Electronic Desktop Calculator

Desktop calculators led the idea of computers small and cheap enough to sit on an individual’s desk. Eventually, they also became the impetus for the general-purpose microchip.

History

The first desktop electronic calculator is the ANITA Mark VII and ANITA Mark VIIII, both launched late 1961. The Bell Punch Co. of Britain designed the ANITA. Markedly, they used vacuum tubes and cold-cathode, and nixie tubes for the numerical display. Norbert (“Norman”) Kitz led the design and engineering work.

Eventually, the ANITA VII sold in continental Europe and the ANITA VIII in the UK and the rest of the world. However, soon after launch, Bell dropped the ANITA VII and consolidated the product line.

Cost was a major factor producing the ANITA. To make the calculator, Bell Punch needed to sell the product for about 1/100th the least expensive electronic computers of the day cost. Eventually, ANITA went on the market for £355 (about £7,800 in 2018, about $10,500 USD). In contrast, the least expensive general-purpose computers in 1961 cost about £50,000 (just over £1 million adjusted to 2018). The device weighed 34 pounds (15.5 kg).

Transistor-Based Calculators

Eventually, by 1964, competitors started to release calculators that used transistors rather than tubes. Sharp, Canon, Sony, Toshiba, Wang, and countless others released transistor-based calculators. However, these calculators were similarly priced to the ANITA, or even more expensive. Significantly, were significantly smaller and lighter due to the lack of tubes.

The Soviet Union literally weighed in with the T-64 built in Bulgaria. However, despite the use of semiconductors, the calculator weighed 8kg (17.6 lbs.) and is the first calculator to compute square roots.

Calculators continued to decrease in price, size, and increase in performance.

General-Purpose Microchip

Many calculator companies hired Intel, a young company, to produce custom chips for their calculators. Eventually,  in 1970, Intel engineer Ted Hoff instead created a general-purpose chip for Japanese company Busicom. Unlike other calculator chips, the Busicom chip was programmable to do multiple functions, not only those specific to one calculator. In 1971, Intel licensed the chip back and rebranded it the Intel 4004, Intel’s first general-purpose microprocessor.

Nuclear Aircraft Carrier

Nuclear aircraft carriers are enormous ships capable of traveling the world indefinitely. Indeed, the ships feature relatively large flight-decks capable of launching and landing fixed-wing aircraft, typically fighters.

At 1,123 ft. (342 m.) the USS Enterprise is an enormous ship. In contrast, only oil supertankers are larger.

The Enterprise supported 4,600 service members. First launched Sept. 24, 1960, Enterprise remained in service until December 1, 2012. The ship featured eight nuclear reactors. However, future US carriers, and most other countries, never built more than two reactors per ship. The ship cost $451 million USD (about $4 billion in 2019 dollars).

The Enterprise’ first mission was recovering astronaut John Glenn, the first American in orbit. Her second mission was the Cuban blockade, leading to the Cuban Missile Crisis. By 1964, the Enterprise was sailing with two other nuclear-powered ships around the world as a display of American firepower.

Enterprise fought extensively during the early Vietnam War, launching countless airstrikes. On Dec. 3, 1965, she launched a record 165 sorties in one day. Eventually, the ship was sent elsewhere for refitting and other missions. However, she returned to Vietnam in 1971, towards the end of the war. Additionally, Enterprise also fought in both the first and second Iraq War, as well as in Afghanistan and countless other American skirmishes.

Surprisingly, the nuclear aircraft carrier was designed and launched after the nuclear submarine, a more complex weapon. At the present time, many countries have nuclear aircraft carriers. The portable airfields circle the globe endlessly, sailing from one conflict to another.

Mass-Scale Desalination

Reverse osmosis enables large-scale desalination of seawater, efficiently transforming it into drinking water.

People have been trying to desalinate seawater into drinking water for millennia. Firstly, Aristotle and other ancient Greeks unsuccessfully attempted to desalinate seawater. Eventually, by the 16th century, ship-based evaporation desalination systems created potable water. In time, by 1955, the first multi-stage flash distillation (MSF) plant went online. It desalinated water but require distillation, consuming enormous amounts of time and energy.

Finally, in 1959, the first multi-effect distillation (MED) plant came online. This plant used a combination of industrial-scale reverse osmosis and filtering. Subsequently, the University of California innovated the synthetic reverse osmosis membrane. This brought together the building blocks of a modern desalination plant.

Reverse osmosis desalination methods were refined over the future decades. Particularly, the filters became vastly more efficient at filtering salt and other particulate matter from seawater with ever fewer amounts of electricity.

Eventually, as aquifers around the world run dry, desalination promises to help offset the use of natural potable water.

At this time, in 2019, Saudi Arabia has the largest desalination plant in the world. It features 8 evaporators, 17 reverse osmosis units, and produces 1 million cubic meters of drinking water every day.

Israel comes second, using 16-inch (40.5cm) membranes it produces 624,000 cubic meters of drinkable water per day. Thanks to this and other desalination plants, Israel generates more water than the country uses, using the excess to refill drained aquifers and the Sea of Galilee. A plant north of San Diego, in the US, will produce about 190,000 cubic meters of freshwater a day for Californian’s, who have suffered water rationing for years as aquifers ran dry.

Barbie

In 1945, Harold “Matt” Matson and Elliot Handler created a garage-based manufacturing business. They named it by combining their first names, Mattel. First, they manufactured picture frames. Using the leftover pieces of wood, Elliot built dollhouses that sold well. Soon, Matson dropped out of the business leaving it solely owned by Handler and his wife, Ruth.

Ruth and Matt created a toy ukulele that sold well, the company’s first major success. In 1955, they licensed the rights for popular “Mickey Mouse Club Products.” Licensing pop-culture characters for toys was an emerging and popular business model. In 1955, they patented the toy cap gun. The business wasn’t spectacular but the Handler’s were doing well.

Dolls during this era were babies or children. However, Ruth noticed her own daughter, Barbara, playing with dolls and assigning them adult roles. In 1956, the family took a trip in Europe and purchased a German toy doll called Bild Lilli that looked like a small woman rather than a child. Originally marketed to adults, Lilli was more popular with children.

Back in the US, Ruth decided to make her own grown-up doll. The doll should look fun, she reasoned, rather than realistic. It had an unusually large bust, a slim waist, and full-size hips. Ruth named it Barbie, after her daughter Barbara.

On March 9, 1959, Barbie was introduced to the world at the American International Toy Fair in New York. She wore a black-and-white one piece and came in blonde or brunette. The “Teen-age Fashion Model” was wildly successful, selling about 350,000 during her first year. Ruth died in 2002 and Eliott in 2011 but, in 2019 at 60 years-old, Barbie is still very much alive.

Just-In-Time Manufacturing

Just in time manufacturing delivers the parts required to complete a product shortly before they are needed. Accordingly, this vastly reduces inventory cost while typically increasing quality by aligning the manufacturing needs of part suppliers and the final manufacturer.

Background

Toyota engineer Taiichi Ohno needed a better way to manufacture. Specifically, efficiency was low and quality suffered, especially when necessary parts ran out. In due time, he noticed that supermarkets used a visual card to indicate when an item was running low. Therefore, this system signaled to supermarket workers to restock the bin immediately. Without this system, bins might be filled with unneeded food that would spoil or sit empty, forcing customers to make a later trip or go to a different store.

Ohno adapted this system calling it “Kanban,” which means “visual signal” or “card” in Japanese.

Eventually, Ohno brought the system to Toyota’s manufacturing facilities. When parts ran low, workers turned over a car and somebody quickly came to replenish the parts. There were never too many nor too few parts for a workstation on an assembly line.

Kanban has four core properties. First, visualize the workflow. It is necessary to lay out a workflow so an ordinary person can grasp it visually. Second, limit work-in-progress. There must never be too much nor too little work-in-progress. Third, manage flow. It is necessary to align the workflow with the workers and the need for literal or figurative parts. Fourth, make process policies explicit. Clarify the workflow so everybody understands what is required. Fifth, create feedback loops. Ask and observe what works and what doesn’t and adjust accordingly. Finally, improve collaboration. Use small, continuous, incremental evolutionary changes that stick. Do not try to boil the ocean.

Toyota found Kanban vastly increased efficiency and decreased costs and adopted it through the Toyota system.

JIT

During the 1950s – 1970s, the quality of Japanese manufacturing rapidly increased while the quality of US manufacturing similarly declined. American executives studied the Japanese and found the core two components of Japan’s secret sauce was the use of Kanban and techniques taught by statistician W. Edwards Deming after the war. Deming tied Kanban’s flow into a statistical system called Total Quality Management TQM, producing higher quality goods (especially cars) at lower prices.

Eventually, US firms adopted Kanban and TQM while the process evolved in both Japan, the US, and elsewhere. Most notably, Michael Dell created a computer company that relied heavily on parts created by others. Dell computers were custom-configured when ordered, then quickly delivered. He needed a system where vendors aligned with his own factory to quickly build high-quality computers. Dell’s Just-In-Time (JIT) methods revolutionized manufacturing, enabling him to work with countless suppliers to ensure the supply bins were never either empty nor too full.

Nuclear Submarine

Nuclear submarines can stay underwater for an unlimited amount of time, or at least until the food runs out. Prior to nuclear subs, there were diesel-electric subs, that still exist today. However, these subs use diesel engines to recharge batteries. Therefore, diesel subs must surface to turn on the diesel engines and recharge their batteries, limiting their range.

Nuclear submarines, including the first one, were designed to remain underwater indefinitely. Rather than remaining close to coastal waters, nuclear submarines are able to travel the globe. In particular, nuclear sub routinely sail under the polar ice caps and are capable of breaking through the ice to surface.

Background

The first nuclear submarine is the Nautilus, launched Jan. 21, 1954. It was tested for years, becoming the first ship to reach the North Pole on Aug. 3, 1958. Eventually, in 1960, it was assigned to the Sixth Fleet as an active-duty submarine. However, by 1966 new technologies rendered Nautilus obsolete and it was retired as a training submarine.

Nuclear submarine technology evolved and, eventually, the ships were equipped with all manner of weapons besides ordinary torpedoes. Nuclear submarines can launch cruise missiles and even full-blown nuclear ballistic missiles. Since they’re quiet and travel under the polar ice caps it is virtually impossible to destroy a nuclear submarine before it launches missiles unless an enemy submarine is nearby.

The USSR eventually developed their own nuclear sub but the early versions, and even some more recent models, lack reliability. The first Soviet nuclear submarine, the K-19, launched in 1958 and earned the nickname “the widowmaker.”

Both the US and USSR/Russia developed two basic types of nuclear submarines, bombers that launched ballistic missiles and hunters that destroyed other submarines and supported special operation missions. There is a broad consensus that the combination of a nuclear submarine armed with nuclear ballistic missiles is the most powerful weapon developed in history.

Genetic Testing

Genetic testing identifies genetic patterns, including irregularities. In 2019, genetic testing is typically used to search for abnormalities and susceptibilities. However, new treatments under development target the specific traits of patients or disease. These treatments attack and cure at the genetic level. In addition, genetic testing is entertaining. People find unknown relatives or trace family origin.

Background

In April 1953, James Watson and Francis Crick worked with Rosalind Franklin to discover that DNA is a double-helix. They explained how DNA self-replicates and encodes hereditary information. Eventually, Watson & Crick won the Nobel Prize for their work (Franklin died, rendering her ineligible). However, while they accurately described the form of DNA they did not explain the chromosomes that render our biological blueprint.

In 1956, Joe Him Tjio and Albert Levan released the first substantive work on chromosomes, the core of genetic testing. Particularly, they found human DNA contained 46, not 48 as previously believed. Almost more importantly, they identified how to read information from chromosomes.

Not long after, the earliest genetic testing began. Eventually, reports emerged concurrently identifying the genetic abnormality responsible for Down syndrome. Next came reports tying Turner and Klinefelter syndromes to genetic anomalies.

Markedly, progress identifying genetic differences proceeded slowly until the 1980s. Eventually, new technologies lowered the cost and increased the value of the information. By the 1990s, these techniques increased in speed and decreased in cost.

Human Genome Mapping

In 1990, scientists started a project to map the entire human genome, the Human Genome Project. It finished in April 2003, and cost about $2.7 billion USD. By late 2018, one company ran a sale to sequence an entire human genome for $200. The full price was $999 though the company, Veritas, predicts the retail price for a full DNA sequence will be $99 by 2024 at the latest.

Countless DNA sequencing companies exist that read and report partial DNA results. For example, 23 and Me offers a “Health + Ancestry Service.” For $199, customers receive over 125 gene-related health reports plus a fun family history report. The family history report, “Get a breakdown of your global ancestry by percentages, connect with DNA relatives and more,” costs $99 alone.

Sonography

Sonography is the process of using sound waves as an imaging device, typically for medical purposes.

Background

Indeed, the principles of sonography come from the natural world. For example, bats and whales are mammals that use sound waves for navigation. In 1794, after performing medical studies on bats, Lazzaro Spallanzani gained a basic understanding of ultrasound physics.

In 1880, French brothers Jacques and Pierre Curie discovered piezoelectricity. Simplifying, piezoelectricity is an electric current generated by deforming certain crystals. For example, flint-less cigarette lighters and inkjet printers both utilize the piezo effect. Getting to the point, piezoelectricity enables ultrasound transducers that emit and receive soundwaves.

On April 14, 1912, the RMS Titanic famously struck an iceberg and sank, killing about 1,500 people. Accordingly, government agencies around the world called for some method to better detect icebergs. Eventually, In 1914, Paul Langevin built on the work of Reginald Fessenden (of AM radio) to invent the first ultrasound transducer aimed at icebergs. His machine detected icebergs up to about two miles away but had no directional capability. To clarify, it could detect there was an iceberg somewhere close but not in which direction.

Ultrasound as Weaponry

The use of submarines in World War I increased the need for directional ultrasound in water. Eventually, Langevin and Constantin Chilowsky created a high-frequency ultrasound machine with directional capabilities. On April 23, 1916, their “hydrophone” was used to sink a German U-boat.

Medical Imaging

Eventually, in 1942, Austrian Neurologist Karl Dussik used sonography to detect brain tumors. Dussik used a method where sound waves were beamed towards the head of a patient partially submerged in water and the resulting echo recorded on heat-sensitive paper. Specifically, this became the first ultrasound image. Eventually, George Lewig used ultrasounds to detect gallstones and kidney stones.

Progress continued with physicians and engineers using ultrasound to measure various fluid-based organs. Most notable are studies in cardiology and obstetrics. By the 1970s, Doppler and color Doppler ultrasound imaging became commonplace. In the 1980s, Kazunori Baba of Japan developed 3D ultrasound.

By the 1990s, with the help of computers, real-time 3D ultrasound enabled surgeons to see inside a body during biopsies. Today, ultrasound machines are common, especially in obstetrics. Unlike radiation-based imaging devices, the ultrasound machines are entirely harmless.

Rock & Roll

“If you’re not doing something different, you’re not doing anything.”

Sam Phillips

Background

Billboard magazine started charting songs in 1940. Eventually, they divided songs into three categories, pop, country-and-western, and “race music.” Around 1949, race music was renamed rhythm and blues (R&B).

Music sales were proprietary and closely guarded so Billboard based their charts off of popularity estimates from Jukebox and radio play. Before the 1940s three national broadcast networks dominated. The Federal Communications Commission mandated more local radio licenses. From the beginning to the end of the 1940s the number of local radio stations increased from about 800 to over 2,000.

Billboard categorized a song as pop or R&B depending upon whether the audience was African American or white. The songs on radio stations targeted to African Americans or Jukeboxes in African American clubs were R&B. Those targeted to white people were pop.

Rock & Roll

Rock & Roll came from a convergence of two events in Memphis, Tennessee. The first is a recording studio owned by Sam Phillips. “We Record Anything – Anywhere – Anytime” was their slogan. Any aspiring musician could visit and audition. If Phillips liked what he heard, he’d record them. Musicians could cut their own records for a fee.

One of the local African-American radio stations in Memphis was WDIA. In 1949, they began broadcasting and hired a disk jockey and on-air performer named B.B. King. Radio waves did not respect the racial segregation lives in Memphis. One WDIA listener, a B.B. King fan, was a young white man named Elvis Presley.

Young Elvis wasn’t alone. By one estimate, about 40% of the people buying R&B music, at African-American record stores, were young white people.

Just about this time the major labels exited the R&B market, segregating their music tastes to white people. In response, countless minor labels sprung up. Phillips decided to create one of his own, considering his stream of fresh talent, calling it Sun Records.

About this same time, television began widespread penetration. One of the featured events of live television were musical performances. Phillips had great musicians and a great sound but knew television, in those days, would refuse to broadcast African-American performers. “If I could find a white man who had the Negro sound and the Negro feel, I could make a billion dollars,” Phillips said. In 1956, the Nat King Cole television show was canceled after only a year due to a dearth of sponsors. “Madison Avenue is afraid of the dark,” he noted.

Elvis

Elvis showed up at Phillips studio in the summer of 1953, at age 18, to record two songs. They paid him four dollars, noted “Good ballad singer. Hold.” and ignore him. Phillips invited him back a year later to try some ballads. Nothing clicked but Phillips added some musicians, an electric guitarist and standup bass player. After a few songs, Elvis suggested trying R&B music, singing “That’s All Right.”

Phillips asked a friend with an R&B show on a white radio station (yes, Memphis was that segregated) to roll the record. Dewey Phillips played it and, due to repeated requested, kept playing it. Soon enough, he was singing to 40 million people on television.

During those years, Phillips recorded Elvis, B.B. King, Howlin’ Wolf, Ike Turner, Carl Perkins, Johnny Cash, Jerry Lee Lewis, and Roy Orbison. And in a small studio in Memphis Rock and Roll was born.

Heart-Lung Machine / Cardiopulmonary Bypass

Heart-Lung machines temporarily do the work of the heart and lungs allowing surgeons to operate on the heart or lungs. Despite the sci-fi nature, it was a husband-wife garage invention.

Background

In 1931, surgeon John Gibbon lost a patient he felt sure would have lived if he could temporarily keep blood circulating and oxygenated. He worked with his lab assistant, Mary Hopkinson, to develop a heart-lung machine. More than the machine progressed, and John and Mary eventually married.

The Gibbon’s experimented on cats. By 1935, they were able to keep a cat alive for 20 minutes while their machine replaced heart and lung function. However, the machine damaged blood cells and virtually no cat lived longer than 23 days after surgery.

Gibbons’ machine relied largely on a blood oxygenator. This was a series of rollers, developed in 1885 by von Frey and Gruber, that thinned out blood and exposed it with oxygen. The rollers mimicked the surface area of the lungs. Additionally, it was a challenge to find a pump as strong as human hearts. Patients’ blood must be saturated with heparin to prevent coagulation, which would gunk up the machine.

By 1945, Gibbon included other researchers and expanded their research to dogs. They found that by adding filters to remove blood clots and applying suction to prevent air from entering the bloodstream survival rates dramatically increased.

On May 6, 1953, Gibbon and his team decided their machine was ready for use on people. Gibbon operated on Cecelia Bavolek, bypassing her heart and lungs with his machine for 45 minutes. She lived and fully recovered from the operation. Unfortunately, Gibbon’s next four patients died and he abandoned heart surgery.

Walton Lillebei

Walton Lillebei picked up where Gibbons’ left off. He tried a radically different approach, connecting one person to another whose heart and lungs would do the work for two. Typically, a child would be connected to one of their parents. Lillebei also invented the bubble oxygenator, replacing the rollers used in Gibbons’ machine. Finally, the heart-lung machine was reliable.

Despite the success, there was one major problem: the heart kept beating during the use of the machine. This caused a literal bloody mess, making it difficult for surgeons to see. Returning to animal research, they found it was possible to stop a heart while a patient was connected to the machine and restart it later. However, the lack of blood in the heart caused tissue damage. Surgeons operated with beating hearts until the 1980s when researchers at St. Thomas Hospital found that cooling the heart below 28°C (82°F) and treating it with a combination of drugs kept the heart healthy and intact. Today, this technique is used for extended operations where a heart must be stopped and also to transport hearts for transplanting.