Here we uncover “Knowledge Ingots” that have been mined in the fields of Science, Technology, Engineering, Art, and Mathematics. Special attention is provided to Ingots where AI and Machine Learning have provided the catalyst for the mining (see asterisked items). Each Ingot will be referred to within the Steam Pilot Lesson Plans.
-
Scientists at the National Institute for Science and Technology have discovered a way to “teach an AI to make an interconnected set of adjustments to tiny quantum dots, which are among the the many promising devices for creating the quantum bits, or “qubits” that would form the switches in a quantum computer’s processor.
-
Scientists have recently used AI to learn about protein folding and the structure of tens of thousands of proteins. Using Alphafold, an AI program developed by DeepMind, scientists are developing treatments for a whole new set of diseases.
-
A nanometer (nm) is a unit of length in the metric system, equal to one billionth of a meter (10^-9 meters). It is commonly used to measure things on the molecular and atomic scales, such as the wavelengths of light, the dimensions of biological structures, and the sizes of semiconductor components in electronics.
Definition and Scale
Size Comparison:
1 nanometer = 0.000000001 meters (10^-9 meters)
For context, a single strand of human DNA is about 2.5 nanometers in diameter.
A typical human hair is about 80,000 to 100,000 nanometers wide.
A red blood cell is approximately 7,000 nanometers in diameter.
Applications
Nanotechnology:
Nanomaterials: Materials engineered at the nanometer scale often exhibit unique properties, such as increased strength, lighter weight, and higher chemical reactivity. Examples include carbon nanotubes and quantum dots.
Nanodevices: The development of nanoscale devices, such as nanosensors and nanorobots, has potential applications in medicine, electronics, and environmental monitoring.
Semiconductor Industry:
Transistors: Modern microprocessors use transistors with dimensions measured in nanometers. The smaller the transistor, the more can fit on a chip, increasing computing power and efficiency.
Manufacturing Processes: Semiconductor fabrication processes are often described in terms of nanometers, such as the 7nm, 5nm, or even 3nm process nodes.
Biology and Medicine:
Biological Structures: Many biological structures, such as proteins, viruses, and cell membranes, are best described in nanometers. This scale is crucial for understanding cellular processes and developing medical treatments.
Drug Delivery: Nanoparticles can be used for targeted drug delivery, improving the efficacy and reducing the side effects of treatments.
Optics and Photonics:
Wavelengths of Light: Visible light has wavelengths in the range of approximately 400 to 700 nanometers. Understanding and manipulating light at this scale is essential for developing optical devices such as lasers and photonic circuits.
Nano-Optics: The field of nano-optics involves studying and controlling light at the nanometer scale, with applications in imaging, sensing, and information processing.
Materials Science:
Surface Properties: Nanotechnology allows for the modification of surface properties of materials, leading to innovations like self-cleaning surfaces, enhanced catalysts, and improved coatings.
Measurement and Tools
Electron Microscopy: Scanning electron microscopes (SEMs) and transmission electron microscopes (TEMs) are capable of imaging structures at the nanometer scale.
Atomic Force Microscopy (AFM): AFM is a type of scanning probe microscopy with very high resolution, able to measure features at the nanometer level by "feeling" the surface with a mechanical probe.
Challenges
Precision and Control: Working at the nanometer scale requires extremely precise control over the fabrication and manipulation of materials, often pushing the limits of current technology.
Cost and Scalability: The cost of developing and manufacturing nanoscale materials and devices can be high, and scaling up production to industrial levels presents additional challenges.
-
Independently developed by Isaac Newton and Gottfried Wilhelm Leibniz in the late 17th century, calculus provides a framework for understanding changes and motion. It includes the concepts of differentiation and integration, which are essential in physics, engineering, economics, and many other fields.
-
CRISPR-Cas9 is a revolutionary gene-editing technology that allows scientists to add, remove, or alter genetic material within an organism's DNA. It's being explored for potential treatments for genetic disorders, cancer, and even the modification of crops for better yield.m description
-
Modern cryptography, which heavily relies on mathematical concepts such as number theory, algebra, and computational complexity, has revolutionized secure communication.
The development of public-key cryptography by Whitfield Diffie and Martin Hellman, and independently by Ralph Merkle, laid the foundation for secure digital communication.
Bitcoin uses a system of public and private keys to manage ownership and transfer of bitcoins. The public key is used as an address to receive bitcoins, while the private key is used to sign transactions, proving ownership.
-
Thomas Edison to use Isaac Newton’s famous line “Stood On the Shoulders of Giants”. There were numerous scientists who contributed to his understanding of electricity.
The timeline began around 600 BC with the Greek philosopher Thales - and proceeded to William Gilbert to Robert Boyle to Benjamin Franklin onto Alesandro Volta, A.C. Maxwell and Michael Faraday.
It was Faraday’s discovery in 1821 of the principle of electro-magnetic rotation that would later be the key to developing the electric motor. And it was Joseph Swan who actually invented the first incandescent lightbulb; however, his lightbulb burned out quickly
-
Most students of Biology are aware of the Watson & Crick Double Helix Model of DNA. Fewer take note of the research by Rosalind Franklin that pre-dated the DNA model of 1962. Working in the laboratory of Maurice Wilkins, Franklin exposed a crystallized form of DNA to X-Rays to reveal clues about the structure of the molecule.
Franklin discovered that some of the rays are deflected by the atoms in the crystal, forming a diffraction pattern. Watson and Crick then employed this imaging, along with Edwin Chargaff’s findings about A, C, T, G nucleotides - to come up with their model.
Watson and Crick, along with Chargaff and Wilkins received the Nobel Prize in Medicine in 1962. Unfortunately, Franklin had by then passed away - and prizes were not awarded posthumously.
-
Thousands of exoplanets (planets outside our solar system) have been discovered, some within the habitable zones of their stars where conditions might be right for life. The Kepler Space Telescope and other missions have significantly advanced our understanding of these distant worlds.
-
A geodesic dome is a spherical or partial-spherical structure composed of a network of triangles. This triangular network distributes stress evenly across the structure, making it exceptionally strong relative to its weight.
The concept of geodesic structures can be traced back to earlier mathematical work by engineers and mathematicians like Walther Bauersfeld, who designed a geodesic dome for the Zeiss planetarium in 1922. However, it was popularized by Buckminster Fuller - who improved the design and made it suitable for commercial production.
-
Widespread use of cloud computing began in 2006 when Amazon sold its excess e-commerce Internat capacity as The Elastic Compute Cloud (EC2).
-
Cubism is an early-20th century art movement that presented objects in multi-dimensional abstract forms. The movement was pioneered by Pablo Picasso and Georges Braque, and had a profound influence on both painting and sculpture - and went on to inspire related art movements n music, literature, and architecture.
-
Discovered by a group of French teenagers in 1940, the Lascaux Cave paintings date back around 17,000 years and feature vivid depictions of animals such as horses, deer, and bison. These Paleolithic cave paintings provide insight into early human life and artistic expression.
-
Hiram Bingham rediscovered this Incan citadel high in the Andes. The site’s architecture, terracing, and stonework exemplify Incan art and engineering skills, offering insights into their civilization.
-
Developed by Blaise Pascal, Pierre de Fermat, and later formalized by Andrey Kolmogorov, probability theory provides a mathematical framework for analyzing random events. It is fundamental in fields such as statistics, finance, and many branches of science.Item description
-
Discovered by the ancient Greek, Pythagoras of Samus around 550 BC , the Pythagorean theorem states that in a right-angled triangle, the square of the hypotenuse is equal to the sum of the squares of the other two sides (a² + b² = c²). This theorem is fundamental in geometry.
While the theorem was known to Babylonian mathematicians before Pythagoras, he and his followers are credited with providing the first known proof.
-
Discovered by French soldiers during Napoleon’s campaign in Egypt, the Rosetta Stone features a decree in three scripts: Greek, Demotic, and hieroglyphic. It was instrumental in deciphering Egyptian hieroglyphs, opening up the study of ancient Egyptian culture and art.
-
Development of the Internet
Early Concepts:
J.C.R. Licklider: In 1962, Licklider of MIT discussed his "Galactic Network" concept, envisioning a globally interconnected set of computers through which everyone could quickly access data and programs from any site.
ARPANET:
Origins: The Advanced Research Projects Agency Network (ARPANET) was funded by the U.S. Department of Defense in the late 1960s. ARPANET was the first network to implement the TCP/IP protocol suite, laying the foundation for the internet.
First Connection: The first message was sent over ARPANET on October 29, 1969, from UCLA to Stanford Research Institute.
TCP/IP Protocol:
Vint Cerf and Bob Kahn: In the 1970s, Cerf and Kahn developed the Transmission Control Protocol (TCP) and the Internet Protocol (IP), which became the standard networking protocols for the ARPANET and eventually the internet.
Development of the World Wide Web:
Tim Berners-Lee: In 1989, while working at CERN, Berners-Lee proposed the World Wide Web, an information management system that allowed documents to be linked through hypertext. He also developed the first web browser and web server.
Launch: The World Wide Web became publicly available in 1991, revolutionizing how information is accessed and shared on the internet.
Commercialization and Growth:
1990s Boom: The internet transitioned from a primarily academic and governmental network to a commercial and public network in the 1990s, leading to the dot-com boom.
Web Browsers: The development of user-friendly web browsers like Mosaic and Netscape Navigator in the early 1990s made the web accessible to a wider audience.
-
The Printing Press: An Overview
Johannes Gutenberg:
Invention: Johannes Gutenberg, a German blacksmith, goldsmith, printer, and publisher, is credited with inventing the movable type printing press around 1440.
Movable Type: The key innovation was the development of movable type, which allowed individual letters and characters to be arranged and rearranged quickly for each page, making the printing process much more efficient than previous methods.
Technology and Process:
Typesetting: Movable metal type pieces were assembled into pages of text.
Ink: Oil-based ink was applied to the type.
Press: The inked type was pressed onto paper, parchment, or vellum using a screw press adapted from the wine and olive presses of the period.
Mass Production: This method allowed for the mass production of books and other printed materials, significantly reducing the cost and time required to produce them.
The Gutenberg Bible:
First Major Work: One of Gutenberg's first major projects was the printing of the Gutenberg Bible (also known as the 42-line Bible) around 1455. This work demonstrated the potential of the printing press to produce high-quality, mass-produced books.
Impact of the Printing Press
Spread of Knowledge:
Books and Literacy: The printing press made books more accessible and affordable, leading to increased literacy rates. It facilitated the spread of knowledge, ideas, and education across Europe and eventually the world.
Standardization of Texts: Printed texts became more standardized, reducing errors and variations that were common in hand-copied manuscripts.
Scientific Revolution:
Dissemination of Scientific Ideas: The printing press enabled scientists to share their discoveries and theories more widely and rapidly, contributing to the Scientific Revolution. Works like Copernicus' "De revolutionibus orbium coelestium" and Newton's "Principia Mathematica" could reach a broader audience.
Collaboration: The ability to print and distribute scientific findings facilitated collaboration and the cumulative advancement of scientific knowledge.
Religious Reformation:
Martin Luther and the Reformation: The printing press played a crucial role in the Protestant Reformation. Martin Luther's 95 Theses were rapidly printed and disseminated, challenging the Catholic Church and leading to widespread religious reform.
Bible Translations: The printing press allowed for the production of Bibles in vernacular languages, making religious texts accessible to ordinary people and fostering personal interpretation of the scriptures.
Cultural and Social Change:
Renaissance: The spread of printed materials contributed to the cultural movement of the Renaissance by making classical texts and new ideas more accessible.
Public Discourse: Pamphlets, newspapers, and other printed materials became tools for public discourse, political debate, and the spread of new ideas, laying the groundwork for modern democracy.
Economic Impact:
Publishing Industry: The printing press gave rise to the publishing industry, creating new economic opportunities and professions.
Information Economy: The increased flow of information and knowledge contributed to economic growth and the development of more complex societies.Item description
-
Invented in 1947, the transistor replaced vacuum tubes in electronic devices, leading to smaller, more efficient, and more reliable electronics, and paving the way for the development of modern computers and other digital devices.
-
-
The invention of the lithium-ion battery was a significant breakthrough in energy storage technology, leading to the development of many portable electronic devices, electric vehicles, and renewable energy systems. Here's an overview of the invention:
Background and Early Research
1950s-1970s: The concept of using lithium in batteries dates back to the 1950s, when researchers began exploring lithium's potential due to its high electrochemical potential and low atomic weight. Early experiments focused on primary (non-rechargeable) lithium batteries, but these had limitations due to lithium's reactivity.
1970s: In the 1970s, researchers started exploring rechargeable lithium batteries. Stanley Whittingham, working at Exxon, developed the first lithium battery using titanium disulfide (TiS2) as the cathode and lithium metal as the anode. However, safety concerns due to lithium metal's instability, particularly in recharging, made these early versions impractical.
Key Innovations and Invention
John B. Goodenough (1980): The true breakthrough came in 1980 when John B. Goodenough, a professor at the University of Oxford, discovered that lithium cobalt oxide (LiCoO2) could be used as a cathode material. This material allowed for much higher energy density and improved stability, making it suitable for rechargeable batteries.
Akira Yoshino (1985): Building on Goodenough's work, Akira Yoshino, a Japanese chemist working at Asahi Kasei Corporation, developed the first practical lithium-ion battery in 1985. Yoshino replaced the lithium metal anode with a safer material—petroleum coke, a carbonaceous material that can host lithium ions. This made the battery more stable and safer for commercial use.
Commercialization (1991): Sony and Asahi Kasei commercialized the first lithium-ion battery in 1991, marking the beginning of its widespread use in consumer electronics. The battery's high energy density, light weight, and long life cycle made it ideal for powering portable devices like camcorders, laptops, and eventually smartphones.
Impact and Applications
Portable Electronics: The lithium-ion battery became the standard for powering portable electronic devices due to its high energy density, relatively low self-discharge, and long cycle life.
Electric Vehicles: The development of lithium-ion batteries also enabled the growth of the electric vehicle industry, providing a lightweight, rechargeable power source that made electric cars viable on a large scale.
Renewable Energy Storage: Lithium-ion batteries are now essential for storing energy generated from renewable sources like solar and wind, helping to smooth out the supply of electricity and making renewable energy more practical.
Nobel Prize in Chemistry (2019)
In 2019, John B. Goodenough, M. Stanley Whittingham, and Akira Yoshino were awarded the Nobel Prize in Chemistry for their roles in the development of lithium-ion batteries. Their work has been recognized as a cornerstone of modern technology, enabling the creation of portable electronic devices, advancing electric vehicles, and contributing to the development of a more sustainable energy future.
Legacy and Future
The invention of the lithium-ion battery has had a profound impact on modern life, from the proliferation of portable electronics to the push for electric vehicles and renewable energy integration. As research continues, improvements in battery capacity, charging speed, safety, and environmental sustainability are expected to further expand the possibilities of lithium-ion and other advanced battery technologies.
-
The invention of penicillin is one of the most significant medical breakthroughs in history, leading to the development of antibiotics and revolutionizing the treatment of bacterial infections. Here’s an overview of the discovery and its impact:
Discovery
Alexander Fleming (1928): Penicillin was discovered by Scottish bacteriologist Alexander Fleming in 1928. While working at St. Mary's Hospital in London, Fleming was studying staphylococci bacteria. He returned from a vacation to find that one of his Petri dishes containing bacteria had been contaminated by mold. He noticed that the bacteria surrounding the mold were being destroyed, while colonies farther away were unaffected.
Identification of Penicillium Notatum: Upon closer examination, Fleming identified the mold as belonging to the Penicillium genus, specifically Penicillium notatum. He realized that the mold was producing a substance that killed a wide variety of bacteria. Fleming named this substance penicillin.
Development and Mass Production
Early Research (1930s): After Fleming’s discovery, penicillin's potential as a "miracle drug" was recognized, but there were significant challenges in purifying and producing it in large quantities. Fleming's work attracted little immediate interest, and he moved on to other research.
Howard Florey and Ernst Boris Chain (1940s): The breakthrough in the development of penicillin came from a team of researchers led by Howard Florey and Ernst Boris Chain at the University of Oxford in the late 1930s and early 1940s. They succeeded in isolating and purifying penicillin, demonstrating its efficacy in treating bacterial infections in mice and later in humans.
World War II and Mass Production: The onset of World War II accelerated the need for effective treatments for infected wounds and diseases. The U.S. and British governments collaborated with pharmaceutical companies to scale up the production of penicillin. By D-Day in 1944, penicillin was being mass-produced and was widely used to treat Allied soldiers, saving countless lives.
Impact
Medical Revolution: Penicillin marked the beginning of the antibiotic era, transforming medicine by providing an effective treatment for previously deadly bacterial infections such as pneumonia, syphilis, gonorrhea, and streptococcal infections.
Reduction in Mortality Rates: The availability of penicillin drastically reduced mortality rates from bacterial infections and complications following surgery or injury.
Inspiration for New Antibiotics: Fleming's discovery inspired further research into antibiotics, leading to the development of a wide range of other antibiotic drugs.
Nobel Prize in Physiology or Medicine (1945)
In 1945, Alexander Fleming, Howard Florey, and Ernst Boris Chain were awarded the Nobel Prize in Physiology or Medicine for their roles in the discovery and development of penicillin. Fleming was recognized for his initial discovery, while Florey and Chain were honored for their work in turning penicillin into a practical and widely available drug.
Legacy and Challenges
Widespread Use: Penicillin and other antibiotics have saved millions of lives and are considered among the most important medical advancements of the 20th century.
Antibiotic Resistance: The widespread use of antibiotics has also led to the emergence of antibiotic-resistant bacteria, presenting a significant challenge to modern medicine. The need for new antibiotics and alternative treatments remains critical as resistance continues to grow.
The discovery of penicillin is a remarkable example of how a chance observation can lead to a revolutionary medical breakthrough, changing the course of history and improving human health on a global scale.
-
The invention of dynamite was a significant development in the history of explosives, with profound impacts on construction, mining, and warfare. Here’s an overview of its invention and its implications:
Inventor
Alfred Nobel (1867): Dynamite was invented by Alfred Nobel, a Swedish chemist, engineer, and inventor, in 1867. Nobel is most famously known for establishing the Nobel Prizes, but his invention of dynamite is one of his most important contributions to industry and science.
Background
Early Explosives: Before the invention of dynamite, the most common explosive used was black powder (gunpowder), which had been in use for centuries. However, black powder was relatively inefficient and dangerous to handle.
Nitroglycerin: In the mid-19th century, nitroglycerin was discovered by Italian chemist Ascanio Sobrero. Nitroglycerin was much more powerful than gunpowder, but it was highly unstable and prone to accidental explosions, making it dangerous to use.
Invention of Dynamite
Stabilizing Nitroglycerin: Alfred Nobel’s key innovation was finding a way to stabilize nitroglycerin so that it could be handled and transported safely. He discovered that when nitroglycerin was absorbed into an inert substance, such as diatomaceous earth (a type of porous silica), it became much safer to handle.
Formulation and Packaging: Nobel mixed nitroglycerin with diatomaceous earth to create a malleable paste, which he then rolled into cylindrical sticks and wrapped in paper. These sticks could be safely transported and handled, but they could still be detonated by a strong shock, such as from a blasting cap, another invention of Nobel’s that ensured the controlled detonation of dynamite.
Patent and Commercialization: Nobel patented his invention in 1867, naming it "dynamite," derived from the Greek word "dynamis," meaning "power." He immediately began producing and selling dynamite, revolutionizing the fields of construction, mining, and demolition.
Impact
Construction and Infrastructure: Dynamite played a crucial role in the construction of tunnels, roads, railways, and canals. It made it possible to blast through rock and other hard materials more efficiently than ever before, accelerating the pace of industrialization and infrastructure development.
Mining: In mining, dynamite allowed for the extraction of minerals and ores on a much larger scale, contributing to the growth of the mining industry and the availability of raw materials for manufacturing and construction.
Military Uses: While dynamite was initially intended for industrial use, it was also adopted for military purposes, particularly in warfare and the creation of explosives for bombs and other weapons. This application of dynamite, however, contributed to its controversial legacy.
Legacy and Controversy
Moral Reflections: Alfred Nobel became concerned about the destructive uses of his invention, especially after a French newspaper mistakenly published his obituary with the headline "The Merchant of Death is Dead," thinking he had passed away. The article criticized Nobel for profiting from the invention of explosives. This incident is said to have influenced Nobel’s decision to leave the bulk of his fortune to establish the Nobel Prizes, including the Nobel Peace Prize.
Modern Explosives: While dynamite was eventually surpassed by more powerful and safer explosives, such as TNT (trinitrotoluene), it remains a symbol of the industrial age and a key innovation in the history of explosives.
Conclusion
The invention of dynamite by Alfred Nobel was a groundbreaking development that significantly advanced the fields of construction, mining, and demolition. It facilitated the rapid expansion of infrastructure and industry in the 19th and early 20th centuries. However, the dual-use nature of dynamite—as both a tool for progress and a weapon of destruction—also led Nobel to consider the ethical implications of his work, ultimately inspiring the establishment of the Nobel Prizes.
-
The invention of the phonograph was a pivotal moment in the history of audio technology and music recording. Here's an overview of its invention:
Inventor:
The phonograph was invented by Thomas Edison in 1877. Edison, an American inventor known for his work with electricity and sound, came up with the idea while working on improvements to the telegraph and the telephone.
Concept and Development:
The phonograph was the first device capable of both recording and reproducing sound. Edison's initial idea was to record telegraph messages and then play them back, but this concept evolved into recording any sound, including the human voice.
The phonograph operated by inscribing sound waves onto a rotating cylinder wrapped in tin foil. A stylus (needle) attached to a diaphragm would move up and down as sound waves hit it, etching a groove into the cylinder. When the cylinder was rotated again, the stylus would retrace the groove, causing the diaphragm to vibrate and reproduce the recorded sound.
First Recorded Sound:
The first recording Edison made on his phonograph was a rendition of "Mary Had a Little Lamb." This moment was groundbreaking because it marked the first time sound could be recorded and played back.
Impact:
The phonograph revolutionized the way people interacted with sound and music. It allowed for the preservation of voices, music, and other sounds, laying the foundation for the modern music industry and audio recording.
Edison's phonograph led to the development of other sound recording and playback devices, including the gramophone, which used flat discs instead of cylinders. This evolution eventually gave rise to the modern record player.
Legacy:
The phonograph's invention is considered one of Thomas Edison's greatest achievements and a cornerstone in the development of audio technology. It opened the door to new forms of entertainment and communication, influencing everything from music to radio and eventually leading to the digital recording technologies we use today.
Further Innovations:
Emile Berliner later improved upon Edison's invention by creating the gramophone in the 1880s, which played flat discs rather than cylinders, a format that eventually became the industry standard.
Vinyl records, cassette tapes, CDs, and digital audio are all descendants of the phonograph, each representing advancements in the way sound is recorded and consumed.
The phonograph was not just a technological breakthrough but also a cultural milestone, forever changing how we experience music and sound.
-