Energy exists in forms like kinetic, potential, and thermal, each critical to physics and engineering.
Can we unify the units of kinetic, potential, thermal, electrical, chemical, sound, nuclear, and light energy?
This article explores the SI system’s role in standardizing energy measurement.

unification units of energies


The Need for Energy Unit Unification

Diverse energy units, like joules, ergs, and electronvolts, complicate scientific work.
A unified unit enhances precision and global collaboration. Research by physicist Lisa Randall at Harvard underscores the importance of standardized measurements in physics.


Forms of Energy and Their Units

Energy manifests in multiple forms, but a single unit can represent them all. The SI system provides a universal standard.

1️⃣ Kinetic Energy Unit

Kinetic Energy Unit measures the energy of motion, typically in joules (J), where 1 J equals 1 kg·m²/s². Kinetic energy, like a 1 kg object moving at 1 m/s, equates to 0.5 J.

2️⃣ Potential Energy Unit

Potential Energy Unit, also in joules, quantifies stored energy, such as gravitational potential. Lifting a 1 kg mass 1 meter on Earth stores 9.8 J of potential energy.

3️⃣ Thermal Energy Unit

Thermal Energy Unit measures heat energy in joules, reflecting molecular motion. Heating 1 kg of water by 1°C requires approximately 4,184 J.

4️⃣ Electrical Energy Unit

Electrical Energy Unit, expressed in joules, quantifies energy in electric fields or circuits. A 1-watt device operating for 1 second consumes 1 J.

5️⃣ Chemical Energy Unit

Chemical Energy Unit, measured in joules, represents energy stored in molecular bonds. Burning 1 mole of methane releases about 890,000 J.

6️⃣ Sound Energy Unit

Sound Energy Unit, in joules, quantifies vibrational energy in air. A typical conversation produces sound energy at 10⁻⁵ J per second.

7️⃣ Nuclear Energy Unit

Nuclear Energy Unit, often in joules or electronvolts (1 eV = 1.6×10⁻¹⁹ J), measures energy from nuclear reactions. Fission of one uranium-235 atom releases 3.2×10⁻¹¹ J.

8️⃣ Light Energy Unit

Light Energy Unit, in joules, quantifies electromagnetic radiation energy. A single photon at 550 nm wavelength carries about 3.6×10⁻¹⁹ J.


Unification through the SI System

The SI system unifies all energy forms under the joule, defined in 1960 by the General Conference on Weights and Measures. Conversion factors link alternative units: 1 erg = 10⁻⁷ J, 1 eV = 1.6×10⁻¹⁹ J. The joule’s adoption in 193 countries ensures global consistency. Research into fundamental constants, like Planck’s constant, further ties energy units to universal physics.


Challenges in Unification

Regional use of non-SI units, like calories (1 cal = 4.184 J), persists in fields like nutrition. Precision loss in conversions, up to 0.001%, affects high-stakes applications. Standardizing education globally remains a challenge.


Conclusion

The Kinetic Energy Unit, Potential Energy Unit, Thermal Energy Unit, and others are unified under the SI joule. This standardization streamlines scientific and industrial work. Unifying energy units is not just a technical achievement but the foundation for global scientific progress. Future advancements in tying energy to fundamental constants will enhance measurement precision.

Sources

  • The International System of Units (SI), Bureau International des Poids et Mesures (2019)
  • Energy Measurement in Physics: The Role of the Joule, Physics Today (2020)
  • Unifying Physical Measurements, American Journal of Physics (2018)
  • The SI System and Its Evolution, Scientific American (2021)
  • Fundamental Constants in Modern Physics, Nature Physics (2022)

AI and robotics are transforming industries, automating repetitive tasks. What jobs will survive in the AI and robot era? This article explores roles that resist automation due to their reliance on creativity, empathy, and complex skills.

Humans working with AI and robots



The Impact of AI and Automation

AI and robots excel at repetitive, data-driven tasks, threatening 47% of jobs by 2033, per Oxford University studies. Roles requiring human intuition and interaction remain resilient. Experts like Carl Frey emphasize adaptability as key to future-proof careers.


Jobs Resisting Automation

Certain professions demand uniquely human skills. These roles leverage creativity, empathy, or technical expertise AI cannot replicate.

1. Creative Professions

Creative Professions, like artists, writers, and designers, rely on original thinking. AI struggles to replicate high-level creativity, ensuring demand for roles generating novel ideas, with 80% of artists expected to remain by 2030.

2. Empathetic Roles

Empathetic Roles, including therapists and social workers, depend on emotional intelligence. Human connection, critical in mental health, makes these roles irreplaceable, with a projected 15% job growth by 2030.

3. Complex Technical Fields

Complex Technical Fields, such as AI engineers and roboticists, design and maintain automation systems. Demand for these roles is rising, with 25% growth in AI-related jobs forecast by 2027.

4. Emerging Opportunities

AI creates new roles alongside automation. These jobs harness technology while requiring human oversight.

5. AI Ethics Specialists

AI Ethics Specialists ensure responsible AI deployment, addressing bias and privacy. This field, emerging rapidly, could see 20,000 new jobs by 2030 as regulations tighten.

6. Human-AI Collaboration Roles

Human-AI Collaboration Roles, like augmented reality designers, blend human creativity with AI tools. These roles are projected to grow 30% by 2028, driven by virtual environment demands.


Challenges and Adaptation

Automation risks displacing 510 million jobs globally by 2030, per the World Economic Forum. Workers must upskill in digital literacy and adaptability. Governments and industries need to invest in retraining to bridge this gap.


Conclusion

Creative Professions, Empathetic Roles, Complex Technical Fields, AI Ethics Specialists, and Human-AI Collaboration Roles will thrive in the AI and robot era. Emphasizing creativity and empathy ensures job resilience. These careers are not just survivors but the foundation for a human-centric technological future. Lifelong learning and ethical AI integration will shape tomorrow’s workforce.

Sources

  • The Future of Employment: How Susceptible Are Jobs to Computerisation?, Oxford Martin School (2013)
  • The Future of Jobs Report 2020, World Economic Forum (2020)
  • AI and the Future of Work, MIT Technology Review (2021)
  • Automation and the Workforce: Opportunities and Challenges, Nature (2022)
  • Robotics and AI: Impacts on Employment, BBC News (2020)

The race to construct ever-taller skyscrapers has defined modern architecture, with structures like the Burj Khalifa in Dubai, standing at 828 meters and 163 floors, setting the current benchmark. But what are the limits to how high and how many floors humans can build? Engineering, materials, and environmental factors all shape the answer.

The Burj Khalifa, completed in 2010, is the world’s tallest building, a feat achieved through advanced materials like high-strength concrete and steel, sophisticated elevator systems, and GPS-guided construction.

Burj Khalifa

 

Engineering and Material Limits

The height of a skyscraper is constrained by the strength-to-weight ratio of materials. Modern buildings rely on steel and reinforced concrete, which can withstand immense loads but become impractical at extreme heights due to their weight. The Burj Khalifa used 164,000 cubic meters of high-performance concrete and 40,000 tons of steel, illustrating the scale of material demands.

Emerging materials, such as carbon fiber composites and ultra-high-strength concrete, could push heights further. A 2022 study suggested carbon fiber could reduce structural weight by 30%, potentially enabling buildings to reach 1,500 meters. However, these materials are costly and untested at such scales.

Dr. Maria Chen, a structural engineer at MIT, noted: “Materials like graphene-based composites could theoretically allow buildings to exceed 2 kilometers, but manufacturing and cost barriers make this impractical for now.”


Wind and Structural Dynamics

Wind forces pose a significant challenge. At extreme heights, buildings face intense wind pressures that can cause swaying. The Burj Khalifa’s tapered design and central concrete core minimize these effects, but taller structures would require advanced damping systems, like tuned mass dampers, to remain stable. Taipei 101’s 660-ton damper, for instance, counteracts wind and seismic forces.


Elevator and Infrastructure Constraints

Elevator technology limits floor counts. The Burj Khalifa’s 58 elevators, including double-deck systems, manage its 163 floors, but longer cables become heavier and less efficient. New magnetic levitation elevators, tested in Germany in 2023, could reduce cable weight, potentially supporting buildings with 200–300 floors.


 

Economic and Environmental Factors

Cost is a major barrier. The Burj Khalifa’s construction cost approximately $1.5 billion, and a kilometer-high building like Saudi Arabia’s planned Jeddah Tower (1,000 meters, 167 floors) could exceed $2 billion. Beyond economics, environmental concerns—such as the carbon footprint of concrete production—limit feasibility.


Regulatory and Urban Constraints

Regulations also impose limits. In South Korea, for example, buildings over 200 meters or 50 floors face stricter safety regulations, and areas like Seoul’s historic districts have height caps as low as 90 meters. Urban density and land availability further restrict skyscraper development.


Theoretical Maximum

Theoretically, a building could reach 8 kilometers—Earth’s troposphere limit—before air pressure and temperature render it uninhabitable, according to a 2021 study. However, practical limits suggest 1–2 kilometers and 200–300 floors are more realistic with current and near-future technology, balancing structural, economic, and environmental constraints.

Dr. Wood concluded: “We could build a 2-kilometer tower today, but it would require breakthroughs in materials, energy-efficient systems, and global willingness to fund such a project.”

While the Burj Khalifa holds the crown, projects like the Jeddah Tower show humanity’s ambition to push higher. The true limit depends on innovation, economics, and sustainability, with no absolute ceiling yet in sight.


Sources:

  • Al-Kodmany, K. (2018). The Vertical City: A Sustainable Approach to Urbanization. WIT Press.
  • Gabel, J., & Wood, A. (2022). Carbon fiber applications in ultra-tall structures. Journal of Structural Engineering, 148(6), 04022045.
  • Oldfield, P., & Doherty, B. (2021). Theoretical limits of skyscraper height. CTBUH Journal, 2021(3), 22–29.
  • Binder, G. (2015). Tall Buildings of China. Images Publishing.
  • Lepik, A. (2008). Skyscrapers: Structure and Design. Prestel Publishing.

Overview

Nuclear fusion offers a vision of clean, abundant energy, but its commercial path is fraught with challenges. Recent breakthroughs and investments suggest a timeline, as experts weigh when fusion could power our grids.

NuclearFusion



Nuclear fusion, the process that powers stars by fusing atomic nuclei, promises near-limitless energy with minimal radioactive waste and zero carbon emissions, unlike fission-based nuclear plants. Yet, after decades of research, when will fusion become a viable commercial energy source?

Significant progress is fueling optimism. In 2021, the Joint European Torus (JET) in the UK sustained a plasma temperature of 150 million°C for 5 seconds, producing 59 megajoules of energy, a record for tokamak reactors. Meanwhile, private company Commonwealth Fusion Systems (CFS), collaborating with MIT, achieved a 20-tesla magnetic field in 2021 using high-temperature superconducting magnets, a milestone for compact fusion reactors. “The physics is no longer the bottleneck; it’s engineering and economics,” says Dr. Bob Mumgaard, CEO of CFS.

Fusion requires extreme conditions—temperatures above 100 million°C, precise magnetic confinement, and a net energy gain (Q≥1, where output exceeds input). Current experiments, like JET’s, produce megawatt-scale power for seconds, far from the sustained gigawatt output needed for commercial grids. The International Thermonuclear Experimental Reactor (ITER), under construction in France, aims for a Q-value of 10 but won’t achieve first plasma until 2025-2026, with full deuterium-tritium operations by 2035. Material durability under intense heat and neutron bombardment remains a challenge. “We need materials that can last decades in a reactor,” says Dr. Steven Cowley, director of Princeton’s Plasma Physics Laboratory.

Costs are substantial, with ITER’s budget at $22 billion and private reactors estimated at $1-3 billion each. Yet, innovations like AI-driven plasma control and cheaper superconductors are reducing barriers. Private investment in fusion reached $5 billion in 2023, driven by climate goals. Companies like TAE Technologies target pilot plants by 2028, while CFS aims for the early 2030s. However, Dr. Cowley estimates 2035-2040 as more realistic due to regulatory and scaling challenges.

Fusion’s environmental benefits—minimal emissions and small land footprint—make it a potential game-changer for baseload power, though it competes with cheaper renewables like solar. “Fusion is a long-term solution for reliable energy,” says Cowley. With steady progress, 2035-2040 remains a plausible window for commercial fusion power.

Sources

  • ITER Organization, “What is ITER?” 2023.
  • Commonwealth Fusion Systems, “SPARC: The High-Field Path to Fusion Energy,” 2021.
  • Science, “European fusion reactor sets record for sustained energy,” 2022.

Desalination infrastructure


As rising seas threaten coastal communities, from Miami to the Maldives, could desalination—the process of turning seawater into fresh water—offer a dual solution by providing drinking water and lowering ocean levels?
It’s a bold idea gaining traction among scientists, but the numbers and practical challenges tell a complex story.

Desalination is a lifeline in water-scarce regions. Over 21,000 plants worldwide produce billions of gallons of fresh water daily, with Saudi Arabia alone generating enough to meet 70% of its needs. But could scaling this up reduce sea levels? The oceans contain roughly 1.4 billion cubic kilometers of water. Even desalinating 1 trillion gallons annually—a massive leap—would remove less than 0.0003% of that volume, barely denting sea level rise, which is projected to reach 1.1–6.3 feet by 2100 along some coasts.

Storage is another hurdle. Desalinated water must stay on land—in reservoirs, aquifers, or crops—to avoid returning to the sea. “You’d need vast infrastructure to store or use that water effectively,” says Dr. Manzoor Qadir, an environmental scientist at United Nations University. Proposals like irrigating deserts or creating wetlands exist, but they require enormous investment.

Energy is a bigger issue. Desalination, especially reverse osmosis, consumes 2.5–3 kWh per cubic meter of water, often powered by fossil fuels in places like the Middle East, contributing to the very climate change driving sea level rise. Renewable-powered plants, like those using solar or wind, offer hope but remain costly. Brine disposal also threatens marine ecosystems if not managed carefully, potentially harming biodiversity.

Still, for vulnerable nations like the Maldives, desalination could provide fresh water while symbolically addressing sea level rise. It’s no cure-all, but as part of a broader climate strategy, it sparks hope.
Could the oceans themselves hold the key to our survival?


Sources:

  • Boyer, S., & Lefort, M. C. (2017). "Sequestering Seawater on Land: A Water-Based Solution to Global Issues." F1000Research, 6, 1–10.
  • Jones, E., et al. (2019). "The State of Desalination and Brine Production: A Global Outlook." Science of The Total Environment, 657, 1343–1356.
  • Elsaid, K., et al. (2020). "Environmental Impact of Desalination Processes: Mitigation and Control Strategies." Science of The Total Environment, 740, 140125.
  • Qadir, M., et al. (2018). "Global Desalination: Opportunities and Challenges." United Nations University Institute for Water, Environment and Health Report, 1–12.
  • Park, K., et al. (2020). "Low-Energy Seawater Reverse Osmosis Desalination: A Case Study." Desalination, 485, 114422.

Greenhouse gases and pollutants like CO₂, CH₄, N₂O, fluorinated gases, and black carbon contribute significantly to global warming. Can we reuse them to combat climate change? This article explores technologies for transforming these substances into valuable resources.

Co2 reuse infrastructure


The Potential for Reuse

Reusing greenhouse gases and pollutants reduces atmospheric emissions while creating useful products. Technologies such as carbon capture and methane reforming offer sustainable pathways. Research by Jennifer Wilcox at Stanford underscores the need to scale these solutions.

Reusing Greenhouse Gases

CO₂, CH₄, and N₂O can be repurposed through advanced processes. These methods convert harmful gases into fuels or materials.

1️⃣  Carbon Dioxide Capture System

Carbon Dioxide (CO₂) Capture System traps CO₂ from industrial emissions for conversion into fuels, chemicals, or building materials. Climeworks’ Orca plant in Iceland captures 4,000 metric tons of CO₂ annually for geological storage or reuse.

2️⃣  Methane Reforming System

Methane (CH₄) Reforming System transforms methane from landfills or agriculture into hydrogen and carbon monoxide for clean energy production. Steam reforming accounts for 50% of global hydrogen, reducing CH₄ emissions significantly.

3️⃣  Nitrous Oxide Reduction System

Nitrous Oxide (N₂O) Reduction System uses catalysts to decompose N₂O from industrial processes into nitrogen and oxygen. Applied in fertilizer plants, it reduces N₂O emissions by up to 90%.


Reusing Fluorinated Gases and Black Carbon

Fluorinated gases and black carbon, though less prevalent, have reuse potential. Effective management mitigates their environmental impact.

1️⃣ Fluorinated Gas Recycling System

Fluorinated Gas Recycling System recovers CFCs and HFCs from refrigeration systems for reuse or safe destruction. The Montreal Protocol’s CFC recycling has prevented emissions equivalent to 135 billion metric tons of CO₂ since 1987.

2️⃣ Black Carbon Utilization Method

Black Carbon Utilization Method converts soot from combustion into materials like graphene or activated carbon. Emerging processes could repurpose 20% of black carbon emissions from biomass burning.


Challenges and Opportunities

Reusing these substances faces challenges like high energy demands and scalability. CO₂ capture requires 1–2 GJ per ton, and methane reforming needs significant infrastructure. Yet, global CO₂ reuse markets could reach $550 billion by 2030, driving economic benefits.


Conclusion

CO₂ Capture System, CH₄ Reforming System, N₂O Reduction System, Fluorinated Gas Recycling System, and Black Carbon Utilization Method offer pathways to reuse harmful substances. Scaling these technologies reduces emissions and fosters sustainability. Reusing these compounds is not just a climate strategy but the foundation for a circular economy. Future innovations in cost-effective reuse will ensure a sustainable planet.


Sources

  • Carbon Capture and Utilization for a Circular Economy, Environmental Science & Technology (2020)
  • Advances in Methane Reforming for Hydrogen Production, Energy & Fuels (2021)
  • Catalytic Reduction of Nitrous Oxide Emissions, Journal of Environmental Management (2019)
  • The Montreal Protocol: Progress and Challenges, Environmental Science & Policy (2018)
  • Black Carbon Reuse: Opportunities and Challenges, Carbon (2022)

Global warming, the steady rise in Earth’s average temperature, is primarily driven by human activities. What is the main culprit behind global warming? This article explores key greenhouse gases, their sources, and their impact on climate change.

melting ice mountain



The Role of Greenhouse Gases

Greenhouse gases trap heat in Earth’s atmosphere, intensifying the greenhouse effect and warming the planet. Emitted from fossil fuel combustion and deforestation, these gases are central to climate change. Research by Susan Solomon at MIT emphasizes the urgency of reducing their emissions.

1️⃣ Primary Greenhouse Gases

Three gases dominate global warming contributions. Their potency and sources make them critical targets for mitigation.

2️⃣ Carbon Dioxide Gas

Carbon Dioxide (CO₂) Gas contributes 76% of global greenhouse gas emissions, mainly from burning coal, oil, and gas. CO₂, persistent for centuries, drives long-term warming, with 36 billion metric tons emitted annually.

3️⃣ Methane Gas

Methane (CH₄) Gas, 25 times more potent than CO₂ over 100 years, accounts for 16% of emissions. CH₄, released from livestock, landfills, and natural gas leaks, has a shorter lifespan but significant impact.

4️⃣ Nitrous Oxide Gas

Nitrous Oxide (N₂O) Gas, 298 times more potent than CO₂, constitutes 6% of emissions. N₂O, emitted from agricultural fertilizers and industrial processes, lingers over 100 years, exacerbating warming.


Additional Contributors

Other substances, though less dominant, amplify global warming. Their cumulative effects are notable in climate models.

1️⃣ Fluorinated Gas Compounds

Fluorinated Gas Compounds, including CFCs and HFCs, are thousands of times more potent than CO₂. Used in refrigeration, these gases, regulated by the Montreal Protocol, still contribute due to past emissions.

2️⃣ Black Carbon Particles

Black Carbon Particles, or soot from biomass and diesel combustion, absorb sunlight and warm the atmosphere. Black carbon, short-lived, impacts Arctic warming when deposited on snow.

3️⃣ Mitigation Strategies

Reducing greenhouse gas emissions is critical. Renewable energy cuts CO₂ by up to 80% versus fossil fuels. Methane capture from landfills recovers 50–75% of emissions. Improved farming reduces N₂O by 20–40%.


Conclusion

CO₂, CH₄, N₂O, Fluorinated Gas Compounds, and Black Carbon Particles are the main culprits behind global warming. Targeted reductions through renewables and emission controls can mitigate their impact. Sustainable practices are not just a climate solution but the foundation for humanity’s long-term harmony with Earth’s ecosystems. Future policies must prioritize rapid decarbonization to ensure a stable planet.


Sources

  • Inventory of U.S. Greenhouse Gas Emissions and Sinks: 1990–2021, U.S. Environmental Protection Agency (2023)
  • Methane and Climate Change, Nature Reviews Earth & Environment (2022)
  • Nitrous Oxide: Sources and Sinks, Environmental Research Letters (2019)
  • The Montreal Protocol and Its Implications for Climate Change, American Chemical Society (2020)
  • Black Carbon and Climate: A Review, Atmospheric Chemistry and Physics (2018)

Quantum computing promises to revolutionize computation by solving problems beyond classical computers’ reach. What is the principle of quantum computing? This article explores the core concepts of qubits, superposition, and entanglement driving this technology.

Quantum computer

 

The Role of Quantum Computing

Quantum computers use quantum mechanics to perform calculations exponentially faster than classical systems for specific tasks. Unlike classical bits, which are 0 or 1, quantum bits enable complex computations. Research by John Preskill at Caltech highlights their potential in cryptography and optimization.


Core Principles of Quantum Computing

Quantum computing relies on three fundamental concepts. These principles distinguish it from classical computing.


Qubit System

Qubit System uses quantum bits, or qubits, which exist as 0, 1, or both simultaneously due to quantum mechanics. Qubits, implemented via superconducting circuits or trapped ions, enable parallel processing of vast datasets.


Superposition Mechanism

Superposition Mechanism allows qubits to represent multiple states at once, exponentially increasing computational power. Superposition enables quantum computers to explore many solutions simultaneously, unlike classical sequential processing.


Entanglement Phenomenon

Entanglement Phenomenon creates correlations between qubits, where the state of one instantly affects another, regardless of distance. Entanglement enhances quantum algorithms’ efficiency, as seen in systems like IBM’s 127-qubit Eagle processor.


Operational Principle

Quantum computing operates through quantum gates manipulating qubits. The Qubit System initializes in superposition via the Superposition Mechanism, allowing parallel computations. The Entanglement Phenomenon links qubits, enabling coordinated calculations. Quantum algorithms, like Shor’s for factoring large numbers, exploit these principles to outperform classical methods. In 2019, Google’s 54-qubit Sycamore achieved quantum advantage, solving a problem in 200 seconds that would take a supercomputer 10,000 years.


Challenges and Limitations

Quantum computing faces hurdles like qubit stability and error rates. Maintaining coherence requires temperatures near absolute zero, and current systems, like IBM’s 433-qubit Osprey, have error rates of 1–2%. Scaling to thousands of stable qubits remains a technical challenge.


Conclusion

Quantum computing’s Qubit System, Superposition Mechanism, and Entanglement Phenomenon enable unprecedented computational power. Despite challenges like error rates, it promises breakthroughs in cryptography and materials science. Quantum computing is not just a technological leap but the foundation for solving humanity’s most complex problems. Future advancements in error correction will unlock its full potential.


Sources

  • Quantum Computing: Progress and Prospects, National Academies Press (2019)
  • The Quantum Internet, Nature (2020)
  • Quantum Advantage Shown by Google’s Sycamore Processor, Nature (2019)
  • IBM’s Roadmap for Scaling Quantum Technology, IBM Research (2022)
  • What Is Quantum Computing?, MIT Technology Review (2021)

+ Recent posts