International Space Station
First launched into space in 1998, NASA’s International Space Station (ISS) is an engineering marvel. Costing close to US $100 billion, its construction required over 100 rocket and shuttle launches, plus 160 space walks, and the components were not all in place until 2011. Russia, the US, Canada, the European Space Agency and Japan all contributed, in what became a new collaborative approach to space exploration. The ISS is larger than a six-bedroom house and its systems are controlled by 52 computers – 3.3 million lines of software code on the ground support the 1.8 million lines of flight software code. Covering the area of a football field, the spacecraft is made up of living quarters, laboratories and robotic arms, and each module is connected by nodes. It is so large it can be seen with the naked eye from earth.
The main purpose of the station is to conduct research that cannot be explored anywhere else, such as the effects of microgravity on the human body and investigating the universe surrounding us. The robotic arms assisted in building the spacecraft module by module, after it had been launched, and they help astronauts go on spacewalks. The station gets its power from solar arrays attached to the edges of the craft. The spacecraft was first boarded by three astronauts in 2000. The ISS is now home to six permanent astronauts who live in orbit for four-six months each year.
The Mars Opportunity Rover
NASA’s Opportunity Rover has been exploring Mars since 2004, despite originally being launched with the purpose of fulfilling a 90-day mission. This project was originally green-lit in 2000, so the rover was designed with seven instruments, including cameras, a miniature thermal emission spectrometer, magnets, a rock abrasion tool, a microscopic imager and an alpha particle X-ray spectrometer. These tools are designed to allow the rover to investigate the terrain and air compositions, and send this data back to scientists. The fact that Opportunity is now in its 15th year of exploring Mars is amazing in itself – given it’s now the longest living robot on another planet. Opportunity and its twin Spirit were originally sent to opposite sides of Mars to explore the terrain and search for signs of past water and potential habitats, in case Mars was once like Earth. Since then, Opportunity has roved for 45 kilometres and sent more than 250,000 images back to earth. This mission has helped scientists collect evidence of ancient, habitable environments on Mars, to give them a better understanding of the planet.
The Mars Curiosity Rover and Sky Crane
In 2012, Curiosity successfully landed on Mars with the intention of exploring the planet’s habitability on a long-term mission. This was the most sophisticated interplanetary probe ever created, with more than 7,000 scientists and engineers involved in its construction. One of the most challenging aspects of its launch was finding a place to have it land – the other rovers were secured inside a bag, which was inflated upon launch to protect the machines upon landing. Curiosity is five times the size of Opportunity, and a bag that big would have been torn apart during descent. In a marathon brainstorming session, scientists and engineers settled on the idea of creating Sky Crane to hover above Mars and lower curiosity down via a cable.
Fitted with the most advanced suite of scientific tools ever sent to Mars, Curiosity is using its built-in laboratory to analyse the composition of rocks and soil in the hopes of detecting chemical evidence of life, such as carbon. In a sample collected from its first drill, it found elements of sulphur, nitrogen, hydrogen, oxygen, phosphorus and carbon, which could potentially support life. Curiosity is also exploring Mars’ environment, including its weather and radiation emissions.
Chandra X-Ray Observatory
One of NASA’s “Great Observatories,” the Chandra X-Ray Observatory was first launched in 1999. Its purpose is to produce full-colour images of objects which are emitting X-ray light, so scientists can develop a better understanding of our universe. In 2007, the telescope monitored a galaxy 240 million light years away and helped astronomers realise the galaxy contained a type of exploding star which they had never seen before. The Chandra X-Ray Observatory circles the earth every two-and-a-half days; its elliptical orbit allows it to travel up to 140,000 kilometres away. It primarily focuses on collecting data about black holes, supernovae, starburst galaxies and exotic objects far from earth. It also has the capability to study foreign objects that cross its path spontaneously.
It has four pairs of mirrors which run perpendicular to the path of incoming light. As the light hits the top mirror in each pair, it filters down to the secondary mirrors, because two bounces are required before an X-Ray comes into focus. The X-Ray the travels down an eight-metre-long tube towards the telescopes scientific instruments, where the information is then sent back to earth to be analysed by scientists. The Chandra X-ray Observatory also contains devices called gratings, which can be used to filter the light and determine an object’s density, temperature, as well as its motion towards or away from the telescope.
Large Hadron Collider
The world’s largest and most powerful particle accelerator was first used in 2008, despite scientists at CERN developing the concept for over two decades. It consists of a 27 kilometre circumference made of superconducting magnets, designed to accelerate the energy of particles as they pass through. It reuses the underground tunnel built for CERN’s previous large accelerator, Large Electron-Positron Collider, which was dismantled in 2000. This is for environmental reasons, such as minimal disruption to the landscape and using the earth’s crust as protection from radiation, as well as economic reasons; it’s more cost-effective than finding enough above-ground space to accommodate the machine.
This machine’s purpose is to help scientists discover more about the universe by smashing subatomic particles together at almost the speed of light. This will allow them to confirm the existence of particles they’ve predicted, discover new ones, and possibly create microscopic black holes. For example, the Large Hadron Collider led to the discovery of the Higgs boson, the final missing piece of Standard Model physics. Scientists are hoping this machine will help them discover how many dimensions there actually are.
This technology was first developed in the mid-1980s; however, since then it has seen much advancement. At a very basic level, the process begins with the designing of an object via computer software to the correct dimensions. Spools of filament are fed into the 3D printer and turned into plastic; the object is then built layer-by-layer in a process known as additive manufacturing. However, many industrial-grade printers can use materials other than plastic – such as ceramic, brass and steel.
There are endless opportunities as this technology continues to develop, and its availability becomes more widespread. When the concept of 3D printing first came about, it was known as rapid prototyping because it was just used to make creating models easier for engineers. Now we have the ability to construct buildings, machines and furniture with one of these printers. Scientists are now looking into ways to engineer intricate food delicacies. There’s also the opportunity to generate biomaterials for the creation of organs and other body parts, something that would revolutionise the medical world. There are a few limitations to the technology, however. For example, the printer’s size determines the object’s size: you cannot manufacture a house using a regular sized printer. Additive manufacturing is also an incredibly slow process, which means mass production is not yet viable. The cost of these printers and the materials is incredibly high, and it’s currently impossible to achieve the same finish as an industrial machine would.
Computed Tomography is traditionally the process of using X-Rays to examine a specific part of the human body. Invented in 1972, this machine works by pin-pointing the area to be examined with a narrow X-ray beam, which is quickly rotated around the body to generate a cross-sectional image of the human body. These images, known as “slices,” are far more detailed than that of an X-ray. Once a number of successive images have been taken, the slices can be put together to form a 3D image of the human body. When the machine was first invented, taking these images was a slow process, one slice at a time. The newest machines can record up to 640 slices at a time; however it’s more common for machines to record between four and 320.
However, technological advancements have allowed this machine to be used for much more than the medical field. Industrial Computed Tomography allows engineers to capture the precise dimensions of external and internal structures, without destroying them. This opens the door for reverse engineering in a manufacturing setting: this process determines the measurements of an existing object to allow for mass production. It also allows for easier management of quality control, determining material posterity and ensuring internal contact points are made during assembly. There are still limitations, such as the cost and size of these machines, however this technology has the potential to change the engineering world.
LightStrike Germ-Zapping Robot
In 2010, the LightStrike was launched commercially to combat the spread of hospital acquired infections, as studies showed this was a leading cause of death in the US and throughout the world. This robot uses pulsed xenon lamps to produce flashes of germicidal UV light which destroys bugs in hard to clean places. The high-intensity UV light penetrates the cell walls of micro-organisms and fuses their DNA to kill them without making physical contact. Different pathogens die at different wavelengths, so this machine can be programmed to combat specific diseases: such as, the pathogens most commonly associated with hospital acquired infections, Ebola, Anthrax, HIV, and other communicable diseases. The disinfection cycle runs for approximately five minutes and the robot can clean 64 rooms per day without overheating. The UV-C light it emitted is highly dangerous for humans; however the light cannot travel through walls and glass, so it’s only harmful to humans who are in the room while it’s operating. As a safety measure, the robot is fitted with heat and movement sensors to automatically turn off if a human enters the room.
Bladeless Wind Turbines
Wind is a major contributor to the generation of renewable energy, and reliance on it will only increase as fossil fuels continue to deplete. The most common way to harvest this resource is via wind turbines, which convert wind energy into electrical energy, as the movement of the blades fuel an electricity generator. The top-heavy nature of these machines means high-quality components are required for the blades to decrease the risk of structural damage. Therefore, the cost of producing traditional wind turbines is high.
A Spanish engineering start-up has developed bladeless turbines, which harness the vorticity of air. When wind passes through the cylindrical wind turbines, it becomes a spinning vortex which exerts force onto the cylinder and causes it to vibrate. The kinetic energy created by this cylinder is then converted into electricity via a linear generator. The lightweight design doesn’t have any gears or bearings, and it can produce electricity for approximately 40 per cent less than traditional wind turbines. It’s also less of a danger to birds flying near it. Currently, bladeless turbines are not as effective at generating electricity as traditional turbines; however, with time these machines will be adapted to increase their efficiency.