New Technologies New Risks

New Technologies New Risks

HDIAC Journal Homepage Download PDF Previous Article Next Article



Technology is advancing at such a rapid pace that it is becoming increasingly difficult to assess the safety and efficacy of each technology before implementation. Furthermore, regulatory oversight and the law, which traditionally lag in regard to technology, are slipping even further behind this wave of technological development. New technologies that are disruptive, untested and not well-established are referred to as emerging. There are many different ways to define an emerging technology, but in general, an emerging technology is one that has the capacity to change the status quo. No matter the definition, all emerging technologies share five key attributes as defined by D. Rotolo:

  • Radical novelty
  • Relatively fast growth
  • Coherence
  • Prominent impact
  • Uncertainty and ambiguity [1]

Figure 1: A representation of the six key groups of the most prominant emerging technologies. (Released)

Figure 1: A representation of the six key groups of the most prominant emerging technologies. (Released)

It is really the combination of these attributes that not only makes an emerging technology so attractive for positive, societal change, but also generates an environment for potential concern. Apart from environmental, health and safety issues, the concept of emerging technologies has generated apprehension and in some cases fear among military leaders and security experts. NATO has evaluated the impact of emerging technologies on arms control, [2] and a review of U.S. Naval strategy warned that the impact of emerging technologies on mission capabilities is ambiguous. [3] In fact, the latest U.S. National Military Strategy goes as far to state that, “emerging technologies are impacting the calculus of deterrence and conflict management by increasing uncertainty and compressing decision space.” [4]

The possibilities of nefarious use and security breaches emanating from emerging technologies have always been cause for alarm. In fact, some of the most notable technological developments in the 20th century (which were emerging in their time) still plague us today: nuclear fission, the internet and rocketry. These technologies have not just opened us up to new worlds of energy production, communication and travel, but also created fear and panic related to the advent of nuclear weapons, social media propaganda and cyber threats.

The Possibilities for New Threats

Each year the World Economic Forum releases a list of the top 10 emerging technologies to watch for during the upcoming year. Over the past five years, six key groups of technologies are emerging as the most prominent (See Table 1) and are represented in Figure 1.

Aside from being new and not well-established, the aura surrounding many of these technologies is further complicated by the fact that they are considered dual-use technologies, meaning that most legitimate and beneficial uses of the technologies also brings a darker, more sinister application. The challenges associated with dual-use technologies are threefold:

  • Complexity – Many variables with unknown or unknowable consequences
  • Uncertainty – Inability to produce usable risk versus benefit assessments
  • Ambiguity – Often involve conflicts over ethical and professional values [11]

Many of these technologies, if not all, are also being utilized by militaries for use in weapons systems, leading to a resurgence in not only proliferation but state-sponsored scientific research programs that often are used to drive militarization.

Table 1: The World Economic Forum’s Top 10 Emerging Technologies by year for 2012-2016 with technologies representative of six key groups. (Released) [58,59,60,61,62]

Table 1: The World Economic Forum’s Top 10 Emerging Technologies by year for 2012-2016 with technologies representative of six key groups. (Released) [58,59,60,61,62]

The emergence of new technologies, their expected use in military and terror tactics, and a general unawareness of potential consequences has created a new class of security risks that could defy countermeasures used for more classical types of threats, such as nuclear weapons. Nontraditional risks associated with technological innovation can be classified as emerging security challenges and may defy established security agendas. [12] Examples of risks associated with the emerging technologies described earlier can be found in Table 2. These technologies are the most rapidly developing and pervasive, and the potential risks they bring deserve further attention.

Intellectual Property Theft and Espionage

Theft of intellectual property and espionage are not new crimes; however, the type of information that is being smuggled or potentially stolen is changing all the time. As emerging technologies develop and evolve, the allure they bring in terms of competitive advantage and innovation will become more sought-after, and their value will bring big rewards from some and great loss for others.

In 2014, Jianyu Huang, a former scientist at the Department of Energy pleaded guilty to making a false statement and unlawfully transporting converted government property in interstate and foreign commerce. Huang, a naturalized U.S. citizen from China, was employed at the Center for Integrated Nanotechnologies, which is a user facility jointly operated by Sandia National Laboratories and Los Alamos National Laboratory. From 2009 to 2012, Huang made a series of trips to China in which he brought his government issued laptop containing sensitive information regarding his research with him. He admitted to sharing the information on this laptop with businesses and universities throughout China and was sentenced to one year in prison. [13]

Five men in Taiwan were charged in July 2016 with stealing intellectual property from a nanotechnology company and setting up competing nanotechnology plants in China. The men are former employees of Hsin Fang Nano Technology Co., which developed a novel process for manufacturing nanopowders used in a variety of applications. The men used the information they stole to create Unicat Nan Advanced Materials & Devices Technology Co., and operated it successfully for several years before being apprehended. The estimated financial loss to Hsin Fang is reported to be approximately $82 million U.S. dollars and cost the company 15 years of research. [14]

Apart from nanotechnology, other technologies, such as 3D printing and synthetic biology, pose new risks to IP, particularly in the patenting aspect. Two things are needed to operate a 3D printer: material and a design. The material is either a polymer composite or a metal, and the design is an electronic file (computer-aided design file) that can be developed on special software and even downloaded from the internet. Traditionally, product design has been protected because only the manufacturer has the specifications and the ability to create the item; however, 3D printers are changing that model. The cost of a small 3D printer is only a few hundred dollars, and because CAD files are digital, they can be shared across the internet on file-sharing services, just like movies and music. [15] It is now possible for someone to find the plan online for a product they want to create and simply download and print it on a desktop 3D printer at home.

Somewhat more complex, but maybe even more alarming, is the possibility of patent wars for biological innovations. In 2013, the U.S. Supreme Court ruled in Association for Molecular Pathology et al. v. Myriad Genetics that human genes are not patentable but synthetic DNA is patent eligible because it does not occur in nature. [16] Although this ruling puts limitations on developments in biotechnology related to gene-sequencing, this ruling opens up many new possibilities including a potential war for creating new organisms and being the first to market in order to reap the financial rewards.

Loss of Control and Sabotage

The connectivity of items to the internet, and the expanded use of electronics and code in general has created a new venue for causing damage and destruction. Sabotage may be one of the oldest threats known to humanity, but technology is opening up the possibility to do this in completely new ways. One example is through 3D printers that are connected to the internet, allowing for remote control. Hackers might be able to target these printers and secretly introduce catastrophic internal defects in the manufacturing process, change the printing orientation or insert fine defects that may manifest at a later time. [17] This may not seem to be a huge concern until the extreme consequences are truly realized. Imagine that thousands of parts used in engines and airplanes are covertly altered and unknowingly placed into service, only to fail at some later time during operation leading to unexplained multiple crashes at once. This is becoming more of a likely scenario as industrial 3D printing becomes more of a widespread reality. [18] These fears apply even to medicine. In August 2015, Aprecia Pharmaceuticals received Food and Drug Administration approval for the first 3D printed drug, Spritam (levetiracetam), which is used to control seizures brought on by epilepsy. [19,20] If the files that drive the printing of the pills were altered in any way, the entire dosing and effectiveness could be manipulated and would potentially compromise the health of anyone taking the drug.

Table 2: Key emerging technologies, their associated risks, and some examples of concerns. (Released)

Table 2: Key emerging technologies, their associated risks, and some examples of concerns. (Released)

Looking beyond 3D printing, connectivity in general has generated new ways of wreaking havoc on systems. In October 2016, a large portion of the internet was threatened when hackers used the IoT to execute a distributed denial of service attack. [18] A DDoS occurs when a machine receives so many commands that it crashes because it cannot keep up. What is interesting in this case is that hackers utilized something called a “Mirai” botnet that specifically targeted internet-connected devices resulting in more than 500,000 devices being comprised and sending out signals simultaneously and bringing down part of the internet. [21]

More than household devices are connected to the internet. Many aspects of critical infrastructure, such as power plants and dams, are connected to the internet to provide remote and automated access in order to improve the efficiency of operations and reduce the amount of hours a person needs to spend on-site. While this aspect of connectivity does improve operational capacity, for obvious reasons, it creates security threats. For example, sensors are often placed on dams in order to measure water levels and automatically control flood gates if the height of water in the reservoir changes. It is possible to manipulate the sensors and cause the dam’s system to think that the levels are too high (a process called spoofing) and open the flood gates leading to an unplanned and potentially catastrophic release. A group of Iranian hackers gained access to the Bowman Dam in Rye, New York, in 2013. [22] They never gained access to the dam itself, but they did learn information about how the flood gates are controlled.

Artificial intelligence is creating a unique set of risks, in particular one that has frightened humanity for decades, and that is the threat of machines taking over. While the threat may not necessarily be imminent, there is some cause for alarm as both the private sector and the militaries of many nations push to have more autonomous and intelligent machines. In an effort to keep tabs on the activities of semi-autonomous and even autonomous systems, the Department of Defense released DoD Directive 3000.09 prohibiting the creation or use of unmanned systems to, “select and engage individual targets or specific target groups that have not been previously selected by an authorized human operator.” [23] This sort of protection becomes imperative when looking at a weapons platform such as the South Korean Super aEgis II, which is the first weapon system capable of recognizing and targeting humans. [24] However, the nature of machine learning in intelligent systems would eventually enable a system to figure out how to get around this failsafe, [25] and at some point, the human would be removed from the equation altogether. Google’s Brain deep learning project developed three neural networks named Alice, Bob and Eve that passed each other notes using an encryption method they created themselves. [26] This is the first AI-generated, human-independent encryption.

Understanding what it would mean if a human were unable to gain or regain control of an autonomous, semi-autonomous or intelligent system is the sort of scenario that is becoming more likely. [27,28] “We believe strongly that humans should be the only ones to decide when to use lethal force. But when you’re under attack, especially at machine speeds, we want to have a machine that can protect us,” Deputy Defense Secretary Robert Work said. [29] In fact, this type of response has already been demonstrated as an AI-controlled drone defeated a pilot in a simulated dogfight. [30] As the DoD continues to develop its Third Offset Strategy [31] and push for more autonomous systems in which humans play a vital role, but not necessarily the lead role, preparing for risks, such as losing control through hacking or intrinsic systems design, will be paramount. Additionally, preparing for attacks from intelligent systems that do not have the same “human in the loop” safeguards will be equally important.

Terrorism and Weapons of Mass Destruction

In 1346, a Tartar (a subjugated peoples of the Mongol Empire) army laid siege to the city of Caffa on the Crimean peninsula. Sometime after the siege began, the Tartars were stricken with an outbreak of the plague. As the body count mounted, the Tartar commander decided to change the rules of the game and ordered plague-infected corpses to be catapulted into the city. Shortly after, an outbreak of plague erupted inside the walls of Caffa. This event is long believed to have initiated the Black Death that killed one-fourth to one-third of Europe’s population in the 14th century. [32]

In 1978, a Bulgarian dissident named Georgie Markov was attacked with an umbrella in London. Unbeknownst to Markov, the umbrella was modified and used to inject a pellet containing ricin into his leg. Ricin is a highly toxic substance found in the castor bean. He died four days after the attack. [33]

In November 2006, a healthy man was admitted into a London hospital with severe gastrointestinal symptoms. He died approximately three weeks later. His death was later determined to be caused by poisoning from Polonium-210, a radioactive substance. [34]

The connection between all three of these events is that they are the first of their kind. No one ever thought these things could or would happen until they did. In many ways, these incidents set the precedent for what it means to be prepared to face emerging security risks. The emerging technologies discussed throughout have the potential to be used as new weapons of terror.

One of the hallmarks of emerging technologies is the uncertainty they naturally harbor, which for obvious reasons, poses unique challenges in preventing attacks as evidenced by a strange report from Russia. Svetlana Zheludeva, a deputy director at the Russian Academy of Sciences, died suddenly after opening a letter containing a white powder. The sender of the letter had sent several similar letters with the white powder to other scientists claiming the powder was a product of his research in nanotechnology; however, no other recipients of the letters died, and test results supposedly showed that the powder was ordinary quartz sand. Russian authorities have stated that there will be no criminal charges and no other information about the case is available. [35]

Even more concerning than bizarre nanotechnology case reports are real instances of biotechnology and synthetic biology used to create more virulent organisms, and, in some cases, completely new ones. A report prepared by the United Nations Interregional Crime and Justice Research Institute called attention to the possibility that beneficial applications of synthetic biology and biotechnology could be suitable for of new or enhanced biological weapons and used by criminals or terrorist. [36] Gene editing techniques, such as clustered regularly-interspaced short palindromic repeats, have the potential to revolutionize medical treatments by altering faulty genes in patients suffering from rare or degenerative disorders; [37] however, CRISPR could also be used to create mutated strains of organisms that could be lethal, and, in fact, CRISPR has already been used to modify human embryos. [38]

Over the past decade and a half, researchers have used genetic engineering techniques to develop hypervirulent strains of viruses in an effort to use that information to create more efficacious vaccine. However, the development of these new viruses and the debate as to whether or not the results of this research should be published openly has created ethical and security challenges. To date, some of the more important biotechnology research includes:

  • 100 percent fatal strains of mouse pox created [39,40]
  • Genetic characterization of the 1918 influenza pandemic (H1N1) [41]
  • Genomic characterization of the Plague that destroyed Europe in the 14th century [42]
  • More virulent strain of bird flu (H5N1) engineered [43,44]
  • Recreation of 1918 influenza strain (H1N1) using various strains of bird flu (H5N1) [45]

In September 2016 researchers in the United Kingdom created the world’s smallest virus, measuring 12 nm in diameter. [46] This was unprecedented considering this was the first artificial virus ever created, but it also opens up a new possibility for biological warfare. No longer do armies or terrorists have to aerosolize existing organisms; they can now create them. These artificial viruses have no known treatments since they have never existed before, which creates new challenges to developing countermeasures and anticipating threats.

Aside from chem-bio attacks, emerging technologies are changing the landscape for how traditional attacks could be carried out. The first 3D printed weapon was successfully fired in 2013, [47] and in August 2016, Transportation Security Administration officers at the Reno, Nevada, airport confiscated a loaded 3D printed handgun in a passenger carry-on bag. [48] More interesting is the fact that the U.S. military has an interest in new 3D printing technology that would allow soldiers in the field to potentially print entire missiles at once. [49] A platform like this could be a game-changer for terrorist organizations in terms of leveling the playing field regarding weapons production and availability.

Finally, new technologies not only bring risk through the potential of creating new types of weapons but the concept of the technology itself poses a threat. Since the early 2000s, a series of attacks and protests aimed at nanotechnology-themed activities and research centers has occurred across the world (Table 3). In many cases, these were harmless protests incited by students or local environmental groups. However, a failed attack on the IBM facility in Zurich and several planned attacks throughout Mexico were orchestrated by known terrorist organizations, and possibly connected to a group known as the Olga Cell, part of an Italian-based internal terrorist organization known as the Federazianoe Anarchia Informale. [50]


Part of the challenge to managing emerging technologies is anticipating what, if anything, could potentially go wrong. The other aspect is to understand what to do in order to manage the fallout of an adverse event. Typically, policy and the rule of law is how most threats are mitigated. In fact, legislators and regulators have looked at existing legal frameworks in order to control emerging threats from technologies. In terms of nanotechnology, biotechnology and synthetic biology, chemical and biological weapons bans and treaties have been deemed appropriate to control the proliferation of new threats. However, biological innovations have always seemed to garner more attention than others, starting with the Ansilomar Conference in 1975 [51] and culminating with the recent announcement of the intent by the National Academy of Sciences and the DoD’s Office of the Deputy Assistant Secretary of Defense, Chemical and Biological Defense to assess biodefense given new advances in synthetic biology. [52]

Table 3: Protests and terror incidents involving nanotechnology-related venues. [63] (Released)

Table 3: Protests and terror incidents involving nanotechnology-related venues. [63] (Released)

The big problem is that laws are only as good as the people who choose to follow them. The key issue is how to mitigate threats of emerging technologies in the face of any and all environments. Foremost is promoting awareness of not only emerging technologies, but the potential and actual risks they invoke, particularly as everyone, including rogue states and violent non-state actors, are gaining access to the same technology. [52] Recommendations for addressing risks of emerging technologies revolve around awareness, social responsibility and research:

  • Increase awareness of the dual role of emerging technologies and their unintended security consequences
  • Identify and address emerging security concerns now, even if the threat seems to be remote
  • Build emerging security concerns into education curricula and have it addressed within national institutions
  • More dialogue around risk governance priorities between industry, academia and government
  • Increase funding and priority for research related to risk governance
  • Promote a culture of responsibility around innovation
  • Include applications of technology in war gaming scenarios used to train military leaders [52,53,54]

Some of these recommendations are already in practice. The FBI launched a program to identify suspicious activity that could be connected to nefarious uses of synthetic biology. The Synthetic Biology Tripwire Initiative is a system in which suppliers of biological products used in synthetic biology or biotechnology alert the FBI when odd requests are made. [55] DARPA announced in September 2016 that it was developing Safe Genes, a program to develop a set of tools used to address potential risks of gene editing. [56] UNICRI is in the process of opening the Center for Artificial Intelligence and Robotics, aimed at raising awareness of the risks and benefits of AI. [57] While these initiatives are a good start, more focus needs to be put on other technologies and include training for first responders.


In many ways, the title of this article may be misleading. These are not necessarily new risks, but rather relatively common risks that are brought about in ways that have not been used before. However, the fact that most of the world is unprepared to mitigate any one of these scenarios, let alone facing the possibility of several of these scenarios occurring at once, is disconcerting, to say the least. If the 20th century taught us anything, it is that science and technology can be both our greatest friend and our worst enemy. Few people want to think about the impossible or the unknowable, let alone spend money preparing for something that may or may not happen. However, it is always better to plan because each of the events in this article have happened, and these are just the ones we know about.


  1. Rotolo, D., Hicks, D., & Martin, B. (2015). What is an Emerging Technology? SSRN Electronic Journal. doi:10.2139/ ssrn.2564094
  2. Special Report: Emerging Technologies and Their Impact on Arms Control and Non-Proliferation, Nato Parliamentary Assembly committee report, October 2001.
  3. Council, N. R., Sciences, Division on Engineering and Physical, & Board, N. S. (2013). Responding to Capability Surprise: Strategy for U.S. Naval Forces. Washington: National Academies Press.
  4. 2015 National Military Strategy
  5. How 3D Printers Work. (n.d.). Retrieved from  (accessed January 23, 2017).
  6. Russell, S. J., & Norvig, P. (2009). Artificial intelligence: a modern approach. Upper Saddle River: Prentice-Hall.
  7. Nature (n.d.) Biotechnology. Retrieved from (accessed January 23, 2017).
  8. Chui, M., Loffler, M., & Roberts, R. (2010, March). The Internet of Things. Retrieved from  (accessed January 23, 2017).
  9. National Nanotechnology Initiative. What is Nanotechnology? Retrieved from (accessed January 23, 2017).
  10. Osbourn, A. E., O'Maille, P. E., Rosser, S. J. and Lindsey, K. (2012), Synthetic biology. New Phytol, 196: 671–677. doi:10.1111/ j.1469-8137.2012.04374.x
  11. Edwards, B. (2014). Taking Stock of Security Concerns Related to Synthetic Biology in an Age of Responsible Innovation. Frontiers in Public Health, 2, 79. http://doi. org/10.3389/fpubh.2014.00079 
  12. Emerging Security Challenges Working Group Policy Brief No. 1, Emerging Security Challenges: Issues and Options for Consideration, November 13, 2013.
  13. U.S. Attorney’s Office. District of New Mexico. (2014, August 25). Former Sandia Corporation Scientist Pleads Guilty to Taking Government Property to China. Retrieved from (accessed January 23, 2017).
  14. Pan, Jason. (2016, July 28). Prosecuters charge five with nanotechnology theft. Taipei Times. Retrieved from (accessed January 23, 2017).
  15. Holbrook, Timothy. (2016, January 6). How 3D printing threatens our patent system. The Conversation. Retrieved from (accessed January 23, 2017).
  16. Staff Reporter. (2013, June 13). US Supreme Court Strikes Down Gene Patents but Allows Patenting of Synthetic DNA. GenomeWeb. Retrieved from (accessed January 23, 2017).
  17. Zeltmann, S.E., Gupta, N., Tsoutsos, N.G. et al. Manufacturing and Security Challenges in 3D Printing. JOM (2016) 68: 1872. doi:10.1007/s11837-016-1937-7
  18. GE Global Research. (n.d.) 3D Printing Creates New Parts for Aircraft Engines. Retrieved from (accessed January 23, 2017).
  19. Spritam. (n.d.) Making Medicine Using 3D Printing. Retrieved from (accessed January 23, 2017).
  20. Wainwright, Oliver. (2015, August 5). The first 3D-printed pill opens up a world of downloadable medicine. The Guardian. Retrieved from (accessed January 23, 2017).
  21. Newman, Lily H. (2016, December 9). The Botnet that Broke the Internet Isn't Going Away. Retrieved from (accessed April 13, 2017).
  22. Federal Bureau of Investigation. (2016, March 24). International Cyber Crime: Iranians Charged with Hacking U.S. Financial Sector. Retrieved from (accessed January 23, 2017).
  23. Department of Defense. (2012, November 21). Department of Defense Directive Number 3000.09. Retrieved from (accessed January 30, 2017).
  24. Parkin, Simon. (2015, July 16). Killer robots: The soldiers that never sleep. BBC. Retrieved from (accessed January 23, 2017).
  25. Yudkowsky, Eliezer. 2008. “Artificial Intelligence as a Positive and Negative Factor in Global Risk.” In Global Catastrophic Risks, edited by Nick Bostrom and Milan M. Ćirković, 308–345. New York: Oxford University Press.
  26. Dalton, A. (2016, October 28). Google's AI created its own form of encryption. Engadget. Retrieved from (accessed January 26, 2017).
  27. Meuser, C. (2016, July). Frankenweapons loom on the horizon. Proceedings Magazine. Retrieved from (accessed January 26, 2017).
  28. Girrier, B. (2016, August). Unmanned: The new normal [Blog post]. Retrieved from (accessed January 26, 2017).
  29. Work, Bob. (2015, December 14). Proceedings from CNAS Defense Forum: Deputy Secretary of Defense Speech. Washington, DC Retrieved from (accessed January 26, 2017).
  30. Clark, C. (2016, August 8). Artificial Intelligence Drone Defeats Fighter Pilot: The Future? Breaking Defense. Retrieved from (accessed January 27, 2017).
  31. Pellerin, C. (2016, October 31). Deputy Secretary: Third Offset Strategy Bolsters America’s Military Deterrence. Retrieved from (accessed January 27, 2017).
  32. Wheelis, M. (2002, September). Biological Warfare at the 1346 Siege of Caffa. Emerging Infectious Diseases, 8(9), 971-975. Retrieved from (accessed January 27, 2017).
  33. Crompton, R., & Gall, D. (1980, June 1). Georgi Markov — Death in a Pellet. Medico- Legal Journal, 48(2), 51-62. Retrieved from (accessed January 27, 2017).
  34. McFee, R., & Leikin. (2009). Death by Polonium- 210: Lessons learned from the murder of former Soviet spy Alexander Litvinenko. Response Guide for Chemical and Radiological Threats, 26(1), 18-23.
  35. Nanotechnology Now (2008, May). Daughter Suspects Poisoning in Death. Retrieved from (accessed January 27, 2017).
  36.  United Nations Interregional Crime and Justice Research Institute. (2012). Security Implications of Synthetic Biology and Nanobiotechnology: A Risk and Response Assessment of Advances in Biotechnology. Retrieved from (accessed January 27, 2017).
  37. Ledford, H. (2015, June 3). CRISPR, the disruptor. Nature, 522(7554). Retrieved from (accessed January 27, 2017).
  38. Liang, P., Xu, Y., Zhang, X., Ding, C., Huang, R., Zhang, Z., . . . Huang, J. (2015, May). CRISPR/Cas9-mediated gene editing in human tripronuclear zygotes. Protein and Cell, 6(5), 363-372. Retrieved from (accessed January 27, 2017).
  39. Broad, W. (2001, January 23). Australians Create a Deadly Mouse Virus. The New York Times. Retrieved from (accessed January 27, 2017).
  40. Broad, W. (2003, November 1). Bioterror Researchers Build A More Lethal Mousepox. Australians Create a Deadly Mouse Virus. The New York Times. Retrieved from (accessed January 27, 2017).
  41. Tumpey, T. M., Basler, C. F., Aguilar, P. V., Zeng, H., Solórzano, A., Swayne, D. E., . . .García-Sastre, A. (2005, October 7). Characterization of the Reconstructed 1918 Spanish Influenza Pandemic Virus. Science, 310(5745), 77-80. Retrieved from (accessed January 27, 2017).
  42. Bos, K. I., Schuenemann, V. J., Golding, B., Burbano, H. A., Waglechner, N., Coombes, B. K., . . .Krause, J. (2011, October 27). A draft genome of Yersinia pestis from victims of the Black Death. Nature, 478(7370), 506- 510. Retrieved from (accessed January 27, 2017).
  43. Yong, E. (2012, June 21). Second mutant-flu paper published: Just five mutations allow H5N1 to spread between ferrets. Nature. Retrieved from (accessed January 27, 2017).
  44. Yong, E. (2012, May 3). Mutant-flu paper published: Controversial study shows how dangerous forms of avian influenza could evolve in the wild. Nature, 485(7396), 13- 14. Retrieved from (accessed January 27, 2017).
  45. Wahlberg, D. (2014, June 11). UW-Madison scientist creates new flu virus in lab. Wisconsin State Journal. Retrieved from (accessed January 27, 2017).
  46. Noble, J. E., De Santis, E., Ravi, J., Lamarre, B., Castelletto, V., Mantell, J., . . .Ryadnov, M.G. (2016, September 1). A De Novo Virus-Like Topology for Synthetic Virions. Journal of the American Chemical Society, 138(37), 12202-12210. Retrieved from (accessed January 27, 2017).
  47. Prindle, D. (2013, May 8). First working 3D-printed handgun has politicians sweating bullets. Retrieved from (accessed January 27, 2017).
  48. Hodgkins, K. (2016, August 10). Passenger caught with loaded 3D-printed gun in carry- on luggage. Retrieved from (accessed January 27, 2017).
  49. Selby, G. (2015, September 27). 3-D Printing Devices Could Assist Service Members in the Field [Blog post]. Retrieved from (accessed January 27, 2017).
  50. Phillips, L. (2012, May 31). Anarchists attack science: Armed extremists are targeting nuclear and nanotechnology workers. Nature, 485(7400), 561. Retrieved from (accessed January 27, 2017).
  51. Berg, P., Baltimore, D., Brenner, S., Roblin, R. O., III, & Singer, M. F. (1975). Summary statement of the Asilomar Conference on recombinant DNA molecules. Proceedings of the National Academy of Sciences of the United States of America, 72(6), 1981-1984.
  52. Partnership for Peace Consortium. (2013, November 13). Emerging Security Challenges: Issues and Options for Consideration. Retrieved from (accessed January 30, 2017).
  53. Drzik, J. (2015, January 15). The genie of emerging technology. Retrieved from (accessed January 27, 2017).
  54. McGuinness, J. P. (2005, January 15). Nanotechnology: The next Industrial Revolution – Military and societal implications. Army Environmental Policy Institute, Arlington, Virginia. Retrieved from (accessed February 6, 2017).
  55. Taylor, M. (2016, October 13). FBI: We need to collaborate with industry on bioterrorism, WMD. Forensic Magazine. Retrieved from (accessed January 27, 2017).
  56. Defense Advanced Research Projects Agency. (2016, September 7). Setting a Safe Course for Gene Editing Research. Retrieved from (accessed January 27, 2017).
  57. Drzik, J. (2015, January 15). The genie of emerging technology. Retrieved from (accessed January 27, 2017).
  58. Global Agenda Council on Emerging Technologies. (2012, February 15). The top 10 emerging technologies for 2012. Retrieved from (accessed January 27, 2017).
  59. King, D. (2013, February 14). The top 10 emerging technologies for 2013. Retrieved from (accessed January 27, 2017).
  60. Afeyan, N. (2014, September 1). Top 10 emerging technologies for 2014. Retrieved from (accessed January 27, 2017).
  61. Meyerson, B. (2015, March 4). Top 10 emerging technologies of 2015. Retrieved from (accessed January 27, 2017).
  62. Cann, O. (2016, June 23). These are the top 10 emerging technologies of 2016. Retrieved from (accessed January 27, 2017).
  63. Berube, D. M., Frith, J., & Cummings, C. L. (n.d.). Nanoterrorism (Unpublished). North Carolina State University, Raleigh, North Carolina. Retrieved from (accessed February 6, 2017).

Gregory Nichols, MPH, CPH


  Gregory Nichols, MPH, CPH

Gregory Nichols is the scientific and technical advisor for HDIAC. Previously, he managed the Nanotechnology Studies Program at ORAU in Oak Ridge, Tennessee, where he provided expertise on nanotechnology-related topics and conducted research. Prior to ORAU, Nichols spent 10 years in various healthcare roles including five years as a Hospital Corpsman in the U.S. Navy. He has published and presented on a variety of topics including nanotechnology, public health and risk assessment. He has a bachelor’s degree in philosophy and a Master of Public Health degree, both from the University of Tennessee and holds the Certified in Public Health credential.



Back to Top