Creating Trust: A Proposal for Effective Communication of Genome Editing Hilary Gehin


Imagine if discovering you had cancer meant that all you had to do for treatment was schedule an appointment to undergo gene therapy.   After one simple procedure, free from the painful side effects of chemotherapy, your cancer is eradicated.

This is a potential reality on the horizon of the medical field; one that could have a significant impact on global health. Considering that 39.6% of men and women will be diagnosed with cancer at some point in their lifetime (National Institute of Cancer), finding a cure would significantly improve many people’s quality of life.

An area of current research aimed at targeting cancer is genome editing. Genome editing is the process of inserting, deleting, or replacing genes using nucleases (Wikipedia). Much current research with genome editing specifically employs the CRISPR-Cas9 system. This technique involves clustered regularly interspaced short palindromic repeats (CRISPR) which guide RNAs in attaching to a targeted DNA sequence and recruit the Cas9 endonuclease to cleave DNA. Either the targeted gene is removed, or a new gene is inserted at the cleavage site (Hwang, 2013). Compared to other genome editing systems, CRISPR-Cas9 has the advantage of being easier and more precise in identifying cancer targets (CRISPR).

With its potential implications, genome editing affects a wide range of interest groups. Cancer patients, research institutes, and scientists all are impacted by genome editing. This technology could create a huge industry, allowing for patients to be cured of ailments, cut health costs of expensive cancer treatments, and establish funding for researchers on genome editing technology.

As a rapidly advancing technique, genome editing is quickly seeping into public discourse. The most current news related to genome editing is the Human Genome Editing Summit, which occurred in Washington DC December 1st—3rd, 2015. The purpose of the meeting was to bring together experts to gauge the varying attitudes and concerns behind genome editing. Additionally this gathering uncovered the most up-to-date perceived benefits and risks of genome editing. On the one hand, there was the opinion that genome editing has potential health benefits for targeting conditions such as HIV and blood diseases (Travis, 2015).

However, concerns were raised about using genome editing for cosmetic reasons. People expressed wariness over eugenics, or utilizing genome editing for enhancement. Additionally, there was some disagreement about editing germ line cells. One side argued that editing germ line cells could help eradicate inherited genetic disorders before conception while the opposing side argued that with much still unknown about how genes interact, this procedure might affect future generations in undetermined, potentially negative ways.   In the US, editing germ line cells is currently not supported by the government, although there are currently no policies outlawing it (Saey, 2015). Another issue posed at the summit was the possibility of affecting non-targeted genes, or creating a domino effect of potentially damaging healthy genes that interact with the targeted gene (Travis, 2015).


Overall, the summit raised awareness of general public and expert perceptions of genome editing. The key concern implied is that the excitement surrounding genome editing could overshadow the potential ethical implications. While for now the purpose is therapeutic, there is worry that this technology will be used unethically in the future. In addition to the concerns posed at the summit, some scientists worry about “designer babies,” or that one day we will be able to chose the intelligence, sexual orientation, physical traits, and personality of fetuses in vitro (Christensen, 2015). This could also have a negative impact on the poor. For example, could we start discriminating against people who cannot afford to change their genes for cosmetic reasons?

The other key issue raised at the summit was that experts should seek to find public acceptance of genome editing while acknowledging public concerns. As US Congressman Bill Foster announced at the summit:

“For many people outside this room, including most members of Congress, CRISPR is still an unknown term…I believe that it’s important that the first side of CRISPR presented to the public is a positive one. CRISPR and related technologies have the potential to revolutionize the treatment of diseases but could be used in many ways not beneficial to society” (Travis, 2015)


Despite their agreement that public acceptance is necessary as genome editing research moves forward, there were no guidelines given to scientists on how they could go about achieving this goal. With the knowledge of how people perceive and process risk, how technologies similar to genome editing have been evaluated by the public, and knowing what makes an effective risk communicator, I will outline a plan for how experts can effectively communicate the risks and benefits of genome editing. The goal is to keep the public informed, involved, and confident in the experts.

What is Risk?

Before exploring how people perceive risk, it is important to define “risk.” Renn (1992) defines risk as containing three elements: undesirable outcomes, possibility of occurrence, and state of reality. In this case, undesirable outcomes would be unknown effects of genes interacting and the ethics of editing germ line cells and designer babies. There are several ways risk can be evaluated. From a technical perspective, you can look at the estimated expected physical harm (unknown side effects of gene interactions). From an economic perspective, the expected utility is considered (how much money will this cost to do? Will the costs of this process cut down on health care spending in the future?).   It’s important to define risk because the definition can impact the outcome of policy debates and allocation of resources (Brossard, 2015). In the case of genome editing, defining the risks is crucial to create policies that both moderate ethical standards and fund therapeutic applications of genome technology.

Risk can further be defined as a combination of hazard and outrage (Brossard, 2015). On one hand, risk can be perceived in terms of hazards, which are the measured and quantified likelihoods of a risky event occurring (Brossard, 2015). Experts commonly definine risk in terms of hazards. In contrast, the public tends to form risk perceptions by focusing on outrage, or the perceived likelihood of how devastating an event would be if it occurred (Brossard, 2015). It is crucial for experts to keep this in mind when communicating the risks/benefits of genome editing, knowing that the public will not necessarily form opinions based on data or statistics.   A further definition of risk is as an event where the outcome is uncertain (Aven & Renn, 2009). By this definition, risk is an event where something of human value is at stake and the outcome is unknown. In this case, human safety and trust in science are at stake, and the outcomes of potential risks of genome editing are highly uncertain since the field is still in development.

Perception and Processing of Risk

Before proposing how to most effectively communicate the benefits and risks of genome editing, it is important to know how people perceive and process risk. There are many theories that explain the perception of risk. Slovic argues that risk is perceived by five factors: controllability, dread, conflict between experts, equity of risks/benefits, uncertainty (Slovic, 1988). Applying these factors to genome editing, one can estimate how people will perceive genome editing.

One study focused on nuclear energy found that action for or against nuclear power was directly influenced by perceived benefits/risks of nuclear technology, which then determined their acceptability for/against nuclear power (Gardner et al., 1982). Thus, whether someone feels in control of their situation affects whether they accept that technology or not. In the case of genome editing, the public has no direct control over where research is going and what it might enable scientists to do. Dread cannot be measured until this matter reaches the public arena. Uncertainty is the key issue here. When genome editing becomes more prevalent in public discourse, we might expect similar results: people who feel a sense of self-efficacy (or that they have control over whether they go through genome editing or not) could be more accepting of genome editing. In communicating risk then, genome editing should be framed as a potential option for cancer patients or people with a genetic disease seeking treatment, not as a tool to manipulate human genomes without awareness or consent.

As genome editing becomes focused on by news sources and is shared with the general public, scientists and other experts must keep in mind that lack of controllability and uncertainty might cause people to perceive genome editing as risky and not as a potential therapeutic tool.

Belief Gap Hypothesis

Another theory of risk perception is the belief gap hypothesis, which states that an ideology is a stronger predictor of risk perception than prior knowledge or gains in knowledge from the media on scientific issues (Nisbet, 2014). Thus, the challenge posed to experts is that merely informing the public of genome editing is not sufficient to promote acceptance. They will rather be more likely to rely on prior beliefs (political or religious, for example) when forming their own opinions of genome editing. This could be an issue, say for religious people who see tampering with genes as “playing God” and will overlook the potential health benefits.

Proof of the belief gap hypothesis is found throughout the literature. In one study, it was discovered that political ideologies have a large impact on beliefs in science (Hamilton 2013). No matter what degree of education, republicans remained steadfast in their beliefs on climate change (Hamilton, 2013). Another study found that party identification moderates the correlation between knowledge and concern about global warming (Malka, Krosnick & Langer, 2009). The challenge experts will face is how to frame genome editing in a way that does not cause it to become a politically charged, partisan issue like climate change(Nisbet, 2014).

Theory of Motivated Reasoning

Another theory of communication that pertaining to genome editing is motivated reasoning: the desire to arrive at conclusions consistent with previous beliefs leads to biased information processing (Nisbet, Cooper, & Ellithorpe, 2014). This is similar to the belief gap hypothesis in that humans tend to use previous convictions as short cuts when forming opinions. Thus when communicating benefits and risks of genome editing, it is important to consider that people will process information differently based on their moral values. If communication related to genome editing is counter to a group or person’s values, they will form a bias against genome editing based on previous biases. For example, people who are currently opposed to GMOs because they consider the technology to be “playing GOd might consider genome editing to be similarly immoral and will thus oppose it without considering the benefits. Thus, science communicators must address these values when forming a communication strategy.

Knowledge Gap Hypothesis

One common misconception in the field of science communication is that knowledge gaps account for higher risk perceptions. Many studies however have refuted this theory. Johnson (1993) made it clear that the tendency to equate lack of knowledge with ignorance is a mistake. Knowledge and ignorance are not two polar extremes because everyone has different levels of knowledge and ignorance on varying topics.

In communicating the risks/benefits of genome editing, experts should avoid the assumption that the general public will form their perceptions based on scientific knowledge. Evidence suggests people will process risk based on experiences. This idea is the foundation of the Heuristic Systematic model (HSM) (Trumbo, 2002). It states that humans process information using systematic processes, heuristic process, or both. Systematic processing examines arguments, past information, and statistics before forming an opinion. Heuristic processing occurs by using simple decision rules to arrive at a judgment. For example, gun violence is a highly polarized issue with republicans tending to support concealed carry licenses and democrats arguing for stricter gun control. Your political stance might cause you to automatically form an opinion on the risk of gun violence, based on either owning a gun yourself or watching the news of mass shootings, without actually researching the statistics behind gun violence risks. In other words, humans use shortcuts in forming risk perceptions (Trumbo, 2002). It can thus be concluded that when people evaluate the risks of genome editing, they will use heuristic processing as well.

Trust Gap Hypothesis

Another theory that explains public risk perception is the trust gap hypothesis. It states that risk perception can be predicted by measuring levels of trust in institutions. One particular study found that trust in institutions is more important than knowledge in predicting agreement with biotechnologies (Priest, Bonfadelli, & Rusanen, 2003). A further study found that trust in industry and of scientists predicts acceptance of genetically modified organisms field experiments (Siegrist, 2012). Since GMOs and genome editing are similar biotechnologies in that they are the manipulation of genes, these results are useful in forming risk/benefit communication campaigns. People may spend less time systematically processing a message, or use short cuts because it is easier for them to make a decision based on how they trust the experts.

While communicating risks/benefits of genome editing, this is crucial to consider because knowledge of the technology and the content of the message aren’t the most important aspects of whether people will accept genome editing or not. The crucial step in gaining public acceptance is creating and maintaining trust in the experts. It is also important to consider people with high moral convictions. A previous study found that people with high moral convictions were influenced by procedural fairness in their acceptance of GMO experiments (Siegrist, 2012). The process of genome editing must be presented as having fairness in the procedures and outcomes. For example, genome editing should be available to everyone once it becomes an established therapy, and the outcomes should be for increased health. To gain public support, genome editing should not be used now for controversial procedures such as germ-line cell editing and eugenics in order to gain public support.

Creating Trust in Experts

Trust has been proven to affect acceptance of technologies. The key to public acceptance of genome editing is building trust between the lay public and experts on genome editing. Engdahl and Lidskog (2012) suggest that trust is created when people are emotionally involved, take part, express their opinions, and recognize themselves as the recipients of trust. Communication of genome editing must be a conversation during which people express their concerns and opinions and scientists and other experts address these worries. Some ways this can be done are by interviewing experts within the field on TV about potential benefits and ethical questions of genome manipulation. Additionally, experts could utilize social media by having well known and trusted public figures prompt public discourse on genome editing, such as Bill Nye or Neil De Grasse Tyson. Lastly, having more opportunities for people to express their concerns would help in fostering an open dialogue. The Human Genome Editing Summit held in Washington, DC is one such example of providing a safe place for the public and experts to share their thoughts, expectations, and concerns regarding genome editing.

Media Amplification

After the Human Genome Editing Summit, and as more research is conducted with CRISPR-Cas9, it is likely the media will have an important influence on how the risks and benefits of genome editing are perceived by the public. While scientists can take responsibility for communicating the scientifically grounded information, the media tends to amplify risks and outrage factors when presenting news stories (Brossard, 2015).

Studies have shown that the news can easily change our risk perceptions. It was found that outrage factors or catastrophic potential determine coverage of health related food risks (Ju et al., 2015). A study on the Fukushima nuclear disaster analyzed how the media communicated the events to the public. Although the news used creative approaches to inform audiences, some TV sources over exaggerated the risks behind the nuclear disaster (Friedman, 2011). Television has the potential to be effective at presenting relevant information to the public in an easily accessible format, but excessive coverage of potential dangers can skew public perception towards focusing solely on risks. Experts must keep this information in mind, knowing that media sources tend to focus more on negative aspects of technology rather than benefits.

This could also apply to the potential risks of genome editing. A risk with more outrage factors tends to get more news coverage (Brossard, 2015). Outrage factors thus amplify a story and can affect people’s risk perceptions (Brossard, 2015). For example, the popular blogger FoodBabe created public outrage when she wrote about Subway sandwiches containing azodicarbonamide, a chemical also found in yoga mats. Although completely safe to ingest, her negative spin created outrage among her followers, who were motivated to pressure Subway into taking the chemical out of its bread recipe (Tapper, 2014).

There seems to be many parallels between nanotechnology and genome editing. Both involve manipulation of genes within an organism. Studies have shown that people are hesitant to accept other technologies similar to genome editing, such as nanotechnology. In addition to the argument that such technologies are akin to “playing God,” the uncertainty of the potential risks of these procedures have caused people to have higher risk perception (Priest, 2003). For example the potential impacts of nanotechnology on human health and the environment are not well-established and cannot be anticipated (Grossman). Not fully knowing the risks of genome editing could cause people to assume what they perceive to be the worst-case-scenario without reassurance from experts on what the actual risks are.

Because genome editing is specifically related to humans, will this cause more or less outrage and amplification in the news? Although experts have so far agreed that the risks of genome editing are largely speculative and unknown, the media could use the uncertainty of the future implications of genome editing as an outrage factor. One article, for example, expressed concerns that genome editing could lead to a situation akin to the movie “Gattaca” where people create designer babies in utero, and as adults people are discriminated against based on their genomic profile (Friedman, 2015).  In “Gattaca,” the main character is at a disadvantage his whole life for being born before the surge of designer babies. Without considering environmental effects, a person’s worth, wealth, and ability to be hired is based on your genetic makeup (Nicoll). Could genetic discrimination turn into a reality? One issue with this is that if insurance companies can base coverage on your genetic profile, people with more money could afford to have healthier babies who would end up being the only ones able to afford health insurance.

While scientists can be as careful as possible about communicating the risks/benefits of genome editing, they must keep in mind that the media likes to latch onto the risks of technologies and has the power to amplify them (Brossard, 2015). This could tip the scale and lead to higher perception of genome editing risk and lower acceptance of that new technology. The amplification by the media alters risk perception, which further affects trust in science, which ultimatgely leads to acceptance or discouragement of emergent technologies (Siegrest et al., 2012).   Although it is important that the public is aware of genome editing for the effect it could have on the health field, the media could negatively affect perceptions of genome editing through lowering trust in experts.

It is clear that we need more effective ways to earn trust for scientists. Engdahl and Lidskog (2014) point out that the lay person uses contextual factors to form their view of risks. While keeping this in mind, risk communication should focus on earning trust from the public. This is important because the public gives legitimacy to the experts’ varying claims. They further argue that experts tend to form risk communication as rigid and not open to conversation and mutual understanding (Engdahl & Lidskog, 2014). Furthermore, experts forget that risks are socially embedded and don’t leave space for other people’s perspectives. This is especially important to keep in mind when communicating genome editing because this new technology will, if not now, in the future, affect the general public. If experts wish to be successful risk communicators, they must remember to keep open lines of communication with the public and accept their opinions and input.

One example on how this can be done was the panel on designer genes at UW Madison on October 24th, 2015. The panel brought experts and the public together to discuss the potential benefits and risks surrounding genome editing. Such events like these allow the public and experts to communicate and could significantly increase self-efficacy, reduce uncertainty, and make the balances of risks and benefits of genome editing clearer. As Engdahl and Lidskog (2014) note: “trust is created when people are emotionally involved, take part, express opinions, and recognize self in recipient of trust” (p. 714).

With these theories in mind, it is imperative to remember the importance of communicating risks. Risks imply that there is a potential for accidents or catastrophes, which can incur social and economical costs (Brossard, 2015). While genome editing can be framed as beneficial, scientists need to avoid improper risk estimation by eschewing overconfidence of benefits, avoiding potentially catastrophic low probabilities (crossing ethical boundaries into eugenics), and pressures from sponsors or political parties that try to push a technology without properly considering all the risks (Freudenburg, 1988).

A Plan for Communicating Genome Editing

In communicating genome editing risks/benefits, the goal should be to avoid the backlash GMOs have faced. The intention of creating GMOs was positive: make food that is insect resistant and more nutritionally dense. However, there has been significant backlash against GMOs. The moniker, “Frankenfood” that has been attributed to GMOs illustrates the uncertainty and fear behind GMOs (Guthrie, 2013). Thus people are opposed to GMOs and understand the uncertainties behind them, but are not well informed on the benefits.  With genome editing, the main goals should be to accurately present the risks and benefits associated with the procedure, and build trust in experts.

One error science communicators should avoid when discussing genome editing is overconfidence. As important players in the field of risk communication, media sources should also avoid overconfidence in presenting the results of genome editing studies. Focusing too much on the benefits ignores the any public concerns over uncertainties, potential risks and ethical implications of this emergent technology. On the other hand, knowing that people respond to negativity more strongly, the experts and media should avoid amplifying the uncertainty and promoting a sense of controllability. If the public does not feel in control with regard to genome editing, this could lead to fear, outrage, and higher perception of risk. For example, genome editing germ line cells or creating designer babies does not give the future child a choice in editing his/her genes. Experts should come together and have a unified outlook on genome editing. Disagreement among experts further amplifies public risk perception (Slovic, 1992).


With that, the question remains: how can risks and benefits of genome editing be effectively articulated? First, specialists should find out what people’s needs and expectations are related to genome editing. A study focused on nuclear energy found that although most lay people reported a negative view of nuclear energy, they wanted more accessible information on the topic (Skarlatidou, Cheng & Haklay, 2012). The findings show that although lay people had little knowledge of nuclear power, they were aware of their knowledge gap and were willing to seek more information if it was available and easy to digest. The authors suggested that information on nuclear power should have context organized according to the needs and expectations of lay people (Skarlatidou, Cheng & Haklay, 2012). Similarly, the same approach should be used for genome editing. In this case, public needs could include therapies for cancer and other genetic disorders. Expectations could include scientists adhering to ethical standards. This information will further guide experts in disclosing the risks and benefits of genome editing.

The first step in communicating genome editing should be to clearly articulate the magnitude of risks and ethical issues. Like all medical procedures, the risks need to be weighed against the benefits. Additionally, self-efficacy must be promoted. If people perceive that genome editing is something that could happen to them, but they would have no control of (for example, having their genes edited in vitro), they will have a higher risk perception (Brossard, 2015). Additionally, the risks and of genome editing should be compared accurately to similar risks. Risk perception can be greatly amplified is a risk is taken out of context (Brossard, 2015).

While the public should be aware of potential hazards and ethical issues of genome editing, it is important that they are kept in perspective. Sensationalist information should be limited as experts and media communicate genome editing. Sensationalist information includes worst-case scenarios and loaded words (Brossard, 2015). If experts want the public to accept the potential therapeutic benefits of genome editing, sensationalism should be avoided. Lastly, genome editing should be presented as a thematic event. Genome editing is not an isolated discovery but an emerging technology from the wide field of genomics and gene therapy. While informing the public of this new research, it is crucial to show how genome editing has evolved from years of work within the genetics field.


While experts have shown that genome editing has the potential to significantly improve public health, it is important to acknowledge the potential concerns and risk perceptions the public might harbor. Given what is known about similar technologies, how people perceive risk, and what comprises effective risk communication, I have outlined what experts need to know in order to accurately present genome editing. Additionally, I have stressed the importance of creating trust between the public and experts in regards to genome editing. What we can learn from past failures in scientific communication is that uncertainty of a technology creates mistrust in experts, which causes higher risk perceptions. Building trust is a crucial step towards public acceptance of genome editing. Trust in experts will show that the public is confident that scientists are taking proper precautions to ensure that genome editing is used ethically and responsibly.

In sum, clear communication of genome editing as a potentially beneficial medical procedure that requires further research and raises ethical questions will help citizens make informed decisions and opinions. The ultimate goal is to create trust in experts. Unsuccessful science communication has lead to mistrust, which has in turn affected government policies; as has been the case with climate change (Brownell, 2013). With a clear communication campaign based on what has and hasn’t worked in the past, experts can work towards a public understanding, thus promoting trust in genome editing experts. Ultimately, how successful scientists are at communicating genome editing will affect government decisions on policies, funding, and regulation, which in turn has the potential to significantly impact human health.






Aven, T. & Renn, O. (2009). On risk defined as an event where the outcome is uncertain. Journal of Risk Research, 12:1, 1-11.

doi: 10.1080/13669870802488883

Brossard, D. (2015). Defining risk [Power point].

Brossard, D. (2015). Media and risk perception[Power point].

Brownell, S. E., Price, J. V., & Steinman, L. (2013). Science communication to the general public: Why we need to teach undergraduate and graduate students this skill as part of their formal scientific training. Journal of Undergraduate Neuroscience Education, 12(1), E6–E10.

Christensen, J. (2015). The slow crawl to designer babies. CNN. Retrieved from:

Engdahl, E. & Lidskog, R. (2014). Risk, communication and trust: Towards an emotional understanding of trust. Public Understanding of Science: 23:6, 703–717. doi: 10.1177/0963662512460953

Freudenburg, W. (1988). Perceived risk, real risk: Social science and the art of probabilistic risk assessment. Science, New Series, 242:4875, 44—49.

Freudenburg, W. (1992). Heuristics, biases, and the not-so general publics: Expertise and error in the assessment of risks. In S. Krimsky & D. Golding (Ed.), Social Theories of Risk (pp. 229–249). Westport, CT: Praeger.

Friedman, L. (2015). These are the countries where it’s ‘legal’ to edit human embryos (hint: the US is one). Business Insider. Retrieved December 8th, 2015 from:

Friedman, S. (2011). Three Mile Island, Chernobyl, and Fukushima: An analysis of traditional and new media coverage of nuclear waste accidents and radiation. Bulletin of the Atomic Scientists, 67:5, 55–65.

doi: 10.1177/0096340211421587

Gardner, G., Tiemann, A., Gould, L., DeLuca, D., Doob, L., & Stolwijk, J. (1982). Risk and benefit perceptions, acceptability judgments, and self-reported actions toward nuclear power. The Journal of Social Psychology, 116, 179–197.

Genome editing (n.d.). In Wikipedia. Retrieved December 8, 2015, from:

Grossman, E. (n.d.). Tiny materials in countless products raise big questions for environment and health. Ensia. Retrieved December 8th, 2015 from:

Guthrie, C. (2013). Frankenfood=genetically modified foods. Experience Life. Retrieved from:

Hwang, W., Du, W., Reyon, D.,…, Joung, J. (2013). Efficient in vivo genome editing using RNA-guided nucleases. National Biotechnology, 31(3) 227-229. doi: 10.1038/nbt.2501

Johnson, B. (1993). Risk: Issues in Health & Safety: 189, 189–212.

Ju, Y., Lim, J., Shim, M., & You, M. (2015). Outrage factors in government press releases of food risk and their influence on news media coverage. Journal of Health Communication, 20, 879–887. doi: 10.1080/10810730.2015.1018602

Malka, A., Krosnick, J., & Langer, G. The association of knowledge with concern about global warming: Trusted information sources shape public thinking. Risk Analysis: 29:5, 633–647. doi: 10.1111/j.1539-6924.2009.01220.x

National Cancer Institute (n.d.). CRISPR: Genome editing comes of age. Retrieved December 8th, 2015 from:

Nicoll, A. (n.d.). Gattaca movie review summary. Retrieved December 8th, 2015 from:

Nisbet, E., Cooper, K., & Ellithorpe, M. (2014). Ignorance or bias? Evaluating the ideological and informational drivers of communication gaps about climate change. Public Understanding of Science: 24:3, 285–301.

doi: 10.1177/0963662514545909

Priest, S., Bonfadelli, H., & Rusanen, M. (2003). The “trust gap” hypothesis: Predicting support for biotechnology across national cultures as a function of trust in actors. Risk Analysis: 23:4, 751–766.

Renn, O. (1992). Concepts of Risk: A Classification. In S. Krimsky & D. Golding (Ed.), Social Theories of Risk (pp. 53–79). Westport, CT: Praeger.

Saey, T. (2015). White House hits pause on editing human germ line cells. Science News. Retrieved December 8th, 2015 from: line-cells

Siegrist, M., Connor, M., & Keller, C. (2012). Trust, confidence, procedural fairness, outcome fairness, moral conviction, and the acceptance of GM field experiments. Risk Analysis: 32:8, 1394–1403.

doi: 10.1111/j.1539-6924.2011.01739.x

Skarlatidou, A., Cheng, T., & Haklay, M. (2012). What do lay people want to know about the disposal of nuclear waste? A mental model approach to the design and development of an online risk communication. Risk Analysis: 32, 1496–1509. doi: 10.1111/j.1539-6924.2011.01773.x

Slovic, P. (1992). Perception of Risk: Reflections on the psychometric paradigm. In S. Krimsky & D. Golding (Ed.), Social Theories of Risk (pp. 117–152). Westport: CT, Praeger.

Tapper, J. (2014). Meet the ‘food babe’ who helped convince Subway to remove chemical from bread. Lead the Tapper. Retrieved December 8th, 2015 from:

Travis, J. (2015). Inside the summit on human gene editing: A reporter’s notebook. Science Insider. Retrieved from:

Trumbo, C. (2002). Information processing and risk perception: An adaption of the heuristic-systematic model. Journal of Communication: 52:2, 367–381.

Wade, S. (2015, March 19). Scientists Seek Ban on Method of Editing the Human Genome. The New York Times. Retrieved from:


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s