In this episode of
Expert Insights for the Research Training Community, Dr. Jyllian
N. Kemsley, manager of C&E News’ Safety Zone, and Dr. Craig A. Merlic,
executive director of the University of California Center for Laboratory
Safety, discuss common underlying factors in lab accidents and the roles
of multiple stakeholders in improving safety in lab settings. They also
recommend relevant resources and tools for developing a culture of safety.
The original recording of this episode took place as a webinar on November
5, 2020, with NIGMS host and director Dr. Jon Lorsch. A Q&A session
with webinar attendees followed Dr. Kemsley and Dr. Merlic’s talk.
Recorded on November 5, 2020
Download Recording [MP3]
Welcome to Expert Insights for the Research Training Community—A podcast
from the National Institute of General Medical Sciences. Adapted from
our webinar series, this is where the biomedical research community can
connect with fellow scientists to gain valuable insights.
Dr. Jon Lorsch:
Welcome, everybody, to this webinar that NIGMS is hosting. I’m Jon
Lorsch, the director of NIGMS, and it’s my pleasure to welcome everyone
I want to go ahead in just a second and introduce our speakers, but
first let me just quickly tell you how this is going to work. We’re
going to have two talks by the distinguished speakers we have here on
the topic of improving the culture of laboratory safety. And then we’ll
have an open Q&A session.
So let me go ahead and introduce our speakers. The first is Dr. Craig
Merlic, who is a professor of chemistry and biochemistry at UCLA and is
the executive director of the University of California Center for
Laboratory Safety. His main research focuses include transition metal
catalysis, synthetic strategies toward strain macrocycles, and
organometallic synthesis. He received his B.S. from the University of
California, Davis and his Ph.D. from the University of
The Center for Lab Safety was established in March 2011 in the wake of a
fatal lab accident at UCLA in 2008 that Craig will probably mention.
Craig actively promotes safety at UCLA and in the University of
California system, chairing several safety committees and directing an
information technology group creating safety software tools for use at
all 10 university campuses.
Our second speaker is Dr. Jyllian Kemsley. She is the executive editor,
policy and content partnerships at the American Chemical Society’s
flagship publication, Chemical & Engineering News, or C&E News,
and the manager of C&E News’s Safety Zone blog.
She received a B.A. in chemistry from Amherst College, a Ph.D. in
chemistry from Stanford University, and a graduate certificate in
science writing from the University of California at Santa Cruz. In her
executive editor role, Jyllian covers science policy in areas such as
the environment, energy, pharmaceuticals, food, worker safety,
intellectual property, and research policy. The Safety Zone blog covers
chemical safety issues in academic and industrial research labs, as well
as in manufacturing.
So thanks to both of you for agreeing to participate in this, and I will
turn it over to you. Craig, are you going first?
Dr. Craig Merlic:
I’m going to let Jyllian go first.
Jyllian’s going to go first. Perfect.
Dr. Jyllian Kemsley:
I’m going to go first, so let me share here. OK, well thank you to
everyone for being here and possibly taking a break from pressing
refresh on the election results.
I’m going to start off by talking about safety in labs that use
chemicals. And I phrased it that way deliberately, because there are an
awful lot of labs that use chemicals but don’t think they’re doing
chemistry. And they may not be. They may be using chemicals just as
solvents or something like that, but the chemicals are still there.
I’m going to talk through a few incidents that I’ve covered and what I
see are the common themes among them, and then I will pass it over to
Craig, who’s going to talk about the role of the PI in ensuring good
To start off, I’m going to talk about the fire at UCLA. The technician
involved was Sheri Sangji. She was 23 years old. She had just gotten her
bachelor’s degree a few months earlier. She was hired as a technician to
install and get up and running some analytical instrumentation, also to
do a little bit of synthesis. She was injured in a laboratory fire at
the end of 2008, and she died of her injuries about 2 1/2 weeks later.
On the day of the fire, so she had just started the previous October.
This was the second time she was doing this particular synthesis. And it
involved a reagent that’s a pyrophoric, meaning it ignites spontaneously
in air. This was the second time she was doing it and she scaled up
considerably to an amount that most people would not consider doing in a
So this is some of the safety information from the safety data sheet on
tert-butyllithium, that reagent, and you can see that the concerns about
it, the safety concerns, are very obvious—highly flammable liquid and
vapor, catches fire spontaneously if exposed to air.
In the top right here, this is how you should be handling that
particular reagent. It’s a liquid in a sealed bottle, so you clamp that,
and you run an inert gas line in, use an airtight syringe with a needle
that reaches to the bottom of the bottle, two hands on the syringe. And
you generally don’t want to pull the plunger more than halfway up; you
want to leave yourself that buffer for safety.
It’s unclear how Sheri was trained to do this particular operation. She
was told to ask a postdoc for help, and when that postdoc was
interviewed by investigators later, he said he couldn’t remember what he
had told her, but when asked about the procedure more generally, he
said, “Should you clamp the bottle? Yes Did I do it? No.” He also
described pulling the plunger all the way up on the syringe.
So most likely the example she had in terms of training was a poor one.
The other thing is that day she was using a 60-milliliter syringe.
Here’s a reminder of how big that is for people. And the bottle wasn’t
clamped, so chances are good that she was holding the bottle in one hand
and trying to manipulate a 60-milliliter syringe with the other. The
plunger came out of the barrel, and the reagent ignited, and Sheri’s
clothes caught on fire.
There was a postdoc in the lab with her. What he described was hearing a
scream, seeing her on fire. She was panicking, which is a normal
response to having her clothes be on fire, but she did not go to the
shower that was in the lab. She did not stop, drop, and roll. The
postdoc tried to use a lab coat to wrap around her to put out the fire
and then started pouring tap water on her.
When her adviser got to the lab, he found her sitting in the floor with
the postdoc pouring water on her. “Her clothing from the waist up was
largely burned off and large blisters were forming on her abdomen and
hands—the skin seemed to be separating from her hands.”
If you want to know more detail on what I at least was able to piece
together about what happened here, that’s the UCLA fire link on the
slide. This set off a lot of changes within the UC system—some driven by
the district attorney. Both UC and the professor, the adviser, were
charged criminally for the incident. Both of those cases were settled.
But if you want to know a little more about the UC changes, that’s at
the UC safety link.
At Texas Tech University, here we had a senior graduate student who was
synthesizing explosives as part of a Department of Homeland Security
project. Because they were working with explosives, the lab ostensibly
had a maximum limit of 200 milligrams that you could make at any one
This graduate student, who was training a junior graduate student at the
time, a first-year, was having trouble getting consistent
characterization results off of different equipment, and so he wanted to
make more to be able to do all of the tests on one batch. Reasonable
scientific thing to want to do. He scaled up to 10 grams, which again is
not something anyone in an academic laboratory should be doing.
After doing synthesis, he took half of it; it was lumpy. He thought it
was safe to grind in a mortar and pestle if it was wet with a solvent,
and it blew up in his hands. And he was in the middle of the lab, no
glass shield, no protective equipment.
He lost three fingers and permanently injured his eyes. That’s probably
lucky in terms of the result that could have come from this.
Another point about this case is that other people in the lab knew he
was scaling up, but no one spoke up about it, even though there was a
clear risk to them. Had someone been standing next to him or on the
other side of this bench, they could have easily been hurt also. And
link for more details if anyone is interested.
At the University of Minnesota, here was a graduate student synthesizing
acetyltrimethyl silane, 200 grams of it. When making this in the past,
he’d had trouble with clumping while the reaction was going on, so they
switched the solvent to polyethylene glycol, or PEG. And one of the
reagents involved in this is sodium azide, which when it comes into
contact with water, it will form hydrazoic acid.
PEG is hygroscopic, so if you combine sodium azide with a solvent that
can pull in a lot of water, you do risk forming hydrazoic acid. The
sodium azide could also have reacted directly with the PEG. So going
back to that hydrazoic acid point, hydrazoic acid is an unstable
explosive, has some other issues, too, but the explosive part is
probably the biggest concern.
So on the day the incident happened, the graduate student was actually
on his way out of the lab to lunch. He walked by his hood. He realized
that the reaction had stopped stirring. He reached into the hood, and it
exploded at that point. Whether he’d formed hydrazoic acid or whether
because the whole thing had stopped stirring and clumped together, we
had some localized heating that blew it up, it’s a little unclear.
Consistent with what you expect if someone is reaching into a hood, his
arm and his side were injured, and he needed surgery to remove the
At UC Berkeley, this was a graduate student who had synthesized a
diazonium compound and then wanted to look at how different counter ions
affected the properties. Fairly common thing to do, but in this
particular case, he decided to include perchlorate as one of the counter
And here’s from PubChem on perchlorate. This is perchlorate as the ion,
not as a salt of something else, but again you’ve got “may cause fire or
explosion.” It’s a strong oxidizer. Hazards here are documented and well
What happened in this case is that he got all the way through the
synthesis fine. It was at the very end where he had it in a funnel and
was scraping it out with a metal spatula; he’d taken off his eye
protection, and it blew up in his hands. And if we go back, we can see
this is what happened to his regular glasses. One lens is completely
gone, and the other one has a hole blown through it. He needed eye
surgery and had some facial lacerations, although no permanent damage.
And then lastly, from the University of Hawaii, this is a situation
where you’ve probably heard of the fire triangle but maybe not really
thought about how it might apply to what you’re doing in the lab. The
basic idea of the fire triangle is if you combine oxygen, heat, and
fuel, you can ignite. You put all those things in one place and
essentially you have a bomb.
So this particular lab was actually growing microbes to make biofuels,
and they were feeding the bugs a mixture of 55% hydrogen, 28% oxygen,
and 7% carbon dioxide. So hydrogen is your fuel, oxygen is your oxygen,
and all you need is some form of heat there to set that off. They had
had a problem previously with a smaller tank, I think it was 3-ish
liters, where a static shock had set off what they called a small
internal explosion. That should have been a heads up that you need to
rethink what you’re doing here, but it wasn’t.
So on the day of the incident, we had a 49-liter tank with that gas
mixture at about 117 psi and most likely— and this was an investigation
done by the Center for Laboratory Safety, so Craig can correct me if I
get anything wrong here—but the conclusion from that was that it was
probably a very small static shock, so small that possibly she didn’t
even realize it had happened, that set off that explosion. That’s the
remains of the tank that you see on the screen. The lab was pretty much
destroyed, and the postdoctoral researcher lost one of her arms and also
damaged her hearing.
So some of the common themes that I see in these and in other cases is
that the hazards are known and well documented. I included the safety
data sheets from PubChem and things here to demonstrate that the
information’s there, people. But you need to look for it and read it,
and that should be part of your safety culture is doing that hazard
assessment of the stuff that you’re working with.
I do sometimes hear from people, “Well, we’re working with new stuff.
We’re researchers and don’t always know.”
Again, a lot of the reagents that are the problem here, the risks are
well known, and I would argue if you really are working with something
unknown, that’s a reason to be that much more careful with what you’re
doing. In all of these cases, also a reasonable scientific justification
for the experiment they were doing but did not do that risk assessment.
So here’s a quote from then-chair of University of Minnesota chemistry
department Bill Tolman, “There was a real reason to use PEG as the
solvent to try to stop the clumping, but they hadn’t thought through
that there might be problems with doing that, especially at a 200-gram
The other thing is something that I like to call “training by
telephone.” I did this as an exercise probably in elementary or middle
school where you sit in a circle, and one person whispers something to
the person next to them, and they whisper to the person next to them,
and by the time you get all the way around the circle, whatever the
starting message was, it’s completely garbled. That’s what happens when
your laboratory training is done just by passing information from one
person to another—say the PI when they start to their first generation
of grad students and postdocs and on down the line. You don’t have any
So this is an excerpt here from the UCLA investigation, the postdoc who,
in theory, trained Sheri is like, “Yeah, there was no SOP. It was just
sort of passed on, either me watching someone else or someone telling me
how to do it.” Likewise, if you think about Texas Tech, you had a senior
graduate student thinking it’s perfectly fine to scale up to 10 grams
while training a junior graduate student.
And then lastly, a lack of personal protective equipment. Now that is
last because it should be last. You think about the hierarchy of
controls. You need to get those elimination and substitution and
engineering controls in place. PPE is your last line of defense.
And that’s where safety culture comes in a lot is are we doing those
hazard and risk assessments? Are we eliminating hazards or substituting
where we can? Are we putting in the appropriate controls? And then again
at the end, personal protective equipment.
Last thing very, very quickly. This is a bit tangential to safety
culture, but we published a story this week about graduate students
essentially not being eligible for worker’s compensation and being on
the hook for hundreds or thousands of dollars if they get hurt while
working in a laboratory. You really don’t have any protection for those
medical bills. So something to, perhaps, start asking your own
And with that, I will pass it over to Craig.
Thank you, Jyllian. So let me go ahead and share my screen.
So I’d like to address some concepts more related to the safety culture
as we go through this talk. And so we’re going to start off talking
about four themes.
But before I start off there, I’d like to call out this article that was
published by Jon and coworkers on developing a culture of safety in
biomedical research training. So this article was very nice in covering
a broad set of issues and can be sort of a starting point for both
researchers and administrators and PIs in looking at their institution
and their programs and seeing are they missing anything that needs to be
covered to try and have a safe work environment.
And so the first few themes of what I like to talk about when I address
lab safety is first off there’s no easy approach to improving lab safety
and creating a culture of safety. It takes a very strong commitment and
effort by people at every level of institutions—so the chancellor or
president of the university, through the deans and administrators, down
to the PI in particular and then down to the researchers in the labs, so
the graduate students, undergraduate researchers, and postdoctoral
It takes strong effort by everyone. And then most every institution has
some form of environmental health and safety office or department, and
they are really fabulous people, and they perform many essential
functions and help set policies. But really safety outcomes are very
dependent on safety engagement by the lab PI. And so at UCLA I work a
lot with the EH&S office, but when it comes to lab safety, it’s
critical that it be done at the lab level.
And Jyllian mentioned some interesting accidents. And actually if you go
through those accidents in a detailed way, the sort of standard
institutional EH&S policies may not have prevented those accidents,
so it’s really at the lab level that it’s critical to address lab
Then the next two themes, or the bottom two here, regulatory compliance
should not be the end goal of a workplace safety program, but rather,
the outcome of a strong culture of safety. So it turns out that
institutions have metrics involving safety programs, and EH&S often
times comes in and looks for compliance issues—so for example, the
existence of chemical inventory and proper storage of chemicals and
recording of safety trainings by people.
But actually those specific actions don’t necessarily prevent incidents
and injuries; it’s actually what goes on in the lab, how chemicals are
handled, how operations are done, how centrifuges and other energetic
pieces of equipment are run that determines lab safety.
And then finally, strong safety programs aim to prevent injuries while
supporting cutting-edge science. Sometimes there’s a feeling that safety
can get in the way of accomplishing the science, but really at the end
of the day, science done well is science done safely. If you think about
a sloppy researcher, do you trust their results scientifically? But it’s
the same thing for safety. A sloppy researcher may not have safe lab
So really science done well is science done safely. So when I think
about safety, I typically think about a laboratory safety triad. So we
spend a lot of time talking about safety programs, and safety programs
include things like EH&S actions, using SOPs that Jyllian mentioned,
PPE usage that was just talked about, safety training. But they tie into
the safety culture, so what are the overall actions by researchers in
the labs and what is the safety leadership shown and done by everyone in
And then these two combine to address, really, the safety outcomes. And
we saw some terrible examples in the last few minutes. So what types of
incidents occur? Where do they occur? When do they occur? And why did
they occur? And so it’s critically important that institutions actually
keep track of safety outcomes, discuss them, discuss safety outcomes at
other institutions as a way of avoiding them in the future. So this is
the safety triad.
What I’d like to do is dive into the safety culture side, since we just
saw some excellent examples of safety programs and safety outcomes.
Safety culture. We can define it as the values, beliefs, and behaviors
resulting in a collective commitment to safety by everyone in an
organization. There’s many inputs and outputs for safety culture—and
some of the points are shown here. But really, it’s involvement by
everyone and a quality involvement.
You can have the most outstanding set of SOPs—Safe Operating Procedures
or Standard Operating Procedures—but if no one is following them, then
that’s pointless. Or you can have verbal statements by people in
appropriate positions, but if there’s not follow-through, then you’re
also not going to have a safe organization. So really, it’s a
combination of what you have and how you live it that results in the
And we want to have a strong safety culture to do outstanding,
cutting-edge science in a safe manner. So my group, one of the things
we’ve done recently is conducted a safety culture survey to try and look
at the state of laboratory safety culture in some institutions, identify
factors that influence the safety culture, determine the strengths and
weaknesses of safety programs, and then identify factors that correlate
So our overall goal is to use data to drive improvements in laboratory
safety, practices, culture, and outcomes. We looked at more than 1,000
researchers at four R1 universities. They ended up being about 50/50
male/female. The majority were in health and life sciences, and the
others were in physical sciences, engineering. And we had about a 80/20
split between researchers/trainees and PIs.
So what were some of the outcomes that we found? So one was that there
is a perception gap when discussing experiment safety. So when we asked
the PIs a question about whether they discuss with their lab group how
to conduct experiments safely, 89% of the PIs agreed or strongly agreed
that they discuss how to conduct experiments safely. But the trainees
and staff had a slightly different view.
Answering the question, “My PI discusses with me how to conduct
experiments safely,” only 66% agree or strongly agree. So we need to
have better communication within the labs. PIs need to do a better job
communicating with the trainees about laboratory safety.
And as I’ll show in the next couple slides, it’s critically important
for the safety outcomes experienced by the researchers. Then one
correlation we did is we correlated PI safety recognition with the
trainee and staff safety behavior and also attitudes. I’m a chemist, so
I like to look at actions and reactions.
So here’s PI actions here. So safety is recognized by the PI, is
somewhat neutral, or the PI does not recognize and promote safety. And
then here’s the trainees’ and staff reactions here. “People in my lab
incorporate safety measures into their experimental protocols.” OK.
That’s a very good thing to do, as we just saw from Jyllian’s talk.
And yet, for researchers whose PI recognizes safety, almost 90% do
incorporate safety measures, but for researchers whose PI does not
recognize safety, only 36% of the researchers incorporate safety
measures into the experimental protocols. So what we see is a clear
dependence on safety behaviors by researchers dependent on the PIs. This
is at institutions where all the researchers, all the PIs, have the same
environmental health and safety office, the same rules, but we can see
that the behavior depends on the actions of the PI.
Over here, we see something about attitudes.
The question was to the trainees, “The time devoted to compliance with
laboratory safety regulations is appropriate and valuable.” And so we
can see that the attitude toward safety among the researchers is
dependent on the PI actions. For those researchers whose PI recognized
safety, 80% thought safety compliance was a reasonable thing to do, but
conversely then, for researchers whose PI did not recognize safety, only
29% thought the time devoted to compliance was appropriate and valuable,
and so I worry about the safety of those researchers. Another
correlation. We’re looking at correlating PI safety recognition with the
behavior and attitudes related to reporting a minor injury or a near
And so a minor injury, those may occur, but they could be precursors to
major injuries. And a near miss could be a precursor to a serious event.
So as Jyllian mentioned in Hawaii, literally the day before a major
accident the lab had a near miss, which is defined as “an incident where
no one was injured,” and a roughly 2-liter vessel had an explosion
inside it. And the lab did not analyze that near miss but instead went
on to a scale-up experiment that resulted in a tragic accident.
It’s important the labs discuss minor injuries and near misses and that
they’re reported, but let’s see whether that will occur. And so we see
again the PI action. Are they holding safety discussions weekly?
Monthly? Only quarterly or maybe yearly or never?
And then we can see the trainee and staff reactions here. So in the
groups that discussed safety often, 70% to 80% then report minor
injuries and near misses, but for research groups where safety is never
discussed, then 40% to 50% would report minor injuries and near misses.
Lab safety discussions correlate with incident reporting, which can lead
to better safety practices.
What about safety outcomes? So safety outcomes, we’re going to define
those as actual injuries. And can we correlate the PI and trainees’
actions and attitudes and behaviors with these outcomes? We previously
looked at some major injuries, but today let’s look at some minor
So here we’re looking at correlating the results of fewer minor injuries
are reported when trainees/staff report risk assessments. The statement
put to people was, “People in my lab consider safety procedures before
conducting a new or scaled-up experiment.” And so if they agree with
that statement, we can see that 70% have no minor injuries, and only 7%
have more than one minor injury. But for the group of researchers who
disagree with the statement—that is, they’re not considering safety
procedures before conducting a new or scaled-up experiment—only 46%
don’t have any minor injuries, and almost four times as many have more
than one minor injury.
So we’re correlating the injuries with the performance of risk
assessments. As Jyllian mentioned, at the University of Hawaii, that
accident not only was there a near miss the day before, but the
postdoctoral scholar had come from England, which has a strong culture
of safety at many of their research universities, and she was actually
used to conducting risk assessments in her training in England.
But when she went to the University of Hawaii, she actually asked her
professor about doing a risk assessment for this experiment involving a
mixing of hydrogen and oxygen gas, and the PI said, “No, we don’t need
to worry about doing a risk assessment.” But it was something that she
had actually thought of. She asked about doing it, and it was not done,
and then we saw the tragic consequence.
And then what about minor injuries here correlating with the PI
discussing experimental safety? The statement was, “My mentor/professor
discusses with me how to conduct experiments safely.” And for
researchers who agreed with that statement, 73% had no minor injuries,
and only 7% had more than one minor injury. But for the researchers who
disagreed with the statement, there was again almost a tripling of the
number of researchers who had more than one minor injury.
So we can correlate then safety recognition by the PI with injuries in
the labs. So what is PI engagement? What should PIs be doing? PIs need
to be discussing with students and staff how to conduct experiments
safely. They need to require students and staff to consider safety
procedures before they conduct a new or scaled-up experiment. The PIs
need to teach hazard identification, risk assessment, critically
illustrated by the Hawaii incident. PIs need to monitor student safety
So the Texas Tech incident involved a scale up that should not have been
done by the student. PIs need to regularly discuss safety in lab
meetings. And in research groups where there are no incidents occurring,
that’s fabulous, but they need to draw upon incidents at other labs or
other universities as discussion points to lead the safety discussions.
There are lessons-learned sites. There’s a number available at my own
site, at the Center for Laboratory Safety. PIs need to wear proper PPE
when in the lab, and they need to exemplify proper safety protocols at
all times. It’s oftentimes the PIs who are saying things like, “Do as I
say and not as I do.” Well, that’s not appropriate. We need to have
appropriate behavior by the PIs at all times to exemplify what should be
So to conclude my very brief comments here, there’s no magic bullet for
lab safety. It’s a multipronged set of actions required at every
administrative level. And again, I would like to reference Jon’s nice
article that outlines many key points. And then engagement by faculty is
a critically important action with a documented impact on safety. It
reduces injuries and promotes a culture of safety, and we cannot rely
just on environmental health and safety offices to ensure the safety of
And then also, as illustrated by Jyllian, experiments must be designed
and conducted with safety planning. Safer experiments constitute better
science. So that concludes my key points.
Excellent. Thank you so much to both of you. I think that was extremely
clear and illuminating.
Jyllian, you kind of touched on this, but I think there is sometimes a
perception in the life sciences field that this is more of a chemistry
problem. A lot of the examples you gave were chemistry departments. To
both of you, is this mostly a chemistry problem or are there lots of
examples of tragic accidents and near misses happening in the biological
sciences as well?
There are certainly examples in the biological sciences. I write for
chemists, so those are the examples I know best. But Jon, I think in
your paper you actually mentioned one that involved some sort of
pathogen that people had thought had been disabled—I can’t think of the
right word off the top of my head—neutralized maybe?
But when given the opportunity to infect someone who had a particular
disorder, the pathogen adapted, as we know they do. And if I remember
correctly, that person died. So yeah, they definitely occur in other
I think what I want the biomedical community to be mindful of is that
you are using chemicals. Even if you think you’re not doing chemistry,
you’re still using chemicals, and those can still cause problems. Maybe
they’re toxicological rather than explosive. But also that some of the
more biological things that I didn’t get into can present a problem
also. Craig, do you have anything you want to add?
Sure. I’d like to point out that there are hazards in nearly every area
of research, except maybe computational.
Oh no, repetitive stress injuries there.
Yeah, it’s critically important that even the biological researchers
address lab safety, and so there are, sadly, incidents that have
occurred. Some ones that I am familiar with: Eye injuries. So it’s often
common in biomedical researchers, especially in academic labs, that they
don’t wear eye protection, and so there have been serious injuries to
the eye because people were not wearing eye protection.
There’s also burn injuries are relatively common among the researchers
who are doing a sterilization technique where they take the wire loop
and put it in ethanol and then the Bunsen burner and transfer the
bacteria. The Erlenmeyer flasks of ethanol have been knocked over and
people have been seriously burned from that.
And the infections from hazardous biological agents. Just recently there
was a research lab that was working with HIV, and the test tube
shattered inside the centrifuge and spread HIV throughout the
centrifuge. So there’s a number of hazardous situations. So safety is
something that everyone should be addressing at the institution and not
I’d like to reemphasize something on eye protection too, because we, as
a news organization, reject several photos a week because someone has
taken a photograph of someone who is obviously in a lab—and I will get
that they’re in a bio lab, and there’s a shelf behind them of large
chemical containers. We won’t run them, because our standard for photos
is that if you are in a lab, you have eye protection on. Lab
coat—situational. Gloves—situational. But if you are in a lab, you have
eye protection on and we, like I said, several times a week we reject
photos. If you are that concerned about your clothes that you’re putting
on a lab coat, or that concerned about your hands that you’re putting on
gloves, you need to protect your eyes.
Craig, do you want to mention lab coats? Because that’s something I
learned from you which I didn’t know, the difference between the
standard cotton white lab coats and more modern ones.
Yeah, it’s amazing to me that lab coats were invented more than 150
years ago, but just in the last 20 years there’s been tremendous
advances in lab coat materials. And so there are cotton lab coats,
polyester lab coats, which provide little or no protection to fire. But
now there are treated coats that provide flame resistance. But then the
latest lab coats actually provide not only flame resistance but also
In a biological lab, it’s common to be working with maybe some aqueous
basic solutions, or aqueous acidic solutions, or solutions involving
carcinogens, toxins, things like that. And the latest lab coats actually
have a barrier that prevents polar materials from penetrating the coat.
So it actually could provide protection against acids, bases, and polar
solutions of toxins and carcinogens.
I had no idea all these years we’ve been wearing lab coats that could
catch fire if you spilled flaming ethanol on them, or if you spilled
ethidium bromide, it would go right through.
I also just dropped in the chat, C&EN reviewed several different
types of chemical splash goggles, and I just dropped the link to that in
the chat, if anyone is interested.
Great. Thank you. Take a few from the Q&A because there are quite a
bunch coming in now. The first one says, “Hi, Craig and Jyllian. What
recommendations do you have for guidance to reviewers (so this would be
reviewers of training grants) related to evaluation of how training
programs prepare students to evaluate risk and work safely?”
The background here is NIGMS is now emphasizing that as one of the
important areas in our training grant applications, and reviewers are
being asked to look at how programs are preparing students in terms of
the things you’re talking about. How would you, if you were advising
reviewers, think about that, getting appropriate information, and what
would you emphasize?
I would say that it should be regular and ongoing. So the idea that you
can bring in trainees and give them, the day they walk in the door, a
three-hour introduction to safety, and then consider them to be trained,
is ludicrous. You need to have a program that has active engagement at
multiple levels so that, yes, when trainees come in the door, yes, they
should get some training, but then it should be ongoing. And it should
There’s the generic training that EH&S might be able to provide, but
also the lab-specific training. So research labs need to be doing
things. And then maybe some training at the department level. And it
should be ongoing, so annual, or more frequently, and it should be
engaging, so that the students get a variety of modalities in the
training. That’s what I’d like to see is not just a single three-hour
course at the beginning but rather an ongoing, sustained program.
Jyllian, do you have anything to add to that?
Just that it’s also really important to remember that lab-specific
training is key. No general EH&S training is going to cover the
specific stuff that someone is going to be doing when they get into the
meat of their experiments. So if you assume that EH&S safety
training is going to cover everything students need to know, that’s not
the case. No general safety training, for example, is going to include
the specifics—Sorry, the noise is the printer, which is in my office.
The kids are printing something for school. Think about what
lab-specific activities, the specific properties of an infectious agent
or whatever, that the PI needs to make sure they expand upon for that
That seems very important that there is the didactic part of the
curriculum and there is the mentored part of the curriculum, and they
need to somehow connect and amplify and dig down, as Craig said. That
dovetails very nicely into the next question, which is about, “How can
EH&S staff promote this culture of laboratory safety?” Given Craig’s
point and your point, Jyllian, you can’t just do one training at the
beginning. And EH&S has one important role, but there’s an awful lot
more to it. But how can they be connected into the process in a maybe
more meaningful way? Is there a way?
Yeah, there’s lots of things EH&S can and should be doing. It’s
critical that they be seen as collaborators to facilitate cutting-edge
science and not the policemen. Too often, EH&S might come in to do
safety inspections in the lab and then sometimes the researchers
actually flee the lab, and that’s the worst thing. You want to have
collaboration. So EH&S should always be meeting with researchers and
labs when they do inspections.
EH&S should engage the researchers to find out what type of research
they’re doing. Ask the researchers what they feel to be the most
hazardous operations that they’re doing, and then have discussions and
communications to try and engage those topics. And so I think the most
important thing is for EH&S to be seen and to actively work to be
collaborators promoting the science—safe science.
Do either of you know of any models where EH&S has been brought in
to the curriculum part of graduate education, for example, so they’re
not just the three hours at the beginning and then the police force, but
maybe there’s a collaboration in teaching as the curriculum goes along?
Has that been explored?
I think there are a couple, but they’re mostly smaller schools, not big
research universities. Craig, I don’t know if you’ve got any others.
It depends on the school. I know at UCLA, where I work, so I teach a
course; it’s titled “Safety in Chemical and Biochemical Research,” and
actually I bring in people from EH&S to give presentations. And I
know at some of the other schools EH&S is engaged in some of the
training at the department level, and that’s one of the ways that
EH&S can better establish connections with the students. If they
come in and make presentations and be part of the safety training, then
the students, from an early point, can see members of EH&S and the
perspectives and goals of EH&S.
So another question. Craig, this is for you. “What’s your schedule
checking in with labs?” But I think Jyllian could probably comment on
this as well. “And what’s the best approach to involving the labs
themselves in improving their safety culture?”
I think I’m catching the gist of the question. And so it’s really
important that the researchers, the students, get really engaged in lab
safety. And one of my favorite political comments is that when the
people lead, the leaders will follow. We’ve seen a number of
institutions that have really great student organizations, student-led
I think a pioneering one was at University of Minnesota, but now there’s
a number of universities, R1 universities around the nation where there
are actually student-led organizations focusing on safety. So that’s one
of the things that can be done, and that can be institution-wide. So
I’ve seen departments start it, graduate students within one department,
then it spreads to the university.
But also within a research group. I think the students bring up in a
research group meeting, “Oh, by the way, this issue came up with me this
week. I had this problem.” Or, “By the way, this was recently reported
in the literature. I think since we do this hazardous operation we
should discuss it.” So I think there’s things that the students can do
to directly promote laboratory safety and the culture of safety.
Jyllian, here’s one that maybe you can start with. “What do you think
academia can learn from industry safety practices,” since you’ve covered
Well, going momentarily back to the previous question, the managing
editor of the ACS journal Chemical Health & Safety used to be part
of one such student safety organization when she was a graduate student,
so if anyone is interested in pursuing that concept, I know there are
some resources out there. I don’t have them right at my fingertips right
now, but if you reach out to me—I have a very Google-able name; you can
find my email address or Twitter account—I can pass those on.
In terms of what academia can learn from industry, one of the challenges
there is they have very different worker populations. Academia,
proportionately, has a much younger workforce with much higher turnover
than industry does. And if you think about, for me if I think about what
I was like in my early or mid 20s, you tend to think you’re a little bit
invincible. That recognition of your own mortality tends to come later.
And I think that’s a really important thing for schools to recognize. I
mean, it’s not an excuse for lax safety practices. If anything, it’s an
argument that you need to pay even more attention. So that’s not really
learning from industry safety practice directly, but it’s understanding
what some of the differences are there. Industry is pretty clear. You
don’t follow safety practices, you’re out.
And I do know of cases where someone has fired a graduate student for
poor safety practice. It is possible to do it. People don’t seem to want
to take those steps, but maybe a little more enforcement is appropriate.
Just to add to her comments, there are several active things that are
being done. So one is that when industry makes presentations, they can
emphasize the safety considerations that go into an industrial approach
to research and process and production. When they make presentations,
that’s one of the things that they can do.
Oftentimes, there’s also contacts between industrial researchers and
academic researchers and they are offering their expertise. I know I’ve
had a number of interactions with some pharmaceutical companies, and
they’re very interested in trying to help out academics to improve
laboratory safety. And then some companies have gone above and beyond
Dow Chemical has the Dow Safety Academy. The Dow Safety Academy is
actually a set of resources available to promote safe research. They
have direct connections between them and a number of universities, but
also their website contains freely available materials, and there’s
actually videos also related to safely using materials. So there’s a
number of things that can be done there and are being done.
I would think, and we sort of posit this in our paper, that if a
graduate program is teaching or trying to teach to the higher standards
of industry in terms of safety, in addition to probably improving safety
in the labs in that academic institution, their students will be better
prepared for jobs in the industry, because they won’t have to be
retrained to higher standards.
Yes, I hear that over and over from industry: “We feel like we have to
do remedial training for people coming out of graduate school.” And they
would love to not have to do that, to the degree that they do that.
Yes, multiple benefits. What about suggestions for encouraging PIs to
take a more active role in promoting safety? Craig, you identified that
as a key problem. What are the steps that institutions and others should
take to do that?
Yeah, so I did mention a few minutes ago having the students help
initiate. PIs really value their students a lot, and they want the
students to be as productive as possible producing the science. So if
the students go to the PIs and say, “We want this. We want this to be a
part of group meetings.” Or, “We want this to be part of the
departmental education.” I think that’s a good starting point.
I think administrators clearly need to put requirements on the PIs to
promote lab safety, and when it comes to training grants, then the
administrators there can demand that type of action by the PIs. And then
the NIH, from their perspective, could also demand that that be part of
training grant programs, asking what type of actions are being done at
the PI level.
So in site visits for training grants, the reviewers could ask what the
individual research PIs are doing to promote lab safety. So there is a
number of actions there. And then maybe not related to training grants
but individual research grants, the NIH, for example, could ask that the
PIs identify what are the hazards and what are the risks in the research
being proposed in this grant?
And then the second part would be what is the PI going to do to mitigate
those risks? And then the third part would be what training will be done
as part of this grant to promote safe research—so training on the grad
students and postdoctoral scholars.
I think, Jyllian, ACS journals are now requiring special safety
sections, isn’t that right?
They are requiring safety information, although it varies by journal.
And some of that is because there are so many journals. We have ACS
Chemical Health & Safety, also Organic Process Research &
Development. We also have more computational journals and more
biochemical and then, of course, organic chemistry, inorganic chemistry,
so how that gets implemented varies quite a bit across the board.
I honestly, I haven’t asked recently how people think that’s going, so
I’m not sure. Better than not asking authors to do it, certainly. Not
sure how it’s gone in practice.
In my opinion, it could be strengthened quite a bit, and I’ve made
comments to some people at ACS on that. But as an example, when people
review papers, there’s typically some free response sections, but
there’s also a few specific questions like, what’s the scientific merit
of this research? Did the results adequately support the conclusions?
So a series of questions that are brief that the reviewer is supposed to
answer. It would be easy for the journals to add one or two more
questions for the reviewers, such as, are any particular hazards
highlighted in the safety information and are measures given to mitigate
any risks? It would be easy to add in a question in the review forms
related to is safety addressed in this paper?
And as you said, same possibly for grants. So one of the attendees says
that her research is actually in academic safety, which is great to
Send me an email!
Send Craig an email. He wants to hear from you.
And Jyllian too. So that’s a great connection right there.
But that brings up the question of what more research would you like to
see done in this area? What are some key questions that we just don’t
know the answers to and would help us improve safety culture if we did?
I think some of the stuff that Craig has described, really trying to
understand what the drivers and motivations are for… Going back to the
original question: Does PI engagement make a difference? It turns out
that, yes, as you’d expect, it does.
Graduate students just in general don’t pay a whole lot of attention to
stuff that their advisor doesn’t care about. I think the next step from
there is what sort of interventions work? Are there different approaches
to training that are more or less effective? There is probably a lot of
literature on this outside of laboratory safety, so could probably
capitalize on some of that in terms of what training approaches work and
make some generalizations with it. Laboratory scientists are data-driven
people. If you can give them the data to say this makes a difference, I
think they are probably that much more likely to pay attention to it.
Definitely. And also there are other fields that have been doing safety
analysis for quite a long time, and if we can translate that information
over. So for example, the aviation industry has very robust procedures
for safety because the consequences are so severe. There are things that
are done in the aviation industry that could be emulated in the research
fields. Just an example, checklists. You don’t want your pilot to have
the A-310 lift off without going through the checklist of safety issues.
So we could think about the same things as related to lab safety. When
you’re getting ready to fire up that centrifuge, have you done A, B, C,
D? There’s a number of things that can be looked at.
In engineering there’s a number of programs that actually have safety
analysis as a course in engineering. So I think looking at how we can
translate safety information and safety culture promotion from other
areas to lab research, I think that would be one thing to look at. As
Jyllian mentioned, what type of training is most effective? Do we want
to call it, say, safety education, maybe not safety training?
And then the structures of universities and how they promote safety
action. So human factor analysis. Many accidents occur because of human
actions and not paying attention to certain things, and so research
related to understanding human factors and how they contribute to lab
I know we’re almost at the time here, but going back to having a strong
safety culture. Where do people really start to become scientists? It’s
in their undergraduate labs. And I think people could do a better job of
building safety into the curriculum as a way to engage students and have
them understand the science that they’re doing.
The safety concerns are not separable from what you’re doing
scientifically. If you understand the virulence of this particular
pathogen, for example, that drives your safety considerations. So rather
than just handing students a list of rules for the labs they’re taking,
are there ways to have them engage in that hazard assessment/risk
assessment part for the teaching labs?
There’s a great example from Seattle University where they break
students up into groups and each group—separate from their lab
groups—each group is responsible for doing the risk assessment for the
laboratory and presenting it to the class. And it’s a good literature
exercise, it’s a good critical thinking exercise, and there’s lots of
pedagogy in there that you’re fulfilling that becomes more useful in the
long run in terms of setting that culture—this is part of what we do in
the laboratory—than handing people a set of rules.
We’re just about out of time. I’ll give one more question before we wrap
up. I’ll say there are lots of questions still in the Q&A. We’re
going to save these, and we’ll figure out if there is some way we can
answer those in the future in different ways. We’re just hitting the tip
of the iceberg here, I think. But the last question that came through
that I’ll ask you is, “What more can government agencies—NIH, NSF,
elsewhere—be doing, or state agencies?” Oh, nothing. Good. [LAUGHING]
So I did make some comments about grant applications. Can we ask the
people proposing the research to identify what are the hazards? How are
they going to mitigate the hazards? And how are they going to train the
researchers? At government level or institutional levels, making demands
on institutions that they have robust safety programs and engagement at
We need to make sure that the leaders of institutions value safety and
promote safety. That was one of the other chat questions. And there’s
nothing more destructive to scientific progress than to have an accident
and having to spend enormous amounts of resources addressing it. So
safety is always much more efficient than having an accident and then
trying to clean up the mess afterwards.
Nothing really to add on that front except to note that going back to
the enormous amount of resources involved. Time and materials aside,
UCLA spent, if I remember correctly, somewhere on the order of $4.5
million on external legal fees only around Sheri Sangji’s death. And
that doesn’t include anyone’s internal time or anything like that, or
even Sheri’s medical bills. So yeah, better not to have the accident in
the first place.
Absolutely. Well, thank you so much. Again, I think we’ve just hit the
tip of the iceberg, but this is a great start of this conversation. I
hope we’ll be able to find ways to continue it. This is a real area of
emphasis for NIGMS and, I think, NIH as a whole, and, again, we’ll put
this up on the web. I encourage you to share it. And we will find ways
to answer the rest of the questions.
So Jyllian, Craig, thanks so much to both of you.
And thank you, Jon.
Thank you very much for having us.
Connect With Us: