Listed below are the details of the projects funded under
Listed below are the details of the projects funded under
Listed below are the details of the projects funded under
Advanced Statistical Experimental Design, Ethics and Data Analysis Principal Investigator: Jeremy Boss, Ph.D., Emory UniversityThis project aims to develop and teach a new semester-long course, Advanced Statistical Experimental Design, Ethics and Data Analysis, to provide graduate students with advanced knowledge in experimental design and statistical methodology beyond the introductory biostatistics and bioinformatics courses. This course will be taught in a multimodal format utilizing lecture, a practicum providing opportunities to work directly with datasets, a seminar series with lectures from data scientists in the field and peer-to-peer consulting led by students who have completed the course. A website will also be developed which will contain prerequisite material, lecture videos and other resources.
Accomplishments: Funding was provided to design a course in data analytics. The Advanced Seminar in Data Analytics course has been taught twice and is now part of the Genetics and Molecular Biology curriculum. In addition to enabling design and implementation of the course, this support allowed outside speakers to be brought in for special lectures.
Best Practices to Ensure Reproducibility and Rigor in ResearchPrincipal Investigator: Eugene Orringer, M.D., University of North Carolina at Chapel HillThis project will provide formal training in research design and methods, data acquisition and data analysis to ensure rigorous and reproducible science through a new 7-week, one-credit course, Best Practices for Reproducibility and Rigor in Research. Topics will include: experimental design, experimental rigor, training and standard operating procedures, data acquisition and archiving, data analysis and reporting, recognizing biological diversity and potential artifacts, and industry representatives-best practices in pharma and biotech. The course will be offered to students in T32 programs and available to all graduate students at the University of North Carolina in subsequent years.Accomplishments: The supplement supported the organization of a new course for graduate students that provides a strong foundation in research design and methods in areas related to conducting reproducible and rigorous research. Through a collaborative effort with the UNC BBSP, 7 workshops were developed covering everything from starting an experiment to publishing the results. The workshops were offered to graduate students in May 2016 and will be offered again next spring.
Experimental Design for Biomedical TraineesPrincipal Investigator: Joey Barnett, Ph.D., Vanderbilt University Medical CenterThis project aims to develop and implement a curriculum which includes topics such as experimental design, determining appropriate sample size, selection of exclusion criteria, awareness of bias and other statistical considerations. This curriculum will be delivered through lectures, small group discussions and "walk-in clinics" with opportunities for biostatistics consultation designed to prepare Vanderbilt T32 trainees, chemical biology graduate students and Meharry Medical College students training at Vanderbilt to address issues of reproducibility in science.Accomplishments: The supplement was used to develop a program on experimental design and reproducibility for first year students at Vanderbilt. The program also allowed students to have regular access to biostatisticians to help plan their experiments. The course is now being taught. A very brief description about Barnett's program was included in a Nature story ( Reproducibility: Seek out stronger science "http://www.nature.com/nature/journal/v537/n7622/full/nj7622-703a.html )"
. In addition to the full course, short (1-2 hour) lectures that introduce the history of our current interest in Rigor and Reproducibility and basic principles have been developed. These have been delivered to students recently appointed to training grants, most not NIGMS-funded. Course directors developed a session on this topic for the ASPET Teaching Institute at Exp. Biol. in April 2017. Dr. Barnett served as Organizer and Chair, ASPET Teaching Institute, Supporting Experiment Design and Rigor in Graduate Training, Exp. Biol. Chicago IL (April 2017). Dr. Damon presented our results to date.Dr. Barnett presented a lecture on rigor and reproducibility efforts at an international conference on graduate training. ORPHEUS (Organization for PhD Education in Biomedicine and Health Sciences in the European System), Efforts to enhance rigor and reproducibility in science in the United States. Klaipeda, Lithuania (May 2017).Drs. Barnett and Damon presented a poster at the TWD meeting in Baltimore in June 2017 - Survey of Student Knowledge and Behaviors Related to Experimental Design. Drs. Barnett and Damon have written and will submit in 2018 an R25 application to expand the pilot course across Vanderbilt and our training partners (Meharry, Fisk, TSU) in Nashville.
Hypothesis, Design and Biostatistics Principal Investigator: Matthew Trudeau, Ph.D., University of Maryland, BaltimoreThis project will develop a new course, Hypothesis, Design and Biostatistics, with an emphasis on principles of rigor and reproducibility in science. The course will consist primarily of online content and faculty will work with the Center for Academic Innovation and Distance Education at the University of Maryland to develop online materials and adaptive learning modules based on adult learning theory. The course and materials will be available to students at University of Maryland, and has the potential for broad accessibility and dissemination given its online format.
Accomplishments: This course has trained 25 graduate students, including 8 T32 fellows and 4 NIGMS T32 fellows. Students were from the Training Program in Integrated Membrane Biology, the MSTP and the Training Program in Neuroscience as well as other programs. The course was a 3-credit course that entails approximately 2 hours (or more) of online work and 1 hour of in-class time per week. The course ran for 17 weeks. Learning outcomes were evaluated by: (1) On-line quizzes that were submitted and evaluated by the coursemaster, (2) Problem sets that were worked during the out-of-class time and submitted electronically, (3) In class time that was spent in small groups with the instructor as a guide in order to master the problem sets, (4) exams, and (5) final exams. Exams were electronic and taken at a computer workstation. Grades were determined by participation in class small groups and problem sets (30%), and exams (35% each). The results were measured by the Grades Awarded: 19A/5B/1C. Now starting its 2nd year, 27 new graduate students have enrolled in the course. There are currently 6 T32 trainees, Of these There are 3 NIGMS T32 fellows with 2 in the TPIMB and 1 in the MSTP. Results will be measured at the end of the semester by grades awarded. In summery we have trained (or are training) a total of 52 graduate trained to date. The course will now continue in perpetuity.
Improving Rigor and Reproducibility in Big Data Research
Principal Investigator: Bruce Weir, Ph.D., University of Washington.This project will design new curriculum and enhance existing training in bioinformatics and machine learning to provide trainees with quantitative expertise necessary to extract knowledge from biomedical big data. Activities include developing a data science curriculum for biomedical sciences at the University of Washington, a course on data visualization for biomedical big data, a short course on graphical models and network analysis, revision of the Ethical Issues for Biostatisticians course and a graduate certificate in data science for public health and medicine.Accomplishments: The supplement allowed for the planning of three new graduate-level courses: BIOST 544
Introduction to Biomedical Data Science (4) Provides an introduction to biomedical data science with an emphasis on statistical perspectives, inducing the process of collecting, organizing, and integrating information toward extracting knowledge from data in public health, biology, and medicine. Prerequisite: either BIOST 511 or equivalent; either BIOST 509 or equivalent; or permission of instructor. BIOST 545
Biostatistical Methods for Big Omics Data (3) This "hands-on" course introduces statistical methods for high-dimensional omics data, as well as the R programming language and the Bioconductor project as tools to extract, query, integrate, visualize, and analyze real world omics data sets. Prerequisite: BIOST 512, 514, or 517 BIOST 546
Machine Learning for Biomedical and Public Health Big Data (3) Provides an introduction to statistical learning for biomedical and public health data. Intended for graduate students in the School of Public Health or the School of Medicine. These courses are designed to lead into an MS degree in statistics or the new MS degree in data science. They serve to bring statistical training to biomedical students.
Mastering the Art of Reproducible SciencePrincipal Investigator: Bradley Yoder, Ph.D., University of Alabama at Birmingham
This project will develop and pilot a team-based learning course, Mastering the Art of Reproducible Science. The 16-week course is structured around four modules: reproducibility issues associated with tools, reagents and research approaches; reproducibility issues involving pressure, bias and data analysis and presentation; reproducibility in preclinical testing and failures in clinical studies; and reproducibility involving gender, race, age and health disparities. Each module will include reading materials, web-based videos and exercises that will be completed outside of the classroom. Teams of students will then work through a series of application exercises involving scenarios that impact reproducibility and will produce short videos demonstrating the principle of reproducible research learned in the scenario.Accomplishments: The 3 NIGMS training programs at UAB launched this course in fall of 2016. The 21 NIGMS students analyzed published research and developed scenarios where reproducibility was compromised and devised strategies to use in their labs to ensure reproducible results. The course covered: 1) reproducibility issues associated with tools, reagents, and approaches; 2) reproducibility issues involving pressure, bias, data analysis, and presentation; 3) Reproducibility in preclinical and clinical studies; 4) reproducibility involving gender, race, age and health disparities; and 5) a panel of editors, scientists, and bioethicists who discussed how they are addressing reproducibility problems.
Quantitative Approaches to Experimental Design and Intro to Individual DevelopmentPrincipal Investigator: Miguel Garcia-Diaz, Ph.D., State University New York, Stony BrookThis project will implement curricular changes to enhance both rigor and reproducibility and exposure of predoctoral students to multiple career paths in biomedical science. Curricular changes include the development of a 1-credit course, Quantitative Approaches to Experimental Design, for all first year students, enhancement of the Statistics in Life Sciences course to expand on introductory content, requiring students to address reproducibility in their research seminar presentations, developing new elective courses and revising the RCR training to include issues of reproducibility. This project will also implement a 1-credit Individual Development Plan course to evaluate interest in and develop plans for their preferred career path. A Career Conference will also be held every 2 years to provide exposure to different career paths.
Accomplishments: The supplement was used to design a course ititled "Introduction to Computational and Quantitative Methods in Biology" for their trainees. This course provides an introduction to computational methods using the Python programming language. The supplement was also used to purchase iPAD Pro/Microsoft Surface Pro Tablets for T32 students and a graphical workstation with software for graphical and statistical analysis. These purschases allowed graduate students to learn how to maintain electronic laboratory notebooks and to handle notebook data acquisition. The program also developed a Rigor and Reproduciblity training program for their trainees to discuss experimental design. The supplement also supported the establishment of a yearly career workshop to expose trainees to different career options, discuss the importance of Individual Development Plans as well as Career Planning.
Training in Experimental Design for Biomolecular PharmacologyPrincipal Investigator: David Farb, Ph.D., Boston UniversityThis project will enhance training in experimental design and reproducibility of research results by developing a new training core within the predoctoral training program in biomolecular pharmacology at Boston University. This training core will consist of a two semester course covering content including emphasizing a 360-degree view of study design, objective statistical evaluation of results, blinding of investigators for analysis of research data, directed measures of quality control and communication of the scientific product, and students will apply knowledge in the development of NRSA proposals. The course will also include collaboration with the pharmaceutical industry (Pfizer) and "role-playing" exercises to replicate real-world experiences in communication and decision-making related to scientific results.Accomplishments: The supplement was used to introduce the topic of experimental design and reproducibility in their core curriculum. Starting in the spring semester of 2016, course directors included experimental design and rigor/reproducibility training in their first year course. In their 2nd year Current Topics course, students are provided information and publications about the importance of experimental design and data reproducibility in research. A seminar course was developed that discusses strategies to include rigor in experimental planning to reduce the risks that results will not be reproducible. Arranged discussion session on industry practices to maximize reproducibility of experimentation for five Biomolecular Pharmacology trainees at Pfizer and five at Biogen with their scientific supervisors at end of summer 7-week rotation. Reviewed and approved a new elective course for the Biomolecular Pharmacology Training Program to enhance trainees' skills in statistical analysis of genomic data. A course, entitled Genomics Data Mining and Statistics (BS831, 2 credits), was developed by faculty in Computational Biomedicine, Department of Medicine and offered through the Department of Biostatistics at BU School of Public Health as a direct result of the institutional engagement supported by this NIGMS supplement.
Using ORANGE to Improve Rigor and Reproducibility in the Analysis of Large DatasetsPrincipal Investigator: Theodore Wensel, Ph.D., Baylor College of Medicine
This project will provide graduate students with a strong foundation in methods to designing and conducting reproducible and rigorous research by teaching an innovative 2-week course, Orange, and using it as an analytical tool for large data sets. The course will focus on data mining essentials, covering standard approaches to clustering, classification, regression and model selection, and domain-oriented techniques including gene enrichment analysis. In the second phase of the project, students from the Orange course will participate in a 7-month application of the content through three team projects. The team project component of the course will also include access to mentors, guest speakers presenting professional development seminars and a business plan style competition.Accomplishments: Summary of Results of Activities Supported by Reproducibility Supplement T32-GM008280-27S1 The goal was to design and implement a course to train students in data analysis reproducibility. The focus was on reproducible analysis of large data sets, rather than, for example, reproducible acquisition of primary data. A course was developed based on the data mining software suite Orange. Students were provided with lectures on reproducible analysis and correct use of statistical measures and given hands-on exercises using the software. There was great interest in the course, and 40 students enrolled in and completed the course from a range of the different Ph.D. programs in our Graduate School of Biomedical Sciences. The students' self-assessment at the end of the course was that they had advanced significantly in their understanding of reproducible data analysis, and their assessment of the course was overwhelmingly positive. We had initially planned to assess the effectiveness of the course objectively by conducting a "reproducibility challenge", with volunteers being assessed with and without benefit of the course. Although 14 participants in the course agreed to participate, we were unable to identify an appropriate control group of students who would undergo the same assessment without the course. As it did not seem like a good idea to conduct what might not be a reproducible assessment as part of a course on reproducibility, this idea was abandoned. Because of the positive response of the students and instructors, the course was implemented as an ongoing course in our Graduate School of Biomedical Sciences curriculum. This year the course again has a full enrolment of 40 students.
Adoption of Good Research PracticesPrincipal Investigator: Yoji Shimizu, Ph.D., University of MinnesotaThis project is to develop educational activities aimed at providing predoctoral trainees with foundational training in research quality assurance best practices, also known as Good Research Practices (GRP). The curriculum will be delivered using lecture, discussion, interactive workshops and one-on-one support opportunities. They will have a website to provide lecture videos, course readings, GRP integration tools, templates and self-assessment checklists. The evaluation plan includes trainee and course evaluation using results from GRP trainee audits, trainee satisfaction surveys, data reconstruction exercises, analysis of pre-and post-training assessments by trainees, and GRP integration metrics.Accomplishments: We implemented a pilot course to introduce predoctoral trainees to Research Quality Assurance (RQA), and to help them establish an individualized RQA process to maintain research rigor, data quality and integrity throughout their specific research activity. Faculty with QA and data management expertise worked with trainees to facilitate their adoption of risk-based RQA best practices into their research environments. The implementation of RQA best practices helps ensure that data are collected, generated, used, and applied in a transparent, reliable and secure way throughout the research life cycle. This is achieved through attention to the risk-based management of projects, personnel, facilities, equipment, materials and reagents, assay validation, procedures, research and work records, computerized systems and research review.Twelve predoctoral trainees volunteered to participate in five group workshops, 1 Data Carpentry workshop (http://www.datacarpentry.org), individual trainee-facilitator sessions and a focus group led by an external facilitator and evaluator. Program topics included: targeting RQA to specific projects, identifying and mitigating risk to research data quality; managing research data to improve research reproducibility; data organization and cleaning and introductory computational skills needed for data management and analysis (data carpentry). Trainees conducted data risk assessments, developed standard operating procedures, generated equipment and reagent inventories, and designed data and research management policies and procedures.Seven of the 12 trainees completed the program. We are in the process of evaluating the course and trainee attitudes using trainee evaluation, pre- and post-training assessment data, and an external focus group. Preliminary data indicate that the trainees found the workshops to be useful (trainees provided an average response range of 3.3 to 4.8 across the workshops when presented with a scale of 1-5 where 5 is strongly agree that the content was useful, 4 = agree, 3 = neutral, 2= disagree and 1= strongly disagree). However, 7 of 9 trainees had difficulty completing RQA assignments suggesting that time constraints may impede successful integration of RQA best practices if implementation support is not provided within the laboratory or institution.
Development of an Online Course on Statistical and Computational Tools for Reproducible SciencePrincipal Investigator: Xihong Lin, Ph.D., Harvard School of Public HealthThe project is to develop an online HarvardX course, Principles and Statistical and Computational Tools for Reproducible Science. HarvardX is a Harvard University-wide strategic initiative, overseen by the Office of the Vice Provost for Advances in Learning, to enable faculty to build and create open online learning experiences for residential and online use. The proposed online course will consist of eight 1-hour lectures on quantitative reproducible science that are taught by the faculty who are the investigators on this T32 supplement, coupled with flipped classroom discussions led by the faculty instructors and augmented with support provided by teaching assistants. The topics include fundamentals of reproducible science, case studies, data provenance, statistical methods and computational tools for reproducible science, and reproducible science reporting. All the trainees in their NIGMS T32 training grant will be required to take this course and their performance will be evaluated. Accomplishments: The course PH527X: Principles, Statistical, and Computational Tools for Reproducible Science was successfully launched in October 2017. The course description: Today the principles and techniques of reproducible research are more important than ever, across diverse disciplines from astrophysics to political science. No one wants to do research that can't be reproduced. Thus, this course is really for anyone who is doing any data intensive research. While many of us come from a biomedical background, this course is for a broad audience of data scientists.
To meet the needs of the scientific community, this course will examine the fundamentals of methods and tools for reproducible research. Led by experienced faculty from the Harvard T.H. Chan School of Public Health, you will participate in six modules that will include several case studies that illustrate the significant impact of reproducible research methods on scientific discovery.
Learn skills and tools that support data science and reproducible research, to ensure you can trust your own research results, reproduce them yourself, and communicate them to others.
Register or learn more about the course
PH527X Principles, Statistical and Computational Tools for Reproducible Science here
Ensuring Rigor and Reproducibility: A Team Based ApproachPrincipal Investigator: Perry Victor Halushka, Ph.D., Medical University of South CarolinaThis project will support the development of a curriculum focused on rigor, transparency and reproducibility to enhance the training of students in the Medical Scientist Training Program at the Medical University of South Carolina. The activities include development of a course which includes validation of biological resources, principles of study design, data handling and analysis, recording and reporting, and external influences. They plan to evaluate the course within the context of teaching effectiveness and student learning and included a dissemination plan. They also provided a plan for instruction in the responsible conduct of research. Accomplishments: The course started on August 22, 2017. The 15-week/1h per week course is designed for trainees to develop a comprehensive understanding of the biological and statistical variables that can significantly impact the rigor and reproducibility of experimental results. The course is composed of a series of reading assignments, lectures, discussion groups and a final written synopsis of lessons learned. The students will also write a short research proposal at the beginning of the course and have 2 weeks to turn it in. It will serve as a starting point for a revision to be turned in at the end of the semester. The objective is to see if the students have mastered the principles of designing a NIH type rigorous research application by incorporating what they learned into the revision. Course Evaluation: The course will undergo a formal evaluation that uses both quantitative and qualitative data to ensure the most rigorous, effective and efficient training. Students will complete the "Student Assessment of Learning Gains" instrument to assess perceived educational gains. The on-line survey will be administered during weeks 1 and 15 to assess the impact of the course activities as well as student gains toward learning objectives. All student responses are anonymous. This information will be used for ongoing course development and enhancement.
Experimental Design, Biostatistics and Biological Variable ConsiderationPrincipal Investigator: Kimberly L. Mowry, Ph.D., Brown UniversityThis project is to design and implement seven new training modules in rigor and reproducibility featuring topics such as experimental design, biostatistics for the bench scientist, biological variable consideration, transparency/reproduction of published experimental wet bench and big-data methods, and team science. Trainees will participate in this training during their first year of graduate study and the training modules will use web-based materials developed to engage the trainees prior to in-class small group discussions and hands-on experimental scenarios. The plans to assess the impact of the training will include assessment of students' core competencies by online quizzes, completion of anonymous web-based curriculum evaluation forms by students and an annual survey of students and research mentors using an anonymous web-based system to gauge the impact of the training.Accomplishments: We designed and implemented seven new training modules in Rigor and Transparency (R&T) for first year PhD students. This series used a blended design with Canvas-based online components and in-person discussions. We incorporated videos of our instructors explaining content that often confuses students and designed exercises that required the students to apply new or sharpened skills and perspectives to their concurrent laboratory research projects. The R&T modules were as follows: 1) Foundations of Experimental Design, 2) Variables and Variation in Experimental Biology, 3) Transparency/Reproduction of Big-Data Methods, 4) Critique and Revision of Published Experimental Methods, 5) Team Science and 6-7) Biostatistics for the Bench Scientist (two modules). The seven modules were offered in January-June of 2017 and we have received constructive feedback from both the trainees and instructors that will guide revisions to the training in future years. It is planned that the R&T program will be permanently incorporated into the core training curriculum, along with the Responsible Conduct of Research and Individual Development Plans, for all PhD students in the Division of Biology & Medicine at Brown University.
Experimental Design, Biostatistics, and Quantitative AnalysisPrincipal Investigator: Kimberly L. Mowry, Ph.D., Brown UniversityThis project is to develop a curriculum that focuses on core issues and best practices that cut across all types of biomedical research. The topics include: developing awareness of the background for concerns about replication failure; defining critical factors in experimental design and data management; minimizing unrecognized bias in sampling and data analysis; testing hypotheses vs. "proving" hypotheses; statistical analysis; consideration of biological variables including sex and authentication of reagents; transparency in reporting; balancing enhancement of rigor with the 3 R's that govern use of animals in research; rigor and the sociology of science; perverse incentives and career advancement; practical aspects of addressing new NIH requirements to address rigor in experimental design. They plan to incorporate an assessment of the planned curricular activities into future evaluations of other training programs.Accomplishments: Began by testing a 5 day training session for postdoctoral fellows and graduate students in September 2016 to test teaching approaches and strategies and obtain feedback. Based on this experience, will pilot a new course for MSTP students in April, 2017: The course will be interleaved with a previous course on statistical design. Course Overview: This course will cover guidelines and best practices for experimental designs for biomedical research and will consist of a total of eight sessions over the duration of four weeks. Includes the development of reading lists and exercises and also, experiments with videotaping certain of the key didactic sessions. An important component will be continuous feedback and evaluation from the participants, and where appropriate, faculty will ask the students to participate in presentations that have been modified based on their comments.
Fundamental Concepts of Study Design, Statistics and InformaticsPrincipal Investigator: Joanna Louise Groden, Ph.D., Ohio State UniversityThis project will develop a new course for their T32 trainees, which will be made available to all Ohio State University graduate students, focused on concepts of reproducibility and rigor in the practice of biomedical science research. The course content includes the fundamental concepts of study design, statistics and informatics. However, their intent is to deliver an illustration of these principles in practiceand demonstrate scenarios where these scientific principles are both adhered to and violated. The success of the course will be evaluated by students and external advisory boards.Accomplishments: The course "Rigorous and Reproducible Design and Data Analysis" taught the analysis of datasets and principles of reproducibility and rigor. The course enrolled 19 students from 9 different graduate programs. Nine students were NIGMS T32 fellows. Lecture material included examples taken from literature of experimental designs that were rigorous and that had built-in flaws. Students learned programming skills and analytic tools for large datasets. At the completion of the course, students would have an intermediate level of competency in R computer language for data analysis, and knowledge of how to manage and analyze large datasets.
Improved Reagent Verification as a Means for Enhanced Research ReproducibilityPrincipal Investigator: Raghavendra G. Mirmira, Ph.D., Indiana University-Purdue University at IndianapolisThis project will develop a novel course, Improved Reagent Clarification as a Means for Enhanced Research Reproducibility, which is not available at the Indiana University School of Medicine. This course will entail a traditional didactic lecture series accompanied by relevant whole-class practical exercises allowing students to implement basic methods of reagent verification. Also, they plan to provide Pre-Doctoral Student Intramural Grants, awarded on a competitive basis, which will be used by predoctoral graduate students for reagent verification or execution of pilot studies to examine biologic variability. This is designed to be a 10-week course which will meet once a week for 2 hours. To complement the lecture topics, they will engage in whole-class practicums to demonstrate how to validate reagents beyond the standard in-lab surveillance methods.Accomplishments:
Integrated Introduction to Biostatistics and ComputationPrincipal Investigator: Wayne M. Yokoyama, Ph.D., Washington UniversityThis project will develop a course with four central components, which include basic computer programming for data analysis, statistical analysis by computer simulation, intuitive introduction to statistical hypothesis testing (i.e.
p-values) and computational tools for experimental design. The course will be evaluated by using pre- and post-class surveys to evaluate perceived gains in student knowledge in programming, statistics, experimental design and data analysis, and to garner feedback on how to improve the class in subsequent years, analysis during, and after the course to provide critical, expert evaluation on how to best train students in effective experimental design and data analysis, and having one of the faculty members (an experimental biologist and a seasoned educator) attend the course to provide a perspective on how best to evolve the class to meet student needs.Accomplishments: The course was launched in September 2016 with 37 students, including MD-PhD and PhD students in the Division of Biology and Biomedical Sciences and MD-PhD students in the Department of Biomedical Engineering. The 2-hour per week course alternates between lectures and interactive workshops to teach computation, statistics and experimental design. A key component of the course involves in class programming exercises, with the assistance of tutors. Lectures are being recorded and a website is being developed to allow other groups of learners (e.g., clinical fellows) to benefit from the material. Student surveys at the conclusion of the course indicated that 1)Knowledge of computer programming increased from 1.8 to 3.1 on a 5-point Likert scale; and 2) Knowledge of basic statistical concepts increased from 2.5 to 3.8 on a 5-point Likert scale. We believe the course effectively introduces early-stage graduate students to basic programmming skills and basic concepts in statistics. The September 2017 iteration of the course will enroll 60 students; a 62% increase over the initial offering.
Integrating Concepts of Rigor, Repeatability and Reproducibility in Molecular BiologyPrincipal Investigator: Robert A. Sclafani, Ph.D., University of Colorado Denver
This project proposes to integrate the concepts of rigor, repeatability and reproducibility into applicant's current curriculum and to develop a new curriculum that combines both wet and dry lab components focused on teaching these concepts and laboratory skills. The proposed curricular activities include cell line and animal authentication by genotyping, quality control of antibodies, rigorous materials and methods, and big data analysis and statistical methods. The evaluation plan includes student evaluations and written/oral examinations of the students at different benchmarks during their tenure and design of an evaluative rubric to help the students provide additional feedback to the faculty.Accomplishments: Dr. Sclafani and Dr. Hesselberth have developed lectures and discussion materials for integration of rigor and reproducibility concepts into coursework. Appropriate software for use in rigorous statistical and bioinformatic analyses have been purchased. A new section on rigor and reproducibility has been incorporated into written parts of qualifying exams. A new course, Rigor and Reproducibility in Biomedical Research (MOLB/PHCL 7801), is now being taught and includes both dry lab and wet lab (cell authentication, antibody verification) components. Dr. Mark Johnson, Editor in Chief of Genetics, will lead a discussion on Rigor and Reproducibility requirements in scientific journals. UPDATE: The course was a big succcess and we will continue it next year. Student evaluations were very positive (4.5/5.0 for Excellent). Other T32 programs on on our campus have contacted me about it and plan similar courses. In addition, integration into other requirement is ongoing and working well including in dissertations and candidacy exams.
Open Source Training in Computational Competence and Hands-on Data AnalysisPrincipal Investigator: Martha S. Cyert, Ph.D., Stanford UniversityThis project will implement a set of career development activities for Cellular and Molecular Biology trainees to enhance their training in data analysis and computational competence. Activities will include creation of hands-on data analysis sections for three existing mini-courses; incorporate principles of metrology, the science of measurement, into the graduate curriculum; and partner with non-profit educational organizations, Software and Data Carpentry (https://software-carpentry.org)
to build skills in computational competence, data management and data analysis and create a workshop for analysis of digital images using the Data Carpentry format. Because all Carpentry materials are freely distributed, they could be widely adopted to develop a resource for the entire biomedical community. We also offer a new professional development opportunity for trainees, i.e. to become a licensed carpentry instructor.Accomplishments: Organized and implemented 2 day workshop via partnership agreement with Data and Software Carpentry, January 2017. Developed "Datalucence", a new mini-course on analysis of digital images using the Data Carpentry format, to be taught in Spring 2017. Offered new course, Principles and Tools for Metrology in Biology, in Spring 2017. Developed Assessment approaches.
Promotion of Strong Foundations in Research Design and Methods Towards Reproducible and Rigorous ResearchPrincipal Investigator: Gordon William Laurie, Ph.D., University of VirginiaThe project is to enhance and restructure the Essentials of Translational Science course offered by the Biotechnology T32 to address reproducibility training by taking better advantage of thought leaders in the Charlottesville and University of Virginia (UVa) community, and to meet the needs of other UVa T32 programs. The new modules will include research integrity and reproducibility through research openness and transparency; ethics in the creation, publication, patenting and commercialization of scientific data; and strategies of design thinking in science. The evaluation plan includes students' assessment of the course content (perceived relevance and applicability, understanding of subject matter) and professors teaching/mentoring skills. Multivariate analysis of outcomes will include comparisons by T32 program, degree, externship and career goals. Accomplishments: The redesigned course 'Essentials of Translational Science' course [Cell 8401] is underway. The first module was presented by Tim Ellington of the Center for Open Science (COS) on February 15th 2017 and discussed current challenges and practical steps to enhance rigor and reproducibility in scientific research. The second module, presented by Soderberg (COS), took place on February 17th, involved a hands-on session to help students begin to build more reproducible projects, using examples from the Reproducibility Project: Cancer Biology as well as students' own projects. The remaining modules were presented twice weekly by UVa. affiliates and included topics on Ethics in Translational Science (IP, Data generation, publication), Strategies of Design Thinking in Science, Scientists as Entrepreneur, Intellectual Property I, Intellectual Property II, Commercialization and Entrepreneurship, Regulatory Science and Engagement Part I- FDA History and Authorities, Regulatory Science and Engagement Part 2- FDA and Product Development, Translational Science on Grounds. Students are graded based on attendance, class participation, and a concise summary paper expanding on a relevant topic of the student's own choice from the course work. A second initiative is to develop COS' 'Open Science Framework' (OSF) as a graduate student-friendly electronic notebook. The OSF guides and captures all elements of the research discovery cycle from project initiation and planning to publication and open uploading of data in a free cloud-based format supported by top journals. Upon creation of a new project, assigned collaborators are updated with each new entry. Collaborators might logicallly be thesis committee members. Ideas for improvements emerged from a graduate student committee.
Rigor and Reproducibility Training for Cellular and Molecular Medicine ResearchPrincipal Investigator: Rajini Rao, Ph.D., Johns Hopkins UniversityThis project is to develop a course on rigor and reproducibility in research and it will consist of 10 online, self-paced modules accompanied by 10 in-class, faculty led, small group sessions. Modules will include multimedia content and interactive sessions to keep learners engaged. It will take a minimum of 15 minutes to complete each module and include assessments and learner evaluation. The topics include introduction to rigor and reproducibility; validation and authentication of reagents; controls and experimental design; normalization and data handling; sample size and descriptive statistics;evaluation of experimental data and hypotheses; preclinical animal studies; telling the story: manuscript preparation, social pressures in publishing and critical review of published data. They will evaluate the impact of the training based on online module quizzes taken by students. In addition, they plan to develop pre- and post-course assessments. Accomplishments:
Training in Design of Research Methods for Reproducibility and RigorPrincipal Investigator: Robert Sekuler, Ph.D., Brandeis UniversityThis project will develop three minicourses and a symposium that teach students to work with real research data and to design research methods for reproducibility and rigor. The minicourses include statistical errors beyond p-values: back-tracking on phacking; a bioinformatics module (bioinformatic analysis of gene expression); and a computational module (bayesian methods for analysis of protein evolution). The impact of minicourses will be evaluated at a symposium hosted by the applicant's institution by surveying students on pre- and post-course for their perceived skill levels in particular areas using a survey to be developed by the three training grant leaders, the instructors and minicourses coordinator. Accomplishments:
Training in Experimental Rigor and ReproducibilityPrincipal Investigator: Christopher J. Chang, Ph.D., University of California, BerkeleyThis project aims to
design sustainable, scalable training in experimental rigor and reproducibility for first year grant students. The plan is to create 5 to 7 weeks of in-depth training covering the strategies, resources and best practices necessary to design meaningful experiments that lead to consistent reproducible results. The activities include experimental design, data collection and recording, and data analysis. The plan includes a strategy to maintain the curriculum over time as a resource for the entire UC Berkeley campus.Accomplishments: The objective is to create a new curriculum in teaching rigor and reproducibility in science, pilot it, and make results available to all training grants on campus. In conjunction with several of the training grant PIs as well as administrative personnel, the decision was made to extend MCB293C from 5 weeks to 10 weeks in order to introduce the additional 5 weeks on rigor and reproducibility topics. Seventy-six students from 4 training grants enrolled in the class, 54 students enrolled for two units. Students registered for the new curriculum represent six different departments. Students attending lectures 6-10 in addition to lectures and discussion sections 1-5, earn 2 credits and fulfill not only their RCR requirement but receive additional training for rigor and reproducibility. New course topics being addressed include: The Reproducibility Crisis, Experimental Design, Data Sharing, Meta data, Statistical Analysis, Image Analysis and Presentation, Project Lifecycle, The Open-Science Ecosystem, Storing Larger Datasets, Effective Data Analysis, Structuring Data, Management of Analysis Workflow, Data Visualization and Resources On Berkeley's Campus. Now that instruction has begun, student evaluations are being collected regularly during the course, by topic section, in order to obtain feedback. All lectures are being recorded via "Course Capture" and posted along with lecture slides on a webportal available to students and participating faculty. Before starting the new materials in session 6-10, students were polled on-line for their level of understanding as well as desire to learn more about specific proposed topics. Once instruction is complete, student polling information, student evaluations, recorded lectures, slide decks, and instructor evaluations will be made available to the advisory board. All materials, including the advisory board's recommendations will be used to improve the curriculum, format, and materials for the course. All training grant PIs at Berkeley have been invited to participate as an advisory board member in the final evaluation process.
Title: Redesign of PGY 603 – Design and Analysis Principal Investigator: Bret Smith, Ph.D., University of Kentucky We propose to restructure an existing course based in the College of Medicine at the University of Kentucky to include more extensive training focused on the principles of scientific rigor and research transparency. This is an area that has become an important concern of all involved in the research enterprise, and justifiably so given the profession's duty to public accountability and social responsibility in the conduct of science. The University of Kentucky has an excellent program for training in the responsible conduct of research that applies to all institutional and individual training grants. However, the existing program is not designed to adequately address the new requirements for scientific rigor and transparency described in NOT-OD-16-004, NOT-OD- 16-011, and NOT-OD-16-012. Furthermore, experimental design and data analysis are covered across a variety of courses that are not coordinated in any way, and scientific rigor and reproducibility are not emphasized. The planned curriculum will enhance the training of biomedical scientists to strengthen the scientific premise and rigor of their independent projects.
Title: Rigor and Reproducibility in Biomedical Research Principal Investigator: Sue Jinks-Robertson, Ph.D., Duke University The Duke School of Medicine course in “Rigor and Reproducibility” will comprise didactic lectures, team-based learning opportunities, lectures by invited speakers, and workshops dedicated to providing direct, operational expertise. The topic areas for didactic lectures will include i) Bias: Why We Are Wired for Bias and How to Combat It. This lecture series will explore the cognitive origins of bias, decision making, and perception and will also cover approaches to blinding and randomization, valid methodologies for outlier identification, and data reporting expectations; ii) Experimental Design: Perils, Pitfalls, and Solutions. A lecture series devoted to principles of experimental design, selection of model systems, methods for validation of cell lines, methods, and reagents, distinguishing quantitative and qualitative data and their application, robustness and reproducibility criteria, power calculations, statistical methods, laboratory records, and data archiving. We will also include lectures on biological variables and how to identify relevant biological variables in different experimental systems and scientific approaches (e.g., sex, age, strain variations, etc., as a biological variables) iii) FFP: Defining Falsification, Fabrication, and Plagiarism. Although much of what comprises FFP would seem to be self-evident examples of scientific misconduct, the reality is that trainees must be engaged on these topics and definitions and examples provided that are relevant to the diversity of research that is conducted at Duke. In this set of modules, FFP will be discussed from the standpoint of the different primary disciplines that comprise modern science, the experimental systems that are commonly used, the types of data that are obtained in these experimental systems, and in particular, the strengths, weaknesses, and limitations in the different methods of data identification, collection, and analysis. We will also address the reality that there are cultural variations in what is considered plagiarism and how plagiarism is viewed in ethical terms.
Title: Training in Rigor, Transparency to Enhance Reproducibility Principal Investigator: Brian Athey, Ph.D., University of Michigan This is a proposal to plan, develop, and deliver a recurring 5-day workshop to improve Rigor and Transparency to enhance Reproducibility (R&TeR). For this effort, faculty will partner with the University of Michigan Institute for Social Research (ISR) and its Inter-University Consortium for Social and Political Research (ICPSR), who are renown world-wide for their expertise in policies and practices regarding data sharing, data management, data and code curation, information security and privacy practices, data stewardship, and open science.
This page last reviewed on
10/17/2018 2:53 PM
Connect With Us: