Archive for the ‘Data’ Category
Monday, August 24th, 2015
The following is an announcement from the NIH Big Data 2 Knowledge email list. If you are in the South Central Region and are interested in attending, please consider applying for a Professional Development Award to cover related expenses.
The NIH Big Data to Knowledge (BD2K) program and the NIH Library are pleased to join the Johns Hopkins (JHU) Bloomberg School of Public HealthDepartment of Biostatistics in announcing the first JHU DaSH – Data Science Hackathon – on September 21-23, 2015 in Baltimore.
The organizers– Drs. Brian Caffo, Leah Jager, Jeff Leek and Roger Peng – include JHU professors who teach the popular Coursera Data Science Specialization. This Data Science Hackathon will provide an opportunity for hands-on training that reinforces and builds on data management and analysis skills such as those covered in the MOOC specialization (completion of the specialization is not a prerequisite).
“This event will be an opportunity for data scientists and data scientists-in-training to get together and hack on real-world problems collaboratively and to learn from each other. The DaSH will feature data scientists from government, academia, and industry presenting problems and describing challenges in their respective areas. There will also be a number of networking opportunities where attendees can get to know each other.”
— Simply Statistics blog post
JHU DaSH will provide a local opportunity for NIH scientists and trainees to participate in a data science Hackathon. For more information, seehttps://regonline.com/jhudash. NIH staff or trainees who would like to attend should complete the application at https://www.surveymonkey.com/r/NIH-JHUDaSH (no later than Aug 14th) rather than the one on the https://regonline.com/jhudash website. For questions, contact Lisa Federer (firstname.lastname@example.org) in the NIH Library. Non-affiliates of NIH should apply directly through https://regonline.com/jhudash.
Friday, August 21st, 2015
Librarians specializing in health and related sciences are invited to participate in the next offering of the bioinformatics training course, “A Librarian’s Guide to NCBI,” sponsored by the National Library of Medicine (NLM), the National Center for Biotechnology Information (NCBI), and the National Network of Libraries of Medicine, NLM Training Center (NTC). Enrollment is limited to 25 participants.
The course provides knowledge and skills for librarians interested in ‘helping patrons use online molecular databases and tools from the NCBI. Prior knowledge of molecular biology and genetics is not required. Participating in the Librarian’s Guide course will improve your ability to initiate or extend bioinformatics services at your institution.
Online Pre-Course and In-Person Course Components
The two parts to “A Librarian’s Guide to NCBI” are listed below. Applicants must complete both parts. Participants must complete the pre-course with full CE credit (Part 1) in order to advance to attend the 5-day in-person course (Part 2).
Part 1: “Fundamentals in Bioinformatics and Searching,” an online (asynchronous) course, October 26-December 11, 2015.
An additional offering of this class will be in January-February 2016.
Part 2: A 5-day in-person course offered on-site at the National Library of Medicine in Bethesda, Maryland, March 7-11, 2016.
Application deadline: September 14, 2015
For more information and a link to the application, visit the NLM Technical Bulletin article: K. Majewski. NLM Tech Bull. 2015 Jul-Aug;(405):e4.
Monday, March 2nd, 2015
MLA is offering a webcast on data management. The program, “The Diversity of Data Management: Practical Approaches for Health Sciences Librarianship,” will have Lisa Federer, AHIP, Kevin Read and Jacqueline Wirz present strategies and success stories for data management.
Wednesday, April 22, 2015, 1:00-2:30pm CST
This webcast is designed to provide health sciences librarians with an introduction to data management, including how data is used within the research landscape, and the current climate around data management in biomedical research. Three librarians working with data management at their institutions will present case studies and examples of products and services they have implemented, and provide strategies and success stories about what has worked to get data management services up and running at their libraries.
More information on the program and speakers can be found on MLANET.
Tuesday, February 24th, 2015
The Next Generation of Access to Sequencing Data: Using NCBI’s SRA Toolkit to Access Data from dbGaP and SRA
Next Wednesday, February 25, NCBI staff will present a webinar on the SRA Toolkit (Sequence Read Archive), a system for accessing the approximately 3.4 Petabases of next-generation genomic and expressed sequence data housed in the NCBI Sequence Read Archive (SRA). As data sets become larger, mining information and performing comparisons directly from structured databases becomes increasingly necessary. The SRA Toolkit is not only capable of dumping the data out as a fastq or sam file, but also provides direct analysis and comparison from specific genomics regions across hundreds or thousands of samples.
In the webinar, we will show examples of configuration and use of the Toolkit for both public SRA and controlled access data associated with studies in the Database of Genotypes and Phenotypes (dbGaP).
To register for this webinar, please go here: https://attendee.gotowebinar.com/register/2847950984085163009
Monday, November 18th, 2013
The Lamar Soutter Library at the University of Massachusetts Medical School has recently released the New England Collaborative Data Management Curriculum which offers openly available materials that librarians can use to teach research data management best practices to students in the sciences, health sciences and engineering fields, at the undergraduate and graduate levels. The materials in the curriculum are openly available, with lecture notes and slide presentations that librarians teaching RDM can customize for their particular audiences. The curriculum also has a database of real life research cases that can be integrated into the curriculum to address discipline specific data management topics.
Each of the curriculum’s six online instructional modules aligns with the National Science Foundation’s data management plan recommendations and addresses universal data management challenges. Included in the curriculum is a collection of actual research cases that provides a discipline specific context to the content of the instructional modules. These cases come from a range of research settings such as clinical research, biomedical labs, an engineering project, and a qualitative behavioral health study. Additional research cases will be added to the collection on an ongoing basis. Each of the modules can be taught as a stand-alone class or as part of a series of classes. Instructors are welcome to customize the content of the instructional modules to meet the learning needs of their students and the policies and resources at their institutions.
Monday, September 30th, 2013
Guest Author: Susan Barnes, Assistant Director, NN/LM Outreach Evaluation Resource Center (OERC), Health Sciences Libraries and Information Center, University of Washington
The 2nd Edition of the Planning and Evaluating Health Information Outreach Projects series of 3 booklets http://nnlm.gov/evaluation/guides.html#A2 is now available online from the NN/LM Outreach Evaluation Resource Center (OERC).
Getting Started with Community-Based Outreach (Booklet 1) http://nnlm.gov/evaluation/booklets508/bookletOne508.html
What’s new? More emphasis and background on the value of health information outreach, including its relationship to the Healthy People 2020 Health Communication and Health Information Technology topic areas
Planning Outcomes-Based Outreach Projects (Booklet 2) http://nnlm.gov/evaluation/booklets508/bookletTwo508.html
What’s new? Focus on uses of the logic model planning tool beyond project planning, such as providing approaches to writing proposals and reports.
Collecting and Analyzing Evaluation Data (Booklet 3) http://nnlm.gov/evaluation/booklets508/bookletThree508.html
What’s new? Step-by-step guide to collecting, analyzing, and assessing the validity (or trustworthiness) of quantitative and qualitative data, using questionnaires and interviews as examples.
These are all available free to network members. To request printed copies, send an email to email@example.com. PDF versions of all three booklets are available here: http://nnlm.gov/evaluation/guides.html#A2 .
The Planning and Evaluating Health Information Outreach Projects series, by Cynthia Olney and Susan Barnes, supplements and summarizes material in Cathy Burroughs’ groundbreaking work from 2000, Measuring the Difference: Guide to Planning and Evaluating Health Information Outreach. Printed copies of Burroughs’ book are also available free—just send an email request to firstname.lastname@example.org.
Wednesday, September 25th, 2013
Is a table worth a thousand words? Sometimes you need an understandable and memorable diagram that will illustrate what you are trying to say. This Periodic Table of Visualization Methods http://www.visual-literacy.org/periodic_table/periodic_table.html# (a resource from Visual-Literacy.org http://www.visual-literacy.org/) provides examples of 100 visualization methods.
This table is not just a cool looking list of visualization methods, but it also uses the format that we are familiar with from the Periodic Table of Elements to organize the visualization techniques into different types and purposes.
For example, the chart is color coded from yellow to purple. The colors represent different kinds of visualization types: Data Visualization; Information Visualization; Concept Visualization; Strategy Visualization; Metaphor Visualization; and Compound Visualization (examples below). Scrolling over each box in the Table will bring a pop-up window with an example in it.
In addition, several other pieces of information about the methods are contained in this table. There are icons that show if the visualization method is process visualization (depicting a temporal sequence) or structure visualization (depicting conceptual relationships), and whether the different methods show macro patterns (overview) or micro patterns (detail), and finally whether the methods demonstrate divergent or convergent thinking. These can help you determine whether this visualization technique might be right for you.
Here are some examples from each of the main categories of visualization methods:
Data Visualization: example Area Chart
Information Visualization: example Radar Chart
Concept Visualization: example Argument Slide
Strategy Visualization: example Fishbone Diagram
Metaphor Visualization: example Tree
Compound Visualization: example Knowledge Map
Friday, September 6th, 2013
One of the Centers in The National Network of Libraries of Medicine is the Outreach and Evaluation Resource Center (OERC). This center is located at the University of Washington in Seattle, WA http://nnlm.gov/evaluation. The OERC has created a Guide to evaluation tools and other resources that you and your library can use to evaluate your programs: http://guides.nnlm.gov/content.php?pid=494137&sid=4058311. Here are some of the tools and resources described in the Guide:
Community Oriented Outreach
- Building Partnerships: tips on successful collaborations, tools for improving collaboration with community networks
- Participatory Evaluation: toolkits for practical participatory evaluation, processes for conducting outcome-based evaluations
- OERC Guides to incorporating evaluation planning into your outreach projects
- Evaluation planning resources from other organizations, including logic models
- List of outreach projects funded by NLM
Data Collection and Analysis
- Needs Assessments and Data Collection: access to data indicators, tips for questionnaire development, guides for using Appreciative Inquiry for evaluation
- Data Analysis: Resources for statistical methods and guides for analyzing qualitative and quantitative data
Reporting and Visualizing:
- Data Dashboards: guides for creating popular data dashboards
- Data Visualization: teachings of Edward Tufte and lists of visualization methods
- Reporting: tools for presentation design and TEDtalks about presentation structure
Monday, August 26th, 2013
When demonstrating your library’s impact to your institution, you will need to organize all the data that you have collected – gate counts, reference statistics, cost/benefit analyses, anecdotal data, etc. – and present them to your administration in some format. Your goal is that your presentation gets the attention of your administration, makes the case that your library has a huge positive impact on the institution, and convinces them that support for the library needs to be maintained or increased.
Part 3 in the Demonstrating Your Impact series is called “Telling Your Story.” This section is about exploring the idea of using storytelling as a means of organizing your data and having the most impact.
Andy Goodman, the author of Why Bad Presentations Happen to Good Causes (free download http://www.thegoodmancenter.com/resources/) says that “stories are a terrific way to bring large issues down to ground level where people can get their minds (and hearts) around them. But after you have told your story, you must back it up with the numbers that prove you have more than one story to tell.” In this video of a Plenary address for the National Assembly on School-Based Health Care, Andy Goodman gives a powerful demonstration of the importance of storytelling in engaging decision makers: http://www.ustream.tv/recorded/15665748.
How can you take this concept of storytelling and apply it to the data that you have been collecting on your library? Cindy Olney, with the NN/LM Outreach and Evaluation Resource Center, describes a very do-able process in her April 17, 2013 SCR CONNECTions webinar, Once Upon a Time: Using Evaluation Findings to Tell Your Project’s Story (recorded webinar: https://webmeeting.nih.gov/p18217101/). In her description of how to organize your presentation, Olney suggests
- analyzing the data that you have collected,
- articulating the key findings from charts and graphs into sentences, and
- deciding what the most important findings are
- weave them into one of two story systems: Sparkline or Storybook
Sparkline: This system, described by Nancy Duarte’s in a TED Talk (http://www.ted.com/talks/nancy_duarte_the_secret_structure_of_great_talks.html), is designed for persuasive arguments (like convincing your employers to expand the role of the library). In this system, the presentation goes back and forth between the vision of what could be and the situation as it is now. The presentation ends with a call to action. This Sparkline system can be shown to underlay great persuasive speeches, such as Abraham Lincoln’s Gettysburg Address and Martin Luther King Jr.’s I Have a Dream speech.
Storybook: Olney suggests the storybook format is best for presenting the results of a completed project. Three important elements should be included for a good story:
- a likeable main character in an undesirable circumstances
- this main character takes steps toward improving those circumstances – their progress is rife with obstacles
- at the end, the main character is transformed
Whether you use the Storybook or the Sparkline system, to keep your story interesting and memorable, Olney adds “don’t let the data get in the way of a good story – write your story, then weave the data into it.”
Read part 1 and 2 of the Demonstrating Your Impact series (Return on Investment and Collecting Stories).
Monday, August 19th, 2013
To demonstrate your library’s impact to decision makers, it can be helpful to bring your data to life with some great success stories: researchers that were helped by your librarians, doctors’ time saved, or patients understanding their follow-up instructions. Even better than success stories you tell your administration are stories told about your library by satisfied customers, for example, satisfied doctors whose time is valuable to their hospital as well as themselves, satisfied patients who can recommend your hospital to others, or satisfied researchers who can vote where their city dollars go. In addition, there is evidence that anecdotal data can influence the outcomes of decisions (http://mande.co.uk/2010/uncategorized/stories-vs-statistics-the-impact-of-anecdotal-data-on-accounting-decision-making/).
Part 2 in the Demonstrating Your Impact series is about collecting and telling those success stories. The Centers for Disease Control and Prevention (CDC) has a publication called Impact and Value: Telling Your Program’s Story http://www.cdc.gov/oralhealth/publications/library/pdf/success_story_workbook.pdf. This document is intended for program managers to provide steps they can use to systematically collect and create success stories: “with attention to detail, a system of regular data collection and practice, this tool can become a powerful instrument to spread the word about your program.”
According to the Impact and Value publication, stories should not be the main method of presenting data, but they put a face to the numbers of research and evaluation data: “What does it really mean when you report that you have provided ‘X’ amount of services to ‘Y’ amount of people? How are the lives of the program participants [or your library customers] changed because of your services?”
A great example of systematic story collection can be found in the article, “MedlinePlus and the challenge of low health literacy: findings from the Colonias project,” (http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1773027/pdf/i0025-7338-095-01-0031.pdf) which describes a project funded by the National Library of Medicine in which community health workers, known as promotoras, were trained to help members of some Texas-Mexico border communities find health information using MedlinePlus. These promotoras were asked to collect up to two stories every week on how they used online resources to help residents with health concerns. The 157 stories that resulted from this technique were treated as data: thematically coded, checked for validity, and studied to show the degree of success of the promotoras project.
What to do with all this data? Stay tuned for part 3 of the Demonstrating Your Impact series: Telling Your Story