Archive for the ‘Communications Tools’ Category
NCBI has a new Twitter feed, @ncbibooks, to announce new books and documents available on the NCBI Bookshelf. An online resource providing free access to the full text of books and documents in life sciences and health care, the Bookshelf currently provides access to over 4,500 titles.
The Bookshelf is continuously expanding with new materials as well as receiving updates to existing books & documents. Between May 16 and May 20, 2016, for example, 19 new titles were added. Among the new titles are several Agency for Healthcare Research and Quality reports, health technology assessments and systematic reviews from Canadian Agency for Drugs and Technologies in Health and National Institute for Health Research (UK), and World Health Organization guidelines on daily iron supplementation.
For general news, follow NCBI on Twitter, Facebook and LinkedIn.
The U.S. Department of Health and Human Services’ Agency for Healthcare, Research and Quality (AHRQ) recently released its Comparative Effectiveness Review Improving Cultural Competence to Reduce Health Disparities for Priority Populations. This review examines existing system-, clinic-, provider-, and individual-level interventions to improve culturally appropriate health care for people with disabilities; lesbian, gay, bisexual, and transgender (LGBT) populations; and racial/ethnic minority populations.
The National Library of Medicine’s Outreach and Special Populations Branch (OSPB) works to reduce health disparities within underserved and special populations by improving access to accurate, quality health information. OSPB manages Minority Health Information Outreach projects for specific populations, such as American Indian Health Web Portal for Native Americans and HealthReach for refugee populations.
NCBI has enhanced My Bibliography and Other Citations to include the following two improvements: a search and select tool to add citations from PubMed and an option to add citations in bulk using files that have citations in the MEDLINE or RIS (Research Information Systems) format. These features were developed to help manage My Bibliography and Other Citations collections allowing you to add PubMed citations directly in My Bibliography and Other Citations collections, and to upload citations in bulk using a file, which is especially useful for publications that are not present in PubMed. For further details, visit this NLM Technical Bulletin article.
The newest video on the NCBI YouTube channel, Navigating the NIH Manuscript Submission Process, covers details about submitting, reviewing and approving your manuscript in the NIH Manuscript Submission (NIHMS) system in ten minutes. The NIHMS system supports manuscript depositing into PubMed Central (PMC) as required by the public access policies of NIH and other participating funding agencies. Subscribe to the NCBI YouTube channel to receive alerts about new videos ranging from quick tips to full webinar presentations.
This month, the National Library of Medicine’s Disaster Lit database added its 10,000th record on the clinical and public health aspects of natural disasters, human-caused disasters, terrorism, disease outbreaks, and other public health emergencies. Disaster Lit describes and links to reports, webinars, training, conferences, factsheets and other documents that are not commercially published. Disaster Lit complements the journal literature in PubMed and the resources for the public in MedlinePlus. Materials are carefully selected by NLM medical librarians and subject experts from nearly 1,000 approved sources and provide current awareness for health professionals, first responders and emergency planners who have disaster health responsibilities.
New content is sent daily to nearly 14,000 subscribers via RSS, Twitter, email subscriptions, and the DISASTR-OUTREACH-LIB listserv. Disaster Lit plays a key role in collecting the earliest available trusted medical guidance soon after a disaster event or disease outbreak, often long before the same guidance can be published in peer-reviewed medical journals.
Disaster Lit supports other federal disaster information programs by providing the:
The Disaster Lit collection of grey literature was started in 2002 by the New York Academy of Medicine, with funding from the National Library of Medicine (NLM) National Information Center for Health Services Research (NICHSR). In 2010, the database moved to the then-new Disaster Information Management Research Center, Specialized Information Services (SIS) Division, NLM. The database continues to grow with funding support from SIS, NICHSR and the National Institute of Environmental Health Sciences.
Questions or comments may be sent to the Disaster Information Management Research Center.
The American Evaluation Association’s Statement on Cultural Competence in Evaluation describes the importance of cultural competence in terms of ethics, validity of results, and theory.
- Ethics – quality evaluation has an ethical responsibility to ensure fair, just and equitable treatment of all persons.
- Validity – evaluation results that are considered valid require trust from the diverse perspectives of the people providing the data and trust that the data will be honestly and fairly represented.
- Theory – theories underlie all of evaluation, but theories are not created in a cultural vacuum. Assumptions behind theories must be carefully examined to ensure that they apply in the cultural context of the evaluation.
The Statement also makes some recommendations for essential practices for cultural competence, including the following examples:
- Acknowledge the complexity of cultural identity. Cultural groups are not static, and people belong to multiple cultural groups. Attempts to categorize people often collapse them into cultural groupings that may not accurately represent the true diversity that exists.
- Recognize the dynamics of power. Cultural privilege can create and perpetuate inequities in power. Work to avoid reinforcing cultural stereotypes and prejudice in evaluation. Evaluators often work with data organized by cultural categories. The choices you make in working with these data can affect prejudice and discrimination attached to such categories.
- Recognize and eliminate bias in language: Language is often used as the code for a certain treatment of groups. Thoughtful use of language can reduce bias when conducting evaluations.
Two recent entries on the Evergreen Blog on data visualizations and how they can show cultural bias illustrate how these principles can be applied to the evaluation of an outreach project. The first case, How Dataviz Can Unintentionally Perpetuate Inequality: The Bleeding Infestation Example, shows how using red to represent individual participants on a map made the actual participants feel like they were perceived as a threat. The more recent blog post, How Dataviz Can Unintentionally Perpetuate Inequality Part 2, shows how the categories used in a chart on median household income contribute to stereotyping certain cultures and skew the data to show something that does not accurately represent income levels of the different groups.
On March 16, NN/LM PSR presented What the heck is Data Visualization and why should a librarian care?! for the Midday at the Oasis monthly webinar. Jackie Wirz, PhD, Research Data Ninja and Assistant Professor at Oregon Health & Science University, discussed the basic principles of presenting data with good visual design. You can view the webinar by visiting the Midday at the Oasis Archives page or by clicking on the YouTube video player below.
Note: To switch to full screen, click on the full screen icon in the bottom corner of the video player. To exit the full screen, press Esc on your keyboard or click on the Full screen icon again. If you have problems viewing full screen videos, make sure you have the most up-to-date version of Adobe Flash Player.
The ACRL Roadshow Workshop, Scholarly Communication: From Understanding to Engagement! will be offered on Thursday, March 24, 8:30 am – 4:30 pm, at the Toll Room, Alumni House, on the UC Berkeley campus. Registration is free and limited to 100 participants. The session is directed towards librarians and library staff who need a broad foundational knowledge of scholarly communication issues. Participants will learn about and discuss content access barriers, intellectual property, emerging opportunities, and engagement with faculty and students. Attendees will leave with practical ideas for developing outreach activities and models for supporting changes in scholarly communication. The two presenters for this workshop will be Katie Fortney, Copyright Policy & Education Officer, California Digital Library, and Jaron Porciello, Digital Scholarship Initiatives Coordinator, Digital Scholarship and Preservation Services, Cornell University.
The Alumni House is a short distance from the Downtown Berkeley BART station. Parking around campus is limited and taking public transportation is recommended. For inquiries regarding the workshop, please contact Jean McKenzie, UC Berkeley Acting Associate University Librarian for Collections.
You have surely noticed poorly designed data visualization displays with too many details, unnecessary icons, and many variables piled into one chart. But you don’t have to be an artist to do good visual displays. Most information designers concur that data visualization is about communication, not art. However, you have to know how to design with a purpose. To understand the basics of good design, you need to understand why humans respond so well to visual displays of data, which is the topic of an excellent blog article by Stephen Few, a thought leader in the data visualization field. First, he advocates that data visualizations aid users in performing three primary functions: exploring, making sense of, and communicating data. Evaluators would add that another goal is to help users apply data in planning and decision-making. To that end, Few argues that data visualizations should be designed to support readers’ ability to perform these four cognitive tasks:
- See the big picture in the data
- Compare values
- See patterns among values
- Compare patterns
Design experts like Stephen Few are avowed minimalists, who hate chart junk, such as gridlines and data labels. They have an affinity for small multiples, which are series of graphs displaying different slices of data. If you have never seen small multiples, visit this post from Juice Analytics with good examples. In general, they do not include any element that will hinder users’ ability to make comparisons, find patterns, and identify pattern abnormalities that may be indicators of important events. Decorative features like gas gauges and paper doll icons are viewed as unnecessary distractions.
There is a distinction between data visualizations and infographics. Alberto Cairo, Knight Chair in Visual Journalism at the University of Miami’s School of Communication, wrote that data visualizations are tools for interactive data exploration while infographics are visual displays that make a specific point. One way to think of this is that data visualizations have users, while infographics have readers. Chart art may be more legitimate in infographics because it supports the primary message or story. But Cairo admits that the boundary between infographic and data visualizations is fuzzy. He noted a trend toward infographics with two layers: a presentation layer, and an exploration one. The infographics have an obvious primary message, but readers are also presented with opportunities to explore their own questions.
That said, Cairo still argues that data design principles hold true for both data visualizations and infographics. Don’t drown your readers in images or distract them with bling. Zen is in; bells and whistles are out. The good news is that simple data visualizations do not require sophisticated software or design skills. That’s not to say that simple is the same as easy. Good data visualizations and infographics take a lot of thought. For the more interactive data visualizations, you must identify how your users will use your data and design accordingly. For infographics, you need first to clearly identify your central message and then be sure that every element has a supporting role. To read further about developing good visual design habits, check out Presenting Data Effectively by Stephanie Evergreen (Sage, 2013).
Recently the Public Library Association (PLA) initiated a service called Project Outcome. An article entitled “Project Outcome – Looking Back, Looking Forward” by Carolyn Anthony, director of the Skokie Public Library, IL, was recently published in Public Libraries Online that describes the successes of libraries using this service over the past six months.
Project Outcome is an online resource that provides evaluation tools that are designed to measure the impact of library programs and services, such as summer reading programs or career development programming. It also provides ready-made reports and data dashboards that can be used to give libraries and stakeholders immediate data on their programs’ outcomes. And Project Outcome provides support and peer sharing opportunities to address common challenges and increase capacity for outcomes evaluation.
Following are some highlights about this service:
- Project Outcome has managed to create a structured approach for program outcome evaluation that can be used online by public libraries of all shapes and sizes, by people who have not done outcome evaluation before. Along with tools for collecting data, the resource has tutorials and support for libraries doing outcomes evaluation for the first time.
- Continued support and peer sharing as an integral part of the service means that PLA is building a community of librarians who use outcome evaluation.
- The stories that are shared by the peers as described in the article will increase the understanding that evaluation isn’t something forced on you from outside, but can be something that helps you to create a better library and enhance the meaning of your library’s programs.
- This process teaches librarians to start with the evaluation question (“decide what you want to learn about outcomes in your community”) and a plan for what to do with the findings. And the process ends with successfully communicating your findings to stakeholders and implementing next steps.
- Lastly, Project Outcome and the PLA Performance Measurement Task Force are planning the next iteration of their project that will measure whether program participants followed through with their intended outcomes.