Professional Context
The daily grind of a News Analysts, Reporters, and Journalists is a constant battle between meeting deadlines and uncovering the truth, as the need to deliver breaking news quickly conflicts with the necessity of verifying facts and sources. With the pressure to publish mounting, journalists must navigate complex data sets, conduct thorough research, and craft compelling stories all while maintaining the highest standards of accuracy and integrity.
💡 Expert Advice & Considerations
Don't rely on ChatGPT to generate entire articles, but instead use it to help with tasks like data analysis, research, and organization to free up time for the real work of journalism: interviewing sources, verifying facts, and crafting a compelling narrative.
Advanced Prompt Library
4 Expert PromptsData-Driven Storytelling
Analyze the SQL database containing the past year's worth of crime statistics for a major metropolitan area, and identify the top 5 precincts with the highest rates of violent crime, then use Python to create a scatter plot illustrating the relationship between crime rates and socioeconomic factors such as poverty and education levels, and finally write a 2-paragraph summary of the findings, including potential story angles and sources to interview. The database schema includes tables for incidents, offenders, and victims, with columns for date, time, location, and type of crime.
Investigative Research Assistance
Research and compile a list of the top 10 most influential stakeholders in the renewable energy industry, including their current and past positions, affiliations, and published works, then use Tableau to create a network diagram illustrating their connections and relationships, and finally write a brief profile of each stakeholder, including their background, expertise, and potential biases. Use Snowflake to query the database of industry publications and extract relevant articles and quotes.
Statistical Summary and Modeling
Develop a regression model using R to analyze the relationship between social media engagement and website traffic for a news organization, using a data set that includes metrics such as likes, shares, comments, and page views, then create a statistical summary of the findings, including coefficients, p-values, and confidence intervals, and finally write a 1-page report interpreting the results and recommending strategies for increasing engagement and traffic. The data set includes 6 months' worth of daily data, with 10 variables and 180 observations.
ETL Pipeline Development
Design and implement an ETL pipeline using Python to extract data from a collection of CSV files containing information on government contracts, transform the data into a standardized format, and load it into a Snowflake database for analysis, then write a data cleaning script to handle missing values and outliers, and finally create a data dictionary documenting the pipeline's architecture, data sources, and processing steps. The CSV files include columns for contract number, vendor, date, and amount, with varying formats and structures.