Professional Context
I still remember the chaotic night we had to meet a tight deadline for a breaking news story, and our data analyst was stuck trying to optimize a SQL query to extract the latest election results from our Snowflake database. The newsroom was on edge, and every minute counted, as we struggled to get the numbers just right. It was then that I realized the importance of having the right tools and expertise to accurately analyze and report the news.
💡 Expert Advice & Considerations
Don't rely on AI to generate entire news stories, but instead, use it to augment your research and data analysis, so you can focus on what really matters - storytelling and fact-checking.
Advanced Prompt Library
4 Expert PromptsElection Results Data Extraction and Visualization
Write a Python script using the Snowflake connector to extract the latest election results from our database, including voter turnout, candidate vote shares, and demographic breakdowns. Then, use Tableau to create an interactive dashboard visualizing the results, with filters for region, party, and time series analysis. Ensure the dashboard is optimized for mobile devices and includes a map view of the electoral districts. Finally, generate a statistical summary of the results, including mean, median, and standard deviation of the vote shares, as well as a correlation analysis between voter turnout and candidate performance.
Regression Model for Predicting News Article Engagement
Develop a regression model using R to predict the engagement metrics (likes, shares, comments) of news articles based on features such as article length, keywords, author, publication date, and social media promotion. Use a dataset of historical articles and their corresponding engagement metrics to train the model. Then, use the model to predict the engagement metrics for a new set of articles and generate a report comparing the predicted values with the actual values. Finally, refine the model by incorporating additional features, such as sentiment analysis and entity recognition, and re-evaluate its performance using metrics such as mean absolute error and R-squared.
Data Cleaning and ETL Pipeline for News Archives
Design an ETL pipeline using SQL and Python to extract, transform, and load news archives from a legacy database into a modern data warehouse. The pipeline should handle data cleaning tasks, such as removing duplicates, handling missing values, and standardizing date formats. Then, use the cleaned data to generate a statistical summary of the news archives, including frequency distributions of keywords, authors, and publication dates. Finally, create a data visualization using Tableau to illustrate the trends and patterns in the news archives over time, with filters for topic, author, and date range.
Topic Modeling and Entity Recognition for News Trend Analysis
Apply topic modeling techniques using Python and the Gensim library to a large corpus of news articles, with the goal of identifying emerging trends and patterns in the news landscape. Then, use entity recognition techniques to extract and analyze the key entities mentioned in the articles, such as people, organizations, and locations. Finally, generate a report highlighting the top trends and entities, along with their corresponding sentiment analysis and network visualization, to provide insights into the current news narrative and its key players.