Grok Optimized

Best Grok prompts for News Analysts, Reporters, and Journalists

A specialized toolkit of advanced AI prompts designed specifically for News Analysts, Reporters, and Journalists.

Professional Context

Daily deadlines collide with the need for meticulous research, forcing News Analysts, Reporters, and Journalists to navigate a delicate balance between speed and accuracy, all while staying ahead of the competition in a 24-hour news cycle.

💡 Expert Advice & Considerations

Don't waste time using Grok to churn out generic summaries; instead, focus on leveraging its capabilities to identify nuanced trends and patterns that can inform your reporting and give you a competitive edge.

Advanced Prompt Library

4 Expert Prompts
1

Real-Time Crisis Monitoring Dashboard

Terminal

Create a SQL query to extract the latest 100 tweets containing keywords related to an emerging crisis, such as a natural disaster or political uprising, and then use Python to analyze the sentiment of these tweets, grouping them by location and time of posting. Next, visualize the results in a Tableau dashboard, including a map view and a time-series chart, to help identify areas of high activity and track the evolution of the crisis over time. Finally, write a 200-word summary of the key findings, including any notable trends or patterns that emerge from the data.

✏️ Customization:Replace the crisis keywords with those relevant to the current event being monitored.
2

In-Depth Trend Analysis Report

Terminal

Use Snowflake to query a database of historical news articles, extracting a list of the top 20 most frequently mentioned entities (people, organizations, locations) over the past quarter, along with the number of times each entity was mentioned and the sentiment of the surrounding text. Then, use R to perform a regression analysis to identify which entities are most closely correlated with each other, and which are most strongly associated with positive or negative sentiment. Finally, write a 500-word report detailing the key findings, including any surprising trends or correlations that emerge from the data, and provide recommendations for future reporting based on these insights.

✏️ Customization:Adjust the time frame and entity types to suit the specific needs of the analysis.
3

ETL Pipeline for Social Media Data

Terminal

Design an ETL pipeline to extract social media data from Twitter, Facebook, and Instagram, using APIs to collect posts, comments, and engagement metrics, and then load the data into a Snowflake database for analysis. Use Python to handle errors and exceptions, and to transform the data into a standardized format, including converting timestamps to a uniform timezone and handling missing values. Next, create a data cleaning script to remove duplicates, handle outliers, and perform data quality checks, and finally, write a 100-word summary of the pipeline's architecture and any challenges encountered during implementation.

✏️ Customization:Modify the pipeline to accommodate different social media platforms or data sources as needed.
4

Statistical Summary of Reader Engagement

Terminal

Use Tableau to connect to a database of reader engagement metrics, including page views, click-through rates, and time on site, and then create a series of visualizations to summarize the data, including a heatmap of engagement by topic and time of day, a bar chart of top-performing articles, and a scatter plot of reader demographics versus engagement metrics. Next, use SQL to query the database and extract a list of the top 10 most engaging articles over the past month, along with their corresponding metrics, and finally, write a 300-word summary of the key findings, including any notable trends or patterns that emerge from the data, and provide recommendations for increasing reader engagement based on these insights.

✏️ Customization:Update the metrics and visualizations to reflect changing reader engagement patterns over time.