Extract Tweets for Free: Easy Python Guide Using Twikit (2025)
Want to collect tweets for your project but don't want to pay for Twitter's API? π¦ You're not alone!
Twitter (now X) started charging for their API in 2023, but there's good news - you can still extract tweets for free using Python and a special library called Twikit!
In this beginner-friendly guide, I'll show you exactly how to:
- Set up everything you need (even if you're new to Python)
- Search for tweets on any topic
- Save tweets to analyze later
- All without spending a penny! π°
Quick Summary: Install Twikit β Set up authentication β Search for tweets β Save data to CSV. Total time: ~15 minutes.
What You'll Need
Before we start, make sure you have:
- Python installed on your computer (download here if you don't have it)
- A Twitter/X account (free to create)
- Basic knowledge of running commands (I'll explain everything step by step)
Step 1: Install the Required Tools π οΈ
First, we need to install two Python libraries:
- Twikit: Helps us connect to Twitter and extract tweets
- Pandas: Makes it easy to work with and save the tweet data
Open your terminal (Command Prompt on Windows) and run this command:
pip install twikit pandas
What this does: This command tells Python to download and install the tools we need. You should see some progress text as they install.
Step 2: Set Up Your Twitter Login π
Twikit needs to log into Twitter to extract tweets. The safest way to handle your login information is to use environment variables (temporary settings that only exist during your current session).
Run these commands in your terminal, replacing the values with your actual Twitter information:
# For Windows Command Prompt:
set TWITTER_USERNAME="your_twitter_username"
set TWITTER_EMAIL="your_twitter_email"
set TWITTER_PASSWORD="your_twitter_password"
# For Mac/Linux Terminal:
export TWITTER_USERNAME="your_twitter_username"
export TWITTER_EMAIL="your_twitter_email"
export TWITTER_PASSWORD="your_twitter_password"
Security Tip: This method keeps your password out of your code. The environment variables will disappear when you close your terminal.
Step 3: Create Your Tweet Extractor Script π
Now let's create a Python script that will:
- Log into Twitter
- Search for tweets
- Save the results
Create a new file called tweet_extractor.py
and copy this code into it:
import asyncio
from twikit import Client
import os
import pandas as pd
# Create a Twitter client
client = Client("en-US")
# Step 1: Log into Twitter
async def authenticate():
cookies_file = "twitter_cookies.json"
# Try using saved cookies first (faster login)
if os.path.exists(cookies_file):
try:
client.load_cookies(cookies_file)
await client.user_id()
print("β
Logged in using saved cookies!")
return True
except Exception:
print("β οΈ Saved cookies expired, logging in again...")
# Get login info from environment variables
username = os.getenv("TWITTER_USERNAME")
email = os.getenv("TWITTER_EMAIL")
password = os.getenv("TWITTER_PASSWORD")
# Log in with username/password
try:
await client.login(auth_info_1=username, auth_info_2=email, password=password)
client.save_cookies(cookies_file)
print("β
Successfully logged in!")
return True
except Exception as e:
print(f"β Login failed: {e}")
return False
# Step 2: Search for tweets
async def search_tweets(keyword, max_tweets=100):
print(f"π Searching for tweets about '{keyword}'...")
# Get tweets (newest first)
tweets = await client.search_tweet(keyword, "Latest")
results = []
# Extract the information we want from each tweet
count = 0
for tweet in tweets:
if count >= max_tweets:
break
results.append({
"text": tweet.text,
"user": tweet.user.screen_name,
"date": tweet.created_at,
"likes": tweet.like_count,
"retweets": tweet.retweet_count,
"replies": tweet.reply_count,
"url": f"https://twitter.com/{tweet.user.screen_name}/status/{tweet.id}"
})
count += 1
print(f"β
Found {len(results)} tweets!")
return results
# Step 3: Save tweets to a CSV file
def save_tweets_to_csv(tweets, filename="tweets.csv"):
df = pd.DataFrame(tweets)
df.to_csv(filename, index=False)
print(f"πΎ Saved {len(tweets)} tweets to {filename}")
return filename
# Main function that runs everything
async def main():
# Ask the user what they want to search for
search_term = input("What topic do you want to search for? ")
max_count = int(input("How many tweets do you want to collect? "))
# Run the whole process
if await authenticate():
tweets = await search_tweets(search_term, max_count)
if tweets:
filename = save_tweets_to_csv(tweets, f"{search_term}_tweets.csv")
print(f"π All done! Your tweets are saved in {filename}")
print(f"π You can open this file in Excel or Google Sheets")
# Run the program
if __name__ == "__main__":
asyncio.run(main())
What This Code Does:
This script:
- Logs into Twitter using your credentials (or saved cookies for faster login)
- Searches for tweets on any topic you specify
- Extracts useful information from each tweet (text, user, likes, etc.)
- Saves everything to a CSV file you can open in Excel
Step 4: Run Your Tweet Extractor π
Now let's run the script! In your terminal, navigate to where you saved the file and run:
python tweet_extractor.py
The script will:
- Ask what topic you want to search for
- Ask how many tweets you want to collect
- Log into Twitter
- Find and save the tweets
You should see something like this:
What topic you want to search for? Python
How many tweets do you want to collect? 50
β
Logged in using saved cookies!
π Searching for tweets about 'Python'...
β
Found 50 tweets!
πΎ Saved 50 tweets to Python_tweets.csv
π All done! Your tweets are saved in Python_tweets.csv
π You can open this file in Excel or Google Sheets
Step 5: Analyze Your Tweet Data π
Now you have a CSV file with tweets! You can:
- Open it in Excel or Google Sheets to view all the data
- Sort and filter to find the most popular tweets
- Create charts to visualize engagement
Here's a simple Python script to show some basic stats about your tweets:
import pandas as pd
# Load the CSV file
filename = input("Enter your CSV filename: ")
tweets = pd.read_csv(filename)
# Show some basic stats
print(f"Total tweets: {len(tweets)}")
print(f"Most liked tweet: {tweets.loc[tweets['likes'].idxmax()]['text']}")
print(f"Most retweeted tweet: {tweets.loc[tweets['retweets'].idxmax()]['text']}")
# Show average engagement
print(f"Average likes: {tweets['likes'].mean():.1f}")
print(f"Average retweets: {tweets['retweets'].mean():.1f}")
Save this as analyze_tweets.py
and run it to see some quick stats about your tweets!
Common Problems and Solutions
"Login Failed" Error
Solution: Double-check your Twitter credentials. Make sure:
- Your username/email is correct
- Your password is correct
- Your Twitter account isn't locked or requiring a verification code
"Module Not Found" Error
Solution: Make sure you've installed the required libraries:
pip install twikit pandas
Rate Limiting
Solution: If you're collecting a lot of tweets, Twitter might temporarily limit your access. Wait 15-30 minutes and try again with a smaller number of tweets.
Going Further: Advanced Tweet Collection
Want to do more with your tweet collection? Here are some ideas:
1. Filter Tweets by Language
Add this parameter to the search_tweet
function:
tweets = await client.search_tweet(keyword, "Latest", lang="en") # English only
2. Collect Tweets from a Specific User
async def get_user_tweets(username, max_tweets=50):
user_id = await client.user_id_by_screen_name(username)
tweets = await client.user_tweets(user_id)
# Process tweets as before
return results
3. Schedule Regular Collection
Use a scheduling library like schedule
to run your script automatically:
import schedule
import time
def job():
asyncio.run(main())
schedule.every().day.at("10:00").do(job)
while True:
schedule.run_pending()
time.sleep(1)
Conclusion
Congratulations! π You've learned how to:
- Extract tweets for free using Python and Twikit
- Save tweet data for analysis
- Avoid paying for the expensive Twitter API
This approach is perfect for:
- Research projects
- Social media analysis
- Tracking trends
- Sentiment analysis
By following this guide, you now have a powerful tool for collecting Twitter data without spending money on API access.
Next Steps
Want to learn more? Check out these related articles:
- Extract Tweets Using a Browser Extension - Another free method to collect tweets
Want to see the complete code? Visit my GitHub repository for the full solution and additional examples.
Happy tweet collecting! π¦