Add authentication, dashboard, contacts page, and contact details functionality

This commit is contained in:
boilerrat 2025-03-18 21:58:47 -04:00
parent e7f81d3219
commit 3894be3a5f
61 changed files with 13644 additions and 500 deletions

23
.env.production.template Normal file
View File

@ -0,0 +1,23 @@
# Database
DATABASE_URL="postgresql://stones@66.179.188.130:5433/stones?schema=public"
PYTHON_DATABASE_URL="postgresql://stones@66.179.188.130:5433/stones"
# API Keys (replace with your actual API keys)
ETHEREUM_ETHERSCAN_API_KEY="YOUR_ETHERSCAN_API_KEY"
ALCHEMY_API_KEY="YOUR_ALCHEMY_API_KEY"
# Web3 Provider
WEB3_PROVIDER_URL="https://eth-mainnet.g.alchemy.com/v2/${ALCHEMY_API_KEY}"
OPTIMISM_RPC_URL="https://opt-mainnet.g.alchemy.com/v2/YOUR_OPTIMISM_KEY"
# Application
NODE_ENV="production"
PORT=3000
# Next.js
NEXT_PUBLIC_API_URL="http://your-domain.com/api"
OPTIMISM_ETHERSCAN_API_KEY="YOUR_OPTIMISM_ETHERSCAN_KEY"
# IMPORTANT: Add your database password to the DATABASE_URL and PYTHON_DATABASE_URL
# Add it between 'stones:' and '@66.179'
# Example: DATABASE_URL="postgresql://stones:your_password_here@66.179.188.130:5433/stones?schema=public"

214
DEPLOYMENT.md Normal file
View File

@ -0,0 +1,214 @@
# Stones Database - Deployment Guide
This guide provides step-by-step instructions for deploying the Stones Database application to a VPS with Nginx and PostgreSQL.
## Prerequisites
- A VPS server with Ubuntu/Debian
- A domain or subdomain (e.g., contact.boilerhaus.org)
- SSH access to your server
- PostgreSQL database server
- Node.js and npm installed on the server
- Nginx web server
- Let's Encrypt SSL certificates for your domain
- SSH key set up for Gitea access
## Server Setup Checklist
### 1. Update your server
```bash
sudo apt update && sudo apt upgrade -y
```
### 2. Install Node.js using NVM
```bash
# Install NVM
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.3/install.sh | bash
source ~/.bashrc
# Install Node.js v18
nvm install 18
nvm use 18
nvm alias default 18
```
### 3. Install and configure PostgreSQL
```bash
sudo apt install postgresql postgresql-contrib -y
# Create a database user and database
sudo -u postgres psql -c "CREATE USER stonesadmin WITH PASSWORD 'your-secure-password';"
sudo -u postgres psql -c "CREATE DATABASE stones;"
sudo -u postgres psql -c "GRANT ALL PRIVILEGES ON DATABASE stones TO stonesadmin;"
```
### 4. Install Nginx
```bash
sudo apt install nginx -y
```
### 5. Install Let's Encrypt Certbot
```bash
sudo apt install certbot python3-certbot-nginx -y
```
### 6. Generate SSL certificate
```bash
sudo certbot --nginx -d contact.boilerhaus.org
```
### 7. Set up SSH key for Gitea
If you don't already have an SSH key:
```bash
ssh-keygen -t rsa -b 4096 -C "your_email@example.com"
```
Add your public key to your Gitea account at git.boilerhaus.org:
```bash
cat ~/.ssh/id_rsa.pub
```
Copy the output and add it to your Gitea account settings.
## Deployment Process
### 1. Set up your repository on Gitea (git.boilerhaus.org)
Make sure your project is pushed to your Gitea repository at git.boilerhaus.org/boiler/stones. This repository will be used for deployment.
### 2. Clone the repository to your local machine to prepare for deployment
```bash
git clone git@git.boilerhaus.org:boiler/stones.git
cd stones
```
### 3. Prepare the deployment files
Copy the Nginx configuration and deployment script to your repository:
- `contact-boilerhaus-org.conf`: Nginx configuration for your subdomain
- `deploy.sh`: Deployment script to automate the deployment process
The deployment script is already configured to use your Gitea server:
```
REPO_URL="git@git.boilerhaus.org:boiler/stones.git"
```
### 4. Make the deployment script executable
```bash
chmod +x deploy.sh backup-db.sh
```
### 5. Commit and push these files to your repository
```bash
git add contact-boilerhaus-org.conf deploy.sh backup-db.sh DEPLOYMENT.md
git commit -m "Add deployment files"
git push origin main
```
### 6. Upload the repository to your server
You can clone the repository directly to your server:
```bash
ssh your-server-user@your-server-ip
git clone git@git.boilerhaus.org:boiler/stones.git
cd stones
```
Make sure your server has the proper SSH key set up to access your Gitea repository.
### 7. Run the deployment script
```bash
./deploy.sh
```
The script will:
- Check if SSH key is set up for git user access
- Clone or update the repository
- Install dependencies
- Build the application
- Create a .env.production file if it doesn't exist
- Set up PM2 for process management
- Configure Nginx
### 8. Update the .env.production file with your actual values
```bash
nano .env.production
```
Make sure to update:
- `DATABASE_URL` with your PostgreSQL credentials
- `AUTH_SECRET` with a strong random string
- Any other configuration variables
### 9. Import your database dump (if you have one)
```bash
psql -U stonesadmin -d stones -f path/to/your/dump.sql
```
## Updating the Application
When you need to update the application, you can either:
1. Run the deployment script again, which will pull the latest changes:
```bash
cd /path/to/repository
./deploy.sh
```
2. Or manually update:
```bash
cd /var/www/stones-database
git pull origin main
npm ci
npm run build
pm2 restart stones-database
```
## Troubleshooting
### Git Access Issues
If you encounter issues with Git access:
```bash
# Test SSH connection to Gitea
ssh -T git@git.boilerhaus.org
# Check if SSH agent is running
eval $(ssh-agent -s)
ssh-add ~/.ssh/id_rsa
```
### Nginx Configuration
If you encounter issues with Nginx:
```bash
sudo nginx -t # Test Nginx configuration
sudo systemctl reload nginx # Reload Nginx
sudo systemctl status nginx # Check Nginx status
```
### PM2 Issues
```bash
pm2 logs stones-database # View application logs
pm2 list # Check running processes
pm2 restart stones-database # Restart the application
```
### Database Connection
If your application can't connect to the database:
1. Check if PostgreSQL is running: `sudo systemctl status postgresql`
2. Verify your `.env.production` file has the correct DATABASE_URL
3. Make sure your PostgreSQL configuration allows connections (check pg_hba.conf)
### SSL Certificate
If you have issues with SSL:
```bash
sudo certbot renew --dry-run # Test certificate renewal
sudo certbot certificates # List certificates
```
## Notes
- The application runs on port 3001 locally and is proxied through Nginx
- PM2 is used to keep the application running and restart it if it crashes
- Make sure to back up your database regularly

169
README.md
View File

@ -1,57 +1,138 @@
# Stones Database
A database application for collecting Ethereum addresses and contact information for the Farcastle $Stones token launch.
A web application for managing contacts and their blockchain-related information, including NFT holdings, DAO memberships, and token holdings.
## Project Overview
## Development Setup
This application provides:
- A database to store Ethereum addresses, ENS names, and contact information
- Data collection scripts to gather information from various sources (NFT holders, ERC20 holders, Moloch DAO members)
- A web interface for accessing and managing the database at stones.boilerhaus.org
### Prerequisites
## Tech Stack
- Node.js (v16 or higher)
- PostgreSQL database
- Git
- **Backend**: Node.js with Express
- **Frontend**: Next.js with App Router, React, Shadcn UI, and Tailwind CSS
- **Database**: PostgreSQL
- **Data Collection**: Python scripts for blockchain data scraping
- **Deployment**: Docker for containerization
## Project Structure
```
/
├── src/ # Source code
│ ├── app/ # Next.js app router pages
│ ├── components/ # React components
│ ├── lib/ # Shared utilities
│ └── server/ # Server-side code
├── scripts/ # Python scripts for data collection
│ ├── nft_holders/ # Scripts to collect NFT holder data
│ ├── erc20_holders/ # Scripts to collect ERC20 token holder data
│ ├── moloch_dao/ # Scripts to collect Moloch DAO member data
│ └── utils/ # Shared utilities for scripts
├── prisma/ # Database schema and migrations
├── public/ # Static assets
└── docker/ # Docker configuration
```
## Getting Started
### Setup Instructions
1. Clone the repository
2. Install dependencies: `npm install`
3. Set up environment variables
4. Run the development server: `npm run dev`
5. Access the application at http://localhost:3000
```bash
git clone git@git.boilerhaus.org:boiler/stones.git
cd stones
```
## Data Collection
2. Install dependencies
```bash
npm install
```
The application includes various Python scripts to collect data from:
- NFT holders
- ERC20 token holders
- Moloch DAO members (Raid Guild, DAOhaus, Metacartel)
- ENS resolution for contact information
3. Set up the database
- Create a PostgreSQL database named `stones`
- Update the `.env.local` file with your database connection string:
```
DATABASE_URL="postgresql://username:password@localhost:5432/stones"
```
4. Run database migrations
```bash
npx prisma migrate dev
npx prisma generate
```
5. Start the development server
```bash
npm run dev
# or
./run-dev.sh
```
This will start the application at http://localhost:3000.
## Utility Scripts
This project includes several utility scripts to streamline the development and deployment process:
- `run-dev.sh`: Starts the development server with all necessary checks
- `check-db.sh`: Tests database connectivity and displays database statistics
- `push-to-gitea.sh`: Pushes changes to the Gitea repository
- `deploy.sh`: Deploys the application to a production server
- `backup-db.sh`: Creates a backup of the PostgreSQL database
### Using the Development Server
To run the development server with automatic checks for dependencies and database setup:
```bash
./run-dev.sh
```
This script will:
- Check for Node.js and npm
- Create a `.env.local` file if it doesn't exist
- Install dependencies if needed
- Run database migrations
- Start the development server
### Checking Database Connectivity
To test your database connection and view statistics:
```bash
./check-db.sh
```
This script will connect to your database and display the number of contacts, NFT holdings, DAO memberships, and token holdings.
## Authentication
The application uses a simple authentication system with a hardcoded admin user:
- Username: `admin`
- Password: `stones1234`
For security in production, this should be replaced with a proper authentication system.
## Deployment
The application is deployed at stones.boilerhaus.org
For detailed deployment instructions, see [DEPLOYMENT.md](DEPLOYMENT.md).
To deploy to a server:
1. Push changes to Gitea:
```bash
./push-to-gitea.sh
```
2. Connect to your server and clone the repository:
```bash
ssh your-server-user@your-server-ip
git clone git@git.boilerhaus.org:boiler/stones.git
cd stones
```
3. Run the deployment script:
```bash
./deploy.sh
```
4. Update the `.env.production` file with your production settings:
```bash
nano .env.production
```
5. Access your application at https://contact.boilerhaus.org (or your configured domain).
## Database Backups
To back up your database:
```bash
./backup-db.sh
```
This creates a compressed SQL backup in `/var/backups/stones-database/` with a timestamp.
## Project Structure
- `src/app`: Next.js App Router pages and API routes
- `src/components`: React components
- `src/lib`: Utility functions and libraries
- `prisma`: Database schema and migrations
- `scripts`: Data collection and processing scripts

69
backup-db.sh Executable file
View File

@ -0,0 +1,69 @@
#!/bin/bash
# Backup script for Stones Database PostgreSQL database
set -e # Exit immediately if a command exits with a non-zero status
# Configuration
DB_NAME="stones"
DB_USER="stonesadmin" # Replace with your actual database user
BACKUP_DIR="/var/backups/stones-database"
DATETIME=$(date +"%Y-%m-%d_%H-%M-%S")
BACKUP_FILE="$BACKUP_DIR/stones_db_backup_$DATETIME.sql"
LOG_FILE="$BACKUP_DIR/backup.log"
ROTATION_DAYS=30 # Number of days to keep backups
# Colors for pretty output
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
RED='\033[0;31m'
NC='\033[0m' # No Color
# Create backup directory if it doesn't exist
if [ ! -d "$BACKUP_DIR" ]; then
echo -e "${YELLOW}Creating backup directory...${NC}"
mkdir -p $BACKUP_DIR
fi
# Log start of backup
echo -e "$(date): Starting backup of $DB_NAME database" | tee -a $LOG_FILE
# Export database
echo -e "${YELLOW}Exporting database to $BACKUP_FILE...${NC}"
if pg_dump -U $DB_USER $DB_NAME > $BACKUP_FILE; then
# Compress the backup file
echo -e "${YELLOW}Compressing backup file...${NC}"
gzip $BACKUP_FILE
BACKUP_FILE="$BACKUP_FILE.gz"
# Calculate file size
FILE_SIZE=$(du -h "$BACKUP_FILE" | cut -f1)
echo -e "${GREEN}Backup completed successfully: $BACKUP_FILE (Size: $FILE_SIZE)${NC}" | tee -a $LOG_FILE
else
echo -e "${RED}Backup failed!${NC}" | tee -a $LOG_FILE
exit 1
fi
# Remove old backups
echo -e "${YELLOW}Removing backups older than $ROTATION_DAYS days...${NC}"
find $BACKUP_DIR -name "stones_db_backup_*.sql.gz" -type f -mtime +$ROTATION_DAYS -delete
echo -e "$(date): Removed old backups" | tee -a $LOG_FILE
# Optional: Copy backup to another location (e.g., remote server or cloud storage)
# Uncomment and modify these lines to enable remote backup
# SCP to remote server
# echo -e "${YELLOW}Copying backup to remote server...${NC}"
# scp $BACKUP_FILE user@remote-server:/path/to/backup/dir/
# Or upload to S3 (requires AWS CLI)
# echo -e "${YELLOW}Uploading backup to S3...${NC}"
# aws s3 cp $BACKUP_FILE s3://your-bucket/stones-database/
echo -e "${GREEN}Backup process completed!${NC}"
# To use this script as a cronjob, add a line like this to your crontab:
# Run daily at 2:00 AM: 0 2 * * * /path/to/backup-db.sh
# To setup the cronjob automatically, uncomment these lines:
# (crontab -l 2>/dev/null; echo "0 2 * * * $(readlink -f $0)") | crontab -

78
check-db.sh Executable file
View File

@ -0,0 +1,78 @@
#!/bin/bash
# Script to check database connectivity for the Stones Database application
set -e # Exit immediately if a command exits with a non-zero status
# Colors for pretty output
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
RED='\033[0;31m'
NC='\033[0m' # No Color
echo -e "${YELLOW}Checking database connectivity...${NC}"
# Check if Prisma is installed
if [ ! -d "node_modules/.prisma" ]; then
echo -e "${YELLOW}Installing Prisma dependencies...${NC}"
npm install @prisma/client
npx prisma generate
fi
# Create temporary script to check DB connection
cat > db-check.js << EOF
const { PrismaClient } = require('@prisma/client');
async function main() {
console.log('Attempting to connect to database...');
const prisma = new PrismaClient();
try {
// Test the connection
await prisma.\$connect();
console.log('Successfully connected to the database.');
// Get some database statistics
const contactCount = await prisma.contact.count();
const nftHoldingCount = await prisma.nftHolding.count();
const daoMembershipCount = await prisma.daoMembership.count();
const tokenHoldingCount = await prisma.tokenHolding.count();
console.log('Database statistics:');
console.log(\`Contacts: \${contactCount}\`);
console.log(\`NFT Holdings: \${nftHoldingCount}\`);
console.log(\`DAO Memberships: \${daoMembershipCount}\`);
console.log(\`Token Holdings: \${tokenHoldingCount}\`);
await prisma.\$disconnect();
process.exit(0);
} catch (error) {
console.error('Failed to connect to the database:', error.message);
if (error.message.includes('database does not exist')) {
console.error('The database specified in your DATABASE_URL does not exist.');
console.error('You may need to create it manually:');
console.error(' 1. Connect to PostgreSQL using: psql -U postgres');
console.error(' 2. Create the database: CREATE DATABASE stones;');
console.error(' 3. Update your .env or .env.local file with the correct DATABASE_URL');
} else if (error.message.includes('authentication failed')) {
console.error('Authentication failed. Check your username and password in DATABASE_URL.');
} else if (error.message.includes('connect ECONNREFUSED')) {
console.error('Could not connect to PostgreSQL server. Make sure it is running.');
}
await prisma.\$disconnect();
process.exit(1);
}
}
main();
EOF
# Run the temporary script
echo -e "${YELLOW}Running database connection test...${NC}"
node db-check.js
# Clean up
rm db-check.js
echo -e "${GREEN}Database check completed.${NC}"

View File

@ -0,0 +1,64 @@
server {
listen 80;
server_name contact.boilerhaus.org;
# Redirect HTTP to HTTPS
location / {
return 301 https://$host$request_uri;
}
}
server {
listen 443 ssl http2;
server_name contact.boilerhaus.org;
# SSL Configuration (make sure to update paths to your certificates)
ssl_certificate /etc/letsencrypt/live/boilerhaus.org/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/boilerhaus.org/privkey.pem;
ssl_trusted_certificate /etc/letsencrypt/live/boilerhaus.org/chain.pem;
ssl_session_timeout 1d;
ssl_session_cache shared:SSL:50m;
ssl_session_tickets off;
ssl_protocols TLSv1.2 TLSv1.3;
ssl_ciphers 'ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384';
ssl_prefer_server_ciphers on;
# HSTS
add_header Strict-Transport-Security "max-age=63072000" always;
# Security Headers
add_header X-Content-Type-Options nosniff;
add_header X-Frame-Options SAMEORIGIN;
add_header X-XSS-Protection "1; mode=block";
# Logs
access_log /var/log/nginx/contact.boilerhaus.org.access.log;
error_log /var/log/nginx/contact.boilerhaus.org.error.log;
# Proxy to Node.js application
location / {
proxy_pass http://localhost:3001; # Assuming your Next.js app will run on port 3001
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_cache_bypass $http_upgrade;
}
# Serve static files directly
location /_next/static {
alias /path/to/your/app/.next/static;
expires 365d;
access_log off;
}
# Serve public files directly
location /public {
alias /path/to/your/app/public;
expires 365d;
access_log off;
}
}

118
deploy.sh Executable file
View File

@ -0,0 +1,118 @@
#!/bin/bash
# Deployment script for Stones Database Next.js application
set -e # Exit immediately if a command exits with a non-zero status
# Configuration
APP_NAME="stones-database"
APP_DIR="/var/www/$APP_NAME"
REPO_URL="git@git.boilerhaus.org:boiler/stones.git" # Updated to correct Gitea repo URL
BRANCH="main" # Or whatever branch you want to deploy
NODE_VERSION="18" # Make sure this matches your development environment
# Colors for pretty output
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
RED='\033[0;31m'
NC='\033[0m' # No Color
echo -e "${GREEN}Starting deployment of $APP_NAME...${NC}"
# Check if SSH key is set up for git user
if [ ! -f ~/.ssh/id_rsa ]; then
echo -e "${YELLOW}SSH key not found. Make sure your SSH key is set up for git user access.${NC}"
echo -e "${YELLOW}You may need to run: ssh-keygen -t rsa -b 4096 -C 'your_email@example.com'${NC}"
echo -e "${YELLOW}Then add the public key to your Gitea account.${NC}"
exit 1
fi
# Check if directory exists, if not create it
if [ ! -d "$APP_DIR" ]; then
echo -e "${YELLOW}Creating application directory...${NC}"
mkdir -p $APP_DIR
# Clone the repository if it's the first time
echo -e "${YELLOW}Cloning repository...${NC}"
git clone --branch $BRANCH $REPO_URL $APP_DIR
else
echo -e "${YELLOW}Pulling latest changes...${NC}"
cd $APP_DIR
git fetch --all
git reset --hard origin/$BRANCH
fi
cd $APP_DIR
# Make sure we're using the right Node.js version
echo -e "${YELLOW}Using Node.js version $NODE_VERSION...${NC}"
export NVM_DIR="$HOME/.nvm"
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh" # This loads nvm
nvm use $NODE_VERSION || { echo -e "${RED}Failed to switch Node.js version. Make sure NVM is installed.${NC}"; exit 1; }
# Install dependencies
echo -e "${YELLOW}Installing dependencies...${NC}"
npm ci || { echo -e "${RED}Failed to install dependencies.${NC}"; exit 1; }
# Build the application
echo -e "${YELLOW}Building the application...${NC}"
npm run build || { echo -e "${RED}Build failed.${NC}"; exit 1; }
# Setup environment variables
echo -e "${YELLOW}Setting up environment variables...${NC}"
if [ ! -f ".env.production" ]; then
echo -e "${YELLOW}Creating .env.production file...${NC}"
cat > .env.production << EOF
# Database Connection
DATABASE_URL="postgresql://username:password@localhost:5432/stones"
# Authentication
AUTH_SECRET="your-auth-secret" # Replace with a strong random string
# Application
NEXT_PUBLIC_APP_URL="https://contact.boilerhaus.org"
EOF
echo -e "${YELLOW}Please update the .env.production file with your actual values.${NC}"
fi
# Setup PM2 process manager if not already configured
if ! command -v pm2 &> /dev/null; then
echo -e "${YELLOW}Installing PM2 process manager...${NC}"
npm install -g pm2
fi
# Check if the PM2 process already exists
if pm2 list | grep -q "$APP_NAME"; then
echo -e "${YELLOW}Restarting application with PM2...${NC}"
pm2 restart $APP_NAME
else
echo -e "${YELLOW}Setting up application with PM2...${NC}"
pm2 start npm --name $APP_NAME -- start -- -p 3001
# Save PM2 configuration to persist across server restarts
pm2 save
pm2 startup
fi
# Update Nginx configuration
echo -e "${YELLOW}Setting up Nginx configuration...${NC}"
NGINX_CONF="/etc/nginx/sites-available/contact-boilerhaus-org.conf"
if [ ! -f "$NGINX_CONF" ]; then
echo -e "${YELLOW}Copying Nginx configuration file...${NC}"
# Assuming contact-boilerhaus-org.conf is in the same directory as this script
cp ./contact-boilerhaus-org.conf $NGINX_CONF
# Update paths in the Nginx configuration
sed -i "s|/path/to/your/app|$APP_DIR|g" $NGINX_CONF
# Create symlink if it doesn't exist
if [ ! -f "/etc/nginx/sites-enabled/contact-boilerhaus-org.conf" ]; then
ln -s $NGINX_CONF /etc/nginx/sites-enabled/
fi
# Test Nginx configuration
nginx -t && systemctl reload nginx
else
echo -e "${YELLOW}Nginx configuration already exists.${NC}"
# Test Nginx configuration
nginx -t && systemctl reload nginx
fi
echo -e "${GREEN}Deployment completed successfully!${NC}"
echo -e "${GREEN}Your application should now be accessible at https://contact.boilerhaus.org${NC}"

58
import-data.js Normal file
View File

@ -0,0 +1,58 @@
const { PrismaClient } = require('@prisma/client');
const prisma = new PrismaClient();
// This script imports data directly via Prisma instead of using psql
async function main() {
console.log('Importing data sources...');
try {
// Create DataSource records first
await prisma.dataSource.createMany({
data: [
{
name: 'Public Nouns',
type: 'NFT',
description: 'Public Nouns NFT holders',
createdAt: new Date(),
updatedAt: new Date()
},
{
name: 'Raid Guild',
type: 'DAO',
description: 'Raid Guild DAO members',
createdAt: new Date(),
updatedAt: new Date()
},
{
name: 'Moloch DAO',
type: 'DAO',
description: 'Moloch DAO members',
createdAt: new Date(),
updatedAt: new Date()
},
{
name: 'MetaCartel',
type: 'DAO',
description: 'MetaCartel DAO members',
createdAt: new Date(),
updatedAt: new Date()
}
],
skipDuplicates: true
});
console.log('Data sources imported successfully');
// You can add more import steps here if needed
} catch (error) {
console.error('Error importing data:', error);
} finally {
await prisma.$disconnect();
}
}
main().catch(error => {
console.error(error);
process.exit(1);
});

7
next.config.js Normal file
View File

@ -0,0 +1,7 @@
/** @type {import('next').NextConfig} */
const nextConfig = {
reactStrictMode: true,
swcMinify: true,
}
module.exports = nextConfig

975
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -12,7 +12,7 @@
"prisma:studio": "prisma studio"
},
"dependencies": {
"@prisma/client": "5.10.2",
"@prisma/client": "^6.5.0",
"@radix-ui/react-avatar": "^1.0.4",
"@radix-ui/react-dialog": "^1.0.5",
"@radix-ui/react-dropdown-menu": "^2.0.6",
@ -22,16 +22,16 @@
"@radix-ui/react-tabs": "^1.0.4",
"@radix-ui/react-toast": "^1.1.5",
"class-variance-authority": "^0.7.0",
"clsx": "^2.1.0",
"clsx": "^2.1.1",
"express": "^4.18.2",
"framer-motion": "^11.0.5",
"lucide-react": "^0.331.0",
"next": "14.1.0",
"next": "^14.2.25",
"next-themes": "^0.2.1",
"nuqs": "^1.16.0",
"react": "^18.2.0",
"react-dom": "^18.2.0",
"tailwind-merge": "^2.2.1",
"tailwind-merge": "^2.6.0",
"tailwindcss-animate": "^1.0.7",
"zod": "^3.22.4"
},
@ -42,10 +42,10 @@
"@types/react-dom": "^18.2.19",
"autoprefixer": "^10.4.17",
"eslint": "^8.56.0",
"eslint-config-next": "14.1.0",
"eslint-config-next": "^14.2.25",
"postcss": "^8.4.35",
"prisma": "^5.10.2",
"prisma": "^6.5.0",
"tailwindcss": "^3.4.1",
"typescript": "^5.3.3"
}
}
}

View File

@ -8,27 +8,26 @@ datasource db {
}
model Contact {
id String @id @default(cuid())
ethereumAddress String @unique
id String @id @default(cuid())
ethereumAddress String @unique
ensName String?
name String?
email String?
twitter String?
discord String?
telegram String?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
farcaster String?
otherSocial String?
warpcastAddress String?
ethereumAddress2 String?
warpcastAddress String?
ensName String?
name String?
farcaster String?
twitter String?
discord String?
telegram String?
email String?
otherSocial String?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
// Relations
nftHoldings NftHolding[]
tokenHoldings TokenHolding[]
daoMemberships DaoMembership[]
notes Note[]
tags TagsOnContacts[]
ContactSource ContactSource[]
daoMemberships DaoMembership[]
nftHoldings NftHolding[]
notes Note[]
tags TagsOnContacts[]
tokenHoldings TokenHolding[]
}
model NftHolding {
@ -101,13 +100,14 @@ model TagsOnContacts {
}
model DataSource {
id String @id @default(cuid())
name String @unique
type String
description String?
lastScraped DateTime?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
id String @id @default(cuid())
name String @unique
type String
description String?
lastScraped DateTime?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
ContactSource ContactSource[]
}
model ScrapingJob {
@ -123,3 +123,17 @@ model ScrapingJob {
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
}
model ContactSource {
id String @id
contactId String
dataSourceId String
createdAt DateTime @db.Timestamp(6)
updatedAt DateTime @db.Timestamp(6)
Contact Contact @relation(fields: [contactId], references: [id], onDelete: Cascade, onUpdate: NoAction)
DataSource DataSource @relation(fields: [dataSourceId], references: [id], onDelete: Cascade, onUpdate: NoAction)
@@unique([contactId, dataSourceId])
@@index([contactId])
@@index([dataSourceId])
}

56
push-to-gitea.sh Executable file
View File

@ -0,0 +1,56 @@
#!/bin/bash
# Script to push changes to Gitea repository
set -e # Exit immediately if a command exits with a non-zero status
# Colors for pretty output
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
RED='\033[0;31m'
NC='\033[0m' # No Color
# Check if git is installed
if ! command -v git &> /dev/null; then
echo -e "${RED}Error: git is not installed. Please install git first.${NC}"
exit 1
fi
# Check if we're in a git repository
if ! git rev-parse --is-inside-work-tree &> /dev/null; then
echo -e "${RED}Error: Not a git repository. Please run this script from within a git repository.${NC}"
exit 1
fi
# Check if remote already exists
if git remote | grep -q "gitea"; then
echo -e "${YELLOW}Remote 'gitea' already exists.${NC}"
else
echo -e "${YELLOW}Adding 'gitea' remote...${NC}"
git remote add gitea git@git.boilerhaus.org:boiler/stones.git
fi
# Get current branch
CURRENT_BRANCH=$(git symbolic-ref --short HEAD)
echo -e "${YELLOW}Current branch: ${CURRENT_BRANCH}${NC}"
# Check for uncommitted changes
if ! git diff-index --quiet HEAD --; then
echo -e "${YELLOW}You have uncommitted changes.${NC}"
read -p "Do you want to commit them? (y/n): " -n 1 -r
echo
if [[ $REPLY =~ ^[Yy]$ ]]; then
read -p "Enter commit message: " COMMIT_MSG
git add .
git commit -m "$COMMIT_MSG"
else
echo -e "${YELLOW}Continuing without committing changes.${NC}"
fi
fi
# Push to Gitea
echo -e "${YELLOW}Pushing to Gitea...${NC}"
git push -u gitea $CURRENT_BRANCH
echo -e "${GREEN}Successfully pushed to Gitea!${NC}"
echo -e "${GREEN}Repository URL: git@git.boilerhaus.org:boiler/stones.git${NC}"
echo -e "${YELLOW}To deploy, run the deploy.sh script on your server.${NC}"

119
run-dev.sh Executable file
View File

@ -0,0 +1,119 @@
#!/bin/bash
# Script to run a development server for the Stones Database application
set -e # Exit immediately if a command exits with a non-zero status
# Colors for pretty output
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
RED='\033[0;31m'
NC='\033[0m' # No Color
# Check for required tools
echo -e "${YELLOW}Checking for required tools...${NC}"
# Check if Node.js is installed
if ! command -v node &> /dev/null; then
echo -e "${RED}Error: Node.js is not installed. Please install Node.js first.${NC}"
exit 1
fi
# Check Node.js version
NODE_VERSION=$(node -v | cut -d "v" -f 2 | cut -d "." -f 1)
if [ "$NODE_VERSION" -lt "16" ]; then
echo -e "${RED}Error: Node.js version 16 or higher is required. Current version: $(node -v)${NC}"
exit 1
fi
# Check if npm is installed
if ! command -v npm &> /dev/null; then
echo -e "${RED}Error: npm is not installed. Please install npm first.${NC}"
exit 1
fi
# Print npm and Node.js versions for diagnostic purposes
echo -e "${YELLOW}Node.js version: $(node -v)${NC}"
echo -e "${YELLOW}npm version: $(npm -v)${NC}"
# Check Next.js version for diagnostic purposes
echo -e "${YELLOW}Next.js version: $(npm list next | grep next@ | head -1)${NC}"
# Check for .env file
if [ ! -f ".env.local" ] && [ ! -f ".env" ]; then
echo -e "${YELLOW}Creating .env.local file with development settings...${NC}"
cat > .env.local << EOF
# Database Connection for Development
DATABASE_URL="postgresql://postgres:postgres@localhost:5432/stones"
# Authentication
AUTH_SECRET="dev-secret-key-for-testing"
# Application
NEXT_PUBLIC_APP_URL="http://localhost:3000"
EOF
echo -e "${YELLOW}Please update the .env.local file with your actual development database values.${NC}"
fi
# Install dependencies if node_modules doesn't exist
if [ ! -d "node_modules" ]; then
echo -e "${YELLOW}Installing dependencies...${NC}"
npm install
else
echo -e "${YELLOW}Dependencies already installed. To reinstall, remove the node_modules directory.${NC}"
fi
# Run database migrations if schema.prisma exists
if [ -f "prisma/schema.prisma" ]; then
echo -e "${YELLOW}Running database migrations...${NC}"
npx prisma migrate dev --name dev-migration
echo -e "${YELLOW}Generating Prisma client...${NC}"
npx prisma generate
fi
# Check PostgreSQL connectivity
echo -e "${YELLOW}Checking PostgreSQL connectivity...${NC}"
if command -v pg_isready &> /dev/null; then
if pg_isready -h localhost -p 5432; then
echo -e "${GREEN}PostgreSQL server is running at localhost:5432${NC}"
else
echo -e "${RED}Warning: PostgreSQL server at localhost:5432 is not responding${NC}"
echo -e "${RED}Please ensure your PostgreSQL server is running${NC}"
fi
else
echo -e "${YELLOW}pg_isready not found, skipping PostgreSQL connectivity check${NC}"
fi
# Add network debug information
echo -e "${YELLOW}Network interfaces:${NC}"
ip addr | grep "inet " | awk '{print $2}' | cut -d/ -f1
echo ""
# Create next.config.js file with proper configuration
echo -e "${YELLOW}Creating/updating next.config.js file...${NC}"
if [ -f "next.config.js" ]; then
mv next.config.js next.config.js.bak
fi
cat > next.config.js << EOF
/** @type {import('next').NextConfig} */
const nextConfig = {
reactStrictMode: true,
swcMinify: true,
}
module.exports = nextConfig
EOF
# Add HOST environment variable for Next.js to bind to all interfaces
echo -e "${YELLOW}Starting Next.js with HOST=0.0.0.0 to enable network access${NC}"
echo -e "${GREEN}Starting development server...${NC}"
echo -e "${GREEN}The application will be available at:${NC}"
echo -e "${GREEN} - http://localhost:3000${NC}"
ip addr | grep "inet " | grep -v "127.0.0.1" | awk '{print " - http://" $2 ":3000"}' | cut -d/ -f1
# Use HOST environment variable to allow access from any IP
HOST=0.0.0.0 npm run dev
# This part will execute when the server is stopped
echo -e "${YELLOW}Development server stopped.${NC}"

View File

@ -0,0 +1,78 @@
#!/usr/bin/env python3
"""
Check DAOhaus API Response
This script checks the response from the DAOhaus API for Public Haus DAO.
"""
import requests
import sys
# Constants
DAOHAUS_API_URL = "https://admin.daohaus.club/api"
PUBLIC_HAUS_DAO_ID = "0xf5d6b637a9185707f52d40d452956ca49018247a" # Public Haus DAO ID on Optimism
CHAIN_ID = "10" # Optimism chain ID
def check_api_response(url):
"""
Check the response from an API URL
Args:
url: The URL to check
"""
print(f"Checking URL: {url}")
try:
# Make request to API
response = requests.get(url)
# Print status code
print(f"Status code: {response.status_code}")
# Print headers
print("Headers:")
for key, value in response.headers.items():
print(f" {key}: {value}")
# Print content
print("\nContent:")
print(response.text)
# Try to parse as JSON
try:
data = response.json()
print("\nJSON data:")
print(data)
except Exception as e:
print(f"\nError parsing JSON: {e}")
except Exception as e:
print(f"Exception checking API: {e}")
def main():
"""Main function"""
# Check DAO info endpoint
dao_url = f"{DAOHAUS_API_URL}/dao/{CHAIN_ID}/{PUBLIC_HAUS_DAO_ID}"
print(f"Checking DAO info endpoint: {dao_url}")
check_api_response(dao_url)
print("\n" + "=" * 80 + "\n")
# Check members endpoint
members_url = f"{DAOHAUS_API_URL}/dao/{CHAIN_ID}/{PUBLIC_HAUS_DAO_ID}/members"
print(f"Checking members endpoint: {members_url}")
check_api_response(members_url)
# Try alternative API URL
print("\n" + "=" * 80 + "\n")
print("Trying alternative API URL...")
alt_api_url = "https://api.daohaus.club/api"
alt_dao_url = f"{alt_api_url}/dao/{CHAIN_ID}/{PUBLIC_HAUS_DAO_ID}"
print(f"Checking alternative DAO info endpoint: {alt_dao_url}")
check_api_response(alt_dao_url)
return 0
if __name__ == "__main__":
sys.exit(main())

View File

@ -0,0 +1,147 @@
#!/usr/bin/env python3
"""
Check DAOhaus v3 Subgraph
This script checks if the DAOhaus v3 subgraph on Optimism is responding
and lists any available DAOs without filtering.
"""
import requests
import json
# Constants
SUBGRAPH_URLS = [
"https://api.thegraph.com/subgraphs/name/hausdao/daohaus-v3-optimism",
"https://api.thegraph.com/subgraphs/name/hausdao/daohaus-v3",
"https://api.thegraph.com/subgraphs/name/hausdao/daohaus-v3-goerli",
"https://api.thegraph.com/subgraphs/name/hausdao/daohaus-v3-gnosis",
"https://api.thegraph.com/subgraphs/name/hausdao/daohaus-v3-arbitrum",
"https://api.thegraph.com/subgraphs/name/hausdao/daohaus-v3-polygon",
"https://api.thegraph.com/subgraphs/name/hausdao/daohaus-v3-celo"
]
def check_subgraph(url):
"""
Check if a subgraph is responding
Args:
url: The subgraph URL to check
Returns:
True if responding, False otherwise
"""
# Simple query to check if subgraph is responding
query = """
query {
_meta {
block {
number
}
deployment
hasIndexingErrors
}
}
"""
try:
# Make request to subgraph
response = requests.post(
url,
json={"query": query}
)
# Check for errors
if response.status_code != 200:
print(f"Error checking subgraph: {response.text}")
return False
data = response.json()
# Check if response has data
if not data.get("data") or not data["data"].get("_meta"):
print(f"Invalid response from subgraph: {data}")
return False
# Get meta data
meta = data["data"]["_meta"]
print(f"Subgraph is responding at {url}")
print(f"Block number: {meta['block']['number']}")
print(f"Deployment: {meta['deployment']}")
print(f"Has indexing errors: {meta['hasIndexingErrors']}")
print("-" * 50)
return True
except Exception as e:
print(f"Exception checking subgraph: {e}")
return False
def list_daos(url):
"""
List all DAOs in a subgraph
Args:
url: The subgraph URL to query
"""
# GraphQL query to list all DAOs
query = """
query {
daos(first: 10) {
id
name
createdAt
totalShares
totalLoot
activeMemberCount
}
}
"""
try:
# Make request to subgraph
response = requests.post(
url,
json={"query": query}
)
# Check for errors
if response.status_code != 200:
print(f"Error listing DAOs: {response.text}")
return
data = response.json()
# Check if DAOs exist
if not data.get("data") or not data["data"].get("daos"):
print("No DAOs found")
return
# Get DAOs
daos = data["data"]["daos"]
print(f"Found {len(daos)} DAOs")
# Print results
for dao in daos:
print(f"ID: {dao['id']}")
print(f"Name: {dao['name']}")
print(f"Created: {dao['createdAt']}")
print(f"Members: {dao['activeMemberCount']}")
print(f"Shares: {dao['totalShares']}")
print(f"Loot: {dao['totalLoot']}")
print("-" * 50)
except Exception as e:
print(f"Exception listing DAOs: {e}")
def main():
"""Main function"""
print("Checking DAOhaus v3 subgraphs...")
for url in SUBGRAPH_URLS:
print(f"\nChecking subgraph at {url}...")
if check_subgraph(url):
print("\nListing DAOs in subgraph...")
list_daos(url)
return 0
if __name__ == "__main__":
main()

View File

@ -0,0 +1,70 @@
#!/usr/bin/env python3
"""
Check DAOhaus v3 Subgraph Status
This script checks the status of the DAOhaus v3 subgraph with a simple query.
"""
import requests
import json
# Constants
SUBGRAPH_URL = "https://api.thegraph.com/subgraphs/id/HouDe2pTdyKM9CTG1aodnPPPhm7U148BCH7eJ4HHwpdQ"
def check_subgraph_status():
"""
Check the status of the subgraph
"""
# Simple query to check if subgraph is responding
query = """
query {
_meta {
block {
number
}
deployment
hasIndexingErrors
}
}
"""
print(f"Checking subgraph at {SUBGRAPH_URL}...")
try:
# Make request to subgraph
response = requests.post(
SUBGRAPH_URL,
json={"query": query}
)
# Print status code
print(f"Status code: {response.status_code}")
# Print headers
print("Headers:")
for key, value in response.headers.items():
print(f" {key}: {value}")
# Print content
print("\nContent:")
print(response.text)
# Try to parse as JSON
try:
data = response.json()
print("\nJSON data:")
print(json.dumps(data, indent=2))
except Exception as e:
print(f"\nError parsing JSON: {e}")
except Exception as e:
print(f"Exception checking subgraph: {e}")
def main():
"""Main function"""
check_subgraph_status()
return 0
if __name__ == "__main__":
main()

View File

@ -0,0 +1,227 @@
#!/usr/bin/env python3
"""
Explore DAOhaus v3 Subgraph
This script explores the DAOhaus v3 subgraph and lists all available DAOs.
It can also search for DAOs by name or ID.
"""
import sys
import requests
import json
import argparse
# Constants
SUBGRAPH_URL = "https://api.thegraph.com/subgraphs/id/HouDe2pTdyKM9CTG1aodnPPPhm7U148BCH7eJ4HHwpdQ"
def list_daos(limit=100, skip=0):
"""
List all DAOs in the subgraph
Args:
limit: Maximum number of DAOs to return
skip: Number of DAOs to skip
Returns:
List of DAOs
"""
# GraphQL query to list all DAOs
query = """
query {
daos(first: %d, skip: %d, orderBy: createdAt, orderDirection: desc) {
id
name
createdAt
totalShares
totalLoot
activeMemberCount
}
}
""" % (limit, skip)
# Make request to subgraph
response = requests.post(
SUBGRAPH_URL,
json={"query": query}
)
# Check for errors
if response.status_code != 200:
print(f"Error listing DAOs: {response.text}")
return []
data = response.json()
# Check if DAOs exist
if not data.get("data") or not data["data"].get("daos"):
print("No DAOs found")
return []
# Get DAOs
daos = data["data"]["daos"]
print(f"Found {len(daos)} DAOs")
return daos
def search_daos_by_name(name, limit=100):
"""
Search for DAOs by name
Args:
name: Name to search for
limit: Maximum number of DAOs to return
Returns:
List of matching DAOs
"""
# GraphQL query to search for DAOs by name
query = """
query {
daos(first: %d, where: {name_contains_nocase: "%s"}, orderBy: createdAt, orderDirection: desc) {
id
name
createdAt
totalShares
totalLoot
activeMemberCount
}
}
""" % (limit, name)
# Make request to subgraph
response = requests.post(
SUBGRAPH_URL,
json={"query": query}
)
# Check for errors
if response.status_code != 200:
print(f"Error searching DAOs: {response.text}")
return []
data = response.json()
# Check if DAOs exist
if not data.get("data") or not data["data"].get("daos"):
print(f"No DAOs found with name containing '{name}'")
return []
# Get DAOs
daos = data["data"]["daos"]
print(f"Found {len(daos)} DAOs with name containing '{name}'")
return daos
def get_dao_by_id(dao_id):
"""
Get a DAO by ID
Args:
dao_id: ID of the DAO to get
Returns:
DAO data if found, None otherwise
"""
# GraphQL query to get a DAO by ID
query = """
query {
dao(id: "%s") {
id
name
createdAt
totalShares
totalLoot
activeMemberCount
members {
id
memberAddress
shares
loot
createdAt
}
}
}
""" % dao_id.lower()
# Make request to subgraph
response = requests.post(
SUBGRAPH_URL,
json={"query": query}
)
# Check for errors
if response.status_code != 200:
print(f"Error getting DAO: {response.text}")
return None
data = response.json()
# Check if DAO exists
if not data.get("data") or not data["data"].get("dao"):
print(f"DAO not found with ID: {dao_id}")
return None
# Get DAO
dao = data["data"]["dao"]
print(f"Found DAO with ID: {dao_id}")
print(f"Name: {dao['name']}")
print(f"Created: {dao['createdAt']}")
print(f"Members: {dao['activeMemberCount']}")
print(f"Shares: {dao['totalShares']}")
print(f"Loot: {dao['totalLoot']}")
print(f"Member count: {len(dao['members'])}")
return dao
def print_dao_info(dao):
"""
Print information about a DAO
Args:
dao: DAO data to print
"""
print(f"ID: {dao['id']}")
print(f"Name: {dao['name']}")
print(f"Created: {dao['createdAt']}")
print(f"Members: {dao['activeMemberCount']}")
print(f"Shares: {dao['totalShares']}")
print(f"Loot: {dao['totalLoot']}")
print("-" * 50)
def main():
"""Main function"""
parser = argparse.ArgumentParser(description="Explore DAOhaus v3 Subgraph")
parser.add_argument("--list", action="store_true", help="List all DAOs")
parser.add_argument("--search", type=str, help="Search for DAOs by name")
parser.add_argument("--id", type=str, help="Get a DAO by ID")
parser.add_argument("--limit", type=int, default=100, help="Maximum number of DAOs to return")
parser.add_argument("--skip", type=int, default=0, help="Number of DAOs to skip")
args = parser.parse_args()
if args.id:
# Get a DAO by ID
dao = get_dao_by_id(args.id)
if dao:
print_dao_info(dao)
elif args.search:
# Search for DAOs by name
daos = search_daos_by_name(args.search, args.limit)
for dao in daos:
print_dao_info(dao)
elif args.list:
# List all DAOs
daos = list_daos(args.limit, args.skip)
for dao in daos:
print_dao_info(dao)
else:
# Default to listing all DAOs
print("Listing all DAOs...")
daos = list_daos(args.limit, args.skip)
for dao in daos:
print_dao_info(dao)
return 0
if __name__ == "__main__":
sys.exit(main())

View File

@ -0,0 +1,205 @@
#!/usr/bin/env python3
"""
Find Public Haus DAO ID
This script queries the DAOhaus v3 subgraph on Optimism mainnet to find the Public Haus DAO ID.
It searches for DAOs with names containing 'Public Haus' or similar terms, and also checks
a specific DAO ID if provided.
"""
import os
import sys
import requests
import json
from dotenv import load_dotenv
# Load environment variables
load_dotenv()
# Constants
SUBGRAPH_URL = "https://api.thegraph.com/subgraphs/name/hausdao/daohaus-v3-optimism"
SPECIFIC_DAO_ID = "0xf5d6b637a9185707f52d40d452956ca49018247a" # Public Haus DAO ID to check
def check_specific_dao(dao_id):
"""
Check if a specific DAO ID exists
Args:
dao_id: The DAO ID to check
Returns:
DAO data if found, None otherwise
"""
# GraphQL query to check a specific DAO
query = """
query {
dao(id: "%s") {
id
name
createdAt
totalShares
totalLoot
activeMemberCount
}
}
""" % dao_id.lower()
# Make request to subgraph
response = requests.post(
SUBGRAPH_URL,
json={"query": query}
)
# Check for errors
if response.status_code != 200:
print(f"Error checking DAO: {response.text}")
return None
data = response.json()
# Check if DAO exists
if not data.get("data") or not data["data"].get("dao"):
print(f"DAO not found with ID: {dao_id}")
return None
# Get DAO
dao = data["data"]["dao"]
print(f"Found DAO with ID: {dao_id}")
print(f"Name: {dao['name']}")
print(f"Created: {dao['createdAt']}")
print(f"Members: {dao['activeMemberCount']}")
print(f"Shares: {dao['totalShares']}")
print(f"Loot: {dao['totalLoot']}")
print("-" * 50)
return dao
def search_daos(search_term):
"""
Search for DAOs with names containing the search term
Args:
search_term: Term to search for in DAO names
Returns:
List of matching DAOs
"""
# GraphQL query to search for DAOs
query = """
query {
daos(where: {name_contains_nocase: "%s"}, first: 100) {
id
name
createdAt
totalShares
totalLoot
activeMemberCount
}
}
""" % search_term
# Make request to subgraph
response = requests.post(
SUBGRAPH_URL,
json={"query": query}
)
# Check for errors
if response.status_code != 200:
print(f"Error searching DAOs: {response.text}")
return []
data = response.json()
# Check if DAOs exist
if not data.get("data") or not data["data"].get("daos"):
print(f"No DAOs found with name containing '{search_term}'")
return []
# Get DAOs
daos = data["data"]["daos"]
print(f"Found {len(daos)} DAOs with name containing '{search_term}'")
return daos
def main():
"""Main function"""
# First check the specific DAO ID
print(f"Checking specific DAO ID: {SPECIFIC_DAO_ID}...")
specific_dao = check_specific_dao(SPECIFIC_DAO_ID)
# Search terms to try
search_terms = ["Public Haus", "PublicHaus", "Public", "Haus"]
all_daos = []
# Try each search term
for term in search_terms:
print(f"\nSearching for DAOs with name containing '{term}'...")
daos = search_daos(term)
all_daos.extend(daos)
# Print results
for dao in daos:
print(f"ID: {dao['id']}")
print(f"Name: {dao['name']}")
print(f"Created: {dao['createdAt']}")
print(f"Members: {dao['activeMemberCount']}")
print(f"Shares: {dao['totalShares']}")
print(f"Loot: {dao['totalLoot']}")
print("-" * 50)
# If no DAOs found, try listing all DAOs
if not all_daos and not specific_dao:
print("\nNo DAOs found with the search terms. Listing all DAOs...")
# GraphQL query to list all DAOs
query = """
query {
daos(first: 100) {
id
name
createdAt
totalShares
totalLoot
activeMemberCount
}
}
"""
# Make request to subgraph
response = requests.post(
SUBGRAPH_URL,
json={"query": query}
)
# Check for errors
if response.status_code != 200:
print(f"Error listing DAOs: {response.text}")
return 1
data = response.json()
# Check if DAOs exist
if not data.get("data") or not data["data"].get("daos"):
print("No DAOs found")
return 1
# Get DAOs
daos = data["data"]["daos"]
print(f"Found {len(daos)} DAOs")
# Print results
for dao in daos:
print(f"ID: {dao['id']}")
print(f"Name: {dao['name']}")
print(f"Created: {dao['createdAt']}")
print(f"Members: {dao['activeMemberCount']}")
print(f"Shares: {dao['totalShares']}")
print(f"Loot: {dao['totalLoot']}")
print("-" * 50)
return 0
if __name__ == "__main__":
sys.exit(main())

View File

@ -0,0 +1,351 @@
#!/usr/bin/env python3
"""
Import Public Haus Members using Optimism Etherscan API
This script fetches holders of the Public Haus shares token using the Optimism Etherscan API,
imports them into the database, and links them to the Public Haus DAO.
Usage:
python import_public_haus_etherscan.py
"""
import os
import sys
import logging
import json
import time
import requests
from typing import Dict, Any, List, Optional
from dotenv import load_dotenv
# Add parent directory to path to import utils
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from utils.db_connector import DatabaseConnector
from utils.logger import setup_logger
# Load environment variables
load_dotenv()
# Setup logging
logger = setup_logger("public_haus_etherscan_importer")
# Constants
PUBLIC_HAUS_DAO_ID = "0xf5d6b637a9185707f52d40d452956ca49018247a" # Public Haus DAO ID on Optimism
SHARES_TOKEN_ADDRESS = "0x4950c436F69c8b4F68ed814A70a5E1D94495c4a7" # From the image, sharesToken address
# Optimism Etherscan API
OPTIMISM_ETHERSCAN_API_URL = "https://api-optimistic.etherscan.io/api"
class PublicHausEtherscanImporter:
"""Importer for Public Haus members using Optimism Etherscan API"""
def __init__(self):
"""Initialize the importer"""
# Initialize database
self.db = DatabaseConnector()
# Get Etherscan API key
self.etherscan_api_key = os.getenv("ETHERSCAN_API_KEY")
if not self.etherscan_api_key:
logger.warning("ETHERSCAN_API_KEY not set, using API without key (rate limited)")
self.etherscan_api_key = ""
# Register data source
self.data_source_id = self.register_data_source()
# Initialize scraping job
self.job_id = self.db.create_scraping_job(
source_name="Public Haus DAO Etherscan",
status="running"
)
logger.info(f"Created scraping job with ID: {self.job_id}")
def register_data_source(self) -> str:
"""Register the Public Haus data source in the database"""
return self.db.upsert_data_source(
name="Public Haus DAO Etherscan",
source_type="blockchain",
description="Public Haus DAO members identified by token holdings via Etherscan"
)
def get_token_info(self) -> Dict[str, Any]:
"""
Get information about the shares token from Etherscan
Returns:
Token information
"""
try:
# Get token info from Etherscan
params = {
"module": "token",
"action": "tokeninfo",
"contractaddress": SHARES_TOKEN_ADDRESS,
"apikey": self.etherscan_api_key
}
response = requests.get(OPTIMISM_ETHERSCAN_API_URL, params=params)
data = response.json()
if data["status"] == "1":
token_info = data["result"][0]
logger.info(f"Token info: {token_info.get('name')} ({token_info.get('symbol')})")
return token_info
else:
# If Etherscan API fails, use hardcoded values
logger.warning(f"Error getting token info from Etherscan: {data.get('message')}")
return {
"name": "Public Haus Shares",
"symbol": "SHARES",
"decimals": "18",
"totalSupply": "0"
}
except Exception as e:
logger.error(f"Error getting token info: {e}")
# Return default values
return {
"name": "Public Haus Shares",
"symbol": "SHARES",
"decimals": "18",
"totalSupply": "0"
}
def fetch_token_holders(self) -> List[Dict[str, Any]]:
"""
Fetch holders of the shares token using Etherscan API
Returns:
List of token holders with their balances
"""
try:
# Get token info
token_info = self.get_token_info()
decimals = int(token_info.get("decimals", 18))
# Get token holders from Etherscan
params = {
"module": "token",
"action": "tokenholderlist",
"contractaddress": SHARES_TOKEN_ADDRESS,
"page": 1,
"offset": 100, # Get up to 100 holders
"apikey": self.etherscan_api_key
}
response = requests.get(OPTIMISM_ETHERSCAN_API_URL, params=params)
data = response.json()
holders = []
if data["status"] == "1":
for holder in data["result"]:
address = holder["address"]
balance = int(holder["TokenHolderQuantity"])
# Skip zero balances
if balance > 0:
holders.append({
"address": address,
"balance": balance,
"balanceFormatted": balance / (10 ** decimals),
"dao": "Public Haus"
})
logger.info(f"Found {len(holders)} token holders with non-zero balance")
else:
# If Etherscan API fails, try alternative approach
logger.warning(f"Error getting token holders from Etherscan: {data.get('message')}")
# If the tokenholderlist endpoint is not available, try getting transfers
params = {
"module": "account",
"action": "tokentx",
"contractaddress": SHARES_TOKEN_ADDRESS,
"page": 1,
"offset": 1000, # Get up to 1000 transfers
"sort": "desc",
"apikey": self.etherscan_api_key
}
response = requests.get(OPTIMISM_ETHERSCAN_API_URL, params=params)
data = response.json()
if data["status"] == "1":
# Extract unique addresses from transfers
addresses = set()
for tx in data["result"]:
addresses.add(tx["to"])
addresses.add(tx["from"])
# Remove zero address
if "0x0000000000000000000000000000000000000000" in addresses:
addresses.remove("0x0000000000000000000000000000000000000000")
# Create holder objects
for address in addresses:
holders.append({
"address": address,
"balance": 1, # We don't know the actual balance
"balanceFormatted": 1,
"dao": "Public Haus"
})
logger.info(f"Found {len(holders)} unique addresses from token transfers")
# If we still don't have any holders, use the DAO address itself
if not holders:
logger.warning("No token holders found, using DAO address as fallback")
holders.append({
"address": PUBLIC_HAUS_DAO_ID,
"balance": 1,
"balanceFormatted": 1,
"dao": "Public Haus"
})
return holders
except Exception as e:
logger.error(f"Error fetching token holders: {e}")
raise
def process_holder(self, holder: Dict[str, Any]) -> Optional[str]:
"""
Process a token holder and import into the database
Args:
holder: Token holder information
Returns:
Contact ID if successful, None otherwise
"""
try:
# Extract holder information
address = holder["address"]
balance = holder["balance"]
balance_formatted = holder["balanceFormatted"]
dao_name = holder["dao"]
# Check if contact exists
query = 'SELECT id, name, "ensName" FROM "Contact" WHERE "ethereumAddress" ILIKE %(address)s'
existing_contacts = self.db.execute_query(query, {"address": address})
contact_id = None
if existing_contacts:
# Use existing contact
contact_id = existing_contacts[0]["id"]
logger.info(f"Found existing contact {contact_id} for address {address}")
else:
# Create new contact
contact_id = self.db.upsert_contact(
ethereum_address=address,
ens_name=None
)
logger.info(f"Created new contact {contact_id} for address {address}")
# Add DAO membership
self.db.execute_update(
"""
INSERT INTO "DaoMembership" ("contactId", "daoName", "shares", "loot", "delegatingTo")
VALUES (%(contact_id)s, %(dao_name)s, %(shares)s, %(loot)s, %(delegating_to)s)
ON CONFLICT ("contactId", "daoName")
DO UPDATE SET
"shares" = %(shares)s,
"loot" = %(loot)s,
"updatedAt" = NOW()
""",
{
"contact_id": contact_id,
"dao_name": dao_name,
"shares": balance, # Use token balance as shares
"loot": 0, # We don't have loot information
"delegating_to": None
}
)
# Add note about membership
note_content = f"Public Haus DAO Member\nShares Token Balance: {balance_formatted}"
self.db.add_note_to_contact(
contact_id=contact_id,
content=note_content
)
# Add tag for the DAO
self.db.add_tag_to_contact(
contact_id=contact_id,
tag_name=dao_name
)
# Link to data source
self.db.link_contact_to_data_source(contact_id, self.data_source_id)
return contact_id
except Exception as e:
logger.error(f"Error processing holder {holder.get('address')}: {e}")
return None
def run(self) -> int:
"""
Run the importer
Returns:
Number of holders imported
"""
try:
# Fetch token holders
holders = self.fetch_token_holders()
if not holders:
logger.info("No token holders found")
self.db.update_scraping_job(self.job_id, "completed")
return 0
# Process holders
imported_count = 0
existing_count = 0
for holder in holders:
try:
contact_id = self.process_holder(holder)
if contact_id:
imported_count += 1
except Exception as e:
logger.exception(f"Error processing holder {holder.get('address')}: {e}")
# Add a small delay to avoid overwhelming the database
time.sleep(0.1)
# Complete the scraping job
self.db.update_scraping_job(
self.job_id,
"completed",
records_processed=len(holders),
records_added=imported_count,
records_updated=existing_count
)
logger.info(f"Imported {imported_count} holders out of {len(holders)} processed")
return imported_count
except Exception as e:
# Update the scraping job with error
self.db.update_scraping_job(self.job_id, "failed", error_message=str(e))
logger.exception(f"Error importing holders: {e}")
raise
def main():
"""Main function"""
try:
importer = PublicHausEtherscanImporter()
imported_count = importer.run()
logger.info(f"Import completed successfully. Imported {imported_count} token holders.")
return 0
except Exception as e:
logger.exception(f"Error importing token holders: {e}")
return 1
if __name__ == "__main__":
sys.exit(main())

View File

@ -0,0 +1,248 @@
#!/usr/bin/env python3
"""
Import Public Haus Members from DAOhaus v3 Subgraph
This script fetches members of Public Haus DAO from the DAOhaus v3 subgraph on Optimism mainnet,
imports them into the database, and links them to the Public Haus DAO.
Usage:
python import_public_haus_members.py
"""
import os
import sys
import logging
import requests
import json
from typing import Dict, Any, List, Optional
from dotenv import load_dotenv
# Add parent directory to path to import utils
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from utils.db_connector import DatabaseConnector
from utils.logger import setup_logger
# Load environment variables
load_dotenv()
# Setup logging
logger = setup_logger("public_haus_importer")
# Constants
SUBGRAPH_URL = "https://api.thegraph.com/subgraphs/name/hausdao/daohaus-v3-optimism"
PUBLIC_HAUS_DAO_ID = "0xf5d6b637a9185707f52d40d452956ca49018247a" # Public Haus DAO ID on Optimism
class PublicHausImporter:
"""Importer for Public Haus members from DAOhaus v3 subgraph"""
def __init__(self):
"""Initialize the importer"""
# Initialize database
self.db = DatabaseConnector()
# Register data source
self.data_source_id = self.register_data_source()
def register_data_source(self) -> str:
"""Register the Public Haus data source in the database"""
return self.db.upsert_data_source(
name="Public Haus DAO Subgraph",
source_type="subgraph",
description="Public Haus DAO members from DAOhaus v3 subgraph on Optimism mainnet"
)
def fetch_members_from_subgraph(self) -> List[Dict[str, Any]]:
"""
Fetch Public Haus members from the DAOhaus v3 subgraph
Returns:
List of member data from the subgraph
"""
# GraphQL query to fetch members
query = """
query {
dao(id: "%s") {
id
name
members {
id
memberAddress
shares
loot
createdAt
delegatingTo
delegateOfCount
delegateOf {
memberAddress
}
}
}
}
""" % PUBLIC_HAUS_DAO_ID.lower()
# Make request to subgraph
response = requests.post(
SUBGRAPH_URL,
json={"query": query}
)
# Check for errors
if response.status_code != 200:
logger.error(f"Error fetching members: {response.text}")
raise Exception(f"Error fetching members: {response.status_code}")
data = response.json()
# Check if DAO exists
if not data.get("data") or not data["data"].get("dao"):
logger.error(f"DAO not found: {PUBLIC_HAUS_DAO_ID}")
raise Exception(f"DAO not found: {PUBLIC_HAUS_DAO_ID}")
# Get members
members = data["data"]["dao"]["members"]
logger.info(f"Fetched {len(members)} members from subgraph")
return members
def process_member(self, member: Dict[str, Any]) -> Optional[str]:
"""
Process a single member and import into database
Args:
member: Member data from the subgraph
Returns:
Contact ID if successful, None otherwise
"""
# Extract member data
address = member["memberAddress"]
shares = int(member["shares"])
loot = int(member["loot"])
created_at = member["createdAt"]
delegating_to = member.get("delegatingTo")
# Skip if no address
if not address:
logger.warning(f"Member has no address: {member}")
return None
# Check if contact already exists
query = 'SELECT id, name, "ensName" FROM "Contact" WHERE "ethereumAddress" ILIKE %(address)s'
existing_contacts = self.db.execute_query(query, {"address": address})
contact_id = None
if existing_contacts:
# Use existing contact
contact_id = existing_contacts[0]["id"]
logger.info(f"Found existing contact {contact_id} for address {address}")
else:
# Create new contact
contact_data = {
"ethereumAddress": address,
"name": f"Public Haus Member {address[:8]}", # Default name
}
contact_id = self.db.upsert_contact(contact_data)
logger.info(f"Created new contact {contact_id} for address {address}")
# Add DAO membership
self.db.execute_update(
"""
INSERT INTO "DaoMembership" ("contactId", "daoName", "shares", "loot", "delegatingTo")
VALUES (%(contact_id)s, %(dao_name)s, %(shares)s, %(loot)s, %(delegating_to)s)
ON CONFLICT ("contactId", "daoName")
DO UPDATE SET
"shares" = %(shares)s,
"loot" = %(loot)s,
"delegatingTo" = %(delegating_to)s,
"updatedAt" = NOW()
""",
{
"contact_id": contact_id,
"dao_name": "Public Haus",
"shares": shares,
"loot": loot,
"delegating_to": delegating_to
}
)
# Add note about membership
note_content = f"Public Haus DAO Member\nShares: {shares}\nLoot: {loot}\nJoined: {created_at}"
if delegating_to:
note_content += f"\nDelegating to: {delegating_to}"
self.db.add_note_to_contact(
contact_id=contact_id,
content=note_content,
source="Public Haus DAO Subgraph"
)
# Link to data source
self.db.link_contact_to_data_source(contact_id, self.data_source_id)
return contact_id
def run(self) -> int:
"""
Run the importer
Returns:
Number of members imported
"""
# Create a scraping job
job_id = self.db.create_scraping_job("Public Haus DAO Importer", "running")
logger.info(f"Created scraping job with ID: {job_id}")
try:
# Fetch members
members = self.fetch_members_from_subgraph()
if not members:
logger.info("No members found")
self.db.update_scraping_job(job_id, "completed")
return 0
# Process members
imported_count = 0
existing_count = 0
for member in members:
try:
contact_id = self.process_member(member)
if contact_id:
imported_count += 1
except Exception as e:
logger.exception(f"Error processing member {member.get('memberAddress')}: {e}")
# Complete the scraping job
self.db.update_scraping_job(
job_id,
"completed",
records_processed=len(members),
records_added=imported_count,
records_updated=existing_count
)
logger.info(f"Imported {imported_count} members out of {len(members)} processed")
return imported_count
except Exception as e:
# Update the scraping job with error
self.db.update_scraping_job(job_id, "failed", error_message=str(e))
logger.exception(f"Error importing members: {e}")
raise
def main():
"""Main function"""
try:
importer = PublicHausImporter()
imported_count = importer.run()
logger.info(f"Import completed successfully. Imported {imported_count} members.")
return 0
except Exception as e:
logger.exception(f"Error importing members: {e}")
return 1
if __name__ == "__main__":
sys.exit(main())

View File

@ -0,0 +1,254 @@
#!/usr/bin/env python3
"""
Import Public Haus Members from DAOhaus API
This script fetches members of Public Haus DAO from the DAOhaus API on Optimism mainnet,
imports them into the database, and links them to the Public Haus DAO.
Usage:
python import_public_haus_members_api.py
"""
import os
import sys
import logging
import requests
import json
import time
from typing import Dict, Any, List, Optional
from dotenv import load_dotenv
# Add parent directory to path to import utils
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from utils.db_connector import DatabaseConnector
from utils.logger import setup_logger
# Load environment variables
load_dotenv()
# Setup logging
logger = setup_logger("public_haus_importer")
# Constants
DAOHAUS_API_URL = "https://admin.daohaus.club/api"
PUBLIC_HAUS_DAO_ID = "0xf5d6b637a9185707f52d40d452956ca49018247a" # Public Haus DAO ID on Optimism
CHAIN_ID = "10" # Optimism chain ID
class PublicHausImporter:
"""Importer for Public Haus members from DAOhaus API"""
def __init__(self):
"""Initialize the importer"""
# Initialize database
self.db = DatabaseConnector()
# Register data source
self.data_source_id = self.register_data_source()
def register_data_source(self) -> str:
"""Register the Public Haus data source in the database"""
return self.db.upsert_data_source(
name="Public Haus DAO API",
source_type="api",
description="Public Haus DAO members from DAOhaus API on Optimism mainnet"
)
def fetch_dao_info(self) -> Dict[str, Any]:
"""
Fetch Public Haus DAO information from the DAOhaus API
Returns:
DAO information
"""
# Make request to DAOhaus API
url = f"{DAOHAUS_API_URL}/dao/{CHAIN_ID}/{PUBLIC_HAUS_DAO_ID}"
response = requests.get(url)
# Check for errors
if response.status_code != 200:
logger.error(f"Error fetching DAO info: {response.text}")
raise Exception(f"Error fetching DAO info: {response.status_code}")
data = response.json()
logger.info(f"Fetched DAO info: {data.get('name')}")
return data
def fetch_members(self) -> List[Dict[str, Any]]:
"""
Fetch Public Haus members from the DAOhaus API
Returns:
List of member data from the API
"""
# Make request to DAOhaus API
url = f"{DAOHAUS_API_URL}/dao/{CHAIN_ID}/{PUBLIC_HAUS_DAO_ID}/members"
response = requests.get(url)
# Check for errors
if response.status_code != 200:
logger.error(f"Error fetching members: {response.text}")
raise Exception(f"Error fetching members: {response.status_code}")
data = response.json()
# Check if members exist
if not data:
logger.error(f"No members found for DAO: {PUBLIC_HAUS_DAO_ID}")
return []
logger.info(f"Fetched {len(data)} members from API")
return data
def process_member(self, member: Dict[str, Any]) -> Optional[str]:
"""
Process a single member and import into database
Args:
member: Member data from the API
Returns:
Contact ID if successful, None otherwise
"""
# Extract member data
address = member.get("memberAddress")
shares = int(member.get("shares", 0))
loot = int(member.get("loot", 0))
joined_at = member.get("createdAt")
delegating_to = member.get("delegatingTo")
# Skip if no address
if not address:
logger.warning(f"Member has no address: {member}")
return None
# Check if contact already exists
query = 'SELECT id, name, "ensName" FROM "Contact" WHERE "ethereumAddress" ILIKE %(address)s'
existing_contacts = self.db.execute_query(query, {"address": address})
contact_id = None
if existing_contacts:
# Use existing contact
contact_id = existing_contacts[0]["id"]
logger.info(f"Found existing contact {contact_id} for address {address}")
else:
# Create new contact
contact_data = {
"ethereumAddress": address,
"name": f"Public Haus Member {address[:8]}", # Default name
}
contact_id = self.db.upsert_contact(contact_data)
logger.info(f"Created new contact {contact_id} for address {address}")
# Add DAO membership
self.db.execute_update(
"""
INSERT INTO "DaoMembership" ("contactId", "daoName", "shares", "loot", "delegatingTo")
VALUES (%(contact_id)s, %(dao_name)s, %(shares)s, %(loot)s, %(delegating_to)s)
ON CONFLICT ("contactId", "daoName")
DO UPDATE SET
"shares" = %(shares)s,
"loot" = %(loot)s,
"delegatingTo" = %(delegating_to)s,
"updatedAt" = NOW()
""",
{
"contact_id": contact_id,
"dao_name": "Public Haus",
"shares": shares,
"loot": loot,
"delegating_to": delegating_to
}
)
# Add note about membership
note_content = f"Public Haus DAO Member\nShares: {shares}\nLoot: {loot}"
if joined_at:
note_content += f"\nJoined: {joined_at}"
if delegating_to:
note_content += f"\nDelegating to: {delegating_to}"
self.db.add_note_to_contact(
contact_id=contact_id,
content=note_content,
source="Public Haus DAO API"
)
# Link to data source
self.db.link_contact_to_data_source(contact_id, self.data_source_id)
return contact_id
def run(self) -> int:
"""
Run the importer
Returns:
Number of members imported
"""
# Create a scraping job
job_id = self.db.create_scraping_job("Public Haus DAO Importer", "running")
logger.info(f"Created scraping job with ID: {job_id}")
try:
# Fetch DAO info
dao_info = self.fetch_dao_info()
logger.info(f"DAO Name: {dao_info.get('name')}")
# Fetch members
members = self.fetch_members()
if not members:
logger.info("No members found")
self.db.update_scraping_job(job_id, "completed")
return 0
# Process members
imported_count = 0
existing_count = 0
for member in members:
try:
contact_id = self.process_member(member)
if contact_id:
imported_count += 1
except Exception as e:
logger.exception(f"Error processing member {member.get('memberAddress')}: {e}")
# Add a small delay to avoid overwhelming the database
time.sleep(0.1)
# Complete the scraping job
self.db.update_scraping_job(
job_id,
"completed",
records_processed=len(members),
records_added=imported_count,
records_updated=existing_count
)
logger.info(f"Imported {imported_count} members out of {len(members)} processed")
return imported_count
except Exception as e:
# Update the scraping job with error
self.db.update_scraping_job(job_id, "failed", error_message=str(e))
logger.exception(f"Error importing members: {e}")
raise
def main():
"""Main function"""
try:
importer = PublicHausImporter()
imported_count = importer.run()
logger.info(f"Import completed successfully. Imported {imported_count} members.")
return 0
except Exception as e:
logger.exception(f"Error importing members: {e}")
return 1
if __name__ == "__main__":
sys.exit(main())

View File

@ -0,0 +1,304 @@
#!/usr/bin/env python3
"""
Import Public Haus Members from Optimism Blockchain using Events
This script fetches members of Public Haus DAO by querying events from the Optimism blockchain,
imports them into the database, and links them to the Public Haus DAO.
Usage:
python import_public_haus_members_events.py
"""
import os
import sys
import logging
import json
import time
from typing import Dict, Any, List, Optional
from web3 import Web3
from dotenv import load_dotenv
# Add parent directory to path to import utils
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from utils.db_connector import DatabaseConnector
from utils.logger import setup_logger
# Load environment variables
load_dotenv()
# Setup logging
logger = setup_logger("public_haus_importer")
# Constants
PUBLIC_HAUS_DAO_ID = "0xf5d6b637a9185707f52d40d452956ca49018247a" # Public Haus DAO ID on Optimism
# Moloch DAO V3 ABI (partial, only what we need for events)
MOLOCH_V3_ABI = [
{
"anonymous": False,
"inputs": [
{"indexed": True, "internalType": "address", "name": "member", "type": "address"},
{"indexed": False, "internalType": "uint256", "name": "shares", "type": "uint256"}
],
"name": "SharingEvent",
"type": "event"
},
{
"anonymous": False,
"inputs": [
{"indexed": True, "internalType": "address", "name": "applicant", "type": "address"}
],
"name": "MembershipProposalSubmitted",
"type": "event"
},
{
"anonymous": False,
"inputs": [
{"indexed": True, "internalType": "address", "name": "member", "type": "address"}
],
"name": "MemberAdded",
"type": "event"
},
{
"anonymous": False,
"inputs": [
{"indexed": True, "internalType": "address", "name": "memberAddress", "type": "address"},
{"indexed": False, "internalType": "uint256", "name": "shares", "type": "uint256"},
{"indexed": False, "internalType": "uint256", "name": "loot", "type": "uint256"}
],
"name": "ProcessProposal",
"type": "event"
}
]
class PublicHausImporter:
"""Importer for Public Haus members from Optimism blockchain using events"""
def __init__(self):
"""Initialize the importer"""
# Initialize database
self.db = DatabaseConnector()
# Initialize Web3
optimism_rpc_url = os.getenv("OPTIMISM_RPC_URL")
if not optimism_rpc_url:
raise ValueError("OPTIMISM_RPC_URL environment variable not set")
self.web3 = Web3(Web3.HTTPProvider(optimism_rpc_url))
if not self.web3.is_connected():
raise ValueError("Failed to connect to Optimism RPC")
logger.info(f"Connected to Optimism: {self.web3.is_connected()}")
# Initialize contract
self.contract = self.web3.eth.contract(
address=self.web3.to_checksum_address(PUBLIC_HAUS_DAO_ID),
abi=MOLOCH_V3_ABI
)
# Register data source
self.data_source_id = self.register_data_source()
# Initialize scraping job
self.job_id = self.db.create_scraping_job(
source_name="Public Haus DAO Blockchain Events",
status="running"
)
logger.info(f"Created scraping job with ID: {self.job_id}")
def register_data_source(self) -> str:
"""Register the Public Haus data source in the database"""
return self.db.upsert_data_source(
name="Public Haus DAO Blockchain",
source_type="blockchain",
description="Public Haus DAO members from Optimism blockchain"
)
def fetch_members_from_events(self) -> List[Dict[str, Any]]:
"""
Fetch Public Haus members by querying events
Returns:
List of member information
"""
try:
# Get the latest block number
latest_block = self.web3.eth.block_number
# Calculate the starting block (approximately 6 months ago)
# Optimism has ~1 block every 2 seconds
blocks_per_day = 43200 # 86400 seconds / 2 seconds per block
start_block = max(0, latest_block - (blocks_per_day * 180)) # 180 days
logger.info(f"Fetching events from block {start_block} to {latest_block}")
# Get all member-related events
member_addresses = set()
# Try different event types that might indicate membership
for event_name in ["MemberAdded", "ProcessProposal", "SharingEvent", "MembershipProposalSubmitted"]:
try:
event_filter = self.contract.events[event_name].create_filter(
fromBlock=start_block,
toBlock=latest_block
)
events = event_filter.get_all_entries()
logger.info(f"Found {len(events)} {event_name} events")
for event in events:
if hasattr(event.args, 'member'):
member_addresses.add(event.args.member)
elif hasattr(event.args, 'memberAddress'):
member_addresses.add(event.args.memberAddress)
elif hasattr(event.args, 'applicant'):
member_addresses.add(event.args.applicant)
except Exception as e:
logger.warning(f"Error fetching {event_name} events: {e}")
continue
# If we didn't find any members through events, try a different approach
if not member_addresses:
logger.warning("No members found through events, trying alternative approach")
# Try to get members by checking recent transactions to the DAO
transactions = []
for block_num in range(latest_block - 1000, latest_block):
block = self.web3.eth.get_block(block_num, full_transactions=True)
for tx in block.transactions:
if tx.to and tx.to.lower() == PUBLIC_HAUS_DAO_ID.lower():
transactions.append(tx)
member_addresses.add(tx['from'])
# Convert addresses to member objects
members = []
for address in member_addresses:
members.append({
"address": address,
"dao": "Public Haus",
"shares": 0, # We don't have share information from events
"loot": 0 # We don't have loot information from events
})
logger.info(f"Found {len(members)} unique members")
return members
except Exception as e:
logger.error(f"Error fetching members from events: {e}")
raise
def process_member(self, member: Dict[str, Any]) -> Optional[str]:
"""
Process a member and import into the database
Args:
member: Member information
Returns:
Contact ID if successful, None otherwise
"""
try:
# Extract member information
address = member["address"]
dao_name = member["dao"]
# Check if contact exists
contact_id = self.db.get_contact_by_ethereum_address(address)
if contact_id:
logger.info(f"Contact already exists for address {address}")
else:
# Create new contact
contact_id = self.db.create_contact(
name=f"Public Haus Member {address[:8]}",
ethereum_address=address,
email=None,
twitter=None,
github=None,
telegram=None,
discord=None
)
logger.info(f"Created new contact with ID {contact_id} for address {address}")
# Link contact to data source
self.db.link_contact_to_data_source(
contact_id=contact_id,
data_source_id=self.data_source_id,
external_id=address
)
# Add tag for the DAO
self.db.add_tag_to_contact(
contact_id=contact_id,
tag_name=dao_name
)
# Add note about membership
self.db.add_note_to_contact(
contact_id=contact_id,
note=f"Member of {dao_name} DAO on Optimism"
)
return contact_id
except Exception as e:
logger.error(f"Error processing member {member['address']}: {e}")
return None
def run(self) -> int:
"""
Run the importer
Returns:
Number of imported members
"""
try:
# Fetch members
members = self.fetch_members_from_events()
# Process members
imported_count = 0
for member in members:
contact_id = self.process_member(member)
if contact_id:
imported_count += 1
# Sleep to avoid rate limiting
time.sleep(0.1)
# Update scraping job
self.db.update_scraping_job(
job_id=self.job_id,
status="completed",
records_processed=len(members),
records_added=imported_count,
records_updated=0
)
logger.info(f"Imported {imported_count} members out of {len(members)}")
return imported_count
except Exception as e:
logger.error(f"Error importing members: {e}")
# Update scraping job with error
self.db.update_scraping_job(
job_id=self.job_id,
status="failed",
error_message=str(e)
)
raise
def main():
"""Main entry point"""
try:
importer = PublicHausImporter()
imported_count = importer.run()
logger.info(f"Successfully imported {imported_count} Public Haus members")
except Exception as e:
logger.error(f"Error importing members: {e}")
sys.exit(1)
if __name__ == "__main__":
main()

View File

@ -0,0 +1,298 @@
#!/usr/bin/env python3
"""
Import Public Haus Members from DAOhaus v3 Subgraph
This script fetches members of Public Haus DAO from the DAOhaus v3 subgraph,
imports them into the database, and links them to the Public Haus DAO.
Usage:
python import_public_haus_members_graph.py
"""
import os
import sys
import logging
import requests
import json
import time
from typing import Dict, Any, List, Optional
from dotenv import load_dotenv
# Add parent directory to path to import utils
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from utils.db_connector import DatabaseConnector
from utils.logger import setup_logger
# Load environment variables
load_dotenv()
# Setup logging
logger = setup_logger("public_haus_importer")
# Constants
SUBGRAPH_URL = "https://api.thegraph.com/subgraphs/id/HouDe2pTdyKM9CTG1aodnPPPhm7U148BCH7eJ4HHwpdQ"
PUBLIC_HAUS_DAO_ID = "0xf5d6b637a9185707f52d40d452956ca49018247a" # Public Haus DAO ID
class PublicHausImporter:
"""Importer for Public Haus members from DAOhaus v3 subgraph"""
def __init__(self):
"""Initialize the importer"""
# Initialize database
self.db = DatabaseConnector()
# Register data source
self.data_source_id = self.register_data_source()
def register_data_source(self) -> str:
"""Register the Public Haus data source in the database"""
return self.db.upsert_data_source(
name="Public Haus DAO Subgraph",
source_type="subgraph",
description="Public Haus DAO members from DAOhaus v3 subgraph"
)
def fetch_dao_info(self) -> Dict[str, Any]:
"""
Fetch Public Haus DAO information from the subgraph
Returns:
DAO information
"""
# GraphQL query to fetch DAO info
query = """
query {
dao(id: "%s") {
id
name
createdAt
totalShares
totalLoot
activeMemberCount
}
}
""" % PUBLIC_HAUS_DAO_ID.lower()
# Make request to subgraph
response = requests.post(
SUBGRAPH_URL,
json={"query": query}
)
# Check for errors
if response.status_code != 200:
logger.error(f"Error fetching DAO info: {response.text}")
raise Exception(f"Error fetching DAO info: {response.status_code}")
data = response.json()
# Check if DAO exists
if not data.get("data") or not data["data"].get("dao"):
logger.error(f"DAO not found: {PUBLIC_HAUS_DAO_ID}")
raise Exception(f"DAO not found: {PUBLIC_HAUS_DAO_ID}")
dao = data["data"]["dao"]
logger.info(f"Fetched DAO info: {dao.get('name')}")
return dao
def fetch_members(self) -> List[Dict[str, Any]]:
"""
Fetch Public Haus members from the subgraph
Returns:
List of member data from the subgraph
"""
# GraphQL query to fetch members
query = """
query {
dao(id: "%s") {
members {
id
memberAddress
shares
loot
createdAt
delegatingTo
delegateOfCount
delegateOf {
memberAddress
}
}
}
}
""" % PUBLIC_HAUS_DAO_ID.lower()
# Make request to subgraph
response = requests.post(
SUBGRAPH_URL,
json={"query": query}
)
# Check for errors
if response.status_code != 200:
logger.error(f"Error fetching members: {response.text}")
raise Exception(f"Error fetching members: {response.status_code}")
data = response.json()
# Check if DAO exists
if not data.get("data") or not data["data"].get("dao"):
logger.error(f"DAO not found: {PUBLIC_HAUS_DAO_ID}")
raise Exception(f"DAO not found: {PUBLIC_HAUS_DAO_ID}")
# Get members
members = data["data"]["dao"]["members"]
logger.info(f"Fetched {len(members)} members from subgraph")
return members
def process_member(self, member: Dict[str, Any]) -> Optional[str]:
"""
Process a single member and import into database
Args:
member: Member data from the subgraph
Returns:
Contact ID if successful, None otherwise
"""
# Extract member data
address = member["memberAddress"]
shares = int(member["shares"])
loot = int(member["loot"])
created_at = member["createdAt"]
delegating_to = member.get("delegatingTo")
# Skip if no address
if not address:
logger.warning(f"Member has no address: {member}")
return None
# Check if contact already exists
query = 'SELECT id, name, "ensName" FROM "Contact" WHERE "ethereumAddress" ILIKE %(address)s'
existing_contacts = self.db.execute_query(query, {"address": address})
contact_id = None
if existing_contacts:
# Use existing contact
contact_id = existing_contacts[0]["id"]
logger.info(f"Found existing contact {contact_id} for address {address}")
else:
# Create new contact
contact_data = {
"ethereumAddress": address,
"name": f"Public Haus Member {address[:8]}", # Default name
}
contact_id = self.db.upsert_contact(contact_data)
logger.info(f"Created new contact {contact_id} for address {address}")
# Add DAO membership
self.db.execute_update(
"""
INSERT INTO "DaoMembership" ("contactId", "daoName", "shares", "loot", "delegatingTo")
VALUES (%(contact_id)s, %(dao_name)s, %(shares)s, %(loot)s, %(delegating_to)s)
ON CONFLICT ("contactId", "daoName")
DO UPDATE SET
"shares" = %(shares)s,
"loot" = %(loot)s,
"delegatingTo" = %(delegating_to)s,
"updatedAt" = NOW()
""",
{
"contact_id": contact_id,
"dao_name": "Public Haus",
"shares": shares,
"loot": loot,
"delegating_to": delegating_to
}
)
# Add note about membership
note_content = f"Public Haus DAO Member\nShares: {shares}\nLoot: {loot}\nJoined: {created_at}"
if delegating_to:
note_content += f"\nDelegating to: {delegating_to}"
self.db.add_note_to_contact(
contact_id=contact_id,
content=note_content,
source="Public Haus DAO Subgraph"
)
# Link to data source
self.db.link_contact_to_data_source(contact_id, self.data_source_id)
return contact_id
def run(self) -> int:
"""
Run the importer
Returns:
Number of members imported
"""
# Create a scraping job
job_id = self.db.create_scraping_job("Public Haus DAO Importer", "running")
logger.info(f"Created scraping job with ID: {job_id}")
try:
# Fetch DAO info
dao_info = self.fetch_dao_info()
logger.info(f"DAO Name: {dao_info.get('name')}")
# Fetch members
members = self.fetch_members()
if not members:
logger.info("No members found")
self.db.update_scraping_job(job_id, "completed")
return 0
# Process members
imported_count = 0
existing_count = 0
for member in members:
try:
contact_id = self.process_member(member)
if contact_id:
imported_count += 1
except Exception as e:
logger.exception(f"Error processing member {member.get('memberAddress')}: {e}")
# Add a small delay to avoid overwhelming the database
time.sleep(0.1)
# Complete the scraping job
self.db.update_scraping_job(
job_id,
"completed",
records_processed=len(members),
records_added=imported_count,
records_updated=existing_count
)
logger.info(f"Imported {imported_count} members out of {len(members)} processed")
return imported_count
except Exception as e:
# Update the scraping job with error
self.db.update_scraping_job(job_id, "failed", error_message=str(e))
logger.exception(f"Error importing members: {e}")
raise
def main():
"""Main function"""
try:
importer = PublicHausImporter()
imported_count = importer.run()
logger.info(f"Import completed successfully. Imported {imported_count} members.")
return 0
except Exception as e:
logger.exception(f"Error importing members: {e}")
return 1
if __name__ == "__main__":
sys.exit(main())

View File

@ -0,0 +1,333 @@
#!/usr/bin/env python3
"""
Import Public Haus Members from Optimism Blockchain
This script fetches members of Public Haus DAO by directly querying the Optimism blockchain,
imports them into the database, and links them to the Public Haus DAO.
Usage:
python import_public_haus_members_web3.py
"""
import os
import sys
import logging
import json
import time
from typing import Dict, Any, List, Optional
from web3 import Web3
from dotenv import load_dotenv
# Add parent directory to path to import utils
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from utils.db_connector import DatabaseConnector
from utils.logger import setup_logger
# Load environment variables
load_dotenv()
# Setup logging
logger = setup_logger("public_haus_importer")
# Constants
PUBLIC_HAUS_DAO_ID = "0xf5d6b637a9185707f52d40d452956ca49018247a" # Public Haus DAO ID on Optimism
# Moloch DAO V3 ABI (partial, only what we need)
MOLOCH_V3_ABI = [
{
"inputs": [{"internalType": "address", "name": "memberAddress", "type": "address"}],
"name": "members",
"outputs": [
{"internalType": "address", "name": "delegateKey", "type": "address"},
{"internalType": "uint256", "name": "shares", "type": "uint256"},
{"internalType": "uint256", "name": "loot", "type": "uint256"},
{"internalType": "bool", "name": "exists", "type": "bool"},
{"internalType": "uint256", "name": "highestIndexYesVote", "type": "uint256"},
{"internalType": "uint256", "name": "jailed", "type": "uint256"}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [],
"name": "getMemberAddresses",
"outputs": [{"internalType": "address[]", "name": "", "type": "address[]"}],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [],
"name": "totalShares",
"outputs": [{"internalType": "uint256", "name": "", "type": "uint256"}],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [],
"name": "totalLoot",
"outputs": [{"internalType": "uint256", "name": "", "type": "uint256"}],
"stateMutability": "view",
"type": "function"
}
]
class PublicHausImporter:
"""Importer for Public Haus members from Optimism blockchain"""
def __init__(self):
"""Initialize the importer"""
# Initialize database
self.db = DatabaseConnector()
# Initialize Web3
optimism_rpc_url = os.getenv("OPTIMISM_RPC_URL")
if not optimism_rpc_url:
raise ValueError("OPTIMISM_RPC_URL environment variable not set")
self.web3 = Web3(Web3.HTTPProvider(optimism_rpc_url))
if not self.web3.is_connected():
raise ValueError("Failed to connect to Optimism RPC")
logger.info(f"Connected to Optimism: {self.web3.is_connected()}")
# Initialize contract
self.contract = self.web3.eth.contract(
address=self.web3.to_checksum_address(PUBLIC_HAUS_DAO_ID),
abi=MOLOCH_V3_ABI
)
# Register data source
self.data_source_id = self.register_data_source()
def register_data_source(self) -> str:
"""Register the Public Haus data source in the database"""
return self.db.upsert_data_source(
name="Public Haus DAO Blockchain",
source_type="blockchain",
description="Public Haus DAO members from Optimism blockchain"
)
def fetch_dao_info(self) -> Dict[str, Any]:
"""
Fetch Public Haus DAO information from the blockchain
Returns:
DAO information
"""
try:
# Get total shares and loot
total_shares = self.contract.functions.totalShares().call()
total_loot = self.contract.functions.totalLoot().call()
dao_info = {
"id": PUBLIC_HAUS_DAO_ID,
"name": "Public Haus",
"totalShares": total_shares,
"totalLoot": total_loot
}
logger.info(f"Fetched DAO info: Public Haus")
logger.info(f"Total Shares: {total_shares}")
logger.info(f"Total Loot: {total_loot}")
return dao_info
except Exception as e:
logger.error(f"Error fetching DAO info: {e}")
raise
def fetch_members(self) -> List[Dict[str, Any]]:
"""
Fetch Public Haus members from the blockchain
Returns:
List of member data
"""
try:
# Get member addresses
member_addresses = self.contract.functions.getMemberAddresses().call()
logger.info(f"Fetched {len(member_addresses)} member addresses")
members = []
# Get member details
for address in member_addresses:
try:
member = self.contract.functions.members(address).call()
# Extract member data
delegate_key, shares, loot, exists, highest_index_yes_vote, jailed = member
# Skip if not a member
if not exists:
continue
# Create member object
member_data = {
"memberAddress": address,
"delegateKey": delegate_key,
"shares": shares,
"loot": loot,
"jailed": jailed > 0
}
members.append(member_data)
except Exception as e:
logger.error(f"Error fetching member {address}: {e}")
logger.info(f"Fetched {len(members)} members with details")
return members
except Exception as e:
logger.error(f"Error fetching members: {e}")
raise
def process_member(self, member: Dict[str, Any]) -> Optional[str]:
"""
Process a single member and import into database
Args:
member: Member data from the blockchain
Returns:
Contact ID if successful, None otherwise
"""
# Extract member data
address = member["memberAddress"]
shares = int(member["shares"])
loot = int(member["loot"])
delegate_key = member["delegateKey"]
jailed = member["jailed"]
# Skip if no address
if not address:
logger.warning(f"Member has no address: {member}")
return None
# Check if contact already exists
query = 'SELECT id, name, "ensName" FROM "Contact" WHERE "ethereumAddress" ILIKE %(address)s'
existing_contacts = self.db.execute_query(query, {"address": address})
contact_id = None
if existing_contacts:
# Use existing contact
contact_id = existing_contacts[0]["id"]
logger.info(f"Found existing contact {contact_id} for address {address}")
else:
# Create new contact
contact_data = {
"ethereumAddress": address,
"name": f"Public Haus Member {address[:8]}", # Default name
}
contact_id = self.db.upsert_contact(contact_data)
logger.info(f"Created new contact {contact_id} for address {address}")
# Add DAO membership
self.db.execute_update(
"""
INSERT INTO "DaoMembership" ("contactId", "daoName", "shares", "loot", "delegatingTo")
VALUES (%(contact_id)s, %(dao_name)s, %(shares)s, %(loot)s, %(delegating_to)s)
ON CONFLICT ("contactId", "daoName")
DO UPDATE SET
"shares" = %(shares)s,
"loot" = %(loot)s,
"delegatingTo" = %(delegating_to)s,
"updatedAt" = NOW()
""",
{
"contact_id": contact_id,
"dao_name": "Public Haus",
"shares": shares,
"loot": loot,
"delegating_to": delegate_key if delegate_key != address else None
}
)
# Add note about membership
note_content = f"Public Haus DAO Member\nShares: {shares}\nLoot: {loot}"
if delegate_key != address:
note_content += f"\nDelegating to: {delegate_key}"
if jailed:
note_content += "\nJailed: Yes"
self.db.add_note_to_contact(
contact_id=contact_id,
content=note_content,
source="Public Haus DAO Blockchain"
)
# Link to data source
self.db.link_contact_to_data_source(contact_id, self.data_source_id)
return contact_id
def run(self) -> int:
"""
Run the importer
Returns:
Number of members imported
"""
# Create a scraping job
job_id = self.db.create_scraping_job("Public Haus DAO Importer", "running")
logger.info(f"Created scraping job with ID: {job_id}")
try:
# Fetch DAO info
dao_info = self.fetch_dao_info()
# Fetch members
members = self.fetch_members()
if not members:
logger.info("No members found")
self.db.update_scraping_job(job_id, "completed")
return 0
# Process members
imported_count = 0
existing_count = 0
for member in members:
try:
contact_id = self.process_member(member)
if contact_id:
imported_count += 1
except Exception as e:
logger.exception(f"Error processing member {member.get('memberAddress')}: {e}")
# Add a small delay to avoid overwhelming the database
time.sleep(0.1)
# Complete the scraping job
self.db.update_scraping_job(
job_id,
"completed",
records_processed=len(members),
records_added=imported_count,
records_updated=existing_count
)
logger.info(f"Imported {imported_count} members out of {len(members)} processed")
return imported_count
except Exception as e:
# Update the scraping job with error
self.db.update_scraping_job(job_id, "failed", error_message=str(e))
logger.exception(f"Error importing members: {e}")
raise
def main():
"""Main function"""
try:
importer = PublicHausImporter()
imported_count = importer.run()
logger.info(f"Import completed successfully. Imported {imported_count} members.")
return 0
except Exception as e:
logger.exception(f"Error importing members: {e}")
return 1
if __name__ == "__main__":
sys.exit(main())

View File

@ -0,0 +1,392 @@
#!/usr/bin/env python3
"""
Import Public Haus Members by Querying Token Holders
This script fetches members of Public Haus DAO by querying holders of the shares token,
imports them into the database, and links them to the Public Haus DAO.
Usage:
python import_public_haus_token_holders.py
"""
import os
import sys
import logging
import json
import time
from typing import Dict, Any, List, Optional
from web3 import Web3
from dotenv import load_dotenv
# Add parent directory to path to import utils
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from utils.db_connector import DatabaseConnector
from utils.logger import setup_logger
# Load environment variables
load_dotenv()
# Setup logging
logger = setup_logger("public_haus_token_importer")
# Constants
PUBLIC_HAUS_DAO_ID = "0xf5d6b637a9185707f52d40d452956ca49018247a" # Public Haus DAO ID on Optimism
SHARES_TOKEN_ADDRESS = "0x4950c436F69c8b4F68ed814A70a5E1D94495c4a7" # From the image, sharesToken address
# ERC20 ABI (minimal for balance checking)
ERC20_ABI = [
{
"constant": True,
"inputs": [{"name": "_owner", "type": "address"}],
"name": "balanceOf",
"outputs": [{"name": "balance", "type": "uint256"}],
"type": "function"
},
{
"constant": True,
"inputs": [],
"name": "totalSupply",
"outputs": [{"name": "", "type": "uint256"}],
"type": "function"
},
{
"constant": True,
"inputs": [],
"name": "name",
"outputs": [{"name": "", "type": "string"}],
"type": "function"
},
{
"constant": True,
"inputs": [],
"name": "symbol",
"outputs": [{"name": "", "type": "string"}],
"type": "function"
},
{
"constant": True,
"inputs": [],
"name": "decimals",
"outputs": [{"name": "", "type": "uint8"}],
"type": "function"
},
{
"constant": True,
"inputs": [{"name": "_owner", "type": "address"}, {"name": "_spender", "type": "address"}],
"name": "allowance",
"outputs": [{"name": "", "type": "uint256"}],
"type": "function"
},
{
"constant": False,
"inputs": [{"name": "_to", "type": "address"}, {"name": "_value", "type": "uint256"}],
"name": "transfer",
"outputs": [{"name": "", "type": "bool"}],
"type": "function"
}
]
# Transfer event ABI for querying token transfers
TRANSFER_EVENT_ABI = [
{
"anonymous": False,
"inputs": [
{"indexed": True, "name": "from", "type": "address"},
{"indexed": True, "name": "to", "type": "address"},
{"indexed": False, "name": "value", "type": "uint256"}
],
"name": "Transfer",
"type": "event"
}
]
class PublicHausTokenImporter:
"""Importer for Public Haus members by querying token holders"""
def __init__(self):
"""Initialize the importer"""
# Initialize database
self.db = DatabaseConnector()
# Initialize Web3
optimism_rpc_url = os.getenv("OPTIMISM_RPC_URL")
if not optimism_rpc_url:
raise ValueError("OPTIMISM_RPC_URL environment variable not set")
self.web3 = Web3(Web3.HTTPProvider(optimism_rpc_url))
if not self.web3.is_connected():
raise ValueError("Failed to connect to Optimism RPC")
logger.info(f"Connected to Optimism: {self.web3.is_connected()}")
# Initialize token contract
self.shares_token = self.web3.eth.contract(
address=self.web3.to_checksum_address(SHARES_TOKEN_ADDRESS),
abi=ERC20_ABI
)
# Register data source
self.data_source_id = self.register_data_source()
# Initialize scraping job
self.job_id = self.db.create_scraping_job(
source_name="Public Haus Token Holders",
status="running"
)
logger.info(f"Created scraping job with ID: {self.job_id}")
def register_data_source(self) -> str:
"""Register the Public Haus data source in the database"""
return self.db.upsert_data_source(
name="Public Haus DAO Token Holders",
source_type="blockchain",
description="Public Haus DAO members identified by token holdings"
)
def get_token_info(self) -> Dict[str, Any]:
"""
Get information about the shares token
Returns:
Token information
"""
try:
name = self.shares_token.functions.name().call()
symbol = self.shares_token.functions.symbol().call()
decimals = self.shares_token.functions.decimals().call()
total_supply = self.shares_token.functions.totalSupply().call()
token_info = {
"address": SHARES_TOKEN_ADDRESS,
"name": name,
"symbol": symbol,
"decimals": decimals,
"totalSupply": total_supply
}
logger.info(f"Token info: {name} ({symbol})")
logger.info(f"Total supply: {total_supply / (10 ** decimals):.2f} {symbol}")
return token_info
except Exception as e:
logger.error(f"Error getting token info: {e}")
raise
def fetch_token_holders(self) -> List[Dict[str, Any]]:
"""
Fetch holders of the shares token by analyzing transfer events
Returns:
List of token holders with their balances
"""
try:
# Get token info
token_info = self.get_token_info()
decimals = token_info["decimals"]
# Get the latest block number
latest_block = self.web3.eth.block_number
# Calculate the starting block (approximately 6 months ago)
# Optimism has ~1 block every 2 seconds
blocks_per_day = 43200 # 86400 seconds / 2 seconds per block
start_block = max(0, latest_block - (blocks_per_day * 180)) # 180 days
logger.info(f"Fetching Transfer events from block {start_block} to {latest_block}")
# Create a contract instance with the Transfer event ABI
token_events = self.web3.eth.contract(
address=self.web3.to_checksum_address(SHARES_TOKEN_ADDRESS),
abi=TRANSFER_EVENT_ABI
)
# Get Transfer events
transfer_filter = token_events.events.Transfer.create_filter(
fromBlock=start_block,
toBlock=latest_block
)
transfers = transfer_filter.get_all_entries()
logger.info(f"Found {len(transfers)} Transfer events")
# Track addresses that have received tokens
holder_addresses = set()
for transfer in transfers:
from_address = transfer.args.get('from')
to_address = transfer.args.get('to')
# Skip zero address (minting/burning)
if to_address != '0x0000000000000000000000000000000000000000':
holder_addresses.add(to_address)
# Check current balances for all potential holders
holders = []
for address in holder_addresses:
try:
balance = self.shares_token.functions.balanceOf(address).call()
# Only include addresses with non-zero balance
if balance > 0:
holders.append({
"address": address,
"balance": balance,
"balanceFormatted": balance / (10 ** decimals),
"dao": "Public Haus"
})
except Exception as e:
logger.error(f"Error checking balance for {address}: {e}")
# Sort holders by balance (descending)
holders.sort(key=lambda x: x["balance"], reverse=True)
logger.info(f"Found {len(holders)} token holders with non-zero balance")
return holders
except Exception as e:
logger.error(f"Error fetching token holders: {e}")
raise
def process_holder(self, holder: Dict[str, Any]) -> Optional[str]:
"""
Process a token holder and import into the database
Args:
holder: Token holder information
Returns:
Contact ID if successful, None otherwise
"""
try:
# Extract holder information
address = holder["address"]
balance = holder["balance"]
balance_formatted = holder["balanceFormatted"]
dao_name = holder["dao"]
# Check if contact exists
query = 'SELECT id, name, "ensName" FROM "Contact" WHERE "ethereumAddress" ILIKE %(address)s'
existing_contacts = self.db.execute_query(query, {"address": address})
contact_id = None
if existing_contacts:
# Use existing contact
contact_id = existing_contacts[0]["id"]
logger.info(f"Found existing contact {contact_id} for address {address}")
else:
# Create new contact
contact_data = {
"ethereumAddress": address,
"name": f"Public Haus Member {address[:8]}", # Default name
}
contact_id = self.db.upsert_contact(contact_data)
logger.info(f"Created new contact {contact_id} for address {address}")
# Add DAO membership
self.db.execute_update(
"""
INSERT INTO "DaoMembership" ("contactId", "daoName", "shares", "loot", "delegatingTo")
VALUES (%(contact_id)s, %(dao_name)s, %(shares)s, %(loot)s, %(delegating_to)s)
ON CONFLICT ("contactId", "daoName")
DO UPDATE SET
"shares" = %(shares)s,
"loot" = %(loot)s,
"updatedAt" = NOW()
""",
{
"contact_id": contact_id,
"dao_name": dao_name,
"shares": balance, # Use token balance as shares
"loot": 0, # We don't have loot information
"delegating_to": None
}
)
# Add note about membership
note_content = f"Public Haus DAO Member\nShares Token Balance: {balance_formatted}"
self.db.add_note_to_contact(
contact_id=contact_id,
content=note_content,
source="Public Haus DAO Token Holders"
)
# Add tag for the DAO
self.db.add_tag_to_contact(
contact_id=contact_id,
tag_name=dao_name
)
# Link to data source
self.db.link_contact_to_data_source(contact_id, self.data_source_id)
return contact_id
except Exception as e:
logger.error(f"Error processing holder {holder.get('address')}: {e}")
return None
def run(self) -> int:
"""
Run the importer
Returns:
Number of holders imported
"""
try:
# Fetch token holders
holders = self.fetch_token_holders()
if not holders:
logger.info("No token holders found")
self.db.update_scraping_job(self.job_id, "completed")
return 0
# Process holders
imported_count = 0
existing_count = 0
for holder in holders:
try:
contact_id = self.process_holder(holder)
if contact_id:
imported_count += 1
except Exception as e:
logger.exception(f"Error processing holder {holder.get('address')}: {e}")
# Add a small delay to avoid overwhelming the database
time.sleep(0.1)
# Complete the scraping job
self.db.update_scraping_job(
self.job_id,
"completed",
records_processed=len(holders),
records_added=imported_count,
records_updated=existing_count
)
logger.info(f"Imported {imported_count} holders out of {len(holders)} processed")
return imported_count
except Exception as e:
# Update the scraping job with error
self.db.update_scraping_job(self.job_id, "failed", error_message=str(e))
logger.exception(f"Error importing holders: {e}")
raise
def main():
"""Main function"""
try:
importer = PublicHausTokenImporter()
imported_count = importer.run()
logger.info(f"Import completed successfully. Imported {imported_count} token holders.")
return 0
except Exception as e:
logger.exception(f"Error importing token holders: {e}")
return 1
if __name__ == "__main__":
sys.exit(main())

View File

@ -0,0 +1,756 @@
#!/usr/bin/env python3
"""
Import Public Haus Members by Querying Token Holders
This script fetches members of Public Haus DAO by querying holders of both the voting (shares)
and non-voting (loot) tokens, imports them into the database, and links them to the Public Haus DAO.
Usage:
python import_public_haus_tokens.py
"""
import os
import sys
import logging
import json
import time
import requests
from typing import Dict, Any, List, Optional, Set
from web3 import Web3
from dotenv import load_dotenv
# Add parent directory to path to import utils
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from utils.db_connector import DatabaseConnector
from utils.logger import setup_logger
from utils.ens_resolver import ENSResolver
# Load environment variables
load_dotenv()
# Setup logging
logger = setup_logger("public_haus_tokens_importer")
# Constants
PUBLIC_HAUS_DAO_ID = "0xf5d6b637a9185707f52d40d452956ca49018247a" # Public Haus DAO ID on Optimism
SHARES_TOKEN_ADDRESS = "0x4950c436F69c8b4F80f688edc814C5bA84Aa70f5" # Voting token (shares)
LOOT_TOKEN_ADDRESS = "0xab6033E3EC2144FB279fe68dA92B7aC6a42Da6d8" # Non-voting token (loot)
# Optimism Etherscan API
OPTIMISM_ETHERSCAN_API_URL = "https://api-optimistic.etherscan.io/api"
# ERC20 ABI (minimal for balance checking)
ERC20_ABI = [
{
"constant": True,
"inputs": [{"name": "_owner", "type": "address"}],
"name": "balanceOf",
"outputs": [{"name": "balance", "type": "uint256"}],
"type": "function"
},
{
"constant": True,
"inputs": [],
"name": "totalSupply",
"outputs": [{"name": "", "type": "uint256"}],
"type": "function"
},
{
"constant": True,
"inputs": [],
"name": "name",
"outputs": [{"name": "", "type": "string"}],
"type": "function"
},
{
"constant": True,
"inputs": [],
"name": "symbol",
"outputs": [{"name": "", "type": "string"}],
"type": "function"
},
{
"constant": True,
"inputs": [],
"name": "decimals",
"outputs": [{"name": "", "type": "uint8"}],
"type": "function"
}
]
class PublicHausTokensImporter:
"""Importer for Public Haus DAO members based on token holdings"""
def __init__(self):
"""Initialize the importer"""
# Initialize database
self.db = DatabaseConnector()
# Initialize Web3
optimism_rpc_url = os.getenv("OPTIMISM_RPC_URL")
if not optimism_rpc_url:
raise ValueError("OPTIMISM_RPC_URL environment variable not set")
self.web3 = Web3(Web3.HTTPProvider(optimism_rpc_url))
if not self.web3.is_connected():
raise ValueError("Failed to connect to Optimism RPC")
logger.info(f"Connected to Optimism: {self.web3.is_connected()}")
# Initialize token contracts
self.shares_token = self.web3.eth.contract(
address=self.web3.to_checksum_address(SHARES_TOKEN_ADDRESS),
abi=ERC20_ABI
)
self.loot_token = self.web3.eth.contract(
address=self.web3.to_checksum_address(LOOT_TOKEN_ADDRESS),
abi=ERC20_ABI
)
# Get Etherscan API key
self.etherscan_api_key = os.getenv("OPTIMISM_ETHERSCAN_API_KEY")
if not self.etherscan_api_key:
logger.warning("OPTIMISM_ETHERSCAN_API_KEY not set, using API without key (rate limited)")
self.etherscan_api_key = ""
else:
logger.info("Using Optimism Etherscan API key")
# Initialize ENS resolver for Ethereum mainnet
ethereum_rpc_url = os.getenv("ETHEREUM_RPC_URL", "https://eth-mainnet.g.alchemy.com/v2/1fkIfqUX_MoHhd4Qqnu9UItM8Fc4Ls2q")
self.eth_web3 = Web3(Web3.HTTPProvider(ethereum_rpc_url))
if not self.eth_web3.is_connected():
logger.warning("Failed to connect to Ethereum RPC for ENS resolution")
self.ens_resolver = None
else:
logger.info(f"Connected to Ethereum for ENS resolution: {self.eth_web3.is_connected()}")
self.ens_resolver = ENSResolver(self.eth_web3)
# Register data source
self.data_source_id = self.register_data_source()
# Initialize scraping job
self.job_id = self.db.create_scraping_job(
source_name="Public Haus DAO Tokens",
status="running"
)
logger.info(f"Created scraping job with ID: {self.job_id}")
def register_data_source(self) -> str:
"""Register the Public Haus data source in the database"""
return self.db.upsert_data_source(
name="Public Haus DAO Tokens",
source_type="blockchain",
description="Public Haus DAO members identified by token holdings"
)
def get_token_info(self, token_contract, token_name) -> Dict[str, Any]:
"""
Get information about a token
Args:
token_contract: Web3 contract instance
token_name: Name of the token for logging
Returns:
Token information
"""
try:
name = token_contract.functions.name().call()
symbol = token_contract.functions.symbol().call()
decimals = token_contract.functions.decimals().call()
total_supply = token_contract.functions.totalSupply().call()
token_info = {
"name": name,
"symbol": symbol,
"decimals": decimals,
"totalSupply": total_supply
}
logger.info(f"{token_name} info: {name} ({symbol})")
logger.info(f"{token_name} total supply: {total_supply / (10 ** decimals):.2f} {symbol}")
return token_info
except Exception as e:
logger.error(f"Error getting {token_name} info via Web3: {e}")
# Try Etherscan API as fallback
try:
token_address = token_contract.address
params = {
"module": "token",
"action": "tokeninfo",
"contractaddress": token_address,
"apikey": self.etherscan_api_key
}
response = requests.get(OPTIMISM_ETHERSCAN_API_URL, params=params)
data = response.json()
if data["status"] == "1":
token_info = data["result"][0]
logger.info(f"{token_name} info from Etherscan: {token_info.get('name')} ({token_info.get('symbol')})")
return {
"name": token_info.get("name", f"Public Haus {token_name}"),
"symbol": token_info.get("symbol", token_name.upper()),
"decimals": int(token_info.get("decimals", 18)),
"totalSupply": int(token_info.get("totalSupply", "0"))
}
except Exception as etherscan_error:
logger.error(f"Error getting {token_name} info via Etherscan: {etherscan_error}")
# Return default values if both methods fail
return {
"name": f"Public Haus {token_name}",
"symbol": token_name.upper(),
"decimals": 18,
"totalSupply": 0
}
def fetch_token_holders_via_etherscan(self, token_address, token_name, decimals) -> List[Dict[str, Any]]:
"""
Fetch holders of a token using Etherscan API
Args:
token_address: Address of the token
token_name: Name of the token for logging
decimals: Token decimals
Returns:
List of token holders with their balances
"""
try:
# Get token holders from Etherscan
params = {
"module": "token",
"action": "tokenholderlist",
"contractaddress": token_address,
"page": 1,
"offset": 100, # Get up to 100 holders
"apikey": self.etherscan_api_key
}
response = requests.get(OPTIMISM_ETHERSCAN_API_URL, params=params)
data = response.json()
holders = []
if data["status"] == "1":
for holder in data["result"]:
address = holder["address"]
balance = int(holder["TokenHolderQuantity"])
# Skip zero balances
if balance > 0:
holders.append({
"address": address,
"balance": balance,
"balanceFormatted": balance / (10 ** decimals),
"tokenType": token_name
})
logger.info(f"Found {len(holders)} {token_name} holders with non-zero balance via Etherscan")
return holders
else:
logger.warning(f"Error getting {token_name} holders from Etherscan: {data.get('message')}")
return []
except Exception as e:
logger.error(f"Error fetching {token_name} holders via Etherscan: {e}")
return []
def fetch_token_transfers_via_etherscan(self, token_address, token_name, decimals) -> List[Dict[str, Any]]:
"""
Fetch token transfers using Etherscan API and extract unique addresses
Args:
token_address: Address of the token
token_name: Name of the token for logging
decimals: Token decimals
Returns:
List of token holders with their balances
"""
try:
# Get token transfers from Etherscan
params = {
"module": "account",
"action": "tokentx",
"contractaddress": token_address,
"page": 1,
"offset": 1000, # Get up to 1000 transfers
"sort": "desc",
"apikey": self.etherscan_api_key
}
response = requests.get(OPTIMISM_ETHERSCAN_API_URL, params=params)
data = response.json()
addresses = set()
if data["status"] == "1":
for tx in data["result"]:
addresses.add(tx["to"])
addresses.add(tx["from"])
# Remove zero address
if "0x0000000000000000000000000000000000000000" in addresses:
addresses.remove("0x0000000000000000000000000000000000000000")
# Create holder objects
holders = []
for address in addresses:
holders.append({
"address": address,
"balance": 1, # We don't know the actual balance
"balanceFormatted": 1,
"tokenType": token_name
})
logger.info(f"Found {len(holders)} unique addresses from {token_name} transfers via Etherscan")
return holders
else:
logger.warning(f"Error getting {token_name} transfers from Etherscan: {data.get('message')}")
return []
except Exception as e:
logger.error(f"Error fetching {token_name} transfers via Etherscan: {e}")
return []
def fetch_token_holders_via_web3(self, token_contract, token_name, decimals) -> List[Dict[str, Any]]:
"""
Fetch holders of a token by checking balances of known addresses
Args:
token_contract: Web3 contract instance
token_name: Name of the token for logging
decimals: Token decimals
Returns:
List of token holders with their balances
"""
try:
# Get the latest block number
latest_block = self.web3.eth.block_number
# Try to get some known addresses from recent transactions to the token contract
known_addresses = set()
# Check the last 100 blocks for transactions to the token contract
for block_num in range(max(0, latest_block - 100), latest_block + 1):
try:
block = self.web3.eth.get_block(block_num, full_transactions=True)
for tx in block.transactions:
if tx.to and tx.to.lower() == token_contract.address.lower():
known_addresses.add(tx['from'])
except Exception as e:
logger.warning(f"Error getting block {block_num}: {e}")
continue
# Add the DAO address as a known address
known_addresses.add(self.web3.to_checksum_address(PUBLIC_HAUS_DAO_ID))
# Check balances for known addresses
holders = []
for address in known_addresses:
try:
balance = token_contract.functions.balanceOf(address).call()
# Only include addresses with non-zero balance
if balance > 0:
holders.append({
"address": address,
"balance": balance,
"balanceFormatted": balance / (10 ** decimals),
"tokenType": token_name
})
except Exception as e:
logger.error(f"Error checking {token_name} balance for {address}: {e}")
logger.info(f"Found {len(holders)} {token_name} holders with non-zero balance via Web3")
return holders
except Exception as e:
logger.error(f"Error fetching {token_name} holders via Web3: {e}")
return []
def fetch_all_token_holders(self) -> List[Dict[str, Any]]:
"""
Fetch holders of both shares and loot tokens
Returns:
List of token holders with their balances
"""
all_holders = []
# Get token info
shares_info = self.get_token_info(self.shares_token, "Shares")
loot_info = self.get_token_info(self.loot_token, "Loot")
shares_decimals = shares_info["decimals"]
loot_decimals = loot_info["decimals"]
# Try different methods to get token holders
# 1. Try Etherscan tokenholderlist endpoint
shares_holders = self.fetch_token_holders_via_etherscan(
SHARES_TOKEN_ADDRESS, "Shares", shares_decimals
)
loot_holders = self.fetch_token_holders_via_etherscan(
LOOT_TOKEN_ADDRESS, "Loot", loot_decimals
)
# 2. If that fails, try getting transfers
if not shares_holders:
shares_holders = self.fetch_token_transfers_via_etherscan(
SHARES_TOKEN_ADDRESS, "Shares", shares_decimals
)
if not loot_holders:
loot_holders = self.fetch_token_transfers_via_etherscan(
LOOT_TOKEN_ADDRESS, "Loot", loot_decimals
)
# 3. If that fails, try Web3
if not shares_holders:
shares_holders = self.fetch_token_holders_via_web3(
self.shares_token, "Shares", shares_decimals
)
if not loot_holders:
loot_holders = self.fetch_token_holders_via_web3(
self.loot_token, "Loot", loot_decimals
)
# Combine holders
all_holders.extend(shares_holders)
all_holders.extend(loot_holders)
# If we still don't have any holders, use the DAO address itself
if not all_holders:
logger.warning("No token holders found, using DAO address as fallback")
all_holders.append({
"address": PUBLIC_HAUS_DAO_ID,
"balance": 1,
"balanceFormatted": 1,
"tokenType": "Fallback"
})
return all_holders
def merge_holders(self, holders: List[Dict[str, Any]]) -> List[Dict[str, Any]]:
"""
Merge holders by address, combining shares and loot
Args:
holders: List of token holders
Returns:
List of merged holders
"""
merged = {}
for holder in holders:
address = holder["address"]
balance = holder["balance"]
token_type = holder["tokenType"]
if address not in merged:
merged[address] = {
"address": address,
"shares": 0,
"sharesFormatted": 0,
"loot": 0,
"lootFormatted": 0,
"dao": "Public Haus"
}
if token_type == "Shares":
merged[address]["shares"] = balance
merged[address]["sharesFormatted"] = holder["balanceFormatted"]
elif token_type == "Loot":
merged[address]["loot"] = balance
merged[address]["lootFormatted"] = holder["balanceFormatted"]
else:
# Fallback
merged[address]["shares"] = balance
merged[address]["sharesFormatted"] = holder["balanceFormatted"]
return list(merged.values())
def resolve_ens_name(self, address: str) -> Optional[str]:
"""
Resolve ENS name for an Ethereum address
Args:
address: Ethereum address to resolve
Returns:
ENS name if found, None otherwise
"""
if not self.ens_resolver:
return None
try:
ens_name = self.ens_resolver.get_ens_name(address)
if ens_name:
logger.info(f"Resolved ENS name for {address}: {ens_name}")
return ens_name
except Exception as e:
logger.error(f"Error resolving ENS name for {address}: {e}")
return None
def process_holder(self, holder: Dict[str, Any]) -> Optional[str]:
"""
Process a token holder and import into the database
Args:
holder: Token holder information
Returns:
Contact ID if successful, None otherwise
"""
try:
# Extract holder information
address = holder["address"]
shares = holder["shares"]
shares_formatted = holder["sharesFormatted"]
loot = holder["loot"]
loot_formatted = holder["lootFormatted"]
dao_name = holder["dao"]
# Check if contact exists
query = 'SELECT id, name, "ensName" FROM "Contact" WHERE "ethereumAddress" ILIKE %(address)s'
existing_contacts = self.db.execute_query(query, {"address": address})
contact_id = None
if existing_contacts:
# Use existing contact
contact_id = existing_contacts[0]["id"]
logger.info(f"Found existing contact {contact_id} for address {address}")
else:
# Create new contact
contact_id = self.db.upsert_contact(
ethereum_address=address,
ens_name=None
)
logger.info(f"Created new contact {contact_id} for address {address}")
# Add DAO membership
self.db.execute_update(
"""
INSERT INTO "DaoMembership" (id, "contactId", "daoName", "daoType", "createdAt", "updatedAt")
VALUES (gen_random_uuid(), %(contact_id)s, %(dao_name)s, %(dao_type)s, NOW(), NOW())
ON CONFLICT ("contactId", "daoName")
DO UPDATE SET
"updatedAt" = NOW()
""",
{
"contact_id": contact_id,
"dao_name": dao_name,
"dao_type": "Moloch V3"
}
)
# Add note about membership with token holdings
note_content = f"Public Haus DAO Member\nShares: {shares_formatted}\nLoot: {loot_formatted}"
self.db.add_note_to_contact(
contact_id=contact_id,
content=note_content
)
# Add tags for the DAO and token holdings
self.db.add_tag_to_contact(
contact_id=contact_id,
tag_name=dao_name
)
if shares > 0:
self.db.add_tag_to_contact(
contact_id=contact_id,
tag_name=f"{dao_name} Voting Member"
)
if loot > 0:
self.db.add_tag_to_contact(
contact_id=contact_id,
tag_name=f"{dao_name} Non-Voting Member"
)
# Link to data source
self.db.link_contact_to_data_source(contact_id, self.data_source_id)
return contact_id
except Exception as e:
logger.error(f"Error processing holder {holder.get('address')}: {e}")
return None
def process_address(self, address: str, shares_balance: float = 0, loot_balance: float = 0) -> Optional[str]:
"""
Process a single address, creating or updating a contact and linking to the DAO
Args:
address: Ethereum address
shares_balance: Balance of shares token
loot_balance: Balance of loot token
Returns:
Contact ID if successful, None otherwise
"""
try:
# Check if contact already exists
query = 'SELECT id, name, "ensName" FROM "Contact" WHERE "ethereumAddress" ILIKE %(address)s'
result = self.db.execute_query(query, {"address": address})
contact_id = None
is_new = False
# Resolve ENS name if needed
ens_name = None
if not result or not result[0].get("ensName"):
ens_name = self.resolve_ens_name(address)
if result:
# Contact exists, get ID
contact_id = result[0]["id"]
logger.info(f"Found existing contact for {address}: {contact_id}")
# Update ENS name if we found one and it's not already set
if ens_name and not result[0].get("ensName"):
self.db.update_contact(contact_id, {"ensName": ens_name})
logger.info(f"Updated ENS name for contact {contact_id}: {ens_name}")
else:
# Create new contact
# Use ENS name as contact name if available, otherwise use address
contact_name = ens_name.split('.')[0] if ens_name else f"ETH_{address[:8]}"
contact_id = self.db.upsert_contact(
ethereum_address=address,
ens_name=ens_name
)
logger.info(f"Created new contact for {address}: {contact_id}")
is_new = True
# Add DAO membership
self.db.add_dao_membership(
contact_id=contact_id,
dao_name="Public Haus",
dao_type="Moloch V3"
)
# Add note about membership with token holdings
note_content = f"Public Haus DAO Member\nShares: {shares_balance}\nLoot: {loot_balance}"
self.db.add_note_to_contact(
contact_id=contact_id,
content=note_content
)
# Add tags for the DAO and token holdings
self.db.add_tag_to_contact(
contact_id=contact_id,
tag_name="Public Haus"
)
if shares_balance > 0:
self.db.add_tag_to_contact(
contact_id=contact_id,
tag_name="Public Haus Voting Member"
)
if loot_balance > 0:
self.db.add_tag_to_contact(
contact_id=contact_id,
tag_name="Public Haus Non-Voting Member"
)
# Link to data source
self.db.link_contact_to_data_source(contact_id, self.data_source_id)
return contact_id
except Exception as e:
logger.error(f"Error processing address {address}: {e}")
return None
def run(self) -> int:
"""
Run the importer
Returns:
Number of holders imported
"""
try:
# Fetch token holders
all_holders = self.fetch_all_token_holders()
# Merge holders by address
merged_holders = self.merge_holders(all_holders)
if not merged_holders:
logger.info("No token holders found")
self.db.update_scraping_job(self.job_id, "completed")
return 0
# Process holders
imported_count = 0
existing_count = 0
for holder in merged_holders:
try:
contact_id = self.process_address(
address=holder["address"],
shares_balance=holder["shares"],
loot_balance=holder["loot"]
)
if contact_id:
imported_count += 1
except Exception as e:
logger.exception(f"Error processing holder {holder.get('address')}: {e}")
# Add a small delay to avoid overwhelming the database
time.sleep(0.1)
# Complete the scraping job
self.db.update_scraping_job(
self.job_id,
"completed",
records_processed=len(merged_holders),
records_added=imported_count,
records_updated=existing_count
)
logger.info(f"Imported {imported_count} holders out of {len(merged_holders)} processed")
# Run ENS resolution for any contacts that don't have ENS names
if self.ens_resolver:
logger.info("Running ENS resolution for contacts without ENS names...")
from utils.resolve_ens_names import ENSResolver as BatchENSResolver
batch_resolver = BatchENSResolver()
batch_resolver.run(batch_size=50, delay_seconds=0.5)
return imported_count
except Exception as e:
# Update the scraping job with error
self.db.update_scraping_job(self.job_id, "failed", error_message=str(e))
logger.exception(f"Error importing holders: {e}")
raise
def main():
"""Main function"""
try:
importer = PublicHausTokensImporter()
imported_count = importer.run()
logger.info(f"Import completed successfully. Imported {imported_count} token holders.")
return 0
except Exception as e:
logger.exception(f"Error importing token holders: {e}")
return 1
if __name__ == "__main__":
sys.exit(main())

View File

@ -0,0 +1,147 @@
#!/usr/bin/env python3
"""
Resolve ENS Names for Meta Cartel Members
This script resolves ENS names for Meta Cartel members imported from the CSV file.
It updates the contacts with ENS names and profile information, and links them to the data source.
"""
import os
import sys
import logging
from typing import Dict, Any, List, Optional
from web3 import Web3
from dotenv import load_dotenv
# Add parent directory to path to import utils
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from utils.db_connector import DatabaseConnector
from utils.ens_resolver import ENSResolver
from utils.logger import setup_logger
# Load environment variables
load_dotenv()
# Setup logging
logger = setup_logger("metacartel_ens_resolver")
class MetaCartelENSResolver:
"""Resolver for ENS names of Meta Cartel members"""
def __init__(self):
"""Initialize the resolver"""
# Initialize database
self.db = DatabaseConnector()
# Initialize Web3 and ENS resolver
alchemy_api_key = os.getenv("ALCHEMY_API_KEY")
if not alchemy_api_key:
raise ValueError("ALCHEMY_API_KEY not found in environment variables")
self.web3 = Web3(Web3.HTTPProvider(f"https://eth-mainnet.g.alchemy.com/v2/{alchemy_api_key}"))
self.ens_resolver = ENSResolver(self.web3)
# Get data source ID
self.data_source_id = self.get_data_source_id()
def get_data_source_id(self) -> str:
"""Get the ID of the Meta Cartel DAO CSV data source"""
query = 'SELECT id FROM "DataSource" WHERE name = %(name)s'
result = self.db.execute_query(query, {"name": "Meta Cartel DAO CSV"})
if not result:
raise ValueError("Meta Cartel DAO CSV data source not found")
return result[0]["id"]
def get_metacartel_members(self) -> List[Dict[str, Any]]:
"""Get all Meta Cartel members from the database"""
query = """
SELECT c.id, c."ethereumAddress", c."ensName"
FROM "Contact" c
JOIN "DaoMembership" dm ON c.id = dm."contactId"
WHERE dm."daoName" = 'Meta Cartel'
"""
return self.db.execute_query(query)
def resolve_ens_for_member(self, contact_id: str, ethereum_address: str, current_ens: Optional[str] = None) -> bool:
"""
Resolve ENS name for a member and update their profile.
Args:
contact_id: ID of the contact
ethereum_address: Ethereum address of the member
current_ens: Current ENS name of the member, if any
Returns:
True if ENS was resolved or already exists, False otherwise
"""
# Skip if already has ENS
if current_ens:
logger.info(f"Contact {contact_id} already has ENS: {current_ens}")
# Still update profile from ENS if needed
self.ens_resolver.update_contact_from_ens(contact_id, current_ens)
# Link to data source
self.db.link_contact_to_data_source(contact_id, self.data_source_id)
return True
# Resolve ENS name
ens_name = self.ens_resolver.get_ens_name(ethereum_address)
if not ens_name:
logger.info(f"No ENS name found for {ethereum_address}")
# Still link to data source
self.db.link_contact_to_data_source(contact_id, self.data_source_id)
return False
# Update contact with ENS name
self.db.update_contact(contact_id, {"ensName": ens_name})
logger.info(f"Updated contact {contact_id} with ENS name: {ens_name}")
# Update profile from ENS
self.ens_resolver.update_contact_from_ens(contact_id, ens_name)
# Link to data source
self.db.link_contact_to_data_source(contact_id, self.data_source_id)
return True
def run(self):
"""Run the resolver"""
logger.info("Starting ENS resolution for Meta Cartel members")
# Get all Meta Cartel members
members = self.get_metacartel_members()
logger.info(f"Found {len(members)} Meta Cartel members")
# Resolve ENS for each member
resolved_count = 0
for member in members:
if self.resolve_ens_for_member(
member["id"],
member["ethereumAddress"],
member.get("ensName")
):
resolved_count += 1
logger.info(f"Resolved ENS for {resolved_count} out of {len(members)} members")
return resolved_count
def main():
"""Main function"""
try:
resolver = MetaCartelENSResolver()
resolved_count = resolver.run()
logger.info(f"ENS resolution completed successfully. Resolved {resolved_count} members.")
return 0
except Exception as e:
logger.exception(f"Error resolving ENS names: {e}")
return 1
if __name__ == "__main__":
sys.exit(main())

View File

@ -0,0 +1,147 @@
#!/usr/bin/env python3
"""
Resolve ENS Names for Public Haus Members
This script resolves ENS names for Public Haus members imported from the DAOhaus API on Optimism mainnet.
It updates the contacts with ENS names and profile information, and links them to the data source.
"""
import os
import sys
import logging
from typing import Dict, Any, List, Optional
from web3 import Web3
from dotenv import load_dotenv
# Add parent directory to path to import utils
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from utils.db_connector import DatabaseConnector
from utils.ens_resolver import ENSResolver
from utils.logger import setup_logger
# Load environment variables
load_dotenv()
# Setup logging
logger = setup_logger("public_haus_ens_resolver")
class PublicHausENSResolver:
"""Resolver for ENS names of Public Haus members"""
def __init__(self):
"""Initialize the resolver"""
# Initialize database
self.db = DatabaseConnector()
# Initialize Web3 and ENS resolver
alchemy_api_key = os.getenv("ALCHEMY_API_KEY")
if not alchemy_api_key:
raise ValueError("ALCHEMY_API_KEY not found in environment variables")
self.web3 = Web3(Web3.HTTPProvider(f"https://eth-mainnet.g.alchemy.com/v2/{alchemy_api_key}"))
self.ens_resolver = ENSResolver(self.web3)
# Get data source ID
self.data_source_id = self.get_data_source_id()
def get_data_source_id(self) -> str:
"""Get the ID of the Public Haus DAO API data source"""
query = 'SELECT id FROM "DataSource" WHERE name = %(name)s'
result = self.db.execute_query(query, {"name": "Public Haus DAO Tokens"})
if not result:
raise ValueError("Public Haus DAO Tokens data source not found")
return result[0]["id"]
def get_public_haus_members(self) -> List[Dict[str, Any]]:
"""Get all Public Haus members from the database"""
query = """
SELECT c.id, c."ethereumAddress", c."ensName"
FROM "Contact" c
JOIN "DaoMembership" dm ON c.id = dm."contactId"
WHERE dm."daoName" = 'Public Haus'
"""
return self.db.execute_query(query)
def resolve_ens_for_member(self, contact_id: str, ethereum_address: str, current_ens: Optional[str] = None) -> bool:
"""
Resolve ENS name for a member and update their profile.
Args:
contact_id: ID of the contact
ethereum_address: Ethereum address of the member
current_ens: Current ENS name of the member, if any
Returns:
True if ENS was resolved or already exists, False otherwise
"""
# Skip if already has ENS
if current_ens:
logger.info(f"Contact {contact_id} already has ENS: {current_ens}")
# Still update profile from ENS if needed
self.ens_resolver.update_contact_from_ens(contact_id, current_ens)
# Link to data source
self.db.link_contact_to_data_source(contact_id, self.data_source_id)
return True
# Resolve ENS name
ens_name = self.ens_resolver.get_ens_name(ethereum_address)
if not ens_name:
logger.info(f"No ENS name found for {ethereum_address}")
# Still link to data source
self.db.link_contact_to_data_source(contact_id, self.data_source_id)
return False
# Update contact with ENS name
self.db.update_contact(contact_id, {"ensName": ens_name})
logger.info(f"Updated contact {contact_id} with ENS name: {ens_name}")
# Update profile from ENS
self.ens_resolver.update_contact_from_ens(contact_id, ens_name)
# Link to data source
self.db.link_contact_to_data_source(contact_id, self.data_source_id)
return True
def run(self):
"""Run the resolver"""
logger.info("Starting ENS resolution for Public Haus members")
# Get all Public Haus members
members = self.get_public_haus_members()
logger.info(f"Found {len(members)} Public Haus members")
# Resolve ENS for each member
resolved_count = 0
for member in members:
if self.resolve_ens_for_member(
member["id"],
member["ethereumAddress"],
member.get("ensName")
):
resolved_count += 1
logger.info(f"Resolved ENS for {resolved_count} out of {len(members)} members")
return resolved_count
def main():
"""Main function"""
try:
resolver = PublicHausENSResolver()
resolved_count = resolver.run()
logger.info(f"ENS resolution completed successfully. Resolved {resolved_count} members.")
return 0
except Exception as e:
logger.exception(f"Error resolving ENS names: {e}")
return 1
if __name__ == "__main__":
sys.exit(main())

View File

@ -46,7 +46,7 @@ class PublicNounsHoldersScraper:
"""
self.contract_address = Web3.to_checksum_address(contract_address)
self.collection_name = collection_name
self.etherscan_api_key = os.getenv("ETHERSCAN_API_KEY")
self.etherscan_api_key = os.getenv("ETHEREUM_ETHERSCAN_API_KEY")
self.alchemy_api_key = os.getenv("ALCHEMY_API_KEY")
self.web3 = Web3(Web3.HTTPProvider(f"https://eth-mainnet.g.alchemy.com/v2/{self.alchemy_api_key}"))
self.db = DatabaseConnector()
@ -54,7 +54,7 @@ class PublicNounsHoldersScraper:
# Validate API keys
if not self.etherscan_api_key:
logger.error("ETHERSCAN_API_KEY not found in environment variables")
logger.error("ETHEREUM_ETHERSCAN_API_KEY not found in environment variables")
sys.exit(1)
if not self.alchemy_api_key:
logger.error("ALCHEMY_API_KEY not found in environment variables")

View File

@ -0,0 +1,368 @@
#!/usr/bin/env python3
"""
Fix Contact Issues
This script addresses two main issues with the contacts in the database:
1. Removes prefixed names like "RG_0x..." and "MC_0x..." and replaces them with NULL
if they don't have ENS names
2. Merges duplicate contacts that have the same Ethereum address but different records
Usage:
python fix_contact_issues.py
"""
import os
import sys
import argparse
from typing import Dict, List, Any, Optional
from dotenv import load_dotenv
# Add parent directory to path to import utils
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from utils.db_connector import DatabaseConnector
from utils.logger import setup_logger
# Load environment variables
load_dotenv()
# Setup logging
logger = setup_logger("fix_contact_issues")
class ContactFixer:
"""Fixes issues with contacts in the database."""
def __init__(self):
"""Initialize the contact fixer."""
self.db = DatabaseConnector()
def fix_prefixed_names(self) -> int:
"""
Replace prefixed names like "RG_0x..." and "MC_0x..." with NULL.
Only do this for contacts that don't have ENS names.
Returns:
Number of contacts fixed
"""
logger.info("Fixing prefixed names...")
# Find contacts with prefixed names
query = """
SELECT id, name, "ethereumAddress", "ensName"
FROM "Contact"
WHERE (name LIKE 'RG\\_%' OR name LIKE 'MC\\_%' OR name LIKE 'ETH\\_%'
OR name LIKE '%_0x%')
AND "ensName" IS NULL
"""
contacts = self.db.execute_query(query, {})
logger.info(f"Found {len(contacts)} contacts with prefixed names")
# Update contacts to set name to NULL
fixed_count = 0
for contact in contacts:
update_query = """
UPDATE "Contact"
SET name = NULL,
"updatedAt" = NOW()
WHERE id = %(contact_id)s
"""
rows_updated = self.db.execute_update(update_query, {
"contact_id": contact["id"]
})
if rows_updated > 0:
logger.info(f"Cleared name for contact {contact['id']} (was '{contact['name']}')")
fixed_count += 1
logger.info(f"Fixed {fixed_count} contacts with prefixed names")
return fixed_count
def find_duplicate_contacts(self) -> List[Dict[str, Any]]:
"""
Find contacts with duplicate Ethereum addresses.
Returns:
List of Ethereum addresses with duplicate contacts
"""
query = """
SELECT "ethereumAddress", COUNT(*) as count
FROM "Contact"
GROUP BY "ethereumAddress"
HAVING COUNT(*) > 1
ORDER BY COUNT(*) DESC
"""
duplicates = self.db.execute_query(query, {})
logger.info(f"Found {len(duplicates)} Ethereum addresses with duplicate contacts")
return duplicates
def merge_duplicate_contacts(self) -> int:
"""
Merge duplicate contacts by keeping the most complete record.
Returns:
Number of contacts merged
"""
logger.info("Merging duplicate contacts...")
# Find duplicate contacts
duplicates = self.find_duplicate_contacts()
# For each duplicate address
total_merged = 0
for duplicate in duplicates:
eth_address = duplicate["ethereumAddress"]
# Get all contacts with this address
query = """
SELECT id, "ethereumAddress", "ensName", name, email,
twitter, discord, telegram, farcaster, "otherSocial",
"warpcastAddress", "ethereumAddress2", "createdAt"
FROM "Contact"
WHERE "ethereumAddress" = %(eth_address)s
ORDER BY "createdAt" ASC
"""
contacts = self.db.execute_query(query, {"eth_address": eth_address})
if len(contacts) <= 1:
continue
# Determine the primary contact (the one to keep)
# We'll keep the oldest one (first created) as the primary
primary_contact = contacts[0]
primary_id = primary_contact["id"]
# Merge data from other contacts into the primary
for contact in contacts[1:]:
# Update primary contact with any non-null fields from this contact
update_data = {}
for field in ["ensName", "name", "email", "twitter", "discord",
"telegram", "farcaster", "otherSocial", "warpcastAddress",
"ethereumAddress2"]:
if contact[field] is not None and primary_contact[field] is None:
update_data[field] = contact[field]
if update_data:
self.db.update_contact(primary_id, update_data)
logger.info(f"Updated primary contact {primary_id} with data from {contact['id']}")
# Move all related data to the primary contact
self.move_related_data(contact["id"], primary_id)
# Delete the duplicate contact
delete_query = """
DELETE FROM "Contact"
WHERE id = %(contact_id)s
"""
self.db.execute_update(delete_query, {"contact_id": contact["id"]})
logger.info(f"Deleted duplicate contact {contact['id']}")
total_merged += 1
logger.info(f"Merged {total_merged} duplicate contacts")
return total_merged
def move_related_data(self, from_id: str, to_id: str) -> None:
"""
Move all related data from one contact to another.
Args:
from_id: ID of the contact to move data from
to_id: ID of the contact to move data to
"""
# Move NFT holdings
self.move_nft_holdings(from_id, to_id)
# Move token holdings
self.move_token_holdings(from_id, to_id)
# Move DAO memberships
self.move_dao_memberships(from_id, to_id)
# Move notes
self.move_notes(from_id, to_id)
# Move tags
self.move_tags(from_id, to_id)
# Move contact sources
self.move_contact_sources(from_id, to_id)
def move_nft_holdings(self, from_id: str, to_id: str) -> None:
"""
Move NFT holdings from one contact to another.
Args:
from_id: ID of the contact to move holdings from
to_id: ID of the contact to move holdings to
"""
query = """
INSERT INTO "NftHolding" (
id, "contactId", "contractAddress", "tokenId", "collectionName",
"acquiredAt", "createdAt", "updatedAt"
)
SELECT
gen_random_uuid(), %(to_id)s, "contractAddress", "tokenId", "collectionName",
"acquiredAt", "createdAt", NOW()
FROM "NftHolding"
WHERE "contactId" = %(from_id)s
ON CONFLICT ("contactId", "contractAddress", "tokenId") DO NOTHING
"""
self.db.execute_update(query, {"from_id": from_id, "to_id": to_id})
def move_token_holdings(self, from_id: str, to_id: str) -> None:
"""
Move token holdings from one contact to another.
Args:
from_id: ID of the contact to move holdings from
to_id: ID of the contact to move holdings to
"""
query = """
INSERT INTO "TokenHolding" (
id, "contactId", "contractAddress", "tokenSymbol", balance,
"lastUpdated", "createdAt", "updatedAt"
)
SELECT
gen_random_uuid(), %(to_id)s, "contractAddress", "tokenSymbol", balance,
"lastUpdated", "createdAt", NOW()
FROM "TokenHolding"
WHERE "contactId" = %(from_id)s
ON CONFLICT ("contactId", "contractAddress") DO NOTHING
"""
self.db.execute_update(query, {"from_id": from_id, "to_id": to_id})
def move_dao_memberships(self, from_id: str, to_id: str) -> None:
"""
Move DAO memberships from one contact to another.
Args:
from_id: ID of the contact to move memberships from
to_id: ID of the contact to move memberships to
"""
query = """
INSERT INTO "DaoMembership" (
id, "contactId", "daoName", "daoType", "joinedAt", "createdAt", "updatedAt"
)
SELECT
gen_random_uuid(), %(to_id)s, "daoName", "daoType", "joinedAt", "createdAt", NOW()
FROM "DaoMembership"
WHERE "contactId" = %(from_id)s
ON CONFLICT ("contactId", "daoName") DO NOTHING
"""
self.db.execute_update(query, {"from_id": from_id, "to_id": to_id})
def move_notes(self, from_id: str, to_id: str) -> None:
"""
Move notes from one contact to another.
Args:
from_id: ID of the contact to move notes from
to_id: ID of the contact to move notes to
"""
query = """
INSERT INTO "Note" (
id, "contactId", content, "createdAt", "updatedAt"
)
SELECT
gen_random_uuid(), %(to_id)s, content, "createdAt", NOW()
FROM "Note"
WHERE "contactId" = %(from_id)s
"""
self.db.execute_update(query, {"from_id": from_id, "to_id": to_id})
def move_tags(self, from_id: str, to_id: str) -> None:
"""
Move tags from one contact to another.
Args:
from_id: ID of the contact to move tags from
to_id: ID of the contact to move tags to
"""
query = """
INSERT INTO "TagsOnContacts" (
"contactId", "tagId", "assignedAt"
)
SELECT
%(to_id)s, "tagId", "assignedAt"
FROM "TagsOnContacts"
WHERE "contactId" = %(from_id)s
ON CONFLICT ("contactId", "tagId") DO NOTHING
"""
self.db.execute_update(query, {"from_id": from_id, "to_id": to_id})
def move_contact_sources(self, from_id: str, to_id: str) -> None:
"""
Move contact sources from one contact to another.
Args:
from_id: ID of the contact to move sources from
to_id: ID of the contact to move sources to
"""
# Check if the ContactSource table exists
query = """
SELECT EXISTS (
SELECT FROM information_schema.tables
WHERE table_name = 'ContactSource'
) as exists
"""
result = self.db.execute_query(query, {})
if not result or not result[0]["exists"]:
logger.info("ContactSource table does not exist, skipping contact sources migration")
return
query = """
INSERT INTO "ContactSource" (
id, "contactId", "dataSourceId", "createdAt", "updatedAt"
)
SELECT
gen_random_uuid(), %(to_id)s, "dataSourceId", "createdAt", NOW()
FROM "ContactSource"
WHERE "contactId" = %(from_id)s
ON CONFLICT ("contactId", "dataSourceId") DO NOTHING
"""
self.db.execute_update(query, {"from_id": from_id, "to_id": to_id})
def run(self) -> None:
"""Run all fixes."""
logger.info("Starting contact fixes...")
# Fix prefixed names
fixed_names = self.fix_prefixed_names()
# Merge duplicate contacts
merged_contacts = self.merge_duplicate_contacts()
logger.info(f"Completed fixes: {fixed_names} name prefixes removed, {merged_contacts} duplicate contacts merged")
def main():
"""Main entry point for the script."""
parser = argparse.ArgumentParser(description="Fix contact issues")
parser.add_argument("--names-only", action="store_true",
help="Only fix prefixed names, don't merge duplicates")
parser.add_argument("--duplicates-only", action="store_true",
help="Only merge duplicate contacts, don't fix names")
args = parser.parse_args()
fixer = ContactFixer()
if args.names_only:
fixer.fix_prefixed_names()
elif args.duplicates_only:
fixer.merge_duplicate_contacts()
else:
fixer.run()
if __name__ == "__main__":
main()

View File

@ -0,0 +1,80 @@
#!/usr/bin/env python3
"""
Fix Contact Names
This script removes prefixed names like "RG_0x...", "MC_0x...", and "ETH_0x..."
and replaces them with NULL if they don't have ENS names.
Usage:
python fix_contact_names.py
"""
import os
import sys
import argparse
import psycopg2
from psycopg2.extras import RealDictCursor
from dotenv import load_dotenv
# Load environment variables
load_dotenv()
def fix_contact_names():
"""
Fix contact names by removing prefixed names and replacing with NULL.
"""
# Get database connection string from environment variables
db_url = os.getenv("PYTHON_DATABASE_URL")
if not db_url:
db_url = os.getenv("DATABASE_URL").split("?schema=")[0]
# Connect to the database
conn = psycopg2.connect(db_url)
conn.autocommit = True
try:
with conn.cursor(cursor_factory=RealDictCursor) as cursor:
# Find contacts with prefixed names
query = """
SELECT id, name, "ethereumAddress", "ensName"
FROM "Contact"
WHERE (name LIKE 'RG\\_%' OR name LIKE 'MC\\_%' OR name LIKE 'ETH\\_%'
OR name LIKE '%\\_0x%' ESCAPE '\\')
AND "ensName" IS NULL
"""
cursor.execute(query)
contacts = cursor.fetchall()
print(f"Found {len(contacts)} contacts with prefixed names")
# Update contacts to set name to NULL
fixed_count = 0
for contact in contacts:
update_query = """
UPDATE "Contact"
SET name = NULL,
"updatedAt" = NOW()
WHERE id = %s
"""
cursor.execute(update_query, (contact["id"],))
rows_updated = cursor.rowcount
if rows_updated > 0:
print(f"Cleared name for contact {contact['id']} (was '{contact['name']}')")
fixed_count += 1
print(f"Fixed {fixed_count} contacts with prefixed names")
finally:
conn.close()
def main():
"""Main entry point for the script."""
parser = argparse.ArgumentParser(description="Fix contact names")
args = parser.parse_args()
fix_contact_names()
if __name__ == "__main__":
main()

View File

@ -0,0 +1,224 @@
#!/usr/bin/env python3
"""
Merge Duplicate Contacts
This script finds and merges duplicate contacts in the database.
Duplicates are defined as contacts with the same Ethereum address.
Usage:
python merge_duplicate_contacts.py
"""
import os
import sys
import argparse
import psycopg2
from psycopg2.extras import RealDictCursor
from dotenv import load_dotenv
# Load environment variables
load_dotenv()
def merge_duplicate_contacts():
"""
Find and merge duplicate contacts.
"""
# Get database connection string from environment variables
db_url = os.getenv("PYTHON_DATABASE_URL")
if not db_url:
db_url = os.getenv("DATABASE_URL").split("?schema=")[0]
# Connect to the database
conn = psycopg2.connect(db_url)
conn.autocommit = True
try:
with conn.cursor(cursor_factory=RealDictCursor) as cursor:
# Find duplicate Ethereum addresses
query = """
SELECT "ethereumAddress", COUNT(*) as count
FROM "Contact"
GROUP BY "ethereumAddress"
HAVING COUNT(*) > 1
ORDER BY COUNT(*) DESC
"""
cursor.execute(query)
duplicates = cursor.fetchall()
print(f"Found {len(duplicates)} Ethereum addresses with duplicate contacts")
# Process each set of duplicates
total_merged = 0
for duplicate in duplicates:
eth_address = duplicate["ethereumAddress"]
# Get all contacts with this address
query = """
SELECT id, "ethereumAddress", "ensName", name, email,
twitter, discord, telegram, farcaster, "otherSocial",
"warpcastAddress", "ethereumAddress2", "createdAt"
FROM "Contact"
WHERE "ethereumAddress" = %s
ORDER BY "createdAt" ASC
"""
cursor.execute(query, (eth_address,))
contacts = cursor.fetchall()
# Skip if we somehow don't have duplicates
if len(contacts) <= 1:
continue
# Choose the oldest contact as the primary
primary_contact = contacts[0]
primary_id = primary_contact["id"]
print(f"Processing {len(contacts)} duplicates for address {eth_address}")
print(f" Primary contact: {primary_id}")
# Merge data from other contacts into the primary
for contact in contacts[1:]:
contact_id = contact["id"]
# Move NFT holdings
print(f" Moving NFT holdings from {contact_id} to {primary_id}")
query = """
INSERT INTO "NftHolding" (
id, "contactId", "contractAddress", "tokenId", "collectionName",
"acquiredAt", "createdAt", "updatedAt"
)
SELECT
gen_random_uuid(), %s, "contractAddress", "tokenId", "collectionName",
"acquiredAt", "createdAt", NOW()
FROM "NftHolding"
WHERE "contactId" = %s
ON CONFLICT ("contactId", "contractAddress", "tokenId") DO NOTHING
"""
cursor.execute(query, (primary_id, contact_id))
# Move token holdings
print(f" Moving token holdings from {contact_id} to {primary_id}")
query = """
INSERT INTO "TokenHolding" (
id, "contactId", "contractAddress", "tokenSymbol", balance,
"lastUpdated", "createdAt", "updatedAt"
)
SELECT
gen_random_uuid(), %s, "contractAddress", "tokenSymbol", balance,
"lastUpdated", "createdAt", NOW()
FROM "TokenHolding"
WHERE "contactId" = %s
ON CONFLICT ("contactId", "contractAddress") DO NOTHING
"""
cursor.execute(query, (primary_id, contact_id))
# Move DAO memberships
print(f" Moving DAO memberships from {contact_id} to {primary_id}")
query = """
INSERT INTO "DaoMembership" (
id, "contactId", "daoName", "daoType", "joinedAt", "createdAt", "updatedAt"
)
SELECT
gen_random_uuid(), %s, "daoName", "daoType", "joinedAt", "createdAt", NOW()
FROM "DaoMembership"
WHERE "contactId" = %s
ON CONFLICT ("contactId", "daoName") DO NOTHING
"""
cursor.execute(query, (primary_id, contact_id))
# Move notes
print(f" Moving notes from {contact_id} to {primary_id}")
query = """
INSERT INTO "Note" (
id, "contactId", content, "createdAt", "updatedAt"
)
SELECT
gen_random_uuid(), %s, content, "createdAt", NOW()
FROM "Note"
WHERE "contactId" = %s
"""
cursor.execute(query, (primary_id, contact_id))
# Move tags
print(f" Moving tags from {contact_id} to {primary_id}")
query = """
INSERT INTO "TagsOnContacts" (
"contactId", "tagId", "assignedAt"
)
SELECT
%s, "tagId", "assignedAt"
FROM "TagsOnContacts"
WHERE "contactId" = %s
ON CONFLICT ("contactId", "tagId") DO NOTHING
"""
cursor.execute(query, (primary_id, contact_id))
# Check if ContactSource table exists
query = """
SELECT EXISTS (
SELECT FROM information_schema.tables
WHERE table_name = 'ContactSource'
) as exists
"""
cursor.execute(query)
result = cursor.fetchone()
# Move contact sources if table exists
if result and result["exists"]:
print(f" Moving contact sources from {contact_id} to {primary_id}")
query = """
INSERT INTO "ContactSource" (
id, "contactId", "dataSourceId", "createdAt", "updatedAt"
)
SELECT
gen_random_uuid(), %s, "dataSourceId", "createdAt", NOW()
FROM "ContactSource"
WHERE "contactId" = %s
ON CONFLICT ("contactId", "dataSourceId") DO NOTHING
"""
cursor.execute(query, (primary_id, contact_id))
# Update primary contact with non-null values from this contact
update_fields = []
update_values = []
for field in ["ensName", "name", "email", "twitter", "discord",
"telegram", "farcaster", "otherSocial", "warpcastAddress",
"ethereumAddress2"]:
if contact[field] is not None and primary_contact[field] is None:
update_fields.append(f'"{field}" = %s')
update_values.append(contact[field])
print(f" Updating primary contact {field} to {contact[field]}")
if update_fields:
update_values.append(primary_id)
query = f"""
UPDATE "Contact"
SET {', '.join(update_fields)}, "updatedAt" = NOW()
WHERE id = %s
"""
cursor.execute(query, update_values)
# Delete the duplicate contact
print(f" Deleting duplicate contact {contact_id}")
query = """
DELETE FROM "Contact"
WHERE id = %s
"""
cursor.execute(query, (contact_id,))
total_merged += 1
print(f"Merged {total_merged} duplicate contacts")
finally:
conn.close()
def main():
"""Main entry point for the script."""
parser = argparse.ArgumentParser(description="Merge duplicate contacts")
args = parser.parse_args()
merge_duplicate_contacts()
if __name__ == "__main__":
main()

View File

@ -0,0 +1,224 @@
#!/usr/bin/env python3
"""
Resolve ENS Names and Contact Information for All Contacts
This script resolves ENS names and additional contact information for all contacts
in the database that have Ethereum addresses. It uses the existing ENS resolver utility
to fetch ENS names and text records containing social profiles and contact information.
Usage:
python resolve_all_ens.py [--batch-size BATCH_SIZE] [--delay DELAY]
"""
import os
import sys
import logging
import time
import argparse
from typing import Dict, Any, List, Optional, Tuple
from web3 import Web3
from dotenv import load_dotenv
# Add parent directory to path to import utils
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from utils.db_connector import DatabaseConnector
from utils.ens_resolver import ENSResolver
from utils.logger import setup_logger
# Load environment variables
load_dotenv()
# Setup logging
logger = setup_logger("all_ens_resolver")
class AllContactsENSResolver:
"""Resolver for ENS names and contact information for all contacts"""
def __init__(self):
"""Initialize the resolver"""
# Initialize database
self.db = DatabaseConnector()
# Initialize Web3 and ENS resolver
alchemy_api_key = os.getenv("ALCHEMY_API_KEY")
if not alchemy_api_key:
raise ValueError("ALCHEMY_API_KEY not found in environment variables")
self.web3 = Web3(Web3.HTTPProvider(f"https://eth-mainnet.g.alchemy.com/v2/{alchemy_api_key}"))
self.ens_resolver = ENSResolver(self.web3)
# Register data source
self.data_source_id = self.register_data_source()
def register_data_source(self) -> str:
"""Register the ENS data source in the database"""
return self.db.upsert_data_source(
name="ENS Resolver",
source_type="blockchain",
description="ENS names and profile information resolved from Ethereum addresses"
)
def get_contacts_without_ens(self) -> List[Dict[str, Any]]:
"""Get all contacts that have an Ethereum address but no ENS name"""
query = """
SELECT id, "ethereumAddress", name
FROM "Contact"
WHERE "ethereumAddress" IS NOT NULL
AND "ensName" IS NULL
"""
result = self.db.execute_query(query)
logger.info(f"Found {len(result)} contacts without ENS names")
return result
def get_all_contacts_with_eth_address(self) -> List[Dict[str, Any]]:
"""Get all contacts that have an Ethereum address"""
query = """
SELECT id, "ethereumAddress", "ensName", name, twitter, discord, telegram, email, farcaster
FROM "Contact"
WHERE "ethereumAddress" IS NOT NULL
"""
result = self.db.execute_query(query)
logger.info(f"Found {len(result)} contacts with Ethereum addresses")
return result
def process_contact(self, contact: Dict[str, Any]) -> Tuple[bool, bool]:
"""
Process a single contact to resolve ENS name and contact info
Args:
contact: Contact data from the database
Returns:
Tuple of (ens_updated, info_updated) booleans
"""
contact_id = contact["id"]
address = contact["ethereumAddress"]
current_ens = contact.get("ensName")
ens_updated = False
info_updated = False
# Skip if no address
if not address:
return ens_updated, info_updated
# Resolve ENS name if not already set
if not current_ens:
ens_name = self.ens_resolver.get_ens_name(address)
if ens_name:
# Update contact with ENS name
self.db.update_contact(contact_id, {"ensName": ens_name})
logger.info(f"Updated contact {contact_id} with ENS name: {ens_name}")
current_ens = ens_name
ens_updated = True
# Get contact info from ENS text records if we have an ENS name
if current_ens:
# Update profile from ENS
self.ens_resolver.update_contact_from_ens(contact_id, current_ens)
info_updated = True
# Link to data source
self.db.link_contact_to_data_source(contact_id, self.data_source_id)
return ens_updated, info_updated
def run(self, batch_size: int = 50, delay_seconds: float = 0.5, resolve_all: bool = False):
"""
Run the resolver for contacts
Args:
batch_size: Number of contacts to process in each batch
delay_seconds: Delay between processing contacts
resolve_all: Whether to process all contacts or just those without ENS names
Returns:
Tuple of (ens_updated_count, info_updated_count)
"""
# Create a scraping job
job_id = self.db.create_scraping_job("ENS Resolver", "running")
logger.info(f"Created scraping job with ID: {job_id}")
try:
if resolve_all:
contacts = self.get_all_contacts_with_eth_address()
else:
contacts = self.get_contacts_without_ens()
if not contacts:
logger.info("No contacts found to process")
self.db.update_scraping_job(job_id, "completed")
return 0, 0
ens_updated_count = 0
info_updated_count = 0
# Process in batches to avoid rate limiting
for i in range(0, len(contacts), batch_size):
batch = contacts[i:i+batch_size]
logger.info(f"Processing batch {i//batch_size + 1}/{(len(contacts) + batch_size - 1)//batch_size}")
for contact in batch:
ens_updated, info_updated = self.process_contact(contact)
if ens_updated:
ens_updated_count += 1
if info_updated:
info_updated_count += 1
# Add a small delay to avoid rate limiting
time.sleep(delay_seconds)
# Update the scraping job
self.db.update_scraping_job(
job_id,
"running",
records_processed=len(batch),
records_updated=sum(1 for c in batch if self.process_contact(c)[0] or self.process_contact(c)[1])
)
# Complete the scraping job
self.db.update_scraping_job(
job_id,
"completed",
records_processed=len(contacts),
records_added=ens_updated_count,
records_updated=info_updated_count
)
logger.info(f"Updated ENS names for {ens_updated_count} contacts and contact info for {info_updated_count} contacts out of {len(contacts)} processed")
return ens_updated_count, info_updated_count
except Exception as e:
# Update the scraping job with error
self.db.update_scraping_job(job_id, "failed", error_message=str(e))
logger.exception(f"Error resolving ENS names: {e}")
raise
def main():
"""Main function"""
try:
parser = argparse.ArgumentParser(description="Resolve ENS names and contact information for all contacts")
parser.add_argument("--all", action="store_true", help="Process all contacts with Ethereum addresses, not just those without ENS names")
parser.add_argument("--batch-size", type=int, default=50, help="Number of contacts to process in each batch")
parser.add_argument("--delay", type=float, default=0.5, help="Delay in seconds between processing contacts")
args = parser.parse_args()
resolver = AllContactsENSResolver()
ens_count, info_count = resolver.run(
batch_size=args.batch_size,
delay_seconds=args.delay,
resolve_all=args.all
)
logger.info(f"ENS resolution completed successfully. Updated {ens_count} ENS names and {info_count} contact info records.")
return 0
except Exception as e:
logger.exception(f"Error running ENS resolver: {e}")
return 1
if __name__ == "__main__":
sys.exit(main())

View File

@ -0,0 +1,361 @@
#!/usr/bin/env python3
"""
Resolve ENS Names and Contact Information
This script fetches ENS names and additional contact information for Ethereum addresses
in the database. It uses the Web3 library to query the Ethereum blockchain for ENS records
and text records containing social profiles and contact information.
Usage:
python resolve_ens_names.py
"""
import os
import sys
import logging
import time
from typing import List, Dict, Any, Optional, Tuple
from web3 import Web3
from dotenv import load_dotenv
# Add parent directory to path to import utils
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from utils.db_connector import DatabaseConnector
from utils.logger import setup_logger
# Load environment variables
load_dotenv()
# Setup logging
logger = setup_logger("ens_resolver")
class ENSResolver:
"""Resolver for ENS names and contact information from Ethereum addresses"""
# ENS text record keys to check
TEXT_RECORDS = [
"name", # Display name
"email", # Email address
"url", # Website URL
"avatar", # Avatar URL
"description", # Bio/description
"notice", # Notice
"keywords", # Keywords/tags
"com.twitter", # Twitter handle
"com.github", # GitHub username
"org.telegram", # Telegram username
"com.discord", # Discord username
"com.reddit", # Reddit username
"xyz.farcaster", # Farcaster handle
"social.picture", # Profile picture
"vnd.twitter", # Alternative Twitter
"vnd.github", # Alternative GitHub
]
def __init__(self):
"""Initialize the resolver"""
# Initialize database
self.db = DatabaseConnector()
# Initialize Web3 connection
infura_key = os.getenv("INFURA_API_KEY")
if not infura_key:
raise ValueError("INFURA_API_KEY environment variable is required")
self.w3 = Web3(Web3.HTTPProvider(f"https://mainnet.infura.io/v3/{infura_key}"))
if not self.w3.is_connected():
raise ConnectionError("Failed to connect to Ethereum node")
logger.info(f"Connected to Ethereum node: {self.w3.client_version}")
def get_contacts_without_ens(self) -> List[Dict[str, Any]]:
"""Get all contacts that have an Ethereum address but no ENS name"""
query = """
SELECT id, "ethereumAddress", name
FROM "Contact"
WHERE "ethereumAddress" IS NOT NULL
AND "ensName" IS NULL
"""
result = self.db.execute_query(query)
logger.info(f"Found {len(result)} contacts without ENS names")
return result
def get_all_contacts_with_eth_address(self) -> List[Dict[str, Any]]:
"""Get all contacts that have an Ethereum address to check for additional info"""
query = """
SELECT id, "ethereumAddress", "ensName", name, twitter, discord, telegram, email, farcaster
FROM "Contact"
WHERE "ethereumAddress" IS NOT NULL
"""
result = self.db.execute_query(query)
logger.info(f"Found {len(result)} contacts with Ethereum addresses")
return result
def resolve_ens_name(self, address: str) -> Optional[str]:
"""Resolve ENS name for an Ethereum address"""
try:
# Ensure the address is properly formatted
checksum_address = self.w3.to_checksum_address(address)
# Try to get the ENS name
ens_name = self.w3.ens.name(checksum_address)
# If we got a name, verify it resolves back to the same address
if ens_name:
resolved_address = self.w3.ens.address(ens_name)
if resolved_address and resolved_address.lower() == address.lower():
logger.info(f"Resolved ENS name for {address}: {ens_name}")
return ens_name
else:
logger.warning(f"ENS name {ens_name} for {address} resolves to different address {resolved_address}")
return None
except Exception as e:
logger.error(f"Error resolving ENS name for {address}: {e}")
return None
def get_ens_text_records(self, ens_name: str) -> Dict[str, str]:
"""Get text records for an ENS name"""
text_records = {}
try:
for key in self.TEXT_RECORDS:
try:
value = self.w3.ens.get_text(ens_name, key)
if value:
text_records[key] = value
except Exception as e:
logger.debug(f"Error getting text record '{key}' for {ens_name}: {e}")
if text_records:
logger.info(f"Found {len(text_records)} text records for {ens_name}: {', '.join(text_records.keys())}")
return text_records
except Exception as e:
logger.error(f"Error getting text records for {ens_name}: {e}")
return {}
def map_text_records_to_contact_fields(self, text_records: Dict[str, str]) -> Dict[str, str]:
"""Map ENS text records to Contact model fields"""
contact_fields = {}
# Map known fields
if "name" in text_records:
contact_fields["name"] = text_records["name"]
if "email" in text_records:
contact_fields["email"] = text_records["email"]
# Twitter can be in different text records
for twitter_key in ["com.twitter", "vnd.twitter"]:
if twitter_key in text_records:
twitter = text_records[twitter_key]
# Remove @ if present
if twitter.startswith("@"):
twitter = twitter[1:]
contact_fields["twitter"] = twitter
break
# Discord
if "com.discord" in text_records:
contact_fields["discord"] = text_records["com.discord"]
# Telegram
if "org.telegram" in text_records:
contact_fields["telegram"] = text_records["org.telegram"]
# Farcaster
if "xyz.farcaster" in text_records:
contact_fields["farcaster"] = text_records["xyz.farcaster"]
# Collect other social profiles
other_social = []
if "com.github" in text_records or "vnd.github" in text_records:
github = text_records.get("com.github") or text_records.get("vnd.github")
other_social.append(f"GitHub: {github}")
if "com.reddit" in text_records:
other_social.append(f"Reddit: {text_records['com.reddit']}")
if "url" in text_records:
other_social.append(f"Website: {text_records['url']}")
if other_social:
contact_fields["otherSocial"] = "; ".join(other_social)
return contact_fields
def update_contact_info(self, contact_id: str, ens_name: Optional[str] = None, contact_info: Optional[Dict[str, str]] = None) -> bool:
"""Update a contact with ENS name and additional contact information"""
try:
# Build the update query dynamically based on what fields we have
update_fields = []
params = {"contact_id": contact_id}
if ens_name:
update_fields.append('"ensName" = %(ens_name)s')
params["ens_name"] = ens_name
if contact_info:
for field, value in contact_info.items():
update_fields.append(f'"{field}" = %({field})s')
params[field] = value
if not update_fields:
logger.warning(f"No fields to update for contact {contact_id}")
return False
query = f"""
UPDATE "Contact"
SET {", ".join(update_fields)},
"updatedAt" = NOW()
WHERE id = %(contact_id)s
"""
self.db.execute_update(query, params)
# Also update the name if it's currently a generic name and we have a better name
if ens_name and "name" not in contact_info:
name_query = """
SELECT name FROM "Contact" WHERE id = %(contact_id)s
"""
result = self.db.execute_query(name_query, {"contact_id": contact_id})
current_name = result[0]["name"] if result else None
# If the current name is generic (starts with MC_ or ETH_ or RG_), update it
if current_name and (current_name.startswith("MC_") or current_name.startswith("ETH_") or current_name.startswith("RG_")):
# Use ENS name without .eth suffix as the name
name = ens_name[:-4] if ens_name.endswith('.eth') else ens_name
update_name_query = """
UPDATE "Contact"
SET name = %(name)s,
"updatedAt" = NOW()
WHERE id = %(contact_id)s
"""
self.db.execute_update(update_name_query, {
"contact_id": contact_id,
"name": name
})
logger.info(f"Updated contact {contact_id} name from '{current_name}' to '{name}'")
fields_updated = []
if ens_name:
fields_updated.append("ENS name")
if contact_info:
fields_updated.extend(list(contact_info.keys()))
logger.info(f"Updated contact {contact_id} with: {', '.join(fields_updated)}")
return True
except Exception as e:
logger.error(f"Error updating contact {contact_id}: {e}")
return False
def process_contact(self, contact: Dict[str, Any]) -> Tuple[bool, bool]:
"""Process a single contact to resolve ENS name and contact info"""
contact_id = contact["id"]
address = contact["ethereumAddress"]
current_ens = contact.get("ensName")
ens_updated = False
info_updated = False
# Skip if no address
if not address:
return ens_updated, info_updated
# Resolve ENS name if not already set
ens_name = None
if not current_ens:
ens_name = self.resolve_ens_name(address)
if ens_name:
ens_updated = True
else:
ens_name = current_ens
# Get contact info from ENS text records if we have an ENS name
contact_info = {}
if ens_name:
text_records = self.get_ens_text_records(ens_name)
if text_records:
contact_info = self.map_text_records_to_contact_fields(text_records)
# Only include fields that are different from what we already have
for field in list(contact_info.keys()):
if field in contact and contact[field] == contact_info[field]:
del contact_info[field]
if contact_info:
info_updated = True
# Update the contact if we have new information
if ens_updated or info_updated:
self.update_contact_info(contact_id, ens_name if ens_updated else None, contact_info if info_updated else None)
return ens_updated, info_updated
def run(self, batch_size: int = 50, delay_seconds: float = 0.5, resolve_all: bool = False):
"""Run the resolver for contacts"""
if resolve_all:
contacts = self.get_all_contacts_with_eth_address()
else:
contacts = self.get_contacts_without_ens()
if not contacts:
logger.info("No contacts found to process")
return 0, 0
ens_updated_count = 0
info_updated_count = 0
# Process in batches to avoid rate limiting
for i in range(0, len(contacts), batch_size):
batch = contacts[i:i+batch_size]
logger.info(f"Processing batch {i//batch_size + 1}/{(len(contacts) + batch_size - 1)//batch_size}")
for contact in batch:
ens_updated, info_updated = self.process_contact(contact)
if ens_updated:
ens_updated_count += 1
if info_updated:
info_updated_count += 1
# Add a small delay to avoid rate limiting
time.sleep(delay_seconds)
logger.info(f"Updated ENS names for {ens_updated_count} contacts and contact info for {info_updated_count} contacts out of {len(contacts)} processed")
return ens_updated_count, info_updated_count
def main():
"""Main function"""
try:
import argparse
parser = argparse.ArgumentParser(description="Resolve ENS names and contact information")
parser.add_argument("--all", action="store_true", help="Process all contacts with Ethereum addresses, not just those without ENS names")
parser.add_argument("--batch-size", type=int, default=50, help="Number of contacts to process in each batch")
parser.add_argument("--delay", type=float, default=0.5, help="Delay in seconds between processing contacts")
args = parser.parse_args()
resolver = ENSResolver()
ens_count, info_count = resolver.run(
batch_size=args.batch_size,
delay_seconds=args.delay,
resolve_all=args.all
)
logger.info(f"ENS resolution completed successfully. Updated {ens_count} ENS names and {info_count} contact info records.")
return 0
except Exception as e:
logger.exception(f"Error running ENS resolver: {e}")
return 1
if __name__ == "__main__":
sys.exit(main())

View File

@ -0,0 +1,65 @@
import { NextRequest, NextResponse } from "next/server";
import { cookies } from "next/headers";
// Mock user data - in a real app this would come from a database
const USERS = [
{
id: "1",
name: "Admin",
role: "admin",
username: "admin",
password: "stones1234" // In production, use hashed passwords
}
];
export async function POST(request: NextRequest) {
try {
const body = await request.json();
const { username, password } = body;
// Validate inputs
if (!username || !password) {
return NextResponse.json(
{ success: false, message: "Missing required fields" },
{ status: 400 }
);
}
// Find user
const user = USERS.find(u => u.username === username && u.password === password);
if (!user) {
return NextResponse.json(
{ success: false, message: "Invalid credentials" },
{ status: 401 }
);
}
// Set auth cookie
const cookieStore = cookies();
cookieStore.set('auth', JSON.stringify({
id: user.id,
name: user.name,
role: user.role
}), {
httpOnly: true,
secure: process.env.NODE_ENV === 'production',
maxAge: 60 * 60 * 24 * 7, // 1 week
path: '/'
});
return NextResponse.json({
success: true,
user: {
name: user.name,
role: user.role
}
});
} catch (error) {
console.error("Login error:", error);
return NextResponse.json(
{ success: false, message: "Internal server error" },
{ status: 500 }
);
}
}

View File

@ -0,0 +1,20 @@
import { NextRequest, NextResponse } from "next/server";
import { cookies } from "next/headers";
export async function POST(request: NextRequest) {
try {
// Delete auth cookie
const cookieStore = cookies();
cookieStore.delete('auth');
return NextResponse.json({
success: true
});
} catch (error) {
console.error("Logout error:", error);
return NextResponse.json(
{ success: false, message: "Internal server error" },
{ status: 500 }
);
}
}

View File

@ -0,0 +1,313 @@
import { Metadata } from "next";
import Link from "next/link";
import { notFound } from "next/navigation";
import { getUser } from "@/lib/auth";
import { prisma } from "@/lib/prisma";
import { Button } from "@/components/ui/button";
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from "@/components/ui/card";
import { Tabs, TabsContent, TabsList, TabsTrigger } from "@/components/ui/tabs";
import { LogoutButton } from "@/components/auth/logout-button";
import { Badge } from "@/components/ui/badge";
export const metadata: Metadata = {
title: "Contact Details - Stones Database",
description: "View and manage contact details",
};
interface ContactDetailPageProps {
params: {
id: string;
};
}
export default async function ContactDetailPage({
params,
}: ContactDetailPageProps) {
const user = await getUser();
// Get contact with all related data
const contact = await prisma.contact.findUnique({
where: {
id: params.id,
},
include: {
nftHoldings: true,
daoMemberships: true,
tokenHoldings: true,
notes: true,
tags: {
include: {
tag: true,
},
},
},
});
if (!contact) {
notFound();
}
return (
<div className="flex min-h-screen flex-col">
<header className="sticky top-0 z-50 w-full border-b bg-background/95 backdrop-blur supports-[backdrop-filter]:bg-background/60">
<div className="container flex h-14 items-center justify-between">
<div className="mr-4 flex">
<Link href="/" className="mr-6 flex items-center space-x-2">
<span className="font-bold">Stones Database</span>
</Link>
</div>
<nav className="flex items-center space-x-4">
<Link href="/contacts" className="text-sm font-medium">
Contacts
</Link>
<Link href="/dashboard" className="text-sm font-medium">
Dashboard
</Link>
{user && (
<div className="flex items-center gap-4">
<span className="text-sm text-muted-foreground">
Hello, {user.name}
</span>
<LogoutButton />
</div>
)}
</nav>
</div>
</header>
<main className="flex-1 container py-6">
<div className="flex items-center justify-between mb-6">
<div>
<Link href="/contacts">
<Button variant="ghost" size="sm" className="mb-2">
Back to Contacts
</Button>
</Link>
<h1 className="text-3xl font-bold">
{contact.name || contact.ensName || "Unnamed Contact"}
</h1>
<p className="text-muted-foreground">{contact.ethereumAddress}</p>
</div>
<div className="flex gap-2">
<Button variant="outline">Edit Contact</Button>
<Button>Add Note</Button>
</div>
</div>
<div className="grid grid-cols-1 lg:grid-cols-3 gap-6">
<div className="lg:col-span-1">
<Card>
<CardHeader>
<CardTitle>Contact Information</CardTitle>
</CardHeader>
<CardContent className="space-y-4">
<div>
<h3 className="text-sm font-medium text-muted-foreground">Name</h3>
<p>{contact.name || "-"}</p>
</div>
<div>
<h3 className="text-sm font-medium text-muted-foreground">ENS Name</h3>
<p>{contact.ensName || "-"}</p>
</div>
<div>
<h3 className="text-sm font-medium text-muted-foreground">Ethereum Address</h3>
<p className="break-all">{contact.ethereumAddress}</p>
</div>
{contact.ethereumAddress2 && (
<div>
<h3 className="text-sm font-medium text-muted-foreground">Secondary Ethereum Address</h3>
<p className="break-all">{contact.ethereumAddress2}</p>
</div>
)}
<div>
<h3 className="text-sm font-medium text-muted-foreground">Email</h3>
<p>{contact.email || "-"}</p>
</div>
<div className="grid grid-cols-2 gap-4">
<div>
<h3 className="text-sm font-medium text-muted-foreground">Twitter</h3>
<p>{contact.twitter || "-"}</p>
</div>
<div>
<h3 className="text-sm font-medium text-muted-foreground">Discord</h3>
<p>{contact.discord || "-"}</p>
</div>
</div>
<div className="grid grid-cols-2 gap-4">
<div>
<h3 className="text-sm font-medium text-muted-foreground">Telegram</h3>
<p>{contact.telegram || "-"}</p>
</div>
<div>
<h3 className="text-sm font-medium text-muted-foreground">Farcaster</h3>
<p>{contact.farcaster || "-"}</p>
</div>
</div>
{contact.tags.length > 0 && (
<div>
<h3 className="text-sm font-medium text-muted-foreground mb-2">Tags</h3>
<div className="flex flex-wrap gap-2">
{contact.tags.map((tagItem) => (
<Badge key={tagItem.tagId} variant="outline">
{tagItem.tag.name}
</Badge>
))}
</div>
</div>
)}
<div>
<h3 className="text-sm font-medium text-muted-foreground">Added On</h3>
<p>{new Date(contact.createdAt).toLocaleDateString()}</p>
</div>
</CardContent>
</Card>
</div>
<div className="lg:col-span-2">
<Tabs defaultValue="nft">
<TabsList className="mb-4">
<TabsTrigger value="nft">
NFT Holdings ({contact.nftHoldings.length})
</TabsTrigger>
<TabsTrigger value="dao">
DAO Memberships ({contact.daoMemberships.length})
</TabsTrigger>
<TabsTrigger value="tokens">
Token Holdings ({contact.tokenHoldings.length})
</TabsTrigger>
<TabsTrigger value="notes">
Notes ({contact.notes.length})
</TabsTrigger>
</TabsList>
<TabsContent value="nft" className="space-y-4">
{contact.nftHoldings.length > 0 ? (
<div className="rounded-md border">
<table className="w-full">
<thead>
<tr className="border-b bg-muted/50">
<th className="p-2 text-left font-medium">Collection</th>
<th className="p-2 text-left font-medium">Token ID</th>
<th className="p-2 text-left font-medium">Contract Address</th>
</tr>
</thead>
<tbody>
{contact.nftHoldings.map((nft) => (
<tr key={nft.id} className="border-b">
<td className="p-2">{nft.collectionName || "Unknown Collection"}</td>
<td className="p-2">{nft.tokenId}</td>
<td className="p-2">
{nft.contractAddress.substring(0, 6)}...
{nft.contractAddress.substring(nft.contractAddress.length - 4)}
</td>
</tr>
))}
</tbody>
</table>
</div>
) : (
<div className="text-center py-8 text-muted-foreground">
No NFT holdings found.
</div>
)}
</TabsContent>
<TabsContent value="dao" className="space-y-4">
{contact.daoMemberships.length > 0 ? (
<div className="rounded-md border">
<table className="w-full">
<thead>
<tr className="border-b bg-muted/50">
<th className="p-2 text-left font-medium">DAO Name</th>
<th className="p-2 text-left font-medium">DAO Type</th>
<th className="p-2 text-left font-medium">Joined Date</th>
</tr>
</thead>
<tbody>
{contact.daoMemberships.map((dao) => (
<tr key={dao.id} className="border-b">
<td className="p-2">{dao.daoName}</td>
<td className="p-2">{dao.daoType}</td>
<td className="p-2">
{dao.joinedAt
? new Date(dao.joinedAt).toLocaleDateString()
: "Unknown"}
</td>
</tr>
))}
</tbody>
</table>
</div>
) : (
<div className="text-center py-8 text-muted-foreground">
No DAO memberships found.
</div>
)}
</TabsContent>
<TabsContent value="tokens" className="space-y-4">
{contact.tokenHoldings.length > 0 ? (
<div className="rounded-md border">
<table className="w-full">
<thead>
<tr className="border-b bg-muted/50">
<th className="p-2 text-left font-medium">Token</th>
<th className="p-2 text-left font-medium">Balance</th>
<th className="p-2 text-left font-medium">Last Updated</th>
</tr>
</thead>
<tbody>
{contact.tokenHoldings.map((token) => (
<tr key={token.id} className="border-b">
<td className="p-2">{token.tokenSymbol || "Unknown Token"}</td>
<td className="p-2">{token.balance}</td>
<td className="p-2">
{new Date(token.lastUpdated).toLocaleDateString()}
</td>
</tr>
))}
</tbody>
</table>
</div>
) : (
<div className="text-center py-8 text-muted-foreground">
No token holdings found.
</div>
)}
</TabsContent>
<TabsContent value="notes" className="space-y-4">
{contact.notes.length > 0 ? (
<div className="space-y-4">
{contact.notes.map((note) => (
<Card key={note.id}>
<CardHeader>
<CardTitle className="text-sm font-medium">
{new Date(note.createdAt).toLocaleDateString()}
</CardTitle>
</CardHeader>
<CardContent>
<p>{note.content}</p>
</CardContent>
</Card>
))}
</div>
) : (
<div className="text-center py-8 text-muted-foreground">
No notes found.
</div>
)}
</TabsContent>
</Tabs>
</div>
</div>
</main>
<footer className="w-full border-t py-6">
<div className="container flex flex-col items-center justify-between gap-4 md:flex-row">
<p className="text-center text-sm leading-loose text-muted-foreground md:text-left">
© 2023 Farcastle. All rights reserved.
</p>
</div>
</footer>
</div>
);
}

105
src/app/contacts/page.tsx Normal file
View File

@ -0,0 +1,105 @@
import { Metadata } from "next";
import Link from "next/link";
import { getUser } from "@/lib/auth";
import { prisma } from "@/lib/prisma";
import { Button } from "@/components/ui/button";
import { Input } from "@/components/ui/input";
import { LogoutButton } from "@/components/auth/logout-button";
import { ContactsList } from "@/components/contacts/contacts-list";
export const metadata: Metadata = {
title: "Contacts - Stones Database",
description: "Manage contacts in the Stones Database",
};
interface ContactsPageProps {
searchParams: { [key: string]: string | string[] | undefined };
}
export default async function ContactsPage({ searchParams }: ContactsPageProps) {
const user = await getUser();
const page = Number(searchParams.page) || 1;
const limit = 25;
const skip = (page - 1) * limit;
// Get contacts with pagination
const contacts = await prisma.contact.findMany({
skip,
take: limit,
orderBy: {
createdAt: "desc",
},
include: {
nftHoldings: {
take: 1,
},
daoMemberships: {
take: 1,
},
},
});
// Get total count for pagination
const totalContacts = await prisma.contact.count();
const totalPages = Math.ceil(totalContacts / limit);
return (
<div className="flex min-h-screen flex-col">
<header className="sticky top-0 z-50 w-full border-b bg-background/95 backdrop-blur supports-[backdrop-filter]:bg-background/60">
<div className="container flex h-14 items-center justify-between">
<div className="mr-4 flex">
<Link href="/" className="mr-6 flex items-center space-x-2">
<span className="font-bold">Stones Database</span>
</Link>
</div>
<nav className="flex items-center space-x-4">
<Link href="/contacts" className="text-sm font-medium">
Contacts
</Link>
<Link href="/dashboard" className="text-sm font-medium">
Dashboard
</Link>
{user && (
<div className="flex items-center gap-4">
<span className="text-sm text-muted-foreground">
Hello, {user.name}
</span>
<LogoutButton />
</div>
)}
</nav>
</div>
</header>
<main className="flex-1 container py-6">
<div className="flex items-center justify-between mb-6">
<h1 className="text-3xl font-bold">Contacts</h1>
<div className="flex gap-4">
<div className="relative w-full md:w-60">
<Input
type="search"
placeholder="Search contacts..."
className="pr-8"
/>
</div>
<Button>
Add Contact
</Button>
</div>
</div>
<ContactsList
contacts={contacts}
currentPage={page}
totalPages={totalPages}
/>
</main>
<footer className="w-full border-t py-6">
<div className="container flex flex-col items-center justify-between gap-4 md:flex-row">
<p className="text-center text-sm leading-loose text-muted-foreground md:text-left">
© 2023 Farcastle. All rights reserved.
</p>
</div>
</footer>
</div>
);
}

150
src/app/dashboard/page.tsx Normal file
View File

@ -0,0 +1,150 @@
import { Metadata } from "next";
import Link from "next/link";
import { getUser } from "@/lib/auth";
import { Button } from "@/components/ui/button";
import { Card, CardContent, CardDescription, CardFooter, CardHeader, CardTitle } from "@/components/ui/card";
import { prisma } from "@/lib/prisma";
import { LogoutButton } from "@/components/auth/logout-button";
export const metadata: Metadata = {
title: "Dashboard - Stones Database",
description: "Dashboard for Stones Database",
};
export default async function DashboardPage() {
const user = await getUser();
// Get counts from database
const contactCount = await prisma.contact.count();
const nftHoldingCount = await prisma.nftHolding.count();
const daoMembershipCount = await prisma.daoMembership.count();
const tokenHoldingCount = await prisma.tokenHolding.count();
return (
<div className="flex min-h-screen flex-col">
<header className="sticky top-0 z-50 w-full border-b bg-background/95 backdrop-blur supports-[backdrop-filter]:bg-background/60">
<div className="container flex h-14 items-center justify-between">
<div className="mr-4 flex">
<Link href="/" className="mr-6 flex items-center space-x-2">
<span className="font-bold">Stones Database</span>
</Link>
</div>
<nav className="flex items-center space-x-4">
<Link href="/contacts" className="text-sm font-medium">
Contacts
</Link>
<Link href="/dashboard" className="text-sm font-medium">
Dashboard
</Link>
{user && (
<div className="flex items-center gap-4">
<span className="text-sm text-muted-foreground">
Hello, {user.name}
</span>
<LogoutButton />
</div>
)}
</nav>
</div>
</header>
<main className="flex-1 container py-6">
<h1 className="text-3xl font-bold mb-6">Dashboard</h1>
<div className="grid grid-cols-1 gap-6 md:grid-cols-2 lg:grid-cols-4">
<Card>
<CardHeader className="pb-2">
<CardTitle className="text-sm font-medium">Total Contacts</CardTitle>
</CardHeader>
<CardContent>
<div className="text-2xl font-bold">{contactCount}</div>
</CardContent>
<CardFooter>
<Button asChild variant="ghost" size="sm" className="w-full">
<Link href="/contacts">View All Contacts</Link>
</Button>
</CardFooter>
</Card>
<Card>
<CardHeader className="pb-2">
<CardTitle className="text-sm font-medium">NFT Holdings</CardTitle>
</CardHeader>
<CardContent>
<div className="text-2xl font-bold">{nftHoldingCount}</div>
</CardContent>
<CardFooter>
<Button asChild variant="ghost" size="sm" className="w-full">
<Link href="/nft-holdings">View NFT Holdings</Link>
</Button>
</CardFooter>
</Card>
<Card>
<CardHeader className="pb-2">
<CardTitle className="text-sm font-medium">DAO Memberships</CardTitle>
</CardHeader>
<CardContent>
<div className="text-2xl font-bold">{daoMembershipCount}</div>
</CardContent>
<CardFooter>
<Button asChild variant="ghost" size="sm" className="w-full">
<Link href="/dao-memberships">View DAO Memberships</Link>
</Button>
</CardFooter>
</Card>
<Card>
<CardHeader className="pb-2">
<CardTitle className="text-sm font-medium">Token Holdings</CardTitle>
</CardHeader>
<CardContent>
<div className="text-2xl font-bold">{tokenHoldingCount}</div>
</CardContent>
<CardFooter>
<Button asChild variant="ghost" size="sm" className="w-full">
<Link href="/token-holdings">View Token Holdings</Link>
</Button>
</CardFooter>
</Card>
</div>
<div className="mt-8">
<h2 className="text-xl font-bold mb-4">Quick Actions</h2>
<div className="grid grid-cols-1 gap-4 md:grid-cols-2 lg:grid-cols-3">
<Button asChild variant="outline" className="h-auto py-4">
<Link href="/contacts">
<div className="flex flex-col items-start">
<span className="font-medium">Browse Contacts</span>
<span className="text-sm text-muted-foreground">View and manage all contacts</span>
</div>
</Link>
</Button>
<Button asChild variant="outline" className="h-auto py-4">
<Link href="/contacts/search">
<div className="flex flex-col items-start">
<span className="font-medium">Search Contacts</span>
<span className="text-sm text-muted-foreground">Find specific contacts</span>
</div>
</Link>
</Button>
<Button asChild variant="outline" className="h-auto py-4">
<Link href="/contacts/export">
<div className="flex flex-col items-start">
<span className="font-medium">Export Data</span>
<span className="text-sm text-muted-foreground">Download contact data</span>
</div>
</Link>
</Button>
</div>
</div>
</main>
<footer className="w-full border-t py-6">
<div className="container flex flex-col items-center justify-between gap-4 md:flex-row">
<p className="text-center text-sm leading-loose text-muted-foreground md:text-left">
© 2023 Farcastle. All rights reserved.
</p>
</div>
</footer>
</div>
);
}

View File

@ -1,8 +1,6 @@
import type { Metadata } from "next";
import { Inter } from "next/font/google";
import "./globals.css";
import { ThemeProvider } from "@/components/theme-provider";
import { Toaster } from "@/components/ui/toaster";
const inter = Inter({ subsets: ["latin"] });
@ -17,17 +15,9 @@ export default function RootLayout({
children: React.ReactNode;
}>) {
return (
<html lang="en" suppressHydrationWarning>
<html lang="en">
<body className={inter.className}>
<ThemeProvider
attribute="class"
defaultTheme="dark"
enableSystem
disableTransitionOnChange
>
{children}
<Toaster />
</ThemeProvider>
{children}
</body>
</html>
);

25
src/app/login/page.tsx Normal file
View File

@ -0,0 +1,25 @@
import { Metadata } from "next";
import { LoginForm } from "../../components/auth/login-form";
export const metadata: Metadata = {
title: "Login - Stones Database",
description: "Login to the Stones Database",
};
export default function LoginPage() {
return (
<div className="container flex h-screen w-screen flex-col items-center justify-center">
<div className="mx-auto flex w-full flex-col justify-center space-y-6 sm:w-[350px]">
<div className="flex flex-col space-y-2 text-center">
<h1 className="text-2xl font-semibold tracking-tight">
Login to Stones Database
</h1>
<p className="text-sm text-gray-500">
Enter your credentials to access the database
</p>
</div>
<LoginForm />
</div>
</div>
);
}

View File

@ -1,7 +1,5 @@
import { Metadata } from "next";
import Link from "next/link";
import { Button } from "@/components/ui/button";
import { Card, CardContent, CardDescription, CardFooter, CardHeader, CardTitle } from "@/components/ui/card";
export const metadata: Metadata = {
title: "Stones Database",
@ -11,103 +9,85 @@ export const metadata: Metadata = {
export default function Home() {
return (
<div className="flex min-h-screen flex-col">
<header className="sticky top-0 z-50 w-full border-b bg-background/95 backdrop-blur supports-[backdrop-filter]:bg-background/60">
<header className="sticky top-0 z-50 w-full border-b">
<div className="container flex h-14 items-center">
<div className="mr-4 flex">
<Link href="/" className="mr-6 flex items-center space-x-2">
<span className="font-bold">Stones Database</span>
</Link>
</div>
<nav className="flex flex-1 items-center justify-between space-x-2 md:justify-end">
<div className="w-full flex-1 md:w-auto md:flex-none">
<Button asChild variant="outline">
<Link href="/dashboard">Dashboard</Link>
</Button>
<nav className="flex flex-1 items-center justify-between space-x-2">
<div className="w-full flex-1">
<Link href="/dashboard" className="px-4 py-2 border rounded">
Dashboard
</Link>
</div>
</nav>
</div>
</header>
<main className="flex-1">
<section className="w-full py-12 md:py-24 lg:py-32 xl:py-48">
<div className="container px-4 md:px-6">
<section className="w-full py-12">
<div className="container px-4">
<div className="flex flex-col items-center space-y-4 text-center">
<div className="space-y-2">
<h1 className="text-3xl font-bold tracking-tighter sm:text-4xl md:text-5xl lg:text-6xl/none">
<h1 className="text-3xl font-bold">
Farcastle $Stones Database
</h1>
<p className="mx-auto max-w-[700px] text-gray-500 md:text-xl dark:text-gray-400">
<p className="mx-auto max-w-[700px] text-gray-500">
A comprehensive database of Ethereum addresses and contact information for the Farcastle $Stones token launch.
</p>
</div>
<div className="space-x-4">
<Button asChild>
<Link href="/dashboard">View Dashboard</Link>
</Button>
<Button variant="outline" asChild>
<Link href="/contacts">Browse Contacts</Link>
</Button>
<Link href="/dashboard" className="px-4 py-2 bg-blue-500 text-white rounded">
View Dashboard
</Link>
<Link href="/contacts" className="px-4 py-2 border rounded">
Browse Contacts
</Link>
</div>
</div>
</div>
</section>
<section className="w-full py-12 md:py-24 lg:py-32 bg-muted">
<div className="container px-4 md:px-6">
<div className="mx-auto grid max-w-5xl items-center gap-6 py-12 lg:grid-cols-3">
<Card>
<CardHeader>
<CardTitle>NFT Holders</CardTitle>
<CardDescription>
Track holders of specific NFT collections
</CardDescription>
</CardHeader>
<CardContent>
<section className="w-full py-12 bg-gray-100">
<div className="container px-4">
<div className="mx-auto grid max-w-5xl items-center gap-6 py-12 grid-cols-1 md:grid-cols-3">
<div className="border rounded p-4">
<h3 className="text-xl font-bold">NFT Holders</h3>
<p className="text-sm text-gray-500">Track holders of specific NFT collections</p>
<div className="mt-4">
<p>Automatically collect Ethereum addresses of NFT holders and resolve their ENS names.</p>
</CardContent>
<CardFooter>
<Button variant="outline" className="w-full">
View NFT Data
</Button>
</CardFooter>
</Card>
<Card>
<CardHeader>
<CardTitle>Token Holders</CardTitle>
<CardDescription>
Track holders of ERC20 tokens
</CardDescription>
</CardHeader>
<CardContent>
</div>
<div className="mt-4">
<a href="#" className="px-4 py-2 border rounded block text-center">View NFT Data</a>
</div>
</div>
<div className="border rounded p-4">
<h3 className="text-xl font-bold">Token Holders</h3>
<p className="text-sm text-gray-500">Track holders of ERC20 tokens</p>
<div className="mt-4">
<p>Collect data on ERC20 token holders, including balance information and transaction history.</p>
</CardContent>
<CardFooter>
<Button variant="outline" className="w-full">
View Token Data
</Button>
</CardFooter>
</Card>
<Card>
<CardHeader>
<CardTitle>DAO Members</CardTitle>
<CardDescription>
Track members of Moloch DAOs
</CardDescription>
</CardHeader>
<CardContent>
</div>
<div className="mt-4">
<a href="#" className="px-4 py-2 border rounded block text-center">View Token Data</a>
</div>
</div>
<div className="border rounded p-4">
<h3 className="text-xl font-bold">DAO Members</h3>
<p className="text-sm text-gray-500">Track members of Moloch DAOs</p>
<div className="mt-4">
<p>Collect information on members of Moloch DAOs such as Raid Guild, DAOhaus, and Metacartel.</p>
</CardContent>
<CardFooter>
<Button variant="outline" className="w-full">
View DAO Data
</Button>
</CardFooter>
</Card>
</div>
<div className="mt-4">
<a href="#" className="px-4 py-2 border rounded block text-center">View DAO Data</a>
</div>
</div>
</div>
</div>
</section>
</main>
<footer className="w-full border-t py-6">
<div className="container flex flex-col items-center justify-between gap-4 md:flex-row">
<p className="text-center text-sm leading-loose text-muted-foreground md:text-left">
<div className="container flex flex-col items-center justify-between gap-4">
<p className="text-center text-sm text-gray-500">
© 2023 Farcastle. All rights reserved.
</p>
</div>

View File

@ -0,0 +1,100 @@
"use client";
import { useState } from "react";
import { useRouter } from "next/navigation";
export function LoginForm() {
const router = useRouter();
const [isLoading, setIsLoading] = useState(false);
const [username, setUsername] = useState("");
const [password, setPassword] = useState("");
const [error, setError] = useState("");
async function onSubmit(event: React.FormEvent) {
event.preventDefault();
setIsLoading(true);
setError("");
try {
const response = await fetch("/api/auth/login", {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({
username,
password,
}),
});
if (!response.ok) {
throw new Error("Login failed");
}
const data = await response.json();
if (data.success) {
// Redirect to the dashboard
router.push("/dashboard");
router.refresh();
} else {
setError("Authentication failed. Please check your credentials and try again.");
}
} catch (error) {
setError("Something went wrong. Please try again.");
} finally {
setIsLoading(false);
}
}
return (
<div className="grid gap-6">
<form onSubmit={onSubmit}>
<div className="grid gap-4">
<div className="grid gap-2">
<label htmlFor="username" className="text-sm font-medium">
Username
</label>
<input
id="username"
placeholder="Username"
type="text"
className="px-3 py-2 border rounded-md"
value={username}
onChange={(e) => setUsername(e.target.value)}
autoCapitalize="none"
autoCorrect="off"
disabled={isLoading}
/>
</div>
<div className="grid gap-2">
<label htmlFor="password" className="text-sm font-medium">
Password
</label>
<input
id="password"
placeholder="Password"
type="password"
className="px-3 py-2 border rounded-md"
value={password}
onChange={(e) => setPassword(e.target.value)}
disabled={isLoading}
/>
</div>
{error && (
<div className="text-red-500 text-sm">
{error}
</div>
)}
<button
type="submit"
className="px-4 py-2 bg-blue-500 text-white rounded-md hover:bg-blue-600 disabled:opacity-50"
disabled={isLoading}
>
{isLoading ? "Signing in..." : "Sign In"}
</button>
</div>
</form>
</div>
);
}

View File

@ -0,0 +1,51 @@
"use client";
import { useState } from "react";
import { useRouter } from "next/navigation";
import { Button } from "@/components/ui/button";
import { toast } from "@/components/ui/use-toast";
export function LogoutButton() {
const router = useRouter();
const [isLoading, setIsLoading] = useState(false);
async function handleLogout() {
setIsLoading(true);
try {
const response = await fetch("/api/auth/logout", {
method: "POST",
headers: {
"Content-Type": "application/json",
},
});
if (!response.ok) {
throw new Error("Logout failed");
}
// Redirect to the login page
router.push("/login");
router.refresh();
} catch (error) {
toast({
title: "Logout error",
description: "Something went wrong. Please try again.",
variant: "destructive",
});
} finally {
setIsLoading(false);
}
}
return (
<Button
variant="outline"
size="sm"
onClick={handleLogout}
disabled={isLoading}
>
{isLoading ? "Signing out..." : "Sign Out"}
</Button>
);
}

View File

@ -0,0 +1,126 @@
import Link from "next/link";
import { Contact, NftHolding, DaoMembership } from "@prisma/client";
import { Button } from "@/components/ui/button";
import {
Table,
TableBody,
TableCell,
TableHead,
TableHeader,
TableRow,
} from "@/components/ui/table";
import { Badge } from "@/components/ui/badge";
interface ContactsListProps {
contacts: (Contact & {
nftHoldings: NftHolding[];
daoMemberships: DaoMembership[];
})[];
currentPage: number;
totalPages: number;
}
export function ContactsList({
contacts,
currentPage,
totalPages,
}: ContactsListProps) {
return (
<div className="space-y-4">
<div className="rounded-md border">
<Table>
<TableHeader>
<TableRow>
<TableHead>Name</TableHead>
<TableHead>Ethereum Address</TableHead>
<TableHead>ENS</TableHead>
<TableHead>Sources</TableHead>
<TableHead>Actions</TableHead>
</TableRow>
</TableHeader>
<TableBody>
{contacts.map((contact) => (
<TableRow key={contact.id}>
<TableCell className="font-medium">
{contact.name || "Unknown"}
</TableCell>
<TableCell>
{contact.ethereumAddress.substring(0, 6)}...
{contact.ethereumAddress.substring(contact.ethereumAddress.length - 4)}
</TableCell>
<TableCell>{contact.ensName || "-"}</TableCell>
<TableCell>
<div className="flex gap-2">
{contact.nftHoldings.length > 0 && (
<Badge variant="outline">NFT</Badge>
)}
{contact.daoMemberships.length > 0 && (
<Badge variant="outline">DAO</Badge>
)}
</div>
</TableCell>
<TableCell>
<Link href={`/contacts/${contact.id}`}>
<Button variant="ghost" size="sm">
View
</Button>
</Link>
</TableCell>
</TableRow>
))}
{contacts.length === 0 && (
<TableRow>
<TableCell colSpan={5} className="text-center h-24">
No contacts found.
</TableCell>
</TableRow>
)}
</TableBody>
</Table>
</div>
{/* Pagination */}
<div className="flex items-center justify-between">
<div className="text-sm text-muted-foreground">
Showing{" "}
<strong>
{contacts.length > 0
? (currentPage - 1) * 25 + 1
: 0}
</strong>{" "}
to{" "}
<strong>
{(currentPage - 1) * 25 + contacts.length}
</strong>{" "}
of <strong>{totalPages * 25}</strong> contacts
</div>
<div className="flex gap-2">
<Link
href={`/contacts?page=${currentPage - 1}`}
aria-disabled={currentPage <= 1}
>
<Button
variant="outline"
size="sm"
disabled={currentPage <= 1}
>
Previous
</Button>
</Link>
<Link
href={`/contacts?page=${currentPage + 1}`}
aria-disabled={currentPage >= totalPages}
>
<Button
variant="outline"
size="sm"
disabled={currentPage >= totalPages}
>
Next
</Button>
</Link>
</div>
</div>
</div>
);
}

View File

@ -1,8 +1,9 @@
"use client";
import * as React from "react";
import { ThemeProvider as NextThemesProvider } from "next-themes";
import { type ThemeProviderProps } from "next-themes/dist/types";
export function ThemeProvider({ children, ...props }: ThemeProviderProps) {
return <NextThemesProvider {...props}>{children}</NextThemesProvider>;
}
}

View File

@ -0,0 +1,25 @@
import * as React from "react"
export interface BadgeProps extends React.HTMLAttributes<HTMLDivElement> {
variant?: "default" | "secondary" | "outline" | "destructive"
}
export function Badge({
className,
variant = "default",
...props
}: BadgeProps) {
const variantStyles = {
default: "bg-blue-100 text-blue-800",
secondary: "bg-gray-100 text-gray-800",
outline: "border border-gray-200 text-gray-800",
destructive: "bg-red-100 text-red-800"
}
const baseStyle = "inline-flex items-center rounded-md px-2.5 py-0.5 text-xs font-medium"
const variantStyle = variantStyles[variant] || variantStyles.default
return (
<div className={`${baseStyle} ${variantStyle} ${className}`} {...props} />
)
}

View File

@ -1,56 +1,46 @@
"use client"
import * as React from "react";
import { Slot } from "@radix-ui/react-slot";
import { cva, type VariantProps } from "class-variance-authority";
import { cn } from "@/lib/utils";
const buttonVariants = cva(
"inline-flex items-center justify-center whitespace-nowrap rounded-md text-sm font-medium ring-offset-background transition-colors focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:pointer-events-none disabled:opacity-50",
{
variants: {
variant: {
default: "bg-primary text-primary-foreground hover:bg-primary/90",
destructive:
"bg-destructive text-destructive-foreground hover:bg-destructive/90",
outline:
"border border-input bg-background hover:bg-accent hover:text-accent-foreground",
secondary:
"bg-secondary text-secondary-foreground hover:bg-secondary/80",
ghost: "hover:bg-accent hover:text-accent-foreground",
link: "text-primary underline-offset-4 hover:underline",
},
size: {
default: "h-10 px-4 py-2",
sm: "h-9 rounded-md px-3",
lg: "h-11 rounded-md px-8",
icon: "h-10 w-10",
},
},
defaultVariants: {
variant: "default",
size: "default",
},
}
);
export interface ButtonProps
extends React.ButtonHTMLAttributes<HTMLButtonElement>,
VariantProps<typeof buttonVariants> {
asChild?: boolean;
extends React.ButtonHTMLAttributes<HTMLButtonElement> {
variant?: "default" | "outline" | "secondary" | "ghost" | "link" | "destructive"
size?: "default" | "sm" | "lg" | "icon"
}
const Button = React.forwardRef<HTMLButtonElement, ButtonProps>(
({ className, variant, size, asChild = false, ...props }, ref) => {
const Comp = asChild ? Slot : "button";
({ className, variant = "default", size = "default", ...props }, ref) => {
const baseStyles = "inline-flex items-center justify-center rounded-md font-medium transition-colors focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-blue-500 focus-visible:ring-offset-2 disabled:opacity-50 disabled:pointer-events-none"
const variantStyles = {
default: "bg-blue-500 text-white hover:bg-blue-600",
outline: "border border-gray-300 hover:bg-gray-100",
secondary: "bg-gray-100 text-gray-900 hover:bg-gray-200",
ghost: "hover:bg-gray-100 hover:text-gray-900",
link: "text-blue-500 underline-offset-4 hover:underline",
destructive: "bg-red-500 text-white hover:bg-red-600"
}
const sizeStyles = {
default: "h-10 py-2 px-4",
sm: "h-9 px-3 rounded-md",
lg: "h-11 px-8 rounded-md",
icon: "h-10 w-10"
}
const variantStyle = variantStyles[variant] || variantStyles.default
const sizeStyle = sizeStyles[size] || sizeStyles.default
const buttonClass = `${baseStyles} ${variantStyle} ${sizeStyle} ${className || ""}`
return (
<Comp
className={cn(buttonVariants({ variant, size, className }))}
<button
className={buttonClass}
ref={ref}
{...props}
/>
);
)
}
);
Button.displayName = "Button";
)
Button.displayName = "Button"
export { Button, buttonVariants };
export { Button }

View File

@ -2,78 +2,58 @@ import * as React from "react";
import { cn } from "@/lib/utils";
const Card = React.forwardRef<
HTMLDivElement,
React.HTMLAttributes<HTMLDivElement>
>(({ className, ...props }, ref) => (
<div
ref={ref}
className={cn(
"rounded-lg border bg-card text-card-foreground shadow-sm",
className
)}
{...props}
/>
));
Card.displayName = "Card";
export interface CardProps extends React.HTMLAttributes<HTMLDivElement> {}
const CardHeader = React.forwardRef<
HTMLDivElement,
React.HTMLAttributes<HTMLDivElement>
>(({ className, ...props }, ref) => (
<div
ref={ref}
className={cn("flex flex-col space-y-1.5 p-6", className)}
{...props}
/>
));
CardHeader.displayName = "CardHeader";
export function Card({ className, ...props }: CardProps) {
return (
<div
className={`rounded-lg border bg-white text-gray-950 shadow-sm ${className}`}
{...props}
/>
);
}
const CardTitle = React.forwardRef<
HTMLParagraphElement,
React.HTMLAttributes<HTMLHeadingElement>
>(({ className, ...props }, ref) => (
<h3
ref={ref}
className={cn(
"text-2xl font-semibold leading-none tracking-tight",
className
)}
{...props}
/>
));
CardTitle.displayName = "CardTitle";
export interface CardHeaderProps extends React.HTMLAttributes<HTMLDivElement> {}
const CardDescription = React.forwardRef<
HTMLParagraphElement,
React.HTMLAttributes<HTMLParagraphElement>
>(({ className, ...props }, ref) => (
<p
ref={ref}
className={cn("text-sm text-muted-foreground", className)}
{...props}
/>
));
CardDescription.displayName = "CardDescription";
export function CardHeader({ className, ...props }: CardHeaderProps) {
return <div className={`flex flex-col space-y-1.5 p-6 ${className}`} {...props} />;
}
const CardContent = React.forwardRef<
HTMLDivElement,
React.HTMLAttributes<HTMLDivElement>
>(({ className, ...props }, ref) => (
<div ref={ref} className={cn("p-6 pt-0", className)} {...props} />
));
CardContent.displayName = "CardContent";
export interface CardTitleProps extends React.HTMLAttributes<HTMLHeadingElement> {}
const CardFooter = React.forwardRef<
HTMLDivElement,
React.HTMLAttributes<HTMLDivElement>
>(({ className, ...props }, ref) => (
<div
ref={ref}
className={cn("flex items-center p-6 pt-0", className)}
{...props}
/>
));
CardFooter.displayName = "CardFooter";
export function CardTitle({ className, ...props }: CardTitleProps) {
return (
<h3
className={`text-2xl font-semibold leading-none tracking-tight ${className}`}
{...props}
/>
);
}
export { Card, CardHeader, CardFooter, CardTitle, CardDescription, CardContent };
export interface CardDescriptionProps extends React.HTMLAttributes<HTMLParagraphElement> {}
export function CardDescription({ className, ...props }: CardDescriptionProps) {
return (
<p
className={`text-sm text-gray-500 ${className}`}
{...props}
/>
);
}
export interface CardContentProps extends React.HTMLAttributes<HTMLDivElement> {}
export function CardContent({ className, ...props }: CardContentProps) {
return <div className={`p-6 pt-0 ${className}`} {...props} />;
}
export interface CardFooterProps extends React.HTMLAttributes<HTMLDivElement> {}
export function CardFooter({ className, ...props }: CardFooterProps) {
return (
<div
className={`flex items-center p-6 pt-0 ${className}`}
{...props}
/>
);
}

View File

@ -0,0 +1,22 @@
"use client"
import * as React from "react"
export interface InputProps
extends React.InputHTMLAttributes<HTMLInputElement> {}
const Input = React.forwardRef<HTMLInputElement, InputProps>(
({ className, type, ...props }, ref) => {
return (
<input
type={type}
className={`flex h-10 w-full rounded-md border border-gray-300 bg-white px-3 py-2 text-sm placeholder:text-gray-400 focus:outline-none focus:ring-2 focus:ring-blue-500 focus:border-transparent disabled:cursor-not-allowed disabled:opacity-50 ${className}`}
ref={ref}
{...props}
/>
)
}
)
Input.displayName = "Input"
export { Input }

106
src/components/ui/table.tsx Normal file
View File

@ -0,0 +1,106 @@
import * as React from "react"
const Table = React.forwardRef<
HTMLTableElement,
React.HTMLAttributes<HTMLTableElement>
>(({ className, ...props }, ref) => (
<div className="relative w-full overflow-auto">
<table
ref={ref}
className={`w-full caption-bottom text-sm ${className}`}
{...props}
/>
</div>
))
Table.displayName = "Table"
const TableHeader = React.forwardRef<
HTMLTableSectionElement,
React.HTMLAttributes<HTMLTableSectionElement>
>(({ className, ...props }, ref) => (
<thead ref={ref} className={`[&_tr]:border-b ${className}`} {...props} />
))
TableHeader.displayName = "TableHeader"
const TableBody = React.forwardRef<
HTMLTableSectionElement,
React.HTMLAttributes<HTMLTableSectionElement>
>(({ className, ...props }, ref) => (
<tbody
ref={ref}
className={`[&_tr:last-child]:border-0 ${className}`}
{...props}
/>
))
TableBody.displayName = "TableBody"
const TableFooter = React.forwardRef<
HTMLTableSectionElement,
React.HTMLAttributes<HTMLTableSectionElement>
>(({ className, ...props }, ref) => (
<tfoot
ref={ref}
className={`border-t bg-gray-100/50 font-medium [&>tr]:last:border-b-0 ${className}`}
{...props}
/>
))
TableFooter.displayName = "TableFooter"
const TableRow = React.forwardRef<
HTMLTableRowElement,
React.HTMLAttributes<HTMLTableRowElement>
>(({ className, ...props }, ref) => (
<tr
ref={ref}
className={`border-b transition-colors hover:bg-gray-100/50 data-[state=selected]:bg-gray-100 ${className}`}
{...props}
/>
))
TableRow.displayName = "TableRow"
const TableHead = React.forwardRef<
HTMLTableCellElement,
React.ThHTMLAttributes<HTMLTableCellElement>
>(({ className, ...props }, ref) => (
<th
ref={ref}
className={`h-12 px-4 text-left align-middle font-medium text-gray-500 [&:has([role=checkbox])]:pr-0 ${className}`}
{...props}
/>
))
TableHead.displayName = "TableHead"
const TableCell = React.forwardRef<
HTMLTableCellElement,
React.TdHTMLAttributes<HTMLTableCellElement>
>(({ className, ...props }, ref) => (
<td
ref={ref}
className={`p-4 align-middle [&:has([role=checkbox])]:pr-0 ${className}`}
{...props}
/>
))
TableCell.displayName = "TableCell"
const TableCaption = React.forwardRef<
HTMLTableCaptionElement,
React.HTMLAttributes<HTMLTableCaptionElement>
>(({ className, ...props }, ref) => (
<caption
ref={ref}
className={`mt-4 text-sm text-gray-500 ${className}`}
{...props}
/>
))
TableCaption.displayName = "TableCaption"
export {
Table,
TableHeader,
TableBody,
TableFooter,
TableHead,
TableRow,
TableCell,
TableCaption,
}

113
src/components/ui/tabs.tsx Normal file
View File

@ -0,0 +1,113 @@
"use client"
import * as React from "react"
export interface TabsProps extends React.HTMLAttributes<HTMLDivElement> {
defaultValue?: string
value?: string
onValueChange?: (value: string) => void
}
export interface TabsListProps extends React.HTMLAttributes<HTMLDivElement> {}
export interface TabsTriggerProps extends React.ButtonHTMLAttributes<HTMLButtonElement> {
value: string
}
export interface TabsContentProps extends React.HTMLAttributes<HTMLDivElement> {
value: string
}
const TabsContext = React.createContext<{
value: string
onValueChange: (value: string) => void
} | null>(null)
function useTabs() {
const context = React.useContext(TabsContext)
if (!context) {
throw new Error("Tabs components must be used within a Tabs provider")
}
return context
}
export function Tabs({
defaultValue,
value,
onValueChange,
children,
className,
...props
}: TabsProps) {
const [tabValue, setTabValue] = React.useState(value || defaultValue || "")
const handleValueChange = React.useCallback((newValue: string) => {
setTabValue(newValue)
onValueChange?.(newValue)
}, [onValueChange])
return (
<TabsContext.Provider
value={{
value: tabValue,
onValueChange: handleValueChange
}}
>
<div className={`w-full ${className}`} {...props}>
{children}
</div>
</TabsContext.Provider>
)
}
export function TabsList({ className, children, ...props }: TabsListProps) {
return (
<div
className={`inline-flex h-10 items-center justify-center rounded-md bg-gray-100 p-1 text-gray-600 ${className}`}
role="tablist"
{...props}
>
{children}
</div>
)
}
export function TabsTrigger({ className, value, children, ...props }: TabsTriggerProps) {
const { value: selectedValue, onValueChange } = useTabs()
const isSelected = selectedValue === value
return (
<button
role="tab"
aria-selected={isSelected}
data-state={isSelected ? "active" : "inactive"}
className={`inline-flex items-center justify-center whitespace-nowrap rounded-sm px-3 py-1.5 text-sm font-medium ring-offset-white transition-all focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-gray-400 focus-visible:ring-offset-2 disabled:pointer-events-none disabled:opacity-50 ${
isSelected
? "bg-white text-gray-900 shadow-sm"
: "text-gray-600 hover:text-gray-900"
} ${className}`}
onClick={() => onValueChange(value)}
{...props}
>
{children}
</button>
)
}
export function TabsContent({ className, value, children, ...props }: TabsContentProps) {
const { value: selectedValue } = useTabs()
const isSelected = selectedValue === value
if (!isSelected) return null
return (
<div
role="tabpanel"
data-state={isSelected ? "active" : "inactive"}
className={`mt-2 ring-offset-white focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-gray-400 focus-visible:ring-offset-2 ${className}`}
{...props}
>
{children}
</div>
)
}

84
src/lib/auth.ts Normal file
View File

@ -0,0 +1,84 @@
import { cookies } from 'next/headers'
// In a real application, you would use a more secure method
// This is a basic implementation for demonstration purposes
export interface User {
id: string
name: string
role: string
}
// Set to environment variable in production
const AUTH_SECRET = process.env.AUTH_SECRET || 'stones-database-secret'
// In a real app, this would be stored in a database
const USERS = [
{
id: '1',
name: 'Admin',
role: 'admin' as const,
username: 'admin',
password: 'stones1234' // In production, use hashed passwords
}
]
export async function login(username: string, password: string): Promise<User | null> {
// Find the user by username and password
const user = USERS.find(u => u.username === username && u.password === password)
if (!user) {
return null
}
// Store user info in the cookie
const cookieStore = cookies()
cookieStore.set('auth', JSON.stringify({
id: user.id,
name: user.name,
role: user.role
}), {
httpOnly: true,
secure: process.env.NODE_ENV === 'production',
maxAge: 60 * 60 * 24 * 7, // 1 week
path: '/'
})
return {
id: user.id,
name: user.name,
role: user.role
}
}
export async function logout() {
const cookieStore = cookies()
cookieStore.delete('auth')
}
export async function getUser(): Promise<User | null> {
const cookieStore = cookies()
const authCookie = cookieStore.get('auth')
if (!authCookie?.value) {
return null
}
try {
const userData = JSON.parse(authCookie.value)
return userData as User
} catch (error) {
console.error('Error parsing user data:', error)
return null
}
}
export async function requireAuth() {
const user = await getUser()
if (!user) {
throw new Error('Unauthorized')
}
return user
}

14
src/lib/prisma.ts Normal file
View File

@ -0,0 +1,14 @@
import { PrismaClient } from '@prisma/client'
// Create a singleton Prisma client instance
const globalForPrisma = global as unknown as {
prisma: PrismaClient | undefined;
};
export const prisma =
globalForPrisma.prisma ??
new PrismaClient({
log: process.env.NODE_ENV === "development" ? ["query", "error", "warn"] : ["error"],
});
if (process.env.NODE_ENV !== 'production') globalForPrisma.prisma = prisma

View File

@ -1,4 +1,4 @@
import { type ClassValue, clsx } from "clsx";
import { ClassValue, clsx } from "clsx";
import { twMerge } from "tailwind-merge";
export function cn(...inputs: ClassValue[]) {

37
src/middleware.ts Normal file
View File

@ -0,0 +1,37 @@
import { NextResponse } from 'next/server'
import type { NextRequest } from 'next/server'
// This function can be marked `async` if using `await` inside
export function middleware(request: NextRequest) {
const authCookie = request.cookies.get('auth')
const { pathname } = request.nextUrl
// Routes that don't require authentication
const publicRoutes = ['/', '/login', '/api/auth/login']
const isPublicAsset = pathname.startsWith('/_next/') ||
pathname.includes('/favicon.ico') ||
pathname.endsWith('.png') ||
pathname.endsWith('.svg')
// Allow access to public routes without authentication
if (publicRoutes.includes(pathname) || isPublicAsset) {
return NextResponse.next()
}
// Allow access to API routes without cookie check (API routes handle their own auth)
if (pathname.startsWith('/api/')) {
return NextResponse.next()
}
// Redirect to login page if not authenticated and trying to access a protected route
if (!authCookie) {
return NextResponse.redirect(new URL('/login', request.url))
}
return NextResponse.next()
}
// Configure which routes middleware should run on
export const config = {
matcher: ['/((?!_next/static|_next/image|favicon.ico).*)']
}

2783
stones_data.sql Normal file

File diff suppressed because it is too large Load Diff

2373
stones_data_only.sql Normal file

File diff suppressed because it is too large Load Diff

View File

@ -2,12 +2,10 @@
module.exports = {
darkMode: ["class"],
content: [
'./pages/**/*.{ts,tsx}',
'./components/**/*.{ts,tsx}',
'./app/**/*.{ts,tsx}',
'./src/**/*.{ts,tsx}',
'./src/pages/**/*.{js,ts,jsx,tsx,mdx}',
'./src/components/**/*.{js,ts,jsx,tsx,mdx}',
'./src/app/**/*.{js,ts,jsx,tsx,mdx}',
],
prefix: "",
theme: {
container: {
center: true,
@ -59,12 +57,12 @@ module.exports = {
},
keyframes: {
"accordion-down": {
from: { height: "0" },
from: { height: 0 },
to: { height: "var(--radix-accordion-content-height)" },
},
"accordion-up": {
from: { height: "var(--radix-accordion-content-height)" },
to: { height: "0" },
to: { height: 0 },
},
},
animation: {

39
tsconfig.json Normal file
View File

@ -0,0 +1,39 @@
{
"compilerOptions": {
"target": "es5",
"lib": [
"dom",
"dom.iterable",
"esnext"
],
"allowJs": true,
"skipLibCheck": true,
"strict": true,
"forceConsistentCasingInFileNames": true,
"noEmit": true,
"incremental": true,
"esModuleInterop": true,
"module": "esnext",
"moduleResolution": "bundler",
"resolveJsonModule": true,
"isolatedModules": true,
"jsx": "preserve",
"plugins": [
{
"name": "next"
}
],
"paths": {
"@/*": ["./src/*"]
}
},
"include": [
"next-env.d.ts",
".next/types/**/*.ts",
"**/*.ts",
"**/*.tsx"
],
"exclude": [
"node_modules"
]
}