From antigravity-awesome-skills
Scrapes official auctioneer data from Brazil's 27 Juntas Comerciais, stores in local SQLite DB, serves via FastAPI REST API, exports CSV/JSON.
npx claudepluginhub sickn33/antigravity-awesome-skillsThis skill uses the workspace's default tool permissions.
Coleta e consulta dados de leiloeiros oficiais de todas as 27 Juntas Comerciais do Brasil. Scraper multi-UF, banco SQLite, API FastAPI e exportacao CSV/JSON.
references/juntas_urls.mdreferences/legal.mdreferences/schema.mdscripts/db.pyscripts/export.pyscripts/requirements.txtscripts/run_all.pyscripts/scraper/__init__.pyscripts/scraper/base_scraper.pyscripts/scraper/generic_scraper.pyscripts/scraper/jucap.pyscripts/scraper/juceac.pyscripts/scraper/juceal.pyscripts/scraper/juceb.pyscripts/scraper/jucec.pyscripts/scraper/jucema.pyscripts/scraper/jucemg.pyscripts/scraper/jucep.pyscripts/scraper/jucepa.pyscripts/scraper/jucepar.pyScrapes official auctioneer data from Brazil's 27 Juntas Comerciais, stores in local SQLite DB, serves via FastAPI REST API, exports CSV/JSON.
Connects AI agents to 28 Brazilian public APIs via MCP server, exposing 213 tools for natural language queries on economy, legislation, transparency, judiciary, health data.
Discovers businesses by type and geography using Nimble WSAs. Audit mode compares user lists from Google Sheets/CSV against fresh discoveries, categorizing matches and gaps. For market sizing and prospect lists.
Share bugs, ideas, or general feedback.
Coleta e consulta dados de leiloeiros oficiais de todas as 27 Juntas Comerciais do Brasil. Scraper multi-UF, banco SQLite, API FastAPI e exportacao CSV/JSON.
Coleta dados públicos de leiloeiros oficiais de todas as 27 Juntas Comerciais estaduais, persiste em banco SQLite local e oferece API REST e exportação em múltiplos formatos.
C:\Users\renat\skills\junta-leiloeiros\
├── scripts/
│ ├── scraper/
│ │ ├── base_scraper.py ← classe abstrata
│ │ ├── states.py ← registro dos 27 scrapers
│ │ ├── jucesp.py / jucerja.py / jucemg.py / jucec.py / jucis_df.py
│ │ └── generic_scraper.py ← usado pelos 22 estados restantes
│ ├── db.py ← banco SQLite
│ ├── run_all.py ← orquestrador de scraping
│ ├── serve_api.py ← API FastAPI
│ ├── export.py ← exportação
│ └── requirements.txt
├── references/
│ ├── juntas_urls.md ← URLs e status de todas as 27 juntas
│ ├── schema.md ← schema do banco
│ └── legal.md ← base legal
└── data/
├── leiloeiros.db ← banco SQLite (criado no primeiro run)
├── scraping_log.json ← log de cada coleta
└── exports/ ← arquivos exportados
pip install -r C:\Users\renat\skills\junta-leiloeiros\scripts\requirements.txt
## Para Sites Com Javascript:
playwright install chromium
## Todos Os 27 Estados
python C:\Users\renat\skills\junta-leiloeiros\scripts\run_all.py
## Estados Específicos
python C:\Users\renat\skills\junta-leiloeiros\scripts\run_all.py --estado SP RJ MG
## Ver O Que Seria Coletado Sem Executar
python C:\Users\renat\skills\junta-leiloeiros\scripts\run_all.py --dry-run
## Controlar Paralelismo (Default: 5)
python C:\Users\renat\skills\junta-leiloeiros\scripts\run_all.py --concurrency 3
python C:\Users\renat\skills\junta-leiloeiros\scripts\db.py
sqlite3 C:\Users\renat\skills\junta-leiloeiros\data\leiloeiros.db
"SELECT estado, COUNT(*) FROM leiloeiros GROUP BY estado"
## Servir Api Rest
```bash
python C:\Users\renat\skills\junta-leiloeiros\scripts\serve_api.py
## Docs Interativos: Http://Localhost:8000/Docs
Endpoints:
GET /leiloeiros?estado=SP&situacao=ATIVO&nome=silva&limit=100GET /leiloeiros/{estado} — ex: /leiloeiros/SPGET /busca?q=textoGET /statsGET /export/jsonGET /export/csvpython C:\Users\renat\skills\junta-leiloeiros\scripts\export.py --format csv
python C:\Users\renat\skills\junta-leiloeiros\scripts\export.py --format json
python C:\Users\renat\skills\junta-leiloeiros\scripts\export.py --format all
python C:\Users\renat\skills\junta-leiloeiros\scripts\export.py --format csv --estado SP
import sys
sys.path.insert(0, r"C:\Users\renat\skills\junta-leiloeiros\scripts")
from db import Database
db = Database()
db.init()
## Todos Os Leiloeiros Ativos De Sp
leiloeiros = db.get_all(estado="SP", situacao="ATIVO")
## Busca Por Nome
resultados = db.search("silva")
## Estatísticas
stats = db.get_stats()
Se um estado precisar de lógica específica (ex: site usa JavaScript):
## Scripts/Scraper/Meu_Estado.Py
from .base_scraper import AbstractJuntaScraper, Leiloeiro
from typing import List
class MeuEstadoScraper(AbstractJuntaScraper):
estado = "XX"
junta = "JUCEX"
url = "https://www.jucex.xx.gov.br/leiloeiros"
async def parse_leiloeiros(self) -> List[Leiloeiro]:
soup = await self.fetch_page()
if not soup:
return []
# lógica específica aqui
return [self.make_leiloeiro(nome="...", matricula="...")]
Registrar em scripts/scraper/states.py:
from .meu_estado import MeuEstadoScraper
SCRAPERS["XX"] = MeuEstadoScraper
references/juntas_urls.mdreferences/schema.mdreferences/legal.mddata/scraping_log.jsonleiloeiro-avaliacao - Complementary skill for enhanced analysisleiloeiro-edital - Complementary skill for enhanced analysisleiloeiro-ia - Complementary skill for enhanced analysisleiloeiro-juridico - Complementary skill for enhanced analysisleiloeiro-mercado - Complementary skill for enhanced analysis