Top Related Projects
Python ProxyPool for web spider
:sparkling_heart: High available distributed ip proxy pool, powerd by Scrapy and Redis
An Efficient ProxyPool with Getter, Tester and Server
Malicious traffic detection system
Proxy [Finder | Checker | Server]. HTTP(S) & SOCKS :performing_arts:
Quick Overview
IPProxyPool is an open-source project that provides a pool of free IP proxies. It automatically collects, tests, and maintains a list of working proxies from various sources, offering a reliable and up-to-date proxy service for users who need to bypass IP restrictions or maintain anonymity online.
Pros
- Automatic proxy collection and validation
- Regular updates to ensure proxy availability
- Easy-to-use API for retrieving proxies
- Supports both HTTP and HTTPS proxies
Cons
- Limited documentation, especially for non-Chinese speakers
- Potential reliability issues with free proxies
- May require frequent maintenance to keep the proxy list current
- Performance can vary depending on the quality of collected proxies
Code Examples
- Retrieving a random proxy:
import requests
response = requests.get("http://127.0.0.1:8000/get/")
if response.status_code == 200:
proxy = response.json()
print(f"IP: {proxy['ip']}, Port: {proxy['port']}")
- Retrieving multiple proxies:
import requests
response = requests.get("http://127.0.0.1:8000/get_all/")
if response.status_code == 200:
proxies = response.json()
for proxy in proxies:
print(f"IP: {proxy['ip']}, Port: {proxy['port']}")
- Using a proxy with requests:
import requests
proxy = requests.get("http://127.0.0.1:8000/get/").json()
proxies = {
"http": f"http://{proxy['ip']}:{proxy['port']}",
"https": f"http://{proxy['ip']}:{proxy['port']}"
}
response = requests.get("https://api.ipify.org", proxies=proxies)
print(f"Your IP: {response.text}")
Getting Started
-
Clone the repository:
git clone https://github.com/qiyeboy/IPProxyPool.git
-
Install dependencies:
cd IPProxyPool pip install -r requirements.txt
-
Start the proxy pool:
python IPProxy.py
-
Access the API at
http://127.0.0.1:8000
to retrieve proxies.
Competitor Comparisons
Python ProxyPool for web spider
Pros of proxy_pool
- More active development with recent updates and contributions
- Includes a web interface for easier management and visualization
- Supports multiple proxy validation methods
Cons of proxy_pool
- Less comprehensive documentation compared to IPProxyPool
- Fewer built-in proxy sources out of the box
Code Comparison
proxy_pool:
class ProxyCheck(object):
def __init__(self):
self.selfip = self.getMyIP()
self.detect_pool = Pool(THREADNUM)
def checkProxy(self, proxy):
proxies = {"http": "http://%s" % proxy, "https": "https://%s" % proxy}
try:
r = requests.get(url=TEST_URL, headers=HEADER, timeout=TIMEOUT, proxies=proxies)
if r.status_code == 200:
return True
except Exception:
return False
IPProxyPool:
class Validator(object):
def __init__(self):
self.detect_pool = Pool(THREADNUM)
def detect_proxy(self, proxy):
proxies = {"http": "http://%s" % proxy, "https": "https://%s" % proxy}
try:
r = requests.get(url=TEST_URL, headers=HEADER, timeout=TIMEOUT, proxies=proxies)
if r.status_code == 200:
return True
except Exception:
return False
Both projects use similar approaches for proxy validation, utilizing a thread pool and making requests to test the proxy's functionality. The main difference lies in the class and method names, with proxy_pool using ProxyCheck
and checkProxy
, while IPProxyPool uses Validator
and detect_proxy
.
:sparkling_heart: High available distributed ip proxy pool, powerd by Scrapy and Redis
Pros of haipproxy
- More advanced proxy validation and scoring system
- Better scalability with distributed architecture
- Supports multiple proxy sources and protocols
Cons of haipproxy
- More complex setup and configuration
- Requires additional dependencies (Redis, Scrapy)
- Steeper learning curve for beginners
Code Comparison
haipproxy:
class ProxyFetcher(object):
def __init__(self):
self.proxy_set = set()
def get_proxy(self):
return random.choice(list(self.proxy_set))
def add_proxy(self, proxy):
self.proxy_set.add(proxy)
IPProxyPool:
class IPPool(object):
def __init__(self):
self.db = SQLite()
def get(self):
result = self.db.select()
if result:
return random.choice(result)
else:
return None
Both projects aim to provide a pool of proxy IP addresses, but haipproxy offers a more sophisticated approach with its distributed architecture and advanced scoring system. It supports multiple proxy sources and protocols, making it more versatile. However, this comes at the cost of increased complexity in setup and configuration.
IPProxyPool, on the other hand, is simpler and easier to set up, making it more suitable for smaller projects or beginners. It uses SQLite for storage, while haipproxy relies on Redis for its distributed nature.
The code comparison shows that haipproxy uses a set to store proxies in memory, while IPProxyPool uses a database. This reflects the different approaches to scalability and performance in the two projects.
An Efficient ProxyPool with Getter, Tester and Server
Pros of ProxyPool
- More active development with recent updates and contributions
- Supports multiple proxy sources and validation methods
- Includes a web API for easy integration with other applications
Cons of ProxyPool
- Potentially more complex setup due to additional dependencies
- May require more system resources to run effectively
- Less extensive documentation compared to IPProxyPool
Code Comparison
IPProxyPool:
class IPProxy(Base):
__tablename__ = 'proxys'
id = Column(Integer, primary_key=True, autoincrement=True)
ip = Column(String(64), nullable=False)
port = Column(Integer, nullable=False)
types = Column(Integer, nullable=False)
protocol = Column(Integer, nullable=False, default=0)
country = Column(String(100), nullable=False)
area = Column(String(100), nullable=False)
ProxyPool:
class Proxy(object):
def __init__(self, ip, port, protocol=-1, nick_type=-1, speed=-1, area=None, score=0, disable_domains=None):
self.ip = ip
self.port = port
self.protocol = protocol
self.nick_type = nick_type
self.speed = speed
self.area = area
self.score = score
self.disable_domains = disable_domains or set()
Both projects aim to provide a pool of proxy IP addresses, but they differ in their implementation and features. ProxyPool offers more flexibility and active development, while IPProxyPool may be simpler to set up and use for basic needs. The code comparison shows different approaches to storing proxy information, with ProxyPool using a more comprehensive object-oriented structure.
Malicious traffic detection system
Pros of Maltrail
- Focuses on network security and malware detection, offering a more specialized toolset
- Provides real-time monitoring and alerting capabilities for potential threats
- Includes a comprehensive database of malicious indicators and patterns
Cons of Maltrail
- More complex setup and configuration compared to IPProxyPool
- Requires ongoing maintenance and updates to keep threat intelligence current
- May generate false positives, requiring careful tuning and analysis
Code Comparison
Maltrail (Python):
def check_sudo():
if not os.geteuid() == 0:
exit("[!] please run with sudo/Administrator privileges")
def init_sensor():
for module in sensor.MODULES:
logger.info("initializing module '%s'..." % module.__name__)
module.init()
IPProxyPool (Python):
def get_proxy():
proxy = None
try:
proxy = self.db.get(self.proxy_queue.get(timeout=1))
except:
pass
return proxy
def delete_proxy(proxy):
self.db.delete(proxy)
The code snippets highlight the different focus areas of each project. Maltrail's code emphasizes security checks and sensor initialization, while IPProxyPool's code deals with proxy management and database operations.
Proxy [Finder | Checker | Server]. HTTP(S) & SOCKS :performing_arts:
Pros of ProxyBroker
- Written in Python 3, offering better performance and modern language features
- Supports multiple proxy protocols (HTTP, HTTPS, SOCKS4, SOCKS5)
- Provides a built-in asynchronous server for proxy checking
Cons of ProxyBroker
- Less frequent updates and maintenance compared to IPProxyPool
- Smaller community and fewer contributors
- Limited documentation and examples for advanced usage
Code Comparison
IPProxyPool (Python 2):
class IPProxy(object):
def __init__(self, ip, port, proxy_type=0, protocol=-1, country='', area='', speed=0, source=''):
self.ip = ip
self.port = port
self.proxy_type = proxy_type
self.protocol = protocol
self.country = country
self.area = area
self.speed = speed
self.source = source
ProxyBroker (Python 3):
class Proxy:
def __init__(self, host=None, port=None):
self.host = host
self.port = port
self.is_working = False
self.schemes = set()
self.avg_resp_time = None
self.error_rate = 0
Both projects aim to provide proxy management functionality, but ProxyBroker offers a more modern codebase with support for multiple proxy protocols. IPProxyPool, while having a larger community and more frequent updates, is built on Python 2, which may limit its long-term viability. ProxyBroker's code structure is more concise and focuses on essential proxy attributes, while IPProxyPool includes additional metadata like country and area.
Convert
designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual CopilotREADME
# IPProxyPool IPProxyPoolä»£çæ± 项ç®ï¼æä¾ä»£çipãæ¯æpy2åpy3ä¸¤ä¸ªçæ¬ã
æçæ°ä¹¦ãPythonç¬è«å¼åä¸é¡¹ç®å®æãåºçäº,忬¢çè¯å¯ä»¥çä¸ä¸æ ·ç«
详ç»ä½¿ç¨æ¹å¼ï¼è¯·çæçå客: http://www.cnblogs.com/qiyeboy/p/5693128.html
æè¿æ£å¨ä¸ºIPProxyPoolæ·»å äºçº§ä»£çï¼æ¹ä¾¿è°åº¦ã大家å¯ä»¥å ³æ³¨æçå ¬ä¼å·ï¼æ´æ°æä¼åæ¶éç¥ã
æçå¾®ä¿¡å ¬ä¼å·:
叿大家æä¾æ´å¤ç代çç½ç«ï¼ç°å¨ç¬åç好ç¨ç代çipè¿æ¯å¤ªå°ã
åæ¶æè°¢super1-chen,fancoo,Leibnizhu对项ç®çè´¡ç®ã
项ç®ä¾èµ
Ubuntu,debian
1.å®è£
sqliteæ°æ®åº(ä¸è¬ç³»ç»å
ç½®):
apt-get install sqlite3
2.å®è£
requests,chardet,web.py,gevent psutil:
pip install requests chardet web.py sqlalchemy gevent psutil
3.å®è£
lxml:
apt-get install python-lxml
注æï¼
- python3ä¸çæ¯pip3
- ææ¶å使ç¨çgeventçæ¬è¿ä½ä¼åºç°èªå¨éåºæ åµï¼è¯·ä½¿ç¨pip install gevent --upgradeæ´æ°)
- å¨python3ä¸å®è£ web.pyï¼ä¸è½ä½¿ç¨pipï¼ç´æ¥ä¸è½½py3çæ¬çæºç è¿è¡å®è£
Windows
1.ä¸è½½sqlite,è·¯å¾æ·»å å°ç¯å¢åé
2.å®è£
requests,chardet,web.py,gevent:
pip install requests chardet web.py sqlalchemy gevent
3.å®è£
lxml:
pip install lxmlæè
ä¸è½½lxml windowsç
注æï¼
- python3ä¸çæ¯pip3
- ææ¶å使ç¨çgeventçæ¬è¿ä½ä¼åºç°èªå¨éåºæ åµï¼è¯·ä½¿ç¨pip install gevent --upgradeæ´æ°)
- å¨python3ä¸å®è£ web.pyï¼ä¸è½ä½¿ç¨pipï¼ç´æ¥ä¸è½½py3çæ¬çæºç è¿è¡å®è£
æ©å±è¯´æ
æ¬é¡¹ç®é»è®¤æ°æ®åºæ¯sqliteï¼ä½æ¯éç¨sqlalchemyçORM模åï¼éè¿é¢çæ¥å£å¯ä»¥æå±ä½¿ç¨MySQLï¼MongoDBçæ°æ®åºã
é
ç½®æ¹æ³ï¼
1.MySQLé
ç½®
ç¬¬ä¸æ¥ï¼é¦å
å®è£
MySQLæ°æ®åºå¹¶å¯å¨
ç¬¬äºæ¥ï¼å®è£
MySQLdbæè
pymysql(æ¨è)
ç¬¬ä¸æ¥ï¼å¨config.pyæä»¶ä¸é
ç½®DB_CONFIGã妿å®è£
çæ¯MySQLdb模åï¼é
ç½®å¦ä¸ï¼
DB_CONFIG={
'DB_CONNECT_TYPE':'sqlalchemy',
'DB_CONNECT_STRING':'mysql+mysqldb://root:root@localhost/proxy?charset=utf8'
}
妿å®è£
çæ¯pymysql模åï¼é
ç½®å¦ä¸ï¼
DB_CONFIG={
'DB_CONNECT_TYPE':'sqlalchemy',
'DB_CONNECT_STRING':'mysql+pymysql://root:root@localhost/proxy?charset=utf8'
}
sqlalchemyä¸çDB_CONNECT_STRINGåèæ¯ææ°æ®åºï¼ç论ä¸ä½¿ç¨è¿ç§é
ç½®æ¹å¼ä¸åªæ¯éé
MySQLï¼sqlalchemyæ¯æçæ°æ®åºé½å¯ä»¥ï¼ä½æ¯ä»
ä»
æµè¯è¿MySQLã
2.MongoDBé
ç½®
ç¬¬ä¸æ¥ï¼é¦å
å®è£
MongoDBæ°æ®åºå¹¶å¯å¨
ç¬¬äºæ¥ï¼å®è£
pymongo模å
ç¬¬ä¸æ¥ï¼å¨config.pyæä»¶ä¸é
ç½®DB_CONFIGãé
置类似å¦ä¸ï¼
DB_CONFIG={
'DB_CONNECT_TYPE':'pymongo',
'DB_CONNECT_STRING':'mongodb://localhost:27017/'
}
ç±äºsqlalchemy并䏿¯æMongoDB,å æ¤é¢å¤æ·»å äºpymongo模å¼ï¼DB_CONNECT_STRINGåèpymongoçè¿æ¥å符串ã
注æ
å¦æå¤§å®¶æ³æå±å ¶ä»æ°æ®åºï¼å¯ä»¥ç´æ¥ç»§æ¿dbä¸ISqlHelperç±»ï¼å®ç°å ¶ä¸çæ¹æ³ï¼å ·ä½å®ç°åèæç代ç ï¼ç¶åå¨DataStoreä¸å¯¼å ¥ç±»å³å¯ã
try:
if DB_CONFIG['DB_CONNECT_TYPE'] == 'pymongo':
from db.MongoHelper import MongoHelper as SqlHelper
else:
from db.SqlHelper import SqlHelper as SqlHelper
sqlhelper = SqlHelper()
sqlhelper.init_db()
except Exception,e:
raise Con_DB_Fail
ææå ´è¶£çæåï¼å¯ä»¥å°Redisçå®ç°æ¹å¼æ·»å è¿æ¥ã
å¦ä½ä½¿ç¨
å°é¡¹ç®ç®å½cloneå°å½åæä»¶å¤¹
$ git clone
忢工ç¨ç®å½
$ cd IPProxyPool
è¿è¡èæ¬
python IPProxy.py
æåè¿è¡åï¼æå°ä¿¡æ¯
IPProxyPool----->>>>>>>>beginning
http://0.0.0.0:8000/
IPProxyPool----->>>>>>>>db exists ip:0
IPProxyPool----->>>>>>>>now ip num < MINNUM,start crawling...
IPProxyPool----->>>>>>>>Success ip num :134,Fail ip num:7882
API ä½¿ç¨æ¹æ³
第ä¸ç§æ¨¡å¼
GET /
è¿ç§æ¨¡å¼ç¨äºæ¥è¯¢ä»£çipæ°æ®ï¼åæ¶å å ¥è¯åæºå¶ï¼è¿åæ°æ®çé¡ºåºæ¯æç §è¯åç±é«å°ä½ï¼é度ç±å¿«å°æ ¢å¶å®çã
åæ°
Name | Type | Description |
---|---|---|
types | int | 0: é«å¿,1:å¿å,2 éæ |
protocol | int | 0: http, 1 https, 2 http/https |
count | int | æ°é |
country | str | åå¼ä¸º å½å , å½å¤ |
area | str | å°åº |
ä¾å
IPProxysé»è®¤ç«¯å£ä¸º8000,端å£å¯ä»¥å¨config.pyä¸é ç½®ã
妿æ¯å¨æ¬æºä¸æµè¯ï¼
1.è·å5个ipå°åå¨ä¸å½çé«å¿ä»£çï¼http://127.0.0.1:8000/?types=0&count=5&country=å½å
2.ååºä¸ºJSONæ ¼å¼ï¼æç
§è¯åç±é«å°ä½ï¼ååºé度ç±é«å°ä½ç顺åºï¼è¿åæ°æ®ï¼
[["122.226.189.55", 138, 10], ["183.61.236.54", 3128, 10], ["61.132.241.109", 808, 10], ["183.61.236.53", 3128, 10], ["122.227.246.102", 808, 10]]
以["122.226.189.55", 138, 10]为ä¾ï¼ç¬¬ä¸ä¸ªå ç´ æ¯ip,第äºä¸ªå ç´ æ¯portï¼ç¬¬ä¸ä¸ªå ç´ æ¯åå¼scoreã
import requests
import json
r = requests.get('http://127.0.0.1:8000/?types=0&count=5&country=å½å
')
ip_ports = json.loads(r.text)
print ip_ports
ip = ip_ports[0][0]
port = ip_ports[0][1]
proxies={
'http':'http://%s:%s'%(ip,port),
'https':'http://%s:%s'%(ip,port)
}
r = requests.get('http://ip.chinaz.com/',proxies=proxies)
r.encoding='utf-8'
print r.text
第äºç§æ¨¡å¼
GET /delete
è¿ç§æ¨¡å¼ç¨äºæ¹ä¾¿ç¨æ·æ ¹æ®èªå·±çéæ±å é¤ä»£çipæ°æ®
åæ°
Name | Type | Description |
---|---|---|
ip | str | 类似192.168.1.1 |
port | int | 类似 80 |
types | int | 0: é«å¿,1:å¿å,2 éæ |
protocol | int | 0: http, 1 https, 2 http/https |
count | int | æ°é |
country | str | åå¼ä¸º å½å , å½å¤ |
area | str | å°åº |
大家å¯ä»¥æ ¹æ®æå®ä»¥ä¸ä¸ç§æå ç§æ¹å¼å 餿°æ®ã
ä¾å
妿æ¯å¨æ¬æºä¸æµè¯ï¼
1.å é¤ip为120.92.3.127ç代çï¼http://127.0.0.1:8000/delete?ip=120.92.3.127
2.ååºä¸ºJSONæ ¼å¼ï¼è¿åå é¤çç»æä¸ºæå,失败æè
è¿åå é¤ç个æ°,类似å¦ä¸çææï¼
["deleteNum", "ok"]æè
["deleteNum", 1]
import requests
r = requests.get('http://127.0.0.1:8000/delete?ip=120.92.3.127')
print r.text
config.pyåæ°é ç½®
#parserListæ¯ç½åè§£æè§å表,大家å¯ä»¥å°åç°ç代çç½å,å°æåè§åæ·»å å°å
¶ä¸,æ¹ä¾¿ç¬è«çç¬åã
parserList = [
{
'urls': ['http://www.66ip.cn/%s.html' % n for n in ['index'] + list(range(2, 12))],
'type': 'xpath',
'pattern': ".//*[@id='main']/div/div[1]/table/tr[position()>1]",
'position': {'ip': './td[1]', 'port': './td[2]', 'type': './td[4]', 'protocol': ''}
},
......
{
'urls': ['http://www.cnproxy.com/proxy%s.html' % i for i in range(1, 11)],
'type': 'module',
'moduleName': 'CnproxyPraser',
'pattern': r'<tr><td>(\d+\.\d+\.\d+\.\d+)<SCRIPT type=text/javascript>document.write\(\"\:\"(.+)\)</SCRIPT></td><td>(HTTP|SOCKS4)\s*',
'position': {'ip': 0, 'port': 1, 'type': -1, 'protocol': 2}
}
]
#æ°æ®åºçé
ç½®
DB_CONFIG = {
'DB_CONNECT_TYPE': 'sqlalchemy', # 'pymongo'sqlalchemy;redis
# 'DB_CONNECT_STRING':'mongodb://localhost:27017/'
'DB_CONNECT_STRING': 'sqlite:///' + os.path.dirname(__file__) + '/data/proxy.db'
# DB_CONNECT_STRING : 'mysql+mysqldb://root:root@localhost/proxy?charset=utf8'
# 'DB_CONNECT_TYPE': 'redis', # 'pymongo'sqlalchemy;redis
# 'DB_CONNECT_STRING': 'redis://localhost:6379/8',
}
#THREADNUM为gevent poolçåç¨æ°ç®
THREADNUM = 5
#API_PORT为API webæå¡å¨ç端å£
API_PORT = 8000
#ç¬è«ç¬å忣æµipç设置æ¡ä»¶
#ä¸éè¦æ£æµipæ¯å¦å·²ç»åå¨ï¼å 为ä¼å®æ¶æ¸
ç
# UPDATE_TIME:æ¯åä¸ªå°æ¶æ£æµä¸æ¬¡æ¯å¦æä»£çip失æ
UPDATE_TIME = 30 * 60
# 彿æçipå¼å°äºMINNUMæ¶ éè¦å¯å¨ç¬è«è¿è¡ç¬å
MINNUM = 50
# socketè¶
æ¶
TIMEOUT = 5
#ç¬è«ä¸è½½ç½é¡µçéè¯æ¬¡æ°
RETRY_TIME = 3
#USER_AGENTS éæºå¤´ä¿¡æ¯,ç¨æ¥çªç ´ç¬åç½ç«çåç¬è«
USER_AGENTS = [
"Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; AcooBrowser; .NET CLR 1.1.4322; .NET CLR 2.0.50727)",
"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0; Acoo Browser; SLCC1; .NET CLR 2.0.50727; Media Center PC 5.0; .NET CLR 3.0.04506)",
"Mozilla/4.0 (compatible; MSIE 7.0; AOL 9.5; AOLBuild 4337.35; Windows NT 5.1; .NET CLR 1.1.4322; .NET CLR 2.0.50727)",
"Mozilla/5.0 (Windows; U; MSIE 9.0; Windows NT 9.0; en-US)",
"Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Win64; x64; Trident/5.0; .NET CLR 3.5.30729; .NET CLR 3.0.30729; .NET CLR 2.0.50727; Media Center PC 6.0)",
"Mozilla/5.0 (compatible; MSIE 8.0; Windows NT 6.0; Trident/4.0; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; .NET CLR 1.0.3705; .NET CLR 1.1.4322)",
"Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.2; .NET CLR 1.1.4322; .NET CLR 2.0.50727; InfoPath.2; .NET CLR 3.0.04506.30)",
]
#é»è®¤ç»æåçipåé
20å,æ¯æ¬¡è¿æ¥å¤±è´¥,åä¸å,ç´å°åæ°å
¨é¨æ£å®ä»æ°æ®åºä¸å é¤
DEFAULT_SCORE=10
#CHECK_PROXYåéæ¯ä¸ºäºç¨æ·èªå®ä¹æ£æµä»£çç彿°,ï¼é»è®¤æ¯CHECK_PROXY={'function':'checkProxy'}ã
#ç°å¨ä½¿ç¨æ£æµçç½åæ¯httpbin.org,使¯å³ä½¿ipéè¿äºéªè¯åæ£æµ
#ä¹åªè½è¯´æéè¿æ¤ä»£çipå¯ä»¥å°è¾¾httpbin.org,使¯ä¸ä¸å®è½å°è¾¾ç¨æ·ç¬åçç½å
#å æ¤å¨è¿ä¸ªå°æ¹ç¨æ·å¯ä»¥èªå·±æ·»å æ£æµå½æ°,æä»¥ç¾åº¦ä¸ºè®¿é®ç½åå°è¯ä¸ä¸
#大家å¯ä»¥çä¸ä¸Validator.pyæä»¶ä¸çbaidu_check彿°ådetect_proxy彿°å°±ä¼æç½
CHECK_PROXY={'function':'checkProxy'}#{'function':'baidu_check'}
TODO
1.æ·»å squid代çï¼ç®åç¬è«é
ç½®
æ´æ°è¿åº¦
-----------------------------2017-4-6----------------------------
1.æ´æ°è¯åæºå¶ã
- ä¹åçè¯åæºå¶æ¯åæ·»å è¿æ¥æ¯ä¸ªä»£çip为0åï¼æ¯éåä¸ªå°æ¶æ£æµä¸æ¬¡ï¼æ£æµä¹åä¾ç¶ææåå åï¼æ æåå é¤ã
- ç°å¨çè¯åæºå¶æ¯æ¯ä¸ªæ°ç代çipåé 10å,æ¯éåä¸ªå°æ¶æ£æµä¸æ¬¡ï¼æ£æµä¹åä¾ç¶ææååæ°ä¸åï¼æ æååæ°åä¸,ç´è³ä¸º0å é¤,å¯ä»¥é¿å ç±äºæ£æµç½ç«ä¸ç¨³å®å¯¼è´ç误å ã
2.ç¨æ·å¯ä»¥èªå®ä¹æ£æµå½æ°,å¨config.pyçCHECK_PROXYåéä¸å¯ä»¥é ç½®ã
CHECK_PROXYåéæ¯ä¸ºäºç¨æ·èªå®ä¹æ£æµä»£çç彿°ï¼é»è®¤æ¯CHECK_PROXY={'function':'checkProxy'}
ç°å¨ä½¿ç¨æ£æµçç½åæ¯httpbin.org,使¯å³ä½¿ipéè¿äºéªè¯åæ£æµ
ä¹åªè½è¯´æéè¿æ¤ä»£çipå¯ä»¥å°è¾¾httpbin.org,使¯ä¸ä¸å®è½å°è¾¾ç¨æ·ç¬åçç½å
å æ¤å¨è¿ä¸ªå°æ¹ç¨æ·å¯ä»¥èªå·±æ·»å æ£æµå½æ°,æä»¥ç¾åº¦ä¸ºè®¿é®ç½åå°è¯ä¸ä¸
大家å¯ä»¥çä¸ä¸Validator.pyæä»¶ä¸çbaidu_check彿°ådetect_proxy彿°å°±ä¼æç½ã
CHECK_PROXY={'function':'baidu_check'}
3.ç»è¿å¤§å®¶çå ±ååªå,å½»åºè§£å³äºåµæ»è¿ç¨çé®é¢ã
-----------------------------2017-1-16----------------------------
1.å°py2åpy3çæ¬åå¹¶ï¼å¹¶ä¸å
¼å®¹
2.ä¿®å¤pymongoæ¥è¯¢bug
-----------------------------2017-1-11----------------------------
1.使ç¨httpbin.orgæ£æµä»£çipçé«å¿æ§
2.ä½¿ç¨ å½å
å å½å¤ ä½ä¸ºcountryçæ¥è¯¢æ¡ä»¶
3.ä¿®æ¹typesåprotocolåæ°ï¼ä¸å®è¦æ³¨æprotocolç使ç¨ï¼è¯è¯è®¿é®http://www.baidu.comåhttps://www.baidu.com
4.ç¾å代ç 飿 ¼
-----------------------------2016-12-11----------------------------
####å¤§è§æ¨¡éæï¼ä¸»è¦å
æ¬ä»¥ä¸å 个æ¹é¢ï¼
1.使ç¨å¤è¿ç¨+åç¨çæ¹å¼ï¼å°ç¬ååéªè¯çæçæé«äº50å以ä¸ï¼å¯ä»¥å¨å åéä¹å
è·åææçææIP
2.使ç¨web.pyä½ä¸ºAPIæå¡å¨ï¼éæHTTPæ¥å£
3.å¢å Mysql,MongoDBçæ°æ®åºçéé
4.å¢å äºä¸ä¸ªä»£çç½ç«
5.å¢å è¯åæºå¶ï¼è¯æ¯ç¨³å®çip
6.æ¯æpython3
-----------------------------2016-11-24----------------------------
1.å¢å chardetè¯å«ç½é¡µç¼ç
2.çªç ´66ip.cnåç¬éå¶
-----------------------------2016-10-27----------------------------
1.å¢å 对代ççæ£æµï¼æµè¯æ¯å¦è½çæ£è®¿é®å°ç½åï¼å®ç°ä»£ç
2.æ·»å éè¿æ£å表达å¼åå è½½æä»¶è§£æç½é¡µçæ¹å¼
3.åå¢å ä¸ä¸ªæ°ç代çç½ç«
-----------------------------2016-7-20----------------------------
1.ä¿®å¤bug ,å°æ°æ®åºè¿è¡å缩
Top Related Projects
Python ProxyPool for web spider
:sparkling_heart: High available distributed ip proxy pool, powerd by Scrapy and Redis
An Efficient ProxyPool with Getter, Tester and Server
Malicious traffic detection system
Proxy [Finder | Checker | Server]. HTTP(S) & SOCKS :performing_arts:
Convert
designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual Copilot