Scrapyd制作Docker镜像的步骤

2024-11-11 15:04:18

1、安装Docker服务并登陆(docker login)

2、新建一个项目,新建一个scrapyd.conf,即Scrapyd的配置文件,内容如下:[scrapyd]eggs_dir = eggslogs_dir = logsitems_dir =jobs_to_keep = 5dbs_dir = dbsmax_proc = 0max_proc_per_cpu = 4finished_to_keep = 100poll_interval = 5.0bind_address = 0.0.0.0http_port = 6800debug = offrunner = scrapyd.runnerapplication = scrapyd.app.applicationlauncher = scrapyd.launcher.Launcherwebroot = scrapyd.website.Root[services]schedule.json = scrapyd.webservice.Schedulecancel.json = scrapyd.webservice.Canceladdversion.json = scrapyd.webservice.AddVersionlistprojects.json = scrapyd.webservice.ListProjectslistversions.json = scrapyd.webservice.ListVersionslistspiders.json = scrapyd.webservice.ListSpidersdelproject.json = scrapyd.webservice.DeleteProjectdelversion.json = scrapyd.webservice.DeleteVersionlistjobs.json = scrapyd.webservice.ListJobsdaemonstatus.json = scrapyd.webservice.DaemonStatus备注:此处只修改了bind_address = 0.0.0.0,原本是127.0.0.1,不能公开访问,这里修改为0.0.0.0即可接触此限制

3、新建一个requirements.txt文件,将一些Scrapy项目常用的库放进去 request selenium aiohttp beautifulsoup4 pyquery pymysql pymongo redis flask django scrapy scrapyd scrapyd-client scrapy-redis scrapy-splash 如果运行的Scrapy项目还需要其他库,可以自行添加

4、新建一个Dockerfile,内容如下: FROM python:3.6 ADD . /code WORKDIR /code COPY ./scrapyd.conf /etc/scrapyd/ EXPOSE 6800 RUN pip3 install -r requirements.txt CMD scrapyd

5、在当前目录下执行docker build -t scrapyd:latest .千万不要忘了后面的点

6、docker tag scrapyd:latest username/scrapyd:latest

7、docker push username/scrapyd到这一步就算是完成了,接下来只需要在目标机器执行一条命令就可以自动安装并启动scrapyd服务

8、docker run -d -p 6800:6800 qinexpire/scrapyd到这里就全部完成了

猜你喜欢