github更新自己Fork的代码
2018-01-11 14:37
316 查看
总结:
github上有个功能叫fork,可以将别人的工程复制到自己账号下。这个功能很方便,但其有一个缺点是:当源项目更新后,你fork的分支并不会一起更新,需要自己手动去更新。
以gitHub用户:lyt-python(组织名或者用户名),fork 项目scrapy(https://github.com/scrapy/scrapy.git)为例子:
1、clone 自己账号里fork的分支
2、增加远程原始分支到本地(可以用 git remote -v 命令查看远程分支列表)
如果没有远程原始分支则需要增加:
查看确认远程分支列表:
3、fetch原始源分支的新版本到本地
会有如下信息:
4、合并两个版本的代码
会有如下信息:
5、把最新的代码提交到github自己(lyt-python)的账号上
会有如下信息:
1. git fetch scrapy 2. git merge scrapy/master 3. git push origin master
github上有个功能叫fork,可以将别人的工程复制到自己账号下。这个功能很方便,但其有一个缺点是:当源项目更新后,你fork的分支并不会一起更新,需要自己手动去更新。
以gitHub用户:lyt-python(组织名或者用户名),fork 项目scrapy(https://github.com/scrapy/scrapy.git)为例子:
1、clone 自己账号里fork的分支
git clone https://github.com/lyt-python/scrapy.git cd scrapy
2、增加远程原始分支到本地(可以用 git remote -v 命令查看远程分支列表)
$ git remote -v origin https://github.com/lyt-python/scrapy.git (fetch) origin https://github.com/lyt-python/scrapy.git (push)
如果没有远程原始分支则需要增加:
$ git remote add scrapy https://github.com/scrapy/scrapy.git
查看确认远程分支列表:
$ git remote -v origin https://github.com/lyt-python/scrapy.git (fetch) origin https://github.com/lyt-python/scrapy.git (push) scrapy https://github.com/scrapy/scrapy.git (fetch) scrapy https://github.com/scrapy/scrapy.git (push)
3、fetch原始源分支的新版本到本地
$ git fetch scrapy
会有如下信息:
remote: Counting objects: 2147, done. remote: Compressing objects: 100% (11/11), done. remote: Total 2147 (delta 1280), reused 1280 (delta 1280), pack-reused 856 Receiving objects: 100% (2147/2147), 490.60 KiB | 39.00 KiB/s, done. Resolving deltas: 100% (1585/1585), completed with 204 local objects. From https://github.com/scrapy/scrapy * [new branch] 0.12 -> scrapy/0.12 * [new branch] 0.14 -> scrapy/0.14 * [new branch] 0.16 -> scrapy/0.16 * [new branch] 0.18 -> scrapy/0.18 * [new branch] 0.20 -> scrapy/0.20 * [new branch] 0.22 -> scrapy/0.22 * [new branch] 0.24 -> scrapy/0.24 * [new branch] 1.0 -> scrapy/1.0 * [new branch] 1.1 -> scrapy/1.1 * [new branch] asyncio -> scrapy/asyncio * [new branch] deprecate-make-requests-from-url -> scrapy/deprecate-make-requests-from-url * [new branch] disable-toplevel-2 -> scrapy/disable-toplevel-2 * [new branch] feature-1371-download-prios -> scrapy/feature-1371-download-prios * [new branch] fix-1330 -> scrapy/fix-1330 * [new branch] fix-util-function-to-work-outside-project-dir -> scrapy/fix-util-function-to-work-outside-project-dir * [new branch] master -> scrapy/master * [new branch] no-max-rss -> scrapy/no-max-rss * [new tag] 1.1.0rc3 -> 1.1.0rc3 * [new tag] 1.0.4 -> 1.0.4 * [new tag] 1.0.5 -> 1.0.5 * [new tag] 1.1.0rc1 -> 1.1.0rc1 * [new tag] 1.1.0rc2 -> 1.1.0rc2 * [new tag] 1.2.0dev2 -> 1.2.0dev2
4、合并两个版本的代码
$ git merge scrapy/master
会有如下信息:
Updating a5db7f8..ebef6d7 Fast-forward .bumpversion.cfg | 2 +- .travis.yml | 7 +- CODE_OF_CONDUCT.md | 50 ++++ README.rst | 6 + conftest.py | 2 +- docs/contributing.rst | 4 +- docs/faq.rst | 22 +- docs/intro/install.rst | 8 +- docs/intro/overview.rst | 6 +- docs/intro/tutorial.rst | 8 +- docs/news.rst | 242 ++++++++++++++++- docs/topics/api.rst | 2 +- docs/topics/architecture.rst | 6 +- docs/topics/autothrottle.rst | 2 + docs/topics/broad-crawls.rst | 2 +- docs/topics/commands.rst | 4 +- docs/topics/djangoitem.rst | 2 +- docs/topics/downloader-middleware.rst | 40 ++- docs/topics/email.rst | 11 +- docs/topics/extensions.rst | 6 +- docs/topics/feed-exports.rst | 15 +- docs/topics/firebug.rst | 2 +- docs/topics/item-pipeline.rst | 4 +- docs/topics/media-pipeline.rst | 14 +- docs/topics/practices.rst | 2 +- docs/topics/request-response.rst | 53 ++-- docs/topics/selectors.rst | 10 +- docs/topics/settings.rst | 164 ++++++++++-- docs/topics/shell.rst | 58 ++++- docs/topics/signals.rst | 6 +- docs/topics/spider-middleware.rst | 7 +- docs/topics/spiders.rst | 4 +- docs/topics/stats.rst | 8 +- docs/topics/webservice.rst | 2 +- docs/versioning.rst | 2 +- requirements-py3.txt | 2 +- scrapy/VERSION | 2 +- scrapy/_monkeypatches.py | 4 +- scrapy/cmdline.py | 9 +- scrapy/commands/__init__.py | 2 - scrapy/commands/fetch.py | 13 +- scrapy/commands/settings.py | 9 +- scrapy/commands/shell.py | 7 +- scrapy/core/downloader/__init__.py | 2 +- scrapy/core/downloader/contextfactory.py | 116 +++++++-- scrapy/core/downloader/handlers/http10.py | 5 +- scrapy/core/downloader/handlers/http11.py | 119 +++++++-- scrapy/core/downloader/handlers/s3.py | 72 ++++-- scrapy/core/downloader/tls.py | 16 ++ scrapy/core/downloader/webclient.py | 41 ++- scrapy/core/engine.py | 23 +- scrapy/core/scheduler.py | 13 +- scrapy/crawler.py | 17 +- scrapy/downloadermiddlewares/ajaxcrawl.py | 2 +- scrapy/downloadermiddlewares/httpcompression.py | 8 +- scrapy/downloadermiddlewares/httpproxy.py | 15 +- scrapy/downloadermiddlewares/retry.py | 2 +- scrapy/downloadermiddlewares/robotstxt.py | 8 +- scrapy/exporters.py | 77 ++++-- scrapy/extensions/closespider.py | 4 + scrapy/extensions/feedexport.py | 37 ++- scrapy/extensions/httpcache.py | 69 ++--- scrapy/extensions/spiderstate.py | 7 +- scrapy/http/cookies.py | 22 +- scrapy/http/headers.py | 10 + scrapy/http/request/form.py | 14 +- scrapy/http/response/text.py | 7 +- scrapy/linkextractors/__init__.py | 3 +- scrapy/mail.py | 7 +- scrapy/middleware.py | 5 +- scrapy/pipelines/files.py | 103 ++++++-- scrapy/responsetypes.py | 3 +- scrapy/selector/unified.py | 2 +- scrapy/settings/__init__.py | 29 ++- scrapy/settings/default_settings.py | 11 +- scrapy/spiderloader.py | 11 +- scrapy/spiders/sitemap.py | 2 +- scrapy/squeues.py | 4 +- scrapy/templates/project/module/settings.py.tmpl | 3 + scrapy/utils/benchserver.py | 12 +- scrapy/utils/boto.py | 21 ++ scrapy/utils/datatypes.py | 25 +- scrapy/utils/gz.py | 28 +- scrapy/utils/iterators.py | 28 +- scrapy/utils/misc.py | 4 +- scrapy/utils/reactor.py | 2 +- scrapy/utils/response.py | 4 +- scrapy/utils/test.py | 35 ++- scrapy/utils/testsite.py | 12 +- scrapy/utils/url.py | 34 ++- scrapy/xlib/lsprofcalltree.py | 120 --------- scrapy/xlib/pydispatch.py | 19 ++ setup.cfg | 3 + setup.py | 4 + tests/mockserver.py | 89 ++++--- tests/py3-ignores.txt | 29 --- tests/requirements-py3.txt | 2 + tests/spiders.py | 2 +- tests/test_cmdline/__init__.py | 2 +- tests/test_command_fetch.py | 8 +- tests/test_command_shell.py | 40 ++- tests/test_commands.py | 71 +++-- tests/test_crawl.py | 9 +- tests/test_downloader_handlers.py | 287 +++++++++++++++------ tests/test_downloadermiddleware.py | 7 +- tests/test_downloadermiddleware_httpcache.py | 13 +- tests/test_downloadermiddleware_httpcompression.py | 24 +- tests/test_downloadermiddleware_httpproxy.py | 23 +- tests/test_downloadermiddleware_retry.py | 8 +- tests/test_downloadermiddleware_robotstxt.py | 12 + tests/test_engine.py | 14 +- tests/test_exporters.py | 206 +++++++++++---- tests/test_feedexport.py | 63 ++++- tests/test_http_cookies.py | 4 + tests/test_http_request.py | 20 ++ tests/test_http_response.py | 32 ++- tests/test_mail.py | 54 +++- tests/test_pipeline_files.py | 41 ++- tests/test_pydispatch_deprecated.py | 12 + tests/test_responsetypes.py | 11 +- tests/test_settings/__init__.py | 23 +- tests/test_spider.py | 12 + tests/test_spidermiddleware_offsite.py | 9 +- tests/test_spiderstate.py | 6 + tests/test_utils_iterators.py | 83 +++++- tests/test_utils_url.py | 65 ++++- tests/test_webclient.py | 182 +++++++------ tox.ini | 10 +- 128 files changed, 2577 insertions(+), 967 deletions(-) create mode 100644 CODE_OF_CONDUCT.md create mode 100644 scrapy/core/downloader/tls.py create mode 100644 scrapy/utils/boto.py delete mode 100644 scrapy/xlib/lsprofcalltree.py create mode 100644 scrapy/xlib/pydispatch.py create mode 100644 tests/test_pydispatch_deprecated.py
5、把最新的代码提交到github自己(lyt-python)的账号上
$ git push origin master
会有如下信息:
Counting objects: 1802, done. Delta compression using up to 24 threads. Compressing objects: 100% (558/558), done. Writing objects: 100% (1802/1802), 280.30 KiB | 0 bytes/s, done. Total 1802 (delta 1395), reused 1644 (delta 1241) To https://github.com/lyt-python/scrapy.git a5db7f8..ebef6d7 master -> master
相关文章推荐
- github更新自己fork的代码
- github更新自己Fork的代码
- Github上更新自己Fork的代码
- GitHub上更新自己Fork的代码
- 更新GitHub上自己 Fork 的代码与原作者的项目进度一致
- github上fork别人代码,如何将自己代码更新到最新版本
- github更新自己fork的代码
- Github上更新自己Fork的代码
- 更新GitHub上自己 Fork 的代码与原作者的项目进度一致
- 更新GitHub上自己 Fork 的代码与原作者的项目进度一致
- 更新自己在github fork后的代码
- Github上更新自己Fork的代码
- 更新GitHub上自己 Fork 的代码与原作者的项目进度一致
- 更新GitHub上自己 Fork 的代码与原作者的项目进度一致
- github更新自己Fork的代码
- github更新自己Fork的代码
- github更新自己fork的代码
- github更新自己Fork的代码
- github更新自己Fork的代码
- Github上更新自己Fork的代码