您的位置:首页 > 编程语言

github更新自己Fork的代码

2018-02-01 17:06 239 查看
github上有个功能叫fork,可以将别人的工程复制到自己账号下。这个功能很方便,但其有一个缺点是:当源项目更新后,你fork的分支并不会一起更新,需要自己手动去更新。

  以gitHub用户:lyt-python(组织名或者用户名),fork 项目scrapy(https://github.com/scrapy/scrapy.git)为例子:

1、clone 自己账号里fork的分支
git clone https://github.com/lyt-python/scrapy.git cd scrapy
1
2

2、增加远程原始分支到本地(可以用 git remote -v 命令查看远程分支列表)
$ git remote -v
origin  https://github.com/lyt-python/scrapy.git (fetch)
origin  https://github.com/lyt-python/scrapy.git (push)
1
2
3

如果没有远程原始分支则需要增加:
$ git remote add scrapy https://github.com/scrapy/scrapy.git
1

查看确认远程分支列表:
$ git remote -v

origin  https://github.com/lyt-python/scrapy.git (fetch)
origin  https://github.com/lyt-python/scrapy.git (push)
scrapy  https://github.com/scrapy/scrapy.git (fetch)
scrapy  https://github.com/scrapy/scrapy.git (push)
1
2
3
4
5
6

3、fetch原始源分支的新版本到本地
$ git fetch scrapy
1

会有如下信息:
remote: Counting objects: 2147, done.
remote: Compressing objects: 100% (11/11), done.
remote: Total 2147 (delta 1280), reused 1280 (delta 1280), pack-reused 856
Receiving objects: 100% (2147/2147), 490.60 KiB | 39.00 KiB/s, done.
Resolving deltas: 100% (1585/1585), completed with 204 local objects.
From https://github.com/scrapy/scrapy * [new branch]      0.12       -> scrapy/0.12
* [new branch]      0.14       -> scrapy/0.14
* [new branch]      0.16       -> scrapy/0.16
* [new branch]      0.18       -> scrapy/0.18
* [new branch]      0.20       -> scrapy/0.20
* [new branch]      0.22       -> scrapy/0.22
* [new branch]      0.24       -> scrapy/0.24
* [new branch]      1.0        -> scrapy/1.0
* [new branch]      1.1        -> scrapy/1.1
* [new branch]      asyncio    -> scrapy/asyncio
* [new branch]      deprecate-make-requests-from-url -> scrapy/deprecate-make-requests-from-url
* [new branch]      disable-toplevel-2 -> scrapy/disable-toplevel-2
* [new branch]      feature-1371-download-prios -> scrapy/feature-1371-download-prios
* [new branch]      fix-1330   -> scrapy/fix-1330
* [new branch]      fix-util-function-to-work-outside-project-dir -> scrapy/fix-util-function-to-work-outside-project-dir
* [new branch]      master     -> scrapy/master
* [new branch]      no-max-rss -> scrapy/no-max-rss
* [new tag]         1.1.0rc3   -> 1.1.0rc3
* [new tag]         1.0.4      -> 1.0.4
* [new tag]         1.0.5      -> 1.0.5
* [new tag]         1.1.0rc1   -> 1.1.0rc1
* [new tag]         1.1.0rc2   -> 1.1.0rc2
* [new tag]         1.2.0dev2  -> 1.2.0dev2
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29

4、合并两个版本的代码
$ git merge scrapy/master
1

会有如下信息:
Updating a5db7f8..ebef6d7
Fast-forward
.bumpversion.cfg                                   |   2 +-
.travis.yml                                        |   7 +-
CODE_OF_CONDUCT.md                                 |  50 ++++
README.rst                                         |   6 +
conftest.py                                        |   2 +-
docs/contributing.rst                              |   4 +-
docs/faq.rst                                       |  22 +-
docs/intro/install.rst                             |   8 +-
docs/intro/overview.rst                            |   6 +-
docs/intro/tutorial.rst                            |   8 +-
docs/news.rst                                      | 242 ++++++++++++++++-
docs/topics/api.rst                                |   2 +-
docs/topics/architecture.rst                       |   6 +-
docs/topics/autothrottle.rst                       |   2 +
docs/topics/broad-crawls.rst                       |   2 +-
docs/topics/commands.rst                           |   4 +-
docs/topics/djangoitem.rst                         |   2 +-
docs/topics/downloader-middleware.rst              |  40 ++-
docs/topics/email.rst                              |  11 +-
docs/topics/extensions.rst                         |   6 +-
docs/topics/feed-exports.rst                       |  15 +-
docs/topics/firebug.rst                            |   2 +-
docs/topics/item-pipeline.rst                      |   4 +-
docs/topics/media-pipeline.rst                     |  14 +-
docs/topics/practices.rst                          |   2 +-
docs/topics/request-response.rst                   |  53 ++--
docs/topics/selectors.rst                          |  10 +-
docs/topics/settings.rst                           | 164 ++++++++++--
docs/topics/shell.rst                              |  58 ++++-
docs/topics/signals.rst                            |   6 +-
docs/topics/spider-middleware.rst                  |   7 +-
docs/topics/spiders.rst                            |   4 +-
docs/topics/stats.rst                              |   8 +-
docs/topics/webservice.rst                         |   2 +-
docs/versioning.rst                                |   2 +-
requirements-py3.txt                               |   2 +-
scrapy/VERSION                                     |   2 +-
scrapy/_monkeypatches.py                           |   4 +-
scrapy/cmdline.py                                  |   9 +-
scrapy/commands/__init__.py                        |   2 -
scrapy/commands/fetch.py                           |  13 +-
scrapy/commands/settings.py                        |   9 +-
scrapy/commands/shell.py                           |   7 +-
scrapy/core/downloader/__init__.py                 |   2 +-
scrapy/core/downloader/contextfactory.py           | 116 +++++++--
scrapy/core/downloader/handlers/http10.py          |   5 +-
scrapy/core/downloader/handlers/http11.py          | 119 +++++++--
scrapy/core/downloader/handlers/s3.py              |  72 ++++--
scrapy/core/downloader/tls.py                      |  16 ++
scrapy/core/downloader/webclient.py                |  41 ++-
scrapy/core/engine.py                              |  23 +-
scrapy/core/scheduler.py                           |  13 +-
scrapy/crawler.py                                  |  17 +-
scrapy/downloadermiddlewares/ajaxcrawl.py          |   2 +-
scrapy/downloadermiddlewares/httpcompression.py    |   8 +-
scrapy/downloadermiddlewares/httpproxy.py          |  15 +-
scrapy/downloadermiddlewares/retry.py              |   2 +-
scrapy/downloadermiddlewares/robotstxt.py          |   8 +-
scrapy/exporters.py                                |  77 ++++--
scrapy/extensions/closespider.py                   |   4 +
scrapy/extensions/feedexport.py                    |  37 ++-
scrapy/extensions/httpcache.py                     |  69 ++---
scrapy/extensions/spiderstate.py                   |   7 +-
scrapy/http/cookies.py                             |  22 +-
scrapy/http/headers.py                             |  10 +
scrapy/http/request/form.py                        |  14 +-
scrapy/http/response/text.py                       |   7 +-
scrapy/linkextractors/__init__.py                  |   3 +-
scrapy/mail.py                                     |   7 +-
scrapy/middleware.py                               |   5 +-
scrapy/pipelines/files.py                          | 103 ++++++--
scrapy/responsetypes.py                            |   3 +-
scrapy/selector/unified.py                         |   2 +-
scrapy/settings/__init__.py                        |  29 ++-
scrapy/settings/default_settings.py                |  11 +-
scrapy/spiderloader.py                             |  11 +-
scrapy/spiders/sitemap.py                          |   2 +-
scrapy/squeues.py                                  |   4 +-
scrapy/templates/project/module/settings.py.tmpl   |   3 +
scrapy/utils/benchserver.py                        |  12 +-
scrapy/utils/boto.py                               |  21 ++
scrapy/utils/datatypes.py                          |  25 +-
scrapy/utils/gz.py                                 |  28 +-
scrapy/utils/iterators.py                          |  28 +-
scrapy/utils/misc.py                               |   4 +-
scrapy/utils/reactor.py                            |   2 +-
scrapy/utils/response.py                           |   4 +-
scrapy/utils/test.py                               |  35 ++-
scrapy/utils/testsite.py                           |  12 +-
scrapy/utils/url.py                                |  34 ++-
scrapy/xlib/lsprofcalltree.py                      | 120 ---------
scrapy/xlib/pydispatch.py                          |  19 ++
setup.cfg                                          |   3 +
setup.py                                           |   4 +
tests/mockserver.py                                |  89 ++++---
tests/py3-ignores.txt                              |  29 ---
tests/requirements-py3.txt                         |   2 +
tests/spiders.py                                   |   2 +-
tests/test_cmdline/__init__.py                     |   2 +-
tests/test_command_fetch.py                        |   8 +-
tests/test_command_shell.py                        |  40 ++-
tests/test_commands.py                             |  71 +++--
tests/test_crawl.py                                |   9 +-
tests/test_downloader_handlers.py                  | 287 +++++++++++++++------
tests/test_downloadermiddleware.py                 |   7 +-
tests/test_downloadermiddleware_httpcache.py       |  13 +-
tests/test_downloadermiddleware_httpcompression.py |  24 +-
tests/test_downloadermiddleware_httpproxy.py       |  23 +-
tests/test_downloadermiddleware_retry.py           |   8 +-
tests/test_downloadermiddleware_robotstxt.py       |  12 +
tests/test_engine.py                               |  14 +-
tests/test_exporters.py                            | 206 +++++++++++----
tests/test_feedexport.py                           |  63 ++++-
tests/test_http_cookies.py                         |   4 +
tests/test_http_request.py                         |  20 ++
tests/test_http_response.py                        |  32 ++-
tests/test_mail.py                                 |  54 +++-
tests/test_pipeline_files.py                       |  41 ++-
tests/test_pydispatch_deprecated.py                |  12 +
tests/test_responsetypes.py                        |  11 +-
tests/test_settings/__init__.py                    |  23 +-
tests/test_spider.py                               |  12 +
tests/test_spidermiddleware_offsite.py             |   9 +-
tests/test_spiderstate.py                          |   6 +
tests/test_utils_iterators.py                      |  83 +++++-
tests/test_utils_url.py                            |  65 ++++-
tests/test_webclient.py                            | 182 +++++++------
tox.ini                                            |  10 +-
128 files changed, 2577 insertions(+), 967 deletions(-)
create mode 100644 CODE_OF_CONDUCT.md
create mode 100644 scrapy/core/downloader/tls.py
create mode 100644 scrapy/utils/boto.py
delete mode 100644 scrapy/xlib/lsprofcalltree.py
create mode 100644 scrapy/xlib/pydispatch.py
create mode 100644 tests/test_pydispatch_deprecated.py
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137

5、把最新的代码提交到github自己(lyt-python)的账号上
$ git push origin master
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: