云盘传输

百度网盘

危险

此项目在 GitHub 上已经被删除,不确保可以正常使用。如果不能使用请从百度网盘下载后再通过其他方式上传。

BaiduPCS-Go 是百度网盘的命令行工具,支持对百度网盘的各种文件操作。项目在 iikira/BaiduPCS-Go 中,更加高级的操作与详细操作可以查询此文档。

登录百度网盘

  1. 使用 BaiduPCS-Go login 命令进入交互式登录。

  2. 输入用户名与密码。

  3. 打开提示的链接查看验证码,输入

  4. 如提示验证,选择验证方式,输入验证码

~# BaiduPCS-Go login
请输入百度用户名(手机号/邮箱/用户名), 回车键提交 > AAAAAAAA
请输入密码(输入的密码无回显, 确认输入完成, 回车提交即可) >

验证码有误,请重新输入
打开以下路径, 以查看验证码
/tmp/captcha.png

或者打开以下的网址, 以查看验证码
https://wappass.baidu.com/cgi-bin/genimage?AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

请输入验证码 > xsgb

需要验证手机或邮箱才能登录
选择一种验证方式
1: 手机: 131******51
2: 邮箱: 未找到邮箱地址

请输入验证方式 (1 或 2) > 1
消息: 验证码已发送至你的手机 139******11

请输入接收到的验证码 > 021956
百度帐号登录成功: AAAAAAAA

操作百度网盘

百度网盘类似本地文件系统,当前目录成为工作目录。

重要

文件校验失败问题

下载文件时可能会提示“该文件校验失败, 文件md5值与服务器记录的不匹配”。下载时可以加上 –nocheck 不进行校验。 初步可能是百度网盘接口出现变化,下载的文件实际与服务器一致,请注意。

# 查看当前目录
~# BaiduPCS-Go pwd
/

# 查看当前目录下文件
~# BaiduPCS-Go ls

当前目录: /
----
#  文件大小       修改日期               文件(目录)
0         -  2014-05-29 12:08:35  apps/
1         -  2015-03-20 21:06:52  docs/
    总: 0B                       文件总数: 0, 目录总数: 2
----

# 切换工作目录
~# BaiduPCS-Go cd apps
改变工作目录: /apps

# 上传文件
BaiduPCS-Go upload localfile.txt /
[1] 加入上传队列: localfile.txt
[1] 准备上传: localfile.txt
[1] 秒传失败, 开始上传文件...

[1] ↑ 10B/10B 5B/s in 2s .............
[1] 上传文件成功, 保存到网盘路径: /localfile.txt

全部上传完毕, 总大小: 10B
~# BaiduPCS-Go ls

当前目录: /
----
#  文件大小       修改日期               文件(目录)
0         -  2014-05-29 12:08:35  apps/
1         -  2015-03-20 21:06:52  docs/
2       10B  2020-02-27 16:24:30  localfile.txt
    总: 10B                       文件总数: 1, 目录总数: 2
----

# 设置下载文件目录
~# BaiduPCS-Go config set -savedir ~
保存配置成功!

# 下载文件
~# BaiduPCS-Go download --nocheck /remotefile.txt

[0] 提示: 当前下载最大并发量为: 8, 下载缓存为: 65536
[1] 加入下载队列: /remotefile.txt

[1] ----
类型              文件
文件路径          /remotefile.txt
文件名称          remotefile.txt
文件大小          11, 11B
md5 (截图请打码)  ed7de0053r6d070f93d967a5caac39e1
app_id            250528
fs_id             112464246121328
创建日期          2020-02-27 16:36:54
修改日期          2020-02-27 16:36:54

[1] 准备下载: /remotefile.txt
[1] 将会下载到路径: /root/AAAAAAAA/remotefile.txt

[1] ↓ 11B/11B 11B/s in 4s, left 0s ............
[1] 下载完成, 保存位置: /root/AAAAAAAA/remotefile.txt

任务结束, 时间: 7.322s, 数据总量: 11B

Rclone 添加远程存储配置

Rclone 支持大量云存储或云盘,具体支持列表可以查看 Rclone 项目官网。

工具可添加多种网盘或存储,添加配置是通过交互式输入认证和配置信息。

添加配置后即可使用命令来对本地和远程存储的文件进行各种操作,并且操作命令都基于 Rclone,屏蔽了各种存储或网盘的差异,易于使用。

SFTP

SFTP

添加配置

如果使用密钥认证,需要生成或放置好秘钥。

使用 rclone config 添加网盘。

  1. 选择新增远程,输入 n

  2. 输入本地此云盘配置的自定义名称,在使用云盘时使用这个名称来操作此云盘,如 sftp

  3. 输入网盘名称,sftp

  4. 输入 SSH 主机地址

  5. 输入用户名和端口

  6. 输入 access_key_id 与 secret_access_key

  7. 选择密码方式,使用密码输入 y,使用密钥输入 n

  8. 输入 Key 文件位置,输入 ~/.ssh/id_rsa

  9. Key 是否加密,无加密输入 n

  10. 是否使用 ssh-agent,输入 false

  11. 选择加密列表,输入 false

  12. 是否禁用 hashcheck,输出 false

  13. 是否编辑高级配置,输入 n

  14. 是否确认配置,输入 y

  15. 确认后会输出当前的远程配置列表,输入 q 退出

~# rclone config
No remotes found - make a new one
n) New remote
s) Set configuration password
q) Quit config
n/s/q> n
name> sftp
Type of storage to configure.
Enter a string value. Press Enter for the default ("").
Choose a number from below, or type in your own value
1 / 1Fichier
\ "fichier"
2 / Alias for an existing remote
\ "alias"
3 / Amazon Drive
\ "amazon cloud drive"
4 / Amazon S3 Compliant Storage Provider (AWS, Alibaba, Ceph, Digital Ocean, Dreamhost, IBM COS, Minio, etc)
\ "s3"
5 / Backblaze B2
\ "b2"
6 / Box
\ "box"
7 / Cache a remote
\ "cache"
8 / Citrix Sharefile
\ "sharefile"
9 / Dropbox
\ "dropbox"
10 / Encrypt/Decrypt a remote
\ "crypt"
11 / FTP Connection
\ "ftp"
12 / Google Cloud Storage (this is not Google Drive)
\ "google cloud storage"
13 / Google Drive
\ "drive"
14 / Google Photos
\ "google photos"
15 / Hubic
\ "hubic"
16 / In memory object storage system.
\ "memory"
17 / JottaCloud
\ "jottacloud"
18 / Koofr
\ "koofr"
19 / Local Disk
\ "local"
20 / Mail.ru Cloud
\ "mailru"
21 / Mega
\ "mega"
22 / Microsoft Azure Blob Storage
\ "azureblob"
23 / Microsoft OneDrive
\ "onedrive"
24 / OpenDrive
\ "opendrive"
25 / Openstack Swift (Rackspace Cloud Files, Memset Memstore, OVH)
\ "swift"
26 / Pcloud
\ "pcloud"
27 / Put.io
\ "putio"
28 / QingCloud Object Storage
\ "qingstor"
29 / SSH/SFTP Connection
\ "sftp"
30 / Sugarsync
\ "sugarsync"
31 / Transparently chunk/split large files
\ "chunker"
32 / Union merges the contents of several remotes
\ "union"
33 / Webdav
\ "webdav"
34 / Yandex Disk
\ "yandex"
35 / http Connection
\ "http"
36 / premiumize.me
\ "premiumizeme"
Storage> sftp
** See help for sftp backend at: https://rclone.org/sftp/ **

SSH host to connect to
Enter a string value. Press Enter for the default ("").
Choose a number from below, or type in your own value
1 / Connect to example.com
\ "example.com"
host> 1.1.1.1
SSH username, leave blank for current username, root
Enter a string value. Press Enter for the default ("").
user> root
SSH port, leave blank to use default (22)
Enter a string value. Press Enter for the default ("").
port> 22
SSH password, leave blank to use ssh-agent.
y) Yes type in my own password
g) Generate random password
n) No leave this optional password blank (default)
y/g/n> n
Path to PEM-encoded private key file, leave blank or set key-use-agent to use ssh-agent.
Enter a string value. Press Enter for the default ("").
key_file> ~/.ssh/id_rsa
The passphrase to decrypt the PEM-encoded private key file.

Only PEM encrypted key files (old OpenSSH format) are supported. Encrypted keys
in the new OpenSSH format can't be used.
y) Yes type in my own password
g) Generate random password
n) No leave this optional password blank (default)
y/g/n> n
When set forces the usage of the ssh-agent.

When key-file is also set, the ".pub" file of the specified key-file is read and only the associated key is
requested from the ssh-agent. This allows to avoid `Too many authentication failures for *username*` errors
when the ssh-agent contains many keys.
Enter a boolean value (true or false). Press Enter for the default ("false").
key_use_agent> false
Enable the use of insecure ciphers and key exchange methods.

This enables the use of the the following insecure ciphers and key exchange methods:

- aes128-cbc
- aes192-cbc
- aes256-cbc
- 3des-cbc
- diffie-hellman-group-exchange-sha256
- diffie-hellman-group-exchange-sha1

Those algorithms are insecure and may allow plaintext data to be recovered by an attacker.
Enter a boolean value (true or false). Press Enter for the default ("false").
Choose a number from below, or type in your own value
1 / Use default Cipher list.
\ "false"
2 / Enables the use of the aes128-cbc cipher and diffie-hellman-group-exchange-sha256, diffie-hellman-group-exchange-sha1 key exchange.
\ "true"
use_insecure_cipher> false
Disable the execution of SSH commands to determine if remote file hashing is available.
Leave blank or set to false to enable hashing (recommended), set to true to disable hashing.
Enter a boolean value (true or false). Press Enter for the default ("false").
disable_hashcheck> false
Edit advanced config? (y/n)
y) Yes
n) No (default)
y/n> n
Remote config
--------------------
[sftp]
type = sftp
host = 1.1.1.1
user = root
port = 22
key_use_agent = false
use_insecure_cipher = false
disable_hashcheck = false
--------------------
y) Yes this is OK (default)
e) Edit this remote
d) Delete this remote
y/e/d> y
Current remotes:

Name                 Type
====                 ====
sftp                 sftp

e) Edit existing remote
n) New remote
d) Delete remote
r) Rename remote
c) Copy remote
s) Set configuration password
q) Quit config
e/n/d/r/c/s/q> q

对象存储 OSS

Alibaba OSS

添加配置

使用 rclone config 添加网盘。

  1. 选择新增远程,输入 n

  2. 输入本地此云盘配置的自定义名称,在使用云盘时使用这个名称来操作此云盘,如 oss

  3. 输入网盘名称,s3

  4. 选择 S3 提供商,输入 Alibaba

  5. 选择获取密钥的位置,输入 false 在下一步输入密钥

  6. 输入 access_key_id 与 secret_access_key

  7. 选择连接的 OSS Endpoint,输入地址或序号

  8. 选择 ACL 模式,输入 private 为完全控制

  9. 输入 Storage class 模式,留空

  10. 是否编辑高级配置,输入 n

  11. 是否确认配置,输入 y

  12. 确认后会输出当前的远程配置列表,输入 q 退出

重要

OSS 使用注意

OSS 在使用时要在云盘配置名称后加”<BUCKET>”,如查看在名为 oss 云盘配置下 test Bucket 的所有文件: rclone ls oss:test/

~# rclone config
No remotes found - make a new one
n) New remote
s) Set configuration password
q) Quit config
n/s/q> n
name> oss
Type of storage to configure.
Enter a string value. Press Enter for the default ("").
Choose a number from below, or type in your own value
1 / 1Fichier
\ "fichier"
2 / Alias for an existing remote
\ "alias"
3 / Amazon Drive
\ "amazon cloud drive"
4 / Amazon S3 Compliant Storage Provider (AWS, Alibaba, Ceph, Digital Ocean, Dreamhost, IBM COS, Minio, etc)
\ "s3"
5 / Backblaze B2
\ "b2"
6 / Box
\ "box"
7 / Cache a remote
\ "cache"
8 / Citrix Sharefile
\ "sharefile"
9 / Dropbox
\ "dropbox"
10 / Encrypt/Decrypt a remote
\ "crypt"
11 / FTP Connection
\ "ftp"
12 / Google Cloud Storage (this is not Google Drive)
\ "google cloud storage"
13 / Google Drive
\ "drive"
14 / Google Photos
\ "google photos"
15 / Hubic
\ "hubic"
16 / In memory object storage system.
\ "memory"
17 / JottaCloud
\ "jottacloud"
18 / Koofr
\ "koofr"
19 / Local Disk
\ "local"
20 / Mail.ru Cloud
\ "mailru"
21 / Mega
\ "mega"
22 / Microsoft Azure Blob Storage
\ "azureblob"
23 / Microsoft OneDrive
\ "onedrive"
24 / OpenDrive
\ "opendrive"
25 / Openstack Swift (Rackspace Cloud Files, Memset Memstore, OVH)
\ "swift"
26 / Pcloud
\ "pcloud"
27 / Put.io
\ "putio"
28 / QingCloud Object Storage
\ "qingstor"
29 / SSH/SFTP Connection
\ "sftp"
30 / Sugarsync
\ "sugarsync"
31 / Transparently chunk/split large files
\ "chunker"
32 / Union merges the contents of several remotes
\ "union"
33 / Webdav
\ "webdav"
34 / Yandex Disk
\ "yandex"
35 / http Connection
\ "http"
36 / premiumize.me
\ "premiumizeme"
Storage> s3
** See help for s3 backend at: https://rclone.org/s3/ **

Choose your S3 provider.
Enter a string value. Press Enter for the default ("").
Choose a number from below, or type in your own value
1 / Amazon Web Services (AWS) S3
\ "AWS"
2 / Alibaba Cloud Object Storage System (OSS) formerly Aliyun
\ "Alibaba"
3 / Ceph Object Storage
\ "Ceph"
4 / Digital Ocean Spaces
\ "DigitalOcean"
5 / Dreamhost DreamObjects
\ "Dreamhost"
6 / IBM COS S3
\ "IBMCOS"
7 / Minio Object Storage
\ "Minio"
8 / Netease Object Storage (NOS)
\ "Netease"
9 / StackPath Object Storage
\ "StackPath"
10 / Wasabi Object Storage
\ "Wasabi"
11 / Any other S3 compatible provider
\ "Other"
provider> Alibaba
Get AWS credentials from runtime (environment variables or EC2/ECS meta data if no env vars).
Only applies if access_key_id and secret_access_key is blank.
Enter a boolean value (true or false). Press Enter for the default ("false").
Choose a number from below, or type in your own value
1 / Enter AWS credentials in the next step
\ "false"
2 / Get AWS credentials from the environment (env vars or IAM)
\ "true"
env_auth> false
AWS Access Key ID.
Leave blank for anonymous access or runtime credentials.
Enter a string value. Press Enter for the default ("").
access_key_id> AAAAAAAAAAAAAAAAAAAAAAAA
AWS Secret Access Key (password)
Leave blank for anonymous access or runtime credentials.
Enter a string value. Press Enter for the default ("").
secret_access_key> AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
Endpoint for OSS API.
Enter a string value. Press Enter for the default ("").
Choose a number from below, or type in your own value
1 / East China 1 (Hangzhou)
\ "oss-cn-hangzhou.aliyuncs.com"
2 / East China 2 (Shanghai)
\ "oss-cn-shanghai.aliyuncs.com"
3 / North China 1 (Qingdao)
\ "oss-cn-qingdao.aliyuncs.com"
4 / North China 2 (Beijing)
\ "oss-cn-beijing.aliyuncs.com"
5 / North China 3 (Zhangjiakou)
\ "oss-cn-zhangjiakou.aliyuncs.com"
6 / North China 5 (Huhehaote)
\ "oss-cn-huhehaote.aliyuncs.com"
7 / South China 1 (Shenzhen)
\ "oss-cn-shenzhen.aliyuncs.com"
8 / Hong Kong (Hong Kong)
\ "oss-cn-hongkong.aliyuncs.com"
9 / US West 1 (Silicon Valley)
\ "oss-us-west-1.aliyuncs.com"
10 / US East 1 (Virginia)
\ "oss-us-east-1.aliyuncs.com"
11 / Southeast Asia Southeast 1 (Singapore)
\ "oss-ap-southeast-1.aliyuncs.com"
12 / Asia Pacific Southeast 2 (Sydney)
\ "oss-ap-southeast-2.aliyuncs.com"
13 / Southeast Asia Southeast 3 (Kuala Lumpur)
\ "oss-ap-southeast-3.aliyuncs.com"
14 / Asia Pacific Southeast 5 (Jakarta)
\ "oss-ap-southeast-5.aliyuncs.com"
15 / Asia Pacific Northeast 1 (Japan)
\ "oss-ap-northeast-1.aliyuncs.com"
16 / Asia Pacific South 1 (Mumbai)
\ "oss-ap-south-1.aliyuncs.com"
17 / Central Europe 1 (Frankfurt)
\ "oss-eu-central-1.aliyuncs.com"
18 / West Europe (London)
\ "oss-eu-west-1.aliyuncs.com"
19 / Middle East 1 (Dubai)
\ "oss-me-east-1.aliyuncs.com"
endpoint> oss-cn-shanghai.aliyuncs.com
Canned ACL used when creating buckets and storing or copying objects.

This ACL is used for creating objects and if bucket_acl isn't set, for creating buckets too.

For more info visit https://docs.aws.amazon.com/AmazonS3/latest/dev/acl-overview.html#canned-acl

Note that this ACL is applied when server side copying objects as S3
doesn't copy the ACL from the source but rather writes a fresh one.
Enter a string value. Press Enter for the default ("").
Choose a number from below, or type in your own value
1 / Owner gets FULL_CONTROL. No one else has access rights (default).
\ "private"
2 / Owner gets FULL_CONTROL. The AllUsers group gets READ access.
\ "public-read"
/ Owner gets FULL_CONTROL. The AllUsers group gets READ and WRITE access.
3 | Granting this on a bucket is generally not recommended.
\ "public-read-write"
4 / Owner gets FULL_CONTROL. The AuthenticatedUsers group gets READ access.
\ "authenticated-read"
/ Object owner gets FULL_CONTROL. Bucket owner gets READ access.
5 | If you specify this canned ACL when creating a bucket, Amazon S3 ignores it.
\ "bucket-owner-read"
/ Both the object owner and the bucket owner get FULL_CONTROL over the object.
6 | If you specify this canned ACL when creating a bucket, Amazon S3 ignores it.
\ "bucket-owner-full-control"
acl> private
The storage class to use when storing new objects in OSS.
Enter a string value. Press Enter for the default ("").
Choose a number from below, or type in your own value
1 / Default
\ ""
2 / Standard storage class
\ "STANDARD"
3 / Archive storage mode.
\ "GLACIER"
4 / Infrequent access storage mode.
\ "STANDARD_IA"
storage_class> 1
Edit advanced config? (y/n)
y) Yes
n) No (default)
y/n> n
Remote config
--------------------
[oss]
type = s3
provider = Alibaba
env_auth = false
access_key_id = AAAAAAAAAAAAAAAAAAAAAAAA
secret_access_key = AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
endpoint = oss-cn-shanghai.aliyuncs.com
acl = private
--------------------
y) Yes this is OK (default)
e) Edit this remote
d) Delete this remote
y/e/d> y
Current remotes:

Name                 Type
====                 ====
oss                  s3

e) Edit existing remote
n) New remote
d) Delete remote
r) Rename remote
c) Copy remote
s) Set configuration password
q) Quit config
e/n/d/r/c/s/q> q

OneDrive

OneDrive

获取授权 Token

使用云盘需要获取账号 Token,并同意第三方授权。

先在本地下载 Rclone,通过 Rclone 授权命令打开浏览器,在网页上登录进行授权,授权后根据提示将命令返回的字段复制,用于后续在云环境中添加云盘。

Rclone Download 中下载对应平台的文件并安装,或采用脚本安装。安装后执行 rclone 命令查看是否安装成功。

使用 Rclone 认证获取 Token。执行后会弹出网盘登录界面,如果没有出现则手动打开命令输出的链接。

~# ./rclone authorize "onedrive"
If your browser doesn't open automatically go to the following link: http://127.0.0.1:53682/auth?state=SzoLkcm_8c2SEmYMGSf2lA
Log in and authorize rclone for access
Waiting for code...

使用账户进行登录

../_images/01.png ../_images/02.png

授权 Rclone

../_images/03.png

成功授权,关闭浏览器

../_images/04.png

将 —> 和 <—End paste 中的一行复制,用于在云环境配置云盘。

Got code
Paste the following into your remote machine --->
{"access_token":AAAA","token_type":"Bearer","refresh_token":"AAAA","expiry":"2020-02-27T11:50:18.679297+08:00"}
<---End paste

添加配置

使用 rclone config 添加网盘。

  1. 选择新增远程,输入 n

  2. 输入本地此云盘配置的自定义名称,在使用云盘时使用这个名称来操作此云盘,如 onedrive

  3. 输入网盘名称,onedrive

  4. 输入 client_id 和 client_secret,留空即可

  5. 是否编辑高级配置,输入 n

  6. 是否使用自动配置,输入 n

  7. 输入 Token,将之前在本地获取的授权完整字段输入

  8. 选择云盘类型,填写 onedrive

  9. 选择云盘,填写 0

  10. 确认选择的云盘,输入 y

  11. 是否确认配置,输入 y

  12. 确认后会输出当前的远程配置列表,输入 q 退出

~# rclone config
No remotes found - make a new one
n) New remote
s) Set configuration password
q) Quit config
n/s/q> n
name> onedrive
Type of storage to configure.
Enter a string value. Press Enter for the default ("").
Choose a number from below, or type in your own value
1 / 1Fichier
\ "fichier"
2 / Alias for an existing remote
\ "alias"
3 / Amazon Drive
\ "amazon cloud drive"
4 / Amazon S3 Compliant Storage Provider (AWS, Alibaba, Ceph, Digital Ocean, Dreamhost, IBM COS, Minio, etc)
\ "s3"
5 / Backblaze B2
\ "b2"
6 / Box
\ "box"
7 / Cache a remote
\ "cache"
8 / Citrix Sharefile
\ "sharefile"
9 / Dropbox
\ "dropbox"
10 / Encrypt/Decrypt a remote
\ "crypt"
11 / FTP Connection
\ "ftp"
12 / Google Cloud Storage (this is not Google Drive)
\ "google cloud storage"
13 / Google Drive
\ "drive"
14 / Google Photos
\ "google photos"
15 / Hubic
\ "hubic"
16 / In memory object storage system.
\ "memory"
17 / JottaCloud
\ "jottacloud"
18 / Koofr
\ "koofr"
19 / Local Disk
\ "local"
20 / Mail.ru Cloud
\ "mailru"
21 / Mega
\ "mega"
22 / Microsoft Azure Blob Storage
\ "azureblob"
23 / Microsoft OneDrive
\ "onedrive"
24 / OpenDrive
\ "opendrive"
25 / Openstack Swift (Rackspace Cloud Files, Memset Memstore, OVH)
\ "swift"
26 / Pcloud
\ "pcloud"
27 / Put.io
\ "putio"
28 / QingCloud Object Storage
\ "qingstor"
29 / SSH/SFTP Connection
\ "sftp"
30 / Sugarsync
\ "sugarsync"
31 / Transparently chunk/split large files
\ "chunker"
32 / Union merges the contents of several remotes
\ "union"
33 / Webdav
\ "webdav"
34 / Yandex Disk
\ "yandex"
35 / http Connection
\ "http"
36 / premiumize.me
\ "premiumizeme"
Storage> onedrive
** See help for onedrive backend at: https://rclone.org/onedrive/ **

Microsoft App Client Id
Leave blank normally.
Enter a string value. Press Enter for the default ("").
client_id>
Microsoft App Client Secret
Leave blank normally.
Enter a string value. Press Enter for the default ("").
client_secret>
Edit advanced config? (y/n)
y) Yes
n) No (default)
y/n> n
Remote config
Use auto config?
* Say Y if not sure
* Say N if you are working on a remote or headless machine
y) Yes (default)
n) No
y/n> n
For this to work, you will need rclone available on a machine that has a web browser available.
Execute the following on your machine (same rclone version recommended) :
    rclone authorize "onedrive"
Then paste the result below:
result> {"access_token":AAAA","token_type":"Bearer","refresh_token":"AAAA","expiry":"2020-02-27T11:50:18.679297+08:00"}
Choose a number from below, or type in an existing value
1 / OneDrive Personal or Business
\ "onedrive"
2 / Root Sharepoint site
\ "sharepoint"
3 / Type in driveID
\ "driveid"
4 / Type in SiteID
\ "siteid"
5 / Search a Sharepoint site
\ "search"
Your choice> onedrive
Found 1 drives, please select the one you want to use:
0:  (personal) id=ea7ddb67ac0bce34
Chose drive to use:> 0
Found drive 'root' of type 'personal', URL: https://onedrive.live.com/?cid=ea7ddb67ac0bce34
Is that okay?
y) Yes (default)
n) No
y/n> y
--------------------
[onedrive]
type = onedrive
token = {"access_token":AAAA","token_type":"Bearer","refresh_token":"AAAA","expiry":"2020-02-27T11:50:18.679297+08:00"}
drive_id = ea7ddb67ac0bce34
drive_type = personal
--------------------
y) Yes this is OK (default)
e) Edit this remote
d) Delete this remote
y/e/d> y
Current remotes:

Name                 Type
====                 ====
onedrive             onedrive

e) Edit existing remote
n) New remote
d) Delete remote
r) Rename remote
c) Copy remote
s) Set configuration password
q) Quit config
e/n/d/r/c/s/q> q

Dropbox

Dropbox

获取授权 Token

使用云盘需要获取账号 Token,并同意第三方授权。

先在本地下载 Rclone,通过 Rclone 授权命令打开浏览器,在网页上登录进行授权,授权后根据提示将命令返回的字段复制,用于后续在云环境中添加云盘。

Rclone Download 中下载对应平台的文件并安装,或采用脚本安装。安装后执行 rclone 命令查看是否安装成功。

使用 Rclone 认证获取 Token。执行后会弹出网盘登录界面,如果没有出现则手动打开命令输出的链接。

~# ./rclone authorize "dropbox"
If your browser doesn't open automatically go to the following link: http://127.0.0.1:53682/auth?state=2iq_SZoOScHjhD4ubld9-w
Log in and authorize rclone for access
Waiting for code...

使用账户进行登录

../_images/011.png

通过真人验证

../_images/021.png

从邮箱获取验证码继续

../_images/031.png

授权 Rclone

../_images/041.png

成功授权,关闭浏览器

../_images/05.png

将 —> 和 <—End paste 中的一行复制,用于在云环境配置云盘。

Got code
Paste the following into your remote machine --->
{"access_token":"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA","token_type":"bearer","expiry":"0001-01-01T00:00:00Z"}
<---End paste

添加配置

使用 rclone config 添加网盘。

  1. 选择新增远程,输入 n

  2. 输入本地此云盘配置的自定义名称,在使用云盘时使用这个名称来操作此云盘,如 dropbox

  3. 输入网盘名称,dropbox

  4. 输入 client_id 和 client_secret,留空即可

  5. 是否编辑高级配置,输入 n

  6. 是否使用自动配置,输入 n

  7. 输入 Token,将之前在本地获取的授权完整字段输入

  8. 是否确认配置,输入 y

  9. 确认后会输出当前的远程配置列表,输入 q 退出

~# rclone config
No remotes found - make a new one
n) New remote
s) Set configuration password
q) Quit config
n/s/q> n
name> dropbox
Type of storage to configure.
Enter a string value. Press Enter for the default ("").
Choose a number from below, or type in your own value
1 / 1Fichier
\ "fichier"
2 / Alias for an existing remote
\ "alias"
3 / Amazon Drive
\ "amazon cloud drive"
4 / Amazon S3 Compliant Storage Provider (AWS, Alibaba, Ceph, Digital Ocean, Dreamhost, IBM COS, Minio, etc)
\ "s3"
5 / Backblaze B2
\ "b2"
6 / Box
\ "box"
7 / Cache a remote
\ "cache"
8 / Citrix Sharefile
\ "sharefile"
9 / Dropbox
\ "dropbox"
10 / Encrypt/Decrypt a remote
\ "crypt"
11 / FTP Connection
\ "ftp"
12 / Google Cloud Storage (this is not Google Drive)
\ "google cloud storage"
13 / Google Drive
\ "drive"
14 / Google Photos
\ "google photos"
15 / Hubic
\ "hubic"
16 / In memory object storage system.
\ "memory"
17 / JottaCloud
\ "jottacloud"
18 / Koofr
\ "koofr"
19 / Local Disk
\ "local"
20 / Mail.ru Cloud
\ "mailru"
21 / Mega
\ "mega"
22 / Microsoft Azure Blob Storage
\ "azureblob"
23 / Microsoft OneDrive
\ "onedrive"
24 / OpenDrive
\ "opendrive"
25 / Openstack Swift (Rackspace Cloud Files, Memset Memstore, OVH)
\ "swift"
26 / Pcloud
\ "pcloud"
27 / Put.io
\ "putio"
28 / QingCloud Object Storage
\ "qingstor"
29 / SSH/SFTP Connection
\ "sftp"
30 / Sugarsync
\ "sugarsync"
31 / Transparently chunk/split large files
\ "chunker"
32 / Union merges the contents of several remotes
\ "union"
33 / Webdav
\ "webdav"
34 / Yandex Disk
\ "yandex"
35 / http Connection
\ "http"
36 / premiumize.me
\ "premiumizeme"
Storage> dropbox
** See help for dropbox backend at: https://rclone.org/dropbox/ **

Dropbox App Client Id
Leave blank normally.
Enter a string value. Press Enter for the default ("").
client_id>
Dropbox App Client Secret
Leave blank normally.
Enter a string value. Press Enter for the default ("").
client_secret>
Edit advanced config? (y/n)
y) Yes
n) No (default)
y/n> n
Remote config
Use auto config?
* Say Y if not sure
* Say N if you are working on a remote or headless machine
y) Yes (default)
n) No
y/n> n
For this to work, you will need rclone available on a machine that has a web browser available.
Execute the following on your machine (same rclone version recommended) :
    rclone authorize "dropbox"
Then paste the result below:
result> {"access_token":"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA","token_type":"bearer","expiry":"0001-01-01T00:00:00Z"}
--------------------
[dropbox]
type = dropbox
token = {"access_token":"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA","token_type":"bearer","expiry":"0001-01-01T00:00:00Z"}
--------------------
y) Yes this is OK (default)
e) Edit this remote
d) Delete this remote
y/e/d> y
Current remotes:

Name                 Type
====                 ====
dropbox              dropbox

e) Edit existing remote
n) New remote
d) Delete remote
r) Rename remote
c) Copy remote
s) Set configuration password
q) Quit config
e/n/d/r/c/s/q> q

Google Drive

Google Drive

获取授权 Token

使用云盘需要获取账号 Token,并同意第三方授权。

在云环境当中配置 Google Drive 时,会生成授权的链接,在本地浏览器中打开链接登录账号,将获取的代码再复制到交互式命令中,可完成授权。

添加配置

使用 rclone config 添加网盘。

  1. 选择新增远程,输入 n

  2. 输入本地此云盘配置的自定义名称,在使用云盘时使用这个名称来操作此云盘,如 googledrive

  3. 输入网盘名称,drive

  4. 输入 client_id 和 client_secret,留空即可

  5. 选择作用域,输入 1 为完全控制

  6. 输入根目录 ID,默认留空

  7. 输入服务账户文件,默认留空

  8. 是否编辑高级配置,输入 n

  9. 是否使用自动配置,输入 n

  10. 根据提示的 URL 在浏览器内打开

~# rclone config
No remotes found - make a new one
n) New remote
s) Set configuration password
q) Quit config
n/s/q> n
name> googledrive
Type of storage to configure.
Enter a string value. Press Enter for the default ("").
Choose a number from below, or type in your own value
1 / 1Fichier
\ "fichier"
2 / Alias for an existing remote
\ "alias"
3 / Amazon Drive
\ "amazon cloud drive"
4 / Amazon S3 Compliant Storage Provider (AWS, Alibaba, Ceph, Digital Ocean, Dreamhost, IBM COS, Minio, etc)
\ "s3"
5 / Backblaze B2
\ "b2"
6 / Box
\ "box"
7 / Cache a remote
\ "cache"
8 / Citrix Sharefile
\ "sharefile"
9 / Dropbox
\ "dropbox"
10 / Encrypt/Decrypt a remote
\ "crypt"
11 / FTP Connection
\ "ftp"
12 / Google Cloud Storage (this is not Google Drive)
\ "google cloud storage"
13 / Google Drive
\ "drive"
14 / Google Photos
\ "google photos"
15 / Hubic
\ "hubic"
16 / In memory object storage system.
\ "memory"
17 / JottaCloud
\ "jottacloud"
18 / Koofr
\ "koofr"
19 / Local Disk
\ "local"
20 / Mail.ru Cloud
\ "mailru"
21 / Mega
\ "mega"
22 / Microsoft Azure Blob Storage
\ "azureblob"
23 / Microsoft OneDrive
\ "onedrive"
24 / OpenDrive
\ "opendrive"
25 / Openstack Swift (Rackspace Cloud Files, Memset Memstore, OVH)
\ "swift"
26 / Pcloud
\ "pcloud"
27 / Put.io
\ "putio"
28 / QingCloud Object Storage
\ "qingstor"
29 / SSH/SFTP Connection
\ "sftp"
30 / Sugarsync
\ "sugarsync"
31 / Transparently chunk/split large files
\ "chunker"
32 / Union merges the contents of several remotes
\ "union"
33 / Webdav
\ "webdav"
34 / Yandex Disk
\ "yandex"
35 / http Connection
\ "http"
36 / premiumize.me
\ "premiumizeme"
Storage> drive
** See help for drive backend at: https://rclone.org/drive/ **

Google Application Client Id
Setting your own is recommended.
See https://rclone.org/drive/#making-your-own-client-id for how to create your own.
If you leave this blank, it will use an internal key which is low performance.
Enter a string value. Press Enter for the default ("").
client_id>
Google Application Client Secret
Setting your own is recommended.
Enter a string value. Press Enter for the default ("").
client_secret>
Scope that rclone should use when requesting access from drive.
Enter a string value. Press Enter for the default ("").
Choose a number from below, or type in your own value
1 / Full access all files, excluding Application Data Folder.
\ "drive"
2 / Read-only access to file metadata and file contents.
\ "drive.readonly"
/ Access to files created by rclone only.
3 | These are visible in the drive website.
| File authorization is revoked when the user deauthorizes the app.
\ "drive.file"
/ Allows read and write access to the Application Data folder.
4 | This is not visible in the drive website.
\ "drive.appfolder"
/ Allows read-only access to file metadata but
5 | does not allow any access to read or download file content.
\ "drive.metadata.readonly"
scope> 1
ID of the root folder
Leave blank normally.

Fill in to access "Computers" folders (see docs), or for rclone to use
a non root folder as its starting point.

Note that if this is blank, the first time rclone runs it will fill it
in with the ID of the root folder.

Enter a string value. Press Enter for the default ("").
root_folder_id>
Service Account Credentials JSON file path
Leave blank normally.
Needed only if you want use SA instead of interactive login.
Enter a string value. Press Enter for the default ("").
service_account_file>
Edit advanced config? (y/n)
y) Yes
n) No (default)
y/n> n
Remote config
Use auto config?
* Say Y if not sure
* Say N if you are working on a remote or headless machine
y) Yes (default)
n) No
y/n> n
Please go to the following link: https://accounts.google.com/o/oauth2/auth?access_type=offline&client_id=AAAAAAAAAAAA.apps.googleusercontent.com&redirect_uri=urn%3Aietf%3Awg%3Aoauth%3A2.0%3Aoob&response_type=code&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fdrive&state=AAAAAAAAAAAAAAAAAAA-AA
Log in and authorize rclone for access
Enter verification code>

使用账户进行登录

../_images/012.png

授权 Rclone

../_images/022.png

成功授权,复制代码

../_images/032.png
  1. 输入复制的代码

  2. 是否为团队云盘,默认输入 n

  3. 是否确认配置,输入 y

  4. 确认后会输出当前的远程配置列表,输入 q 退出

Enter verification code> 4/AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA-AAAAAAAAAA-AAAAAAAA
Configure this as a team drive?
y) Yes
n) No (default)
y/n> n
--------------------
[googledrive]
type = drive
scope = drive
token = {"access_token":"AAAA.AA-_AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA","token_type":"Bearer","refresh_token":"1//AAAA-AAAAAAAAAAAAAAAAAAAAAAA-AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA","expiry":"2020-02-26T11:58:01.263310846Z"}
--------------------
y) Yes this is OK (default)
e) Edit this remote
d) Delete this remote
y/e/d> y
Current remotes:

Name                 Type
====                 ====
googledrive          drive

e) Edit existing remote
n) New remote
d) Delete remote
r) Rename remote
c) Copy remote
s) Set configuration password
q) Quit config
e/n/d/r/c/s/q> q

Rclone 使用远程存储

添加过云盘配置后,可以使用 rclone 命令对云盘进行传输、同步文件的操作。更高级详细的功能可以参考 subcommands

# 查看名为 dropbox 的远程云盘根目录下文件
~# rclone ls dropbox:
        7 remotefile.txt
# 查看 OSS 需要加 Bucket,bucket 名为 test-bucket
~# rclone ls oss:test-bucket/

# 从名为 dropbox 的远程云盘拷贝文件到本地当前目录
~# rclone copy dropbox:remotefile.txt .
~# ls
remotefile.txt
# 从名为 oss 的远程云盘拷贝文件到本地当前目录,bucket 名为 test-bucket
~# rclone copy oss:test-bucket/remotefile.txt .

# 拷贝本地文件到名为 dropbox 的远程云盘根目录下
~# rclone copy localfile.txt dropbox:/
~# rclone ls dropbox:/
    10 localfile.txt

# 同步云盘的根目录到本地目录,可反向同步。建议先使用 --dry-run 查看变更,防止误删除本地文件。
~# mkdir syncfolder
~# rclone sync dropbox:/ syncfolder --dry-run
2020/02/26 08:55:33 NOTICE: newfile.txt: Not copying as --dry-run
~# rclone sync dropbox:/ syncfolder
~# ls syncfolder
newfile.txt