Issue
I develop a Python two scripts to transfer lot of data (~120Go) on my vm, with Paramiko. My vm is on OVH server. First script transfert ~ 40Go, and the second script ~ 80Go.
Stack :
Python 3.9.1
Paramiko 2.7.2
SCP 0.13.3
On my both scripts, I use this function to setup SSH connection.
def connect():
transport = paramiko.Transport((target_host, target_port))
transport.connect(None, target_username, target_pwd)
sftp_client = paramiko.SFTPClient.from_transport(transport)
green_print("SSH connected")
return sftp_client, transport
If I create one script which do the two transfer, I'm timeout after 3 hours.
With two distincts script which run in the same time, I'm timeout after 2h30 of transfer.
I already read many many many post on Paramiko, SSH connection, timeout parameter, ClientAliveInterval, etc... But nothing works.
After this time, I have this error
Connexion fermée par l'hôte distant
/ Connection closed by remote host
Three functions of my script :
def connect():
transport = paramiko.Transport((target_host, target_port))
transport.connect(None, target_username, target_pwd)
sftp_client = paramiko.SFTPClient.from_transport(transport)
green_print("SSH connected")
return sftp_client, transport
def transfert(sftp, vm, object_path):
os.chdir(os.path.split(object_path)[0])
parent = os.path.split(object_path)[1]
try:
sftp.mkdir(vm)
except:
pass
for path, _, files in os.walk(parent):
try:
sftp.mkdir(os.path.join(vm, path))
except:
pass
for filename in files:
sftp.put(os.path.join(object_path, filename),
os.path.join(vm, path, filename))
def job():
green_print("\nProcess start...")
check_folder()
folder = forfiles_method()
vm, lidar, pos = name_path(folder)
sftp, transport = connect()
transfert(sftp, vm, pos)
sftp.close()
transport.close()
minimal reproducible example :
from paramiko.sftp_client import SFTPClient
import paramiko
import os
target_host = 'xx.xx.x.xxx'
target_port = 22
target_username = "xxxxxxx"
target_pwd = 'xxxxxx'
remote_path = "e:/x/" # => on your vm
target_folder = '/folder1' # => on your computer
def connect():
transport = paramiko.Transport((target_host, target_port))
transport.connect(None, target_username, target_pwd)
sftp_client = paramiko.SFTPClient.from_transport(transport)
return sftp_client, transport
def transfert(sftp, remote_path, object_path):
os.chdir(os.path.split(object_path)[0])
parent = os.path.split(object_path)[1]
try:
sftp.mkdir(remote_path)
except:
pass
for path, _, files in os.walk(parent):
try:
sftp.mkdir(os.path.join(remote_path, path))
except:
pass
for filename in files:
sftp.put(os.path.join(object_path, filename),
os.path.join(remote_path, path, filename))
def job():
sftp, transport = connect()
transfert(sftp, remote_path, target_folder)
sftp.close()
transport.close()
The tree structure of my files, and I want to transfer only the "test" folder which contains more than 120GB.
folder / test
I'm new in Python dev. If someone have a solution, I take it !
Solution
So the solution :
subprocess.run(["winscp.com", "/script=" + cmdFile], shell=True)
If winscp.com is not found like command, insert the path like : C:/Program Files (x86)/WinSCP/winscp.com
Write your commandes line in a txt file, here cmdFile.
Links, which can help you :
Running WinSCP command from Python
From Python run WinSCP commands in console
https://winscp.net/eng/docs/commandline
Answered By - Adrien Masson