Jump to content

Packet Splitting Networking


fiveworlds

Recommended Posts

9 hours ago, fiveworlds said:

Is there an existing application/hardware that will allow a single computer to send duplicate network traffic to multiple computers. For example, say I want many computers to download the same update file?

Are not doing it, the all servers, which allow multiple connections (i.e. which spawn new thread after socket listen() and accept()).. ?

https://docs.microsoft.com/en-us/windows/desktop/api/winsock2/nf-winsock2-accept

e.g. HTTP Apache (which allows 256 simultaneous connections, if you don't change config file).

Limit has been added to disallow DDoS attack on HTTP server, which would run of resources of server machine and cause crash.

If after listen() and accept() your server code does not spawn new thread, only one connection is possible at a time to such server. Other attempts of connection, from other machines, are rejected or delayed (depends on time-out set using setsockopt(), SO_RCVTIMEO and SO_SNDTIMEO options)

https://docs.microsoft.com/en-us/windows/desktop/api/winsock/nf-winsock-setsockopt

 

 

On client machines wget allows you to specify initial offset using --start-pos

Quote

‘--start-pos=OFFSET’

Start downloading at zero-based position OFFSET. Offset may be expressed in bytes, kilobytes with the ‘k’ suffix, or megabytes with the ‘m’ suffix, etc.

‘--start-pos’ has higher precedence over ‘--continue’. When ‘--start-pos’ and ‘--continue’ are both specified, wget will emit a warning then proceed as if ‘--continue’ was absent.

Server support for continued download is required, otherwise ‘--start-pos’ cannot help. See ‘-c’ for details.

 

https://www.gnu.org/software/wget/manual/wget.html#Download-Options

Some FTPs also supports retrieving file from specified initial offset.

 

Curl command allows to download specified number of bytes from specified offset:

https://serverfault.com/questions/18834/how-to-ftp-get-a-partial-file-only

Search "range" in:

https://curl.haxx.se/docs/manual.html

 

The main questions:

- are you writing software or using existing software?

- are you writing/using client or server machines?

If somebody, non-programmer, would like to split, let's say 1 GB large file, he or she can simply split file on disk, and have files like name.0, name.1, etc.

after putting them on HTTP, up to 256 simultaneous client machines can get up to 256 files from your default config HTTP Apache server.

 

Link to comment
Share on other sites

11 hours ago, fiveworlds said:

For example, say I want many computers to download the same update file?

Free software such as BitTorrent  is available to reduce server load.

Quote

The BitTorrent protocol can be used to reduce the server and network impact of distributing large files. Rather than downloading a file from a single source server, the BitTorrent protocol allows users to join a "swarm" of hosts to upload to/download from each other simultaneously.

It's (I think) only useful if several users are downloading the same file(s) simultaneously.

 

Link to comment
Share on other sites

@Carrock

Are not you talking about multiple sources, on multiple machines ("servers" or "peers"), single target user.. ?

Torrent is rather not an option, if you want to redistribute your own legit software, to worldwide clients.

 

Edited by Sensei
Link to comment
Share on other sites

21 minutes ago, Sensei said:

@Carrock

Are not you talking about multiple sources, on multiple machines ("servers" or "peers"), single target user.. ?

 

Sort of.

First user starts downloading from source server.

Second user gets partial downloads from source server and from first user, who has already downloaded part of file.

Third user downloads from source, first and second users. First user can now download from source, second and third users. And so on...

My post was only a response to the OP's second sentence.

11 hours ago, fiveworlds said:

Is there an existing application/hardware that will allow a single computer to send duplicate network traffic to multiple computers. For example, say I want many computers to download the same update file?

i.e. I felt his first sentence unnecessarily limited his options.

Sensei: "Torrent is rather not an option, if you want to redistribute your own legit software, to worldwide clients."

 

BitTorrent is an option for every major Linux OS worldwide download; users are requested to download with BitTorrent or similar to reduce server load.

 

Edited by Carrock
Responding to Sensei's addition, despite Forum bug.
Link to comment
Share on other sites

Quote

are you writing software or using existing software?

I would like to use existing software if possible. If not I could write it.

Quote

are you writing/using client or server machines?

I am using windows server 2012 as the switch which will be connected to many machines (not necessarily windows) on the same network. 

Quote

It's (I think) only useful if several users are downloading the same file(s) simultaneously.

There are no users. It is an automated process involving downloading a large (12GB) zip file containing windows driver tests to about 40 computers a process which normally would take several hours from the windows assessment services server rig.  Since the server is also the switch I would like to try using 

https://en.wikipedia.org/wiki/Broadcasting_(networking)

or https://en.wikipedia.org/wiki/Multicast

But most usages of these tend to be for things other than downloading a file such as sms/video emergency broadcasting

 

Link to comment
Share on other sites

33 minutes ago, fiveworlds said:

But most usages of these tend to be for things other than downloading a file such as sms/video emergency broadcasting

Broadcasting life signal (digitalized on the fly e.g. from camera and microphone in the real-time) requires completely different tactic than downloading fully existing file from server.

In the second case, when entire file exists, 1st client could start downloading from 0 offset, 2nd client could start downloading from e.g. 1 MB offset, 3rd client could start downloading from e.g. 2 MB offset and so on, so on, as many clients you want. Then 2nd client could download missing the first block of data from 1st client, and 1st client missing block from 2nd client, without bothering server anymore.

It's pretty complicated though. Client can be static or dynamic IP, public or private IP, behind firewall/NAT or not etc. etc. Only clients which are public static IP could listen for connection from clients unable to open local ports and reveal them to the Internet ("active" or "passive" clients). Server would have to reveal IP of one client to other client, so they could connect p2p (if they can), which is potential privacy vulnerability (abused by anti-piracy organizations). There are also security issues. Modified client software could send something else than has been downloaded from server machine, so there is needed some authentication and verification of integrity of data, which passes through clients to other clients..

33 minutes ago, fiveworlds said:

I would like to use existing software if possible. If not I could write it.

In the face of new provided details of what you need, Carrock advice has sense, and worth a try. Although you are not limited to just torrent. There are other alternative p2p protocols.

DirectConnect has easy protocol specification:

http://ftp.isu.edu.tw/pub/Unix/GNU/ftp/savannah/files/mldonkey/docs/Direct-Connect/dc1_protocol.html

 

 

Edited by Sensei
Link to comment
Share on other sites

On 5/11/2019 at 4:48 AM, fiveworlds said:

Is there an existing application/hardware that will allow a single computer to send duplicate network traffic to multiple computers. For example, say I want many computers to download the same update file?

This is not going to work.

Link to comment
Share on other sites

Quote

This is not going to work

I am not sure if you are right about that. It just isn't a conventional use case. There is some applications available on GitHub that do stuff similar to this for example https://github.com/gistrec/File-Broadcaster

 

Quote


In the second case, when entire file exists, 1st client could start downloading from 0 offset, 2nd client could start downloading from e.g. 1 MB offset, 3rd client could start downloading from e.g. 2 MB offset and so on, so on, as many clients you want. Then 2nd client could download missing the first block of data from 1st client, and 1st client missing block from 2nd client, without bothering server anymore. 

It's pretty complicated though. Client can be static or dynamic IP, public or private IP, behind firewall/NAT or not etc. etc. Only clients which are public static IP could listen for connection from clients unable to open local ports and reveal them to the Internet ("active" or "passive" clients). Server would have to reveal IP of one client to other client, so they could connect p2p (if they can), which is potential privacy vulnerability (abused by anti-piracy organizations). There are also security issues. Modified client software could send something else than has been downloaded from server machine, so there is needed some authentication and verification of integrity of data, which passes through clients to other clients..

 

You can assume that you have complete access to the client. For example you could start a service to listen for the download on all clients. As for lost packets surely the client can keep track of where it loses packets and ask the sender to resend? Or the sender can loop through the download several times.

 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.