amp

Full file recheck and recheck on resume

49 posts in this topic

To clarify: I don't care if lower priority downloads have their slowest (or even all) sources removed (as long as it will resume automatically when there are no higher priority downloads or there is "extra" bandwidth). But I do care if higher priority downloads are intentionally slowed down and most download bandwidth is occupied by lower priority stuff, which happens now.

Share this post


Link to post
Share on other sites

what connection type do you have??? On my 5MB it doesn't slow down anything and only brings the faster downloading and more free slots in the network.

Segmented downloading is done for fast connection to download from more fast users and not to occupy every each slot in network with zero influence.

Share this post


Link to post
Share on other sites

what connection type do you have???

512/256 kbps

On my 5MB it doesn't slow down anything and only brings the faster downloading and more free slots in the network.

Our "DC network" is quite different. Most users have almost equal (and relatively slow) connection speed, so "per-slot" download speed is typically 2..5 KB/s (when DC receives all bandwidth), so I typically use 10..15 download slots to get optimum download speed. This way your hard-coded "slow user limit" becomes 1..1.5 KB/s, which, obviously, becomes a problem.

Oh, please do not say "you are a leecher because your upstream bandwidth is lower". I will get this problem with any upstream bandwidth. Because this will not change upstream bandwidth of other users. As for me, I represent one of the most useful sources in our network ;-)

Segmented downloading is done for fast connection to download from more fast users and not to occupy every each slot in network with zero influence.

It is even more useful to "prioritize" one file above the others when there is no fast sources for it. And this "slow users bug" breaks this.

If you do not want to make this "feature" optional - you can modify it to work differently with higher priority downloads. And, BTW, removing "slow" alternate sources completely is not very good idea IMHO - download may freeze for a very long time if all of remaining sources go offline.

Share this post


Link to post
Share on other sites

I won't comment your text now, because I don't have much time now. But I will be serious now, so sorry for my following words - are you kidding that you are use segmented downloading on such slow connection????? that's your problem and not in slow download disconnecting.

Share this post


Link to post
Share on other sites

Ok, sponsor our community (300+ users) to upgrade our network connectivity and we would agree with you. And don't forget, 10Mbps connection costs here only 2000$ per month.

Share this post


Link to post
Share on other sites

i have never said that you should upgrade your network. I just said you shouldn't use segmented downloading.

Share this post


Link to post
Share on other sites

Ok, say when, what all this "segmented download" thingy for? Why you implement it?

Share this post


Link to post
Share on other sites

Ok, StrongDC project's main page.

Safe segmented downloading Download one file from more users simultaneously without corruption! This way of downloading dramatically improves download speed, especially for fast connections, and it's safer than classical downloading.

So, this is why we're still using your software. Because segmented downloading allows us to improve download speed! We have a problem. A problem which CAN be solved.

And all you can say is "f**k off guys, buy a good internet access, not my business". Good point. Next time someone will say your software takes lot of CPU, say, that their PCs are crap, and you can't do anything with this. And you would be damn right.

I hope ApexDC developers won't be so selfish.

Share this post


Link to post
Share on other sites

btw .... it made me some headache that i couldn't manage downloading like this -> disconnect slow downloads, but don't remove slow users, only disconnect them when downloading speed is so much.

the client always disconnected and removed them from the sources list (here the internet speeds are also like 256K/128K 1024K/256K and sg like this)

and one of my other questions .... it doesn't check the file's tth root again, after downloading like pwdc did. am i blind or it just doesn't show it or it really doesn't check it?

Share this post


Link to post
Share on other sites

2amp: you forgot to highlight some words

Safe segmented downloading Download one file from more users simultaneously without corruption! This way of downloading dramatically improves download speed, especially for fast connections, and it's safer than classical downloading.

yes, of course, you can use it on slow connection, but then you should except some problems, because we don't want to have situation that all slots will be eaten by slow users.

it's like you would think that your bicycle will be faster when you will be allowed to ride it on a highway.

Share this post


Link to post
Share on other sites

Whats the difference between slow and fast connection in the segment downloading algorithm? Some hardcoded constants?

At my point of view, segmented downloading helps to maximize efficiency of up and down bandwidth usage! No matter how fast you are. So, as far as this algorithm should stay not for speed, but efficiency, it should be tunable for any types of connections. Bring us fine tune settings!

Share this post


Link to post
Share on other sites

It should be conncted with the speed indicators in General, or I'm wrong?

Share this post


Link to post
Share on other sites

It should be conncted with the speed indicators in General, or I'm wrong?

I don't think it's in any way connected to those, and sure hope it never will... As so many users enter incorrect (usually higher) values there.

Share this post


Link to post
Share on other sites

Whats the difference between slow and fast connection in the segment downloading algorithm? Some hardcoded constants?

At my point of view, segmented downloading helps to maximize efficiency of up and down bandwidth usage! No matter how fast you are. So, as far as this algorithm should stay not for speed, but efficiency, it should be tunable for any types of connections. Bring us fine tune settings!

there's on big difference - on slow connection you will occupied slots of all segments for a very long time. And you should also realize that slow connections coulnd't be able to handle more connections and it's speed will decrease.

But I don't know whether I've already mentioned it. The automatic very-slow sources removing is bad, but there's no other possibility. "Slow speed" should be based on your real download without dependency of upload speed of remote side, but it's not technically possible to get your real download speed, so it's done as it's done ;)

Share this post


Link to post
Share on other sites

Your logic about leeching ain't clean enough to me.

You see, I have 1M/1M connection which is relatively fast to our community - most have from 256/128 to 1024/256, but first is the most.

So, almost every source is slow to me. And they can't become faster because of dropping sources. So, I'm a seeder who cannot download as fast as it actually possible ;). At my point of view, this feature is not anti-leeching, but anti-downloading. After all, faster I get a file, faster I will seed it! Even I would be on 256/128 this remain true.

I like torrents, because they fast, do not have slots and stupid limits, but they really suck in sharing of lotta things. Doing torrent for every one file :-/.

Crise

Better tell us your point of view about all this things :)

Share this post


Link to post
Share on other sites

"leeching" isn't real word which can be used here, because leeching is about uploading. It's hard to me to explain it in English, but there's one simple fact - you should use minimum slots as possible and not all available slots.

A lot of people thinks that if they download one file from five users then the the slots will be released 5x sooner, but it's closed to true only on really fast connections.

Share this post


Link to post
Share on other sites

Leeching is about downloading without uploading, AFAIR.

A lot of people thinks that if they download one file from five users then the the slots will be released 5x sooner, but it's closed to true only on really fast connections.
I think that it tends to be on *any* connection. You have download slots limiting after all.

Hot discussion about one simple edit ;)

if(autoDrop) vs if(autoDrop && enabledSlowDisconnecting)

Question appears - what for we have "Enable slow downloads disconnecting" option if it's enabled in any case.

It's hard to me to explain it in English

Unfortunelly I do not understand czech, only russian and english :)

Share this post


Link to post
Share on other sites

it doesn't work on slow connections in such way, because real speed of users decreases. Theoretically it should works like some users think, but according to many tests it works in that way only on fast connections, so auto-disconnecting is protection to abuse segmented downloading on slow connections.

Manual slow downloads disconnecting is there if someone wants higher limit than it's default.

Share this post


Link to post
Share on other sites

Then it should have another description.

So, you won't make it in easy way.

Let me state current problems again:

- Download speed decreases a lot if other side (sources) is on slow connection. Even if it could be faster.

- Download speed might stuck at all, if all sources were dropped and last source goes offline.

So, as void mentioned above, you may change your dropping algorithm in case of high priority downloads.

Or you may add an "autoretry source" to automaticly readd a dropped source after some time or on some event (another source goes offline).

Or you may try to add "training" to explore real connection speed of the host. There are some online tools which can do so. You may also monitor current speed and approximate it to real - higher math techniques available ;).

Share this post


Link to post
Share on other sites

I have one more idea about limiting. What if you'll make it speed-related. I mean, speed that was specifed by user in description. You may also limit download/upload speed by this values. For example, I've set 256K/128K and I have 1M/1M, so I won't be able to download/upload faster than 256K/128K. But also, slow user limit should be also decreased. If I have set 1M/1M but physically I have only 256K/128K I will have increased slow user limit. So, I'd better use proper speeds in my description :)

Share this post


Link to post
Share on other sites

I have one more idea about limiting. What if you'll make it speed-related. I mean, speed that was specifed by user in description. You may also limit download/upload speed by this values. For example, I've set 256K/128K and I have 1M/1M, so I won't be able to download/upload faster than 256K/128K. But also, slow user limit should be also decreased. If I have set 1M/1M but physically I have only 256K/128K I will have increased slow user limit. So, I'd better use proper speeds in my description :)

I had a similar idea in the past but it was related to Line speed visible in Connection column.

Share this post


Link to post
Share on other sites

I have seen the problem that amp and void see, and it is very annoying. Big Muscle is plain wrong to say that you shouldn't use segments with a slow speed connection. Its all relative, what is helpful at high speed is just as helpful at lower speed.

The fix that will keep me happy is this:

-Drop slow sources IF there are other sources like we currently do (using an arbitary limit if you have to).

-If the other sources go away for whatever reason, put the slow sources back to give them another chance.

Share this post


Link to post
Share on other sites

The fix that will keep me happy is this:

-Drop slow sources IF there are other sources like we currently do (using an arbitary limit if you have to).

-If the other sources go away for whatever reason, put the slow sources back to give them another chance.

as i also had this problem because of slow connections around the client usually removed the sources, and there was an option to "readd sources" ..... it's okay until i shutdown the client and then restart it (sg needs the pc to be restarted or changing hardware or just want to sleep :ermm: ) ..... it simply forgets about the readdable sources, so there will be no sources to readd to the files ... i know, automatic search will find them after a while, but still .... i also liked it when i could ask the client just to disconnect slow sources, not to remove them. :)

Share this post


Link to post
Share on other sites