Alex H

Member
  • Content count

    5
  • Joined

  • Last visited

About Alex H

  • Rank
    Newbie

Contact Methods

  • ICQ
    0
  1. Oh no, it doesn't need to be website based at all. It's an individual user publishing their own feed from their own PC, (which needs to have either a static IP or a DynDNS address). I guess it would be kind of similar to a bot that posts a magnet link into a hub's main chat window whenever the bot's owner adds a new file to their share. The RSS way of doing it is, of course, a lot less like spam because it's opt in - User A decides whether they want to ask User B if User B's feed contains anything new in it, rather than User B just spamming everyone about the great new file they're sharing. As you say, DC is really good for obscure stuff so an example might be that Alice wants to share a bunch of old radio show MP3s as they enter the public domain. If the old radio show was originally broadcast weekly, then the copyrights would expire on a weekly basis, so she'd have a new one to share every week. Alice might set her feed to update weekly, so when Bob subscribes to her feed, his Apex would ping Alice's feed (e.g. http://alice.dyndns.org/apex/radiofeed.rss) once a week on Tuesdays. The only reason I suggest this is because its a real pain letting everyone know you have new files in your share for people to download. For me, I probably only share stuff with a maximum of 20 people that I personally know, but I don't want to have to go around and tell everyone "hey, I've got some new files in my share" - they should be able to be informed as soon as I put them in my shared folder. Apex can already download a user's file list whenever they log on to a particular hub, but that only happens when they initially log on and then you have to search though it manually to see if they have any new stuff. This way, you'd be doing the same thing but using different criteria: check for files labeled with a specific tag (e.g. alice.radio.show) at a fixed time (e.g. ever 7 days at 12:01). (By way of example, I had to write an RSS reader for a course in C# - it was only basic, but from memory it was less than a hundred lines of code.)
  2. I don't think it's especially advanced - it's just tacking on a cheap-arsed RSS reader and getting it to output a bunch of info that Apex holds already. The &kt= token is basically just a tag that the feed owner can set.
  3. Gee, I was hoping someone would at least reply saying it was a **** idea. :blushing:
  4. I've been talking with a mate who is trying to do some scripting and was wondering if there is another solution. He is trying to get a remote computer to automatically download files from him (via Apex) as soon as he sticks them in a particular shared folder. From what I gather, he wants to upload them to a seedbox. Rather than screw around with scripts, I thought this was exactly what RSS was built for. Considering Apex already has a webserver (I don't know if its a stripped back server or what), would it be possible for a client to publish an RSS feed using the server? Provided they had a static IP or a no.ip.org address, they would be easy to contact. If the RSS feed were to spit out Magnet links, there are other nifty things that would make the RSS downloader very useful. Magnet links are very versatile and as well as supplying the TTH (which most do) and I was wondering if Apex supported the &xs+ token? This defines an eXact Source and is a URL of some kind (IP or hostname). This is obviosualy who the person publishing the RSS feed is, so the person receiving the feed starts downloading [TTH] from [The feed publisher]. So that is the basic way of doing a feed. The slightly more complicated but much cool way of doing it would be to use the &kt+ token, which is for Key Words. Usually, you click on the magent link and the client starts searching a network for the key words. The file and source are already known though, so this token can be used as a tag to only download items in the feed containing [Key Word]. This means the feed owner could publish two or more podcasts (for example) and clients could choose to only download the ones with one key word and not the other. Basically: Read [font="Courier New"]random.no-ip.org/example.rss [/font] check for entries marked with [font="Courier New"][Key+Word+1][/font] feed contains 3x files marked with [font="Courier New"][Key+Word+1][/font], 4x files marked with [font="Courier New"][Key+Word+2][/font], 7x files marked with [font="Courier New"][Key+Word+3][/font] get [font="Courier New"][TTH][/font] for entries marked with [font="Courier New"][Key+Word+1][/font] start downloading [font="Courier New"][TTH][/font] from [font="Courier New"][IP:PORT][/font] ...and from there on the publisher get to choose who to give priority/extra slots/etc to. Anyway, to me at least, this seems like a much better way of automatically downloading files (your box or a remote one) than complicated scripts. I imagine and RSS reader/publisher would be fairly easy to implement as there are hundreds of them out there and they usually the subject of programming tutorials. What are the capabilities of Apex's magnet handling? Oh, and is this a good idea in the first place?
  5. Segmented downloading techniques

    So, basically you want the "check for corruption and ONLY re-download corrupted parts" feature that Shareaza has?