knoten
Kurzgeschichten - Lieder - RPG über mich Links - Gedichte - krude Ideen

 

How to improve Gnutella

Inv this document I store some of my ideas, how Gnutella could be improved or used in different ways. There might be new ones regularly, as I often get these strange flashes, what "should be done" without being able to implement them. see also Anonymize Gnutella and gdf-queue-disscussion.html">gdf-queue-disscussion.
I hope you enjoy the trip!


Automatic Requerying

Would only provide shortterm gain, but do longterm harm!

From dave (From the_GDF, Bearshare Developer, afaik):
> Regardless of "what you know", planning an automated system to requery expected upon population size of 30,000, I would expect it to have real problems as improvements to the searching architecture make searching 60,000 hosts possible, or more. In short, it won't scale, period!

This is what convinced me, that automated requeries are wrong, and will also be in future, because they would only give a short-lived gain, but would hinder the further developement of the GNet.

Thanks, Dave!

---following the original idea---

Some time ago I had the idea, that a client would be more user-friendly, if it regularly retried its queries. As too many queries would flood the network I wanted to ask, if you'd consider a client who did exactly one automatic query every 15min as hostile?

The reason is, that I like to let my client run overnight on weekends, and that I'd like to be able to start one rare query (which wouldn't get me results at once, or would only give too few results) and have it requery automatically, so the number of results would slowly increase, and I could start it when going to work, to find a good list of results when coming back.

I thought about restricting this to _one_ query every 15 minutes, so if I had four seraches, the same search would be send only once an hour.
To keep popular searches from accumulating too many results, I'd set a cap at 250 results, because I don't need it, when I get enough results anyhow.

The drawback of this would be one query every 15mins, which is far less aggressive, than most users search.

The plus side would be longer uptime, even when the host doesn't download actively (because being up wouldn't just waste time anymore, but'd make very rare files accessible), and a better feeling for the User (because the program doesn't need to be micro-managed, but can be left alone for hours, to return and find, what one wants). can't program myself, so please read this as an idea.

Remember: This thought was wrong!


Gnutella News (GNews)

A client to publish news or other updating documents on it.

I thought of a G-Client, which used the GNet to publish News files.

It would use the GNet to find other GNews clients (by simple vendor preferencing, which works quite well in LW) and would then send queries (to them) for files starting with "GNews - [ID of the author] - [date:yyyymmdd] - ".

The files could be of any format, which uses a single file to convey the information (like pdf, rtf, txt, html-archives, or similar).

The client would automatically request documents for a specified date range of a certain author and store and share them. That way the files would get published quickly, and only new files would get distributed without manual searching.

An author could publish stories there, an e-zine could be done this way, and authors could link to other authors ID-tags (that would be the way to find them).

An example would be a program, which searches every hour for files containing the tag "GNews - Pentagon_Spy - 200307" which would find all files of that author, which were published in July.
Already downloaded files shouldn't be downloaded again, naturally.

The program could also create the seraches based on when the last serach ran. If it was less than a day before, it would search for a specific day, if more, then for a month, and only manually for the whole year.

Each file should be signed by the author, and the authors should publish their public keys somewhere, so that the authorship could be proved by the program (after downloading the file, or maybe with a signed hash-string in the MetaData or something similar). False signatures could be marked in the interface as possible frauds. If the Hash or the URN would have to be signed, the download might not be necessary before validating authorship. You couldn't get a wrong file, because a copied URN with Hash-string would lead to the correct file, afaik.

I could imagine it in the simple layout of a forum up to the complex layout of a real newspaper, but I prefer non-html-interfaces most times. Maybe also designed like a News-Client. Or the threads in a sidebar and the text in the main window.

This would make online publishing far less expensive, because you'd not even need a server or a webpage designed for it, but could simply put your texts there. Hobby authors might use it.
It could also be used by universities to publish papers in it.
Or to publish images, maybe by modifying the string to
"GNews - [author] - [date] - [content type] - [name]"

Since you can also search for parts of filenames, a search for GNews [author] [content-type] would result in all files in that specific format (up to the limit of the sharing host).

I can't program, so I can't do this myself, but I thought there might be someone who could.
Most of the core-code shoudl already be there.

-----

Sowohl Benutzer-ID als auch Signatur in der MetaData und der Gesamtdatei.

Process:
Search for file name,
Check MetaData,
Verify Hash/FileName,
Verify User,
Download file,
Verify file.

Alternate:
Download Signature file, check the Signed Hash, Download News File.

Verfahren:
Dateiname wird gesucht,
Metadata wird geprŸft,
Dateiname wird verifiziert,
Benutzer wird verifiziert,
Datei wird heruntergeladen,
Datei wird verifiziert.


Gnutella for distributed Forums

I'd love to see a Forum based on Gnutella as storage. The database file would have to consist of a file for every user, signed by the user (name and data), so that it would be possible to ban users and assist trustto users.
Also it would have one file carrying the design, which could also be updated, but maybe only by users you trust enough, or only by yourself.

It would be distributed over an Island in Gnutella.

For that a client would have to be modified in such a way, that it had only one connection to the normal Gnut, and some connections which used a very strong vendor preferencing for its own client.
It would do a search for the database files once an hour to find other vendors of its kind, so that no two forum-islands would form.

Then it would ask for current database files via the islanded connections and add them together to the database.

Then it would act like a CGI/PHP/whatever-server, so that you could either look at the pages in your browser, accessing via localhost, or even have the forum built into the program, so that you could view it there.
You might even have the design built into the program, so that you wouldn't need to send it, but the last two points would limit the flexibility.

It could also need a key by which the vendor-preferencing would work, so that you only get infos for one forum, and not for all on the Gnut.

Arne Babenhauserheide


zurück