Difference between revisions of "Project:Zeus"

From London Hackspace Wiki
Jump to navigation Jump to search
Line 56: Line 56:
 
*** can they be poisoned with false info?
 
*** can they be poisoned with false info?
 
*** and so on.
 
*** and so on.
 +
 +
=== Games ====
 +
 +
* Biggest ever minecraft server

Revision as of 07:37, 8 October 2012

Zeus

Among the server stuff we've got in the space we have several IBM x3590 M2's.

4 of them can be linked together to form a single computer, with the bits we've got this will have 64 cpu's and 512Gb ram.

This page is for listing the uses to which we would put such a machine.

Short term usage

play around with it for a week or so

  • Test the BSD's on it and make it available to *BSD developers for kernel testing - NetBSD at least has only recently had support for > 32 cpu's and may not scale well at 64 CPU's giving developers access to a machine of this size will enable them to test scalability.
  • Try to get on the steam linux beta and run steam/source on it.
  • muck about with it cos we can.
  • We'd then sell it...

Long term usage

  • Zeus would *not* be left on 24/7. We would use a remotely controllable PDU to power it off when not in use, and could charge for the power used.
  • We could set up password protection for the PDU.
  • We cannot use large amounts of internet bandwidth with it - the spaces internet connection isn't up to it and it would not be fair to other members.

Render farm

  • There is some interest from the graphics hackers (?) for a blender/yaffray render farm
  • Could also be used for video rendering.
  • Data would have to be transferred from/to Zeus with external HD's or by copying over the hackspace lan.

Charitable use

  • For charitable bodies who have computing-intensive jobs (from the occasional BARJ (big ass R job) to rendering); or
  • for educational institutions in developing countries who have bona fide research projects that require computing power they cannot afford - this is a *good* idea, but there are so many complications that need to be ironed out, including: getting the data here, export/import controls, verifying the projects, export/import controls, delivering the results, export-import controls... did I mention export-import controls?

Very large dataset processing

  • There are a few hackspace members interested in space and photogrammetery. We could get Lunar & Martian photographic datasets from NASA/JAXA/ESA and process them into virtual planets and then go on virtual moonwalks and mars walks with Project:OculusRIFT.

Internet Simulator

  • Using lightweight process isolation (e.g. FreeBSD Jails + dummynet, usermode Linux, etc..) set up a system that approximates the internet with e.g.:
    • many virtual isp's
    • with end-users with realistic bandwidth
    • an 'Atlantic' and a 'pacific' with appropriate latency

We can then use this to:

  • test out the various network topologically aware bittorrent schemes and see how much of a difference (both in terms of bandwidth usage over backbones and in terms of download speed) the different schemes make.
  • setup our own isolated versions of some of the proposed distributed p2p social network and darknet systems (e.g. freenet, bitcoin, tahrir), and
    • try to break them:
      • can they be tampered with in transit?
      • can they be persuaded to ddos people?
      • can they be poisoned with false info?
      • and so on.

Games =

  • Biggest ever minecraft server