Thursday, December 27, 2007

Immortality for 30 cents

If a gigabyte of hard disk storage costs $100, then you have to manage what you save and what you decide to record in digital form. Now if this cost keeps falling every year, what does it mean if a gigabyte of storage costs nothing? On the Internet a gigabyte of storage is free, I can get it free from Google or Yahoo! just by signing up for an account. If I insist on having that disk drive in my own home, then I can get a gigabyte for about 30 cents.

  • Would you pay 30 cents to be able to keep your worst digital photos forever?
  • Would you pay 30 cents to eliminate the job of cleaning up your hard drive?
  • Would you pay 30 cents so that your email and your voice mail never hit their max capacity?
  • Would you pay 30 cents to store the medical records of 1,000 people in Africa who do not have 30 cents to store their own medical data?
  • Would you pay 30 cents to store your most valuable memories in a Digital Library of Congress forever?
  • Would you pay 30 cents to store a survey of your home for insurance claims?
  • Is there anything so trivial that you would refuse to pay 30 cents to save it?
  • How would you answer these questions when 30 cents drops to 1 cent?

Have you stopped to think about what all of this means to the future of information, commerce, personal memories, ... everything?

Labels: , , ,

Tuesday, December 25, 2007

Santa meets the Warcrafter ....

Labels: , ,

Monday, December 24, 2007

10 Kinds of People in the World (The Techie Version)

There are 10 kinds of people in the world:

01 - Those who understand binary, and

10 - Those who don't.

If you don't understand this joke, then you are in category 10.


Sunday, December 16, 2007

New Ideas from Next Door

Where do new ideas come from? There are the researchers who look for the next big thing in any industry, including simulation. They seek to improve the state of tools or science based on limitations that customers have right now. The simulation community has been clamoring for better interoperability, shared and rapid terrain generation, and more powerful AI for decades. There are entire conferences and committees that explore and discuss each of these.

But who is looking at how multi-core PCs might fundamentally change the simulation industry? Who can see how to apply business IT tools to simulation? What about supercomputers and Web 2.0 tools? Adjacent to simulation are a number of fields that are thriving on different, but related customer problems - as well as lots of money to solve them. Digging deeper in the same hole is not always the best way to solve the problems that are in the hole with you. Sometimes you need to get out of the hole and see what your neighbors are doing in their holes. Your technology neighbors are just as smart as the people in your hole, perhaps smarter. And often by looking at a similar problem from a different angle they come up with a solution that really makes the problem look a lot easier.

Another advantage of the neighbors hole is that you are not required to be consistent with all of the historical work that have been done in your own area. You are allowed to think and explore at tangents that are just not quite proper in the official hole. Gian Zaccai at the Design Continuum says that, "moving among many different industries frees you from the dogma of any one industry and their firm belief in the links between problems and solutions." Andrew Hargadon at UC Davis believes that "bridging multiple worlds, in essence, makes you less susceptible to the pressures of conforming in any one because you have somewhere else to go."

So where some promising places to look for technologies that are valuable in teh simulation world? I like:

  1. High Performance Computing, including multicore and GPU.
  2. Business IT, including the Service Oriented Architecture.
  3. Computer Games, with emphasis on their tools for creating simulations.
  4. Web 2.0 because they are all about collaboration, networking, and authoring unique information.

When I look at what these communities are doing I see so many great ideas that can be used directly in our community. The struggle is always in bringing new ideas from the neighbors next door and convincing my own family that they are valuable. Imagine how the two Marines who created Marine Doom felt back in the 1990's when they introduced their ideas. Back then it was, "that's nice how made a toy look more real". Today the toys are overturning big parts of the industry. All of the industries listed above offer similarly powerful tools.

Get out of your hole and go visit the neighbors.

Labels: , , , , ,

Thursday, December 6, 2007


What is the primary and overridding difference between the CPU and the GPU is a nice consumer computer? It is NOT the speeds at which vectors can be processed, it is NOT the ability or lack of ability to processes double precision numbers.

The most important factor is who makes the most money from that PC you bought. Under the current configuration a nice graphics card can account for 50% of the cost of a computer and the GPU often costs more than the CPU.

This situation does not set well with Intel and AMD. The latter has made a move to change this by purchasing ATI and working on a new computer design that combines the capabilities of the CPU and the GPU and brings more computer revenues to the compined company (AMD+ATI).

Intel, on the other hand, is changing the paradigm from inside the company and inside the CPU. Their Larrabee project is looking to perovide a multi-core chip that includes cores that can handle the graphics that have traditionally been owned by the GPU.

So what is Nvidia doing to defend their very profitable turf? It appears that they are pursuing high-applications with their Tesla product that uses multiple GPUs to handle high-compute problems. Given that the consumer desktop is where all of the money is, you would expect them to be doing their own innovation in the consumer space. That may include multi-core GPU, multi-chip cards, combined CPU/GPU architectures ... Or something entirely different.

THe situation where the GPU pulls in a significant share or the PC price is equivalent to a serious threat to Microsoft's ownership of the O/S and Office productivity tools. The power of Intel HAS to rise up to reclaim these revenues. There will be a new architecture for consumer grade computers in CPU/GPU specifically because of the current revenue share ... Intel will make it happen. The real question is why has it taken so long?

Labels: , , , ,