The last day of IDF is usually reserved for an entertaining keynote by Pat Gelsinger, but since Pat has moved on to his new role as Senior VP and GM of the new Digital Enterprise Group, someone else had to fill his shoes.

That someone else is none other than Justin Rattner, quite possibly the best person for the job. Rattner's keynote started off with a clearly nervous tone, wouldn't you be if you were on the receiving end of the Gelsinger torch?

Mannerisms aside, Rattner is an extremely capable engineer and he should have no problems fulfilling the demands of his new role. His keynote today wasn't able to top Gelsinger's past keynotes but he's on the right track to picking up where Pat left off.

The keynote was a bit long and drawn out, but in the usual style of Gelsinger, Rattner provided a good look at what is coming down the road in a segment he called Platform 2015. The idea is to be able to look at trends that will be reality in the next decade, and here are some of the more interesting points...

The Super Resolution demo
Comments Locked

15 Comments

View All Comments

  • Verdant - Saturday, March 5, 2005 - link

    sigh - no compiler is going to magically make software work in parallel.


    not everything is "parallel-able" (my new word for the day)

    some tasks must be serial-processed, it is the nature of computing.

    my main point is that i hope we can see individual "cores" keep increasing their speed..


    did he say anything about what the highlighted "photonics" box on the slide was about?
  • mkruer - Friday, March 4, 2005 - link

    As if Intel can predict 10yrs into the future. They having trouble predicting one year in advance. I seriously doubt that Intel massive parallelism will be the solution to all their CPU issues. Looking somewhat ahead I see the parallelism thread dying out at around 8 pipelines for the simple reason, that most “standard” (non games or scientific apps) programs would never use more then eight. Look at RISC, most RISC architecture have 10 thread, and its been that way for the last 10yrs or more. You can only go so wide before the width becomes detrimental to the processing of the instruction.
  • Locut0s - Thursday, March 3, 2005 - link

    #12 Oops should have reap above posts. Yeah that makes more sense then.
  • xsilver - Thursday, March 3, 2005 - link

    the super resolution demo requires video people;
    it interpolates a 60-90 frames into 1 frame like the guy above said....

    and #8 ... I think they mean 1000x because the size of the image used in the demo is very small... so if you wanted to use it on say a face then you would need WAY more computing power.... eg. the stuff on CSI is so bunk....
  • Locut0s - Thursday, March 3, 2005 - link

    Am I the only one that thinks that the "Super Resolution" Demo shown there is just a little too good to be true?
  • xsilver - Thursday, March 3, 2005 - link

    "nanoscale thermal pumps"
    sounds like some tool you need to get botox done :)
  • sphinx - Thursday, March 3, 2005 - link

    All I can say is, we'll see.
  • DCstewieG - Thursday, March 3, 2005 - link

    60 seconds to do 3 seconds of footage. That would seem to me it needs 20x the power to do it in real-time. What's this about 1000x?
  • clarkey01 - Thursday, March 3, 2005 - link

    Intel said in early 03 that they would be at 10 Ghz ( nehalem) in 2005.

    So dont hold you breathe on thier dual core predictions
  • Phlargo - Thursday, March 3, 2005 - link

    Didn't Intel originally say that they could scale the P4 architecture to 10 ghz?

Log in

Don't have an account? Sign up now