Earlier today we reported on the most interesting element of Justin Rattner's keynote: Intel's research project to bring a CMOS voltage regulator and North Bridge/GMCH on-chip. Now we bring you two of the other (less) interesting items that he talked about in his keynote, starting with: diamonds.

The Diamond Project

The first demo of the day was of Intel Research's Diamond Project. The Diamond Project is an Interactive Data Exploration project, and to put it plainly, the project addresses the problem of finding unorganized, unlabeled digital photos.

Despite the introduction of desktop search and technologies like Apple's Spotlight, finding digital photos on your PC is still a problem because, for the most part, they come in unlabeled. The Diamond Project demo offered another solution to the problem, instead of searching based on an index, the demo searched for pictures based on visual aspects of the pictures.

The demo during the keynote featured a computer with 85,000 unlabeled, unorganized photos. The goal was to find a picture of Intel's Justin Rattner from his first IDF keynote speech back in the Spring. The search was done, not by searching for labels or file names but by what Justin Rattner looked like in the picture.

For example, a face recognition filter was run to find all pictures that looked like they were of a person. Obviously sometimes you end up seeing faces where none exist (e.g. in the clouds), so the accuracy of the filter isn't all that great.

The demo also showed how the technology can be configured on the fly, as a new filter to find pictures that featured a blue IDF shirt was created alongside the face filter.


Selecting the filter criteria

The results of that filter ended up being mostly pictures of keynote speakers at past IDFs, which was exactly the type of pictures the user was looking to find to start with.

Network Manageability Engine
Comments Locked

17 Comments

View All Comments

  • LoneWolf15 - Friday, August 26, 2005 - link

    I think the Network Manageability Engine is a great concept --but only if it can be updated if necessary through some sort of PROM setup or other option. As resilient as it seems today, someone will eventually find a way around it. At that point, if the hardware can't be reprogrammed with updates to meet the threat, it will be useless and instantly obsolete.
  • Regs - Friday, August 26, 2005 - link

    I had the same general reaction as all of you guys. What's the use of all this crap? However I guess it's just something Intel wants to show where the future might lead. I just hope AMD are the ones holding the baton.
  • mkruer - Thursday, August 25, 2005 - link

    Is it my imagination, or is this years IDF very lackluster (Lacking brightness, luster, or vitality; dull)? It seems to me that none of the hardware they are offering, I am remotely interested it. About the only thing that I saw that was interesting was the virtualization technology, but even that will take years to come into the main stream. This for me will go down as one of the more mundane conferences. So far I have seen a lot of hype and a bunch of pretty slides.
  • mikecel79 - Thursday, August 25, 2005 - link

    Running demos of Intel's next generation hardware is not impressive? Lots of 65nm chips running isn't impressive?
  • Kensei - Friday, August 26, 2005 - link

    I agree. I don't know what's not to like here. This is a peek at the future, not a peek at next week. Multi-core, hyperthreading, etc. have huge implications for software engineering and computer science in general regarding how and when to best "divide and bring back together" various computational processes. It adds great complexity to the software engineering field at a time when they currently have difficulty writing code that isn't buggy and/or easily exploited. I think this will make architecting (is that a word?) software, before writing the code, an even more improtant step. Unfortunately, software architecture is something not often taught either on-the-job or in universities.

    What I find intersting about this whole "diamond" thing is why is Intel interested in this sort of stuff at all? It seems much more suited to the type of research being done by MS or at universities. I may be missing something, but what's hardware architecture got to do with identifying people in pictures? Is Intel planning on entering the software development world also?
  • PrinceGaz - Tuesday, August 30, 2005 - link

    Yeah, Diamond does seem a totally software related project, and unless Intel code it to only run on their processors (which they probably would), it would work just as well on AMD chips.

    I suppose Intel have made a few video codecs in the past which were quite well used; maybe they are planning on doing something like that again, but restrict their use to Intel chips this time?
  • mkruer - Friday, August 26, 2005 - link

    Intel's next generation is the problem. After the Prescott debacle, Intel made a knee jerk reaction, even looking at some of the benchmarks for the Yonah and Sossaman (what a joke).

    My predictions for 2006/2007

    1. Intel will go core ballistic with there “Performance per watt” and run into the opposite mistake of the “extreme gigahertz” i.e. they will have lots of slower cores, but they can’t be effectively be used by common x86 applications.
    2. VT will be rushed out the door, and be quickly replaced by VT2

    I have been hearing that Intel will be no threat to AMD for 2006, and after reading all the tech sites, it looks like that is 100% correct. Intel for all there “new technology” is still play catch-up.

    Like I said the only two really interesting things are the VT and new “dynamic” L2 cache that are going to be used for power savings. That’s about it.

    BTW to date I think Transmeta’s Coruso chip has the recorded for “Performance per watt” running the equivalent of a 500Mhz cpu at 1 watt
  • CSMR - Friday, August 26, 2005 - link

    You have to take in to account that power is something like voltage^2*speed and voltage can be reduced if speed is reduced, meaning that performance/watt is not a good metric; something between performance/watt and performance^3/watt would be better.
  • ElFenix - Friday, August 26, 2005 - link

    performance per watt is very important for a lot of customers. anywhere that has a huge amount of cores running will love this, especially as energy prices go up. datacenters are already paying attention to this, their AC bills alone would make most places choke. many customers need nothing more than a 1 ghz p3 to run their email, word, and powerpoint. to get these people to upgrade will need a serious focus on TCO, and power management is going to be a huge part of that.
  • mkruer - Friday, August 26, 2005 - link

    But there in lies the problem. With this massively multi core approach, it will show huge “Performance per watt”, but only in a very massively parallel environments. The good news is that most server applications will be able to use multi threads, bad news is that having multi threads does not mean there is a 1 to 1 increase per core. In real life the maximum number of cores that would be utilized to full extents is around 4, because there are too many process that require the outcome of the original process.

    What is required is a balance, and so far from what I have seen, intel is not gunning for balance, but yet another PR stunt.

Log in

Don't have an account? Sign up now