climateprediction.net home page
Grid Model Computing

Grid Model Computing

Questions and Answers : Windows : Grid Model Computing
Message board moderation

To post messages, you must log in.

AuthorMessage
bigsky

Send message
Joined: 29 Mar 05
Posts: 4
Credit: 2,348,450
RAC: 0
Message 29583 - Posted: 17 Jul 2007, 13:38:04 UTC

Hello all,
I was wondering if anyone has heard of any plans to migrate any of the computing for climateprediction onto the GPU? Does the grid structure of the model translate well onto a Staggered lattice model that could be programmed in the GPU?
Just wondering....

bigsky
ID: 29583 · Report as offensive     Reply Quote
Profile astroWX
Volunteer moderator

Send message
Joined: 5 Aug 04
Posts: 1496
Credit: 95,522,203
RAC: 0
Message 29589 - Posted: 17 Jul 2007, 20:00:51 UTC
Last modified: 17 Jul 2007, 20:05:04 UTC

Written with the understanding that you mean the graphics card and not \'grid computing\', which has a different meaning for me:

None to our knowledge. The developers have enough on their plates without wandering into an area of \'who-knows what return\'. They\'d have to know how many people have heavy-duty graphics cards, for starters.

What instruction set do they use? Something more limited than CPU instruction sets, isn\'t it?

For now, developers are busy with 64-bit, with higher-resolution Models, with making everything we\'ve completed to date available to scientists worldwide -- and a wee bit of fire-fighting. One wonders when they have time to sleep.
"We have met the enemy and he is us." -- Pogo
Greetings from coastal Washington state, the scenic US Pacific Northwest.
ID: 29589 · Report as offensive     Reply Quote
bigsky

Send message
Joined: 29 Mar 05
Posts: 4
Credit: 2,348,450
RAC: 0
Message 29608 - Posted: 18 Jul 2007, 22:56:27 UTC - in response to Message 29589.  

Yes I\'m a developer, I understand about no sleep. Thank you for your reply.
Also I am an undergraduate student at the University of Utah, currently I\'m doing a brief, high level overview of General Computing on the GPU. Actually, I was poking around trying to understand some of the math behind the General Circulation Model, Lattice QCD, ODE, PDE, Matrix, all/none? A lot of work has been (successfully) done translating these computational models onto the GPU, even some older cards. The equations are highly parallel and the GPU (thus the project) can take advantage of that. I am a ware of one distributed computing project that has written a separate client to do just this. They believe a gain of 20-40% will be achieved, a nice gain for WU turnaround time. A current check of their stats shows 723 (unspecified) GPU units generating 42TFLOPs, outperforming 22,729 Linux machines running at 39TFLOPs.
Anyway, it\'s all quite interesting. I am aware of another project that would like to try to code it... maybe in 2009. I\'d be glad to pass along any info if you would like, it\'s pretty cool stuff. If you would happen to know of any detail on the climate model and would pass that along, it would be gratly appreciated.
Thanks,
\"Bigsky\"

Written with the understanding that you mean the graphics card and not \'grid computing\', which has a different meaning for me:

None to our knowledge. The developers have enough on their plates without wandering into an area of \'who-knows what return\'. They\'d have to know how many people have heavy-duty graphics cards, for starters.

What instruction set do they use? Something more limited than CPU instruction sets, isn\'t it?

For now, developers are busy with 64-bit, with higher-resolution Models, with making everything we\'ve completed to date available to scientists worldwide -- and a wee bit of fire-fighting. One wonders when they have time to sleep.

ID: 29608 · Report as offensive     Reply Quote
Profile MikeMarsUK
Volunteer moderator
Avatar

Send message
Joined: 13 Jan 06
Posts: 1498
Credit: 15,613,038
RAC: 0
Message 29609 - Posted: 18 Jul 2007, 23:23:21 UTC
Last modified: 18 Jul 2007, 23:31:29 UTC

As an indication of the complexity of the code:

* The current model (HadCM3) is one million lines of Fortran

* The newest model (not yet on the site), HadGEM + HadGAM, is ten million lines of Fortran.

To migrate these to an entirely different architecture would probably take man-centuries of work (i.e., rewriting them from scratch).

I\'m not sure that the GPUs are currently capable of that. Perhaps AMD\'s fusion project will change things.

The project you mention which already uses GPU processing (Folding at home) can only model very simple molecules on the GPU (although it does it extremely quickly). Larger molecules need to be modelled on the PC.

--- Edit:

SNAP! again. Two answers within the same second...

--- Edit:

Must be the week for records, last time it was a triple/quad answer, now it\'s a perfectly simultaneous answer :-)

I'm a volunteer and my views are my own.
News and Announcements and FAQ
ID: 29609 · Report as offensive     Reply Quote
Les Bayliss
Volunteer moderator

Send message
Joined: 5 Sep 04
Posts: 7629
Credit: 24,240,330
RAC: 0
Message 29610 - Posted: 18 Jul 2007, 23:23:21 UTC
Last modified: 18 Jul 2007, 23:25:01 UTC

It\'s been posted many times, and a Search would show some with more detail.

However, to summerise:
The code is owned by the UK Met Office
It\'s been written in Fortran over many decades, by many software engineers, to run on their supercomputers.
The source code is one million plus lines long.
It took 2 software engineers at Oxford Uni over 2 years to convert for use on dsktops, make stable, and produce reliable results.
Each time a new climate model is developed, it takes months of testing to find a set of compiler options that creates code that produces results that are consistent with know results with the same set of parameters.

Testing has been going on for some time along these lines, and it\'s slow going, even on the fastest desktops running continuously.
I don\'t think that the 2 SEs really like the idea of having to start all over again, just to produce something for GPUs. Those projects that have written apps for the PS3 for instance, have quite simple requirements, compared to this project.
By next year it\'s hoped to start testing a new hi-res model that uses 4.7 Gigs of ram.

Horses for courses, as they say. :)

edit
Hi Mike. Now we\'ve got identical posting times. :)

ID: 29610 · Report as offensive     Reply Quote
bigsky

Send message
Joined: 29 Mar 05
Posts: 4
Credit: 2,348,450
RAC: 0
Message 29612 - Posted: 19 Jul 2007, 1:16:50 UTC

Thanks for the perspective. I knew this was a much more complex project than the others, but maybe didn\'t realize it was quite that big. Sorry if I got a bit carried away in my enthusiasm :)
64 bit, eh? I\'ve got a couple of those lying about somewhere as well.... doing 32 bit stuff no doubt. It will be a fine day when those extra bits can be put to good use!
Thanks again!
Aaron
ID: 29612 · Report as offensive     Reply Quote

Questions and Answers : Windows : Grid Model Computing

©2024 cpdn.org