So if I expect people to vote for my proposal of a National Canadian Supercomputer centre, I should provide some examples of what we would do with such a machine. Certainly not a comprehensive list, but a sampling of some interesting articles that I have read recently.
We all hope technology will bring us breakthroughs in the prevention and treatment of disease. Supercomputing lends a hand in many ways. Here is an example of some basic research in the human proteome enabled by a supercomputer shared by universities in Ottawa and Kingston. This type of basic research also known as bioinformatics, will accelerate the discovery on new cures. There’s a small Canadian company, that is developing GPU based solutions for bioinformatics. Genome research, which has similarities to proteome research, is being used to find cures for cancer. Canada is participating in a global effort to share genome based cancer research. This is an interesting article on the simulation of DNA repair mechanisms. Gene research becomes a data mining exercise.
Batteries are a foundational technology for many products and of special interest in new automobile design; this is a nice write up on how supercomputers are being used for better battery design.
Oil & Gas
The oil and gas industry uses supercomputer simulation for exploration (processing seismic data) and production (reservoir simulation), but this story talks about a new computer at the University of Regina that is being used for research into greener methods for petroleum processing, even the manufacturer of the machine is Canadian.
Although supercomputers are usually thought of for solving computationally intensive algorithms, this article talks about the processing needs of massive amounts of data for space research. Canada also uses supercomputers for processing telescope data.
Weather simulation has been run on supercomputers for decades, but more recently, climate simulation is another growing area for supercomputer research. Canada should play a leading role in modeling the changes in arctic climate. This article makes reference to chemistry research using high performance computing for CO2 capture, and important component of climate modelling. There is an interesting animation of an ocean current simulation that is attempting to predict the spread of the BP oil spill. It has been suggested that disaster scenarios like this should be simulated as part of the due diligence for all new ocean oil drilling platforms. With all the interest in drilling in the arctic, it would be in Canada’s best interest to not only model disaster scenarios for Canadian rigs in the arctic, but also the oil drilling sites of our arctic neighbours as well.
Canadian contributions to Supercomputing technology
Canada’s biggest supercomputer belongs to the SciNet consortium. Although this CBC article written last year states that it would be in the 15 machines in the world, it actually ranked 22nd when it first came online, and when the list was updated last month, it dropped to 28th.
I wrote a blog post on an innovative supercomputer built in Quebec. Although this is a truly innovative supercomputer design, it is currently only ranked 72nd.
This article mentions Canada’s contribution to a multi-national effort to develop exascale computing software, although it is not specific on what Canadian institutions are participating in this development.
Although a CATA article from last year, it paints a sad picture for companies that develop new technology in Canada; the case in point here is a Canadian company that was developing a supercomputer system, of which I have written about in another post. This company did not survive, but I wonder if it would have if it could have had a GoC reference account a year earlier than it did.