RSSUncategorized

Seattle Children’s Hospital Mines Big Data

| August 9, 2011
Seattle Children’s Hospital Mines Big Data

Seattle Children’s Hospital Mines Big Data, Dan Gatti, Big Data I/O Forum Ted Corbett, the director of knowledge management, at Seattle Children’s Hospital is using software from a company called Tableau to draw smart inferences from the 10 terabytes of data locked up in his servers and warehousing appliances see article By Stacey Higginbotham Aug. […]

Continue Reading

eBay/PayPal attacks I/O performance and cuts power by 78%

| August 8, 2011
eBay/PayPal  attacks I/O performance and cuts power by 78%

eBay/PayPal  attacks I/O performance and cuts power by 78%,  Dan Gatti, Big Data I/O Forum eBay is a prime example of the benefits of flash. According to Nimbus Data CEO Thomas Isakovich, “eBay had only 2.5TB of flash installed six months ago before recently upgrading to 100TB. Within the PayPal division, where Nimbus is deployed”, […]

Continue Reading

Is DARPA looking for a Big Data Big I/O Headache or BIDIOH PAIN?

| July 23, 2011
Is DARPA looking for a Big Data Big I/O  Headache or BIDIOH PAIN?

As Big Data grows, the question becomes: How do you handle the I/O bottleneck with processing metadata in a zettabyte world? Is the result a bigger migraine headache or a solution that may be provided by IBM’s research group? According to Michael Feldman “Typically metadata itself doesn’t require lots of capacity. To store the attributes […]

Continue Reading

Utah CHPC solves I/O bottlenecks-performance roadblock

| July 10, 2011
Utah CHPC solves I/O bottlenecks-performance roadblock

 [adrotate banner=”1″] The chemical and fuels engineering group at CHPC was running an application that was authored by the Center for the Simulation of Accidental Fires and Explosions. This application is a composite of code contributed from scientists across the country. (see post below Hemsoth article) July 08, 2011 HPC Center Traces Storage Selection Experience […]

Continue Reading

Lustre used in Top 70 Big Computer Systems

| July 2, 2011
Lustre used in Top 70 Big Computer Systems

Brent Gorda, CEO and President, Whamcloud ________________________________________ File systems are a critical component of the modern supercomputing architectural model. Tying together vast numbers of compute nodes to achieve the highest computational speeds depends on a set of robust, coordinated fire hoses of data to connect the compute to the storage. Just as the computational model […]

Continue Reading

Mercedes Renders Big Data with NVIDIA

| June 30, 2011
Mercedes Renders Big Data with NVIDIA

Striking photorealistic creations that capture the appeal and lure of the iconic Mercedes brand are the latest endeavor of Jeff Patton, a prolific, self-taught, freelance computer graphics (CG) artist. Patton originally created the mechanical illustrations as an example of his skill, but the quality of his work soon captured the attention of Mercedes Benz USA […]

Continue Reading

LexisNexis HPCC Systems 10 years old

| June 26, 2011
LexisNexis HPCC Systems 10 years old

  June 15, 2011 – New York – LexisNexis Risk Solutions  today announced that it will offer its data intensive supercomputing platform under a dual license, open source model, as HPCC Systems. HPCC Systems is designed for the enterprise to solve big data problems. The platform is built on top of high performance computing technology, and […]

Continue Reading

Hadoop is the Answer! What is the Question?

| June 8, 2011
Hadoop is the Answer! What is the Question?

 Tim Negris, the VP of Marketing at 1010data, makes some interesting HADOOPIAN observations in this post. see http://cloudcomputing.sys-com/node/1860458 According to Tim, 1010data doesn’t have much actual overlap with Hadoop, which provides programmatic batch job access to linked content files for text, log and social graph data analytics.  I must confess, I have not been paying […]

Continue Reading

Big Data I/O Forum Approach

| May 28, 2011
Big Data I/O Forum Approach

Big Data applications are proliferating rapidly, our objective will be critical to customer’s successful implementation of Big Data applications. The Forum will explore in-dept issues impacting their application, identify real-world paradigms and provide solutions. The Forum will reach out and educate the community and help accelerate adoption of new technologies.

Continue Reading

UCSD: Gordon tackling Data Intensive Applications

| May 28, 2011
UCSD: Gordon tackling Data Intensive Applications

Data intensive applications are creating a data tsunami requiring new architecture and new ideas. The good folks down at USCD have received a $20 million grant form NSF to attack the data intensive applications. Gordan, AKA Flash Gordon is a supercomputer based on SSD Flash Memory and Virtual Shared Memory.

Continue Reading