Friday, July 21, 2006

Genetic algorithm-based optimization design methods; mimicking nature

Genetic algorithms work in decisigning enterprise decision management systems, data mining, and artificial intelligence, but you must know what you are doing since you will not be able to pick one off the shelves!

Thursday, July 20, 2006

Math Grid Toolkit brings grid computing to business


Search Engine Optimization and Semantics: Index Content Not Keywords

If you are in the data mining or artificial intelligence fields this is important, please read.

SPARQL Query Language for RDF

The future language of the web. Data mining and artificial intelligence through the web.

Google exec challenges Berners-Lee

Is Semantic Web the future? Yes it is. Are there some serious issues (like deception) that must be deal with? Yes

The Uninvited Guest: Patents on Wall Street

For companies in the financial services (banking, investment banking, stock brokerage firms,
and the like) and healthcare industries understanding the case of State Street Bank & Trust Co. v. Signature Financial Group, Inc. is a keystone when developing products that are based on data mining and artificial intelligence technologies.

Wednesday, July 19, 2006

Strategic Technology Planning: Picking the Winners

The fundamentals of strategic technology planning for CIOs and CEOs (and other C-level executive management).

Data Mining for Network Intrusion Detection

This paper provides good information for those looking for data mining techniques to detect novel and unknown attacks to their networks.

CRM Data Mining: Methods

This very good article is a comparison of data mining neural networks techniques in CRM vis-a-vis CHAID (Chi-Square Automatic Interaction Detection). The core of the article is that while neural nets set of rules are obscure CHAID have an understandable set of rules. Moreover, the implementation appears to be easier too.

A Fast Clustering Algorithm for Large Categorical Data Sets

This algorithm might be efficient (fast performance) for data mining when trying to cluster categorical (non-numerical) large data set.

Tuesday, July 11, 2006

Kurzweil develops PDA for the blind

Ray Kurzweil is on the forefront of practical applications for artificial intelligence and we are all thankful!

If you do not know who Ray is, you must read:

High-tech prosthetics: Out on a limb

Here is a real-life application (which also integrates the rules of physics) of an artificial intelligence. Somehow I can see the prosthetics and the gaming industry providing crucial components to the artificial intelligence industry.

New IBM Mainframe Aims to Optimize Enterprise Resources

This tool is efficient because it separates the data warehousing from the data mining and business intelligence component. Although this may seem like a simple fact some companies do not understand the computing time and space that data mining techniques imposed upon their warehousing.

INC Research Latest Organization to Partner with SAS in Life Science Industry

A good alternative when a company has multiple clients with multiple data formats that must converted to the same format before beginning the analysis.

Monday, July 10, 2006

Cost Estimation predictive Modeling: Regression vs. Neural Network

If you know the cost estimating relationships (CERs) regression is a better method, but if you don't the neural network is the appropiate toll to select the CER for regression.

Competing on Analytics

Charecteristics of companies that use predictive modeling and analytics in their decision making. Read:

Saturday, July 08, 2006

Failure is an option

This is an article that deserves careful and thoughtful reading. Innovation requires tolerance to failure (or not achieving original objectives). This concept is no different than anything else in life that is worthwhile: the question is not whether we are going to fail to achieve an objective or not (ah, how frail and imperfect are we...humans). The issues are: when we fail (or fall) do we get up again and try again, and what did we learn during the process that we can use in rhe future.

Friday, July 07, 2006


I believe that when the visualization and simulation technology of the gaming industry and the data mining techniques of econometrics will join at one point to give us an incredible robust artificial intelligence application for every day use.

Kognitio and IBM BladeCenter provide fastest, scalable database platform for Data Analytics


I have not had the opportunity to look at this solution.

Supercomputing landscape changing

The rate of companies worldwide that use supercomputers is only going to accelerate as the costs of these systems keep going down. The "business" architechture for the utilization of these high-performance systems is lagging behind the science. The role of a CTO is essential in this matter since IT in some organizations assumes a task-oriented role, rather than a business-partner role.

HyperTransport Consortium Announces New HTX OEM Reference Design Kit

The computer science keeps advancing and making more efficient products for high-performance and clustering computing. Science and business need to join efforts to test these new technologies to increase return of investment (ROI).

Deploying dashboards the right way

I still remember when the CEO of a Fortune 200 company was shown the GE dashboard. He went back to his company and said, "I want a dashboard like this". I tried to explain that the metrics of a dashboard needed to be customized per company, but he did not want to spend the time nor resources needed to create a meaningful dashboard for his company (he wanted what GE had). So, he got what he wanted (and less than a year later he was not the CEO anymore).

Early Warning Signs

In healthcare, as in other industry, a transparent decision management system can make an organization profitable. As in any other industry deciding what to measure (metric) is one of the foundations of an efficient data mining system.

Thursday, July 06, 2006

Scientists to automate thought

This is the future! In my experience, all companies do some type of "cognitive querying" of the available data in their decision management. The keystone of an artificial intelligence is how to integrate the cognitive querying with data mining techniques. Understanding how the human brain works is important in determining how do we "ranking" (or classification) in our decision making. Go GATSBY COMPUTATIONAL NEUROSCIENCE UNIT!!!

twentysix New York's Andrew Brust Authors Microsoft SQL Server 2005 Developer's Reference

This book was needed...period! In order for developers to use SQL 2005 Server to turn data into information using data mining and artificial intelligence techniques it is crucial to understand the capabilities of the system. This issue becomes more crucial when the business and IT come together to design an enterprise decision management system that meets the goals of the business in an efficient and cost efficient manner. Andrew Burst and Stephen Forte have done a significant contribution in the advancement of EDM systems. Thanks.

Fair Isaac Expands Decision Management Opportunities Globally with Localization of Blaze Advisor System

Fair Isaac again is looking ahead of the curve in the area of global decision systems...good job! The next challenge will be to introduce Chinese as one of the languages!!!


Monday, July 03, 2006

Instinct, Not Technology, Rules Pricing Decisions

I have seen this over and over: a company makes substantial investment in technology and predictive modeling but still make pricing and most business decisions out of "instinct". In a data-poor enviroment this solution makes sense, but in a data-rich environment it does not. The bottom line is that it takes change management to make decisions incorporating predictive modeling into the business model.


Microsoft Releases Windows Compute Cluster Server 2003, Bringing High-Performance Computing to the Mainstream

This is a right move for Microsoft, adopting high-performance computing to the mainstream. As Microsoft and other companies compete for the more efficient platform for high-perfomance computing the challegence is to create a web-based artificial intellegence to the mainstream. I have not tested this system yet, but would like to know its capabilitites.

REDMOND, Wash., June 9 /PRNewswire-FirstCall/ -- Microsoft Corp. today announced the release to manufacturing of Windows(R) Compute Cluster Server 2003, the company's first software offering designed to run parallel, high-performance computing (HPC) applications for customers solving complex computations. Windows Compute Cluster Server 2003 accelerates customers' time to insight by providing a reliable, HPC platform that is simple to deploy, operate, and integrate with existing infrastructure and tools. The product will be generally available to customers in August, and evaluation versions will be provided to attendees of the Microsoft(R) TechEd 2006 conference, June 11-16 in Boston.
"High-performance computing technology holds great potential for expanding the opportunities within engineering, medical research, exploration and other critical human endeavors, but until now it has been too expensive and too difficult for many people to use effectively," said Bob Muglia, senior vice president of the Server and Tools Business at Microsoft. "Microsoft is making HPC technology more mainstream by bringing the cost advantages, ease of use and partner ecosystem of the Windows Server(TM) platform to departments and divisions in commercial industry and the public sector. We want HPC technology to become a pervasive resource -- something that's as easy to locate and use as printers are today. Microsoft is excited about the promise this holds for our customers and partners in the months and years ahead."
Bringing High-Performance Computing to the Mainstream
Windows Compute Cluster Server 2003 provides customers with a simplified deployment and management experience, offers easy integration with existing Windows infrastructures, and enables customers to leverage their existing development skills using Microsoft Visual Studio(R) 2005. Via Microsoft's collaboration with the HPC community and strategic partners, Windows Compute Cluster Server 2003 will deliver a more mainstream way for engineers, scientists and researchers to solve scaled-out business and scientific computational problems. This collaboration is designed to meet customers' unique needs by enabling them to choose among and run a variety of compatible HPC applications. Microsoft has also made a multiyear, multimillion-dollar investment in joint projects at academic institutions to help guide ongoing software research and product innovation at Microsoft to address the most challenging technical computing problems.
Windows Compute Cluster Server 2003 has been used by early-adopter customers for oil and gas reservoir simulation and seismic processing, by life sciences customers for simulations of enzyme catalysis and protein folding, and by manufacturing customers for vehicle design and safety improvements. Microsoft's early-adopter customers include AREVA-Challenge (France), BAE Systems, CASPUR (Italy), Cornell University's Computational Biology Service Unit, the National Center for Atmospheric Research, Northrop Grumman Corp., Petrobras (Brazil), Queen's University of Belfast (U.K.), the University of Cincinnati's Genome Research Institute and Virginia Tech's Computational Bioinformatics and Bioimaging Laboratory.
The Computational Biology Service Unit in Ithaca, N.Y., is a core facility for computational biology and bioinformatics for Cornell University researchers, providing research and computational support to biology groups. Windows Compute Cluster Server 2003 has been adopted as a platform for computational biology applications of a wide range of research activities in bioinformatics, including sequence-based data mining, population genetics and protein structure prediction. Many of the projects require lengthy calculations, so a massively parallel computing system, such as Windows Compute Cluster Server 2003, helps accelerate the pace of discovery and insights.
"Adopting Windows Compute Cluster Server 2003 was a natural step for us since we use SQL Server(TM) for our database needs and Windows servers for hosting our Web interfaces," said Jaroslaw Pillardy, Ph.D., senior research associate at the Computational Biology Service Unit. "In addition to serving massively parallel applications, I've found that Windows Compute Cluster Server is a convenient tool for serving the computational needs of many small projects, where installing the software, updating databases and managing other such tasks are much easier than on a set of separate computers."
Northrop Grumman's Space Technology sector has adopted Windows Compute Cluster Server 2003 to deliver on its commitment to helping U.S. government customers achieve mission success. The Space Technology sector is a leading developer of military and civil space systems, satellite payloads, and advanced technologies from high-power lasers to high-performance microelectronics. By running simulation applications, such as MATLAB and FLUENT, in a familiar Windows-based infrastructure, the sector has been able to develop and run simulations and analysis more quickly with lower cost. "Scientists and engineers have huge unmet computing needs today," said Thi Pham, systems engineer in the Space Technology sector at Northrop Grumman. "By adopting Microsoft's high-performance computing solution, we are able to take advantage of economies of scale and efficiencies, helping our scientists and engineers save time and money while increasing availability of computing resources. Beforehand, I had to limit my problem size because I ran out of resources. Now I feel enabled to think bigger."
Microsoft also is working closely with software and hardware partners to help ensure integration of Windows Compute Cluster Server 2003. By the end of 2006, the following software and hardware partners are scheduled to publicly release solutions that run on, or interoperate with, Windows Compute Cluster Server 2003: ABAQUS Inc., Absoft Corp., Advanced Micro Devices Inc. (AMD), ANSYS Inc., BioTeam Inc., Broadcom Corp., CD-adapco, Cisco Systems Inc., Dell Inc., ESI Group, Fluent Inc., Fujitsu Ltd., Hitachi Ltd., HP, IBM Corp., Intel Corporation, Livermore Software Technology Corp., Macrovision Corp., the MathWorks Inc., Mecalog Group, Mellanox Technologies Ltd., MSC Software Corp., Myricom Inc., NEC Corp., Parallel Geoscience Corp., Platform Computing Inc., the Portland Group Inc. (PGI), Schlumberger Ltd., SilverStorm Technologies, Tyan Computer Corp., Verari Systems Inc., Voltaire and Wolfram Research Inc.
Meeting the Growing Demand for High-Performance Computing
Microsoft's entrance into high-performance computing comes at a time when customers are presented with powerful computing economics in the forms of multicore processors, standards-based, high-speed interconnects and ubiquitous x64 (64-bit x86 architecture) computers. Customer demand for HPC is being driven by a combination of increased performance in processors per compute node, low acquisition price per node, and the overall price and performance of compute clusters. These trends are driving new customers to adopt HPC to replace or supplement live, physical experiments with computer-simulated modeling, tests and analysis.
According to analyst firm IDC, the high-performance and technical computing (HPTC) market grew approximately 24 percent in 2005 to reach a record $9.2 billion (U.S.) in revenue, which is the second consecutive year of 20 percent-plus growth in this market. The HPC cluster market share continued to show explosive growth, representing over 50 percent of HPTC market revenue in the first quarter of 2006. IDC reported that worldwide x86 HPTC cluster revenue grew 70 percent year over year (2004 to 2005). IDC indicated that high-performance computing clusters in the lower-end capacity segments of the market will see substantial customer adoption in the coming years. These systems represent a significant initial opportunity for Windows Compute Cluster Server 2003.
Evaluation versions are available today from, with generally availability scheduled for August via volume licensing and original equipment manufacturing licensing. Windows Compute Cluster Server 2003 will be available in the volume license channel for an estimated price of $469 (U.S.) per node, but prices will vary depending on license and volume.
Founded in 1975, Microsoft is the worldwide leader in software, services and solutions that help people and businesses realize their full potential.
NOTE: Microsoft, Windows, Windows Server and Visual Studio are either registered trademarks or trademarks of Microsoft Corp. in the United States and/or other countries.
The names of actual companies and products mentioned herein may be the trademarks of their respective owners. Microsoft Corp.
CONTACT: Rapid Response Team of Waggener Edstrom Worldwide,+1-503-443-7070, or, for Microsoft
Web site:
Published Jun. 9, 2006 — Reads 140

Business Analytics

Business Analytics

Blog Archive

About Me

My photo
See my resume at: