Tuesday, August 22, 2006

Enterprise Decision Management

James Taylor has changed the EDM (Enterprise Decision Management) to a new URL. Jim is an expert in the field so I decided to let everyone know the new link: the primary URL for the EDM blog is www.edmblog.com

Wednesday, August 16, 2006

Artificial Intelligence II

Everything you wanted to know (more or less) but were afraid to ask. It is not complete and not perfect, but it will help you navigate some concepts. Good for beginners.
Read: http://www.resultspk.net/artificial_intelligence/

Artificial intelligence applied to network load balancing using Ant Colony Optimisation

Good potential solution of using a genetic algorithm to network load balancing issues. Read: http://www.codeproject.com/useritems/Ant_Colony_Optimisation.asp

Tuesday, August 15, 2006

Eleventh International Conference on Computer Aided Systems Theory

Date: February 12-16, 2007

Looks good!

http://www.iuctc.ulpgc.es/spain/eurocast2007/workshop.html

Segregative Genetic Algorithms (SEGA): A Hybrid Superstructure

This sounds like a robust and efficient solution for the issue of premature convergence (the child is no better than the parent) in genetic algorithms.
http://www.heuristiclab.com/publications/papers/affenzeller01d.pdf

Friday, July 21, 2006

Genetic algorithm-based optimization design methods; mimicking nature

Genetic algorithms work in decisigning enterprise decision management systems, data mining, and artificial intelligence, but you must know what you are doing since you will not be able to pick one off the shelves!
Read: http://smarteconomy.typepad.com/smart_economy/2006/05/genetic_algorit.html

Thursday, July 20, 2006

Math Grid Toolkit brings grid computing to business

read: http://www.macworld.com/news/2006/07/20/mgt/index.php?lsrc=mwrss

Search Engine Optimization and Semantics: Index Content Not Keywords

If you are in the data mining or artificial intelligence fields this is important, please read.
Read: http://www.isedb.com/db/articles/1491/

SPARQL Query Language for RDF

The future language of the web. Data mining and artificial intelligence through the web.
Read: http://www.w3.org/TR/rdf-sparql-query/

Google exec challenges Berners-Lee

Is Semantic Web the future? Yes it is. Are there some serious issues (like deception) that must be deal with? Yes
Read: http://nundesign.blogspot.com/2006/07/google-exec-challenges-berners-lee.html

The Uninvited Guest: Patents on Wall Street

For companies in the financial services (banking, investment banking, stock brokerage firms,
and the like) and healthcare industries understanding the case of State Street Bank & Trust Co. v. Signature Financial Group, Inc. is a keystone when developing products that are based on data mining and artificial intelligence technologies.
Read: http://www.frbatlanta.org/filelegacydocs/erq403_merges.pdf

Wednesday, July 19, 2006

Strategic Technology Planning: Picking the Winners

The fundamentals of strategic technology planning for CIOs and CEOs (and other C-level executive management).
Read: http://www.cata.ca/files/PDF/Resource_Centres/hightech/reports/indepstudies/StrategicTechnologyPlanningPickingtheWinners.pdf

Data Mining for Network Intrusion Detection

This paper provides good information for those looking for data mining techniques to detect novel and unknown attacks to their networks.
Read: http://scholar.google.com/scholar?hl=en&lr=&q=cache:cU5cnHJrw2kJ:www-users.cs.umn.edu/~kumar/papers/nsf_ngdm_2002.pdf+mahalanobis+%22data+mining%22

CRM Data Mining: Methods

This very good article is a comparison of data mining neural networks techniques in CRM vis-a-vis CHAID (Chi-Square Automatic Interaction Detection). The core of the article is that while neural nets set of rules are obscure CHAID have an understandable set of rules. Moreover, the implementation appears to be easier too.
Read: http://scholar.google.com/scholar?hl=en&lr=&scoring=r&q=cache:1e03s57-wVgJ:chern.ie.nthu.edu.tw/IEEM7103/923834-paper-1-june21.pdf+%22chi+square%22+%22data+mining%22

A Fast Clustering Algorithm for Large Categorical Data Sets

This algorithm might be efficient (fast performance) for data mining when trying to cluster categorical (non-numerical) large data set.
Read: http://scholar.google.com/scholar?hl=en&lr=&q=cache:hPC_IAqQuWUJ:www.dmi.usherb.ca/~wang/teaching/winter_03/Huang_Categ_Clustering.pdf+%22chi+square%22+%22data+mining%22

Tuesday, July 11, 2006

Kurzweil develops PDA for the blind

Ray Kurzweil is on the forefront of practical applications for artificial intelligence and we are all thankful!
Read: http://jkontherun.blogs.com/jkontherun/2006/06/kurzweil_develo.html

If you do not know who Ray is, you must read: http://www.kurzweiltech.com/aboutray.html

High-tech prosthetics: Out on a limb

Here is a real-life application (which also integrates the rules of physics) of an artificial intelligence. Somehow I can see the prosthetics and the gaming industry providing crucial components to the artificial intelligence industry.
Read: http://news.com.com/2100-11393_3-6091189.html?part=rss&tag=6091189&subj=news

New IBM Mainframe Aims to Optimize Enterprise Resources

This tool is efficient because it separates the data warehousing from the data mining and business intelligence component. Although this may seem like a simple fact some companies do not understand the computing time and space that data mining techniques imposed upon their warehousing.
Read: http://www.ecommercetimes.com/rsstory/51099.html

INC Research Latest Organization to Partner with SAS in Life Science Industry

A good alternative when a company has multiple clients with multiple data formats that must converted to the same format before beginning the analysis.
Read: http://www.crm2day.com/news/crm/119051.php

Monday, July 10, 2006

Cost Estimation predictive Modeling: Regression vs. Neural Network

If you know the cost estimating relationships (CERs) regression is a better method, but if you don't the neural network is the appropiate toll to select the CER for regression.
Read:
http://scholar.google.com/scholar?hl=en&lr=&q=cache:aLOpDudTY2AJ:www.eng.auburn.edu/users/aesmith/postscript/tony.pdf++%22predictive+modeling%22

Competing on Analytics

Charecteristics of companies that use predictive modeling and analytics in their decision making. Read: https://securemx.ed4.net/hbsp/sas/competingsingles.6.pdf

Saturday, July 08, 2006

Failure is an option

This is an article that deserves careful and thoughtful reading. Innovation requires tolerance to failure (or not achieving original objectives). This concept is no different than anything else in life that is worthwhile: the question is not whether we are going to fail to achieve an objective or not (ah, how frail and imperfect are we...humans). The issues are: when we fail (or fall) do we get up again and try again, and what did we learn during the process that we can use in rhe future.
Read: http://innovateonpurpose.blogspot.com/2006/07/failure-is-option.html

Friday, July 07, 2006

INSTINCT GAINS INTELLIGENCE

I believe that when the visualization and simulation technology of the gaming industry and the data mining techniques of econometrics will join at one point to give us an incredible robust artificial intelligence application for every day use.
Read:
http://www.gamesindustry.biz/content_page.php?aid=18194

Kognitio and IBM BladeCenter provide fastest, scalable database platform for Data Analytics

Read:
http://www.prweb.com/releases/2006/7/prweb407614.htm

I have not had the opportunity to look at this solution.

Supercomputing landscape changing

The rate of companies worldwide that use supercomputers is only going to accelerate as the costs of these systems keep going down. The "business" architechture for the utilization of these high-performance systems is lagging behind the science. The role of a CTO is essential in this matter since IT in some organizations assumes a task-oriented role, rather than a business-partner role.
Read:
http://infotech.indiatimes.com/articleshow/1702364.cms

HyperTransport Consortium Announces New HTX OEM Reference Design Kit

The computer science keeps advancing and making more efficient products for high-performance and clustering computing. Science and business need to join efforts to test these new technologies to increase return of investment (ROI).
Read:
http://www.linuxelectrons.com/article.php/20060706154023903

Deploying dashboards the right way

I still remember when the CEO of a Fortune 200 company was shown the GE dashboard. He went back to his company and said, "I want a dashboard like this". I tried to explain that the metrics of a dashboard needed to be customized per company, but he did not want to spend the time nor resources needed to create a meaningful dashboard for his company (he wanted what GE had). So, he got what he wanted (and less than a year later he was not the CEO anymore).
Read:
http://www.arnnet.com.au/index.php/id;1547840507;fp;2;fpid;1

Early Warning Signs

In healthcare, as in other industry, a transparent decision management system can make an organization profitable. As in any other industry deciding what to measure (metric) is one of the foundations of an efficient data mining system.
Read:
http://www.healthleadersmedia.com/magazine/viewmagfeature/content/80595.html

Thursday, July 06, 2006

Scientists to automate thought

This is the future! In my experience, all companies do some type of "cognitive querying" of the available data in their decision management. The keystone of an artificial intelligence is how to integrate the cognitive querying with data mining techniques. Understanding how the human brain works is important in determining how do we "ranking" (or classification) in our decision making. Go GATSBY COMPUTATIONAL NEUROSCIENCE UNIT!!!
Read: http://www.computing.co.uk/computing/analysis/2159789/scientists-automate-thought

twentysix New York's Andrew Brust Authors Microsoft SQL Server 2005 Developer's Reference

This book was needed...period! In order for developers to use SQL 2005 Server to turn data into information using data mining and artificial intelligence techniques it is crucial to understand the capabilities of the system. This issue becomes more crucial when the business and IT come together to design an enterprise decision management system that meets the goals of the business in an efficient and cost efficient manner. Andrew Burst and Stephen Forte have done a significant contribution in the advancement of EDM systems. Thanks.
Read:
http://biz.yahoo.com/prnews/060706/nyth066.html?.v=58

Fair Isaac Expands Decision Management Opportunities Globally with Localization of Blaze Advisor System

Fair Isaac again is looking ahead of the curve in the area of global decision systems...good job! The next challenge will be to introduce Chinese as one of the languages!!!

Read:
http://www.fairisaac.com/Fairisaac/News/Press+Releases/Fair+Isaac+Expands+Decision+Management+Opportunities+Globally+with+Localization+of+Blaze+Advisor+Sys.htm

Monday, July 03, 2006

Instinct, Not Technology, Rules Pricing Decisions

I have seen this over and over: a company makes substantial investment in technology and predictive modeling but still make pricing and most business decisions out of "instinct". In a data-poor enviroment this solution makes sense, but in a data-rich environment it does not. The bottom line is that it takes change management to make decisions incorporating predictive modeling into the business model.

Read:
http://www.destinationcrm.com/articles/default.asp?ArticleID=6143&TopicID=9

Microsoft Releases Windows Compute Cluster Server 2003, Bringing High-Performance Computing to the Mainstream

This is a right move for Microsoft, adopting high-performance computing to the mainstream. As Microsoft and other companies compete for the more efficient platform for high-perfomance computing the challegence is to create a web-based artificial intellegence to the mainstream. I have not tested this system yet, but would like to know its capabilitites.

REDMOND, Wash., June 9 /PRNewswire-FirstCall/ -- Microsoft Corp. today announced the release to manufacturing of Windows(R) Compute Cluster Server 2003, the company's first software offering designed to run parallel, high-performance computing (HPC) applications for customers solving complex computations. Windows Compute Cluster Server 2003 accelerates customers' time to insight by providing a reliable, HPC platform that is simple to deploy, operate, and integrate with existing infrastructure and tools. The product will be generally available to customers in August, and evaluation versions will be provided to attendees of the Microsoft(R) TechEd 2006 conference, June 11-16 in Boston.
(Logo: http://www.newscom.com/cgi-bin/prnh/20000822/MSFTLOGO)
"High-performance computing technology holds great potential for expanding the opportunities within engineering, medical research, exploration and other critical human endeavors, but until now it has been too expensive and too difficult for many people to use effectively," said Bob Muglia, senior vice president of the Server and Tools Business at Microsoft. "Microsoft is making HPC technology more mainstream by bringing the cost advantages, ease of use and partner ecosystem of the Windows Server(TM) platform to departments and divisions in commercial industry and the public sector. We want HPC technology to become a pervasive resource -- something that's as easy to locate and use as printers are today. Microsoft is excited about the promise this holds for our customers and partners in the months and years ahead."
Bringing High-Performance Computing to the Mainstream
Windows Compute Cluster Server 2003 provides customers with a simplified deployment and management experience, offers easy integration with existing Windows infrastructures, and enables customers to leverage their existing development skills using Microsoft Visual Studio(R) 2005. Via Microsoft's collaboration with the HPC community and strategic partners, Windows Compute Cluster Server 2003 will deliver a more mainstream way for engineers, scientists and researchers to solve scaled-out business and scientific computational problems. This collaboration is designed to meet customers' unique needs by enabling them to choose among and run a variety of compatible HPC applications. Microsoft has also made a multiyear, multimillion-dollar investment in joint projects at academic institutions to help guide ongoing software research and product innovation at Microsoft to address the most challenging technical computing problems.
Windows Compute Cluster Server 2003 has been used by early-adopter customers for oil and gas reservoir simulation and seismic processing, by life sciences customers for simulations of enzyme catalysis and protein folding, and by manufacturing customers for vehicle design and safety improvements. Microsoft's early-adopter customers include AREVA-Challenge (France), BAE Systems, CASPUR (Italy), Cornell University's Computational Biology Service Unit, the National Center for Atmospheric Research, Northrop Grumman Corp., Petrobras (Brazil), Queen's University of Belfast (U.K.), the University of Cincinnati's Genome Research Institute and Virginia Tech's Computational Bioinformatics and Bioimaging Laboratory.
The Computational Biology Service Unit in Ithaca, N.Y., is a core facility for computational biology and bioinformatics for Cornell University researchers, providing research and computational support to biology groups. Windows Compute Cluster Server 2003 has been adopted as a platform for computational biology applications of a wide range of research activities in bioinformatics, including sequence-based data mining, population genetics and protein structure prediction. Many of the projects require lengthy calculations, so a massively parallel computing system, such as Windows Compute Cluster Server 2003, helps accelerate the pace of discovery and insights.
"Adopting Windows Compute Cluster Server 2003 was a natural step for us since we use SQL Server(TM) for our database needs and Windows servers for hosting our Web interfaces," said Jaroslaw Pillardy, Ph.D., senior research associate at the Computational Biology Service Unit. "In addition to serving massively parallel applications, I've found that Windows Compute Cluster Server is a convenient tool for serving the computational needs of many small projects, where installing the software, updating databases and managing other such tasks are much easier than on a set of separate computers."
Northrop Grumman's Space Technology sector has adopted Windows Compute Cluster Server 2003 to deliver on its commitment to helping U.S. government customers achieve mission success. The Space Technology sector is a leading developer of military and civil space systems, satellite payloads, and advanced technologies from high-power lasers to high-performance microelectronics. By running simulation applications, such as MATLAB and FLUENT, in a familiar Windows-based infrastructure, the sector has been able to develop and run simulations and analysis more quickly with lower cost. "Scientists and engineers have huge unmet computing needs today," said Thi Pham, systems engineer in the Space Technology sector at Northrop Grumman. "By adopting Microsoft's high-performance computing solution, we are able to take advantage of economies of scale and efficiencies, helping our scientists and engineers save time and money while increasing availability of computing resources. Beforehand, I had to limit my problem size because I ran out of resources. Now I feel enabled to think bigger."
Microsoft also is working closely with software and hardware partners to help ensure integration of Windows Compute Cluster Server 2003. By the end of 2006, the following software and hardware partners are scheduled to publicly release solutions that run on, or interoperate with, Windows Compute Cluster Server 2003: ABAQUS Inc., Absoft Corp., Advanced Micro Devices Inc. (AMD), ANSYS Inc., BioTeam Inc., Broadcom Corp., CD-adapco, Cisco Systems Inc., Dell Inc., ESI Group, Fluent Inc., Fujitsu Ltd., Hitachi Ltd., HP, IBM Corp., Intel Corporation, Livermore Software Technology Corp., Macrovision Corp., the MathWorks Inc., Mecalog Group, Mellanox Technologies Ltd., MSC Software Corp., Myricom Inc., NEC Corp., Parallel Geoscience Corp., Platform Computing Inc., the Portland Group Inc. (PGI), Schlumberger Ltd., SilverStorm Technologies, Tyan Computer Corp., Verari Systems Inc., Voltaire and Wolfram Research Inc.
Meeting the Growing Demand for High-Performance Computing
Microsoft's entrance into high-performance computing comes at a time when customers are presented with powerful computing economics in the forms of multicore processors, standards-based, high-speed interconnects and ubiquitous x64 (64-bit x86 architecture) computers. Customer demand for HPC is being driven by a combination of increased performance in processors per compute node, low acquisition price per node, and the overall price and performance of compute clusters. These trends are driving new customers to adopt HPC to replace or supplement live, physical experiments with computer-simulated modeling, tests and analysis.
According to analyst firm IDC, the high-performance and technical computing (HPTC) market grew approximately 24 percent in 2005 to reach a record $9.2 billion (U.S.) in revenue, which is the second consecutive year of 20 percent-plus growth in this market. The HPC cluster market share continued to show explosive growth, representing over 50 percent of HPTC market revenue in the first quarter of 2006. IDC reported that worldwide x86 HPTC cluster revenue grew 70 percent year over year (2004 to 2005). IDC indicated that high-performance computing clusters in the lower-end capacity segments of the market will see substantial customer adoption in the coming years. These systems represent a significant initial opportunity for Windows Compute Cluster Server 2003.
Availability
Evaluation versions are available today from http://www.microsoft.com/hpc, with generally availability scheduled for August via volume licensing and original equipment manufacturing licensing. Windows Compute Cluster Server 2003 will be available in the volume license channel for an estimated price of $469 (U.S.) per node, but prices will vary depending on license and volume.
Founded in 1975, Microsoft is the worldwide leader in software, services and solutions that help people and businesses realize their full potential.
NOTE: Microsoft, Windows, Windows Server and Visual Studio are either registered trademarks or trademarks of Microsoft Corp. in the United States and/or other countries.
The names of actual companies and products mentioned herein may be the trademarks of their respective owners. Microsoft Corp.
CONTACT: Rapid Response Team of Waggener Edstrom Worldwide,+1-503-443-7070, or rrt@waggeneredstrom.com, for Microsoft
Web site: http://www.microsoft.com/
Published Jun. 9, 2006 — Reads 140

Friday, June 30, 2006

Cisco Integrates Ethernet and InfiniBand Management to Support High Performance Data Center Applications and Strengthen InfiniBand As a Data Center Te

I think that cluster computering is an efficient architecture for data mining and artificial intelligence.

Read: http://www.redorbit.com/news/technology/551560/cisco_integrates_ethernet_and_infiniband_management_to_support_high_performance/index.html?source=r_technology

Web-Based Data Mining and Agile Reporting Now Possible with AJAX Technology

Although I am not saying that this is THE TOOL, the idea of using a web-based data mining application for non-technical users IS the right idea.

Read:
http://sys-con.com/read/241997.htm

Artificial Intelligence Techniques Used In Computerized Valuation System

Read:
http://www.sciencedaily.com/releases/2006/06/060621085141.htm

CombineNet Founder Dr. Tuomas Sandholm to Receive IAAI Award for Application of Artificial Intelligence in Strategic Sourcing

CombineNet Founder Dr. Tuomas Sandholm to Receive IAAI Award for Application of Artificial Intelligence in Strategic Sourcing


PITTSBURGH, PA—June 26 , 2006
Dr. Tuomas Sandholm, Founder, Chairman and Chief Scientist of CombineNet, will present his paper Expressive Commerce and Its Application to Sourcing at the Eighteenth Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-06). Dr. Sandholm will be honored with the American Association of Artificial Intelligence (AAAI) Deployed Application Award for his insight and achievement in applying artificial intelligence to strategic sourcing activities at the IAAI conference on July 18-20, 2006 in Boston, MA.

Dr. Sandholm's paper discusses the application and impact of artificial intelligence in CombineNet's proprietary optimization technology, used for advanced decision guidance primarily in the field of strategic sourcing. CombineNet's optimization technology analyzes problems with massive amounts of disparate data to guide users to the optimal solution. In advanced sourcing applications, this technology enables corporate sourcing and logistics teams to quickly evaluate hundreds of thousands of supplier proposals alongside corporate business rules and preferences, to find the best award allocations. Across more than 250 advanced sourcing events, CombineNet has produced savings of more than $2.5B on an aggregate append of more than $19B.

We have not yet begun to grasp the application of computer science to the world's most pressing issues, inside and outside of industry," said Sandholm. "I'm honored to be recognized by this award, and I look forward to presenting my paper at the conference in July"
Dr. Sandholm's award-winning paper "Expressive Commerce and Its Application to Sourcing" is available for download at http://www.combinenet.com/technology/learning_center/.

In addition to his role at CombineNet, Dr. Sandholm is a professor in the Computer Science Department at Carnegie Mellon University. He has been internationally recognized for his efforts as the recipient of several of the most selective academic awards in the field, including the prestigious Computers and Thought Award, presented by the International Joint Conference on Artificial Intelligence (IJCAI), and the Sloan Research Fellowship, presented by the Alfred P. Sloan Foundation. He has also received the National Science Foundation Career Award and the Association for Computing Machinery Autonomous Agents Research Award.

Dr. Sandholm received his Ph.D. and M.S. in computer science from the University of Massachusetts at Amherst and an M.S. with distinction in Industrial Engineering and Management Science from the Helsinki University of Technology in Finland.



About CombineNet
CombineNet is the advanced sourcing technology company. CombineNet's Advanced Sourcing Application Platform enables companies to engage in Expressive Commerce, the strategic sourcing initiative that allows buyers and sellers to communicate supply and demand more expressively, collaboratively and strategically. The result is a win-win for both buyer and supplier, where greater innovation and supply chain efficiency drive the absolute best value and lowest total cost of ownership for goods and services. CombineNet's ASAP delivers 5, 10, even 20 percent greater realized cost savings than other e-sourcing solutions for the largest businesses in the world including General Mills, PepsiCo, Procter & Gamble, Siemens and others. For more information, visit www.combinenet.com.

When Rules Make the Best Medicine

Again Jim shows his skill at summarizing complex issues into practical terms.

http://edmblog.fairisaac.com/weblog/2006/05/live_from_inter_3.html

In the article Jim writes: "You need knowledge governance to build consensus around rules that cross disciplines or involve hard stops. "

This statement is so crucial in the business of data mining and it is often mistakenly overlook by the scientists (yes folks that's what we are).

Thursday, June 29, 2006

Data Mining and Innovation: Keys to U.S. Health Reform

Very good article on Data Mining and Artificial Intelligence in the Health Care.


Data Mining and Innovation: Keys to U.S. Health Reform

Contributor Richard L. Reece, M.D., says that it’s possible that a powerful force embedded in American culture--innovation--will bridge the growing political divide between market-driven and single-payer advocates who seek to resolve cost, coverage and quality problems.

In The Consequential Divide: Which Direction Healthcare? (April 27, 2006), HealthLeaders contributor Preston Gee asserts that a political divide exists between market-driven and single-payer advocates who seek to resolve cost, coverage and quality problems. Either solution, the title implies, harbors profound consequences for healthcare stakeholders. It’s possible that a powerful force embedded in American culture--our genius for innovation--will bridge the divide.

A new solution

Experts point to four basic reform solutions that exist for the U.S.:

A national universal system of coverage
A consumer-driven, market-based system covering those able to pay
State-by-state universal coverage, Massachusetts-style
A national consumer-driven, market-based model with universal coverage through Federal Employee Health Benefits Plan or the Universal Health Voucher Plan, as proposed by the Mayo Clinic
I propose another approach incorporating all these solutions--systematic innovation by government and market-based organizations. This solution will take time. It overlaps government and private sectors, and it is not without doubters. George Lundberg, M.D., past editor of the Journal of The American Medical Association and now editor of Medscape’s MedGenJournal, observes, “Innovations tend to be limited and localized. For the masses, innovations would have to propagate like crazy.”

Comparisons across the pond

In Innovation and Entrepreneurship, Peter Drucker argues the U.S. entrepreneurial economy distinguishes us from Western Europe. Our current economic growth rate is 4 to 5 percent while Europe’s is 1 percent. The U.S. unemployment rate is half of Europe’s. To Drucker, such differences exist because U.S entrepreneurs are closer to customers while socialistic bureaucrats are isolated and remote from people.

Critics argue that Europe has universal coverage and better health statistics. True, but it’s at the cost of economic stagnation, long waits and limited access to medical technologies. One could persuasively argue U.S. innovations often are strictly technological in nature and have little to do with solving social problems ranging from the uninsured to high cost and poor quality. But I assert that these problems can and will be addressed in innovative ways in the political, data collection and deployment, information technology and healthcare organization arenas.

Major innovations

Six major innovations, sometimes inspired by government, sometimes undertaken independently or in concert with the private sector, are driving health reform: data mining reform, consumer-driven care, pay-for-performance initiatives, national electronic infrastructure building, state-by-state reform experimentation, and “disruptive simplification” innovations at the practice management level. Data mining is the most important and sweeping innovation, because it gives us the tools to restructure and rebuild the existing system based on irrefutable and impersonal data. According to Webopedia, the computer technology dictionary, data mining may be defined as “the class of database applications that look for hidden patterns in a group of data that can be used to predict future behavior. For example, data mining software can help retail companies find customers with common interests. The term is commonly misused to describe software that presents data in new ways. True data mining software doesn't just change the presentation, but actually discovers previously unknown relationships among the data.”

Four areas of data mining are transforming healthcare:

Medicare data mining
This form of data mining is not new, but it remains an inexhaustible innovation source because of its size. John Wennberg and Alan Gettlesohn first explored the Medicare Mine in 1973 when they published their classic findings on how medical care varied from one region of the country to the other. Ever since, Medicare data has been considered the sine qua non for studying and judging health costs and outcomes. Wennberg considers medical service variation across regions and academic center as “unwarranted.” The variation data, he concludes, does not correlate with better outcomes data. He has proven beyond statistical doubt that “more is not better.” Employers and health plans are aware Medicare data is a treasure trove for data miners wishing to improve quality and outcomes and to pay hospitals and doctors for performance, which is why the Business Round Table and others are pressuring the Bush administration to release all Medicare claims data.

Pharmaceutical data mining
I was present in Minneapolis in the 1970s at the creation of the UnitedHealthcare Group. Perhaps that is why I maintain that pharmaceutical data mining, outside of the billion- dollar leadership of William McGuire, M.D., is what made UnitedHealthcare what it is today. It isn’t generally recognized that 75 percent of United’s profits come from outside the traditional HMO business. In 2005, I spoke with Brian Gould, M.D., a former senior executive for United. “In early 1990, I moved to Minneapolis. I was in charge of United’s Specialty Operations Division--all the non-HMO businesses. These included a pioneering pharmaceutical benefit company, Diversified Pharmaceutical Services. In 1993, we sold DPM to Smith Kline Beecham for an astonishing price of $2.3 billion,” he said. Under the terms of agreement, United HealthCare agreed to provide Smith Kline Beecham “with access to medical data and outcomes analysis.” This meant access to United’s pharmaceutical data mining operation data. For example, if United had pharmaceutical claims data indicating who was taking insulin, Smith Kline could use that data to study a huge population of diabetics.

United has not abandoned pharmaceutical data mining. Its Ingenix division provides clinic research services, medical education services, and therapeutic outcomes and epidemiology research data to pharmaceutical companies, biotechnology companies and medical device manufacturers.

Printed word data mining
Google is so powerful, it has become a verb. One no longer looks up information in medical libraries, one “googles” medical information. Google, I would argue, is turning the medical world upside-down. Medical journals, for example, are struggling to survive because of drops in advertising and readership. Moreover, Google has leveled the information playing field between doctors and patients. The late Tom Ferguson, M.D., a pioneer and prophet of the consumer-driven movement, put it this way in an interview I conducted with him in 1999: “Patient knowledge is different from physician knowledge. Depending on the area of specialization, a specialist might have to stay current on 30, 200 or 400 medical conditions. A general practitioner might have to keep up with 600. Patients only have to know about one disease--their own.”

Clinical, practice management and practice pattern data mining
In the 1970s and 1980s, in a clinical laboratory setting, Russell Hobbie, Ph.D., a physics professor at the University of Minnesota, and I used the Internet to develop two practical clinical applications using data available in physician’s offices--patient age and gender, physical measurements (height, weight, blood pressure), and laboratory data. From this universally available data, we developed two products--the Unified Presentation of Relevant Tests, a differential diagnosis report listing the top ten diagnostic possibilities, and the Health Quotient, a health status report based on height, weight, blood pressure, family or personal history of heart attack or stroke, and laboratory findings. UNIPORT was 80 percent accurate and was commercially successful; the HQ was acclaimed by its recipients and predicted imminent heart attacks with unexpected precision.

True potential

The real potential of data mining lies in two areas: practice pattern grouping using existing data to define costs and consequences, and predictive modeling using broad clinical and financial databases to define the effect of current patient behavior, diagnoses, and interventions on future outcomes and costs.

Practice pattern grouping often goes by the name of episode grouping. As government and private healthcare organizations seek to deliver top-quality care more cost-effectively, episode grouping has come into vogue. By clustering costs around a clinical episode--everything from doctors involved, to diagnoses, to medications, to interventions, to hospitalization, to rehabilitations, to nursing home care, to outcomes-- you can more precisely analyze total outcomes and costs. You can also more accurately—and fairly—assess physician performance. Much of the total cost, for example, of hospitalizations resides in the hospital’s costs. Hospital charges make up about 80 percent of physician costs in the hospital setting. The hospital charges may be beyond the doctor’s control. On the other hand, drugs doctors prescribe or interventions they choose are not. It has been found that total episode costs may vary by factors of as much as 20 to one. In these instances, and even with smaller variations, systematic or structural reforms are in order. True reform lies in rationalizing, not rationing, care.

Predictive modeling requires a more sophisticated mathematical approach and artificial intelligence deployment. One of the pioneers in this field is David Eddy, M.D., Ph.D., who, over the last 10 years at Kaiser Permanente, has developed a predictive model called the Archimedes Model. This model provides a mathematically based lever that moves and manipulates vast amounts of data in a way that simulates reality. It improves and speeds healthcare decision-making at decision points along the healthcare spectrum. Archimedes, funded by Kaiser, has been 10 years in the making. It uses mathematical simulation to create a visual world to help healthcare organizations make critical and administrative decisions. The model has been repeatedly tested and validated to answer complex real-world decisions. In the words of a Kaiser publicist, “The Archimedes model has virtual people who get virtual diseases, go to virtual doctors, get virtual tests, receive virtual treatment, and have virtual outcomes.” Using Kaiser’s eight million-member database, Archimedes played a role in the Vioxx recall, and it is currently being used as a tool to conduct virtual clinical trials by major pharmaceutical companies.

Another company pursuing goals similar to Archimedes is MedAI (short for Medical Artificial Intelligence) in Orlando, Fla. MedAI’s outcomes measurement application, Pin Point Quality, enables users to easily identify specific steps to monitor and improve clinical outcomes while reducing healthcare costs. Clients can integrate data from clinical and financial legacy systems. This allows clients to undertake quality initiatives. Medical directors, administrative directors and other members of the organization can create reports of quality indicators, which they can then use to drive practice changes in their organization.

In formulating the argument that America innovation in general and innovation in the handling of data in particular will change the world, I have only touched briefly on such innovative and powerful movements as consumer-driven care, pay-for-performance, the building of a national electronic infrastructure, the political innovation in Massachusetts, or “disruptive innovations” that are simpler, less costly, and more convenient to use. These are all terribly important, and their full potentials will, no doubt, require data-based innovations.


--------------------------------------------------------------------------------

Richard L. Reece, M.D., is a pathologist, writer, editor, speaker and consultant in Old Saybrook, Conn. His latest book, Key “Under the Radar” Innovations Transforming U.S. Health Care, will be published later this year. Reece may be reached at rreece1500@aol.com.

--

Thursday, June 22, 2006

SQL 2005 Analysis Services

Anybody has tried to build an artificial intelligence using this tool?

Artificial Intelligence Resources

Read:
http://airesources.blogspot.com/2006/06/artificial-intelligence-resources.html

If you do not know where to start your research this is a good site!

Data Mining Resources

Read:
http://dataminingresources.blogspot.com/2006/06/data-mining-resources.html

If you do not where to start your research this is a good site!

Integrating Analytics into Business Processes

Read:
http://edmblog.fairisaac.com/weblog/2006/06/integrating_ana.html

Jim's article is accurate and insightful. My comment to him was that he did not made a distinction between supervised and unsupervised data mining. Jim is obvious a real pro!!

Wednesday, June 21, 2006

Research explores data mining, privacy

Read:
http://www.usatoday.com/tech/news/surveillance/2006-06-18-data-mining-privacy_x.htm?csp=34

The key to be able to use data mining and artificial intelligence techniques in a cost-efficient manner is to find ways of using the raw real-time data (thanks Federal Reserve Chairman Ben Bernanke for his contribution in this area). If we need to strip and convert data identifiers to no-identifiers before we apply DM and AI techniques we will never be able to have a cost-efficient method. The issue is not to strip the data of identifiers, the issue is to limit who has access to the data, analysis and for what purpose.

Data mining still needs a clue to be effective

Read:
http://www.detnews.com/apps/pbcs.dll/article?AID=/20060620/BIZ04/606200378

Not completely correct. There are supervised data mining models that required data tags to be effective. On the other hand, there are unsupervised data mining models that do not require data tags.

Which company will create the next generation of artificial intellegence that could be use as the foundation for any software?

I think that Microsoft and Google have the financial and intellectual resources, but it may be smaller companies (gaming, healthcare, etc) that may have the flexibility and urgency to come up with a solution.

Physics may take video games to next level

Read:
http://www.usatoday.com/tech/gaming/2006-06-20-physics-gaming_x.htm?csp=34

Congratulations Manju and Ageia!!! These folks understand that an AI must be 3D and based on science. This will bring a dramatic change to the gaming community and to the AI community as a whole.

URA pilots e-filing system that “learns” from human decision-making

Read:
http://www.computerworld.com.sg/ShowPage.aspx?pagetype=2&articleid=3837&pubid=3&tab=Home&issueid=92

Data mining + human experience = AI

Poker Academy Supplied World Renowned Texas Holdem Poker Artificial Intelligence to Myelin Media’s “STACKEDTM with Daniel Negreanu”

Read:
http://news.yahoo.com/s/prweb/20060619/bs_prweb/prweb400195_1

Analytics + human experience = AI

Artificial Intelligence Helps Stock Shelves

Read:
http://www.redorbit.com/news/technology/543187/artificial_intelligence_helps_stock_shelves/index.html?source=r_technology

Data Mining + Human Experience = AI

Artificial Intelligence Techniques Used In Computerized Valuation System

Read:

http://www.sciencedaily.com/releases/2006/06/060621085141.htm

An example of data mining plus cognitive experience = AI

Microsoft bets on a robotic future

Read this article:
http://money.cnn.com/2006/06/20/technology/microsoft_robots.reut/index.htm?cnn=yes

The key in robotics is the design of an artificial intelligence that combines data mining techniques (analytics) and cognitive (human) experience. Microsoft has the basic analytical tools (regression, clustering, average, standard deviation, ranking etc.). The challenge is the application of those tools. The analytical tools determines patterns in the data, but we still need the cognitive exceptions of human experience as the basic benchmark for an AI to develop.

Tuesday, June 20, 2006

AI=Analytics + Past Experience

I am in the process of developing an artificial intellegence to detect patterns in very large databases (over 300 terabytes). My theory is that if you join data mining techniques (analytics for supervised and unsupervised data modeling) and past experience (the exceptions in cognitive querying) you would have an artificial intellegence.

Business Analytics

Business Analytics

Blog Archive

About Me

My photo
See my resume at: https://docs.google.com/document/d/1-IonTpDtAgZyp3Pz5GqTJ5NjY0PhvCfJsYAfL1rX8KU/edit?hl=en_USid=1gr_s5GAMafHRjwGbDG_sTWpsl3zybGrvu12il5lRaEw