Steps toward machine intelligence…….

Futurists paint a view of an integrated man /machine ‘hybrid’, achieved by instantiating elements of human cognition into computational devices. This essay outlines several important technologies in development and /or early product launch, that form the foundation of this future.

Recent developments in new computer architectures exhibit ‘bio-mimicry’ of the function & structure of the brain. In general these new architectures are configurable, optimizing around the nature of the computational problem that is to be solved. Formerly, solution strategies were constrained by the general constructs of Von Neumann architectures that have dominated computer architecture for fifty years. Specifically, instructions to move x to location y, and perform operation z, rather they are processed in a more natural flow, whose function is analogous to those of synapses in the brain.

IBM has achieved important milestones with neurosynaptic chipsets, which step toward human cognition. The major challenges of these architectures are the absence of mature compilers, operating systems, and other development tools that can bring to bear the awesome potential of the hardware architecture. Additionally, in an industry where economies-of-scale are critical, reaching volume production determines profitability.

HP with The Machine, has introduced ‘special purpose’ processor cores, architecturally tuned toward solving problems using ‘big data’. Other examples in The Machine include memristor memory technology. Not unlike persistent memory in a human brain, memristors can preserve their state (1’s or 0’s) when power is removed. Another significant memristor benefit is to ease challenges associated with ‘queuing’ data from disks to high speed memory; a highly complex process that anticipates what data will be needed when, and to move that data from disk memory. This optimization yields significant performance benefit. Further advances involve implementing optical interconnects offer high-speed data transfer without increasing heat and signal noise that accompany smaller, faster, metal interconnect.

Beyond computational architecture, data schema changes are also evolving to accommodate a wide breadth of data types. There are multiple sources of commercial and open source ‘database management’ applications optimized for ‘unstructured’ data, as might exist for customer analytics of ‘big data’. Hadoop is a popular open sourced application, with commercial products from IBM, Marklogic, and others. Further-out on the horizon, from a commercialization perspective, are more radical approaches to computation.

Although not applicable to all computational problems, quantum computing (defined by Cheuk Chi Lo & John J.L. Morton in an article in an August 2014 article in IEEE Spectrum as “a system that can store and process information according to the laws of quantum mechanics”) holds great promise in selected applications, such as molecular engineering and cryptography.

D-Wave Systems of Burnaby, British Columbia, is offering a ‘quantum computer‘ which is implemented in exotic materials operating as superconductors that require near-absolute zero temperatures. The referenced article discusses technical developments that would implement quantum computers in silicon-based circuits operating at room temperatures; a significant advance in this field.

The black magic from the author’s point-of-view, is the advent of machine learning, as demonstrated by IBM’s Watson, of Jeopardy fame. Early commercial offerings of artificial intelligence where named expert systems. These involved attempts to catalog a specific body-of-knowledge in its entirety (e.g., every possible chess move, and their counter move(s)); an exceedingly difficult task to undertake. (for non-deterministic problems) The set of problems to which this approach would apply are limited, and deal with identifiable and discrete options.

The machine learning approach involves the ‘computer’ learning from its responses, whether correct or incorrect, as indicators on answering future questions. The initial use model of these capabilities (referred to as ‘cognitive computing’ by IBM) are collaboration-based; e.g., medical and financial professionals who use these tools to augment their professional activity), as opposed to stand-alone ‘decision machines’.

Continued collaboration between developers and domain experts (e.g., physicians) promise workable machine-assisted decision-making. The primary goal of ‘big data’, ultimately, is the quality of decision-support analytics, a step on the path to machine intelligence, the ultimate goal; one achievable through the discipline of incremental improvement as opposed to revolutionary discovery.

About Don

Former C-Level Exec of NASDAQ company InfoSec Certifications: CISSP, CISO (Carnegie Mellon CIO Institute) / Founding member of several 'startups' / Georgetown University, Masters, Technology Management / InfoSec Certifications: CISSP, CISO (Carnegie Mellon CIO Institute)
This entry was posted in Machine Intelligence. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s