Inventing the Future

Alan Kay once said that “The best way to predict the future is to invent it”. It is always nice to see efforts to invent the future. Some are small with near term potential and some are large ambitious once. Here are just a couple, I came across today in a mail I received from ACM TechNews.

Intel’s Challenge

“We Challenge You: A Call to Action on Global Issues

Intel’s INSPIRE•EMPOWER Challenge is calling for the best technology solutions to address four areas of global need – education, healthcare, economic development, and the environment. The contest will award seed funding of $100,000 USD to one winner in each category. The Challenge is designed to inspire developers, individuals, and organizations to innovate and empower them to deliver new ways to apply technology to these issues. The INSPIRE•EMPOWER Challenge advances the commitment of the Intel World Ahead Program – to connect people to a world of opportunity.”

You can find out more about Intel’s challenge here.

Expeditions in Computing

This initiative from NSF is a more ambitious one.

The Directorate for Computer and Information Science and Engineering (CISE) at the National Science Foundation (NSF) has established four new Expeditions in Computing. Each of these $10 million grants will allow teams of researchers and educators to pursue far-reaching research agendas that promise significant advances in the computing frontier and great benefit to society.

“We created the Expeditions program to encourage the research community to send us their brightest and boldest ideas,” said Jeannette Wing, NSF’s assistant director for CISE. “We received an overwhelming response, and I’m delighted with the results of our first annual competition. The four Expeditions together push both the depth and breadth of our field: pursuing longstanding scientific questions of computing, creating a new field of computational sustainability, experimenting with novel technologies for secure, ubiquitous computing and communications of the future, and exploring what we even think of as computing.”

It covers four broad areas.

  1. Understand, Cope with, and Benefit From Intractability
  2. Computational Sustainability: Computational Methods for a Sustainable Environment, Economy, and Society
  3. Open Programmable Mobile Internet 2020 project
  4. Molecular Programming Project

CISE anticipates hosting an Expeditions competition annually, with three new awards anticipated each year. The deadline for preproposal submission to the second annual Expeditions competition is September 10, 2008.

Do you know of any other initiatives of similar nature around the world. Please add a comment with a link.

Linguistic Interface to the Web?

What is common between Ubiquity, YubNub and Inky? They all seem to bring back a simple command line interface to the web. More flexible, a bit more powerful and hopefully a lot easier to customize. I always yearned for this kind of interface. In fact, we were even thinking of adding scripting to our browser plug-ins. It is nice to see that Mozilla.org is planning to introduce this interface in Firefox. Here is a breif description from the article in Technology Review:

The idea, says Beard, is to make it easier to find and share information on the Web while avoiding cumbersome copy-and-paste instructions. Traditionally, if you want to e-mail a picture or a piece of text to a friend, look up a word in an online dictionary, or map an address, you have to follow a series of well-worn steps: copy the information, open a new browser tab or an external program, paste in the text, and run the program.

I really like the concept of Command Extensibility:

Ubiquity is highly customizable. From the start, the interface will come with built-in instructions or “verbs,” such as “e-mail,” “Twitter,” and “Digg,” but Beard expects people to add many new ones.

Book: Software Pipelines and Multi-core Revolution

I like receiving email/rss streams on new books in tech. It is one of my tools for tracking technology trends. Here is title from an O’Reilly Safari service, Software Pipelines: The Key to capitalizing on multi-core revolution, that caught my attention.

Software Pipelines is a new architecture that specifically addresses the problem of using parallel processing in the multi-core era. It is a new approach to the problem. Pipeline technology abstracts the complexities of parallel computing and makes it possible to use the power of the new CPUs for business applications. The ability to implement performance-based SOA applications and other high performance computing applications in the enterprise hangs in the balance.

I recall reading way back in mid 80’s about pipes in Unix. In fact a book I really liked (and still on my bookshelf) called Software Tools. Its philosophy and some what similar to Unix philosophy of building small interoperable tools that work together using Unix Pipes.

The technology of software pipelines, while conceptually similar, is a lot different. While Unix pipes worked well for a set of programs on the same machine, we need some thing different for a set of applications running on different CPUs. I am looking forward to reading this book and understanding how software pipelines work.

This book was written primarily for software architects, application developers, and application development managers who need high-performance, scalable business applications. Project managers, software quality assurance specialists, and IT operations managers will also find it useful; however, the main focus is software development.

The Web’s Red Pill

I liked this video, especially the part where Harry says:

It is like a red pill for the web. When you take it, all you see is triples. The true graph nature of the universe is revealed.

While we are still a bit far away from that red pill, the concept of seeding a little metadata to your web pages to enable a data web is a great idea.

GRDDL, bridging the interwebs? from Marcos Caceres on Vimeo.