7 Things I Learned from Listening to The Culture of Innovation Talk

I really enjoyed watching  “The Culture of Innovation” from MIT Technology Review.

The talk covers several interesting topics worth exploring.

  1. Permission less innovation and Innovation at the edges
  2. A culture of practice over theory
  3. The concept of Social Investing
  4. Connectivity in Communities
  5. Peripheral vision and Pattern Recognition and how they are the total opposite of focus and execution
  6. Attachment bias
  7. Cultures and sub-cultures

My favorite quote from the talk:

We so cherish focus, execution and they are the opposites of peripheral vision, pattern recognition
Peripheral vision and pattern recognition lead to discovering new ways of doing things.
Here is a link to the video interview with Joi Ito.

Where is Machine Learning Being Applied?

When I give talks on Machine Learning, I often get these questions:

  • What is Machine Learning?
  • What are some Machine Learning Applications?
  • Is Machine Learning Mature?
  • Who is using Machine Learning?
  • How do we get started?

If you are using Google or Bing Search, if you get recommendations for books or other products from Amazon, if you are getting hints for the next word to type on a mobile keyboard, you are already using Machine Learning.

Here is a sample list of Machine Learning applications.

From  Apple’s Core ML Brings AI to the Masses:

  • Real Time Image Recognition
  • Sentiment Analysis
  • Search Ranking
  • Personalization
  • Speaker Identification
  • Text Prediction
  • Handwriting Recognition
  • Machine Translation
  • Face Detection
  • Music Tagging
  • Entity Recognition
  • Style Transfer
  • Image Captioning
  • Emotion Detection
  • Text Summarization

From Seven Machine Learning Applications at Google

  • Google Translate
  • Google Voice Search
  • Gmail Inbox Smart Reply
  • RankBrain
  • Google Photos
  • Google Cloud Vision API
  • DeepDream

Also, see – How Google is Remaking Itself as a “Machine Learning First” Company.

While Apple, Google, Facebook, Amazon, IBM, and Microsoft are the most visible companies in the AI space, take a look at business applications of Machine Learning.

What is Artificial Intelligence (AI)?

Artificial Intelligence (aka AI),  will have a deep impact on our lives – both positive and negative.  Like any other tool or technology, a lot depends on how we use it.  I often get asked these questions:

  • What is AI?
  • What is good about it?
  • Will it destroy jobs?
  • Will it take over humanity?
  • What do we need to do to leverage AI?

AI traditionally refers to an artificial creation of human-like intelligence that can learn, reason, plan, perceive, or process natural language. These traits allow AI to bring immense socioeconomic opportunities, while also posing ethical and socio-economic challenges.

Right now the opportunities are in research, technology development, skill development and business application development.

The technologies that power AI – neural networks, Bayesian Probability, Statistical Machine Learning have been around for several decades (some as old as the late 50’s). The availability of Big Data is bringing AI applications to life.

There are concerns about misuse of AI and a worry that it may result in uncontrolled proliferation, killing jobs in its wake. Other worries include unethical uses, unintended biases, and other problems. It is too early to take one side or the other.

Please take a look at Artificial Intelligence and Machine Learning:  Policy Paper. It looks at AI from a variety of lenses.

ReadLog: When Leaders Think Aloud…

When leaders think aloud, it is a fascinating to listen. Satya talks about innovation, handling failures, AI, advances in cloud computing, using silicon to speed machine learning and a variety of other topics including bits of history (of Microsoft) and philosophy.

satya may 2 2017-1

Microsoft had been there, too early.  And they were too far behind on the Internet and managed to catch up.

On handling failures – instead of saying “I have an idea”, what if you said, “I have a new hypothesis”?

satya May 2 2017

Satya Nadella goes on to talk about some of their innovations (accelerating AI using FPGA), on investing in the future and the future of innovation. This article is a good read.

Q&A with Microsoft CEO Satya Nadella: On artificial intelligence, work culture, and what’s next

Insights into IOT

The Internet of Things, or IoT, may be the most important online development yet

First thing in a new technology, people do all the obvious things that look like the old market, but more efficiently. In the Internet, GNN had web ads like old newspaper ads. Later there was Google search, which was a different way of doing advertising, by focusing more on data. Now we’ve got social search, social networks. The business model moves to something that is more native to the technology.

Dart, Swift and Popularity of Big Data and Computational Statistics

Watching programming language popularity is one of my hobbies. The TIOBE index Nov 2014, shows some interesting trends. Let us take a look.

 

Click on these images to see a full page view.

TIOBE_2014

 

TIOBE2014-8-20

 

 

This para from the TIOBE is worth noting.

Thanks to the big data hype, computational statistics is gaining attention nowadays. The TIOBE index lists various of these statistical programming languages available, e.g. Julia (position #126), LabView (#63), Mathematica (#80), MATLAB (#24), S (#84), SAS (#21), SPSS (#104) and Stata (#110). Most of these languages are getting more popular every month. The clear winner of the pack is the open source programming language R. This month it jumped to position 12, while being at position 15 last month.

Other trends:

  1. The top 7 languages (from a year ago) retain their spots, but all of them drop a bit in popularity.
  2. Dart, a programming language from Google,  jumps into Top 20 from a previous rank of #81. Dart is language for  building web and cloud apps.
  3. Swift comes from nowhere and enters #18 spot. Swift is a new programming language from Apple for iOS and OS X.
  4. Perl and Visual Basic.NET stay in Top 10. It will be interesting to watch their moves.
  5. F# keeps moving up (from #23 to #16)
  6. Watch the Top 50 languages (#21-#50). Some of them are leading indicators to future of computing.
  7. To see potential new entrants into Top 20, you may want to watch the other languages in Top 50 in the  TIOBE site.
  8. I expected Scala to be in this list but for some reason, I don’t see it. I think it will soon move up into the Top 20 list.
  9. Three SQL dialects are still in Top 20. I am not surprised by that since SQL is still one of the most popular languages for database programming.
  10. I keep hearing a lot about Julia. I will be watching it with interest.

The images in this page are from InfoMinder. InfoMinder is a tool for tracking web pages. I use it to track a few interesting pages on the web. When InfoMinder detects change in a page, it highlights it  and creates a new changed page. It is one of the tools we built over a decade ago and is still chugging along, helping me and others watch the web.

Recommended Reading: What Will Our World Look Like in 2022?

Predicting the future is hard and risky. Predicting the future in the computer industry is even harder and riskier due to dramatic changes in technology and limitless challenges to innovation. Only a small fraction of innovations truly disrupt the state of the art. Some are not practical or cost-effective, some are ahead of their time, and some simply do not have a market. There are numerous examples of superior technologies that were never adopted because others arrived on time or fared better n the market. Therefore this document is only an attempt to better understand where technologies are going. The book Innovators Dilemma and its sequels best describe the process of innovation and disruption.

Nine technical leaders of the IEEE Computer Society joined forces to write a technical report, entitled IEEE CS 2022, symbolically surveying 23 potential technologies that could change the landscape of computer science and industry by the year 2022. In particular, this report focuses on:

  1. Security cross-cutting issues
  2. The open intellectual property movement
  3. Sustainability
  4. Massively online open courses
  5. Quantum computing
  6. Devices and nanotechnology
  7. 3D integrated circuits
  8. Universal memory
  9. Multicore
  10. Photonics
  11. Networking and inter-connectivity
  12. Software-defined networks
  13. High-performance computing (HPC)
  14. Cloud computing
  15. The Internet of Things
  16. Natural user interfaces
  17. 3D printing
  18. Big data and analytics
  19. Machine learning and intelligent systems
  20. Computer Vision and Pattern Recognition
  21. Life sciences
  22. Computational biology and bioinformatics
  23. Medical Robotics

You can find the comprehensive report here.

LinkLog: Smart and Connected Health Program

From an NSF request for proposal synopsis:

The goal of the Smart and Connected Health (SCH) Program is to accelerate the development and use of innovative approaches that would support the much needed transformation of healthcare from reactive and hospital-centered to preventive, proactive, evidence-based, person-centered and focused on well-being rather than disease. Approaches that partner technology-based solutions with biobehavioral health research are supported by multiple agencies of the federal government including the National Science Foundation (NSF) and the National Institutes of Health (NIH). The purpose of this program is to develop next generation health care solutions and encourage existing and new research communities to focus on breakthrough ideas in a variety of areas of value to health, such as sensor technology, networking, information and machine learning technology, decision support systems, modeling of behavioral and cognitive processes, as well as system and process modeling. Effective solutions must satisfy a multitude of constraints arising from clinical/medical needs, social interactions, cognitive limitations, barriers to behavioral change, heterogeneity of data, semantic mismatch and limitations of current cyberphysical systems. Such solutions demand multidisciplinary teams ready to address technical, behavioral and clinical issues ranging from fundamental science to clinical practice.

Due in large part to advances in high throughput and connective computing, medicine is at the cusp of a sector-wide transformation that – if nurtured through rigorous scientific innovation – promises to accelerate discovery, improve patient outcomes, decrease costs, and address the complexity of such challenging health problems as cancer, heart disease, diabetes and neurological degeneration.  These transformative changes are possible in areas ranging from the basic science of molecular genomics and proteomics to decision support for physicians, patients and caregivers through data mining to support behavior change through technology-enabled social and motivational support.  In addition to these scientific discoveries, innovative approaches are required to address delivery of high quality, economically-efficient healthcare that is rapidly becoming one of the key economic, societal and scientific challenges in the United States.

Discovered while doing some research on Smart Homes and Places.

IBM Watson – Augmenting Human Knowledge

Amazing! Between Watson, Siri and other similar Natural Language apps, we will be entering a new era of Knowledge Augmentation. I am especially thrilled about the impact it will have on teaching.

Watson looks at the question it is being asked and groups words together, finding statistically related phrases. Thanks to a massively parallel architecture, it then simultaneously uses thousands of language analysis algorithms to sift through its database of 15 terabytes of human knowledge and find the correct answer. The more algorithms find the same answer independently, the more a certain answer is likely to be correct. This is how, back in 2011, it managed to win a game of Jeopardy against two human champions.

In a presentation at the the Milken Institute Global Conference, IBM senior vice president and director of research John Kelly III demonstrated how Watson can now list, without human assistance, what it believes are the most valid arguments for and against a topic of choice. In other words, it can now debate for or against any topic, in natural language.

From a Gizmag article on IBM Watson

Machine Learning Application: Job Classification at LinkedIn

I am fascinated by Machine Learning (ML) and keep looking for case studies were ML solves real world problems. This Talk – Machine Learning: The Basics by Ron Bekkerman( video), provides a great overview  of machine learning and how it is being used by LinkedIn for Job Analysis. LinkedIn is one of the early companies to jump in to Data Science. With over 200 million subscribers, they have ample data to analyze. The data is very contextual too and that helps build better algorithms (they claim 95% accuracy in prediction in a specific case). At one point in the talk Ron mentions that the ML study helped in building a product that generates about 6 million dollars in revenue for LinkedIn. That is great pay off.

linkedin_case_study

Why is job analysis interesting in general? It provides you with some interesting insights into the direction a specific industry is moving:

  • If you are in the (IT staffing) industry, you may want to know what kinds of jobs are in demand? And which ones are growing and which ones are shrinking?
  • If you are an outsourcing company, you may want to analyze the hiring patterns in different parts of the world
  • What kinds of skills are in demand for startups, medium sized companies and large enterprises? Lots of people from startups to training companies can use this data to build and tailor their offerings.
  • How do training companies and conference organizers meet the need for skills using job analysis?

Ultimately, it is all Market Intelligence of a kind. It is fascinating that, now we have large data to analyze and get some glimpses into the patterns of demand/supply.  So where do you get all this data from? That is a topic for another blog post.

Meta

One of our interns is working on an app to do Job Classification and automatic tagging of jobs. We were debating whether we should use some simple techniques or ML. I was going around looking for case studies and stumbled upon this video.