"Big data hubris”" is the often implicit assumption that big data are a substitute for, rather than a supplement to, traditional data collection and analysis.  In a recent article in Science, "The Parable of Google Flu: Traps in Big Data Analysis," declare that Google was guilty of "big data hubris," pointing to Google Flu Trends' missed predictions as an example of this hubris:

Screen Shot 2014-04-02 at 1.14.35 PM.png

According to a blog post in the New York Times, the technical criticism of Google Flu Trends (GFT) is that it is not using a broader array of data analysis tools. Indeed, their analysis shows that combining Google Flu Trends with C.D.C. data, and applying a few tweaking techniques, works best. "The mash-up is the way to go,"  said one scientist. Interestingly, Matt Mohebbi, the co-inventor of Google Flu Trends, agrees with this assessment. His view is that the GFT service was always intended as a "complementary signal" rather than a stand-alone forecasting tool.

One of our favorite pharma bloggers, Derek Lowe - said it best: "Only trust the big data when the little data are trustworthy in turn."

Lowe made another important observation back in 2007:

There's a problem in the drug industry that people have recognized for some years, but we're not that much closer to dealing with it than we were then. We keep coming up with these technologies and techniques which seem as if they might be able to help us with some of our nastiest problems - I'm talking about genomics in all its guises, and metabolic profiling, and naturally the various high-throughput screening platforms, and others. But whether these are helping or not (and opinions sure do vary), one thing that they all have in common is that they generate enormous heaps of data.

We're not the only field to wish that the speed of collating and understanding all these results would start to catch up with the speed with which they're being generated. But some days I feel as if the two curves don't even have the same exponent in their equations. High-throughput screening data are fairly manageable, as these things go, and it's a good thing. When you can rip through a million compounds screening a new target, generating multiple-point binding curves along the way, you have a good-sized brick of numbers. But you're looking for just the ones with tight binding and reasonable curves, which is a relatively simple operation, and by the time you're done there may only be a couple of dozen compounds worth looking at. (More often than you'd think, there may be none at all).

His message was essentially this - new technologies may be cool but they create heaps of data and can't really be effective. This problem still exists today, but we're starting to see new approaches like Intelligent Waveform Service (IWS) that are yielding results. In my view, another form of hubris is to ignore these developments.

It’s no secret that R&D productivity has been a real challenge for the pharmaceutical industry. But there is a growing sense that Big Data can help. 

The McKinsey Global Institute has estimated that applying big-data strategies to better inform decision making could generate up to $100 billion in value annually across the US health-care system, by optimizing innovation, improving the efficiency of research and clinical trials, and building new tools for physicians, consumers, insurers, and regulators to meet the promise of more individualized approaches. An article* from McKinsey paints the following vision:

  • Predictive modeling of biological processes and drugs becomes significantly more sophisticated and widespread. By leveraging the diversity of available molecular and clinical data, predictive modeling could help identify new potential-candidate molecules with a high probability of being successfully developed into drugs that act on biological targets safely and effectively.
  • Patients are identified to enroll in clinical trials based on more sources—for example, social media—than doctors’ visits. Furthermore, the criteria for including patients in a trial could take significantly more factors (for instance, genetic information) into account to target specific populations, thereby enabling trials that are smaller, shorter, less expensive, and more powerful.
  • Trials are monitored in real time to rapidly identify safety or operational signals requiring action to avoid significant and potentially costly issues such as adverse events and unnecessary delays.
  • Instead of rigid data silos that are difficult to exploit, data are captured electronically and flow easily between functions, for example, discovery and clinical development, as well as to external partners, for instance, physicians and contract research organizations (CROs). This easy flow is essential for powering the real-time and predictive analytics that generate business value.
The authors warn that many pharmaceutical companies are wary about investing significantly in improving big-data analytical capabilities, partly because there are few examples of peers creating a lot of value from it. But, they hasten to add, they believe investment and value creation will grow. 

In our own work, we find that the direct application of our enterprise analytics platform impacts pharma R&D directly. It helps them to recognize patterns in waveform data, powering the next phase of Big Data solutions for unstructured content. Our Intelligent Waveform Service (IWS) automates biosignal recognition and analysis, improving drug research and development by addressing the critical issues of risk management, cost containment and time to market. The enterprise software works across the development lifecycle, organizations and applications, empowering teams to learn faster and make smarter decisions about drug programs. Here’s how:

  • Leverages machine learning to provide waveform analytics of biosignals.
  • Automates and accelerates discovery & preclinical research, providing more complete results at a fraction of the time and cost.
  • Learns example patterns, then runs batch jobs against multiple file sets and data formats, making analytics shareable among researchers and groups.
Screen Shot 2014-03-05 at 3.03.51 PM.png

We demonstrated for the first time a scalable analysis platform capable of handling the large volumes of data produced by MEAs. IWS was compared to existing techniques and tools. The software’s pattern-based methodology and automated machine learning algorithms delivered substantial performance improvements.

The use case showed that human induced pluripotent stem cell derived cardiomyocytes (hiPSC-CMs) with multi-electrode array (MEA) technology could be used to develop assays for predictive electrophysiological safety screening. Performance was significantly improved:

Time to Analysis - Average analysis was five times (5x) faster than current techniques (Fig 11).
Quality of Results - IWS performed the detection of marker spikes on every channel improving accuracy by approximately 3-5% for FPD.
Analysis Scope - Earlier techniques limited analysis to channels with the strongest signals. IWS enabled analysis of waveforms with lower signals that could not previously be measured.

Employing machine learning and pattern recognition technology will:

  • Improve accuracy and facilitate greater use of MEAs in compound screening.
  • Provide a truly scalable analysis platform for MEAs.
  • Support the expansion of assays using hiPSC-CMs that research suggests offer a reliable, cost-effective and surrogate to pre-clinical in vitro testing, in addition to the 3Rs (refine, reduce and replace animals in research) benefit.
Learn more about our waveform analytic solution here >>

In 2014, Big Data in the Enterprise will take on an added dimension of urgency as companies race to compete on analytics. Here are a few predictions and insights for 2014:

Renewed Emphasis on Security 
While there are numerous surveys on how and why companies are using Big Data, there is less information on how these initiatives are being kept secure.  Security concerns will be paramount in 2014, given the scope and nature of the data being analyzed.  

Shift from Time-to-Answer to Time-to-Value
While companies agree that decision-making is enhanced or better supported by data, there will be new approaches to accelerate “time-to-analysis” and “time-to-decision-making,” leading to “time-to-value.”

Screen Shot 2014-01-22 at 10.07.19 PM.png
Source: New Vantage Survey 2013

Big Data Project Investments Explode
Companies will ramp up the application of Big Data Analytics as they start reaping the benefits of the first-round of experiments and pilots.  Companies will move beyond the type of applications shown below to embrace the use of Big Data in their work processes to speed up decision-making and reduce cycle-time.

Screen Shot 2014-01-22 at 10.06.45 PM.png
Source: New Vantage Survey 2013

Business Drives Big Data Projects
The role of IT in Big Data will continue to diminish, as business and data analysts with deep functional experience lead projects and initiatives.  IT will provide infrastructure support, helping support the various functional “sandboxes.”

The Supply Chain Gets Big Data
New analytic capabilities will improve decision-making in supply chain applications, leading to improvements in “order-to-invoice” processes for manufacturers.  We will likely see a fair number of hiccups that make the headlines as kinks are worked out in these new applications.

HR Freaks Out
The lack of readily available talent, specifically for advanced data analysts, is going to turn HR departments upside down. We can expect real growth in both internal and external training programs in Big Data Analytics.  Salaries for top talent will go even further up than expected.

Healthcare and Life Sciences Get Engaged
From our perspective, we see a sharp rise in the use of Big Data Analytics as healthcare applications for patient monitoring hits stratospheric heights and as Life Sciences professionals use wave analytics to improve their decision-making.

Big Data Goes Small
Micro-apps that focus on narrow but useful subsets of Big Data will drive mobile, on-demand investments.  Small businesses will hire Big Data service providers.

Unstructured Data Gets More Organized
It’s just a matter of time, but we believe 2014 will show real strides in the use of Big Data Analytics on unstructured data.

Predictive Analytics
By now you’ve heard about Amazon shipping stuff to you before you order it, but jokes aside, the analytical insights gained via Big Data Analytics will make predictive and preventative measures part of manufacturing and inventory operations.

As the discipline of Big Data Analytics becomes a critical component of business competitiveness, we need to ask - Who, in the C-suite, should be lead the charge?

The topic is raised in a recent article from McKinsey which gives us the following guidelines:

1. Establish new mind-sets
2. Define a data-analytics strategy
3. Determine what to build, purchase, borrow, or rent
4. Secure analytics expertise
5. Mobilize resources
6. Build frontline capabilities
7. Put leadership capacity where it's needed
The article concludes by observing:

At all companies, top teams, and probably board members as well, need a better understanding of the scale of what's needed to ensure data-analytics success. Then they must notch these responsibilities against their existing management capacity in a way that's sensitive to the organization's core sources of value and that meshes with existing structures.

As a reminder, here are a few questions that need answers. But the question remains - do companies need a Big-Data Daddy (or Mama) at the top?  The answer is yes, but it depends where your company sits on the Big Data maturity model.


In phases 4 and 5, there is clear executive leadership and sponsorship.

Most companies will adopt a committee based approach, establishing a Big-Data Council or a Big-Data Steering Committee. This is the "safe" way to go, but it holds its own risks. If Big Data Analytics really is a core-competence required for future survival, and increasingly it seems to look that way, then you would be wise to create teams across the company focused on functional data analysis along with the cross-functional analysis that spans both the company and the industry. 

The key: Don't wait!

A recently concluded conference on precision medicine at UCSF pitched a number of promising ideas in the areas of data collection, data storage, data analysis and technology development, and data use. The mapping of the human genome sequences, the advent of Big Data Analytics, and the new mantra of prevention with the citizen as a healthcare stakeholderhas made possible the field of precision medicine, the wave of the future. The conference made amply clear "precision medicine as the future of medicine" defining it as the "practice of harnessing technology, science, and medical records to better understand the roots of disease, develop targeted therapies, and ultimately save lives."

The ideas coalesced around critical issues of big data quality, transparency, and portability to realizing the full potential of precision medicine built around a more granular new taxonomy of diseases involving specific molecular/pathogenic pathways rather than amorphous "signs and symptoms." Disparate diseases might share the same molecular/ genetic disruptions in their pathway which opens up treatment options and improves effectiveness.  With genomic sequencing at its disposal medical science can also chart out risk probabilities within families and facilitate prevention.

Such specific taxonomies can build on the traditional ICD classifications which serve clinical and statistical purposes but are either too static or inadequate in describing the genetic pathways or driver mutations of diseases.  The interlinking of data from clinical, environmental, behavioral, and socio-economic indexes derived from comprehensive Electronic Health Records with genomic parameters can form an Information Commons, i.e., a data ecosystem derived from millions of patients individualized data from which pilot studies and large cohort studies can be launched by bio-tech, drug researchers,clinicians, and every healthcare stakeholder to contribute towards a Knowledge Network which can be used to classify diseases on a molecular level, discover disease mechanisms, detect diseases early and establish accurate diagnosis, measure disease predisposition, target treatments, develop drugs, and reduce health disparities. The results from these studies are used to update the Knowledge Network and validate the disease taxonomy towards a dynamic, sustainable, precise, and economic healthcare delivery model.

The USA spends approximately 18% of its GDP on healthcare, the costliest in the world, and two times more than the OECD spends but ranks 46th out of 48 countries in healthcare efficiency just ahead of Serbia and Brazil as per a recent Bloomberg study. The challenge is controlling healthcare costs while delivering high-quality care and increasing medical coverage amongst the uninsured.

As David Houle and Jonathan Fleece in the New Health Age reveal, this is to be achieved by transforming the citizen into a stakeholder in their healthcare as it revamps from a sickness to a wellness model, creates awareness and understanding, and horizontally integrates.  A new generation of informed citizen scientists armed with portable monitoring devices which measure vital signs ranging from blood pressure to oxygen saturation rates collected, analyzed, and uploaded to offsite doctor's offices when red flagged.  Citizens receive credits for remaining healthy and in turn, the industry is developing reimbursement codes for keeping them that way to incentivize clinicians. As we saw in the previous blog post, companies have already jumpstarted the billions of dollars personal health and fitness business with their devices that measure calories, sleep, and impart wellness advice. Such devices make the goal of an Information Commons attainable.

The citizen scientist theme echoed at the conference in such ideas as Me For You, a social media campaign to raise awareness of precision medicine and targeted therapies by encouraging users to become advocates and share health data and empower patients to advance the field of precision medicine. Data Donor Drive or D3 envisions a grassroots campaign collecting 1 million genetic data sets from volunteers to develop a database as a launching pad for precision medicine's Knowledge Network. In another citizen scientist endeavor: stool samples, a key health indicator could be collected and analyzed at home under the Smart Toilet initiative thus generating insights into diets, personal genomes, and microbiomes. The "lab into your toilet" could be used to monitor the family's well-being and in addition, help identify prevailing infectious diseases before they reach epidemic proportions.

A citizen stakeholder adds yet another layer to an already multilayered stakeholder landscape in healthcare. This is a welcome development but brings with it the potential for ridiculously high volumes of both high velocity and variety of data which will swallow every conceivable size measure in a matter of seconds.

For e.g., pharmaceutical companies see big data as invaluable in early stage drug discovery, understanding the market, and personalized medicine. While concerns storing and managing data abates, it is data curation to make it meaningful to various stakeholders that is a priority with the next generation tools still nascent. The mission of a well-meaning data scientist is to ensure data provenance and high quality analysis which can unlock a treasure trove of insights that brings us closer to predictability, prevention, and precision and achieves the Knowledge Network's goals as enumerated above.

The past few years have seen an explosion in the monitoring of personal bio-signals in order to improve performance. The exponential growth of smartphone apps has put these performance-feedback tools into the hands of consumers - from weekend athletes to the pros. More significantly, these same technologies can be used to provide real-time feedback on a patient's condition, giving doctors an "early warning" system to prevent and take proactive steps to keep patients at their optimal health.

adidassensors.jpgThe sports technology (adidas' miCoach Elite System shown in the video above) tracks Lionel Messi's movements, speed, heart rate, "power output" (similar to exertion), positioning, stamina, and a few other physical and physiological metrics. The coach is instantly alerted to any changes in athlete performance, often before the player acknowledges it themselves.

Over time, the teams performance data is analyzed by the coach who in some ways becomes a part-time data scientist. The miCoach Elite System helps players optimize their play. By measuring every move, heartbeat and step and relaying that data to a coach on the touch line in less than a second, the technology enables a better understanding of the physical and physiological impact on the team, or any individual, during a game or training session. Simply by monitoring their iPads, coaches can now fully understand the physical impact on the body, including work rate, stamina, speed, distance, performance efficiency and, for the first time, power of every player, in every position. From influencing in-game managerial decisions, such as substitutions and team tactics, to analyzing trends to prevent overtraining and risk of injury, the technology helps maintain optimum levels of player performance week in, week out throughout the season.

The micoach Elite System includes a small data cell that fits into a player's base layer in a protective pocket on the back between the shoulder blades.  Connected by a series of electrodes and sensors woven into the fabric of the base layer, the cell wirelessly transmits more than 200 data records per second from each player to a central computer and then is displayed in a series of simplified insights and results on the coach's tablet.  At the touch of his fingertips, a coach can monitor the work load of an individual player, compare one athlete with another or view the whole team to gain a complete picture of the 90-minute game.

This is just the beginning. By using artificial intelligence, entire games can be analyzed to identify performance patterns and gain insights to help coaches make better decisions for future games.

As Neuraldude pointed out in the previous blog, patient-generated data becomes a core-competence of the hospital of the future. As mobile health merges with artificial intelligence, we will see predictive applications play an important role in patient care. Doctors will have access to their patients at home and on the move. Alerts will raise flags before a catastrophe.  The rich interaction between the physician and informed patients will transform healthcare as we know it now.

This phenomenon of the "quantified self" is already here. What is important is how we use the feedback to improve our lives on a day-to-day basis. One immediate application I'd like to see is the use of this technology to prevent heat-exhaustion deaths in high school athletes and outdoor workers across the US. 

A central thrust in the Patient Protection and Affordable Care Act (PPACA) a.k.a. Obamacare is to focus on patients becoming collaborators in their own healthcare. The idea is to reduce disease chronicity and to minimize expensive diagnostic and surgical interventions which inflate healthcare costs making the USA the most expensive country in the world for medical care.

The medical device industry is poised for huge growth as more and more Americans take to portable monitoring devices built with sensors that can detect changes in blood pressure to blood sugar to blood oxygen levels. Offsite recordings can be fed to a doctor's office via smartphones and tablets and medication dosages adjusted or changed accordingly. Naturally, the savings from a brick and mortar visit could be substantial.  As David Houle and Jonathan Fleece reveal such virtual healthcare delivery collaborations will be the way of the future as technology revolutionizes the traditional interface between provider and patient. [David Houle and Jonathan Fleece: The New Health Age: The Future Of Healthcare In America, 2011]

Personal monitoring devices could pull up thousands of data points of blood sugar levels and cardiac vascular function per patient daily with the doctor using statistical analysis tools to draw actionable insights into appropriate treatment.  There are a number of devices and apps now available such as Nike + pedometers to Instant Heart Rate for Android and general wellness tracking devices like Fitbit. It won't be long before Star Trek like tri-recorders become reality.  The explosion of such devices and apps have raised FDA regulatory concerns as well as data accuracy and privacy issues at stake with transmission of sensitive health care data through such channels.  The FDA receptive to these concerns is now unveiling new guidelines available this October:  Any mobile device and app monitoring critical vital signs like blood oxygen or blood pressure, regulating drug delivery, or used for diagnosis for critical conditions will have to go through approval.  General purpose devices that run medical apps or "lifestyle" devices and apps will be exempt from approval.

From a Big Data perspective, such patient generated data gives rise to concerns regarding accuracy and provenance.  Acting on it without corroboration could be risky, not acting at all could risk negligence.  Such questions are inevitable in a healthcare system transitioning to one of personal responsibility as patients increasingly become aware that patency of information provided can make all the difference.  Knowledge of device limitations/ errors, data iterations, and confirmatory formal testing in suspected patient error or outliers will be required as such data becomes part of a patient's permanent medical record. Thus, having good, clean data could prove crucial to a patient's health or recovery.

From a healthcare organization's perspective such data can be tricky because the PPACA now ties reimbursement to performance metrics such as rates of patient recovery and cost effectiveness.  Which makes data governance imperative - considering the source of data and its pedigree can drive decisions regarding patient care and healthcare delivery.  Beth Israel Deaconess Center is a big data pioneer in healthcare and uses it to maintain electronic medical records, reimbursement, and claims. It also recognizes the necessity of good data governance.

"There will be a lot of pressure put on health IT organizations to turn the data around rapidly," says Bill Gillis, CIO of Beth Israel Deaconess.  He however cautioned, "It's critical that the 'tyranny of the urgent' not win over," Gillis says. "Having governance in place up front can help avoid that pitfall and keep things on track."
Before you embark on a detailed enterprise planning process for Big Data, here are 11 critical questions that will help you set your strategic framework:

1. Does your company have a clear view of how Big Data and Intelligent Value Creation is likely to reshape your relevant markets over the next five years?

2. Does your company have an equally clear view of the implications for the changes you will need to make to continue to create value?

3. Are these views shared effectively among your senior managers across the organization?

4. Does senior management recognize the risks and uncertainties as part of the decision-making process?

5. Has your company been sufficiently aggressive in using Big Data and Intelligent Value Creation to enhance your operations?

6. Are there opportunities to use Big Data and Intelligent Value Creation to improve operations around existing products and services?

7. Are their opportunities to use Big Data and Intelligent Value Creation to significantly reduce costs and cycle time in existing work processes?

8. What are the data sources?

9. How will you monitor them?

10. How do you trigger events based on the intelligence gathered from the data?

11. Is there a growth or cost-savings optimization opportunity that is impacted?

Not answering these questions will cause problems down the road (we speak from experience).

The hype surrounding Big Data is inescapable. Fortunately, we are starting to see real world examples of business value to justify the investment. So how does an organization get started? A recent article in the McKinsey Quarterly makes the case for Big Data Planning. The "missing step for most companies is spending the time required to create a simple plan for how data, analytics, frontline tools, and people come together to create business value."

Furthermore, the article states:

In these early days of big-data and analytics planning, companies should address analogous issues: choosing the internal and external data they will integrate; selecting, from a long list of potential analytic models and tools, the ones that will best support their business goals; and building the organizational capabilities needed to exploit this potential.
If only it were that simple.

As we discussed earlier in our Intelligent Value Creation Maturity Model, the adoption of Big Data Analytics is an organizational journey.  Most companies begin their Big Data transformation led by an individual and/or a small group of like-minded individuals. Often viewed as trouble-makers, these groups should be encouraged and brought into technology strategy meetings and recognized for their passion. Skunkworks and research projects are typically not given the credit they deserve.  What is important is that learning is taking place, and even though it is informal, it is to be encouraged.  A path to the mainstream technology strategy for the company should be mapped out for these sorts of initiatives.

The main point is that without executive participation, Big Data analytics will not become a business priority, period. And most executives are already too busy, overstretched, to take on one more enterprise-wide initiative. The key concept is participation. An executive should take ownership of analytics as an organizational competence, and they need to be aware and supportive of the potential impact data-complete models all accross the organization. This involvement will help shift the enterprise from relying on intuition to data-driven decisions.

What about IT? Is this an opportunity to be make IT strategic? Of course it is, but, as is always the case, this has to be business-driven. As we've seen with digital initiatives over the past decade, Big Data will become a core skill requirement across all divisions, and big data budgets will not all originate out of IT.  To remain relevant, IT needs to think strategically.

So who should lead Big Data Analytics? Who has the skills required to understand the business impact of Big Data?  NeuralDude's advice is to listen to Peter Drucker. He said something like this - find the best individual in your company, and then charge them with making the transformation.

Which brings us back to the plan.  How do you plan for Big Data?  Answer: it has to be an integral part of your business strategy.

By strategy, I don't mean just executive-level strategy.  There has to be an educational process which shows and demonstrates the value to all employees - embedded into your operations.

More on how to build a Big Data Strategy in the next post.
FutureMed.jpgIt has been a couple weeks since completing FutureMed 2013 at Singularity University, and the dust is finally beginning to settle. Here are a few observations from NeuralDude:

Daniel Kraft, MD, Executive Director of FutureMed, brought together a powerful cross-section of thought leaders, scientists and entrepreneurs to the third year of this incredible program. "The world of health and medicine is at the cusp of radical disruption, with novel applications and the convergence of fast moving technologies ranging from the digitization of cloud-based healthcare data, to mobile health merging with artificial intelligence, to regenerative medicine and 3D printing," said FutureMed Executive Director Daniel Kraft MD, a Stanford and Harvard University trained physician, scientist and Faculty Chair of the SU Medicine Track.  "By understanding the trajectory and potential of rapidly developing technologies, merged with unmet needs and creative thinking, we have the potential to re-invent many aspects of wellness, diagnosis, and intervention- leading to better outcomes and healthier lives, at lower cost."

There are great strides being made in this community in areas spanning from synthetic biology to next generation healthcare practice.  An optimistic crowd to say the least, we all brought our thinking and leadership to the topics shared during this fascinating program.

Here are two health related factoids for you to think about:

  • Americans in the early 20th century ate on average 1 lbs of high fructose corn syrup per year.  Now the average American consumes around 50 lbs a year!  Houston, we have a problem!
  • By eating ¼ teaspoon less salt per day, Americans can potentially add 20 years to their lifespan.  
For me, the biggest takeaway was the famous quote, "I have seen the enemy, and it is us!" 

We need to think inclusively and beyond compliance to create a healthier future. David Duncan hosted several experts in an intriguing presentation on the future of personalized medicine; we are very quickly approaching the consumable cost for building the personal genome.  Now only several thousand dollars to get your genome sequenced, we will soon see the personal genome sequenced for under $100 bucks!   The personal genome has not changed our lives yet, but stay tuned!

Techniques in Life Sciences to support the growth of personalization are on the move as well.  MEA (e.g. Multi-Electrode Arrays) support 10s to 100s of individual experiments running simultaneously.  An excellent speech on the correlation of Cardiomyocyte tissues to clinical outcomes was presented at the conference; this will also speed drug discovery and the reality of personalized drug therapies.  Neural ID is involved in this area, providing a solution for automating biosignal analysis of MEA content. 

IMHO, the combination of AI with these new drug discovery platforms will speed the delivery of new drug therapies and revolutionize Life Sciences productivity.

In addition, the personal stories at FutureMed are always awe-inspiring!   In the inaugural year I was deeply touched by the personal story of e-patient Dave.  A man who had been diagnosed with late stage cancer; and through the support of his doctor and the internet; Dave continues to inspire us all with his life experience and "beating  the odds" against cancer.  This year it was Eric Rasmussen who provided an interesting truth about global health in defining the condition of a person in need of health services in poor or developing nations.  "Keep in mind the person you are trying to assist is hot, stressed (e.g. may have been attacked or injured) and hungry".   Open your mind to that, then think about how you solve the complex problems of global health.  I found the information Eric shared for work in creating clean water resource through technology of great interest.  Puralytics has several products to offer including the SolarBag (nanotech in a bag)   - it requires no power other than sunlight and the plastic container can be reused up to 500 times.  Way cool!  Another startup has figured out desalinization with a powered system that can purify 20,000 gallons a day!  Keep an eye on Alrafidane, solving a critical problem for global water supply. Houston, we have a solution!!  I need to get this info over to my buddies at the $300 House project...

Big Data will play a major role in understanding the internet of everything (i.e. sensors) connected to the complex world of global health (all of us).  Exponentials is what its all about! Topics on Artificial Intelligence, IT, Global Health, developing countries, Cancer, Body Sensing, Human Genome, and a myriad of others were delivered in a thoughtful and inspiring way.  The number of variables for each human being is a subject of study in and of itself.  Creating personalized health models that drive behavior and education will make a huge difference in the next 10 years. Focusing on the bigger problems and making significant strides each day.  Check out FutureMed and take a glimpse into a healthy future!  Our global health must be all-inclusive.


 I encourage anyone in the Life Sciences or Healthcare communities to make your plans now for next year's conference