Complex information networks are ubiquitous with increasingly voluminous and diverse data: examples include social networks, medical health records, stock market, disaster response, and environmental science. Complex, partial, unstructured and time-varying multimodal data pose significant new challenges to knowledge discovery. Healthcare is an excellent example of a data offering all these challenges: patient health records, consisting of imaging data (CT scans, MRI, pathology images), physiological measurements, physician notes, disease states, treatment history including medication, and outcomes (well-being of the patient). The spatial variations (e.g., in 2D/3D images) and temporal behavior (response to medication, changes in imaging and other measurements) are further complicated because data sampling is not uniform in any of these dimensions. The primary mission of the Center for Multimodal Big Data Science is to address the grand challenge of developing novel computational methods leveraging image analysis, natural language processing, machine learning, system identification and database technologies to discover significant knowledge to provide a transformative impact on a broad spectrum of applications.

A central component in this research effort is the BisQue image management and analysis platform that has been developed in the lab over the past 10 years. BisQue specifically supports large scale, multi-dimensional multimodal-images and image analysis. Metadata is stored as arbitrarily nested and linked tag/value pairs, allowing for domain-specific data organization. Image analysis modules can be added to perform complex analysis tasks on compute clusters. Analysis results are stored within the database for further querying and processing. The data and analysis provenance is maintained for reproducibility of results. BisQue can be easily deployed in cloud computing environments or on computer clusters for scalability. The user interacts with BisQue via any modern web browser.

The interdisciplinary research projects built on top of BisQue span four main areas: marine science, plant biology, materials science, and healthcare. 

Recent Papers

(None)

Research

BisQue has been used to manage and analyze 23.3 hours (884GB) of high definition video from dives in Bering Sea submarine canyons to evaluate the density of fishes, structure-forming corals and sponges and to document and describe fishing damage. 

The BisQue image analysis platform was used to develop algorithms to assay phenotypes such as directional root-tip growth or comparisons of seed size differences.

In this project, we integrate the microstructure analysis software package Dream.3D into BisQue. 

The Multimodal Multiview Network for Healthcare is a collaborative effort between researchers from the Electrical and Computer Engineering Department at the University of California Santa Barbara and the intesivists and medical practitioners from the Medical Intensive Care Unit (MICU) at Santa Barbara Cottage Hospital. The Objective of the research is to improve quality of care by  monitoring patients and workflows in real ICU rooms. The network is non-disruptive and non-intrusive and the methods and protocols to protect and maintain the privacy of patients and staff.

Contact

Prof. B. S. Manjunath
Director, Center for Multimodal Big Data Science and Healthcare
University of California
Santa Barbara, CA 93106-9560

Tel: (805) 893 7112
Fax: (805) 893 3262
E-mail: manj [at] ece [dot] ucsb [dot] edu

News

  • 2015-09. This is sample news content.