You are here

Project Wizard

You can use the category filters given on the right sidebar to narrow down your search results.

Laika

Rating: 
Your rating: None Average: 3.5 (4 votes)

Laika analyzes and reports on the interoperability capabilities of EHR systems. This includes the testing for certification of EHR software products and networks.

To support EHR data interoperability testing, Laika is designed to verify the input and output of EHR data against the standards and criteria identified by the Certification Commission for Health Information Technology (CCHIT). Laika is used by the Certification Commission to perform part of the interoperability certification inspection of EHRs.

Snofyre

Rating: 
No votes yet

Snofyre is an open source, service oriented API for creating SNOMED CT enabled applications in Java. It provides a number of SNOMED CT related services out of the box. These services can be used:

  • as a starter for understanding how to add SNOMED CT functionality to an application.
  • to rapidly prototype a SNOMED CT enabled application.

Snofyre API aims to

  • reduce the 'ramp up' time needed to understand
  • and embed SNOMED CT functionality in an application.

WEKA

Rating: 
Your rating: None Average: 2.7 (3 votes)

Weka is a collection of machine learning algorithms for data mining tasks. The algorithms can either be applied directly to a dataset or called from your own Java code. Weka contains tools for data pre-processing, classification, regression, clustering, association rules, and visualization. It is also well-suited for developing new machine learning schemes.

MOLGENIS

Rating: 
No votes yet

MOLGENIS is a modular web application for scientific data. MOLGENIS was born from molecular genetics research (and was called 'molecular genetics information system') but has grown, thanks too many sponsors and contributors, to be used in many scientifc areas such as biobanking, rare disease research, patient registries and even energy research. MOLGENIS provides researchers with user friendly and scalable software infrastructures to capture, exchange, and exploit the large amounts of data that is being produced by scientific organisations all around the world.