The amount and complexity of globally available and newly generated data is dramatically increasing. In many cases, the problem is not whether data is available but rather if one can easily access them in a structured manner. In addition to raw data, there are huge challenges in having ongoing access to and to be able to verify analyzes performed on data by many independent actors. For some data sets, taking infectious diseases as an example, parts of the data may be associated with high level of confidentiality, but at the same time it has to be fully searchable, trustable and available for inte-gration to be useful. In addition the data must be available to a defined subset of researchers within a very short time horizon (e.g. hours). One of the crucial elements is a data processing infrastructure with the ability to provide all the rele-vant data, to provide easy access at different levels and to provide integration with public datasets as well as being able to rapidly execute relevant analysis and publishing apps and models that end users can call on demand. This has never been done before and state of the art big data processing technology is needed.
05/03/2018 12:00:00
72200000-7 Software programming and consultancy services
Danmarks Tekniske Universitet - DTU
Anker Engelunds Vej 1
2800
Kgs. Lyngby
Denmark
View profile
Christian Torrendrup
https://www.dtu.dk
| Notice | Date of dispatch |
|---|---|
| 02. Contract notice (TED (v209)) | 23/01/2018 16:10 |
| 03. Contract award notice (TED (v209)) | 17/04/2018 13:40 |