-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Rec. 20: Support legacy data to be made FAIR #20
Comments
This is not a very clear case. Legacy data can often be made obsolete by new technological developments and may not meet today's standards. Communities already have trouble keeping up with valuable new data so is it wise to divert resources to deal with legacy data ? |
I think that this is an important point. Even with new techniques, much legacy data has intrinsic value, plus it can be important for reproducibility or the integration of old and new knowledge. I would suggest including in the actions here a reference to the importance of data curation. |
There needs to be some consideration into the type of the legacy data. For example, census data (typically going back 50 or so years) that cannot be recollected but highly informative to social science research is a good candidate for making FAIR. However, crystallographic data that can be reproduced at a higher quality with new technology does not make sense. |
4TU.Centre for Research Data position: We offer a data refinement fund and it was clear very quickly, that there is a high demand on funding to digitize legacy data. More support from the funders would be great. |
DFG position: It is true that large amounts of data exist already, which are not following the FAIR-Principles. However, it is also true that not any of those data are of value anymore and consequently it would not make much sense to take on the effort to make them FAIR. The answer to the question, whether those data are of value or not have to be given by scientists or the scientific communities. Given that a clear scientific value can be demonstrated convincingly, funding my already be available. Providing dedicated means of support in order to make legacy data FAIR should bear a lower priority against other issues. |
I see this recommendation as one of the most important ones. Making valuable legacy data FAIR (e.g. historical climate data) will have a huge impact. Many of our researchers really want to do that but are lacking time and money. |
Thumbs up, but make sure that you select and prioritise. Consider to fund projects for turning legacy data into Linked Open Data. |
ESO position |
INAF (astronomy) position: agree |
ILL Position: Very interesting proposition that shouldn't require an important budget if limited to some legacy datasets that could be extremely valuable for teaching purposes. |
ELIXIR-UK position: supporting legacy data has to be in terms of Return on Investment |
Fully support making legacy data FAIR and providing funding and support for this. There needs to be a clear selection process including researchers and research communities for which legacy data should be made FAIR. We could envision an open call for researchers to identify such valuable legacy data. |
There are large amounts of legacy data that is not FAIR but would have considerable value if it were. Mechanisms should be explored to include some legacy data in the FAIR ecosystem where required.
Research communities and data owners should explore legacy data to identify indispensable collections with significant reuse potential that warrant effort to make them FAIR.
Stakeholders: Research communities; Institutions; Data services.
Funding should be provided to adapt legacy datasets that have been identified as particularly crucial in a given discipline.
Stakeholders: Funders; Institutions; Research communities.
The text was updated successfully, but these errors were encountered: