-
Notifications
You must be signed in to change notification settings - Fork 123
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
2024 / Version 2.0 Plan #409
Comments
Have you got any plans to support the EXT_mesh_features and EXT_structural_metadata extensions? Having support for writing/reading these extensions would be very useful now they are fully supported in Cesium Ion. |
@scottmcnab, if the only requirement is to be able to read the JSON then these objectives will make supporting new extensions much easier. Having said that, there are no plans to support any particular extension in the near future. If you need a quick solution in the meantime then you might want to consider using #395. |
I've come up with a neat (albeit slightly cursed) way of auto-generating the top-level glTF crate: https://play.rust-lang.org/?version=stable&mode=debug&edition=2021&gist=516e6a2c73525eb5efbd0412789923e1 |
Any plans to take other 3d-formats and export to glTF? (perhaps even visa-versa) My use case is that I would like to serve .stp, .stl and .3mf files to a renderer in the frontend. My infrastructure is all rust, and would like to stay in rust if possible. I'm not sure about the scope of what I'm asking. From what I managed to glean from basic research, it would seem that there isn't really any libraries to do this, just software features. I think it would be a great contribution to the community at large if there was a single library that managed this, so instead of it becoming absorbed into the codebase of blender, for example, it exists as part of a pseudo-standard library that can be integrated into any software that needs to work with gltf. |
I think you might want to create a new issue for this. That being said, while I'd love to have that functionality, the scope of what you're asking about is pretty big. |
Thanks for the feedback. I can now see the nature of my question better, and see that is probably well out-of-scope of this project. |
A substantial amount work has been going on over the last month or so. New macros have been added to I'm planning to open a large PR soon. There will be a final 1.4.1 release for the latest fixes before a 2.0 release is entertained. |
Looking forward to the 1.4.1 release with the dependency bumps! 🤞 |
A new release with #414 in it is now the only thing blocking emilk/egui#4160. |
2024 / Version 2.0 Plan
Happy new year all!
As discussed in issue #385, the next released version is expected to be 2.0 and this version is intended to mark the start of semantic versioning for all the crates in this repository rather than just the top-level
gltf
crate.There are a few critical requirements that would need to be met for this to be reasonably achievable.
Critical requirements
Merging of validation errors with parsing errors to simplify the JSON data structures
This pertains to the
Checked
type:The original idea behind this was to separate validation errors (i.e., "the data does not make sense") from parsing errors (e.g., "an object missing its closing brace"). Where validation errors occur a JSON path to the offending datum can be reported. This is achieved using a custom deserialiser. For example, consider the
camera::Type
enumeration:The use of the
Checked
type allows parsing to complete even if the data is malformed; however, pragmatically speaking, if the data is malformed then what use is it to continue parsing other than to provide the JSON path? Discontinuing this practice would allow the code to be generated trivially byserde_derive
instead. For exporters, it would also simplify populating the JSON data structures.Unification of the
gltf
andgltf-json
cratesHistorically, this split served two requirements: (1) to reduce build times and (2) to allow users to avoid using the wrapper. The first point might still be an issue. The second point could be achieved by generating the wrapper (see "Generation of wrapper" below) using a feature flag. Adding a feature flag to each wrapper type manually would be a pain otherwise. We could forego the second requirement for the time being.
Automated semantic versioning checks
This will make a stronger semantic versioning guarantee, reducing the chance of human error. See #385 (comment) for details.
Important but not critical requirements
Generation of wrapper
This has been discussed in other issues such as #198 and #234. A significant time period has elapsed since these issues were written and now I would feel more comfortable implementing a procedural macro on our existing data structures than implementating any other method of code generation. I have begun prototyping gltf-derive 2.0 to provide such macros.
There is a decision to be made regarding the structure of the generated crate. The existing structure is designed to keep type names terse and to avoid repeated prefixes. For example:
Buffer
,BufferView
, andBufferTarget
are grouped together asbuffer::Buffer
,buffer::View
, andbuffer::Target
. All of the JSON data structures (i.e., those defined in thegltf-json
crate) are re-exported under thejson
module and the crate structure matches that of the wrapper. For example:json::buffer::View
corresponds tobuffer::View
. This structure could be difficult to replicate with a procedural macro. It is likely some compromise would have to be made such asbuffer::View
corresponding to eitherbuffer::ViewJson
orbuffer::ViewReader
."Trusted" imports
The validation step is reportedly slow (see #402). The purpose of these checks is to ensure ahead of time that the wrapper crate will not crash due to malformed data such as out of range indices. This is reasonable for arbitrary incoming data; however, for applications such as games with static assets, this is an unnecessary overhead. For these scenarios, I'd like to introduce "trusted" variants of the import functions that skip the validation step.
The text was updated successfully, but these errors were encountered: