How to implement re-export behavior in ScopeComputation? #1508
-
Hey, I'd like to add a feature to our language that allows re-exporting imported elements. Therefore, I extended the However, my implementation is running into the following error:
ExampleThe code snippets below are from our project Jayvee. I omitted everything that is not interesting for the example. grammar.langium
Implementationexport class JayveeScopeComputation extends DefaultScopeComputation {
constructor(services: LangiumCoreServices) {
super(services);
}
protected override exportNode(
node: AstNode,
exports: AstNodeDescription[],
document: LangiumDocument,
): void {
// [...]
const isDelayedExportDefinition = isExportDefinition(node);
if (isDelayedExportDefinition) {
const exportedNode = node.element.ref;
if (exportedNode === undefined) {
return;
}
return super.exportNode(exportedNode, exports, document);
}
}
} Questions
Thanks already in advance! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 4 replies
-
Hey @georg-schwarz, great question! In general, I believe you're trying to achieve too much with the scope computation mechanism. While in theory this could work with a smarter compute scope strategy (i.e. a sorting of the documents), this would also break down in case of a cyclic dependency between files. My solution would attempt to do this using the scoping mechanism directly instead of using the scope computation. During the scoping, we're allowed to resolve cross references. In your case it should be enough to adjust the global scope: // compute the scope that can be accessed from the file that contains the reference
getGlobalScope(referenceType: string, context: ReferenceInfo): Scope {
const entryElement = getContainerOfType(context.container, isModel); // assuming the most top level element in your grammar is `Model`
const imports = entryElement.imports; // collect all `use` statements
const resolvedFiles = resolveImports(imports); // this should be a list of all imported documents
const elements: AstNodeDescription[] = [];
for (const file of resolvedFiles) {
const allPublishedElements = getPublishedElements(file); // gets all elements annotated with `publish`
elements.push(...allPublishedElements.map(e => this.descriptions.createDescription(e, e.name));
}
return new MapScope(elements);
}
getPublishedElements(document: LangiumDocument): Element[] {
const model = document.parseResult.value;
const publishedElements = model.elements.filter(e => e.published);
const exportedElements: Element[] = [];
for (const published of publishedElements) {
if (isPublishStatement(published) {
// follow the chain of published elements here
// i.e. use published.element.ref recursively until you find the defining element (or run into a cycle)
exportedElements.push(followPublishStatementChain(published));
} else {
exportedElements.push(published);
}
}
return exportedElements;
} Note that the code above is a very rough outline. Unless you perform caching, you will quickly run into performance issues with this. Also, this doesn't deal with renaming of objects yet, you might need an intermediate layer for that. I.e. don't expose the |
Beta Was this translation helpful? Give feedback.
Hey @georg-schwarz,
great question! In general, I believe you're trying to achieve too much with the scope computation mechanism. While in theory this could work with a smarter compute scope strategy (i.e. a sorting of the documents), this would also break down in case of a cyclic dependency between files.
My solution would attempt to do this using the scoping mechanism directly instead of using the scope computation. During the scoping, we're allowed to resolve cross references. In your case it should be enough to adjust the global scope: