The separate "spectests" directory was an artifact of our former nesting
of the main package under a "hcl" directory. However, it was confusing
to have both specsuite and spectests directories at the top level, so
instead we'll just conflate these two by putting the automatic Go testing
helper into the specsuite directory.
These grammar files are not yet feature-complete and so we'll remove them
for now until such time that we're ready to support them.
The main blocker here is in lacking a good testing procedure for updates
to these. While in principle they can be loaded into a number of different
text editors, that is a very manual process that often requires a high
degree of familiarity with the extension API. In order to support these
properly we'd instead need some sort of automatic test suite which can
run various inputs through these rulesets and check that the resulting
tokenization.
This experimental extension is not ready to be included in a release. It
ought to be reworked so that "include" blocks get replaced with what they
include _in-place_, preserving the relative ordering of blocks.
However, there is no application making use of this yet and so we'll defer
that work until there's a real use-case to evaluate it with.
This package remains highly experimental, and so we'll remove it for now
and consider spinning it off into its own repository for further iteration
if the direction seems promising.
This is the first step in bringing HCL 2 into the main HCL repository.
Subsequent commits will prune and reorganize this in preparation for
an initial tagged HCL 2 release.
When I tried a spec with quoted types, I got this error:
`A type is required, not string.`
Later in the document, the types don’t have quotes, so I figured that’s what’s expected.
If an expression in a conditional contains a DynamicVal, we know that
the opposing condition can pass through with no conversion since
converting to a DynamicPseudoType is a noop. We can also just pass
through the Dynamic val, since it is unknown and can't be converted.
The type unification done when evaluating a conditional normally needs
to return a DynamicPseudoType when either condition is dynamic. However,
a null dynamic value represents a known value of the desired type rather
than an unknown type, and we can be certain that it is convertible to
the desired type during the final evaluation. Rather than unifying
types against nulls, directly assign the needed conversion when
evaluating the conditional.
A sequence like "foo" is represented in the AST as a TemplateExpr with a
single string literal inside rather than as a string literal node
directly, so we need to recognize that situation during parsing and treat
it as a special case so we can get the intended behavior of representing
that index as a traversal step rather than as a dynamic index operation.
Most of the time this distinction doesn't matter, but it's important for
static analysis use-cases. In particular, hcl.AbsTraversalForExpr will now
accept an expression like foo["bar"] where before it would've rejected it.
This also includes a better error message for when an expression cannot be
recognized as a single traversal. There isn't really any context here to
return a direct reference to the construct that was problematic, which is
what we'd ideally do, but at least this new message includes a summary
of what is allowed and some examples of things that are not allowed as an
aid to understanding what "static variable reference" means.
Fix a bug where diagnostics found when evaluating array individual
elements are ignored, resulting in suppressed errors.
Nomad observed this issue in https://github.com/hashicorp/nomad/issues/5694.
This introduces only some minor bugfixes compared to the commit we
selected before; the main goal here is to be on an actual tagged release
rather than an arbitrary commit.
Previously we were using the EndOfLine as a mandatory marker to end the
comment, but that meant that if a comment appeared immediately before EOF
without a newline on the end it would fail to match.
Now we use the :>> operator similarly to how we previously fixed
greediness in the multi-line comment case: it tells Ragel to end the
Comment production if the following pattern matches (if EndOfLine is found)
but also allows the point before EndOfLine to be a final state, in case
EOF shows up there.
The contract for our parser is that in the case of errors our result is
stil valid, though possibly incomplete, so that development tools can
still do analysis of the parts of the result we _were_ able to parse.
However, we were previously failing to meet that contract in the presence
of certain syntax errors during block parsing, where we were producing
a nil body instead of a valid empty one.
Now we'll produce an empty placeholder body if for any reason we don't
have a real one before we return from block parsing, which then allows
analysis tools to see that the containing block was present but makes
its content appear totally empty. This is always done in conjunction with
returning an error, so a calling application will not be mislead into
thinking it is a complete result even though parts are missing.
The TemplateStringLiteral production was not quite right, causing a
literal $ or % immediately followed by " to consume the quotes and any
following characters on the line if there were any more characters on the
line.
Now we match things more precisely, but at the expense of generating some
redundant extra tokens when escapes and literal dollar/percent signs are
present. Those extra tokens don't matter in practice because the resulting
strings get concatenated together anyway, which is proven by the fact
that this changeset includes changes only to the scanner and parser tests,
and not to any of the expression result tests.
While here, I also improved the error message for when the user attempts
to split a quoted string over multiple lines. Previously it was just using
the generic "invalid character" message, which isn't particularly
actionable. Now we'll give the user a couple options of what to do
instead.
Previously our behavior for an unknown for_each was to produce a single
block whose content was the result of evaluating content with the iterator
set to cty.DynamicVal. That produced a reasonable idea of the content, but
the number of blocks in the result was still not accurate, and that can
present a problem for applications that use unknown values to predict
the overall shape of a not-yet-complete structure.
We can't return an unknown block via the HCL API, but to make that
situation easier to recognize by callers we'll now go a little further and
force _all_ of the leaf attributes in such a block to be unknown values,
even if they are constants in the configuration. This allows a calling
application that is making predictions to use a single object whose
leaves are all unknown as a heuristic to recognize what is effectively
an unknown set of blocks.
This is still not a perfect heuristic, but is the best we can do here
within the HCL API assumptions. A fundamental assumption of the HCL API
is that it's possible to walk the block structure without evaluating any
expressions and the dynamic block extension is intentionally subverting
that assumption, so some oddities are to be expected. Calling applications
that need a fully reliable sense of the final structure should not use
the dynamic block extension.
Slight adaptation of Ian Remmler <ian@remmler.org>'s fix in pr #39:
> If decoding into a pointer, and the pointer points to a value, decode
> into the exitsing value. Otherwise, decode into a new value and point
> to it.
> Addresses issue #38.
Fixes an issue where the name range may be incorrect in case there are
multiple attributes and one of them is wrong. Another attribute's name
range could overwrite the previous one as the attr variable is
overwritten in the for loop.
This situation is likely to arise if the user attempts to compute an index
using division while expecting HCL to do integer division rather than
float division.
This message is intended therefore to help the user see what fix is needed
(round the number) but because of layering it sadly cannot suggest a
specific remedy because HCL itself has no built-in rounding functionality,
and thus how exactly that is done is a matter for the calling application.
We're accepting that compromise for now to see how common this turns out
to be in practice. My hypothesis is that it'll be seen more when an
application moves from HCL 1 to HCL 2 because HCL 1 would do integer
division in some cases, but then it will be less common once new patterns
are established in the language community. For example, any existing
examples of using integer division to index in Terraform will over time
be replaced with examples showing the use of Terraform's "floor" function.
This is a variant of NewRangeScanner that allows the caller to specify the
start position, which is appropriate when this utility is being used with
a sub-slice of a file, rather than the whole file.
Previously, hclsyntax MissingItemRange() function returned a zero-length
range anchored at the end of the block in question. This commit changes
that to the beginning of the block. In practice, the end of a block is
generally just a "}" and not very useful in error messages.
In normal situations the block type name alone is enough to determine the
appropriate schema for a child, but when callers are otherwise doing
unusual pre-processing of bodies to dynamically generate schemas during
decoding they are likely to need to take similar steps while analyzing
for variables, to ensure that all of the references can be located in
spite of the not-yet-applied pre-processing.
Our API previously had a function only for retrieving the variables used
in the for_each and labels arguments used during an Expand call, and
expected callers to then interrogate the resulting expanded block to find
the other variables required to fully decode the content.
That approach is insufficient for any application that needs to know the
full set of required variables before any evaluation begins, such as when
a dependency graph will be constructed to allow a topological traversal
through blocks while evaluating.
Now we have WalkVariables, which finds both the variables used to expand
_and_ the variables within any blocks. This also renames
WalkForEachVariables to WalkExpandVariables since that name is more
accurate with the addition of the "label" argument into the expand-time
dependency set.
There is also a hcldec-based helper wrapper for each of those, allowing
single-shot analysis of blocks for applications that use hcldec.
This is a breaking change to the dynblock package API, because the old
WalkForEachVariables and ForEachVariablesHCLDec functions are no longer
present.