1
0
Fork 0

Adding upstream version 1.34.4.

Signed-off-by: Daniel Baumann <daniel@debian.org>
This commit is contained in:
Daniel Baumann 2025-05-24 07:26:29 +02:00
parent e393c3af3f
commit 4978089aab
Signed by: daniel
GPG key ID: FBB4F0E80A80222F
4963 changed files with 677545 additions and 0 deletions

View file

@ -0,0 +1,639 @@
# XPath Parser Plugin
The XPath data format parser parses different formats into metric fields using
[XPath][xpath] expressions.
For supported XPath functions check [the underlying XPath library][xpath lib].
__NOTE:__ The type of fields are specified using [XPath functions][xpath
lib]. The only exception are _integer_ fields that need to be specified in a
`fields_int` section.
## Supported data formats
| name | `data_format` setting | comment |
| -------------------------------------------- | --------------------- | ------- |
| [Extensible Markup Language (XML)][xml] | `"xml"` | |
| [Concise Binary Object Representation][cbor] | `"xpath_cbor"` | [see additional notes](#concise-binary-object-representation-notes)|
| [JSON][json] | `"xpath_json"` | |
| [MessagePack][msgpack] | `"xpath_msgpack"` | |
| [Protocol-buffers][protobuf] | `"xpath_protobuf"` | [see additional parameters](#protocol-buffers-additional-settings)|
### Protocol-buffers additional settings
For using the protocol-buffer format you need to specify additional
(_mandatory_) properties for the parser. Those options are described here.
#### `xpath_protobuf_files` (mandatory)
Use this option to specify the name of the protocol-buffer definition files
(`.proto`).
#### `xpath_protobuf_type` (mandatory)
This option contains the top-level message file to use for deserializing the
data to be parsed. Usually, this is constructed from the `package` name in the
protocol-buffer definition file and the `message` name as `<package
name>.<message name>`.
#### `xpath_protobuf_import_paths` (optional)
In case you import other protocol-buffer definitions within your `.proto` file
(i.e. you use the `import` statement) you can use this option to specify paths
to search for the imported definition file(s). By default the imports are only
searched in `.` which is the current-working-directory, i.e. usually the
directory you are in when starting telegraf.
Imagine you do have multiple protocol-buffer definitions (e.g. `A.proto`,
`B.proto` and `C.proto`) in a directory (e.g. `/data/my_proto_files`) where your
top-level file (e.g. `A.proto`) imports at least one other definition
```protobuf
syntax = "proto3";
package foo;
import "B.proto";
message Measurement {
...
}
```
You should use the following setting
```toml
[[inputs.file]]
files = ["example.dat"]
data_format = "xpath_protobuf"
xpath_protobuf_files = ["A.proto"]
xpath_protobuf_type = "foo.Measurement"
xpath_protobuf_import_paths = [".", "/data/my_proto_files"]
...
```
#### `xpath_protobuf_skip_bytes` (optional)
This option allows to skip a number of bytes before trying to parse
the protocol-buffer message. This is useful in cases where the raw data
has a header e.g. for the message length or in case of GRPC messages.
This is a list of known headers and the corresponding values for
`xpath_protobuf_skip_bytes`
| name | setting | comment |
| --------------------------------------- | ------- | ------- |
| [GRPC protocol][GRPC] | 5 | GRPC adds a 5-byte header for _Length-Prefixed-Messages_ |
| [PowerDNS logging][PDNS] | 2 | Sent messages contain a 2-byte header containing the message length |
[GRPC]: https://github.com/grpc/grpc/blob/master/doc/PROTOCOL-HTTP2.md
[PDNS]: https://docs.powerdns.com/recursor/lua-config/protobuf.html
### Concise Binary Object Representation notes
Concise Binary Object Representation support numeric keys in the data. However,
XML (and this parser) expects node names to be strings starting with a letter.
To be compatible with these requirements, numeric nodes will be prefixed with
a lower case `n` and converted to strings. This means that if you for example
have a node with the key `123` in CBOR you will need to query `n123` in your
XPath expressions.
## Configuration
```toml
[[inputs.file]]
files = ["example.xml"]
## Data format to consume.
## Each data format has its own unique set of configuration options, read
## more about them here:
## https://github.com/influxdata/telegraf/blob/master/docs/DATA_FORMATS_INPUT.md
data_format = "xml"
## PROTOCOL-BUFFER definitions
## Protocol-buffer definition file
# xpath_protobuf_files = ["sparkplug_b.proto"]
## Name of the protocol-buffer message type to use in a fully qualified form.
# xpath_protobuf_type = "org.eclipse.tahu.protobuf.Payload"
## List of paths to use when looking up imported protocol-buffer definition files.
# xpath_protobuf_import_paths = ["."]
## Number of (header) bytes to ignore before parsing the message.
# xpath_protobuf_skip_bytes = 0
## Print the internal XML document when in debug logging mode.
## This is especially useful when using the parser with non-XML formats like protocol-buffers
## to get an idea on the expression necessary to derive fields etc.
# xpath_print_document = false
## Allow the results of one of the parsing sections to be empty.
## Useful when not all selected files have the exact same structure.
# xpath_allow_empty_selection = false
## Get native data-types for all data-format that contain type information.
## Currently, CBOR, protobuf, msgpack and JSON support native data-types.
# xpath_native_types = false
## Trace empty node selections for debugging
# log_level = "trace"
## Multiple parsing sections are allowed
[[inputs.file.xpath]]
## Optional: XPath-query to select a subset of nodes from the XML document.
# metric_selection = "/Bus/child::Sensor"
## Optional: XPath-query to set the metric (measurement) name.
# metric_name = "string('example')"
## Optional: Query to extract metric timestamp.
## If not specified the time of execution is used.
# timestamp = "/Gateway/Timestamp"
## Optional: Format of the timestamp determined by the query above.
## This can be any of "unix", "unix_ms", "unix_us", "unix_ns" or a valid Golang
## time format. If not specified, a "unix" timestamp (in seconds) is expected.
# timestamp_format = "2006-01-02T15:04:05Z"
## Optional: Timezone of the parsed time
## This will locate the parsed time to the given timezone. Please note that
## for times with timezone-offsets (e.g. RFC3339) the timestamp is unchanged.
## This is ignored for all (unix) timestamp formats.
# timezone = "UTC"
## Optional: List of fields to convert to hex-strings if they are
## containing byte-arrays. This might be the case for e.g. protocol-buffer
## messages encoding data as byte-arrays. Wildcard patterns are allowed.
## By default, all byte-array-fields are converted to string.
# fields_bytes_as_hex = []
## Optional: List of fields to convert to base64-strings if they
## contain byte-arrays. Resulting string will generally be shorter
## than using hex encoding. Base64 encoding is RFC4648 compliant.
# fields_bytes_as_base64 = []
## Tag definitions using the given XPath queries.
[inputs.file.xpath.tags]
name = "substring-after(Sensor/@name, ' ')"
device = "string('the ultimate sensor')"
## Integer field definitions using XPath queries.
[inputs.file.xpath.fields_int]
consumers = "Variable/@consumers"
## Non-integer field definitions using XPath queries.
## The field type is defined using XPath expressions such as number(), boolean() or string(). If no conversion is performed the field will be of type string.
[inputs.file.xpath.fields]
temperature = "number(Variable/@temperature)"
power = "number(Variable/@power)"
frequency = "number(Variable/@frequency)"
ok = "Mode != 'ok'"
```
In this configuration mode, you explicitly specify the field and tags you want
to scrape out of your data.
A configuration can contain multiple _xpath_ subsections for e.g. the
file plugin to process the xml-string multiple times. Consult the
[XPath syntax][xpath] and the [underlying library's functions][xpath lib]
for details and help regarding XPath queries. Consider using an XPath tester
such as [xpather.com][xpather] or [Code Beautify's XPath Tester][xpath tester]
for help developing and debugging your query.
## Configuration (batch)
Alternatively to the configuration above, fields can also be specified in a
batch way. So contrary to specify the fields in a section, you can define a
`name` and a `value` selector used to determine the name and value of the fields
in the metric.
```toml
[[inputs.file]]
files = ["example.xml"]
## Data format to consume.
## Each data format has its own unique set of configuration options, read
## more about them here:
## https://github.com/influxdata/telegraf/blob/master/docs/DATA_FORMATS_INPUT.md
data_format = "xml"
## PROTOCOL-BUFFER definitions
## Protocol-buffer definition file
# xpath_protobuf_file = ["sparkplug_b.proto"]
## Name of the protocol-buffer message type to use in a fully qualified form.
# xpath_protobuf_type = "org.eclipse.tahu.protobuf.Payload"
## List of paths to use when looking up imported protocol-buffer definition files.
# xpath_protobuf_import_paths = ["."]
## Print the internal XML document when in debug logging mode.
## This is especially useful when using the parser with non-XML formats like protocol-buffers
## to get an idea on the expression necessary to derive fields etc.
# xpath_print_document = false
## Allow the results of one of the parsing sections to be empty.
## Useful when not all selected files have the exact same structure.
# xpath_allow_empty_selection = false
## Get native data-types for all data-format that contain type information.
## Currently, protobuf, msgpack and JSON support native data-types
# xpath_native_types = false
## Multiple parsing sections are allowed
[[inputs.file.xpath]]
## Optional: XPath-query to select a subset of nodes from the XML document.
metric_selection = "/Bus/child::Sensor"
## Optional: XPath-query to set the metric (measurement) name.
# metric_name = "string('example')"
## Optional: Query to extract metric timestamp.
## If not specified the time of execution is used.
# timestamp = "/Gateway/Timestamp"
## Optional: Format of the timestamp determined by the query above.
## This can be any of "unix", "unix_ms", "unix_us", "unix_ns" or a valid Golang
## time format. If not specified, a "unix" timestamp (in seconds) is expected.
# timestamp_format = "2006-01-02T15:04:05Z"
## Field specifications using a selector.
field_selection = "child::*"
## Optional: Queries to specify field name and value.
## These options are only to be used in combination with 'field_selection'!
## By default the node name and node content is used if a field-selection
## is specified.
# field_name = "name()"
# field_value = "."
## Optional: Expand field names relative to the selected node
## This allows to flatten out nodes with non-unique names in the subtree
# field_name_expansion = false
## Tag specifications using a selector.
## tag_selection = "child::*"
## Optional: Queries to specify tag name and value.
## These options are only to be used in combination with 'tag_selection'!
## By default the node name and node content is used if a tag-selection
## is specified.
# tag_name = "name()"
# tag_value = "."
## Optional: Expand tag names relative to the selected node
## This allows to flatten out nodes with non-unique names in the subtree
# tag_name_expansion = false
## Tag definitions using the given XPath queries.
[inputs.file.xpath.tags]
name = "substring-after(Sensor/@name, ' ')"
device = "string('the ultimate sensor')"
```
__Please note__: The resulting fields are _always_ of type string!
It is also possible to specify a mixture of the two alternative ways of
specifying fields. In this case _explicitly_ defined tags and fields take
_precedence_ over the batch instances if both use the same tag/field name.
### metric_selection (optional)
You can specify a [XPath][xpath] query to select a subset of nodes from the XML
document, each used to generate a new metrics with the specified fields, tags
etc.
For relative queries in subsequent queries they are relative to the
`metric_selection`. To specify absolute paths, please start the query with a
slash (`/`).
Specifying `metric_selection` is optional. If not specified all relative queries
are relative to the root node of the XML document.
### metric_name (optional)
By specifying `metric_name` you can override the metric/measurement name with
the result of the given [XPath][xpath] query. If not specified, the default
metric name is used.
### timestamp, timestamp_format, timezone (optional)
By default the current time will be used for all created metrics. To set the
time from values in the XML document you can specify a [XPath][xpath] query in
`timestamp` and set the format in `timestamp_format`.
The `timestamp_format` can be set to `unix`, `unix_ms`, `unix_us`, `unix_ns`, or
an accepted [Go "reference time"][time const]. Consult the Go [time][time parse]
package for details and additional examples on how to set the time format. If
`timestamp_format` is omitted `unix` format is assumed as result of the
`timestamp` query.
The `timezone` setting will be used to locate the parsed time in the given
timezone. This is helpful for cases where the time does not contain timezone
information, e.g. `2023-03-09 14:04:40` and is not located in _UTC_, which is
the default setting. It is also possible to set the `timezone` to `Local` which
used the configured host timezone.
For time formats with timezone information, e.g. RFC3339, the resulting
timestamp is unchanged. The `timezone` setting is ignored for all `unix`
timestamp formats.
### tags sub-section
[XPath][xpath] queries in the `tag name = query` format to add tags to the
metrics. The specified path can be absolute (starting with `/`) or
relative. Relative paths use the currently selected node as reference.
__NOTE:__ Results of tag-queries will always be converted to strings.
### fields_int sub-section
[XPath][xpath] queries in the `field name = query` format to add integer typed
fields to the metrics. The specified path can be absolute (starting with `/`) or
relative. Relative paths use the currently selected node as reference.
__NOTE:__ Results of field_int-queries will always be converted to
__int64__. The conversion will fail in case the query result is not convertible!
### fields sub-section
[XPath][xpath] queries in the `field name = query` format to add non-integer
fields to the metrics. The specified path can be absolute (starting with `/`) or
relative. Relative paths use the currently selected node as reference.
The type of the field is specified in the [XPath][xpath] query using the type
conversion functions of XPath such as `number()`, `boolean()` or `string()` If
no conversion is performed in the query the field will be of type string.
__NOTE: Path conversion functions will always succeed even if you convert a text
to float!__
### field_selection, field_name, field_value (optional)
You can specify a [XPath][xpath] query to select a set of nodes forming the
fields of the metric. The specified path can be absolute (starting with `/`) or
relative to the currently selected node. Each node selected by `field_selection`
forms a new field within the metric.
The _name_ and the _value_ of each field can be specified using the optional
`field_name` and `field_value` queries. The queries are relative to the selected
field if not starting with `/`. If not specified the field's _name_ defaults to
the node name and the field's _value_ defaults to the content of the selected
field node.
__NOTE__: `field_name` and `field_value` queries are only evaluated if a
`field_selection` is specified.
Specifying `field_selection` is optional. This is an alternative way to specify
fields especially for documents where the node names are not known a priori or
if there is a large number of fields to be specified. These options can also be
combined with the field specifications above.
__NOTE: Path conversion functions will always succeed even if you convert a text
to float!__
### field_name_expansion (optional)
When _true_, field names selected with `field_selection` are expanded to a
_path_ relative to the _selected node_. This is necessary if we e.g. select all
leaf nodes as fields and those leaf nodes do not have unique names. That is in
case you have duplicate names in the fields you select you should set this to
`true`.
### tag_selection, tag_name, tag_value (optional)
You can specify a [XPath][xpath] query to select a set of nodes forming the tags
of the metric. The specified path can be absolute (starting with `/`) or
relative to the currently selected node. Each node selected by `tag_selection`
forms a new tag within the metric.
The _name_ and the _value_ of each tag can be specified using the optional
`tag_name` and `tag_value` queries. The queries are relative to the selected tag
if not starting with `/`. If not specified the tag's _name_ defaults to the node
name and the tag's _value_ defaults to the content of the selected tag node.
__NOTE__: `tag_name` and `tag_value` queries are only evaluated if a
`tag_selection` is specified.
Specifying `tag_selection` is optional. This is an alternative way to specify
tags especially for documents where the node names are not known a priori or if
there is a large number of tags to be specified. These options can also be
combined with the tag specifications above.
### tag_name_expansion (optional)
When _true_, tag names selected with `tag_selection` are expanded to a _path_
relative to the _selected node_. This is necessary if we e.g. select all leaf
nodes as tags and those leaf nodes do not have unique names. That is in case you
have duplicate names in the tags you select you should set this to `true`.
## Examples
This `example.xml` file is used in the configuration examples below:
```xml
<?xml version="1.0"?>
<Gateway>
<Name>Main Gateway</Name>
<Timestamp>2020-08-01T15:04:03Z</Timestamp>
<Sequence>12</Sequence>
<Status>ok</Status>
</Gateway>
<Bus>
<Sensor name="Sensor Facility A">
<Variable temperature="20.0"/>
<Variable power="123.4"/>
<Variable frequency="49.78"/>
<Variable consumers="3"/>
<Mode>busy</Mode>
</Sensor>
<Sensor name="Sensor Facility B">
<Variable temperature="23.1"/>
<Variable power="14.3"/>
<Variable frequency="49.78"/>
<Variable consumers="1"/>
<Mode>standby</Mode>
</Sensor>
<Sensor name="Sensor Facility C">
<Variable temperature="19.7"/>
<Variable power="0.02"/>
<Variable frequency="49.78"/>
<Variable consumers="0"/>
<Mode>error</Mode>
</Sensor>
</Bus>
```
### Basic Parsing
This example shows the basic usage of the xml parser.
Config:
```toml
[[inputs.file]]
files = ["example.xml"]
data_format = "xml"
[[inputs.file.xpath]]
[inputs.file.xpath.tags]
gateway = "substring-before(/Gateway/Name, ' ')"
[inputs.file.xpath.fields_int]
seqnr = "/Gateway/Sequence"
[inputs.file.xpath.fields]
ok = "/Gateway/Status = 'ok'"
```
Output:
```text
file,gateway=Main,host=Hugin seqnr=12i,ok=true 1598610830000000000
```
In the _tags_ definition the XPath function `substring-before()` is used to only
extract the sub-string before the space. To get the integer value of
`/Gateway/Sequence` we have to use the _fields_int_ section as there is no XPath
expression to convert node values to integers (only float).
The `ok` field is filled with a boolean by specifying a query comparing the
query result of `/Gateway/Status` with the string _ok_. Use the type conversions
available in the XPath syntax to specify field types.
### Time and metric names
This is an example for using time and name of the metric from the XML document
itself.
Config:
```toml
[[inputs.file]]
files = ["example.xml"]
data_format = "xml"
[[inputs.file.xpath]]
metric_name = "name(/Gateway/Status)"
timestamp = "/Gateway/Timestamp"
timestamp_format = "2006-01-02T15:04:05Z"
[inputs.file.xpath.tags]
gateway = "substring-before(/Gateway/Name, ' ')"
[inputs.file.xpath.fields]
ok = "/Gateway/Status = 'ok'"
```
Output:
```text
Status,gateway=Main,host=Hugin ok=true 1596294243000000000
```
Additionally to the basic parsing example, the metric name is defined as the
name of the `/Gateway/Status` node and the timestamp is derived from the XML
document instead of using the execution time.
### Multi-node selection
For XML documents containing metrics for e.g. multiple devices (like `Sensor`s
in the _example.xml_), multiple metrics can be generated using node
selection. This example shows how to generate a metric for each _Sensor_ in the
example.
Config:
```toml
[[inputs.file]]
files = ["example.xml"]
data_format = "xml"
[[inputs.file.xpath]]
metric_selection = "/Bus/child::Sensor"
metric_name = "string('sensors')"
timestamp = "/Gateway/Timestamp"
timestamp_format = "2006-01-02T15:04:05Z"
[inputs.file.xpath.tags]
name = "substring-after(@name, ' ')"
[inputs.file.xpath.fields_int]
consumers = "Variable/@consumers"
[inputs.file.xpath.fields]
temperature = "number(Variable/@temperature)"
power = "number(Variable/@power)"
frequency = "number(Variable/@frequency)"
ok = "Mode != 'error'"
```
Output:
```text
sensors,host=Hugin,name=Facility\ A consumers=3i,frequency=49.78,ok=true,power=123.4,temperature=20 1596294243000000000
sensors,host=Hugin,name=Facility\ B consumers=1i,frequency=49.78,ok=true,power=14.3,temperature=23.1 1596294243000000000
sensors,host=Hugin,name=Facility\ C consumers=0i,frequency=49.78,ok=false,power=0.02,temperature=19.7 1596294243000000000
```
Using the `metric_selection` option we select all `Sensor` nodes in the XML
document. Please note that all field and tag definitions are relative to these
selected nodes. An exception is the timestamp definition which is relative to
the root node of the XML document.
### Batch field processing with multi-node selection
For XML documents containing metrics with a large number of fields or where the
fields are not known before (e.g. an unknown set of `Variable` nodes in the
_example.xml_), field selectors can be used. This example shows how to generate
a metric for each _Sensor_ in the example with fields derived from the
_Variable_ nodes.
Config:
```toml
[[inputs.file]]
files = ["example.xml"]
data_format = "xml"
[[inputs.file.xpath]]
metric_selection = "/Bus/child::Sensor"
metric_name = "string('sensors')"
timestamp = "/Gateway/Timestamp"
timestamp_format = "2006-01-02T15:04:05Z"
field_selection = "child::Variable"
field_name = "name(@*[1])"
field_value = "number(@*[1])"
[inputs.file.xpath.tags]
name = "substring-after(@name, ' ')"
```
Output:
```text
sensors,host=Hugin,name=Facility\ A consumers=3,frequency=49.78,power=123.4,temperature=20 1596294243000000000
sensors,host=Hugin,name=Facility\ B consumers=1,frequency=49.78,power=14.3,temperature=23.1 1596294243000000000
sensors,host=Hugin,name=Facility\ C consumers=0,frequency=49.78,power=0.02,temperature=19.7 1596294243000000000
```
Using the `metric_selection` option we select all `Sensor` nodes in the XML
document. For each _Sensor_ we then use `field_selection` to select all child
nodes of the sensor as _field-nodes_ Please note that the field selection is
relative to the selected nodes. For each selected _field-node_ we use
`field_name` and `field_value` to determining the field's name and value,
respectively. The `field_name` derives the name of the first attribute of the
node, while `field_value` derives the value of the first attribute and converts
the result to a number.
[cbor]: https://cbor.io/
[json]: https://www.json.org/
[msgpack]: https://msgpack.org/
[protobuf]: https://developers.google.com/protocol-buffers
[time const]: https://golang.org/pkg/time/#pkg-constants
[time parse]: https://golang.org/pkg/time/#Parse
[xml]: https://www.w3.org/XML/
[xpath]: https://www.w3.org/TR/xpath/
[xpath lib]: https://github.com/antchfx/xpath
[xpath tester]: https://codebeautify.org/Xpath-Tester
[xpather]: http://xpather.com/

View file

@ -0,0 +1,103 @@
package xpath
import (
"reflect"
"strconv"
"strings"
path "github.com/antchfx/xpath"
"github.com/srebhan/cborquery"
)
type cborDocument struct{}
func (*cborDocument) Parse(buf []byte) (dataNode, error) {
return cborquery.Parse(strings.NewReader(string(buf)))
}
func (*cborDocument) QueryAll(node dataNode, expr string) ([]dataNode, error) {
// If this panics it's a programming error as we changed the document type while processing
native, err := cborquery.QueryAll(node.(*cborquery.Node), expr)
if err != nil {
return nil, err
}
nodes := make([]dataNode, 0, len(native))
for _, n := range native {
nodes = append(nodes, n)
}
return nodes, nil
}
func (*cborDocument) CreateXPathNavigator(node dataNode) path.NodeNavigator {
// If this panics it's a programming error as we changed the document type while processing
return cborquery.CreateXPathNavigator(node.(*cborquery.Node))
}
func (d *cborDocument) GetNodePath(node, relativeTo dataNode, sep string) string {
names := make([]string, 0)
// If these panic it's a programming error as we changed the document type while processing
nativeNode := node.(*cborquery.Node)
nativeRelativeTo := relativeTo.(*cborquery.Node)
// Climb up the tree and collect the node names
n := nativeNode.Parent
for n != nil && n != nativeRelativeTo {
nodeName := d.GetNodeName(n, sep, false)
names = append(names, nodeName)
n = n.Parent
}
if len(names) < 1 {
return ""
}
// Construct the nodes
nodepath := ""
for _, name := range names {
nodepath = name + sep + nodepath
}
return nodepath[:len(nodepath)-1]
}
func (d *cborDocument) GetNodeName(node dataNode, sep string, withParent bool) string {
// If this panics it's a programming error as we changed the document type while processing
nativeNode := node.(*cborquery.Node)
name := nativeNode.Name
// Check if the node is part of an array. If so, determine the index and
// concatenate the parent name and the index.
kind := reflect.Invalid
if nativeNode.Parent != nil && nativeNode.Parent.Value() != nil {
kind = reflect.TypeOf(nativeNode.Parent.Value()).Kind()
}
switch kind {
case reflect.Slice, reflect.Array:
// Determine the index for array elements
if name == "" && nativeNode.Parent != nil && withParent {
name = nativeNode.Parent.Name + sep
}
return name + d.index(nativeNode)
}
return name
}
func (*cborDocument) OutputXML(node dataNode) string {
native := node.(*cborquery.Node)
return native.OutputXML()
}
func (*cborDocument) index(node *cborquery.Node) string {
idx := 0
for n := node; n.PrevSibling != nil; n = n.PrevSibling {
idx++
}
return strconv.Itoa(idx)
}

View file

@ -0,0 +1,103 @@
package xpath
import (
"reflect"
"strconv"
"strings"
"github.com/antchfx/jsonquery"
path "github.com/antchfx/xpath"
)
type jsonDocument struct{}
func (*jsonDocument) Parse(buf []byte) (dataNode, error) {
return jsonquery.Parse(strings.NewReader(string(buf)))
}
func (*jsonDocument) QueryAll(node dataNode, expr string) ([]dataNode, error) {
// If this panics it's a programming error as we changed the document type while processing
native, err := jsonquery.QueryAll(node.(*jsonquery.Node), expr)
if err != nil {
return nil, err
}
nodes := make([]dataNode, 0, len(native))
for _, n := range native {
nodes = append(nodes, n)
}
return nodes, nil
}
func (*jsonDocument) CreateXPathNavigator(node dataNode) path.NodeNavigator {
// If this panics it's a programming error as we changed the document type while processing
return jsonquery.CreateXPathNavigator(node.(*jsonquery.Node))
}
func (d *jsonDocument) GetNodePath(node, relativeTo dataNode, sep string) string {
names := make([]string, 0)
// If these panic it's a programming error as we changed the document type while processing
nativeNode := node.(*jsonquery.Node)
nativeRelativeTo := relativeTo.(*jsonquery.Node)
// Climb up the tree and collect the node names
n := nativeNode.Parent
for n != nil && n != nativeRelativeTo {
nodeName := d.GetNodeName(n, sep, false)
names = append(names, nodeName)
n = n.Parent
}
if len(names) < 1 {
return ""
}
// Construct the nodes
nodepath := ""
for _, name := range names {
nodepath = name + sep + nodepath
}
return nodepath[:len(nodepath)-1]
}
func (d *jsonDocument) GetNodeName(node dataNode, sep string, withParent bool) string {
// If this panics it's a programming error as we changed the document type while processing
nativeNode := node.(*jsonquery.Node)
name := nativeNode.Data
// Check if the node is part of an array. If so, determine the index and
// concatenate the parent name and the index.
kind := reflect.Invalid
if nativeNode.Parent != nil && nativeNode.Parent.Value() != nil {
kind = reflect.TypeOf(nativeNode.Parent.Value()).Kind()
}
switch kind {
case reflect.Slice, reflect.Array:
// Determine the index for array elements
if name == "" && nativeNode.Parent != nil && withParent {
name = nativeNode.Parent.Data + sep
}
return name + d.index(nativeNode)
}
return name
}
func (*jsonDocument) OutputXML(node dataNode) string {
native := node.(*jsonquery.Node)
return native.OutputXML()
}
func (*jsonDocument) index(node *jsonquery.Node) string {
idx := 0
for n := node; n.PrevSibling != nil; n = n.PrevSibling {
idx++
}
return strconv.Itoa(idx)
}

View file

@ -0,0 +1,41 @@
package xpath
import (
"bytes"
"fmt"
"github.com/antchfx/jsonquery"
path "github.com/antchfx/xpath"
"github.com/tinylib/msgp/msgp"
)
type msgpackDocument jsonDocument
func (*msgpackDocument) Parse(buf []byte) (dataNode, error) {
var json bytes.Buffer
// Unmarshal the message-pack binary message to JSON and proceed with the jsonquery class
if _, err := msgp.UnmarshalAsJSON(&json, buf); err != nil {
return nil, fmt.Errorf("unmarshalling to json failed: %w", err)
}
return jsonquery.Parse(&json)
}
func (d *msgpackDocument) QueryAll(node dataNode, expr string) ([]dataNode, error) {
return (*jsonDocument)(d).QueryAll(node, expr)
}
func (d *msgpackDocument) CreateXPathNavigator(node dataNode) path.NodeNavigator {
return (*jsonDocument)(d).CreateXPathNavigator(node)
}
func (d *msgpackDocument) GetNodePath(node, relativeTo dataNode, sep string) string {
return (*jsonDocument)(d).GetNodePath(node, relativeTo, sep)
}
func (d *msgpackDocument) GetNodeName(node dataNode, sep string, withParent bool) string {
return (*jsonDocument)(d).GetNodeName(node, sep, withParent)
}
func (d *msgpackDocument) OutputXML(node dataNode) string {
return (*jsonDocument)(d).OutputXML(node)
}

View file

@ -0,0 +1,703 @@
package xpath
import (
"encoding/base64"
"encoding/hex"
"errors"
"fmt"
"reflect"
"slices"
"strconv"
"strings"
"time"
"github.com/antchfx/jsonquery"
path "github.com/antchfx/xpath"
"github.com/srebhan/cborquery"
"github.com/srebhan/protobufquery"
"github.com/influxdata/telegraf"
"github.com/influxdata/telegraf/config"
"github.com/influxdata/telegraf/filter"
"github.com/influxdata/telegraf/internal"
"github.com/influxdata/telegraf/metric"
"github.com/influxdata/telegraf/plugins/parsers"
)
type dataNode interface{}
type dataDocument interface {
Parse(buf []byte) (dataNode, error)
QueryAll(node dataNode, expr string) ([]dataNode, error)
CreateXPathNavigator(node dataNode) path.NodeNavigator
GetNodePath(node, relativeTo dataNode, sep string) string
GetNodeName(node dataNode, sep string, withParent bool) string
OutputXML(node dataNode) string
}
type Parser struct {
Format string `toml:"-"`
ProtobufMessageFiles []string `toml:"xpath_protobuf_files"`
ProtobufMessageDef string `toml:"xpath_protobuf_file" deprecated:"1.32.0;1.40.0;use 'xpath_protobuf_files' instead"`
ProtobufMessageType string `toml:"xpath_protobuf_type"`
ProtobufImportPaths []string `toml:"xpath_protobuf_import_paths"`
ProtobufSkipBytes int64 `toml:"xpath_protobuf_skip_bytes"`
PrintDocument bool `toml:"xpath_print_document"`
AllowEmptySelection bool `toml:"xpath_allow_empty_selection"`
NativeTypes bool `toml:"xpath_native_types"`
Trace bool `toml:"xpath_trace" deprecated:"1.35.0;use 'log_level' 'trace' instead"`
Configs []Config `toml:"xpath"`
DefaultMetricName string `toml:"-"`
DefaultTags map[string]string `toml:"-"`
Log telegraf.Logger `toml:"-"`
// Required for backward compatibility
ConfigsXML []Config `toml:"xml" deprecated:"1.23.1;1.35.0;use 'xpath' instead"`
ConfigsJSON []Config `toml:"xpath_json" deprecated:"1.23.1;1.35.0;use 'xpath' instead"`
ConfigsMsgPack []Config `toml:"xpath_msgpack" deprecated:"1.23.1;1.35.0;use 'xpath' instead"`
ConfigsProto []Config `toml:"xpath_protobuf" deprecated:"1.23.1;1.35.0;use 'xpath' instead"`
document dataDocument
}
type Config struct {
MetricQuery string `toml:"metric_name"`
Selection string `toml:"metric_selection"`
Timestamp string `toml:"timestamp"`
TimestampFmt string `toml:"timestamp_format"`
Timezone string `toml:"timezone"`
Tags map[string]string `toml:"tags"`
Fields map[string]string `toml:"fields"`
FieldsInt map[string]string `toml:"fields_int"`
FieldsHex []string `toml:"fields_bytes_as_hex"`
FieldsBase64 []string `toml:"fields_bytes_as_base64"`
FieldSelection string `toml:"field_selection"`
FieldNameQuery string `toml:"field_name"`
FieldValueQuery string `toml:"field_value"`
FieldNameExpand bool `toml:"field_name_expansion"`
TagSelection string `toml:"tag_selection"`
TagNameQuery string `toml:"tag_name"`
TagValueQuery string `toml:"tag_value"`
TagNameExpand bool `toml:"tag_name_expansion"`
FieldsHexFilter filter.Filter
FieldsBase64Filter filter.Filter
Location *time.Location
}
func (p *Parser) Init() error {
switch p.Format {
case "", "xml":
p.document = &xmlDocument{}
// Required for backward compatibility
if len(p.ConfigsXML) > 0 {
p.Configs = append(p.Configs, p.ConfigsXML...)
config.PrintOptionDeprecationNotice("parsers.xpath", "xml", telegraf.DeprecationInfo{
Since: "1.23.1",
RemovalIn: "1.35.0",
Notice: "use 'xpath' instead",
})
}
case "xpath_cbor":
p.document = &cborDocument{}
case "xpath_json":
p.document = &jsonDocument{}
// Required for backward compatibility
if len(p.ConfigsJSON) > 0 {
p.Configs = append(p.Configs, p.ConfigsJSON...)
config.PrintOptionDeprecationNotice("parsers.xpath", "xpath_json", telegraf.DeprecationInfo{
Since: "1.23.1",
RemovalIn: "1.35.0",
Notice: "use 'xpath' instead",
})
}
case "xpath_msgpack":
p.document = &msgpackDocument{}
// Required for backward compatibility
if len(p.ConfigsMsgPack) > 0 {
p.Configs = append(p.Configs, p.ConfigsMsgPack...)
config.PrintOptionDeprecationNotice("parsers.xpath", "xpath_msgpack", telegraf.DeprecationInfo{
Since: "1.23.1",
RemovalIn: "1.35.0",
Notice: "use 'xpath' instead",
})
}
case "xpath_protobuf":
if p.ProtobufMessageDef != "" && !slices.Contains(p.ProtobufMessageFiles, p.ProtobufMessageDef) {
p.ProtobufMessageFiles = append(p.ProtobufMessageFiles, p.ProtobufMessageDef)
}
pbdoc := protobufDocument{
MessageFiles: p.ProtobufMessageFiles,
MessageType: p.ProtobufMessageType,
ImportPaths: p.ProtobufImportPaths,
SkipBytes: p.ProtobufSkipBytes,
Log: p.Log,
}
if err := pbdoc.Init(); err != nil {
return err
}
p.document = &pbdoc
// Required for backward compatibility
if len(p.ConfigsProto) > 0 {
p.Configs = append(p.Configs, p.ConfigsProto...)
config.PrintOptionDeprecationNotice("parsers.xpath", "xpath_proto", telegraf.DeprecationInfo{
Since: "1.23.1",
RemovalIn: "1.35.0",
Notice: "use 'xpath' instead",
})
}
default:
return fmt.Errorf("unknown data-format %q for xpath parser", p.Format)
}
// Make sure we do have a metric name
if p.DefaultMetricName == "" {
return errors.New("missing default metric name")
}
// Update the configs with default values
for i, cfg := range p.Configs {
if cfg.Selection == "" {
cfg.Selection = "/"
}
if cfg.TimestampFmt == "" {
cfg.TimestampFmt = "unix"
}
if cfg.Timezone == "" {
cfg.Location = time.UTC
} else {
loc, err := time.LoadLocation(cfg.Timezone)
if err != nil {
return fmt.Errorf("invalid location in config %d: %w", i+1, err)
}
cfg.Location = loc
}
f, err := filter.Compile(cfg.FieldsHex)
if err != nil {
return fmt.Errorf("creating hex-fields filter failed: %w", err)
}
cfg.FieldsHexFilter = f
bf, err := filter.Compile(cfg.FieldsBase64)
if err != nil {
return fmt.Errorf("creating base64-fields filter failed: %w", err)
}
cfg.FieldsBase64Filter = bf
p.Configs[i] = cfg
}
return nil
}
func (p *Parser) Parse(buf []byte) ([]telegraf.Metric, error) {
t := time.Now()
// Parse the XML
doc, err := p.document.Parse(buf)
if err != nil {
return nil, err
}
if p.PrintDocument {
p.Log.Debugf("XML document equivalent: %q", p.document.OutputXML(doc))
}
// Queries
metrics := make([]telegraf.Metric, 0)
p.Log.Debugf("Number of configs: %d", len(p.Configs))
for _, cfg := range p.Configs {
selectedNodes, err := p.document.QueryAll(doc, cfg.Selection)
if err != nil {
return nil, err
}
if (len(selectedNodes) < 1 || selectedNodes[0] == nil) && !p.AllowEmptySelection {
p.debugEmptyQuery("metric selection", doc, cfg.Selection)
return metrics, errors.New("cannot parse with empty selection node")
}
p.Log.Debugf("Number of selected metric nodes: %d", len(selectedNodes))
for _, selected := range selectedNodes {
m, err := p.parseQuery(t, doc, selected, cfg)
if err != nil {
return metrics, err
}
metrics = append(metrics, m)
}
}
return metrics, nil
}
func (p *Parser) ParseLine(line string) (telegraf.Metric, error) {
metrics, err := p.Parse([]byte(line))
if err != nil {
return nil, err
}
switch len(metrics) {
case 0:
return nil, nil
case 1:
return metrics[0], nil
default:
return metrics[0], fmt.Errorf("cannot parse line with multiple (%d) metrics", len(metrics))
}
}
func (p *Parser) SetDefaultTags(tags map[string]string) {
p.DefaultTags = tags
}
func (p *Parser) parseQuery(starttime time.Time, doc, selected dataNode, cfg Config) (telegraf.Metric, error) {
var timestamp time.Time
var metricname string
// Determine the metric name. If a query was specified, use the result of this query and the default metric name
// otherwise.
metricname = p.DefaultMetricName
if len(cfg.MetricQuery) > 0 {
v, err := p.executeQuery(doc, selected, cfg.MetricQuery)
if err != nil {
return nil, fmt.Errorf("failed to query metric name: %w", err)
}
var ok bool
if metricname, ok = v.(string); !ok {
if v == nil {
p.Log.Infof("Hint: Empty metric-name-node. If you wanted to set a constant please use `metric_name = \"'name'\"`.")
}
return nil, fmt.Errorf("failed to query metric name: query result is of type %T not 'string'", v)
}
}
// By default take the time the parser was invoked and override the value
// with the queried timestamp if an expression was specified.
timestamp = starttime
if len(cfg.Timestamp) > 0 {
v, err := p.executeQuery(doc, selected, cfg.Timestamp)
if err != nil {
return nil, fmt.Errorf("failed to query timestamp: %w", err)
}
if v != nil {
timestamp, err = internal.ParseTimestamp(cfg.TimestampFmt, v, cfg.Location)
if err != nil {
return nil, fmt.Errorf("failed to parse timestamp: %w", err)
}
}
}
// Query tags and add default ones
tags := make(map[string]string)
// Handle the tag batch definitions if any.
if len(cfg.TagSelection) > 0 {
tagnamequery := "name()"
tagvaluequery := "."
if len(cfg.TagNameQuery) > 0 {
tagnamequery = cfg.TagNameQuery
}
if len(cfg.TagValueQuery) > 0 {
tagvaluequery = cfg.TagValueQuery
}
// Query all tags
selectedTagNodes, err := p.document.QueryAll(selected, cfg.TagSelection)
if err != nil {
return nil, err
}
p.Log.Debugf("Number of selected tag nodes: %d", len(selectedTagNodes))
if len(selectedTagNodes) > 0 && selectedTagNodes[0] != nil {
for _, selectedtag := range selectedTagNodes {
n, err := p.executeQuery(doc, selectedtag, tagnamequery)
if err != nil {
return nil, fmt.Errorf("failed to query tag name with query %q: %w", tagnamequery, err)
}
name, ok := n.(string)
if !ok {
return nil, fmt.Errorf("failed to query tag name with query %q: result is not a string (%v)", tagnamequery, n)
}
name = p.constructFieldName(selected, selectedtag, name, cfg.TagNameExpand)
v, err := p.executeQuery(doc, selectedtag, tagvaluequery)
if err != nil {
return nil, fmt.Errorf("failed to query tag value for %q: %w", name, err)
}
// Check if field name already exists and if so, append an index number.
if _, ok := tags[name]; ok {
for i := 1; ; i++ {
p := name + "_" + strconv.Itoa(i)
if _, ok := tags[p]; !ok {
name = p
break
}
}
}
// Convert the tag to be a string
s, err := internal.ToString(v)
if err != nil {
return nil, fmt.Errorf("failed to query tag value for %q: result is not a string (%v)", name, v)
}
tags[name] = s
}
} else {
p.debugEmptyQuery("tag selection", selected, cfg.TagSelection)
}
}
// Handle explicitly defined tags
for name, query := range cfg.Tags {
// Execute the query and cast the returned values into strings
v, err := p.executeQuery(doc, selected, query)
if err != nil {
return nil, fmt.Errorf("failed to query tag %q: %w", name, err)
}
switch v := v.(type) {
case string:
tags[name] = v
case bool:
tags[name] = strconv.FormatBool(v)
case float64:
tags[name] = strconv.FormatFloat(v, 'G', -1, 64)
case nil:
continue
default:
return nil, fmt.Errorf("unknown format '%T' for tag %q", v, name)
}
}
// Add default tags
for name, v := range p.DefaultTags {
tags[name] = v
}
// Query fields
fields := make(map[string]interface{})
// Handle the field batch definitions if any.
if len(cfg.FieldSelection) > 0 {
fieldnamequery := "name()"
fieldvaluequery := "."
if len(cfg.FieldNameQuery) > 0 {
fieldnamequery = cfg.FieldNameQuery
}
if len(cfg.FieldValueQuery) > 0 {
fieldvaluequery = cfg.FieldValueQuery
}
// Query all fields
selectedFieldNodes, err := p.document.QueryAll(selected, cfg.FieldSelection)
if err != nil {
return nil, err
}
p.Log.Debugf("Number of selected field nodes: %d", len(selectedFieldNodes))
if len(selectedFieldNodes) > 0 && selectedFieldNodes[0] != nil {
for _, selectedfield := range selectedFieldNodes {
n, err := p.executeQuery(doc, selectedfield, fieldnamequery)
if err != nil {
return nil, fmt.Errorf("failed to query field name with query %q: %w", fieldnamequery, err)
}
name, ok := n.(string)
if !ok {
return nil, fmt.Errorf("failed to query field name with query %q: result is not a string (%v)", fieldnamequery, n)
}
name = p.constructFieldName(selected, selectedfield, name, cfg.FieldNameExpand)
v, err := p.executeQuery(doc, selectedfield, fieldvaluequery)
if err != nil {
return nil, fmt.Errorf("failed to query field value for %q: %w", name, err)
}
// Check if field name already exists and if so, append an index number.
if _, ok := fields[name]; ok {
for i := 1; ; i++ {
p := name + "_" + strconv.Itoa(i)
if _, ok := fields[p]; !ok {
name = p
break
}
}
}
// Handle complex types which would be dropped otherwise for
// native type handling
if v != nil {
switch reflect.TypeOf(v).Kind() {
case reflect.Array, reflect.Slice, reflect.Map:
if b, ok := v.([]byte); ok {
if cfg.FieldsHexFilter != nil && cfg.FieldsHexFilter.Match(name) {
v = hex.EncodeToString(b)
}
if cfg.FieldsBase64Filter != nil && cfg.FieldsBase64Filter.Match(name) {
v = base64.StdEncoding.EncodeToString(b)
}
} else {
v = fmt.Sprintf("%v", v)
}
}
}
fields[name] = v
}
} else {
p.debugEmptyQuery("field selection", selected, cfg.FieldSelection)
}
}
// Handle explicitly defined fields
for name, query := range cfg.FieldsInt {
// Execute the query and cast the returned values into integers
v, err := p.executeQuery(doc, selected, query)
if err != nil {
return nil, fmt.Errorf("failed to query field (int) %q: %w", name, err)
}
switch v := v.(type) {
case string:
fields[name], err = strconv.ParseInt(v, 10, 54)
if err != nil {
return nil, fmt.Errorf("failed to parse field (int) %q: %w", name, err)
}
case bool:
fields[name] = int64(0)
if v {
fields[name] = int64(1)
}
case float64:
fields[name] = int64(v)
case nil:
continue
default:
return nil, fmt.Errorf("unknown format '%T' for field (int) %q", v, name)
}
}
for name, query := range cfg.Fields {
// Execute the query and store the result in fields
v, err := p.executeQuery(doc, selected, query)
if err != nil {
return nil, fmt.Errorf("failed to query field %q: %w", name, err)
}
// Handle complex types which would be dropped otherwise for
// native type handling
if v != nil {
switch reflect.TypeOf(v).Kind() {
case reflect.Array, reflect.Slice, reflect.Map:
if b, ok := v.([]byte); ok {
if cfg.FieldsHexFilter != nil && cfg.FieldsHexFilter.Match(name) {
v = hex.EncodeToString(b)
}
if cfg.FieldsBase64Filter != nil && cfg.FieldsBase64Filter.Match(name) {
v = base64.StdEncoding.EncodeToString(b)
}
} else {
v = fmt.Sprintf("%v", v)
}
}
}
fields[name] = v
}
return metric.New(metricname, tags, fields, timestamp), nil
}
func (p *Parser) executeQuery(doc, selected dataNode, query string) (r interface{}, err error) {
// Check if the query is relative or absolute and set the root for the query
root := selected
if strings.HasPrefix(query, "/") {
root = doc
}
// Compile the query
expr, err := path.Compile(query)
if err != nil {
return nil, fmt.Errorf("failed to compile query %q: %w", query, err)
}
// Evaluate the compiled expression and handle returned node-iterators
// separately. Those iterators will be returned for queries directly
// referencing a node (value or attribute).
n := expr.Evaluate(p.document.CreateXPathNavigator(root))
iter, ok := n.(*path.NodeIterator)
if !ok {
return n, nil
}
// We got an iterator, so take the first match and get the referenced
// property. This will always be a string.
if iter.MoveNext() {
current := iter.Current()
// If the dataformat supports native types and if support is
// enabled, we should return the native type of the data
if p.NativeTypes {
switch nn := current.(type) {
case *cborquery.NodeNavigator:
return nn.GetValue(), nil
case *jsonquery.NodeNavigator:
return nn.GetValue(), nil
case *protobufquery.NodeNavigator:
return nn.GetValue(), nil
}
}
return iter.Current().Value(), nil
}
return nil, nil
}
func splitLastPathElement(query string) []string {
// This is a rudimentary xpath-parser that splits the path
// into the last path element and the remaining path-part.
// The last path element is then further split into
// parts such as attributes or selectors. Each returned
// element is a full path!
// Nothing left
if query == "" || query == "/" || query == "//" || query == "." {
return nil
}
separatorIdx := strings.LastIndex(query, "/")
if separatorIdx < 0 {
query = "./" + query
separatorIdx = 1
}
// For double slash we want to split at the first slash
if separatorIdx > 0 && query[separatorIdx-1] == byte('/') {
separatorIdx--
}
base := query[:separatorIdx]
if base == "" {
base = "/"
}
elements := make([]string, 0, 3)
elements = append(elements, base)
offset := separatorIdx
if i := strings.Index(query[offset:], "::"); i >= 0 {
// Check for axis operator
offset += i
elements = append(elements, query[:offset]+"::*")
}
if i := strings.Index(query[offset:], "["); i >= 0 {
// Check for predicates
offset += i
elements = append(elements, query[:offset])
} else if i := strings.Index(query[offset:], "@"); i >= 0 {
// Check for attributes
offset += i
elements = append(elements, query[:offset])
}
return elements
}
func (p *Parser) constructFieldName(root, node dataNode, name string, expand bool) string {
var expansion string
// In case the name is empty we should determine the current node's name.
// This involves array index expansion in case the parent of the node is
// and array. If we expanded here, we should skip our parent as this is
// already encoded in the name
if name == "" {
name = p.document.GetNodeName(node, "_", !expand)
}
// If name expansion is requested, construct a path between the current
// node and the root node of the selection. Concatenate the elements with
// an underscore.
if expand {
expansion = p.document.GetNodePath(node, root, "_")
}
if len(expansion) > 0 {
name = expansion + "_" + name
}
return name
}
func (p *Parser) debugEmptyQuery(operation string, root dataNode, initialquery string) {
if p.Log == nil || (!p.Log.Level().Includes(telegraf.Trace) && !p.Trace) { // for backward compatibility
return
}
query := initialquery
// We already know that the
p.Log.Tracef("got 0 nodes for query %q in %s", query, operation)
for {
parts := splitLastPathElement(query)
if len(parts) < 1 {
return
}
for i := len(parts) - 1; i >= 0; i-- {
q := parts[i]
nodes, err := p.document.QueryAll(root, q)
if err != nil {
p.Log.Tracef("executing query %q in %s failed: %v", q, operation, err)
return
}
p.Log.Tracef("got %d nodes for query %q in %s", len(nodes), q, operation)
if len(nodes) > 0 && nodes[0] != nil {
return
}
query = parts[0]
}
}
}
func init() {
// Register all variants
parsers.Add("xml",
func(defaultMetricName string) telegraf.Parser {
return &Parser{
Format: "xml",
DefaultMetricName: defaultMetricName,
}
},
)
parsers.Add("xpath_cbor",
func(defaultMetricName string) telegraf.Parser {
return &Parser{
Format: "xpath_cbor",
DefaultMetricName: defaultMetricName,
}
},
)
parsers.Add("xpath_json",
func(defaultMetricName string) telegraf.Parser {
return &Parser{
Format: "xpath_json",
DefaultMetricName: defaultMetricName,
}
},
)
parsers.Add("xpath_msgpack",
func(defaultMetricName string) telegraf.Parser {
return &Parser{
Format: "xpath_msgpack",
DefaultMetricName: defaultMetricName,
}
},
)
parsers.Add("xpath_protobuf",
func(defaultMetricName string) telegraf.Parser {
return &Parser{
Format: "xpath_protobuf",
DefaultMetricName: defaultMetricName,
}
},
)
}

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,214 @@
package xpath
import (
"context"
"encoding/hex"
"errors"
"fmt"
"reflect"
"sort"
"strconv"
"strings"
path "github.com/antchfx/xpath"
"github.com/bufbuild/protocompile"
"github.com/srebhan/protobufquery"
"google.golang.org/protobuf/encoding/protowire"
"google.golang.org/protobuf/proto"
"google.golang.org/protobuf/reflect/protoreflect"
"google.golang.org/protobuf/reflect/protoregistry"
"google.golang.org/protobuf/types/dynamicpb"
"github.com/influxdata/telegraf"
)
type protobufDocument struct {
MessageFiles []string
MessageType string
ImportPaths []string
SkipBytes int64
Log telegraf.Logger
msg *dynamicpb.Message
unmarshaller proto.UnmarshalOptions
}
func (d *protobufDocument) Init() error {
// Check the message definition and type
if len(d.MessageFiles) == 0 {
return errors.New("protocol-buffer files not set")
}
if d.MessageType == "" {
return errors.New("protocol-buffer message-type not set")
}
// Load the file descriptors from the given protocol-buffer definition
ctx := context.Background()
resolver := &protocompile.SourceResolver{ImportPaths: d.ImportPaths}
compiler := &protocompile.Compiler{
Resolver: protocompile.WithStandardImports(resolver),
}
files, err := compiler.Compile(ctx, d.MessageFiles...)
if err != nil {
return fmt.Errorf("parsing protocol-buffer definition failed: %w", err)
}
if len(files) < 1 {
return errors.New("files do not contain a file descriptor")
}
// Register all definitions in the file in the global registry
var registry protoregistry.Files
for _, f := range files {
if err := registry.RegisterFile(f); err != nil {
return fmt.Errorf("adding file %q to registry failed: %w", f.Path(), err)
}
}
d.unmarshaller = proto.UnmarshalOptions{
RecursionLimit: protowire.DefaultRecursionLimit,
Resolver: dynamicpb.NewTypes(&registry),
}
// Lookup given type in the loaded file descriptors
msgFullName := protoreflect.FullName(d.MessageType)
descriptor, err := registry.FindDescriptorByName(msgFullName)
if err != nil {
d.Log.Infof("Could not find %q... Known messages:", msgFullName)
var known []string
registry.RangeFiles(func(fd protoreflect.FileDescriptor) bool {
name := strings.TrimSpace(string(fd.FullName()))
if name != "" {
known = append(known, name)
}
return true
})
sort.Strings(known)
for _, name := range known {
d.Log.Infof(" %s", name)
}
return err
}
// Get a prototypical message for later use
msgDesc, ok := descriptor.(protoreflect.MessageDescriptor)
if !ok {
return fmt.Errorf("%q is not a message descriptor (%T)", msgFullName, descriptor)
}
d.msg = dynamicpb.NewMessage(msgDesc)
if d.msg == nil {
return fmt.Errorf("creating message template for %q failed", msgDesc.FullName())
}
return nil
}
func (d *protobufDocument) Parse(buf []byte) (dataNode, error) {
msg := d.msg.New()
// Unmarshal the received buffer
if err := d.unmarshaller.Unmarshal(buf[d.SkipBytes:], msg.Interface()); err != nil {
hexbuf := hex.EncodeToString(buf)
d.Log.Debugf("raw data (hex): %q (skip %d bytes)", hexbuf, d.SkipBytes)
return nil, err
}
return protobufquery.Parse(msg)
}
func (*protobufDocument) QueryAll(node dataNode, expr string) ([]dataNode, error) {
// If this panics it's a programming error as we changed the document type while processing
native, err := protobufquery.QueryAll(node.(*protobufquery.Node), expr)
if err != nil {
return nil, err
}
nodes := make([]dataNode, 0, len(native))
for _, n := range native {
nodes = append(nodes, n)
}
return nodes, nil
}
func (*protobufDocument) CreateXPathNavigator(node dataNode) path.NodeNavigator {
// If this panics it's a programming error as we changed the document type while processing
return protobufquery.CreateXPathNavigator(node.(*protobufquery.Node))
}
func (d *protobufDocument) GetNodePath(node, relativeTo dataNode, sep string) string {
names := make([]string, 0)
// If these panic it's a programming error as we changed the document type while processing
nativeNode := node.(*protobufquery.Node)
nativeRelativeTo := relativeTo.(*protobufquery.Node)
// Climb up the tree and collect the node names
n := nativeNode.Parent
for n != nil && n != nativeRelativeTo {
kind := reflect.Invalid
if n.Parent != nil && n.Parent.Value() != nil {
kind = reflect.TypeOf(n.Parent.Value()).Kind()
}
switch kind {
case reflect.Slice, reflect.Array:
// Determine the index for array elements
names = append(names, d.index(n))
default:
// Use the name if not an array
names = append(names, n.Name)
}
n = n.Parent
}
if len(names) < 1 {
return ""
}
// Construct the nodes
nodepath := ""
for _, name := range names {
nodepath = name + sep + nodepath
}
return nodepath[:len(nodepath)-1]
}
func (d *protobufDocument) GetNodeName(node dataNode, sep string, withParent bool) string {
// If this panics it's a programming error as we changed the document type while processing
nativeNode := node.(*protobufquery.Node)
name := nativeNode.Name
// Check if the node is part of an array. If so, determine the index and
// concatenate the parent name and the index.
kind := reflect.Invalid
if nativeNode.Parent != nil && nativeNode.Parent.Value() != nil {
kind = reflect.TypeOf(nativeNode.Parent.Value()).Kind()
}
switch kind {
case reflect.Slice, reflect.Array:
if name == "" && nativeNode.Parent != nil && withParent {
name = nativeNode.Parent.Name + sep
}
return name + d.index(nativeNode)
}
return name
}
func (*protobufDocument) OutputXML(node dataNode) string {
native := node.(*protobufquery.Node)
return native.OutputXML()
}
func (*protobufDocument) index(node *protobufquery.Node) string {
idx := 0
for n := node; n.PrevSibling != nil; n = n.PrevSibling {
idx++
}
return strconv.Itoa(idx)
}

View file

@ -0,0 +1,28 @@
# Example for parsing an example protocol buffer data.
#
# File:
# testcases/addressbook.dat xpath_protobuf
#
# Protobuf:
# testcases/protos/addressbook.proto addressbook.AddressBook
#
# Expected Output:
# addresses,id=101,name=John\ Doe age=42i,email="john@example.com" 1621430181000000000
# addresses,id=102,name=Jane\ Doe age=40i 1621430181000000000
# addresses,id=201,name=Jack\ Doe age=12i,email="jack@example.com" 1621430181000000000
# addresses,id=301,name=Jack\ Buck age=19i,email="buck@example.com" 1621430181000000000
# addresses,id=1001,name=Janet\ Doe age=16i,email="janet@example.com" 1621430181000000000
#
metric_name = "'addresses'"
metric_selection = "//people"
[tags]
id = "id"
name = "name"
[fields_int]
age = "age"
[fields]
email = "email"

View file

@ -0,0 +1,17 @@
John Doeejohn@example.com *

Jane Doef (
3
Jack DoeÉjack@example.com *
555-555-5555
V
Jack Buck­buck@example.com *
555-555-0000*
555-555-0001*
555-555-0002
E
Janet Doeéjanet@example.com *
555-777-0000*
555-777-0001homeprivatefriends

View file

@ -0,0 +1 @@
¢fpeople…¤dnamehJohn Doebideeemailpjohn@example.comcage*£dnamehJane Doebidfcage(¥dnamehJack DoebidÉeemailpjack@example.comcage fphones<65>¢fnumberl555-555-5555dtype¥dnameiJack Buckbid-eemailpbuck@example.comcagefphonesƒ¢fnumberl555-555-0000dtype¡fnumberl555-555-0001¢fnumberl555-555-0002dtype¥dnameiJanet Doebidéeemailqjanet@example.comcagefphones¡fnumberl555-777-0000¢fnumberl555-777-0001dtypedtagsƒdhomegprivategfriends

View file

@ -0,0 +1,5 @@
addresses age="42",email="john@example.com",id="101",name="John Doe"
addresses age="40",id="102",name="Jane Doe"
addresses age="12",email="jack@example.com",id="201",name="Jack Doe",phones_number="555-555-5555",phones_type="2"
addresses age="19",email="buck@example.com",id="301",name="Jack Buck",phones_number="555-555-0000",phones_number_1="555-555-0001",phones_number_2="555-555-0002",phones_type="1",phones_type_1="2"
addresses age="16",email="janet@example.com",id="1001",name="Janet Doe",phones_number="555-777-0000",phones_number_1="555-777-0001",phones_type="1"

View file

@ -0,0 +1,9 @@
[[inputs.file]]
files = ["./testcases/cbor/addressbook.bin"]
data_format = "xpath_cbor"
[[inputs.file.xpath]]
metric_name = "'addresses'"
metric_selection = "//people"
field_selection = "descendant::*[not(*)]"
field_name_expansion = true

View file

@ -0,0 +1 @@
data str_a="this is a test",str_b="foobar",bytes_a="0001020304050607",fielda="AAECAwQFBgc=",bytes_b="Zm9vYmFy",timestamp=1687852514u 1687852514000000000

View file

@ -0,0 +1,28 @@
# Example data:
# [
# {
# "str_a": bytearray("this is a test"),
# "str_b": bytearray("foobar"),
# "bytes_a": bytearray([0, 1, 2, 3, 4, 5, 6, 7]),
# "bytes_b": bytearray("foobar"),
# "timestamp": 1687852514
# }
# ]
[[inputs.file]]
files = ["./testcases/cbor_base64_encoding/data.bin"]
data_format = "xpath_cbor"
xpath_native_types = true
[[inputs.file.xpath]]
metric_name = "'data'"
metric_selection = "child::*"
timestamp = "timestamp"
timestamp_format = "unix"
field_selection = "child::*"
fields_bytes_as_base64 = ["bytes_b", "field*"]
fields_bytes_as_hex = ["bytes_a"]
[inputs.file.xpath.fields]
fielda = "bytes_a"

View file

@ -0,0 +1,2 @@
benchmark,source=myhost,tags_platform=python,tags_sdkver=3.11.5 value=5.0 1653643421000000000
benchmark,source=myhost,tags_platform=python,tags_sdkver=3.11.4 value=4.0 1653643421000000000

View file

@ -0,0 +1,20 @@
[[inputs.file]]
files = ["./testcases/cbor_benchmark/message.bin"]
data_format = "xpath_cbor"
xpath_native_types = true
[[inputs.file.xpath]]
metric_name = "'benchmark'"
metric_selection = "//data"
timestamp = "timestamp"
timestamp_format = "unix_ns"
[inputs.file.xpath.tags]
source = "source"
tags_sdkver = "tags_sdkver"
tags_platform = "tags_platform"
[inputs.file.xpath.fields]
value = "value"

View file

@ -0,0 +1 @@
data str_a="this is a test",str_b="foobar",bytes_a="0001020304050607",bytes_b="666f6f626172",timestamp=1687852514u 1687852514000000000

View file

@ -0,0 +1,24 @@
# Example data:
# [
# {
# "str_a": bytearray("this is a test"),
# "str_b": bytearray("foobar"),
# "bytes_a": bytearray([0, 1, 2, 3, 4, 5, 6, 7]),
# "bytes_b": bytearray("foobar"),
# "timestamp": 1687852514
# }
# ]
[[inputs.file]]
files = ["./testcases/cbor_hex_encoding/data.bin"]
data_format = "xpath_cbor"
xpath_native_types = true
[[inputs.file.xpath]]
metric_name = "'data'"
metric_selection = "child::*"
timestamp = "timestamp"
timestamp_format = "unix"
field_selection = "child::*"
fields_bytes_as_hex = ["bytes_*"]

View file

@ -0,0 +1 @@
data fieldc="this is a test",fielda="0001020304050607",fieldb="666f6f626172" 1687852514000000000

View file

@ -0,0 +1,29 @@
# Example data:
# [
# {
# "str_a": bytearray("this is a test"),
# "str_b": bytearray("foobar"),
# "bytes_a": bytearray([0, 1, 2, 3, 4, 5, 6, 7]),
# "bytes_b": bytearray("foobar"),
# "timestamp": 1687852514
# }
# ]
[[inputs.file]]
files = ["./testcases/cbor_hex_encoding/data.bin"]
data_format = "xpath_cbor"
xpath_native_types = true
[[inputs.file.xpath]]
metric_name = "'data'"
metric_selection = "child::*"
timestamp = "timestamp"
timestamp_format = "unix"
fields_bytes_as_hex = ["fielda", "fieldb"]
[inputs.file.xpath.fields]
fielda = "bytes_a"
fieldb = "bytes_b"
fieldc = "str_a"

View file

@ -0,0 +1 @@
data n258="002-2.1.x",n259="14ca85ed9",n260=1687787189711304960u,n261=true,n263=3u,n264=23.76,n265=68.934,n266=false 1687787189711304960

View file

@ -0,0 +1,26 @@
# Example data:
# [
# {
# 258: "002-2.1.x",
# 259: "14ca85ed9",
# 260: 1687787189711304960,
# 261: true,
# 263: 3,
# 264: 23.760,
# 265: 68.934,
# 266: false
# }
# ]
[[inputs.file]]
files = ["./testcases/cbor_numeric_keys/data.bin"]
data_format = "xpath_cbor"
xpath_native_types = true
[[inputs.file.xpath]]
metric_name = "'data'"
metric_selection = "child::*"
timestamp = "n260"
timestamp_format = "unix_ns"
field_selection = "child::*"

View file

@ -0,0 +1,44 @@
# Example for parsing QuakeML measurement data.
#
# File:
# testcases/earthquakes.quakeml
#
# Expected Output:
# earthquakes,agency=us,type=mww depth=13000,eventid="7000dg8x",lat=-37.6099,lon=179.6102,mag=6.3,station_count=33i 1614989782185000000
# earthquakes,agency=us,type=mww depth=17000,eventid="7000dft1",lat=-28.7146,lon=-176.5582,mag=6.3,station_count=15i 1614911436571000000
# earthquakes,agency=us,type=mww depth=26450,eventid="7000dflf",lat=-29.7347,lon=-177.2817,mag=8.1,station_count=81i 1614886112819000000
# earthquakes,agency=us,type=mb depth=10000,eventid="7000dfku",lat=39.7886,lon=22.1189,mag=5.8,station_count=279i 1614883099415000000
# earthquakes,agency=us,type=mww depth=53090,eventid="7000dfk3",lat=-29.6647,lon=-177.8343,mag=7.4,station_count=40i 1614879684425000000
# earthquakes,agency=us,type=mww depth=20780,eventid="7000dffl",lat=-37.5628,lon=179.4443,mag=7.3,station_count=33i 1614864456464000000
# earthquakes,agency=us,type=mww depth=10000,eventid="7000df40",lat=39.7641,lon=22.1756,mag=6.3,station_count=81i 1614766570197000000
# earthquakes,type=mww depth=42100,eventid="0212o88mof",lat=61.3286,lon=-149.9991,mag=5.3 1614452365398000000
# earthquakes,agency=us,type=mww depth=10000,eventid="6000dkmk",lat=63.9602,lon=-22.2736,mag=5.6,station_count=64i 1614161159873000000
# earthquakes,agency=NC,type=mw depth=6220,eventid="73526151",lat=37.0456667,lon=-121.4781667,mag=3.76,station_count=3i 1613957893840000000
# earthquakes,agency=US,type=mwr depth=7000,eventid="2021dmpg",lat=36.96366667,lon=-98.09383333,mag=4.2,station_count=39i 1613743017950000000
# earthquakes,agency=us,type=mww depth=5590,eventid="6000dhxn",lat=-17.8192,lon=167.5901,mag=6.2,station_count=24i 1613436564078000000
# earthquakes,agency=us,type=mww depth=49940,eventid="6000dher",lat=37.7453,lon=141.7494,mag=7.1,station_count=74i 1613225270397000000
# earthquakes,agency=us,type=mww depth=98950,eventid="6000dh48",lat=38.1314,lon=73.545,mag=5.9,station_count=34i 1613149295308000000
# earthquakes,agency=us,type=mww depth=10000,eventid="6000dg77",lat=-23.0508,lon=171.657,mag=7.7,station_count=54i 1612963195532000000
#
metric_selection = "//event"
metric_name = "string('earthquakes')"
# Convert from milliseconds to nanoseconds as golang unfortunately
# only supports RFC3339 with second OR nanosecond precision.
timestamp = "replace(normalize-space(origin/time), 'Z', '000000Z')"
timestamp_format = "2006-01-02T15:04:05.999999999Z"
[fields]
eventid = "@catalog:eventid"
lon = "number(origin/longitude/value)"
lat = "number(origin/latitude/value)"
depth = "number(origin/depth/value)"
mag = "number(magnitude/mag/value)"
[fields_int]
station_count = "magnitude/stationCount"
[tags]
agency = "magnitude/creationInfo/agencyID"
type = "magnitude/type"

View file

@ -0,0 +1,20 @@
<?xml version="1.0" encoding="UTF-8"?>
<q:quakeml xmlns="http://quakeml.org/xmlns/bed/1.2" xmlns:anss="http://anss.org/xmlns/event/0.1" xmlns:catalog="http://anss.org/xmlns/catalog/0.1" xmlns:q="http://quakeml.org/xmlns/quakeml/1.2">
<eventParameters publicID="quakeml:earthquake.usgs.gov/earthquakes/feed/v1.0/summary/significant_month.quakeml">
<event catalog:datasource="us" catalog:eventsource="us" catalog:eventid="7000dg8x" publicID="quakeml:earthquake.usgs.gov/earthquakes/feed/v1.0/detail/us7000dg8x.quakeml"><description><type>earthquake name</type><text>182 km NE of Gisborne, New Zealand</text></description><origin catalog:datasource="us" catalog:dataid="us7000dg8x" catalog:eventsource="us" catalog:eventid="7000dg8x" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dg8x/us/1615258919040/product.xml"><time><value>2021-03-06T00:16:22.185Z</value></time><longitude><value>179.6102</value></longitude><latitude><value>-37.6099</value></latitude><depth><value>13000</value><uncertainty>1700</uncertainty></depth><originUncertainty><horizontalUncertainty>8100</horizontalUncertainty><preferredDescription>horizontal uncertainty</preferredDescription></originUncertainty><quality><usedPhaseCount>290</usedPhaseCount><standardError>1.04</standardError><azimuthalGap>34</azimuthalGap><minimumDistance>1.036</minimumDistance></quality><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-03-09T03:01:59.040Z</creationTime></creationInfo></origin><magnitude catalog:datasource="us" catalog:dataid="us7000dg8x" catalog:eventsource="us" catalog:eventid="7000dg8x" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dg8x/us/1615258919040/product.xml#magnitude"><mag><value>6.3</value><uncertainty>0.054</uncertainty></mag><type>mww</type><stationCount>33</stationCount><originID>quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dg8x/us/1615258919040/product.xml</originID><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-03-09T03:01:59.040Z</creationTime></creationInfo></magnitude><preferredOriginID>quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dg8x/us/1615258919040/product.xml</preferredOriginID><preferredMagnitudeID>quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dg8x/us/1615258919040/product.xml#magnitude</preferredMagnitudeID><type>earthquake</type><creationInfo><agencyID>us</agencyID><creationTime>2021-03-09T03:05:51.084Z</creationTime></creationInfo></event>
<event catalog:datasource="us" catalog:eventsource="us" catalog:eventid="7000dft1" publicID="quakeml:earthquake.usgs.gov/earthquakes/feed/v1.0/detail/us7000dft1.quakeml"><description><type>earthquake name</type><text>Kermadec Islands region</text></description><origin catalog:datasource="us" catalog:dataid="us7000dft1" catalog:eventsource="us" catalog:eventid="7000dft1" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dft1/us/1614970064040/product.xml"><time><value>2021-03-05T02:30:36.571Z</value></time><longitude><value>-176.5582</value></longitude><latitude><value>-28.7146</value></latitude><depth><value>17000</value><uncertainty>1800</uncertainty></depth><originUncertainty><horizontalUncertainty>9800</horizontalUncertainty><preferredDescription>horizontal uncertainty</preferredDescription></originUncertainty><quality><usedPhaseCount>89</usedPhaseCount><standardError>1.25</standardError><azimuthalGap>41</azimuthalGap><minimumDistance>9.815</minimumDistance></quality><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-03-05T18:47:44.040Z</creationTime></creationInfo></origin><magnitude catalog:datasource="us" catalog:dataid="us7000dft1" catalog:eventsource="us" catalog:eventid="7000dft1" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dft1/us/1614970064040/product.xml#magnitude"><mag><value>6.3</value><uncertainty>0.08</uncertainty></mag><type>mww</type><stationCount>15</stationCount><originID>quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dft1/us/1614970064040/product.xml</originID><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-03-05T18:47:44.040Z</creationTime></creationInfo></magnitude><preferredOriginID>quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dft1/us/1614970064040/product.xml</preferredOriginID><preferredMagnitudeID>quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dft1/us/1614970064040/product.xml#magnitude</preferredMagnitudeID><type>earthquake</type><creationInfo><agencyID>us</agencyID><creationTime>2021-03-06T02:34:07.561Z</creationTime></creationInfo></event>
<event catalog:datasource="us" catalog:eventsource="us" catalog:eventid="7000dflf" publicID="quakeml:earthquake.usgs.gov/earthquakes/feed/v1.0/detail/us7000dflf.quakeml"><description><type>earthquake name</type><text>Kermadec Islands, New Zealand</text></description><origin catalog:datasource="us" catalog:dataid="us7000dflf" catalog:eventsource="us" catalog:eventid="7000dflf" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dflf/us/1614967711040/product.xml"><time><value>2021-03-04T19:28:32.819Z</value></time><longitude><value>-177.2817</value></longitude><latitude><value>-29.7347</value></latitude><depth><value>26450</value><uncertainty>3700</uncertainty></depth><originUncertainty><horizontalUncertainty>7800</horizontalUncertainty><preferredDescription>horizontal uncertainty</preferredDescription></originUncertainty><quality><usedPhaseCount>130</usedPhaseCount><standardError>0.67</standardError><azimuthalGap>21</azimuthalGap><minimumDistance>0.746</minimumDistance></quality><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-03-05T18:08:31.040Z</creationTime></creationInfo></origin><magnitude catalog:datasource="us" catalog:dataid="us7000dflf" catalog:eventsource="us" catalog:eventid="7000dflf" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dflf/us/1614967711040/product.xml#magnitude"><mag><value>8.1</value><uncertainty>0.034</uncertainty></mag><type>mww</type><stationCount>81</stationCount><originID>quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dflf/us/1614967711040/product.xml</originID><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-03-05T18:08:31.040Z</creationTime></creationInfo></magnitude><preferredOriginID>quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dflf/us/1614967711040/product.xml</preferredOriginID><preferredMagnitudeID>quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dflf/us/1614967711040/product.xml#magnitude</preferredMagnitudeID><type>earthquake</type><creationInfo><agencyID>us</agencyID><creationTime>2021-03-09T18:52:08.298Z</creationTime></creationInfo></event>
<event catalog:datasource="us" catalog:eventsource="us" catalog:eventid="7000dfku" publicID="quakeml:earthquake.usgs.gov/earthquakes/feed/v1.0/detail/us7000dfku.quakeml"><description><type>earthquake name</type><text>Greece</text></description><origin catalog:datasource="us" catalog:dataid="us7000dfku" catalog:eventsource="us" catalog:eventid="7000dfku" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dfku/us/1614956583040/product.xml"><time><value>2021-03-04T18:38:19.415Z</value></time><longitude><value>22.1189</value></longitude><latitude><value>39.7886</value></latitude><depth><value>10000</value><uncertainty>1800</uncertainty></depth><originUncertainty><horizontalUncertainty>5200</horizontalUncertainty><preferredDescription>horizontal uncertainty</preferredDescription></originUncertainty><quality><usedPhaseCount>140</usedPhaseCount><standardError>0.9</standardError><azimuthalGap>19</azimuthalGap><minimumDistance>0.424</minimumDistance></quality><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-03-05T15:03:03.040Z</creationTime></creationInfo></origin><magnitude catalog:datasource="us" catalog:dataid="us7000dfku" catalog:eventsource="us" catalog:eventid="7000dfku" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dfku/us/1614956583040/product.xml#magnitude"><mag><value>5.8</value><uncertainty>0.036</uncertainty></mag><type>mb</type><stationCount>279</stationCount><originID>quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dfku/us/1614956583040/product.xml</originID><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-03-05T15:03:03.040Z</creationTime></creationInfo></magnitude><preferredOriginID>quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dfku/us/1614956583040/product.xml</preferredOriginID><preferredMagnitudeID>quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dfku/us/1614956583040/product.xml#magnitude</preferredMagnitudeID><type>earthquake</type><creationInfo><agencyID>us</agencyID><creationTime>2021-03-07T08:43:06.987Z</creationTime></creationInfo></event>
<event catalog:datasource="us" catalog:eventsource="us" catalog:eventid="7000dfk3" publicID="quakeml:earthquake.usgs.gov/earthquakes/feed/v1.0/detail/us7000dfk3.quakeml"><description><type>earthquake name</type><text>Kermadec Islands, New Zealand</text></description><origin catalog:datasource="us" catalog:dataid="us7000dfk3" catalog:eventsource="us" catalog:eventid="7000dfk3" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dfk3/us/1614952174040/product.xml"><time><value>2021-03-04T17:41:24.425Z</value></time><longitude><value>-177.8343</value></longitude><latitude><value>-29.6647</value></latitude><depth><value>53090</value><uncertainty>3600</uncertainty></depth><originUncertainty><horizontalUncertainty>7800</horizontalUncertainty><preferredDescription>horizontal uncertainty</preferredDescription></originUncertainty><quality><usedPhaseCount>132</usedPhaseCount><standardError>1.14</standardError><azimuthalGap>30</azimuthalGap><minimumDistance>0.426</minimumDistance></quality><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-03-05T13:49:34.040Z</creationTime></creationInfo></origin><magnitude catalog:datasource="us" catalog:dataid="us7000dfk3" catalog:eventsource="us" catalog:eventid="7000dfk3" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dfk3/us/1614952174040/product.xml#magnitude"><mag><value>7.4</value><uncertainty>0.049</uncertainty></mag><type>mww</type><stationCount>40</stationCount><originID>quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dfk3/us/1614952174040/product.xml</originID><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-03-05T13:49:34.040Z</creationTime></creationInfo></magnitude><preferredOriginID>quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dfk3/us/1614952174040/product.xml</preferredOriginID><preferredMagnitudeID>quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dfk3/us/1614952174040/product.xml#magnitude</preferredMagnitudeID><type>earthquake</type><creationInfo><agencyID>us</agencyID><creationTime>2021-03-09T18:42:04.756Z</creationTime></creationInfo></event>
<event catalog:datasource="us" catalog:eventsource="us" catalog:eventid="7000dffl" publicID="quakeml:earthquake.usgs.gov/earthquakes/feed/v1.0/detail/us7000dffl.quakeml"><description><type>earthquake name</type><text>174 km NE of Gisborne, New Zealand</text></description><origin catalog:datasource="us" catalog:dataid="us7000dffl" catalog:eventsource="us" catalog:eventid="7000dffl" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dffl/us/1614870527040/product.xml"><time><value>2021-03-04T13:27:36.464Z</value></time><longitude><value>179.4443</value></longitude><latitude><value>-37.5628</value></latitude><depth><value>20780</value><uncertainty>3200</uncertainty></depth><originUncertainty><horizontalUncertainty>6600</horizontalUncertainty><preferredDescription>horizontal uncertainty</preferredDescription></originUncertainty><quality><usedPhaseCount>141</usedPhaseCount><standardError>1.35</standardError><azimuthalGap>23</azimuthalGap><minimumDistance>0.904</minimumDistance></quality><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-03-04T15:08:47.040Z</creationTime></creationInfo></origin><magnitude catalog:datasource="us" catalog:dataid="us7000dffl" catalog:eventsource="us" catalog:eventid="7000dffl" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dffl/us/1614870527040/product.xml#magnitude"><mag><value>7.3</value><uncertainty>0.054</uncertainty></mag><type>mww</type><stationCount>33</stationCount><originID>quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dffl/us/1614870527040/product.xml</originID><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-03-04T15:08:47.040Z</creationTime></creationInfo></magnitude><preferredOriginID>quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dffl/us/1614870527040/product.xml</preferredOriginID><preferredMagnitudeID>quakeml:earthquake.usgs.gov/realtime/product/origin/us7000dffl/us/1614870527040/product.xml#magnitude</preferredMagnitudeID><type>earthquake</type><creationInfo><agencyID>us</agencyID><creationTime>2021-03-10T21:54:32.975Z</creationTime></creationInfo></event>
<event catalog:datasource="us" catalog:eventsource="us" catalog:eventid="7000df40" publicID="quakeml:earthquake.usgs.gov/earthquakes/feed/v1.0/detail/us7000df40.quakeml"><description><type>earthquake name</type><text>10 km WNW of Týrnavos, Greece</text></description><origin catalog:datasource="us" catalog:dataid="us7000df40" catalog:eventsource="us" catalog:eventid="7000df40" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us7000df40/us/1614767518040/product.xml"><time><value>2021-03-03T10:16:10.197Z</value></time><longitude><value>22.1756</value></longitude><latitude><value>39.7641</value></latitude><depth><value>10000</value><uncertainty>1800</uncertainty></depth><originUncertainty><horizontalUncertainty>5400</horizontalUncertainty><preferredDescription>horizontal uncertainty</preferredDescription></originUncertainty><quality><usedPhaseCount>129</usedPhaseCount><standardError>1.05</standardError><azimuthalGap>17</azimuthalGap><minimumDistance>0.415</minimumDistance></quality><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-03-03T10:31:58.040Z</creationTime></creationInfo></origin><magnitude catalog:datasource="us" catalog:dataid="us7000df40" catalog:eventsource="us" catalog:eventid="7000df40" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us7000df40/us/1614767518040/product.xml#magnitude"><mag><value>6.3</value><uncertainty>0.034</uncertainty></mag><type>mww</type><stationCount>81</stationCount><originID>quakeml:earthquake.usgs.gov/realtime/product/origin/us7000df40/us/1614767518040/product.xml</originID><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-03-03T10:31:58.040Z</creationTime></creationInfo></magnitude><preferredOriginID>quakeml:earthquake.usgs.gov/realtime/product/origin/us7000df40/us/1614767518040/product.xml</preferredOriginID><preferredMagnitudeID>quakeml:earthquake.usgs.gov/realtime/product/origin/us7000df40/us/1614767518040/product.xml#magnitude</preferredMagnitudeID><type>earthquake</type><creationInfo><agencyID>us</agencyID><creationTime>2021-03-08T04:19:29.249Z</creationTime></creationInfo></event>
<event catalog:datasource="ak" catalog:eventsource="ak" catalog:eventid="0212o88mof" publicID="quakeml:earthquake.usgs.gov/earthquakes/feed/v1.0/detail/ak0212o88mof.quakeml"><description><type>earthquake name</type><text>3 km SSW of Point MacKenzie, Alaska</text></description><origin catalog:datasource="ak" catalog:dataid="AK0212o88mof" catalog:eventsource="ak" catalog:eventid="0212o88mof" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/AK0212o88mof/ak/1614453659442/product.xml"><time><value>2021-02-27T18:59:25.398Z</value></time><longitude><value>-149.9991</value></longitude><latitude><value>61.3286</value></latitude><depth><value>42100</value><uncertainty>300</uncertainty></depth><originUncertainty><horizontalUncertainty>0</horizontalUncertainty><preferredDescription>horizontal uncertainty</preferredDescription></originUncertainty><quality><usedPhaseCount>134</usedPhaseCount><standardError>0.86</standardError></quality><evaluationMode>manual</evaluationMode><creationInfo><creationTime>2021-02-27T19:20:59.442Z</creationTime><version>2</version></creationInfo></origin><magnitude catalog:datasource="ak" catalog:dataid="AK0212o88mof" catalog:eventsource="ak" catalog:eventid="0212o88mof" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/AK0212o88mof/ak/1614453659442/product.xml#magnitude"><mag><value>5.3</value></mag><type>mww</type><originID>quakeml:earthquake.usgs.gov/realtime/product/origin/AK0212o88mof/ak/1614453659442/product.xml</originID><evaluationMode>manual</evaluationMode><creationInfo><creationTime>2021-02-27T19:20:59.442Z</creationTime></creationInfo></magnitude><preferredOriginID>quakeml:earthquake.usgs.gov/realtime/product/origin/AK0212o88mof/ak/1614453659442/product.xml</preferredOriginID><preferredMagnitudeID>quakeml:earthquake.usgs.gov/realtime/product/origin/AK0212o88mof/ak/1614453659442/product.xml#magnitude</preferredMagnitudeID><type>earthquake</type><creationInfo><agencyID>ak</agencyID><creationTime>2021-03-10T19:09:33.840Z</creationTime><version>2</version></creationInfo></event>
<event catalog:datasource="us" catalog:eventsource="us" catalog:eventid="6000dkmk" publicID="quakeml:earthquake.usgs.gov/earthquakes/feed/v1.0/detail/us6000dkmk.quakeml"><description><type>earthquake name</type><text>5 km ESE of Vogar, Iceland</text></description><origin catalog:datasource="us" catalog:dataid="us6000dkmk" catalog:eventsource="us" catalog:eventid="6000dkmk" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dkmk/us/1614179124040/product.xml"><time><value>2021-02-24T10:05:59.873Z</value></time><longitude><value>-22.2736</value></longitude><latitude><value>63.9602</value></latitude><depth><value>10000</value><uncertainty>1800</uncertainty></depth><originUncertainty><horizontalUncertainty>5600</horizontalUncertainty><preferredDescription>horizontal uncertainty</preferredDescription></originUncertainty><quality><usedPhaseCount>129</usedPhaseCount><standardError>1.22</standardError><azimuthalGap>46</azimuthalGap><minimumDistance>0.891</minimumDistance></quality><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-02-24T15:05:24.040Z</creationTime></creationInfo></origin><magnitude catalog:datasource="us" catalog:dataid="us6000dkmk" catalog:eventsource="us" catalog:eventid="6000dkmk" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dkmk/us/1614179124040/product.xml#magnitude"><mag><value>5.6</value><uncertainty>0.039</uncertainty></mag><type>mww</type><stationCount>64</stationCount><originID>quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dkmk/us/1614179124040/product.xml</originID><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-02-24T15:05:24.040Z</creationTime></creationInfo></magnitude><preferredOriginID>quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dkmk/us/1614179124040/product.xml</preferredOriginID><preferredMagnitudeID>quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dkmk/us/1614179124040/product.xml#magnitude</preferredMagnitudeID><type>earthquake</type><creationInfo><agencyID>us</agencyID><creationTime>2021-03-07T02:32:18.760Z</creationTime></creationInfo></event>
<event catalog:datasource="nc" catalog:eventsource="nc" catalog:eventid="73526151" publicID="quakeml:earthquake.usgs.gov/earthquakes/feed/v1.0/detail/nc73526151.quakeml"><description><type>earthquake name</type><text>9km ENE of Gilroy, CA</text></description><origin catalog:datasource="nc" catalog:dataid="nc73526151" catalog:eventsource="nc" catalog:eventid="73526151" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/nc73526151/nc/1614041646560/product.xml"><time><value>2021-02-22T01:38:13.840Z</value></time><longitude><value>-121.4781667</value></longitude><latitude><value>37.0456667</value></latitude><depth><value>6220</value><uncertainty>240</uncertainty></depth><originUncertainty><horizontalUncertainty>90</horizontalUncertainty><preferredDescription>horizontal uncertainty</preferredDescription></originUncertainty><quality><usedPhaseCount>178</usedPhaseCount><usedStationCount>164</usedStationCount><standardError>0.15</standardError><azimuthalGap>33</azimuthalGap><minimumDistance>0.02089</minimumDistance></quality><evaluationMode>manual</evaluationMode><creationInfo><agencyID>NC</agencyID><creationTime>2021-02-23T00:54:06.560Z</creationTime><version>10</version></creationInfo></origin><magnitude catalog:datasource="nc" catalog:dataid="nc73526151" catalog:eventsource="nc" catalog:eventid="73526151" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/nc73526151/nc/1614041646560/product.xml#magnitude"><mag><value>3.76</value></mag><type>mw</type><stationCount>3</stationCount><originID>quakeml:earthquake.usgs.gov/realtime/product/origin/nc73526151/nc/1614041646560/product.xml</originID><evaluationMode>manual</evaluationMode><creationInfo><agencyID>NC</agencyID><creationTime>2021-02-23T00:54:06.560Z</creationTime></creationInfo></magnitude><preferredOriginID>quakeml:earthquake.usgs.gov/realtime/product/origin/nc73526151/nc/1614041646560/product.xml</preferredOriginID><preferredMagnitudeID>quakeml:earthquake.usgs.gov/realtime/product/origin/nc73526151/nc/1614041646560/product.xml#magnitude</preferredMagnitudeID><type>earthquake</type><creationInfo><agencyID>nc</agencyID><creationTime>2021-03-04T06:33:36.782Z</creationTime><version>10</version></creationInfo></event>
<event catalog:datasource="ok" catalog:eventsource="ok" catalog:eventid="2021dmpg" publicID="quakeml:earthquake.usgs.gov/earthquakes/feed/v1.0/detail/ok2021dmpg.quakeml"><description><type>earthquake name</type><text>6 km SW of Manchester, Oklahoma</text></description><origin catalog:datasource="ok" catalog:dataid="ogs2021dmpg" catalog:eventsource="ok" catalog:eventid="2021dmpg" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/ogs2021dmpg/ok/1613745730861/product.xml"><time><value>2021-02-19T13:56:57.950Z</value></time><longitude><value>-98.09383333</value></longitude><latitude><value>36.96366667</value></latitude><depth><value>7000</value><uncertainty>300</uncertainty></depth><originUncertainty><horizontalUncertainty>0</horizontalUncertainty><preferredDescription>horizontal uncertainty</preferredDescription></originUncertainty><quality><usedPhaseCount>182</usedPhaseCount><usedStationCount>98</usedStationCount><standardError>0.15</standardError><azimuthalGap>96</azimuthalGap><minimumDistance>0</minimumDistance></quality><evaluationMode>manual</evaluationMode><creationInfo><agencyID>OK</agencyID><creationTime>2021-02-19T14:42:10.861Z</creationTime></creationInfo></origin><magnitude catalog:datasource="ok" catalog:dataid="ogs2021dmpg" catalog:eventsource="ok" catalog:eventid="2021dmpg" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/ogs2021dmpg/ok/1613745730861/product.xml#magnitude"><mag><value>4.2</value></mag><type>mwr</type><stationCount>39</stationCount><originID>quakeml:earthquake.usgs.gov/realtime/product/origin/ogs2021dmpg/ok/1613745730861/product.xml</originID><evaluationMode>manual</evaluationMode><creationInfo><agencyID>US</agencyID><creationTime>2021-02-19T14:42:10.861Z</creationTime></creationInfo></magnitude><preferredOriginID>quakeml:earthquake.usgs.gov/realtime/product/origin/ogs2021dmpg/ok/1613745730861/product.xml</preferredOriginID><preferredMagnitudeID>quakeml:earthquake.usgs.gov/realtime/product/origin/ogs2021dmpg/ok/1613745730861/product.xml#magnitude</preferredMagnitudeID><type>earthquake</type><creationInfo><agencyID>ok</agencyID><creationTime>2021-03-05T02:13:24.659Z</creationTime></creationInfo></event>
<event catalog:datasource="us" catalog:eventsource="us" catalog:eventid="6000dhxn" publicID="quakeml:earthquake.usgs.gov/earthquakes/feed/v1.0/detail/us6000dhxn.quakeml"><description><type>earthquake name</type><text>77 km W of Port-Vila, Vanuatu</text></description><origin catalog:datasource="us" catalog:dataid="us6000dhxn" catalog:eventsource="us" catalog:eventid="6000dhxn" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dhxn/us/1613705801040/product.xml"><time><value>2021-02-16T00:49:24.078Z</value></time><longitude><value>167.5901</value></longitude><latitude><value>-17.8192</value></latitude><depth><value>5590</value><uncertainty>3300</uncertainty></depth><originUncertainty><horizontalUncertainty>7400</horizontalUncertainty><preferredDescription>horizontal uncertainty</preferredDescription></originUncertainty><quality><usedPhaseCount>386</usedPhaseCount><standardError>0.86</standardError><azimuthalGap>32</azimuthalGap><minimumDistance>3.666</minimumDistance></quality><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-02-19T03:36:41.040Z</creationTime></creationInfo></origin><magnitude catalog:datasource="us" catalog:dataid="us6000dhxn" catalog:eventsource="us" catalog:eventid="6000dhxn" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dhxn/us/1613705801040/product.xml#magnitude"><mag><value>6.2</value><uncertainty>0.063</uncertainty></mag><type>mww</type><stationCount>24</stationCount><originID>quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dhxn/us/1613705801040/product.xml</originID><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-02-19T03:36:41.040Z</creationTime></creationInfo></magnitude><preferredOriginID>quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dhxn/us/1613705801040/product.xml</preferredOriginID><preferredMagnitudeID>quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dhxn/us/1613705801040/product.xml#magnitude</preferredMagnitudeID><type>earthquake</type><creationInfo><agencyID>us</agencyID><creationTime>2021-03-04T11:07:03.880Z</creationTime></creationInfo></event>
<event catalog:datasource="us" catalog:eventsource="us" catalog:eventid="6000dher" publicID="quakeml:earthquake.usgs.gov/earthquakes/feed/v1.0/detail/us6000dher.quakeml"><description><type>earthquake name</type><text>72 km ENE of Namie, Japan</text></description><origin catalog:datasource="us" catalog:dataid="us6000dher" catalog:eventsource="us" catalog:eventid="6000dher" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dher/us/1613340262040/product.xml"><time><value>2021-02-13T14:07:50.397Z</value></time><longitude><value>141.7494</value></longitude><latitude><value>37.7453</value></latitude><depth><value>49940</value><uncertainty>3500</uncertainty></depth><originUncertainty><horizontalUncertainty>7000</horizontalUncertainty><preferredDescription>horizontal uncertainty</preferredDescription></originUncertainty><quality><usedPhaseCount>144</usedPhaseCount><standardError>1.12</standardError><azimuthalGap>33</azimuthalGap><minimumDistance>3.073</minimumDistance></quality><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-02-14T22:04:22.040Z</creationTime></creationInfo></origin><magnitude catalog:datasource="us" catalog:dataid="us6000dher" catalog:eventsource="us" catalog:eventid="6000dher" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dher/us/1613340262040/product.xml#magnitude"><mag><value>7.1</value><uncertainty>0.036</uncertainty></mag><type>mww</type><stationCount>74</stationCount><originID>quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dher/us/1613340262040/product.xml</originID><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-02-14T22:04:22.040Z</creationTime></creationInfo></magnitude><preferredOriginID>quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dher/us/1613340262040/product.xml</preferredOriginID><preferredMagnitudeID>quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dher/us/1613340262040/product.xml#magnitude</preferredMagnitudeID><type>earthquake</type><creationInfo><agencyID>us</agencyID><creationTime>2021-03-05T13:32:14.760Z</creationTime></creationInfo></event>
<event catalog:datasource="us" catalog:eventsource="us" catalog:eventid="6000dh48" publicID="quakeml:earthquake.usgs.gov/earthquakes/feed/v1.0/detail/us6000dh48.quakeml"><description><type>earthquake name</type><text>37 km W of Murghob, Tajikistan</text></description><origin catalog:datasource="us" catalog:dataid="us6000dh48" catalog:eventsource="us" catalog:eventid="6000dh48" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dh48/us/1613670813040/product.xml"><time><value>2021-02-12T17:01:35.308Z</value></time><longitude><value>73.545</value></longitude><latitude><value>38.1314</value></latitude><depth><value>98950</value><uncertainty>1200</uncertainty></depth><originUncertainty><horizontalUncertainty>5400</horizontalUncertainty><preferredDescription>horizontal uncertainty</preferredDescription></originUncertainty><quality><usedPhaseCount>298</usedPhaseCount><standardError>0.91</standardError><azimuthalGap>16</azimuthalGap><minimumDistance>1.915</minimumDistance></quality><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-02-18T17:53:33.040Z</creationTime></creationInfo></origin><magnitude catalog:datasource="us" catalog:dataid="us6000dh48" catalog:eventsource="us" catalog:eventid="6000dh48" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dh48/us/1613670813040/product.xml#magnitude"><mag><value>5.9</value><uncertainty>0.053</uncertainty></mag><type>mww</type><stationCount>34</stationCount><originID>quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dh48/us/1613670813040/product.xml</originID><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-02-18T17:53:33.040Z</creationTime></creationInfo></magnitude><preferredOriginID>quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dh48/us/1613670813040/product.xml</preferredOriginID><preferredMagnitudeID>quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dh48/us/1613670813040/product.xml#magnitude</preferredMagnitudeID><type>earthquake</type><creationInfo><agencyID>us</agencyID><creationTime>2021-03-04T10:24:38.562Z</creationTime></creationInfo></event>
<event catalog:datasource="us" catalog:eventsource="us" catalog:eventid="6000dg77" publicID="quakeml:earthquake.usgs.gov/earthquakes/feed/v1.0/detail/us6000dg77.quakeml"><description><type>earthquake name</type><text>southeast of the Loyalty Islands</text></description><origin catalog:datasource="us" catalog:dataid="us6000dg77" catalog:eventsource="us" catalog:eventid="6000dg77" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dg77/us/1615190090040/product.xml"><time><value>2021-02-10T13:19:55.532Z</value></time><longitude><value>171.657</value></longitude><latitude><value>-23.0508</value></latitude><depth><value>10000</value><uncertainty>1800</uncertainty></depth><originUncertainty><horizontalUncertainty>7800</horizontalUncertainty><preferredDescription>horizontal uncertainty</preferredDescription></originUncertainty><quality><usedPhaseCount>270</usedPhaseCount><standardError>0.42</standardError><azimuthalGap>15</azimuthalGap><minimumDistance>7.988</minimumDistance></quality><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-03-08T07:54:50.040Z</creationTime></creationInfo></origin><magnitude catalog:datasource="us" catalog:dataid="us6000dg77" catalog:eventsource="us" catalog:eventid="6000dg77" publicID="quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dg77/us/1615190090040/product.xml#magnitude"><mag><value>7.7</value><uncertainty>0.042</uncertainty></mag><type>mww</type><stationCount>54</stationCount><originID>quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dg77/us/1615190090040/product.xml</originID><evaluationMode>manual</evaluationMode><creationInfo><agencyID>us</agencyID><creationTime>2021-03-08T07:54:50.040Z</creationTime></creationInfo></magnitude><preferredOriginID>quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dg77/us/1615190090040/product.xml</preferredOriginID><preferredMagnitudeID>quakeml:earthquake.usgs.gov/realtime/product/origin/us6000dg77/us/1615190090040/product.xml#magnitude</preferredMagnitudeID><type>earthquake</type><creationInfo><agencyID>us</agencyID><creationTime>2021-03-08T08:07:24.427Z</creationTime></creationInfo></event>
<creationInfo><creationTime>2021-03-11T11:55:37.000Z</creationTime></creationInfo>
</eventParameters></q:quakeml>

View file

@ -0,0 +1,14 @@
# Example for reading batches of selecting fields AND tags.
#
# File:
# testcases/field_tag_batch.json xpath_json
#
# Expected Output:
# measurementName,machine=machineValue,source=sourceValue field1="1",field2="2" 1643760000000000000
#
metric_name = "/measurement"
timestamp = "/timestamp"
timestamp_format = "2006-01-02T15:04:05Z"
field_selection = "fields/child::*"
tag_selection = "tags/child::*"

View file

@ -0,0 +1,12 @@
{
"measurement": "measurementName",
"timestamp": "2022-02-02T00:00:00Z",
"tags": {
"source": "sourceValue",
"machine": "machineValue"
},
"fields": {
"field1": 1.0,
"field2": 2.0
}
}

View file

@ -0,0 +1 @@
foo adr=true,applicationID="1",applicationName="Test",data="AGhCAGcA0g==",devEUI="27c0817d2aba3052",deviceName="TcsSensorNode",deviceProfileID="100ca98d-e075-4b1b-8cd3-41edbab355f5",deviceProfileName="TcsSensor",fCnt=71,fPort=2,object_humiditySensor_0=33,object_humiditySensor_1=25.5,object_temperatureSensor_0=21,object_temperatureSensor_1=19.3,rxInfo_0_gatewayID="b827ebfffeaa4582",rxInfo_0_loRaSNR=7.5,rxInfo_0_name="local",rxInfo_0_rssi=-60,rxInfo_0_uplinkID="659cbfab-3216-42fd-9f71-f2c470b7f9da",rxInfo_1_gatewayID="ca925641ce08b33e",rxInfo_1_loRaSNR=4.2,rxInfo_1_name="local",rxInfo_1_rssi=-98,rxInfo_1_uplinkID="15ca3b44-17a0-4662-82c9-aa23b40e16eb",txInfo_dr=5,txInfo_frequency=868500000,timestamp=1677099936000000000 1677099936000000000

View file

@ -0,0 +1,12 @@
[[inputs.file]]
files = ["./testcases/json_array_expand/test.json"]
data_format = "xpath_json"
xpath_native_types = true
[[inputs.file.xpath]]
field_name_expansion = true
metric_name = "'foo'"
field_selection = "descendant::*[not(*)]"
timestamp = "//timestamp"
timestamp_format = "unix_ns"

View file

@ -0,0 +1,43 @@
{
"applicationID":"1",
"applicationName":"Test",
"deviceName":"TcsSensorNode",
"deviceProfileName":"TcsSensor",
"deviceProfileID":"100ca98d-e075-4b1b-8cd3-41edbab355f5",
"devEUI":"27c0817d2aba3052",
"timestamp": 1677099936000000000,
"rxInfo":[
{
"gatewayID":"b827ebfffeaa4582",
"uplinkID":"659cbfab-3216-42fd-9f71-f2c470b7f9da",
"name":"local",
"rssi":-60,
"loRaSNR":7.5
},
{
"gatewayID":"ca925641ce08b33e",
"uplinkID":"15ca3b44-17a0-4662-82c9-aa23b40e16eb",
"name":"local",
"rssi":-98,
"loRaSNR":4.2
}
],
"txInfo":{
"frequency":868500000,
"dr":5
},
"adr":true,
"fCnt":71,
"fPort":2,
"data":"AGhCAGcA0g==",
"object":{
"humiditySensor":{
"0":33,
"1":25.5
},
"temperatureSensor":{
"0":21,
"1":19.3
}
}
}

View file

@ -0,0 +1 @@
foo name="PC1",cpus_0="cpu1",cpus_1="cpu2",cpus_2="cpu3",disks_sata_0="disk1",disks_sata_1="disk2",disks_nvme_0="disk3",disks_nvme_1="disk4",disks_nvme_2="disk5",timestamp=1690218699 1690218699000000000

View file

@ -0,0 +1,12 @@
[[inputs.file]]
files = ["./testcases/json_array_expand_simple_types/test.json"]
data_format = "xpath_json"
xpath_native_types = true
[[inputs.file.xpath]]
metric_name = "'foo'"
field_selection = "descendant::*[not(*)]"
field_name_expansion = true
timestamp = "//timestamp"
timestamp_format = "unix"

View file

@ -0,0 +1,20 @@
{
"name": "PC1",
"cpus": [
"cpu1",
"cpu2",
"cpu3"
],
"disks": {
"sata": [
"disk1",
"disk2"
],
"nvme": [
"disk3",
"disk4",
"disk5"
]
},
"timestamp": 1690218699
}

View file

@ -0,0 +1 @@
foo name="PC1",cpus_0="cpu1",cpus_1="cpu2",cpus_2="cpu3",sata_0="disk1",sata_1="disk2",nvme_0="disk3",nvme_1="disk4",nvme_2="disk5",timestamp=1690218699 1690218699000000000

View file

@ -0,0 +1,11 @@
[[inputs.file]]
files = ["./testcases/json_array_simple_types/test.json"]
data_format = "xpath_json"
xpath_native_types = true
[[inputs.file.xpath]]
metric_name = "'foo'"
field_selection = "descendant::*[not(*)]"
timestamp = "//timestamp"
timestamp_format = "unix"

View file

@ -0,0 +1,20 @@
{
"name": "PC1",
"cpus": [
"cpu1",
"cpu2",
"cpu3"
],
"disks": {
"sata": [
"disk1",
"disk2"
],
"nvme": [
"disk3",
"disk4",
"disk5"
]
},
"timestamp": 1690218699
}

View file

@ -0,0 +1 @@
foo a="a string",b=3.1415,c=true,d="{\"d1\":1,\"d2\":\"foo\",\"d3\":true,\"d4\":null}",e="[\"master\",42,true]",timestamp=1690193829 1690193829000000000

View file

@ -0,0 +1,15 @@
[[inputs.file]]
files = ["./testcases/json_string_representation/test.json"]
data_format = "xpath_json"
xpath_native_types = true
[[inputs.file.xpath]]
metric_name = "'foo'"
field_selection = "*"
timestamp = "timestamp"
timestamp_format = "unix"
[inputs.file.xpath.fields]
d = "string(d)"
e = "string(e)"

View file

@ -0,0 +1,13 @@
{
"a": "a string",
"b": 3.1415,
"c": true,
"d": {
"d1": 1,
"d2": "foo",
"d3": true,
"d4": null
},
"e": ["master", 42, true],
"timestamp": 1690193829
}

View file

@ -0,0 +1 @@
foo a="a string",b=3.1415,c=true,d="map[d1:1 d2:foo d3:true d4:<nil>]",e="[master 42 true]",timestamp=1690193829 1690193829000000000

View file

@ -0,0 +1,11 @@
[[inputs.file]]
files = ["./testcases/json_string_representation/test.json"]
data_format = "xpath_json"
xpath_native_types = true
[[inputs.file.xpath]]
metric_name = "'foo'"
field_selection = "*"
timestamp = "timestamp"
timestamp_format = "unix"

View file

@ -0,0 +1,13 @@
{
"a": "a string",
"b": 3.1415,
"c": true,
"d": {
"d1": 1,
"d2": "foo",
"d3": true,
"d4": null
},
"e": ["master", 42, true],
"timestamp": 1690193829
}

View file

@ -0,0 +1 @@
foo a="a string",b="3.1415",c="true",d="{\"d1\":1,\"d2\":\"foo\",\"d3\":true,\"d4\":null}",e="[\"master\",42,true]",timestamp="1690193829" 1690193829000000000

View file

@ -0,0 +1,9 @@
[[inputs.file]]
files = ["./testcases/json_string_representation/test.json"]
data_format = "xpath_json"
[[inputs.file.xpath]]
metric_name = "'foo'"
field_selection = "*"
timestamp = "timestamp"
timestamp_format = "unix"

View file

@ -0,0 +1,13 @@
{
"a": "a string",
"b": 3.1415,
"c": true,
"d": {
"d1": 1,
"d2": "foo",
"d3": true,
"d4": null
},
"e": ["master", 42, true],
"timestamp": 1690193829
}

View file

@ -0,0 +1,31 @@
<?xml version="1.0"?>
<Gateway>
<Name>Main Gateway</Name>
<Timestamp>2020-08-01T15:04:03Z</Timestamp>
<Sequence>12</Sequence>
<Status>ok</Status>
</Gateway>
<Bus>
<Sensor name="Sensor Facility A">
<Variable temperature="20.0"/>
<Variable power="123.4"/>
<Variable frequency="49.78"/>
<Variable consumers="3"/>
<Mode>busy</Mode>
</Sensor>
<Sensor name="Sensor Facility B">
<Variable temperature="23.1"/>
<Variable power="14.3"/>
<Variable frequency="49.78"/>
<Variable consumers="1"/>
<Mode>standby</Mode>
</Sensor>
<Sensor name="Sensor Facility C">
<Variable temperature="19.7"/>
<Variable power="0.02"/>
<Variable frequency="49.78"/>
<Variable consumers="0"/>
<Mode>error</Mode>
</Sensor>
</Bus>

View file

@ -0,0 +1,17 @@
# Simple example for using the xml-parser.
#
# File:
# testcases/multisensor.xml
#
# Expected Output:
# xml,gateway=Main seqnr=12i,ok=true
#
[tags]
gateway = "substring-before(/Gateway/Name, ' ')"
[fields_int]
seqnr = "/Gateway/Sequence"
[fields]
ok = "/Gateway/Status = 'ok'"

View file

@ -0,0 +1,28 @@
# Example for explicitly selecting fields from a bunch of selected metrics.
#
# File:
# testcases/multisensor.xml
#
# Expected Output:
# sensors,name=Facility\ A consumers=3i,frequency=49.78,power=123.4,temperature=20,ok=true 1596294243000000000
# sensors,name=Facility\ B consumers=1i,frequency=49.78,power=14.3,temperature=23.1,ok=true 1596294243000000000
# sensors,name=Facility\ C consumers=0i,frequency=49.78,power=0.02,temperature=19.7,ok=false 1596294243000000000
#
metric_selection = "/Bus/child::Sensor"
metric_name = "string('sensors')"
timestamp = "/Gateway/Timestamp"
timestamp_format = "2006-01-02T15:04:05Z"
[tags]
name = "substring-after(@name, ' ')"
[fields_int]
consumers = "Variable/@consumers"
[fields]
temperature = "number(Variable/@temperature)"
power = "number(Variable/@power)"
frequency = "number(Variable/@frequency)"
ok = "Mode != 'error'"

View file

@ -0,0 +1,23 @@
# Example for batch selecting fields from a bunch of selected metrics.
#
# File:
# testcases/multisensor.xml
#
# Expected Output:
# sensors,name=Facility\ A consumers=3,frequency=49.78,power=123.4,temperature=20 1596294243000000000
# sensors,name=Facility\ B consumers=1,frequency=49.78,power=14.3,temperature=23.1 1596294243000000000
# sensors,name=Facility\ C consumers=0,frequency=49.78,power=0.02,temperature=19.7 1596294243000000000
#
metric_selection = "/Bus/child::Sensor"
metric_name = "string('sensors')"
timestamp = "/Gateway/Timestamp"
timestamp_format = "2006-01-02T15:04:05Z"
field_selection = "child::Variable"
field_name = "name(@*[1])"
field_value = "number(@*[1])"
[tags]
name = "substring-after(@name, ' ')"

View file

@ -0,0 +1,3 @@
devices ok=true,phases_0_load=34,phases_0_voltage=231,phases_1_load=35,phases_1_voltage=231,phases_2_load=36,phases_2_voltage=231,rpm=423,type="Motor"
devices flow=3.1414,hours=8762,ok=true,type="Pump"
devices ok=true,phases_0_load=341,phases_0_voltage=231,phases_1_load=352,phases_1_voltage=231,phases_2_load=363,phases_2_voltage=231,throughput=1026,type="Machine"

View file

@ -0,0 +1,10 @@
[[inputs.file]]
files = ["./testcases/name_expansion/test.json"]
data_format = "xpath_json"
xpath_native_types = true
[[inputs.file.xpath]]
metric_name = "'devices'"
metric_selection = "/devices/*"
field_selection = "descendant::*[not(*)]"
field_name_expansion = true

View file

@ -0,0 +1,48 @@
{
"devices": [
{
"type": "Motor",
"rpm": 423,
"phases": [
{
"load": 34,
"voltage": 231
},
{
"load": 35,
"voltage": 231
},
{
"load": 36,
"voltage": 231
}
],
"ok": true
},
{
"type": "Pump",
"hours": 8762,
"flow": 3.1414,
"ok": true
},
{
"type": "Machine",
"throughput": 1026,
"phases": [
{
"load": 341,
"voltage": 231
},
{
"load": 352,
"voltage": 231
},
{
"load": 363,
"voltage": 231
}
],
"ok": true
}
]
}

View file

@ -0,0 +1 @@
¢fpeople…¤dnamehJohn Doebideeemailpjohn@example.comcage*£dnamehJane Doebidfcage(¥dnamehJack DoebidÉeemailpjack@example.comcage fphones<65>¢fnumberl555-555-5555dtype¥dnameiJack Buckbid-eemailpbuck@example.comcagefphonesƒ¢fnumberl555-555-0000dtype¡fnumberl555-555-0001¢fnumberl555-555-0002dtype¥dnameiJanet Doebidéeemailqjanet@example.comcagefphones¡fnumberl555-777-0000¢fnumberl555-777-0001dtypedtagsƒdhomegprivategfriends

View file

@ -0,0 +1,5 @@
addresses age=42u,email="john@example.com",id=101u,name="John Doe"
addresses age=40u,id=102u,name="Jane Doe"
addresses age=12u,email="jack@example.com",id=201u,name="Jack Doe",phones_number="555-555-5555",phones_type=2u
addresses age=19u,email="buck@example.com",id=301u,name="Jack Buck",phones_number="555-555-0000",phones_number_1="555-555-0001",phones_number_2="555-555-0002",phones_type=1u,phones_type_1=2u
addresses age=16u,email="janet@example.com",id=1001u,name="Janet Doe",phones_number="555-777-0000",phones_number_1="555-777-0001",phones_type=1u

View file

@ -0,0 +1,11 @@
[[inputs.file]]
files = ["./testcases/cbor/addressbook.bin"]
data_format = "xpath_cbor"
xpath_native_types = true
[[inputs.file.xpath]]
metric_name = "'addresses'"
metric_selection = "//people"
field_selection = "descendant::*[not(*)]"
field_name_expansion = true

View file

@ -0,0 +1 @@
native_types value_a="a string",value_b=3.1415,value_c=42.0,value_d=true

View file

@ -0,0 +1,13 @@
[[inputs.file]]
files = ["./testcases/native_types_json/test.json"]
data_format = "xpath_json"
xpath_native_types = true
[[inputs.file.xpath]]
metric_name = "'native_types'"
[inputs.file.xpath.fields]
value_a = "//a"
value_b = "//b"
value_c = "//c"
value_d = "//d"

View file

@ -0,0 +1,6 @@
{
"a": "a string",
"b": 3.1415,
"c": 42,
"d": true
}

View file

@ -0,0 +1 @@
native_types value_a="a string",value_b=3.1415,value_c=42.0,value_d=true

View file

@ -0,0 +1 @@
native_types value_a="a string",value_b=3.1415,value_c=true

View file

@ -0,0 +1,12 @@
[[inputs.file]]
files = ["./testcases/native_types_json/test.json"]
data_format = "xpath_json"
xpath_native_types = true
[[inputs.file.xpath]]
metric_name = "'native_types'"
[inputs.file.xpath.fields]
value_a = "//a"
value_b = "//b"
value_c = "//c"

View file

@ -0,0 +1,5 @@
{
"a": "a string",
"b": 3.1415,
"c": true
}

View file

@ -0,0 +1,13 @@
[[inputs.file]]
files = ["./testcases/native_types_msgpack/test.msg"]
data_format = "xpath_msgpack"
xpath_native_types = true
[[inputs.file.xpath]]
metric_name = "'native_types'"
[inputs.file.xpath.fields]
value_a = "//a"
value_b = "//b"
value_c = "//c"
value_d = "//d"

View file

@ -0,0 +1 @@
native_types value_a="a string",value_b=3.1415,value_c=42i,value_d=true

View file

@ -0,0 +1,10 @@
syntax = "proto3";
package native_type;
message Message {
string a = 1;
double b = 2;
int32 c = 3;
bool d = 4;
}

View file

@ -0,0 +1,17 @@
[[inputs.file]]
files = ["./testcases/native_types_protobuf/test.dat"]
data_format = "xpath_protobuf"
xpath_native_types = true
xpath_protobuf_files = ["message.proto"]
xpath_protobuf_type = "native_type.Message"
xpath_protobuf_import_paths = [".", "./testcases/native_types_protobuf"]
[[inputs.file.xpath]]
metric_name = "'native_types'"
[inputs.file.xpath.fields]
value_a = "//a"
value_b = "//b"
value_c = "//c"
value_d = "//d"

View file

@ -0,0 +1,2 @@
a stringoƒÀÊ! @* 

View file

@ -0,0 +1,127 @@
{
"cod": "200",
"message": 0.0179,
"cnt": 96,
"list": [
{
"dt": 1596632400,
"main": {
"temp": 280.16,
"feels_like": 280.41,
"temp_min": 280.16,
"temp_max": 280.16,
"pressure": 1010,
"sea_level": 1010,
"grnd_level": 1010,
"humidity": 70,
"temp_kf": 0
},
"weather": [
{
"id": 804,
"main": "Clouds",
"description": "overcast clouds",
"icon": "04n"
}
],
"clouds": {
"all": 100
},
"wind": {
"speed": 2.03,
"deg": 252,
"gust":5.46
},
"visibility": 10000,
"pop": 0.04,
"sys": {
"pod": "n"
},
"dt_txt": "2020-08-05 13:00:00"
},
{
"dt": 159663600,
"main": {
"temp": 281.16,
"feels_like": 281.41,
"temp_min": 281.16,
"temp_max": 281.16,
"pressure": 1011,
"sea_level": 1011,
"grnd_level": 1011,
"humidity": 71,
"temp_kf": 0
},
"weather": [
{
"id": 804,
"main": "Clouds",
"description": "overcast clouds",
"icon": "04n"
}
],
"clouds": {
"all": 100
},
"wind": {
"speed": 2.03,
"deg": 252,
"gust":5.46
},
"visibility": 10000,
"pop": 0.04,
"sys": {
"pod": "n"
},
"dt_txt": "2020-08-05 14:00:00"
},
{
"dt": 159667200,
"main": {
"temp": 282.16,
"feels_like": 282.41,
"temp_min": 282.16,
"temp_max": 282.16,
"pressure": 1012,
"sea_level": 1012,
"grnd_level": 1012,
"humidity": 71,
"temp_kf": 0
},
"weather": [
{
"id": 804,
"main": "Clouds",
"description": "overcast clouds",
"icon": "04n"
}
],
"clouds": {
"all": 100
},
"wind": {
"speed": 2.03,
"deg": 252,
"gust":5.46
},
"visibility": 10000,
"pop": 0.04,
"sys": {
"pod": "n"
},
"dt_txt": "2020-08-05 15:00:00"
}
],
"city": {
"id": 2643743,
"name": "London",
"coord": {
"lat": 51.5085,
"lon": -0.1258
},
"country": "GB",
"timezone": 0,
"sunrise": 1568958164,
"sunset": 1569002733
}
}

View file

@ -0,0 +1,38 @@
<?xml version="1.0" encoding="UTF-8"?>
<!-- Taken from https://openweathermap.org/forecast5#XML -->
<weatherdata>
<location>
<name>London</name>
<type/>
<country>GB</country>
<timezone>3600</timezone>
<location altitude="0" latitude="51.5085" longitude="-0.1258" geobase="geonames" geobaseid="2643743"/>
</location>
<meta>
<lastupdate>2015-06-30T00:00:00Z</lastupdate>
</meta>
<sun rise="2015-06-30T10:08:46" set="2015-07-01T01:06:20"/>
<forecast>
<time from="2015-06-30T09:00:00" to="2015-06-30T12:00:00">
<symbol number="500" name="light rain" var="10n"/>
<precipitation value="5" unit="3h" type="rain"/>
<windDirection deg="253.5" code="WSW" name="West-southwest"/>
<windSpeed mps="4.9" name="Gentle Breeze"/>
<temperature unit="celsius" value="16.89" min="16.89" max="17.375"/>
<feels_like value="281.37" unit="kelvin"/>
<pressure unit="hPa" value="989.51"/>
<humidity value="96" unit="%"/>
<clouds value="broken clouds" all="64" unit="%"/>
</time>
<time from="2015-06-30T12:00:00" to="2015-06-30T15:00:00">
<symbol number="500" name="light rain" var="10d"/>
<precipitation value="99" unit="3h" type="rain"/>
<windDirection deg="248.001" code="WSW" name="West-southwest"/>
<windSpeed mps="4.86" name="Gentle Breeze"/>
<temperature unit="celsius" value="17.23" min="17.23" max="17.614"/>
<pressure unit="hPa" value="991.29"/>
<humidity value="97" unit="%"/>
<clouds value="scattered clouds" all="44" unit="%"/>
</time>
</forecast>
</weatherdata>

View file

@ -0,0 +1,28 @@
# Example for parsing openweathermap five-day-forecast data.
#
# File:
# testcases/openweathermap_5d.json xpath_json
#
# Expected Output:
# weather,city=London,country=GB humidity=70i,clouds=100i,wind_direction=252,wind_speed=2.03,temperature=137.86666666666667 1596632400000000000
# weather,city=London,country=GB wind_direction=252,wind_speed=2.03,temperature=138.42222222222225,clouds=100i,humidity=71i 159663600000000000
# weather,city=London,country=GB humidity=71i,clouds=100i,wind_direction=252,wind_speed=2.03,temperature=138.9777777777778 159667200000000000
#
metric_name = "'weather'"
metric_selection = "//list/*"
timestamp = "dt"
timestamp_format = "unix"
[tags]
city = "/city/name"
country = "/city/country"
[fields_int]
humidity = "main/humidity"
clouds = "clouds/all"
[fields]
wind_direction = "number(wind/deg)"
wind_speed = "number(wind/speed)"
temperature = "(number(main/temp) - 32.0)*(5.0 div 9.0)"

View file

@ -0,0 +1,28 @@
# Example for parsing openweathermap five-day-forecast data.
#
# File:
# testcases/openweathermap_5d.xml xml
#
# Expected Output:
# weather,city=London,country=GB clouds=64i,humidity=96i,precipitation=5,temperature=16.89,wind_direction=253.5,wind_speed=4.9 1435654800000000000
# weather,city=London,country=GB clouds=44i,humidity=97i,precipitation=99,temperature=17.23,wind_direction=248.001,wind_speed=4.86 1435665600000000000
#
metric_name = "'weather'"
metric_selection = "//forecast/*"
timestamp = "@from"
timestamp_format = "2006-01-02T15:04:05"
[tags]
city = "/weatherdata/location/name"
country = "/weatherdata/location/country"
[fields_int]
humidity = "humidity/@value"
clouds = "clouds/@all"
[fields]
precipitation = "number(precipitation/@value)"
wind_direction = "number(windDirection/@deg)"
wind_speed = "number(windSpeed/@mps)"
temperature = "number(temperature/@value)"

View file

@ -0,0 +1,15 @@
syntax = "proto3";
package benchmark;
message Entry {
string source = 1;
string tags_sdkver = 2;
string tags_platform = 3;
double value = 4;
uint64 timestamp = 5;
}
message BenchmarkData {
repeated Entry data = 1;
}

View file

@ -0,0 +1,2 @@
benchmark,source=myhost,tags_platform=python,tags_sdkver=3.11.5 value=5.0 1653643421000000000
benchmark,source=myhost,tags_platform=python,tags_sdkver=3.11.4 value=4.0 1653643421000000000

View file

@ -0,0 +1,24 @@
[[inputs.file]]
files = ["./testcases/protobuf_benchmark/message.bin"]
data_format = "xpath_protobuf"
xpath_protobuf_files = ["benchmark.proto"]
xpath_protobuf_type = "benchmark.BenchmarkData"
xpath_protobuf_import_paths = [".", "./testcases/protobuf_benchmark"]
xpath_native_types = true
[[inputs.file.xpath]]
metric_name = "'benchmark'"
metric_selection = "//data"
timestamp = "timestamp"
timestamp_format = "unix_ns"
[inputs.file.xpath.tags]
source = "source"
tags_sdkver = "tags_sdkver"
tags_platform = "tags_platform"
[inputs.file.xpath.fields]
value = "value"

View file

@ -0,0 +1 @@
dune,application=ERSStreamTest,session=TestPartition final_context_application_name="ERSStreamTest",final_context_cwd="/afs/cern.ch/user/r/riehecky",final_context_file_name="/tmp/root/spack-stage/spack-stage-erskafka-NB23-07-26-iao7bogflcsyozrhmzbnotd6gfqx6pye/spack-src/test/apps/protobuf_stream_test.cxx",final_context_function_name="int main(int, char**)",final_context_host_name="lxplus790.cern.ch",final_context_line_number=33u,final_context_package_name="unknown",final_context_process_id=31799u,final_context_thread_id=31799u,final_context_user_id=132836u,final_context_user_name="riehecky",final_inheritance_="ers::Issue",final_inheritance__1="erskafka::TestIssue",final_message="this is issue with ID: 14",final_name="erskafka::TestIssue",final_parameters_id="14",final_severity="4" 1690499167530000000

View file

@ -0,0 +1,38 @@
syntax = "proto3";
package dunedaq.ersschema;
message Context {
string cwd = 1;
string file_name = 2;
string function_name = 3;
string host_name = 4;
uint32 line_number = 5;
string package_name = 6;
uint32 process_id = 11;
uint32 thread_id = 12;
uint32 user_id = 13;
string user_name = 14;
string application_name = 15;
}
message SimpleIssue {
Context context = 1;
string name = 2;
repeated string inheritance = 3;
string message = 4;
string severity = 5;
uint64 time = 6;
map<string, string> parameters = 7;
}
message IssueChain {
SimpleIssue final = 1;
repeated SimpleIssue causes = 2;
string session = 10;
string application = 11;
string module = 12;
}

View file

@ -0,0 +1,6 @@
æ
þ
/afs/cern.ch/user/r/rieheckyƒ/tmp/root/spack-stage/spack-stage-erskafka-NB23-07-26-iao7bogflcsyozrhmzbnotd6gfqx6pye/spack-src/test/apps/protobuf_stream_test.cxxint main(int, char**)"lxplus790.cern.ch(!2unknownX·ø`·ø<68>rrieheckyz ERSStreamTesterskafka::TestIssue
ers::Issueerskafka::TestIssue"this is issue with ID: 14*40ªêèÌ™1:
id14R TestPartitionZ ERSStreamTest

View file

@ -0,0 +1,19 @@
[[inputs.file]]
files = ["./testcases/protobuf_issue_13715/message.bin"]
data_format = "xpath_protobuf"
xpath_native_types = true
xpath_protobuf_files = ["issue.proto"]
xpath_protobuf_type = "dunedaq.ersschema.IssueChain"
xpath_protobuf_import_paths = [".", "./testcases/protobuf_issue_13715"]
[[inputs.file.xpath]]
metric_name = "'dune'"
field_selection = "//final/descendant::*[not(*) and name() != 'time']"
field_name_expansion = true
timestamp = "//time"
timestamp_format = "unix_ms"
[inputs.file.xpath.tags]
application = "/application"
session = "/session"

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1 @@
test component_id=0u,enterprise_juniperNetworks_jnpr_interface_ext_interface_stats_if_name="ge-0/0/0",enterprise_juniperNetworks_jnpr_interface_ext_interface_stats_if_name_1="ge-0/0/1",enterprise_juniperNetworks_jnpr_interface_ext_interface_stats_if_name_10="ge-0/0/0",enterprise_juniperNetworks_jnpr_interface_ext_interface_stats_if_name_11="ge-0/0/1",enterprise_juniperNetworks_jnpr_interface_ext_interface_stats_if_name_12="ge-0/0/0",enterprise_juniperNetworks_jnpr_interface_ext_interface_stats_if_name_13="ge-0/0/1",enterprise_juniperNetworks_jnpr_interface_ext_interface_stats_if_name_14="ge-0/0/0",enterprise_juniperNetworks_jnpr_interface_ext_interface_stats_if_name_15="ge-0/0/1",enterprise_juniperNetworks_jnpr_interface_ext_interface_stats_if_name_16="ge-0/0/0",enterprise_juniperNetworks_jnpr_interface_ext_interface_stats_if_name_17="ge-0/0/1",enterprise_juniperNetworks_jnpr_interface_ext_interface_stats_if_name_18="ge-0/0/0",enterprise_juniperNetworks_jnpr_interface_ext_interface_stats_if_name_19="ge-0/0/1",enterprise_juniperNetworks_jnpr_interface_ext_interface_stats_if_name_2="ge-0/0/0",enterprise_juniperNetworks_jnpr_interface_ext_interface_stats_if_name_3="ge-0/0/1",enterprise_juniperNetworks_jnpr_interface_ext_interface_stats_if_name_4="ge-0/0/0",enterprise_juniperNetworks_jnpr_interface_ext_interface_stats_if_name_5="ge-0/0/1",enterprise_juniperNetworks_jnpr_interface_ext_interface_stats_if_name_6="ge-0/0/0",enterprise_juniperNetworks_jnpr_interface_ext_interface_stats_if_name_7="ge-0/0/1",enterprise_juniperNetworks_jnpr_interface_ext_interface_stats_if_name_8="ge-0/0/0",enterprise_juniperNetworks_jnpr_interface_ext_interface_stats_if_name_9="ge-0/0/1",sensor_name="ROUTER-INF:/junos/system/linecard/interface/:/junos/system/linecard/interface/:PFE",sequence_number=103837u,sub_component_id=0u,system_id="ST-Justin:10.55.60.125",timestamp=1719150965216u,version_major=1u,version_minor=1u 1719150965216000000

View file

@ -0,0 +1,226 @@
//
// Copyrights (c) 2015, 2016, Juniper Networks, Inc.
// All rights reserved.
//
//
// Licensed to the Apache Software Foundation (ASF) under one
// or more contributor license agreements. See the NOTICE file
// distributed with this work for additional information
// regarding copyright ownership. The ASF licenses this file
// to you under the Apache License, Version 2.0 (the
// "License"); you may not use this file except in compliance
// with the License. You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing,
// software distributed under the License is distributed on an
// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
// KIND, either express or implied. See the License for the
// specific language governing permissions and limitations
// under the License.
//
//
// Nitin Kumar, Jan 2015
//
// This file defines the messages in Protocol Buffers format used by
// the port sensor. The-top level messages is Port.
//
// Version 1.1
//
syntax = "proto2";
import "telemetry_top.proto";
//
// This occupies branch 3 from JuniperNetworksSensors
//
extend JuniperNetworksSensors {
optional Port jnpr_interface_ext = 3;
}
//
// Top-level message
//
message Port {
repeated InterfaceInfos interface_stats = 1;
}
//
// Interface information
//
message InterfaceInfos {
// Interface name, e.g., xe-0/0/0
required string if_name = 1 [(telemetry_options).is_key = true];
// Time when interface is created
optional uint64 init_time = 2;
// Global Index
optional uint32 snmp_if_index = 3;
// Name of parent for AE interface, if applicable
optional string parent_ae_name = 4;
// Egress queue information
repeated QueueStats egress_queue_info = 5;
// Ingress queue information
repeated QueueStats ingress_queue_info = 6;
// Inbound traffic statistics
optional InterfaceStats ingress_stats = 7;
// Outbound traffic statistics
optional InterfaceStats egress_stats = 8;
// Inbound traffic errors
optional IngressInterfaceErrors ingress_errors = 9;
// Interface administration status
optional string if_administration_status = 10;
// Interface operational status
optional string if_operational_status = 11;
// Interface description
optional string if_description = 12;
// Counter: number of carrier transitions on this interface
optional uint64 if_transitions = 13 [(telemetry_options).is_counter = true];
// This corresponds to the ifLastChange object in the standard interface MIB
optional uint32 ifLastChange = 14;
// This corresponds to the ifHighSpeed object in the standard interface MIB
optional uint32 ifHighSpeed = 15;
// Outbound traffic errors
optional EgressInterfaceErrors egress_errors = 16;
}
//
// Interface queue statistics
//
message QueueStats {
// Queue number
optional uint32 queue_number = 1 [(telemetry_options).is_key = true];
// The total number of packets that have been added to this queue
optional uint64 packets = 2 [(telemetry_options).is_counter = true];
// The total number of bytes that have been added to this queue
optional uint64 bytes = 3 [(telemetry_options).is_counter = true];
// The total number of tail dropped packets
optional uint64 tail_drop_packets = 4 [(telemetry_options).is_counter = true];
// The total number of rate-limited packets
optional uint64 rl_drop_packets = 5 [(telemetry_options).is_counter = true];
// The total number of rate-limited bytes
optional uint64 rl_drop_bytes = 6 [(telemetry_options).is_counter = true];
// The total number of red-dropped packets
optional uint64 red_drop_packets = 7 [(telemetry_options).is_counter = true];
// The total number of red-dropped bytes
optional uint64 red_drop_bytes = 8 [(telemetry_options).is_counter = true];
// Average queue depth, in packets
optional uint64 avg_buffer_occupancy = 9 [(telemetry_options).is_gauge = true];
// Current queue depth, in packets
optional uint64 cur_buffer_occupancy = 10 [(telemetry_options).is_gauge = true];
// The max measured queue depth, in packets, across all measurements since boot
optional uint64 peak_buffer_occupancy = 11 [(telemetry_options).is_gauge = true];
// Allocated buffer size
optional uint64 allocated_buffer_size = 12 [(telemetry_options).is_gauge = true];
}
//
// Interface statistics
//
message InterfaceStats {
// The total number of packets sent/received by this interface
optional uint64 if_pkts = 1 [(telemetry_options).is_counter = true];
// The total number of bytes sent/received by this interface
optional uint64 if_octets = 2 [(telemetry_options).is_counter = true];
// The rate at which packets are sent/received by this interface (in packets/sec)
optional uint64 if_1sec_pkts = 3 [(telemetry_options).is_gauge = true];
// The rate at which bytes are sent/received by this interface
optional uint64 if_1sec_octets = 4 [(telemetry_options).is_gauge = true];
// Total number of unicast packets sent/received by this interface
optional uint64 if_uc_pkts = 5 [(telemetry_options).is_counter = true];
// Total number of multicast packets sent/received by this interface
optional uint64 if_mc_pkts = 6 [(telemetry_options).is_counter = true];
// Total number of broadcast packets sent/received by this interface
optional uint64 if_bc_pkts = 7 [(telemetry_options).is_counter = true];
// Counter: total no of error packets sent/rcvd by this interface
optional uint64 if_error = 8 [(telemetry_options).is_counter = true];
// Counter: total no of PAUSE packets sent/rcvd by this interface
optional uint64 if_pause_pkts = 9 [(telemetry_options).is_counter = true];
// Counter: total no of UNKNOWN proto packets sent/rcvd by this interface
optional uint64 if_unknown_proto_pkts = 10 [(telemetry_options).is_counter = true];
}
//
// Inbound traffic error statistics
//
message IngressInterfaceErrors {
// The number of packets that contained errors
optional uint64 if_errors = 1 [(telemetry_options).is_counter = true];
// The number of packets dropped by the input queue of the I/O Manager ASIC
optional uint64 if_in_qdrops = 2 [(telemetry_options).is_counter = true];
// The number of packets which were misaligned
optional uint64 if_in_frame_errors = 3 [(telemetry_options).is_counter = true];
// The number of non-error packets which were chosen to be discarded
optional uint64 if_discards = 4 [(telemetry_options).is_counter = true];
// The number of runt packets
optional uint64 if_in_runts = 5 [(telemetry_options).is_counter = true];
// The number of packets that fail Layer 3 sanity checks of the header
optional uint64 if_in_l3_incompletes = 6 [(telemetry_options).is_counter = true];
// The number of packets for which the software could not find a valid logical interface
optional uint64 if_in_l2chan_errors = 7 [(telemetry_options).is_counter = true];
// The number of malform or short packets
optional uint64 if_in_l2_mismatch_timeouts = 8 [(telemetry_options).is_counter = true];
// The number of FIFO errors
optional uint64 if_in_fifo_errors = 9 [(telemetry_options).is_counter = true];
// The number of resource errors
optional uint64 if_in_resource_errors = 10 [(telemetry_options).is_counter = true];
}
//
// Outbound traffic error statistics
//
message EgressInterfaceErrors {
// The number of packets that contained errors
optional uint64 if_errors = 1 [(telemetry_options).is_counter = true];
// The number of non-error packets which were chosen to be discarded
optional uint64 if_discards = 2 [(telemetry_options).is_counter = true];
}

View file

@ -0,0 +1,16 @@
[[inputs.file]]
files = ["./testcases/protobuf_issue_15571/message.bin"]
data_format = "xpath_protobuf"
xpath_print_document = true
xpath_native_types = true
xpath_protobuf_files = ["telemetry_top.proto", "port.proto"]
xpath_protobuf_type = "TelemetryStream"
xpath_protobuf_import_paths = [".", "./testcases/protobuf_issue_15571"]
[[inputs.file.xpath]]
metric_name = "'test'"
field_selection = "* | //if_name"
field_name_expansion = true
timestamp = "//timestamp"
timestamp_format = "unix_ms"

View file

@ -0,0 +1,99 @@
//
// Copyrights (c) 2015, 2016, Juniper Networks, Inc.
// All rights reserved.
//
//
// Licensed to the Apache Software Foundation (ASF) under one
// or more contributor license agreements. See the NOTICE file
// distributed with this work for additional information
// regarding copyright ownership. The ASF licenses this file
// to you under the Apache License, Version 2.0 (the
// "License"); you may not use this file except in compliance
// with the License. You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing,
// software distributed under the License is distributed on an
// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
// KIND, either express or implied. See the License for the
// specific language governing permissions and limitations
// under the License.
//
//
// This file defines the top level message used for all Juniper
// Telemetry packets encoded to the protocol buffer format.
// The top level message is TelemetryStream.
//
syntax = "proto2";
import "google/protobuf/descriptor.proto";
extend google.protobuf.FieldOptions {
optional TelemetryFieldOptions telemetry_options = 1024;
}
message TelemetryFieldOptions {
optional bool is_key = 1;
optional bool is_timestamp = 2;
optional bool is_counter = 3;
optional bool is_gauge = 4;
}
message TelemetryStream {
// router hostname
// (or, just in the case of legacy (microkernel) PFEs, the IP address)
required string system_id = 1 [(telemetry_options).is_key = true];
// line card / RE (slot number). For RE, it will be 65535
optional uint32 component_id = 2 [(telemetry_options).is_key = true];
// PFE (if applicable)
optional uint32 sub_component_id = 3 [(telemetry_options).is_key = true];
// Overload sensor name with "senor name, internal path, external path
// and component" seperated by ":". For RE sensors, component will be
// daemon-name and for PFE sensors it will be "PFE".
optional string sensor_name = 4 [(telemetry_options).is_key = true];
// sequence number, monotonically increasing for each
// system_id, component_id, sub_component_id + sensor_name.
optional uint32 sequence_number = 5;
// timestamp (milliseconds since 00:00:00 UTC 1/1/1970)
optional uint64 timestamp = 6 [(telemetry_options).is_timestamp = true];
// major version
optional uint32 version_major = 7;
// minor version
optional uint32 version_minor = 8;
// end-of-message marker, set to true when the end of wrap is reached
optional bool eom = 9;
optional IETFSensors ietf = 100;
optional EnterpriseSensors enterprise = 101;
}
message IETFSensors {
extensions 1 to max;
}
message EnterpriseSensors {
extensions 1 to max;
}
extend EnterpriseSensors {
// re-use IANA assigned numbers
optional JuniperNetworksSensors juniperNetworks = 2636;
}
message JuniperNetworksSensors {
extensions 1 to max;
}

View file

@ -0,0 +1 @@
cannot parse invalid wire-format data

View file

@ -0,0 +1,10 @@
syntax = "proto3";
package native_type;
message Message {
string a = 1;
double b = 2;
int32 c = 3;
bool d = 4;
}

View file

@ -0,0 +1,18 @@
[[inputs.file]]
files = ["./testcases/protobuf_skip_bytes_grpc/test.dat"]
data_format = "xpath_protobuf"
xpath_native_types = true
xpath_protobuf_files = ["message.proto"]
xpath_protobuf_type = "native_type.Message"
xpath_protobuf_import_paths = [".", "./testcases/protobuf_skip_bytes_grpc"]
#xpath_protobuf_skip_bytes = 5
[[inputs.file.xpath]]
metric_name = "'native_types'"
[inputs.file.xpath.fields]
value_a = "//a"
value_b = "//b"
value_c = "//c"
value_d = "//d"

View file

@ -0,0 +1 @@
powerdns from="7f000001",fromPort=45729u,to="7f000001",toPort=53u,inBytes=48u,serverIdentity="xxxxxxxxxxxxxxxxxxx.com",messageId="943f90bea57a4eecbc5b0bea820a8aae",qName="ilse.nl.",qClass=1u,qType=1u,id=64100u,timeSec=1665050957u,timeUsec=500976u

View file

@ -0,0 +1,184 @@
/*
* This file describes the message format used by the protobuf logging feature in PowerDNS and dnsdist.
*
* MIT License
*
* Copyright (c) 2016-now PowerDNS.COM B.V. and its contributors.
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all
* copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/
syntax = "proto2";
message PBDNSMessage {
enum Type {
DNSQueryType = 1; // Query received by the service
DNSResponseType = 2; // Response returned by the service
DNSOutgoingQueryType = 3; // Query sent out by the service to a remote server
DNSIncomingResponseType = 4; // Response returned by the remote server
}
enum SocketFamily {
INET = 1; // IPv4 (RFC 791)
INET6 = 2; // IPv6 (RFC 2460)
}
enum SocketProtocol {
UDP = 1; // User Datagram Protocol (RFC 768)
TCP = 2; // Transmission Control Protocol (RFC 793)
DOT = 3; // DNS over TLS (RFC 7858)
DOH = 4; // DNS over HTTPS (RFC 8484)
DNSCryptUDP = 5; // DNSCrypt over UDP (https://dnscrypt.info/protocol)
DNSCryptTCP = 6; // DNSCrypt over TCP (https://dnscrypt.info/protocol)
}
enum PolicyType {
UNKNOWN = 1; // No RPZ policy applied, or unknown type
QNAME = 2; // Policy matched on the QName
CLIENTIP = 3; // Policy matched on the client IP
RESPONSEIP = 4; // Policy matched on one of the IPs contained in the answer
NSDNAME = 5; // Policy matched on the name of one nameserver involved
NSIP = 6; // Policy matched on the IP of one nameserver involved
}
enum PolicyKind {
NoAction = 1; // No action taken
Drop = 2; // https://tools.ietf.org/html/draft-vixie-dns-rpz-04 3.4
NXDOMAIN = 3; // https://tools.ietf.org/html/draft-vixie-dns-rpz-04 3.1
NODATA = 4; // https://tools.ietf.org/html/draft-vixie-dns-rpz-04 3.2
Truncate= 5; // https://tools.ietf.org/html/draft-vixie-dns-rpz-04 3.5
Custom = 6; // https://tools.ietf.org/html/draft-vixie-dns-rpz-04 3.6
}
enum VState {
Indeterminate = 1;
Insecure = 2;
Secure = 3;
BogusNoValidDNSKEY = 4;
BogusInvalidDenial = 5;
BogusUnableToGetDSs = 6;
BogusUnableToGetDNSKEYs = 7;
BogusSelfSignedDS = 8;
BogusNoRRSIG = 9;
BogusNoValidRRSIG = 10;
BogusMissingNegativeIndication = 11;
BogusSignatureNotYetValid = 12;
BogusSignatureExpired = 13;
BogusUnsupportedDNSKEYAlgo = 14;
BogusUnsupportedDSDigestType = 15;
BogusNoZoneKeyBitSet = 16;
BogusRevokedDNSKEY = 17;
BogusInvalidDNSKEYProtocol = 18;
}
required Type type = 1; // Type of event
optional bytes messageId = 2; // UUID, shared by the query and the response
optional bytes serverIdentity = 3; // ID of the server emitting the protobuf message
optional SocketFamily socketFamily = 4;
optional SocketProtocol socketProtocol = 5;
optional bytes from = 6; // DNS requestor (client) as 4 (IPv4) or 16 (IPv6) raw bytes in network byte order
optional bytes to = 7; // DNS responder (server) as 4 (IPv4) or 16 (IPv6) raw bytes in network byte order
optional uint64 inBytes = 8; // Size of the query or response on the wire
optional uint32 timeSec = 9; // Time of message reception (seconds since epoch)
optional uint32 timeUsec = 10; // Time of message reception (additional micro-seconds)
optional uint32 id = 11; // ID of the query/response as found in the DNS header
message DNSQuestion {
optional string qName = 1; // Fully qualified DNS name (with trailing dot)
optional uint32 qType = 2; // https://www.iana.org/assignments/dns-parameters/dns-parameters.xhtml#dns-parameters-4
optional uint32 qClass = 3; // Typically 1 (IN), see https://www.iana.org/assignments/dns-parameters/dns-parameters.xhtml#dns-parameters-2
}
optional DNSQuestion question = 12; // DNS query received from client
message DNSResponse {
// See exportTypes in https://docs.powerdns.com/recursor/lua-config/protobuf.html#protobufServer
// for the list of supported resource record types.
message DNSRR {
optional string name = 1; // Fully qualified DNS name (with trailing dot)
optional uint32 type = 2; // https://www.iana.org/assignments/dns-parameters/dns-parameters.xhtml#dns-parameters-4
optional uint32 class = 3; // Typically 1 (IN), see https://www.iana.org/assignments/dns-parameters/dns-parameters.xhtml#dns-parameters-2
optional uint32 ttl = 4; // TTL in seconds
optional bytes rdata = 5; // raw address bytes in network byte order for A & AAAA; text representation for others, with fully qualified (trailing dot) domain names
optional bool udr = 6; // True if this is the first time this RR has been seen for this question
}
optional uint32 rcode = 1; // DNS Response code, or 65536 for a network error including a timeout
repeated DNSRR rrs = 2; // DNS resource records in response
optional string appliedPolicy = 3; // Filtering policy (RPZ or Lua) applied
repeated string tags = 4; // Additional tags applied
optional uint32 queryTimeSec = 5; // Time of the corresponding query reception (seconds since epoch)
optional uint32 queryTimeUsec = 6; // Time of the corresponding query reception (additional micro-seconds)
optional PolicyType appliedPolicyType = 7; // Type of the filtering policy (RPZ or Lua) applied
optional string appliedPolicyTrigger = 8; // The RPZ trigger
optional string appliedPolicyHit = 9; // The value (qname or IP) that caused the hit
optional PolicyKind appliedPolicyKind = 10; // The Kind (RPZ action) applied by the hit
optional VState validationState = 11; // The DNSSEC Validation State
}
optional DNSResponse response = 13;
optional bytes originalRequestorSubnet = 14; // EDNS Client Subnet value (4 or 16 raw bytes in network byte order)
optional string requestorId = 15; // Username of the requestor
optional bytes initialRequestId = 16; // UUID of the incoming query that initiated this outgoing query or incoming response
optional bytes deviceId = 17; // Device ID of the requestor (could be mac address IP address or e.g. IMEI, format implementation dependent)
optional bool newlyObservedDomain = 18; // True if the domain has not been seen before
optional string deviceName = 19; // Device name of the requestor
optional uint32 fromPort = 20; // Source port of the DNS query (client)
optional uint32 toPort = 21; // Destination port of the DNS query (server)
message MetaValue {
repeated string stringVal = 1;
repeated int64 intVal = 2;
}
message Meta {
required string key = 1; // MUST be unique, so if you have multiple values they must be aggregated into on Meta
required MetaValue value = 2;
}
repeated Meta meta = 22; // Arbitrary meta-data - to be used in future rather than adding new fields all the time
// The well known EventTrace event numbers
enum EventType {
// Range 0..99: Generic events
CustomEvent = 0; // A custom event
ReqRecv = 1; // A request was received
PCacheCheck = 2; // A packet cache check was initiated or completed; value: bool cacheHit
AnswerSent = 3; // An answer was sent to the client
// Range 100: Recursor events
SyncRes = 100; // Recursor Syncres main function has started or completed; value: int rcode
LuaGetTag = 101; // Events below mark start or end of Lua hook calls; value: return value of hook
LuaGetTagFFI = 102;
LuaIPFilter = 103;
LuaPreRPZ = 104;
LuaPreResolve = 105;
LuaPreOutQuery = 106;
LuaPostResolve = 107;
LuaNoData = 108;
LuaNXDomain = 109;
LuaPostResolveFFI = 110;
}
message Event {
required int64 ts = 1; // Timestamp in ns relative to time of creation of event trace data structure
required EventType event = 2; // Type of event
required bool start = 3; // true for "start" events, false for "completed" events
optional bool boolVal = 4; // Below are optional values associated with events
optional int64 intVal = 5;
optional string stringVal = 6;
optional bytes bytesVal = 7;
optional string custom = 8; // The name of the event for custom events
}
repeated Event trace = 23;
}
message PBDNSMessageList {
repeated PBDNSMessage msg = 1;
}

View file

@ -0,0 +1,14 @@
[[inputs.file]]
files = ["./testcases/protobuf_powerdns_hex/powerdns_message.bin"]
data_format = "xpath_protobuf"
xpath_native_types = true
xpath_protobuf_files = ["powerdns_message.proto"]
xpath_protobuf_type = "PBDNSMessage"
xpath_protobuf_import_paths = [".", "./testcases/protobuf_powerdns_hex"]
xpath_protobuf_skip_bytes = 2
[[inputs.file.xpath]]
metric_name = "'powerdns'"
fields_bytes_as_hex = ["from", "to", "messageId"]
field_selection = "descendant::*"

View file

@ -0,0 +1 @@
native_types value_a="a string",value_b=3.1415,value_c=42i,value_d=true

View file

@ -0,0 +1,10 @@
syntax = "proto3";
package native_type;
message Message {
string a = 1;
double b = 2;
int32 c = 3;
bool d = 4;
}

Some files were not shown because too many files have changed in this diff Show more