Concepts
It's worth getting familiar with the basic concepts that comprise Vector as they are used throughout the documentation. This knowledge will be helpful as you proceed and is also cool to brag about amongst friends.
Components
"Component" is the generic term we use for sources, transforms, and sinks. You compose components to create pipelines, allowing you to ingest, transform, and send data.
Sources
Vector would be junk if it couldn't injest data. A "source" defines where Vector
should pull data from, or how it should receive data pushed to it. A pipeline
can have any number of sources, and as they ingest data they proceed to
normalize it into events (see next section). This sets the stage
for easy and consistent processing of your data. Examples of sources include
file, syslog,
tcp, and stdin.
Transforms
A "transform" is responsible for mutating events as they are transported by Vector. This might involve parsing, filtering, sampling, or aggregating. You can have any number of transforms in your pipeline and how they are composed is up to you.
Sinks
A "sink" is a destination for events. Each sink's
design and transmission method is dictated by the downstream service it is
interacting with. For example, the tcp sink will stream
individual events, while the aws_s3 sink will buffer and
flush data.
Events
All items (both logs and metrics) passing through Vector are known as an "event", which is explained in detail in the data model section.
Pipelines
A "pipeline" is the end result of connecting sources, transforms, and sinks. You can see a full example of a pipeline in the configuration section.