This commit is contained in:
Biedermann Steve 2024-05-08 14:43:29 +02:00
parent 9948b07f76
commit fd8ea49ac7
4 changed files with 195 additions and 83 deletions

120
out.md Normal file
View File

@ -0,0 +1,120 @@
# Requirements for journal-uploader
[[_TOC_]]
The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED",
"MAY", and "OPTIONAL" in this document are to be interpreted as described in
[RFC 2119](https://datatracker.ietf.org/doc/html/rfc2119).
## Description
The journal-uploader has two main functionalities.
- Take a stream of log messages and filter them depending on their severity
- Upload journal logs for a specified time when activated through cloud call
## Requirements
### [_TOPIC-1_] Journal Watcher
#### [_SUB-1.1_] File Monitoring
- **[_REQ-1.1.1_] Continuous Monitoring:** The tool **_MUST_** continuously monitor a designated directory.
#### [_SUB-1.2_] File Detection
- **[_REQ-1.2.1_] Detection of New Files:** The tool **_MUST_** detect the addition of new files in the monitored directory.
- **[_REQ-1.2.2_] Avoid Re-processing:** The tool **_MUST NOT_** process files that have already been processed.
### [_TOPIC-2_] Traced Logging
#### [_SUB-2.1_] File Processing
- **[_REQ-2.1.1_] Reading Log Messages:** When a new file is processed, each log message **_SHOULD_** be put into a buffer.
- **[_REQ-2.1.2_] Filtering Log Messages:** The tool will search for messages of a defined priority (Trigger Priority).
Each message of this priority, as well as all messages before and after, which are inside a defined timespan, **_MUST_**
get written into a file. Every other message **_SHOULD_** gets dropped.
- **[_REQ-2.1.3_] No Duplicate Log Messages:** The tool **_SHALL_** make sure that no log entry will be written to the file twice.
#### [_SUB-2.2_] Traced Log Rotation
- **[_REQ-2.2.1_] Rotating Files:** When the size of the current traced log file exceeds a certain threshold,
it **_MUST_** be closed and a new file **_MUST_** be opened for writing.
- **[_REQ-2.2.2_] Compression of Rotated Files:** Each traced log file **_MUST_** get compressed after it got rotated.
- **[_REQ-2.2.3_] Rotating Directory:** When the directory size exceeds a certain threshold, the tool **_MUST_** delete the oldest
files in the directory, until the size is below the threshold again.
### [_TOPIC-3_] Remote Journal Logging
#### [_SUB-3.1_] Service Activation
- **[_REQ-3.1.1_] Cloud Activation:** The remote journal logging **_SHALL_** be startable through a function call from the cloud.
The api call has the duration and max interval as arguments.
- **[_REQ-3.1.2_] Duration:** The remote journal logging **_SHOULD_** stay active, until it reaches the specified duration.
- **[_REQ-3.1.3_] Max Interval:** If no upload was done after the amount of time specified in max interval,
a log rotation **_SHALL_** be triggered, which will in turn get picked up by the file monitoring.
- **[_REQ-3.1.4_] Analytics Not Accepted:** If the user has not accepted the usage of their data, the cloud call **_MUST_**
result in an error.
#### [_SUB-3.2_] File Processing
- **[_REQ-3.2.1_] File Upload:** When a file gets detected, it **_SHOULD_** get uploaded to the cloud.
- **[_REQ-3.2.2_] No Duplicate Files:** Already processed files **_MUST NOT_** get uploaded again.
- **[_REQ-3.2.3_] Revoking Analytics:** If the user revokes the usage of their data, the service **_MAY_** continue running
but **_MUST NOT_** upload any data until the user allows the usage of their data again.
- **[_REQ-3.2.4_] Duration Expired:** After the specified duration is expired, the service **_SHOULD_** stop uploading files.
### [_TOPIC-4_] Configuration
- **[_CONF-4.1_] Journal Directory:** Users **_SHOULD_** be able to specify the directory to be monitored for journal files.
- **[_CONF-4.2_] Output Directory:** Users **_SHOULD_** be able to specify the directory into which the final files will be written.
- **[_CONF-4.3_] Trigger Priority:** Users **_SHOULD_** be able to specify which priority triggers the filtering.
- **[_CONF-4.4_] Journal Context:** Users **_SHOULD_** be able to specify how many seconds of context will be added to traced logs when encountering a trigger priority.
- **[_CONF-4.5_] Max File Size:** Users **_SHOULD_** be able to specify the max file size, at which a file gets rotated.
- **[_CONF-4.6_] Max Directory Size:** Users **_SHOULD_** be able to specify the max directory size, at which a directory gets rotated.
- **[_CONF-4.7_] File Monitoring Interval:** Users **_SHOULD_** be able to specify an interval, which **_SHOULD_** change
how long the tool waits before checking if new files are available.
### [_TOPIC-5_] Performance Requirements
- **[_PERF-5.1_] Efficiency:** The tool **_SHOULD_** efficiently monitor and process files without excessive resource consumption.
- **[_PERF-5.2_] Interval Delay:** The tool **_SHOULD_** do its work with no more than 10 seconds delay after its interval.
### [_TOPIC-6_] Security & Data Protection
- **[_SEC-6.1_] No Insecure Connection:** The tool **_MUST_** send data only through a secure connection.
- **[_SEC-6.2_] GDPR compliance:** The tool **_MUST NOT_** upload data if the user has not agreed to share this information.
### [_TOPIC-7_] Testing
- **[_TEST-7.1_] Unit Tests:** Comprehensive unit tests **_SHOULD_** be written to cover major functionalities.
- **[_TEST-7.2_] Integration Tests:** Integration tests **_SHOULD_** be conducted to ensure all parts of the tool work together seamlessly.
## Definitions
- Default Journal Directory: /run/log/journal/<machine_id>
- Machine ID can be found at /etc/machine-id
- Default Output Directory: /run/log/filtered-journal
## Config Defaults
- **Journal Directory**
- Type: Path
- **Required**: This value **_MUST_** be provided as a start parameter.
- **Output Directory**
- Type: Path
- **Required**: This value **_MUST_** be provided as a start parameter.
- **Trigger Priority**
- Type: Enum
- Valid Values: _Emergency, Alert, Critical, Error, Warning, Notice, Info, Debug_
- Default Value: _Warning_
- **Journal Context**
- Type: Integer
- Unit: Seconds
- Default Value: _15_
- **Max File Size**
- Type: Integer
- Unit: Bytes
- Default Value: _8388608_ (8 MB)
- **Max Directory Size**
- Type: Integer
- Unit: Bytes
- Default Value: _75497472_ (72 MB)
- **File Monitoring Interval**
- Type: Integer
- Unit: Seconds
- Default Value: _10_

107
req.yml
View File

@ -4,163 +4,152 @@ description: |-
- Take a stream of log messages and filter them depending on their severity
- Upload journal logs for a specified time when activated through cloud call
topics:
FEAT-1:
name: Traced Logging
TOPIC-1:
name: Journal Watcher
subtopics:
SUB-1:
SUB-1.1:
name: File Monitoring
requirements:
REQ-1:
REQ-1.1.1:
name: Continuous Monitoring
description: The tool must continuously monitor a designated directory.
SUB-2:
SUB-1.2:
name: File Detection
requirements:
REQ-1:
REQ-1.2.1:
name: Detection of New Files
description: The tool must detect the addition of new files in the monitored directory.
REQ-2:
REQ-1.2.2:
name: Avoid Re-processing
description: The tool must not process files that have already been processed.
SUB-3:
TOPIC-2:
name: Traced Logging
subtopics:
SUB-2.1:
name: File Processing
requirements:
REQ-1:
REQ-2.1.1:
name: Reading Log Messages
description: When a new file is processed, each log message should be put into a buffer.
REQ-2:
REQ-2.1.2:
name: Filtering Log Messages
description: |-
The tool will search for messages of a defined priority (Trigger Priority).
Each message of this priority, as well as all messages before and after, which are inside a defined timespan, must
get written into a file. Every other message should gets dropped.
REQ-3:
REQ-2.1.3:
name: No Duplicate Log Messages
description: The tool shall make sure that no log entry will be written to the file twice.
SUB-4:
SUB-2.2:
name: Traced Log Rotation
requirements:
REQ-1:
REQ-2.2.1:
name: Rotating Files
description: |-
When the size of the current traced log file exceeds a certain threshold,
it must be closed and a new file must be opened for writing.
REQ-2:
REQ-2.2.2:
name: Compression of Rotated Files
description: Each traced log file must get compressed after it got rotated.
REQ-3:
REQ-2.2.3:
name: Rotating Directory
description: |-
When the directory size exceeds a certain threshold, the tool must delete the oldest
files in the directory, until the size is below the threshold again.
FEAT-2:
TOPIC-3:
name: Remote Journal Logging
subtopics:
SUB-1:
SUB-3.1:
name: Service Activation
requirements:
REQ-1:
REQ-3.1.1:
name: Cloud Activation
description: |-
The remote journal logging shall be startable through a function call from the cloud.
The api call has the duration and max interval as arguments.
REQ-2:
REQ-3.1.2:
name: Duration
description: The remote journal logging should stay active, until it reaches the specified duration.
REQ-3:
REQ-3.1.3:
name: Max Interval
description: |-
If no upload was done after the amount of time specified in max interval,
a log rotation shall be triggered, which will in turn get picked up by the file monitoring.
REQ-4:
REQ-3.1.4:
name: Analytics Not Accepted
description: |-
If the user has not accepted the usage of their data, the cloud call must
result in an error.
SUB-2:
name: File Monitoring
requirements:
REQ-1:
name: Continuous Monitoring
description: The tool should continuously monitor a designated directory.
SUB-3:
name: File Detection
requirements:
REQ-1:
name: Detection of New Files
description: The tool must detect the addition of new files in the monitored directory.
REQ-2:
name: Avoid Re-processing
description: The tool must not process files that have already been processed.
SUB-4:
SUB-3.2:
name: File Processing
requirements:
REQ-1:
REQ-3.2.1:
name: File Upload
description: When a file gets detected, it should get uploaded to the cloud.
REQ-2:
REQ-3.2.2:
name: No Duplicate Files
description: Already processed files must not get uploaded again.
REQ-3:
REQ-3.2.3:
name: Revoking Analytics
description: |-
If the user revokes the usage of their data, the service may continue running
but must not upload any data until the user allows the usage of their data again.
REQ-4:
REQ-3.2.4:
name: Duration Expired
description: After the specified duration is expired, the service should stop uploading files.
FEAT-3:
TOPIC-4:
name: Configuration
requirements:
REQ-1:
CONF-4.1:
name: Journal Directory
description: Users should be able to specify the directory to be monitored for journal files.
REQ-2:
CONF-4.2:
name: Output Directory
description: Users should be able to specify the directory into which the final files will be written.
REQ-3:
CONF-4.3:
name: Trigger Priority
description: Users should be able to specify which priority triggers the filtering.
REQ-4:
CONF-4.4:
name: Journal Context
description: Users should be able to specify how many seconds of context will be added
to traced logs when encountering a trigger priority.
REQ-5:
CONF-4.5:
name: Max File Size
description: Users should be able to specify the max file size, at which a file gets rotated.
REQ-6:
CONF-4.6:
name: Max Directory Size
description: Users should be able to specify the max directory size, at which a directory gets rotated.
REQ-7:
CONF-4.7:
name: File Monitoring Interval
description: |-
Users should be able to specify an interval, which should change
how long the tool waits before checking if new files are available.
FEAT-4:
TOPIC-5:
name: Performance Requirements
requirements:
REQ-1:
PERF-5.1:
name: Efficiency
description: The tool should efficiently monitor and process files without excessive resource consumption.
REQ-2:
PERF-5.2:
name: Interval Delay
description: The tool should do its work with no more than 10 seconds delay after its interval.
FEAT-5:
name: Data Protection
TOPIC-6:
name: Security & Data Protection
requirements:
REQ-1:
SEC-6.1:
name: No Insecure Connection
description: The tool must send data only through a secure connection.
REQ-2:
SEC-6.2:
name: GDPR compliance
description: The tool must not upload data if the user has not agreed to share this information.
FEAT-6:
TOPIC-7:
name: Testing
requirements:
REQ-1:
TEST-7.1:
name: Unit Tests
description: Comprehensive unit tests should be written to cover major functionalities.
REQ-2:
TEST-7.2:
name: Integration Tests
description: Integration tests should be conducted to ensure all parts of the tool work together seamlessly.

View File

@ -2,27 +2,6 @@ use indexmap::{indexmap, IndexMap};
use serde::{Deserialize, Serialize, Serializer};
use stringlit::s;
#[allow(dead_code)]
pub const WORD_DESCRIPTION: &str = //
r#"The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED",
"MAY", and "OPTIONAL" in this document are to be interpreted as described in
[RFC 2119](https://datatracker.ietf.org/doc/html/rfc2119).
"#;
#[allow(dead_code)]
pub const HIGHLIGHTED_WORDS: [&str; 10] = [
"must not",
"must",
"required",
"shall not",
"shall",
"should not",
"should",
"recommended",
"may",
"optional",
];
pub fn my_trim<S>(v: &String, s: S) -> Result<S::Ok, S::Error>
where
S: Serializer,

View File

@ -2,6 +2,25 @@ use indexmap::IndexMap;
use req::*;
use stringlit::s;
pub const WORD_DESCRIPTION: &str = //
r#"The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED",
"MAY", and "OPTIONAL" in this document are to be interpreted as described in
[RFC 2119](https://datatracker.ietf.org/doc/html/rfc2119).
"#;
pub const HIGHLIGHTED_WORDS: [&str; 10] = [
"must not",
"must",
"required",
"shall not",
"shall",
"should not",
"should",
"recommended",
"may",
"optional",
];
fn nl() -> String {
s!("")
}
@ -9,7 +28,7 @@ fn nl() -> String {
fn add_requirements(output: &mut Vec<String>, requirements: &IndexMap<String, Requirement>) {
for (id, requirement) in requirements {
output.push(format!(
"- **{id} {}:** {}",
"- **[_{id}_] {}:** {}",
requirement.name, requirement.description
));
}
@ -17,7 +36,7 @@ fn add_requirements(output: &mut Vec<String>, requirements: &IndexMap<String, Re
fn add_topics(output: &mut Vec<String>, topics: &IndexMap<String, Topic>, level: usize) {
for (id, topic) in topics {
output.push(format!("{} {id} {}", "#".repeat(level), topic.name));
output.push(format!("{} [_{id}_] {}", "#".repeat(level), topic.name));
if !topic.requirements.is_empty() {
add_requirements(output, &topic.requirements);
output.push(nl());
@ -89,7 +108,12 @@ fn main() -> anyhow::Result<()> {
}
}
println!("{}", output.join("\n"));
let mut output = output.join("\n");
for word in HIGHLIGHTED_WORDS {
output = output.replace(word, &format!("**_{}_**", word.to_uppercase()));
}
println!("{output}");
Ok(())
}