Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions .dockerignore
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ conf/
log_merging_config.json
dockerComposeMPI.yml
*.csv
reversim-conf

# Debugpy logs, WinMerge Backups etc.
*.log
Expand All @@ -29,6 +30,11 @@ dockerComposeMPI.yml
# Ignore statistics folder
statistics/

# Hide deployment storage
tmp/
secrets/


# Maybe ignore unnecessary code
# app/statistics
# app/tests
46 changes: 32 additions & 14 deletions .github/workflows/deploy-image.yml
Original file line number Diff line number Diff line change
@@ -1,18 +1,21 @@
#
name: Create and publish a Docker image

# Configures this workflow to run every time a change is pushed to the branch called `main`.
# Configures this workflow to run every time a change is pushed to the branch called
# `main` or `dev`.
on:
push:
branches: ['main', 'dev']
tags: ['*']

# Defines two custom environment variables for the workflow. These are used for the Container registry domain, and a name for the Docker image that this workflow builds.
# Defines two custom environment variables for the workflow. These are used for the
# Container registry domain, and a name for the Docker image that this workflow builds.
env:
REGISTRY: ghcr.io
IMAGE_NAME: ${{ github.repository }}

# There is a single job in this workflow. It's configured to run on the latest available version of Ubuntu.
# There is a single job in this workflow. It's configured to run on the latest available
# version of Ubuntu.
jobs:
build-and-push-image:
runs-on: ubuntu-latest
Expand All @@ -25,21 +28,28 @@ jobs:

steps:
- name: Checkout repository
uses: actions/checkout@v4
uses: actions/checkout@v6
with:
submodules: true

# Uses the `docker/login-action` action to log in to the Container registry registry using the account and password that will publish the packages. Once published, the packages are scoped to the account defined here.
# Uses the `docker/login-action` action to log in to the Container registry registry
# using the account and password that will publish the packages. Once published, the
# packages are scoped to the account defined here.
- name: Log in to the Container registry
uses: docker/login-action@65b78e6e13532edd9afa3aa52ac7964289d1a9c1
uses: docker/login-action@v3
with:
registry: ${{ env.REGISTRY }}
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}

# This step uses [docker/metadata-action](https://github.com/docker/metadata-action#about) to extract tags and labels that will be applied to the specified image. The `id` "meta" allows the output of this step to be referenced in a subsequent step. The `images` value provides the base name for the tags and labels.
# This step uses [docker/metadata-action](https://github.com/docker/metadata-action#about)
# to extract tags and labels that will be applied to the specified image.
# The `id` "meta" allows the output of this step to be referenced in a subsequent
# step. The `images` value provides the base name for the tags and labels.
# It will automatically create the latest Docker tag, if a git tag is found: https://github.com/docker/metadata-action?tab=readme-ov-file#latest-tag
- name: Extract metadata (tags, labels) for Docker
id: meta
uses: docker/metadata-action@9ec57ed1fcdbf14dcef7dfbe97b2010124a938b7
uses: docker/metadata-action@v5
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}

Expand All @@ -51,12 +61,17 @@ jobs:
calculatedSha=$(git rev-parse --short ${{ github.sha }})
echo "COMMIT_SHORT_SHA=$calculatedSha" >> $GITHUB_ENV

# This step uses the `docker/build-push-action` action to build the image, based on your repository's `Dockerfile`. If the build succeeds, it pushes the image to GitHub Packages.
# It uses the `context` parameter to define the build's context as the set of files located in the specified path. For more information, see [Usage](https://github.com/docker/build-push-action#usage) in the README of the `docker/build-push-action` repository.
# It uses the `tags` and `labels` parameters to tag and label the image with the output from the "meta" step.
# This step uses the `docker/build-push-action` action to build the image, based on
# your repository's `Dockerfile`. If the build succeeds, it pushes the image to
# GitHub Packages.
# It uses the `context` parameter to define the build's context as the set of files
# located in the specified path. For more information, see [Usage](https://github.com/docker/build-push-action#usage)
# in the README of the `docker/build-push-action` repository.
# It uses the `tags` and `labels` parameters to tag and label the image with the
# output from the "meta" step.
- name: Build and push Docker image
id: push
uses: docker/build-push-action@f2a1d5e99d037542a71f64918e516c093c6f3fc4
uses: docker/build-push-action@v6
with:
context: .
push: true
Expand All @@ -66,9 +81,12 @@ jobs:
GAME_GIT_HASH=${{ github.sha }}
GAME_GIT_HASH_SHORT=${{ env.COMMIT_SHORT_SHA }}

# This step generates an artifact attestation for the image, which is an unforgeable statement about where and how it was built. It increases supply chain security for people who consume the image. For more information, see [Using artifact attestations to establish provenance for builds](/actions/security-guides/using-artifact-attestations-to-establish-provenance-for-builds).
# This step generates an artifact attestation for the image, which is an unforgeable
# statement about where and how it was built. It increases supply chain security for
# people who consume the image. For more information, see [Using artifact attestations
# to establish provenance for builds](/actions/security-guides/using-artifact-attestations-to-establish-provenance-for-builds).
- name: Generate artifact attestation
uses: actions/attest-build-provenance@v2
uses: actions/attest-build-provenance@v3
with:
subject-name: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME}}
subject-digest: ${{ steps.push.outputs.digest }}
Expand Down
5 changes: 4 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,6 @@ __pycache__/
env/
venv/
.venv/
tmp/

# --- Debugpy logs, WinMerge Backups etc.
*.log
Expand All @@ -44,3 +43,7 @@ reversim-conf

# --- Generated level thumbnails
doc/levels

# --- Don't push instance relevant configuration
tmp/
secrets/
1 change: 1 addition & 0 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
Expand Up @@ -143,6 +143,7 @@
"hreserver",
"hrestudy",
"htmlsafe",
"httpauth",
"iframe",
"imgdata",
"imgstring",
Expand Down
3 changes: 2 additions & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -15,10 +15,10 @@

# Labels as per:
# https://github.com/opencontainers/image-spec/blob/main/annotations.md#pre-defined-annotation-keys
MAINTAINER Max Planck Institute for Security and Privacy

Check warning on line 18 in Dockerfile

View workflow job for this annotation

GitHub Actions / build-and-push-image

The MAINTAINER instruction is deprecated, use a label instead to define an image author

MaintainerDeprecated: Maintainer instruction is deprecated in favor of using label More info: https://docs.docker.com/go/dockerfile/rule/maintainer-deprecated/
LABEL org.opencontainers.image.authors="Max Planck Institute for Security and Privacy"
# NOTE Also change the version in config.py
LABEL org.opencontainers.image.version="2.1.0"
LABEL org.opencontainers.image.version="2.1.1"
LABEL org.opencontainers.image.licenses="AGPL-3.0-only"
LABEL org.opencontainers.image.description="Ready to deploy Docker container to use ReverSim for research. ReverSim is an open-source environment for the browser, originally developed at the Max Planck Institute for Security and Privacy (MPI-SP) to study human aspects in hardware reverse engineering."
LABEL org.opencontainers.image.source="https://github.com/emsec/ReverSim"
Expand Down Expand Up @@ -62,6 +62,7 @@
# Create empty statistics folders
WORKDIR /usr/var/reversim-instance/statistics/LogFiles
WORKDIR /usr/var/reversim-instance/statistics/canvasPics
WORKDIR /usr/var/reversim-instance/secrets
WORKDIR /usr/src/hregame

# Specify mount points for the statistics folder, levels, researchInfo & disclaimer
Expand Down
45 changes: 45 additions & 0 deletions app/authentication.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
import logging
import os
import secrets

from flask_httpauth import HTTPTokenAuth # type: ignore

from app.config import BEARER_TOKEN_BYTES
from app.model.ApiKey import ApiKey
from app.storage.database import db
from app.utilsGame import safe_join

auth = HTTPTokenAuth(scheme='Bearer')

USER_METRICS = 'api_metrics'

@auth.verify_token # type: ignore
def verifyToken(token: str) -> ApiKey|None:
"""Check if this token exists. If yes return the user object, otherwise return `None`

https://flask-httpauth.readthedocs.io/en/latest/#flask_httpauth.HTTPTokenAuth.verify_token
"""

return db.session.query(ApiKey).filter_by(token=token).first()


def populate_data(instance_path: str):
if ApiKey.query.count() < 1:
apiKey = ApiKey(secrets.token_urlsafe(BEARER_TOKEN_BYTES), USER_METRICS)
db.session.add(apiKey)
db.session.commit()

defaultToken = db.session.query(ApiKey).where(ApiKey.user == USER_METRICS).first()
if defaultToken is not None:

# Try to write the bearer secret to a file so other containers can use it
try:
folder = safe_join(instance_path, 'secrets')
os.makedirs(folder, exist_ok=True)
with open(safe_join(folder, 'bearer_api.txt'), encoding='UTF-8', mode='wt') as f:
f.write(defaultToken.token)

except Exception as e:
# When the file can't be created print the bearer to stdout
logging.error('Could not write bearer token to file: ' + str(e))
logging.info('Bearer token for /metrics endpoint: ' + defaultToken.token)
5 changes: 4 additions & 1 deletion app/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,14 +16,17 @@

# CONFIG Current Log File Version.
# NOTE Also change this in the Dockerfile
LOGFILE_VERSION = "2.1.0" # Major.Milestone.Subversion
LOGFILE_VERSION = "2.1.1" # Major.Milestone.Subversion

PSEUDONYM_LENGTH = 32
LEVEL_ENCODING = 'UTF-8' # was Windows-1252
TIME_DRIFT_THRESHOLD = 200 # ms
STALE_LOGFILE_TIME = 48 * 60 * 60 # close logfiles after 48h
MAX_ERROR_LOGS_PER_PLAYER = 25

# The bearer token for the /metrics endpoint
BEARER_TOKEN_BYTES = 32

# Number of seconds, after which the player is considered disconnected. A "Back Online"
# message will be printed to the log, if the player connects afterwards. Also used for the
# Prometheus Online Player Count metric
Expand Down
16 changes: 16 additions & 0 deletions app/model/ApiKey.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
from datetime import datetime, timezone
from sqlalchemy import DateTime, String
from sqlalchemy.orm import Mapped, mapped_column

from app.storage.database import db


class ApiKey(db.Model):
token: Mapped[str] = mapped_column(primary_key=True)
user: Mapped[str] = mapped_column(String(64))
created: Mapped[datetime] = mapped_column(DateTime)

def __init__(self, token: str, user: str) -> None:
self.token = token
self.user = user
self.created = datetime.now(timezone.utc)
7 changes: 5 additions & 2 deletions app/model/LevelLoader/JsonLevelList.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,9 +54,11 @@ def getPossibleLevels(self) -> list[Level]:
current_list = MappingProxyType(self.levelList[list_name])

for entry in current_list['levels']:
# If this is a list, add all levels that are contained in that list
if isinstance(entry, list):
for subEntry in cast(list[dict[str, str]], entry):
levels.append(Level(type=subEntry['type'], fileName=subEntry['name']))
levels.append(Level(type=subEntry['type'], fileName=subEntry['name']))
# Else add the single level
else:
levels.append(Level(type=entry['type'], fileName=entry['name']))

Expand Down Expand Up @@ -152,7 +154,8 @@ def fromFile(
try:
conf = load_config(fileName=fileName, instanceFolder=instanceFolder)

# TODO Run checks
# TODO Run checks to catch any errors directly on launch and not later when
# someone tries to load the first level

logging.info(f'Successfully loaded {len(conf)} level lists.')
return conf
Expand Down
14 changes: 9 additions & 5 deletions app/prometheusMetrics.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
import logging
from threading import Thread
import time
from typing import Any
from flask import Flask

# Prometheus Metrics
Expand All @@ -14,27 +15,29 @@

class ServerMetrics:
@staticmethod
def __prometheusFactory():
def __prometheusFactory(auth_provider: Any):
EXCLUDED_PATHS = ["/?res\\/.*", "/?src\\/.*", "/?doc\\/.*"]

# Try to use the uWSGI exporter. This will fail, if uWSGI is not installed
try:
metrics = UWsgiPrometheusMetrics.for_app_factory( # type: ignore
excluded_paths=EXCLUDED_PATHS
excluded_paths=EXCLUDED_PATHS,
metrics_decorator=auth_provider
)

# Use the regular Prometheus exporter
except Exception as e:
logging.error(e)

metrics = PrometheusMetrics.for_app_factory( # type: ignore
excluded_paths=EXCLUDED_PATHS
excluded_paths=EXCLUDED_PATHS,
metrics_decorator=auth_provider
)

logging.info(f'Using {type(metrics).__name__} as the Prometheus exporter')
return metrics

metrics = __prometheusFactory()
metrics: PrometheusMetrics|UWsgiPrometheusMetrics|None = None

# ReverSim Prometheus Metrics
#met_openLogs = Gauge("reversim_logfile_count", "The number of open logfiles") # type: ignore
Expand All @@ -47,8 +50,9 @@ def __prometheusFactory():
met_clientErrors: Gauge|None = None

@classmethod
def createPrometheus(cls, app: Flask):
def createPrometheus(cls, app: Flask, auth_provider: Any):
"""Init Prometheus"""
cls.metrics = cls.__prometheusFactory(auth_provider)
cls.metrics.init_app(app) # type: ignore
cls.metrics.info('app_info', 'Application info', version=gameConfig.LOGFILE_VERSION) # type: ignore

Expand Down
32 changes: 21 additions & 11 deletions app/statistics/statsPhase.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
from app.statistics.statisticUtils import LogSyntaxError, calculateDuration, removeprefix
from app.statistics.statsLevel import StatsLevel

from app.utilsGame import PhaseType
from app.utilsGame import LevelType, PhaseType


class StatsPhase:
Expand Down Expand Up @@ -196,22 +196,35 @@ def onLevelRequested(self, event: EVENT_T):
nextLevelName: str = event['Filename']
expectedLevelType: str = ALL_LEVEL_TYPES[self.levels[self.levelCounter].type]

# New Special Case: The old asset folder was /res
if nextLevelName.startswith('/res/'):
logging.debug(f'Old asset folder: "{nextLevelName}"')
nextLevelName = nextLevelName.replace('res', 'assets')

# Check if the current level type matches the expected level type
if nextLevelType != expectedLevelType:
# Tutorials might get inserted dynamically
if nextLevelType in [ALL_LEVEL_TYPES[SLIDE_TYPE_TUTORIAL], ALL_LEVEL_TYPES[SLIDE_TYPE_SPECIAL]]:
if nextLevelType in [ALL_LEVEL_TYPES[LevelType.TUTORIAL], ALL_LEVEL_TYPES[LevelType.SPECIAL]]:
nextLevelTypeLog = next(k for k, v in ALL_LEVEL_TYPES.items() if v == nextLevelType)
tut = StatsLevel(nextLevelTypeLog, nextLevelName)
self.levels.insert(self.levelCounter, tut)
nextLevel = tut

# We might have too many levels in the que, since the new JSON Level List
# allows to pick one of multiple levels. Try to skip ahead to find a slide
# with matching type
elif nextLevelType in ALL_LEVEL_TYPES[LevelType.INFO]:
oldLevelCounter = self.levelCounter
for level in self.levels[self.levelCounter:]:
# Break if we found a slide with matching type
if ALL_LEVEL_TYPES[level.type] == nextLevelType:
nextLevel = self.levels[self.levelCounter]
assert nextLevel == level
break

# Increment counter and check if we reached the end of the phase
self.levelCounter += 1
if self.levelCounter > len(self.levels)-1:
raise LogSyntaxError(f'Unable to find a slide of type {nextLevelType} until the end of the phase')

# Unable to find a matching slide with all edge cases, raise error
else:
raise LogSyntaxError("Expected slide of type " + expectedLevelType + ", got " + nextLevelType + " in " + self.name + "!")


# If the level contains a circuit, the order might be randomized, so search for the level in the array
if nextLevelType in [ALL_LEVEL_TYPES['level'], ALL_LEVEL_TYPES['tutorial']]:
Expand Down Expand Up @@ -364,9 +377,6 @@ def createStatsPhase() -> Dict[str, int]:
THINKALOUD_CONFIG_OPTIONS = ['concurrent', 'retrospective']
THINKALOUD_SLIDE_NAMES = ['thinkaloudCon', 'thinkaloudRet']

SLIDE_TYPE_SPECIAL = 'special'
SLIDE_TYPE_TUTORIAL = 'tutorial'

INTRO_SLIDE_NAMES = ['covert', 'camouflage']
INTRO_SLIDE_NAMES_OLD = {
'Tutorial covert': INTRO_SLIDE_NAMES[0],
Expand Down
Loading