feat(save-transfer): add saveutil CLI and token-gated import endpoint

Adds two complementary paths for transferring character save data between
Erupe instances without breaking the SHA-256 integrity check system:

- `cmd/saveutil/`: admin CLI with `import`, `export`, `grant-import`, and
  `revoke-import` subcommands. Direct DB access; no server running required.
- `POST /v2/characters/{id}/import`: player-facing API endpoint gated behind
  a one-time token issued by `saveutil grant-import` (default TTL 24 h).
  Token is validated and consumed atomically to prevent TOCTOU races.
- Migration `0013_save_transfer`: `savedata_import_token` and
  `savedata_import_token_expiry` columns on `characters` table.
- Both paths decompress incoming savedata and recompute the SHA-256 hash
  server-side, so the integrity check remains valid after import.
- README documents both methods and the per-character hash-reset workaround.

Closes #183.
This commit is contained in:
Houmgaor
2026-03-21 20:14:58 +01:00
parent dbbfb927f8
commit 5fe1b22550
9 changed files with 679 additions and 0 deletions

View File

@@ -9,6 +9,10 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
### Added
- `saveutil` admin CLI (`cmd/saveutil/`): `import`, `export`, `grant-import`, and `revoke-import` commands for transferring character save data between server instances without touching the database manually.
- `POST /v2/characters/{id}/import` API endpoint: player-facing save import gated behind a one-time admin-granted token (generated by `saveutil grant-import`). Token expires after a configurable TTL (default 24 h).
- Database migration `0013_save_transfer`: adds `savedata_import_token` and `savedata_import_token_expiry` columns to the `characters` table.
- `DisableSaveIntegrityCheck` config flag: when `true`, the SHA-256 savedata integrity check is skipped on load. Intended for cross-server save transfers where the stored hash in the database does not match the imported save blob. Defaults to `false`. Affected characters can alternatively be unblocked per-character with `UPDATE characters SET savedata_hash = NULL WHERE id = <id>`.
- Guild scout invitations now use a dedicated `guild_invites` table (migration `0012_guild_invites`), giving each invitation a real serial PK; the scout list response now returns accurate invite IDs and timestamps, and `CancelGuildScout` uses the correct PK instead of the character ID.
- Event Tent (campaign) system: code redemption, stamp tracking, reward claiming, and quest gating for special event quests, backed by 8 new database tables and seeded with community-researched live-game campaign data ([#182](https://github.com/Mezeporta/Erupe/pull/182), by stratick).
- Database migration `0010_campaign` (campaigns, campaign_categories, campaign_category_links, campaign_rewards, campaign_rewards_claimed, campaign_state, campaign_codes, campaign_quest).

View File

@@ -130,6 +130,64 @@ Edit `config.json` before starting the server. The essential settings are:
`config.example.json` is intentionally minimal — all other settings have sane defaults built into the server. For the full configuration reference (gameplay multipliers, debug options, Discord integration, in-game commands, entrance/channel definitions), see [config.reference.json](./config.reference.json) and the [Erupe Wiki](https://github.com/Mezeporta/Erupe/wiki).
## Save Transfers
To move a character from one Erupe instance to another, use the `saveutil` admin tool.
### Build saveutil
```bash
go build -o saveutil ./cmd/saveutil/
```
### Method 1: Direct admin import (recommended)
This method does not require the server to be running.
**On the source server**, export the character:
```bash
./saveutil export --config config.json --char-id <SOURCE_ID> --output my_character.json
```
**On the destination server**, find the target character ID (use pgAdmin or `psql`), then import:
```bash
./saveutil import --config config.json --char-id <DEST_ID> --file my_character.json
```
### Method 2: Player self-service via API
This method lets players import their own save without admin DB access, but requires the admin to grant a one-time token first.
**Admin step** — grant a token (valid for 24 hours by default):
```bash
./saveutil grant-import --config config.json --char-id <DEST_ID> [--ttl 48h]
# → Import token for character 42: abc123...
```
Give the printed token to the player. They then call the import endpoint:
```bash
curl -X POST http://<server>:8080/v2/characters/<DEST_ID>/import \
-H "Authorization: Bearer <player_token>" \
-H "Content-Type: application/json" \
-d '{
"import_token": "<admin_token>",
"character": <paste contents of my_character.json .character field here>
}'
```
The token is consumed on success and cannot be reused. To cancel a pending grant:
```bash
./saveutil revoke-import --config config.json --char-id <DEST_ID>
```
### Troubleshooting
**"savedata integrity check failed"** — the character was imported directly into the DB without going through `saveutil`. Fix by clearing the stored hash:
```sql
UPDATE characters SET savedata_hash = NULL WHERE id = <char_id>;
```
The correct hash will be recomputed on the next save.
## Features
- **Multi-version Support**: Compatible with all Monster Hunter Frontier versions from Season 6.0 to ZZ

334
cmd/saveutil/main.go Normal file
View File

@@ -0,0 +1,334 @@
// saveutil is an admin CLI for Erupe save data management.
//
// Usage:
//
// saveutil import --config config.json --char-id 42 --file export.json
// saveutil export --config config.json --char-id 42 [--output export.json]
// saveutil grant-import --config config.json --char-id 42 [--ttl 24h]
// saveutil revoke-import --config config.json --char-id 42
package main
import (
"crypto/rand"
"crypto/sha256"
"encoding/base64"
"encoding/hex"
"encoding/json"
"errors"
"flag"
"fmt"
"os"
"time"
"erupe-ce/server/channelserver/compression/nullcomp"
"github.com/jmoiron/sqlx"
_ "github.com/lib/pq"
)
// dbConfig is the minimal config subset needed to connect to PostgreSQL.
type dbConfig struct {
Database struct {
Host string `json:"Host"`
Port int `json:"Port"`
User string `json:"User"`
Password string `json:"Password"`
Database string `json:"Database"`
} `json:"Database"`
}
func main() {
if len(os.Args) < 2 {
printUsage()
os.Exit(1)
}
cmd := os.Args[1]
args := os.Args[2:]
var err error
switch cmd {
case "import":
err = runImport(args)
case "export":
err = runExport(args)
case "grant-import":
err = runGrantImport(args)
case "revoke-import":
err = runRevokeImport(args)
default:
fmt.Fprintf(os.Stderr, "unknown command: %s\n", cmd)
printUsage()
os.Exit(1)
}
if err != nil {
fmt.Fprintf(os.Stderr, "error: %v\n", err)
os.Exit(1)
}
}
func printUsage() {
fmt.Fprintln(os.Stderr, `saveutil — Erupe save data admin tool
Commands:
import --config config.json --char-id N --file export.json
export --config config.json --char-id N [--output file.json]
grant-import --config config.json --char-id N [--ttl 24h]
revoke-import --config config.json --char-id N`)
}
// openDB parses config.json and returns an open database connection.
func openDB(configPath string) (*sqlx.DB, error) {
data, err := os.ReadFile(configPath)
if err != nil {
return nil, fmt.Errorf("read config: %w", err)
}
var cfg dbConfig
if err := json.Unmarshal(data, &cfg); err != nil {
return nil, fmt.Errorf("parse config: %w", err)
}
dsn := fmt.Sprintf(
"host='%s' port='%d' user='%s' password='%s' dbname='%s' sslmode=disable",
cfg.Database.Host, cfg.Database.Port,
cfg.Database.User, cfg.Database.Password,
cfg.Database.Database,
)
db, err := sqlx.Open("postgres", dsn)
if err != nil {
return nil, fmt.Errorf("open db: %w", err)
}
if err := db.Ping(); err != nil {
return nil, fmt.Errorf("ping db: %w", err)
}
return db, nil
}
// generateToken returns a 32-byte cryptographically random hex token.
func generateToken() (string, error) {
b := make([]byte, 32)
if _, err := rand.Read(b); err != nil {
return "", err
}
return hex.EncodeToString(b), nil
}
// --- import ---
func runImport(args []string) error {
fs := flag.NewFlagSet("import", flag.ExitOnError)
configPath := fs.String("config", "config.json", "Path to config.json")
charID := fs.Uint("char-id", 0, "Destination character ID")
filePath := fs.String("file", "", "Path to export JSON file (required)")
_ = fs.Parse(args)
if *charID == 0 {
return errors.New("--char-id is required")
}
if *filePath == "" {
return errors.New("--file is required")
}
db, err := openDB(*configPath)
if err != nil {
return err
}
defer func() { _ = db.Close() }()
// Read and parse the export JSON.
raw, err := os.ReadFile(*filePath)
if err != nil {
return fmt.Errorf("read file: %w", err)
}
var export struct {
Character map[string]interface{} `json:"character"`
}
if err := json.Unmarshal(raw, &export); err != nil {
return fmt.Errorf("parse export JSON: %w", err)
}
if export.Character == nil {
return errors.New("export JSON has no 'character' key")
}
blobs, err := extractAllBlobs(export.Character)
if err != nil {
return fmt.Errorf("extract blobs: %w", err)
}
// Compute savedata hash.
var savedataHash []byte
if len(blobs["savedata"]) > 0 {
decompressed, err := nullcomp.Decompress(blobs["savedata"])
if err != nil {
return fmt.Errorf("decompress savedata: %w", err)
}
h := sha256.Sum256(decompressed)
savedataHash = h[:]
}
_, err = db.Exec(
`UPDATE characters SET
savedata=$1, savedata_hash=$2, decomyset=$3, hunternavi=$4,
otomoairou=$5, partner=$6, platebox=$7, platedata=$8,
platemyset=$9, rengokudata=$10, savemercenary=$11, gacha_items=$12,
house_info=$13, login_boost=$14, skin_hist=$15, scenariodata=$16,
savefavoritequest=$17, mezfes=$18,
savedata_import_token=NULL, savedata_import_token_expiry=NULL
WHERE id=$19`,
blobs["savedata"], savedataHash, blobs["decomyset"], blobs["hunternavi"],
blobs["otomoairou"], blobs["partner"], blobs["platebox"], blobs["platedata"],
blobs["platemyset"], blobs["rengokudata"], blobs["savemercenary"], blobs["gacha_items"],
blobs["house_info"], blobs["login_boost"], blobs["skin_hist"], blobs["scenariodata"],
blobs["savefavoritequest"], blobs["mezfes"],
*charID,
)
if err != nil {
return fmt.Errorf("update characters: %w", err)
}
fmt.Printf("Save data imported into character %d\n", *charID)
return nil
}
// --- export ---
func runExport(args []string) error {
fs := flag.NewFlagSet("export", flag.ExitOnError)
configPath := fs.String("config", "config.json", "Path to config.json")
charID := fs.Uint("char-id", 0, "Character ID to export")
outputPath := fs.String("output", "", "Output file (default: stdout)")
_ = fs.Parse(args)
if *charID == 0 {
return errors.New("--char-id is required")
}
db, err := openDB(*configPath)
if err != nil {
return err
}
defer func() { _ = db.Close() }()
row := db.QueryRowx("SELECT * FROM characters WHERE id=$1", *charID)
result := make(map[string]interface{})
if err := row.MapScan(result); err != nil {
return fmt.Errorf("query character: %w", err)
}
export := map[string]interface{}{"character": result}
enc := json.NewEncoder(os.Stdout)
if *outputPath != "" {
f, err := os.Create(*outputPath)
if err != nil {
return fmt.Errorf("create output file: %w", err)
}
defer func() { _ = f.Close() }()
enc = json.NewEncoder(f)
}
enc.SetIndent("", " ")
if err := enc.Encode(export); err != nil {
return fmt.Errorf("encode JSON: %w", err)
}
if *outputPath != "" {
fmt.Printf("Character %d exported to %s\n", *charID, *outputPath)
}
return nil
}
// --- grant-import ---
func runGrantImport(args []string) error {
fs := flag.NewFlagSet("grant-import", flag.ExitOnError)
configPath := fs.String("config", "config.json", "Path to config.json")
charID := fs.Uint("char-id", 0, "Character ID to grant import permission for")
ttl := fs.Duration("ttl", 24*time.Hour, "Token validity duration (e.g. 24h, 48h)")
_ = fs.Parse(args)
if *charID == 0 {
return errors.New("--char-id is required")
}
db, err := openDB(*configPath)
if err != nil {
return err
}
defer func() { _ = db.Close() }()
token, err := generateToken()
if err != nil {
return fmt.Errorf("generate token: %w", err)
}
expiry := time.Now().Add(*ttl)
res, err := db.Exec(
`UPDATE characters SET savedata_import_token=$1, savedata_import_token_expiry=$2 WHERE id=$3`,
token, expiry, *charID,
)
if err != nil {
return fmt.Errorf("update characters: %w", err)
}
n, _ := res.RowsAffected()
if n == 0 {
return fmt.Errorf("character %d not found", *charID)
}
fmt.Printf("Import token for character %d (expires %s):\n%s\n",
*charID, expiry.Format(time.RFC3339), token)
return nil
}
// --- revoke-import ---
func runRevokeImport(args []string) error {
fs := flag.NewFlagSet("revoke-import", flag.ExitOnError)
configPath := fs.String("config", "config.json", "Path to config.json")
charID := fs.Uint("char-id", 0, "Character ID to revoke import permission for")
_ = fs.Parse(args)
if *charID == 0 {
return errors.New("--char-id is required")
}
db, err := openDB(*configPath)
if err != nil {
return err
}
defer func() { _ = db.Close() }()
_, err = db.Exec(
`UPDATE characters SET savedata_import_token=NULL, savedata_import_token_expiry=NULL WHERE id=$1`,
*charID,
)
if err != nil {
return fmt.Errorf("update characters: %w", err)
}
fmt.Printf("Import token revoked for character %d\n", *charID)
return nil
}
// blobColumns is the ordered list of transferable save blob column names.
var blobColumns = []string{
"savedata", "decomyset", "hunternavi", "otomoairou", "partner",
"platebox", "platedata", "platemyset", "rengokudata", "savemercenary",
"gacha_items", "house_info", "login_boost", "skin_hist", "scenariodata",
"savefavoritequest", "mezfes",
}
// extractAllBlobs decodes all save blob columns from a character export map.
func extractAllBlobs(m map[string]interface{}) (map[string][]byte, error) {
out := make(map[string][]byte, len(blobColumns))
for _, col := range blobColumns {
v, ok := m[col]
if !ok || v == nil {
out[col] = nil
continue
}
s, ok := v.(string)
if !ok {
return nil, fmt.Errorf("column %q: expected string, got %T", col, v)
}
b, err := base64.StdEncoding.DecodeString(s)
if err != nil {
return nil, fmt.Errorf("column %q: base64: %w", col, err)
}
out[col] = b
}
return out, nil
}

View File

@@ -94,6 +94,7 @@ func (s *APIServer) Start() error {
v2Auth.HandleFunc("/characters/{id}/delete", s.DeleteCharacter).Methods("POST")
v2Auth.HandleFunc("/characters/{id}", s.DeleteCharacter).Methods("DELETE")
v2Auth.HandleFunc("/characters/{id}/export", s.ExportSave).Methods("GET")
v2Auth.HandleFunc("/characters/{id}/import", s.ImportSave).Methods("POST")
handler := handlers.CORS(
handlers.AllowedHeaders([]string{"Content-Type", "Authorization"}),

View File

@@ -2,13 +2,16 @@ package api
import (
"context"
"crypto/sha256"
"database/sql"
"encoding/base64"
"encoding/json"
"encoding/xml"
"errors"
"erupe-ce/common/gametime"
"erupe-ce/common/mhfcourse"
cfg "erupe-ce/config"
"erupe-ce/server/channelserver/compression/nullcomp"
"fmt"
"image"
"image/jpeg"
@@ -590,3 +593,151 @@ func (s *APIServer) Health(w http.ResponseWriter, r *http.Request) {
"status": "ok",
})
}
// ImportSave handles POST /v2/characters/{id}/import.
// The request body must contain a one-time import_token (granted by an admin
// via saveutil) plus a character export blob in the same format as ExportSave.
func (s *APIServer) ImportSave(w http.ResponseWriter, r *http.Request) {
ctx := r.Context()
userID, _ := UserIDFromContext(ctx)
var charID uint32
if _, err := fmt.Sscanf(mux.Vars(r)["id"], "%d", &charID); err != nil {
writeError(w, http.StatusBadRequest, "invalid_request", "Invalid character ID")
return
}
var req struct {
ImportToken string `json:"import_token"`
Character map[string]interface{} `json:"character"`
}
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
writeError(w, http.StatusBadRequest, "invalid_request", "Malformed request body")
return
}
if req.ImportToken == "" {
writeError(w, http.StatusBadRequest, "missing_token", "import_token is required")
return
}
blobs, err := saveBlobsFromMap(req.Character)
if err != nil {
s.logger.Warn("ImportSave: failed to extract blobs", zap.Error(err), zap.Uint32("charID", charID))
writeError(w, http.StatusBadRequest, "invalid_request", "Invalid save data: "+err.Error())
return
}
// Compute savedata hash server-side.
if len(blobs.Savedata) > 0 {
decompressed, err := nullcomp.Decompress(blobs.Savedata)
if err != nil {
writeError(w, http.StatusBadRequest, "invalid_request", "savedata decompression failed")
return
}
h := sha256.Sum256(decompressed)
blobs.SavedataHash = h[:]
}
if err := s.charRepo.ImportSave(ctx, charID, userID, req.ImportToken, blobs); err != nil {
s.logger.Warn("ImportSave: failed", zap.Error(err), zap.Uint32("charID", charID))
writeError(w, http.StatusForbidden, "import_denied", "Import token invalid, expired, or character not owned by user")
return
}
s.logger.Info("ImportSave: save imported successfully", zap.Uint32("charID", charID), zap.Uint32("userID", userID))
w.WriteHeader(http.StatusOK)
}
// saveBlobsFromMap extracts save blob columns from an export character map.
// Values must be base64-encoded strings (as produced by json.Marshal on []byte).
func saveBlobsFromMap(m map[string]interface{}) (SaveBlobs, error) {
var b SaveBlobs
var err error
b.Savedata, err = extractBlob(m, "savedata")
if err != nil {
return b, err
}
b.Decomyset, err = extractBlob(m, "decomyset")
if err != nil {
return b, err
}
b.Hunternavi, err = extractBlob(m, "hunternavi")
if err != nil {
return b, err
}
b.Otomoairou, err = extractBlob(m, "otomoairou")
if err != nil {
return b, err
}
b.Partner, err = extractBlob(m, "partner")
if err != nil {
return b, err
}
b.Platebox, err = extractBlob(m, "platebox")
if err != nil {
return b, err
}
b.Platedata, err = extractBlob(m, "platedata")
if err != nil {
return b, err
}
b.Platemyset, err = extractBlob(m, "platemyset")
if err != nil {
return b, err
}
b.Rengokudata, err = extractBlob(m, "rengokudata")
if err != nil {
return b, err
}
b.Savemercenary, err = extractBlob(m, "savemercenary")
if err != nil {
return b, err
}
b.GachaItems, err = extractBlob(m, "gacha_items")
if err != nil {
return b, err
}
b.HouseInfo, err = extractBlob(m, "house_info")
if err != nil {
return b, err
}
b.LoginBoost, err = extractBlob(m, "login_boost")
if err != nil {
return b, err
}
b.SkinHist, err = extractBlob(m, "skin_hist")
if err != nil {
return b, err
}
b.Scenariodata, err = extractBlob(m, "scenariodata")
if err != nil {
return b, err
}
b.Savefavoritequest, err = extractBlob(m, "savefavoritequest")
if err != nil {
return b, err
}
b.Mezfes, err = extractBlob(m, "mezfes")
if err != nil {
return b, err
}
return b, nil
}
// extractBlob decodes a single base64-encoded blob from a character export map.
// Returns nil (not an error) if the key is absent or its value is JSON null.
func extractBlob(m map[string]interface{}, key string) ([]byte, error) {
v, ok := m[key]
if !ok || v == nil {
return nil, nil
}
s, ok := v.(string)
if !ok {
return nil, fmt.Errorf("field %q: expected base64 string, got %T", key, v)
}
b, err := base64.StdEncoding.DecodeString(s)
if err != nil {
return nil, fmt.Errorf("field %q: base64 decode: %w", key, err)
}
return b, nil
}

View File

@@ -2,6 +2,8 @@ package api
import (
"context"
"errors"
"time"
"github.com/jmoiron/sqlx"
)
@@ -89,3 +91,80 @@ func (r *APICharacterRepository) ExportSave(ctx context.Context, userID, charID
}
return result, nil
}
func (r *APICharacterRepository) GrantImportToken(ctx context.Context, charID, userID uint32, token string, expiry time.Time) error {
res, err := r.db.ExecContext(ctx,
`UPDATE characters SET savedata_import_token=$1, savedata_import_token_expiry=$2
WHERE id=$3 AND user_id=$4 AND deleted=false`,
token, expiry, charID, userID,
)
if err != nil {
return err
}
n, err := res.RowsAffected()
if err != nil {
return err
}
if n == 0 {
return errors.New("character not found or not owned by user")
}
return nil
}
func (r *APICharacterRepository) RevokeImportToken(ctx context.Context, charID, userID uint32) error {
_, err := r.db.ExecContext(ctx,
`UPDATE characters SET savedata_import_token=NULL, savedata_import_token_expiry=NULL
WHERE id=$1 AND user_id=$2`,
charID, userID,
)
return err
}
func (r *APICharacterRepository) ImportSave(ctx context.Context, charID, userID uint32, token string, blobs SaveBlobs) error {
tx, err := r.db.BeginTxx(ctx, nil)
if err != nil {
return err
}
defer func() { _ = tx.Rollback() }()
// Validate token ownership and expiry, then clear it — all in one UPDATE.
res, err := tx.ExecContext(ctx,
`UPDATE characters
SET savedata_import_token=NULL, savedata_import_token_expiry=NULL
WHERE id=$1 AND user_id=$2
AND savedata_import_token=$3
AND savedata_import_token_expiry > now()`,
charID, userID, token,
)
if err != nil {
return err
}
n, err := res.RowsAffected()
if err != nil {
return err
}
if n == 0 {
return errors.New("import token invalid, expired, or character not owned by user")
}
// Write all save blobs.
_, err = tx.ExecContext(ctx,
`UPDATE characters SET
savedata=$1, savedata_hash=$2, decomyset=$3, hunternavi=$4,
otomoairou=$5, partner=$6, platebox=$7, platedata=$8,
platemyset=$9, rengokudata=$10, savemercenary=$11, gacha_items=$12,
house_info=$13, login_boost=$14, skin_hist=$15, scenariodata=$16,
savefavoritequest=$17, mezfes=$18
WHERE id=$19`,
blobs.Savedata, blobs.SavedataHash, blobs.Decomyset, blobs.Hunternavi,
blobs.Otomoairou, blobs.Partner, blobs.Platebox, blobs.Platedata,
blobs.Platemyset, blobs.Rengokudata, blobs.Savemercenary, blobs.GachaItems,
blobs.HouseInfo, blobs.LoginBoost, blobs.SkinHist, blobs.Scenariodata,
blobs.Savefavoritequest, blobs.Mezfes,
charID,
)
if err != nil {
return err
}
return tx.Commit()
}

View File

@@ -8,6 +8,29 @@ import (
// Repository interfaces decouple API server business logic from concrete
// PostgreSQL implementations, enabling mock/stub injection for unit tests.
// SaveBlobs holds the transferable save data columns for a character.
// SavedataHash must be set by the caller (SHA-256 of decompressed Savedata).
type SaveBlobs struct {
Savedata []byte
SavedataHash []byte
Decomyset []byte
Hunternavi []byte
Otomoairou []byte
Partner []byte
Platebox []byte
Platedata []byte
Platemyset []byte
Rengokudata []byte
Savemercenary []byte
GachaItems []byte
HouseInfo []byte
LoginBoost []byte
SkinHist []byte
Scenariodata []byte
Savefavoritequest []byte
Mezfes []byte
}
// APIUserRepo defines the contract for user-related data access.
type APIUserRepo interface {
// Register creates a new user and returns their ID and rights.
@@ -42,6 +65,13 @@ type APICharacterRepo interface {
GetForUser(ctx context.Context, userID uint32) ([]Character, error)
// ExportSave returns the full character row as a map.
ExportSave(ctx context.Context, userID, charID uint32) (map[string]interface{}, error)
// GrantImportToken sets a one-time import token for a character owned by userID.
GrantImportToken(ctx context.Context, charID, userID uint32, token string, expiry time.Time) error
// RevokeImportToken clears any pending import token for a character owned by userID.
RevokeImportToken(ctx context.Context, charID, userID uint32) error
// ImportSave atomically validates+consumes the import token and writes all save blobs.
// Returns an error if the token is invalid, expired, or the character doesn't belong to userID.
ImportSave(ctx context.Context, charID, userID uint32, token string, blobs SaveBlobs) error
}
// APIEventRepo defines the contract for read-only event data access.

View File

@@ -72,6 +72,10 @@ type mockAPICharacterRepo struct {
exportResult map[string]interface{}
exportErr error
grantImportTokenErr error
revokeImportTokenErr error
importSaveErr error
}
func (m *mockAPICharacterRepo) GetNewCharacter(_ context.Context, _ uint32) (Character, error) {
@@ -106,6 +110,18 @@ func (m *mockAPICharacterRepo) ExportSave(_ context.Context, _, _ uint32) (map[s
return m.exportResult, m.exportErr
}
func (m *mockAPICharacterRepo) GrantImportToken(_ context.Context, _, _ uint32, _ string, _ time.Time) error {
return m.grantImportTokenErr
}
func (m *mockAPICharacterRepo) RevokeImportToken(_ context.Context, _, _ uint32) error {
return m.revokeImportTokenErr
}
func (m *mockAPICharacterRepo) ImportSave(_ context.Context, _, _ uint32, _ string, _ SaveBlobs) error {
return m.importSaveErr
}
// mockAPIEventRepo implements APIEventRepo for testing.
type mockAPIEventRepo struct {
featureWeapon *FeatureWeaponRow

View File

@@ -0,0 +1,6 @@
-- Save transfer tokens: one-time admin-granted permission for a character
-- to receive an imported save via the API endpoint.
-- NULL means no import is pending for this character.
ALTER TABLE characters
ADD COLUMN IF NOT EXISTS savedata_import_token TEXT,
ADD COLUMN IF NOT EXISTS savedata_import_token_expiry TIMESTAMPTZ;