diff --git a/CHANGELOG.md b/CHANGELOG.md index dd210a336..32bdd19a1 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -7,6 +7,31 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 ## [Unreleased] +### Added + +- Reverse-engineered user binary data types from `mhfo-hd.dll` via Ghidra: type 1 = character name (max 17B SJIS), type 2 = player profile with self-introduction (208B), type 3 = equipment/appearance snapshot (384B). Added structured parsing with size validation warnings to `handleMsgSysSetUserBinary`. +- French (`fr`) and Spanish (`es`) server language translations. Set `"Language": "fr"` or `"Language": "es"` in `config.json` to activate. +- `TestLangCompleteness` uses reflection to verify that every string field in `i18n` is populated for all registered languages — catches missing translations at CI time rather than silently serving empty strings in-game. +- Server-generated strings (commands, mail templates, Raviente announcements, Diva bead names, guild names) are now split into one file per language (`lang_en.go`, `lang_jp.go`, etc.). Adding a new language requires only a single self-contained file and a one-line registration in `getLangStrings` ([#185](https://github.com/Mezeporta/Erupe/issues/185)). +- Hunting Tournament system: all six tournament handlers are now fully implemented and DB-backed. `MsgMhfEnterTournamentQuest` (0x00D2) wire format was derived from `mhfo-hd.dll` binary analysis. Schedule, cups, sub-events, player registrations, and run submissions are stored in five new tables. `EnumerateRanking` returns the active tournament schedule with phase-state computation; `EnumerateOrder` returns per-event leaderboards ranked by submission time. `TournamentDefaults.sql` seeds cup and sub-event data from live tournament #150. One field (`Unk2` / event_id mapping) remains unconfirmed pending a packet capture ([#184](https://github.com/Mezeporta/Erupe/issues/184)). Database migration `0021_tournament` (`tournaments`, `tournament_cups`, `tournament_sub_events`, `tournament_entries`, `tournament_results`). +- Return/Rookie Guild system: new players are automatically placed in a temporary rookie guild (`return_type=1`) and returning players in a comeback guild (`return_type=2`) via `MSG_MHF_ENTRY_ROOKIE_GUILD`. Players graduate (leave) via `OperateGuildGraduateRookie`/`OperateGuildGraduateReturn`. Guild info response now reports `isReturnGuild` correctly. Database migration `0020_return_guilds` adds `return_type` to the `guilds` table. +- `saveutil` admin CLI (`cmd/saveutil/`): `import`, `export`, `grant-import`, and `revoke-import` commands for transferring character save data between server instances without touching the database manually. +- `POST /v2/characters/{id}/import` API endpoint: player-facing save import gated behind a one-time admin-granted token (generated by `saveutil grant-import`). Token expires after a configurable TTL (default 24 h). +- Database migration `0019_save_transfer`: adds `savedata_import_token` and `savedata_import_token_expiry` columns to the `characters` table. +- Guild scout invitations now use a dedicated `guild_invites` table (migration `0018_guild_invites`), giving each invitation a real serial PK; the scout list response now returns accurate invite IDs and timestamps, and `CancelGuildScout` uses the correct PK instead of the character ID. +- Event Tent (campaign) system: code redemption, stamp tracking, reward claiming, and quest gating for special event quests, backed by 8 new database tables and seeded with community-researched live-game campaign data ([#182](https://github.com/Mezeporta/Erupe/pull/182), by stratick). +- Database migration `0016_campaign` (campaigns, campaign_categories, campaign_category_links, campaign_rewards, campaign_rewards_claimed, campaign_state, campaign_codes, campaign_quest). +- JSON Hunting Road config: `bin/rengoku_data.json` is now supported as a human-readable alternative to the opaque `rengoku_data.bin` — the server assembles and ECD-encrypts the binary at startup, with `.bin` used as a fallback ([#173](https://github.com/Mezeporta/Erupe/issues/173)). +- JSON scenario files: `.json` files in `bin/scenarios/` are now supported alongside `.bin` — the server tries `.bin` first, then compiles `.json` on demand. Supports sub-header chunks (flags 0x01/0x02, strings UTF-8 → Shift-JIS, opaque metadata preserved as base64), inline episode listings (flag 0x08), and raw JKR blob chunks (flags 0x10/0x20) ([#172](https://github.com/Mezeporta/Erupe/issues/172)). A `ParseScenarioBinary` function allows existing `.bin` files to be exported to JSON. Fixed off-by-one in JPK decompressor that caused the last literal byte to be dropped. +- JKR type-3 (LZ77) compressor added (`common/decryption.PackSimple`), the inverse of `UnpackSimple`, ported from ReFrontier `JPKEncodeLz.cs` ([#172](https://github.com/Mezeporta/Erupe/issues/172)). +- JSON quest files: `.json` files in `bin/quests/` are now supported alongside `.bin` — the server tries `.bin` first (full backward compatibility), then compiles `.json` on the fly to the MHF binary wire format ([#160](https://github.com/Mezeporta/Erupe/issues/160)). Covers all binary sections: quest text (UTF-8 → Shift-JIS), all 12 objective types, monster spawns (large + minion), reward tables, supply box, loaded stages, rank requirements, variant flags, forced equipment, map sections, area transitions, coordinate mappings, map info, gathering points, gathering tables, and area facilities. A `ParseQuestBinary` reverse function allows existing `.bin` files to be inspected and exported to JSON. +- Diva Defense (UD) system: full implementation of prayer bead selection, point accumulation, interception mechanics, and reward tracking. Adds 4 new tables (`diva_beads`, `diva_beads_assignment`, `diva_beads_points`, `diva_prizes`) and columns for guild/character interception state. Seeded with 26 prize milestones for personal and guild reward tracks. Packet handlers corrected against mhfo-hd.dll RE findings: `GetKijuInfo` (546 bytes/entry with color_id and bead_type), `GetUdMyPoint` (8×18-byte entries, no count prefix), `GetUdTotalPointInfo` (u64[64] thresholds + u8[64] types + u64 total), `GetUdSelectedColorInfo` (9 bytes), present/reward list formats, `GetRewardSong` (22-byte layout), and `AddRewardSongCount` parse (was NOT IMPLEMENTED stub). EN/JP bead names for all 18 bead types. Original system design by wish, with contributions from stratic-dev, Samboge, Re-Nest, and Houmgaor (feature/diva branch). Supplemental RE documentation by ezemania2. +- Database migration `0017_diva` (bead pool, assignments, points log, prizes, interception columns). + +### Fixed + +- Fixed backup recovery panic: `recoverFromBackups` now rejects decompressed backup data smaller than the minimum save layout size, preventing a slice-bounds panic when nullcomp passes through garbage bytes as "already decompressed" data ([#182](https://github.com/Mezeporta/Erupe/pull/182)). + ## [9.3.2] - 2026-04-06 ### Added @@ -31,8 +56,8 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 ### Added -- `DisableSaveIntegrityCheck` config flag: when `true`, the SHA-256 savedata integrity check is skipped on load. -Intended for cross-server save transfers where the stored hash in the database does not match the imported save blob. +- `DisableSaveIntegrityCheck` config flag: when `true`, the SHA-256 savedata integrity check is skipped on load. +Intended for cross-server save transfers where the stored hash in the database does not match the imported save blob. Defaults to `false`. Affected characters can alternatively be unblocked per-character with `UPDATE characters SET savedata_hash = NULL WHERE id = `. @@ -520,6 +545,6 @@ Initial Community Edition and foundational work: This changelog documents all known changes from the Community Edition reupload (February 25, 2022) onwards. The period before this (Einherjar Team era, ~2020-2022) has no public git history. -Earlier development by Cappuccino/Ellie42 (March 2020) focused on basic server infrastructure, multiplayer systems, and core functionality. See [AUTHORS.md](AUTHORS.md) for detailed development history. +Earlier development by Cappuccino/Ellie42 (March 2020) focused on basic server infrastructure, multiplayer systems, and core functionality. See [HISTORY.md](HISTORY.md) for detailed development history. The project began following semantic versioning with v9.0.0 (August 3, 2022) and maintains tagged releases for stable versions. Development continues on the main branch with features merged from feature branches. diff --git a/CLAUDE.md b/CLAUDE.md index 1c4bff185..f32b27c4d 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -165,4 +165,4 @@ golangci-lint run ./... # Must pass with zero errors - Branch naming: `feature/`, `fix/`, `refactor/`, `docs/` - Commit messages: conventional commits (`feat:`, `fix:`, `refactor:`, `docs:`) -- Update `CHANGELOG.md` under "Unreleased" for all changes +- Update `CHANGELOG.md` under "Unreleased" for every feature or fix — one concise line per change (two lines maximum) diff --git a/AUTHORS.md b/HISTORY.md similarity index 90% rename from AUTHORS.md rename to HISTORY.md index a99170750..ce53e714d 100644 --- a/AUTHORS.md +++ b/HISTORY.md @@ -168,6 +168,17 @@ This version transformed Erupe from a proof-of-concept into a feature-complete, This is a recent fork and its specific goals or contributions are not yet documented. +### Houmgaor / Mogapédia (2025-present) + + (continued from ZeruLight/Mezeporta) + +Houmgaor, maintainer of the French MH community Mogapédia, took over active development with a focus on engineering quality and production reliability. Key contributions: + +* Shift-left quality gates: mandatory race-detector CI, coverage floor (≥50%), `golangci-lint`, mock repo pattern for unit tests without a database +* Savedata integrity: SHA-256 checksums, rotating backups, transparent corruption recovery +* Operator tooling: interactive setup wizard, live dashboard, protocol bot for integration testing +* New game systems: Event Tent campaigns, JSON quest files, JSON Hunting Road config, achievement rank-up notifications, Diva Defense point accumulation + ## Authorship of the code Authorship is assigned for each commit within the git history, which is stored in these git repos: diff --git a/README.md b/README.md index 487a605ab..400d4825e 100644 --- a/README.md +++ b/README.md @@ -100,6 +100,18 @@ These files contain quest definitions and scenario data that the server sends to **Without these files, quests will not load and the client will crash.** +### JSON Format Support + +As an alternative to opaque `.bin` files, Erupe supports human-readable `.json` files for quests, scenarios, and Hunting Road config. The server always tries `.bin` first and falls back to `.json` automatically — existing binary files work unchanged. + +| File type | Location | Documentation | +|-----------|----------|---------------| +| Quest | `bin/quests/.json` | Erupe wiki | +| Scenario | `bin/scenarios/.json` | `docs/scenario-format.md` | +| Hunting Road | `bin/rengoku_data.json` | Erupe wiki | + +JSON quests and scenarios use UTF-8 text (converted to Shift-JIS on the wire), making them diff-friendly and editable without binary tools. + ## Client Setup 1. Obtain a Monster Hunter Frontier client (version G10 or later recommended) @@ -149,6 +161,64 @@ Edit `config.json` before starting the server. The essential settings are: `config.example.json` is intentionally minimal — all other settings have sane defaults built into the server. For the full configuration reference (gameplay multipliers, debug options, Discord integration, in-game commands, entrance/channel definitions), see [config.reference.json](./config.reference.json) and the [Erupe Wiki](https://github.com/Mezeporta/Erupe/wiki). +## Save Transfers + +To move a character from one Erupe instance to another, use the `saveutil` admin tool. + +### Build saveutil + +```bash +go build -o saveutil ./cmd/saveutil/ +``` + +### Method 1: Direct admin import (recommended) + +This method does not require the server to be running. + +**On the source server**, export the character: +```bash +./saveutil export --config config.json --char-id --output my_character.json +``` + +**On the destination server**, find the target character ID (use pgAdmin or `psql`), then import: +```bash +./saveutil import --config config.json --char-id --file my_character.json +``` + +### Method 2: Player self-service via API + +This method lets players import their own save without admin DB access, but requires the admin to grant a one-time token first. + +**Admin step** — grant a token (valid for 24 hours by default): +```bash +./saveutil grant-import --config config.json --char-id [--ttl 48h] +# → Import token for character 42: abc123... +``` + +Give the printed token to the player. They then call the import endpoint: +```bash +curl -X POST http://:8080/v2/characters//import \ + -H "Authorization: Bearer " \ + -H "Content-Type: application/json" \ + -d '{ + "import_token": "", + "character": + }' +``` + +The token is consumed on success and cannot be reused. To cancel a pending grant: +```bash +./saveutil revoke-import --config config.json --char-id +``` + +### Troubleshooting + +**"savedata integrity check failed"** — the character was imported directly into the DB without going through `saveutil`. Fix by clearing the stored hash: +```sql +UPDATE characters SET savedata_hash = NULL WHERE id = ; +``` +The correct hash will be recomputed on the next save. + ## Features - **Multi-version Support**: Compatible with all Monster Hunter Frontier versions from Season 6.0 to ZZ @@ -270,4 +340,4 @@ See [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines. ## Authors -A list of authors can be found at [AUTHORS.md](AUTHORS.md). +A list of authors can be found at [HISTORY.md](HISTORY.md). diff --git a/cmd/saveutil/main.go b/cmd/saveutil/main.go new file mode 100644 index 000000000..61ab96f97 --- /dev/null +++ b/cmd/saveutil/main.go @@ -0,0 +1,334 @@ +// saveutil is an admin CLI for Erupe save data management. +// +// Usage: +// +// saveutil import --config config.json --char-id 42 --file export.json +// saveutil export --config config.json --char-id 42 [--output export.json] +// saveutil grant-import --config config.json --char-id 42 [--ttl 24h] +// saveutil revoke-import --config config.json --char-id 42 +package main + +import ( + "crypto/rand" + "crypto/sha256" + "encoding/base64" + "encoding/hex" + "encoding/json" + "errors" + "flag" + "fmt" + "os" + "time" + + "erupe-ce/server/channelserver/compression/nullcomp" + + "github.com/jmoiron/sqlx" + _ "github.com/lib/pq" +) + +// dbConfig is the minimal config subset needed to connect to PostgreSQL. +type dbConfig struct { + Database struct { + Host string `json:"Host"` + Port int `json:"Port"` + User string `json:"User"` + Password string `json:"Password"` + Database string `json:"Database"` + } `json:"Database"` +} + +func main() { + if len(os.Args) < 2 { + printUsage() + os.Exit(1) + } + cmd := os.Args[1] + args := os.Args[2:] + + var err error + switch cmd { + case "import": + err = runImport(args) + case "export": + err = runExport(args) + case "grant-import": + err = runGrantImport(args) + case "revoke-import": + err = runRevokeImport(args) + default: + fmt.Fprintf(os.Stderr, "unknown command: %s\n", cmd) + printUsage() + os.Exit(1) + } + if err != nil { + fmt.Fprintf(os.Stderr, "error: %v\n", err) + os.Exit(1) + } +} + +func printUsage() { + fmt.Fprintln(os.Stderr, `saveutil — Erupe save data admin tool + +Commands: + import --config config.json --char-id N --file export.json + export --config config.json --char-id N [--output file.json] + grant-import --config config.json --char-id N [--ttl 24h] + revoke-import --config config.json --char-id N`) +} + +// openDB parses config.json and returns an open database connection. +func openDB(configPath string) (*sqlx.DB, error) { + data, err := os.ReadFile(configPath) + if err != nil { + return nil, fmt.Errorf("read config: %w", err) + } + var cfg dbConfig + if err := json.Unmarshal(data, &cfg); err != nil { + return nil, fmt.Errorf("parse config: %w", err) + } + dsn := fmt.Sprintf( + "host='%s' port='%d' user='%s' password='%s' dbname='%s' sslmode=disable", + cfg.Database.Host, cfg.Database.Port, + cfg.Database.User, cfg.Database.Password, + cfg.Database.Database, + ) + db, err := sqlx.Open("postgres", dsn) + if err != nil { + return nil, fmt.Errorf("open db: %w", err) + } + if err := db.Ping(); err != nil { + return nil, fmt.Errorf("ping db: %w", err) + } + return db, nil +} + +// generateToken returns a 32-byte cryptographically random hex token. +func generateToken() (string, error) { + b := make([]byte, 32) + if _, err := rand.Read(b); err != nil { + return "", err + } + return hex.EncodeToString(b), nil +} + +// --- import --- + +func runImport(args []string) error { + fs := flag.NewFlagSet("import", flag.ExitOnError) + configPath := fs.String("config", "config.json", "Path to config.json") + charID := fs.Uint("char-id", 0, "Destination character ID") + filePath := fs.String("file", "", "Path to export JSON file (required)") + _ = fs.Parse(args) + + if *charID == 0 { + return errors.New("--char-id is required") + } + if *filePath == "" { + return errors.New("--file is required") + } + + db, err := openDB(*configPath) + if err != nil { + return err + } + defer func() { _ = db.Close() }() + + // Read and parse the export JSON. + raw, err := os.ReadFile(*filePath) + if err != nil { + return fmt.Errorf("read file: %w", err) + } + var export struct { + Character map[string]interface{} `json:"character"` + } + if err := json.Unmarshal(raw, &export); err != nil { + return fmt.Errorf("parse export JSON: %w", err) + } + if export.Character == nil { + return errors.New("export JSON has no 'character' key") + } + + blobs, err := extractAllBlobs(export.Character) + if err != nil { + return fmt.Errorf("extract blobs: %w", err) + } + + // Compute savedata hash. + var savedataHash []byte + if len(blobs["savedata"]) > 0 { + decompressed, err := nullcomp.Decompress(blobs["savedata"]) + if err != nil { + return fmt.Errorf("decompress savedata: %w", err) + } + h := sha256.Sum256(decompressed) + savedataHash = h[:] + } + + _, err = db.Exec( + `UPDATE characters SET + savedata=$1, savedata_hash=$2, decomyset=$3, hunternavi=$4, + otomoairou=$5, partner=$6, platebox=$7, platedata=$8, + platemyset=$9, rengokudata=$10, savemercenary=$11, gacha_items=$12, + house_info=$13, login_boost=$14, skin_hist=$15, scenariodata=$16, + savefavoritequest=$17, mezfes=$18, + savedata_import_token=NULL, savedata_import_token_expiry=NULL + WHERE id=$19`, + blobs["savedata"], savedataHash, blobs["decomyset"], blobs["hunternavi"], + blobs["otomoairou"], blobs["partner"], blobs["platebox"], blobs["platedata"], + blobs["platemyset"], blobs["rengokudata"], blobs["savemercenary"], blobs["gacha_items"], + blobs["house_info"], blobs["login_boost"], blobs["skin_hist"], blobs["scenariodata"], + blobs["savefavoritequest"], blobs["mezfes"], + *charID, + ) + if err != nil { + return fmt.Errorf("update characters: %w", err) + } + fmt.Printf("Save data imported into character %d\n", *charID) + return nil +} + +// --- export --- + +func runExport(args []string) error { + fs := flag.NewFlagSet("export", flag.ExitOnError) + configPath := fs.String("config", "config.json", "Path to config.json") + charID := fs.Uint("char-id", 0, "Character ID to export") + outputPath := fs.String("output", "", "Output file (default: stdout)") + _ = fs.Parse(args) + + if *charID == 0 { + return errors.New("--char-id is required") + } + + db, err := openDB(*configPath) + if err != nil { + return err + } + defer func() { _ = db.Close() }() + + row := db.QueryRowx("SELECT * FROM characters WHERE id=$1", *charID) + result := make(map[string]interface{}) + if err := row.MapScan(result); err != nil { + return fmt.Errorf("query character: %w", err) + } + + export := map[string]interface{}{"character": result} + enc := json.NewEncoder(os.Stdout) + if *outputPath != "" { + f, err := os.Create(*outputPath) + if err != nil { + return fmt.Errorf("create output file: %w", err) + } + defer func() { _ = f.Close() }() + enc = json.NewEncoder(f) + } + enc.SetIndent("", " ") + if err := enc.Encode(export); err != nil { + return fmt.Errorf("encode JSON: %w", err) + } + if *outputPath != "" { + fmt.Printf("Character %d exported to %s\n", *charID, *outputPath) + } + return nil +} + +// --- grant-import --- + +func runGrantImport(args []string) error { + fs := flag.NewFlagSet("grant-import", flag.ExitOnError) + configPath := fs.String("config", "config.json", "Path to config.json") + charID := fs.Uint("char-id", 0, "Character ID to grant import permission for") + ttl := fs.Duration("ttl", 24*time.Hour, "Token validity duration (e.g. 24h, 48h)") + _ = fs.Parse(args) + + if *charID == 0 { + return errors.New("--char-id is required") + } + + db, err := openDB(*configPath) + if err != nil { + return err + } + defer func() { _ = db.Close() }() + + token, err := generateToken() + if err != nil { + return fmt.Errorf("generate token: %w", err) + } + expiry := time.Now().Add(*ttl) + + res, err := db.Exec( + `UPDATE characters SET savedata_import_token=$1, savedata_import_token_expiry=$2 WHERE id=$3`, + token, expiry, *charID, + ) + if err != nil { + return fmt.Errorf("update characters: %w", err) + } + n, _ := res.RowsAffected() + if n == 0 { + return fmt.Errorf("character %d not found", *charID) + } + + fmt.Printf("Import token for character %d (expires %s):\n%s\n", + *charID, expiry.Format(time.RFC3339), token) + return nil +} + +// --- revoke-import --- + +func runRevokeImport(args []string) error { + fs := flag.NewFlagSet("revoke-import", flag.ExitOnError) + configPath := fs.String("config", "config.json", "Path to config.json") + charID := fs.Uint("char-id", 0, "Character ID to revoke import permission for") + _ = fs.Parse(args) + + if *charID == 0 { + return errors.New("--char-id is required") + } + + db, err := openDB(*configPath) + if err != nil { + return err + } + defer func() { _ = db.Close() }() + + _, err = db.Exec( + `UPDATE characters SET savedata_import_token=NULL, savedata_import_token_expiry=NULL WHERE id=$1`, + *charID, + ) + if err != nil { + return fmt.Errorf("update characters: %w", err) + } + fmt.Printf("Import token revoked for character %d\n", *charID) + return nil +} + +// blobColumns is the ordered list of transferable save blob column names. +var blobColumns = []string{ + "savedata", "decomyset", "hunternavi", "otomoairou", "partner", + "platebox", "platedata", "platemyset", "rengokudata", "savemercenary", + "gacha_items", "house_info", "login_boost", "skin_hist", "scenariodata", + "savefavoritequest", "mezfes", +} + +// extractAllBlobs decodes all save blob columns from a character export map. +func extractAllBlobs(m map[string]interface{}) (map[string][]byte, error) { + out := make(map[string][]byte, len(blobColumns)) + for _, col := range blobColumns { + v, ok := m[col] + if !ok || v == nil { + out[col] = nil + continue + } + s, ok := v.(string) + if !ok { + return nil, fmt.Errorf("column %q: expected string, got %T", col, v) + } + b, err := base64.StdEncoding.DecodeString(s) + if err != nil { + return nil, fmt.Errorf("column %q: base64: %w", col, err) + } + out[col] = b + } + return out, nil +} diff --git a/common/decryption/ecd.go b/common/decryption/ecd.go new file mode 100644 index 000000000..6acf8b087 --- /dev/null +++ b/common/decryption/ecd.go @@ -0,0 +1,162 @@ +package decryption + +/* + ECD encryption/decryption ported from: + - ReFrontier (C#): https://github.com/Chakratos/ReFrontier (LibReFrontier/Crypto.cs) + - FrontierTextHandler (Python): src/crypto.py + + ECD is a stream cipher used to protect MHF game data files. All known + MHF files use key index 4 (DefaultECDKey). The cipher uses a 32-bit LCG + for key-stream generation with a Feistel-like nibble transformation and + ciphertext-feedback (CFB) chaining. +*/ + +import ( + "encoding/binary" + "errors" + "fmt" + "hash/crc32" +) + +// ECDMagic is the ECD container magic ("ecd\x1a"), stored little-endian on disk. +// On-disk bytes: 65 63 64 1A; decoded as LE uint32: 0x1A646365. +const ECDMagic = uint32(0x1A646365) + +// DefaultECDKey is the LCG key index used by all known MHF game files. +const DefaultECDKey = 4 + +const ecdHeaderSize = 16 + +// rndBufECD holds the 6 LCG key-parameter sets. Each entry is an 8-byte pair +// of (multiplier, increment) stored big-endian, indexed by the key field in +// the ECD header. +var rndBufECD = [...]byte{ + 0x4A, 0x4B, 0x52, 0x2E, 0x00, 0x00, 0x00, 0x01, // key 0 + 0x00, 0x01, 0x0D, 0xCD, 0x00, 0x00, 0x00, 0x01, // key 1 + 0x00, 0x01, 0x0D, 0xCD, 0x00, 0x00, 0x00, 0x01, // key 2 + 0x00, 0x01, 0x0D, 0xCD, 0x00, 0x00, 0x00, 0x01, // key 3 + 0x00, 0x19, 0x66, 0x0D, 0x00, 0x00, 0x00, 0x03, // key 4 (default; all MHF files) + 0x7D, 0x2B, 0x89, 0xDD, 0x00, 0x00, 0x00, 0x01, // key 5 +} + +const numECDKeys = len(rndBufECD) / 8 + +// getRndECD advances the LCG by one step using the selected key's parameters +// and returns the new 32-bit state. +func getRndECD(key int, rnd uint32) uint32 { + offset := key * 8 + multiplier := binary.BigEndian.Uint32(rndBufECD[offset:]) + increment := binary.BigEndian.Uint32(rndBufECD[offset+4:]) + return rnd*multiplier + increment +} + +// DecodeECD decrypts an ECD-encrypted buffer and returns the plaintext payload. +// The 16-byte ECD header is consumed; only the decrypted payload is returned. +// +// The cipher uses the CRC32 stored in the header to seed the LCG key stream. +// No post-decryption CRC check is performed (matching reference implementations). +func DecodeECD(data []byte) ([]byte, error) { + if len(data) < ecdHeaderSize { + return nil, errors.New("ecd: buffer too small for header") + } + if binary.LittleEndian.Uint32(data[:4]) != ECDMagic { + return nil, errors.New("ecd: invalid magic") + } + + key := int(binary.LittleEndian.Uint16(data[4:6])) + if key >= numECDKeys { + return nil, fmt.Errorf("ecd: invalid key index %d", key) + } + + payloadSize := int(binary.LittleEndian.Uint32(data[8:12])) + if len(data) < ecdHeaderSize+payloadSize { + return nil, fmt.Errorf("ecd: declared payload size %d exceeds buffer (%d bytes available)", + payloadSize, len(data)-ecdHeaderSize) + } + + // Seed LCG: rotate the stored CRC32 by 16 bits and set LSB to 1. + storedCRC := binary.LittleEndian.Uint32(data[12:16]) + rnd := (storedCRC<<16 | storedCRC>>16) | 1 + + // Initial LCG step establishes the cipher-feedback byte r8. + rnd = getRndECD(key, rnd) + r8 := byte(rnd) + + out := make([]byte, payloadSize) + for i := 0; i < payloadSize; i++ { + rnd = getRndECD(key, rnd) + xorpad := rnd + + // Nibble-feedback decryption: XOR with previous decrypted byte, then + // apply 8 rounds of Feistel-like nibble mixing using the key stream. + r11 := uint32(data[ecdHeaderSize+i]) ^ uint32(r8) + r12 := (r11 >> 4) & 0xFF + + for j := 0; j < 8; j++ { + r10 := xorpad ^ r11 + r11 = r12 + r12 = (r12 ^ r10) & 0xFF + xorpad >>= 4 + } + + r8 = byte((r12 & 0xF) | ((r11 & 0xF) << 4)) + out[i] = r8 + } + + return out, nil +} + +// EncodeECD encrypts plaintext using the ECD cipher and returns the complete +// ECD container (16-byte header + encrypted payload). Use DefaultECDKey (4) +// for all MHF-compatible output. +func EncodeECD(data []byte, key int) ([]byte, error) { + if key < 0 || key >= numECDKeys { + return nil, fmt.Errorf("ecd: invalid key index %d", key) + } + + payloadSize := len(data) + checksum := crc32.ChecksumIEEE(data) + + out := make([]byte, ecdHeaderSize+payloadSize) + binary.LittleEndian.PutUint32(out[0:], ECDMagic) + binary.LittleEndian.PutUint16(out[4:], uint16(key)) + // out[6:8] = 0 (reserved padding) + binary.LittleEndian.PutUint32(out[8:], uint32(payloadSize)) + binary.LittleEndian.PutUint32(out[12:], checksum) + + // Seed LCG identically to decryption so the streams stay in sync. + rnd := (checksum<<16 | checksum>>16) | 1 + rnd = getRndECD(key, rnd) + r8 := byte(rnd) + + for i := 0; i < payloadSize; i++ { + rnd = getRndECD(key, rnd) + xorpad := rnd + + // Inverse Feistel: compute the nibble-mixed values using a zeroed + // initial state, then XOR the plaintext nibbles through. + r11 := uint32(0) + r12 := uint32(0) + + for j := 0; j < 8; j++ { + r10 := xorpad ^ r11 + r11 = r12 + r12 = (r12 ^ r10) & 0xFF + xorpad >>= 4 + } + + b := data[i] + dig2 := uint32(b) + dig1 := (dig2 >> 4) & 0xFF + dig1 ^= r11 + dig2 ^= r12 + dig1 ^= dig2 + + rr := byte((dig2 & 0xF) | ((dig1 & 0xF) << 4)) + rr ^= r8 + out[ecdHeaderSize+i] = rr + r8 = b // Cipher-feedback: next iteration uses current plaintext byte. + } + + return out, nil +} diff --git a/common/decryption/ecd_test.go b/common/decryption/ecd_test.go new file mode 100644 index 000000000..82da56102 --- /dev/null +++ b/common/decryption/ecd_test.go @@ -0,0 +1,140 @@ +package decryption + +import ( + "bytes" + "testing" +) + +// TestEncodeDecodeECD_RoundTrip verifies that encoding then decoding returns +// the original plaintext for various payloads and key indices. +func TestEncodeDecodeECD_RoundTrip(t *testing.T) { + cases := []struct { + name string + payload []byte + key int + }{ + {"empty", []byte{}, DefaultECDKey}, + {"single_byte", []byte{0x42}, DefaultECDKey}, + {"all_zeros", make([]byte, 64), DefaultECDKey}, + {"all_ones", bytes.Repeat([]byte{0xFF}, 64), DefaultECDKey}, + {"sequential", func() []byte { + b := make([]byte, 256) + for i := range b { + b[i] = byte(i) + } + return b + }(), DefaultECDKey}, + {"key0", []byte("hello world"), 0}, + {"key1", []byte("hello world"), 1}, + {"key5", []byte("hello world"), 5}, + {"large", bytes.Repeat([]byte("MHFrontier"), 1000), DefaultECDKey}, + } + + for _, tc := range cases { + t.Run(tc.name, func(t *testing.T) { + enc, err := EncodeECD(tc.payload, tc.key) + if err != nil { + t.Fatalf("EncodeECD: %v", err) + } + + // Encoded output must start with ECD magic. + if len(enc) < 4 { + t.Fatalf("encoded output too short: %d bytes", len(enc)) + } + + dec, err := DecodeECD(enc) + if err != nil { + t.Fatalf("DecodeECD: %v", err) + } + + if !bytes.Equal(dec, tc.payload) { + t.Errorf("round-trip mismatch:\n got %x\n want %x", dec, tc.payload) + } + }) + } +} + +// TestDecodeECD_Errors verifies that invalid inputs are rejected with errors. +func TestDecodeECD_Errors(t *testing.T) { + cases := []struct { + name string + data []byte + wantErr string + }{ + { + name: "too_small", + data: []byte{0x65, 0x63, 0x64}, + wantErr: "too small", + }, + { + name: "bad_magic", + data: func() []byte { + b := make([]byte, 16) + b[0] = 0xDE + return b + }(), + wantErr: "invalid magic", + }, + { + name: "invalid_key", + data: func() []byte { + b := make([]byte, 16) + // ECD magic + b[0], b[1], b[2], b[3] = 0x65, 0x63, 0x64, 0x1A + // key index = 99 (out of range) + b[4] = 99 + return b + }(), + wantErr: "invalid key", + }, + { + name: "payload_exceeds_buffer", + data: func() []byte { + b := make([]byte, 16) + b[0], b[1], b[2], b[3] = 0x65, 0x63, 0x64, 0x1A + // key 4 + b[4] = DefaultECDKey + // declare payload size larger than the buffer + b[8], b[9], b[10], b[11] = 0xFF, 0xFF, 0xFF, 0x00 + return b + }(), + wantErr: "exceeds buffer", + }, + } + + for _, tc := range cases { + t.Run(tc.name, func(t *testing.T) { + _, err := DecodeECD(tc.data) + if err == nil { + t.Fatal("expected error, got nil") + } + if !bytes.Contains([]byte(err.Error()), []byte(tc.wantErr)) { + t.Errorf("error %q does not contain %q", err.Error(), tc.wantErr) + } + }) + } +} + +// TestEncodeECD_InvalidKey verifies that an out-of-range key is rejected. +func TestEncodeECD_InvalidKey(t *testing.T) { + _, err := EncodeECD([]byte("test"), 99) + if err == nil { + t.Fatal("expected error for invalid key, got nil") + } +} + +// TestDecodeECD_EmptyPayload verifies that a valid header with zero payload +// decodes to an empty slice without error. +func TestDecodeECD_EmptyPayload(t *testing.T) { + enc, err := EncodeECD([]byte{}, DefaultECDKey) + if err != nil { + t.Fatalf("EncodeECD: %v", err) + } + dec, err := DecodeECD(enc) + if err != nil { + t.Fatalf("DecodeECD: %v", err) + } + if len(dec) != 0 { + t.Errorf("expected empty payload, got %d bytes", len(dec)) + } +} diff --git a/common/decryption/jpk.go b/common/decryption/jpk.go index b4a60a4c9..6476f9355 100644 --- a/common/decryption/jpk.go +++ b/common/decryption/jpk.go @@ -54,7 +54,7 @@ func ProcessDecode(data *byteframe.ByteFrame, outBuffer []byte) { func (s *jpkState) processDecode(data *byteframe.ByteFrame, outBuffer []byte) { outIndex := 0 - for int(data.Index()) < len(data.Data()) && outIndex < len(outBuffer)-1 { + for int(data.Index()) < len(data.Data()) && outIndex < len(outBuffer) { if s.bitShift(data) == 0 { outBuffer[outIndex] = ReadByte(data) outIndex++ diff --git a/common/decryption/jpk_compress.go b/common/decryption/jpk_compress.go new file mode 100644 index 000000000..d193c2a7d --- /dev/null +++ b/common/decryption/jpk_compress.go @@ -0,0 +1,169 @@ +package decryption + +import "encoding/binary" + +// PackSimple compresses data using JPK type-3 (LZ77) compression and wraps it +// in a JKR header. It is the inverse of UnpackSimple. +func PackSimple(data []byte) []byte { + compressed := lzEncode(data) + + out := make([]byte, 16+len(compressed)) + binary.LittleEndian.PutUint32(out[0:4], 0x1A524B4A) // JKR magic + binary.LittleEndian.PutUint16(out[4:6], 0x0108) // version + binary.LittleEndian.PutUint16(out[6:8], 0x0003) // type 3 = LZ only + binary.LittleEndian.PutUint32(out[8:12], 0x00000010) // data offset = 16 (after header) + binary.LittleEndian.PutUint32(out[12:16], uint32(len(data))) + copy(out[16:], compressed) + return out +} + +// lzEncoder holds mutable state for the LZ77 compression loop. +// Ported from ReFrontier JPKEncodeLz.cs. +// +// The format groups 8 items behind a flag byte (MSB = item 0): +// +// bit=0 → literal byte follows +// bit=1 → back-reference follows (with sub-cases below) +// +// Back-reference sub-cases: +// +// 10xx + 1 byte → length 3–6, offset ≤ 255 +// 11 + 2 bytes → length 3–9, offset ≤ 8191 (length encoded in hi byte bits 7–5) +// 11 + 2 bytes + 0 + 4 bits → length 10–25, offset ≤ 8191 +// 11 + 2 bytes + 1 + 1 byte → length 26–280, offset ≤ 8191 +type lzEncoder struct { + flag byte + shiftIndex int + toWrite [1024]byte // data bytes for the current flag group + indexToWrite int + out []byte +} + +func (e *lzEncoder) setFlag(value bool) { + if e.shiftIndex <= 0 { + e.flushFlag(false) + e.shiftIndex = 7 + } else { + e.shiftIndex-- + } + if value { + e.flag |= 1 << uint(e.shiftIndex) + } +} + +// setFlagsReverse writes `count` bits of value MSB-first. +func (e *lzEncoder) setFlagsReverse(value byte, count int) { + for i := count - 1; i >= 0; i-- { + e.setFlag(((value >> uint(i)) & 1) == 1) + } +} + +func (e *lzEncoder) writeByte(b byte) { + e.toWrite[e.indexToWrite] = b + e.indexToWrite++ +} + +func (e *lzEncoder) flushFlag(final bool) { + if !final || e.indexToWrite > 0 { + e.out = append(e.out, e.flag) + } + e.flag = 0 + e.out = append(e.out, e.toWrite[:e.indexToWrite]...) + e.indexToWrite = 0 +} + +// lzEncode compresses data with the JPK LZ77 algorithm, producing the raw +// compressed bytes (without the JKR header). +func lzEncode(data []byte) []byte { + const ( + compressionLevel = 280 // max match length + maxIndexDist = 0x300 // max look-back distance (768) + ) + + enc := &lzEncoder{shiftIndex: 8} + + for pos := 0; pos < len(data); { + repLen, repOff := lzLongestRepetition(data, pos, compressionLevel, maxIndexDist) + + if repLen == 0 { + // Literal byte + enc.setFlag(false) + enc.writeByte(data[pos]) + pos++ + } else { + enc.setFlag(true) + if repLen <= 6 && repOff <= 0xff { + // Short: flag=10, 2-bit length, 1-byte offset + enc.setFlag(false) + enc.setFlagsReverse(byte(repLen-3), 2) + enc.writeByte(byte(repOff)) + } else { + // Long: flag=11, 2-byte offset/length header + enc.setFlag(true) + u16 := uint16(repOff) + if repLen <= 9 { + // Length fits in hi byte bits 7-5 + u16 |= uint16(repLen-2) << 13 + } + enc.writeByte(byte(u16 >> 8)) + enc.writeByte(byte(u16 & 0xff)) + if repLen > 9 { + if repLen <= 25 { + // Extended: flag=0, 4-bit length + enc.setFlag(false) + enc.setFlagsReverse(byte(repLen-10), 4) + } else { + // Extended: flag=1, 1-byte length + enc.setFlag(true) + enc.writeByte(byte(repLen - 0x1a)) + } + } + } + pos += repLen + } + } + + enc.flushFlag(true) + return enc.out +} + +// lzLongestRepetition finds the longest match for data[pos:] in the look-back +// window. Returns (matchLen, encodedOffset) where encodedOffset is +// (pos - matchStart - 1). Returns (0, 0) when no usable match exists. +func lzLongestRepetition(data []byte, pos, compressionLevel, maxIndexDist int) (int, uint) { + const minLength = 3 + + // Clamp threshold to available bytes + threshold := compressionLevel + if remaining := len(data) - pos; remaining < threshold { + threshold = remaining + } + + if pos == 0 || threshold < minLength { + return 0, 0 + } + + windowStart := pos - maxIndexDist + if windowStart < 0 { + windowStart = 0 + } + + maxLen := 0 + var bestOffset uint + + for left := windowStart; left < pos; left++ { + curLen := 0 + for curLen < threshold && data[left+curLen] == data[pos+curLen] { + curLen++ + } + if curLen >= minLength && curLen > maxLen { + maxLen = curLen + bestOffset = uint(pos - left - 1) + if maxLen >= threshold { + break + } + } + } + + return maxLen, bestOffset +} diff --git a/common/decryption/jpk_compress_test.go b/common/decryption/jpk_compress_test.go new file mode 100644 index 000000000..a175ee31e --- /dev/null +++ b/common/decryption/jpk_compress_test.go @@ -0,0 +1,78 @@ +package decryption + +import ( + "bytes" + "encoding/binary" + "testing" +) + +func TestPackSimpleRoundTrip(t *testing.T) { + tests := []struct { + name string + data []byte + }{ + {"single byte", []byte{0x42}}, + {"ascii text", []byte("hello world")}, + {"repeated pattern", bytes.Repeat([]byte{0xAB, 0xCD}, 100)}, + {"all zeros", make([]byte, 256)}, + {"all 0xFF", bytes.Repeat([]byte{0xFF}, 128)}, + {"sequential bytes", func() []byte { + b := make([]byte, 256) + for i := range b { + b[i] = byte(i) + } + return b + }()}, + {"long repeating run", bytes.Repeat([]byte("ABCDEFGH"), 50)}, + {"mixed", []byte{0x00, 0x01, 0x02, 0xFF, 0xFE, 0xFD, 0x80, 0x81}}, + } + + for _, tc := range tests { + t.Run(tc.name, func(t *testing.T) { + compressed := PackSimple(tc.data) + got := UnpackSimple(compressed) + if !bytes.Equal(got, tc.data) { + t.Errorf("round-trip mismatch\n want len=%d\n got len=%d", len(tc.data), len(got)) + } + }) + } +} + +func TestPackSimpleHeader(t *testing.T) { + data := []byte("test data") + compressed := PackSimple(data) + + if len(compressed) < 16 { + t.Fatalf("output too short: %d bytes", len(compressed)) + } + + magic := binary.LittleEndian.Uint32(compressed[0:4]) + if magic != 0x1A524B4A { + t.Errorf("wrong magic: got 0x%08X, want 0x1A524B4A", magic) + } + + jpkType := binary.LittleEndian.Uint16(compressed[6:8]) + if jpkType != 3 { + t.Errorf("wrong type: got %d, want 3", jpkType) + } + + decompSize := binary.LittleEndian.Uint32(compressed[12:16]) + if decompSize != uint32(len(data)) { + t.Errorf("wrong decompressed size: got %d, want %d", decompSize, len(data)) + } +} + +func TestPackSimpleLargeRepeating(t *testing.T) { + // 4 KB of repeating pattern — should compress well + data := bytes.Repeat([]byte{0xAA, 0xBB, 0xCC, 0xDD}, 1024) + compressed := PackSimple(data) + + if len(compressed) >= len(data) { + t.Logf("note: compressed (%d) not smaller than original (%d)", len(compressed), len(data)) + } + + got := UnpackSimple(compressed) + if !bytes.Equal(got, data) { + t.Errorf("round-trip failed for large repeating data") + } +} diff --git a/config/config.go b/config/config.go index b731b45b0..a006614bb 100644 --- a/config/config.go +++ b/config/config.go @@ -67,7 +67,8 @@ type Config struct { Host string `mapstructure:"Host"` BinPath string `mapstructure:"BinPath"` Language string - DisableSoftCrash bool // Disables the 'Press Return to exit' dialog allowing scripts to reboot the server automatically + DisableSoftCrash bool // Disables the 'Press Return to exit' dialog allowing scripts to reboot the server automatically + ShutdownCountdownSeconds int // Seconds to count down before shutting down (default 10; ignored when DisableSoftCrash is true) HideLoginNotice bool // Hide the Erupe notice on login LoginNotices []string // MHFML string of the login notices displayed PatchServerManifest string // Manifest patch server override @@ -354,6 +355,7 @@ func registerDefaults() { viper.SetDefault("CommandPrefix", "!") viper.SetDefault("AutoCreateAccount", true) viper.SetDefault("LoopDelay", 50) + viper.SetDefault("ShutdownCountdownSeconds", 10) viper.SetDefault("DefaultCourses", []uint16{1, 23, 24}) viper.SetDefault("EarthMonsters", []int32{0, 0, 0, 0}) diff --git a/docs/conquest-war.md b/docs/conquest-war.md new file mode 100644 index 000000000..0881802a9 --- /dev/null +++ b/docs/conquest-war.md @@ -0,0 +1,514 @@ +# Conquest War (討伐征戦 / Seibatsu) + +Tracks what is known about the Conquest War event system and what remains to be +reverse-engineered before it can be fully implemented in Erupe. + +The `feature/conquest` branch (origin) attempted a partial implementation but drifted too far +from `develop` without completing the core gameplay loop and is not mergeable in its current +state. Its findings are incorporated below. + +--- + +## Game Context + +**Conquest War** (討伐征戦, also called *Seibatsu*) is a weekly rotating time-limited event +introduced in the G2 update (July 2013). Players hunt legendary monsters and race to level them +up on a per-player leaderboard. + +The event follows a **three-week, three-phase cycle** tracked server-side as the "Earth" system: + +| Week | Phase | Japanese | Description | +|------|-------|----------|-------------| +| 1 | **Conquest (Seibatsu)** | 討伐征戦 | Hunting phase — players level their monsters | +| 2 | **Pallone Festival** | パローネ祭典 | Side festival event concurrent with conquest rewards | +| 3 | **Tower (Dure)** | 塔 | Tower climbing event for additional rewards | + +### Conquest Mechanics + +- Each player has their own independent monster (not shared with others). +- Players hunt their own monster or join quests at the same or higher level. +- A monster starts at level 1 and caps at **9999**. +- Level gain per hunt: **+5** (no faints), **+3** (one faint), **+1** (multiple faints). +- As the monster levels, its stats scale up, making each subsequent hunt harder. +- At week end, rewards are distributed based on the player's rank on the per-monster + leaderboard. + +### Target Monsters (configurable) + +The live service used **Shantien**, **Disufiroa**, **G-Rank Black Fatalis**, and +**G-Rank Crimson Fatalis**. The branch defaults to monster IDs `[116, 107, 2, 36]` +(Deviljho, Rajang, Rathalos, Gore Magala — suitable for G8 and below, where the original +four are not available). + +For clients at `RealClientMode <= G8`, only the first 3 monsters are exposed; G9+ exposes 4. + +### Reward Distribution Types + +The `DistributionType` field in reward packets uses these sentinel values: + +| Value | Meaning | +|-------|---------| +| `7201` | Item reward (ItemID + quantity) | +| `7202` | N-Points (currency) | +| `7203` | Guild contribution points | + +--- + +## Packet Overview + +Thirteen packets implement the Conquest/Earth system. All live in `network/mhfpacket/`. +None have `Build()` implemented (all return `NOT IMPLEMENTED`) — responses are built +directly in handler code using `byteframe`. + +### `MsgMhfGetEarthStatus` — Client → Server → Client + +Fetches the current Earth event windows and which monsters are active. + +**Request** (`msg_mhf_get_earth_status.go`): +``` +AckHandle uint32 +Unk0 uint32 — unknown; never used by handler +Unk1 uint32 — unknown; never used by handler +``` + +**Response** (built in `handlers_earth.go → handleMsgMhfGetEarthStatus`): +``` +for each active earth event (up to 3: Conquest, Pallone, Tower): + [uint32] StartTime — Unix timestamp + [uint32] EndTime — Unix timestamp + [int32] StatusID — 1 or 2 (Conquest); 11 (Pallone active) or 12 (Pallone reward); 21 (Tower) + [int32] EarthID — unique event ID from DB row + [int32] MonsterID × N — active conquest monsters (3 for G8, 4 for G9+) +``` + +**Status ID semantics**: the difference between `1` and `2` for the Conquest phase is not +known. The branch selects `1` when the hunt week is active and `2` otherwise, but this is +a guess. + +**Current state**: Implemented. Event windows are generated from a single `events` table row +(`event_type = 'earth'`). A 21-day rolling cycle is computed from that row's `start_time`. +Debug mode (`EarthDebug = true`) collapses the windows to week boundaries for faster testing. + +--- + +### `MsgMhfGetEarthValue` — Client → Server → Client + +Fetches numeric values associated with the current Earth event (kill counts, floor tallies, +special flags). + +**Request** (`msg_mhf_get_earth_value.go`): +``` +AckHandle uint32 +Unk0 uint32 — unknown +Unk1 uint32 — unknown +ReqType uint32 — 1, 2, or 3 (see below) +Unk3–Unk6 uint32 — unknown; never used by handler +``` + +**Response**: a variable-length array of 6-uint32 entries, wrapped in `doAckEarthSucceed`. +Each entry: `[ID, Value, Unk, Unk, Unk, Unk]`. The last four fields are always zero in +known captures. + +| ReqType | Known entries | Notes | +|---------|--------------|-------| +| 1 | `{1, 100}`, `{2, 100}` | Block + DureSlays count — exact meaning unclear | +| 2 | `{1, 5771}`, `{2, 1847}` | Block + Floors? — "Floors?" is a guess | +| 3 | `{1001, 36}` getTouhaHistory; `{9001, 3}` getKohouhinDropStopFlag; `{9002, 10, 300}` getKohouhinForceValue | `ttcSetDisableFlag` relationship unknown | + +**Current state**: Implemented with hardcoded values. No database persistence. + +--- + +### `MsgMhfReadBeatLevel` — Client → Server → Client + +Reads the player's current conquest beat levels (monster progress values) from the server. + +**Request** (`msg_mhf_read_beat_level.go`): +``` +AckHandle uint32 +Unk0 uint32 — always 1 in the JP client (hardcoded literal) +ValidIDCount uint32 — always 4 in the JP client +IDs [16]uint32 — always [0x74, 0x6B, 0x02, 0x24, 0, 0, ...] (hardcoded) +``` + +**Response**: `ValidIDCount` entries of `[ID uint32, Value uint32, 0 uint32, 0 uint32]`. +Default value if no DB data: `{0,1, 0,1, 0,1, 0,1}` (level 1 for each slot). + +**Current state**: Fully implemented. Beat levels are read from `characters.conquest_data` +(16-byte BYTEA). Defaults to level 1 if the column is NULL. + +--- + +### `MsgMhfUpdateBeatLevel` — Client → Server → Client + +Saves the player's updated conquest beat levels after a quest. + +**Request** (`msg_mhf_update_beat_level.go`): +``` +AckHandle uint32 +Unk1 uint32 — unknown +Unk2 uint32 — unknown +Data1 [16]int32 — unknown purpose; entirely discarded by the handler +Data2 [16]int32 — beat level data; only first 4 values are stored +``` + +**Response**: `{0x00, 0x00, 0x00, 0x00}`. + +**Current state**: Implemented, but incomplete. Only `Data2[0..3]` is written to the DB. +`Data1` and `Data2[4..15]` are silently ignored. `Unk1`/`Unk2` purposes are unknown. + +--- + +### `MsgMhfReadBeatLevelAllRanking` — Client → Server → Client + +Fetches the global leaderboard for a given monster. + +**Request** (`msg_mhf_read_beat_level_all_ranking.go`): +``` +AckHandle uint32 +Unk0 uint32 +MonsterID int32 — which monster's ranking to fetch +Unk2 int32 — unknown +``` + +**Response structure** (from known captures): +``` +[uint32] Unk +[int32] Unk +[int32] Unk +for each of 100 entries: + [uint32] Rank + [uint32] Level + [32 bytes] HunterName (null-padded) +``` + +**Current state**: Stubbed. Returns 100 zero-filled entries. No database ranking data exists. + +--- + +### `MsgMhfReadBeatLevelMyRanking` — Client → Server → Client + +Fetches the player's own rank on the conquest leaderboard. + +**Request** (`msg_mhf_read_beat_level_my_ranking.go`): +``` +AckHandle uint32 +Unk0 uint32 +Unk1 uint32 +Unk2 [16]int32 — unknown; possibly the same ID array as ReadBeatLevel +``` + +**Current state**: Stubbed. Returns an empty buffer. Response format unknown. + +--- + +### `MsgMhfReadLastWeekBeatRanking` — Client → Server → Client + +Purpose is partially understood: the handler comment says "controls the monster headings for +the other menus". Likely provides context for which monster's data to display. + +**Request** (`msg_mhf_read_last_week_beat_ranking.go`): +``` +AckHandle uint32 +Unk0 uint32 +EarthMonster int32 +``` + +**Response** (current stub): `[EarthMonster uint32, 0, 0, 0]`. Actual format unknown. + +**Current state**: Minimal stub. Response structure not reverse-engineered. + +--- + +### `MsgMhfGetBreakSeibatuLevelReward` — Client → Server → Client + +Returns per-monster level-break milestone rewards (items granted at specific level thresholds). + +**Request** (`msg_mhf_get_break_seibatu_level_reward.go`): +``` +AckHandle uint32 +Unk0 uint32 — unknown; debug-printed but never used +EarthMonster int32 +``` + +**Response**: variable-length array of reward entries via `doAckEarthSucceed`: +``` +[int32] ItemID +[int32] Quantity +[int32] Level — the level threshold at which this reward unlocks +[int32] Unk — always 0 in known data +``` + +**Current state**: Implemented with hardcoded per-monster reward tables. Item IDs were +derived from packet captures. No database backend. + +--- + +### `MsgMhfGetWeeklySeibatuRankingReward` — Client → Server → Client + +Returns reward tables for conquest ranking, Pallone Festival routes, and Tower floors. +The most complex handler in the branch. + +**Request** (`msg_mhf_get_weekly_seibatu_ranking_reward.go`): +``` +AckHandle uint32 +Unk0 uint32 — unknown; debug-printed but never used +Operation uint32 — 1 = conquest ranking, 3 = Pallone festival, 5 = event rewards +ID uint32 — event/route ID (for Op=1: aligns with EarthStatus 1 and 2) +EarthMonster uint32 +``` + +**Response format for Operation = 1** (conquest ranking rewards): +``` +per entry: + [int32] Unk0 + [int32] ItemID + [uint32] Amount + [int32] PlaceFrom + [int32] PlaceTo +``` + +**Response format for Operations 3 and 5** (Pallone/Tower): +``` +per entry: + [int32] Index0 — floor number (Op=5) or place rank (Op=3) + [int32] Index1 + [uint32] Index2 — distribution slot (Op=5 tower dure: 1 or 2) + [int32] DistributionType — 7201/7202/7203 + [int32] ItemID + [int32] Amount +``` + +**Current state**: Implemented with hardcoded tables derived from packet captures. +- Operation 1: All four monsters return the same bracket table (ranks 1–100, 101–1000, + 1000–1001). The tables are identical for all monsters — this may be correct, or captures + were only recorded for one monster. +- Operation 3 (Pallone): 91 entries across 11 routes, all zero-filled — format is known + but content is not. +- Operation 5 (Tower): Tower dure kill rewards (260001) and 155-entry floor reward table + (260003, floors 1–1500) are hardcoded from captures. + +Note in source: "Can only have 10 in each dist" — the maximum entries per distribution slot +before the client discards them is 10. + +--- + +### `MsgMhfGetFixedSeibatuRankingTable` — Client → Server → Client + +Returns a static "fixed" leaderboard (likely a seeded/display ranking, not live player data). +The handler notes this packet is *not* triggered when `EarthStatus == 1`, suggesting it +belongs to the reward-week display rather than the hunt-week display. + +**Request** (`msg_mhf_get_fixed_seibatu_ranking_table.go`): +``` +AckHandle uint32 +Unk0 uint32 — unknown +Unk1 int32 — unknown +EarthMonster int32 +Unk3 int32 — unknown +Unk4 int32 — unknown +``` + +**Response**: up to 9 entries: +``` +[int32] Rank +[int32] Level +[32 bytes] HunterName (null-padded) +``` + +**Current state**: Implemented with 9 hardcoded "Hunter N" placeholder entries per monster. +`Unk1`, `Unk3`, `Unk4` purposes unknown. + +--- + +### `MsgMhfGetSeibattle` — Client → Server → Client + +Fetches Seibattle (guild-vs-guild battle) data. The `GuildID` field suggests this is +guild-specific, but the handler ignores it entirely. + +**Request** (`msg_mhf_get_seibattle.go`): +``` +AckHandle uint32 +Unk0 uint8 +Type uint8 — 1=timetable, 3=key score, 4=career, 5=opponent, 6=convention result, + 7=char score, 8=cur result +GuildID uint32 +Unk3 uint8 — unknown +Unk4 uint16 — unknown +``` + +**Response**: varies by `Type`. Timetable (Type=1) returns 3 eight-hour battle windows +computed from midnight. All other types return zero-filled structs. + +**Current state**: Stubbed. No database queries, no guild-specific data. The seibattle +guild-vs-guild combat system is entirely unimplemented. + +--- + +### `MsgMhfPostSeibattle` — Client → Server → Client + +Submits a seibattle result. All fields are unknown. + +**Request** (`msg_mhf_post_seibattle.go`): +``` +AckHandle uint32 +Unk0 uint8 +Unk1 uint8 +Unk2 uint32 +Unk3 uint8 +Unk4 uint16 +Unk5 uint16 +Unk6 uint8 +``` + +**Current state**: Stubbed. Returns `{0,0,0,0}`. No data is read or persisted. + +--- + +### `MsgMhfGetAdditionalBeatReward` — Client → Server → Client + +Purpose unclear. The handler comment states: *"Actual responses in packet captures are all +just giant batches of null bytes. I'm assuming this is because it used to be tied to an +actual event that no longer triggers in the client."* + +**Request** (`msg_mhf_get_additional_beat_reward.go`): +``` +AckHandle uint32 +Unk0–Unk3 uint32 — all unknown +``` + +**Current state**: Returns 260 (`0x104`) zero bytes. Whether real responses were ever +non-zero is unknown. + +--- + +## Database Schema + +The branch adds two migrations: + +```sql +-- schemas/patch-schema/23-earth.sql +ALTER TYPE event_type ADD VALUE 'earth'; + +-- schemas/patch-schema/24-conquest.sql +ALTER TABLE public.characters ADD COLUMN IF NOT EXISTS conquest_data BYTEA; +``` + +And seeds four conquest quests (`schemas/bundled-schema/ConquestQuests.sql`): +quest IDs `54257`, `54258`, `54277`, `54370` — all `quest_type = 33`, `max_players = 0`. + +**Missing tables** required for a full implementation: + +| Table | Purpose | +|-------|---------| +| `conquest_rankings` | Per-player, per-monster beat level leaderboard | +| `conquest_reward_claims` | Track which level-break and ranking rewards have been claimed | +| `seibattle_scores` | Guild seibattle results and career records | +| `seibattle_schedules` | Persistent timetable (currently computed in memory) | + +--- + +## Configuration + +Two keys were added to `config.go` / `config.json` by the branch: + +| Key | Type | Default | Purpose | +|-----|------|---------|---------| +| `EarthDebug` | bool | `false` | Collapses event windows to week boundaries for testing | +| `EarthMonsters` | []int32 | `[116, 107, 2, 36]` | Active conquest target monster IDs | + +--- + +## What Is Already Understood + +- The three-phase Earth event cycle (Conquest → Pallone → Tower) and its 21-day rolling + window, keyed to a single `events` row. +- `GetEarthStatus` response wire format: per-phase `[Start, End, StatusID, EarthID, MonsterIDs…]`. +- `ReadBeatLevel` request is fully hardcoded by the JP client (IDs `0x74, 0x6B, 0x02, 0x24`); + no dynamic ID resolution is needed. +- Per-character beat level storage: 4 × int32, 16 bytes, in `characters.conquest_data`. +- Level-break reward item IDs and quantities for monsters 116, 107, 2, 36 (from captures). +- Weekly ranking reward brackets for conquest (ranks 1–100, 101–1000, 1000–1001). +- Tower floor reward table (floors 1–1500, item IDs and quantities from captures). +- Tower dure kill reward distributions (dist 1 and 2, from captures). +- `GetWeeklySeibatuRankingReward` response wire format for all three operations. +- `GetFixedSeibatuRankingTable` response wire format (rank + level + 32-byte name). +- `GetBreakSeibatuLevelReward` response wire format (ItemID + Quantity + Level + Unk). +- Distribution type sentinels: `7201` = item, `7202` = N-Points, `7203` = guild contribution. +- The 10-entry-per-distribution-slot limit in weekly seibatu ranking rewards. +- `GetSeibattle` timetable format: 3 × 8-hour windows from midnight. +- Conquest quest IDs: `54257`, `54258`, `54277`, `54370` (type 33). + +--- + +## What Needs RE Before Full Implementation + +### High Priority — blocks any functional gameplay + +| Unknown | Where to look | Notes | +|---------|---------------|-------| +| Semantics of `EarthStatus` IDs 1 vs 2 | Packet captures during hunt week vs reward week | Currently guessed; wrong selection may break phase detection | +| `MsgMhfUpdateBeatLevel.Data1[16]` | Captures with known quest outcome | Second int32 array entirely discarded; may carry the level gain delta | +| `MsgMhfUpdateBeatLevel.Unk1 / Unk2` | Same captures | May carry monster ID or quest ID needed for routing | +| `ReadBeatLevelAllRanking` response structure | Captures from an active leaderboard | Header fields (3 × uint32 before the 100 entries) unknown | +| `ReadBeatLevelMyRanking` response structure | Same | Format entirely unknown; returns empty today | +| `ReadLastWeekBeatRanking` full response | Captures after week rollover | Only monster ID echoed back today | + +### Medium Priority — required for accurate reward flow + +| Unknown | Where to look | Notes | +|---------|---------------|-------| +| `MsgMhfPostSeibattle` all fields (`Unk0–Unk6`) | Captures after a seibattle result | Handler does nothing today; this is the score submission path | +| `GetSeibattle` types 3–8 response formats | Captures for each `Type` value | Currently all return zero structs | +| `GetSeibattle.Unk0 / Unk3 / Unk4` | Same captures | Likely context selectors for guild/season | +| `GetEarthValue.Unk0 / Unk1 / Unk3–Unk6` | Captures across different event phases | 6 of the 8 request fields are unknown | +| `GetEarthStatus.Unk0 / Unk1` | Captures across phases | Never used by the handler; may be version or session flags | +| `GetWeeklySeibatuRankingReward` Op=3 content | Captures during Pallone Festival | 91 entries are zero-filled placeholders | +| Claim tracking semantics | Compare reward endpoint with gacha claim flow | No "claimed" flag exists anywhere in the schema | + +### Low Priority — cosmetic / completeness + +| Unknown | Where to look | Notes | +|---------|---------------|-------| +| `GetFixedSeibatuRankingTable.Unk1 / Unk3 / Unk4` | Captures | Likely unused alignment or version fields | +| `GetBreakSeibatuLevelReward.Unk0` | Captures with different monsters | Debug-printed; may be season or event ID | +| `ReadBeatLevelAllRanking.Unk0 / Unk2` | Captures | Likely pagination or season selectors | +| `GetAdditionalBeatReward` full structure | Captures if the packet is ever non-null | May be permanently dead in the last client version | +| Pallone Festival route semantics | JP wiki / community guides | 11 routes × 13 entries, all content unknown | +| Original live-service event scheduling cadence | JP wiki archives | Cycle length and reset time not publicly documented | + +--- + +## Relation to Other Systems + +**Gacha service** (`svc_gacha.go`): The reward distribution model (distribution type, +item ID, amount, rank brackets) is structurally similar to the gacha reward pipeline. +Conquest reward claiming can likely reuse or adapt `GachaService.ClaimRewards` and its +point transaction infrastructure. + +**Raviente siege** (`sys_channel_server.go`, `handlers_register.go`): Conquest quests may +use the same `MsgMhfRegisterEvent` / semaphore pattern for quest slot management, though +this has not been confirmed with captures. + +**Tower event** (`feature/tower` branch): The Tower phase is part of the same Earth event +cycle. The `GetWeeklySeibatuRankingReward` handler already covers Tower rewards (Op=5). +The two branches should be coordinated or merged. + +--- + +## Known Code Quality Issues in the Branch + +The following must be fixed before any part of this branch is merged: + +- `fmt.Printf` debug prints left in packet `Parse()` methods: + `msg_mhf_get_break_seibatu_level_reward.go`, `msg_mhf_get_weekly_seibatu_ranking_reward.go`, + `msg_mhf_get_fixed_seibatu_ranking_table.go`, `msg_mhf_read_last_week_beat_ranking.go` +- The `cleanupEarthStatus` function wipes `conquest_data` for all characters on event + expiry — this erases history. Completed conquest data should be archived, not deleted. +- The branch introduced a large `handlers.go` consolidation file that deleted existing test + files (`handlers_achievement_test.go`, `channel_isolation_test.go`, etc.). These must be + restored. +- DB access in `handlers_earth.go` uses raw `s.server.db` calls instead of the repo pattern. + Any merge must route these through `CharacterRepo` and `EventRepo` interfaces. +- `EarthMonsters` config currently accepts any IDs; a G9+ client will crash if fewer than 4 + monsters are configured when `RealClientMode > G8`. diff --git a/docs/fort-attack-event.md b/docs/fort-attack-event.md new file mode 100644 index 000000000..f7478c271 --- /dev/null +++ b/docs/fort-attack-event.md @@ -0,0 +1,201 @@ +# Fort Attack Event (迎撃拠点 / Interceptor's Base) + +Tracks what is known about the Interceptor's Base fort attack event system and what remains to be +reverse-engineered before it can be implemented in Erupe. + +The `feature/enum-event` branch (origin) attempted a partial implementation but is not mergeable in +its current state. Its useful findings are incorporated below. + +--- + +## Game Context + +The **Interceptor's Base** (迎撃拠点) is a persistent field introduced in Forward.1 (April 2011). +Guilds defend a fortress adjacent to Mezeporta against invading Elder Dragons. The fort has a +**durability meter** — if monster attacks reduce it to 0% the quest fails regardless of time or +lives remaining. Monsters known to attack include Rukodiora, Rebidiora, Teoleskatle, Yamatukami, +Shengaroren, Harudomerugu, Rusted Kushala Daora, Belkyruros, Abiologu, and Keoaruboru. + +**Keoaruboru** (古龍の侵攻 culmination, added MHF-Z Z1.1) is the hardest variant. Its limbs +accumulate heat as the fight progresses; if any limb reaches maximum heat it fires a beam at the +fort dealing 20% durability damage and resetting all heat. The fort starts at 80% integrity, meaning +four unchecked beams cause quest failure. Managing heat across limbs is the central mechanic. + +The event was scheduled by Capcom's live servers on a cycle. The exact trigger frequency is not +publicly documented in either English or Japanese sources. + +--- + +## Packet Overview + +Five packets are involved. All live in `network/mhfpacket/`. + +### `MsgMhfEnumerateEvent` (0x72) — Client → Server → Client + +The client polls this on login to learn what fort attack events are currently scheduled. + +**Request** (`msg_mhf_enumerate_event.go`): `AckHandle uint32` + two zeroed `uint16`. + +**Response** built in `handleMsgMhfEnumerateEvent` (`handlers_event.go`): + +``` +[uint8] event count +for each event: + [uint16] EventType — 0 = nothing; 1 or 2 = "Ancient Dragon has attacked the fort" + [uint16] Unk1 — unknown; always 0 in known captures + [uint16] Unk2 — unknown; always 0 + [uint16] Unk3 — unknown; always 0 + [uint16] Unk4 — unknown; always 0 + [uint32] StartTime — Unix timestamp (seconds) when event begins + [uint32] EndTime — Unix timestamp when event ends + if EventType == 2: + [uint8] quest file count + [uint16] quest file ID × N +``` + +What `EventType == 1` means vs `EventType == 2` is not known. The quest file ID list only appears +when `EventType == 2`. The semantics of Unk1–Unk4 are entirely unknown. + +**Current state**: Handler returns an empty event list (0 events). The `feature/enum-event` branch +adds DB-backed scheduling with a configurable `Duration` / `RestartAfter` cycle and a hardcoded +list of 19 quest IDs, but has a logic bug (inverted `rows.Next()` check) and uses raw DB calls +instead of the repo pattern. + +--- + +### `MsgMhfRegisterEvent` — Client → Server → Client + +Sent when a player attempts to join a fort attack session. + +**Request** (`msg_mhf_register_event.go`): +``` +AckHandle uint32 +Unk0 uint16 — unknown +WorldID uint16 +LandID uint16 +CheckOnly bool — if true, only check whether an event is active (don't join) +[uint8 zeroed padding] +``` + +**Response** (4 bytes): `WorldID uint8 | LandID uint8 | RaviID uint16` + +**Current state**: Implemented in `handlers_register.go`. On `CheckOnly=true` with no active +Raviente semaphore it returns a zeroed 4-byte success. Otherwise it echoes back the world/land IDs +and `s.server.raviente.id`. This is the Raviente siege plumbing reused — whether it is correct for +fort attack (as opposed to the Raviente siege proper) is unknown. + +--- + +### `MsgMhfReleaseEvent` — Client → Server + +Sent when a player leaves a fort attack session. Carries `RaviID uint32` (the session ID returned +by RegisterEvent) plus a zeroed `uint32`. + +**Current state**: Always returns `_ACK_EFAILED` (0x41). The correct success response format is +unknown — packet `Build()` is also unimplemented. + +--- + +### `MsgMhfGetRestrictionEvent` — Client → Server → Client + +Purpose unknown. Likely fetches per-player or per-world restrictions for event participation +(e.g. quest rank gate, prior completion check). + +**Current state**: Packet `Parse()` and `Build()` both return `NOT IMPLEMENTED`. Handler is an +empty no-op (`handleMsgMhfGetRestrictionEvent`). No captures of this packet are known. + +--- + +### `MsgMhfSetRestrictionEvent` — Client → Server → Client + +Purpose unknown. Likely sets restriction state after an event completes or a player qualifies. + +**Request** (`msg_mhf_set_restriction_event.go`): +``` +AckHandle uint32 +Unk0 uint32 — unknown +Unk1 uint32 — unknown +Unk2 uint32 — unknown +Unk3 uint8 — unknown +``` + +**Current state**: Handler returns a zeroed 4-byte success. `Build()` is unimplemented. Packet +semantics are entirely unknown. + +--- + +## Shared State (Registers) + +The Raviente siege uses three named register banks (`raviRegisterState`, `raviRegisterSupport`, +`raviRegisterGeneral`) served via `MsgSysLoadRegister` and mutated via `MsgSysOperateRegister`. +The fort attack event likely uses the same register mechanism for shared state (fort durability, +Keoaruboru heat accumulation, etc.), but which register IDs and slot indices map to which fort +variables has not been reverse-engineered. + +`handleMsgSysNotifyRegister` is a stub (`// stub: unimplemented`) — this handler broadcasts +register updates to other players in the session. It must be implemented for multi-player fort +state synchronisation to work. + +--- + +## Database + +The `events` table (`server/migrations/sql/0001_init.sql`) already supports timestamped events: + +```sql +CREATE TABLE public.events ( + id SERIAL PRIMARY KEY, + event_type event_type NOT NULL, + start_time TIMESTAMP WITH TIME ZONE NOT NULL +); +``` + +The `event_type` enum currently contains `festa`, `diva`, `vs`, `mezfes`. Adding `ancientdragon` +requires a migration: + +```sql +ALTER TYPE event_type ADD VALUE 'ancientdragon'; +``` + +The `feature/enum-event` branch placed this in `schemas/patch-schema/event-ancientdragon.sql`, +which is outside the numbered migration sequence and will not be auto-applied. It needs to be +added as `server/migrations/sql/0002_ancientdragon_event_type.sql` (or folded into the next +migration). + +--- + +## What Needs RE Before Implementation + +| Unknown | Where to look | Priority | +|---------|---------------|---------| +| Semantics of `EventType` values (1 vs 2, others?) | Packet captures during event window | High | +| Meaning of Unk1–Unk4 in the EnumerateEvent response | Packet captures + client disassembly | Medium | +| Correct `MsgMhfReleaseEvent` success response format | Packet captures | High | +| `MsgMhfGetRestrictionEvent` full structure (parse + response) | Packet captures | High | +| `MsgMhfSetRestrictionEvent` field semantics (Unk0–Unk3) | Packet captures | Medium | +| Which register IDs / slots carry fort durability | Packet captures during fort quest | High | +| Keoaruboru heat accumulation register mapping | Packet captures during Keoaruboru quest | High | +| Whether `MsgMhfRegisterEvent` reuses Raviente state correctly for fort | Packet captures + comparison with Raviente behaviour | Medium | +| Original event scheduling cadence (cycle length, trigger time) | Live server logs / JP wiki sources | Low | + +--- + +## What Is Already Understood + +- `MsgMhfEnumerateEvent` response wire format (field order, types, conditional quest ID list) +- `StartTime` / `EndTime` are Unix timestamps (confirmed by the feature branch) +- `MsgMhfRegisterEvent` request structure and plausible response format (echoes world/land + ravi ID) +- `MsgMhfReleaseEvent` request structure (carries the ravi session ID) +- `MsgMhfSetRestrictionEvent` request structure (5 fields, semantics unknown) +- The fort event cycles via the `events` table and can share the existing Raviente semaphore infrastructure +- Quest file IDs for fort quests: `20001, 20004–20006, 20011–20013, 20018–20029` (from feature branch config; unvalidated against captures) + +--- + +## Relation to Raviente + +The Raviente siege (`sys_channel_server.go`, `handlers_register.go`) is the closest implemented +analogue. It uses the same `MsgMhfRegisterEvent` / `MsgSysOperateRegister` / `MsgSysLoadRegister` +pipeline. Fort attack implementation can likely reuse or extend this infrastructure rather than +building a separate system. The key difference is that Raviente is always available (with its own +scheduling), while fort attacks are event-gated via `MsgMhfEnumerateEvent`. diff --git a/docs/hunting-tournament.md b/docs/hunting-tournament.md new file mode 100644 index 000000000..6838db9dc --- /dev/null +++ b/docs/hunting-tournament.md @@ -0,0 +1,190 @@ +# Official Hunting Tournament (公式狩猟大会) + +Documents the tournament system implementation status, known protocol details, and remaining +reverse-engineering gaps. + +The `feature/hunting-tournament` branch (origin) is **not mergeable** — it duplicates handlers +that already exist in `handlers_tournament.go`. Its useful findings are incorporated below. + +--- + +## Game Context + +The 公式狩猟大会 (Official Hunting Tournament) was a recurring competitive event numbered +sequentially from the first (late 2007) through at least the 150th before service ended in +December 2019. It ran during the **登録祭** (Registration Festival) — week 1 of each 3-week +Mezeporta Festival (狩人祭) cycle. + +### Competition Cups (杯) + +| Cup | Group | Type | Description | +|-----|-------|------|-------------| +| **個人 G級韋駄天杯** (Solo speed hunt) | 16 | 7 | Time-attack solo vs. a designated monster. Results ranked per weapon class (EventSubType 0–13+ map to weapon categories). | +| **猟団対抗韋駄天杯** (Guild speed hunt) | 17 | 7 | Same time-attack concept, up to 4 hunters from the same guild. Guild rankings determine 魂 (souls) payouts to Mezeporta Festival. EventSubType -1 = all weapon classes combined. | +| **巨大魚杯** (Giant fish cup) | 6 | 6 | Fish size competition. Three designated species; largest catch wins. EventSubType maps to fish species. | + +### Tournament Schedule + +The tournament ran inside each 登録祭 week, and had four phases: + +| Phase | State byte | Duration | +|-------|-----------|---------| +| Before start | 0 | Until `StartTime` | +| Registration + hunting | 1 | `StartTime` → `EntryEnd` (~3 days, Fri 14:00 to Mon 14:00) | +| Scoring / ranking | 2 | `EntryEnd` → `RankingEnd` (~+8.9 days) | +| Reward distribution | 3 | `RankingEnd` → `RewardEnd` (+7 days) | + +The four Unix timestamps (`StartTime`, `EntryEnd`, `RankingEnd`, `RewardEnd`) are all included in +the `EnumerateRanking` response alongside the current state byte. + +### Rewards + +| Placement | Reward | +|-----------|--------| +| All participants | カフの素 (Skill Cuff base materials), ネコ珠の素 (Cat Gem base) | +| Top 500 | 匠チケット + ハーフチケット白 | +| Top 100 | 猟団ポイント (Guild points) | +| Top 3 (speed hunt) | 公式のしるし【金/銀/銅】(Official Mark Gold/Silver/Bronze) | +| Top 3 (fish cup) | 魚杯のしるし【金/銀/銅】(Fish Cup Mark Gold/Silver/Bronze) | +| 1st place (from tournament 76+) | 王者のメダル (King's Medal) — crafts exclusive weapons | +| Guild rank 1–10 | 50,000 魂 to faction + 5,000 to guild (Mezeporta Festival souls) | +| Guild rank 11–30 | 20,000 魂 to faction + 2,000 to guild | + +--- + +## Implementation Status in `develop` + +The tournament is **substantially implemented** in `handlers_tournament.go` and `repo_tournament.go` +with a full repository pattern and DB schema (`server/migrations/sql/0015_tournament.sql`). + +### What Works + +| Handler | File | Status | +|---------|------|--------| +| `handleMsgMhfEnumerateRanking` | `handlers_tournament.go` | Full — DB-backed, state machine, cups + sub-events | +| `handleMsgMhfEnumerateOrder` | `handlers_tournament.go` | Partial — returns leaderboard entries, but ranked by submission time (see gaps) | +| `handleMsgMhfInfoTournament` | `handlers_tournament.go` | Partial — type 0 (listing) and type 1 (registration check) work; type 2 (reward structures) returns empty | +| `handleMsgMhfEntryTournament` | `handlers_tournament.go` | Full — registers character, returns `entryID` | +| `handleMsgMhfEnterTournamentQuest` | `handlers_tournament.go` | Partial — records the submission, but clear time is not stored (see gaps) | +| `handleMsgMhfAcquireTournament` | `handlers_tournament.go` | Stub — returns empty reward list | + +### Database Schema + +Five tables in `0015_tournament.sql`: + +``` +tournaments — schedule: id, name, start_time, entry_end, ranking_end, reward_end +tournament_cups — per-tournament cup categories (cup_group, cup_type, name, description) +tournament_sub_events — shared event definitions (cup_group, event_sub_type, quest_file_id, name) +tournament_entries — per-character registration (char_id, tournament_id, UNIQUE) +tournament_results — per-submission record (char_id, tournament_id, event_id, quest_slot, stage_handle, submitted_at) +``` + +Note: `tournament_results` records *when* a submission arrived but not the actual quest clear time. +The leaderboard in `GetLeaderboard` therefore ranks by `submitted_at ASC` (first to submit = rank 1) +which is incorrect — the real server ranked by quest clear time. + +--- + +## Known Gaps (RE Required) + +### 1. Ranking by Quest Clear Time + +**Impact**: High — the leaderboard is fundamentally wrong. + +`handleMsgMhfEnterTournamentQuest` receives `TournamentID`, `EntryHandle`, `Unk2` (likely +`EventID`), `QuestSlot`, and `StageHandle`. None of these fields carry the actual clear time +directly. The clear time likely arrives via a separate packet (possibly `MsgMhfEndQuest` or a +dedicated score submission packet) that is not yet identified. Until it is, ranking by submission +order is a best-effort placeholder. + +### 2. Guild Leaderboard Filtering by `ClanID` + +**Impact**: Medium — guild cup leaderboard shows all entries instead of filtering by clan. + +`MsgMhfEnumerateOrder` sends both `EventID` and `ClanID` (field names confirmed by the +`feature/hunting-tournament` branch). The current `GetLeaderboard` implementation queries only +by `event_id` and ignores `ClanID`. The guild cup (cup_group 17) leaderboard is presumably +filtered to show only that clan's members, or possibly compared against other clans. The exact +filtering semantics are unknown. + +### 3. `AcquireTournament` Reward Delivery + +**Impact**: High — players cannot receive any tournament rewards. + +`handleMsgMhfAcquireTournament` returns an empty `TournamentReward` list. The +`TournamentReward` struct has three `uint16` fields (`Unk0`, `Unk1`, `Unk2`) that are entirely +unknown. It is unclear whether these carry item IDs, quantities, and flags, or whether the reward +delivery uses a different mechanism (e.g. mail). The 王者のメダル and 公式のしるし item IDs are +also unknown. + +### 4. `InfoTournament` Type 2 (Reward Structures) + +**Impact**: Medium — in-game reward preview is empty. + +Query type 2 returns `TournamentInfo21` and `TournamentInfo22` lists — these likely describe +the per-placement reward tiers shown in the UI before a player claims their prize. All fields in +both structs are unknown (`Unk0`–`Unk4`). + +### 5. `TournamentInfo0` Unknown Fields + +**Impact**: Low — mostly display metadata. + +The `TournamentInfo0` struct (used in `InfoTournament` type 0) has several unknown fields: +`MaxPlayers`, `CurrentPlayers`, `TextColor`, `Unk1`–`Unk6`, `MinHR`, `MaxHR`, plus two +unknown strings. Currently all written as zero/empty. The HR min/max likely gate tournament +access by hunter rank; `TextColor` likely styles the tournament name in the UI. + +### 6. Guild Cup Souls → Mezeporta Festival Attribution + +**Impact**: Medium — guild cup placement does not feed into Festa soul pool. + +The guild speed hunt cup (cup_group 17) awarded 魂 to the guild's Mezeporta Festival account +based on placement. `handleMsgMhfAcquireTournament` currently delivers no rewards at all, let +alone Festa souls. Even once reward delivery is implemented, the soul injection into the Festa +system (via `FestaRepo.SubmitSouls` or similar) needs to be wired up. + +--- + +## What the `feature/hunting-tournament` Branch Adds + +The branch is not mergeable because it adds `handleMsgMhfEnumerateRanking` and +`handleMsgMhfEnumerateOrder` to `handlers_festa.go`, creating duplicate definitions that already +exist in `handlers_tournament.go`. However it contains several useful findings: + +**`ClanID` field name on `MsgMhfEnumerateOrder`** +The two unknown fields (`Unk0`, `Unk1`) are identified as `EventID` and `ClanID`. `EventID` was +already used correctly in develop; `ClanID` is the new insight (currently ignored). + +**Phase timing constants** +The branch's `generateTournamentTimestamps` debug modes confirm the timestamp offsets: +- `StartTime` → `EntryEnd`: +259,200 s (3 days) +- `EntryEnd` → `RankingEnd`: +766,800 s (~8.9 days) +- `RankingEnd` → `RewardEnd`: +604,800 s (7 days) + +These match the real-server cadence and are already reflected in `TournamentDefaults.sql`. + +**Festa timing correction (unrelated side effect)** +The branch also modifies `generateFestaTimestamps` in two ways that are not related to the +tournament but should be evaluated independently: +- `RestartAfter` threshold: 2,977,200 s → 3,024,000 s (34.45 days → 35 days) +- New event start time: midnight+24h → midnight+35h (i.e. 11:00 the following morning) + +These changes appear to better match the real server schedule but have no test coverage. They +should be assessed against packet captures before merging. + +--- + +## Seed Data Reference + +`server/migrations/seed/TournamentDefaults.sql` pre-populates: + +- 1 tournament (tournament #150, "第150回公式狩猟大会") with correct phase durations +- 18 sub-events: + - cup_group 16 (individual speed hunt): EventSubType 0–13 against Brachydios, quest_file_id 60691 + - cup_group 17 (guild speed hunt): EventSubType -1, quest_file_id 60690 + - cup_group 6 (fish): キレアジ (EventSubType 234), ハリマグロ (237), カクサンデメキン (239) +- 3 cups: 個人 巨大魚杯 (id 569), 猟団 G級韋駄天杯 (id 570), 個人 G級韋駄天杯 (id 571) + +The cup descriptions contain hardcoded dates ("2019年11月22日") from the original live event. +These should be templated or made dynamic when reward delivery is implemented. diff --git a/docs/reorg-proposals.md b/docs/reorg-proposals.md new file mode 100644 index 000000000..fd407ef47 --- /dev/null +++ b/docs/reorg-proposals.md @@ -0,0 +1,237 @@ +# Reorg Proposals + +Architectural improvements identified from the `chore/reorg` branch (never merged — too much drift). Each item is self-contained and can be implemented independently. + +--- + +## High Priority + +### 1. Remove `clientctx` from the mhfpacket interface + +`ClientContext` is an empty struct (`struct{}`) that was never used. It appears as a parameter in every `Parse` and `Build` signature across all ~453 mhfpacket files. + +**Current:** +```go +type MHFPacket interface { + Parse(bf *byteframe.ByteFrame, ctx *clientctx.ClientContext) error + Build(bf *byteframe.ByteFrame, ctx *clientctx.ClientContext) error + Opcode() packetid.PacketID +} +``` + +**Proposed:** +```go +type MHFPacket interface { + Parse(bf *byteframe.ByteFrame) error + Build(bf *byteframe.ByteFrame) error + Opcode() packetid.PacketID +} +``` + +Delete `network/mhfpacket/clientcontext.go` and remove the `ctx` argument from all 453 packet files and all call sites. Purely mechanical — no behaviour change. + +--- + +### 2. Lazy packet two-buffer sendLoop + +ACK responses (which the client blocks on) should always flush before broadcast state updates (position syncs, cast binaries from other players). Without priority ordering, a busy stage with many players can delay ACKs long enough that the client appears to softlock or experiences input lag. + +**Proposed change to `sys_session.go`:** + +Add a `Lazy bool` field to `queuedMHFPacket`. Add `QueueSendMHFLazy` for low-priority packets. In `sendLoop`, maintain two accumulators: flush `buffer` first, then `lazybuffer`. + +```go +type queuedMHFPacket struct { + pkt mhfpacket.MHFPacket + Lazy bool +} + +func (s *Session) QueueSendMHFLazy(pkt mhfpacket.MHFPacket) { + s.sendPackets <- queuedMHFPacket{pkt: pkt, Lazy: true} +} +``` + +`BroadcastMHF` and `Stage.BroadcastMHF` should use `QueueSendMHFLazy` since broadcast packets are lower priority than direct ACK responses. + +The current `QueueSendNonBlocking` (which drops packets silently on full queue) should be merged into `QueueSendMHFLazy` with a warning log on drop. + +--- + +### 3. `SessionStage` interface — decouple Stage from `*Session` + +`Stage` currently stores `map[*Session]uint32` for its clients and `*Session` for its host. This creates a circular import: Stage imports Session, Session imports Stage. It makes Stage impossible to move to a shared package and impossible to test in isolation. + +**Proposed — new interface in `sys_stage.go`:** +```go +type SessionStage interface { + QueueSendMHFLazy(pkt mhfpacket.MHFPacket) + GetCharID() uint32 + GetName() string +} +``` + +Change Stage fields: +```go +// Before +Clients map[*Session]uint32 +Host *Session + +// After +Clients map[SessionStage]uint32 +Host SessionStage +``` + +`*Session` already satisfies this interface via existing methods. `Stage.BroadcastMHF` iterates `SessionStage` values — no concrete session reference needed. + +This is a prerequisite for eventually moving Stage to a shared internal package and for writing Stage unit tests with mock sessions. + +--- + +### 4. Fix `semaphoreIndex` data race + +`semaphoreIndex uint32` is a shared incrementing counter on the `Server` struct, initialised to `7`. It is read and written from multiple goroutines without a lock — this is a data race. + +**Current (`sys_channel_server.go`):** +```go +type Server struct { + // ... + semaphoreIndex uint32 +} +``` + +**Proposed — remove from Server, derive per-session:** + +Each session already tracks `semaphoreID []uint16` and `semaphoreMode bool`. Derive the semaphore ID from those: + +```go +func (s *Session) GetSemaphoreID() uint32 { + if s.semaphoreMode { + return 0x000E0000 + uint32(s.semaphoreID[1]) + } + return 0x000F0000 + uint32(s.semaphoreID[0]) +} +``` + +No shared counter, no race. Verify with `go test -race` before and after. + +--- + +## Medium Priority + +### 5. Extract chat commands to `chat_commands.go` + +`handlers_cast_binary.go` currently contains both packet handlers and all chat command implementations (`ban`, `timer`, `psn`, `reload`, `kqf`, `rights`, `course`, `ravi`, `teleport`, `discord`, `help`) — roughly 400 lines of command logic mixed into a handler file. + +**Proposed:** Move `parseChatCommand` and all command implementations to a new `chat_commands.go`. The handler in `handlers_cast_binary.go` calls `parseChatCommand(s, message)` — that call site stays unchanged. + +While doing this, change `Config.Commands` from `[]Command` (linear scan) to `map[string]Command` (O(1) lookup by name). + +--- + +### 6. i18n rewrite + +`sys_language.go` currently uses bare string literals concatenated per locale. The chore/reorg branch introduces a cleaner pattern worth adopting independently of the full reorg. + +**Proposed `i18n.go`:** +```go +var translations = map[string]map[string]string{ + "en": { + "commands.ban.success.permanent": "User {username} has been permanently banned.", + "commands.ban.success.timed": "User {username} has been banned for {duration}.", + // ... + }, + "jp": { + "commands.ban.success.permanent": "ユーザー {username} は永久BANされました。", + // ... + }, +} + +type v = map[string]string + +func t(key string, placeholders v) string { + locale := currentLocale() // or derive from session + tmpl, ok := translations[locale][key] + if !ok { + tmpl = translations["en"][key] + } + for k, val := range placeholders { + tmpl = strings.ReplaceAll(tmpl, "{"+k+"}", val) + } + return tmpl +} +``` + +Usage: `t("commands.ban.success.permanent", v{"username": uname})` + +--- + +### 7. Extract model structs to shared location + +Model structs currently defined inside handler files (`Guild`, `Mail`, `GuildApplication`, `FestivalColor`, etc.) cannot be used by services or other packages without importing `channelserver`, which risks import cycles. + +**Proposed:** Move `db:`-tagged model structs out of handler files and into a dedicated location (e.g. `server/channelserver/model/` or a future `internal/model/`). Local-only types (used by exactly one handler) can stay in place. + +`guild_model.go` is already a partial example of this pattern — extend it. + +--- + +### 8. `doAck*` as session methods + +The four free functions `doAckBufSucceed`, `doAckBufFail`, `doAckSimpleSucceed`, `doAckSimpleFail` take `*Session` as their first argument. They are inherently session operations and should be methods. + +**Current:** +```go +doAckBufSucceed(s, pkt.AckHandle, data) +doAckBufFail(s, pkt.AckHandle, data) +``` + +**Proposed:** +```go +s.DoAckBufSucceed(pkt.AckHandle, data) +s.DoAckBufFail(pkt.AckHandle, data) +``` + +Exporting them (`DoAck…` vs `doAck…`) makes them accessible from service packages without having to pass a raw response buffer around. Mechanical update across all handler files. + +--- + +### 9. `Server` → `ChannelServer` rename + +The channel server's `Server` struct is ambiguous when `signserver.Server` and `entranceserver.Server` all appear in `main.go`. Within handler files, the `s` receiver is used for both `*Session` and `*Server` methods — reading a handler requires tracking which `s` is which. + +**Proposed:** +- Rename `type Server struct` → `type ChannelServer struct` in `sys_channel_server.go` +- Change `Server` method receivers from `s *Server` to `server *ChannelServer` +- Session method receivers remain `s *Session` +- `s.server` field (on Session) stays as-is but now types to `*ChannelServer` + +This makes "am I reading session code or server code?" immediately obvious from the receiver name. + +--- + +## Low Priority + +### 10. Split `sys_channel_server.go` — extract Raviente to `sys_ravi.go` + +`sys_channel_server.go` handles server lifecycle, world management, and all Raviente siege state. Raviente is conceptually a separate subsystem. + +**Proposed:** Move `Raviente` struct, `resetRaviente()`, `GetRaviMultiplier()`, and `UpdateRavi()` to a new `sys_ravi.go`. No logic changes. + +--- + +### 11. Split `handlers_cast_binary.go` — extract broadcast utils to `sys_broadcast.go` + +`BroadcastMHF`, `WorldcastMHF`, `BroadcastChatMessage`, and `BroadcastRaviente` are server-level infrastructure, not packet handlers. They live in `handlers_cast_binary.go` for historical reasons. + +**Proposed:** Move them to a new `sys_broadcast.go`. No logic changes. + +--- + +## What Not to Adopt from chore/reorg + +These patterns appeared in the branch but should not be brought over: + +- **DB singleton (`database.GetDB()`)** — services calling a global DB directly lose the repo-interface/mock pattern we rely on for handler unit tests. The session-as-service-locator problem is better solved by passing the DB through constructors explicitly. +- **`db *sqlx.DB` added to every handler signature** — the intent (make DB dependency explicit) is right, but the implementation is inconsistent (many handlers still call `database.GetDB()` internally). The existing repo-mock pattern is a better testability mechanism. +- **Services in `internal/service/` with inline SQL** — the move out of `channelserver` is correct; dropping the repo-interface pattern is not. Any service extraction should preserve mockability. +- **Logger singleton** — same concern as DB singleton: `sync.Once` initialization cannot be reset between tests, complicating parallel or isolated test runs. diff --git a/docs/scenario-format.md b/docs/scenario-format.md index e78aeffb4..3d18d0a55 100644 --- a/docs/scenario-format.md +++ b/docs/scenario-format.md @@ -12,45 +12,218 @@ When `IsScenario == true`, the client sends a `scenarioFileIdentifier`: | Offset | Type | Field | Description | |--------|--------|-------------|-------------| -| 0 | uint8 | CategoryID | Scenario category | +| 0 | uint8 | CategoryID | Scenario category (0=Basic, 1=Veteran, 3=Other, 6=Pallone, 7=Diva) | | 1 | uint32 | MainID | Main scenario identifier | | 5 | uint8 | ChapterID | Chapter within the scenario | | 6 | uint8 | Flags | Bit flags selecting chunk types (see below) | +The server constructs the filename as: + +``` +{CategoryID}_0_0_0_S{MainID}_T{Flags}_C{ChapterID}.bin (or .json) +``` + ## Flags (Chunk Type Selection) The `Flags` byte is a bitmask that selects which chunk types the client requests: -| Bit | Value | Type | Recursive | Content | -|------|-------|---------|-----------|---------| -| 0 | 0x01 | Chunk0 | Yes | Quest name/description + 0x14 byte info block | -| 1 | 0x02 | Chunk1 | Yes | NPC dialog(?) + 0x2C byte info block | -| 2 | 0x04 | — | — | Unknown (no instances found; possibly Chunk2) | -| 3 | 0x08 | Chunk0 | No | Episode listing (0x1 prefixed?) | -| 4 | 0x10 | Chunk1 | No | JKR-compressed blob, NPC dialog(?) | -| 5 | 0x20 | Chunk2 | No | JKR-compressed blob, menu options or quest titles(?) | -| 6 | 0x40 | — | — | Unknown (no instances found) | -| 7 | 0x80 | — | — | Unknown (no instances found) | +| Bit | Value | Format | Content | +|-----|-------|-----------------|---------| +| 0 | 0x01 | Sub-header | Quest name/description (chunk0) | +| 1 | 0x02 | Sub-header | NPC dialog (chunk1) | +| 2 | 0x04 | — | Unknown (no instances found) | +| 3 | 0x08 | Inline | Episode listing (chunk0 inline) | +| 4 | 0x10 | JKR-compressed | NPC dialog blob (chunk1) | +| 5 | 0x20 | JKR-compressed | Menu options or quest titles (chunk2) | +| 6 | 0x40 | — | Unknown (no instances found) | +| 7 | 0x80 | — | Unknown (no instances found) | -### Chunk Types +The flags are part of the filename — each unique `(CategoryID, MainID, Flags, ChapterID)` tuple corresponds to its own file on disk. -- **Chunk0**: Contains text data (quest names, descriptions, episode titles) with an accompanying fixed-size info block. -- **Chunk1**: Contains dialog or narrative text with a larger info block (0x2C bytes). -- **Chunk2**: Contains menu/selection text. +## Container Format (big-endian) -### Recursive vs Non-Recursive +``` +Offset Field +@0x00 u32 BE chunk0_size +@0x04 u32 BE chunk1_size +@0x08 bytes chunk0_data (chunk0_size bytes) +@0x08+c0 bytes chunk1_data (chunk1_size bytes) +@0x08+c0+c1 u32 BE chunk2_size (only present if file continues) + bytes chunk2_data (chunk2_size bytes) +``` -- **Recursive chunks** (flags 0x01, 0x02): The chunk data itself contains nested sub-chunks that must be parsed recursively. -- **Non-recursive chunks** (flags 0x08, 0x10, 0x20): The chunk is a flat binary blob. Flags 0x10 and 0x20 are JKR-compressed and must be decompressed before reading. +The 8-byte header is always present. Chunks with size 0 are absent. Chunk2 is only read if at least 4 bytes remain after chunk0+chunk1. -## Response Format +**Client-side size limits (confirmed from `FUN_11525c60` in `mhfo-hd.dll`):** each chunk is silently dropped (treated as size 0) if its size exceeds `0x8000` bytes (32 768). The client allocates three fixed 0x8000-byte buffers — one per chunk — so the server must not serve chunks larger than that limit. -The server responds with the scenario file data via `doAckBufSucceed`. The response is the raw binary blob matching the requested chunk types. If the scenario file is not found, the server sends `doAckBufFail` to prevent a client crash. +## Chunk Formats -## Current Implementation +### Sub-header Format (flags 0x01, 0x02) -Scenario files are loaded from `quests/scenarios/` on disk. The server currently serves them as opaque binary blobs with no parsing. Issue #172 proposes adding JSON/CSV support for easier editing, which would require implementing a parser/serializer for this format. +Used for structured text chunks containing named strings with metadata. + +**Sub-header (8 bytes, fields at byte offsets within the chunk):** + +| Off | Type | Field | Notes | +|-----|---------|--------------|-------| +| 0 | u8 | Type | Usually `0x01` | +| 1 | u8 | Pad | Always `0x00`; used to detect this format vs inline | +| 2 | u16 LE | TotalSize | Total chunk size including this header | +| 4 | u8 | EntryCount | Number of string entries | +| 5 | u8 | Unknown1 | Unknown; preserved in JSON for round-trip | +| 6 | u8 | MetadataSize | Total bytes of the metadata block that follows | +| 7 | u8 | Unknown2 | Unknown; preserved in JSON for round-trip | + +**Layout after the 8-byte header:** + +``` +[MetadataSize bytes: opaque metadata block] +[null-terminated Shift-JIS string #1] +[null-terminated Shift-JIS string #2] +... +[0xFF end-of-strings sentinel] +``` + +**Metadata block** (partially decoded): + +The metadata block is `MetadataSize` bytes long. Known sizes from real files: + +- Chunk0 (flag 0x01): `MetadataSize = 0x14` (20 bytes = 10 × u16 LE) +- Chunk1 (flag 0x02): `MetadataSize = 0x2C` (44 bytes = 22 × u16 LE) + +**Chunk0 metadata (20 bytes decoded from 145,000+ real scenario files):** + +Client parser (`FUN_1080d310` in `mhfo-hd.dll`) extracts only m[0]–m[6]; fields m[7]–m[9] are not read. + +| u16 index | Field | Notes | +|-----------|-------|-------| +| m[0] | CategoryID | Matches the first field of the filename (0=basic, 1=GR, 3=exchange, 6=pallone, 7=diva) | +| m[1] | MainID | Matches the `S` field of the filename | +| m[2] | 0x0000 | Always zero; used as offset to string 0 (i.e., strings section start = str0 start) | +| m[3–4] | 0x0000 | Reserved / always zero; not used by client | +| m[5] | str0_len | Byte length of string 0 in Shift-JIS including the null terminator; used as offset to string 1 | +| m[6] | SceneRef | `MainID` when CategoryID=0; `0xFFFF` (−1 as s16) when CategoryID≠0 — stored in client struct as signed short; purpose unclear | +| m[7] | 0x0000 | Not read by client parser | +| m[8] | 0x0005 | Not read by client parser; constant whose purpose is unknown | +| m[9] | varies | Not read by client parser | + +**Chunk1 metadata (44 bytes decoded from multi-dialog scenario files):** + +The 22 u16 fields encode string offsets and dialog script positions. Client parser (`FUN_1080d3b0`) interprets m[8]–m[17] as **signed** offsets: if the value is negative (as s16), the absolute position is `(~value) + dialog_base` where `dialog_base` is the start of the post-0xFF binary data; if non-negative, the position is `value + strings_base`. + +| u16 index | Field | Notes | +|-----------|-------|-------| +| m[0] | ID byte 0 | Low byte only is read; typically 0 | +| m[1] | ID byte 1 | High byte only is read; varies | +| m[1] (u16) | TotalSize copy | Bytes 2–3 read as u16 LE; mirrors the sub-header TotalSize | +| m[2] | EntryCount (s16) | Read as signed short; number of strings or related count | +| m[3] | u16 at offset 6 | Read as u16 | +| m[4–5] | u32 at offset 8 | Read as single u32 | +| m[6] | u16 at offset 12 | Read as u16 | +| m[7] | u16 at offset 14 | Read as u16 | +| m[8] | signed offset | String/dialog pointer (see signed offset formula above) | +| m[9] | cumOff[2] | Byte offset to string 2 from strings section start (= str0_len + str1_len) | +| m[10] | cumOff[1] | Byte offset to string 1 = str0_len | +| m[11] | dialog offset | Offset into the post-0xFF dialog data section | +| m[12] | dialog offset | Offset into the post-0xFF dialog data section | +| m[13] | dialog offset | Offset into the post-0xFF dialog data section | +| m[14] | cumOff[3] | Byte offset to string 3 | +| m[15] | cumOff[4] | Total string bytes without the 0xFF sentinel | +| m[16] | dialog offset | Further offset into the post-0xFF dialog data section | +| m[17] | signed offset | Final offset; if negative, `(~m[17]) + dialog_base`; byte at m[18]×2 is also read | +| m[18–19] | byte fields | Individual bytes read (not as u16 pairs) | +| m[20] | 0x0005 | Constant (same as chunk0 m[8]); not confirmed whether client reads this | +| m[21] | DataSize − 4 | Approximately equal to `chunk1_size − 8 − MetadataSize + 4` | + +The metadata is preserved verbatim in JSON as a base64 blob so that clients receive correct values for all fields including those not yet fully understood. + +**Format detection for chunk0:** if `chunk_data[1] == 0x00` → sub-header, else → inline. + +### Inline Format (flag 0x08) + +Used for episode listings. Each entry is: + +``` +{u8 index}{null-terminated Shift-JIS string} +``` + +Entries are sequential with no separator. Null bytes between entries are ignored during parsing. + +### JKR-compressed Chunks (flags 0x10, 0x20) + +Chunks with flags 0x10 (chunk1) and 0x20 (chunk2) are JKR-compressed blobs. The JKR header (magic `0x1A524B4A`) appears at the start of the chunk data. + +The decompressed content contains metadata bytes interleaved with null-terminated Shift-JIS strings, but the detailed format is not yet fully documented. These chunks are stored as opaque base64 blobs in the JSON format and served to the client unchanged. + +## JSON Format (for `.json` scenario files) + +Erupe supports `.json` files in `bin/scenarios/` as an alternative to `.bin` files. The server compiles `.json` to wire format on demand. `.bin` takes priority if both exist. + +Example `0_0_0_0_S102_T1_C0.json`: + +```json +{ + "chunk0": { + "subheader": { + "type": 1, + "unknown1": 0, + "unknown2": 0, + "metadata": "AAAAAAAAAAAAAAAAAAAAAAAAAAAA", + "strings": ["Quest Name", "Quest description goes here."] + } + } +} +``` + +Example with inline chunk0 (flag 0x08): + +```json +{ + "chunk0": { + "inline": [ + {"index": 1, "text": "Chapter 1"}, + {"index": 2, "text": "Chapter 2"} + ] + } +} +``` + +Example with both chunk0 and chunk1: + +```json +{ + "chunk0": { + "subheader": { + "type": 1, "unknown1": 0, "unknown2": 0, + "metadata": "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA", + "strings": ["Quest Name"] + } + }, + "chunk1": { + "subheader": { + "type": 1, "unknown1": 0, "unknown2": 0, + "metadata": "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA", + "strings": ["NPC: Welcome, hunter.", "NPC: Good luck!"] + } + } +} +``` + +**Key fields:** + +- `metadata`: Base64-encoded opaque blob. Copy from `ParseScenarioBinary` output. For new scenarios with zero-filled metadata, use a base64 string of the right number of zero bytes. +- `strings`: UTF-8 text. The compiler converts to Shift-JIS on the wire. +- `chunk2.data`: Raw JKR-compressed bytes, base64-encoded. Copy from the original `.bin` file. ## JKR Compression -Chunks with flags 0x10 and 0x20 use JPK compression (magic bytes `0x1A524B4A`). See the ReFrontier tool for decompression utilities. +Chunks with flags 0x10 and 0x20 use JKR compression (magic `0x1A524B4A`, type 3 LZ77). The Go compressor is in `common/decryption.PackSimple` and the decompressor in `common/decryption.UnpackSimple`. These implement type-3 (LZ-only) compression, which is the format used throughout Erupe. + +Type-4 (HFI = Huffman + LZ77) JKR blobs from real game files pass through as opaque base64 in `.json` — the server serves them as-is without re-compression. + +## Implementation + +- **Handler**: `server/channelserver/handlers_quest.go` → `handleMsgSysGetFile` → `loadScenarioBinary` +- **JSON schema + compiler**: `server/channelserver/scenario_json.go` +- **JKR compressor**: `common/decryption/jpk_compress.go` (`PackSimple`) +- **JKR decompressor**: `common/decryption/jpk.go` (`UnpackSimple`) diff --git a/docs/tower.md b/docs/tower.md new file mode 100644 index 000000000..42a28976f --- /dev/null +++ b/docs/tower.md @@ -0,0 +1,511 @@ +# Tower / Sky Corridor (天廊 / Tenrou) + +Tracks what is known about the Tower system and what remains to be reverse-engineered +or implemented in Erupe. + +The core of this system **is already implemented on `develop`** via the repository and +service pattern (`repo_tower.go`, `svc_tower.go`, `handlers_tower.go`). Two branches carry +earlier work: `wip/tower` predates the refactor and uses direct SQL; `feature/tower` merges +`wip/tower` with `feature/earth`. Their useful findings are incorporated below. + +The `feature/tower` branch is **not mergeable in its current state** — it diverged too far +from `develop` and was superseded by the direct integration of tower code into `develop`. +Its main remaining value is the `PresentBox` handler logic and the `TimeTaken`/`CID` +field naming for `MsgMhfPostTowerInfo`. + +--- + +## Game Context + +The **Sky Corridor** (天廊, *Tenrou*) is a permanent dungeon introduced in MHFG6. Players +explore a multi-floor structure built by an ancient civilization on a desolate island. Unlike +quest zones, the Sky Corridor has its own progression systems independent of normal quests. + +### Individual Tower Progression + +- Each clear adds **10 minutes** to the timer; quests consist of 2–4 randomly selected floors. +- Each floor contains 2–3 rooms separated by gates, with Felynes that drop **purple medals**, + fixed-position treasure chests, and shiny crystals. +- Players accumulate **Tower Rank (TR)**, **Tower Rank Points (TRP)**, and + **Tower Skill Points (TSP)**. +- TSP is spent to level up one of 64 tower-specific skills stored server-side. +- Two floor-count columns exist in the DB (`block1`, `block2`), corresponding to the + pre-G7 and G7+ floor tiers respectively. + +### Duremudira (The Guardian) + +**Duremudira** (天廊の番人, Tower Guardian) is the Emperor Ice Dragon — an Elder Dragon +introduced in MHFG6. It appears as an optional boss in the second district of the Sky +Corridor. Slaying it yields **Red and Grey Liquids** used to craft Sky Corridor Gems and +Sigils. + +**Arrogant Duremudira** is a harder variant; less information about it is available in +English sources. + +### Gems (Sky Corridor Decorations) + +Gems are collectibles (30 slots, organized as 6 tiers × 5 per tier) obtained from the Tower. +They slot into equipment to activate skills without consuming armor skill slots. Stored +server-side as a 30-element CSV in `tower.gems`. + +### Tenrouirai (天廊威来) — Guild Mission System + +A guild-parallel challenge system layered over the Sky Corridor. Each guild has a +**mission page** (1-based) containing 3 active missions. Guild members contribute scores +by completing tower runs; when cumulative scores meet all three mission goals, the page +advances (after a guild RP donation threshold is also met). There are 33 hardcoded +missions across 3 blocks. + +**Mission types** (the `Mission` field in `TenrouiraiData`): +| Value | Objective | +|-------|-----------| +| 1 | Floors climbed | +| 2 | Antiques collected | +| 3 | Chests opened | +| 4 | Felynes (cats) saved | +| 5 | TRP acquisition | +| 6 | Monster slays | + +### Relation to the Earth Event Cycle + +The Tower phase is **week 3** of the three-week Earth event rotation (see `docs/conquest-war.md`). +The `GetWeeklySeibatuRankingReward` handler (Operation=5, IDs 260001 and 260003) already +handles Tower dure kill rewards and the 155-entry floor reward table. These do not need to be +reimplemented here. + +--- + +## Database Schema + +All tower tables are defined in `server/migrations/sql/0001_init.sql`. + +### `tower` — per-character progression + +```sql +CREATE TABLE public.tower ( + char_id integer, + tr integer, -- Tower Rank + trp integer, -- Tower Rank Points + tsp integer, -- Tower Skill Points + block1 integer, -- Floor count, era 1 (pre-G7) + block2 integer, -- Floor count, era 2 (G7+) + skills text, -- CSV of 64 skill levels + gems text -- CSV of 30 gem quantities (6 tiers × 5) +); +``` + +`skills` and `gems` default to `EmptyTowerCSV(N)` — comma-separated zeros — when NULL. +Gems are encoded as `tier << 8 | (index_within_tier + 1)` in wire responses. + +### Guild columns (`guilds` and `guild_characters`) + +```sql +-- guilds +tower_mission_page integer DEFAULT 1 -- Current Tenrouirai mission page +tower_rp integer DEFAULT 0 -- Accumulated guild tower RP + +-- guild_characters +tower_mission_1 integer -- Member's score for mission slot 1 +tower_mission_2 integer -- Member's score for mission slot 2 +tower_mission_3 integer -- Member's score for mission slot 3 +``` + +--- + +## Packet Overview + +Ten packets implement the Tower system. All live in `network/mhfpacket/`. None have +`Build()` implemented (all return `NOT IMPLEMENTED`). + +### `MsgMhfGetTowerInfo` — Client → Server → Client + +Fetches character tower data. The `InfoType` field selects what data to return. + +**Request** (`msg_mhf_get_tower_info.go`): +``` +AckHandle uint32 +InfoType uint32 — 1=TR/TRP, 2=TSP+skills, 3=level(pre-G7), 4=history, 5=level(G7+) +Unk0 uint32 — unknown; never used by handler +Unk1 uint32 — unknown; never used by handler +``` + +**Response**: variable-length array of frames via `doAckEarthSucceed`. + +| InfoType | Response per frame | +|----------|-------------------| +| 1 | `TR int32, TRP int32` | +| 2 | `TSP int32, Skills [64]int16` | +| 3, 5 | `Floors int32, Unk1 int32, Unk2 int32, Unk3 int32` — one frame per era (1 for G7, 2 for G8+) | +| 4 | `[5]int16 (history group 0), [5]int16 (history group 1)` | + +**InfoTypes 3 and 5** use the same code path and both return `TowerInfoLevel` entries. +The distinction between them is not understood. The `wip/tower` branch treats them +identically. + +**TowerInfoLevel Unk1/Unk2/Unk3**: three of the four level-entry fields are unknown. +They are hardcoded to `5` in `wip/tower` and `0` on `develop`. Whether they carry +max floor, session count, or display state is not known. + +**InfoType 4 (history)**: returns two groups of 5 × int16. The `wip/tower` branch +hardcodes them as `{1, 2, 3, 4, 5}` / `{1, 2, 3, 4, 5}`. Their meaning (e.g. recent +clear times, floor high scores) is not reverse-engineered. On `develop` they return zeros. + +**Current state on develop**: Implemented. Reads from `repo_tower.GetTowerData()`. +History data is zero-filled (semantics unknown). Level Unk1–Unk3 are zero-filled. + +--- + +### `MsgMhfPostTowerInfo` — Client → Server → Client + +Submits updated tower progress after a quest. + +**Request** (`msg_mhf_post_tower_info.go`): +``` +AckHandle uint32 +InfoType uint32 — 1 or 7 = progress update, 2 = skill purchase +Unk1 uint32 — unknown; logged in debug mode +Skill int32 — skill index to level up (InfoType=2 only) +TR int32 — new Tower Rank to set +TRP int32 — TRP earned (added to existing) +Cost int32 — TSP cost (InfoType=2) or TSP earned (InfoType=1,7) +Unk6 int32 — unknown; logged in debug mode +Unk7 int32 — unknown; logged in debug mode +Block1 int32 — floor count increment (InfoType=1,7) +Unk9 int64 — develop: reads as int64; wip/tower: reads as TimeTaken int32 + CID int32 +``` + +**Field disambiguation — `Unk9` vs `TimeTaken + CID`**: the `wip/tower` branch splits the +final 8 bytes into `TimeTaken int32` (quest duration in seconds) and `CID int32` +(character ID). This interpretation appears more correct than a single int64 — the character +ID would make sense as a submission attribution field. `develop` keeps it as `Unk9 int64` +until confirmed. This should be verified with a packet capture. + +**InfoType 7**: handled identically to InfoType 1. The difference between them is unknown +— it may relate to whether the run included Duremudira or was a normal floor clear. + +**TSP rate note**: the handler comment in both branches says "This might give too much TSP? +No idea what the rate is supposed to be." The `Cost` field is used for both TSP earned +(on progress updates) and TSP spent (on skill purchases); the actual earn rate formula is +unknown. + +**`block2` not written**: `UpdateProgress` only writes `block1`. The `block2` column (G7+ +floor era) is never incremented by the handler. This is likely a bug — `block2` should be +written when the client sends a G7+ floor run. + +**Current state on develop**: Implemented. Calls `towerRepo.UpdateSkills` (InfoType=2) and +`towerRepo.UpdateProgress` (InfoType=1,7). `Unk9`/`Unk1`/`Unk6`/`Unk7` are logged in +debug mode but not acted on. + +--- + +### `MsgMhfGetTenrouirai` — Client → Server → Client + +Fetches Tenrouirai (guild mission) data. + +**Request** (`msg_mhf_get_tenrouirai.go`): +``` +AckHandle uint32 +Unk0 uint8 — unknown; never used +DataType uint8 — 1=mission defs, 2=rewards, 4=guild progress, 5=char scores, 6=guild RP +GuildID uint32 +MissionIndex uint8 — which mission to query scores for (DataType=5 only; 1-3) +Unk4 uint8 — unknown; never used +``` + +**DataType=1 response**: 33 frames, one per mission definition: +``` +[uint8] Block — 1–3 +[uint8] Mission — type (1–6, see table above) +[uint16] Goal — score required +[uint16] Cost — RP cost to unlock/advance +[uint8] Skill1–6 — 6 skill requirement bytes (values: 80, 40, 40, 20, 40, 50) +``` + +**DataType=2 response (rewards)**: `TenrouiraiReward` struct is defined but never +populated. Returns an empty array. Response format: +``` +[uint8] Index +[uint16] Item[0..4] — 5 item IDs +[uint8] Quantity[0..4] — 5 quantities +``` +No captures of a populated reward response are known. + +**DataType=4 response**: 1 frame: +``` +[uint8] Page — current mission page (1-based) +[uint16] Mission1 — aggregated guild score for slot 1 (capped to goal) +[uint16] Mission2 — aggregated guild score for slot 2 (capped to goal) +[uint16] Mission3 — aggregated guild score for slot 3 (capped to goal) +``` + +**DataType=5 response**: N frames, one per guild member with a non-null score: +``` +[int32] Score +[14 bytes] HunterName (null-padded) +``` + +**DataType=6 response**: 1 frame: +``` +[uint8] Unk0 — always 0 +[uint32] RP — guild's accumulated tower RP +[uint32] Unk2 — unknown; always 0 +``` + +**`TenrouiraiKeyScore`** (`Unk0 uint8, Unk1 int32`): defined and included in the +`Tenrouirai` struct but never written into or sent. Likely related to an unimplemented +DataType (possibly 3). Purpose unknown. + +**Current state on develop**: DataTypes 1, 4, 5, 6 implemented. DataType 2 (rewards) +returns empty. `Unk0`, `Unk4`, and `TenrouiraiKeyScore` are unresolved. + +--- + +### `MsgMhfPostTenrouirai` — Client → Server → Client + +Submits Tenrouirai results or donates guild RP. + +**Request** (`msg_mhf_post_tenrouirai.go`): +``` +AckHandle uint32 +Unk0 uint8 +Op uint8 — 1 = submit mission results, 2 = donate RP +GuildID uint32 +Unk1 uint8 — unknown + +Op=1 fields: + Floors uint16 — floors climbed this run + Antiques uint16 — antiques collected + Chests uint16 — chests opened + Cats uint16 — Felynes saved + TRP uint16 — TRP obtained + Slays uint16 — monsters slain + +Op=2 fields: + DonatedRP uint16 — RP to donate + PreviousRP uint16 — prior RP total (from client; used for display only?) + Unk2_0–3 uint16 — unknown; 4 reserved fields +``` + +**Critical gap — Op=1 does nothing**: the handler logs the fields in debug mode and +returns a success ACK, but **does not write any data to the database**. Mission scores +(`guild_characters.tower_mission_1/2/3`) are never updated from quest results. This means +Tenrouirai missions can never actually advance via normal gameplay — the `SUM` aggregation +in DataType=4 will always return zero. + +To fix: the handler needs to determine which of the three active missions the current run +contributes to (based on mission type and the run's stats), then write to the appropriate +`tower_mission_N` column. + +**Op=2 (RP donation)**: implemented. Deducts RP from character save data, updates +`guilds.tower_rp`, and advances the mission page when the cumulative donation threshold +is met. `Unk0`, `Unk1`, and `Unk2_0-3` are parsed but unused. + +**Current state on develop**: Op=2 fully implemented. Op=1 is a no-op. + +--- + +### `MsgMhfGetGemInfo` — Client → Server → Client + +Fetches gem inventory or gem acquisition history. + +**Request** (`msg_mhf_get_gem_info.go`): +``` +AckHandle uint32 +QueryType uint32 — 1=gem inventory, 2=gem history +Unk1 uint32 — unknown +Unk2–Unk6 int32 — unknown; 5 additional fields +``` + +**QueryType=1 response**: 30 frames (one per gem slot): +``` +[uint16] Gem — encoded as (tier << 8) | (index_within_tier + 1) +[uint16] Quantity +``` + +**QueryType=2 response**: N frames (gem history): +``` +[uint16] Gem +[uint16] Message — purpose unknown; likely a display string ID +[uint32] Timestamp — Unix timestamp +[14 bytes] Sender — null-padded character name +``` + +**Current state on develop**: QueryType=1 implemented via `towerRepo.GetGems()`. +QueryType=2 returns empty (the `GemHistory` slice is never populated). `Unk1`–`Unk6` +are parsed but unused; purpose unknown. + +--- + +### `MsgMhfPostGemInfo` — Client → Server → Client + +Adds or transfers gems. + +**Request** (`msg_mhf_post_gem_info.go`): +``` +AckHandle uint32 +Op uint32 — 1=add gem, 2=transfer gem +Unk1 uint32 — unknown +Gem int32 — gem ID encoded as (tier << 8) | (index+1) +Quantity int32 — amount +CID int32 — target character ID (Op=2 likely uses this) +Message int32 — display message ID? purpose unknown +Unk6 int32 — unknown +``` + +**Op=1 (add gem)**: implemented. Decodes the gem index from the `Gem` field, increments +the quantity in the CSV, and saves. Note: the index computation `(pkt.Gem >> 8 * 5) + (pkt.Gem - pkt.Gem&0xFF00 - 1%5)` may have operator precedence issues — verify with captures. + +**Op=2 (transfer gem)**: not implemented. Handler comment: *"no way im doing this for now"*. +The `CID` field likely identifies the recipient character. Format of the response (if any +acknowledgement is sent to the recipient) is unknown. + +**Current state on develop**: Op=1 implemented via `towerService.AddGem()`. Op=2 stub. +`Unk1`, `Message`, `Unk6` purposes unknown. + +--- + +### `MsgMhfPresentBox` — Client → Server → Client + +Fetches or claims items from the Tower present box (a reward inbox for seibatsu/Tower +milestone awards). This packet's field names differ between `develop` and `wip/tower`: + +**Request** — `wip/tower` naming (more accurate than develop's all-`Unk*` version): +``` +AckHandle uint32 +Unk0 uint32 +Operation uint32 — 1=open list, 2=claim item, 3=close +PresentCount uint32 — number of PresentType entries that follow +Unk3 uint32 — unknown +Unk4 uint32 — unknown +Unk5 uint32 — unknown +Unk6 uint32 — unknown +PresentType []uint32 — array of present type IDs (length = PresentCount) +``` + +On `develop`, `Operation` is `Unk1` and `PresentCount` is `Unk2` — the field is correctly +used to drive the `for` loop but the semantic name is lost. + +**Response** for Op=1 and Op=2: N frames via `doAckEarthSucceed`, each: +``` +[uint32] ItemClaimIndex — unique claim ID +[int32] PresentType — echoes the request PresentType +[int32] Unk2–Unk7 — 6 unknown fields (always 0 in captures) +[int32] DistributionType — 7201=item, 7202=N-Points, 7203=guild contribution +[int32] ItemID +[int32] Amount +``` + +**Critical gap — no claim tracking**: `ItemClaimIndex` is a sequential ID that the client +uses for "claimed" state, but the server has no DB table or flag for it. Every call +returns the same hardcoded items, so a player can claim the same rewards repeatedly. + +**Op=3**: returns an empty buffer (close/dismiss). + +**Current state on develop**: handler returns an empty item list (`data` slice is nil). +The `wip/tower` branch has a working hardcoded handler (7 dummy items per `PresentType`) +with the correct response structure. `Unk0`, `Unk3`–`Unk6` purposes unknown. + +**Note**: `wip/tower`'s `MsgMhfPresentBox.Parse()` still contains `fmt.Printf` debug +print statements that must be removed before any merge. + +--- + +### `MsgMhfGetNotice` — Client → Server → Client + +Purpose unknown. Likely fetches in-lobby Tower notices or announcements. + +**Request** (`msg_mhf_get_notice.go`): +``` +AckHandle uint32 +Unk0 uint32 +Unk1 uint32 +Unk2 int32 +``` + +**Current state**: Stub on `develop` — returns `{0, 0, 0, 0}`. Response format unknown. + +--- + +### `MsgMhfPostNotice` — Client → Server → Client + +Purpose unknown. Likely submits a read-receipt or acknowledgement for a Tower notice. + +**Request** (`msg_mhf_post_notice.go`): +``` +AckHandle uint32 +Unk0 uint32 +Unk1 uint32 +Unk2 int32 +Unk3 int32 +``` + +**Current state**: Stub on `develop` — returns `{0, 0, 0, 0}`. Response format unknown. + +--- + +## What Is Already Working on Develop + +- Character tower data (TR, TRP, TSP, skills, floor counts) is read and written via the + full repository pattern. +- Skill levelling (InfoType=2) deducts TSP and increments the correct CSV index. +- Floor progress (InfoType=1,7) updates TR, TRP, TSP, and block1. +- All 33 Tenrouirai mission definitions are hardcoded and served correctly. +- Guild Tenrouirai progress (page, aggregated mission scores) is read and score-capped. +- Per-character Tenrouirai leaderboard (DataType=5) is read from DB. +- Guild tower RP donation (Op=2) deducts player RP, accumulates guild RP, and advances + the mission page when the threshold is met. +- Gem inventory (QueryType=1) is read and returned correctly. +- Gem add (Op=1) updates the CSV at the correct index. + +--- + +## What Needs RE or Implementation + +### Functional bugs (affect gameplay today) + +| Issue | Location | Notes | +|-------|----------|-------| +| `PostTenrouirai` Op=1 is a no-op | `handlers_tower.go` | Mission scores are never written; Tenrouirai cannot advance via normal play | +| `block2` never written | `repo_tower.go → UpdateProgress` | G7+ floor count not persisted; requires captures to confirm which InfoType sends it | +| `PresentBox` returns empty list | `handlers_tower.go` | No items are ever shown; `wip/tower` handler logic can be adapted | +| Present claim tracking absent | DB schema | No "claimed" flag; players can re-claim indefinitely once handler is populated | + +### Unknown packet fields (need captures to resolve) + +| Field | Packet | Notes | +|-------|--------|-------| +| `Unk9 int64` vs `TimeTaken int32 + CID int32` | `MsgMhfPostTowerInfo` | `wip/tower` splits this into two int32; likely correct — needs capture confirmation | +| `Unk1`, `Unk6`, `Unk7` | `MsgMhfPostTowerInfo` | Logged in debug but unused; may encode run metadata | +| `Unk0`, `Unk1` | `MsgMhfGetTowerInfo` | Never used in handler | +| `TowerInfoLevel.Unk1–Unk3` | `GetTowerInfo` InfoType=3,5 | 3 of 4 level-entry fields zero-filled; may be max floor, session count, display state | +| `TowerInfoHistory` 10 × int16 | `GetTowerInfo` InfoType=4 | Two groups of 5; semantics unknown (recent clear times? floor high scores?) | +| InfoType 3 vs 5 distinction | `MsgMhfGetTowerInfo` | Same code path; difference not understood | +| InfoType 7 vs 1 distinction | `MsgMhfPostTowerInfo` | Both update progress; difference not understood | +| `Unk0`, `Unk4` | `MsgMhfGetTenrouirai` | Always 0 in known captures | +| `TenrouiraiKeyScore` | `GetTenrouirai` | Struct defined, never sent; likely an unimplemented DataType | +| `Unk0`, `Unk1`, `Unk2_0–3` | `MsgMhfPostTenrouirai` | 6 parsed fields that are never used | +| `Unk1–Unk6` | `MsgMhfGetGemInfo` | 6 extra fields; may filter by tier, season, or character | +| `Message`, `Unk1`, `Unk6` | `MsgMhfPostGemInfo` | `Message` likely a display string ID for gem transfer notices | +| `Unk0`, `Unk3–Unk6` | `MsgMhfPresentBox` | 5 unknown request fields | +| All fields | `MsgMhfGetNotice`, `MsgMhfPostNotice` | Both packets entirely uncharacterized | + +### Missing features (require further RE + design) + +| Feature | Notes | +|---------|-------| +| Tenrouirai mission score submission | `PostTenrouirai` Op=1 needs to map run stats to the correct mission type and write `tower_mission_N` | +| Tenrouirai rewards (DataType=2) | `TenrouiraiReward` response format is known; item IDs and quantities are not | +| Gem transfer (PostGemInfo Op=2) | Recipient lookup via `CID`; likely requires a notification to the target session | +| Gem history (GetGemInfo QueryType=2) | Response structure is known; DB storage is not — would require a `gem_history` table | +| PresentBox claim tracking | Needs a `present_claims` table or a bitfield on the character | +| Notice system | Both Get/Post are stubs; may be Tower bulletin board or reward notifications | + +--- + +## Relation to the Conquest War Doc + +`docs/conquest-war.md` covers the `GetWeeklySeibatuRankingReward` handler which already +implements the Tower dure kill reward table (Op=5, ID 260001) and the Tower floor reward +table (Op=5, ID 260003). Those are not missing here — they live in `handlers_seibattle.go` +on the `feature/conquest` branch. When that branch is eventually integrated, ensure the +Tower floor reward data is preserved. diff --git a/docs/unimplemented.md b/docs/unimplemented.md index 1f8aa691d..ec50db3d2 100644 --- a/docs/unimplemented.md +++ b/docs/unimplemented.md @@ -15,7 +15,7 @@ All empty handlers carry an inline comment — `// stub: unimplemented` for real --- -## Unimplemented (68 handlers) +## Unimplemented (65 handlers) Grouped by handler file / game subsystem. Handlers with an open branch are marked **[branch]**. @@ -44,7 +44,7 @@ Grouped by handler file / game subsystem. Handlers with an open branch are marke | Handler | Notes | |---------|-------| -| `handleMsgMhfShutClient` | Server-initiated client disconnect — partial draft in `fix/clan-invites` | +| `handleMsgMhfShutClient` | Server-initiated client disconnect | | `handleMsgSysHideClient` | Hide a client session from the session list | ### Data / Auth (`handlers_data.go`) @@ -57,7 +57,7 @@ Grouped by handler file / game subsystem. Handlers with an open branch are marke | Handler | Notes | |---------|-------| -| `handleMsgMhfGetRestrictionEvent` | Fetch event-based gameplay restrictions — **[`feature/enum-event`]** (4 commits), **[`feature/event-tent`]** (30 commits, most mature) | +| `handleMsgMhfGetRestrictionEvent` | Fetch event-based gameplay restrictions — see `docs/fort-attack-event.md` | ### Guild (`handlers_guild.go`) @@ -77,7 +77,7 @@ Grouped by handler file / game subsystem. Handlers with an open branch are marke | Handler | Notes | |---------|-------| -| `handleMsgMhfStampcardPrize` | Claim a stamp card prize — **[`feature/event-tent`]** (campaign stamp work) | +| `handleMsgMhfStampcardPrize` | Claim a stamp card prize | ### Misc (`handlers_misc.go`) @@ -95,8 +95,6 @@ Grouped by handler file / game subsystem. Handlers with an open branch are marke All five mutex handlers are empty. MHF mutexes are distributed locks used for event coordination (Raviente, etc.). The server currently uses its own semaphore system instead. -**[`fix/mutex-rework`]** (2 commits) proposes replacing the semaphore system with proper -protocol-level mutex handling. | Handler | Notes | |---------|-------| @@ -125,12 +123,6 @@ secondary operations are stubs: | `handleMsgSysDispObject` | Display/show a previously hidden object | | `handleMsgSysHideObject` | Hide an object from other clients | -### Quests (`handlers_quest.go`) - -| Handler | Notes | -|---------|-------| -| `handleMsgMhfEnterTournamentQuest` | Enter a tournament-mode quest — **[`feature/hunting-tournament`]** (7 commits, actively rebased onto main) | - ### Register (`handlers_register.go`) | Handler | Notes | @@ -141,8 +133,6 @@ secondary operations are stubs: | Handler | Notes | |---------|-------| -| `handleMsgMhfUseRewardSong` | Use/activate a reward song buff — **[`feature/diva`]** | -| `handleMsgMhfAddRewardSongCount` | Increment reward song usage counter — **[`feature/diva`]** | | `handleMsgMhfAcceptReadReward` | Claim a reward for reading an in-game notice | ### Session (`handlers_session.go`) @@ -199,15 +189,10 @@ that needs no reply). Others are genuine feature gaps. | Branch | Commits ahead | Handlers targeted | |--------|:---:|-------------------| -| `feature/event-tent` | 30 | `GetRestrictionEvent`, `StampcardPrize` (campaign stamps) | -| `feature/hunting-tournament` | 7 | `EnterTournamentQuest` | -| `fix/mutex-rework` | 2 | All 5 Mutex handlers | -| `feature/enum-event` | 4 | `GetRestrictionEvent` | -| `feature/conquest` | 4 | Conquest quest handlers | -| `feature/tower` | 4 | Tower handlers | -| `feature/diva` | 6 | `UseRewardSong`, `AddRewardSongCount` | -| `feature/return-guild` | 1 | `UpdateGuild` | -| `fix/clan-invites` | 1 | `ShutClient` (unfinished draft) | +| `feature/enum-event` | 4 | `EnumerateEvent` scheduling only — not mergeable, see `docs/fort-attack-event.md` | +| `feature/conquest` | 4 | Conquest quest handlers — not mergeable, see `docs/conquest-war.md` | +| `feature/hunting-tournament` | 7 | `EnumerateRanking` / `EnumerateOrder` — not mergeable (duplicates handlers_tournament.go), see `docs/hunting-tournament.md` | +| `feature/tower` | 4 | Tower handlers — superseded by direct integration; see `docs/tower.md` for gaps | --- diff --git a/main.go b/main.go index 899d073a0..4082cebcc 100644 --- a/main.go +++ b/main.go @@ -1,6 +1,7 @@ package main import ( + "context" "errors" "flag" "fmt" @@ -106,7 +107,7 @@ func main() { } } - logger.Info(fmt.Sprintf("Starting Erupe (9.3.2-%s)", Commit())) + logger.Info(fmt.Sprintf("Starting Erupe (9.4.0-dev-%s)", Commit())) logger.Info(fmt.Sprintf("Client Mode: %s (%d)", config.ClientMode, config.RealClientMode)) if config.Database.Password == "" { @@ -335,6 +336,7 @@ func main() { si := 0 ci := 0 count := 1 + seenPorts := make(map[uint16]string) for j, ee := range config.Entrance.Entries { for i, ce := range ee.Channels { sid := (4096 + si*256) + (16 + ci) @@ -344,6 +346,13 @@ func main() { count++ continue } + if prev, exists := seenPorts[ce.Port]; exists { + preventClose(config, fmt.Sprintf("Channel %d: port %d already used by %s", count, ce.Port, prev)) + ci++ + count++ + continue + } + seenPorts[ce.Port] = fmt.Sprintf("channel %d", count) c := *channelserver.NewServer(&channelserver.Config{ ID: uint16(sid), Logger: logger.Named("channel-" + fmt.Sprint(count)), @@ -393,8 +402,12 @@ func main() { <-sig if !config.DisableSoftCrash { - for i := 0; i < 10; i++ { - message := fmt.Sprintf("Shutting down in %d...", 10-i) + countdown := config.ShutdownCountdownSeconds + if countdown <= 0 { + countdown = 10 + } + for i := 0; i < countdown; i++ { + message := fmt.Sprintf("Shutting down in %d...", countdown-i) for _, c := range channels { c.BroadcastChatMessage(message) } @@ -409,8 +422,10 @@ func main() { } if config.Channel.Enabled { + drainCtx, drainCancel := context.WithTimeout(context.Background(), 30*time.Second) + defer drainCancel() for _, c := range channels { - c.Shutdown() + c.ShutdownAndDrain(drainCtx) } } diff --git a/network/mhfpacket/msg_batch_parse_test.go b/network/mhfpacket/msg_batch_parse_test.go index 8e97fe1e1..78b72c63f 100644 --- a/network/mhfpacket/msg_batch_parse_test.go +++ b/network/mhfpacket/msg_batch_parse_test.go @@ -420,7 +420,7 @@ func TestBatchParseMultiField(t *testing.T) { if err := pkt.Parse(bf, ctx); err != nil { t.Fatal(err) } - if pkt.Unk0 != 2 || pkt.Unk1 != 3 || pkt.Unk2 != 4 { + if pkt.QuestID != 2 || pkt.ItemType != 3 || pkt.Quantity != 4 { t.Error("field mismatch") } }) diff --git a/network/mhfpacket/msg_build_test.go b/network/mhfpacket/msg_build_test.go index 18e938e53..d572bee87 100644 --- a/network/mhfpacket/msg_build_test.go +++ b/network/mhfpacket/msg_build_test.go @@ -515,8 +515,8 @@ func TestBuildParseStateCampaign(t *testing.T) { if parsed.CampaignID != tt.campaignID { t.Errorf("CampaignID = %d, want %d", parsed.CampaignID, tt.campaignID) } - if parsed.Unk1 != tt.unk1 { - t.Errorf("Unk1 = %d, want %d", parsed.Unk1, tt.unk1) + if parsed.NullPadding != tt.unk1 { + t.Errorf("NullPadding = %d, want %d", parsed.NullPadding, tt.unk1) } }) } @@ -526,15 +526,14 @@ func TestBuildParseStateCampaign(t *testing.T) { // Build is NOT IMPLEMENTED, so we manually write the binary representation. func TestBuildParseApplyCampaign(t *testing.T) { tests := []struct { - name string - ackHandle uint32 - unk0 uint32 - unk1 uint16 - unk2 []byte + name string + ackHandle uint32 + campaignID uint32 + code string }{ - {"typical", 0x55667788, 5, 10, make([]byte, 16)}, - {"zero", 0, 0, 0, make([]byte, 16)}, - {"max", 0xFFFFFFFF, 0xFFFFFFFF, 0xFFFF, make([]byte, 16)}, + {"typical", 0x55667788, 5, "TESTCODE"}, + {"zero", 0, 0, ""}, + {"max", 0xFFFFFFFF, 0xFFFFFFFF, "MAXCODE"}, } ctx := &clientctx.ClientContext{RealClientMode: cfg.ZZ} @@ -542,9 +541,11 @@ func TestBuildParseApplyCampaign(t *testing.T) { t.Run(tt.name, func(t *testing.T) { bf := byteframe.NewByteFrame() bf.WriteUint32(tt.ackHandle) - bf.WriteUint32(tt.unk0) - bf.WriteUint16(tt.unk1) - bf.WriteBytes(tt.unk2) + bf.WriteUint32(tt.campaignID) + bf.WriteUint16(0) // zeroed + codeBytes := make([]byte, 16) + copy(codeBytes, []byte(tt.code)) + bf.WriteBytes(codeBytes) _, _ = bf.Seek(0, io.SeekStart) parsed := &MsgMhfApplyCampaign{} @@ -555,14 +556,11 @@ func TestBuildParseApplyCampaign(t *testing.T) { if parsed.AckHandle != tt.ackHandle { t.Errorf("AckHandle = 0x%X, want 0x%X", parsed.AckHandle, tt.ackHandle) } - if parsed.Unk0 != tt.unk0 { - t.Errorf("Unk0 = %d, want %d", parsed.Unk0, tt.unk0) + if parsed.CampaignID != tt.campaignID { + t.Errorf("CampaignID = %d, want %d", parsed.CampaignID, tt.campaignID) } - if parsed.Unk1 != tt.unk1 { - t.Errorf("Unk1 = %d, want %d", parsed.Unk1, tt.unk1) - } - if len(parsed.Unk2) != len(tt.unk2) { - t.Errorf("Unk2 len = %d, want %d", len(parsed.Unk2), len(tt.unk2)) + if parsed.Code != tt.code { + t.Errorf("Code = %q, want %q", parsed.Code, tt.code) } }) } diff --git a/network/mhfpacket/msg_mhf_acquire_item.go b/network/mhfpacket/msg_mhf_acquire_item.go index de86ce846..8bdb65fc9 100644 --- a/network/mhfpacket/msg_mhf_acquire_item.go +++ b/network/mhfpacket/msg_mhf_acquire_item.go @@ -11,9 +11,7 @@ import ( // MsgMhfAcquireItem represents the MSG_MHF_ACQUIRE_ITEM type MsgMhfAcquireItem struct { AckHandle uint32 - Unk0 uint16 - Length uint16 - Unk1 []uint32 + RewardIDs []uint32 } // Opcode returns the ID associated with this packet type. @@ -24,10 +22,10 @@ func (m *MsgMhfAcquireItem) Opcode() network.PacketID { // Parse parses the packet from binary func (m *MsgMhfAcquireItem) Parse(bf *byteframe.ByteFrame, ctx *clientctx.ClientContext) error { m.AckHandle = bf.ReadUint32() - m.Unk0 = bf.ReadUint16() - m.Length = bf.ReadUint16() - for i := 0; i < int(m.Length); i++ { - m.Unk1 = append(m.Unk1, bf.ReadUint32()) + bf.ReadUint16() // Zeroed + ids := bf.ReadUint16() + for i := uint16(0); i < ids; i++ { + m.RewardIDs = append(m.RewardIDs, bf.ReadUint32()) } return nil } diff --git a/network/mhfpacket/msg_mhf_add_reward_song_count.go b/network/mhfpacket/msg_mhf_add_reward_song_count.go index fa6afc236..20db789a1 100644 --- a/network/mhfpacket/msg_mhf_add_reward_song_count.go +++ b/network/mhfpacket/msg_mhf_add_reward_song_count.go @@ -1,27 +1,53 @@ package mhfpacket import ( - "errors" - "erupe-ce/common/byteframe" "erupe-ce/network" "erupe-ce/network/clientctx" ) -// MsgMhfAddRewardSongCount represents the MSG_MHF_ADD_REWARD_SONG_COUNT -type MsgMhfAddRewardSongCount struct{} +// MsgMhfAddRewardSongCount represents the MSG_MHF_ADD_REWARD_SONG_COUNT packet. +// Request layout: +// +// u32 ack_handle +// u32 prayer_id +// u16 array_size_bytes (= count × 2) +// u8 count +// u16[count] entries +type MsgMhfAddRewardSongCount struct { + AckHandle uint32 + PrayerID uint32 + ArraySizeBytes uint16 + Count uint8 + Entries []uint16 +} // Opcode returns the ID associated with this packet type. func (m *MsgMhfAddRewardSongCount) Opcode() network.PacketID { return network.MSG_MHF_ADD_REWARD_SONG_COUNT } -// Parse parses the packet from binary +// Parse parses the packet from binary. func (m *MsgMhfAddRewardSongCount) Parse(bf *byteframe.ByteFrame, ctx *clientctx.ClientContext) error { - return errors.New("NOT IMPLEMENTED") + m.AckHandle = bf.ReadUint32() + m.PrayerID = bf.ReadUint32() + m.ArraySizeBytes = bf.ReadUint16() + m.Count = bf.ReadUint8() + m.Entries = make([]uint16, m.Count) + for i := range m.Entries { + m.Entries[i] = bf.ReadUint16() + } + return bf.Err() } // Build builds a binary packet from the current data. func (m *MsgMhfAddRewardSongCount) Build(bf *byteframe.ByteFrame, ctx *clientctx.ClientContext) error { - return errors.New("NOT IMPLEMENTED") + bf.WriteUint32(m.AckHandle) + bf.WriteUint32(m.PrayerID) + bf.WriteUint16(uint16(len(m.Entries) * 2)) + bf.WriteUint8(uint8(len(m.Entries))) + for _, e := range m.Entries { + bf.WriteUint16(e) + } + return nil } diff --git a/network/mhfpacket/msg_mhf_apply_campaign.go b/network/mhfpacket/msg_mhf_apply_campaign.go index b39dab499..9eac62af7 100644 --- a/network/mhfpacket/msg_mhf_apply_campaign.go +++ b/network/mhfpacket/msg_mhf_apply_campaign.go @@ -2,6 +2,7 @@ package mhfpacket import ( "errors" + "erupe-ce/common/bfutil" "erupe-ce/common/byteframe" "erupe-ce/network" "erupe-ce/network/clientctx" @@ -9,10 +10,9 @@ import ( // MsgMhfApplyCampaign represents the MSG_MHF_APPLY_CAMPAIGN type MsgMhfApplyCampaign struct { - AckHandle uint32 - Unk0 uint32 - Unk1 uint16 - Unk2 []byte + AckHandle uint32 + CampaignID uint32 + Code string } // Opcode returns the ID associated with this packet type. @@ -23,9 +23,9 @@ func (m *MsgMhfApplyCampaign) Opcode() network.PacketID { // Parse parses the packet from binary func (m *MsgMhfApplyCampaign) Parse(bf *byteframe.ByteFrame, ctx *clientctx.ClientContext) error { m.AckHandle = bf.ReadUint32() - m.Unk0 = bf.ReadUint32() - m.Unk1 = bf.ReadUint16() - m.Unk2 = bf.ReadBytes(16) + m.CampaignID = bf.ReadUint32() + bf.ReadUint16() // Zeroed + m.Code = string(bfutil.UpToNull(bf.ReadBytes(16))) return nil } diff --git a/network/mhfpacket/msg_mhf_enter_tournament_quest.go b/network/mhfpacket/msg_mhf_enter_tournament_quest.go index 84b3f99f2..d4c0730a7 100644 --- a/network/mhfpacket/msg_mhf_enter_tournament_quest.go +++ b/network/mhfpacket/msg_mhf_enter_tournament_quest.go @@ -8,8 +8,33 @@ import ( "erupe-ce/network/clientctx" ) -// MsgMhfEnterTournamentQuest represents the MSG_MHF_ENTER_TOURNAMENT_QUEST -type MsgMhfEnterTournamentQuest struct{} +// MsgMhfEnterTournamentQuest represents the MSG_MHF_ENTER_TOURNAMENT_QUEST (opcode 0x00D2). +// +// Wire format derived from mhfo-hd.dll binary analysis (FUN_114f4280 = putEnterTournamentQuest). +// The client sends this packet when entering the actual tournament quest instance after +// completing the ENTRY_TOURNAMENT (0xD1) flow. Fields are all big-endian. +// +// Byte layout (after opcode): +// +// [0..3] uint32 AckHandle +// [4..7] uint32 TournamentID — tournament being entered +// [8..11] uint32 EntryHandle — slot handle assigned by server during ENTRY_TOURNAMENT response +// [12..15] uint32 Unk2 — third field from server INFO response; semantics unclear +// [16..19] uint32 QuestSlot — derived from quest table (DAT_1e41d3b4); effectively uint16 in uint32 +// [20..23] uint32 StageHandle — quest node offset (DAT_1e41d3b8); computed as quest_node + 0x10 +// [24..27] uint32 Unk5 — formatted string identifier (result of FUN_11586310) +// [28] uint8 StringLen — length of optional trailing string (0 = absent in normal flow) +// [29+] bytes String — pascal-style string data (StringLen bytes, absent when 0) +type MsgMhfEnterTournamentQuest struct { + AckHandle uint32 + TournamentID uint32 + EntryHandle uint32 + Unk2 uint32 + QuestSlot uint32 + StageHandle uint32 + Unk5 uint32 + String []byte // pascal-style: 1-byte length prefix, then data; nil when absent +} // Opcode returns the ID associated with this packet type. func (m *MsgMhfEnterTournamentQuest) Opcode() network.PacketID { @@ -18,7 +43,18 @@ func (m *MsgMhfEnterTournamentQuest) Opcode() network.PacketID { // Parse parses the packet from binary func (m *MsgMhfEnterTournamentQuest) Parse(bf *byteframe.ByteFrame, ctx *clientctx.ClientContext) error { - return errors.New("NOT IMPLEMENTED") + m.AckHandle = bf.ReadUint32() + m.TournamentID = bf.ReadUint32() + m.EntryHandle = bf.ReadUint32() + m.Unk2 = bf.ReadUint32() + m.QuestSlot = bf.ReadUint32() + m.StageHandle = bf.ReadUint32() + m.Unk5 = bf.ReadUint32() + strLen := bf.ReadUint8() + if strLen > 0 { + m.String = bf.ReadBytes(uint(strLen)) + } + return nil } // Build builds a binary packet from the current data. diff --git a/network/mhfpacket/msg_mhf_enumerate_item.go b/network/mhfpacket/msg_mhf_enumerate_item.go index fea4f3371..21b670599 100644 --- a/network/mhfpacket/msg_mhf_enumerate_item.go +++ b/network/mhfpacket/msg_mhf_enumerate_item.go @@ -11,8 +11,6 @@ import ( // MsgMhfEnumerateItem represents the MSG_MHF_ENUMERATE_ITEM type MsgMhfEnumerateItem struct { AckHandle uint32 - Unk0 uint16 - Unk1 uint16 CampaignID uint32 } @@ -24,8 +22,8 @@ func (m *MsgMhfEnumerateItem) Opcode() network.PacketID { // Parse parses the packet from binary func (m *MsgMhfEnumerateItem) Parse(bf *byteframe.ByteFrame, ctx *clientctx.ClientContext) error { m.AckHandle = bf.ReadUint32() - m.Unk0 = bf.ReadUint16() - m.Unk1 = bf.ReadUint16() + bf.ReadUint16() // Zeroed + bf.ReadUint16() // Always 2 m.CampaignID = bf.ReadUint32() return nil } diff --git a/network/mhfpacket/msg_mhf_enumerate_order.go b/network/mhfpacket/msg_mhf_enumerate_order.go index bf4fa7abf..e88402b15 100644 --- a/network/mhfpacket/msg_mhf_enumerate_order.go +++ b/network/mhfpacket/msg_mhf_enumerate_order.go @@ -11,8 +11,8 @@ import ( // MsgMhfEnumerateOrder represents the MSG_MHF_ENUMERATE_ORDER type MsgMhfEnumerateOrder struct { AckHandle uint32 - Unk0 uint32 - Unk1 uint32 + EventID uint32 + ClanID uint32 } // Opcode returns the ID associated with this packet type. @@ -23,8 +23,8 @@ func (m *MsgMhfEnumerateOrder) Opcode() network.PacketID { // Parse parses the packet from binary func (m *MsgMhfEnumerateOrder) Parse(bf *byteframe.ByteFrame, ctx *clientctx.ClientContext) error { m.AckHandle = bf.ReadUint32() - m.Unk0 = bf.ReadUint32() - m.Unk1 = bf.ReadUint32() + m.EventID = bf.ReadUint32() + m.ClanID = bf.ReadUint32() return nil } diff --git a/network/mhfpacket/msg_mhf_get_daily_mission_master.go b/network/mhfpacket/msg_mhf_get_daily_mission_master.go index f844ce171..9bfc6fad4 100644 --- a/network/mhfpacket/msg_mhf_get_daily_mission_master.go +++ b/network/mhfpacket/msg_mhf_get_daily_mission_master.go @@ -8,17 +8,22 @@ import ( "erupe-ce/network/clientctx" ) -// MsgMhfGetDailyMissionMaster represents the MSG_MHF_GET_DAILY_MISSION_MASTER -type MsgMhfGetDailyMissionMaster struct{} +// MsgMhfGetDailyMissionMaster requests the server-side daily mission master list. +// Full request payload beyond the AckHandle is not yet reverse-engineered. +type MsgMhfGetDailyMissionMaster struct { + AckHandle uint32 +} // Opcode returns the ID associated with this packet type. func (m *MsgMhfGetDailyMissionMaster) Opcode() network.PacketID { return network.MSG_MHF_GET_DAILY_MISSION_MASTER } -// Parse parses the packet from binary +// Parse parses the packet from binary. +// Only the AckHandle is parsed; additional fields are unknown. func (m *MsgMhfGetDailyMissionMaster) Parse(bf *byteframe.ByteFrame, ctx *clientctx.ClientContext) error { - return errors.New("NOT IMPLEMENTED") + m.AckHandle = bf.ReadUint32() + return nil } // Build builds a binary packet from the current data. diff --git a/network/mhfpacket/msg_mhf_get_daily_mission_personal.go b/network/mhfpacket/msg_mhf_get_daily_mission_personal.go index 58088f7b5..3055ca659 100644 --- a/network/mhfpacket/msg_mhf_get_daily_mission_personal.go +++ b/network/mhfpacket/msg_mhf_get_daily_mission_personal.go @@ -8,17 +8,22 @@ import ( "erupe-ce/network/clientctx" ) -// MsgMhfGetDailyMissionPersonal represents the MSG_MHF_GET_DAILY_MISSION_PERSONAL -type MsgMhfGetDailyMissionPersonal struct{} +// MsgMhfGetDailyMissionPersonal requests the character's personal daily mission progress. +// Full request payload beyond the AckHandle is not yet reverse-engineered. +type MsgMhfGetDailyMissionPersonal struct { + AckHandle uint32 +} // Opcode returns the ID associated with this packet type. func (m *MsgMhfGetDailyMissionPersonal) Opcode() network.PacketID { return network.MSG_MHF_GET_DAILY_MISSION_PERSONAL } -// Parse parses the packet from binary +// Parse parses the packet from binary. +// Only the AckHandle is parsed; additional fields are unknown. func (m *MsgMhfGetDailyMissionPersonal) Parse(bf *byteframe.ByteFrame, ctx *clientctx.ClientContext) error { - return errors.New("NOT IMPLEMENTED") + m.AckHandle = bf.ReadUint32() + return nil } // Build builds a binary packet from the current data. diff --git a/network/mhfpacket/msg_mhf_set_daily_mission_personal.go b/network/mhfpacket/msg_mhf_set_daily_mission_personal.go index 4b5da5c52..d19ef2fcb 100644 --- a/network/mhfpacket/msg_mhf_set_daily_mission_personal.go +++ b/network/mhfpacket/msg_mhf_set_daily_mission_personal.go @@ -8,17 +8,22 @@ import ( "erupe-ce/network/clientctx" ) -// MsgMhfSetDailyMissionPersonal represents the MSG_MHF_SET_DAILY_MISSION_PERSONAL -type MsgMhfSetDailyMissionPersonal struct{} +// MsgMhfSetDailyMissionPersonal writes the character's personal daily mission progress. +// Full request payload beyond the AckHandle is not yet reverse-engineered. +type MsgMhfSetDailyMissionPersonal struct { + AckHandle uint32 +} // Opcode returns the ID associated with this packet type. func (m *MsgMhfSetDailyMissionPersonal) Opcode() network.PacketID { return network.MSG_MHF_SET_DAILY_MISSION_PERSONAL } -// Parse parses the packet from binary +// Parse parses the packet from binary. +// Only the AckHandle is parsed; additional fields are unknown. func (m *MsgMhfSetDailyMissionPersonal) Parse(bf *byteframe.ByteFrame, ctx *clientctx.ClientContext) error { - return errors.New("NOT IMPLEMENTED") + m.AckHandle = bf.ReadUint32() + return nil } // Build builds a binary packet from the current data. diff --git a/network/mhfpacket/msg_mhf_state_campaign.go b/network/mhfpacket/msg_mhf_state_campaign.go index ab6342c55..a34b72981 100644 --- a/network/mhfpacket/msg_mhf_state_campaign.go +++ b/network/mhfpacket/msg_mhf_state_campaign.go @@ -10,9 +10,9 @@ import ( // MsgMhfStateCampaign represents the MSG_MHF_STATE_CAMPAIGN type MsgMhfStateCampaign struct { - AckHandle uint32 - CampaignID uint32 - Unk1 uint16 + AckHandle uint32 + CampaignID uint32 + NullPadding uint16 } // Opcode returns the ID associated with this packet type. @@ -24,7 +24,7 @@ func (m *MsgMhfStateCampaign) Opcode() network.PacketID { func (m *MsgMhfStateCampaign) Parse(bf *byteframe.ByteFrame, ctx *clientctx.ClientContext) error { m.AckHandle = bf.ReadUint32() m.CampaignID = bf.ReadUint32() - m.Unk1 = bf.ReadUint16() + m.NullPadding = bf.ReadUint16() //0 in Z2 return nil } diff --git a/network/mhfpacket/msg_mhf_transfer_item.go b/network/mhfpacket/msg_mhf_transfer_item.go index 69dfdb13f..a98ae7bb9 100644 --- a/network/mhfpacket/msg_mhf_transfer_item.go +++ b/network/mhfpacket/msg_mhf_transfer_item.go @@ -11,12 +11,9 @@ import ( // MsgMhfTransferItem represents the MSG_MHF_TRANSFER_ITEM type MsgMhfTransferItem struct { AckHandle uint32 - // looking at packets, these were static across sessions and did not actually - // correlate with any item IDs that would make sense to get after quests so - // I have no idea what this actually does - Unk0 uint32 - Unk1 uint8 - Unk2 uint16 + QuestID uint32 + ItemType uint8 + Quantity uint16 } // Opcode returns the ID associated with this packet type. @@ -27,10 +24,10 @@ func (m *MsgMhfTransferItem) Opcode() network.PacketID { // Parse parses the packet from binary func (m *MsgMhfTransferItem) Parse(bf *byteframe.ByteFrame, ctx *clientctx.ClientContext) error { m.AckHandle = bf.ReadUint32() - m.Unk0 = bf.ReadUint32() - m.Unk1 = bf.ReadUint8() + m.QuestID = bf.ReadUint32() + m.ItemType = bf.ReadUint8() bf.ReadUint8() // Zeroed - m.Unk2 = bf.ReadUint16() + m.Quantity = bf.ReadUint16() return nil } diff --git a/network/mhfpacket/msg_parse_coverage_test.go b/network/mhfpacket/msg_parse_coverage_test.go index 974089cef..2dcb42d46 100644 --- a/network/mhfpacket/msg_parse_coverage_test.go +++ b/network/mhfpacket/msg_parse_coverage_test.go @@ -102,8 +102,8 @@ func TestParseCoverage_VariableLength(t *testing.T) { if err := pkt.Parse(parsed, ctx); err != nil { t.Errorf("Parse() error: %v", err) } - if len(pkt.Unk1) != 2 { - t.Errorf("expected 2 items, got %d", len(pkt.Unk1)) + if len(pkt.RewardIDs) != 2 { + t.Errorf("expected 2 items, got %d", len(pkt.RewardIDs)) } }) diff --git a/network/mhfpacket/msg_parse_small_test.go b/network/mhfpacket/msg_parse_small_test.go index 5bf69d474..1db2e9bc4 100644 --- a/network/mhfpacket/msg_parse_small_test.go +++ b/network/mhfpacket/msg_parse_small_test.go @@ -18,13 +18,9 @@ func TestParseSmallNotImplemented(t *testing.T) { }{ // MHF packets - NOT IMPLEMENTED {"MsgMhfAcceptReadReward", &MsgMhfAcceptReadReward{}}, - {"MsgMhfAddRewardSongCount", &MsgMhfAddRewardSongCount{}}, {"MsgMhfDebugPostValue", &MsgMhfDebugPostValue{}}, - {"MsgMhfEnterTournamentQuest", &MsgMhfEnterTournamentQuest{}}, {"MsgMhfGetCaAchievementHist", &MsgMhfGetCaAchievementHist{}}, {"MsgMhfGetCaUniqueID", &MsgMhfGetCaUniqueID{}}, - {"MsgMhfGetDailyMissionMaster", &MsgMhfGetDailyMissionMaster{}}, - {"MsgMhfGetDailyMissionPersonal", &MsgMhfGetDailyMissionPersonal{}}, {"MsgMhfGetRestrictionEvent", &MsgMhfGetRestrictionEvent{}}, {"MsgMhfKickExportForce", &MsgMhfKickExportForce{}}, {"MsgMhfPaymentAchievement", &MsgMhfPaymentAchievement{}}, @@ -33,7 +29,6 @@ func TestParseSmallNotImplemented(t *testing.T) { {"MsgMhfResetAchievement", &MsgMhfResetAchievement{}}, {"MsgMhfResetTitle", &MsgMhfResetTitle{}}, {"MsgMhfSetCaAchievement", &MsgMhfSetCaAchievement{}}, - {"MsgMhfSetDailyMissionPersonal", &MsgMhfSetDailyMissionPersonal{}}, {"MsgMhfSetUdTacticsFollower", &MsgMhfSetUdTacticsFollower{}}, {"MsgMhfStampcardPrize", &MsgMhfStampcardPrize{}}, {"MsgMhfUpdateForceGuildRank", &MsgMhfUpdateForceGuildRank{}}, diff --git a/server/api/api_server.go b/server/api/api_server.go index 557d4611d..5f03f0ac4 100644 --- a/server/api/api_server.go +++ b/server/api/api_server.go @@ -86,6 +86,7 @@ func (s *APIServer) Start() error { v2.HandleFunc("/version", s.Version).Methods("GET") v2.HandleFunc("/health", s.Health).Methods("GET") v2.HandleFunc("/server/status", s.ServerStatus).Methods("GET") + v2.HandleFunc("/server/info", s.ServerInfo).Methods("GET") // V2 authenticated routes v2Auth := v2.PathPrefix("").Subrouter() @@ -94,6 +95,7 @@ func (s *APIServer) Start() error { v2Auth.HandleFunc("/characters/{id}/delete", s.DeleteCharacter).Methods("POST") v2Auth.HandleFunc("/characters/{id}", s.DeleteCharacter).Methods("DELETE") v2Auth.HandleFunc("/characters/{id}/export", s.ExportSave).Methods("GET") + v2Auth.HandleFunc("/characters/{id}/import", s.ImportSave).Methods("POST") handler := handlers.CORS( handlers.AllowedHeaders([]string{"Content-Type", "Authorization"}), diff --git a/server/api/dashboard_test.go b/server/api/dashboard_test.go new file mode 100644 index 000000000..babfe98cc --- /dev/null +++ b/server/api/dashboard_test.go @@ -0,0 +1,153 @@ +package api + +import ( + "encoding/json" + "net/http" + "net/http/httptest" + "strings" + "testing" + "time" +) + +// TestDashboardStatsJSON_NoDB verifies the stats endpoint returns valid JSON +// with safe zero values when no database is configured. +func TestDashboardStatsJSON_NoDB(t *testing.T) { + logger := NewTestLogger(t) + defer func() { _ = logger.Sync() }() + + server := &APIServer{ + logger: logger, + erupeConfig: NewTestConfig(), + startTime: time.Now().Add(-5 * time.Minute), + // db intentionally nil + } + + req := httptest.NewRequest(http.MethodGet, "/api/dashboard/stats", nil) + rec := httptest.NewRecorder() + + server.DashboardStatsJSON(rec, req) + + if rec.Code != http.StatusOK { + t.Errorf("Expected status %d, got %d", http.StatusOK, rec.Code) + } + + ct := rec.Header().Get("Content-Type") + if !strings.HasPrefix(ct, "application/json") { + t.Errorf("Expected Content-Type application/json, got %q", ct) + } + + var stats DashboardStats + if err := json.NewDecoder(rec.Body).Decode(&stats); err != nil { + t.Fatalf("Failed to decode response: %v", err) + } + + // Verify required fields are present and have expected zero-DB values. + if stats.ServerVersion == "" { + t.Error("Expected non-empty ServerVersion") + } + if stats.Uptime == "" || stats.Uptime == "unknown" { + // startTime is set so uptime should be computed, not "unknown". + t.Errorf("Expected computed uptime, got %q", stats.Uptime) + } + if stats.TotalAccounts != 0 { + t.Errorf("Expected TotalAccounts=0 without DB, got %d", stats.TotalAccounts) + } + if stats.TotalCharacters != 0 { + t.Errorf("Expected TotalCharacters=0 without DB, got %d", stats.TotalCharacters) + } + if stats.OnlinePlayers != 0 { + t.Errorf("Expected OnlinePlayers=0 without DB, got %d", stats.OnlinePlayers) + } + if stats.DatabaseOK { + t.Error("Expected DatabaseOK=false without DB") + } + if stats.Channels != nil { + t.Errorf("Expected nil Channels without DB, got %v", stats.Channels) + } +} + +// TestDashboardStatsJSON_UptimeUnknown verifies "unknown" uptime when startTime is zero. +func TestDashboardStatsJSON_UptimeUnknown(t *testing.T) { + logger := NewTestLogger(t) + defer func() { _ = logger.Sync() }() + + server := &APIServer{ + logger: logger, + erupeConfig: NewTestConfig(), + // startTime is zero value + } + + req := httptest.NewRequest(http.MethodGet, "/api/dashboard/stats", nil) + rec := httptest.NewRecorder() + + server.DashboardStatsJSON(rec, req) + + if rec.Code != http.StatusOK { + t.Errorf("Expected status %d, got %d", http.StatusOK, rec.Code) + } + + var stats DashboardStats + if err := json.NewDecoder(rec.Body).Decode(&stats); err != nil { + t.Fatalf("Failed to decode response: %v", err) + } + + if stats.Uptime != "unknown" { + t.Errorf("Expected Uptime='unknown' for zero startTime, got %q", stats.Uptime) + } +} + +// TestDashboardStatsJSON_JSONShape validates every field of the DashboardStats payload. +func TestDashboardStatsJSON_JSONShape(t *testing.T) { + logger := NewTestLogger(t) + defer func() { _ = logger.Sync() }() + + server := &APIServer{ + logger: logger, + erupeConfig: NewTestConfig(), + startTime: time.Now(), + } + + req := httptest.NewRequest(http.MethodGet, "/api/dashboard/stats", nil) + rec := httptest.NewRecorder() + + server.DashboardStatsJSON(rec, req) + + // Decode into a raw map so we can check key presence independent of type. + var raw map[string]interface{} + if err := json.NewDecoder(rec.Body).Decode(&raw); err != nil { + t.Fatalf("Failed to decode response as raw map: %v", err) + } + + requiredKeys := []string{ + "uptime", "serverVersion", "clientMode", + "onlinePlayers", "totalAccounts", "totalCharacters", + "databaseOK", + } + for _, key := range requiredKeys { + if _, ok := raw[key]; !ok { + t.Errorf("Missing required JSON key %q", key) + } + } +} + +// TestFormatDuration covers the human-readable duration formatter. +func TestFormatDuration(t *testing.T) { + tests := []struct { + d time.Duration + want string + }{ + {10 * time.Second, "10s"}, + {90 * time.Second, "1m 30s"}, + {2*time.Hour + 15*time.Minute + 5*time.Second, "2h 15m 5s"}, + {25*time.Hour + 3*time.Minute + 0*time.Second, "1d 1h 3m 0s"}, + } + + for _, tt := range tests { + t.Run(tt.want, func(t *testing.T) { + got := formatDuration(tt.d) + if got != tt.want { + t.Errorf("formatDuration(%v) = %q, want %q", tt.d, got, tt.want) + } + }) + } +} diff --git a/server/api/endpoints.go b/server/api/endpoints.go index 50adb9e37..29c4ae71e 100644 --- a/server/api/endpoints.go +++ b/server/api/endpoints.go @@ -2,13 +2,16 @@ package api import ( "context" + "crypto/sha256" "database/sql" + "encoding/base64" "encoding/json" "encoding/xml" "errors" "erupe-ce/common/gametime" "erupe-ce/common/mhfcourse" cfg "erupe-ce/config" + "erupe-ce/server/channelserver/compression/nullcomp" "fmt" "image" "image/jpeg" @@ -153,6 +156,32 @@ func (s *APIServer) Version(w http.ResponseWriter, r *http.Request) { _ = json.NewEncoder(w).Encode(resp) } +// ServerInfoResponse is the JSON payload returned by GET /v2/server/info. +// It exposes the server's configured game version in a form that launcher +// tools (e.g. mhf-outpost) can use to check version compatibility. +type ServerInfoResponse struct { + // ClientMode is the version string as configured in Erupe (e.g. "ZZ", "G10.1"). + ClientMode string `json:"clientMode"` + // ManifestID is the normalized form of ClientMode (lowercase, dots removed) + // matching mhf-outpost manifest IDs (e.g. "zz", "g101"). + ManifestID string `json:"manifestId"` + // Name is the server software name. + Name string `json:"name"` +} + +// ServerInfo handles GET /v2/server/info, returning the server's configured +// game version in a format compatible with mhf-outpost manifest IDs. +func (s *APIServer) ServerInfo(w http.ResponseWriter, r *http.Request) { + clientMode := s.erupeConfig.ClientMode + resp := ServerInfoResponse{ + ClientMode: clientMode, + ManifestID: strings.ToLower(strings.ReplaceAll(clientMode, ".", "")), + Name: "Erupe-CE", + } + w.Header().Set("Content-Type", "application/json") + _ = json.NewEncoder(w).Encode(resp) +} + // Launcher handles GET /launcher and returns banners, messages, and links for the launcher UI. func (s *APIServer) Launcher(w http.ResponseWriter, r *http.Request) { var respData LauncherResponse @@ -590,3 +619,151 @@ func (s *APIServer) Health(w http.ResponseWriter, r *http.Request) { "status": "ok", }) } + +// ImportSave handles POST /v2/characters/{id}/import. +// The request body must contain a one-time import_token (granted by an admin +// via saveutil) plus a character export blob in the same format as ExportSave. +func (s *APIServer) ImportSave(w http.ResponseWriter, r *http.Request) { + ctx := r.Context() + userID, _ := UserIDFromContext(ctx) + + var charID uint32 + if _, err := fmt.Sscanf(mux.Vars(r)["id"], "%d", &charID); err != nil { + writeError(w, http.StatusBadRequest, "invalid_request", "Invalid character ID") + return + } + + var req struct { + ImportToken string `json:"import_token"` + Character map[string]interface{} `json:"character"` + } + if err := json.NewDecoder(r.Body).Decode(&req); err != nil { + writeError(w, http.StatusBadRequest, "invalid_request", "Malformed request body") + return + } + if req.ImportToken == "" { + writeError(w, http.StatusBadRequest, "missing_token", "import_token is required") + return + } + + blobs, err := saveBlobsFromMap(req.Character) + if err != nil { + s.logger.Warn("ImportSave: failed to extract blobs", zap.Error(err), zap.Uint32("charID", charID)) + writeError(w, http.StatusBadRequest, "invalid_request", "Invalid save data: "+err.Error()) + return + } + + // Compute savedata hash server-side. + if len(blobs.Savedata) > 0 { + decompressed, err := nullcomp.Decompress(blobs.Savedata) + if err != nil { + writeError(w, http.StatusBadRequest, "invalid_request", "savedata decompression failed") + return + } + h := sha256.Sum256(decompressed) + blobs.SavedataHash = h[:] + } + + if err := s.charRepo.ImportSave(ctx, charID, userID, req.ImportToken, blobs); err != nil { + s.logger.Warn("ImportSave: failed", zap.Error(err), zap.Uint32("charID", charID)) + writeError(w, http.StatusForbidden, "import_denied", "Import token invalid, expired, or character not owned by user") + return + } + + s.logger.Info("ImportSave: save imported successfully", zap.Uint32("charID", charID), zap.Uint32("userID", userID)) + w.WriteHeader(http.StatusOK) +} + +// saveBlobsFromMap extracts save blob columns from an export character map. +// Values must be base64-encoded strings (as produced by json.Marshal on []byte). +func saveBlobsFromMap(m map[string]interface{}) (SaveBlobs, error) { + var b SaveBlobs + var err error + b.Savedata, err = extractBlob(m, "savedata") + if err != nil { + return b, err + } + b.Decomyset, err = extractBlob(m, "decomyset") + if err != nil { + return b, err + } + b.Hunternavi, err = extractBlob(m, "hunternavi") + if err != nil { + return b, err + } + b.Otomoairou, err = extractBlob(m, "otomoairou") + if err != nil { + return b, err + } + b.Partner, err = extractBlob(m, "partner") + if err != nil { + return b, err + } + b.Platebox, err = extractBlob(m, "platebox") + if err != nil { + return b, err + } + b.Platedata, err = extractBlob(m, "platedata") + if err != nil { + return b, err + } + b.Platemyset, err = extractBlob(m, "platemyset") + if err != nil { + return b, err + } + b.Rengokudata, err = extractBlob(m, "rengokudata") + if err != nil { + return b, err + } + b.Savemercenary, err = extractBlob(m, "savemercenary") + if err != nil { + return b, err + } + b.GachaItems, err = extractBlob(m, "gacha_items") + if err != nil { + return b, err + } + b.HouseInfo, err = extractBlob(m, "house_info") + if err != nil { + return b, err + } + b.LoginBoost, err = extractBlob(m, "login_boost") + if err != nil { + return b, err + } + b.SkinHist, err = extractBlob(m, "skin_hist") + if err != nil { + return b, err + } + b.Scenariodata, err = extractBlob(m, "scenariodata") + if err != nil { + return b, err + } + b.Savefavoritequest, err = extractBlob(m, "savefavoritequest") + if err != nil { + return b, err + } + b.Mezfes, err = extractBlob(m, "mezfes") + if err != nil { + return b, err + } + return b, nil +} + +// extractBlob decodes a single base64-encoded blob from a character export map. +// Returns nil (not an error) if the key is absent or its value is JSON null. +func extractBlob(m map[string]interface{}, key string) ([]byte, error) { + v, ok := m[key] + if !ok || v == nil { + return nil, nil + } + s, ok := v.(string) + if !ok { + return nil, fmt.Errorf("field %q: expected base64 string, got %T", key, v) + } + b, err := base64.StdEncoding.DecodeString(s) + if err != nil { + return nil, fmt.Errorf("field %q: base64 decode: %w", key, err) + } + return b, nil +} diff --git a/server/api/endpoints_coverage_test.go b/server/api/endpoints_coverage_test.go index 9eaa26e7a..4c520faa4 100644 --- a/server/api/endpoints_coverage_test.go +++ b/server/api/endpoints_coverage_test.go @@ -44,6 +44,56 @@ func TestVersionEndpoint(t *testing.T) { } } +func TestServerInfoEndpoint(t *testing.T) { + tests := []struct { + clientMode string + wantID string + }{ + {"ZZ", "zz"}, + {"GG", "gg"}, + {"G10.1", "g101"}, + {"G9.1", "g91"}, + {"FW.5", "fw5"}, + } + for _, tt := range tests { + t.Run(tt.clientMode, func(t *testing.T) { + logger := NewTestLogger(t) + c := NewTestConfig() + c.ClientMode = tt.clientMode + + server := &APIServer{ + logger: logger, + erupeConfig: c, + } + + req := httptest.NewRequest("GET", "/v2/server/info", nil) + rec := httptest.NewRecorder() + server.ServerInfo(rec, req) + + if rec.Code != http.StatusOK { + t.Errorf("status = %d, want 200", rec.Code) + } + if ct := rec.Header().Get("Content-Type"); ct != "application/json" { + t.Errorf("Content-Type = %q, want application/json", ct) + } + + var resp ServerInfoResponse + if err := json.NewDecoder(rec.Body).Decode(&resp); err != nil { + t.Fatalf("decode error: %v", err) + } + if resp.ClientMode != tt.clientMode { + t.Errorf("ClientMode = %q, want %q", resp.ClientMode, tt.clientMode) + } + if resp.ManifestID != tt.wantID { + t.Errorf("ManifestID = %q, want %q", resp.ManifestID, tt.wantID) + } + if resp.Name != "Erupe-CE" { + t.Errorf("Name = %q, want Erupe-CE", resp.Name) + } + }) + } +} + func TestLandingPageEndpoint_Enabled(t *testing.T) { logger := NewTestLogger(t) c := NewTestConfig() diff --git a/server/api/repo_character.go b/server/api/repo_character.go index 1c5a5e97d..026b078d2 100644 --- a/server/api/repo_character.go +++ b/server/api/repo_character.go @@ -2,6 +2,8 @@ package api import ( "context" + "errors" + "time" "github.com/jmoiron/sqlx" ) @@ -89,3 +91,80 @@ func (r *APICharacterRepository) ExportSave(ctx context.Context, userID, charID } return result, nil } + +func (r *APICharacterRepository) GrantImportToken(ctx context.Context, charID, userID uint32, token string, expiry time.Time) error { + res, err := r.db.ExecContext(ctx, + `UPDATE characters SET savedata_import_token=$1, savedata_import_token_expiry=$2 + WHERE id=$3 AND user_id=$4 AND deleted=false`, + token, expiry, charID, userID, + ) + if err != nil { + return err + } + n, err := res.RowsAffected() + if err != nil { + return err + } + if n == 0 { + return errors.New("character not found or not owned by user") + } + return nil +} + +func (r *APICharacterRepository) RevokeImportToken(ctx context.Context, charID, userID uint32) error { + _, err := r.db.ExecContext(ctx, + `UPDATE characters SET savedata_import_token=NULL, savedata_import_token_expiry=NULL + WHERE id=$1 AND user_id=$2`, + charID, userID, + ) + return err +} + +func (r *APICharacterRepository) ImportSave(ctx context.Context, charID, userID uint32, token string, blobs SaveBlobs) error { + tx, err := r.db.BeginTxx(ctx, nil) + if err != nil { + return err + } + defer func() { _ = tx.Rollback() }() + + // Validate token ownership and expiry, then clear it — all in one UPDATE. + res, err := tx.ExecContext(ctx, + `UPDATE characters + SET savedata_import_token=NULL, savedata_import_token_expiry=NULL + WHERE id=$1 AND user_id=$2 + AND savedata_import_token=$3 + AND savedata_import_token_expiry > now()`, + charID, userID, token, + ) + if err != nil { + return err + } + n, err := res.RowsAffected() + if err != nil { + return err + } + if n == 0 { + return errors.New("import token invalid, expired, or character not owned by user") + } + + // Write all save blobs. + _, err = tx.ExecContext(ctx, + `UPDATE characters SET + savedata=$1, savedata_hash=$2, decomyset=$3, hunternavi=$4, + otomoairou=$5, partner=$6, platebox=$7, platedata=$8, + platemyset=$9, rengokudata=$10, savemercenary=$11, gacha_items=$12, + house_info=$13, login_boost=$14, skin_hist=$15, scenariodata=$16, + savefavoritequest=$17, mezfes=$18 + WHERE id=$19`, + blobs.Savedata, blobs.SavedataHash, blobs.Decomyset, blobs.Hunternavi, + blobs.Otomoairou, blobs.Partner, blobs.Platebox, blobs.Platedata, + blobs.Platemyset, blobs.Rengokudata, blobs.Savemercenary, blobs.GachaItems, + blobs.HouseInfo, blobs.LoginBoost, blobs.SkinHist, blobs.Scenariodata, + blobs.Savefavoritequest, blobs.Mezfes, + charID, + ) + if err != nil { + return err + } + return tx.Commit() +} diff --git a/server/api/repo_interfaces.go b/server/api/repo_interfaces.go index 383668764..b4e842def 100644 --- a/server/api/repo_interfaces.go +++ b/server/api/repo_interfaces.go @@ -8,6 +8,29 @@ import ( // Repository interfaces decouple API server business logic from concrete // PostgreSQL implementations, enabling mock/stub injection for unit tests. +// SaveBlobs holds the transferable save data columns for a character. +// SavedataHash must be set by the caller (SHA-256 of decompressed Savedata). +type SaveBlobs struct { + Savedata []byte + SavedataHash []byte + Decomyset []byte + Hunternavi []byte + Otomoairou []byte + Partner []byte + Platebox []byte + Platedata []byte + Platemyset []byte + Rengokudata []byte + Savemercenary []byte + GachaItems []byte + HouseInfo []byte + LoginBoost []byte + SkinHist []byte + Scenariodata []byte + Savefavoritequest []byte + Mezfes []byte +} + // APIUserRepo defines the contract for user-related data access. type APIUserRepo interface { // Register creates a new user and returns their ID and rights. @@ -42,6 +65,13 @@ type APICharacterRepo interface { GetForUser(ctx context.Context, userID uint32) ([]Character, error) // ExportSave returns the full character row as a map. ExportSave(ctx context.Context, userID, charID uint32) (map[string]interface{}, error) + // GrantImportToken sets a one-time import token for a character owned by userID. + GrantImportToken(ctx context.Context, charID, userID uint32, token string, expiry time.Time) error + // RevokeImportToken clears any pending import token for a character owned by userID. + RevokeImportToken(ctx context.Context, charID, userID uint32) error + // ImportSave atomically validates+consumes the import token and writes all save blobs. + // Returns an error if the token is invalid, expired, or the character doesn't belong to userID. + ImportSave(ctx context.Context, charID, userID uint32, token string, blobs SaveBlobs) error } // APIEventRepo defines the contract for read-only event data access. diff --git a/server/api/repo_mocks_test.go b/server/api/repo_mocks_test.go index b7a23f5d0..aefd1bb5e 100644 --- a/server/api/repo_mocks_test.go +++ b/server/api/repo_mocks_test.go @@ -72,6 +72,10 @@ type mockAPICharacterRepo struct { exportResult map[string]interface{} exportErr error + + grantImportTokenErr error + revokeImportTokenErr error + importSaveErr error } func (m *mockAPICharacterRepo) GetNewCharacter(_ context.Context, _ uint32) (Character, error) { @@ -106,6 +110,18 @@ func (m *mockAPICharacterRepo) ExportSave(_ context.Context, _, _ uint32) (map[s return m.exportResult, m.exportErr } +func (m *mockAPICharacterRepo) GrantImportToken(_ context.Context, _, _ uint32, _ string, _ time.Time) error { + return m.grantImportTokenErr +} + +func (m *mockAPICharacterRepo) RevokeImportToken(_ context.Context, _, _ uint32) error { + return m.revokeImportTokenErr +} + +func (m *mockAPICharacterRepo) ImportSave(_ context.Context, _, _ uint32, _ string, _ SaveBlobs) error { + return m.importSaveErr +} + // mockAPIEventRepo implements APIEventRepo for testing. type mockAPIEventRepo struct { featureWeapon *FeatureWeaponRow diff --git a/server/api/routing_test.go b/server/api/routing_test.go index d72f7567b..a681f476c 100644 --- a/server/api/routing_test.go +++ b/server/api/routing_test.go @@ -44,6 +44,7 @@ func newTestRouter(s *APIServer) *mux.Router { v2Auth.HandleFunc("/characters/{id}/export", s.ExportSave).Methods("GET") v2.HandleFunc("/server/status", s.ServerStatus).Methods("GET") + v2.HandleFunc("/server/info", s.ServerInfo).Methods("GET") return r } diff --git a/server/channelserver/guild_model.go b/server/channelserver/guild_model.go index d6bbab2ea..f86966d21 100644 --- a/server/channelserver/guild_model.go +++ b/server/channelserver/guild_model.go @@ -43,7 +43,8 @@ type Guild struct { EventRP uint32 `db:"event_rp"` RoomRP uint16 `db:"room_rp"` RoomExpiry time.Time `db:"room_expiry"` - Comment string `db:"comment"` + Comment string `db:"comment"` + ReturnType uint8 `db:"return_type"` PugiName1 string `db:"pugi_name_1"` PugiName2 string `db:"pugi_name_2"` PugiName3 string `db:"pugi_name_3"` diff --git a/server/channelserver/handlers_campaign.go b/server/channelserver/handlers_campaign.go index ad6854fee..f7cb79853 100644 --- a/server/channelserver/handlers_campaign.go +++ b/server/channelserver/handlers_campaign.go @@ -9,55 +9,83 @@ import ( "time" ) -// CampaignEvent represents a promotional campaign event. type CampaignEvent struct { - ID uint32 - Unk0 uint32 - MinHR int16 - MaxHR int16 - MinSR int16 - MaxSR int16 - MinGR int16 - MaxGR int16 - Unk1 uint16 - Unk2 uint8 - Unk3 uint8 - Unk4 uint16 - Unk5 uint16 - Start time.Time - End time.Time - Unk6 uint8 - String0 string - String1 string - String2 string - String3 string - Link string - Prefix string - Categories []uint16 + ID uint32 `db:"id"` + MinHR int16 `db:"min_hr"` + MaxHR int16 `db:"max_hr"` + MinSR int16 `db:"min_sr"` + MaxSR int16 `db:"max_sr"` + MinGR int16 `db:"min_gr"` + MaxGR int16 `db:"max_gr"` + RewardType uint16 `db:"reward_type"` + Stamps uint8 `db:"stamps"` + ReceiveType uint8 `db:"receive_type"` + BackgroundID uint16 `db:"background_id"` + Start time.Time `db:"start_time"` + End time.Time `db:"end_time"` + Title string `db:"title"` + Reward string `db:"reward"` + Link string `db:"link"` + Prefix string `db:"code_prefix"` } -// CampaignCategory represents a category grouping for campaign events. type CampaignCategory struct { - ID uint16 - Type uint8 - Title string - Description string + ID uint16 `db:"id"` + Type uint8 `db:"type"` + Title string `db:"title"` + Description string `db:"description"` } -// CampaignLink links a campaign event to its items/rewards. type CampaignLink struct { - CategoryID uint16 - CampaignID uint32 + CategoryID uint16 `db:"category_id"` + CampaignID uint32 `db:"campaign_id"` +} + +type CampaignReward struct { + ID uint32 `db:"id"` + ItemType uint16 `db:"item_type"` + Quantity uint16 `db:"quantity"` + ItemID uint16 `db:"item_id"` + Deadline time.Time `db:"deadline"` +} + +// campaignRequiredStamps returns the stamp requirement for a campaign, +// clamping to a minimum of 1. Campaigns with 0 stamps in the DB are +// treated as requiring a single stamp (code redemption) to unlock. +func campaignRequiredStamps(stamps int) int { + if stamps < 1 { + return 1 + } + return stamps } func handleMsgMhfEnumerateCampaign(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgMhfEnumerateCampaign) + if s.server.db == nil { + doAckBufFail(s, pkt.AckHandle, make([]byte, 4)) + return + } bf := byteframe.NewByteFrame() - events := []CampaignEvent{} - categories := []CampaignCategory{} + var events []CampaignEvent + var categories []CampaignCategory var campaignLinks []CampaignLink + err := s.server.db.Select(&events, "SELECT id,min_hr,max_hr,min_sr,max_sr,min_gr,max_gr,reward_type,stamps,receive_type,background_id,start_time,end_time,title,reward,link,code_prefix FROM campaigns") + if err != nil { + doAckBufFail(s, pkt.AckHandle, make([]byte, 4)) + return + } + err = s.server.db.Select(&categories, "SELECT id, type, title, description FROM campaign_categories") + if err != nil { + doAckBufFail(s, pkt.AckHandle, make([]byte, 4)) + return + } + err = s.server.db.Select(&campaignLinks, "SELECT campaign_id, category_id FROM campaign_category_links") + if err != nil { + doAckBufFail(s, pkt.AckHandle, make([]byte, 4)) + return + } if len(events) > 255 { bf.WriteUint8(255) bf.WriteUint16(uint16(len(events))) @@ -66,7 +94,7 @@ func handleMsgMhfEnumerateCampaign(s *Session, p mhfpacket.MHFPacket) { } for _, event := range events { bf.WriteUint32(event.ID) - bf.WriteUint32(event.Unk0) + bf.WriteUint32(0) bf.WriteInt16(event.MinHR) bf.WriteInt16(event.MaxHR) bf.WriteInt16(event.MinSR) @@ -75,34 +103,19 @@ func handleMsgMhfEnumerateCampaign(s *Session, p mhfpacket.MHFPacket) { bf.WriteInt16(event.MinGR) bf.WriteInt16(event.MaxGR) } - bf.WriteUint16(event.Unk1) - bf.WriteUint8(event.Unk2) - bf.WriteUint8(event.Unk3) - bf.WriteUint16(event.Unk4) - bf.WriteUint16(event.Unk5) + bf.WriteUint16(event.RewardType) + bf.WriteUint8(event.Stamps) + bf.WriteUint8(event.ReceiveType) + bf.WriteUint16(event.BackgroundID) + bf.WriteUint16(0) bf.WriteUint32(uint32(event.Start.Unix())) bf.WriteUint32(uint32(event.End.Unix())) - bf.WriteUint8(event.Unk6) - ps.Uint8(bf, event.String0, true) - ps.Uint8(bf, event.String1, true) - ps.Uint8(bf, event.String2, true) - ps.Uint8(bf, event.String3, true) + bf.WriteBool(event.End.Before(time.Now())) + ps.Uint8(bf, event.Title, true) + ps.Uint8(bf, event.Reward, true) + ps.Uint8(bf, event.Prefix, true) + ps.Uint8(bf, "", false) ps.Uint8(bf, event.Link, true) - for i := range event.Categories { - campaignLinks = append(campaignLinks, CampaignLink{event.Categories[i], event.ID}) - } - } - - if len(events) > 255 { - bf.WriteUint8(255) - bf.WriteUint16(uint16(len(events))) - } else { - bf.WriteUint8(uint8(len(events))) - } - for _, event := range events { - bf.WriteUint32(event.ID) - bf.WriteUint8(1) // Always 1? - bf.WriteBytes([]byte(event.Prefix)) } if len(categories) > 255 { @@ -137,43 +150,185 @@ func handleMsgMhfEnumerateCampaign(s *Session, p mhfpacket.MHFPacket) { func handleMsgMhfStateCampaign(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgMhfStateCampaign) + if s.server.db == nil { + doAckBufFail(s, pkt.AckHandle, make([]byte, 4)) + return + } bf := byteframe.NewByteFrame() - bf.WriteUint16(1) - bf.WriteUint16(0) + var required int + var deadline time.Time + var stamps []uint32 + + err := s.server.db.Select(&stamps, "SELECT id FROM campaign_state WHERE campaign_id = $1 AND character_id = $2", pkt.CampaignID, s.charID) + if err != nil { + doAckBufFail(s, pkt.AckHandle, make([]byte, 4)) + return + } + err = s.server.db.QueryRow(`SELECT stamps, end_time FROM campaigns WHERE id = $1`, pkt.CampaignID).Scan(&required, &deadline) + if err != nil { + doAckBufFail(s, pkt.AckHandle, make([]byte, 4)) + return + } + bf.WriteUint16(uint16(len(stamps))) + required = campaignRequiredStamps(required) + + if len(stamps) >= required && deadline.After(time.Now()) { + bf.WriteUint16(2) + } else { + bf.WriteUint16(0) + } + + for _, v := range stamps { + bf.WriteUint32(v) + } + doAckBufSucceed(s, pkt.AckHandle, bf.Data()) } func handleMsgMhfApplyCampaign(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgMhfApplyCampaign) - bf := byteframe.NewByteFrame() - bf.WriteUint32(1) - doAckSimpleSucceed(s, pkt.AckHandle, bf.Data()) + if s.server.db == nil { + doAckSimpleFail(s, pkt.AckHandle, make([]byte, 4)) + return + } + // Check if the code exists, belongs to this campaign, and check if it's a multi-code + var multi bool + err := s.server.db.QueryRow(`SELECT multi FROM public.campaign_codes WHERE code = $1 AND campaign_id = $2`, pkt.Code, pkt.CampaignID).Scan(&multi) + if err != nil { + doAckSimpleFail(s, pkt.AckHandle, make([]byte, 4)) + return + } + + // Check if the code is already used + var exists bool + if multi { + err = s.server.db.QueryRow(`SELECT COUNT(*) > 0 FROM public.campaign_state WHERE code = $1 AND character_id = $2`, pkt.Code, s.charID).Scan(&exists) + } else { + err = s.server.db.QueryRow(`SELECT COUNT(*) > 0 FROM public.campaign_state WHERE code = $1`, pkt.Code).Scan(&exists) + } + if err != nil || exists { + doAckSimpleFail(s, pkt.AckHandle, make([]byte, 4)) + return + } + + _, err = s.server.db.Exec(`INSERT INTO public.campaign_state (code, campaign_id, character_id) VALUES ($1, $2, $3)`, pkt.Code, pkt.CampaignID, s.charID) + if err != nil { + doAckSimpleFail(s, pkt.AckHandle, make([]byte, 4)) + return + } + doAckSimpleSucceed(s, pkt.AckHandle, make([]byte, 4)) } func handleMsgMhfEnumerateItem(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgMhfEnumerateItem) - items := []struct { - Unk0 uint32 - Unk1 uint16 - Unk2 uint16 - Unk3 uint16 - Unk4 uint32 - Unk5 uint32 - }{} - bf := byteframe.NewByteFrame() - bf.WriteUint16(uint16(len(items))) - for _, item := range items { - bf.WriteUint32(item.Unk0) - bf.WriteUint16(item.Unk1) - bf.WriteUint16(item.Unk2) - bf.WriteUint16(item.Unk3) - bf.WriteUint32(item.Unk4) - bf.WriteUint32(item.Unk5) + if s.server.db == nil { + doAckBufFail(s, pkt.AckHandle, make([]byte, 4)) + return + } + bf := byteframe.NewByteFrame() + + var stamps, required, rewardType uint16 + err := s.server.db.QueryRow(`SELECT COUNT(*) FROM campaign_state WHERE campaign_id = $1 AND character_id = $2`, pkt.CampaignID, s.charID).Scan(&stamps) + if err != nil { + doAckBufFail(s, pkt.AckHandle, make([]byte, 4)) + return + } + err = s.server.db.QueryRow(`SELECT stamps, reward_type FROM campaigns WHERE id = $1`, pkt.CampaignID).Scan(&required, &rewardType) + if err != nil { + doAckBufFail(s, pkt.AckHandle, make([]byte, 4)) + return + } + required = uint16(campaignRequiredStamps(int(required))) + + if stamps >= required { + var items []CampaignReward + if rewardType == 2 { + var exists int + err = s.server.db.QueryRow(`SELECT COUNT(*) FROM campaign_quest WHERE campaign_id = $1 AND character_id = $2`, pkt.CampaignID, s.charID).Scan(&exists) + if err != nil { + doAckBufFail(s, pkt.AckHandle, make([]byte, 4)) + return + } + if exists > 0 { + err = s.server.db.Select(&items, ` + SELECT id, item_type, quantity, item_id, TO_TIMESTAMP(0) AS deadline FROM campaign_rewards + WHERE campaign_id = $1 AND item_type != 9 + AND NOT EXISTS (SELECT 1 FROM campaign_rewards_claimed WHERE reward_id = campaign_rewards.id AND character_id = $2) + `, pkt.CampaignID, s.charID) + } else { + err = s.server.db.Select(&items, ` + SELECT cr.id, cr.item_type, cr.quantity, cr.item_id, COALESCE(c.end_time, TO_TIMESTAMP(0)) AS deadline FROM campaign_rewards cr + JOIN campaigns c ON cr.campaign_id = c.id + WHERE campaign_id = $1 AND item_type = 9`, pkt.CampaignID) + } + } else { + err = s.server.db.Select(&items, ` + SELECT id, item_type, quantity, item_id, TO_TIMESTAMP(0) AS deadline FROM campaign_rewards + WHERE campaign_id = $1 + AND NOT EXISTS (SELECT 1 FROM campaign_rewards_claimed WHERE reward_id = campaign_rewards.id AND character_id = $2) + `, pkt.CampaignID, s.charID) + } + if err != nil { + doAckBufFail(s, pkt.AckHandle, make([]byte, 4)) + return + } + + bf.WriteUint16(uint16(len(items))) + for _, item := range items { + bf.WriteUint32(item.ID) + bf.WriteUint16(item.ItemType) + bf.WriteUint16(item.Quantity) + bf.WriteUint16(item.ItemID) //HACK:placed quest id in this field to fit with Item No pattern. however it could be another field... possibly the other unks. + bf.WriteUint16(0) //Unk4, gets cast to uint8 + bf.WriteUint32(0) //Unk5 + bf.WriteUint32(uint32(item.Deadline.Unix())) + } + if len(items) == 0 { + doAckBufSucceed(s, pkt.AckHandle, make([]byte, 4)) + } else { + doAckBufSucceed(s, pkt.AckHandle, bf.Data()) + } + } else { + doAckBufSucceed(s, pkt.AckHandle, make([]byte, 4)) } - doAckBufSucceed(s, pkt.AckHandle, bf.Data()) } func handleMsgMhfAcquireItem(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgMhfAcquireItem) + if s.server.db == nil { + doAckSimpleFail(s, pkt.AckHandle, make([]byte, 4)) + return + } + for _, id := range pkt.RewardIDs { + _, err := s.server.db.Exec(`INSERT INTO campaign_rewards_claimed (reward_id, character_id) VALUES ($1, $2)`, id, s.charID) + if err != nil { + doAckSimpleFail(s, pkt.AckHandle, make([]byte, 4)) + return + } + } + doAckSimpleSucceed(s, pkt.AckHandle, make([]byte, 4)) +} + +func handleMsgMhfTransferItem(s *Session, p mhfpacket.MHFPacket) { + pkt := p.(*mhfpacket.MsgMhfTransferItem) + if s.server.db == nil { + doAckSimpleSucceed(s, pkt.AckHandle, make([]byte, 4)) + return + } + if pkt.ItemType == 9 { + var campaignID uint32 + err := s.server.db.QueryRow(` + SELECT ce.campaign_id FROM campaign_rewards ce + JOIN event_quests eq ON ce.item_id = eq.quest_id + WHERE eq.id = $1 + `, pkt.QuestID).Scan(&campaignID) + if err == nil { + _, err = s.server.db.Exec(`INSERT INTO campaign_quest (campaign_id, character_id) VALUES ($1, $2)`, campaignID, s.charID) + if err != nil { + doAckSimpleFail(s, pkt.AckHandle, make([]byte, 4)) + return + } + } + } doAckSimpleSucceed(s, pkt.AckHandle, make([]byte, 4)) } diff --git a/server/channelserver/handlers_character.go b/server/channelserver/handlers_character.go index 429dc1a23..3c788163d 100644 --- a/server/channelserver/handlers_character.go +++ b/server/channelserver/handlers_character.go @@ -66,8 +66,7 @@ func GetCharacterSaveData(s *Session, charID uint32) (*CharacterSaveData, error) zap.Binary("stored_hash", storedHash), zap.Binary("computed_hash", computedHash[:]), ) - // TODO: attempt recovery from savedata_backups here - return nil, errors.New("savedata integrity check failed") + return recoverFromBackups(s, saveData, charID) } } else if storedHash != nil && s.server.erupeConfig.DisableSaveIntegrityCheck { s.logger.Warn("Savedata integrity check skipped (DisableSaveIntegrityCheck=true)", @@ -80,6 +79,77 @@ func GetCharacterSaveData(s *Session, charID uint32) (*CharacterSaveData, error) return saveData, nil } +// recoverFromBackups is called when the primary savedata fails its integrity check. +// It queries savedata_backups in recency order and returns the first slot whose +// compressed blob decompresses cleanly. It never writes to the database — the +// next successful Save() will overwrite the primary with fresh data and a new hash, +// self-healing the corruption without any extra recovery writes. +func recoverFromBackups(s *Session, base *CharacterSaveData, charID uint32) (*CharacterSaveData, error) { + backups, err := s.server.charRepo.LoadBackupsByRecency(charID) + if err != nil { + s.logger.Error("Failed to load savedata backups during recovery", + zap.Uint32("charID", charID), + zap.Error(err), + ) + return nil, errors.New("savedata integrity check failed") + } + + if len(backups) == 0 { + s.logger.Error("Savedata corrupted and no backups available", + zap.Uint32("charID", charID), + ) + return nil, errors.New("savedata integrity check failed: no backups available") + } + + for _, backup := range backups { + candidate := &CharacterSaveData{ + CharID: base.CharID, + IsNewCharacter: base.IsNewCharacter, + Name: base.Name, + Mode: base.Mode, + Pointers: base.Pointers, + compSave: backup.Data, + } + + if err := candidate.Decompress(); err != nil { + s.logger.Warn("Backup slot decompression failed during recovery, trying next", + zap.Uint32("charID", charID), + zap.Int("slot", backup.Slot), + zap.Time("saved_at", backup.SavedAt), + zap.Error(err), + ) + continue + } + + // nullcomp passes through data without a "cmp" header as-is (legitimate for + // old uncompressed saves). Guard against garbage data that is too small to + // contain the minimum save layout (name field at offset 88–100). + const minSaveSize = saveFieldNameOffset + saveFieldNameLen + if len(candidate.decompSave) < minSaveSize { + s.logger.Warn("Backup slot data too small after decompression, skipping", + zap.Uint32("charID", charID), + zap.Int("slot", backup.Slot), + zap.Int("size", len(candidate.decompSave)), + ) + continue + } + + s.logger.Warn("Savedata recovered from backup — primary was corrupt", + zap.Uint32("charID", charID), + zap.Int("slot", backup.Slot), + zap.Time("saved_at", backup.SavedAt), + ) + candidate.updateStructWithSaveData() + return candidate, nil + } + + s.logger.Error("Savedata corrupted and all backup slots failed decompression", + zap.Uint32("charID", charID), + zap.Int("backups_tried", len(backups)), + ) + return nil, errors.New("savedata integrity check failed: all backup slots exhausted") +} + func (save *CharacterSaveData) Save(s *Session) error { if save.decompSave == nil { s.logger.Warn("No decompressed save data, skipping save", diff --git a/server/channelserver/handlers_character_test.go b/server/channelserver/handlers_character_test.go index 102647b34..e0a1d6dff 100644 --- a/server/channelserver/handlers_character_test.go +++ b/server/channelserver/handlers_character_test.go @@ -446,6 +446,144 @@ func TestGetCharacterSaveData_Integration(t *testing.T) { } } +// TestGetCharacterSaveData_BackupRecovery tests that a character whose primary +// savedata has a hash mismatch is transparently recovered from the backup table. +func TestGetCharacterSaveData_BackupRecovery(t *testing.T) { + db := SetupTestDB(t) + defer TeardownTestDB(t, db) + + // Build valid compressed savedata (same layout as CreateTestCharacter). + rawSave := make([]byte, 150000) + copy(rawSave[88:], append([]byte("BackupChar"), 0x00)) + validCompressed, err := nullcomp.Compress(rawSave) + if err != nil { + t.Fatalf("compress valid savedata: %v", err) + } + + // Build a compressed blob that will fail decompression (garbage bytes). + invalidCompressed := []byte("this is not valid compressed data") + + corruptHash := make([]byte, 32) // all-zero hash is wrong for any real savedata + corruptHash[0] = 0xFF + + repo := NewCharacterRepository(db) + + t.Run("recovers_from_most_recent_backup", func(t *testing.T) { + userID := CreateTestUser(t, db, "recovery_user") + charID := CreateTestCharacter(t, db, userID, "BackupChar") + + // Store a valid backup in slot 0. + if err := repo.SaveBackup(charID, 0, validCompressed); err != nil { + t.Fatalf("SaveBackup: %v", err) + } + + // Set a wrong hash on the primary so the integrity check fails. + if _, err := db.Exec("UPDATE characters SET savedata_hash = $1 WHERE id = $2", corruptHash, charID); err != nil { + t.Fatalf("set corrupt hash: %v", err) + } + + mock := &MockCryptConn{sentPackets: make([][]byte, 0)} + s := createTestSession(mock) + s.charID = charID + SetTestDB(s.server, db) + s.server.erupeConfig.RealClientMode = cfg.Z2 + + got, err := GetCharacterSaveData(s, charID) + if err != nil { + t.Fatalf("GetCharacterSaveData() unexpected error: %v", err) + } + if got == nil { + t.Fatal("GetCharacterSaveData() returned nil") + } + if got.CharID != charID { + t.Errorf("CharID = %d, want %d", got.CharID, charID) + } + }) + + t.Run("skips_corrupt_backup_and_uses_next", func(t *testing.T) { + userID := CreateTestUser(t, db, "multibackup_user") + charID := CreateTestCharacter(t, db, userID, "BackupChar") + + // Slot 1 is newer (saved second) but has invalid compressed data. + // Slot 0 is older but valid. Recovery must skip slot 1 and use slot 0. + if err := repo.SaveBackup(charID, 0, validCompressed); err != nil { + t.Fatalf("SaveBackup slot 0: %v", err) + } + if err := repo.SaveBackup(charID, 1, invalidCompressed); err != nil { + t.Fatalf("SaveBackup slot 1: %v", err) + } + // Update slot 1's saved_at to be newer than slot 0. + if _, err := db.Exec( + "UPDATE savedata_backups SET saved_at = now() + interval '1 minute' WHERE char_id = $1 AND slot = 1", + charID, + ); err != nil { + t.Fatalf("update saved_at: %v", err) + } + + if _, err := db.Exec("UPDATE characters SET savedata_hash = $1 WHERE id = $2", corruptHash, charID); err != nil { + t.Fatalf("set corrupt hash: %v", err) + } + + mock := &MockCryptConn{sentPackets: make([][]byte, 0)} + s := createTestSession(mock) + s.charID = charID + SetTestDB(s.server, db) + s.server.erupeConfig.RealClientMode = cfg.Z2 + + got, err := GetCharacterSaveData(s, charID) + if err != nil { + t.Fatalf("GetCharacterSaveData() unexpected error: %v", err) + } + if got == nil { + t.Fatal("GetCharacterSaveData() returned nil") + } + }) + + t.Run("returns_error_when_no_backups", func(t *testing.T) { + userID := CreateTestUser(t, db, "nobackup_user") + charID := CreateTestCharacter(t, db, userID, "NoBackupChar") + + if _, err := db.Exec("UPDATE characters SET savedata_hash = $1 WHERE id = $2", corruptHash, charID); err != nil { + t.Fatalf("set corrupt hash: %v", err) + } + + mock := &MockCryptConn{sentPackets: make([][]byte, 0)} + s := createTestSession(mock) + s.charID = charID + SetTestDB(s.server, db) + s.server.erupeConfig.RealClientMode = cfg.Z2 + + _, err := GetCharacterSaveData(s, charID) + if err == nil { + t.Fatal("expected error when no backups available, got nil") + } + }) + + t.Run("returns_error_when_all_backups_corrupt", func(t *testing.T) { + userID := CreateTestUser(t, db, "allcorrupt_user") + charID := CreateTestCharacter(t, db, userID, "AllCorruptChar") + + if err := repo.SaveBackup(charID, 0, invalidCompressed); err != nil { + t.Fatalf("SaveBackup: %v", err) + } + + if _, err := db.Exec("UPDATE characters SET savedata_hash = $1 WHERE id = $2", corruptHash, charID); err != nil { + t.Fatalf("set corrupt hash: %v", err) + } + + mock := &MockCryptConn{sentPackets: make([][]byte, 0)} + s := createTestSession(mock) + s.charID = charID + SetTestDB(s.server, db) + s.server.erupeConfig.RealClientMode = cfg.Z2 + + _, err := GetCharacterSaveData(s, charID) + if err == nil { + t.Fatal("expected error when all backups corrupt, got nil") + } + }) +} + // TestCharacterSaveData_Save_Integration tests saving character data to database func TestCharacterSaveData_Save_Integration(t *testing.T) { db := SetupTestDB(t) diff --git a/server/channelserver/handlers_data.go b/server/channelserver/handlers_data.go index dc3e8989b..d2c68b636 100644 --- a/server/channelserver/handlers_data.go +++ b/server/channelserver/handlers_data.go @@ -123,6 +123,7 @@ func handleMsgMhfSavedata(s *Session, p mhfpacket.MHFPacket) { if characterSaveData.Name == s.Name || s.server.erupeConfig.RealClientMode <= cfg.S10 { if err := characterSaveData.Save(s); err != nil { s.logger.Error("Failed to save character data", zap.Error(err)) + doAckSimpleFail(s, pkt.AckHandle, make([]byte, 4)) return } s.logger.Info("Wrote recompressed savedata back to DB.") diff --git a/server/channelserver/handlers_diva.go b/server/channelserver/handlers_diva.go index b79979d8e..c576d83bb 100644 --- a/server/channelserver/handlers_diva.go +++ b/server/channelserver/handlers_diva.go @@ -23,6 +23,9 @@ func cleanupDiva(s *Session) { if err := s.server.divaRepo.DeleteEvents(); err != nil { s.logger.Error("Failed to delete diva events", zap.Error(err)) } + if err := s.server.divaRepo.CleanupBeads(); err != nil { + s.logger.Error("Failed to cleanup diva beads", zap.Error(err)) + } } func generateDivaTimestamps(s *Session, start uint32, debug bool) []uint32 { @@ -137,16 +140,50 @@ func handleMsgMhfGetUdInfo(s *Session, p mhfpacket.MHFPacket) { doAckBufSucceed(s, pkt.AckHandle, resp.Data()) } +// defaultBeadTypes are used when the database has no bead rows configured. +var defaultBeadTypes = []int{1, 3, 4, 8} + func handleMsgMhfGetKijuInfo(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgMhfGetKijuInfo) - // Temporary canned response - data, _ := hex.DecodeString("04965C959782CC8B468EEC00000000000000000000000000000000000000000000815C82A082E782B582DC82A982BA82CC82AB82B682E3815C0A965C959782C682CD96D282E98E7682A281420A95B782AD8ED282C997458B4382F0975E82A682E98142000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001018BAD8C8282CC8B468EEC00000000000000000000000000000000000000000000815C82AB82E582A482B082AB82CC82AB82B682E3815C0A8BAD8C8282C682CD8BAD82A290BA904681420A95B782AD8ED282CC97CD82F08CA482AC909F82DC82B78142200000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000003138C8B8F5782CC8B468EEC00000000000000000000000000000000000000000000815C82AF82C182B582E382A482CC82AB82B682E3815C0A8C8B8F5782C682CD8A6D8CC582BD82E9904D978A81420A8F5782DF82E982D982C782C98EEB906C82BD82BF82CC90B8905F97CD82C682C882E9814200000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000041189CC8CEC82CC8B468EEC00000000000000000000000000000000000000000000815C82A482BD82DC82E082E882CC82AB82B682E3815C0A89CC8CEC82C682CD89CC955082CC8CEC82E881420A8F5782DF82E982D982C782C98EEB906C82BD82BF82CC8E7882A682C682C882E9814220000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000212") - doAckBufSucceed(s, pkt.AckHandle, data) + + // RE-confirmed entry layout (546 bytes each): + // +0x000 char[32] name + // +0x020 char[512] description + // +0x220 u8 color_id (slot index, 1-based) + // +0x221 u8 bead_type (effect ID) + // Response: u8 count + count × 546 bytes. + beadTypes, err := s.server.divaRepo.GetBeads() + if err != nil || len(beadTypes) == 0 { + beadTypes = defaultBeadTypes + } + + lang := getLangStrings(s.server) + bf := byteframe.NewByteFrame() + bf.WriteUint8(uint8(len(beadTypes))) + for i, bt := range beadTypes { + name, desc := lang.beadName(bt), lang.beadDescription(bt) + bf.WriteBytes(stringsupport.PaddedString(name, 32, true)) + bf.WriteBytes(stringsupport.PaddedString(desc, 512, true)) + bf.WriteUint8(uint8(i + 1)) // color_id: slot 1..N + bf.WriteUint8(uint8(bt)) // bead_type: effect ID + } + + doAckBufSucceed(s, pkt.AckHandle, bf.Data()) } func handleMsgMhfSetKiju(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgMhfSetKiju) - doAckSimpleSucceed(s, pkt.AckHandle, []byte{0x00, 0x00, 0x00, 0x00}) + beadIndex := int(pkt.Unk1) + expiry := TimeAdjusted().Add(24 * time.Hour) + if err := s.server.divaRepo.AssignBead(s.charID, beadIndex, expiry); err != nil { + s.logger.Warn("Failed to assign bead", + zap.Uint32("charID", s.charID), + zap.Int("beadIndex", beadIndex), + zap.Error(err)) + } else { + s.currentBeadIndex = beadIndex + } + doAckSimpleSucceed(s, pkt.AckHandle, []byte{0x00}) } func handleMsgMhfAddUdPoint(s *Session, p mhfpacket.MHFPacket) { @@ -169,6 +206,17 @@ func handleMsgMhfAddUdPoint(s *Session, p mhfpacket.MHFPacket) { zap.Uint32("bonusPoints", pkt.BonusPoints), zap.Error(err)) } + if s.currentBeadIndex >= 0 { + total := int(pkt.QuestPoints) + int(pkt.BonusPoints) + if total > 0 { + if err := s.server.divaRepo.AddBeadPoints(s.charID, s.currentBeadIndex, total); err != nil { + s.logger.Warn("Failed to add bead points", + zap.Uint32("charID", s.charID), + zap.Int("beadIndex", s.currentBeadIndex), + zap.Error(err)) + } + } + } } doAckSimpleSucceed(s, pkt.AckHandle, []byte{0x00, 0x00, 0x00, 0x00}) @@ -176,23 +224,92 @@ func handleMsgMhfAddUdPoint(s *Session, p mhfpacket.MHFPacket) { func handleMsgMhfGetUdMyPoint(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgMhfGetUdMyPoint) - // Temporary canned response - data, _ := hex.DecodeString("00040000013C000000FA000000000000000000040000007E0000003C02000000000000000000000000000000000000000000000000000002000004CC00000438000000000000000000000000000000000000000000000000000000020000026E00000230000000000000000000020000007D0000007D000000000000000000000000000000000000000000000000000000") - doAckBufSucceed(s, pkt.AckHandle, data) + + // RE confirms: no count prefix. Client hardcodes exactly 8 loop iterations. + // Per-entry stride is 18 bytes: + // +0x00 u8 bead_index + // +0x01 u32 points + // +0x05 u32 points_dupe (same value as points) + // +0x09 u8 unk1 (half-period: 0=first 12h, 1=second 12h) + // +0x0A u32 unk2 + // +0x0E u32 unk3 + // Total: 8 × 18 = 144 bytes. + beadPoints, err := s.server.divaRepo.GetCharacterBeadPoints(s.charID) + if err != nil { + s.logger.Warn("Failed to get bead points", zap.Uint32("charID", s.charID), zap.Error(err)) + beadPoints = map[int]int{} + } + activeBead := uint8(0) + if s.currentBeadIndex >= 0 { + activeBead = uint8(s.currentBeadIndex) + } + pts := uint32(0) + if s.currentBeadIndex >= 0 { + if p, ok := beadPoints[s.currentBeadIndex]; ok { + pts = uint32(p) + } + } + bf := byteframe.NewByteFrame() + for i := 0; i < 8; i++ { + bf.WriteUint8(activeBead) + bf.WriteUint32(pts) + bf.WriteUint32(pts) // points_dupe + bf.WriteUint8(uint8(i % 2)) // unk1: 0=first half, 1=second half + bf.WriteUint32(0) // unk2 + bf.WriteUint32(0) // unk3 + } + doAckBufSucceed(s, pkt.AckHandle, bf.Data()) +} + +// udMilestones are the global contribution milestones for Diva Defense. +// RE confirms: 64 × u64 target_values + 64 × u8 target_types + u64 total = ~585 bytes. +// Slots 0–12 are populated; slots 13–63 are zero. +var udMilestones = []uint64{ + 500000, 1000000, 2000000, 3000000, 5000000, 7000000, 10000000, + 15000000, 20000000, 30000000, 50000000, 70000000, 100000000, } func handleMsgMhfGetUdTotalPointInfo(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgMhfGetUdTotalPointInfo) - // Temporary canned response - data, _ := hex.DecodeString("00000000000007A12000000000000F424000000000001E848000000000002DC6C000000000003D090000000000004C4B4000000000005B8D8000000000006ACFC000000000007A1200000000000089544000000000009896800000000000E4E1C00000000001312D0000000000017D78400000000001C9C3800000000002160EC00000000002625A000000000002AEA5400000000002FAF0800000000003473BC0000000000393870000000000042C1D800000000004C4B40000000000055D4A800000000005F5E10000000000008954400000000001C9C3800000000003473BC00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001020300000000000000000000000000000000000000000000000000000000000000000000000000000000101F1420") - doAckBufSucceed(s, pkt.AckHandle, data) + + total, err := s.server.divaRepo.GetTotalBeadPoints() + if err != nil { + s.logger.Warn("Failed to get total bead points", zap.Error(err)) + } + + bf := byteframe.NewByteFrame() + bf.WriteUint8(0) // error = success + // 64 × u64 target_values (big-endian) + for i := 0; i < 64; i++ { + var v uint64 + if i < len(udMilestones) { + v = udMilestones[i] + } + bf.WriteUint64(v) + } + // 64 × u8 target_types (0 = global) + for i := 0; i < 64; i++ { + bf.WriteUint8(0) + } + // u64 total_souls + bf.WriteUint64(uint64(total)) + doAckBufSucceed(s, pkt.AckHandle, bf.Data()) } func handleMsgMhfGetUdSelectedColorInfo(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgMhfGetUdSelectedColorInfo) - // Unk - doAckBufSucceed(s, pkt.AckHandle, []byte{0x00, 0x01, 0x01, 0x01, 0x02, 0x03, 0x02, 0x00, 0x00}) + // RE confirms: exactly 9 bytes = u8 error + u8[8] winning colors. + bf := byteframe.NewByteFrame() + bf.WriteUint8(0) // error = success + for day := 0; day < 8; day++ { + topBead, err := s.server.divaRepo.GetTopBeadPerDay(day) + if err != nil { + topBead = 0 + } + bf.WriteUint8(uint8(topBead)) + } + doAckBufSucceed(s, pkt.AckHandle, bf.Data()) } func handleMsgMhfGetUdMonsterPoint(s *Session, p mhfpacket.MHFPacket) { @@ -329,16 +446,25 @@ func handleMsgMhfGetUdMonsterPoint(s *Session, p mhfpacket.MHFPacket) { func handleMsgMhfGetUdDailyPresentList(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgMhfGetUdDailyPresentList) - // Temporary canned response - data, _ := hex.DecodeString("0100001600000A5397DF00000000000000000000000000000000") - doAckBufSucceed(s, pkt.AckHandle, data) + // DailyPresentList: u16 count + count × 15-byte entries. + // Entry: u8 rank_type, u16 rank_from, u16 rank_to, u8 item_type, + // u16 _pad0(skip), u16 item_id, u16 _pad1(skip), u16 quantity, u8 unk. + // Padding at +6 and +10 is NOT read by the client. + bf := byteframe.NewByteFrame() + bf.WriteUint16(0) // count = 0 (no entries configured) + doAckBufSucceed(s, pkt.AckHandle, bf.Data()) } func handleMsgMhfGetUdNormaPresentList(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgMhfGetUdNormaPresentList) - // Temporary canned response - data, _ := hex.DecodeString("0100001600000A5397DF00000000000000000000000000000000") - doAckBufSucceed(s, pkt.AckHandle, data) + // NormaPresentList: u16 count + count × 19-byte entries. + // Same layout as DailyPresent (+0x00..+0x0D), plus: + // +0x0E u32 points_required (norma threshold) + // +0x12 u8 bead_type (BeadType that unlocks this tier) + // Padding at +6 and +10 NOT read. + bf := byteframe.NewByteFrame() + bf.WriteUint16(0) // count = 0 (no entries configured) + doAckBufSucceed(s, pkt.AckHandle, bf.Data()) } func handleMsgMhfAcquireUdItem(s *Session, p mhfpacket.MHFPacket) { diff --git a/server/channelserver/handlers_festa.go b/server/channelserver/handlers_festa.go index 6b6be370b..b8d1bb84c 100644 --- a/server/channelserver/handlers_festa.go +++ b/server/channelserver/handlers_festa.go @@ -26,65 +26,6 @@ func handleMsgMhfLoadMezfesData(s *Session, p mhfpacket.MHFPacket) { []byte{0x00, 0x00, 0x00, 0x00, 0x02, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00}) } -func handleMsgMhfEnumerateRanking(s *Session, p mhfpacket.MHFPacket) { - pkt := p.(*mhfpacket.MsgMhfEnumerateRanking) - bf := byteframe.NewByteFrame() - state := s.server.erupeConfig.DebugOptions.TournamentOverride - // Unk - // Unk - // Start? - // End? - midnight := TimeMidnight() - switch state { - case 1: - bf.WriteUint32(uint32(midnight.Unix())) - bf.WriteUint32(uint32(midnight.Add(3 * 24 * time.Hour).Unix())) - bf.WriteUint32(uint32(midnight.Add(13 * 24 * time.Hour).Unix())) - bf.WriteUint32(uint32(midnight.Add(20 * 24 * time.Hour).Unix())) - case 2: - bf.WriteUint32(uint32(midnight.Add(-3 * 24 * time.Hour).Unix())) - bf.WriteUint32(uint32(midnight.Unix())) - bf.WriteUint32(uint32(midnight.Add(10 * 24 * time.Hour).Unix())) - bf.WriteUint32(uint32(midnight.Add(17 * 24 * time.Hour).Unix())) - case 3: - bf.WriteUint32(uint32(midnight.Add(-13 * 24 * time.Hour).Unix())) - bf.WriteUint32(uint32(midnight.Add(-10 * 24 * time.Hour).Unix())) - bf.WriteUint32(uint32(midnight.Unix())) - bf.WriteUint32(uint32(midnight.Add(7 * 24 * time.Hour).Unix())) - default: - bf.WriteBytes(make([]byte, 16)) - bf.WriteUint32(uint32(TimeAdjusted().Unix())) // TS Current Time - bf.WriteUint8(3) - bf.WriteBytes(make([]byte, 4)) - doAckBufSucceed(s, pkt.AckHandle, bf.Data()) - return - } - bf.WriteUint32(uint32(TimeAdjusted().Unix())) // TS Current Time - bf.WriteUint8(3) - ps.Uint8(bf, "", false) - bf.WriteUint16(0) // numEvents - bf.WriteUint8(0) // numCups - - /* - struct event - uint32 eventID - uint16 unk - uint16 unk - uint32 unk - psUint8 name - - struct cup - uint32 cupID - uint16 unk - uint16 unk - uint16 unk - psUint8 name - psUint16 desc - */ - - doAckBufSucceed(s, pkt.AckHandle, bf.Data()) -} - // Festa timing constants (all values in seconds) const ( festaVotingDuration = 9000 // 150 min voting window diff --git a/server/channelserver/handlers_guild.go b/server/channelserver/handlers_guild.go index 11b060d41..0e5b5d132 100644 --- a/server/channelserver/handlers_guild.go +++ b/server/channelserver/handlers_guild.go @@ -372,7 +372,39 @@ func handleMsgMhfReadGuildcard(s *Session, p mhfpacket.MHFPacket) { func handleMsgMhfEntryRookieGuild(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgMhfEntryRookieGuild) - doAckSimpleFail(s, pkt.AckHandle, make([]byte, 4)) + + // pkt.Unk==0: fresh rookie entering a rookie guild (return_type=1). + // pkt.Unk>=1: returning player entering a comeback/return guild (return_type=2). + returnType := uint8(1) + nameTemplate := s.server.i18n.guild.rookieGuildName + if pkt.Unk >= 1 { + returnType = 2 + nameTemplate = s.server.i18n.guild.returnGuildName + } + + guildID, err := s.server.guildRepo.FindOrCreateReturnGuild(returnType, nameTemplate) + if err != nil { + s.logger.Error("failed to find/create return guild", + zap.Uint32("charID", s.charID), + zap.Error(err), + ) + doAckSimpleFail(s, pkt.AckHandle, make([]byte, 4)) + return + } + + if err := s.server.guildRepo.AddMember(guildID, s.charID); err != nil { + s.logger.Error("failed to add character to return guild", + zap.Uint32("charID", s.charID), + zap.Uint32("guildID", guildID), + zap.Error(err), + ) + doAckSimpleFail(s, pkt.AckHandle, make([]byte, 4)) + return + } + + bf := byteframe.NewByteFrame() + bf.WriteUint32(guildID) + doAckSimpleSucceed(s, pkt.AckHandle, bf.Data()) } func handleMsgMhfUpdateForceGuildRank(s *Session, p mhfpacket.MHFPacket) {} // stub: unimplemented diff --git a/server/channelserver/handlers_guild_info.go b/server/channelserver/handlers_guild_info.go index dc7dc8182..4542e00f9 100644 --- a/server/channelserver/handlers_guild_info.go +++ b/server/channelserver/handlers_guild_info.go @@ -98,9 +98,9 @@ func handleMsgMhfInfoGuild(s *Session, p mhfpacket.MHFPacket) { bf.WriteInt8(int8(FestivalColorCodes[guild.FestivalColor])) bf.WriteUint32(guild.RankRP) bf.WriteBytes(guildLeaderName) - bf.WriteUint32(0) // Unk - bf.WriteBool(false) // isReturnGuild - bf.WriteBool(false) // earnedSpecialHall + bf.WriteUint32(0) // Unk + bf.WriteBool(guild.ReturnType > 0) // isReturnGuild + bf.WriteBool(false) // earnedSpecialHall bf.WriteUint8(2) bf.WriteUint8(2) bf.WriteUint32(guild.EventRP) // Skipped if last byte is <2? diff --git a/server/channelserver/handlers_guild_ops.go b/server/channelserver/handlers_guild_ops.go index 4fa1ce6ed..faff03246 100644 --- a/server/channelserver/handlers_guild_ops.go +++ b/server/channelserver/handlers_guild_ops.go @@ -125,6 +125,13 @@ func handleMsgMhfOperateGuild(s *Session, p mhfpacket.MHFPacket) { s.logger.Error("Failed to exchange guild event RP", zap.Error(err)) } bf.WriteUint32(balance) + case mhfpacket.OperateGuildGraduateRookie, mhfpacket.OperateGuildGraduateReturn: + // Player graduates (leaves) a temporary return/rookie guild. + // No extra packet data — just remove and succeed. + isApplicant := characterGuildInfo != nil && characterGuildInfo.IsApplicant + if _, err := s.server.guildService.Leave(s.charID, guild.ID, isApplicant, guild.Name); err != nil { + s.logger.Error("Failed to graduate from return guild", zap.Error(err)) + } default: s.logger.Error("unhandled operate guild action", zap.Uint8("action", uint8(pkt.Action))) } diff --git a/server/channelserver/handlers_guild_scout.go b/server/channelserver/handlers_guild_scout.go index 8e390afb4..ab5ea8042 100644 --- a/server/channelserver/handlers_guild_scout.go +++ b/server/channelserver/handlers_guild_scout.go @@ -53,7 +53,7 @@ func handleMsgMhfCancelGuildScout(s *Session, p mhfpacket.MHFPacket) { return } - err = s.server.guildRepo.CancelInvitation(guild.ID, pkt.InvitationID) + err = s.server.guildRepo.CancelInvite(pkt.InvitationID) if err != nil { doAckBufFail(s, pkt.AckHandle, make([]byte, 4)) @@ -123,28 +123,25 @@ func handleMsgMhfGetGuildScoutList(s *Session, p mhfpacket.MHFPacket) { } } - chars, err := s.server.guildRepo.ListInvitedCharacters(guildInfo.ID) + invites, err := s.server.guildRepo.ListInvites(guildInfo.ID) if err != nil { - s.logger.Error("failed to retrieve scouted characters", zap.Error(err)) + s.logger.Error("failed to retrieve scout invites", zap.Error(err)) doAckSimpleSucceed(s, pkt.AckHandle, make([]byte, 4)) return } bf := byteframe.NewByteFrame() bf.SetBE() - bf.WriteUint32(uint32(len(chars))) + bf.WriteUint32(uint32(len(invites))) - for _, sc := range chars { - // This seems to be used as a unique ID for the invitation sent - // we can just use the charID and then filter on guild_id+charID when performing operations - // this might be a problem later with mails sent referencing IDs but we'll see. - bf.WriteUint32(sc.CharID) - bf.WriteUint32(sc.ActorID) - bf.WriteUint32(sc.CharID) - bf.WriteUint32(uint32(TimeAdjusted().Unix())) - bf.WriteUint16(sc.HR) - bf.WriteUint16(sc.GR) - bf.WriteBytes(stringsupport.PaddedString(sc.Name, 32, true)) + for _, inv := range invites { + bf.WriteUint32(inv.ID) + bf.WriteUint32(inv.ActorID) + bf.WriteUint32(inv.CharID) + bf.WriteUint32(uint32(inv.InvitedAt.Unix())) + bf.WriteUint16(inv.HR) + bf.WriteUint16(inv.GR) + bf.WriteBytes(stringsupport.PaddedString(inv.Name, 32, true)) } doAckBufSucceed(s, pkt.AckHandle, bf.Data()) diff --git a/server/channelserver/handlers_guild_scout_test.go b/server/channelserver/handlers_guild_scout_test.go index f41973a05..f601a5264 100644 --- a/server/channelserver/handlers_guild_scout_test.go +++ b/server/channelserver/handlers_guild_scout_test.go @@ -12,7 +12,7 @@ func TestAnswerGuildScout_Accept(t *testing.T) { server := createMockServer() mailMock := &mockMailRepo{} guildMock := &mockGuildRepo{ - application: &GuildApplication{GuildID: 10, CharID: 1}, + hasInviteResult: true, } guildMock.guild = &Guild{ID: 10, Name: "TestGuild"} guildMock.guild.LeaderCharID = 50 @@ -29,8 +29,8 @@ func TestAnswerGuildScout_Accept(t *testing.T) { handleMsgMhfAnswerGuildScout(session, pkt) - if guildMock.acceptedCharID != 1 { - t.Errorf("AcceptApplication charID = %d, want 1", guildMock.acceptedCharID) + if guildMock.acceptInviteCharID != 1 { + t.Errorf("AcceptInvite charID = %d, want 1", guildMock.acceptInviteCharID) } if len(mailMock.sentMails) != 2 { t.Fatalf("Expected 2 mails (self + leader), got %d", len(mailMock.sentMails)) @@ -47,7 +47,7 @@ func TestAnswerGuildScout_Decline(t *testing.T) { server := createMockServer() mailMock := &mockMailRepo{} guildMock := &mockGuildRepo{ - application: &GuildApplication{GuildID: 10, CharID: 1}, + hasInviteResult: true, } guildMock.guild = &Guild{ID: 10, Name: "TestGuild"} guildMock.guild.LeaderCharID = 50 @@ -64,8 +64,8 @@ func TestAnswerGuildScout_Decline(t *testing.T) { handleMsgMhfAnswerGuildScout(session, pkt) - if guildMock.rejectedCharID != 1 { - t.Errorf("RejectApplication charID = %d, want 1", guildMock.rejectedCharID) + if guildMock.declineInviteCharID != 1 { + t.Errorf("DeclineInvite charID = %d, want 1", guildMock.declineInviteCharID) } if len(mailMock.sentMails) != 2 { t.Fatalf("Expected 2 mails (self + leader), got %d", len(mailMock.sentMails)) @@ -101,7 +101,7 @@ func TestAnswerGuildScout_ApplicationMissing(t *testing.T) { server := createMockServer() mailMock := &mockMailRepo{} guildMock := &mockGuildRepo{ - application: nil, // no application found + hasInviteResult: false, // no invite found } guildMock.guild = &Guild{ID: 10, Name: "TestGuild"} guildMock.guild.LeaderCharID = 50 @@ -134,7 +134,7 @@ func TestAnswerGuildScout_MailError(t *testing.T) { server := createMockServer() mailMock := &mockMailRepo{sendErr: errNotFound} guildMock := &mockGuildRepo{ - application: &GuildApplication{GuildID: 10, CharID: 1}, + hasInviteResult: true, } guildMock.guild = &Guild{ID: 10, Name: "TestGuild"} guildMock.guild.LeaderCharID = 50 diff --git a/server/channelserver/handlers_guild_test.go b/server/channelserver/handlers_guild_test.go index 073a4ddc0..4b16ceacd 100644 --- a/server/channelserver/handlers_guild_test.go +++ b/server/channelserver/handlers_guild_test.go @@ -923,23 +923,36 @@ func TestCheckMonthlyItem_UnknownType(t *testing.T) { } func TestHandleMsgMhfEntryRookieGuild(t *testing.T) { - server := createMockServer() - session := createMockSession(1, server) - - pkt := &mhfpacket.MsgMhfEntryRookieGuild{ - AckHandle: 12345, - Unk: 42, + tests := []struct { + name string + unk uint32 + }{ + {"rookie (Unk=0)", 0}, + {"comeback (Unk=1)", 1}, + {"comeback with hr (Unk=2)", 2}, } + for _, tt := range tests { + t.Run(tt.name, func(t *testing.T) { + server := createMockServer() + server.guildRepo = &mockGuildRepo{} + session := createMockSession(1, server) - handleMsgMhfEntryRookieGuild(session, pkt) + pkt := &mhfpacket.MsgMhfEntryRookieGuild{ + AckHandle: 12345, + Unk: tt.unk, + } - select { - case p := <-session.sendPackets: - if len(p.data) == 0 { - t.Error("Response packet should have data") - } - default: - t.Error("No response packet queued") + handleMsgMhfEntryRookieGuild(session, pkt) + + select { + case p := <-session.sendPackets: + if len(p.data) == 0 { + t.Error("Response packet should have data") + } + default: + t.Error("No response packet queued") + } + }) } } diff --git a/server/channelserver/handlers_items.go b/server/channelserver/handlers_items.go index 83ab57b5b..ce2c10f52 100644 --- a/server/channelserver/handlers_items.go +++ b/server/channelserver/handlers_items.go @@ -9,11 +9,6 @@ import ( "go.uber.org/zap" ) -func handleMsgMhfTransferItem(s *Session, p mhfpacket.MHFPacket) { - pkt := p.(*mhfpacket.MsgMhfTransferItem) - doAckSimpleSucceed(s, pkt.AckHandle, []byte{0x00, 0x00, 0x00, 0x00}) -} - func handleMsgMhfEnumeratePrice(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgMhfEnumeratePrice) bf := byteframe.NewByteFrame() @@ -50,10 +45,6 @@ func handleMsgMhfEnumeratePrice(s *Session, p mhfpacket.MHFPacket) { doAckBufSucceed(s, pkt.AckHandle, bf.Data()) } -func handleMsgMhfEnumerateOrder(s *Session, p mhfpacket.MHFPacket) { - pkt := p.(*mhfpacket.MsgMhfEnumerateOrder) - stubEnumerateNoResults(s, pkt.AckHandle) -} func handleMsgMhfGetExtraInfo(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgMhfGetExtraInfo) diff --git a/server/channelserver/handlers_misc.go b/server/channelserver/handlers_misc.go index 0e64453e1..525a13e84 100644 --- a/server/channelserver/handlers_misc.go +++ b/server/channelserver/handlers_misc.go @@ -147,11 +147,39 @@ func handleMsgMhfGetSenyuDailyCount(s *Session, p mhfpacket.MHFPacket) { doAckBufSucceed(s, pkt.AckHandle, bf.Data()) } -func handleMsgMhfGetDailyMissionMaster(s *Session, p mhfpacket.MHFPacket) {} // stub: unimplemented +// handleMsgMhfGetDailyMissionMaster returns an empty daily mission master list. +// The full response format is not yet reverse-engineered; count=0 is safe. +func handleMsgMhfGetDailyMissionMaster(s *Session, p mhfpacket.MHFPacket) { + if p == nil { + return + } + pkt := p.(*mhfpacket.MsgMhfGetDailyMissionMaster) + bf := byteframe.NewByteFrame() + bf.WriteUint32(0) // entry count = 0 + doAckBufSucceed(s, pkt.AckHandle, bf.Data()) +} -func handleMsgMhfGetDailyMissionPersonal(s *Session, p mhfpacket.MHFPacket) {} // stub: unimplemented +// handleMsgMhfGetDailyMissionPersonal returns an empty personal daily mission progress list. +// The full response format is not yet reverse-engineered; count=0 is safe. +func handleMsgMhfGetDailyMissionPersonal(s *Session, p mhfpacket.MHFPacket) { + if p == nil { + return + } + pkt := p.(*mhfpacket.MsgMhfGetDailyMissionPersonal) + bf := byteframe.NewByteFrame() + bf.WriteUint32(0) // entry count = 0 + doAckBufSucceed(s, pkt.AckHandle, bf.Data()) +} -func handleMsgMhfSetDailyMissionPersonal(s *Session, p mhfpacket.MHFPacket) {} // stub: unimplemented +// handleMsgMhfSetDailyMissionPersonal acknowledges a personal daily mission progress write. +// The full request/response format is not yet reverse-engineered. +func handleMsgMhfSetDailyMissionPersonal(s *Session, p mhfpacket.MHFPacket) { + if p == nil { + return + } + pkt := p.(*mhfpacket.MsgMhfSetDailyMissionPersonal) + doAckSimpleSucceed(s, pkt.AckHandle, make([]byte, 4)) +} // Equip skin history buffer sizes per game version const ( diff --git a/server/channelserver/handlers_quest.go b/server/channelserver/handlers_quest.go index 2914b0974..cf5fd713f 100644 --- a/server/channelserver/handlers_quest.go +++ b/server/channelserver/handlers_quest.go @@ -107,10 +107,9 @@ func handleMsgSysGetFile(s *Session, p mhfpacket.MHFPacket) { ) } filename := fmt.Sprintf("%d_0_0_0_S%d_T%d_C%d", pkt.ScenarioIdentifer.CategoryID, pkt.ScenarioIdentifer.MainID, pkt.ScenarioIdentifer.Flags, pkt.ScenarioIdentifer.ChapterID) - // Read the scenario file. - data, err := os.ReadFile(filepath.Join(s.server.erupeConfig.BinPath, fmt.Sprintf("scenarios/%s.bin", filename))) + data, err := loadScenarioBinary(s, filename) if err != nil { - s.logger.Error("Failed to open scenario file", zap.String("binPath", s.server.erupeConfig.BinPath), zap.String("filename", filename)) + s.logger.Error("Failed to open scenario file", zap.String("binPath", s.server.erupeConfig.BinPath), zap.String("filename", filename), zap.Error(err)) doAckBufFail(s, pkt.AckHandle, nil) return } @@ -127,9 +126,9 @@ func handleMsgSysGetFile(s *Session, p mhfpacket.MHFPacket) { pkt.Filename = seasonConversion(s, pkt.Filename) } - data, err := os.ReadFile(filepath.Join(s.server.erupeConfig.BinPath, fmt.Sprintf("quests/%s.bin", pkt.Filename))) + data, err := loadQuestBinary(s, pkt.Filename) if err != nil { - s.logger.Error("Failed to open quest file", zap.String("binPath", s.server.erupeConfig.BinPath), zap.String("filename", pkt.Filename)) + s.logger.Error("Failed to open quest file", zap.String("binPath", s.server.erupeConfig.BinPath), zap.String("filename", pkt.Filename), zap.Error(err)) doAckBufFail(s, pkt.AckHandle, nil) return } @@ -141,10 +140,54 @@ func handleMsgSysGetFile(s *Session, p mhfpacket.MHFPacket) { } func questFileExists(s *Session, filename string) bool { - _, err := os.Stat(filepath.Join(s.server.erupeConfig.BinPath, fmt.Sprintf("quests/%s.bin", filename))) + base := filepath.Join(s.server.erupeConfig.BinPath, "quests", filename) + if _, err := os.Stat(base + ".bin"); err == nil { + return true + } + _, err := os.Stat(base + ".json") return err == nil } +// loadQuestBinary loads a quest file by name, trying .bin first then .json. +// For .json files it compiles the JSON to the MHF binary wire format. +func loadQuestBinary(s *Session, filename string) ([]byte, error) { + base := filepath.Join(s.server.erupeConfig.BinPath, "quests", filename) + + if data, err := os.ReadFile(base + ".bin"); err == nil { + return data, nil + } + + jsonData, err := os.ReadFile(base + ".json") + if err != nil { + return nil, err + } + compiled, err := CompileQuestJSON(jsonData) + if err != nil { + return nil, fmt.Errorf("compile quest JSON %s: %w", filename, err) + } + return compiled, nil +} + +// loadScenarioBinary loads a scenario file by name, trying .bin first then .json. +// For .json files it compiles the JSON to the MHF binary wire format. +func loadScenarioBinary(s *Session, filename string) ([]byte, error) { + base := filepath.Join(s.server.erupeConfig.BinPath, "scenarios", filename) + + if data, err := os.ReadFile(base + ".bin"); err == nil { + return data, nil + } + + jsonData, err := os.ReadFile(base + ".json") + if err != nil { + return nil, err + } + compiled, err := CompileScenarioJSON(jsonData) + if err != nil { + return nil, fmt.Errorf("compile scenario JSON %s: %w", filename, err) + } + return compiled, nil +} + func seasonConversion(s *Session, questFile string) string { // Try the seasonal override file (e.g., 00001d2 for season 2) filename := fmt.Sprintf("%s%d", questFile[:6], s.server.Season()) @@ -209,12 +252,22 @@ func loadQuestFile(s *Session, questId int) []byte { return cached } - file, err := os.ReadFile(filepath.Join(s.server.erupeConfig.BinPath, fmt.Sprintf("quests/%05dd0.bin", questId))) - if err != nil { + base := filepath.Join(s.server.erupeConfig.BinPath, fmt.Sprintf("quests/%05dd0", questId)) + var decrypted []byte + if data, err := os.ReadFile(base + ".bin"); err == nil { + decrypted = decryption.UnpackSimple(data) + } else if jsonData, err := os.ReadFile(base + ".json"); err == nil { + compiled, err := CompileQuestJSON(jsonData) + if err != nil { + s.logger.Error("loadQuestFile: failed to compile quest JSON", + zap.Int("questId", questId), zap.Error(err)) + return nil + } + decrypted = compiled + } else { return nil } - decrypted := decryption.UnpackSimple(file) if s.server.erupeConfig.RealClientMode <= cfg.Z1 && s.server.erupeConfig.DebugOptions.AutoQuestBackport { decrypted = BackportQuest(decrypted, s.server.erupeConfig.RealClientMode) } @@ -290,7 +343,34 @@ func makeEventQuest(s *Session, eq EventQuest) ([]byte, error) { } bf.WriteUint8(eq.QuestType) if eq.QuestType == QuestTypeSpecialTool { - bf.WriteBool(false) + var stamps, required int + var deadline time.Time + err := s.server.db.QueryRow(`SELECT COUNT(*) FROM campaign_state WHERE campaign_id = ( + SELECT campaign_id + FROM campaign_rewards + WHERE item_type = 9 + AND item_id = $1 + LIMIT 1 + ) AND character_id = $2`, eq.QuestID, s.charID).Scan(&stamps) + if err != nil { + bf.WriteBool(false) + } else { + err = s.server.db.QueryRow(`SELECT stamps, end_time + FROM campaigns + WHERE id = ( + SELECT campaign_id + FROM campaign_rewards + WHERE item_type = 9 + AND item_id = $1 + LIMIT 1 + )`, eq.QuestID).Scan(&required, &deadline) + required = campaignRequiredStamps(required) + if err == nil && stamps >= required && deadline.After(time.Now()) { + bf.WriteBool(true) + } else { + bf.WriteBool(false) + } + } } else { bf.WriteBool(true) } @@ -654,7 +734,6 @@ func getTuneValueRange(start uint16, value uint16) []tuneValue { return tv } -func handleMsgMhfEnterTournamentQuest(s *Session, p mhfpacket.MHFPacket) {} // stub: unimplemented func handleMsgMhfGetUdBonusQuestInfo(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgMhfGetUdBonusQuestInfo) diff --git a/server/channelserver/handlers_quest_test.go b/server/channelserver/handlers_quest_test.go index 16f3def5b..61e2a5fd8 100644 --- a/server/channelserver/handlers_quest_test.go +++ b/server/channelserver/handlers_quest_test.go @@ -604,17 +604,16 @@ func TestQuestFileLoadingErrors(t *testing.T) { } } -// TestTournamentQuestEntryStub tests the stub tournament quest handler -func TestTournamentQuestEntryStub(t *testing.T) { +// TestTournamentQuestEntryHandler tests the tournament quest entry handler. +func TestTournamentQuestEntryHandler(t *testing.T) { mockConn := &MockCryptConn{sentPackets: make([][]byte, 0)} s := createTestSession(mockConn) + s.server.tournamentRepo = &mockTournamentRepo{} - pkt := &mhfpacket.MsgMhfEnterTournamentQuest{} + pkt := &mhfpacket.MsgMhfEnterTournamentQuest{AckHandle: 1} - // This tests that the stub function doesn't panic handleMsgMhfEnterTournamentQuest(s, pkt) - // Verify no crash occurred (pass if we reach here) if s.logger == nil { t.Errorf("Session corrupted") } diff --git a/server/channelserver/handlers_reward.go b/server/channelserver/handlers_reward.go index 9e6a04cdf..56a3a5dd8 100644 --- a/server/channelserver/handlers_reward.go +++ b/server/channelserver/handlers_reward.go @@ -1,8 +1,6 @@ package channelserver import ( - "encoding/hex" - "erupe-ce/common/byteframe" "erupe-ce/network/mhfpacket" ) @@ -18,21 +16,44 @@ func handleMsgMhfGetAdditionalBeatReward(s *Session, p mhfpacket.MHFPacket) { func handleMsgMhfGetUdRankingRewardList(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgMhfGetUdRankingRewardList) - // Temporary canned response - data, _ := hex.DecodeString("0100001600000A5397DF00000000000000000000000000000000") - doAckBufSucceed(s, pkt.AckHandle, data) + // RankingRewardList: u16 count + count × 14-byte entries. + // Entry: u8 rank_type, u16 rank_from, u16 rank_to, u8 item_type, + // u32 item_id, u32 quantity. No padding gaps. + bf := byteframe.NewByteFrame() + bf.WriteUint16(0) // count = 0 (no entries configured) + doAckBufSucceed(s, pkt.AckHandle, bf.Data()) } func handleMsgMhfGetRewardSong(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgMhfGetRewardSong) - // Temporary canned response - data, _ := hex.DecodeString("0100001600000A5397DF00000000000000000000000000000000") - doAckBufSucceed(s, pkt.AckHandle, data) + // RE-confirmed layout (22 bytes): + // +0x00 u8 error + // +0x01 u8 usage_count + // +0x02 u32 prayer_id + // +0x06 u32 prayer_end (0xFFFFFFFF = no active prayer) + // then 4 × (u8 color_error, u8 color_id, u8 color_usage_count) + bf := byteframe.NewByteFrame() + bf.WriteUint8(0) // error + bf.WriteUint8(0) // usage_count + bf.WriteUint32(0) // prayer_id + bf.WriteUint32(0xFFFFFFFF) // prayer_end: no active prayer + for colorID := uint8(1); colorID <= 4; colorID++ { + bf.WriteUint8(0) // color_error + bf.WriteUint8(colorID) // color_id + bf.WriteUint8(0) // color_usage_count + } + doAckBufSucceed(s, pkt.AckHandle, bf.Data()) } -func handleMsgMhfUseRewardSong(s *Session, p mhfpacket.MHFPacket) {} // stub: unimplemented +func handleMsgMhfUseRewardSong(s *Session, p mhfpacket.MHFPacket) { + pkt := p.(*mhfpacket.MsgMhfUseRewardSong) + doAckSimpleSucceed(s, pkt.AckHandle, []byte{0x00}) +} -func handleMsgMhfAddRewardSongCount(s *Session, p mhfpacket.MHFPacket) {} // stub: unimplemented +func handleMsgMhfAddRewardSongCount(s *Session, p mhfpacket.MHFPacket) { + pkt := p.(*mhfpacket.MsgMhfAddRewardSongCount) + doAckSimpleSucceed(s, pkt.AckHandle, []byte{0x00}) +} func handleMsgMhfAcquireMonthlyReward(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgMhfAcquireMonthlyReward) diff --git a/server/channelserver/handlers_reward_test.go b/server/channelserver/handlers_reward_test.go index 028b20865..a11fe515f 100644 --- a/server/channelserver/handlers_reward_test.go +++ b/server/channelserver/handlers_reward_test.go @@ -70,26 +70,34 @@ func TestHandleMsgMhfUseRewardSong(t *testing.T) { server := createMockServer() session := createMockSession(1, server) - defer func() { - if r := recover(); r != nil { - t.Errorf("handleMsgMhfUseRewardSong panicked: %v", r) - } - }() + pkt := &mhfpacket.MsgMhfUseRewardSong{AckHandle: 12345} + handleMsgMhfUseRewardSong(session, pkt) - handleMsgMhfUseRewardSong(session, nil) + select { + case p := <-session.sendPackets: + if len(p.data) == 0 { + t.Error("Response packet should have data") + } + default: + t.Error("No response packet queued") + } } func TestHandleMsgMhfAddRewardSongCount(t *testing.T) { server := createMockServer() session := createMockSession(1, server) - defer func() { - if r := recover(); r != nil { - t.Errorf("handleMsgMhfAddRewardSongCount panicked: %v", r) - } - }() + pkt := &mhfpacket.MsgMhfAddRewardSongCount{AckHandle: 42} + handleMsgMhfAddRewardSongCount(session, pkt) - handleMsgMhfAddRewardSongCount(session, nil) + select { + case p := <-session.sendPackets: + if len(p.data) == 0 { + t.Error("Response packet should have data") + } + default: + t.Error("No response packet queued") + } } func TestHandleMsgMhfAcquireMonthlyReward(t *testing.T) { @@ -193,16 +201,15 @@ func TestEmptyHandlers_MiscFiles_Reward(t *testing.T) { server := createMockServer() session := createMockSession(1, server) - tests := []struct { + // Handlers that accept nil and take no action (no AckHandle). + nilSafeTests := []struct { name string fn func() }{ - {"handleMsgMhfUseRewardSong", func() { handleMsgMhfUseRewardSong(session, nil) }}, - {"handleMsgMhfAddRewardSongCount", func() { handleMsgMhfAddRewardSongCount(session, nil) }}, {"handleMsgMhfAcceptReadReward", func() { handleMsgMhfAcceptReadReward(session, nil) }}, } - for _, tt := range tests { + for _, tt := range nilSafeTests { t.Run(tt.name, func(t *testing.T) { defer func() { if r := recover(); r != nil { @@ -212,4 +219,18 @@ func TestEmptyHandlers_MiscFiles_Reward(t *testing.T) { tt.fn() }) } + + // handleMsgMhfUseRewardSong is a real handler (requires a typed packet). + t.Run("handleMsgMhfUseRewardSong", func(t *testing.T) { + pkt := &mhfpacket.MsgMhfUseRewardSong{AckHandle: 1} + handleMsgMhfUseRewardSong(session, pkt) + select { + case p := <-session.sendPackets: + if len(p.data) == 0 { + t.Error("handleMsgMhfUseRewardSong: response should have data") + } + default: + t.Error("handleMsgMhfUseRewardSong: no response queued") + } + }) } diff --git a/server/channelserver/handlers_simple_test.go b/server/channelserver/handlers_simple_test.go index 97f68b653..44ebb1464 100644 --- a/server/channelserver/handlers_simple_test.go +++ b/server/channelserver/handlers_simple_test.go @@ -33,16 +33,18 @@ func TestHandlerMsgMhfSexChanger(t *testing.T) { func TestHandlerMsgMhfEnterTournamentQuest(t *testing.T) { server := createMockServer() + server.tournamentRepo = &mockTournamentRepo{} session := createMockSession(1, server) - // Should not panic with nil packet (empty handler) + pkt := &mhfpacket.MsgMhfEnterTournamentQuest{AckHandle: 1} + defer func() { if r := recover(); r != nil { t.Errorf("handleMsgMhfEnterTournamentQuest panicked: %v", r) } }() - handleMsgMhfEnterTournamentQuest(session, nil) + handleMsgMhfEnterTournamentQuest(session, pkt) } func TestHandlerMsgMhfGetUdBonusQuestInfo(t *testing.T) { @@ -293,7 +295,6 @@ func TestEmptyHandlers_NoDb(t *testing.T) { {"handleMsgMhfKickExportForce", handleMsgMhfKickExportForce}, {"handleMsgSysSetStatus", handleMsgSysSetStatus}, {"handleMsgSysEcho", handleMsgSysEcho}, - {"handleMsgMhfEnterTournamentQuest", handleMsgMhfEnterTournamentQuest}, } for _, tt := range tests { diff --git a/server/channelserver/handlers_tactics.go b/server/channelserver/handlers_tactics.go index 7ad787995..6ec231a7d 100644 --- a/server/channelserver/handlers_tactics.go +++ b/server/channelserver/handlers_tactics.go @@ -2,29 +2,121 @@ package channelserver import ( "encoding/hex" + "fmt" + "strconv" + "erupe-ce/common/byteframe" "erupe-ce/network/mhfpacket" + "go.uber.org/zap" ) func handleMsgMhfGetUdTacticsPoint(s *Session, p mhfpacket.MHFPacket) { // Diva defense interception points pkt := p.(*mhfpacket.MsgMhfGetUdTacticsPoint) - // Temporary canned response - data, _ := hex.DecodeString("000000A08F0BE2DAE30BE30AE2EAE2E9E2E8E2F5E2F3E2F2E2F1E2BB") - doAckBufSucceed(s, pkt.AckHandle, data) + + pointsMap, err := s.server.divaRepo.GetCharacterInterceptionPoints(s.charID) + if err != nil { + s.logger.Warn("Failed to get interception points", zap.Uint32("charID", s.charID), zap.Error(err)) + pointsMap = map[string]int{} + } + + // Build per-quest list and compute total. + type questEntry struct { + questFileID int + points int + } + var entries []questEntry + var total int + for k, pts := range pointsMap { + qid, err := strconv.Atoi(k) + if err != nil { + continue + } + entries = append(entries, questEntry{qid, pts}) + total += pts + } + + bf := byteframe.NewByteFrame() + bf.WriteUint32(uint32(total)) + bf.WriteUint32(uint32(len(entries))) + for _, e := range entries { + bf.WriteUint32(uint32(e.questFileID)) + bf.WriteUint32(uint32(e.points)) + } + + doAckBufSucceed(s, pkt.AckHandle, bf.Data()) } +// udTacticsQuestFileIDs is the allowed range of interception quest file IDs. +const ( + udTacticsQuestMin = 58079 + udTacticsQuestMax = 58083 +) + func handleMsgMhfAddUdTacticsPoint(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgMhfAddUdTacticsPoint) - stubEnumerateNoResults(s, pkt.AckHandle) + questFileID := int(pkt.QuestID) + points := int(pkt.TacticsPoints) + + if questFileID < udTacticsQuestMin || questFileID > udTacticsQuestMax { + s.logger.Warn("AddUdTacticsPoint: quest file ID out of range", + zap.Int("questFileID", questFileID), + zap.String("range", fmt.Sprintf("%d-%d", udTacticsQuestMin, udTacticsQuestMax))) + doAckSimpleSucceed(s, pkt.AckHandle, []byte{0x00, 0x00, 0x00, 0x00}) + return + } + + if points > 0 { + if err := s.server.divaRepo.AddInterceptionPoints(s.charID, questFileID, points); err != nil { + s.logger.Warn("Failed to add interception points", + zap.Uint32("charID", s.charID), + zap.Int("questFileID", questFileID), + zap.Int("points", points), + zap.Error(err)) + } + } + + doAckSimpleSucceed(s, pkt.AckHandle, []byte{0x00, 0x00, 0x00, 0x00}) +} + +func writeDivaPrizeList(bf *byteframe.ByteFrame, prizes []DivaPrize) { + bf.WriteUint32(uint32(len(prizes))) + for _, p := range prizes { + bf.WriteUint32(uint32(p.PointsReq)) + bf.WriteUint16(uint16(p.ItemType)) + bf.WriteUint16(uint16(p.ItemID)) + bf.WriteUint16(uint16(p.Quantity)) + if p.GR { + bf.WriteUint8(1) + } else { + bf.WriteUint8(0) + } + if p.Repeatable { + bf.WriteUint8(1) + } else { + bf.WriteUint8(0) + } + } } func handleMsgMhfGetUdTacticsRewardList(s *Session, p mhfpacket.MHFPacket) { - // Diva defense interception + // Diva defense interception reward list pkt := p.(*mhfpacket.MsgMhfGetUdTacticsRewardList) - // Temporary canned response - data, _ := hex.DecodeString("000094000000010732DD00010000000000010732DD00010100000000C8071F2800050100000000C80705C000050000000001901A000001F40000000001901A000001F40100000002580705C00005000000000258071F2800050100000003201A000003E80100000003201A000003E80000000003E81A000004B00100000003E81A000004B00000000004B01A000005DC0100000004B01A000005DC0000000005781A000008FC0100000005781A000008FC0000000006401A000009C40000000006401A000009C40100000007081A00000BB80100000007081A00000BB80000000007D00725FA00010000000007D01A00000CE40000000007D00725FC00010100000007D00725FB00010100000007D00725FA00010100000007D01A00000CE40100000007D00725FC00010000000007D00725FB0001000000000BB80705C00005000000000BB8071F280005010000000FA01A00000DAC000000000FA01A00000DAC0100000013880705C00005000000001388071F2800050100000017700725FE00010100000017700725FD00010100000017700725FF00010100000017700725FD00010000000017700725FE00010000000017700725FF0001000000001B581A00000E74000000001B581A00000E74010000001F400727D00005010000001F400727D000050000000023281A00000FA00000000023281A00000FA00100000027100736EF000100000000271007369600010100000027100736EF00010100000027100736EF0001000000002EE00727D10005010000002EE00727D100050000000036B01D000000010100000036B01D00000001000000003A980737DB0001010000003A980736EF00010000000046500725E600010100000046500725E60001000000004E200738C90001010000004E200736EF00010000000055F01A000010680100000055F01A000010680000000061A80736EF00010000000061A80739A600010100000065900727D200050000000065900727D20005010000007530073A0600010100000075300736EF00010000000075300736EF00010000000075300736EF00010100000084D01D000000020000000084D01D00000002010000009C400727D30005010000009C400727D3000500000000B3B01A0000119400000000B3B01A0000119401000000C3500727D4000500000000C3500727D4000501000000D2F01D0000000300000000D2F01D0000000301000000EA600736EF000100000000EA600736EF000101000000F6181A0000125C00000000F6181A0000125C0100000111700727D500050000000111700727D500050100000119400727D600050100000119400727D600050000000121101D000000040000000121101D000000040100000130B01A000013880000000130B01A000013880100000140500727D700050000000140500727D700050100000148201D000000050000000148201D00000005010000014FF01A000014B4000000014FF01A000014B4010000015F900736EF0001000000015F900736EF00010100000167600729EA00050000000167600729EA0005010000016F301D00000006010000016F301D00000006000000017ED00729EB0005000000017ED00729EB0005010000018E701A0000157C010000018E701A0000157C0000000196401D000000070000000196401D00000007010000019E100729EC0005000000019E100729EC000501000001ADB00727CD000100000001ADB00727CD000101000001BD501D0000000800000001BD501D0000000801000001CCF01A0000164401000001CCF01A0000164400000001E4601D0000000901000001E4601D0000000900000001EC300727CC000101000001EC300727CC0001000000020B701D0000000A000000020B701D0000000A010000023A501A0000170C010000023A501A0000170C0000000249F00736EF00010100000249F00736EF00010000000271001A000017D40100000271001A000017D400000002A7B01A0000189C01000002A7B01A0000189C00000002BF200736EF000100000002BF200736EF000101000002D6901A0000196401000002D6901A00001964000000030D400727CB0001000000030D400727CB00010100000343F01A00001A2C0100000343F01A00001A2C0000000372D0072CB0000F0000000372D0072CB0000F01000003A9801A00001BBC00000003A9801A00001BBC01000003F7A01A000003E800010003F7A01A000003E80101000445C01A000003E80101000445C01A000003E80001005E000000020704020005010000000002070402000500000000000307040200140000000000030704020014010000000005071D200003010000000005071D20000300000000000607040200140100000000060704020014000000000008071D210003010000000008071D21000300000000000A070402001401000000000A070402001400000000000C0722EC000501000000000C0722ED000500000000000C0722F2000500000000000C0722EC000500000000000C0722EF000500000000000C0722ED000501000000000C0722F2000501000000000C0722EF000501000000000D1A000003E801000000000D1A000003E800000000000F07357C000501000000000F07357D000501000000000F07357C000500000000000F07357D00050000000000111A000007D00000000000111A000007D00100000000141C00000001000000000014071D2200030000000000141C00000001010000000014071D22000301000000001607357D000701000000001607357C00070000000000160704020028000000000016070402002801000000001607357C000701000000001607357D0007000000000018071D270003000000000018071D27000301000000001A1A00000BB800000000001A1A00000BB801000000001C07357D000701000000001C070402002801000000001C07357D000700000000001C07357C000700000000001C070402002800000000001C07357C000701000000001E070402003C01000000001E070402003C000000000020071D26000301000000002007357C000700000000002007357D000700000000002007357C000701000000002007357D0007010000000020071D260003000000000023071D280003010000000023071D28000300000000002A070402003C00000000002A070402003C01000000002C0725EE000100000000002C0725EE000101000000002E070402005001000000002E07357D000A01000000002E070402005000000000002E07357C000A00000000002E07357D000A00000000002E07357C000A0100000000300725ED00010000000000300725ED0001010000000032071D200003010000000032071D200003000000000034072C7B0001000000000034072C7B0001010000000037071D210003000000000037071D21000301000000003C0722F1000A00000000003C0722F1000A01000000004107040200500000000000410704020050010000000046071D220003010000000046071D22000300000000004B071D27000301000000004B071D2700030000000000500722F1000F0100000000500722F1000F0000000000550704020050010000000055070402005000000000005A071D26000301000000005A071D26000300000000005F071D28000300000000005F071D2800030100000000641A0000C3500100000000641A0000C3500000002607000E00C8000000010000000307000F0032000000010000000307001000320000000100000003070011003200000001000000030700120032000000010000000307000E0096000000040000000A07000F0028000000040000000A0700100028000000040000000A0700110028000000040000000A0700120028000000040000000A07000E00640000000B0000001907000F001E0000000B00000019070010001E0000000B00000019070011001E0000000B00000019070012001E0000000B0000001907000E00320000001A0000002807000F00140000001A0000002807001000140000001A0000002807001100140000001A0000002807001200140000001A0000002807000E001E000000290000004607000F000A0000002900000046070010000A000000290000004607001100010000002900000046070012000A000000290000004607000E0019000000470000006407000F0008000000470000006407001000080000004700000064070011000100000047000000640700120008000000470000006407000E000F000000650000009607000F0006000000650000009607001000010000006500000096070011000600000065000000960700120006000000650000009607000E000500000097000001F407000F000500000097000001F4070010000500000097000001F4") - doAckBufSucceed(s, pkt.AckHandle, data) + + personal, err := s.server.divaRepo.GetPersonalPrizes() + if err != nil { + s.logger.Warn("Failed to get personal prizes", zap.Error(err)) + } + guild, err := s.server.divaRepo.GetGuildPrizes() + if err != nil { + s.logger.Warn("Failed to get guild prizes", zap.Error(err)) + } + + bf := byteframe.NewByteFrame() + writeDivaPrizeList(bf, personal) + writeDivaPrizeList(bf, guild) + + doAckBufSucceed(s, pkt.AckHandle, bf.Data()) } func handleMsgMhfGetUdTacticsFollower(s *Session, p mhfpacket.MHFPacket) { @@ -39,11 +131,18 @@ func handleMsgMhfGetUdTacticsBonusQuest(s *Session, p mhfpacket.MHFPacket) { doAckBufSucceed(s, pkt.AckHandle, data) } +// udTacticsFirstQuestBonuses are the static first-quest bonus point values. +var udTacticsFirstQuestBonuses = []uint32{1500, 2000, 2500, 3500, 4500} + func handleMsgMhfGetUdTacticsFirstQuestBonus(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgMhfGetUdTacticsFirstQuestBonus) - // Temporary canned response - data, _ := hex.DecodeString("0500000005DC01000007D002000009C40300000BB80400001194") - doAckBufSucceed(s, pkt.AckHandle, data) + bf := byteframe.NewByteFrame() + bf.WriteUint32(uint32(len(udTacticsFirstQuestBonuses))) + for i, bonus := range udTacticsFirstQuestBonuses { + bf.WriteUint32(bonus) + bf.WriteUint32(uint32(i)) + } + doAckBufSucceed(s, pkt.AckHandle, bf.Data()) } func handleMsgMhfGetUdTacticsRemainingPoint(s *Session, p mhfpacket.MHFPacket) { diff --git a/server/channelserver/handlers_tournament.go b/server/channelserver/handlers_tournament.go index ec575199c..90bb8b685 100644 --- a/server/channelserver/handlers_tournament.go +++ b/server/channelserver/handlers_tournament.go @@ -3,8 +3,11 @@ package channelserver import ( "erupe-ce/common/byteframe" ps "erupe-ce/common/pascalstring" + cfg "erupe-ce/config" "erupe-ce/network/mhfpacket" "time" + + "go.uber.org/zap" ) // TournamentInfo0 represents tournament information (type 0). @@ -46,73 +49,6 @@ type TournamentInfo22 struct { Unk4 string } -func handleMsgMhfInfoTournament(s *Session, p mhfpacket.MHFPacket) { - pkt := p.(*mhfpacket.MsgMhfInfoTournament) - bf := byteframe.NewByteFrame() - - tournamentInfo0 := []TournamentInfo0{} - tournamentInfo21 := []TournamentInfo21{} - tournamentInfo22 := []TournamentInfo22{} - - switch pkt.QueryType { - case 0: - bf.WriteUint32(0) - bf.WriteUint32(uint32(len(tournamentInfo0))) - for _, tinfo := range tournamentInfo0 { - bf.WriteUint32(tinfo.ID) - bf.WriteUint32(tinfo.MaxPlayers) - bf.WriteUint32(tinfo.CurrentPlayers) - bf.WriteUint16(tinfo.Unk1) - bf.WriteUint16(tinfo.TextColor) - bf.WriteUint32(tinfo.Unk2) - bf.WriteUint32(uint32(tinfo.Time1.Unix())) - bf.WriteUint32(uint32(tinfo.Time2.Unix())) - bf.WriteUint32(uint32(tinfo.Time3.Unix())) - bf.WriteUint32(uint32(tinfo.Time4.Unix())) - bf.WriteUint32(uint32(tinfo.Time5.Unix())) - bf.WriteUint32(uint32(tinfo.Time6.Unix())) - bf.WriteUint8(tinfo.Unk3) - bf.WriteUint8(tinfo.Unk4) - bf.WriteUint32(tinfo.MinHR) - bf.WriteUint32(tinfo.MaxHR) - ps.Uint8(bf, tinfo.Unk5, true) - ps.Uint16(bf, tinfo.Unk6, true) - } - case 1: - bf.WriteUint32(uint32(TimeAdjusted().Unix())) - bf.WriteUint32(0) // Registered ID - bf.WriteUint32(0) - bf.WriteUint32(0) - bf.WriteUint8(0) - bf.WriteUint32(0) - ps.Uint8(bf, "", true) - case 2: - bf.WriteUint32(0) - bf.WriteUint32(uint32(len(tournamentInfo21))) - for _, info := range tournamentInfo21 { - bf.WriteUint32(info.Unk0) - bf.WriteUint32(info.Unk1) - bf.WriteUint32(info.Unk2) - bf.WriteUint8(info.Unk3) - } - bf.WriteUint32(uint32(len(tournamentInfo22))) - for _, info := range tournamentInfo22 { - bf.WriteUint32(info.Unk0) - bf.WriteUint32(info.Unk1) - bf.WriteUint32(info.Unk2) - bf.WriteUint8(info.Unk3) - ps.Uint8(bf, info.Unk4, true) - } - } - - doAckBufSucceed(s, pkt.AckHandle, bf.Data()) -} - -func handleMsgMhfEntryTournament(s *Session, p mhfpacket.MHFPacket) { - pkt := p.(*mhfpacket.MsgMhfEntryTournament) - doAckSimpleSucceed(s, pkt.AckHandle, make([]byte, 4)) -} - // TournamentReward represents a tournament reward entry. type TournamentReward struct { Unk0 uint16 @@ -120,8 +56,254 @@ type TournamentReward struct { Unk2 uint16 } +// tournamentState returns the state byte for the EnumerateRanking response. +// 0 = no tournament / before start, 1 = registration open, 2 = hunting active, +// 3 = ranking/reward period. +func tournamentState(now int64, t *Tournament) uint8 { + if t == nil || now < t.StartTime { + return 0 + } + if now <= t.EntryEnd { + return 1 + } + if now <= t.RankingEnd { + return 2 + } + return 3 +} + +func handleMsgMhfEnumerateRanking(s *Session, p mhfpacket.MHFPacket) { + pkt := p.(*mhfpacket.MsgMhfEnumerateRanking) + bf := byteframe.NewByteFrame() + + now := TimeAdjusted().Unix() + tournament, err := s.server.tournamentRepo.GetActive(now) + if err != nil { + s.logger.Error("Failed to get active tournament for EnumerateRanking", zap.Error(err)) + } + + if tournament == nil { + // No active tournament: write zeroed timestamps, current time, state 0, empty data. + bf.WriteBytes(make([]byte, 16)) + bf.WriteUint32(uint32(now)) + bf.WriteUint8(0) + ps.Uint8(bf, "", false) + bf.WriteUint16(0) // numEvents + bf.WriteUint8(0) // numCups + doAckBufSucceed(s, pkt.AckHandle, bf.Data()) + return + } + + state := tournamentState(now, tournament) + + bf.WriteUint32(uint32(tournament.StartTime)) + bf.WriteUint32(uint32(tournament.EntryEnd)) + bf.WriteUint32(uint32(tournament.RankingEnd)) + bf.WriteUint32(uint32(tournament.RewardEnd)) + bf.WriteUint32(uint32(now)) + bf.WriteUint8(state) + ps.Uint8(bf, tournament.Name, true) + + subEvents, err := s.server.tournamentRepo.GetSubEvents() + if err != nil { + s.logger.Error("Failed to get tournament sub-events", zap.Error(err)) + subEvents = nil + } + bf.WriteUint16(uint16(len(subEvents))) + for _, se := range subEvents { + bf.WriteUint32(se.ID) + bf.WriteUint16(uint16(se.CupGroup)) + bf.WriteInt16(se.EventSubType) + bf.WriteUint32(se.QuestFileID) + ps.Uint8(bf, se.Name, true) + } + + cups, err := s.server.tournamentRepo.GetCups(tournament.ID) + if err != nil { + s.logger.Error("Failed to get tournament cups", zap.Error(err)) + cups = nil + } + bf.WriteUint8(uint8(len(cups))) + for _, cup := range cups { + bf.WriteUint32(cup.ID) + bf.WriteUint16(uint16(cup.CupGroup)) + bf.WriteUint16(uint16(cup.CupType)) + bf.WriteUint16(uint16(cup.Unk)) + ps.Uint8(bf, cup.Name, true) + ps.Uint16(bf, cup.Description, true) + } + + doAckBufSucceed(s, pkt.AckHandle, bf.Data()) +} + +func handleMsgMhfEnumerateOrder(s *Session, p mhfpacket.MHFPacket) { + pkt := p.(*mhfpacket.MsgMhfEnumerateOrder) + bf := byteframe.NewByteFrame() + + now := uint32(TimeAdjusted().Unix()) + bf.WriteUint32(pkt.EventID) + bf.WriteUint32(now) + + entries, err := s.server.tournamentRepo.GetLeaderboard(pkt.EventID) + if err != nil { + s.logger.Error("Failed to get tournament leaderboard", zap.Error(err), zap.Uint32("eventID", pkt.EventID)) + entries = nil + } + + bf.WriteUint16(uint16(len(entries))) + bf.WriteUint16(0) // unk + + for _, e := range entries { + bf.WriteUint32(e.CharID) + bf.WriteUint32(e.Rank) + bf.WriteUint16(e.Grade) + bf.WriteUint16(0) // pad + bf.WriteUint16(e.HR) + if s.server.erupeConfig.RealClientMode >= cfg.G10 { + bf.WriteUint16(e.GR) + } + bf.WriteUint16(0) // pad + charNameBytes := []byte(e.CharName) + guildNameBytes := []byte(e.GuildName) + bf.WriteUint8(uint8(len(charNameBytes) + 1)) + bf.WriteUint8(uint8(len(guildNameBytes) + 1)) + bf.WriteNullTerminatedBytes(charNameBytes) + bf.WriteNullTerminatedBytes(guildNameBytes) + } + + doAckBufSucceed(s, pkt.AckHandle, bf.Data()) +} + +func handleMsgMhfInfoTournament(s *Session, p mhfpacket.MHFPacket) { + pkt := p.(*mhfpacket.MsgMhfInfoTournament) + bf := byteframe.NewByteFrame() + + now := TimeAdjusted().Unix() + + switch pkt.QueryType { + case 0: + tournament, err := s.server.tournamentRepo.GetActive(now) + if err != nil { + s.logger.Error("Failed to get active tournament for InfoTournament type 0", zap.Error(err)) + } + bf.WriteUint32(0) // unk header + if tournament == nil { + bf.WriteUint32(0) // count = 0 + break + } + bf.WriteUint32(1) // count + bf.WriteUint32(tournament.ID) + bf.WriteUint32(0) // MaxPlayers + bf.WriteUint32(0) // CurrentPlayers + bf.WriteUint16(0) // Unk1 + bf.WriteUint16(0) // TextColor + bf.WriteUint32(0) // Unk2 + bf.WriteUint32(uint32(tournament.StartTime)) + bf.WriteUint32(uint32(tournament.EntryEnd)) + bf.WriteUint32(uint32(tournament.RankingEnd)) + bf.WriteUint32(uint32(tournament.RewardEnd)) + bf.WriteUint32(uint32(tournament.RewardEnd)) + bf.WriteUint32(uint32(tournament.RewardEnd)) + bf.WriteUint8(0) // Unk3 + bf.WriteUint8(0) // Unk4 + bf.WriteUint32(0) // MinHR + bf.WriteUint32(0) // MaxHR + ps.Uint8(bf, tournament.Name, true) + ps.Uint16(bf, "", false) + case 1: + // Return player registration status. + bf.WriteUint32(uint32(now)) + tournament, err := s.server.tournamentRepo.GetActive(now) + if err != nil { + s.logger.Error("Failed to get active tournament for InfoTournament type 1", zap.Error(err)) + } + if tournament == nil { + bf.WriteUint32(0) // tournamentID + bf.WriteUint32(0) // entryID + bf.WriteUint32(0) + bf.WriteUint8(0) // not registered + bf.WriteUint32(0) + ps.Uint8(bf, "", true) + break + } + entry, err := s.server.tournamentRepo.GetEntry(s.charID, tournament.ID) + if err != nil { + s.logger.Error("Failed to get tournament entry for InfoTournament type 1", zap.Error(err)) + } + bf.WriteUint32(tournament.ID) + if entry != nil { + bf.WriteUint32(entry.ID) + bf.WriteUint32(0) + bf.WriteUint8(1) // registered + } else { + bf.WriteUint32(0) + bf.WriteUint32(0) + bf.WriteUint8(0) // not registered + } + bf.WriteUint32(0) + ps.Uint8(bf, tournament.Name, true) + case 2: + // Return empty lists (reward structures unknown). + bf.WriteUint32(0) + bf.WriteUint32(0) // count type 21 + bf.WriteUint32(0) // count type 22 + } + + doAckBufSucceed(s, pkt.AckHandle, bf.Data()) +} + +func handleMsgMhfEntryTournament(s *Session, p mhfpacket.MHFPacket) { + pkt := p.(*mhfpacket.MsgMhfEntryTournament) + now := TimeAdjusted().Unix() + + tournament, err := s.server.tournamentRepo.GetActive(now) + if err != nil { + s.logger.Error("Failed to get active tournament for EntryTournament", zap.Error(err)) + doAckSimpleSucceed(s, pkt.AckHandle, make([]byte, 4)) + return + } + if tournament == nil { + doAckSimpleSucceed(s, pkt.AckHandle, make([]byte, 4)) + return + } + + entryID, err := s.server.tournamentRepo.Register(s.charID, tournament.ID) + if err != nil { + s.logger.Error("Failed to register for tournament", zap.Error(err), + zap.Uint32("charID", s.charID), zap.Uint32("tournamentID", tournament.ID)) + doAckSimpleSucceed(s, pkt.AckHandle, make([]byte, 4)) + return + } + + bf := byteframe.NewByteFrame() + bf.WriteUint32(entryID) + doAckBufSucceed(s, pkt.AckHandle, bf.Data()) +} + +func handleMsgMhfEnterTournamentQuest(s *Session, p mhfpacket.MHFPacket) { + pkt := p.(*mhfpacket.MsgMhfEnterTournamentQuest) + s.logger.Debug("EnterTournamentQuest", + zap.Uint32("tournamentID", pkt.TournamentID), + zap.Uint32("entryHandle", pkt.EntryHandle), + zap.Uint32("unk2", pkt.Unk2), + zap.Uint32("questSlot", pkt.QuestSlot), + zap.Uint32("stageHandle", pkt.StageHandle), + ) + if err := s.server.tournamentRepo.SubmitResult( + s.charID, + pkt.TournamentID, + pkt.Unk2, + pkt.QuestSlot, + pkt.StageHandle, + ); err != nil { + s.logger.Error("Failed to submit tournament result", zap.Error(err)) + } + doAckSimpleSucceed(s, pkt.AckHandle, make([]byte, 4)) +} + func handleMsgMhfAcquireTournament(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgMhfAcquireTournament) + // Reward item IDs are unknown. Return an empty reward list. rewards := []TournamentReward{} bf := byteframe.NewByteFrame() bf.WriteUint8(uint8(len(rewards))) diff --git a/server/channelserver/handlers_users.go b/server/channelserver/handlers_users.go index d2f2f7fb0..713a40487 100644 --- a/server/channelserver/handlers_users.go +++ b/server/channelserver/handlers_users.go @@ -1,10 +1,41 @@ package channelserver import ( + "encoding/binary" + "erupe-ce/common/bfutil" + "erupe-ce/common/stringsupport" "erupe-ce/network/mhfpacket" "go.uber.org/zap" ) +// User binary expected sizes and offsets (from mhfo-hd.dll RE). +// Types 4-5 are accepted by the server but never sent by the ZZ client. +const ( + userBinaryNameMaxSize = 17 // Type 1: SJIS null-terminated name + userBinaryProfileSize = 208 // Type 2: 0xD0 — player profile + userBinaryEquipSize = 384 // Type 3: 0x180 — equipment/appearance + + // Type 2 profile offsets + profileNameOff = 0x0C // 25-byte SJIS name + profileNameLen = 25 + profileIntroOff = 0x25 // 35-byte SJIS self-introduction + profileIntroLen = 35 + profileGuildIDOff = 0x48 // u32 guild ID + + // Type 3 equipment offsets + equipHROff = 0x00 // u16 HR (XOR'd with session key) + equipWeaponOff = 0x08 // 12-byte weapon entry + equipHeadOff = 0x18 // 12-byte head armor entry + equipChestOff = 0x24 // 12-byte chest armor entry + equipArmsOff = 0x30 // 12-byte arms armor entry + equipWaistOff = 0x3C // 12-byte waist armor entry + equipLegsOff = 0x48 // 12-byte legs armor entry + equipGuildIDOff = 0x64 // u32 guild ID + equipGenderOff = 0x68 // u8 gender flag + equipSharpnessOff = 0x69 // u8 sharpness level + equipEntrySize = 12 // Each equipment entry: 3x u32 +) + func handleMsgSysInsertUser(s *Session, p mhfpacket.MHFPacket) {} // stub: unimplemented func handleMsgSysDeleteUser(s *Session, p mhfpacket.MHFPacket) {} // stub: unimplemented @@ -15,6 +46,9 @@ func handleMsgSysSetUserBinary(s *Session, p mhfpacket.MHFPacket) { s.logger.Warn("Invalid BinaryType", zap.Uint8("type", pkt.BinaryType)) return } + + logUserBinaryFields(s, pkt.BinaryType, pkt.RawDataPayload) + s.server.userBinary.Set(s.charID, pkt.BinaryType, pkt.RawDataPayload) s.server.BroadcastMHF(&mhfpacket.MsgSysNotifyUserBinary{ @@ -23,6 +57,101 @@ func handleMsgSysSetUserBinary(s *Session, p mhfpacket.MHFPacket) { }, s) } +// logUserBinaryFields parses and logs the structured fields of a user binary +// payload based on its type. Logs a warning if the payload size does not match +// the expected format from the client RE. +func logUserBinaryFields(s *Session, binaryType uint8, data []byte) { + switch binaryType { + case 1: + logUserBinaryName(s, data) + case 2: + logUserBinaryProfile(s, data) + case 3: + logUserBinaryEquipment(s, data) + default: + s.logger.Info("User binary received (unknown type)", + zap.Uint8("type", binaryType), + zap.Int("size", len(data)), + zap.Uint32("charID", s.charID), + ) + } +} + +// logUserBinaryName parses type 1: character name (SJIS, null-terminated). +func logUserBinaryName(s *Session, data []byte) { + if len(data) == 0 { + s.logger.Warn("User binary type 1 (name): empty payload", + zap.Uint32("charID", s.charID), + ) + return + } + if len(data) > userBinaryNameMaxSize { + s.logger.Warn("User binary type 1 (name): payload exceeds expected max", + zap.Int("size", len(data)), + zap.Int("expected_max", userBinaryNameMaxSize), + zap.Uint32("charID", s.charID), + ) + } + name := stringsupport.SJISToUTF8Lossy(bfutil.UpToNull(data)) + s.logger.Info("User binary type 1 (name)", + zap.String("name", name), + zap.Int("size", len(data)), + zap.Uint32("charID", s.charID), + ) +} + +// logUserBinaryProfile parses type 2: player profile (208 bytes). +func logUserBinaryProfile(s *Session, data []byte) { + if len(data) != userBinaryProfileSize { + s.logger.Warn("User binary type 2 (profile): unexpected size", + zap.Int("size", len(data)), + zap.Int("expected", userBinaryProfileSize), + zap.Uint32("charID", s.charID), + ) + return + } + nameBytes := bfutil.UpToNull(data[profileNameOff : profileNameOff+profileNameLen]) + name := stringsupport.SJISToUTF8Lossy(nameBytes) + + introBytes := bfutil.UpToNull(data[profileIntroOff : profileIntroOff+profileIntroLen]) + intro := stringsupport.SJISToUTF8Lossy(introBytes) + + guildID := binary.BigEndian.Uint32(data[profileGuildIDOff : profileGuildIDOff+4]) + + s.logger.Info("User binary type 2 (profile)", + zap.String("name", name), + zap.String("self_intro", intro), + zap.Uint32("guild_id", guildID), + zap.Int("size", len(data)), + zap.Uint32("charID", s.charID), + ) +} + +// logUserBinaryEquipment parses type 3: equipment/appearance (384 bytes). +func logUserBinaryEquipment(s *Session, data []byte) { + if len(data) != userBinaryEquipSize { + s.logger.Warn("User binary type 3 (equipment): unexpected size", + zap.Int("size", len(data)), + zap.Int("expected", userBinaryEquipSize), + zap.Uint32("charID", s.charID), + ) + return + } + hr := binary.BigEndian.Uint16(data[equipHROff : equipHROff+2]) + guildID := binary.BigEndian.Uint32(data[equipGuildIDOff : equipGuildIDOff+4]) + gender := data[equipGenderOff] + sharpness := data[equipSharpnessOff] + + s.logger.Info("User binary type 3 (equipment)", + zap.Uint16("hr_xored", hr), + zap.Uint32("guild_id", guildID), + zap.Uint8("gender", gender), + zap.Uint8("sharpness", sharpness), + zap.Int("size", len(data)), + zap.Uint32("charID", s.charID), + ) +} + func handleMsgSysGetUserBinary(s *Session, p mhfpacket.MHFPacket) { pkt := p.(*mhfpacket.MsgSysGetUserBinary) diff --git a/server/channelserver/lang_en.go b/server/channelserver/lang_en.go new file mode 100644 index 000000000..797c79388 --- /dev/null +++ b/server/channelserver/lang_en.go @@ -0,0 +1,99 @@ +package channelserver + +func langEnglish() i18n { + var i i18n + + i.language = "English" + i.cafe.reset = "Resets on %d/%d" + i.timer = "Time: %02d:%02d:%02d.%03d (%df)" + + i.commands.noOp = "You don't have permission to use this command" + i.commands.disabled = "%s command is disabled" + i.commands.reload = "Reloading players..." + i.commands.playtime = "Playtime: %d hours %d minutes %d seconds" + + i.commands.kqf.get = "KQF: %x" + i.commands.kqf.set.error = "Error in command. Format: %s set xxxxxxxxxxxxxxxx" + i.commands.kqf.set.success = "KQF set, please switch Land/World" + i.commands.kqf.version = "This command is disabled prior to MHFG10" + i.commands.rights.error = "Error in command. Format: %s x" + i.commands.rights.success = "Set rights integer: %d" + i.commands.course.error = "Error in command. Format: %s " + i.commands.course.disabled = "%s Course disabled" + i.commands.course.enabled = "%s Course enabled" + i.commands.course.locked = "%s Course is locked" + i.commands.teleport.error = "Error in command. Format: %s x y" + i.commands.teleport.success = "Teleporting to %d %d" + i.commands.psn.error = "Error in command. Format: %s " + i.commands.psn.success = "Connected PSN ID: %s" + i.commands.psn.exists = "PSN ID is connected to another account!" + + i.commands.discord.success = "Your Discord token: %s" + + i.commands.ban.noUser = "Could not find user" + i.commands.ban.success = "Successfully banned %s" + i.commands.ban.invalid = "Invalid Character ID" + i.commands.ban.error = "Error in command. Format: %s [length]" + i.commands.ban.length = " until %s" + + i.commands.timer.enabled = "Quest timer enabled" + i.commands.timer.disabled = "Quest timer disabled" + + i.commands.ravi.noCommand = "No Raviente command specified!" + i.commands.ravi.start.success = "The Great Slaying will begin in a moment" + i.commands.ravi.start.error = "The Great Slaying has already begun!" + i.commands.ravi.multiplier = "Raviente multiplier is currently %.2fx" + i.commands.ravi.res.success = "Sending resurrection support!" + i.commands.ravi.res.error = "Resurrection support has not been requested!" + i.commands.ravi.sed.success = "Sending sedation support if requested!" + i.commands.ravi.request = "Requesting sedation support!" + i.commands.ravi.error = "Raviente command not recognised!" + i.commands.ravi.noPlayers = "No one has joined the Great Slaying!" + i.commands.ravi.version = "This command is disabled outside of MHFZZ" + + i.raviente.berserk = " is being held!" + i.raviente.extreme = " is being held!" + i.raviente.extremeLimited = " is being held!" + i.raviente.berserkSmall = " is being held!" + + i.guild.rookieGuildName = "Rookie Clan %d" + i.guild.returnGuildName = "Return Clan %d" + + i.guild.invite.title = "Invitation!" + i.guild.invite.body = "You have been invited to join\n「%s」\nDo you want to accept?" + + i.guild.invite.success.title = "Success!" + i.guild.invite.success.body = "You have successfully joined\n「%s」." + + i.guild.invite.accepted.title = "Accepted" + i.guild.invite.accepted.body = "The recipient accepted your invitation to join\n「%s」." + + i.guild.invite.rejected.title = "Rejected" + i.guild.invite.rejected.body = "You rejected the invitation to join\n「%s」." + + i.guild.invite.declined.title = "Declined" + i.guild.invite.declined.body = "The recipient declined your invitation to join\n「%s」." + + i.beads = []Bead{ + {1, "Bead of Storms", "A prayer bead imbued with the power of storms.\nSummons raging winds to bolster allies."}, + {3, "Bead of Severing", "A prayer bead imbued with severing power.\nGrants allies increased cutting strength."}, + {4, "Bead of Vitality", "A prayer bead imbued with vitality.\nBoosts the health of those around it."}, + {8, "Bead of Healing", "A prayer bead imbued with healing power.\nProtects allies with restorative energy."}, + {9, "Bead of Fury", "A prayer bead imbued with furious energy.\nFuels allies with battle rage."}, + {10, "Bead of Blight", "A prayer bead imbued with miasma.\nInfuses allies with poisonous power."}, + {11, "Bead of Power", "A prayer bead imbued with raw might.\nGrants allies overwhelming strength."}, + {14, "Bead of Thunder", "A prayer bead imbued with lightning.\nCharges allies with electric force."}, + {15, "Bead of Ice", "A prayer bead imbued with freezing cold.\nGrants allies chilling elemental power."}, + {17, "Bead of Fire", "A prayer bead imbued with searing heat.\nIgnites allies with fiery elemental power."}, + {18, "Bead of Water", "A prayer bead imbued with flowing water.\nGrants allies water elemental power."}, + {19, "Bead of Dragon", "A prayer bead imbued with dragon energy.\nGrants allies dragon elemental power."}, + {20, "Bead of Earth", "A prayer bead imbued with earth power.\nGrounds allies with elemental earth force."}, + {21, "Bead of Wind", "A prayer bead imbued with swift wind.\nGrants allies increased agility."}, + {22, "Bead of Light", "A prayer bead imbued with radiant light.\nInspires allies with luminous energy."}, + {23, "Bead of Shadow", "A prayer bead imbued with darkness.\nInfuses allies with shadowy power."}, + {24, "Bead of Iron", "A prayer bead imbued with iron strength.\nGrants allies fortified defence."}, + {25, "Bead of Immunity", "A prayer bead imbued with sealing power.\nNullifies elemental weaknesses for allies."}, + } + + return i +} diff --git a/server/channelserver/lang_es.go b/server/channelserver/lang_es.go new file mode 100644 index 000000000..c1f1cdd44 --- /dev/null +++ b/server/channelserver/lang_es.go @@ -0,0 +1,99 @@ +package channelserver + +func langSpanish() i18n { + var i i18n + + i.language = "Español" + i.cafe.reset = "Se reinicia el %d/%d" + i.timer = "Tiempo: %02d:%02d:%02d.%03d (%df)" + + i.commands.noOp = "No tienes permiso para usar este comando" + i.commands.disabled = "El comando %s está desactivado" + i.commands.reload = "Recargando jugadores..." + i.commands.playtime = "Tiempo de juego: %d hora(s) %d minuto(s) %d segundo(s)" + + i.commands.kqf.get = "KQF: %x" + i.commands.kqf.set.error = "Error en el comando. Formato: %s set xxxxxxxxxxxxxxxx" + i.commands.kqf.set.success = "KQF establecido, por favor cambia de Zona/Mundo" + i.commands.kqf.version = "Este comando está desactivado antes de MHFG10" + i.commands.rights.error = "Error en el comando. Formato: %s x" + i.commands.rights.success = "Establecer entero de derechos: %d" + i.commands.course.error = "Error en el comando. Formato: %s " + i.commands.course.disabled = "Curso %s desactivado" + i.commands.course.enabled = "Curso %s activado" + i.commands.course.locked = "El curso %s está bloqueado" + i.commands.teleport.error = "Error en el comando. Formato: %s x y" + i.commands.teleport.success = "Teletransportando a %d %d" + i.commands.psn.error = "Error en el comando. Formato: %s " + i.commands.psn.success = "ID de PSN conectado: %s" + i.commands.psn.exists = "Este ID de PSN ya está asociado a otra cuenta" + + i.commands.discord.success = "Tu token de Discord: %s" + + i.commands.ban.noUser = "No se encontró al usuario" + i.commands.ban.success = "%s ha sido baneado con éxito" + i.commands.ban.invalid = "ID de personaje inválido" + i.commands.ban.error = "Error en el comando. Formato: %s [duración]" + i.commands.ban.length = " hasta el %s" + + i.commands.timer.enabled = "Temporizador de misión activado" + i.commands.timer.disabled = "Temporizador de misión desactivado" + + i.commands.ravi.noCommand = "No se especificó ningún comando de Raviente" + i.commands.ravi.start.success = "La Gran Cacería comenzará en un momento" + i.commands.ravi.start.error = "¡La Gran Cacería ya ha comenzado!" + i.commands.ravi.multiplier = "El multiplicador de Raviente es actualmente %.2fx" + i.commands.ravi.res.success = "¡Enviando apoyo de resurrección!" + i.commands.ravi.res.error = "¡El apoyo de resurrección no ha sido solicitado!" + i.commands.ravi.sed.success = "¡Enviando apoyo de sedación si fue solicitado!" + i.commands.ravi.request = "¡Solicitando apoyo de sedación!" + i.commands.ravi.error = "¡Comando de Raviente no reconocido!" + i.commands.ravi.noPlayers = "¡Nadie se ha unido a la Gran Cacería!" + i.commands.ravi.version = "Este comando está desactivado fuera de MHFZZ" + + i.raviente.berserk = "¡ está en curso!" + i.raviente.extreme = "¡ está en curso!" + i.raviente.extremeLimited = "¡ está en curso!" + i.raviente.berserkSmall = "¡ está en curso!" + + i.guild.rookieGuildName = "Clan Novato %d" + i.guild.returnGuildName = "Clan Regreso %d" + + i.guild.invite.title = "¡Invitación!" + i.guild.invite.body = "Has sido invitado a unirte a\n「%s」\n¿Deseas aceptar?" + + i.guild.invite.success.title = "¡Éxito!" + i.guild.invite.success.body = "Te has unido a\n「%s」 con éxito." + + i.guild.invite.accepted.title = "Aceptada" + i.guild.invite.accepted.body = "El destinatario aceptó tu invitación para unirse a\n「%s」." + + i.guild.invite.rejected.title = "Rechazada" + i.guild.invite.rejected.body = "Rechazaste la invitación para unirte a\n「%s」." + + i.guild.invite.declined.title = "Declinada" + i.guild.invite.declined.body = "El destinatario declinó tu invitación para unirse a\n「%s」." + + i.beads = []Bead{ + {1, "Perla de Tormentas", "Una perla de oración imbuida con el poder de las tormentas.\nInvoca vientos furiosos para fortalecer a los aliados."}, + {3, "Perla de Corte", "Una perla de oración imbuida con poder cortante.\nOtorga a los aliados mayor fuerza de corte."}, + {4, "Perla de Vitalidad", "Una perla de oración imbuida con vitalidad.\nAumenta los puntos de vida de los aliados cercanos."}, + {8, "Perla de Curación", "Una perla de oración imbuida con poder curativo.\nProtege a los aliados con energía restauradora."}, + {9, "Perla de Furia", "Una perla de oración imbuida con energía furiosa.\nImbuye a los aliados con rabia de combate."}, + {10, "Perla de Plaga", "Una perla de oración imbuida con miasma.\nInfunde a los aliados con poder venenoso."}, + {11, "Perla de Poder", "Una perla de oración imbuida con fuerza bruta.\nOtorga a los aliados una fuerza abrumadora."}, + {14, "Perla del Trueno", "Una perla de oración imbuida con rayos.\nCarga a los aliados con fuerza eléctrica."}, + {15, "Perla de Hielo", "Una perla de oración imbuida con frío glacial.\nOtorga a los aliados poder elemental helado."}, + {17, "Perla de Fuego", "Una perla de oración imbuida con calor abrasador.\nEnciende a los aliados con poder elemental ígneo."}, + {18, "Perla de Agua", "Una perla de oración imbuida con agua fluyente.\nOtorga a los aliados poder elemental acuático."}, + {19, "Perla del Dragón", "Una perla de oración imbuida con energía dracónica.\nOtorga a los aliados poder elemental dracónico."}, + {20, "Perla de Tierra", "Una perla de oración imbuida con el poder de la tierra.\nAfianza a los aliados con fuerza elemental telúrica."}, + {21, "Perla del Viento", "Una perla de oración imbuida con viento veloz.\nOtorga a los aliados mayor agilidad."}, + {22, "Perla de Luz", "Una perla de oración imbuida con luz radiante.\nInspira a los aliados con energía luminosa."}, + {23, "Perla de Sombra", "Una perla de oración imbuida con oscuridad.\nInfunde a los aliados con poder sombrío."}, + {24, "Perla de Hierro", "Una perla de oración imbuida con la resistencia del hierro.\nOtorga a los aliados una defensa reforzada."}, + {25, "Perla de Inmunidad", "Una perla de oración imbuida con poder de sellado.\nAnula las debilidades elementales de los aliados."}, + } + + return i +} diff --git a/server/channelserver/lang_fr.go b/server/channelserver/lang_fr.go new file mode 100644 index 000000000..2db839b84 --- /dev/null +++ b/server/channelserver/lang_fr.go @@ -0,0 +1,99 @@ +package channelserver + +func langFrench() i18n { + var i i18n + + i.language = "Français" + i.cafe.reset = "Réinitialisation le %d/%d" + i.timer = "Temps : %02d:%02d:%02d.%03d (%df)" + + i.commands.noOp = "Vous n'avez pas la permission d'utiliser cette commande" + i.commands.disabled = "La commande %s est désactivée" + i.commands.reload = "Rechargement des joueurs..." + i.commands.playtime = "Temps de jeu : %d heure(s) %d minute(s) %d seconde(s)" + + i.commands.kqf.get = "KQF : %x" + i.commands.kqf.set.error = "Erreur de commande. Format : %s set xxxxxxxxxxxxxxxx" + i.commands.kqf.set.success = "KQF défini, veuillez changer de Zone/Monde" + i.commands.kqf.version = "Cette commande est désactivée avant MHFG10" + i.commands.rights.error = "Erreur de commande. Format : %s x" + i.commands.rights.success = "Définir entier de droits : %d" + i.commands.course.error = "Erreur de commande. Format : %s " + i.commands.course.disabled = "Cours %s désactivé" + i.commands.course.enabled = "Cours %s activé" + i.commands.course.locked = "Le cours %s est verrouillé" + i.commands.teleport.error = "Erreur de commande. Format : %s x y" + i.commands.teleport.success = "Téléportation vers %d %d" + i.commands.psn.error = "Erreur de commande. Format : %s " + i.commands.psn.success = "ID PSN connecté : %s" + i.commands.psn.exists = "Cet ID PSN est déjà associé à un autre compte !" + + i.commands.discord.success = "Votre jeton Discord : %s" + + i.commands.ban.noUser = "Utilisateur introuvable" + i.commands.ban.success = "%s a été banni avec succès" + i.commands.ban.invalid = "ID de personnage invalide" + i.commands.ban.error = "Erreur de commande. Format : %s [durée]" + i.commands.ban.length = " jusqu'au %s" + + i.commands.timer.enabled = "Minuteur de quête activé" + i.commands.timer.disabled = "Minuteur de quête désactivé" + + i.commands.ravi.noCommand = "Aucune commande Raviente spécifiée !" + i.commands.ravi.start.success = "La Grande Chasse va commencer dans un instant" + i.commands.ravi.start.error = "La Grande Chasse a déjà commencé !" + i.commands.ravi.multiplier = "Le multiplicateur Raviente est actuellement de %.2fx" + i.commands.ravi.res.success = "Envoi du soutien de résurrection !" + i.commands.ravi.res.error = "Le soutien de résurrection n'a pas été demandé !" + i.commands.ravi.sed.success = "Envoi du soutien de sédation si demandé !" + i.commands.ravi.request = "Demande de soutien de sédation !" + i.commands.ravi.error = "Commande Raviente non reconnue !" + i.commands.ravi.noPlayers = "Personne n'a rejoint la Grande Chasse !" + i.commands.ravi.version = "Cette commande est désactivée en dehors de MHFZZ" + + i.raviente.berserk = " est en cours !" + i.raviente.extreme = " est en cours !" + i.raviente.extremeLimited = " est en cours !" + i.raviente.berserkSmall = " est en cours !" + + i.guild.rookieGuildName = "Clan Novice %d" + i.guild.returnGuildName = "Clan Retour %d" + + i.guild.invite.title = "Invitation !" + i.guild.invite.body = "Vous avez été invité à rejoindre\n「%s」\nSouhaitez-vous accepter ?" + + i.guild.invite.success.title = "Succès !" + i.guild.invite.success.body = "Vous avez rejoint\n「%s」 avec succès." + + i.guild.invite.accepted.title = "Acceptée" + i.guild.invite.accepted.body = "Le destinataire a accepté votre invitation à rejoindre\n「%s」." + + i.guild.invite.rejected.title = "Refusée" + i.guild.invite.rejected.body = "Vous avez refusé l'invitation à rejoindre\n「%s」." + + i.guild.invite.declined.title = "Déclinée" + i.guild.invite.declined.body = "Le destinataire a décliné votre invitation à rejoindre\n「%s」." + + i.beads = []Bead{ + {1, "Perle des Tempêtes", "Une perle de prière imprégnée du pouvoir des tempêtes.\nInvoque des vents déchaînés pour soutenir les alliés."}, + {3, "Perle de Tranchant", "Une perle de prière imprégnée du pouvoir tranchant.\nAccorde aux alliés une force de coupe accrue."}, + {4, "Perle de Vitalité", "Une perle de prière imprégnée de vitalité.\nAugmente les points de vie des alliés proches."}, + {8, "Perle de Guérison", "Une perle de prière imprégnée du pouvoir de guérison.\nProtège les alliés avec une énergie restauratrice."}, + {9, "Perle de Fureur", "Une perle de prière imprégnée d'énergie furieuse.\nEmbrasse les alliés d'une rage au combat."}, + {10, "Perle de Fléau", "Une perle de prière imprégnée de miasmes.\nInfuse les alliés d'un pouvoir venimeux."}, + {11, "Perle de Puissance", "Une perle de prière imprégnée d'une force brute.\nAccorde aux alliés une force accablante."}, + {14, "Perle du Tonnerre", "Une perle de prière imprégnée de foudre.\nCharge les alliés d'une force électrique."}, + {15, "Perle de Glace", "Une perle de prière imprégnée d'un froid glacial.\nAccorde aux alliés un pouvoir élémentaire glacé."}, + {17, "Perle de Feu", "Une perle de prière imprégnée d'une chaleur brûlante.\nEnflamme les alliés d'un pouvoir élémentaire ardent."}, + {18, "Perle d'Eau", "Une perle de prière imprégnée d'eau courante.\nAccorde aux alliés un pouvoir élémentaire aquatique."}, + {19, "Perle du Dragon", "Une perle de prière imprégnée d'énergie draconique.\nAccorde aux alliés un pouvoir élémentaire draconique."}, + {20, "Perle de Terre", "Une perle de prière imprégnée du pouvoir de la terre.\nAncre les alliés avec une force élémentaire tellurique."}, + {21, "Perle du Vent", "Une perle de prière imprégnée d'un vent rapide.\nAccorde aux alliés une agilité accrue."}, + {22, "Perle de Lumière", "Une perle de prière imprégnée d'une lumière radieuse.\nInspire les alliés avec une énergie lumineuse."}, + {23, "Perle d'Ombre", "Une perle de prière imprégnée d'obscurité.\nInfuse les alliés d'un pouvoir ténébreux."}, + {24, "Perle de Fer", "Une perle de prière imprégnée de la résistance du fer.\nAccorde aux alliés une défense renforcée."}, + {25, "Perle d'Immunité", "Une perle de prière imprégnée d'un pouvoir de scellement.\nAnnule les faiblesses élémentaires des alliés."}, + } + + return i +} diff --git a/server/channelserver/lang_jp.go b/server/channelserver/lang_jp.go new file mode 100644 index 000000000..c4ff8b0a8 --- /dev/null +++ b/server/channelserver/lang_jp.go @@ -0,0 +1,99 @@ +package channelserver + +func langJapanese() i18n { + var i i18n + + i.language = "日本語" + i.cafe.reset = "%d/%dにリセット" + i.timer = "タイマー:%02d'%02d\"%02d.%03d (%df)" + + i.commands.noOp = "このコマンドを使用する権限がありません" + i.commands.disabled = "%sのコマンドは無効です" + i.commands.reload = "リロードします" + i.commands.kqf.get = "現在のキークエストフラグ:%x" + i.commands.kqf.set.error = "キークエコマンドエラー 例:%s set xxxxxxxxxxxxxxxx" + i.commands.kqf.set.success = "キークエストのフラグが更新されました。ワールド/ランドを移動してください" + i.commands.kqf.version = "このコマンドはMHFG10以前では無効です" + i.commands.rights.error = "コース更新コマンドエラー 例:%s x" + i.commands.rights.success = "コース情報を更新しました:%d" + i.commands.course.error = "コース確認コマンドエラー 例:%s " + i.commands.course.disabled = "%sコースは無効です" + i.commands.course.enabled = "%sコースは有効です" + i.commands.course.locked = "%sコースはロックされています" + i.commands.teleport.error = "テレポートコマンドエラー 構文:%s x y" + i.commands.teleport.success = "%d %dにテレポート" + i.commands.psn.error = "PSN連携コマンドエラー 例:%s " + i.commands.psn.success = "PSN「%s」が連携されています" + i.commands.psn.exists = "PSNは既存のユーザに接続されています" + + i.commands.discord.success = "あなたのDiscordトークン:%s" + + i.commands.ban.noUser = "ユーザーが見つかりません" + i.commands.ban.success = "%sをBANしました" + i.commands.ban.invalid = "無効なキャラクターIDです" + i.commands.ban.error = "コマンドエラー 例:%s [期間]" + i.commands.ban.length = " ~%sまで" + + i.commands.playtime = "プレイ時間:%d時間%d分%d秒" + + i.commands.timer.enabled = "クエストタイマーが有効になりました" + i.commands.timer.disabled = "クエストタイマーが無効になりました" + + i.commands.ravi.noCommand = "ラヴィコマンドが指定されていません" + i.commands.ravi.start.success = "大討伐を開始します" + i.commands.ravi.start.error = "大討伐は既に開催されています" + i.commands.ravi.multiplier = "ラヴィダメージ倍率:x%.2f" + i.commands.ravi.res.success = "復活支援を実行します" + i.commands.ravi.res.error = "復活支援は実行されませんでした" + i.commands.ravi.sed.success = "鎮静支援を実行します" + i.commands.ravi.request = "鎮静支援を要請します" + i.commands.ravi.error = "ラヴィコマンドが認識されません" + i.commands.ravi.noPlayers = "誰も大討伐に参加していません" + i.commands.ravi.version = "このコマンドはMHFZZ以外では無効です" + + i.raviente.berserk = "<大討伐:猛狂期>が開催されました!" + i.raviente.extreme = "<大討伐:猛狂期【極】>が開催されました!" + i.raviente.extremeLimited = "<大討伐:猛狂期【極】(制限付)>が開催されました!" + i.raviente.berserkSmall = "<大討伐:猛狂期(小数)>が開催されました!" + + i.guild.rookieGuildName = "新米猟団%d" + i.guild.returnGuildName = "復帰猟団%d" + + i.guild.invite.title = "猟団勧誘のご案内" + i.guild.invite.body = "猟団「%s」からの勧誘通知です。\n「勧誘に返答」より、返答を行ってください。" + + i.guild.invite.success.title = "成功" + i.guild.invite.success.body = "あなたは「%s」に参加できました。" + + i.guild.invite.accepted.title = "承諾されました" + i.guild.invite.accepted.body = "招待した狩人が「%s」への招待を承諾しました。" + + i.guild.invite.rejected.title = "却下しました" + i.guild.invite.rejected.body = "あなたは「%s」への参加を却下しました。" + + i.guild.invite.declined.title = "辞退しました" + i.guild.invite.declined.body = "招待した狩人が「%s」への招待を辞退しました。" + + i.beads = []Bead{ + {1, "暴風の祈珠", "暴風の力を宿した祈珠。\n嵐を呼ぶ力で仲間を鼓舞する。"}, + {3, "断力の祈珠", "断力の力を宿した祈珠。\n斬撃の力を仲間に授ける。"}, + {4, "活力の祈珠", "活力の力を宿した祈珠。\n体力を高める力で仲間を鼓舞する。"}, + {8, "癒しの祈珠", "癒しの力を宿した祈珠。\n回復の力で仲間を守る。"}, + {9, "激昂の祈珠", "激昂の力を宿した祈珠。\n怒りの力を仲間に与える。"}, + {10, "瘴気の祈珠", "瘴気の力を宿した祈珠。\n毒霧の力を仲間に与える。"}, + {11, "剛力の祈珠", "剛力の力を宿した祈珠。\n強大な力を仲間に授ける。"}, + {14, "雷光の祈珠", "雷光の力を宿した祈珠。\n稲妻の力を仲間に与える。"}, + {15, "氷結の祈珠", "氷結の力を宿した祈珠。\n冷気の力を仲間に与える。"}, + {17, "炎熱の祈珠", "炎熱の力を宿した祈珠。\n炎の力を仲間に与える。"}, + {18, "水流の祈珠", "水流の力を宿した祈珠。\n水の力を仲間に与える。"}, + {19, "龍気の祈珠", "龍気の力を宿した祈珠。\n龍属性の力を仲間に与える。"}, + {20, "大地の祈珠", "大地の力を宿した祈珠。\n大地の力を仲間に与える。"}, + {21, "疾風の祈珠", "疾風の力を宿した祈珠。\n素早さを高める力を仲間に与える。"}, + {22, "光輝の祈珠", "光輝の力を宿した祈珠。\n光の力で仲間を鼓舞する。"}, + {23, "暗影の祈珠", "暗影の力を宿した祈珠。\n闇の力を仲間に与える。"}, + {24, "鋼鉄の祈珠", "鋼鉄の力を宿した祈珠。\n防御力を高める力を仲間に与える。"}, + {25, "封属の祈珠", "封属の力を宿した祈珠。\n属性を封じる力を仲間に与える。"}, + } + + return i +} diff --git a/server/channelserver/quest_json.go b/server/channelserver/quest_json.go new file mode 100644 index 000000000..1814cfe3f --- /dev/null +++ b/server/channelserver/quest_json.go @@ -0,0 +1,1101 @@ +package channelserver + +import ( + "bytes" + "encoding/binary" + "encoding/json" + "fmt" + "math" + + "golang.org/x/text/encoding/japanese" + "golang.org/x/text/transform" +) + +// Objective type constants matching questObjType in questfile.bin.hexpat. +const ( + questObjNone = uint32(0x00000000) + questObjHunt = uint32(0x00000001) + questObjDeliver = uint32(0x00000002) + questObjEsoteric = uint32(0x00000010) + questObjCapture = uint32(0x00000101) + questObjSlay = uint32(0x00000201) + questObjDeliverFlag = uint32(0x00001002) + questObjBreakPart = uint32(0x00004004) + questObjDamage = uint32(0x00008004) + questObjSlayOrDamage = uint32(0x00018004) + questObjSlayTotal = uint32(0x00020000) + questObjSlayAll = uint32(0x00040000) +) + +var questObjTypeMap = map[string]uint32{ + "none": questObjNone, + "hunt": questObjHunt, + "deliver": questObjDeliver, + "esoteric": questObjEsoteric, + "capture": questObjCapture, + "slay": questObjSlay, + "deliver_flag": questObjDeliverFlag, + "break_part": questObjBreakPart, + "damage": questObjDamage, + "slay_or_damage": questObjSlayOrDamage, + "slay_total": questObjSlayTotal, + "slay_all": questObjSlayAll, +} + +// ---- JSON schema types ---- + +// QuestObjectiveJSON represents a single quest objective. +type QuestObjectiveJSON struct { + // Type is one of: none, hunt, capture, slay, deliver, deliver_flag, + // break_part, damage, slay_or_damage, slay_total, slay_all, esoteric. + Type string `json:"type"` + // Target is a monster ID for hunt/capture/slay/break_part/damage, + // or an item ID for deliver/deliver_flag. + Target uint16 `json:"target"` + // Count is the quantity required (hunts, item count, etc.). + Count uint16 `json:"count"` + // Part is the monster part ID for break_part objectives. + Part uint16 `json:"part,omitempty"` +} + +// QuestRewardItemJSON is one entry in a reward table. +type QuestRewardItemJSON struct { + Rate uint16 `json:"rate"` + Item uint16 `json:"item"` + Quantity uint16 `json:"quantity"` +} + +// QuestRewardTableJSON is a named reward table with its items. +type QuestRewardTableJSON struct { + TableID uint8 `json:"table_id"` + Items []QuestRewardItemJSON `json:"items"` +} + +// QuestMonsterJSON describes one large monster spawn. +type QuestMonsterJSON struct { + ID uint8 `json:"id"` + SpawnAmount uint32 `json:"spawn_amount"` + SpawnStage uint32 `json:"spawn_stage"` + Orientation uint32 `json:"orientation"` + X float32 `json:"x"` + Y float32 `json:"y"` + Z float32 `json:"z"` +} + +// QuestSupplyItemJSON is one supply box entry. +type QuestSupplyItemJSON struct { + Item uint16 `json:"item"` + Quantity uint16 `json:"quantity"` +} + +// QuestStageJSON is a loaded stage definition. +type QuestStageJSON struct { + StageID uint32 `json:"stage_id"` +} + +// QuestForcedEquipJSON defines forced equipment per slot. +// Each slot is [equipment_id, attach1, attach2, attach3]. +// Zero values mean no restriction. +type QuestForcedEquipJSON struct { + Legs [4]uint16 `json:"legs,omitempty"` + Weapon [4]uint16 `json:"weapon,omitempty"` + Head [4]uint16 `json:"head,omitempty"` + Chest [4]uint16 `json:"chest,omitempty"` + Arms [4]uint16 `json:"arms,omitempty"` + Waist [4]uint16 `json:"waist,omitempty"` +} + +// QuestMinionSpawnJSON is one minion spawn entry within a map section. +type QuestMinionSpawnJSON struct { + Monster uint8 `json:"monster"` + SpawnToggle uint16 `json:"spawn_toggle"` + SpawnAmount uint32 `json:"spawn_amount"` + X float32 `json:"x"` + Y float32 `json:"y"` + Z float32 `json:"z"` +} + +// QuestMapSectionJSON defines one map section with its minion spawns. +// Each section corresponds to a loaded stage area. +type QuestMapSectionJSON struct { + LoadedStage uint32 `json:"loaded_stage"` + SpawnMonsters []uint8 `json:"spawn_monsters,omitempty"` // monster IDs for spawn type list + MinionSpawns []QuestMinionSpawnJSON `json:"minion_spawns,omitempty"` +} + +// QuestAreaTransitionJSON is one zone transition (floatSet). +type QuestAreaTransitionJSON struct { + TargetStageID1 int16 `json:"target_stage_id"` + StageVariant int16 `json:"stage_variant"` + CurrentX float32 `json:"current_x"` + CurrentY float32 `json:"current_y"` + CurrentZ float32 `json:"current_z"` + TransitionBox [5]float32 `json:"transition_box"` + TargetX float32 `json:"target_x"` + TargetY float32 `json:"target_y"` + TargetZ float32 `json:"target_z"` + TargetRotation [2]int16 `json:"target_rotation"` +} + +// QuestAreaTransitionsJSON holds the transitions for one area zone entry. +// The pointer may be null (empty transitions list) for zones without transitions. +type QuestAreaTransitionsJSON struct { + Transitions []QuestAreaTransitionJSON `json:"transitions,omitempty"` +} + +// QuestAreaMappingJSON defines coordinate mappings between area and base map. +// Layout: 32 bytes per entry (Area_xPos, Area_zPos, pad8, Base_xPos, Base_zPos, kn_Pos, pad4). +type QuestAreaMappingJSON struct { + AreaX float32 `json:"area_x"` + AreaZ float32 `json:"area_z"` + BaseX float32 `json:"base_x"` + BaseZ float32 `json:"base_z"` + KnPos float32 `json:"kn_pos"` +} + +// QuestMapInfoJSON contains the map ID and return base camp ID. +type QuestMapInfoJSON struct { + MapID uint32 `json:"map_id"` + ReturnBCID uint32 `json:"return_bc_id"` +} + +// QuestGatheringPointJSON is one gathering point (24 bytes). +type QuestGatheringPointJSON struct { + X float32 `json:"x"` + Y float32 `json:"y"` + Z float32 `json:"z"` + Range float32 `json:"range"` + GatheringID uint16 `json:"gathering_id"` + MaxCount uint16 `json:"max_count"` + MinCount uint16 `json:"min_count"` +} + +// QuestAreaGatheringJSON holds up to 4 gathering points for one area zone entry. +// A nil/empty list means the pointer is null for this zone. +type QuestAreaGatheringJSON struct { + Points []QuestGatheringPointJSON `json:"points,omitempty"` +} + +// QuestFacilityPointJSON is one facility point (24 bytes, facPoint in hexpat). +type QuestFacilityPointJSON struct { + Type uint16 `json:"type"` // SpecAc: 1=cooking, 2=fishing, 3=bluebox, etc. + X float32 `json:"x"` + Y float32 `json:"y"` + Z float32 `json:"z"` + Range float32 `json:"range"` + ID uint16 `json:"id"` +} + +// QuestAreaFacilitiesJSON holds the facilities block for one area zone entry. +// A nil/empty list means the pointer is null for this zone. +type QuestAreaFacilitiesJSON struct { + Points []QuestFacilityPointJSON `json:"points,omitempty"` +} + +// QuestGatherItemJSON is one entry in a gathering table. +type QuestGatherItemJSON struct { + Rate uint16 `json:"rate"` + Item uint16 `json:"item"` +} + +// QuestGatheringTableJSON is one gathering loot table. +type QuestGatheringTableJSON struct { + Items []QuestGatherItemJSON `json:"items,omitempty"` +} + +// QuestJSON is the human-readable quest definition. +// Time values: TimeLimitMinutes is converted to frames (×30×60) in the binary. +// Strings: encoded as UTF-8 here, converted to Shift-JIS in the binary. +type QuestJSON struct { + // Quest identification + QuestID uint16 `json:"quest_id"` + + // Text (UTF-8; converted to Shift-JIS in binary) + Title string `json:"title"` + Description string `json:"description"` + TextMain string `json:"text_main"` + TextSubA string `json:"text_sub_a"` + TextSubB string `json:"text_sub_b"` + SuccessCond string `json:"success_cond"` + FailCond string `json:"fail_cond"` + Contractor string `json:"contractor"` + + // General quest properties (generalQuestProperties section, 0x44–0x85) + MonsterSizeMulti uint16 `json:"monster_size_multi"` // 100 = 100% + SizeRange uint16 `json:"size_range"` + StatTable1 uint32 `json:"stat_table_1,omitempty"` + StatTable2 uint8 `json:"stat_table_2,omitempty"` + MainRankPoints uint32 `json:"main_rank_points"` + SubARankPoints uint32 `json:"sub_a_rank_points"` + SubBRankPoints uint32 `json:"sub_b_rank_points"` + + // Main quest properties + Fee uint32 `json:"fee"` + RewardMain uint32 `json:"reward_main"` + RewardSubA uint16 `json:"reward_sub_a"` + RewardSubB uint16 `json:"reward_sub_b"` + TimeLimitMinutes uint32 `json:"time_limit_minutes"` + Map uint32 `json:"map"` + RankBand uint16 `json:"rank_band"` + HardHRReq uint16 `json:"hard_hr_req,omitempty"` + JoinRankMin uint16 `json:"join_rank_min,omitempty"` + JoinRankMax uint16 `json:"join_rank_max,omitempty"` + PostRankMin uint16 `json:"post_rank_min,omitempty"` + PostRankMax uint16 `json:"post_rank_max,omitempty"` + + // Quest variant flags (see handlers_quest.go makeEventQuest comments) + QuestVariant1 uint8 `json:"quest_variant1,omitempty"` + QuestVariant2 uint8 `json:"quest_variant2,omitempty"` + QuestVariant3 uint8 `json:"quest_variant3,omitempty"` + QuestVariant4 uint8 `json:"quest_variant4,omitempty"` + + // Objectives + ObjectiveMain QuestObjectiveJSON `json:"objective_main"` + ObjectiveSubA QuestObjectiveJSON `json:"objective_sub_a,omitempty"` + ObjectiveSubB QuestObjectiveJSON `json:"objective_sub_b,omitempty"` + + // Monster spawns + LargeMonsters []QuestMonsterJSON `json:"large_monsters,omitempty"` + + // Reward tables + Rewards []QuestRewardTableJSON `json:"rewards,omitempty"` + + // Supply box (main: up to 24, sub_a/sub_b: up to 8 each) + SupplyMain []QuestSupplyItemJSON `json:"supply_main,omitempty"` + SupplySubA []QuestSupplyItemJSON `json:"supply_sub_a,omitempty"` + SupplySubB []QuestSupplyItemJSON `json:"supply_sub_b,omitempty"` + + // Loaded stages + Stages []QuestStageJSON `json:"stages,omitempty"` + + // Forced equipment (optional) + ForcedEquipment *QuestForcedEquipJSON `json:"forced_equipment,omitempty"` + + // Map sections with minion spawns (questAreaPtr) + MapSections []QuestMapSectionJSON `json:"map_sections,omitempty"` + + // Area transitions per zone (areaTransitionsPtr); one entry per zone. + // Length determines area1Zones in generalQuestProperties. + AreaTransitions []QuestAreaTransitionsJSON `json:"area_transitions,omitempty"` + + // Area coordinate mappings (areaMappingPtr) + AreaMappings []QuestAreaMappingJSON `json:"area_mappings,omitempty"` + + // Map info: map ID + return base camp ID (mapInfoPtr) + MapInfo *QuestMapInfoJSON `json:"map_info,omitempty"` + + // Per-zone gathering points (gatheringPointsPtr); one entry per zone. + GatheringPoints []QuestAreaGatheringJSON `json:"gathering_points,omitempty"` + + // Per-zone area facilities (areaFacilitiesPtr); one entry per zone. + AreaFacilities []QuestAreaFacilitiesJSON `json:"area_facilities,omitempty"` + + // Additional metadata strings (someStringsPtr / unk30). Optional. + SomeString string `json:"some_string,omitempty"` + QuestType string `json:"quest_type_string,omitempty"` + + // Gathering loot tables (gatheringTablesPtr) + GatheringTables []QuestGatheringTableJSON `json:"gathering_tables,omitempty"` +} + +// toShiftJIS converts a UTF-8 string to a null-terminated Shift-JIS byte slice. +// ASCII-only strings pass through unchanged. +func toShiftJIS(s string) ([]byte, error) { + enc := japanese.ShiftJIS.NewEncoder() + out, _, err := transform.Bytes(enc, []byte(s)) + if err != nil { + return nil, fmt.Errorf("shift-jis encode %q: %w", s, err) + } + return append(out, 0x00), nil +} + +// writeUint16LE writes a little-endian uint16 to buf. +func writeUint16LE(buf *bytes.Buffer, v uint16) { + b := [2]byte{} + binary.LittleEndian.PutUint16(b[:], v) + buf.Write(b[:]) +} + +// writeInt16LE writes a little-endian int16 to buf. +func writeInt16LE(buf *bytes.Buffer, v int16) { + writeUint16LE(buf, uint16(v)) +} + +// writeUint32LE writes a little-endian uint32 to buf. +func writeUint32LE(buf *bytes.Buffer, v uint32) { + b := [4]byte{} + binary.LittleEndian.PutUint32(b[:], v) + buf.Write(b[:]) +} + +// writeFloat32LE writes a little-endian IEEE-754 float32 to buf. +func writeFloat32LE(buf *bytes.Buffer, v float32) { + b := [4]byte{} + binary.LittleEndian.PutUint32(b[:], math.Float32bits(v)) + buf.Write(b[:]) +} + +// pad writes n zero bytes to buf. +func pad(buf *bytes.Buffer, n int) { + buf.Write(make([]byte, n)) +} + +// questBuilder is a small helper for building a quest binary with pointer patching. +// All pointers are absolute offsets from the start of the buffer (file start). +type questBuilder struct { + out *bytes.Buffer +} + +// reserve writes a u32(0) placeholder and returns its offset in the buffer. +func (b *questBuilder) reserve() int { + off := b.out.Len() + writeUint32LE(b.out, 0) + return off +} + +// patch writes the current buffer length as a u32 at the previously reserved offset. +func (b *questBuilder) patch(reservedOff int) { + binary.LittleEndian.PutUint32(b.out.Bytes()[reservedOff:], uint32(b.out.Len())) +} + +// patchValue writes a specific uint32 value at a previously reserved offset. +func (b *questBuilder) patchValue(reservedOff int, v uint32) { + binary.LittleEndian.PutUint32(b.out.Bytes()[reservedOff:], v) +} + +// objectiveBytes serialises one QuestObjectiveJSON to 8 bytes. +// Layout per hexpat objective.hexpat: +// +// u32 goalType +// if hunt/capture/slay/damage/break_part: u8 target, u8 pad +// else: u16 target +// if break_part: u16 goalPart +// else: u16 goalCount +// if none: trailing padding[4] instead of the above +func objectiveBytes(obj QuestObjectiveJSON) ([]byte, error) { + goalType, ok := questObjTypeMap[obj.Type] + if !ok { + if obj.Type == "" { + goalType = questObjNone + } else { + return nil, fmt.Errorf("unknown objective type %q", obj.Type) + } + } + + buf := &bytes.Buffer{} + writeUint32LE(buf, goalType) + + if goalType == questObjNone { + pad(buf, 4) + return buf.Bytes(), nil + } + + switch goalType { + case questObjHunt, questObjCapture, questObjSlay, questObjDamage, + questObjSlayOrDamage, questObjBreakPart: + buf.WriteByte(uint8(obj.Target)) + buf.WriteByte(0x00) + default: + writeUint16LE(buf, obj.Target) + } + + if goalType == questObjBreakPart { + writeUint16LE(buf, obj.Part) + } else { + writeUint16LE(buf, obj.Count) + } + + return buf.Bytes(), nil +} + +// CompileQuestJSON parses JSON quest data and compiles it to the MHF quest +// binary format (ZZ/G10 version, little-endian, uncompressed). +// +// Binary layout produced: +// +// 0x000–0x043 QuestFileHeader (68 bytes, 17 pointers) +// 0x044–0x085 generalQuestProperties (66 bytes) +// 0x086–0x1C5 mainQuestProperties (320 bytes, questBodyLenZZ) +// 0x1C6+ QuestText pointer table (32 bytes) + strings (Shift-JIS) +// aligned+ stages, supply box, reward tables, monster spawns, +// map sections, area mappings, area transitions, +// map info, gathering points, area facilities, +// some strings, gathering tables +func CompileQuestJSON(data []byte) ([]byte, error) { + var q QuestJSON + if err := json.Unmarshal(data, &q); err != nil { + return nil, fmt.Errorf("parse quest JSON: %w", err) + } + + // ── Compute counts before writing generalQuestProperties ───────────── + numZones := len(q.AreaTransitions) + numGatheringTables := len(q.GatheringTables) + + // Validate zone-length consistency. + if len(q.GatheringPoints) != 0 && len(q.GatheringPoints) != numZones { + return nil, fmt.Errorf("GatheringPoints len (%d) must equal AreaTransitions len (%d) or be 0", + len(q.GatheringPoints), numZones) + } + if len(q.AreaFacilities) != 0 && len(q.AreaFacilities) != numZones { + return nil, fmt.Errorf("AreaFacilities len (%d) must equal AreaTransitions len (%d) or be 0", + len(q.AreaFacilities), numZones) + } + + // ── Section offsets (computed as we build) ────────────────────────── + const ( + headerSize = 68 // 0x44 + genPropSize = 66 // 0x42 + mainPropSize = questBodyLenZZ // 320 = 0x140 + questTextSize = 32 // 8 × 4-byte s32p pointers + ) + + questTypeFlagsPtr := uint32(headerSize + genPropSize) // 0x86 + questStringsTablePtr := questTypeFlagsPtr + uint32(mainPropSize) // 0x1C6 + + // ── Build Shift-JIS strings ───────────────────────────────────────── + // Order matches QuestText struct: title, textMain, textSubA, textSubB, + // successCond, failCond, contractor, description. + rawTexts := []string{ + q.Title, q.TextMain, q.TextSubA, q.TextSubB, + q.SuccessCond, q.FailCond, q.Contractor, q.Description, + } + var sjisStrings [][]byte + for _, s := range rawTexts { + b, err := toShiftJIS(s) + if err != nil { + return nil, err + } + sjisStrings = append(sjisStrings, b) + } + + // Compute absolute pointers for each string (right after the s32p table). + stringDataStart := questStringsTablePtr + uint32(questTextSize) + stringPtrs := make([]uint32, len(sjisStrings)) + cursor := stringDataStart + for i, s := range sjisStrings { + stringPtrs[i] = cursor + cursor += uint32(len(s)) + } + + // ── Locate variable sections ───────────────────────────────────────── + // Offset after all string data, 4-byte aligned. + align4 := func(n uint32) uint32 { return (n + 3) &^ 3 } + afterStrings := align4(cursor) + + // Stages: each Stage is u32 stageID + 12 bytes padding = 16 bytes. + loadedStagesPtr := afterStrings + stagesSize := uint32(len(q.Stages)) * 16 + afterStages := align4(loadedStagesPtr + stagesSize) + // unk34 (fixedCoords1Ptr) terminates the stages loop in the hexpat. + unk34Ptr := afterStages + + // Supply box: main=24×4, subA=8×4, subB=8×4 = 160 bytes total. + supplyBoxPtr := afterStages + const supplyBoxSize = (24 + 8 + 8) * 4 + afterSupply := align4(supplyBoxPtr + supplyBoxSize) + + // Reward tables: compute size. + rewardPtr := afterSupply + rewardBuf := buildRewardTables(q.Rewards) + afterRewards := align4(rewardPtr + uint32(len(rewardBuf))) + + // Large monster spawns: each is 60 bytes + 1-byte terminator. + largeMonsterPtr := afterRewards + monsterBuf := buildMonsterSpawns(q.LargeMonsters) + afterMonsters := align4(largeMonsterPtr + uint32(len(monsterBuf))) + + // ── Assemble file ──────────────────────────────────────────────────── + qb := &questBuilder{out: &bytes.Buffer{}} + + // ── Header placeholders (68 bytes) ──────────────────────────────────── + // We'll write the header now with known values; variable section pointers + // that depend on the preceding variable sections are also known at this + // point because we computed them above. The new sections (area, gathering, + // etc.) will be appended after the monster spawns and patched in. + hdrQuestAreaOff := 0x14 // questAreaPtr placeholder + hdrAreaTransOff := 0x1C // areaTransitionsPtr placeholder + hdrAreaMappingOff := 0x20 // areaMappingPtr placeholder + hdrMapInfoOff := 0x24 // mapInfoPtr placeholder + hdrGatherPtsOff := 0x28 // gatheringPointsPtr placeholder + hdrFacilitiesOff := 0x2C // areaFacilitiesPtr placeholder + hdrSomeStringsOff := 0x30 // someStringsPtr placeholder + hdrGatherTablesOff := 0x38 // gatheringTablesPtr placeholder + + writeUint32LE(qb.out, questTypeFlagsPtr) // 0x00 questTypeFlagsPtr + writeUint32LE(qb.out, loadedStagesPtr) // 0x04 loadedStagesPtr + writeUint32LE(qb.out, supplyBoxPtr) // 0x08 supplyBoxPtr + writeUint32LE(qb.out, rewardPtr) // 0x0C rewardPtr + writeUint16LE(qb.out, 0) // 0x10 subSupplyBoxPtr (unused) + qb.out.WriteByte(0) // 0x12 hidden + qb.out.WriteByte(0) // 0x13 subSupplyBoxLen + writeUint32LE(qb.out, 0) // 0x14 questAreaPtr (patched later) + writeUint32LE(qb.out, largeMonsterPtr) // 0x18 largeMonsterPtr + writeUint32LE(qb.out, 0) // 0x1C areaTransitionsPtr (patched later) + writeUint32LE(qb.out, 0) // 0x20 areaMappingPtr (patched later) + writeUint32LE(qb.out, 0) // 0x24 mapInfoPtr (patched later) + writeUint32LE(qb.out, 0) // 0x28 gatheringPointsPtr (patched later) + writeUint32LE(qb.out, 0) // 0x2C areaFacilitiesPtr (patched later) + writeUint32LE(qb.out, 0) // 0x30 someStringsPtr (patched later) + writeUint32LE(qb.out, unk34Ptr) // 0x34 fixedCoords1Ptr (stages end) + writeUint32LE(qb.out, 0) // 0x38 gatheringTablesPtr (patched later) + writeUint32LE(qb.out, 0) // 0x3C fixedCoords2Ptr (null) + writeUint32LE(qb.out, 0) // 0x40 fixedInfoPtr (null) + + if qb.out.Len() != headerSize { + return nil, fmt.Errorf("header size mismatch: got %d want %d", qb.out.Len(), headerSize) + } + + // ── General Quest Properties (66 bytes, 0x44–0x85) ────────────────── + writeUint16LE(qb.out, q.MonsterSizeMulti) // 0x44 monsterSizeMulti + writeUint16LE(qb.out, q.SizeRange) // 0x46 sizeRange + writeUint32LE(qb.out, q.StatTable1) // 0x48 statTable1 + writeUint32LE(qb.out, q.MainRankPoints) // 0x4C mainRankPoints + writeUint32LE(qb.out, 0) // 0x50 unknown + writeUint32LE(qb.out, q.SubARankPoints) // 0x54 subARankPoints + writeUint32LE(qb.out, q.SubBRankPoints) // 0x58 subBRankPoints + writeUint32LE(qb.out, 0) // 0x5C questTypeID / unknown + qb.out.WriteByte(0) // 0x60 padding + qb.out.WriteByte(q.StatTable2) // 0x61 statTable2 + pad(qb.out, 0x11) // 0x62–0x72 padding + qb.out.WriteByte(0) // 0x73 questKn1 + writeUint16LE(qb.out, 0) // 0x74 questKn2 + writeUint16LE(qb.out, 0) // 0x76 questKn3 + writeUint16LE(qb.out, uint16(numGatheringTables)) // 0x78 gatheringTablesQty + writeUint16LE(qb.out, 0) // 0x7A unknown + qb.out.WriteByte(uint8(numZones)) // 0x7C area1Zones + qb.out.WriteByte(0) // 0x7D area2Zones + qb.out.WriteByte(0) // 0x7E area3Zones + qb.out.WriteByte(0) // 0x7F area4Zones + writeUint16LE(qb.out, 0) // 0x80 unknown + writeUint16LE(qb.out, 0) // 0x82 unknown + writeUint16LE(qb.out, 0) // 0x84 unknown + + if qb.out.Len() != headerSize+genPropSize { + return nil, fmt.Errorf("genProp size mismatch: got %d want %d", qb.out.Len(), headerSize+genPropSize) + } + + // ── Main Quest Properties (320 bytes, 0x86–0x1C5) ─────────────────── + mainStart := qb.out.Len() + qb.out.WriteByte(0) // +0x00 unknown + qb.out.WriteByte(0) // +0x01 musicMode + qb.out.WriteByte(0) // +0x02 localeFlags + qb.out.WriteByte(0) // +0x03 unknown + qb.out.WriteByte(0) // +0x04 rankingID + qb.out.WriteByte(0) // +0x05 unknown + writeUint16LE(qb.out, 0) // +0x06 unknown + writeUint16LE(qb.out, q.RankBand) // +0x08 rankBand + writeUint16LE(qb.out, 0) // +0x0A questTypeID + writeUint32LE(qb.out, q.Fee) // +0x0C questFee + writeUint32LE(qb.out, q.RewardMain) // +0x10 rewardMain + writeUint32LE(qb.out, 0) // +0x14 cartsOrReduction + writeUint16LE(qb.out, q.RewardSubA) // +0x18 rewardA + writeUint16LE(qb.out, 0) // +0x1A padding + writeUint16LE(qb.out, q.RewardSubB) // +0x1C rewardB + writeUint16LE(qb.out, q.HardHRReq) // +0x1E hardHRReq + writeUint32LE(qb.out, q.TimeLimitMinutes*60*30) // +0x20 questTime (frames at 30Hz) + writeUint32LE(qb.out, q.Map) // +0x24 questMap + writeUint32LE(qb.out, questStringsTablePtr) // +0x28 questStringsPtr + writeUint16LE(qb.out, 0) // +0x2C unknown + writeUint16LE(qb.out, q.QuestID) // +0x2E questID + + // +0x30 objectives[3] (8 bytes each) + for _, obj := range []QuestObjectiveJSON{q.ObjectiveMain, q.ObjectiveSubA, q.ObjectiveSubB} { + b, err := objectiveBytes(obj) + if err != nil { + return nil, err + } + qb.out.Write(b) + } + + // +0x48 post-objectives fields + qb.out.WriteByte(0) // +0x48 unknown + qb.out.WriteByte(0) // +0x49 unknown + writeUint16LE(qb.out, 0) // +0x4A padding + writeUint16LE(qb.out, q.JoinRankMin) // +0x4C joinRankMin + writeUint16LE(qb.out, q.JoinRankMax) // +0x4E joinRankMax + writeUint16LE(qb.out, q.PostRankMin) // +0x50 postRankMin + writeUint16LE(qb.out, q.PostRankMax) // +0x52 postRankMax + pad(qb.out, 8) // +0x54 padding[8] + + // +0x5C forced equipment (6 slots × 4 u16 = 48 bytes) + eq := q.ForcedEquipment + if eq == nil { + eq = &QuestForcedEquipJSON{} + } + for _, slot := range [][4]uint16{eq.Legs, eq.Weapon, eq.Head, eq.Chest, eq.Arms, eq.Waist} { + for _, v := range slot { + writeUint16LE(qb.out, v) + } + } + + // +0x8C unknown u32 + writeUint32LE(qb.out, 0) + + // +0x90 monster variants[3] + mapVariant + qb.out.WriteByte(0) // monsterVariants[0] + qb.out.WriteByte(0) // monsterVariants[1] + qb.out.WriteByte(0) // monsterVariants[2] + qb.out.WriteByte(0) // mapVariant + + // +0x94 requiredItemType (ItemID = u16), requiredItemCount + writeUint16LE(qb.out, 0) + qb.out.WriteByte(0) // requiredItemCount + + // +0x97 questVariants + qb.out.WriteByte(q.QuestVariant1) + qb.out.WriteByte(q.QuestVariant2) + qb.out.WriteByte(q.QuestVariant3) + qb.out.WriteByte(q.QuestVariant4) + + // +0x9B padding[5] + pad(qb.out, 5) + + // +0xA0 allowedEquipBitmask, points + writeUint32LE(qb.out, 0) // allowedEquipBitmask + writeUint32LE(qb.out, 0) // mainPoints + writeUint32LE(qb.out, 0) // subAPoints + writeUint32LE(qb.out, 0) // subBPoints + + // +0xB0 rewardItems[3] (ItemID = u16, 3 items = 6 bytes) + pad(qb.out, 6) + + // +0xB6 interception section (non-SlayAll path: padding[3] + MonsterID[1] = 4 bytes) + pad(qb.out, 4) + + // +0xBA padding[0xA] = 10 bytes + pad(qb.out, 10) + + // +0xC4 questClearsAllowed + writeUint32LE(qb.out, 0) + + // +0xC8 = 200 bytes so far for documented fields. ZZ body = 320 bytes. + // Zero-pad the remaining unknown ZZ-specific fields. + writtenInMain := qb.out.Len() - mainStart + if writtenInMain < mainPropSize { + pad(qb.out, mainPropSize-writtenInMain) + } else if writtenInMain > mainPropSize { + return nil, fmt.Errorf("mainQuestProperties overflowed: wrote %d, max %d", writtenInMain, mainPropSize) + } + + if qb.out.Len() != int(questTypeFlagsPtr)+mainPropSize { + return nil, fmt.Errorf("main prop end mismatch: at %d, want %d", qb.out.Len(), int(questTypeFlagsPtr)+mainPropSize) + } + + // ── QuestText pointer table (32 bytes) ─────────────────────────────── + for _, ptr := range stringPtrs { + writeUint32LE(qb.out, ptr) + } + + // ── String data ────────────────────────────────────────────────────── + for _, s := range sjisStrings { + qb.out.Write(s) + } + + // Pad to afterStrings alignment. + for uint32(qb.out.Len()) < afterStrings { + qb.out.WriteByte(0) + } + + // ── Stages ─────────────────────────────────────────────────────────── + for _, st := range q.Stages { + writeUint32LE(qb.out, st.StageID) + pad(qb.out, 12) + } + for uint32(qb.out.Len()) < afterStages { + qb.out.WriteByte(0) + } + + // ── Supply Box ─────────────────────────────────────────────────────── + type slot struct { + items []QuestSupplyItemJSON + max int + } + for _, section := range []slot{ + {q.SupplyMain, 24}, + {q.SupplySubA, 8}, + {q.SupplySubB, 8}, + } { + written := 0 + for _, item := range section.items { + if written >= section.max { + break + } + writeUint16LE(qb.out, item.Item) + writeUint16LE(qb.out, item.Quantity) + written++ + } + for written < section.max { + writeUint32LE(qb.out, 0) + written++ + } + } + + // ── Reward Tables ──────────────────────────────────────────────────── + qb.out.Write(rewardBuf) + for uint32(qb.out.Len()) < largeMonsterPtr { + qb.out.WriteByte(0) + } + + // ── Large Monster Spawns ───────────────────────────────────────────── + qb.out.Write(monsterBuf) + for uint32(qb.out.Len()) < afterMonsters { + qb.out.WriteByte(0) + } + + // ── Variable sections: map sections, area mappings, transitions, etc. ── + // All written at afterMonsters and beyond, pointers patched into header. + + // ── Map Sections (questAreaPtr) ────────────────────────────────────── + // Layout: + // u32 ptr[0], u32 ptr[1], ..., u32(0) terminator + // For each section: + // mapSection: u32 loadedStage, u32 unk, u32 spawnTypesPtr, u32 spawnStatsPtr + // u32(0) gap, u16 unk (= 6 bytes after mapSection) + // spawnTypes data: (MonsterID u8 + pad[3]) per entry, terminated by 0xFFFF + // spawnStats data: MinionSpawn (60 bytes) per entry, terminated by 0xFFFF + if len(q.MapSections) > 0 { + questAreaOff := qb.out.Len() + qb.patchValue(hdrQuestAreaOff, uint32(questAreaOff)) + + // Write pointer array (one u32 per section + terminator). + sectionPtrOffs := make([]int, len(q.MapSections)) + for i := range q.MapSections { + sectionPtrOffs[i] = qb.reserve() + } + writeUint32LE(qb.out, 0) // terminator + + // Write each mapSection block. + type sectionPtrs struct { + spawnTypesOff int + spawnStatsOff int + } + internalPtrs := make([]sectionPtrs, len(q.MapSections)) + + for i, ms := range q.MapSections { + // Patch the pointer-array entry to point here. + qb.patch(sectionPtrOffs[i]) + + // mapSection: loadedStage, unk, spawnTypesPtr, spawnStatsPtr + writeUint32LE(qb.out, ms.LoadedStage) + writeUint32LE(qb.out, 0) // unk + internalPtrs[i].spawnTypesOff = qb.reserve() + internalPtrs[i].spawnStatsOff = qb.reserve() + + // u32(0) gap + u16 unk immediately after the 16-byte mapSection. + writeUint32LE(qb.out, 0) + writeUint16LE(qb.out, 0) + } + + // Write spawn data for each section. + for i, ms := range q.MapSections { + // spawnTypes: varPaddT = u8 monster + pad[3] per entry. + // Terminated by first 2 bytes == 0xFFFF. + qb.patch(internalPtrs[i].spawnTypesOff) + for _, monID := range ms.SpawnMonsters { + qb.out.WriteByte(monID) + pad(qb.out, 3) + } + writeUint16LE(qb.out, 0xFFFF) // terminator + + // Align to 4 bytes before spawnStats. + for qb.out.Len()%4 != 0 { + qb.out.WriteByte(0) + } + + // spawnStats: MinionSpawn per entry (60 bytes), terminated by 0xFFFF. + qb.patch(internalPtrs[i].spawnStatsOff) + for _, ms2 := range ms.MinionSpawns { + qb.out.WriteByte(ms2.Monster) + qb.out.WriteByte(0) // padding[1] + writeUint16LE(qb.out, ms2.SpawnToggle) // spawnToggle + writeUint32LE(qb.out, ms2.SpawnAmount) // spawnAmount + writeUint32LE(qb.out, 0) // unk u32 + pad(qb.out, 0x10) // padding[16] + writeUint32LE(qb.out, 0) // unk u32 + writeFloat32LE(qb.out, ms2.X) + writeFloat32LE(qb.out, ms2.Y) + writeFloat32LE(qb.out, ms2.Z) + pad(qb.out, 0x10) // padding[16] + } + writeUint16LE(qb.out, 0xFFFF) // terminator + + // Align for next section. + for qb.out.Len()%4 != 0 { + qb.out.WriteByte(0) + } + } + } + + // ── Area Mappings (areaMappingPtr) ──────────────────────────────────── + // Written BEFORE area transitions so the parser can use + // "read until areaTransitionsPtr" to know the count. + // Layout: AreaMappings[n] × 32 bytes each, back-to-back. + // float area_xPos, float area_zPos, pad[8], + // float base_xPos, float base_zPos, float kn_Pos, pad[4] + if len(q.AreaMappings) > 0 { + areaMappingOff := qb.out.Len() + qb.patchValue(hdrAreaMappingOff, uint32(areaMappingOff)) + + for _, am := range q.AreaMappings { + writeFloat32LE(qb.out, am.AreaX) + writeFloat32LE(qb.out, am.AreaZ) + pad(qb.out, 8) + writeFloat32LE(qb.out, am.BaseX) + writeFloat32LE(qb.out, am.BaseZ) + writeFloat32LE(qb.out, am.KnPos) + pad(qb.out, 4) + } + } + + // ── Area Transitions (areaTransitionsPtr) ───────────────────────────── + // Layout: playerAreaChange[area1Zones] = u32 ptr per zone. + // Then floatSet arrays for each zone with transitions. + if numZones > 0 { + areaTransOff := qb.out.Len() + qb.patchValue(hdrAreaTransOff, uint32(areaTransOff)) + + // Write pointer array. + zonePtrOffs := make([]int, numZones) + for i := range q.AreaTransitions { + zonePtrOffs[i] = qb.reserve() + } + + // Write floatSet arrays for non-empty zones. + for i, zone := range q.AreaTransitions { + if len(zone.Transitions) == 0 { + // Null pointer — leave as 0. + continue + } + qb.patch(zonePtrOffs[i]) + for _, tr := range zone.Transitions { + writeInt16LE(qb.out, tr.TargetStageID1) + writeInt16LE(qb.out, tr.StageVariant) + writeFloat32LE(qb.out, tr.CurrentX) + writeFloat32LE(qb.out, tr.CurrentY) + writeFloat32LE(qb.out, tr.CurrentZ) + for _, f := range tr.TransitionBox { + writeFloat32LE(qb.out, f) + } + writeFloat32LE(qb.out, tr.TargetX) + writeFloat32LE(qb.out, tr.TargetY) + writeFloat32LE(qb.out, tr.TargetZ) + for _, r := range tr.TargetRotation { + writeInt16LE(qb.out, r) + } + } + // Terminate with s16(-1). + writeInt16LE(qb.out, -1) + // Align. + for qb.out.Len()%4 != 0 { + qb.out.WriteByte(0) + } + } + } + + // ── Map Info (mapInfoPtr) ───────────────────────────────────────────── + if q.MapInfo != nil { + mapInfoOff := qb.out.Len() + qb.patchValue(hdrMapInfoOff, uint32(mapInfoOff)) + writeUint32LE(qb.out, q.MapInfo.MapID) + writeUint32LE(qb.out, q.MapInfo.ReturnBCID) + } + + // ── Gathering Points (gatheringPointsPtr) ───────────────────────────── + // Layout: ptGatheringPoint[area1Zones] = u32 ptr per zone. + // Each non-null ptr points to gatheringPoint[4] terminated by xPos=-1.0. + if numZones > 0 && len(q.GatheringPoints) > 0 { + gatherPtsOff := qb.out.Len() + qb.patchValue(hdrGatherPtsOff, uint32(gatherPtsOff)) + + // Write pointer array. + gpPtrOffs := make([]int, numZones) + for i := range q.GatheringPoints { + gpPtrOffs[i] = qb.reserve() + } + + // Write gathering point arrays for non-empty zones. + for i, zone := range q.GatheringPoints { + if len(zone.Points) == 0 { + continue + } + qb.patch(gpPtrOffs[i]) + for _, gp := range zone.Points { + writeFloat32LE(qb.out, gp.X) + writeFloat32LE(qb.out, gp.Y) + writeFloat32LE(qb.out, gp.Z) + writeFloat32LE(qb.out, gp.Range) + writeUint16LE(qb.out, gp.GatheringID) + writeUint16LE(qb.out, gp.MaxCount) + pad(qb.out, 2) + writeUint16LE(qb.out, gp.MinCount) + } + // Terminator: xPos == -1.0 (0xBF800000). + writeFloat32LE(qb.out, float32(math.Float32frombits(0xBF800000))) + // Pad terminator entry to 24 bytes total (only wrote 4). + pad(qb.out, 20) + } + } + + // ── Area Facilities (areaFacilitiesPtr) ─────────────────────────────── + // Layout: ptVar[area1Zones] = u32 ptr per zone. + // Each non-null ptr points to a facPointBlock. + // facPoint: pad[2] + SpecAc(u16) + xPos + yPos + zPos + range + id(u16) + pad[2] = 24 bytes + // facPointBlock: facPoints[] terminated by (xPos-at-$+4 == 0xBF800000) + pad[0xC] + float + float + // Terminator layout: write pad[2]+type[2] then float32(-1.0) to trigger termination, + // then block footer: pad[0xC] + float(0) + float(0). + if numZones > 0 && len(q.AreaFacilities) > 0 { + facOff := qb.out.Len() + qb.patchValue(hdrFacilitiesOff, uint32(facOff)) + + facPtrOffs := make([]int, numZones) + for i := range q.AreaFacilities { + facPtrOffs[i] = qb.reserve() + } + + for i, zone := range q.AreaFacilities { + if len(zone.Points) == 0 { + continue + } + qb.patch(facPtrOffs[i]) + + for _, fp := range zone.Points { + pad(qb.out, 2) // pad[2] + writeUint16LE(qb.out, fp.Type) // SpecAc type + writeFloat32LE(qb.out, fp.X) + writeFloat32LE(qb.out, fp.Y) + writeFloat32LE(qb.out, fp.Z) + writeFloat32LE(qb.out, fp.Range) + writeUint16LE(qb.out, fp.ID) + pad(qb.out, 2) // pad[2] + } + + // Terminator: the while condition checks read_unsigned($+4,4). + // Write 4 bytes header (pad[2]+type[2]) then float32(-1.0). + pad(qb.out, 2) + writeUint16LE(qb.out, 0) + writeFloat32LE(qb.out, float32(math.Float32frombits(0xBF800000))) + + // Block footer: padding[0xC] + float(0) + float(0) = 20 bytes. + pad(qb.out, 0xC) + writeFloat32LE(qb.out, 0) + writeFloat32LE(qb.out, 0) + } + } + + // ── Some Strings (someStringsPtr / unk30) ───────────────────────────── + // Layout at unk30: ptr someStringPtr, ptr questTypePtr (8 bytes), + // then the string data. + hasSomeStrings := q.SomeString != "" || q.QuestType != "" + if hasSomeStrings { + someStringsOff := qb.out.Len() + qb.patchValue(hdrSomeStringsOff, uint32(someStringsOff)) + + // Two pointer slots. + someStrPtrOff := qb.reserve() + questTypePtrOff := qb.reserve() + + if q.SomeString != "" { + qb.patch(someStrPtrOff) + b, err := toShiftJIS(q.SomeString) + if err != nil { + return nil, err + } + qb.out.Write(b) + } + + if q.QuestType != "" { + qb.patch(questTypePtrOff) + b, err := toShiftJIS(q.QuestType) + if err != nil { + return nil, err + } + qb.out.Write(b) + } + } + + // ── Gathering Tables (gatheringTablesPtr) ───────────────────────────── + // Layout: ptVar[gatheringTablesQty] = u32 ptr per table. + // Each ptr points to GatherItem[] terminated by u16(0xFFFF). + // GatherItem: u16 rate + u16 item = 4 bytes. + if numGatheringTables > 0 { + gatherTablesOff := qb.out.Len() + qb.patchValue(hdrGatherTablesOff, uint32(gatherTablesOff)) + + tblPtrOffs := make([]int, numGatheringTables) + for i := range q.GatheringTables { + tblPtrOffs[i] = qb.reserve() + } + + for i, tbl := range q.GatheringTables { + qb.patch(tblPtrOffs[i]) + for _, item := range tbl.Items { + writeUint16LE(qb.out, item.Rate) + writeUint16LE(qb.out, item.Item) + } + writeUint16LE(qb.out, 0xFFFF) // terminator + } + } + + return qb.out.Bytes(), nil +} + +// buildRewardTables serialises the reward table array and all reward item lists. +// Layout per hexpat: +// +// RewardTable[] { u8 tableId, u8 pad, u16 pad, u32 tableOffset } terminated by int16(-1) +// RewardItem[] { u16 rate, u16 item, u16 quantity } terminated by int16(-1) +func buildRewardTables(tables []QuestRewardTableJSON) []byte { + if len(tables) == 0 { + // Empty: just the terminator. + b := [2]byte{0xFF, 0xFF} + return b[:] + } + + headers := &bytes.Buffer{} + itemData := &bytes.Buffer{} + + // Header array size = len(tables) × 8 bytes + 2-byte terminator. + headerArraySize := uint32(len(tables)*8 + 2) + + for _, t := range tables { + // tableOffset is relative to the start of rewardPtr in the file. + tableOffset := headerArraySize + uint32(itemData.Len()) + + headers.WriteByte(t.TableID) + headers.WriteByte(0) // padding + writeUint16LE(headers, 0) // padding + writeUint32LE(headers, tableOffset) + + for _, item := range t.Items { + writeUint16LE(itemData, item.Rate) + writeUint16LE(itemData, item.Item) + writeUint16LE(itemData, item.Quantity) + } + // Terminate this table's item list with -1. + writeUint16LE(itemData, 0xFFFF) + } + // Terminate the table header array. + writeUint16LE(headers, 0xFFFF) + + return append(headers.Bytes(), itemData.Bytes()...) +} + +// buildMonsterSpawns serialises the large monster spawn list. +// Each entry is 60 bytes; terminated with a 0xFF byte. +func buildMonsterSpawns(monsters []QuestMonsterJSON) []byte { + buf := &bytes.Buffer{} + for _, m := range monsters { + buf.WriteByte(m.ID) + pad(buf, 3) // +0x01 padding[3] + writeUint32LE(buf, m.SpawnAmount) // +0x04 + writeUint32LE(buf, m.SpawnStage) // +0x08 + pad(buf, 16) // +0x0C padding[0x10] + writeUint32LE(buf, m.Orientation) // +0x1C + writeFloat32LE(buf, m.X) // +0x20 + writeFloat32LE(buf, m.Y) // +0x24 + writeFloat32LE(buf, m.Z) // +0x28 + pad(buf, 16) // +0x2C padding[0x10] + } + buf.WriteByte(0xFF) // terminator + return buf.Bytes() +} diff --git a/server/channelserver/quest_json_parser.go b/server/channelserver/quest_json_parser.go new file mode 100644 index 000000000..b91a1efc6 --- /dev/null +++ b/server/channelserver/quest_json_parser.go @@ -0,0 +1,844 @@ +package channelserver + +import ( + "encoding/binary" + "fmt" + "math" + + "golang.org/x/text/encoding/japanese" + "golang.org/x/text/transform" +) + +// ParseQuestBinary reads a MHF quest binary (ZZ/G10 layout, little-endian) +// and returns a QuestJSON ready for re-compilation with CompileQuestJSON. +// +// The binary layout is described in quest_json.go (CompileQuestJSON). +// Sections guarded by null pointers in the header are skipped; the +// corresponding QuestJSON slices will be nil/empty. +func ParseQuestBinary(data []byte) (*QuestJSON, error) { + if len(data) < 0x86 { + return nil, fmt.Errorf("quest binary too short: %d bytes (minimum 0x86)", len(data)) + } + + // ── Helper closures ────────────────────────────────────────────────── + u8 := func(off int) uint8 { + return data[off] + } + u16 := func(off int) uint16 { + return binary.LittleEndian.Uint16(data[off:]) + } + i16 := func(off int) int16 { + return int16(binary.LittleEndian.Uint16(data[off:])) + } + u32 := func(off int) uint32 { + return binary.LittleEndian.Uint32(data[off:]) + } + f32 := func(off int) float32 { + return math.Float32frombits(binary.LittleEndian.Uint32(data[off:])) + } + + // check bounds-checks a read of n bytes at off. + check := func(off, n int, ctx string) error { + if off < 0 || off+n > len(data) { + return fmt.Errorf("%s: offset 0x%X len %d out of bounds (file len %d)", ctx, off, n, len(data)) + } + return nil + } + + // readSJIS reads a null-terminated Shift-JIS string starting at off. + readSJIS := func(off int) (string, error) { + if off < 0 || off >= len(data) { + return "", fmt.Errorf("string offset 0x%X out of bounds", off) + } + end := off + for end < len(data) && data[end] != 0 { + end++ + } + sjis := data[off:end] + if len(sjis) == 0 { + return "", nil + } + dec := japanese.ShiftJIS.NewDecoder() + utf8, _, err := transform.Bytes(dec, sjis) + if err != nil { + return "", fmt.Errorf("shift-jis decode at 0x%X: %w", off, err) + } + return string(utf8), nil + } + + q := &QuestJSON{} + + // ── Header (0x00–0x43) ─────────────────────────────────────────────── + questTypeFlagsPtr := int(u32(0x00)) + loadedStagesPtr := int(u32(0x04)) + supplyBoxPtr := int(u32(0x08)) + rewardPtr := int(u32(0x0C)) + questAreaPtr := int(u32(0x14)) + largeMonsterPtr := int(u32(0x18)) + areaTransitionsPtr := int(u32(0x1C)) + areaMappingPtr := int(u32(0x20)) + mapInfoPtr := int(u32(0x24)) + gatheringPointsPtr := int(u32(0x28)) + areaFacilitiesPtr := int(u32(0x2C)) + someStringsPtr := int(u32(0x30)) + unk34Ptr := int(u32(0x34)) // stages-end sentinel + gatheringTablesPtr := int(u32(0x38)) + + // ── General Quest Properties (0x44–0x85) ──────────────────────────── + q.MonsterSizeMulti = u16(0x44) + q.SizeRange = u16(0x46) + q.StatTable1 = u32(0x48) + q.MainRankPoints = u32(0x4C) + // 0x50 unknown u32 — skipped + q.SubARankPoints = u32(0x54) + q.SubBRankPoints = u32(0x58) + // 0x5C questTypeID/unknown — skipped + // 0x60 padding + q.StatTable2 = u8(0x61) + // 0x62–0x72 padding + // 0x73 questKn1, 0x74 questKn2, 0x76 questKn3 — skipped + gatheringTablesQty := int(u16(0x78)) + // 0x7A unknown + area1Zones := int(u8(0x7C)) + // 0x7D–0x7F area2–4Zones (not needed for parsing) + + // ── Main Quest Properties (at questTypeFlagsPtr, 320 bytes) ───────── + if questTypeFlagsPtr == 0 { + return nil, fmt.Errorf("questTypeFlagsPtr is null; cannot read main quest properties") + } + if err := check(questTypeFlagsPtr, questBodyLenZZ, "mainQuestProperties"); err != nil { + return nil, err + } + + mp := questTypeFlagsPtr // shorthand + + q.RankBand = u16(mp + 0x08) + q.Fee = u32(mp + 0x0C) + q.RewardMain = u32(mp + 0x10) + q.RewardSubA = u16(mp + 0x18) + q.RewardSubB = u16(mp + 0x1C) + q.HardHRReq = u16(mp + 0x1E) + questFrames := u32(mp + 0x20) + q.TimeLimitMinutes = questFrames / (60 * 30) + q.Map = u32(mp + 0x24) + questStringsPtr := int(u32(mp + 0x28)) + q.QuestID = u16(mp + 0x2E) + + // +0x30 objectives[3] (8 bytes each) + objectives, err := parseObjectives(data, mp+0x30) + if err != nil { + return nil, err + } + q.ObjectiveMain = objectives[0] + q.ObjectiveSubA = objectives[1] + q.ObjectiveSubB = objectives[2] + + // +0x4C joinRankMin/Max, postRankMin/Max + q.JoinRankMin = u16(mp + 0x4C) + q.JoinRankMax = u16(mp + 0x4E) + q.PostRankMin = u16(mp + 0x50) + q.PostRankMax = u16(mp + 0x52) + + // +0x5C forced equipment (6 slots × 4 × u16 = 48 bytes) + eq, hasEquip := parseForcedEquip(data, mp+0x5C) + if hasEquip { + q.ForcedEquipment = eq + } + + // +0x97 questVariants + q.QuestVariant1 = u8(mp + 0x97) + q.QuestVariant2 = u8(mp + 0x98) + q.QuestVariant3 = u8(mp + 0x99) + q.QuestVariant4 = u8(mp + 0x9A) + + // ── QuestText strings ──────────────────────────────────────────────── + if questStringsPtr != 0 { + if err := check(questStringsPtr, 32, "questTextTable"); err != nil { + return nil, err + } + strPtrs := make([]int, 8) + for i := range strPtrs { + strPtrs[i] = int(u32(questStringsPtr + i*4)) + } + texts := make([]string, 8) + for i, ptr := range strPtrs { + if ptr == 0 { + continue + } + s, err := readSJIS(ptr) + if err != nil { + return nil, fmt.Errorf("string[%d]: %w", i, err) + } + texts[i] = s + } + q.Title = texts[0] + q.TextMain = texts[1] + q.TextSubA = texts[2] + q.TextSubB = texts[3] + q.SuccessCond = texts[4] + q.FailCond = texts[5] + q.Contractor = texts[6] + q.Description = texts[7] + } + + // ── Stages ─────────────────────────────────────────────────────────── + if loadedStagesPtr != 0 && unk34Ptr > loadedStagesPtr { + off := loadedStagesPtr + for off+16 <= unk34Ptr { + if err := check(off, 16, "stage"); err != nil { + return nil, err + } + stageID := u32(off) + q.Stages = append(q.Stages, QuestStageJSON{StageID: stageID}) + off += 16 + } + } + + // ── Supply Box ─────────────────────────────────────────────────────── + if supplyBoxPtr != 0 { + const supplyBoxSize = (24 + 8 + 8) * 4 + if err := check(supplyBoxPtr, supplyBoxSize, "supplyBox"); err != nil { + return nil, err + } + q.SupplyMain = readSupplySlots(data, supplyBoxPtr, 24) + q.SupplySubA = readSupplySlots(data, supplyBoxPtr+24*4, 8) + q.SupplySubB = readSupplySlots(data, supplyBoxPtr+24*4+8*4, 8) + } + + // ── Reward Tables ──────────────────────────────────────────────────── + if rewardPtr != 0 { + tables, err := parseRewardTables(data, rewardPtr) + if err != nil { + return nil, err + } + q.Rewards = tables + } + + // ── Large Monster Spawns ───────────────────────────────────────────── + if largeMonsterPtr != 0 { + monsters, err := parseMonsterSpawns(data, largeMonsterPtr, f32) + if err != nil { + return nil, err + } + q.LargeMonsters = monsters + } + + // ── Map Sections (questAreaPtr) ────────────────────────────────────── + // Layout: u32 ptr[] terminated by u32(0), then each mapSection: + // u32 loadedStage, u32 unk, u32 spawnTypesPtr, u32 spawnStatsPtr, + // u32(0) gap, u16 unk — then spawnTypes and spawnStats data. + if questAreaPtr != 0 { + sections, err := parseMapSections(data, questAreaPtr, u32, u16, f32) + if err != nil { + return nil, err + } + q.MapSections = sections + } + + // ── Area Mappings (areaMappingPtr) ──────────────────────────────────── + // Read AreaMappings until reaching areaTransitionsPtr (or end of file + // if areaTransitionsPtr is null). Each entry is 32 bytes. + if areaMappingPtr != 0 { + endOff := len(data) + if areaTransitionsPtr != 0 { + endOff = areaTransitionsPtr + } + mappings, err := parseAreaMappings(data, areaMappingPtr, endOff, f32) + if err != nil { + return nil, err + } + q.AreaMappings = mappings + } + + // ── Area Transitions (areaTransitionsPtr) ───────────────────────────── + // playerAreaChange[area1Zones]: one u32 ptr per zone. + if areaTransitionsPtr != 0 && area1Zones > 0 { + transitions, err := parseAreaTransitions(data, areaTransitionsPtr, area1Zones, u32, i16, f32) + if err != nil { + return nil, err + } + q.AreaTransitions = transitions + } + + // ── Map Info (mapInfoPtr) ───────────────────────────────────────────── + if mapInfoPtr != 0 { + if err := check(mapInfoPtr, 8, "mapInfo"); err != nil { + return nil, err + } + q.MapInfo = &QuestMapInfoJSON{ + MapID: u32(mapInfoPtr), + ReturnBCID: u32(mapInfoPtr + 4), + } + } + + // ── Gathering Points (gatheringPointsPtr) ───────────────────────────── + // ptGatheringPoint[area1Zones]: one u32 ptr per zone. + if gatheringPointsPtr != 0 && area1Zones > 0 { + gatherPts, err := parseGatheringPoints(data, gatheringPointsPtr, area1Zones, u32, u16, f32) + if err != nil { + return nil, err + } + q.GatheringPoints = gatherPts + } + + // ── Area Facilities (areaFacilitiesPtr) ─────────────────────────────── + // ptVar[area1Zones]: one u32 ptr per zone. + if areaFacilitiesPtr != 0 && area1Zones > 0 { + facilities, err := parseAreaFacilities(data, areaFacilitiesPtr, area1Zones, u32, u16, f32) + if err != nil { + return nil, err + } + q.AreaFacilities = facilities + } + + // ── Some Strings (someStringsPtr / unk30) ───────────────────────────── + // Layout: ptr someStringPtr, ptr questTypePtr (8 bytes at someStringsPtr). + if someStringsPtr != 0 { + if err := check(someStringsPtr, 8, "someStrings"); err != nil { + return nil, err + } + someStrP := int(u32(someStringsPtr)) + questTypeP := int(u32(someStringsPtr + 4)) + if someStrP != 0 { + s, err := readSJIS(someStrP) + if err != nil { + return nil, fmt.Errorf("someString: %w", err) + } + q.SomeString = s + } + if questTypeP != 0 { + s, err := readSJIS(questTypeP) + if err != nil { + return nil, fmt.Errorf("questTypeString: %w", err) + } + q.QuestType = s + } + } + + // ── Gathering Tables (gatheringTablesPtr) ───────────────────────────── + // ptVar[gatheringTablesQty]: one u32 ptr per table. + // GatherItem: u16 rate + u16 item, terminated by u16(0xFFFF). + if gatheringTablesPtr != 0 && gatheringTablesQty > 0 { + tables, err := parseGatheringTables(data, gatheringTablesPtr, gatheringTablesQty, u32, u16) + if err != nil { + return nil, err + } + q.GatheringTables = tables + } + + return q, nil +} + +// ── Section parsers ────────────────────────────────────────────────────────── + +// parseObjectives reads the three 8-byte objective entries at off. +func parseObjectives(data []byte, off int) ([3]QuestObjectiveJSON, error) { + var objs [3]QuestObjectiveJSON + for i := range objs { + base := off + i*8 + if base+8 > len(data) { + return objs, fmt.Errorf("objective[%d] at 0x%X out of bounds", i, base) + } + goalType := binary.LittleEndian.Uint32(data[base:]) + typeName, ok := objTypeToString(goalType) + if !ok { + typeName = "none" + } + obj := QuestObjectiveJSON{Type: typeName} + + if goalType != questObjNone { + switch goalType { + case questObjHunt, questObjCapture, questObjSlay, questObjDamage, + questObjSlayOrDamage, questObjBreakPart: + obj.Target = uint16(data[base+4]) + // data[base+5] is padding + default: + obj.Target = binary.LittleEndian.Uint16(data[base+4:]) + } + + secondary := binary.LittleEndian.Uint16(data[base+6:]) + if goalType == questObjBreakPart { + obj.Part = secondary + } else { + obj.Count = secondary + } + } + objs[i] = obj + } + return objs, nil +} + +// parseForcedEquip reads 6 slots × 4 uint16 at off. +// Returns nil, false if all values are zero (no forced equipment). +func parseForcedEquip(data []byte, off int) (*QuestForcedEquipJSON, bool) { + eq := &QuestForcedEquipJSON{} + slots := []*[4]uint16{&eq.Legs, &eq.Weapon, &eq.Head, &eq.Chest, &eq.Arms, &eq.Waist} + anyNonZero := false + for _, slot := range slots { + for j := range slot { + v := binary.LittleEndian.Uint16(data[off:]) + slot[j] = v + if v != 0 { + anyNonZero = true + } + off += 2 + } + } + if !anyNonZero { + return nil, false + } + return eq, true +} + +// readSupplySlots reads n supply item slots (each 4 bytes: u16 item + u16 qty) +// starting at off and returns only non-empty entries (item != 0). +func readSupplySlots(data []byte, off, n int) []QuestSupplyItemJSON { + var out []QuestSupplyItemJSON + for i := 0; i < n; i++ { + base := off + i*4 + item := binary.LittleEndian.Uint16(data[base:]) + qty := binary.LittleEndian.Uint16(data[base+2:]) + if item == 0 { + continue + } + out = append(out, QuestSupplyItemJSON{Item: item, Quantity: qty}) + } + return out +} + +// parseRewardTables reads the reward table array starting at baseOff. +// Header array: {u8 tableId, u8 pad, u16 pad, u32 tableOffset} per entry, +// terminated by int16(-1). tableOffset is relative to baseOff. +// Each item list: {u16 rate, u16 item, u16 quantity} terminated by int16(-1). +func parseRewardTables(data []byte, baseOff int) ([]QuestRewardTableJSON, error) { + var tables []QuestRewardTableJSON + off := baseOff + for { + if off+2 > len(data) { + return nil, fmt.Errorf("reward table header truncated at 0x%X", off) + } + if binary.LittleEndian.Uint16(data[off:]) == 0xFFFF { + break + } + if off+8 > len(data) { + return nil, fmt.Errorf("reward table header entry truncated at 0x%X", off) + } + tableID := data[off] + tableOff := int(binary.LittleEndian.Uint32(data[off+4:])) + baseOff + off += 8 + + items, err := parseRewardItems(data, tableOff) + if err != nil { + return nil, fmt.Errorf("reward table %d items: %w", tableID, err) + } + tables = append(tables, QuestRewardTableJSON{TableID: tableID, Items: items}) + } + return tables, nil +} + +// parseRewardItems reads a null-terminated reward item list at off. +func parseRewardItems(data []byte, off int) ([]QuestRewardItemJSON, error) { + var items []QuestRewardItemJSON + for { + if off+2 > len(data) { + return nil, fmt.Errorf("reward item list truncated at 0x%X", off) + } + if binary.LittleEndian.Uint16(data[off:]) == 0xFFFF { + break + } + if off+6 > len(data) { + return nil, fmt.Errorf("reward item entry truncated at 0x%X", off) + } + rate := binary.LittleEndian.Uint16(data[off:]) + item := binary.LittleEndian.Uint16(data[off+2:]) + qty := binary.LittleEndian.Uint16(data[off+4:]) + items = append(items, QuestRewardItemJSON{Rate: rate, Item: item, Quantity: qty}) + off += 6 + } + return items, nil +} + +// parseMonsterSpawns reads large monster spawn entries at baseOff. +// Each entry is 60 bytes; the list is terminated by a 0xFF byte. +func parseMonsterSpawns(data []byte, baseOff int, f32fn func(int) float32) ([]QuestMonsterJSON, error) { + var monsters []QuestMonsterJSON + off := baseOff + const entrySize = 60 + for { + if off >= len(data) { + return nil, fmt.Errorf("monster spawn list unterminated at end of file") + } + if data[off] == 0xFF { + break + } + if off+entrySize > len(data) { + return nil, fmt.Errorf("monster spawn entry at 0x%X truncated", off) + } + m := QuestMonsterJSON{ + ID: data[off], + SpawnAmount: binary.LittleEndian.Uint32(data[off+4:]), + SpawnStage: binary.LittleEndian.Uint32(data[off+8:]), + // +0x0C padding[16] + Orientation: binary.LittleEndian.Uint32(data[off+0x1C:]), + X: f32fn(off + 0x20), + Y: f32fn(off + 0x24), + Z: f32fn(off + 0x28), + // +0x2C padding[16] + } + monsters = append(monsters, m) + off += entrySize + } + return monsters, nil +} + +// parseMapSections reads the MapZones structure at baseOff. +// Layout: u32 ptr[] terminated by u32(0); each ptr points to a mapSection: +// +// u32 loadedStage, u32 unk, u32 spawnTypesPtr, u32 spawnStatsPtr. +// +// After the 16-byte mapSection: u32(0) gap + u16 unk (2 bytes). +// spawnTypes: varPaddT = u8+pad[3] per entry, terminated by 0xFFFF. +// spawnStats: MinionSpawn (60 bytes) per entry, terminated by 0xFFFF in first 2 bytes. +func parseMapSections(data []byte, baseOff int, + u32fn func(int) uint32, + u16fn func(int) uint16, + f32fn func(int) float32, +) ([]QuestMapSectionJSON, error) { + var sections []QuestMapSectionJSON + + // Read pointer array (terminated by u32(0)). + off := baseOff + for { + if off+4 > len(data) { + return nil, fmt.Errorf("mapSection pointer array truncated at 0x%X", off) + } + ptr := int(u32fn(off)) + off += 4 + if ptr == 0 { + break + } + + // Read mapSection at ptr. + if ptr+16 > len(data) { + return nil, fmt.Errorf("mapSection at 0x%X truncated", ptr) + } + loadedStage := u32fn(ptr) + // ptr+4 is unk u32 — skip + spawnTypesPtr := int(u32fn(ptr + 8)) + spawnStatsPtr := int(u32fn(ptr + 12)) + + ms := QuestMapSectionJSON{LoadedStage: loadedStage} + + // Read spawnTypes: varPaddT terminated by 0xFFFF. + if spawnTypesPtr != 0 { + stOff := spawnTypesPtr + for { + if stOff+2 > len(data) { + return nil, fmt.Errorf("spawnTypes at 0x%X truncated", stOff) + } + if u16fn(stOff) == 0xFFFF { + break + } + if stOff+4 > len(data) { + return nil, fmt.Errorf("spawnType entry at 0x%X truncated", stOff) + } + monID := data[stOff] + ms.SpawnMonsters = append(ms.SpawnMonsters, monID) + stOff += 4 // u8 + pad[3] + } + } + + // Read spawnStats: MinionSpawn terminated by 0xFFFF in first 2 bytes. + if spawnStatsPtr != 0 { + const minionSize = 60 + ssOff := spawnStatsPtr + for { + if ssOff+2 > len(data) { + return nil, fmt.Errorf("spawnStats at 0x%X truncated", ssOff) + } + // Terminator: first 2 bytes == 0xFFFF. + if u16fn(ssOff) == 0xFFFF { + break + } + if ssOff+minionSize > len(data) { + return nil, fmt.Errorf("minionSpawn at 0x%X truncated", ssOff) + } + spawn := QuestMinionSpawnJSON{ + Monster: data[ssOff], + // ssOff+1 padding + SpawnToggle: u16fn(ssOff + 2), + SpawnAmount: u32fn(ssOff + 4), + // +8 unk u32, +0xC pad[16], +0x1C unk u32 + X: f32fn(ssOff + 0x20), + Y: f32fn(ssOff + 0x24), + Z: f32fn(ssOff + 0x28), + } + ms.MinionSpawns = append(ms.MinionSpawns, spawn) + ssOff += minionSize + } + } + + sections = append(sections, ms) + } + + return sections, nil +} + +// parseAreaMappings reads AreaMappings entries at baseOff until endOff. +// Each entry is 32 bytes: float areaX, float areaZ, pad[8], +// float baseX, float baseZ, float knPos, pad[4]. +func parseAreaMappings(data []byte, baseOff, endOff int, f32fn func(int) float32) ([]QuestAreaMappingJSON, error) { + var mappings []QuestAreaMappingJSON + const entrySize = 32 + off := baseOff + for off+entrySize <= endOff { + if off+entrySize > len(data) { + return nil, fmt.Errorf("areaMapping at 0x%X truncated", off) + } + am := QuestAreaMappingJSON{ + AreaX: f32fn(off), + AreaZ: f32fn(off + 4), + // off+8: pad[8] + BaseX: f32fn(off + 16), + BaseZ: f32fn(off + 20), + KnPos: f32fn(off + 24), + // off+28: pad[4] + } + mappings = append(mappings, am) + off += entrySize + } + return mappings, nil +} + +// parseAreaTransitions reads playerAreaChange[numZones] at baseOff. +// Each entry is a u32 pointer to a floatSet array terminated by s16(-1). +// floatSet: s16 targetStageId + s16 stageVariant + float[3] current + float[5] box + +// float[3] target + s16[2] rotation = 52 bytes. +func parseAreaTransitions(data []byte, baseOff, numZones int, + u32fn func(int) uint32, + i16fn func(int) int16, + f32fn func(int) float32, +) ([]QuestAreaTransitionsJSON, error) { + result := make([]QuestAreaTransitionsJSON, numZones) + + if baseOff+numZones*4 > len(data) { + return nil, fmt.Errorf("areaTransitions pointer array at 0x%X truncated", baseOff) + } + + for i := 0; i < numZones; i++ { + ptr := int(u32fn(baseOff + i*4)) + if ptr == 0 { + // Null pointer — no transitions for this zone. + continue + } + + // Read floatSet entries until targetStageId1 == -1. + var transitions []QuestAreaTransitionJSON + off := ptr + for { + if off+2 > len(data) { + return nil, fmt.Errorf("floatSet at 0x%X truncated", off) + } + targetStageID := i16fn(off) + if targetStageID == -1 { + break + } + // Each floatSet is 52 bytes: + // s16 targetStageId1 + s16 stageVariant = 4 + // float[3] current = 12 + // float[5] transitionBox = 20 + // float[3] target = 12 + // s16[2] rotation = 4 + // Total = 52 + const floatSetSize = 52 + if off+floatSetSize > len(data) { + return nil, fmt.Errorf("floatSet at 0x%X truncated (need %d bytes)", off, floatSetSize) + } + tr := QuestAreaTransitionJSON{ + TargetStageID1: targetStageID, + StageVariant: i16fn(off + 2), + CurrentX: f32fn(off + 4), + CurrentY: f32fn(off + 8), + CurrentZ: f32fn(off + 12), + TargetX: f32fn(off + 36), + TargetY: f32fn(off + 40), + TargetZ: f32fn(off + 44), + } + for j := 0; j < 5; j++ { + tr.TransitionBox[j] = f32fn(off + 16 + j*4) + } + tr.TargetRotation[0] = i16fn(off + 48) + tr.TargetRotation[1] = i16fn(off + 50) + transitions = append(transitions, tr) + off += floatSetSize + } + result[i] = QuestAreaTransitionsJSON{Transitions: transitions} + } + + return result, nil +} + +// parseGatheringPoints reads ptGatheringPoint[numZones] at baseOff. +// Each entry is a u32 pointer to gatheringPoint[4] terminated by xPos==-1.0. +// gatheringPoint: float xPos, yPos, zPos, range, u16 gatheringID, u16 maxCount, pad[2], u16 minCount = 24 bytes. +func parseGatheringPoints(data []byte, baseOff, numZones int, + u32fn func(int) uint32, + u16fn func(int) uint16, + f32fn func(int) float32, +) ([]QuestAreaGatheringJSON, error) { + result := make([]QuestAreaGatheringJSON, numZones) + + if baseOff+numZones*4 > len(data) { + return nil, fmt.Errorf("gatheringPoints pointer array at 0x%X truncated", baseOff) + } + + const sentinel = uint32(0xBF800000) // float32(-1.0) + const pointSize = 24 + + for i := 0; i < numZones; i++ { + ptr := int(u32fn(baseOff + i*4)) + if ptr == 0 { + continue + } + + var points []QuestGatheringPointJSON + off := ptr + for { + if off+4 > len(data) { + return nil, fmt.Errorf("gatheringPoint at 0x%X truncated", off) + } + // Terminator: xPos bit pattern == 0xBF800000 (-1.0f). + if binary.LittleEndian.Uint32(data[off:]) == sentinel { + break + } + if off+pointSize > len(data) { + return nil, fmt.Errorf("gatheringPoint entry at 0x%X truncated", off) + } + gp := QuestGatheringPointJSON{ + X: f32fn(off), + Y: f32fn(off + 4), + Z: f32fn(off + 8), + Range: f32fn(off + 12), + GatheringID: u16fn(off + 16), + MaxCount: u16fn(off + 18), + // off+20 pad[2] + MinCount: u16fn(off + 22), + } + points = append(points, gp) + off += pointSize + } + result[i] = QuestAreaGatheringJSON{Points: points} + } + + return result, nil +} + +// parseAreaFacilities reads ptVar[numZones] at baseOff. +// Each entry is a u32 pointer to a facPointBlock. +// facPoint: pad[2] + SpecAc(u16) + xPos + yPos + zPos + range + id(u16) + pad[2] = 24 bytes. +// Termination: the loop condition checks read_unsigned($+4,4) != 0xBF800000. +// So a facPoint whose xPos (at offset +4 from start of that potential entry) == -1.0 terminates. +// After all facPoints: padding[0xC] + float + float = 20 bytes (block footer, not parsed into JSON). +func parseAreaFacilities(data []byte, baseOff, numZones int, + u32fn func(int) uint32, + u16fn func(int) uint16, + f32fn func(int) float32, +) ([]QuestAreaFacilitiesJSON, error) { + result := make([]QuestAreaFacilitiesJSON, numZones) + + if baseOff+numZones*4 > len(data) { + return nil, fmt.Errorf("areaFacilities pointer array at 0x%X truncated", baseOff) + } + + const sentinel = uint32(0xBF800000) + const pointSize = 24 + + for i := 0; i < numZones; i++ { + ptr := int(u32fn(baseOff + i*4)) + if ptr == 0 { + continue + } + + var points []QuestFacilityPointJSON + off := ptr + for off+8 <= len(data) { + // Check: read_unsigned($+4, 4) == sentinel means terminate. + // $+4 is the xPos field of the potential next facPoint. + if binary.LittleEndian.Uint32(data[off+4:]) == sentinel { + break + } + if off+pointSize > len(data) { + return nil, fmt.Errorf("facPoint at 0x%X truncated", off) + } + fp := QuestFacilityPointJSON{ + // off+0: pad[2] + Type: u16fn(off + 2), + X: f32fn(off + 4), + Y: f32fn(off + 8), + Z: f32fn(off + 12), + Range: f32fn(off + 16), + ID: u16fn(off + 20), + // off+22: pad[2] + } + points = append(points, fp) + off += pointSize + } + result[i] = QuestAreaFacilitiesJSON{Points: points} + } + + return result, nil +} + +// parseGatheringTables reads ptVar[count] at baseOff. +// Each entry is a u32 pointer to GatherItem[] terminated by u16(0xFFFF). +// GatherItem: u16 rate + u16 item = 4 bytes. +func parseGatheringTables(data []byte, baseOff, count int, + u32fn func(int) uint32, + u16fn func(int) uint16, +) ([]QuestGatheringTableJSON, error) { + result := make([]QuestGatheringTableJSON, count) + + if baseOff+count*4 > len(data) { + return nil, fmt.Errorf("gatheringTables pointer array at 0x%X truncated", baseOff) + } + + for i := 0; i < count; i++ { + ptr := int(u32fn(baseOff + i*4)) + if ptr == 0 { + continue + } + + var items []QuestGatherItemJSON + off := ptr + for { + if off+2 > len(data) { + return nil, fmt.Errorf("gatheringTable at 0x%X truncated", off) + } + if u16fn(off) == 0xFFFF { + break + } + if off+4 > len(data) { + return nil, fmt.Errorf("gatherItem at 0x%X truncated", off) + } + items = append(items, QuestGatherItemJSON{ + Rate: u16fn(off), + Item: u16fn(off + 2), + }) + off += 4 + } + result[i] = QuestGatheringTableJSON{Items: items} + } + + return result, nil +} + +// objTypeToString maps a uint32 goal type to its JSON string name. +// Returns "", false for unknown types. +func objTypeToString(t uint32) (string, bool) { + for name, v := range questObjTypeMap { + if v == t { + return name, true + } + } + return "", false +} diff --git a/server/channelserver/quest_json_test.go b/server/channelserver/quest_json_test.go new file mode 100644 index 000000000..cf0ff89bf --- /dev/null +++ b/server/channelserver/quest_json_test.go @@ -0,0 +1,1265 @@ +package channelserver + +import ( + "bytes" + "encoding/binary" + "encoding/json" + "math" + "testing" +) + +// minimalQuestJSON is a small but complete quest used across many test cases. +var minimalQuestJSON = `{ + "quest_id": 1, + "title": "Test Quest", + "description": "A test quest.", + "text_main": "Hunt the Rathalos.", + "text_sub_a": "", + "text_sub_b": "", + "success_cond": "Slay the Rathalos.", + "fail_cond": "Time runs out or all hunters faint.", + "contractor": "Guild Master", + "monster_size_multi": 100, + "stat_table_1": 0, + "main_rank_points": 120, + "sub_a_rank_points": 60, + "sub_b_rank_points": 0, + "fee": 500, + "reward_main": 5000, + "reward_sub_a": 1000, + "reward_sub_b": 0, + "time_limit_minutes": 50, + "map": 2, + "rank_band": 0, + "objective_main": {"type": "hunt", "target": 11, "count": 1}, + "objective_sub_a": {"type": "deliver", "target": 149, "count": 3}, + "objective_sub_b": {"type": "none"}, + "large_monsters": [ + {"id": 11, "spawn_amount": 1, "spawn_stage": 5, "orientation": 180, "x": 1500.0, "y": 0.0, "z": -2000.0} + ], + "rewards": [ + { + "table_id": 1, + "items": [ + {"rate": 50, "item": 149, "quantity": 1}, + {"rate": 30, "item": 153, "quantity": 1} + ] + } + ], + "supply_main": [ + {"item": 1, "quantity": 5} + ], + "stages": [ + {"stage_id": 2} + ] +}` + +// ── Compiler tests (existing) ──────────────────────────────────────────────── + +func TestCompileQuestJSON_MinimalQuest(t *testing.T) { + data, err := CompileQuestJSON([]byte(minimalQuestJSON)) + if err != nil { + t.Fatalf("CompileQuestJSON: %v", err) + } + if len(data) == 0 { + t.Fatal("empty output") + } + + // Header check: first pointer (questTypeFlagsPtr) must equal headerSize+genPropSize = 0x86 + questTypeFlagsPtr := binary.LittleEndian.Uint32(data[0:4]) + const expectedBodyStart = uint32(68 + 66) // 0x86 + if questTypeFlagsPtr != expectedBodyStart { + t.Errorf("questTypeFlagsPtr = 0x%X, want 0x%X", questTypeFlagsPtr, expectedBodyStart) + } + + // QuestStringsPtr (mainQuestProperties+40) must point past the body. + questStringsPtr := binary.LittleEndian.Uint32(data[questTypeFlagsPtr+40 : questTypeFlagsPtr+44]) + if questStringsPtr < questTypeFlagsPtr+questBodyLenZZ { + t.Errorf("questStringsPtr 0x%X is inside main body (ends at 0x%X)", questStringsPtr, questTypeFlagsPtr+questBodyLenZZ) + } + + // QuestStringsPtr must be within the file. + if int(questStringsPtr) >= len(data) { + t.Errorf("questStringsPtr 0x%X out of range (file len %d)", questStringsPtr, len(data)) + } + + // The quest text pointer table: 8 string pointers, all within the file. + for i := 0; i < 8; i++ { + off := int(questStringsPtr) + i*4 + if off+4 > len(data) { + t.Fatalf("string pointer %d out of bounds", i) + } + strPtr := binary.LittleEndian.Uint32(data[off : off+4]) + if int(strPtr) >= len(data) { + t.Errorf("string pointer %d = 0x%X out of file range (%d bytes)", i, strPtr, len(data)) + } + } + + // QuestID at mainQuestProperties+0x2E. + questID := binary.LittleEndian.Uint16(data[questTypeFlagsPtr+0x2E : questTypeFlagsPtr+0x30]) + if questID != 1 { + t.Errorf("questID = %d, want 1", questID) + } + + // QuestTime at mainQuestProperties+0x20: 50 minutes × 60s × 30Hz = 90000 frames. + questTime := binary.LittleEndian.Uint32(data[questTypeFlagsPtr+0x20 : questTypeFlagsPtr+0x24]) + if questTime != 90000 { + t.Errorf("questTime = %d frames, want 90000 (50min)", questTime) + } +} + +func TestCompileQuestJSON_BadObjectiveType(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.ObjectiveMain.Type = "invalid_type" + b, _ := json.Marshal(q) + + _, err := CompileQuestJSON(b) + if err == nil { + t.Fatal("expected error for invalid objective type, got nil") + } +} + +func TestCompileQuestJSON_AllObjectiveTypes(t *testing.T) { + types := []string{ + "none", "hunt", "capture", "slay", "deliver", "deliver_flag", + "break_part", "damage", "slay_or_damage", "slay_total", "slay_all", "esoteric", + } + for _, typ := range types { + t.Run(typ, func(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.ObjectiveMain.Type = typ + b, _ := json.Marshal(q) + if _, err := CompileQuestJSON(b); err != nil { + t.Fatalf("CompileQuestJSON with type %q: %v", typ, err) + } + }) + } +} + +func TestCompileQuestJSON_EmptyRewards(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.Rewards = nil + b, _ := json.Marshal(q) + if _, err := CompileQuestJSON(b); err != nil { + t.Fatalf("unexpected error with no rewards: %v", err) + } +} + +func TestCompileQuestJSON_MultipleRewardTables(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.Rewards = []QuestRewardTableJSON{ + {TableID: 1, Items: []QuestRewardItemJSON{{Rate: 50, Item: 149, Quantity: 1}}}, + {TableID: 2, Items: []QuestRewardItemJSON{{Rate: 100, Item: 153, Quantity: 2}}}, + } + b, _ := json.Marshal(q) + data, err := CompileQuestJSON(b) + if err != nil { + t.Fatalf("CompileQuestJSON: %v", err) + } + + // Verify reward pointer points into the file. + rewardPtr := binary.LittleEndian.Uint32(data[0x0C:0x10]) + if int(rewardPtr) >= len(data) { + t.Errorf("rewardPtr 0x%X out of file range (%d)", rewardPtr, len(data)) + } +} + +// ── Parser tests ───────────────────────────────────────────────────────────── + +func TestParseQuestBinary_TooShort(t *testing.T) { + _, err := ParseQuestBinary([]byte{0x01, 0x02}) + if err == nil { + t.Fatal("expected error for undersized input, got nil") + } +} + +func TestParseQuestBinary_NullQuestTypeFlagsPtr(t *testing.T) { + // Build a buffer that is long enough but has a null questTypeFlagsPtr. + buf := make([]byte, 0x200) + // questTypeFlagsPtr at 0x00 = 0 (null) + binary.LittleEndian.PutUint32(buf[0x00:], 0) + _, err := ParseQuestBinary(buf) + if err == nil { + t.Fatal("expected error for null questTypeFlagsPtr, got nil") + } +} + +func TestParseQuestBinary_MinimalQuest(t *testing.T) { + data, err := CompileQuestJSON([]byte(minimalQuestJSON)) + if err != nil { + t.Fatalf("compile: %v", err) + } + + q, err := ParseQuestBinary(data) + if err != nil { + t.Fatalf("parse: %v", err) + } + + // Identification + if q.QuestID != 1 { + t.Errorf("QuestID = %d, want 1", q.QuestID) + } + + // Text strings + if q.Title != "Test Quest" { + t.Errorf("Title = %q, want %q", q.Title, "Test Quest") + } + if q.Description != "A test quest." { + t.Errorf("Description = %q, want %q", q.Description, "A test quest.") + } + if q.TextMain != "Hunt the Rathalos." { + t.Errorf("TextMain = %q, want %q", q.TextMain, "Hunt the Rathalos.") + } + if q.SuccessCond != "Slay the Rathalos." { + t.Errorf("SuccessCond = %q, want %q", q.SuccessCond, "Slay the Rathalos.") + } + if q.FailCond != "Time runs out or all hunters faint." { + t.Errorf("FailCond = %q, want %q", q.FailCond, "Time runs out or all hunters faint.") + } + if q.Contractor != "Guild Master" { + t.Errorf("Contractor = %q, want %q", q.Contractor, "Guild Master") + } + + // Numeric fields + if q.MonsterSizeMulti != 100 { + t.Errorf("MonsterSizeMulti = %d, want 100", q.MonsterSizeMulti) + } + if q.MainRankPoints != 120 { + t.Errorf("MainRankPoints = %d, want 120", q.MainRankPoints) + } + if q.SubARankPoints != 60 { + t.Errorf("SubARankPoints = %d, want 60", q.SubARankPoints) + } + if q.SubBRankPoints != 0 { + t.Errorf("SubBRankPoints = %d, want 0", q.SubBRankPoints) + } + if q.Fee != 500 { + t.Errorf("Fee = %d, want 500", q.Fee) + } + if q.RewardMain != 5000 { + t.Errorf("RewardMain = %d, want 5000", q.RewardMain) + } + if q.RewardSubA != 1000 { + t.Errorf("RewardSubA = %d, want 1000", q.RewardSubA) + } + if q.TimeLimitMinutes != 50 { + t.Errorf("TimeLimitMinutes = %d, want 50", q.TimeLimitMinutes) + } + if q.Map != 2 { + t.Errorf("Map = %d, want 2", q.Map) + } + + // Objectives + if q.ObjectiveMain.Type != "hunt" { + t.Errorf("ObjectiveMain.Type = %q, want hunt", q.ObjectiveMain.Type) + } + if q.ObjectiveMain.Target != 11 { + t.Errorf("ObjectiveMain.Target = %d, want 11", q.ObjectiveMain.Target) + } + if q.ObjectiveMain.Count != 1 { + t.Errorf("ObjectiveMain.Count = %d, want 1", q.ObjectiveMain.Count) + } + if q.ObjectiveSubA.Type != "deliver" { + t.Errorf("ObjectiveSubA.Type = %q, want deliver", q.ObjectiveSubA.Type) + } + if q.ObjectiveSubA.Target != 149 { + t.Errorf("ObjectiveSubA.Target = %d, want 149", q.ObjectiveSubA.Target) + } + if q.ObjectiveSubA.Count != 3 { + t.Errorf("ObjectiveSubA.Count = %d, want 3", q.ObjectiveSubA.Count) + } + if q.ObjectiveSubB.Type != "none" { + t.Errorf("ObjectiveSubB.Type = %q, want none", q.ObjectiveSubB.Type) + } + + // Stages + if len(q.Stages) != 1 { + t.Fatalf("Stages len = %d, want 1", len(q.Stages)) + } + if q.Stages[0].StageID != 2 { + t.Errorf("Stages[0].StageID = %d, want 2", q.Stages[0].StageID) + } + + // Supply box + if len(q.SupplyMain) != 1 { + t.Fatalf("SupplyMain len = %d, want 1", len(q.SupplyMain)) + } + if q.SupplyMain[0].Item != 1 || q.SupplyMain[0].Quantity != 5 { + t.Errorf("SupplyMain[0] = {%d, %d}, want {1, 5}", q.SupplyMain[0].Item, q.SupplyMain[0].Quantity) + } + if len(q.SupplySubA) != 0 { + t.Errorf("SupplySubA len = %d, want 0", len(q.SupplySubA)) + } + + // Rewards + if len(q.Rewards) != 1 { + t.Fatalf("Rewards len = %d, want 1", len(q.Rewards)) + } + rt := q.Rewards[0] + if rt.TableID != 1 { + t.Errorf("Rewards[0].TableID = %d, want 1", rt.TableID) + } + if len(rt.Items) != 2 { + t.Fatalf("Rewards[0].Items len = %d, want 2", len(rt.Items)) + } + if rt.Items[0].Rate != 50 || rt.Items[0].Item != 149 || rt.Items[0].Quantity != 1 { + t.Errorf("Rewards[0].Items[0] = %+v, want {50 149 1}", rt.Items[0]) + } + if rt.Items[1].Rate != 30 || rt.Items[1].Item != 153 || rt.Items[1].Quantity != 1 { + t.Errorf("Rewards[0].Items[1] = %+v, want {30 153 1}", rt.Items[1]) + } + + // Large monsters + if len(q.LargeMonsters) != 1 { + t.Fatalf("LargeMonsters len = %d, want 1", len(q.LargeMonsters)) + } + m := q.LargeMonsters[0] + if m.ID != 11 { + t.Errorf("LargeMonsters[0].ID = %d, want 11", m.ID) + } + if m.SpawnAmount != 1 { + t.Errorf("LargeMonsters[0].SpawnAmount = %d, want 1", m.SpawnAmount) + } + if m.SpawnStage != 5 { + t.Errorf("LargeMonsters[0].SpawnStage = %d, want 5", m.SpawnStage) + } + if m.Orientation != 180 { + t.Errorf("LargeMonsters[0].Orientation = %d, want 180", m.Orientation) + } + if m.X != 1500.0 { + t.Errorf("LargeMonsters[0].X = %v, want 1500.0", m.X) + } + if m.Y != 0.0 { + t.Errorf("LargeMonsters[0].Y = %v, want 0.0", m.Y) + } + if m.Z != -2000.0 { + t.Errorf("LargeMonsters[0].Z = %v, want -2000.0", m.Z) + } +} + +// ── Round-trip tests ───────────────────────────────────────────────────────── + +// roundTrip compiles JSON → binary, parses back to QuestJSON, re-serializes +// to JSON, compiles again, and asserts the two binaries are byte-for-byte equal. +func roundTrip(t *testing.T, label, jsonSrc string) { + t.Helper() + + bin1, err := CompileQuestJSON([]byte(jsonSrc)) + if err != nil { + t.Fatalf("%s: compile(1): %v", label, err) + } + + q, err := ParseQuestBinary(bin1) + if err != nil { + t.Fatalf("%s: parse: %v", label, err) + } + + jsonOut, err := json.Marshal(q) + if err != nil { + t.Fatalf("%s: marshal: %v", label, err) + } + + bin2, err := CompileQuestJSON(jsonOut) + if err != nil { + t.Fatalf("%s: compile(2): %v", label, err) + } + + if !bytes.Equal(bin1, bin2) { + t.Errorf("%s: round-trip binary mismatch (bin1 len=%d, bin2 len=%d)", label, len(bin1), len(bin2)) + // Find first differing byte to aid debugging. + limit := len(bin1) + if len(bin2) < limit { + limit = len(bin2) + } + for i := 0; i < limit; i++ { + if bin1[i] != bin2[i] { + t.Errorf(" first diff at offset 0x%X: bin1=0x%02X bin2=0x%02X", i, bin1[i], bin2[i]) + break + } + } + } +} + +func TestRoundTrip_MinimalQuest(t *testing.T) { + roundTrip(t, "minimal", minimalQuestJSON) +} + +func TestRoundTrip_NoRewards(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.Rewards = nil + b, _ := json.Marshal(q) + roundTrip(t, "no rewards", string(b)) +} + +func TestRoundTrip_NoMonsters(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.LargeMonsters = nil + b, _ := json.Marshal(q) + roundTrip(t, "no monsters", string(b)) +} + +func TestRoundTrip_NoStages(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.Stages = nil + b, _ := json.Marshal(q) + roundTrip(t, "no stages", string(b)) +} + +func TestRoundTrip_MultipleStages(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.Stages = []QuestStageJSON{{StageID: 2}, {StageID: 5}, {StageID: 11}} + b, _ := json.Marshal(q) + roundTrip(t, "multiple stages", string(b)) +} + +func TestRoundTrip_MultipleMonsters(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.LargeMonsters = []QuestMonsterJSON{ + {ID: 11, SpawnAmount: 1, SpawnStage: 5, Orientation: 180, X: 1500.0, Y: 0.0, Z: -2000.0}, + {ID: 37, SpawnAmount: 2, SpawnStage: 3, Orientation: 90, X: 0.0, Y: 50.0, Z: 300.0}, + } + b, _ := json.Marshal(q) + roundTrip(t, "multiple monsters", string(b)) +} + +func TestRoundTrip_MultipleRewardTables(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.Rewards = []QuestRewardTableJSON{ + {TableID: 1, Items: []QuestRewardItemJSON{ + {Rate: 50, Item: 149, Quantity: 1}, + {Rate: 50, Item: 153, Quantity: 2}, + }}, + {TableID: 2, Items: []QuestRewardItemJSON{ + {Rate: 100, Item: 200, Quantity: 3}, + }}, + } + b, _ := json.Marshal(q) + roundTrip(t, "multiple reward tables", string(b)) +} + +func TestRoundTrip_FullSupplyBox(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + // Fill supply box to capacity: 24 main + 8 subA + 8 subB. + q.SupplyMain = make([]QuestSupplyItemJSON, 24) + for i := range q.SupplyMain { + q.SupplyMain[i] = QuestSupplyItemJSON{Item: uint16(i + 1), Quantity: uint16(i + 1)} + } + q.SupplySubA = []QuestSupplyItemJSON{{Item: 10, Quantity: 2}, {Item: 20, Quantity: 1}} + q.SupplySubB = []QuestSupplyItemJSON{{Item: 30, Quantity: 5}} + b, _ := json.Marshal(q) + roundTrip(t, "full supply box", string(b)) +} + +func TestRoundTrip_BreakPartObjective(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.ObjectiveMain = QuestObjectiveJSON{Type: "break_part", Target: 11, Part: 3} + b, _ := json.Marshal(q) + roundTrip(t, "break_part objective", string(b)) +} + +func TestRoundTrip_AllObjectiveTypes(t *testing.T) { + types := []string{ + "none", "hunt", "capture", "slay", "deliver", "deliver_flag", + "break_part", "damage", "slay_or_damage", "slay_total", "slay_all", "esoteric", + } + for _, typ := range types { + t.Run(typ, func(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.ObjectiveMain = QuestObjectiveJSON{Type: typ, Target: 11, Count: 1} + b, _ := json.Marshal(q) + roundTrip(t, typ, string(b)) + }) + } +} + +func TestRoundTrip_RankFields(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.RankBand = 7 + q.HardHRReq = 300 + q.JoinRankMin = 100 + q.JoinRankMax = 999 + q.PostRankMin = 50 + q.PostRankMax = 500 + b, _ := json.Marshal(q) + roundTrip(t, "rank fields", string(b)) +} + +func TestRoundTrip_QuestVariants(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.QuestVariant1 = 1 + q.QuestVariant2 = 2 + q.QuestVariant3 = 4 + q.QuestVariant4 = 8 + b, _ := json.Marshal(q) + roundTrip(t, "quest variants", string(b)) +} + +func TestRoundTrip_EmptyQuest(t *testing.T) { + q := QuestJSON{ + QuestID: 999, + TimeLimitMinutes: 30, + MonsterSizeMulti: 100, + ObjectiveMain: QuestObjectiveJSON{Type: "slay_all"}, + } + b, _ := json.Marshal(q) + roundTrip(t, "empty quest", string(b)) +} + +// ── New section round-trip tests ───────────────────────────────────────────── + +func TestRoundTrip_MapSections(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.MapSections = []QuestMapSectionJSON{ + { + LoadedStage: 5, + SpawnMonsters: []uint8{0x0F, 0x33}, // Khezu, Blangonga + MinionSpawns: []QuestMinionSpawnJSON{ + {Monster: 0x0F, SpawnToggle: 1, SpawnAmount: 3, X: 100.0, Y: 0.0, Z: -200.0}, + {Monster: 0x33, SpawnToggle: 1, SpawnAmount: 2, X: 250.0, Y: 5.0, Z: 300.0}, + }, + }, + } + b, _ := json.Marshal(q) + roundTrip(t, "map sections", string(b)) +} + +func TestRoundTrip_MapSectionsMultiple(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.MapSections = []QuestMapSectionJSON{ + { + LoadedStage: 2, + SpawnMonsters: []uint8{0x06}, + MinionSpawns: []QuestMinionSpawnJSON{ + {Monster: 0x06, SpawnToggle: 1, SpawnAmount: 4, X: 50.0, Y: 0.0, Z: 50.0}, + }, + }, + { + LoadedStage: 3, + SpawnMonsters: nil, + MinionSpawns: nil, + }, + } + b, _ := json.Marshal(q) + roundTrip(t, "map sections multiple", string(b)) +} + +func TestRoundTrip_AreaTransitions(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.AreaTransitions = []QuestAreaTransitionsJSON{ + { + Transitions: []QuestAreaTransitionJSON{ + { + TargetStageID1: 3, + StageVariant: 0, + CurrentX: 100.0, + CurrentY: 0.0, + CurrentZ: 50.0, + TransitionBox: [5]float32{10.0, 5.0, 10.0, 0.0, 0.0}, + TargetX: -100.0, + TargetY: 0.0, + TargetZ: -50.0, + TargetRotation: [2]int16{90, 0}, + }, + }, + }, + { + // Zone 2: no transitions (null pointer). + Transitions: nil, + }, + } + b, _ := json.Marshal(q) + roundTrip(t, "area transitions", string(b)) +} + +func TestRoundTrip_AreaMappings(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + // AreaMappings without AreaTransitions: the parser reads until areaTransitionsPtr, + // which will be null, so it reads until end of file's mapping section. To make + // this round-trip cleanly, add both together. + q.AreaTransitions = []QuestAreaTransitionsJSON{{}, {}} + q.AreaMappings = []QuestAreaMappingJSON{ + {AreaX: 100.0, AreaZ: 200.0, BaseX: 10.0, BaseZ: 20.0, KnPos: 5.0}, + {AreaX: 300.0, AreaZ: 400.0, BaseX: 30.0, BaseZ: 40.0, KnPos: 7.5}, + } + b, _ := json.Marshal(q) + roundTrip(t, "area mappings", string(b)) +} + +func TestRoundTrip_MapInfo(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.MapInfo = &QuestMapInfoJSON{ + MapID: 2, + ReturnBCID: 1, + } + b, _ := json.Marshal(q) + roundTrip(t, "map info", string(b)) +} + +func TestRoundTrip_GatheringPoints(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.AreaTransitions = []QuestAreaTransitionsJSON{{}, {}} + q.GatheringPoints = []QuestAreaGatheringJSON{ + { + Points: []QuestGatheringPointJSON{ + {X: 50.0, Y: 0.0, Z: 100.0, Range: 3.0, GatheringID: 5, MaxCount: 3, MinCount: 1}, + {X: 150.0, Y: 0.0, Z: 200.0, Range: 3.0, GatheringID: 6, MaxCount: 2, MinCount: 1}, + }, + }, + { + // Zone 2: no gathering points (null pointer). + Points: nil, + }, + } + b, _ := json.Marshal(q) + roundTrip(t, "gathering points", string(b)) +} + +func TestRoundTrip_AreaFacilities(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.AreaTransitions = []QuestAreaTransitionsJSON{{}, {}} + q.AreaFacilities = []QuestAreaFacilitiesJSON{ + { + Points: []QuestFacilityPointJSON{ + {Type: 1, X: 10.0, Y: 0.0, Z: -5.0, Range: 2.0, ID: 1}, // cooking + {Type: 7, X: 20.0, Y: 0.0, Z: -10.0, Range: 3.0, ID: 2}, // red box + }, + }, + { + // Zone 2: no facilities (null pointer). + Points: nil, + }, + } + b, _ := json.Marshal(q) + roundTrip(t, "area facilities", string(b)) +} + +func TestRoundTrip_SomeStrings(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.SomeString = "extra info" + q.QuestType = "standard" + b, _ := json.Marshal(q) + roundTrip(t, "some strings", string(b)) +} + +func TestRoundTrip_SomeStringOnly(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.SomeString = "only this string" + b, _ := json.Marshal(q) + roundTrip(t, "some string only", string(b)) +} + +func TestRoundTrip_GatheringTables(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.GatheringTables = []QuestGatheringTableJSON{ + { + Items: []QuestGatherItemJSON{ + {Rate: 50, Item: 100}, + {Rate: 30, Item: 101}, + {Rate: 20, Item: 102}, + }, + }, + { + Items: []QuestGatherItemJSON{ + {Rate: 100, Item: 200}, + }, + }, + } + b, _ := json.Marshal(q) + roundTrip(t, "gathering tables", string(b)) +} + +func TestRoundTrip_AllSections(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + + q.MapSections = []QuestMapSectionJSON{ + { + LoadedStage: 5, + SpawnMonsters: []uint8{0x0F}, + MinionSpawns: []QuestMinionSpawnJSON{ + {Monster: 0x0F, SpawnToggle: 1, SpawnAmount: 2, X: 100.0, Y: 0.0, Z: -100.0}, + }, + }, + } + q.AreaTransitions = []QuestAreaTransitionsJSON{ + { + Transitions: []QuestAreaTransitionJSON{ + { + TargetStageID1: 2, + StageVariant: 0, + CurrentX: 50.0, + CurrentY: 0.0, + CurrentZ: 25.0, + TransitionBox: [5]float32{5.0, 5.0, 5.0, 0.0, 0.0}, + TargetX: -50.0, + TargetY: 0.0, + TargetZ: -25.0, + TargetRotation: [2]int16{180, 0}, + }, + }, + }, + {Transitions: nil}, + } + q.AreaMappings = []QuestAreaMappingJSON{ + {AreaX: 100.0, AreaZ: 200.0, BaseX: 10.0, BaseZ: 20.0, KnPos: 1.0}, + } + q.MapInfo = &QuestMapInfoJSON{MapID: 2, ReturnBCID: 0} + q.GatheringPoints = []QuestAreaGatheringJSON{ + { + Points: []QuestGatheringPointJSON{ + {X: 75.0, Y: 0.0, Z: 150.0, Range: 2.5, GatheringID: 3, MaxCount: 3, MinCount: 1}, + }, + }, + {Points: nil}, + } + q.AreaFacilities = []QuestAreaFacilitiesJSON{ + { + Points: []QuestFacilityPointJSON{ + {Type: 3, X: 5.0, Y: 0.0, Z: -5.0, Range: 2.0, ID: 10}, + }, + }, + {Points: nil}, + } + q.SomeString = "test string" + q.QuestType = "hunt" + q.GatheringTables = []QuestGatheringTableJSON{ + { + Items: []QuestGatherItemJSON{ + {Rate: 60, Item: 300}, + {Rate: 40, Item: 301}, + }, + }, + } + + b, _ := json.Marshal(q) + roundTrip(t, "all sections", string(b)) +} + +// ── Golden file test ───────────────────────────────────────────────────────── +// +// This test manually constructs expected binary bytes at specific offsets and +// verifies the compiler produces them exactly for minimalQuestJSON. +// Hard-coded values are derived from the documented binary layout. +// +// Layout constants for minimalQuestJSON: +// +// headerSize = 68 (0x44) +// genPropSize = 66 (0x42) +// mainPropOffset = 0x86 (= headerSize + genPropSize) +// questStringsPtr = 0x1C6 (= mainPropOffset + 320) +func TestGolden_MinimalQuestBinaryLayout(t *testing.T) { + data, err := CompileQuestJSON([]byte(minimalQuestJSON)) + if err != nil { + t.Fatalf("compile: %v", err) + } + + const ( + mainPropOffset = 0x86 + questStringsPtr = uint32(mainPropOffset + questBodyLenZZ) // 0x1C6 + ) + + // ── Header (0x00–0x43) ─────────────────────────────────────────────── + assertU32(t, data, 0x00, mainPropOffset, "questTypeFlagsPtr") + assertU16(t, data, 0x10, 0, "subSupplyBoxPtr (unused)") + assertByte(t, data, 0x12, 0, "hidden") + assertByte(t, data, 0x13, 0, "subSupplyBoxLen") + assertU32(t, data, 0x14, 0, "questAreaPtr (null)") + assertU32(t, data, 0x1C, 0, "areaTransitionsPtr (null)") + assertU32(t, data, 0x20, 0, "areaMappingPtr (null)") + assertU32(t, data, 0x24, 0, "mapInfoPtr (null)") + assertU32(t, data, 0x28, 0, "gatheringPointsPtr (null)") + assertU32(t, data, 0x2C, 0, "areaFacilitiesPtr (null)") + assertU32(t, data, 0x30, 0, "someStringsPtr (null)") + assertU32(t, data, 0x38, 0, "gatheringTablesPtr (null)") + assertU32(t, data, 0x3C, 0, "fixedCoords2Ptr (null)") + assertU32(t, data, 0x40, 0, "fixedInfoPtr (null)") + + loadedStagesPtr := binary.LittleEndian.Uint32(data[0x04:]) + unk34Ptr := binary.LittleEndian.Uint32(data[0x34:]) + if unk34Ptr != loadedStagesPtr+16 { + t.Errorf("unk34Ptr 0x%X != loadedStagesPtr+16 (0x%X); expected exactly 1 stage × 16 bytes", + unk34Ptr, loadedStagesPtr+16) + } + + // ── General Quest Properties (0x44–0x85) ──────────────────────────── + assertU16(t, data, 0x44, 100, "monsterSizeMulti") + assertU16(t, data, 0x46, 0, "sizeRange") + assertU32(t, data, 0x48, 0, "statTable1") + assertU32(t, data, 0x4C, 120, "mainRankPoints") + assertU32(t, data, 0x50, 0, "unknown@0x50") + assertU32(t, data, 0x54, 60, "subARankPoints") + assertU32(t, data, 0x58, 0, "subBRankPoints") + assertU32(t, data, 0x5C, 0, "questTypeID@0x5C") + assertByte(t, data, 0x60, 0, "padding@0x60") + assertByte(t, data, 0x61, 0, "statTable2") + // 0x62–0x72: padding (17 bytes of zeros) + for i := 0x62; i <= 0x72; i++ { + assertByte(t, data, i, 0, "padding") + } + assertByte(t, data, 0x73, 0, "questKn1") + assertU16(t, data, 0x74, 0, "questKn2") + assertU16(t, data, 0x76, 0, "questKn3") + assertU16(t, data, 0x78, 0, "gatheringTablesQty") + assertByte(t, data, 0x7C, 0, "area1Zones") + assertByte(t, data, 0x7D, 0, "area2Zones") + assertByte(t, data, 0x7E, 0, "area3Zones") + assertByte(t, data, 0x7F, 0, "area4Zones") + + // ── Main Quest Properties (0x86–0x1C5) ────────────────────────────── + mp := mainPropOffset + assertByte(t, data, mp+0x00, 0, "mp.unknown@+0x00") + assertByte(t, data, mp+0x01, 0, "mp.musicMode") + assertByte(t, data, mp+0x02, 0, "mp.localeFlags") + assertByte(t, data, mp+0x08, 0, "mp.rankBand lo") // rankBand = 0 + assertByte(t, data, mp+0x09, 0, "mp.rankBand hi") + // questFee = 500 → LE bytes: 0xF4 0x01 0x00 0x00 + assertU32(t, data, mp+0x0C, 500, "mp.questFee") + // rewardMain = 5000 → LE: 0x88 0x13 0x00 0x00 + assertU32(t, data, mp+0x10, 5000, "mp.rewardMain") + assertU32(t, data, mp+0x14, 0, "mp.cartsOrReduction") + // rewardA = 1000 → LE: 0xE8 0x03 + assertU16(t, data, mp+0x18, 1000, "mp.rewardA") + assertU16(t, data, mp+0x1A, 0, "mp.padding@+0x1A") + assertU16(t, data, mp+0x1C, 0, "mp.rewardB") + assertU16(t, data, mp+0x1E, 0, "mp.hardHRReq") + // questTime = 50 × 60 × 30 = 90000 → LE: 0x10 0x5F 0x01 0x00 + assertU32(t, data, mp+0x20, 90000, "mp.questTime") + assertU32(t, data, mp+0x24, 2, "mp.questMap") + assertU32(t, data, mp+0x28, uint32(questStringsPtr), "mp.questStringsPtr") + assertU16(t, data, mp+0x2C, 0, "mp.unknown@+0x2C") + assertU16(t, data, mp+0x2E, 1, "mp.questID") + + // Objective[0]: hunt, target=11, count=1 + assertU32(t, data, mp+0x30, questObjHunt, "obj[0].goalType") + assertByte(t, data, mp+0x34, 11, "obj[0].target") + assertByte(t, data, mp+0x35, 0, "obj[0].pad") + assertU16(t, data, mp+0x36, 1, "obj[0].count") + + // Objective[1]: deliver, target=149, count=3 + assertU32(t, data, mp+0x38, questObjDeliver, "obj[1].goalType") + assertU16(t, data, mp+0x3C, 149, "obj[1].target") + assertU16(t, data, mp+0x3E, 3, "obj[1].count") + + // Objective[2]: none + assertU32(t, data, mp+0x40, questObjNone, "obj[2].goalType") + assertU32(t, data, mp+0x44, 0, "obj[2].trailing pad") + + assertU16(t, data, mp+0x4C, 0, "mp.joinRankMin") + assertU16(t, data, mp+0x4E, 0, "mp.joinRankMax") + assertU16(t, data, mp+0x50, 0, "mp.postRankMin") + assertU16(t, data, mp+0x52, 0, "mp.postRankMax") + + // forced equip: 6 slots × 4 × 2 = 48 bytes, all zero + for i := 0; i < 48; i++ { + assertByte(t, data, mp+0x5C+i, 0, "forced equip zero") + } + + assertByte(t, data, mp+0x97, 0, "mp.questVariant1") + assertByte(t, data, mp+0x98, 0, "mp.questVariant2") + assertByte(t, data, mp+0x99, 0, "mp.questVariant3") + assertByte(t, data, mp+0x9A, 0, "mp.questVariant4") + + // ── QuestText pointer table (0x1C6–0x1E5) ─────────────────────────── + for i := 0; i < 8; i++ { + off := int(questStringsPtr) + i*4 + strPtr := int(binary.LittleEndian.Uint32(data[off:])) + if strPtr < 0 || strPtr >= len(data) { + t.Errorf("string[%d] ptr 0x%X out of bounds (len=%d)", i, strPtr, len(data)) + } + } + + // Title pointer → "Test Quest" + titlePtr := int(binary.LittleEndian.Uint32(data[int(questStringsPtr):])) + end := titlePtr + for end < len(data) && data[end] != 0 { + end++ + } + if string(data[titlePtr:end]) != "Test Quest" { + t.Errorf("title bytes = %q, want %q", data[titlePtr:end], "Test Quest") + } + + // ── Stage entry (1 stage: stageID=2) ──────────────────────────────── + assertU32(t, data, int(loadedStagesPtr), 2, "stage[0].stageID") + for i := 1; i < 16; i++ { + assertByte(t, data, int(loadedStagesPtr)+i, 0, "stage padding") + } + + // ── Supply box: main[0] = {item:1, qty:5} ─────────────────────────── + supplyBoxPtr := int(binary.LittleEndian.Uint32(data[0x08:])) + assertU16(t, data, supplyBoxPtr, 1, "supply_main[0].item") + assertU16(t, data, supplyBoxPtr+2, 5, "supply_main[0].quantity") + for i := 1; i < 24; i++ { + assertU32(t, data, supplyBoxPtr+i*4, 0, "supply_main slot empty") + } + subABase := supplyBoxPtr + 24*4 + for i := 0; i < 8; i++ { + assertU32(t, data, subABase+i*4, 0, "supply_subA slot empty") + } + subBBase := subABase + 8*4 + for i := 0; i < 8; i++ { + assertU32(t, data, subBBase+i*4, 0, "supply_subB slot empty") + } + + // ── Reward table ──────────────────────────────────────────────────── + rewardPtr := int(binary.LittleEndian.Uint32(data[0x0C:])) + assertByte(t, data, rewardPtr, 1, "reward header[0].tableID") + assertByte(t, data, rewardPtr+1, 0, "reward header[0].pad1") + assertU16(t, data, rewardPtr+2, 0, "reward header[0].pad2") + // headerArraySize = 1×8 + 2 = 10 + assertU32(t, data, rewardPtr+4, 10, "reward header[0].tableOffset") + assertU16(t, data, rewardPtr+8, 0xFFFF, "reward header terminator") + itemsBase := rewardPtr + 10 + assertU16(t, data, itemsBase, 50, "reward[0].items[0].rate") + assertU16(t, data, itemsBase+2, 149, "reward[0].items[0].item") + assertU16(t, data, itemsBase+4, 1, "reward[0].items[0].quantity") + assertU16(t, data, itemsBase+6, 30, "reward[0].items[1].rate") + assertU16(t, data, itemsBase+8, 153, "reward[0].items[1].item") + assertU16(t, data, itemsBase+10, 1, "reward[0].items[1].quantity") + assertU16(t, data, itemsBase+12, 0xFFFF, "reward item terminator") + + // ── Large monster spawn ────────────────────────────────────────────── + largeMonsterPtr := int(binary.LittleEndian.Uint32(data[0x18:])) + assertByte(t, data, largeMonsterPtr, 11, "monster[0].id") + assertByte(t, data, largeMonsterPtr+1, 0, "monster[0].pad1") + assertByte(t, data, largeMonsterPtr+2, 0, "monster[0].pad2") + assertByte(t, data, largeMonsterPtr+3, 0, "monster[0].pad3") + assertU32(t, data, largeMonsterPtr+4, 1, "monster[0].spawnAmount") + assertU32(t, data, largeMonsterPtr+8, 5, "monster[0].spawnStage") + for i := 0; i < 16; i++ { + assertByte(t, data, largeMonsterPtr+0x0C+i, 0, "monster[0].pad16") + } + assertU32(t, data, largeMonsterPtr+0x1C, 180, "monster[0].orientation") + assertF32(t, data, largeMonsterPtr+0x20, 1500.0, "monster[0].x") + assertF32(t, data, largeMonsterPtr+0x24, 0.0, "monster[0].y") + assertF32(t, data, largeMonsterPtr+0x28, -2000.0, "monster[0].z") + for i := 0; i < 16; i++ { + assertByte(t, data, largeMonsterPtr+0x2C+i, 0, "monster[0].trailing_pad") + } + assertByte(t, data, largeMonsterPtr+60, 0xFF, "monster list terminator") + + // ── Total file size ────────────────────────────────────────────────── + minExpectedLen := largeMonsterPtr + 61 + if len(data) < minExpectedLen { + t.Errorf("file too short: len=%d, need at least %d", len(data), minExpectedLen) + } +} + +// ── Golden test: generalQuestProperties with populated sections ─────────────── + +func TestGolden_GeneralQuestPropertiesCounts(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.AreaTransitions = []QuestAreaTransitionsJSON{{}, {}, {}} + q.GatheringTables = []QuestGatheringTableJSON{ + {Items: []QuestGatherItemJSON{{Rate: 100, Item: 1}}}, + {Items: []QuestGatherItemJSON{{Rate: 100, Item: 2}}}, + } + + b, _ := json.Marshal(q) + data, err := CompileQuestJSON(b) + if err != nil { + t.Fatalf("compile: %v", err) + } + + // area1Zones at 0x7C should be 3. + assertByte(t, data, 0x7C, 3, "area1Zones") + // gatheringTablesQty at 0x78 should be 2. + assertU16(t, data, 0x78, 2, "gatheringTablesQty") +} + +// ── Golden test: map sections binary layout ─────────────────────────────────── + +func TestGolden_MapSectionsBinaryLayout(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.MapSections = []QuestMapSectionJSON{ + { + LoadedStage: 7, + SpawnMonsters: []uint8{0x0B}, // Rathalos + MinionSpawns: []QuestMinionSpawnJSON{ + {Monster: 0x0B, SpawnToggle: 1, SpawnAmount: 2, X: 500.0, Y: 10.0, Z: -300.0}, + }, + }, + } + + data, err := CompileQuestJSON(func() []byte { b, _ := json.Marshal(q); return b }()) + if err != nil { + t.Fatalf("compile: %v", err) + } + + // questAreaPtr must be non-null. + questAreaPtr := int(binary.LittleEndian.Uint32(data[0x14:])) + if questAreaPtr == 0 { + t.Fatal("questAreaPtr is null, expected non-null") + } + + // First entry in pointer array must be non-null (points to mapSection). + sectionPtr := int(binary.LittleEndian.Uint32(data[questAreaPtr:])) + if sectionPtr == 0 { + t.Fatal("mapSection[0] ptr is null") + } + + // Terminator after the pointer. + terminatorOff := questAreaPtr + 4 + if terminatorOff+4 > len(data) { + t.Fatalf("terminator out of bounds") + } + termVal := binary.LittleEndian.Uint32(data[terminatorOff:]) + if termVal != 0 { + t.Errorf("pointer array terminator = 0x%08X, want 0", termVal) + } + + // mapSection at sectionPtr: loadedStage = 7. + if sectionPtr+16 > len(data) { + t.Fatalf("mapSection out of bounds") + } + loadedStage := binary.LittleEndian.Uint32(data[sectionPtr:]) + if loadedStage != 7 { + t.Errorf("mapSection.loadedStage = %d, want 7", loadedStage) + } + + // spawnTypes and spawnStats ptrs must be non-null. + spawnTypesPtr := int(binary.LittleEndian.Uint32(data[sectionPtr+8:])) + spawnStatsPtr := int(binary.LittleEndian.Uint32(data[sectionPtr+12:])) + if spawnTypesPtr == 0 { + t.Fatal("spawnTypesPtr is null") + } + if spawnStatsPtr == 0 { + t.Fatal("spawnStatsPtr is null") + } + + // spawnTypes: first entry = Rathalos (0x0B) + pad[3], then 0xFFFF terminator. + if spawnTypesPtr+6 > len(data) { + t.Fatalf("spawnTypes data out of bounds") + } + if data[spawnTypesPtr] != 0x0B { + t.Errorf("spawnTypes[0].monster = 0x%02X, want 0x0B", data[spawnTypesPtr]) + } + termU16 := binary.LittleEndian.Uint16(data[spawnTypesPtr+4:]) + if termU16 != 0xFFFF { + t.Errorf("spawnTypes terminator = 0x%04X, want 0xFFFF", termU16) + } + + // spawnStats: first entry monster = Rathalos (0x0B). + if data[spawnStatsPtr] != 0x0B { + t.Errorf("spawnStats[0].monster = 0x%02X, want 0x0B", data[spawnStatsPtr]) + } + // spawnToggle at +2 = 1. + spawnToggle := binary.LittleEndian.Uint16(data[spawnStatsPtr+2:]) + if spawnToggle != 1 { + t.Errorf("spawnStats[0].spawnToggle = %d, want 1", spawnToggle) + } + // spawnAmount at +4 = 2. + spawnAmount := binary.LittleEndian.Uint32(data[spawnStatsPtr+4:]) + if spawnAmount != 2 { + t.Errorf("spawnStats[0].spawnAmount = %d, want 2", spawnAmount) + } + // xPos at +0x20 = 500.0. + xBits := binary.LittleEndian.Uint32(data[spawnStatsPtr+0x20:]) + xPos := math.Float32frombits(xBits) + if xPos != 500.0 { + t.Errorf("spawnStats[0].x = %v, want 500.0", xPos) + } +} + +// ── Golden test: gathering tables binary layout ─────────────────────────────── + +func TestGolden_GatheringTablesBinaryLayout(t *testing.T) { + var q QuestJSON + _ = json.Unmarshal([]byte(minimalQuestJSON), &q) + q.GatheringTables = []QuestGatheringTableJSON{ + {Items: []QuestGatherItemJSON{{Rate: 75, Item: 500}, {Rate: 25, Item: 501}}}, + } + + b, _ := json.Marshal(q) + data, err := CompileQuestJSON(b) + if err != nil { + t.Fatalf("compile: %v", err) + } + + // gatheringTablesPtr must be non-null. + gatherTablesPtr := int(binary.LittleEndian.Uint32(data[0x38:])) + if gatherTablesPtr == 0 { + t.Fatal("gatheringTablesPtr is null") + } + + // gatheringTablesQty at 0x78 must be 1. + assertU16(t, data, 0x78, 1, "gatheringTablesQty") + + // Table 0: pointer to item data. + tblPtr := int(binary.LittleEndian.Uint32(data[gatherTablesPtr:])) + if tblPtr == 0 { + t.Fatal("gathering table[0] ptr is null") + } + + // Item 0: rate=75, item=500. + if tblPtr+4 > len(data) { + t.Fatalf("gathering table items out of bounds") + } + rate0 := binary.LittleEndian.Uint16(data[tblPtr:]) + item0 := binary.LittleEndian.Uint16(data[tblPtr+2:]) + if rate0 != 75 { + t.Errorf("table[0].items[0].rate = %d, want 75", rate0) + } + if item0 != 500 { + t.Errorf("table[0].items[0].item = %d, want 500", item0) + } + + // Item 1: rate=25, item=501. + rate1 := binary.LittleEndian.Uint16(data[tblPtr+4:]) + item1 := binary.LittleEndian.Uint16(data[tblPtr+6:]) + if rate1 != 25 { + t.Errorf("table[0].items[1].rate = %d, want 25", rate1) + } + if item1 != 501 { + t.Errorf("table[0].items[1].item = %d, want 501", item1) + } + + // Terminator: 0xFFFF. + term := binary.LittleEndian.Uint16(data[tblPtr+8:]) + if term != 0xFFFF { + t.Errorf("gathering table terminator = 0x%04X, want 0xFFFF", term) + } +} + +// ── Objective encoding golden tests ───────────────────────────────────────── + +func TestGolden_ObjectiveEncoding(t *testing.T) { + cases := []struct { + name string + obj QuestObjectiveJSON + wantRaw [8]byte // goalType(4) + payload(4) + }{ + { + name: "none", + obj: QuestObjectiveJSON{Type: "none"}, + // goalType=0x00000000, trailing zeros + wantRaw: [8]byte{0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00}, + }, + { + name: "hunt target=11 count=1", + obj: QuestObjectiveJSON{Type: "hunt", Target: 11, Count: 1}, + // goalType=0x00000001, u8(11)=0x0B, u8(0), u16(1)=0x01 0x00 + wantRaw: [8]byte{0x01, 0x00, 0x00, 0x00, 0x0B, 0x00, 0x01, 0x00}, + }, + { + name: "capture target=11 count=1", + obj: QuestObjectiveJSON{Type: "capture", Target: 11, Count: 1}, + // goalType=0x00000101 + wantRaw: [8]byte{0x01, 0x01, 0x00, 0x00, 0x0B, 0x00, 0x01, 0x00}, + }, + { + name: "slay target=37 count=3", + obj: QuestObjectiveJSON{Type: "slay", Target: 37, Count: 3}, + // goalType=0x00000201, u8(37)=0x25, u8(0), u16(3)=0x03 0x00 + wantRaw: [8]byte{0x01, 0x02, 0x00, 0x00, 0x25, 0x00, 0x03, 0x00}, + }, + { + name: "deliver target=149 count=3", + obj: QuestObjectiveJSON{Type: "deliver", Target: 149, Count: 3}, + // goalType=0x00000002, u16(149)=0x95 0x00, u16(3)=0x03 0x00 + wantRaw: [8]byte{0x02, 0x00, 0x00, 0x00, 0x95, 0x00, 0x03, 0x00}, + }, + { + name: "break_part target=11 part=3", + obj: QuestObjectiveJSON{Type: "break_part", Target: 11, Part: 3}, + // goalType=0x00004004, u8(11)=0x0B, u8(0), u16(part=3)=0x03 0x00 + wantRaw: [8]byte{0x04, 0x40, 0x00, 0x00, 0x0B, 0x00, 0x03, 0x00}, + }, + { + name: "slay_all", + obj: QuestObjectiveJSON{Type: "slay_all"}, + // goalType=0x00040000 — slay_all uses default (deliver) path: u16(target), u16(count) + wantRaw: [8]byte{0x00, 0x00, 0x04, 0x00, 0x00, 0x00, 0x00, 0x00}, + }, + } + + for _, tc := range cases { + t.Run(tc.name, func(t *testing.T) { + got, err := objectiveBytes(tc.obj) + if err != nil { + t.Fatalf("objectiveBytes: %v", err) + } + if len(got) != 8 { + t.Fatalf("len(got) = %d, want 8", len(got)) + } + if [8]byte(got) != tc.wantRaw { + t.Errorf("bytes = %v, want %v", got, tc.wantRaw[:]) + } + }) + } +} + +// ── Helper assertions ──────────────────────────────────────────────────────── + +func assertByte(t *testing.T, data []byte, off int, want byte, label string) { + t.Helper() + if off >= len(data) { + t.Errorf("%s @ 0x%X: out of bounds (len=%d)", label, off, len(data)) + return + } + if data[off] != want { + t.Errorf("%s @ 0x%X: got 0x%02X, want 0x%02X", label, off, data[off], want) + } +} + +func assertU16(t *testing.T, data []byte, off int, want uint16, label string) { + t.Helper() + if off+2 > len(data) { + t.Errorf("%s @ 0x%X: out of bounds (len=%d)", label, off, len(data)) + return + } + got := binary.LittleEndian.Uint16(data[off:]) + if got != want { + t.Errorf("%s @ 0x%X: got %d (0x%04X), want %d (0x%04X)", label, off, got, got, want, want) + } +} + +func assertU32(t *testing.T, data []byte, off int, want uint32, label string) { + t.Helper() + if off+4 > len(data) { + t.Errorf("%s @ 0x%X: out of bounds (len=%d)", label, off, len(data)) + return + } + got := binary.LittleEndian.Uint32(data[off:]) + if got != want { + t.Errorf("%s @ 0x%X: got %d (0x%08X), want %d (0x%08X)", label, off, got, got, want, want) + } +} + +func assertF32(t *testing.T, data []byte, off int, want float32, label string) { + t.Helper() + if off+4 > len(data) { + t.Errorf("%s @ 0x%X: out of bounds (len=%d)", label, off, len(data)) + return + } + got := math.Float32frombits(binary.LittleEndian.Uint32(data[off:])) + if got != want { + t.Errorf("%s @ 0x%X: got %v, want %v", label, off, got, want) + } +} diff --git a/server/channelserver/rengoku_binary.go b/server/channelserver/rengoku_binary.go new file mode 100644 index 000000000..a8559c6b8 --- /dev/null +++ b/server/channelserver/rengoku_binary.go @@ -0,0 +1,181 @@ +package channelserver + +import ( + "encoding/binary" + "fmt" +) + +// rengoku binary layout (after ECD decryption + JKR decompression): +// +// @0x00: magic bytes 'r','e','f',0x1A +// @0x04: version (u8, expected 1) +// @0x05: 15 bytes of header offsets (unused by this parser) +// @0x14: RoadMode multiDef (24 bytes) +// @0x2C: RoadMode soloDef (24 bytes) +const ( + rengokuMinSize = 0x44 // header (0x14) + two RoadModes (2×24) + rengokuMultiOffset = 0x14 + rengokuSoloOffset = 0x2C + floorStatsByteSize = 24 + spawnTableByteSize = 32 + spawnPtrEntrySize = 4 // each spawn-table pointer is a u32 +) + +// rengokuRoadMode holds a parsed RoadMode struct. All pointer fields are file +// offsets into the raw (decrypted + decompressed) byte slice. +type rengokuRoadMode struct { + FloorStatsCount uint32 + SpawnCountCount uint32 + SpawnTablePtrCount uint32 + FloorStatsPtr uint32 // → FloorStats[FloorStatsCount] + SpawnTablePtrsPtr uint32 // → u32[SpawnTablePtrCount] → SpawnTable[] + SpawnCountPtrsPtr uint32 // → u32[SpawnCountCount] +} + +// RengokuBinaryInfo summarises the validated rengoku_data.bin contents for +// structured logging. It is populated by parseRengokuBinary. +type RengokuBinaryInfo struct { + MultiFloors int + MultiSpawnTables int + SoloFloors int + SoloSpawnTables int + UniqueMonsters int +} + +// parseRengokuBinary validates the structural integrity of a decrypted and +// decompressed rengoku_data.bin and returns a summary of its contents. +// +// It checks: +// - magic bytes and version +// - all pointer-derived ranges lie within the file +// - individual spawn-table pointers fall within the file +func parseRengokuBinary(data []byte) (*RengokuBinaryInfo, error) { + if len(data) < rengokuMinSize { + return nil, fmt.Errorf("rengoku: file too small (%d bytes, need %d)", len(data), rengokuMinSize) + } + + // Magic: 'r','e','f',0x1A + if data[0] != 'r' || data[1] != 'e' || data[2] != 'f' || data[3] != 0x1A { + return nil, fmt.Errorf("rengoku: invalid magic %02x %02x %02x %02x", + data[0], data[1], data[2], data[3]) + } + + if data[4] != 1 { + return nil, fmt.Errorf("rengoku: unexpected version %d (want 1)", data[4]) + } + + multi, err := readRoadMode(data, rengokuMultiOffset) + if err != nil { + return nil, fmt.Errorf("rengoku: multiDef: %w", err) + } + solo, err := readRoadMode(data, rengokuSoloOffset) + if err != nil { + return nil, fmt.Errorf("rengoku: soloDef: %w", err) + } + + if err := validateRoadMode(data, multi, "multiDef"); err != nil { + return nil, err + } + if err := validateRoadMode(data, solo, "soloDef"); err != nil { + return nil, err + } + + uniqueMonsters := countUniqueMonsters(data, multi) + for id := range countUniqueMonsters(data, solo) { + uniqueMonsters[id] = struct{}{} + } + + return &RengokuBinaryInfo{ + MultiFloors: int(multi.FloorStatsCount), + MultiSpawnTables: int(multi.SpawnTablePtrCount), + SoloFloors: int(solo.FloorStatsCount), + SoloSpawnTables: int(solo.SpawnTablePtrCount), + UniqueMonsters: len(uniqueMonsters), + }, nil +} + +// readRoadMode reads a 24-byte RoadMode struct from data at offset. +func readRoadMode(data []byte, offset int) (rengokuRoadMode, error) { + end := offset + 24 + if len(data) < end { + return rengokuRoadMode{}, fmt.Errorf("RoadMode at 0x%X extends beyond file", offset) + } + d := data[offset:] + return rengokuRoadMode{ + FloorStatsCount: binary.LittleEndian.Uint32(d[0:]), + SpawnCountCount: binary.LittleEndian.Uint32(d[4:]), + SpawnTablePtrCount: binary.LittleEndian.Uint32(d[8:]), + FloorStatsPtr: binary.LittleEndian.Uint32(d[12:]), + SpawnTablePtrsPtr: binary.LittleEndian.Uint32(d[16:]), + SpawnCountPtrsPtr: binary.LittleEndian.Uint32(d[20:]), + }, nil +} + +// ptrInBounds returns true if the region [ptr, ptr+size) fits within data. +// It guards against overflow when ptr+size wraps uint32. +func ptrInBounds(data []byte, ptr, size uint32) bool { + end := ptr + size + if end < ptr { // overflow + return false + } + return int(end) <= len(data) +} + +// validateRoadMode checks that all pointer-derived byte ranges for a RoadMode +// lie within data. +func validateRoadMode(data []byte, rm rengokuRoadMode, label string) error { + fileLen := uint32(len(data)) + + // Floor-stats array bounds. + if !ptrInBounds(data, rm.FloorStatsPtr, rm.FloorStatsCount*floorStatsByteSize) { + return fmt.Errorf("rengoku: %s: floorStats array [0x%X, +%d×%d] out of bounds (file %d B)", + label, rm.FloorStatsPtr, rm.FloorStatsCount, floorStatsByteSize, fileLen) + } + + // Spawn-table pointer array bounds. + if !ptrInBounds(data, rm.SpawnTablePtrsPtr, rm.SpawnTablePtrCount*spawnPtrEntrySize) { + return fmt.Errorf("rengoku: %s: spawnTablePtrs array [0x%X, +%d×4] out of bounds (file %d B)", + label, rm.SpawnTablePtrsPtr, rm.SpawnTablePtrCount, fileLen) + } + + // Spawn-count pointer array bounds. + if !ptrInBounds(data, rm.SpawnCountPtrsPtr, rm.SpawnCountCount*spawnPtrEntrySize) { + return fmt.Errorf("rengoku: %s: spawnCountPtrs array [0x%X, +%d×4] out of bounds (file %d B)", + label, rm.SpawnCountPtrsPtr, rm.SpawnCountCount, fileLen) + } + + // Individual spawn-table pointer targets. + ptrBase := rm.SpawnTablePtrsPtr + for i := uint32(0); i < rm.SpawnTablePtrCount; i++ { + tablePtr := binary.LittleEndian.Uint32(data[ptrBase+i*4:]) + if !ptrInBounds(data, tablePtr, spawnTableByteSize) { + return fmt.Errorf("rengoku: %s: spawnTable[%d] at 0x%X is out of bounds (file %d B)", + label, i, tablePtr, fileLen) + } + } + + return nil +} + +// countUniqueMonsters iterates all SpawnTables for a RoadMode and returns a +// set of unique non-zero monster IDs (from both monsterID1 and monsterID2). +func countUniqueMonsters(data []byte, rm rengokuRoadMode) map[uint32]struct{} { + ids := make(map[uint32]struct{}) + ptrBase := rm.SpawnTablePtrsPtr + for i := uint32(0); i < rm.SpawnTablePtrCount; i++ { + tablePtr := binary.LittleEndian.Uint32(data[ptrBase+i*4:]) + if !ptrInBounds(data, tablePtr, spawnTableByteSize) { + continue + } + t := data[tablePtr:] + id1 := binary.LittleEndian.Uint32(t[0:]) + id2 := binary.LittleEndian.Uint32(t[8:]) + if id1 != 0 { + ids[id1] = struct{}{} + } + if id2 != 0 { + ids[id2] = struct{}{} + } + } + return ids +} diff --git a/server/channelserver/rengoku_binary_test.go b/server/channelserver/rengoku_binary_test.go new file mode 100644 index 000000000..da21d917c --- /dev/null +++ b/server/channelserver/rengoku_binary_test.go @@ -0,0 +1,182 @@ +package channelserver + +import ( + "encoding/binary" + "strings" + "testing" +) + +// buildRengokuData constructs a minimal but structurally valid rengoku binary +// for testing. It contains one floor and one spawn table per road mode. +// +// Layout: +// +// 0x00–0x13 header (magic + version + padding) +// 0x14–0x2B multiDef RoadMode +// 0x2C–0x43 soloDef RoadMode +// 0x44–0x5B multiDef FloorStats (24 bytes) +// 0x5C–0x63 multiDef spawnTablePtrs (1×u32 = 4 bytes) +// 0x64–0x67 multiDef spawnCountPtrs (1×u32 = 4 bytes) +// 0x68–0x87 multiDef SpawnTable (32 bytes) +// 0x88–0x9F soloDef FloorStats (24 bytes) +// 0xA0–0xA3 soloDef spawnTablePtrs (1×u32) +// 0xA4–0xA7 soloDef spawnCountPtrs (1×u32) +// 0xA8–0xC7 soloDef SpawnTable (32 bytes) +func buildRengokuData(multiMonster1, multiMonster2, soloMonster1, soloMonster2 uint32) []byte { + buf := make([]byte, 0xC8) + + // Header + buf[0] = 'r' + buf[1] = 'e' + buf[2] = 'f' + buf[3] = 0x1A + buf[4] = 1 // version + + le := binary.LittleEndian + + // multiDef RoadMode at 0x14 + le.PutUint32(buf[0x14:], 1) // floorStatsCount + le.PutUint32(buf[0x18:], 1) // spawnCountCount + le.PutUint32(buf[0x1C:], 1) // spawnTablePtrCount + le.PutUint32(buf[0x20:], 0x44) // floorStatsPtr + le.PutUint32(buf[0x24:], 0x5C) // spawnTablePtrsPtr + le.PutUint32(buf[0x28:], 0x64) // spawnCountPtrsPtr + + // soloDef RoadMode at 0x2C + le.PutUint32(buf[0x2C:], 1) // floorStatsCount + le.PutUint32(buf[0x30:], 1) // spawnCountCount + le.PutUint32(buf[0x34:], 1) // spawnTablePtrCount + le.PutUint32(buf[0x38:], 0x88) // floorStatsPtr + le.PutUint32(buf[0x3C:], 0xA0) // spawnTablePtrsPtr + le.PutUint32(buf[0x40:], 0xA4) // spawnCountPtrsPtr + + // multiDef FloorStats at 0x44 (24 bytes) + le.PutUint32(buf[0x44:], 1) // floorNumber + + // multiDef spawnTablePtrs at 0x5C: points to SpawnTable at 0x68 + le.PutUint32(buf[0x5C:], 0x68) + + // multiDef SpawnTable at 0x68 (32 bytes) + le.PutUint32(buf[0x68:], multiMonster1) + le.PutUint32(buf[0x70:], multiMonster2) + + // soloDef FloorStats at 0x88 (24 bytes) + le.PutUint32(buf[0x88:], 1) // floorNumber + + // soloDef spawnTablePtrs at 0xA0: points to SpawnTable at 0xA8 + le.PutUint32(buf[0xA0:], 0xA8) + + // soloDef SpawnTable at 0xA8 (32 bytes) + le.PutUint32(buf[0xA8:], soloMonster1) + le.PutUint32(buf[0xB0:], soloMonster2) + + return buf +} + +func TestParseRengokuBinary_ValidMinimal(t *testing.T) { + data := buildRengokuData(101, 102, 103, 101) // monster 101 appears in both roads + + info, err := parseRengokuBinary(data) + if err != nil { + t.Fatalf("parseRengokuBinary: %v", err) + } + if info.MultiFloors != 1 { + t.Errorf("MultiFloors = %d, want 1", info.MultiFloors) + } + if info.MultiSpawnTables != 1 { + t.Errorf("MultiSpawnTables = %d, want 1", info.MultiSpawnTables) + } + if info.SoloFloors != 1 { + t.Errorf("SoloFloors = %d, want 1", info.SoloFloors) + } + if info.SoloSpawnTables != 1 { + t.Errorf("SoloSpawnTables = %d, want 1", info.SoloSpawnTables) + } + // IDs present: 101, 102, 103 → 3 unique (101 shared between roads) + if info.UniqueMonsters != 3 { + t.Errorf("UniqueMonsters = %d, want 3", info.UniqueMonsters) + } +} + +func TestParseRengokuBinary_ZeroMonsterIDsExcluded(t *testing.T) { + data := buildRengokuData(0, 55, 0, 0) // only monster 55 is non-zero + + info, err := parseRengokuBinary(data) + if err != nil { + t.Fatalf("parseRengokuBinary: %v", err) + } + if info.UniqueMonsters != 1 { + t.Errorf("UniqueMonsters = %d, want 1 (zeros excluded)", info.UniqueMonsters) + } +} + +func TestParseRengokuBinary_Errors(t *testing.T) { + validData := buildRengokuData(1, 2, 3, 4) + + cases := []struct { + name string + data []byte + wantErr string + }{ + { + name: "too_small", + data: make([]byte, 10), + wantErr: "too small", + }, + { + name: "bad_magic", + data: func() []byte { + d := make([]byte, len(validData)) + copy(d, validData) + d[0] = 0xFF + return d + }(), + wantErr: "invalid magic", + }, + { + name: "wrong_version", + data: func() []byte { + d := make([]byte, len(validData)) + copy(d, validData) + d[4] = 2 + return d + }(), + wantErr: "unexpected version", + }, + { + name: "floorStats_ptr_out_of_bounds", + data: func() []byte { + d := make([]byte, len(validData)) + copy(d, validData) + // Set multiDef floorStatsPtr to beyond file end + binary.LittleEndian.PutUint32(d[0x20:], uint32(len(d)+1)) + return d + }(), + wantErr: "out of bounds", + }, + { + name: "spawnTable_ptr_target_out_of_bounds", + data: func() []byte { + d := make([]byte, len(validData)) + copy(d, validData) + // Point the spawn table pointer to just before the end so SpawnTable + // (32 bytes) would extend beyond the file. + binary.LittleEndian.PutUint32(d[0x5C:], uint32(len(d)-4)) + return d + }(), + wantErr: "out of bounds", + }, + } + + for _, tc := range cases { + t.Run(tc.name, func(t *testing.T) { + _, err := parseRengokuBinary(tc.data) + if err == nil { + t.Fatal("expected error, got nil") + } + if !strings.Contains(err.Error(), tc.wantErr) { + t.Errorf("error %q does not contain %q", err.Error(), tc.wantErr) + } + }) + } +} diff --git a/server/channelserver/rengoku_build.go b/server/channelserver/rengoku_build.go new file mode 100644 index 000000000..c48efde0f --- /dev/null +++ b/server/channelserver/rengoku_build.go @@ -0,0 +1,270 @@ +package channelserver + +/* + JSON-based rengoku_data.bin builder. + + Operators can place rengoku_data.json in the bin/ directory instead of + (or alongside) rengoku_data.bin. When the JSON file is found it takes + precedence: it is parsed, validated, assembled into the raw binary layout, + and ECD-encrypted before being cached. The .bin file is used as a fallback. + + Binary layout produced by BuildRengokuBinary: + 0x00–0x13 header (20 bytes: magic + version + zeros) + 0x14–0x2B multiDef RoadMode (24 bytes) + 0x2C–0x43 soloDef RoadMode (24 bytes) + -- multi road data -- + floorStats[] (floorStatsCount × 24 bytes) + spawnTablePtrs[] (spawnTablePtrCount × 4 bytes) + spawnCountPtrs[] (spawnTablePtrCount × 4 bytes, zeroed) + spawnTables[] (spawnTablePtrCount × 32 bytes) + -- solo road data -- (same sub-layout) +*/ + +import ( + "encoding/binary" + "encoding/json" + "fmt" + "math" + "os" + "path/filepath" + + "erupe-ce/common/decryption" + + "go.uber.org/zap" +) + +// ─── JSON schema ──────────────────────────────────────────────────────────── + +// RengokuConfig is the top-level JSON structure for rengoku_data.json. +type RengokuConfig struct { + MultiRoad RoadConfig `json:"multi_road"` + SoloRoad RoadConfig `json:"solo_road"` +} + +// RoadConfig describes one road mode (multi or solo) with its floors and +// spawn tables. Floors reference spawn tables by zero-based index. +type RoadConfig struct { + Floors []FloorConfig `json:"floors"` + SpawnTables []SpawnTableConfig `json:"spawn_tables"` +} + +// FloorConfig describes one floor within a road mode. +// +// - SpawnTableIndex: zero-based index into this road's SpawnTables slice, +// selecting which monster configuration is active on this floor. +// - PointMulti1/2: point multipliers applied to rewards on this floor. +// - FinalLoop: non-zero on the last floor of a loop cycle. +type FloorConfig struct { + FloorNumber uint32 `json:"floor_number"` + SpawnTableIndex uint32 `json:"spawn_table_index"` + Unk0 uint32 `json:"unk0,omitempty"` + PointMulti1 float32 `json:"point_multi_1"` + PointMulti2 float32 `json:"point_multi_2"` + FinalLoop uint32 `json:"final_loop,omitempty"` +} + +// SpawnTableConfig describes the two monsters that appear together on a floor. +type SpawnTableConfig struct { + Monster1ID uint32 `json:"monster1_id"` + Monster1Variant uint32 `json:"monster1_variant,omitempty"` + Monster2ID uint32 `json:"monster2_id"` + Monster2Variant uint32 `json:"monster2_variant,omitempty"` + StatTable uint32 `json:"stat_table,omitempty"` + MapZoneOverride uint32 `json:"map_zone_override,omitempty"` + SpawnWeighting uint32 `json:"spawn_weighting,omitempty"` + AdditionalFlag uint32 `json:"additional_flag,omitempty"` +} + +// ─── Builder ───────────────────────────────────────────────────────────────── + +// BuildRengokuBinary assembles a raw (unencrypted, uncompressed) rengoku +// binary from a RengokuConfig. The result can be passed to EncodeECD and +// served directly to clients. +func BuildRengokuBinary(cfg RengokuConfig) ([]byte, error) { + if err := validateRengokuConfig(cfg); err != nil { + return nil, err + } + + // ── Offset plan ────────────────────────────────────────────────────────── + // Fixed regions: header (0x14) + two RoadModes (2×24) = 0x44 + const dataStart = uint32(rengokuMinSize) // 0x44 + + // Multi road sections + mFloorOff := dataStart + mFloorSz := uint32(len(cfg.MultiRoad.Floors)) * floorStatsByteSize + mPtrsOff := mFloorOff + mFloorSz + mPtrsSz := uint32(len(cfg.MultiRoad.SpawnTables)) * spawnPtrEntrySize + mCntOff := mPtrsOff + mPtrsSz + mCntSz := uint32(len(cfg.MultiRoad.SpawnTables)) * spawnPtrEntrySize + mTablesOff := mCntOff + mCntSz + mTablesSz := uint32(len(cfg.MultiRoad.SpawnTables)) * spawnTableByteSize + + // Solo road sections (appended directly after multi) + sFloorOff := mTablesOff + mTablesSz + sFloorSz := uint32(len(cfg.SoloRoad.Floors)) * floorStatsByteSize + sPtrsOff := sFloorOff + sFloorSz + sPtrsSz := uint32(len(cfg.SoloRoad.SpawnTables)) * spawnPtrEntrySize + sCntOff := sPtrsOff + sPtrsSz + sCntSz := uint32(len(cfg.SoloRoad.SpawnTables)) * spawnPtrEntrySize + sTablesOff := sCntOff + sCntSz + sTablesSz := uint32(len(cfg.SoloRoad.SpawnTables)) * spawnTableByteSize + + totalSize := sTablesOff + sTablesSz + buf := make([]byte, totalSize) + + // ── Header ─────────────────────────────────────────────────────────────── + buf[0], buf[1], buf[2], buf[3] = 'r', 'e', 'f', 0x1A + buf[4] = 1 // version + + le := binary.LittleEndian + + // ── RoadMode structs ───────────────────────────────────────────────────── + writeRoadMode(buf, 0x14, le, RoadModeFields{ + FloorCount: uint32(len(cfg.MultiRoad.Floors)), + SpawnCount: uint32(len(cfg.MultiRoad.SpawnTables)), + TablePtrCnt: uint32(len(cfg.MultiRoad.SpawnTables)), + FloorPtr: mFloorOff, + TablePtrsPtr: mPtrsOff, + CountPtrsPtr: mCntOff, + }) + writeRoadMode(buf, 0x2C, le, RoadModeFields{ + FloorCount: uint32(len(cfg.SoloRoad.Floors)), + SpawnCount: uint32(len(cfg.SoloRoad.SpawnTables)), + TablePtrCnt: uint32(len(cfg.SoloRoad.SpawnTables)), + FloorPtr: sFloorOff, + TablePtrsPtr: sPtrsOff, + CountPtrsPtr: sCntOff, + }) + + // ── Data sections ──────────────────────────────────────────────────────── + writeFloors(buf, cfg.MultiRoad.Floors, mFloorOff, le) + writeSpawnSection(buf, cfg.MultiRoad.SpawnTables, mPtrsOff, mTablesOff, le) + + writeFloors(buf, cfg.SoloRoad.Floors, sFloorOff, le) + writeSpawnSection(buf, cfg.SoloRoad.SpawnTables, sPtrsOff, sTablesOff, le) + + return buf, nil +} + +// RoadModeFields carries the computed field values for one RoadMode struct. +type RoadModeFields struct { + FloorCount, SpawnCount, TablePtrCnt uint32 + FloorPtr, TablePtrsPtr, CountPtrsPtr uint32 +} + +func writeRoadMode(buf []byte, offset int, le binary.ByteOrder, f RoadModeFields) { + le.PutUint32(buf[offset:], f.FloorCount) + le.PutUint32(buf[offset+4:], f.SpawnCount) + le.PutUint32(buf[offset+8:], f.TablePtrCnt) + le.PutUint32(buf[offset+12:], f.FloorPtr) + le.PutUint32(buf[offset+16:], f.TablePtrsPtr) + le.PutUint32(buf[offset+20:], f.CountPtrsPtr) +} + +func writeFloors(buf []byte, floors []FloorConfig, base uint32, le binary.ByteOrder) { + for i, f := range floors { + off := base + uint32(i)*floorStatsByteSize + le.PutUint32(buf[off:], f.FloorNumber) + le.PutUint32(buf[off+4:], f.SpawnTableIndex) + le.PutUint32(buf[off+8:], f.Unk0) + le.PutUint32(buf[off+12:], math.Float32bits(f.PointMulti1)) + le.PutUint32(buf[off+16:], math.Float32bits(f.PointMulti2)) + le.PutUint32(buf[off+20:], f.FinalLoop) + } +} + +func writeSpawnSection(buf []byte, tables []SpawnTableConfig, ptrsBase, tablesBase uint32, le binary.ByteOrder) { + for i, t := range tables { + tableOff := tablesBase + uint32(i)*spawnTableByteSize + // Pointer entry + le.PutUint32(buf[ptrsBase+uint32(i)*spawnPtrEntrySize:], tableOff) + // SpawnTable (32 bytes) + le.PutUint32(buf[tableOff:], t.Monster1ID) + le.PutUint32(buf[tableOff+4:], t.Monster1Variant) + le.PutUint32(buf[tableOff+8:], t.Monster2ID) + le.PutUint32(buf[tableOff+12:], t.Monster2Variant) + le.PutUint32(buf[tableOff+16:], t.StatTable) + le.PutUint32(buf[tableOff+20:], t.MapZoneOverride) + le.PutUint32(buf[tableOff+24:], t.SpawnWeighting) + le.PutUint32(buf[tableOff+28:], t.AdditionalFlag) + } +} + +// validateRengokuConfig checks that all spawn_table_index references are +// within range for both road modes. +func validateRengokuConfig(cfg RengokuConfig) error { + for _, road := range []struct { + name string + r RoadConfig + }{{"multi_road", cfg.MultiRoad}, {"solo_road", cfg.SoloRoad}} { + n := len(road.r.SpawnTables) + for i, f := range road.r.Floors { + if int(f.SpawnTableIndex) >= n { + return fmt.Errorf("rengoku: %s floor %d: spawn_table_index %d out of range (have %d tables)", + road.name, i, f.SpawnTableIndex, n) + } + } + } + return nil +} + +// ─── Shared helper ─────────────────────────────────────────────────────────── + +// encodeRengokuECD wraps decryption.EncodeECD with error logging. +func encodeRengokuECD(raw []byte, logger *zap.Logger) ([]byte, error) { + enc, err := decryption.EncodeECD(raw, decryption.DefaultECDKey) + if err != nil { + logger.Error("rengoku: ECD encryption failed", zap.Error(err)) + } + return enc, err +} + +// ─── JSON loader ───────────────────────────────────────────────────────────── + +// loadRengokuFromJSON attempts to load rengoku configuration from +// rengoku_data.json in binPath. It returns the ECD-encrypted binary ready for +// caching, or nil if the file is absent or cannot be processed. +func loadRengokuFromJSON(binPath string, logger *zap.Logger) []byte { + path := filepath.Join(binPath, "rengoku_data.json") + raw, err := os.ReadFile(path) + if err != nil { + return nil // file absent — not an error + } + + var cfg RengokuConfig + if err := json.Unmarshal(raw, &cfg); err != nil { + logger.Error("rengoku_data.json: JSON parse error", + zap.String("path", path), zap.Error(err)) + return nil + } + + bin, err := BuildRengokuBinary(cfg) + if err != nil { + logger.Error("rengoku_data.json: binary build failed", + zap.String("path", path), zap.Error(err)) + return nil + } + + // Validate the freshly built binary (should always pass, but good to confirm). + info, parseErr := parseRengokuBinary(bin) + if parseErr != nil { + logger.Error("rengoku_data.json: structural validation of built binary failed", + zap.String("path", path), zap.Error(parseErr)) + return nil + } + + enc, err := encodeRengokuECD(bin, logger) + if err != nil { + return nil + } + + logger.Info("Hunting Road config (from JSON)", + zap.Int("multi_floors", info.MultiFloors), + zap.Int("multi_spawn_tables", info.MultiSpawnTables), + zap.Int("solo_floors", info.SoloFloors), + zap.Int("solo_spawn_tables", info.SoloSpawnTables), + zap.Int("unique_monsters", info.UniqueMonsters), + ) + logger.Info("Loaded rengoku_data.json", zap.Int("bytes", len(enc))) + return enc +} diff --git a/server/channelserver/rengoku_build_test.go b/server/channelserver/rengoku_build_test.go new file mode 100644 index 000000000..286c8ef35 --- /dev/null +++ b/server/channelserver/rengoku_build_test.go @@ -0,0 +1,216 @@ +package channelserver + +import ( + "encoding/json" + "math" + "os" + "path/filepath" + "strings" + "testing" + + "go.uber.org/zap" +) + +// sampleRengokuConfig returns a small but complete RengokuConfig for tests. +func sampleRengokuConfig() RengokuConfig { + spawnTables := []SpawnTableConfig{ + {Monster1ID: 101, Monster1Variant: 0, Monster2ID: 102, Monster2Variant: 1, + StatTable: 3, SpawnWeighting: 10}, + {Monster1ID: 103, Monster1Variant: 2, Monster2ID: 104, Monster2Variant: 0, + SpawnWeighting: 20}, + } + floors := []FloorConfig{ + {FloorNumber: 1, SpawnTableIndex: 0, PointMulti1: 1.0, PointMulti2: 1.5}, + {FloorNumber: 2, SpawnTableIndex: 1, PointMulti1: 1.2, PointMulti2: 2.0}, + {FloorNumber: 3, SpawnTableIndex: 0, PointMulti1: 1.5, PointMulti2: 2.5, FinalLoop: 1}, + } + soloFloors := []FloorConfig{ + {FloorNumber: 1, SpawnTableIndex: 0, PointMulti1: 1.0, PointMulti2: 1.5}, + {FloorNumber: 2, SpawnTableIndex: 0, PointMulti1: 1.2, PointMulti2: 2.0}, + } + return RengokuConfig{ + MultiRoad: RoadConfig{Floors: floors, SpawnTables: spawnTables}, + SoloRoad: RoadConfig{Floors: soloFloors, SpawnTables: spawnTables[1:]}, + } +} + +// TestBuildRengokuBinary_RoundTrip builds a binary from a config and verifies +// that parseRengokuBinary accepts it and reports the expected summary. +func TestBuildRengokuBinary_RoundTrip(t *testing.T) { + cfg := sampleRengokuConfig() + + bin, err := BuildRengokuBinary(cfg) + if err != nil { + t.Fatalf("BuildRengokuBinary: %v", err) + } + + info, err := parseRengokuBinary(bin) + if err != nil { + t.Fatalf("parseRengokuBinary on built binary: %v", err) + } + + if info.MultiFloors != len(cfg.MultiRoad.Floors) { + t.Errorf("MultiFloors = %d, want %d", info.MultiFloors, len(cfg.MultiRoad.Floors)) + } + if info.MultiSpawnTables != len(cfg.MultiRoad.SpawnTables) { + t.Errorf("MultiSpawnTables = %d, want %d", info.MultiSpawnTables, len(cfg.MultiRoad.SpawnTables)) + } + if info.SoloFloors != len(cfg.SoloRoad.Floors) { + t.Errorf("SoloFloors = %d, want %d", info.SoloFloors, len(cfg.SoloRoad.Floors)) + } + if info.SoloSpawnTables != len(cfg.SoloRoad.SpawnTables) { + t.Errorf("SoloSpawnTables = %d, want %d", info.SoloSpawnTables, len(cfg.SoloRoad.SpawnTables)) + } + // Unique monsters: multi has 101,102,103,104; solo has 103,104 → 4 total + if info.UniqueMonsters != 4 { + t.Errorf("UniqueMonsters = %d, want 4", info.UniqueMonsters) + } +} + +// TestBuildRengokuBinary_FloatFields verifies that PointMulti1/2 values +// survive the binary encoding intact. +func TestBuildRengokuBinary_FloatFields(t *testing.T) { + cfg := RengokuConfig{ + MultiRoad: RoadConfig{ + Floors: []FloorConfig{ + {FloorNumber: 1, SpawnTableIndex: 0, PointMulti1: 1.25, PointMulti2: 3.75}, + }, + SpawnTables: []SpawnTableConfig{{Monster1ID: 1}}, + }, + SoloRoad: RoadConfig{ + Floors: []FloorConfig{{FloorNumber: 1, SpawnTableIndex: 0}}, + SpawnTables: []SpawnTableConfig{{Monster1ID: 2}}, + }, + } + + bin, err := BuildRengokuBinary(cfg) + if err != nil { + t.Fatalf("BuildRengokuBinary: %v", err) + } + + // Re-parse the binary and check that we can read back the float fields. + // The floor stats for multiDef start at rengokuMinSize (0x44). + // Layout: floorNumber(4) + spawnTableIndex(4) + unk0(4) + pointMulti1(4) + pointMulti2(4) + floorBase := rengokuMinSize // 0x44 + pm1Bits := uint32(bin[floorBase+12]) | uint32(bin[floorBase+13])<<8 | + uint32(bin[floorBase+14])<<16 | uint32(bin[floorBase+15])<<24 + pm2Bits := uint32(bin[floorBase+16]) | uint32(bin[floorBase+17])<<8 | + uint32(bin[floorBase+18])<<16 | uint32(bin[floorBase+19])<<24 + + if got := math.Float32frombits(pm1Bits); got != 1.25 { + t.Errorf("PointMulti1 = %f, want 1.25", got) + } + if got := math.Float32frombits(pm2Bits); got != 3.75 { + t.Errorf("PointMulti2 = %f, want 3.75", got) + } +} + +// TestBuildRengokuBinary_ValidationErrors verifies that out-of-range +// spawn_table_index values are caught before the binary is built. +func TestBuildRengokuBinary_ValidationErrors(t *testing.T) { + cases := []struct { + name string + cfg RengokuConfig + wantErr string + }{ + { + name: "multi_index_out_of_range", + cfg: RengokuConfig{ + MultiRoad: RoadConfig{ + Floors: []FloorConfig{{FloorNumber: 1, SpawnTableIndex: 5}}, + SpawnTables: []SpawnTableConfig{{Monster1ID: 1}}, + }, + SoloRoad: RoadConfig{ + Floors: []FloorConfig{{FloorNumber: 1, SpawnTableIndex: 0}}, + SpawnTables: []SpawnTableConfig{{Monster1ID: 2}}, + }, + }, + wantErr: "multi_road", + }, + { + name: "solo_index_out_of_range", + cfg: RengokuConfig{ + MultiRoad: RoadConfig{ + Floors: []FloorConfig{{FloorNumber: 1, SpawnTableIndex: 0}}, + SpawnTables: []SpawnTableConfig{{Monster1ID: 1}}, + }, + SoloRoad: RoadConfig{ + Floors: []FloorConfig{{FloorNumber: 1, SpawnTableIndex: 99}}, + SpawnTables: []SpawnTableConfig{{Monster1ID: 2}}, + }, + }, + wantErr: "solo_road", + }, + } + + for _, tc := range cases { + t.Run(tc.name, func(t *testing.T) { + _, err := BuildRengokuBinary(tc.cfg) + if err == nil { + t.Fatal("expected error, got nil") + } + if !strings.Contains(err.Error(), tc.wantErr) { + t.Errorf("error %q does not contain %q", err.Error(), tc.wantErr) + } + }) + } +} + +// TestLoadRengokuBinary_BinPreferredOverJSON writes both a JSON file and a +// .bin file and verifies that the .bin source is used (consistent with the +// quest and scenario loaders). +func TestLoadRengokuBinary_BinPreferredOverJSON(t *testing.T) { + dir := t.TempDir() + logger, _ := zap.NewDevelopment() + + // Write a valid rengoku_data.json (would produce a much larger binary). + cfg := sampleRengokuConfig() + jsonBytes, err := json.Marshal(cfg) + if err != nil { + t.Fatalf("marshal: %v", err) + } + if err := os.WriteFile(filepath.Join(dir, "rengoku_data.json"), jsonBytes, 0644); err != nil { + t.Fatal(err) + } + + // Write a minimal valid-magic .bin — should be preferred over JSON. + binData := make([]byte, 16) // 16-byte ECD header, zero payload + binData[0], binData[1], binData[2], binData[3] = 0x65, 0x63, 0x64, 0x1A + if err := os.WriteFile(filepath.Join(dir, "rengoku_data.bin"), binData, 0644); err != nil { + t.Fatal(err) + } + + result := loadRengokuBinary(dir, logger) + if result == nil { + t.Fatal("expected non-nil result") + } + // The JSON-built binary would be much larger; 16 bytes confirms .bin was used. + if len(result) != 16 { + t.Errorf("result is %d bytes — looks like JSON was used instead of .bin", len(result)) + } +} + +// TestLoadRengokuBinary_JSONFallbackWhenNoBin verifies that when no .bin file +// is present, loadRengokuBinary falls back to rengoku_data.json. +func TestLoadRengokuBinary_JSONFallbackWhenNoBin(t *testing.T) { + dir := t.TempDir() + logger, _ := zap.NewDevelopment() + + cfg := sampleRengokuConfig() + jsonBytes, err := json.Marshal(cfg) + if err != nil { + t.Fatalf("marshal: %v", err) + } + if err := os.WriteFile(filepath.Join(dir, "rengoku_data.json"), jsonBytes, 0644); err != nil { + t.Fatal(err) + } + + result := loadRengokuBinary(dir, logger) + if result == nil { + t.Fatal("expected fallback to JSON, got nil") + } + // JSON-built result is much larger than 16 bytes. + if len(result) <= 16 { + t.Errorf("result is %d bytes — JSON fallback likely did not run", len(result)) + } +} diff --git a/server/channelserver/repo_character.go b/server/channelserver/repo_character.go index 66b978e1f..a9a385d46 100644 --- a/server/channelserver/repo_character.go +++ b/server/channelserver/repo_character.go @@ -2,6 +2,7 @@ package channelserver import ( "database/sql" + "errors" "fmt" "time" @@ -49,10 +50,24 @@ func (r *CharacterRepository) LoadColumn(charID uint32, column string) ([]byte, return data, err } +// ErrCharacterNotFound is returned by write methods when no character row is matched. +var ErrCharacterNotFound = errors.New("character not found") + // SaveColumn writes a single []byte column by character ID. +// Returns ErrCharacterNotFound if no row was updated (character does not exist). func (r *CharacterRepository) SaveColumn(charID uint32, column string, data []byte) error { - _, err := r.db.Exec("UPDATE characters SET "+column+"=$1 WHERE id=$2", data, charID) - return err + result, err := r.db.Exec("UPDATE characters SET "+column+"=$1 WHERE id=$2", data, charID) + if err != nil { + return err + } + n, err := result.RowsAffected() + if err != nil { + return err + } + if n == 0 { + return fmt.Errorf("SaveColumn %s for char %d: %w", column, charID, ErrCharacterNotFound) + } + return nil } // ReadInt reads a single integer column (0 for NULL) by character ID. @@ -226,13 +241,26 @@ func (r *CharacterRepository) ReadGuildPostChecked(charID uint32) (time.Time, er // When rastaID is 0, only the mercenary blob is saved — the existing rasta_id // (typically NULL for characters without a mercenary) is preserved. Writing 0 // would pollute GetMercenaryLoans queries that match on pact_id. +// Returns ErrCharacterNotFound if no row was updated. func (r *CharacterRepository) SaveMercenary(charID uint32, data []byte, rastaID uint32) error { + var result sql.Result + var err error if rastaID == 0 { - _, err := r.db.Exec("UPDATE characters SET savemercenary=$1 WHERE id=$2", data, charID) + result, err = r.db.Exec("UPDATE characters SET savemercenary=$1 WHERE id=$2", data, charID) + } else { + result, err = r.db.Exec("UPDATE characters SET savemercenary=$1, rasta_id=$2 WHERE id=$3", data, rastaID, charID) + } + if err != nil { return err } - _, err := r.db.Exec("UPDATE characters SET savemercenary=$1, rasta_id=$2 WHERE id=$3", data, rastaID, charID) - return err + n, err := result.RowsAffected() + if err != nil { + return err + } + if n == 0 { + return fmt.Errorf("SaveMercenary for char %d: %w", charID, ErrCharacterNotFound) + } + return nil } // UpdateGCPAndPact updates gcp and pact_id atomically. @@ -241,6 +269,38 @@ func (r *CharacterRepository) UpdateGCPAndPact(charID uint32, gcp uint32, pactID return err } +// SavedataBackup holds one row from the savedata_backups table. +type SavedataBackup struct { + Slot int + Data []byte + SavedAt time.Time +} + +// LoadBackupsByRecency returns all backup slots for a character, ordered +// most-recent first. Returns an empty (non-nil) slice if no backups exist. +func (r *CharacterRepository) LoadBackupsByRecency(charID uint32) ([]SavedataBackup, error) { + rows, err := r.db.Query( + `SELECT slot, savedata, saved_at FROM savedata_backups + WHERE char_id = $1 + ORDER BY saved_at DESC`, + charID, + ) + if err != nil { + return nil, err + } + defer rows.Close() //nolint:errcheck // rows.Close error is non-actionable here + + backups := make([]SavedataBackup, 0) + for rows.Next() { + var b SavedataBackup + if err := rows.Scan(&b.Slot, &b.Data, &b.SavedAt); err != nil { + return nil, err + } + backups = append(backups, b) + } + return backups, rows.Err() +} + // SaveBackup upserts a savedata snapshot into the rotating backup table. func (r *CharacterRepository) SaveBackup(charID uint32, slot int, data []byte) error { _, err := r.db.Exec(` diff --git a/server/channelserver/repo_diva.go b/server/channelserver/repo_diva.go index 69fe3c658..20cf286cf 100644 --- a/server/channelserver/repo_diva.go +++ b/server/channelserver/repo_diva.go @@ -1,6 +1,9 @@ package channelserver import ( + "encoding/json" + "time" + "github.com/jmoiron/sqlx" ) @@ -75,3 +78,149 @@ func (r *DivaRepository) GetTotalPoints(eventID uint32) (int64, int64, error) { } return qp, bp, nil } + +// GetBeads returns all active bead types from the diva_beads table. +func (r *DivaRepository) GetBeads() ([]int, error) { + var types []int + err := r.db.Select(&types, "SELECT type FROM diva_beads ORDER BY id") + return types, err +} + +// AssignBead inserts a bead assignment for a character, replacing any existing one for that bead slot. +func (r *DivaRepository) AssignBead(characterID uint32, beadIndex int, expiry time.Time) error { + _, err := r.db.Exec(` + INSERT INTO diva_beads_assignment (character_id, bead_index, expiry) + VALUES ($1, $2, $3) + ON CONFLICT DO NOTHING`, + characterID, beadIndex, expiry) + return err +} + +// AddBeadPoints records a bead point contribution for a character. +func (r *DivaRepository) AddBeadPoints(characterID uint32, beadIndex int, points int) error { + _, err := r.db.Exec( + "INSERT INTO diva_beads_points (character_id, bead_index, points) VALUES ($1, $2, $3)", + characterID, beadIndex, points) + return err +} + +// GetCharacterBeadPoints returns the summed points per bead_index for a character. +func (r *DivaRepository) GetCharacterBeadPoints(characterID uint32) (map[int]int, error) { + rows, err := r.db.Query( + "SELECT bead_index, COALESCE(SUM(points),0) FROM diva_beads_points WHERE character_id=$1 GROUP BY bead_index", + characterID) + if err != nil { + return nil, err + } + defer func() { _ = rows.Close() }() + result := make(map[int]int) + for rows.Next() { + var idx, pts int + if err := rows.Scan(&idx, &pts); err != nil { + return nil, err + } + result[idx] = pts + } + return result, rows.Err() +} + +// GetTotalBeadPoints returns the sum of all points across all characters and bead slots. +func (r *DivaRepository) GetTotalBeadPoints() (int64, error) { + var total int64 + err := r.db.QueryRow("SELECT COALESCE(SUM(points),0) FROM diva_beads_points").Scan(&total) + return total, err +} + +// GetTopBeadPerDay returns the bead_index with the most points contributed on day offset `day` +// (0 = today, 1 = yesterday, etc.). Returns 0 if no data exists for that day. +func (r *DivaRepository) GetTopBeadPerDay(day int) (int, error) { + var beadIndex int + err := r.db.QueryRow(` + SELECT bead_index + FROM diva_beads_points + WHERE timestamp >= (NOW() - ($1 + 1) * INTERVAL '1 day') + AND timestamp < (NOW() - $1 * INTERVAL '1 day') + GROUP BY bead_index + ORDER BY SUM(points) DESC + LIMIT 1`, + day).Scan(&beadIndex) + if err != nil { + return 0, nil // no data for this day is not an error + } + return beadIndex, nil +} + +// CleanupBeads deletes all rows from diva_beads, diva_beads_assignment, and diva_beads_points. +func (r *DivaRepository) CleanupBeads() error { + if _, err := r.db.Exec("DELETE FROM diva_beads_points"); err != nil { + return err + } + if _, err := r.db.Exec("DELETE FROM diva_beads_assignment"); err != nil { + return err + } + _, err := r.db.Exec("DELETE FROM diva_beads") + return err +} + +// GetPersonalPrizes returns all prize rows with type='personal', ordered by points_req. +func (r *DivaRepository) GetPersonalPrizes() ([]DivaPrize, error) { + return r.getPrizesByType("personal") +} + +// GetGuildPrizes returns all prize rows with type='guild', ordered by points_req. +func (r *DivaRepository) GetGuildPrizes() ([]DivaPrize, error) { + return r.getPrizesByType("guild") +} + +func (r *DivaRepository) getPrizesByType(prizeType string) ([]DivaPrize, error) { + rows, err := r.db.Query(` + SELECT id, type, points_req, item_type, item_id, quantity, gr, repeatable + FROM diva_prizes + WHERE type=$1 + ORDER BY points_req`, + prizeType) + if err != nil { + return nil, err + } + defer func() { _ = rows.Close() }() + var prizes []DivaPrize + for rows.Next() { + var p DivaPrize + if err := rows.Scan(&p.ID, &p.Type, &p.PointsReq, &p.ItemType, &p.ItemID, &p.Quantity, &p.GR, &p.Repeatable); err != nil { + return nil, err + } + prizes = append(prizes, p) + } + return prizes, rows.Err() +} + +// GetCharacterInterceptionPoints returns the interception_points JSON map from guild_characters. +func (r *DivaRepository) GetCharacterInterceptionPoints(characterID uint32) (map[string]int, error) { + var raw []byte + err := r.db.QueryRow( + "SELECT interception_points FROM guild_characters WHERE char_id=$1", + characterID).Scan(&raw) + if err != nil { + return map[string]int{}, nil + } + result := make(map[string]int) + if len(raw) > 0 { + if err := json.Unmarshal(raw, &result); err != nil { + return map[string]int{}, nil + } + } + return result, nil +} + +// AddInterceptionPoints increments the interception points for a quest file ID in guild_characters. +func (r *DivaRepository) AddInterceptionPoints(characterID uint32, questFileID int, points int) error { + _, err := r.db.Exec(` + UPDATE guild_characters + SET interception_points = interception_points || jsonb_build_object( + $2::text, + COALESCE((interception_points->>$2::text)::int, 0) + $3 + ) + WHERE char_id=$1`, + characterID, questFileID, points) + return err +} diff --git a/server/channelserver/repo_guild.go b/server/channelserver/repo_guild.go index f9eca11e5..09d30d85e 100644 --- a/server/channelserver/repo_guild.go +++ b/server/channelserver/repo_guild.go @@ -5,6 +5,7 @@ import ( "database/sql" "errors" "fmt" + "time" "github.com/jmoiron/sqlx" ) @@ -34,6 +35,7 @@ SELECT leader_id, c.name AS leader_name, comment, + return_type, COALESCE(pugi_name_1, '') AS pugi_name_1, COALESCE(pugi_name_2, '') AS pugi_name_2, COALESCE(pugi_name_3, '') AS pugi_name_3, @@ -195,6 +197,62 @@ func (r *GuildRepository) Create(leaderCharID uint32, guildName string) (int32, return guildID, nil } +// FindOrCreateReturnGuild finds an existing return guild of the given type with fewer +// than 60 members, or creates a new one. The name template receives the guild count+1 +// as its single %d argument. Returns the guild ID. +func (r *GuildRepository) FindOrCreateReturnGuild(returnType uint8, nameTemplate string) (uint32, error) { + var guildID uint32 + err := r.db.QueryRow(` + SELECT g.id FROM guilds g + WHERE g.return_type = $1 + AND (SELECT COUNT(1) FROM guild_characters gc WHERE gc.guild_id = g.id) < 60 + LIMIT 1 + `, returnType).Scan(&guildID) + if err == nil { + return guildID, nil + } + if !errors.Is(err, sql.ErrNoRows) { + return 0, err + } + + // No suitable guild — count existing ones and create a new one. + var count int + if err := r.db.QueryRow( + `SELECT COUNT(1) FROM guilds WHERE return_type = $1`, returnType, + ).Scan(&count); err != nil { + return 0, err + } + + tx, err := r.db.BeginTxx(context.Background(), nil) + if err != nil { + return 0, err + } + defer func() { _ = tx.Rollback() }() + + name := fmt.Sprintf(nameTemplate, count+1) + if err := tx.QueryRow( + `INSERT INTO guilds (name, leader_id, return_type, rank_rp) VALUES ($1, 0, $2, 1200) RETURNING id`, + name, returnType, + ).Scan(&guildID); err != nil { + return 0, err + } + + if err := tx.Commit(); err != nil { + return 0, err + } + return guildID, nil +} + +// AddMember inserts a character into a guild's member list. +func (r *GuildRepository) AddMember(guildID, charID uint32) error { + _, err := r.db.Exec(` + INSERT INTO guild_characters (guild_id, character_id, order_index) + VALUES ($1, $2, (SELECT COALESCE(MAX(order_index), 0) + 1 FROM guild_characters WHERE guild_id = $1)) + ON CONFLICT (guild_id, character_id) DO NOTHING + `, guildID, charID) + return err +} + // Save persists guild metadata changes. func (r *GuildRepository) Save(guild *Guild) error { _, err := r.db.Exec(` @@ -270,8 +328,9 @@ func (r *GuildRepository) CreateApplication(guildID, charID, actorID uint32, app return err } -// CreateApplicationWithMail atomically creates an application and sends a notification mail. -func (r *GuildRepository) CreateApplicationWithMail(guildID, charID, actorID uint32, appType GuildApplicationType, mailSenderID, mailRecipientID uint32, mailSubject, mailBody string) error { +// CreateInviteWithMail atomically inserts a scout invitation into guild_invites +// and sends a notification mail to the target character. +func (r *GuildRepository) CreateInviteWithMail(guildID, charID, actorID uint32, mailSenderID, mailRecipientID uint32, mailSubject, mailBody string) error { tx, err := r.db.BeginTxx(context.Background(), nil) if err != nil { return err @@ -279,8 +338,8 @@ func (r *GuildRepository) CreateApplicationWithMail(guildID, charID, actorID uin defer func() { _ = tx.Rollback() }() if _, err := tx.Exec( - `INSERT INTO guild_applications (guild_id, character_id, actor_id, application_type) VALUES ($1, $2, $3, $4)`, - guildID, charID, actorID, appType); err != nil { + `INSERT INTO guild_invites (guild_id, character_id, actor_id) VALUES ($1, $2, $3)`, + guildID, charID, actorID); err != nil { return err } if _, err := tx.Exec(mailInsertQuery, mailSenderID, mailRecipientID, mailSubject, mailBody, 0, 0, true, false); err != nil { @@ -289,11 +348,55 @@ func (r *GuildRepository) CreateApplicationWithMail(guildID, charID, actorID uin return tx.Commit() } -// CancelInvitation removes an invitation for a character. -func (r *GuildRepository) CancelInvitation(guildID, charID uint32) error { +// HasInvite reports whether a pending scout invitation exists for the character in the guild. +func (r *GuildRepository) HasInvite(guildID, charID uint32) (bool, error) { + var n int + err := r.db.QueryRow( + `SELECT 1 FROM guild_invites WHERE guild_id = $1 AND character_id = $2`, + guildID, charID, + ).Scan(&n) + if errors.Is(err, sql.ErrNoRows) { + return false, nil + } + if err != nil { + return false, err + } + return true, nil +} + +// CancelInvite removes a scout invitation by its primary key. +func (r *GuildRepository) CancelInvite(inviteID uint32) error { + _, err := r.db.Exec(`DELETE FROM guild_invites WHERE id = $1`, inviteID) + return err +} + +// AcceptInvite removes the scout invitation and adds the character to the guild atomically. +func (r *GuildRepository) AcceptInvite(guildID, charID uint32) error { + tx, err := r.db.BeginTxx(context.Background(), nil) + if err != nil { + return err + } + defer func() { _ = tx.Rollback() }() + + if _, err := tx.Exec( + `DELETE FROM guild_invites WHERE guild_id = $1 AND character_id = $2`, + guildID, charID); err != nil { + return err + } + if _, err := tx.Exec(` + INSERT INTO guild_characters (guild_id, character_id, order_index) + VALUES ($1, $2, (SELECT MAX(order_index) + 1 FROM guild_characters WHERE guild_id = $1)) + `, guildID, charID); err != nil { + return err + } + return tx.Commit() +} + +// DeclineInvite removes a scout invitation without joining the guild. +func (r *GuildRepository) DeclineInvite(guildID, charID uint32) error { _, err := r.db.Exec( - `DELETE FROM guild_applications WHERE character_id = $1 AND guild_id = $2 AND application_type = 'invited'`, - charID, guildID, + `DELETE FROM guild_invites WHERE guild_id = $1 AND character_id = $2`, + guildID, charID, ) return err } @@ -433,34 +536,39 @@ func (r *GuildRepository) SetRecruiter(charID uint32, allowed bool) error { return err } -// ScoutedCharacter represents an invited character in the scout list. -type ScoutedCharacter struct { - CharID uint32 `db:"id"` - Name string `db:"name"` - HR uint16 `db:"hr"` - GR uint16 `db:"gr"` - ActorID uint32 `db:"actor_id"` +// GuildInvite represents a pending scout invitation with the target character's info. +type GuildInvite struct { + ID uint32 `db:"id"` + GuildID uint32 `db:"guild_id"` + CharID uint32 `db:"character_id"` + ActorID uint32 `db:"actor_id"` + InvitedAt time.Time `db:"created_at"` + HR uint16 `db:"hr"` + GR uint16 `db:"gr"` + Name string `db:"name"` } -// ListInvitedCharacters returns all characters with pending guild invitations. -func (r *GuildRepository) ListInvitedCharacters(guildID uint32) ([]*ScoutedCharacter, error) { +// ListInvites returns all pending scout invitations for a guild, including +// the target character's HR, GR, and name. +func (r *GuildRepository) ListInvites(guildID uint32) ([]*GuildInvite, error) { rows, err := r.db.Queryx(` - SELECT c.id, c.name, c.hr, c.gr, ga.actor_id - FROM guild_applications ga - JOIN characters c ON c.id = ga.character_id - WHERE ga.guild_id = $1 AND ga.application_type = 'invited' + SELECT gi.id, gi.guild_id, gi.character_id, gi.actor_id, gi.created_at, + c.hr, c.gr, c.name + FROM guild_invites gi + JOIN characters c ON c.id = gi.character_id + WHERE gi.guild_id = $1 `, guildID) if err != nil { return nil, err } defer func() { _ = rows.Close() }() - var chars []*ScoutedCharacter + var invites []*GuildInvite for rows.Next() { - sc := &ScoutedCharacter{} - if err := rows.StructScan(sc); err != nil { + inv := &GuildInvite{} + if err := rows.StructScan(inv); err != nil { continue } - chars = append(chars, sc) + invites = append(invites, inv) } - return chars, nil + return invites, nil } diff --git a/server/channelserver/repo_guild_subsystems_test.go b/server/channelserver/repo_guild_subsystems_test.go new file mode 100644 index 000000000..d636c2c15 --- /dev/null +++ b/server/channelserver/repo_guild_subsystems_test.go @@ -0,0 +1,227 @@ +package channelserver + +// Tests for guild subsystem methods not covered by repo_guild_test.go: +// - SetAllianceRecruiting (repo_guild_alliance.go) +// - RolloverDailyRP (repo_guild_rp.go) +// - AddWeeklyBonusUsers (repo_guild_rp.go) +// - InsertKillLog (repo_guild_hunt.go) +// - ClearTreasureHunt (repo_guild_hunt.go) + +import ( + "testing" + "time" +) + +func TestSetAllianceRecruiting(t *testing.T) { + db := SetupTestDB(t) + defer TeardownTestDB(t, db) + + userID := CreateTestUser(t, db, "sar_user") + charID := CreateTestCharacter(t, db, userID, "SAR_Leader") + guildID := CreateTestGuild(t, db, charID, "SAR_Guild") + repo := NewGuildRepository(db) + + if err := repo.CreateAlliance("SAR_Alliance", guildID); err != nil { + t.Fatalf("CreateAlliance failed: %v", err) + } + alliances, err := repo.ListAlliances() + if err != nil { + t.Fatalf("ListAlliances failed: %v", err) + } + if len(alliances) == 0 { + t.Fatal("Expected at least 1 alliance") + } + allianceID := alliances[0].ID + + // Default should be false. + if alliances[0].Recruiting { + t.Error("Expected initial Recruiting=false") + } + + if err := repo.SetAllianceRecruiting(allianceID, true); err != nil { + t.Fatalf("SetAllianceRecruiting(true) failed: %v", err) + } + alliance, err := repo.GetAllianceByID(allianceID) + if err != nil { + t.Fatalf("GetAllianceByID after set true failed: %v", err) + } + if !alliance.Recruiting { + t.Error("Expected Recruiting=true after SetAllianceRecruiting(true)") + } + + if err := repo.SetAllianceRecruiting(allianceID, false); err != nil { + t.Fatalf("SetAllianceRecruiting(false) failed: %v", err) + } + alliance, err = repo.GetAllianceByID(allianceID) + if err != nil { + t.Fatalf("GetAllianceByID after set false failed: %v", err) + } + if alliance.Recruiting { + t.Error("Expected Recruiting=false after SetAllianceRecruiting(false)") + } +} + +func TestRolloverDailyRP(t *testing.T) { + db := SetupTestDB(t) + defer TeardownTestDB(t, db) + + userID := CreateTestUser(t, db, "rollover_user") + charID := CreateTestCharacter(t, db, userID, "Rollover_Leader") + guildID := CreateTestGuild(t, db, charID, "Rollover_Guild") + repo := NewGuildRepository(db) + + // Set rp_today for the member so we can verify the rollover. + if _, err := db.Exec("UPDATE guild_characters SET rp_today = 50 WHERE character_id = $1", charID); err != nil { + t.Fatalf("Failed to set rp_today: %v", err) + } + + noon := time.Now().UTC() + if err := repo.RolloverDailyRP(guildID, noon); err != nil { + t.Fatalf("RolloverDailyRP failed: %v", err) + } + + var rpToday, rpYesterday int + if err := db.QueryRow("SELECT rp_today, rp_yesterday FROM guild_characters WHERE character_id = $1", charID). + Scan(&rpToday, &rpYesterday); err != nil { + t.Fatalf("Failed to read rp values: %v", err) + } + if rpToday != 0 { + t.Errorf("Expected rp_today=0 after rollover, got %d", rpToday) + } + if rpYesterday != 50 { + t.Errorf("Expected rp_yesterday=50 after rollover, got %d", rpYesterday) + } +} + +func TestRolloverDailyRP_Idempotent(t *testing.T) { + db := SetupTestDB(t) + defer TeardownTestDB(t, db) + + userID := CreateTestUser(t, db, "idem_rollover_user") + charID := CreateTestCharacter(t, db, userID, "IdemRollLeader") + guildID := CreateTestGuild(t, db, charID, "Idem_Rollover_Guild") + repo := NewGuildRepository(db) + + if _, err := db.Exec("UPDATE guild_characters SET rp_today = 100 WHERE character_id = $1", charID); err != nil { + t.Fatalf("Failed to set rp_today: %v", err) + } + + noon := time.Now().UTC() + if err := repo.RolloverDailyRP(guildID, noon); err != nil { + t.Fatalf("First RolloverDailyRP failed: %v", err) + } + // Second call with same noon should be a no-op (rp_reset_at >= noon). + if err := repo.RolloverDailyRP(guildID, noon); err != nil { + t.Fatalf("Second RolloverDailyRP (idempotent) failed: %v", err) + } + + var rpToday int + _ = db.QueryRow("SELECT rp_today FROM guild_characters WHERE character_id = $1", charID).Scan(&rpToday) + if rpToday != 0 { + t.Errorf("Expected rp_today=0 after idempotent rollover, got %d", rpToday) + } +} + +func TestAddWeeklyBonusUsers(t *testing.T) { + db := SetupTestDB(t) + defer TeardownTestDB(t, db) + + userID := CreateTestUser(t, db, "wbu_user") + charID := CreateTestCharacter(t, db, userID, "WBU_Leader") + guildID := CreateTestGuild(t, db, charID, "WBU_Guild") + repo := NewGuildRepository(db) + + if err := repo.AddWeeklyBonusUsers(guildID, 3); err != nil { + t.Fatalf("AddWeeklyBonusUsers failed: %v", err) + } + + // Verify the column incremented. + var wbu int + if err := db.QueryRow("SELECT weekly_bonus_users FROM guilds WHERE id = $1", guildID).Scan(&wbu); err != nil { + t.Fatalf("Failed to read weekly_bonus_users: %v", err) + } + if wbu != 3 { + t.Errorf("Expected weekly_bonus_users=3, got %d", wbu) + } + + // Add again and verify accumulation. + if err := repo.AddWeeklyBonusUsers(guildID, 2); err != nil { + t.Fatalf("Second AddWeeklyBonusUsers failed: %v", err) + } + if err := db.QueryRow("SELECT weekly_bonus_users FROM guilds WHERE id = $1", guildID).Scan(&wbu); err != nil { + t.Fatalf("Failed to read weekly_bonus_users after second add: %v", err) + } + if wbu != 5 { + t.Errorf("Expected weekly_bonus_users=5 after second add, got %d", wbu) + } +} + +func TestInsertKillLogAndCount(t *testing.T) { + db := SetupTestDB(t) + defer TeardownTestDB(t, db) + + userID := CreateTestUser(t, db, "kill_log_user") + charID := CreateTestCharacter(t, db, userID, "Kill_Logger") + guildID := CreateTestGuild(t, db, charID, "Kill_Guild") + repo := NewGuildRepository(db) + + // Set box_claimed to 1 hour ago so kills inserted now are within the window. + if _, err := db.Exec("UPDATE guild_characters SET box_claimed = now() - interval '1 hour' WHERE character_id = $1", charID); err != nil { + t.Fatalf("Failed to set box_claimed: %v", err) + } + + if err := repo.InsertKillLog(charID, 42, 2, time.Now()); err != nil { + t.Fatalf("InsertKillLog failed: %v", err) + } + + count, err := repo.CountGuildKills(guildID, charID) + if err != nil { + t.Fatalf("CountGuildKills failed: %v", err) + } + if count != 1 { + t.Errorf("Expected 1 kill log entry, got %d", count) + } +} + +func TestClearTreasureHunt(t *testing.T) { + db := SetupTestDB(t) + defer TeardownTestDB(t, db) + + userID := CreateTestUser(t, db, "cth_user") + charID := CreateTestCharacter(t, db, userID, "CTH_Leader") + guildID := CreateTestGuild(t, db, charID, "CTH_Guild") + repo := NewGuildRepository(db) + + // Create and register a hunt. + if err := repo.CreateHunt(guildID, charID, 7, 1, []byte{}, ""); err != nil { + t.Fatalf("CreateHunt failed: %v", err) + } + hunt, err := repo.GetPendingHunt(charID) + if err != nil || hunt == nil { + t.Fatalf("GetPendingHunt failed or nil: %v", err) + } + if err := repo.RegisterHuntReport(hunt.HuntID, charID); err != nil { + t.Fatalf("RegisterHuntReport failed: %v", err) + } + + // Verify treasure_hunt is set. + var th interface{} + if err := db.QueryRow("SELECT treasure_hunt FROM guild_characters WHERE character_id = $1", charID).Scan(&th); err != nil { + t.Fatalf("Failed to read treasure_hunt: %v", err) + } + if th == nil { + t.Error("Expected treasure_hunt to be set after RegisterHuntReport") + } + + // Clear it. + if err := repo.ClearTreasureHunt(charID); err != nil { + t.Fatalf("ClearTreasureHunt failed: %v", err) + } + + if err := db.QueryRow("SELECT treasure_hunt FROM guild_characters WHERE character_id = $1", charID).Scan(&th); err != nil { + t.Fatalf("Failed to read treasure_hunt after clear: %v", err) + } + if th != nil { + t.Errorf("Expected treasure_hunt=nil after ClearTreasureHunt, got %v", th) + } +} diff --git a/server/channelserver/repo_guild_test.go b/server/channelserver/repo_guild_test.go index fc2ff6c3c..d239739d2 100644 --- a/server/channelserver/repo_guild_test.go +++ b/server/channelserver/repo_guild_test.go @@ -533,66 +533,77 @@ func TestAddMemberDailyRP(t *testing.T) { // --- Invitation / Scout tests --- -func TestCancelInvitation(t *testing.T) { +func TestCancelInvite(t *testing.T) { repo, db, guildID, leaderID := setupGuildRepo(t) user2 := CreateTestUser(t, db, "invite_user") char2 := CreateTestCharacter(t, db, user2, "Invited") - if err := repo.CreateApplication(guildID, char2, leaderID, GuildApplicationTypeInvited); err != nil { - t.Fatalf("CreateApplication (invited) failed: %v", err) + if err := repo.CreateInviteWithMail(guildID, char2, leaderID, leaderID, char2, "Invite", "body"); err != nil { + t.Fatalf("CreateInviteWithMail failed: %v", err) } - if err := repo.CancelInvitation(guildID, char2); err != nil { - t.Fatalf("CancelInvitation failed: %v", err) + invites, err := repo.ListInvites(guildID) + if err != nil || len(invites) != 1 { + t.Fatalf("Expected 1 invite, got %d (err: %v)", len(invites), err) } - has, err := repo.HasApplication(guildID, char2) + if err := repo.CancelInvite(invites[0].ID); err != nil { + t.Fatalf("CancelInvite failed: %v", err) + } + + has, err := repo.HasInvite(guildID, char2) if err != nil { - t.Fatalf("HasApplication failed: %v", err) + t.Fatalf("HasInvite failed: %v", err) } if has { - t.Error("Expected no application after cancellation") + t.Error("Expected no invite after cancellation") } } -func TestListInvitedCharacters(t *testing.T) { +func TestListInvites(t *testing.T) { repo, db, guildID, leaderID := setupGuildRepo(t) user2 := CreateTestUser(t, db, "scout_user") char2 := CreateTestCharacter(t, db, user2, "Scouted") - if err := repo.CreateApplication(guildID, char2, leaderID, GuildApplicationTypeInvited); err != nil { - t.Fatalf("CreateApplication failed: %v", err) + if err := repo.CreateInviteWithMail(guildID, char2, leaderID, leaderID, char2, "Invite", "body"); err != nil { + t.Fatalf("CreateInviteWithMail failed: %v", err) } - chars, err := repo.ListInvitedCharacters(guildID) + invites, err := repo.ListInvites(guildID) if err != nil { - t.Fatalf("ListInvitedCharacters failed: %v", err) + t.Fatalf("ListInvites failed: %v", err) } - if len(chars) != 1 { - t.Fatalf("Expected 1 invited character, got %d", len(chars)) + if len(invites) != 1 { + t.Fatalf("Expected 1 invite, got %d", len(invites)) } - if chars[0].CharID != char2 { - t.Errorf("Expected char ID %d, got %d", char2, chars[0].CharID) + if invites[0].CharID != char2 { + t.Errorf("Expected char ID %d, got %d", char2, invites[0].CharID) } - if chars[0].Name != "Scouted" { - t.Errorf("Expected name 'Scouted', got %q", chars[0].Name) + if invites[0].Name != "Scouted" { + t.Errorf("Expected name 'Scouted', got %q", invites[0].Name) } - if chars[0].ActorID != leaderID { - t.Errorf("Expected actor ID %d, got %d", leaderID, chars[0].ActorID) + if invites[0].ActorID != leaderID { + t.Errorf("Expected actor ID %d, got %d", leaderID, invites[0].ActorID) + } + if invites[0].ID == 0 { + t.Error("Expected non-zero invite ID") + } + if invites[0].InvitedAt.IsZero() { + t.Error("Expected non-zero InvitedAt timestamp") } } -func TestListInvitedCharactersEmpty(t *testing.T) { +func TestListInvitesEmpty(t *testing.T) { repo, _, guildID, _ := setupGuildRepo(t) - chars, err := repo.ListInvitedCharacters(guildID) + invites, err := repo.ListInvites(guildID) if err != nil { - t.Fatalf("ListInvitedCharacters failed: %v", err) + t.Fatalf("ListInvites failed: %v", err) } - if len(chars) != 0 { - t.Errorf("Expected 0 invited characters, got %d", len(chars)) + if len(invites) != 0 { + t.Errorf("Expected 0 invites, got %d", len(invites)) } } @@ -1486,28 +1497,26 @@ func TestDisbandCleansUpAlliance(t *testing.T) { } } -// --- CreateApplicationWithMail --- +// --- CreateInviteWithMail --- -func TestCreateApplicationWithMail(t *testing.T) { +func TestCreateInviteWithMail(t *testing.T) { repo, db, guildID, leaderID := setupGuildRepo(t) user2 := CreateTestUser(t, db, "scout_mail_user") char2 := CreateTestCharacter(t, db, user2, "ScoutTarget") - err := repo.CreateApplicationWithMail( - guildID, char2, leaderID, GuildApplicationTypeInvited, - leaderID, char2, "Guild Invite", "You have been invited!") + err := repo.CreateInviteWithMail(guildID, char2, leaderID, leaderID, char2, "Guild Invite", "You have been invited!") if err != nil { - t.Fatalf("CreateApplicationWithMail failed: %v", err) + t.Fatalf("CreateInviteWithMail failed: %v", err) } - // Verify application was created - has, err := repo.HasApplication(guildID, char2) + // Verify invite was created + has, err := repo.HasInvite(guildID, char2) if err != nil { - t.Fatalf("HasApplication failed: %v", err) + t.Fatalf("HasInvite failed: %v", err) } if !has { - t.Error("Expected application to exist after CreateApplicationWithMail") + t.Error("Expected invite to exist after CreateInviteWithMail") } // Verify mail was sent diff --git a/server/channelserver/repo_interfaces.go b/server/channelserver/repo_interfaces.go index 7d630a0c6..b25087b6f 100644 --- a/server/channelserver/repo_interfaces.go +++ b/server/channelserver/repo_interfaces.go @@ -48,6 +48,9 @@ type CharacterRepo interface { // LoadSaveDataWithHash loads savedata along with its stored SHA-256 hash. // The hash may be nil for characters saved before checksums were introduced. LoadSaveDataWithHash(charID uint32) (id uint32, savedata []byte, isNew bool, name string, hash []byte, err error) + // LoadBackupsByRecency returns all backup slots for a character ordered + // most-recent first. Returns an empty slice if no backups exist. + LoadBackupsByRecency(charID uint32) ([]SavedataBackup, error) } // GuildRepo defines the contract for guild data access. @@ -61,8 +64,11 @@ type GuildRepo interface { RemoveCharacter(charID uint32) error AcceptApplication(guildID, charID uint32) error CreateApplication(guildID, charID, actorID uint32, appType GuildApplicationType) error - CreateApplicationWithMail(guildID, charID, actorID uint32, appType GuildApplicationType, mailSenderID, mailRecipientID uint32, mailSubject, mailBody string) error - CancelInvitation(guildID, charID uint32) error + CreateInviteWithMail(guildID, charID, actorID uint32, mailSenderID, mailRecipientID uint32, mailSubject, mailBody string) error + HasInvite(guildID, charID uint32) (bool, error) + CancelInvite(inviteID uint32) error + AcceptInvite(guildID, charID uint32) error + DeclineInvite(guildID, charID uint32) error RejectApplication(guildID, charID uint32) error ArrangeCharacters(charIDs []uint32) error GetApplication(guildID, charID uint32, appType GuildApplicationType) (*GuildApplication, error) @@ -117,9 +123,11 @@ type GuildRepo interface { CountGuildKills(guildID, charID uint32) (int, error) ClearTreasureHunt(charID uint32) error InsertKillLog(charID uint32, monster int, quantity uint8, timestamp time.Time) error - ListInvitedCharacters(guildID uint32) ([]*ScoutedCharacter, error) + ListInvites(guildID uint32) ([]*GuildInvite, error) RolloverDailyRP(guildID uint32, noon time.Time) error AddWeeklyBonusUsers(guildID uint32, numUsers uint8) error + FindOrCreateReturnGuild(returnType uint8, nameTemplate string) (uint32, error) + AddMember(guildID, charID uint32) error } // UserRepo defines the contract for user account data access. @@ -319,6 +327,18 @@ type GoocooRepo interface { SaveSlot(charID uint32, slot uint32, data []byte) error } +// DivaPrize represents a single reward milestone for the personal or guild track. +type DivaPrize struct { + ID int + Type string + PointsReq int + ItemType int + ItemID int + Quantity int + GR bool + Repeatable bool +} + // DivaRepo defines the contract for diva event data access. type DivaRepo interface { DeleteEvents() error @@ -327,6 +347,23 @@ type DivaRepo interface { AddPoints(charID uint32, eventID uint32, questPoints, bonusPoints uint32) error GetPoints(charID uint32, eventID uint32) (questPoints, bonusPoints int64, err error) GetTotalPoints(eventID uint32) (questPoints, bonusPoints int64, err error) + + // Bead management + GetBeads() ([]int, error) + AssignBead(characterID uint32, beadIndex int, expiry time.Time) error + AddBeadPoints(characterID uint32, beadIndex int, points int) error + GetCharacterBeadPoints(characterID uint32) (map[int]int, error) + GetTotalBeadPoints() (int64, error) + GetTopBeadPerDay(day int) (int, error) + CleanupBeads() error + + // Prize rewards + GetPersonalPrizes() ([]DivaPrize, error) + GetGuildPrizes() ([]DivaPrize, error) + + // Interception points (guild_characters.interception_points JSON) + GetCharacterInterceptionPoints(characterID uint32) (map[string]int, error) + AddInterceptionPoints(characterID uint32, questFileID int, points int) error } // MiscRepo defines the contract for miscellaneous data access. @@ -348,3 +385,61 @@ type MercenaryRepo interface { GetGuildHuntCatsUsed(charID uint32) ([]GuildHuntCatUsage, error) GetGuildAirou(guildID uint32) ([][]byte, error) } + +// Tournament represents a tournament schedule entry. +type Tournament struct { + ID uint32 `db:"id"` + Name string `db:"name"` + StartTime int64 `db:"start_time"` + EntryEnd int64 `db:"entry_end"` + RankingEnd int64 `db:"ranking_end"` + RewardEnd int64 `db:"reward_end"` +} + +// TournamentCup represents a competition category within a tournament. +type TournamentCup struct { + ID uint32 `db:"id"` + CupGroup int16 `db:"cup_group"` + CupType int16 `db:"cup_type"` + Unk int16 `db:"unk"` + Name string `db:"name"` + Description string `db:"description"` +} + +// TournamentSubEvent represents a specific hunt/fish target within a cup group. +type TournamentSubEvent struct { + ID uint32 `db:"id"` + CupGroup int16 `db:"cup_group"` + EventSubType int16 `db:"event_sub_type"` + QuestFileID uint32 `db:"quest_file_id"` + Name string `db:"name"` +} + +// TournamentRankEntry is a single entry in a leaderboard. +type TournamentRankEntry struct { + CharID uint32 + Rank uint32 + Grade uint16 + HR uint16 + GR uint16 + CharName string + GuildName string +} + +// TournamentEntry represents a player's registration for a tournament. +type TournamentEntry struct { + ID uint32 `db:"id"` + CharID uint32 `db:"char_id"` + TournamentID uint32 `db:"tournament_id"` +} + +// TournamentRepo defines the contract for tournament schedule and result data access. +type TournamentRepo interface { + GetActive(now int64) (*Tournament, error) + GetCups(tournamentID uint32) ([]TournamentCup, error) + GetSubEvents() ([]TournamentSubEvent, error) + Register(charID, tournamentID uint32) (entryID uint32, err error) + GetEntry(charID, tournamentID uint32) (*TournamentEntry, error) + SubmitResult(charID, tournamentID, eventID, questSlot, stageHandle uint32) error + GetLeaderboard(eventID uint32) ([]TournamentRankEntry, error) +} diff --git a/server/channelserver/repo_mocks_test.go b/server/channelserver/repo_mocks_test.go index 1291c5a74..c05ac359e 100644 --- a/server/channelserver/repo_mocks_test.go +++ b/server/channelserver/repo_mocks_test.go @@ -248,6 +248,9 @@ func (m *mockCharacterRepo) SaveCharacterDataAtomic(_ SaveAtomicParams) error { func (m *mockCharacterRepo) LoadSaveDataWithHash(_ uint32) (uint32, []byte, bool, string, []byte, error) { return m.loadSaveDataID, m.loadSaveDataData, m.loadSaveDataNew, m.loadSaveDataName, m.loadSaveDataHash, m.loadSaveDataErr } +func (m *mockCharacterRepo) LoadBackupsByRecency(_ uint32) ([]SavedataBackup, error) { + return []SavedataBackup{}, nil +} // --- mockGoocooRepo --- @@ -306,8 +309,10 @@ type mockGuildRepo struct { removeErr error createAppErr error getMemberErr error - hasAppResult bool - hasAppErr error + hasAppResult bool + hasAppErr error + hasInviteResult bool + hasInviteErr error listPostsErr error createPostErr error deletePostErr error @@ -315,8 +320,10 @@ type mockGuildRepo struct { // State tracking disbandedID uint32 removedCharID uint32 - acceptedCharID uint32 - rejectedCharID uint32 + acceptedCharID uint32 + rejectedCharID uint32 + acceptInviteCharID uint32 + declineInviteCharID uint32 savedGuild *Guild savedMembers []*GuildMember createdAppArgs []interface{} @@ -569,10 +576,19 @@ func (m *mockGuildRepo) CountGuildKills(_, _ uint32) (int, error) { // No-op stubs for remaining GuildRepo interface methods. func (m *mockGuildRepo) ListAll() ([]*Guild, error) { return nil, nil } func (m *mockGuildRepo) Create(_ uint32, _ string) (int32, error) { return 0, nil } -func (m *mockGuildRepo) CreateApplicationWithMail(_, _, _ uint32, _ GuildApplicationType, _, _ uint32, _, _ string) error { - return nil +func (m *mockGuildRepo) CreateInviteWithMail(_, _, _, _, _ uint32, _, _ string) error { return nil } +func (m *mockGuildRepo) HasInvite(_, _ uint32) (bool, error) { + return m.hasInviteResult, m.hasInviteErr +} +func (m *mockGuildRepo) CancelInvite(_ uint32) error { return nil } +func (m *mockGuildRepo) AcceptInvite(_, charID uint32) error { + m.acceptInviteCharID = charID + return m.acceptErr +} +func (m *mockGuildRepo) DeclineInvite(_, charID uint32) error { + m.declineInviteCharID = charID + return m.rejectErr } -func (m *mockGuildRepo) CancelInvitation(_, _ uint32) error { return nil } func (m *mockGuildRepo) ArrangeCharacters(_ []uint32) error { return nil } func (m *mockGuildRepo) GetItemBox(_ uint32) ([]byte, error) { return nil, nil } func (m *mockGuildRepo) SaveItemBox(_ uint32, _ []byte) error { return nil } @@ -595,11 +611,13 @@ func (m *mockGuildRepo) CountNewPosts(_ uint32, _ time.Time) (int, error) func (m *mockGuildRepo) ListAlliances() ([]*GuildAlliance, error) { return nil, nil } func (m *mockGuildRepo) ClearTreasureHunt(_ uint32) error { return nil } func (m *mockGuildRepo) InsertKillLog(_ uint32, _ int, _ uint8, _ time.Time) error { return nil } -func (m *mockGuildRepo) ListInvitedCharacters(_ uint32) ([]*ScoutedCharacter, error) { - return nil, nil -} +func (m *mockGuildRepo) ListInvites(_ uint32) ([]*GuildInvite, error) { return nil, nil } func (m *mockGuildRepo) RolloverDailyRP(_ uint32, _ time.Time) error { return nil } func (m *mockGuildRepo) AddWeeklyBonusUsers(_ uint32, _ uint8) error { return nil } +func (m *mockGuildRepo) FindOrCreateReturnGuild(_ uint8, _ string) (uint32, error) { + return 1, nil +} +func (m *mockGuildRepo) AddMember(_, _ uint32) error { return nil } // --- mockUserRepoForItems --- @@ -1147,6 +1165,22 @@ func (m *mockDivaRepo) GetTotalPoints(eventID uint32) (int64, int64, error) { return tq, tb, nil } +func (m *mockDivaRepo) GetBeads() ([]int, error) { return nil, nil } +func (m *mockDivaRepo) AssignBead(_ uint32, _ int, _ time.Time) error { return nil } +func (m *mockDivaRepo) AddBeadPoints(_ uint32, _ int, _ int) error { return nil } +func (m *mockDivaRepo) GetCharacterBeadPoints(_ uint32) (map[int]int, error) { + return map[int]int{}, nil +} +func (m *mockDivaRepo) GetTotalBeadPoints() (int64, error) { return 0, nil } +func (m *mockDivaRepo) GetTopBeadPerDay(_ int) (int, error) { return 0, nil } +func (m *mockDivaRepo) CleanupBeads() error { return nil } +func (m *mockDivaRepo) GetPersonalPrizes() ([]DivaPrize, error) { return nil, nil } +func (m *mockDivaRepo) GetGuildPrizes() ([]DivaPrize, error) { return nil, nil } +func (m *mockDivaRepo) GetCharacterInterceptionPoints(_ uint32) (map[string]int, error) { + return map[string]int{}, nil +} +func (m *mockDivaRepo) AddInterceptionPoints(_ uint32, _ int, _ int) error { return nil } + // --- mockEventRepo --- type mockEventRepo struct { @@ -1232,3 +1266,35 @@ func (m *mockCafeRepo) GetBonusItem(_ uint32) (uint32, uint32, error) { return m.bonusItemType, m.bonusItemQty, m.bonusItemErr } func (m *mockCafeRepo) AcceptBonus(_, _ uint32) error { return nil } + +// --- mockTournamentRepo --- + +type mockTournamentRepo struct { + active *Tournament + activeErr error + cups []TournamentCup + subEvents []TournamentSubEvent + ranks []TournamentRankEntry + registerID uint32 + registerErr error + entry *TournamentEntry + entryErr error +} + +func (m *mockTournamentRepo) GetActive(_ int64) (*Tournament, error) { + return m.active, m.activeErr +} +func (m *mockTournamentRepo) GetCups(_ uint32) ([]TournamentCup, error) { return m.cups, nil } +func (m *mockTournamentRepo) GetSubEvents() ([]TournamentSubEvent, error) { + return m.subEvents, nil +} +func (m *mockTournamentRepo) Register(_, _ uint32) (uint32, error) { + return m.registerID, m.registerErr +} +func (m *mockTournamentRepo) GetEntry(_, _ uint32) (*TournamentEntry, error) { + return m.entry, m.entryErr +} +func (m *mockTournamentRepo) SubmitResult(_, _, _, _, _ uint32) error { return nil } +func (m *mockTournamentRepo) GetLeaderboard(_ uint32) ([]TournamentRankEntry, error) { + return m.ranks, nil +} diff --git a/server/channelserver/repo_tournament.go b/server/channelserver/repo_tournament.go new file mode 100644 index 000000000..ee1150c2b --- /dev/null +++ b/server/channelserver/repo_tournament.go @@ -0,0 +1,167 @@ +package channelserver + +import ( + "database/sql" + "fmt" + + "github.com/jmoiron/sqlx" +) + +// TournamentRepository centralizes all database access for tournament tables. +type TournamentRepository struct { + db *sqlx.DB +} + +// NewTournamentRepository creates a new TournamentRepository. +func NewTournamentRepository(db *sqlx.DB) *TournamentRepository { + return &TournamentRepository{db: db} +} + +// GetActive returns the most recently started tournament that is still within its +// reward window (reward_end >= now), or nil if no active tournament exists. +func (r *TournamentRepository) GetActive(now int64) (*Tournament, error) { + var t Tournament + err := r.db.QueryRowx( + `SELECT id, name, start_time, entry_end, ranking_end, reward_end + FROM tournaments + WHERE start_time <= $1 AND reward_end >= $1 + ORDER BY start_time DESC + LIMIT 1`, + now, + ).StructScan(&t) + if err == sql.ErrNoRows { + return nil, nil + } + if err != nil { + return nil, fmt.Errorf("get active tournament: %w", err) + } + return &t, nil +} + +// GetCups returns all cups belonging to the given tournament, ordered by ID. +func (r *TournamentRepository) GetCups(tournamentID uint32) ([]TournamentCup, error) { + var cups []TournamentCup + err := r.db.Select(&cups, + `SELECT id, cup_group, cup_type, unk, name, description + FROM tournament_cups + WHERE tournament_id = $1 + ORDER BY id`, + tournamentID, + ) + return cups, err +} + +// GetSubEvents returns all sub-events ordered by cup group and event sub type. +func (r *TournamentRepository) GetSubEvents() ([]TournamentSubEvent, error) { + var events []TournamentSubEvent + err := r.db.Select(&events, + `SELECT id, cup_group, event_sub_type, quest_file_id, name + FROM tournament_sub_events + ORDER BY cup_group, event_sub_type`, + ) + return events, err +} + +// Register registers a character for a tournament. If the character is already +// registered the existing entry ID is returned (ON CONFLICT DO NOTHING, then re-SELECT). +func (r *TournamentRepository) Register(charID, tournamentID uint32) (uint32, error) { + _, err := r.db.Exec( + `INSERT INTO tournament_entries (char_id, tournament_id) + VALUES ($1, $2) + ON CONFLICT (char_id, tournament_id) DO NOTHING`, + charID, tournamentID, + ) + if err != nil { + return 0, fmt.Errorf("insert tournament entry: %w", err) + } + var id uint32 + err = r.db.QueryRow( + `SELECT id FROM tournament_entries WHERE char_id = $1 AND tournament_id = $2`, + charID, tournamentID, + ).Scan(&id) + if err != nil { + return 0, fmt.Errorf("fetch tournament entry id: %w", err) + } + return id, nil +} + +// GetEntry returns the registration record for a character/tournament pair, or nil if not found. +func (r *TournamentRepository) GetEntry(charID, tournamentID uint32) (*TournamentEntry, error) { + var e TournamentEntry + err := r.db.QueryRowx( + `SELECT id, char_id, tournament_id + FROM tournament_entries + WHERE char_id = $1 AND tournament_id = $2`, + charID, tournamentID, + ).StructScan(&e) + if err == sql.ErrNoRows { + return nil, nil + } + if err != nil { + return nil, fmt.Errorf("get tournament entry: %w", err) + } + return &e, nil +} + +// SubmitResult records a completed tournament run for a character. +func (r *TournamentRepository) SubmitResult(charID, tournamentID, eventID, questSlot, stageHandle uint32) error { + _, err := r.db.Exec( + `INSERT INTO tournament_results (char_id, tournament_id, event_id, quest_slot, stage_handle) + VALUES ($1, $2, $3, $4, $5)`, + charID, tournamentID, eventID, questSlot, stageHandle, + ) + if err != nil { + return fmt.Errorf("insert tournament result: %w", err) + } + return nil +} + +// GetLeaderboard returns the ranked leaderboard for an event ID. +// Rank is assigned by submission order (first submitted = rank 1). +// Returns at most 100 entries. +func (r *TournamentRepository) GetLeaderboard(eventID uint32) ([]TournamentRankEntry, error) { + type row struct { + CharID uint32 `db:"char_id"` + Rank int64 `db:"rank"` + Grade int `db:"grade"` + HR int `db:"hr"` + GR int `db:"gr"` + CharName string `db:"char_name"` + GuildName string `db:"guild_name"` + } + var rows []row + err := r.db.Select(&rows, ` + SELECT + r.char_id, + ROW_NUMBER() OVER (ORDER BY r.submitted_at ASC)::int AS rank, + c.gr::int AS grade, + c.hr::int AS hr, + c.gr::int AS gr, + c.name AS char_name, + COALESCE(g.name, '') AS guild_name + FROM tournament_results r + JOIN characters c ON c.id = r.char_id + LEFT JOIN guild_characters gc ON gc.character_id = r.char_id + LEFT JOIN guilds g ON g.id = gc.guild_id + WHERE r.event_id = $1 + ORDER BY r.submitted_at ASC + LIMIT 100`, + eventID, + ) + if err != nil { + return nil, fmt.Errorf("get tournament leaderboard: %w", err) + } + entries := make([]TournamentRankEntry, len(rows)) + for i, row := range rows { + entries[i] = TournamentRankEntry{ + CharID: row.CharID, + Rank: uint32(row.Rank), + Grade: uint16(row.Grade), + HR: uint16(row.HR), + GR: uint16(row.GR), + CharName: row.CharName, + GuildName: row.GuildName, + } + } + return entries, nil +} diff --git a/server/channelserver/scenario_json.go b/server/channelserver/scenario_json.go new file mode 100644 index 000000000..b3116fafe --- /dev/null +++ b/server/channelserver/scenario_json.go @@ -0,0 +1,472 @@ +package channelserver + +import ( + "bytes" + "encoding/base64" + "encoding/binary" + "encoding/json" + "fmt" + + "golang.org/x/text/encoding/japanese" + "golang.org/x/text/transform" +) + +// ── Constants ───────────────────────────────────────────────────────────────── + +// jkrMagic is the little-endian magic number at the start of a JKR-compressed +// blob: bytes 0x4A 0x4B 0x52 0x1A ('J','K','R',0x1A). +const jkrMagic uint32 = 0x1A524B4A + +// scenarioChunkSizeLimit is the maximum byte length the client accepts for any +// single chunk (chunk0, chunk1, or chunk2). Confirmed from the client's response +// handler (FUN_11525c60 in mhfo-hd.dll): chunks larger than this are silently +// discarded, so the server must never serve a chunk exceeding this limit. +const scenarioChunkSizeLimit = 0x8000 + +// ── JSON schema types ──────────────────────────────────────────────────────── + +// ScenarioJSON is the open, human-editable representation of a scenario .bin file. +// Strings are stored as UTF-8; the compiler converts to/from Shift-JIS. +// +// Container layout (big-endian sizes): +// +// @0x00: u32 BE chunk0_size +// @0x04: u32 BE chunk1_size +// [chunk0_data] +// [chunk1_data] +// u32 BE chunk2_size (only present when non-zero) +// [chunk2_data] +// +// Each chunk must not exceed scenarioChunkSizeLimit bytes. +type ScenarioJSON struct { + // Chunk0 holds quest name/description data (sub-header or inline format). + Chunk0 *ScenarioChunk0JSON `json:"chunk0,omitempty"` + // Chunk1 holds NPC dialog data (sub-header format or raw JKR blob). + Chunk1 *ScenarioChunk1JSON `json:"chunk1,omitempty"` + // Chunk2 holds JKR-compressed menu/title data. + Chunk2 *ScenarioRawChunkJSON `json:"chunk2,omitempty"` +} + +// ScenarioChunk0JSON represents chunk0, which is either sub-header or inline format. +// Exactly one of Subheader/Inline is non-nil. +type ScenarioChunk0JSON struct { + Subheader *ScenarioSubheaderJSON `json:"subheader,omitempty"` + Inline []ScenarioInlineEntry `json:"inline,omitempty"` +} + +// ScenarioChunk1JSON represents chunk1, which is either sub-header or raw JKR. +// Exactly one of Subheader/JKR is non-nil. +type ScenarioChunk1JSON struct { + Subheader *ScenarioSubheaderJSON `json:"subheader,omitempty"` + JKR *ScenarioRawChunkJSON `json:"jkr,omitempty"` +} + +// ScenarioSubheaderJSON represents a chunk in sub-header format. +// +// Sub-header binary layout (8 bytes, little-endian where applicable): +// +// @0: u8 Type (usually 0x01; the client treats this as a compound-container tag) +// @1: u8 0x00 (pad; must be 0x00 — used by the server to detect this format vs inline) +// @2: u16 Size (total chunk size including this header, LE) +// @4: u8 Count (number of string entries) +// @5: u8 Unknown1 (purpose unconfirmed; preserved round-trip) +// @6: u8 MetaSize (byte length of the metadata block; 0x14 for chunk0, 0x2C for chunk1) +// @7: u8 Unknown2 (purpose unconfirmed; preserved round-trip) +// [MetaSize bytes: opaque metadata — see docs/scenario-format.md for field breakdown] +// [null-terminated Shift-JIS strings, one per entry] +// [0xFF end-of-strings sentinel] +// +// Chunk0 metadata (MetaSize=0x14, 10×u16 LE): +// +// m[0]=CategoryID m[1]=MainID m[2]=0 m[3]=0 m[4]=0 +// m[5]=str0_len m[6]=SceneRef (MainID when cat=0, 0xFFFF otherwise) +// m[7..9]: not read by the client parser (FUN_1080d310 in mhfo-hd.dll) +// +// Chunk1 metadata (MetaSize=0x2C, 22×u16 LE): +// +// m[8..17] are interpreted as signed offsets by the client (FUN_1080d3b0): +// negative → (~value) + dialog_base (into post-0xFF dialog script) +// non-negative → value + strings_base (into strings section) +// m[18..19] are read as individual bytes, not u16 pairs. +type ScenarioSubheaderJSON struct { + // Type is the chunk type byte (almost always 0x01). + Type uint8 `json:"type"` + // Unknown1 is the byte at sub-header offset 5. Purpose not confirmed; + // always 0x00 in observed files. + Unknown1 uint8 `json:"unknown1"` + // Unknown2 is the byte at sub-header offset 7. Purpose not confirmed; + // always 0x00 in observed files. + Unknown2 uint8 `json:"unknown2"` + // Metadata is the opaque metadata block, base64-encoded. + // It is preserved verbatim so the client receives correct values for all + // fields, including those the server does not need to interpret. + // For chunk0, the client only reads m[0]–m[6]; m[7]–m[9] are ignored. + Metadata string `json:"metadata"` + // Strings contains the human-editable text (UTF-8). + // The compiler converts each string to null-terminated Shift-JIS on the wire. + Strings []string `json:"strings"` +} + +// ScenarioInlineEntry is one entry in an inline-format chunk0. +// Format on wire: {u8 index}{Shift-JIS string}{0x00}. +type ScenarioInlineEntry struct { + Index uint8 `json:"index"` + Text string `json:"text"` +} + +// ScenarioRawChunkJSON stores a JKR-compressed chunk as its raw compressed bytes. +// The data is served to the client as-is; the format of the decompressed content +// is not yet fully documented. +type ScenarioRawChunkJSON struct { + // Data is the raw JKR-compressed bytes, base64-encoded. + Data string `json:"data"` +} + +// ── Parse: binary → JSON ───────────────────────────────────────────────────── + +// ParseScenarioBinary reads a scenario .bin file and returns a ScenarioJSON +// suitable for editing and re-compilation with CompileScenarioJSON. +func ParseScenarioBinary(data []byte) (*ScenarioJSON, error) { + if len(data) < 8 { + return nil, fmt.Errorf("scenario data too short: %d bytes", len(data)) + } + + c0Size := int(binary.BigEndian.Uint32(data[0:4])) + c1Size := int(binary.BigEndian.Uint32(data[4:8])) + + result := &ScenarioJSON{} + + // Chunk0 + c0Off := 8 + if c0Size > 0 { + if c0Off+c0Size > len(data) { + return nil, fmt.Errorf("chunk0 size %d overruns data at offset %d", c0Size, c0Off) + } + chunk0, err := parseScenarioChunk0(data[c0Off : c0Off+c0Size]) + if err != nil { + return nil, fmt.Errorf("chunk0: %w", err) + } + result.Chunk0 = chunk0 + } + + // Chunk1 + c1Off := c0Off + c0Size + if c1Size > 0 { + if c1Off+c1Size > len(data) { + return nil, fmt.Errorf("chunk1 size %d overruns data at offset %d", c1Size, c1Off) + } + chunk1, err := parseScenarioChunk1(data[c1Off : c1Off+c1Size]) + if err != nil { + return nil, fmt.Errorf("chunk1: %w", err) + } + result.Chunk1 = chunk1 + } + + // Chunk2 (preceded by its own 4-byte size field) + c2HdrOff := c1Off + c1Size + if c2HdrOff+4 <= len(data) { + c2Size := int(binary.BigEndian.Uint32(data[c2HdrOff : c2HdrOff+4])) + if c2Size > 0 { + c2DataOff := c2HdrOff + 4 + if c2DataOff+c2Size > len(data) { + return nil, fmt.Errorf("chunk2 size %d overruns data at offset %d", c2Size, c2DataOff) + } + result.Chunk2 = &ScenarioRawChunkJSON{ + Data: base64.StdEncoding.EncodeToString(data[c2DataOff : c2DataOff+c2Size]), + } + } + } + + return result, nil +} + +// parseScenarioChunk0 auto-detects sub-header vs inline format. +// The second byte being 0x00 is the pad byte in sub-headers; non-zero means inline. +func parseScenarioChunk0(data []byte) (*ScenarioChunk0JSON, error) { + if len(data) < 2 { + return &ScenarioChunk0JSON{}, nil + } + if data[1] == 0x00 { + sh, err := parseScenarioSubheader(data) + if err != nil { + return nil, err + } + return &ScenarioChunk0JSON{Subheader: sh}, nil + } + entries, err := parseScenarioInline(data) + if err != nil { + return nil, err + } + return &ScenarioChunk0JSON{Inline: entries}, nil +} + +// parseScenarioChunk1 parses chunk1 as JKR or sub-header depending on magic bytes. +// JKR-compressed chunks start with the magic 'J','K','R',0x1A (LE u32 = jkrMagic). +func parseScenarioChunk1(data []byte) (*ScenarioChunk1JSON, error) { + if len(data) >= 4 && binary.LittleEndian.Uint32(data[0:4]) == jkrMagic { + return &ScenarioChunk1JSON{ + JKR: &ScenarioRawChunkJSON{ + Data: base64.StdEncoding.EncodeToString(data), + }, + }, nil + } + sh, err := parseScenarioSubheader(data) + if err != nil { + return nil, err + } + return &ScenarioChunk1JSON{Subheader: sh}, nil +} + +// parseScenarioSubheader parses the 8-byte sub-header + metadata + strings. +func parseScenarioSubheader(data []byte) (*ScenarioSubheaderJSON, error) { + if len(data) < 8 { + return nil, fmt.Errorf("sub-header chunk too short: %d bytes", len(data)) + } + + // 8-byte sub-header fields: + chunkType := data[0] // @0: chunk type (0x01 = compound container) + // data[1] // @1: pad 0x00 (format detector; not stored) + // data[2:4] // @2: u16 LE total size (recomputed on compile) + entryCount := int(data[4]) // @4: number of string entries + unknown1 := data[5] // @5: purpose unknown; always 0x00 in observed files + metaSize := int(data[6]) // @6: byte length of metadata block (0x14=C0, 0x2C=C1) + unknown2 := data[7] // @7: purpose unknown; always 0x00 in observed files + + metaEnd := 8 + metaSize + if metaEnd > len(data) { + return nil, fmt.Errorf("metadata block (size %d) overruns chunk (len %d)", metaSize, len(data)) + } + + metadata := base64.StdEncoding.EncodeToString(data[8:metaEnd]) + + strings, err := scenarioReadStrings(data, metaEnd, entryCount) + if err != nil { + return nil, err + } + + return &ScenarioSubheaderJSON{ + Type: chunkType, + Unknown1: unknown1, + Unknown2: unknown2, + Metadata: metadata, + Strings: strings, + }, nil +} + +// parseScenarioInline parses chunk0 inline format: {u8 index}{Shift-JIS string}{0x00}. +func parseScenarioInline(data []byte) ([]ScenarioInlineEntry, error) { + var result []ScenarioInlineEntry + pos := 0 + for pos < len(data) { + if data[pos] == 0x00 { + pos++ + continue + } + idx := data[pos] + pos++ + if pos >= len(data) { + break + } + end := pos + for end < len(data) && data[end] != 0x00 { + end++ + } + if end > pos { + text, err := scenarioDecodeShiftJIS(data[pos:end]) + if err != nil { + return nil, fmt.Errorf("inline entry at 0x%x: %w", pos, err) + } + result = append(result, ScenarioInlineEntry{Index: idx, Text: text}) + } + pos = end + 1 // skip null terminator + } + return result, nil +} + +// scenarioReadStrings scans for null-terminated Shift-JIS strings starting at +// offset start, reading at most maxCount strings (0 = unlimited). Stops on 0xFF. +func scenarioReadStrings(data []byte, start, maxCount int) ([]string, error) { + var result []string + pos := start + for pos < len(data) { + if maxCount > 0 && len(result) >= maxCount { + break + } + if data[pos] == 0x00 { + pos++ + continue + } + if data[pos] == 0xFF { + break + } + end := pos + for end < len(data) && data[end] != 0x00 { + end++ + } + if end > pos { + text, err := scenarioDecodeShiftJIS(data[pos:end]) + if err != nil { + return nil, fmt.Errorf("string at 0x%x: %w", pos, err) + } + result = append(result, text) + } + pos = end + 1 + } + return result, nil +} + +// ── Compile: JSON → binary ─────────────────────────────────────────────────── + +// CompileScenarioJSON parses jsonData and compiles it to MHF scenario binary format. +func CompileScenarioJSON(jsonData []byte) ([]byte, error) { + var s ScenarioJSON + if err := json.Unmarshal(jsonData, &s); err != nil { + return nil, fmt.Errorf("unmarshal scenario JSON: %w", err) + } + return compileScenario(&s) +} + +func compileScenario(s *ScenarioJSON) ([]byte, error) { + var chunk0, chunk1, chunk2 []byte + var err error + + if s.Chunk0 != nil { + chunk0, err = compileScenarioChunk0(s.Chunk0) + if err != nil { + return nil, fmt.Errorf("chunk0: %w", err) + } + } + if s.Chunk1 != nil { + chunk1, err = compileScenarioChunk1(s.Chunk1) + if err != nil { + return nil, fmt.Errorf("chunk1: %w", err) + } + } + if s.Chunk2 != nil { + chunk2, err = compileScenarioRawChunk(s.Chunk2) + if err != nil { + return nil, fmt.Errorf("chunk2: %w", err) + } + } + + for i, chunk := range [][]byte{chunk0, chunk1, chunk2} { + if len(chunk) > scenarioChunkSizeLimit { + return nil, fmt.Errorf("chunk%d size %d exceeds client limit of %d bytes", i, len(chunk), scenarioChunkSizeLimit) + } + } + + var buf bytes.Buffer + // Container header: c0_size, c1_size (big-endian u32) + _ = binary.Write(&buf, binary.BigEndian, uint32(len(chunk0))) + _ = binary.Write(&buf, binary.BigEndian, uint32(len(chunk1))) + buf.Write(chunk0) + buf.Write(chunk1) + // Chunk2 preceded by its own size field + if len(chunk2) > 0 { + _ = binary.Write(&buf, binary.BigEndian, uint32(len(chunk2))) + buf.Write(chunk2) + } + + return buf.Bytes(), nil +} + +func compileScenarioChunk0(c *ScenarioChunk0JSON) ([]byte, error) { + if c.Subheader != nil { + return compileScenarioSubheader(c.Subheader) + } + return compileScenarioInline(c.Inline) +} + +func compileScenarioChunk1(c *ScenarioChunk1JSON) ([]byte, error) { + if c.JKR != nil { + return compileScenarioRawChunk(c.JKR) + } + if c.Subheader != nil { + return compileScenarioSubheader(c.Subheader) + } + return nil, nil +} + +// compileScenarioSubheader builds the binary sub-header chunk: +// [8-byte header][metadata][null-terminated Shift-JIS strings][0xFF] +func compileScenarioSubheader(sh *ScenarioSubheaderJSON) ([]byte, error) { + meta, err := base64.StdEncoding.DecodeString(sh.Metadata) + if err != nil { + return nil, fmt.Errorf("decode metadata base64: %w", err) + } + + var strBuf bytes.Buffer + for _, s := range sh.Strings { + sjis, err := scenarioEncodeShiftJIS(s) + if err != nil { + return nil, err + } + strBuf.Write(sjis) // sjis already has null terminator from helper + } + strBuf.WriteByte(0xFF) // end-of-strings sentinel + + // Total size = 8-byte header + metadata + strings + totalSize := 8 + len(meta) + strBuf.Len() + + var buf bytes.Buffer + buf.WriteByte(sh.Type) + buf.WriteByte(0x00) // pad (format detector) + // u16 LE total size + buf.WriteByte(byte(totalSize)) + buf.WriteByte(byte(totalSize >> 8)) + buf.WriteByte(byte(len(sh.Strings))) // entry count + buf.WriteByte(sh.Unknown1) + buf.WriteByte(byte(len(meta))) // metadata total size + buf.WriteByte(sh.Unknown2) + buf.Write(meta) + buf.Write(strBuf.Bytes()) + + return buf.Bytes(), nil +} + +// compileScenarioInline builds the inline-format chunk0 bytes. +func compileScenarioInline(entries []ScenarioInlineEntry) ([]byte, error) { + var buf bytes.Buffer + for _, e := range entries { + buf.WriteByte(e.Index) + sjis, err := scenarioEncodeShiftJIS(e.Text) + if err != nil { + return nil, err + } + buf.Write(sjis) // includes null terminator + } + return buf.Bytes(), nil +} + +// compileScenarioRawChunk decodes the base64 raw chunk bytes. +// These are served to the client as-is (no re-compression). +func compileScenarioRawChunk(rc *ScenarioRawChunkJSON) ([]byte, error) { + data, err := base64.StdEncoding.DecodeString(rc.Data) + if err != nil { + return nil, fmt.Errorf("decode raw chunk base64: %w", err) + } + return data, nil +} + +// ── String helpers ─────────────────────────────────────────────────────────── + +// scenarioDecodeShiftJIS converts a raw Shift-JIS byte slice to UTF-8 string. +func scenarioDecodeShiftJIS(b []byte) (string, error) { + dec := japanese.ShiftJIS.NewDecoder() + out, _, err := transform.Bytes(dec, b) + if err != nil { + return "", fmt.Errorf("shift-jis decode: %w", err) + } + return string(out), nil +} + +// scenarioEncodeShiftJIS converts a UTF-8 string to a null-terminated Shift-JIS byte slice. +func scenarioEncodeShiftJIS(s string) ([]byte, error) { + enc := japanese.ShiftJIS.NewEncoder() + out, _, err := transform.Bytes(enc, []byte(s)) + if err != nil { + return nil, fmt.Errorf("shift-jis encode %q: %w", s, err) + } + return append(out, 0x00), nil +} + diff --git a/server/channelserver/scenario_json_test.go b/server/channelserver/scenario_json_test.go new file mode 100644 index 000000000..134a10045 --- /dev/null +++ b/server/channelserver/scenario_json_test.go @@ -0,0 +1,472 @@ +package channelserver + +import ( + "bytes" + "encoding/binary" + "encoding/json" + "os" + "testing" +) + +// ── test helpers ───────────────────────────────────────────────────────────── + +// buildTestSubheaderChunk constructs a minimal sub-header format chunk. +// metadata is zero-filled to metaSize bytes. +func buildTestSubheaderChunk(t *testing.T, strings []string, metaSize int) []byte { + t.Helper() + var strBuf bytes.Buffer + for _, s := range strings { + sjis, err := scenarioEncodeShiftJIS(s) + if err != nil { + t.Fatalf("encode %q: %v", s, err) + } + strBuf.Write(sjis) + } + strBuf.WriteByte(0xFF) // end sentinel + + totalSize := 8 + metaSize + strBuf.Len() + meta := make([]byte, metaSize) // zero metadata + + var buf bytes.Buffer + buf.WriteByte(0x01) // type + buf.WriteByte(0x00) // pad + buf.WriteByte(byte(totalSize)) // size lo + buf.WriteByte(byte(totalSize >> 8)) // size hi + buf.WriteByte(byte(len(strings))) // entry count + buf.WriteByte(0x00) // unknown1 + buf.WriteByte(byte(metaSize)) // metadata total + buf.WriteByte(0x00) // unknown2 + buf.Write(meta) + buf.Write(strBuf.Bytes()) + return buf.Bytes() +} + +// buildTestInlineChunk constructs an inline-format chunk0. +func buildTestInlineChunk(t *testing.T, strings []string) []byte { + t.Helper() + var buf bytes.Buffer + for i, s := range strings { + buf.WriteByte(byte(i + 1)) // 1-based index + sjis, err := scenarioEncodeShiftJIS(s) + if err != nil { + t.Fatalf("encode %q: %v", s, err) + } + buf.Write(sjis) + } + return buf.Bytes() +} + +// buildTestScenarioBinary assembles a complete scenario container for testing. +func buildTestScenarioBinary(t *testing.T, c0, c1 []byte) []byte { + t.Helper() + var buf bytes.Buffer + if err := binary.Write(&buf, binary.BigEndian, uint32(len(c0))); err != nil { + t.Fatal(err) + } + if err := binary.Write(&buf, binary.BigEndian, uint32(len(c1))); err != nil { + t.Fatal(err) + } + buf.Write(c0) + buf.Write(c1) + // c2 size = 0 + if err := binary.Write(&buf, binary.BigEndian, uint32(0)); err != nil { + t.Fatal(err) + } + return buf.Bytes() +} + +// extractStringsFromScenario parses a binary and returns all strings it contains. +func extractStringsFromScenario(t *testing.T, data []byte) []string { + t.Helper() + s, err := ParseScenarioBinary(data) + if err != nil { + t.Fatalf("ParseScenarioBinary: %v", err) + } + var result []string + if s.Chunk0 != nil { + if s.Chunk0.Subheader != nil { + result = append(result, s.Chunk0.Subheader.Strings...) + } + for _, e := range s.Chunk0.Inline { + result = append(result, e.Text) + } + } + if s.Chunk1 != nil && s.Chunk1.Subheader != nil { + result = append(result, s.Chunk1.Subheader.Strings...) + } + return result +} + +// ── parse tests ────────────────────────────────────────────────────────────── + +func TestParseScenarioBinary_TooShort(t *testing.T) { + _, err := ParseScenarioBinary([]byte{0x00, 0x01}) + if err == nil { + t.Error("expected error for short input") + } +} + +func TestParseScenarioBinary_EmptyChunks(t *testing.T) { + data := buildTestScenarioBinary(t, nil, nil) + s, err := ParseScenarioBinary(data) + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + if s.Chunk0 != nil || s.Chunk1 != nil || s.Chunk2 != nil { + t.Error("expected all chunks nil for empty scenario") + } +} + +func TestParseScenarioBinary_SubheaderChunk0(t *testing.T) { + c0 := buildTestSubheaderChunk(t, []string{"Quest A", "Quest B"}, 4) + data := buildTestScenarioBinary(t, c0, nil) + + s, err := ParseScenarioBinary(data) + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + if s.Chunk0 == nil || s.Chunk0.Subheader == nil { + t.Fatal("expected chunk0 subheader") + } + got := s.Chunk0.Subheader.Strings + want := []string{"Quest A", "Quest B"} + if len(got) != len(want) { + t.Fatalf("string count: got %d, want %d", len(got), len(want)) + } + for i := range want { + if got[i] != want[i] { + t.Errorf("[%d]: got %q, want %q", i, got[i], want[i]) + } + } +} + +func TestParseScenarioBinary_InlineChunk0(t *testing.T) { + c0 := buildTestInlineChunk(t, []string{"Item1", "Item2"}) + data := buildTestScenarioBinary(t, c0, nil) + + s, err := ParseScenarioBinary(data) + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + if s.Chunk0 == nil || len(s.Chunk0.Inline) == 0 { + t.Fatal("expected chunk0 inline entries") + } + want := []string{"Item1", "Item2"} + for i, e := range s.Chunk0.Inline { + if e.Text != want[i] { + t.Errorf("[%d]: got %q, want %q", i, e.Text, want[i]) + } + } +} + +func TestParseScenarioBinary_BothChunks(t *testing.T) { + c0 := buildTestSubheaderChunk(t, []string{"Quest"}, 4) + c1 := buildTestSubheaderChunk(t, []string{"NPC1", "NPC2"}, 8) + data := buildTestScenarioBinary(t, c0, c1) + + strings := extractStringsFromScenario(t, data) + want := []string{"Quest", "NPC1", "NPC2"} + if len(strings) != len(want) { + t.Fatalf("string count: got %d, want %d", len(strings), len(want)) + } + for i := range want { + if strings[i] != want[i] { + t.Errorf("[%d]: got %q, want %q", i, strings[i], want[i]) + } + } +} + +func TestParseScenarioBinary_Japanese(t *testing.T) { + c0 := buildTestSubheaderChunk(t, []string{"テスト", "日本語"}, 4) + data := buildTestScenarioBinary(t, c0, nil) + + s, err := ParseScenarioBinary(data) + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + want := []string{"テスト", "日本語"} + got := s.Chunk0.Subheader.Strings + for i := range want { + if got[i] != want[i] { + t.Errorf("[%d]: got %q, want %q", i, got[i], want[i]) + } + } +} + +// ── compile tests ───────────────────────────────────────────────────────────── + +func TestCompileScenarioJSON_Subheader(t *testing.T) { + input := &ScenarioJSON{ + Chunk0: &ScenarioChunk0JSON{ + Subheader: &ScenarioSubheaderJSON{ + Type: 0x01, + Unknown1: 0x00, + Unknown2: 0x00, + Metadata: "AAAABBBB", // base64 of 6 zero bytes + Strings: []string{"Hello", "World"}, + }, + }, + } + + jsonData, err := json.Marshal(input) + if err != nil { + t.Fatalf("marshal: %v", err) + } + + compiled, err := CompileScenarioJSON(jsonData) + if err != nil { + t.Fatalf("CompileScenarioJSON: %v", err) + } + + // Parse the compiled output and verify strings survive + result, err := ParseScenarioBinary(compiled) + if err != nil { + t.Fatalf("ParseScenarioBinary on compiled output: %v", err) + } + if result.Chunk0 == nil || result.Chunk0.Subheader == nil { + t.Fatal("expected chunk0 subheader in compiled output") + } + want := []string{"Hello", "World"} + got := result.Chunk0.Subheader.Strings + for i := range want { + if i >= len(got) || got[i] != want[i] { + t.Errorf("[%d]: got %q, want %q", i, got[i], want[i]) + } + } +} + +func TestCompileScenarioJSON_Inline(t *testing.T) { + input := &ScenarioJSON{ + Chunk0: &ScenarioChunk0JSON{ + Inline: []ScenarioInlineEntry{ + {Index: 1, Text: "Sword"}, + {Index: 2, Text: "Shield"}, + }, + }, + } + jsonData, _ := json.Marshal(input) + compiled, err := CompileScenarioJSON(jsonData) + if err != nil { + t.Fatalf("CompileScenarioJSON: %v", err) + } + + result, err := ParseScenarioBinary(compiled) + if err != nil { + t.Fatalf("ParseScenarioBinary: %v", err) + } + if result.Chunk0 == nil || len(result.Chunk0.Inline) != 2 { + t.Fatal("expected 2 inline entries") + } + if result.Chunk0.Inline[0].Text != "Sword" { + t.Errorf("got %q, want Sword", result.Chunk0.Inline[0].Text) + } + if result.Chunk0.Inline[1].Text != "Shield" { + t.Errorf("got %q, want Shield", result.Chunk0.Inline[1].Text) + } +} + +// ── round-trip tests ───────────────────────────────────────────────────────── + +func TestScenarioRoundTrip_Subheader(t *testing.T) { + original := buildTestScenarioBinary(t, + buildTestSubheaderChunk(t, []string{"QuestName", "Description"}, 0x14), + buildTestSubheaderChunk(t, []string{"Dialog1", "Dialog2", "Dialog3"}, 0x2C), + ) + + s, err := ParseScenarioBinary(original) + if err != nil { + t.Fatalf("parse: %v", err) + } + + jsonData, err := json.Marshal(s) + if err != nil { + t.Fatalf("marshal: %v", err) + } + + compiled, err := CompileScenarioJSON(jsonData) + if err != nil { + t.Fatalf("compile: %v", err) + } + + // Re-parse compiled and compare strings + wantStrings := []string{"QuestName", "Description", "Dialog1", "Dialog2", "Dialog3"} + gotStrings := extractStringsFromScenario(t, compiled) + if len(gotStrings) != len(wantStrings) { + t.Fatalf("string count: got %d, want %d", len(gotStrings), len(wantStrings)) + } + for i := range wantStrings { + if gotStrings[i] != wantStrings[i] { + t.Errorf("[%d]: got %q, want %q", i, gotStrings[i], wantStrings[i]) + } + } +} + +func TestScenarioRoundTrip_Inline(t *testing.T) { + original := buildTestScenarioBinary(t, + buildTestInlineChunk(t, []string{"EpisodeA", "EpisodeB"}), + nil, + ) + + s, _ := ParseScenarioBinary(original) + jsonData, _ := json.Marshal(s) + compiled, err := CompileScenarioJSON(jsonData) + if err != nil { + t.Fatalf("compile: %v", err) + } + + got := extractStringsFromScenario(t, compiled) + want := []string{"EpisodeA", "EpisodeB"} + for i := range want { + if i >= len(got) || got[i] != want[i] { + t.Errorf("[%d]: got %q, want %q", i, got[i], want[i]) + } + } +} + +func TestScenarioRoundTrip_MetadataPreserved(t *testing.T) { + // The metadata block must survive parse → JSON → compile unchanged. + metaBytes := []byte{0x01, 0x02, 0x03, 0x04, 0xFF, 0xFE, 0xFD, 0xFC} + // Build a chunk with custom metadata and unknown field values by hand. + var buf bytes.Buffer + str := []byte("A\x00\xFF") + totalSize := 8 + len(metaBytes) + len(str) + buf.WriteByte(0x01) + buf.WriteByte(0x00) + buf.WriteByte(byte(totalSize)) + buf.WriteByte(byte(totalSize >> 8)) + buf.WriteByte(0x01) // entry count + buf.WriteByte(0xAA) // unknown1 + buf.WriteByte(byte(len(metaBytes))) + buf.WriteByte(0xBB) // unknown2 + buf.Write(metaBytes) + buf.Write(str) + c0 := buf.Bytes() + + data := buildTestScenarioBinary(t, c0, nil) + s, err := ParseScenarioBinary(data) + if err != nil { + t.Fatalf("parse: %v", err) + } + sh := s.Chunk0.Subheader + if sh.Type != 0x01 || sh.Unknown1 != 0xAA || sh.Unknown2 != 0xBB { + t.Errorf("header fields: type=%02X unk1=%02X unk2=%02X", sh.Type, sh.Unknown1, sh.Unknown2) + } + + // Compile and parse again — metadata must survive + jsonData, _ := json.Marshal(s) + compiled, err := CompileScenarioJSON(jsonData) + if err != nil { + t.Fatalf("compile: %v", err) + } + s2, err := ParseScenarioBinary(compiled) + if err != nil { + t.Fatalf("re-parse: %v", err) + } + sh2 := s2.Chunk0.Subheader + if sh2.Metadata != sh.Metadata { + t.Errorf("metadata changed:\n before: %s\n after: %s", sh.Metadata, sh2.Metadata) + } + if sh2.Unknown1 != sh.Unknown1 || sh2.Unknown2 != sh.Unknown2 { + t.Errorf("unknown fields changed: unk1 %02X→%02X unk2 %02X→%02X", + sh.Unknown1, sh2.Unknown1, sh.Unknown2, sh2.Unknown2) + } +} + +// ── real-file round-trip tests ──────────────────────────────────────────────── + +// scenarioBinPath is the relative path from the package to the scenario files. +// These tests are skipped if the directory does not exist (CI without game data). +const scenarioBinPath = "../../bin/scenarios" + +func TestScenarioRoundTrip_RealFiles(t *testing.T) { + samples := []struct { + name string + wantC0 bool // expect chunk0 subheader + wantC1 bool // expect chunk1 (subheader or JKR) + }{ + // cat=0 basic quest scenarios (chunk0 subheader, no chunk1) + {"0_0_0_0_S0_T101_C0", true, false}, + {"0_0_0_0_S1_T101_C0", true, false}, + {"0_0_0_0_S5_T101_C0", true, false}, + // cat=1 GR scenarios (chunk0 subheader, T101 has no chunk1) + {"1_0_0_0_S0_T101_C0", true, false}, + {"1_0_0_0_S1_T101_C0", true, false}, + // cat=3 item exchange (chunk0 subheader, chunk1 subheader with extra data) + {"3_0_0_0_S0_T103_C0", true, true}, + // multi-chapter file with chunk1 subheader + {"0_0_0_0_S0_T103_C0", true, true}, + } + + for _, tc := range samples { + tc := tc + t.Run(tc.name, func(t *testing.T) { + path := scenarioBinPath + "/" + tc.name + ".bin" + original, err := os.ReadFile(path) + if err != nil { + t.Skipf("scenario file not found (game data not present): %v", err) + } + + // Parse binary → JSON schema + parsed, err := ParseScenarioBinary(original) + if err != nil { + t.Fatalf("ParseScenarioBinary: %v", err) + } + + // Verify expected chunk presence + if tc.wantC0 && (parsed.Chunk0 == nil || parsed.Chunk0.Subheader == nil) { + t.Error("expected chunk0 subheader") + } + if tc.wantC1 && parsed.Chunk1 == nil { + t.Error("expected chunk1") + } + + // Marshal to JSON + jsonData, err := json.Marshal(parsed) + if err != nil { + t.Fatalf("json.Marshal: %v", err) + } + + // Compile JSON → binary + compiled, err := CompileScenarioJSON(jsonData) + if err != nil { + t.Fatalf("CompileScenarioJSON: %v", err) + } + + // Re-parse compiled output + result, err := ParseScenarioBinary(compiled) + if err != nil { + t.Fatalf("ParseScenarioBinary on compiled output: %v", err) + } + + // Verify strings survive round-trip unchanged + origStrings := extractStringsFromScenario(t, original) + gotStrings := extractStringsFromScenario(t, compiled) + if len(gotStrings) != len(origStrings) { + t.Fatalf("string count changed: %d → %d", len(origStrings), len(gotStrings)) + } + for i := range origStrings { + if gotStrings[i] != origStrings[i] { + t.Errorf("[%d]: %q → %q", i, origStrings[i], gotStrings[i]) + } + } + + // Verify metadata is preserved byte-for-byte + if parsed.Chunk0 != nil && parsed.Chunk0.Subheader != nil { + if result.Chunk0 == nil || result.Chunk0.Subheader == nil { + t.Fatal("chunk0 subheader lost in round-trip") + } + if result.Chunk0.Subheader.Metadata != parsed.Chunk0.Subheader.Metadata { + t.Errorf("chunk0 metadata changed after round-trip") + } + } + if parsed.Chunk1 != nil && parsed.Chunk1.Subheader != nil { + if result.Chunk1 == nil || result.Chunk1.Subheader == nil { + t.Fatal("chunk1 subheader lost in round-trip") + } + if result.Chunk1.Subheader.Metadata != parsed.Chunk1.Subheader.Metadata { + t.Errorf("chunk1 metadata changed after round-trip") + } + } + }) + } +} diff --git a/server/channelserver/svc_guild.go b/server/channelserver/svc_guild.go index 3ea5268ad..fd45c33dc 100644 --- a/server/channelserver/svc_guild.go +++ b/server/channelserver/svc_guild.go @@ -280,16 +280,16 @@ func (svc *GuildService) PostScout(actorCharID, targetCharID uint32, strings Sco return fmt.Errorf("guild lookup: %w", err) } - hasApp, err := svc.guildRepo.HasApplication(guild.ID, targetCharID) + hasInvite, err := svc.guildRepo.HasInvite(guild.ID, targetCharID) if err != nil { - return fmt.Errorf("check application: %w", err) + return fmt.Errorf("check invite: %w", err) } - if hasApp { + if hasInvite { return ErrAlreadyInvited } - err = svc.guildRepo.CreateApplicationWithMail( - guild.ID, targetCharID, actorCharID, GuildApplicationTypeInvited, + err = svc.guildRepo.CreateInviteWithMail( + guild.ID, targetCharID, actorCharID, actorCharID, targetCharID, strings.Title, fmt.Sprintf(strings.Body, guild.Name)) @@ -309,8 +309,8 @@ func (svc *GuildService) AnswerScout(charID, leaderID uint32, accept bool, strin return nil, fmt.Errorf("guild lookup for leader %d: %w", leaderID, err) } - app, err := svc.guildRepo.GetApplication(guild.ID, charID, GuildApplicationTypeInvited) - if app == nil || err != nil { + hasInvite, err := svc.guildRepo.HasInvite(guild.ID, charID) + if err != nil || !hasInvite { return &AnswerScoutResult{ GuildID: guild.ID, Success: false, @@ -319,13 +319,13 @@ func (svc *GuildService) AnswerScout(charID, leaderID uint32, accept bool, strin var mails []Mail if accept { - err = svc.guildRepo.AcceptApplication(guild.ID, charID) + err = svc.guildRepo.AcceptInvite(guild.ID, charID) mails = []Mail{ {SenderID: 0, RecipientID: charID, Subject: strings.SuccessTitle, Body: fmt.Sprintf(strings.SuccessBody, guild.Name), IsSystemMessage: true}, {SenderID: charID, RecipientID: leaderID, Subject: strings.AcceptedTitle, Body: fmt.Sprintf(strings.AcceptedBody, guild.Name), IsSystemMessage: true}, } } else { - err = svc.guildRepo.RejectApplication(guild.ID, charID) + err = svc.guildRepo.DeclineInvite(guild.ID, charID) mails = []Mail{ {SenderID: 0, RecipientID: charID, Subject: strings.RejectedTitle, Body: fmt.Sprintf(strings.RejectedBody, guild.Name), IsSystemMessage: true}, {SenderID: charID, RecipientID: leaderID, Subject: strings.DeclinedTitle, Body: fmt.Sprintf(strings.DeclinedBody, guild.Name), IsSystemMessage: true}, diff --git a/server/channelserver/svc_guild_test.go b/server/channelserver/svc_guild_test.go index 3b7afd2f0..f84f5aa87 100644 --- a/server/channelserver/svc_guild_test.go +++ b/server/channelserver/svc_guild_test.go @@ -385,14 +385,14 @@ func TestGuildService_PostScout(t *testing.T) { strings := ScoutInviteStrings{Title: "Invite", Body: "Join 「%s」"} tests := []struct { - name string - membership *GuildMember - guild *Guild - hasApp bool - hasAppErr error - createAppErr error - getMemberErr error - wantErr error + name string + membership *GuildMember + guild *Guild + hasInvite bool + hasInviteErr error + createAppErr error + getMemberErr error + wantErr error }{ { name: "successful scout", @@ -403,7 +403,7 @@ func TestGuildService_PostScout(t *testing.T) { name: "already invited", membership: &GuildMember{GuildID: 10, CharID: 1, IsLeader: true, OrderIndex: 1}, guild: &Guild{ID: 10, Name: "TestGuild"}, - hasApp: true, + hasInvite: true, wantErr: ErrAlreadyInvited, }, { @@ -423,11 +423,11 @@ func TestGuildService_PostScout(t *testing.T) { for _, tt := range tests { t.Run(tt.name, func(t *testing.T) { guildMock := &mockGuildRepo{ - membership: tt.membership, - hasAppResult: tt.hasApp, - hasAppErr: tt.hasAppErr, - createAppErr: tt.createAppErr, - getMemberErr: tt.getMemberErr, + membership: tt.membership, + hasInviteResult: tt.hasInvite, + hasInviteErr: tt.hasInviteErr, + createAppErr: tt.createAppErr, + getMemberErr: tt.getMemberErr, } guildMock.guild = tt.guild svc := newTestGuildService(guildMock, &mockMailRepo{}) @@ -468,7 +468,7 @@ func TestGuildService_AnswerScout(t *testing.T) { name string accept bool guild *Guild - application *GuildApplication + hasInvite bool acceptErr error rejectErr error sendErr error @@ -477,13 +477,13 @@ func TestGuildService_AnswerScout(t *testing.T) { wantErr error wantMailCount int wantAccepted uint32 - wantRejected uint32 + wantDeclined uint32 }{ { name: "accept invitation", accept: true, guild: &Guild{ID: 10, Name: "TestGuild", GuildLeader: GuildLeader{LeaderCharID: 50}}, - application: &GuildApplication{GuildID: 10, CharID: 1}, + hasInvite: true, wantSuccess: true, wantMailCount: 2, wantAccepted: 1, @@ -492,16 +492,16 @@ func TestGuildService_AnswerScout(t *testing.T) { name: "decline invitation", accept: false, guild: &Guild{ID: 10, Name: "TestGuild", GuildLeader: GuildLeader{LeaderCharID: 50}}, - application: &GuildApplication{GuildID: 10, CharID: 1}, + hasInvite: true, wantSuccess: true, wantMailCount: 2, - wantRejected: 1, + wantDeclined: 1, }, { name: "application missing", accept: true, guild: &Guild{ID: 10, Name: "TestGuild", GuildLeader: GuildLeader{LeaderCharID: 50}}, - application: nil, + hasInvite: false, wantSuccess: false, wantErr: ErrApplicationMissing, }, @@ -516,7 +516,7 @@ func TestGuildService_AnswerScout(t *testing.T) { name: "mail error is best-effort", accept: true, guild: &Guild{ID: 10, Name: "TestGuild", GuildLeader: GuildLeader{LeaderCharID: 50}}, - application: &GuildApplication{GuildID: 10, CharID: 1}, + hasInvite: true, sendErr: errors.New("mail failed"), wantSuccess: true, wantMailCount: 2, @@ -527,9 +527,9 @@ func TestGuildService_AnswerScout(t *testing.T) { for _, tt := range tests { t.Run(tt.name, func(t *testing.T) { guildMock := &mockGuildRepo{ - application: tt.application, - acceptErr: tt.acceptErr, - rejectErr: tt.rejectErr, + hasInviteResult: tt.hasInvite, + acceptErr: tt.acceptErr, + rejectErr: tt.rejectErr, } guildMock.guild = tt.guild guildMock.getErr = tt.getErr @@ -559,11 +559,11 @@ func TestGuildService_AnswerScout(t *testing.T) { if len(mailMock.sentMails) != tt.wantMailCount { t.Errorf("sentMails count = %d, want %d", len(mailMock.sentMails), tt.wantMailCount) } - if tt.wantAccepted != 0 && guildMock.acceptedCharID != tt.wantAccepted { - t.Errorf("acceptedCharID = %d, want %d", guildMock.acceptedCharID, tt.wantAccepted) + if tt.wantAccepted != 0 && guildMock.acceptInviteCharID != tt.wantAccepted { + t.Errorf("acceptInviteCharID = %d, want %d", guildMock.acceptInviteCharID, tt.wantAccepted) } - if tt.wantRejected != 0 && guildMock.rejectedCharID != tt.wantRejected { - t.Errorf("rejectedCharID = %d, want %d", guildMock.rejectedCharID, tt.wantRejected) + if tt.wantDeclined != 0 && guildMock.declineInviteCharID != tt.wantDeclined { + t.Errorf("declineInviteCharID = %d, want %d", guildMock.declineInviteCharID, tt.wantDeclined) } }) } diff --git a/server/channelserver/sys_channel_server.go b/server/channelserver/sys_channel_server.go index 72f537b81..a20fc0bf9 100644 --- a/server/channelserver/sys_channel_server.go +++ b/server/channelserver/sys_channel_server.go @@ -1,6 +1,7 @@ package channelserver import ( + "context" "encoding/binary" "errors" "fmt" @@ -11,6 +12,7 @@ import ( "time" "erupe-ce/common/byteframe" + "erupe-ce/common/decryption" cfg "erupe-ce/config" "erupe-ce/network" "erupe-ce/network/binpacket" @@ -74,6 +76,7 @@ type Server struct { miscRepo MiscRepo scenarioRepo ScenarioRepo mercenaryRepo MercenaryRepo + tournamentRepo TournamentRepo mailService *MailService guildService *GuildService achievementService *AchievementService @@ -167,6 +170,7 @@ func NewServer(config *Config) *Server { s.miscRepo = NewMiscRepository(config.DB) s.scenarioRepo = NewScenarioRepository(config.DB) s.mercenaryRepo = NewMercenaryRepository(config.DB) + s.tournamentRepo = NewTournamentRepository(config.DB) s.mailService = NewMailService(s.mailRepo, s.guildRepo, s.logger) s.guildService = NewGuildService(s.guildRepo, s.mailService, s.charRepo, s.logger) @@ -245,6 +249,51 @@ func (s *Server) Shutdown() { } +// ShutdownAndDrain stops accepting new connections, force-closes every active +// session so that their logoutPlayer cleanup runs (saves character data, removes +// from stages, etc.), then waits until all sessions have been removed from the +// sessions map or ctx is cancelled. It is safe to call multiple times. +func (s *Server) ShutdownAndDrain(ctx context.Context) { + s.Shutdown() + + // Snapshot all active connections while holding the lock, then close them + // outside the lock so we don't hold it during I/O. Closing a connection + // causes the session's recvLoop to see io.EOF and call logoutPlayer(), which + // in turn deletes the entry from s.sessions under the server mutex. + s.Lock() + conns := make([]net.Conn, 0, len(s.sessions)) + for conn := range s.sessions { + conns = append(conns, conn) + } + s.Unlock() + + for _, conn := range conns { + _ = conn.Close() + } + + // Poll until logoutPlayer has removed every session or the deadline passes. + ticker := time.NewTicker(50 * time.Millisecond) + defer ticker.Stop() + for { + select { + case <-ctx.Done(): + s.Lock() + remaining := len(s.sessions) + s.Unlock() + s.logger.Warn("Shutdown drain timed out", zap.Int("remaining_sessions", remaining)) + return + case <-ticker.C: + s.Lock() + n := len(s.sessions) + s.Unlock() + if n == 0 { + s.logger.Info("Shutdown drain complete") + return + } + } + } +} + func (s *Server) acceptClients() { for { conn, err := s.listener.Accept() @@ -449,31 +498,52 @@ func (s *Server) Season() uint8 { return uint8(((TimeAdjusted().Unix() / secsPerDay) + sid) % 3) } -// ecdMagic is the ECD magic as read by binary.LittleEndian.Uint32. -// On-disk bytes: 65 63 64 1A ("ecd\x1a"), LE-decoded: 0x1A646365. -const ecdMagic = uint32(0x1A646365) - -// loadRengokuBinary reads and validates rengoku_data.bin from binPath. -// Returns the raw bytes on success, or nil if the file is missing or invalid. +// loadRengokuBinary loads and caches Hunting Road config. It tries +// rengoku_data.bin first and falls back to rengoku_data.json (built on the +// fly). Returns ECD-encrypted bytes ready to serve, or nil if no valid source +// is found. func loadRengokuBinary(binPath string, logger *zap.Logger) []byte { path := filepath.Join(binPath, "rengoku_data.bin") data, err := os.ReadFile(path) - if err != nil { - logger.Warn("rengoku_data.bin not found, Hunting Road will be unavailable", - zap.String("path", path), zap.Error(err)) - return nil + if err == nil { + if len(data) < 4 { + logger.Warn("rengoku_data.bin too small, ignoring", + zap.Int("bytes", len(data))) + } else if magic := binary.LittleEndian.Uint32(data[:4]); magic != decryption.ECDMagic { + logger.Warn("rengoku_data.bin has invalid ECD magic, ignoring", + zap.String("expected", fmt.Sprintf("0x%08x", decryption.ECDMagic)), + zap.String("got", fmt.Sprintf("0x%08x", magic))) + } else { + // Decrypt and decompress to validate the internal structure and emit a + // human-readable summary at startup. Failures here are non-fatal: the + // encrypted blob is still served to clients unchanged. + if plain, decErr := decryption.DecodeECD(data); decErr != nil { + logger.Warn("rengoku_data.bin ECD decryption failed — serving anyway", + zap.Error(decErr)) + } else { + raw := decryption.UnpackSimple(plain) + if info, parseErr := parseRengokuBinary(raw); parseErr != nil { + logger.Warn("rengoku_data.bin structural validation failed", + zap.Error(parseErr)) + } else { + logger.Info("Hunting Road config", + zap.Int("multi_floors", info.MultiFloors), + zap.Int("multi_spawn_tables", info.MultiSpawnTables), + zap.Int("solo_floors", info.SoloFloors), + zap.Int("solo_spawn_tables", info.SoloSpawnTables), + zap.Int("unique_monsters", info.UniqueMonsters), + ) + } + } + logger.Info("Loaded rengoku_data.bin", zap.Int("bytes", len(data))) + return data + } } - if len(data) < 4 { - logger.Warn("rengoku_data.bin too small, ignoring", - zap.Int("bytes", len(data))) - return nil + + if enc := loadRengokuFromJSON(binPath, logger); enc != nil { + return enc } - if magic := binary.LittleEndian.Uint32(data[:4]); magic != ecdMagic { - logger.Warn("rengoku_data.bin has invalid ECD magic, ignoring", - zap.String("expected", "0x1a646365"), - zap.String("got", fmt.Sprintf("0x%08x", magic))) - return nil - } - logger.Info("Loaded rengoku_data.bin", zap.Int("bytes", len(data))) - return data + + logger.Warn("No Hunting Road config found (rengoku_data.bin or rengoku_data.json), Hunting Road will be unavailable") + return nil } diff --git a/server/channelserver/sys_channel_server_test.go b/server/channelserver/sys_channel_server_test.go index f0a477d4f..3ec29d98b 100644 --- a/server/channelserver/sys_channel_server_test.go +++ b/server/channelserver/sys_channel_server_test.go @@ -11,6 +11,7 @@ import ( "time" cfg "erupe-ce/config" + "erupe-ce/common/decryption" "erupe-ce/network/clientctx" "erupe-ce/network/mhfpacket" @@ -750,7 +751,7 @@ func TestLoadRengokuBinary_ValidECD(t *testing.T) { dir := t.TempDir() // Build a minimal valid ECD file: magic + some payload data := make([]byte, 16) - binary.LittleEndian.PutUint32(data[:4], ecdMagic) + binary.LittleEndian.PutUint32(data[:4], decryption.ECDMagic) if err := os.WriteFile(filepath.Join(dir, "rengoku_data.bin"), data, 0644); err != nil { t.Fatal(err) } diff --git a/server/channelserver/sys_language.go b/server/channelserver/sys_language.go index 6f24b6f32..6c970909f 100644 --- a/server/channelserver/sys_language.go +++ b/server/channelserver/sys_language.go @@ -1,7 +1,15 @@ package channelserver +// Bead holds the display strings for a single kiju prayer bead type. +type Bead struct { + ID int + Name string + Description string +} + type i18n struct { language string + beads []Bead cafe struct { reset string } @@ -79,7 +87,9 @@ type i18n struct { berserkSmall string } guild struct { - invite struct { + rookieGuildName string + returnGuildName string + invite struct { title string body string success struct { @@ -102,140 +112,37 @@ type i18n struct { } } +// beadName returns the localised name for a bead type. +func (i *i18n) beadName(beadType int) string { + for _, b := range i.beads { + if b.ID == beadType { + return b.Name + } + } + return "" +} + +// beadDescription returns the localised description for a bead type. +func (i *i18n) beadDescription(beadType int) string { + for _, b := range i.beads { + if b.ID == beadType { + return b.Description + } + } + return "" +} + +// getLangStrings returns the i18n string table for the configured language, +// falling back to English for unknown language codes. func getLangStrings(s *Server) i18n { - var i i18n switch s.erupeConfig.Language { case "jp": - i.language = "日本語" - i.cafe.reset = "%d/%dにリセット" - i.timer = "タイマー:%02d'%02d\"%02d.%03d (%df)" - - i.commands.noOp = "You don't have permission to use this command" - i.commands.disabled = "%sのコマンドは無効です" - i.commands.reload = "リロードします" - i.commands.kqf.get = "現在のキークエストフラグ:%x" - i.commands.kqf.set.error = "キークエコマンドエラー 例:%s set xxxxxxxxxxxxxxxx" - i.commands.kqf.set.success = "キークエストのフラグが更新されました。ワールド/ランドを移動してください" - i.commands.kqf.version = "This command is disabled prior to MHFG10" - i.commands.rights.error = "コース更新コマンドエラー 例:%s x" - i.commands.rights.success = "コース情報を更新しました:%d" - i.commands.course.error = "コース確認コマンドエラー 例:%s " - i.commands.course.disabled = "%sコースは無効です" - i.commands.course.enabled = "%sコースは有効です" - i.commands.course.locked = "%sコースはロックされています" - i.commands.teleport.error = "テレポートコマンドエラー 構文:%s x y" - i.commands.teleport.success = "%d %dにテレポート" - i.commands.psn.error = "PSN連携コマンドエラー 例:%s " - i.commands.psn.success = "PSN「%s」が連携されています" - i.commands.psn.exists = "PSNは既存のユーザに接続されています" - - i.commands.discord.success = "あなたのDiscordトークン:%s" - - i.commands.ban.noUser = "Could not find user" - i.commands.ban.success = "Successfully banned %s" - i.commands.ban.invalid = "Invalid Character ID" - i.commands.ban.error = "Error in command. Format: %s [length]" - i.commands.ban.length = " until %s" - - i.commands.ravi.noCommand = "ラヴィコマンドが指定されていません" - i.commands.ravi.start.success = "大討伐を開始します" - i.commands.ravi.start.error = "大討伐は既に開催されています" - i.commands.ravi.multiplier = "ラヴィダメージ倍率:x%.2f" - i.commands.ravi.res.success = "復活支援を実行します" - i.commands.ravi.res.error = "復活支援は実行されませんでした" - i.commands.ravi.sed.success = "鎮静支援を実行します" - i.commands.ravi.request = "鎮静支援を要請します" - i.commands.ravi.error = "ラヴィコマンドが認識されません" - i.commands.ravi.noPlayers = "誰も大討伐に参加していません" - i.commands.ravi.version = "This command is disabled outside of MHFZZ" - - i.raviente.berserk = "<大討伐:猛狂期>が開催されました!" - i.raviente.extreme = "<大討伐:猛狂期【極】>が開催されました!" - i.raviente.extremeLimited = "<大討伐:猛狂期【極】(制限付)>が開催されました!" - i.raviente.berserkSmall = "<大討伐:猛狂期(小数)>が開催されました!" - - i.guild.invite.title = "猟団勧誘のご案内" - i.guild.invite.body = "猟団「%s」からの勧誘通知です。\n「勧誘に返答」より、返答を行ってください。" - - i.guild.invite.success.title = "成功" - i.guild.invite.success.body = "あなたは「%s」に参加できました。" - - i.guild.invite.accepted.title = "承諾されました" - i.guild.invite.accepted.body = "招待した狩人が「%s」への招待を承諾しました。" - - i.guild.invite.rejected.title = "却下しました" - i.guild.invite.rejected.body = "あなたは「%s」への参加を却下しました。" - - i.guild.invite.declined.title = "辞退しました" - i.guild.invite.declined.body = "招待した狩人が「%s」への招待を辞退しました。" + return langJapanese() + case "fr": + return langFrench() + case "es": + return langSpanish() default: - i.language = "English" - i.cafe.reset = "Resets on %d/%d" - i.timer = "Time: %02d:%02d:%02d.%03d (%df)" - - i.commands.noOp = "You don't have permission to use this command" - i.commands.disabled = "%s command is disabled" - i.commands.reload = "Reloading players..." - i.commands.playtime = "Playtime: %d hours %d minutes %d seconds" - - i.commands.kqf.get = "KQF: %x" - i.commands.kqf.set.error = "Error in command. Format: %s set xxxxxxxxxxxxxxxx" - i.commands.kqf.set.success = "KQF set, please switch Land/World" - i.commands.kqf.version = "This command is disabled prior to MHFG10" - i.commands.rights.error = "Error in command. Format: %s x" - i.commands.rights.success = "Set rights integer: %d" - i.commands.course.error = "Error in command. Format: %s " - i.commands.course.disabled = "%s Course disabled" - i.commands.course.enabled = "%s Course enabled" - i.commands.course.locked = "%s Course is locked" - i.commands.teleport.error = "Error in command. Format: %s x y" - i.commands.teleport.success = "Teleporting to %d %d" - i.commands.psn.error = "Error in command. Format: %s " - i.commands.psn.success = "Connected PSN ID: %s" - i.commands.psn.exists = "PSN ID is connected to another account!" - - i.commands.discord.success = "Your Discord token: %s" - - i.commands.ban.noUser = "Could not find user" - i.commands.ban.success = "Successfully banned %s" - i.commands.ban.invalid = "Invalid Character ID" - i.commands.ban.error = "Error in command. Format: %s [length]" - i.commands.ban.length = " until %s" - - i.commands.timer.enabled = "Quest timer enabled" - i.commands.timer.disabled = "Quest timer disabled" - - i.commands.ravi.noCommand = "No Raviente command specified!" - i.commands.ravi.start.success = "The Great Slaying will begin in a moment" - i.commands.ravi.start.error = "The Great Slaying has already begun!" - i.commands.ravi.multiplier = "Raviente multiplier is currently %.2fx" - i.commands.ravi.res.success = "Sending resurrection support!" - i.commands.ravi.res.error = "Resurrection support has not been requested!" - i.commands.ravi.sed.success = "Sending sedation support if requested!" - i.commands.ravi.request = "Requesting sedation support!" - i.commands.ravi.error = "Raviente command not recognised!" - i.commands.ravi.noPlayers = "No one has joined the Great Slaying!" - i.commands.ravi.version = "This command is disabled outside of MHFZZ" - - i.raviente.berserk = " is being held!" - i.raviente.extreme = " is being held!" - i.raviente.extremeLimited = " is being held!" - i.raviente.berserkSmall = " is being held!" - - i.guild.invite.title = "Invitation!" - i.guild.invite.body = "You have been invited to join\n「%s」\nDo you want to accept?" - - i.guild.invite.success.title = "Success!" - i.guild.invite.success.body = "You have successfully joined\n「%s」." - - i.guild.invite.accepted.title = "Accepted" - i.guild.invite.accepted.body = "The recipient accepted your invitation to join\n「%s」." - - i.guild.invite.rejected.title = "Rejected" - i.guild.invite.rejected.body = "You rejected the invitation to join\n「%s」." - - i.guild.invite.declined.title = "Declined" - i.guild.invite.declined.body = "The recipient declined your invitation to join\n「%s」." + return langEnglish() } - return i } diff --git a/server/channelserver/sys_language_test.go b/server/channelserver/sys_language_test.go index 8888c07ec..d0665c4d6 100644 --- a/server/channelserver/sys_language_test.go +++ b/server/channelserver/sys_language_test.go @@ -1,6 +1,8 @@ package channelserver import ( + "fmt" + "reflect" "testing" cfg "erupe-ce/config" @@ -92,3 +94,36 @@ func TestGetLangStrings_EmptyLanguage(t *testing.T) { t.Errorf("Empty language should default to English, got %q", lang.language) } } + +// checkNoEmptyStrings recursively walks v and fails the test for any empty string field. +func checkNoEmptyStrings(t *testing.T, v reflect.Value, path string) { + t.Helper() + switch v.Kind() { + case reflect.String: + if v.String() == "" { + t.Errorf("missing translation: %s is empty", path) + } + case reflect.Struct: + for i := 0; i < v.NumField(); i++ { + checkNoEmptyStrings(t, v.Field(i), path+"."+v.Type().Field(i).Name) + } + case reflect.Slice: + for i := 0; i < v.Len(); i++ { + checkNoEmptyStrings(t, v.Index(i), fmt.Sprintf("%s[%d]", path, i)) + } + } +} + +func TestLangCompleteness(t *testing.T) { + languages := map[string]i18n{ + "en": langEnglish(), + "jp": langJapanese(), + "fr": langFrench(), + "es": langSpanish(), + } + for code, lang := range languages { + t.Run(code, func(t *testing.T) { + checkNoEmptyStrings(t, reflect.ValueOf(lang), code) + }) + } +} diff --git a/server/channelserver/sys_session.go b/server/channelserver/sys_session.go index be9d23e3d..6f3c7a235 100644 --- a/server/channelserver/sys_session.go +++ b/server/channelserver/sys_session.go @@ -72,6 +72,10 @@ type Session struct { // Contains the mail list that maps accumulated indexes to mail IDs mailList []int + // currentBeadIndex is the bead slot selected by the player via MsgMhfSetKiju. + // A value of -1 means no bead is currently assigned this session. + currentBeadIndex int + Name string closed atomic.Bool ackStart map[uint32]time.Time @@ -86,20 +90,21 @@ func NewSession(server *Server, conn net.Conn) *Session { cryptConn, captureConn, captureCleanup := startCapture(server, cryptConn, conn.RemoteAddr(), pcap.ServerTypeChannel) s := &Session{ - logger: server.logger.Named(conn.RemoteAddr().String()), - server: server, - rawConn: conn, - cryptConn: cryptConn, - sendPackets: make(chan packet, 20), - clientContext: &clientctx.ClientContext{RealClientMode: server.erupeConfig.RealClientMode}, - lastPacket: time.Now(), - objectID: server.getObjectId(), - sessionStart: TimeAdjusted().Unix(), - stageMoveStack: stringstack.New(), - ackStart: make(map[uint32]time.Time), - semaphoreID: make([]uint16, 2), - captureConn: captureConn, - captureCleanup: captureCleanup, + logger: server.logger.Named(conn.RemoteAddr().String()), + server: server, + rawConn: conn, + cryptConn: cryptConn, + sendPackets: make(chan packet, 20), + clientContext: &clientctx.ClientContext{RealClientMode: server.erupeConfig.RealClientMode}, + lastPacket: time.Now(), + objectID: server.getObjectId(), + sessionStart: TimeAdjusted().Unix(), + stageMoveStack: stringstack.New(), + ackStart: make(map[uint32]time.Time), + semaphoreID: make([]uint16, 2), + captureConn: captureConn, + captureCleanup: captureCleanup, + currentBeadIndex: -1, } return s } diff --git a/server/channelserver/test_helpers_test.go b/server/channelserver/test_helpers_test.go index ed7c55933..de289652c 100644 --- a/server/channelserver/test_helpers_test.go +++ b/server/channelserver/test_helpers_test.go @@ -48,6 +48,10 @@ func createMockServer() *Server { state: make([]uint32, 30), support: make([]uint32, 30), }, + // divaRepo and tournamentRepo defaults prevent nil-deref in handler tests + // that don't need specific repo behaviour. Tests that need controlled data override them. + divaRepo: &mockDivaRepo{}, + tournamentRepo: &mockTournamentRepo{}, } s.i18n = getLangStrings(s) s.Registry = NewLocalChannelRegistry([]*Server{s}) diff --git a/server/migrations/seed/CampaignDemo.sql b/server/migrations/seed/CampaignDemo.sql new file mode 100644 index 000000000..343504328 --- /dev/null +++ b/server/migrations/seed/CampaignDemo.sql @@ -0,0 +1,5169 @@ +BEGIN; + +INSERT INTO + public.campaigns ( + id, + min_hr, + max_hr, + min_sr, + max_sr, + min_gr, + max_gr, + reward_type, + stamps, + receive_type, + background_id, + start_time, + end_time, + title, + reward, + link, + code_prefix + ) +VALUES + ( + 921805910, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ガイド娘の差し入れ', + 'ガイド娘の差し入れ', + 'http://www.mhf-z.jp/', + 'SCSC' + ), ( + 142209682, + -1, + -1, + -1, + -1, + -1, + -1, + 2, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'VISAクエスト【ベイルZP】', + 'VISAクエスト【ベイルZP】', + 'http://www.mhf-z.jp/', + 'VC10' + ), ( + 488594222, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + '推奨PC特典トゥールム', + '推奨PC特典トゥールム', + 'http://www.mhf-z.jp', + 'TOUR' + ), ( + 649144304, + -1, + -1, + -1, + -1, + -1, + -1, + 2, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'VISAクエスト【ベイルGP】', + 'VISAクエスト【ベイルGP】', + 'http://www.mhf-z.jp', + 'VC09' + ), ( + 421484974, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + '新・狩友紹介特典【紹介】', + '新・狩友紹介特典【紹介】', + 'http://www.mhf-z.jp', + 'SFR1' + ), ( + 730957531, + 100, + 999, + -1, + -1, + -1, + -1, + 1, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + '新・狩友紹介特典【HR100・複数】', + '新・狩友紹介特典【HR100・複数】', + 'http://www.mhf-z.jp', + 'SFR7' + ), ( + 865537514, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'G-Tune特典G-Tuneソード', + 'GTUNE生産券×1', + 'http://www.mhf-z.jp', + 'GT01' + ), ( + 488594158, + -1, + -1, + -1, + -1, + -1, + -1, + 2, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'VISAクエスト【ベイルGS】', + 'VISAクエスト【ベイルGS】', + 'http://www.mhf-z.jp', + 'VC08' + ), ( + 699557165, + -1, + -1, + -1, + -1, + -1, + -1, + 2, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'VISAクエスト【ベイルHS】', + 'VISAクエスト【ベイルHS】', + 'http://www.mhf-z.jp', + 'VC07' + ), ( + 766535101, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'はじめての狩猟セット【蒼】', + 'はじめての狩猟セット【蒼】', + 'http://www.mhf-z.jp', + 'HSSG' + ), ( + 919840278, + -1, + -1, + -1, + -1, + 1, + 999, + 2, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'レア素材獲得クエスト[GR1]', + 'レア素材獲得クエスト[GR1]', + 'http://www.mhf-z.jp', + 'RAQ8' + ), ( + 747709809, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 10, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'プーギー服「ゴスペル風の服」', + 'プーギー服「ゴスペル風の服」', + 'http://members.mhf-z.jp/topic/payment/', + 'PG08' + ), ( + 257005183, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'スリートキット生産券一式', + 'スリートキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'SRTK' + ), ( + 649274608, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ステノキット生産券一式', + 'ステノキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'STNK' + ), ( + 680601073, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'エディオキット生産券一式', + 'エディオキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'EDEK' + ), ( + 142209810, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ガリトスキット生産券一式', + 'ガリトスキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'GLSK' + ), ( + 140111954, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 10, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'プーギー服「パリアっぽい服」', + 'プーギー服「パリアっぽい服」', + 'http://members.mhf-z.jp/topic/payment/', + 'PG10' + ), ( + 225604757, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 10, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'プーギー服「釣り上げそうな服」', + 'プーギー服「釣り上げそうな服」', + 'http://members.mhf-z.jp/topic/payment/', + 'PG12' + ), ( + 970837204, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ブースターパックG第1弾', + 'ブースターパックG第1弾', + 'http://members.mhf-z.jp/topic/payment/', + 'BGK1' + ), ( + 647465687, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'Fate/stay night生産券一式', + 'Fate/stay night生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'FSNK' + ), ( + 615934211, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + '初音ミクキット生産券一式', + '初音ミクキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'HMKK' + ), ( + 477754130, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'G強化券', + 'G強化券×10', + 'http://members.mhf-z.jp/topic/payment/', + 'GUGK' + ), ( + 477131813, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + '伊達政宗ノ装束生産券一式', + '伊達政宗ノ装束生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'BDMK' + ), ( + 919839894, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + '真田幸村ノ装束生産券一式', + '真田幸村ノ装束生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'BSYK' + ), ( + 410645266, + -1, + -1, + -1, + -1, + -1, + -1, + 2, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'VISAクエスト【ベイルHC】', + 'VISAクエスト【ベイルHC】', + 'http://www.mhf-z.jp', + 'VC06' + ), ( + 769511124, + -1, + -1, + -1, + -1, + -1, + -1, + 2, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'VISAクエスト【ベイルF】', + 'VISAクエスト【ベイルF】', + 'http://www.mhf-z.jp', + 'VC04' + ), ( + 697509361, + -1, + -1, + -1, + -1, + -1, + -1, + 2, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'VISAクエスト【ベイルFZ】', + 'VISAクエスト【ベイルFZ】', + 'http://www.mhf-z.jp', + 'VC05' + ), ( + 1037946068, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'SF25周年記念キット生産券一式', + 'SF25周年記念キット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'SFAK' + ), ( + 615803171, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'グロリアキット生産券一式', + 'グロリアキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'GLRK' + ), ( + 257005151, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + '鬼武者キット生産券一式', + '鬼武者キット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'NMSK' + ), ( + 209318802, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'クロースキット生産券一式', + 'クロースキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'CLTK' + ), ( + 664111863, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'テクストキット生産券一式', + 'テクストキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'TEKK' + ), ( + 597101930, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ファランキット生産券一式', + 'ファランキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'FRNK' + ), ( + 494917916, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'アージェキット生産券一式', + 'アージェキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'ARJK' + ), ( + 477262885, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ガイド娘のお助けパック第5弾', + 'ガイド娘のお助けパック第5弾', + 'http://members.mhf-z.jp/topic/payment/', + 'GG5L' + ), ( + 584295574, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ブースターパック第15弾', + 'ブースターパック第15弾', + 'http://members.mhf-z.jp/topic/payment/', + 'BK15' + ), ( + 615934227, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'レイストキット生産券一式', + 'レイストキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'RESK' + ), ( + 865438535, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 0, + 18, + NOW(), + NOW() + INTERVAL '24 hours', + 'ドスパラPC特典ドスパライズ', + 'ドスパライズ', + 'http://www.mhf-z.jp', + 'DP01' + ), ( + 209318162, + 300, + 999, + -1, + -1, + -1, + -1, + 2, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'レア素材獲得クエスト[HR300]', + 'レア素材獲得クエスト[HR300]', + 'http://www.mhf-z.jp', + 'RAQ7' + ), ( + 427809564, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + '秘伝書コース', + '秘伝書コース購入者プレゼント', + 'http://members.mhf-z.jp/topic/payment/', + 'HDNP' + ), ( + 769510612, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ブースターピアス第4弾', + 'ブースターピアス第4弾', + 'http://members.mhf-z.jp/topic/payment/', + 'BUP4' + ), ( + 632580403, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ブースターパック第14弾', + 'ブースターパック第14弾', + 'http://members.mhf-z.jp/topic/payment/', + 'BK14' + ), ( + 697509345, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ガイド娘のお助けパック第4弾', + 'ガイド娘のお助けパック第4弾', + 'http://members.mhf-z.jp/topic/payment/', + 'GG4L' + ), ( + 477754258, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'シエナキット生産券一式', + 'シエナキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'SENK' + ), ( + 982746827, + 11, + 999, + -1, + -1, + -1, + -1, + 1, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + '狩友紹介特典【HR11】', + '狩友紹介特典【HR11】', + 'http://www.mhf-z.jp', + 'FRS2' + ), ( + 427809180, + 31, + 999, + -1, + -1, + -1, + -1, + 1, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + '狩友紹介特典【HR31】', + '狩友紹介特典【HR31】', + 'http://www.mhf-z.jp', + 'FRS3' + ), ( + 326425369, + 51, + 999, + -1, + -1, + -1, + -1, + 1, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + '狩友紹介特典【HR51】', + '狩友紹介特典【HR51】', + 'http://www.mhf-z.jp', + 'FRS4' + ), ( + 932646762, + 71, + 999, + -1, + -1, + -1, + -1, + 1, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + '狩友紹介特典【HR71】', + '狩友紹介特典【HR71】', + 'http://www.mhf-z.jp', + 'FRS5' + ), ( + 664242887, + 100, + 999, + -1, + -1, + -1, + -1, + 1, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + '狩友紹介特典【HR100】', + '狩友紹介特典【HR100】', + 'http://www.mhf-z.jp', + 'FRS6' + ), ( + 324377496, + 100, + 999, + -1, + -1, + -1, + -1, + 1, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + '狩友紹介特典【HR100・複数】', + '狩友紹介特典【HR100・複数】', + 'http://www.mhf-z.jp', + 'FRS7' + ), ( + 488593646, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + '狩友紹介特典【紹介】', + '狩友紹介特典【紹介】', + 'http://www.mhf-z.jp', + 'FRS1' + ), ( + 917579440, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ガイド娘のお助けパック第3弾', + 'ガイド娘のお助けパック第3弾', + 'http://members.mhf-z.jp/topic/payment/', + 'GG3L' + ), ( + 1032922465, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ブースターパック第13弾', + 'ブースターパック第13弾', + 'http://members.mhf-z.jp/topic/payment/', + 'BK13' + ), ( + 410775890, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ターボパック第6弾', + 'ターボパック第6弾', + 'http://members.mhf-z.jp/topic/payment/', + 'TPK6' + ), ( + 190027375, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ブースターピアス第3弾', + 'ブースターピアス第3弾', + 'http://members.mhf-z.jp/topic/payment/', + 'BUP3' + ), ( + 704499284, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'カウチュキット生産券一式', + 'カウチュキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'CULK' + ), ( + 749757837, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'VISA引換券交換【30日】', + 'VISA引換券交換【30日】', + 'http://www.mhf-z.jp', + 'VT30' + ), ( + 41343673, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'パッケージ引換券交換【30日】', + 'パッケージ引換券交換【30日】', + 'http://www.mhf-z.jp', + 'PT30' + ), ( + 664080362, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'パッケージ引換券交換【60日】', + 'パッケージ引換券交換【60日】', + 'http://www.mhf-z.jp', + 'PT60' + ), ( + 582034480, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'はじめての狩猟セット【紅】', + 'はじめての狩猟セット【紅】', + 'http://www.mhf-z.jp', + 'HSSD' + ), ( + 884238483, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ガイド娘のお助けパック第2弾', + 'ガイド娘のお助けパック第2弾', + 'http://members.mhf-z.jp/topic/payment/', + 'GG2L' + ), ( + 125278872, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ヴァンパイアキット生産券一式', + 'ヴァンパイアキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'VAMK' + ), ( + 848792567, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'トラスキット生産券一式', + 'トラスキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'TRSK' + ), ( + 852600342, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ブースターピアス第2弾', + 'ブースターピアス第2弾', + 'http://members.mhf-z.jp/topic/payment/', + 'BUP2' + ), ( + 426931253, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ブースターパック第12弾', + 'ブースターパック第12弾', + 'http://members.mhf-z.jp/topic/payment/', + 'BK12' + ), ( + 582034992, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + '推奨PC特典マグネストーン', + '推奨PC特典マグネストーン', + 'http://www.mhf-z.jp', + 'RPC1' + ), ( + 919708822, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 10, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'プーギー服「レインボウな服」', + 'プーギー服「レインボウな服」', + 'http://members.mhf-z.jp/topic/payment/', + 'PG4A' + ), ( + 477885266, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ブースターパック第11弾', + 'ブースターパック第11弾', + 'http://members.mhf-z.jp/topic/payment/', + 'BK11' + ), ( + 1032922449, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'マイストキット生産券一式', + 'マイストキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'MAIK' + ), ( + 190027359, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ターボパック第5弾', + 'ターボパック第5弾', + 'http://members.mhf-z.jp/topic/payment/', + 'TPK5' + ), ( + 917578800, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ブースターピアス', + 'ブースターピアス', + 'http://members.mhf-z.jp/topic/payment/', + 'BUP1' + ), ( + 651273238, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ガイド娘のお助けパック', + 'ガイド娘のお助けパック', + 'http://members.mhf-z.jp/topic/payment/', + 'GGHL' + ), ( + 157276444, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + '蘭陀キット生産券一式', + '蘭陀キット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'RANK' + ), ( + 664079850, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'スキルカフP セット', + 'スキルカフP セット', + 'http://members.mhf-z.jp/topic/payment/', + 'SC2K' + ), ( + 490559854, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ブースターパック第10弾', + 'ブースターパック第10弾', + 'http://members.mhf-z.jp/topic/payment/', + 'BK10' + ), ( + 1066502091, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'スパIVキット第2弾生産券一式', + 'スパIVキット第2弾生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'SU2K' + ), ( + 766666125, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 10, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'カシラの差し入れ', + 'カシラの差し入れ', + 'http://www.mhf-z.jp', + 'SASB' + ), ( + 917579312, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 10, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + '教官見習ネコの差し入れ', + '教官見習ネコの差し入れ', + 'http://www.mhf-z.jp', + 'SASA' + ), ( + 58169496, + -1, + -1, + -1, + -1, + -1, + -1, + 2, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'レア素材獲得クエスト[HR100]', + 'レア素材獲得クエスト[HR100]', + 'http://www.mhf-z.jp', + 'RAQ6' + ), ( + 57989801, + -1, + -1, + -1, + -1, + -1, + -1, + 2, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'レア素材獲得クエスト[HR31]', + 'レア素材獲得クエスト[HR31]', + 'http://www.mhf-g.jp', + 'RAQ3' + ), ( + 865406314, + -1, + -1, + -1, + -1, + -1, + -1, + 2, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'レア素材獲得クエスト[HR51]', + 'レア素材獲得クエスト[HR51]', + 'http://www.mhf-z.jp', + 'RAQ4' + ), ( + 848792551, + -1, + -1, + -1, + -1, + -1, + -1, + 2, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'レア素材獲得クエスト[HR71]', + 'レア素材獲得クエスト[HR71]', + 'http://www.mhf-z.jp', + 'RAQ5' + ), ( + 865438711, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ターボパック第4弾', + 'ターボパック第4弾', + 'http://members.mhf-z.jp/topic/payment/', + 'TPK4' + ), ( + 596970858, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ブースターパック第9弾', + 'ブースターパック第9弾', + 'http://members.mhf-z.jp/topic/payment/', + 'BPK9' + ), ( + 57989769, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'バーニーキット生産券一式', + 'バーニーキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'BUKT' + ), ( + 884238467, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 10, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'マイトレポイント10P', + 'マイトレポイント10P', + 'http://www.mhf-z.jp', + 'MYTP' + ), ( + 125278360, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'スパIVキット生産券一式', + 'スパIVキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'S4KT' + ), ( + 766666173, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ゴルトキット生産券一式', + 'ゴルトキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'GLKT' + ), ( + 58169880, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ターボパックN第2弾', + 'ターボパックN第2弾', + 'http://members.mhf-z.jp/topic/payment/', + 'TPN2' + ), ( + 58120841, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ブースターパック第8弾', + 'ブースターパック第8弾', + 'http://members.mhf-z.jp/topic/payment/', + 'BPK8' + ), ( + 848661495, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'リゲリアキット生産券一式', + 'リゲリアキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'RGKT' + ), ( + 865406954, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'クレールキット生産券一式', + 'クレールキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'CLKT' + ), ( + 901015731, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'シャランキット生産券一式', + 'シャランキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'SAKT' + ), ( + 702533268, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 10, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'プーギー服「アクラっぽい服」', + 'プーギー服「アクラっぽい服」', + 'http://members.mhf-z.jp/topic/payment/', + 'PG07' + ), ( + 584164886, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 10, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'プーギー服「美味しそうな服」', + 'プーギー服「美味しそうな服」', + 'http://members.mhf-z.jp/topic/payment/', + 'PG13' + ), ( + 425712412, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ターボパック第3弾', + 'ターボパック第3弾', + 'http://members.mhf-z.jp/topic/payment/', + 'TPK3' + ), ( + 749888957, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'きんねこぎんねこキット生産券一式', + 'きんねこぎんねこキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'GSKT' + ), ( + 1066633211, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ブースターパック第7弾', + 'ブースターパック第7弾', + 'http://members.mhf-z.jp/topic/payment/', + 'BPK7' + ), ( + 702532628, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'アリストキット生産券一式', + 'アリストキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'ASKT' + ), ( + 764487025, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ブースターパック第6弾', + 'ブースターパック第6弾', + 'http://members.mhf-z.jp/topic/payment/', + 'BPK6' + ), ( + 714180459, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ブースターパック第5弾', + 'ブースターパック第5弾', + 'http://members.mhf-z.jp/topic/payment/', + 'BPK5' + ), ( + 492820700, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ターボパック第2弾', + 'ターボパック第2弾', + 'http://members.mhf-z.jp/topic/payment/', + 'TPK2' + ), ( + 225604741, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ターボパックN', + 'ターボパックN', + 'http://members.mhf-z.jp/topic/payment/', + 'TPN1' + ), ( + 865438551, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'スキルカフPセット', + 'スキルカフPセット', + 'http://members.mhf-z.jp/topic/payment/', + 'SCP1' + ), ( + 391617368, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'イメチェンサービス', + 'イメチェンサービス', + 'http://members.mhf-z.jp/topic/payment/', + 'IMCH' + ), ( + 309779081, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'メレティキット生産券一式', + 'メレティキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'MTKT' + ), ( + 649143728, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'フェルムキット生産券一式', + 'フェルムキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'FLKT' + ), ( + 123182040, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'レアルキット生産券一式', + 'レアルキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'RLKT' + ), ( + 766534669, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'アルマキット生産券一式', + 'アルマキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'APK1' + ), ( + 901015555, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'イクスキット生産券一式', + 'イクスキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'EXKT' + ), ( + 919708950, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'リアンキット生産券一式', + 'リアンキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'LEKT' + ), ( + 666307882, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'マギサ/ウィザーキット生産券一式', + 'マギサ/ウィザーキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'MGWZ' + ), ( + 766665773, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + '星祭/七夕キット生産券一式', + '星祭/七夕キット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'HTKT' + ), ( + 582035248, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'プロミスキット生産券一式', + 'プロミスキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'PRMS' + ), ( + 173119487, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ホワイトメタルキット生産券一式', + 'ホワイトメタルキット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'WMTL' + ), ( + 766534701, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'しろねこ服キット生産券一式', + 'しろねこ服キット生産券一式', + 'http://members.mhf-z.jp/topic/payment/', + 'SRNK' + ), ( + 747709777, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ターボパック第1弾', + 'ターボパック第1弾', + 'http://members.mhf-z.jp/topic/payment/', + 'TPK1' + ), ( + 189896703, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ブースターパック第4弾', + 'ブースターパック第4弾', + 'http://members.mhf-z.jp/topic/payment/', + 'BPK4' + ), ( + 848792439, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ブースターパック第3弾', + 'ブースターパック第3弾', + 'http://members.mhf-z.jp/topic/payment/', + 'BPK3' + ), ( + 970968596, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ブースターパック第2弾', + 'ブースターパック第2弾', + 'http://members.mhf-z.jp/topic/payment/', + 'BPK2' + ), ( + 425711708, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'ブースターパック', + 'ブースターパック', + 'http://members.mhf-z.jp/topic/payment/', + 'BPAC' + ), ( + 867634602, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'プーギー服「悪戯しそうな服」', + 'プーギー服「悪戯しそうな服」', + 'http://members.mhf-z.jp/topic/payment/', + 'PG11' + ), ( + 714180427, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'プーギー服「ベルっぽい服」', + 'プーギー服「ベルっぽい服」', + 'http://members.mhf-z.jp/topic/payment/', + 'PG09' + ), ( + 309648009, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'プーギー服「アスール風の服」', + 'プーギー服「アスール風の服」', + 'http://members.mhf-z.jp/topic/payment/', + 'PG06' + ), ( + 391616984, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'プーギー服「エスピっぽい服」', + 'プーギー服「エスピっぽい服」', + 'http://members.mhf-z.jp/topic/payment/', + 'PG04' + ), ( + 208827525, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'プーギー服「コムラダ風の服」', + 'プーギー服「コムラダ風の服」', + 'http://members.mhf-z.jp/topic/payment/', + 'PG05' + ), ( + 326556329, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'プーギー服「ヒプっぽい服」', + 'プーギー服「ヒプっぽい服」', + 'http://members.mhf-z.jp/topic/payment/', + 'PG02' + ), ( + 934743978, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'プーギー服「ヴォルっぽい服」', + 'プーギー服「ヴォルっぽい服」', + 'http://members.mhf-z.jp/topic/payment/', + 'PG03' + ), ( + 224385244, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 1, + 1, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'プーギー服「野生的な服」', + 'プーギー服「野生的な服」', + 'http://members.mhf-z.jp/topic/payment/', + 'PG01' + ), ( + 884369427, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 10, + 41, + NOW(), + NOW() + INTERVAL '24 hours', + 'マイトレポイント30P', + 'マイトレポイント30P', + 'http://www.mhf-z.jp', + 'S3P2' + ), ( + 867634474, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 10, + 57, + NOW(), + NOW() + INTERVAL '24 hours', + '便利アイテムセット【入門】', + '便利アイテムセット【入門】', + 'http://www.mhf-z.jp', + 'BITM' + ), ( + 848792423, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 10, + 58, + NOW(), + NOW() + INTERVAL '24 hours', + '便利アイテムセット【達人】', + '便利アイテムセット【達人】', + 'http://www.mhf-z.jp', + 'EITM' + ), ( + 208696469, + -1, + -1, + -1, + -1, + -1, + -1, + 2, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'VISAクエスト【ゴシック】', + 'VISA生産券獲得クエスト', + 'http://www.mhf-z.jp', + 'VC01' + ), ( + 584165270, + -1, + -1, + -1, + -1, + -1, + -1, + 2, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'VISAクエスト【ゴシックF】', + 'VISA生産券獲得クエスト', + 'http://www.mhf-z.jp', + 'VC02' + ), ( + 901146643, + -1, + -1, + -1, + -1, + -1, + -1, + 2, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'VISAクエスト【ベイル】', + 'VISA生産券獲得クエスト', + 'http://www.mhf-z.jp', + 'VC03' + ), ( + 492821212, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'MHF VISAカード入会特典', + 'MHF VISAカード入会特典', + 'http://www.mhf-z.jp', + 'VG03' + ), ( + 901015571, + -1, + -1, + -1, + -1, + -1, + -1, + 2, + 0, + 0, + 12, + NOW(), + NOW() + INTERVAL '24 hours', + 'レア素材獲得クエスト[HR11]', + 'レア素材獲得クエスト[HR11]', + 'http://www.mhf-z.jp', + 'RAQ2' + ), ( + 1038077588, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 0, + 5, + NOW(), + NOW() + INTERVAL '24 hours', + 'ポルタチケット配布所', + '換金用アイテム:ポルタチケット', + 'http://www.mhf-z.jp', + 'PTC1' + ), ( + 190027775, + -1, + -1, + -1, + -1, + -1, + -1, + 1, + 0, + 0, + 44, + NOW(), + NOW() + INTERVAL '24 hours', + 'パタパタメラルー配布所', + '家具:パタパタメラルー', + 'http://www.mhf-z.jp', + 'PATA' + ); + +INSERT INTO + public.campaign_categories (id, type, title, description) +VALUES + ( + 1, + 0, + 'イベント・キャンペーン', + '各種~C04イベント~C00用のイベントコードや、各 +種~C04キャンペーン~C00で入手できるイベントコ +ードを入力することができます。' + ), ( + 2, + 0, + 'パッケージ特典', + '~C11プレミアムパッケージ~C00など、各種~C04パッ +ケージ~C00に付属されているイベントコード +を入力することができます。' + ), ( + 3, + 0, + '関連グッズ付属特典', + '攻略本やゲームパッドなど、各種~C04MHF関 +連商品~C00に付属されているイベントコード +を入力することができます。' + ), ( + 5, + 0, + 'その他', + '複数の手段で入手できるイベントコード +や不具合のアイテム補償用などのイベン +トコードを入力することができます。' + ), ( + 6, + 1, + 'プレミアムキット・オリジナル', + 'アイテム販売商品~C11プレミアムキットシ +リーズ【オリジナルキット】~C00のイベン +トコードを入力することができます。' + ), ( + 7, + 1, + 'プレミアムキット・パッケージ', + 'アイテム販売商品~C11プレミアムキットシ +リーズ【パッケージキット】~C00のイベン +トコードを入力することができます。' + ), ( + 8, + 1, + 'ブースターパック', + 'アイテム販売商品~C11ブースターパック~C00の +イベントコードを入力することができま +す。' + ), ( + 9, + 1, + 'ターボパック', + 'アイテム販売商品~C11ターボパック~C00のイベ +ントコードを入力することができます。' + ), ( + 10, + 1, + 'マイトレプーギー服', + 'アイテム販売商品やパッケージに付属し +ている~C11マイトレプーギー服~C00のイベント +コードを入力することができます。' + ), ( + 11, + 1, + 'その他の商品', + 'アイテム販売商品~C11スキルカフPセット~C00や +~C11イメチェンサービス~C00などのイベントコ +ードを入力することができます。' + ); + +INSERT INTO + public.campaign_category_links (campaign_id, category_id) +VALUES + (921805910, 1), + (142209682, 3), + (488594222, 1), + (488594222, 3), + (649144304, 3), + (421484974, 1), + (730957531, 1), + (865537514, 3), + (488594158, 3), + (699557165, 3), + (766535101, 1), + (766535101, 3), + (919840278, 3), + (919840278, 5), + (747709809, 10), + (257005183, 7), + (649274608, 7), + (680601073, 7), + (142209810, 7), + (140111954, 10), + (225604757, 10), + (970837204, 8), + (647465687, 6), + (615934211, 6), + (477754130, 11), + (477131813, 6), + (919839894, 6), + (410645266, 3), + (769511124, 3), + (697509361, 3), + (1037946068, 6), + (615803171, 6), + (257005151, 6), + (209318802, 6), + (664111863, 7), + (597101930, 7), + (494917916, 7), + (477262885, 11), + (584295574, 8), + (615934227, 6), + (865438535, 3), + (209318162, 1), + (209318162, 5), + (427809564, 11), + (769510612, 8), + (632580403, 8), + (697509345, 11), + (477754258, 6), + (982746827, 1), + (427809180, 1), + (326425369, 1), + (932646762, 1), + (664242887, 1), + (324377496, 1), + (488593646, 1), + (917579440, 11), + (1032922465, 8), + (410775890, 9), + (190027375, 8), + (704499284, 6), + (749757837, 1), + (749757837, 3), + (41343673, 2), + (664080362, 2), + (582034480, 3), + (884238483, 11), + (125278872, 6), + (848792567, 6), + (852600342, 8), + (426931253, 8), + (582034992, 3), + (919708822, 10), + (477885266, 8), + (1032922449, 6), + (190027359, 9), + (917578800, 8), + (651273238, 11), + (157276444, 6), + (664079850, 11), + (490559854, 8), + (1066502091, 6), + (766666125, 1), + (766666125, 2), + (766666125, 5), + (766666125, 11), + (917579312, 1), + (917579312, 2), + (917579312, 5), + (917579312, 11), + (58169496, 1), + (58169496, 2), + (58169496, 5), + (57989801, 1), + (57989801, 2), + (57989801, 5), + (865406314, 1), + (865406314, 2), + (865406314, 5), + (848792551, 1), + (848792551, 2), + (848792551, 5), + (865438711, 9), + (596970858, 8), + (57989769, 6), + (884238467, 1), + (884238467, 5), + (884238467, 11), + (125278360, 6), + (766666173, 7), + (58169880, 9), + (58120841, 8), + (848661495, 6), + (865406954, 7), + (901015731, 7), + (702533268, 10), + (584164886, 10), + (425712412, 9), + (749888957, 7), + (1066633211, 8), + (702532628, 6), + (764487025, 8), + (714180459, 8), + (492820700, 9), + (225604741, 9), + (865438551, 11), + (391617368, 11), + (309779081, 6), + (649143728, 6), + (123182040, 6), + (766534669, 6), + (901015555, 7), + (919708950, 7), + (666307882, 6), + (766665773, 7), + (582035248, 7), + (173119487, 7), + (766534701, 7), + (747709777, 9), + (189896703, 8), + (848792439, 8), + (970968596, 8), + (425711708, 8), + (867634602, 10), + (714180427, 10), + (309648009, 10), + (391616984, 10), + (208827525, 10), + (326556329, 10), + (934743978, 10), + (224385244, 10), + (884369427, 1), + (884369427, 2), + (884369427, 5), + (884369427, 11), + (867634474, 5), + (867634474, 11), + (848792423, 5), + (848792423, 11), + (208696469, 1), + (208696469, 3), + (584165270, 1), + (584165270, 3), + (901146643, 1), + (901146643, 3), + (492821212, 1), + (492821212, 3), + (901015571, 1), + (901015571, 2), + (901015571, 5), + (1038077588, 1), + (1038077588, 5), + (190027775, 1), + (190027775, 5); + +INSERT INTO + public.campaign_rewards ( + campaign_id, + item_type, + quantity, + item_id + ) +VALUES + ( + 410645266, + 9, + 0, + 40161 + ), ( + 884238467, + 14, + 10, + 0 + ), ( + 747709809, + 15, + 1, + 16 + ), ( + 140111954, + 15, + 1, + 19 + ), ( + 225604757, + 15, + 1, + 26 + ), ( + 919708822, + 15, + 1, + 28 + ), ( + 702533268, + 15, + 1, + 11 + ), ( + 584164886, + 15, + 1, + 25 + ), ( + 867634602, + 15, + 1, + 21 + ), ( + 714180427, + 15, + 1, + 15 + ), ( + 309648009, + 15, + 1, + 12 + ), ( + 391616984, + 15, + 1, + 7 + ), ( + 208827525, + 15, + 1, + 8 + ), ( + 326556329, + 15, + 1, + 5 + ), ( + 934743978, + 15, + 1, + 6 + ), ( + 224385244, + 15, + 1, + 4 + ), ( + 697509361, + 9, + 0, + 40143 + ), ( + 769511124, + 9, + 0, + 40141 + ), ( + 901146643, + 9, + 0, + 40081 + ), ( + 584165270, + 9, + 0, + 40080 + ), ( + 208696469, + 9, + 0, + 40079 + ), ( + 492821212, + 7, + 110, + 1859 + ), ( + 208696469, + 9, + 0, + 40079 + ), ( + 492821212, + 7, + 110, + 1859 + ), ( + 190027775, + 8, + 1, + 96 + ), ( + 884369427, + 14, + 30, + 0 + ), ( + 57989769, + 7, + 1, + 3617 + ), ( + 57989769, + 7, + 1, + 3618 + ), ( + 57989769, + 7, + 1, + 3622 + ), ( + 57989769, + 7, + 1, + 3623 + ), ( + 57989769, + 7, + 1, + 3624 + ), ( + 57989769, + 7, + 1, + 3625 + ), ( + 57989769, + 7, + 1, + 3626 + ), ( + 57989769, + 7, + 1, + 3627 + ), ( + 57989769, + 7, + 2, + 3631 + ), ( + 57989769, + 7, + 2, + 3632 + ), ( + 57989769, + 7, + 2, + 3633 + ), ( + 57989769, + 7, + 2, + 3634 + ), ( + 57989769, + 7, + 2, + 3635 + ), ( + 173119487, + 7, + 1, + 1767 + ), ( + 173119487, + 7, + 1, + 1759 + ), ( + 173119487, + 7, + 1, + 1760 + ), ( + 173119487, + 7, + 1, + 1776 + ), ( + 173119487, + 7, + 1, + 1777 + ), ( + 173119487, + 7, + 1, + 1778 + ), ( + 173119487, + 7, + 1, + 1779 + ), ( + 173119487, + 7, + 1, + 1780 + ), ( + 173119487, + 7, + 1, + 1781 + ), ( + 173119487, + 7, + 1, + 1806 + ), ( + 173119487, + 7, + 1, + 1807 + ), ( + 582035248, + 7, + 1, + 1460 + ), ( + 582035248, + 7, + 1, + 1761 + ), ( + 582035248, + 7, + 1, + 1762 + ), ( + 582035248, + 7, + 1, + 1461 + ), ( + 582035248, + 7, + 1, + 1462 + ), ( + 582035248, + 7, + 1, + 1464 + ), ( + 582035248, + 7, + 1, + 1463 + ), ( + 582035248, + 7, + 1, + 1465 + ), ( + 582035248, + 7, + 1, + 1466 + ), ( + 766665773, + 7, + 1, + 1532 + ), ( + 766665773, + 7, + 1, + 1084 + ), ( + 766665773, + 7, + 1, + 2020 + ), ( + 766665773, + 7, + 1, + 2019 + ), ( + 766665773, + 7, + 5, + 1533 + ), ( + 766665773, + 7, + 5, + 1534 + ), ( + 766665773, + 7, + 1, + 1535 + ), ( + 766665773, + 7, + 1, + 1074 + ), ( + 919708950, + 7, + 1, + 1614 + ), ( + 919708950, + 7, + 1, + 1615 + ), ( + 919708950, + 7, + 1, + 1616 + ), ( + 919708950, + 7, + 1, + 1617 + ), ( + 919708950, + 7, + 1, + 1618 + ), ( + 919708950, + 7, + 1, + 1619 + ), ( + 919708950, + 7, + 1, + 1620 + ), ( + 919708950, + 7, + 1, + 1621 + ), ( + 919708950, + 7, + 2, + 1622 + ), ( + 919708950, + 7, + 2, + 1623 + ), ( + 919708950, + 7, + 2, + 1624 + ), ( + 919708950, + 7, + 2, + 1625 + ), ( + 919708950, + 7, + 2, + 1626 + ), ( + 901015555, + 7, + 1, + 1765 + ), ( + 901015555, + 7, + 1, + 1766 + ), ( + 901015555, + 7, + 1, + 1763 + ), ( + 901015555, + 7, + 1, + 1764 + ), ( + 901015555, + 7, + 2, + 1769 + ), ( + 901015555, + 7, + 2, + 1770 + ), ( + 901015555, + 7, + 2, + 1771 + ), ( + 901015555, + 7, + 2, + 1772 + ), ( + 901015555, + 7, + 2, + 1773 + ), ( + 749888957, + 7, + 1, + 2972 + ), ( + 749888957, + 7, + 1, + 2973 + ), ( + 749888957, + 7, + 1, + 2983 + ), ( + 749888957, + 7, + 1, + 2984 + ), ( + 749888957, + 7, + 1, + 2985 + ), ( + 749888957, + 7, + 1, + 2986 + ), ( + 749888957, + 7, + 1, + 2987 + ), ( + 749888957, + 7, + 1, + 2988 + ), ( + 749888957, + 7, + 2, + 3012 + ), ( + 749888957, + 7, + 2, + 3013 + ), ( + 749888957, + 7, + 2, + 3014 + ), ( + 749888957, + 7, + 2, + 3015 + ), ( + 749888957, + 7, + 2, + 3016 + ), ( + 749888957, + 7, + 2, + 3017 + ), ( + 749888957, + 7, + 2, + 3018 + ), ( + 749888957, + 7, + 2, + 3019 + ), ( + 749888957, + 7, + 2, + 3020 + ), ( + 749888957, + 7, + 2, + 3021 + ), ( + 865406954, + 7, + 1, + 2015 + ), ( + 865406954, + 7, + 1, + 2016 + ), ( + 865406954, + 7, + 1, + 2017 + ), ( + 865406954, + 7, + 1, + 2018 + ), ( + 865406954, + 7, + 2, + 2026 + ), ( + 865406954, + 7, + 2, + 2027 + ), ( + 865406954, + 7, + 2, + 2028 + ), ( + 865406954, + 7, + 2, + 2029 + ), ( + 865406954, + 7, + 2, + 2030 + ), ( + 766666173, + 7, + 1, + 2099 + ), ( + 766666173, + 7, + 1, + 2100 + ), ( + 766666173, + 7, + 1, + 2101 + ), ( + 766666173, + 7, + 1, + 2102 + ), ( + 766666173, + 7, + 1, + 2105 + ), ( + 766666173, + 7, + 1, + 2106 + ), ( + 766666173, + 7, + 1, + 2107 + ), ( + 766666173, + 7, + 1, + 2108 + ), ( + 766666173, + 7, + 1, + 2109 + ), ( + 766666173, + 7, + 1, + 2110 + ), ( + 766666173, + 7, + 2, + 2121 + ), ( + 766666173, + 7, + 2, + 2122 + ), ( + 766666173, + 7, + 2, + 2123 + ), ( + 766666173, + 7, + 2, + 2124 + ), ( + 766666173, + 7, + 2, + 2125 + ), ( + 664111863, + 7, + 1, + 2561 + ), ( + 664111863, + 7, + 1, + 2562 + ), ( + 664111863, + 7, + 1, + 2564 + ), ( + 664111863, + 7, + 1, + 2563 + ), ( + 664111863, + 7, + 1, + 2567 + ), ( + 664111863, + 7, + 1, + 2568 + ), ( + 664111863, + 7, + 1, + 2569 + ), ( + 664111863, + 7, + 1, + 2570 + ), ( + 664111863, + 7, + 1, + 2571 + ), ( + 664111863, + 7, + 1, + 2572 + ), ( + 664111863, + 7, + 2, + 2578 + ), ( + 664111863, + 7, + 2, + 2579 + ), ( + 664111863, + 7, + 2, + 2580 + ), ( + 664111863, + 7, + 2, + 2581 + ), ( + 664111863, + 7, + 2, + 2582 + ), ( + 664111863, + 7, + 2, + 2583 + ), ( + 664111863, + 7, + 2, + 2584 + ), ( + 664111863, + 7, + 2, + 2585 + ), ( + 664111863, + 7, + 2, + 2586 + ), ( + 664111863, + 7, + 2, + 2587 + ), ( + 664111863, + 7, + 2, + 2583 + ), ( + 664111863, + 7, + 2, + 2584 + ), ( + 664111863, + 7, + 2, + 2585 + ), ( + 664111863, + 7, + 2, + 2586 + ), ( + 664111863, + 7, + 2, + 2587 + ), ( + 597101930, + 7, + 1, + 2968 + ), ( + 597101930, + 7, + 1, + 2969 + ), ( + 597101930, + 7, + 1, + 2970 + ), ( + 597101930, + 7, + 1, + 2971 + ), ( + 597101930, + 7, + 1, + 2977 + ), ( + 597101930, + 7, + 1, + 2978 + ), ( + 597101930, + 7, + 1, + 2979 + ), ( + 597101930, + 7, + 1, + 6398 + ), ( + 597101930, + 7, + 1, + 2980 + ), ( + 597101930, + 7, + 1, + 2981 + ), ( + 597101930, + 7, + 1, + 2982 + ), ( + 597101930, + 7, + 1, + 6399 + ), ( + 597101930, + 7, + 2, + 3007 + ), ( + 597101930, + 7, + 2, + 3008 + ), ( + 597101930, + 7, + 2, + 3009 + ), ( + 597101930, + 7, + 2, + 3010 + ), ( + 597101930, + 7, + 2, + 3011 + ), ( + 666307882, + 7, + 1, + 1231 + ), ( + 666307882, + 7, + 1, + 1230 + ), ( + 666307882, + 7, + 1, + 1232 + ), ( + 666307882, + 7, + 2, + 1233 + ), ( + 666307882, + 7, + 2, + 1234 + ), ( + 666307882, + 7, + 2, + 1235 + ), ( + 666307882, + 7, + 2, + 1236 + ), ( + 666307882, + 7, + 2, + 1237 + ), ( + 766534669, + 7, + 1, + 1708 + ), ( + 766534669, + 7, + 1, + 1709 + ), ( + 766534669, + 7, + 1, + 1713 + ), ( + 766534669, + 7, + 1, + 1714 + ), ( + 766534669, + 7, + 1, + 1715 + ), ( + 766534669, + 7, + 4, + 1718 + ), ( + 766534669, + 7, + 2, + 1719 + ), ( + 766534669, + 7, + 2, + 1720 + ), ( + 766534669, + 7, + 2, + 1721 + ), ( + 766534669, + 7, + 2, + 1722 + ), ( + 901015731, + 7, + 1, + 1821 + ), ( + 901015731, + 7, + 1, + 1822 + ), ( + 901015731, + 7, + 1, + 1820 + ), ( + 901015731, + 7, + 1, + 1819 + ), ( + 901015731, + 7, + 1, + 1828 + ), ( + 901015731, + 7, + 1, + 1829 + ), ( + 901015731, + 7, + 2, + 1836 + ), ( + 901015731, + 7, + 2, + 1837 + ), ( + 901015731, + 7, + 2, + 1838 + ), ( + 901015731, + 7, + 2, + 1839 + ), ( + 901015731, + 7, + 2, + 1840 + ), ( + 123182040, + 7, + 1, + 1826 + ), ( + 123182040, + 7, + 1, + 1827 + ), ( + 123182040, + 7, + 1, + 1830 + ), ( + 123182040, + 7, + 1, + 1831 + ), ( + 123182040, + 7, + 1, + 1832 + ), ( + 123182040, + 7, + 1, + 1833 + ), ( + 123182040, + 7, + 1, + 1834 + ), ( + 123182040, + 7, + 1, + 1835 + ), ( + 123182040, + 7, + 2, + 1841 + ), ( + 123182040, + 7, + 2, + 1842 + ), ( + 123182040, + 7, + 2, + 1843 + ), ( + 123182040, + 7, + 2, + 1844 + ), ( + 123182040, + 7, + 2, + 1845 + ), ( + 649143728, + 7, + 1, + 2013 + ), ( + 649143728, + 7, + 1, + 2014 + ), ( + 649143728, + 7, + 1, + 2011 + ), ( + 649143728, + 7, + 1, + 2012 + ), ( + 649143728, + 7, + 2, + 2021 + ), ( + 649143728, + 7, + 2, + 2022 + ), ( + 649143728, + 7, + 2, + 2023 + ), ( + 649143728, + 7, + 2, + 2024 + ), ( + 649143728, + 7, + 2, + 2025 + ), ( + 309779081, + 7, + 1, + 2097 + ), ( + 309779081, + 7, + 1, + 2098 + ), ( + 309779081, + 7, + 1, + 2103 + ), ( + 309779081, + 7, + 1, + 2104 + ), ( + 309779081, + 7, + 2, + 2116 + ), ( + 309779081, + 7, + 2, + 2117 + ), ( + 309779081, + 7, + 2, + 2118 + ), ( + 309779081, + 7, + 2, + 2119 + ), ( + 309779081, + 7, + 2, + 2120 + ), ( + 702532628, + 7, + 1, + 2559 + ), ( + 702532628, + 7, + 1, + 2560 + ), ( + 702532628, + 7, + 1, + 2565 + ), ( + 702532628, + 7, + 1, + 2566 + ), ( + 702532628, + 7, + 2, + 2573 + ), ( + 702532628, + 7, + 2, + 2574 + ), ( + 702532628, + 7, + 2, + 2575 + ), ( + 702532628, + 7, + 2, + 2576 + ), ( + 702532628, + 7, + 2, + 2577 + ), ( + 848792567, + 7, + 1, + 4784 + ), ( + 848792567, + 7, + 1, + 4783 + ), ( + 848792567, + 7, + 1, + 4785 + ), ( + 848792567, + 7, + 1, + 4786 + ), ( + 848792567, + 7, + 2, + 4787 + ), ( + 848792567, + 7, + 2, + 4788 + ), ( + 848792567, + 7, + 2, + 4789 + ), ( + 848792567, + 7, + 2, + 4790 + ), ( + 848792567, + 7, + 2, + 4791 + ), ( + 848661495, + 7, + 1, + 3252 + ), ( + 848661495, + 7, + 1, + 3251 + ), ( + 848661495, + 7, + 1, + 3249 + ), ( + 848661495, + 7, + 1, + 3250 + ), ( + 848661495, + 7, + 2, + 3263 + ), ( + 848661495, + 7, + 2, + 3264 + ), ( + 848661495, + 7, + 2, + 3265 + ), ( + 848661495, + 7, + 2, + 3266 + ), ( + 848661495, + 7, + 2, + 3267 + ), ( + 704499284, + 7, + 1, + 5133 + ), ( + 704499284, + 7, + 1, + 5134 + ), ( + 704499284, + 7, + 1, + 5135 + ), ( + 704499284, + 7, + 1, + 5136 + ), ( + 704499284, + 7, + 2, + 5139 + ), ( + 704499284, + 7, + 2, + 5140 + ), ( + 704499284, + 7, + 2, + 5141 + ), ( + 704499284, + 7, + 2, + 5142 + ), ( + 704499284, + 7, + 2, + 5143 + ), ( + 157276444, + 7, + 1, + 4078 + ), ( + 157276444, + 7, + 1, + 4079 + ), ( + 157276444, + 7, + 1, + 4076 + ), ( + 157276444, + 7, + 1, + 4077 + ), ( + 157276444, + 7, + 2, + 4080 + ), ( + 157276444, + 7, + 2, + 4081 + ), ( + 157276444, + 7, + 2, + 4082 + ), ( + 157276444, + 7, + 2, + 4083 + ), ( + 157276444, + 7, + 2, + 4084 + ), ( + 209318802, + 7, + 1, + 6422 + ), ( + 209318802, + 7, + 1, + 6423 + ), ( + 209318802, + 7, + 1, + 6420 + ), ( + 209318802, + 7, + 1, + 6421 + ), ( + 209318802, + 7, + 1, + 6424 + ), ( + 209318802, + 7, + 1, + 6425 + ), ( + 209318802, + 7, + 2, + 6426 + ), ( + 209318802, + 7, + 2, + 6427 + ), ( + 209318802, + 7, + 2, + 6428 + ), ( + 209318802, + 7, + 2, + 6429 + ), ( + 209318802, + 7, + 2, + 6430 + ), ( + 494917916, + 7, + 2, + 2583 + ), ( + 494917916, + 7, + 2, + 2584 + ), ( + 494917916, + 7, + 2, + 2585 + ), ( + 494917916, + 7, + 2, + 2586 + ), ( + 494917916, + 7, + 2, + 2587 + ), ( + 477754258, + 7, + 1, + 5626 + ), ( + 477754258, + 7, + 1, + 5627 + ), ( + 477754258, + 7, + 1, + 5628 + ), ( + 477754258, + 7, + 1, + 5629 + ), ( + 477754258, + 7, + 1, + 5630 + ), ( + 477754258, + 7, + 1, + 5631 + ), ( + 477754258, + 7, + 1, + 5632 + ), ( + 477754258, + 7, + 1, + 5633 + ), ( + 477754258, + 7, + 1, + 5634 + ), ( + 477754258, + 7, + 1, + 5635 + ), ( + 477754258, + 7, + 2, + 5636 + ), ( + 477754258, + 7, + 2, + 5637 + ), ( + 477754258, + 7, + 2, + 5638 + ), ( + 477754258, + 7, + 2, + 5639 + ), ( + 477754258, + 7, + 2, + 5640 + ), ( + 257005151, + 7, + 1, + 6403 + ), ( + 257005151, + 7, + 1, + 6404 + ), ( + 257005151, + 7, + 1, + 6405 + ), ( + 257005151, + 7, + 1, + 6406 + ), ( + 257005151, + 7, + 1, + 6407 + ), ( + 257005151, + 7, + 1, + 6408 + ), ( + 257005151, + 7, + 1, + 6409 + ), ( + 257005151, + 7, + 1, + 6410 + ), ( + 257005151, + 7, + 1, + 6411 + ), ( + 257005151, + 7, + 1, + 6412 + ), ( + 257005151, + 7, + 1, + 6413 + ), ( + 257005151, + 7, + 1, + 6414 + ), ( + 257005151, + 7, + 2, + 6415 + ), ( + 257005151, + 7, + 2, + 6416 + ), ( + 257005151, + 7, + 2, + 6417 + ), ( + 257005151, + 7, + 2, + 6418 + ), ( + 257005151, + 7, + 2, + 6419 + ), ( + 615803171, + 7, + 1, + 6476 + ), ( + 615803171, + 7, + 1, + 6477 + ), ( + 615803171, + 7, + 1, + 6478 + ), ( + 615803171, + 7, + 1, + 6479 + ), ( + 615803171, + 7, + 1, + 6480 + ), ( + 615803171, + 7, + 1, + 6481 + ), ( + 615803171, + 7, + 1, + 6482 + ), ( + 615803171, + 7, + 1, + 6483 + ), ( + 615803171, + 7, + 1, + 6484 + ), ( + 615803171, + 7, + 1, + 6485 + ), ( + 615803171, + 7, + 1, + 6486 + ), ( + 615803171, + 7, + 1, + 6487 + ), ( + 615803171, + 7, + 2, + 6438 + ), ( + 615803171, + 7, + 2, + 6439 + ), ( + 615803171, + 7, + 2, + 6440 + ), ( + 615803171, + 7, + 2, + 6441 + ), ( + 615803171, + 7, + 2, + 6442 + ), ( + 125278360, + 7, + 2, + 3586 + ), ( + 125278360, + 7, + 2, + 3587 + ), ( + 125278360, + 7, + 2, + 3588 + ), ( + 125278360, + 7, + 2, + 3589 + ), ( + 125278360, + 7, + 2, + 3590 + ), ( + 1066502091, + 7, + 2, + 4042 + ), ( + 1066502091, + 7, + 2, + 4043 + ), ( + 1066502091, + 7, + 2, + 4044 + ), ( + 1066502091, + 7, + 2, + 4045 + ), ( + 1066502091, + 7, + 2, + 4046 + ), ( + 1037946068, + 7, + 2, + 6443 + ), ( + 1037946068, + 7, + 2, + 6444 + ), ( + 1037946068, + 7, + 2, + 6445 + ), ( + 1037946068, + 7, + 2, + 6446 + ), ( + 1037946068, + 7, + 2, + 6447 + ), ( + 919839894, + 7, + 1, + 7188 + ), ( + 919839894, + 7, + 1, + 7189 + ), ( + 919839894, + 7, + 1, + 7190 + ), ( + 919839894, + 7, + 1, + 7191 + ), ( + 919839894, + 7, + 2, + 7192 + ), ( + 919839894, + 7, + 2, + 7193 + ), ( + 919839894, + 7, + 2, + 7194 + ), ( + 919839894, + 7, + 2, + 7195 + ), ( + 919839894, + 7, + 2, + 7196 + ), ( + 477131813, + 7, + 1, + 7215 + ), ( + 477131813, + 7, + 1, + 7216 + ), ( + 477131813, + 7, + 1, + 7217 + ), ( + 477131813, + 7, + 1, + 7218 + ), ( + 477131813, + 7, + 2, + 7093 + ), ( + 477131813, + 7, + 2, + 7094 + ), ( + 477131813, + 7, + 2, + 7095 + ), ( + 477131813, + 7, + 2, + 7096 + ), ( + 477131813, + 7, + 2, + 7097 + ), ( + 615934211, + 7, + 1, + 7206 + ), ( + 615934211, + 7, + 1, + 7207 + ), ( + 615934211, + 7, + 1, + 7208 + ), ( + 615934211, + 7, + 1, + 7209 + ), ( + 615934211, + 7, + 2, + 7210 + ), ( + 615934211, + 7, + 2, + 7211 + ), ( + 615934211, + 7, + 2, + 7212 + ), ( + 615934211, + 7, + 2, + 7213 + ), ( + 615934211, + 7, + 2, + 7214 + ), ( + 647465687, + 7, + 1, + 7197 + ), ( + 647465687, + 7, + 1, + 7198 + ), ( + 647465687, + 7, + 1, + 7199 + ), ( + 647465687, + 7, + 1, + 7200 + ), ( + 647465687, + 7, + 2, + 7201 + ), ( + 647465687, + 7, + 2, + 7202 + ), ( + 647465687, + 7, + 2, + 7203 + ), ( + 647465687, + 7, + 2, + 7204 + ), ( + 647465687, + 7, + 2, + 7205 + ), ( + 257005183, + 7, + 1, + 3254 + ), ( + 257005183, + 7, + 1, + 3253 + ), ( + 257005183, + 7, + 1, + 3255 + ), ( + 257005183, + 7, + 1, + 3256 + ), ( + 257005183, + 7, + 1, + 3257 + ), ( + 257005183, + 7, + 1, + 3258 + ), ( + 257005183, + 7, + 1, + 3259 + ), ( + 257005183, + 7, + 1, + 6396 + ), ( + 257005183, + 7, + 1, + 3260 + ), ( + 257005183, + 7, + 1, + 3261 + ), ( + 257005183, + 7, + 1, + 3262 + ), ( + 257005183, + 7, + 1, + 6397 + ), ( + 257005183, + 7, + 2, + 3273 + ), ( + 257005183, + 7, + 2, + 3274 + ), ( + 257005183, + 7, + 2, + 3275 + ), ( + 257005183, + 7, + 2, + 3276 + ), ( + 257005183, + 7, + 2, + 3277 + ), ( + 257005183, + 7, + 2, + 3278 + ), ( + 257005183, + 7, + 2, + 3279 + ), ( + 257005183, + 7, + 2, + 3280 + ), ( + 257005183, + 7, + 2, + 3281 + ), ( + 257005183, + 7, + 2, + 3282 + ), ( + 257005183, + 7, + 2, + 7447 + ), ( + 257005183, + 7, + 2, + 7448 + ), ( + 257005183, + 7, + 2, + 8881 + ), ( + 257005183, + 7, + 2, + 8882 + ), ( + 649274608, + 7, + 1, + 3607 + ), ( + 649274608, + 7, + 1, + 3608 + ), ( + 649274608, + 7, + 1, + 3593 + ), ( + 649274608, + 7, + 1, + 3594 + ), ( + 649274608, + 7, + 1, + 3595 + ), ( + 649274608, + 7, + 1, + 3596 + ), ( + 649274608, + 7, + 2, + 3597 + ), ( + 649274608, + 7, + 2, + 3598 + ), ( + 649274608, + 7, + 2, + 3599 + ), ( + 649274608, + 7, + 2, + 3600 + ), ( + 649274608, + 7, + 2, + 3601 + ), ( + 649274608, + 7, + 2, + 3602 + ), ( + 649274608, + 7, + 2, + 3603 + ), ( + 649274608, + 7, + 2, + 3604 + ), ( + 649274608, + 7, + 2, + 3605 + ), ( + 649274608, + 7, + 2, + 3606 + ), ( + 680601073, + 7, + 1, + 4014 + ), ( + 680601073, + 7, + 1, + 4015 + ), ( + 680601073, + 7, + 1, + 4016 + ), ( + 680601073, + 7, + 1, + 4017 + ), ( + 680601073, + 7, + 1, + 4018 + ), ( + 680601073, + 7, + 1, + 4019 + ), ( + 680601073, + 7, + 1, + 4020 + ), ( + 680601073, + 7, + 1, + 4021 + ), ( + 680601073, + 7, + 1, + 4022 + ), ( + 680601073, + 7, + 1, + 4023 + ), ( + 680601073, + 7, + 2, + 4024 + ), ( + 680601073, + 7, + 2, + 4025 + ), ( + 680601073, + 7, + 2, + 4026 + ), ( + 680601073, + 7, + 2, + 4027 + ), ( + 680601073, + 7, + 2, + 4028 + ), ( + 680601073, + 7, + 2, + 4029 + ), ( + 680601073, + 7, + 2, + 4030 + ), ( + 680601073, + 7, + 2, + 4031 + ), ( + 680601073, + 7, + 2, + 4032 + ), ( + 680601073, + 7, + 2, + 4033 + ), ( + 142209810, + 7, + 1, + 4803 + ), ( + 142209810, + 7, + 1, + 4804 + ), ( + 142209810, + 7, + 1, + 4805 + ), ( + 142209810, + 7, + 1, + 4806 + ), ( + 142209810, + 7, + 1, + 4807 + ), ( + 142209810, + 7, + 1, + 4808 + ), ( + 142209810, + 7, + 1, + 4809 + ), ( + 142209810, + 7, + 1, + 4810 + ), ( + 142209810, + 7, + 1, + 4811 + ), ( + 142209810, + 7, + 1, + 4812 + ), ( + 142209810, + 7, + 2, + 4813 + ), ( + 142209810, + 7, + 2, + 4814 + ), ( + 142209810, + 7, + 2, + 4815 + ), ( + 142209810, + 7, + 2, + 4816 + ), ( + 142209810, + 7, + 2, + 4817 + ), ( + 142209810, + 7, + 2, + 4818 + ), ( + 142209810, + 7, + 2, + 4819 + ), ( + 142209810, + 7, + 2, + 4820 + ), ( + 142209810, + 7, + 2, + 4821 + ), ( + 142209810, + 7, + 2, + 4822 + ); + +END; diff --git a/server/migrations/seed/DivaDefaults.sql b/server/migrations/seed/DivaDefaults.sql new file mode 100644 index 000000000..30a388e65 --- /dev/null +++ b/server/migrations/seed/DivaDefaults.sql @@ -0,0 +1,32 @@ +-- Diva Defense default prize rewards. +-- Personal track: type='personal', quantity=1 per milestone. +-- Guild track: type='guild', quantity=5 per milestone. +-- item_type=26 is Diva Coins; item_id=0 for all. +INSERT INTO diva_prizes (type, points_req, item_type, item_id, quantity, gr, repeatable) VALUES + ('personal', 500000, 26, 0, 1, false, false), + ('personal', 1000000, 26, 0, 1, false, false), + ('personal', 2000000, 26, 0, 1, false, false), + ('personal', 3000000, 26, 0, 1, false, false), + ('personal', 5000000, 26, 0, 1, false, false), + ('personal', 7000000, 26, 0, 1, false, false), + ('personal', 10000000, 26, 0, 1, false, false), + ('personal', 15000000, 26, 0, 1, false, false), + ('personal', 20000000, 26, 0, 1, false, false), + ('personal', 30000000, 26, 0, 1, false, false), + ('personal', 50000000, 26, 0, 1, false, false), + ('personal', 70000000, 26, 0, 1, false, false), + ('personal', 100000000, 26, 0, 1, false, false), + ('guild', 500000, 26, 0, 5, false, false), + ('guild', 1000000, 26, 0, 5, false, false), + ('guild', 2000000, 26, 0, 5, false, false), + ('guild', 3000000, 26, 0, 5, false, false), + ('guild', 5000000, 26, 0, 5, false, false), + ('guild', 7000000, 26, 0, 5, false, false), + ('guild', 10000000, 26, 0, 5, false, false), + ('guild', 15000000, 26, 0, 5, false, false), + ('guild', 20000000, 26, 0, 5, false, false), + ('guild', 30000000, 26, 0, 5, false, false), + ('guild', 50000000, 26, 0, 5, false, false), + ('guild', 70000000, 26, 0, 5, false, false), + ('guild', 100000000, 26, 0, 5, false, false) +ON CONFLICT DO NOTHING; diff --git a/server/migrations/seed/TournamentDefaults.sql b/server/migrations/seed/TournamentDefaults.sql new file mode 100644 index 000000000..7ee9083fb --- /dev/null +++ b/server/migrations/seed/TournamentDefaults.sql @@ -0,0 +1,62 @@ +-- Tournament #150 default data. +-- One tournament is inserted that starts immediately and has a wide window so operators +-- can adjust the timestamps after installation. The sub-events and cups are seeded +-- idempotently via ON CONFLICT DO NOTHING. +-- Cup groups: 16 = speed hunt (Brachydios variants), 17 = guild hunt, 6 = fishing size. +-- Cup types: 7 = speed hunt, 6 = fishing size. + +BEGIN; + +-- Default tournament (always active on a fresh install). +-- start_time = now, entry_end = +3 days, ranking_end = +13 days, reward_end = +20 days. +INSERT INTO tournaments (name, start_time, entry_end, ranking_end, reward_end) +SELECT + 'Tournament #150', + EXTRACT(epoch FROM NOW())::bigint, + EXTRACT(epoch FROM NOW() + INTERVAL '3 days')::bigint, + EXTRACT(epoch FROM NOW() + INTERVAL '13 days')::bigint, + EXTRACT(epoch FROM NOW() + INTERVAL '20 days')::bigint +WHERE NOT EXISTS (SELECT 1 FROM tournaments); + +-- Sub-events (shared across tournaments; NOT tournament-specific). +-- CupGroup 16: Speed hunt Brachydios variants (event_sub_type 0-14, quest_file_id 60691). +INSERT INTO tournament_sub_events (cup_group, event_sub_type, quest_file_id, name) +SELECT * FROM (VALUES + (16::smallint, 0::smallint, 60691, 'ブラキディオス'), + (16::smallint, 1::smallint, 60691, 'ブラキディオス'), + (16::smallint, 2::smallint, 60691, 'ブラキディオス'), + (16::smallint, 3::smallint, 60691, 'ブラキディオス'), + (16::smallint, 4::smallint, 60691, 'ブラキディオス'), + (16::smallint, 5::smallint, 60691, 'ブラキディオス'), + (16::smallint, 6::smallint, 60691, 'ブラキディオス'), + (16::smallint, 7::smallint, 60691, 'ブラキディオス'), + (16::smallint, 8::smallint, 60691, 'ブラキディオス'), + (16::smallint, 9::smallint, 60691, 'ブラキディオス'), + (16::smallint, 10::smallint, 60691, 'ブラキディオス'), + (16::smallint, 11::smallint, 60691, 'ブラキディオス'), + (16::smallint, 12::smallint, 60691, 'ブラキディオス'), + (16::smallint, 13::smallint, 60691, 'ブラキディオス'), + (16::smallint, 14::smallint, 60691, 'ブラキディオス'), + -- CupGroup 17: Guild hunt Brachydios (event_sub_type -1) + (17::smallint, -1::smallint, 60690, 'ブラキディオスギルド'), + -- CupGroup 6: Fishing size categories + (6::smallint, 234::smallint, 0, 'キレアジ'), + (6::smallint, 237::smallint, 0, 'ハリマグロ'), + (6::smallint, 239::smallint, 0, 'カクサンデメキン') +) AS v(cup_group, event_sub_type, quest_file_id, name) +WHERE NOT EXISTS (SELECT 1 FROM tournament_sub_events); + +-- Cups for the default tournament. +-- cup_type 7 = speed hunt, cup_type 6 = fishing size. +INSERT INTO tournament_cups (tournament_id, cup_group, cup_type, unk, name, description) +SELECT t.id, v.cup_group, v.cup_type, v.unk, v.name, v.description +FROM tournaments t +CROSS JOIN (VALUES + (16::smallint, 7::smallint, 0::smallint, 'スピードハントカップ', 'ブラキディオスをより速く狩れ'), + (17::smallint, 7::smallint, 0::smallint, 'ギルドハントカップ', 'ブラキディオスをギルドで狩れ'), + (6::smallint, 6::smallint, 0::smallint, 'フィッシングサイズカップ', '大きな魚を釣れ') +) AS v(cup_group, cup_type, unk, name, description) +WHERE NOT EXISTS (SELECT 1 FROM tournament_cups WHERE tournament_id = t.id) +ORDER BY t.id; + +COMMIT; diff --git a/server/migrations/sql/0001_init.sql b/server/migrations/sql/0001_init.sql index 3f15fdd23..92f3aa57d 100644 --- a/server/migrations/sql/0001_init.sql +++ b/server/migrations/sql/0001_init.sql @@ -998,7 +998,7 @@ CREATE TABLE public.guilds ( pugi_name_1 character varying(12) DEFAULT ''::character varying, pugi_name_2 character varying(12) DEFAULT ''::character varying, pugi_name_3 character varying(12) DEFAULT ''::character varying, - recruiting boolean DEFAULT true NOT NULL, + recruiting boolean DEFAULT false NOT NULL, pugi_outfit_1 integer DEFAULT 0 NOT NULL, pugi_outfit_2 integer DEFAULT 0 NOT NULL, pugi_outfit_3 integer DEFAULT 0 NOT NULL, diff --git a/server/migrations/sql/0004_alliance_recruiting.sql b/server/migrations/sql/0004_alliance_recruiting.sql index 500b182dd..1824a1068 100644 --- a/server/migrations/sql/0004_alliance_recruiting.sql +++ b/server/migrations/sql/0004_alliance_recruiting.sql @@ -1 +1 @@ -ALTER TABLE public.guild_alliances ADD COLUMN IF NOT EXISTS recruiting boolean NOT NULL DEFAULT true; +ALTER TABLE public.guild_alliances ADD COLUMN IF NOT EXISTS recruiting boolean NOT NULL DEFAULT false; diff --git a/server/migrations/sql/0016_campaign.sql b/server/migrations/sql/0016_campaign.sql new file mode 100644 index 000000000..84cba9f80 --- /dev/null +++ b/server/migrations/sql/0016_campaign.sql @@ -0,0 +1,66 @@ +-- Campaign / Event Tent system tables. +CREATE TABLE IF NOT EXISTS public.campaigns ( + id INTEGER PRIMARY KEY, + min_hr INTEGER, + max_hr INTEGER, + min_sr INTEGER, + max_sr INTEGER, + min_gr INTEGER, + max_gr INTEGER, + reward_type INTEGER, + stamps INTEGER, + receive_type INTEGER, + background_id INTEGER, + start_time TIMESTAMP WITH TIME ZONE, + end_time TIMESTAMP WITH TIME ZONE, + title TEXT, + reward TEXT, + link TEXT, + code_prefix TEXT +); + +CREATE TABLE IF NOT EXISTS public.campaign_categories ( + id SERIAL PRIMARY KEY, + type INTEGER, + title TEXT, + description TEXT +); + +CREATE TABLE IF NOT EXISTS public.campaign_category_links ( + id SERIAL PRIMARY KEY, + campaign_id INTEGER REFERENCES public.campaigns(id) ON DELETE CASCADE, + category_id INTEGER REFERENCES public.campaign_categories(id) ON DELETE CASCADE +); + +CREATE TABLE IF NOT EXISTS public.campaign_rewards ( + id SERIAL PRIMARY KEY, + campaign_id INTEGER REFERENCES public.campaigns(id) ON DELETE CASCADE, + item_type INTEGER, + quantity INTEGER, + item_id INTEGER +); + +CREATE TABLE IF NOT EXISTS public.campaign_rewards_claimed ( + character_id INTEGER REFERENCES public.characters(id) ON DELETE CASCADE, + reward_id INTEGER REFERENCES public.campaign_rewards(id) ON DELETE CASCADE, + PRIMARY KEY (character_id, reward_id) +); + +CREATE TABLE IF NOT EXISTS public.campaign_state ( + id SERIAL PRIMARY KEY, + campaign_id INTEGER REFERENCES public.campaigns(id) ON DELETE CASCADE, + character_id INTEGER REFERENCES public.characters(id) ON DELETE CASCADE, + code TEXT +); + +CREATE TABLE IF NOT EXISTS public.campaign_codes ( + code TEXT PRIMARY KEY, + campaign_id INTEGER REFERENCES public.campaigns(id) ON DELETE CASCADE, + multi BOOLEAN NOT NULL DEFAULT FALSE +); + +CREATE TABLE IF NOT EXISTS public.campaign_quest ( + campaign_id INTEGER REFERENCES public.campaigns(id) ON DELETE CASCADE, + character_id INTEGER REFERENCES public.characters(id) ON DELETE CASCADE, + PRIMARY KEY (campaign_id, character_id) +); diff --git a/server/migrations/sql/0017_diva.sql b/server/migrations/sql/0017_diva.sql new file mode 100644 index 000000000..6cb891db7 --- /dev/null +++ b/server/migrations/sql/0017_diva.sql @@ -0,0 +1,44 @@ +-- Diva Defense (United Defense) extended schema. +-- Adds bead selection, per-bead point accumulation, interception points, +-- and prize reward tables for personal and guild tracks. + +-- Interception map data per guild (binary blob, existing column pattern). +ALTER TABLE guilds ADD COLUMN IF NOT EXISTS interception_maps bytea; + +-- Per-character interception points keyed by quest file ID. +ALTER TABLE guild_characters ADD COLUMN IF NOT EXISTS interception_points jsonb NOT NULL DEFAULT '{}'; + +-- Prize reward table for personal and guild tracks. +CREATE TABLE IF NOT EXISTS diva_prizes ( + id SERIAL PRIMARY KEY, + type VARCHAR(10) NOT NULL CHECK (type IN ('personal', 'guild')), + points_req INTEGER NOT NULL, + item_type INTEGER NOT NULL, + item_id INTEGER NOT NULL, + quantity INTEGER NOT NULL, + gr BOOLEAN NOT NULL DEFAULT false, + repeatable BOOLEAN NOT NULL DEFAULT false +); + +-- Active bead types for the current Diva Defense event. +CREATE TABLE IF NOT EXISTS diva_beads ( + id SERIAL PRIMARY KEY, + type INTEGER NOT NULL +); + +-- Per-character bead slot assignments with expiry. +CREATE TABLE IF NOT EXISTS diva_beads_assignment ( + id SERIAL PRIMARY KEY, + character_id INTEGER NOT NULL REFERENCES characters(id) ON DELETE CASCADE, + bead_index INTEGER NOT NULL, + expiry TIMESTAMPTZ NOT NULL +); + +-- Per-character bead point accumulation log. +CREATE TABLE IF NOT EXISTS diva_beads_points ( + id SERIAL PRIMARY KEY, + character_id INTEGER NOT NULL REFERENCES characters(id) ON DELETE CASCADE, + bead_index INTEGER NOT NULL, + points INTEGER NOT NULL, + timestamp TIMESTAMPTZ NOT NULL DEFAULT NOW() +); diff --git a/server/migrations/sql/0018_guild_invites.sql b/server/migrations/sql/0018_guild_invites.sql new file mode 100644 index 000000000..64d91de19 --- /dev/null +++ b/server/migrations/sql/0018_guild_invites.sql @@ -0,0 +1,19 @@ +-- Dedicated table for guild-initiated scout invitations, separate from +-- player-initiated applications. This gives each invitation a real serial PK +-- so the client's InvitationID field can map to an actual database row +-- instead of being aliased to the character ID. +CREATE TABLE IF NOT EXISTS guild_invites ( + id serial PRIMARY KEY, + guild_id integer REFERENCES guilds(id), + character_id integer REFERENCES characters(id), + actor_id integer REFERENCES characters(id), + created_at timestamptz NOT NULL DEFAULT now() +); + +-- Migrate any existing scout invitations from guild_applications. +INSERT INTO guild_invites (guild_id, character_id, actor_id, created_at) +SELECT guild_id, character_id, actor_id, COALESCE(created_at, now()) +FROM guild_applications +WHERE application_type = 'invited'; + +DELETE FROM guild_applications WHERE application_type = 'invited'; diff --git a/server/migrations/sql/0019_save_transfer.sql b/server/migrations/sql/0019_save_transfer.sql new file mode 100644 index 000000000..760709353 --- /dev/null +++ b/server/migrations/sql/0019_save_transfer.sql @@ -0,0 +1,6 @@ +-- Save transfer tokens: one-time admin-granted permission for a character +-- to receive an imported save via the API endpoint. +-- NULL means no import is pending for this character. +ALTER TABLE characters + ADD COLUMN IF NOT EXISTS savedata_import_token TEXT, + ADD COLUMN IF NOT EXISTS savedata_import_token_expiry TIMESTAMPTZ; diff --git a/server/migrations/sql/0020_return_guilds.sql b/server/migrations/sql/0020_return_guilds.sql new file mode 100644 index 000000000..afed5b5e2 --- /dev/null +++ b/server/migrations/sql/0020_return_guilds.sql @@ -0,0 +1 @@ +ALTER TABLE public.guilds ADD COLUMN IF NOT EXISTS return_type SMALLINT NOT NULL DEFAULT 0; diff --git a/server/migrations/sql/0021_tournament.sql b/server/migrations/sql/0021_tournament.sql new file mode 100644 index 000000000..692e3849b --- /dev/null +++ b/server/migrations/sql/0021_tournament.sql @@ -0,0 +1,44 @@ +CREATE TABLE IF NOT EXISTS tournaments ( + id SERIAL PRIMARY KEY, + name VARCHAR(64) NOT NULL, + start_time BIGINT NOT NULL, + entry_end BIGINT NOT NULL, + ranking_end BIGINT NOT NULL, + reward_end BIGINT NOT NULL +); + +CREATE TABLE IF NOT EXISTS tournament_cups ( + id SERIAL PRIMARY KEY, + tournament_id INTEGER NOT NULL REFERENCES tournaments(id) ON DELETE CASCADE, + cup_group SMALLINT NOT NULL, + cup_type SMALLINT NOT NULL, + unk SMALLINT NOT NULL DEFAULT 0, + name VARCHAR(64) NOT NULL, + description TEXT NOT NULL DEFAULT '' +); + +CREATE TABLE IF NOT EXISTS tournament_sub_events ( + id SERIAL PRIMARY KEY, + cup_group SMALLINT NOT NULL, + event_sub_type SMALLINT NOT NULL DEFAULT 0, + quest_file_id INTEGER NOT NULL DEFAULT 0, + name VARCHAR(64) NOT NULL +); + +CREATE TABLE IF NOT EXISTS tournament_entries ( + id SERIAL PRIMARY KEY, + char_id INTEGER NOT NULL REFERENCES characters(id) ON DELETE CASCADE, + tournament_id INTEGER NOT NULL REFERENCES tournaments(id) ON DELETE CASCADE, + registered_at TIMESTAMPTZ NOT NULL DEFAULT NOW(), + UNIQUE (char_id, tournament_id) +); + +CREATE TABLE IF NOT EXISTS tournament_results ( + id SERIAL PRIMARY KEY, + char_id INTEGER NOT NULL REFERENCES characters(id) ON DELETE CASCADE, + tournament_id INTEGER NOT NULL REFERENCES tournaments(id) ON DELETE CASCADE, + event_id INTEGER NOT NULL, + quest_slot INTEGER NOT NULL DEFAULT 0, + stage_handle INTEGER NOT NULL DEFAULT 0, + submitted_at TIMESTAMPTZ NOT NULL DEFAULT NOW() +);