diff --git a/.gitignore b/.gitignore index f138959b..194c5d0c 100644 --- a/.gitignore +++ b/.gitignore @@ -44,3 +44,7 @@ storybook-static/* .sedbak **/tsconfig.tsbuildinfo + +AGENT.md +codex.md +CLAUDE.md diff --git a/LLM.md b/LLM.md new file mode 100644 index 00000000..6c1b92e4 --- /dev/null +++ b/LLM.md @@ -0,0 +1,350 @@ +# Lux Network MPC Bridge Architecture + +This document provides a comprehensive overview of the Lux Network MPC Bridge project, its components, and how they interact. This document distinguishes between **current implementation** and **planned features**. + +## Project Overview + +The Lux Network Bridge is a decentralized cross-chain bridge that uses Multi-Party Computation (MPC) to enable secure asset transfers between different blockchain networks. The bridge consists of several key components: + +1. **Smart Contracts**: EVM-compatible contracts deployed on various networks +2. **MPC Nodes**: Distributed nodes that use threshold signatures for secure transaction signing +3. **Bridge UI**: Web interface for users to initiate cross-chain transfers +4. **Backend Services**: APIs and services that coordinate the bridge operations +5. **Blockchain Monitors**: Services that monitor different blockchains (EVM and non-EVM) for events + +## Project Structure + +The project is organized as a monorepo with the following main directories: + +- `app/`: Frontend applications + - `bridge/`: Main bridge UI application (Next.js) + - `bridge3/`: New version of the bridge UI + - `explorer/`: Block explorer UI + - `server/`: Backend API services +- `contracts/`: Smart contracts for the bridge + - `contracts/`: Solidity smart contracts for various chains + - `ignition/`: Deployment modules for the contracts + - `scripts/`: Utility scripts for contract interactions +- `mpc-nodes/`: MPC node implementation + - `docker/`: Docker configuration for running MPC nodes + - `k8s.examples/`: Kubernetes deployment examples +- `pkg/`: Shared packages and utilities + - `luxfi-core/`: Core shared types and utilities + - `settings/`: Configuration settings + - `utila/`: Utility functions and helpers +- `docs/`: Documentation and guides + - `unified-mpc-library.md`: Details on planned MPC implementation + - `utxo-guide.md`: Guide for planned UTXO-based chain integration + - `eddsa-guide.md`: Guide for planned EdDSA signature implementation + +## Key Components + +### Smart Contracts + +The bridge uses several key smart contracts: + +1. **Bridge.sol**: The main contract that handles the teleport operations, including minting, burning, and vault interactions. +2. **ERC20B.sol**: Bridgeable ERC20 token implementation that supports the bridge-specific operations. +3. **LuxVault.sol**: Vault contract that securely holds tokens during the bridging process. +4. **ETHVault.sol**: Specialized vault for handling native ETH. + +The contracts support multiple blockchain networks, including: +- Ethereum (mainnet and testnets) +- BSC (Binance Smart Chain) +- Lux Network +- Zoo Network +- Base +- Polygon +- Avalanche +- XRP Ledger (XRPL) +- Many other EVM-compatible chains + +### MPC Nodes + +The MPC (Multi-Party Computation) nodes are a distributed network of servers that collectively sign transactions without any single node having access to the complete private key. Key features: + +1. **Decentralized oracle operations using MPC** +2. **Decentralized permissioning using MPC** +3. **Zero-knowledge transactions**: Signers don't know details about assets being teleported +4. **Multi-chain monitoring**: Nodes monitor various blockchains, including both EVM-compatible chains (like Ethereum, Binance Smart Chain, etc.) and non-EVM chains (like XRP Ledger) + +The MPC nodes are containerized using Docker and can be deployed on Kubernetes clusters for production environments. + +#### Current MPC Implementation + +- **CGGMP20 Protocol**: The bridge currently uses the CGGMP20 protocol for ECDSA threshold signatures +- **ECDSA Support**: Only ECDSA is currently supported, which works with all EVM-compatible chains + +#### Planned MPC Enhancements + +- **DKLs23 Protocol**: Being evaluated as a possible future update for improved efficiency and security +- **EdDSA Support**: Planned implementation of EdDSA for supporting non-EVM chains like Solana +- **Unified MPC Library**: A planned abstraction layer to unify ECDSA and EdDSA implementations behind a common API + +### Bridge UI + +The bridge UI is a Next.js application that provides: + +1. **Swap interface**: Allows users to initiate cross-chain transfers +2. **Network selection**: Support for multiple source and destination networks +3. **Token selection**: Support for various tokens on each network +4. **Wallet integration**: Connection to various wallets (EVM, Solana, etc.) +5. **Transaction history**: View and track past transactions + +## Bridge Workflow + +The bridge operates through the following workflow: + +1. **User initiates a transfer**: + - User connects their wallet to the bridge UI + - Selects source network, token, amount, destination network, and address + - Confirms the transaction + +2. **Source chain operations**: + - If using a wrapped token: Burns the token on the source chain + - If using a native token: Locks the token in the vault + +3. **MPC node validation**: + - MPC nodes monitor the source chain for bridge events + - For EVM chains, nodes look for BridgeBurned or VaultDeposit events + - For XRPL, nodes look for Payment transactions to the teleporter address + - Validate the transaction and collectively sign the approval + - No single node has the complete private key + +4. **Destination chain operations**: + - If minting a wrapped token: Creates new tokens on the destination chain + - If releasing a native token: Releases tokens from the vault + - Transfers to the recipient address + +5. **Transaction completion**: + - User receives tokens on the destination chain + - UI updates to show transaction status + +## MPC Implementation (Current & Planned) + +### Current Implementation + +The current MPC implementation focuses on ECDSA threshold signatures using the CGGMP20 protocol: + +1. **CGGMP20 Protocol**: + - Secure threshold ECDSA signatures + - Based on Castagnos and Laguillaumie's encryption scheme + - Efficient distributed key generation and signing + +2. **Key Features**: + - Distributed key generation + - Threshold signatures (t-of-n) + - No trusted dealer required + - Asynchronous communication between nodes + +3. **Supported Chains**: + - All EVM-compatible chains + - XRPL (using ECDSA) + +### Planned Enhancements + +The following enhancements are planned for future development: + +1. **DKLs23 Protocol Evaluation**: + - Newer protocol being evaluated for possible implementation + - Improved efficiency and security properties + - Potential replacement or alternative to CGGMP20 + +2. **EdDSA Support** (Planned): + - Implementation of threshold EdDSA signatures + - Support for chains like Solana that use Ed25519 signatures + - Integration with existing MPC infrastructure + +3. **Unified MPC Library** (Planned): + - Abstraction layer to unify ECDSA and EdDSA implementations + - Common API for different signature schemes + - Simplified integration of new blockchains + +4. **UTXO Support** (Planned): + - Support for UTXO-based blockchains like Bitcoin + - UTXO management and transaction building + - Integration with MPC signing + +## Development Environment + +The project uses: + +- **Node.js v20+**: JavaScript runtime +- **pnpm**: Package manager (v9.15.0+) +- **Next.js**: React framework for the UI +- **TypeScript**: For type-safe code +- **Hardhat**: Ethereum development environment for contracts +- **Docker/Kubernetes**: For containerization and deployment of MPC nodes + +## Running Locally + +To run the bridge locally: + +1. Install `pnpm`: https://pnpm.io/installation +2. Install dependencies: `pnpm install` +3. Run the bridge UI: `pnpm dev` + +## Supported Chains and Networks + +### Currently Supported +- **EVM-Compatible**: + - Ethereum (Chain ID: 1) + - Binance Smart Chain (Chain ID: 56) + - Polygon (Chain ID: 137) + - Optimism (Chain ID: 10) + - Arbitrum One (Chain ID: 42161) + - Celo (Chain ID: 42220) + - Base (Chain ID: 8453) + - Avalanche (Chain ID: 43114) + - Zora (Chain ID: 7777777) + - Blast (Chain ID: 81457) + - Linea (Chain ID: 59144) + - Fantom (Chain ID: 250) + - Aurora (Chain ID: 1313161554) + - Gnosis (Chain ID: 100) + - Lux Network (Chain ID: 96369) + - Zoo Network (Chain ID: 200200) + +- **Non-EVM Chains**: + - XRP Ledger (XRPL) Mainnet + +### Planned Support +- **Non-EVM Chains**: + - Solana (pending EdDSA implementation) + - Bitcoin (pending UTXO implementation) + - Avalanche X-Chain (pending UTXO implementation) + +For the most up-to-date list and configuration, refer to the settings file at: +`/mpc-nodes/docker/common/node/src/config/settings.ts` + +## Architecture Decisions + +### MPC Over Traditional Multi-sig + +The bridge uses MPC for enhanced security compared to traditional multi-signature approaches: +- No single entity can compromise the bridge +- Private keys never exist in complete form +- Decentralized validation of cross-chain transfers + +### Vault System + +The vault system allows for: +- Secure custody of assets during the bridge process +- Efficient asset management across chains +- Fee collection mechanism for bridge operations + +### Modular Design + +The project's modular architecture enables: +- Easy addition of new blockchain networks +- Support for different token types +- Scalable infrastructure to handle increasing loads + +## Security Considerations + +The bridge implements multiple security measures: + +1. **Threshold Signatures**: Requires a minimum number of MPC nodes to sign transactions +2. **Transaction Replay Protection**: Prevents replay attacks +3. **Fee Mechanisms**: Discourages spam and funds system maintenance +4. **Validation Checks**: Ensures transactions meet all requirements before execution + +## Adding New Chains + +### Adding a New EVM Chain + +To add a new EVM-compatible chain to the bridge, follow these steps: + +1. **Update Configuration**: + - Edit the configuration file at `/mpc-nodes/docker/common/node/src/config/settings.ts` + - Add a new entry to the `MAIN_NETWORKS` or `TEST_NETWORKS` array with the following information: + - `display_name`: User-friendly name of the network + - `internal_name`: Unique identifier for the network + - `is_testnet`: Boolean indicating if it's a testnet + - `chain_id`: The numeric chain ID + - `teleporter`: Address of the teleporter contract on this chain + - `vault`: Address of the vault contract on this chain + - `node`: RPC endpoint URL for this chain + - `currencies`: Array of supported tokens on this chain + +2. **Deploy Smart Contracts**: + - Deploy the Bridge.sol contract on the new chain + - Deploy the ERC20B.sol contract for bridgeable tokens + - Deploy the LuxVault.sol or ETHVault.sol as needed + - Update the configuration with the new contract addresses + +3. **Update Swap Pairs**: + - Add entries to the `SWAP_PAIRS` object to define which tokens on the new chain can be swapped with tokens on other chains + +4. **Testing**: + - Test transactions from the new chain to existing chains + - Test transactions from existing chains to the new chain + - Verify that tokens can be correctly bridged in both directions + +### Adding a Non-EVM Blockchain (Future) + +Adding a non-EVM blockchain would require additional custom implementation (planned features): + +1. **Update Configuration**: + - Similar to EVM chains, add the configuration to the settings file + - Specify blockchain-specific parameters (like node endpoints and teleporter addresses) + +2. **Implement Blockchain Monitors**: + - In the MPC node, add specialized monitoring for the blockchain events + - For XRPL, the implementation looks for Payment transactions to the teleporter address + - For Solana (planned), would need to monitor for specific program events + +3. **Add Transaction Validation**: + - Implement chain-specific validation of transactions + - For XRPL, validate that the transaction is of type "Payment" + - For Solana (planned), would need to validate program invocations + +4. **Add Chain Libraries**: + - Import and use chain-specific libraries for interacting with the blockchain + - For XRPL, this includes the `xrpl` library + - For Solana (planned), would need to use the `@solana/web3.js` library + +5. **Implement Signature Generation**: + - Add support for generating signatures for minting tokens on destination chains + - For EdDSA chains like Solana (planned), would need to implement EdDSA threshold signatures + +6. **Update UI**: + - Add support in the UI for connecting to the new blockchain's wallets + - Update network selection to include the new blockchain + +7. **Testing**: + - Test transactions from the new blockchain to existing chains + - Test transactions from existing chains to the new blockchain + - Verify that tokens can be correctly bridged in both directions + +## Future Roadmap (Planned Features) + +### EdDSA Support + +Implementation of Edwards-curve Digital Signature Algorithm (EdDSA) threshold signatures to support chains like Solana: + +1. **Protocol Selection**: Evaluation and selection of an appropriate EdDSA threshold signature protocol +2. **Integration with Existing MPC Framework**: Extending the current MPC framework to support EdDSA +3. **Key Generation**: Implementation of distributed key generation for EdDSA +4. **Signature Generation**: Implementation of threshold signatures for EdDSA +5. **Chain Integration**: Support for Solana and other EdDSA-based chains + +### UTXO Support + +Implementation of support for UTXO-based blockchains like Bitcoin and Avalanche X-Chain: + +1. **UTXO Management**: Tracking and management of UTXOs +2. **Transaction Building**: Creation of UTXO-based transactions +3. **MPC Integration**: Using the existing MPC infrastructure for signing UTXO transactions +4. **Monitoring**: Tracking UTXO-based blockchain for events +5. **Sweeping**: Implementation of UTXO sweeping for efficient management + +### DKLs23 Protocol Evaluation + +Evaluation and potential implementation of the DKLs23 protocol for improved efficiency and security: + +1. **Performance Analysis**: Comparison with the current CGGMP20 implementation +2. **Security Analysis**: Evaluation of security properties +3. **Implementation**: Development of a DKLs23-based threshold signature scheme +4. **Integration**: Integration with the existing MPC infrastructure +5. **Testing**: Comprehensive testing to ensure reliability and security diff --git a/app/bridge/package.json b/app/bridge/package.json index b1136274..6db71d4c 100644 --- a/app/bridge/package.json +++ b/app/bridge/package.json @@ -13,7 +13,8 @@ "storybook": "storybook dev -p 6006", "build:storybook": "storybook build", "analyze": "cross-env ANALYZE=true next build", - "lint": "next lint" + "lint": "next lint", + "test": "vitest --config vitest.config.ts" }, "dependencies": { "@eth-optimism/contracts-ts": "^0.15.0", @@ -21,6 +22,8 @@ "@hanzo/ui": "4.2.0", "@headlessui/react": "^1.7.3", "@imtbl/imx-sdk": "2.1.1", + "@ledgerhq/hw-app-xrp": "^6.31.0", + "@ledgerhq/hw-transport-webhid": "^6.30.0", "@loopring-web/loopring-sdk": "3.3.5", "@loopring-web/web3-provider": "1.4.13", "@luxfi/ui": "5.4.1", @@ -83,6 +86,8 @@ "viem": "^2.9.9", "wagmi": "^2.5.19", "web3": "^4.11.1", + "xrpl": "^4.2.5", + "xumm": "^1.8.0", "zksync": "^0.13.1", "zustand": "^4.4.1" }, @@ -99,6 +104,9 @@ "@storybook/nextjs": "^7.4.5", "@storybook/react": "^7.4.5", "@storybook/testing-library": "^0.2.1", + "@testing-library/jest-dom": "^6.6.3", + "@testing-library/react": "^16.3.0", + "@testing-library/react-hooks": "^8.0.1", "@types/bn.js": "^5.1.0", "@types/crypto-js": "^4.1.1", "@types/lodash.merge": "^4.6.9", @@ -122,6 +130,7 @@ "storybook-react-context": "^0.6.0", "tailwindcss": "catalog:", "typescript": "catalog:", + "vitest": "^3.1.3", "webpack-watch-files-plugin": "^1.2.1" } } diff --git a/app/bridge/src/Models/CryptoNetwork.ts b/app/bridge/src/Models/CryptoNetwork.ts index a3c9af0d..58970d0c 100644 --- a/app/bridge/src/Models/CryptoNetwork.ts +++ b/app/bridge/src/Models/CryptoNetwork.ts @@ -8,6 +8,7 @@ export enum NetworkType { TON = "ton", Bitocoin = "btc", Cardano = "cardano", + XRPL = "xrpl", } export type CryptoNetwork = { diff --git a/app/bridge/src/components/icons/Wallets/XRPL.tsx b/app/bridge/src/components/icons/Wallets/XRPL.tsx new file mode 100644 index 00000000..4cb6382d --- /dev/null +++ b/app/bridge/src/components/icons/Wallets/XRPL.tsx @@ -0,0 +1,16 @@ +import React from 'react' +import Image, { type ImageProps } from 'next/image' + +// XRP Ledger icon for wallet display +// XRP Ledger icon for wallet display +export default function XRPLIcon(props: Omit) { + return ( + XRP Ledger + ) +} \ No newline at end of file diff --git a/app/bridge/src/components/lux/teleport/constants/settings.ts b/app/bridge/src/components/lux/teleport/constants/settings.ts index be91c4d5..33c2c7e3 100644 --- a/app/bridge/src/components/lux/teleport/constants/settings.ts +++ b/app/bridge/src/components/lux/teleport/constants/settings.ts @@ -166,6 +166,7 @@ export const SWAP_PAIRS: Record = { LDOGS: ['ZDOGS'], LMRB: ['ZMRB'], LREDO: ['ZREDO'], + LXRP: ['XRP', 'ZXRP'], // Zoo tokens ZOO: ['LZOO'], ZBTC: ['WBTC', 'LBTC'], @@ -197,6 +198,7 @@ export const SWAP_PAIRS: Record = { ZDOGS: ['LDOGS'], ZMRB: ['LMRB'], ZREDO: ['LREDO'], + ZXRP: ['LXRP', 'XRP'], // Evm tokens ETH: ['LETH', 'ZETH'], WETH: ['LETH', 'ZETH'], @@ -239,4 +241,6 @@ export const SWAP_PAIRS: Record = { DOGS: ['LDOGS', 'ZDOGS'], MRB: ['LMRB', 'ZMRB'], REDO: ['LREDO', 'ZREDO'], + // XRP Ledger tokens + XRP: ['LXRP', 'ZXRP'], } diff --git a/app/bridge/src/components/lux/teleport/process.tsx b/app/bridge/src/components/lux/teleport/process.tsx index f00cab34..014cec77 100644 --- a/app/bridge/src/components/lux/teleport/process.tsx +++ b/app/bridge/src/components/lux/teleport/process.tsx @@ -30,7 +30,10 @@ interface IProps { const Form: React.FC = ({ swapId, className }) => { const { networks } = useSettings() - const filteredNetworks = networks.filter((n: CryptoNetwork) => n.type === NetworkType.EVM) + const filteredNetworks = networks.filter( + (n: CryptoNetwork) => + n.type === NetworkType.EVM || n.type === NetworkType.XRPL + ) const [sourceNetwork, setSourceNetwork] = React.useState(undefined) const [sourceAsset, setSourceAsset] = React.useState(undefined) diff --git a/app/bridge/src/components/lux/teleport/swap/SwapDetails.tsx b/app/bridge/src/components/lux/teleport/swap/SwapDetails.tsx index 2f28aeb2..075b8f1d 100644 --- a/app/bridge/src/components/lux/teleport/swap/SwapDetails.tsx +++ b/app/bridge/src/components/lux/teleport/swap/SwapDetails.tsx @@ -7,8 +7,10 @@ import UserTokenDepositor from './progress/TokenDepositor' import TeleportProcessor from './progress/TeleportProcessor' import PayoutProcessor from './progress/PayoutProcessor' import SwapSuccess from './progress/SwapSuccess' +import XrplPayoutProcessor from './progress/XrplPayoutProcessor' import { SwapStatus } from '@/Models/SwapStatus' import type { CryptoNetwork, NetworkCurrency } from '@/Models/CryptoNetwork' +import { NetworkType } from '@/Models/CryptoNetwork' interface IProps { className?: string @@ -62,6 +64,24 @@ const SwapDetails: React.FC = ({ /> ) } else if (swapStatus === SwapStatus.UserPayoutPending) { + // XRPL payout flow + if (destinationNetwork.type === NetworkType.XRPL) { + return ( + + ) + } return ( = ({ }) => { //state const [isMpcSigning, setIsMpcSigning] = React.useState(false) + const [xrpTxId, setXrpTxId] = useState('') //atoms const [userTransferTransaction] = useAtom(userTransferTransactionAtom) const [swapStatus, setSwapStatus] = useAtom(swapStatusAtom) @@ -59,6 +62,7 @@ const TeleportProcessor: React.FC = ({ const { connectWallet } = useWallet() const { serverAPI } = useServerAPI() const { notify } = useNotify() + const { account, connectXumm, connectLedger, sendPayment } = useXrplWallet() const toBurn = React.useMemo( () => @@ -75,8 +79,52 @@ const TeleportProcessor: React.FC = ({ : false, [destinationAsset] ) + // Detect XRPL deposit flow + const isXrpl = sourceNetwork?.type === NetworkType.XRPL + + // Handler for XRP transaction hash input + const handleXrplMpcSignature = async (providedXrpTxId?: string) => { + const txidToSign = providedXrpTxId ?? xrpTxId + if (!txidToSign) { + notify('Enter XRP transaction hash', 'warn') + return + } + try { + setIsMpcSigning(true) + const receiverAddressHash = Web3.utils.keccak256(String(destinationAddress)) + const signData = { + txId: txidToSign, + fromNetworkId: sourceNetwork?.chain_id, + toNetworkId: destinationNetwork?.chain_id, + toTokenAddress: destinationAsset?.contract_address, + msgSignature: '', + receiverAddressHash, + nonce: 0 + } + const { data } = await serverAPI.post(`/api/swaps/getsig`, signData) + if (data.data) { + await serverAPI.post(`/api/swaps/mpcsign/${swapId}`, { + txHash: data.data.signature, + amount: sourceAmount, + from: data.data.mpcSigner, + to: '' + }) + setMpcSignature(data.data.signature) + setSwapStatus('user_payout_pending') + } else { + notify('Failed to get MPC signature for XRP', 'error') + } + } catch (err) { + console.error(err) + notify('XRPL signing failed', 'error') + } finally { + setIsMpcSigning(false) + } + } React.useEffect(() => { + // skip for XRP, handled via manual TX input + if (sourceNetwork?.type === NetworkType.XRPL) return if (isConnecting || !signer) return if (Number(chainId) === Number(sourceNetwork?.chain_id)) { @@ -158,6 +206,82 @@ const TeleportProcessor: React.FC = ({ } } + // XRPL flow: manual transaction hash input + if (isXrpl) { + return ( +
+
+ +
+
+ {!account && ( +
+ + +
+ )} + {account ? ( + + ) : ( + <> + + setXrpTxId(e.target.value)} + /> + + + )} +
+
+ ); + } return (
@@ -282,7 +406,7 @@ const TeleportProcessor: React.FC = ({
- ) + ); } export default TeleportProcessor diff --git a/app/bridge/src/components/lux/teleport/swap/progress/XrplPayoutProcessor.tsx b/app/bridge/src/components/lux/teleport/swap/progress/XrplPayoutProcessor.tsx new file mode 100644 index 00000000..a3d4af32 --- /dev/null +++ b/app/bridge/src/components/lux/teleport/swap/progress/XrplPayoutProcessor.tsx @@ -0,0 +1,169 @@ +import React, { useState } from 'react' +import { useNotify } from '@/context/toast-provider' +import { useAtom } from 'jotai' +import { useServerAPI } from '@/hooks/useServerAPI' +import { useXrplWallet } from '@/hooks/useXrplWallet' +import { swapStatusAtom, bridgeMintTransactionAtom } from '@/store/teleport' +import { xrpToDrops, isValidXrpAddress } from '@/lib/xrpUtils' +import SwapItems from './SwapItems' +import { NetworkType, type CryptoNetwork, type NetworkCurrency } from '@/Models/CryptoNetwork' + +interface IProps { + sourceNetwork: { type: NetworkType; chain_id?: string | number } + destinationNetwork: { type: NetworkType } + destinationAddress: string + sourceAmount: string + swapId: string + className?: string +} + +const XrplPayoutProcessor: React.FC = ({ + sourceNetwork, + destinationNetwork, + destinationAddress, + sourceAmount, + swapId, + className, +}) => { + const { notify } = useNotify() + const { serverAPI } = useServerAPI() + const { account, connectXumm, connectLedger, sendPayment } = useXrplWallet() + const [, setBridgeMintTx] = useAtom(bridgeMintTransactionAtom) + const [, setSwapStatus] = useAtom(swapStatusAtom) + const [isPayout, setIsPayout] = useState(false) + + const handleXrplPayout = async () => { + if (!account) { + notify('Please connect XRPL wallet first', 'warn') + return + } + + // Validate XRP destination address + if (!isValidXrpAddress(destinationAddress)) { + notify('Invalid XRP destination address', 'error') + return + } + try { + setIsPayout(true) + + // XRP uses 6 decimals, convert amount to drops (1 XRP = 1,000,000 drops) + const drops = xrpToDrops(sourceAmount) + + const txid = await sendPayment(drops, destinationAddress) + setBridgeMintTx(txid) + await serverAPI.post(`/api/swaps/payout/${swapId}`, { + txHash: txid, + amount: sourceAmount, + from: account.address, + to: destinationAddress, + }) + setSwapStatus('payout_success') + } catch (err: any) { + console.error('XRPL payment error:', err) + if (err?.message?.includes('timeout')) { + notify('Transaction timeout. The XRP network may be experiencing delays. Please check your XRP wallet for transaction status.', 'error') + } else if (err?.message?.includes('insufficient funds')) { + notify('Insufficient funds in your XRP wallet to complete this transaction.', 'error') + } else if (err?.message?.includes('rejected')) { + notify('Transaction was rejected. Please try again or use a different wallet.', 'error') + } else { + notify(err?.message || 'XRPL payout failed. Please check your XRP wallet and try again.', 'error') + } + } finally { + setIsPayout(false) + } + } + + return ( +
+
+ +
+
+ {!account && ( +
+ + +
+ )} + {account && ( + + )} +
+
+ ) +} + +export default XrplPayoutProcessor \ No newline at end of file diff --git a/app/bridge/src/hooks/useWallet.ts b/app/bridge/src/hooks/useWallet.ts index 129cf1b6..1a6ba2cf 100644 --- a/app/bridge/src/hooks/useWallet.ts +++ b/app/bridge/src/hooks/useWallet.ts @@ -6,6 +6,7 @@ import useEVM from "../lib/wallets/evm/useEVM"; //import useStarknet from "../lib/wallets/starknet/useStarknet"; import useImmutableX from "../lib/wallets/immutableX/useIMX"; import useSolana from "../lib/wallets/solana/useSolana"; +import useXRPLWallet from "../lib/wallets/xrpl/useXRPLWallet"; import type { CryptoNetwork } from "@/Models/CryptoNetwork"; export type WalletProvider = { @@ -23,9 +24,10 @@ function useWallet() { const WalletProviders: WalletProvider[] = [ // useTON(), useEVM(), - //useStarknet(), useImmutableX(), useSolana(), + useXRPLWallet(), + //useStarknet(), ]; async function handleConnect(providerName: string, chain?: string | number) { diff --git a/app/bridge/src/hooks/useXrplWallet.test.ts b/app/bridge/src/hooks/useXrplWallet.test.ts new file mode 100644 index 00000000..5cf88dea --- /dev/null +++ b/app/bridge/src/hooks/useXrplWallet.test.ts @@ -0,0 +1,13 @@ +// @ts-nocheck +import { renderHook } from '@testing-library/react-hooks' +import { useXrplWallet } from './useXrplWallet' + +describe('useXrplWallet', () => { + it('should expose XUMM & Ledger connection and payment methods', () => { + const { result } = renderHook(() => useXrplWallet()) + expect(typeof result.current.connectXumm).toBe('function') + expect(typeof result.current.connectLedger).toBe('function') + expect(typeof result.current.sendPayment).toBe('function') + expect(typeof result.current.disconnect).toBe('function') + }) +}) \ No newline at end of file diff --git a/app/bridge/src/hooks/useXrplWallet.ts b/app/bridge/src/hooks/useXrplWallet.ts new file mode 100644 index 00000000..80393d8d --- /dev/null +++ b/app/bridge/src/hooks/useXrplWallet.ts @@ -0,0 +1,118 @@ +"use client" +import { useState, useEffect } from 'react' +import { Client as XrplClient, Wallet as XrplWallet } from 'xrpl' +import { XummSdk, PayloadCreate } from 'xumm' +import TransportWebHID from '@ledgerhq/hw-transport-webhid' +import AppXrp from '@ledgerhq/hw-app-xrp' + +export type XrpAccount = { address: string } + +// XRPL Network configuration +const XRPL_NETWORKS = { + MAINNET: 'wss://s1.ripple.com', + TESTNET: 'wss://s.altnet.rippletest.net:51233' +} + +/** + * Hook for integrating with XRP Ledger wallets + * Supports both XUMM and Ledger hardware wallets + */ +export function useXrplWallet() { + const [client, setClient] = useState() + const [sdk, setSdk] = useState() + const [account, setAccount] = useState() + const [connector, setConnector] = useState<'xumm' | 'ledger'>('xumm') + + // initialize XRPL client and XUMM SDK + useEffect(() => { + // Connect to XRPL network based on environment (default to mainnet) + const networkUrl = process.env.NEXT_PUBLIC_API_VERSION === 'mainnet' ? + XRPL_NETWORKS.MAINNET : XRPL_NETWORKS.TESTNET + + const c = new XrplClient(networkUrl) + c.connect().then(() => setClient(c)) + if (process.env.NEXT_PUBLIC_XUMM_API_KEY && process.env.NEXT_PUBLIC_XUMM_API_SECRET) { + setSdk(new XummSdk( + process.env.NEXT_PUBLIC_XUMM_API_KEY, + process.env.NEXT_PUBLIC_XUMM_API_SECRET + )) + } + }, []) + + // connect via XUMM + const connectXumm = async () => { + if (!sdk) throw new Error('XUMM SDK not initialized') + const { uuid } = await sdk.payload.create({ + TransactionType: 'SignIn' + } as any) + sdk.ws.subscribe(`payload.${uuid}`).then((sub: any) => { + sub.on('success', (data: any) => { + setAccount({ address: data.account }) + setConnector('xumm') + }) + }) + window.open(sdk.payload.get.xrplNextUrl(uuid), '_blank') + } + + // connect via Ledger hardware + const connectLedger = async () => { + const transport = await TransportWebHID.create() + const app = new AppXrp(transport) + const resp = await app.getAddress() + setAccount({ address: resp.address }) + setConnector('ledger') + } + + // send payment and return txid + const sendPayment = async (amountDrops: string, destination: string) => { + if (!client || !account) throw new Error('XRPL wallet not connected') + + // Basic validation check + if (!destination.startsWith('r') || destination.length < 25) { + throw new Error('Invalid XRP destination address') + } + + // Handle numeric validation + const drops = Number(amountDrops) + if (isNaN(drops) || drops <= 0) { + throw new Error('Invalid XRP amount') + } + if (connector === 'xumm') { + const tx = { + TransactionType: 'Payment', + Account: account.address, + Amount: amountDrops, + Destination: destination + } + const { uuid } = await sdk.payload.create({ txjson: tx } as any) + return new Promise(resolve => { + sdk.ws.subscribe(`payload.${uuid}`).then((sub: any) => { + sub.on('success', (data: any) => resolve(data.response.txid)) + }) + }) + } else { + const transport = await TransportWebHID.create() + const app = new AppXrp(transport) + const prepared = await client.autofill({ + TransactionType: 'Payment', + Account: account.address, + Amount: amountDrops, + Destination: destination + }) + const signed = await app.sign(prepared) + const result = await client.submitAndWait(signed.signedTransaction) + return result.result.hash + } + } + + const disconnect = () => setAccount(undefined) + + return { + account, + connector, + connectXumm, + connectLedger, + sendPayment, + disconnect + } +} \ No newline at end of file diff --git a/app/bridge/src/lib/wallets/xrpl/useXRPLWallet.test.ts b/app/bridge/src/lib/wallets/xrpl/useXRPLWallet.test.ts new file mode 100644 index 00000000..52d96a0b --- /dev/null +++ b/app/bridge/src/lib/wallets/xrpl/useXRPLWallet.test.ts @@ -0,0 +1,13 @@ +// @ts-nocheck +import useXRPLWallet from './useXRPLWallet' + +describe('useXRPLWallet adapter', () => { + it('should return a WalletProvider with correct shape', () => { + const provider = useXRPLWallet() + expect(provider.name).toBe('XRPL') + expect(Array.isArray(provider.autofillSupportedNetworks)).toBe(true) + expect(typeof provider.connectWallet).toBe('function') + expect(typeof provider.disconnectWallet).toBe('function') + expect(typeof provider.getConnectedWallet).toBe('function') + }) +}) \ No newline at end of file diff --git a/app/bridge/src/lib/wallets/xrpl/useXRPLWallet.ts b/app/bridge/src/lib/wallets/xrpl/useXRPLWallet.ts new file mode 100644 index 00000000..119afe38 --- /dev/null +++ b/app/bridge/src/lib/wallets/xrpl/useXRPLWallet.ts @@ -0,0 +1,30 @@ +import { useXrplWallet } from '@/hooks/useXrplWallet' +import type { WalletProvider } from '@/hooks/useWallet' +import type { Wallet } from '@/stores/walletStore' +import XrplIcon from '@/components/icons/Wallets/XRPL' + +export default function useXRPLWallet(): WalletProvider { + const { account, connector, connectXumm, connectLedger, disconnect } = useXrplWallet() + + const getConnectedWallet = (): Wallet | undefined => { + if (!account) return undefined + return { + address: account.address, + providerName: 'XRPL', + icon: XrplIcon, + connector, + chainId: undefined + } + } + + return { + name: 'XRPL', + autofillSupportedNetworks: ['XRPL_MAINNET', 'XRPL_TESTNET'], + withdrawalSupportedNetworks: [], + connectWallet: async (_chain?: string | number | null | undefined) => { + await connectXumm() + }, + disconnectWallet: () => disconnect(), + getConnectedWallet + } +} diff --git a/app/bridge/src/lib/xrpUtils.ts b/app/bridge/src/lib/xrpUtils.ts new file mode 100644 index 00000000..d0812d56 --- /dev/null +++ b/app/bridge/src/lib/xrpUtils.ts @@ -0,0 +1,38 @@ +/** + * Utilities for working with XRP and the XRP Ledger + */ + +/** + * Convert XRP amount to drops (1 XRP = 1,000,000 drops) + * @param xrpAmount The amount in XRP + * @returns The amount in drops (string) + */ +export function xrpToDrops(xrpAmount: string | number): string { + const xrp = typeof xrpAmount === 'string' ? parseFloat(xrpAmount) : xrpAmount; + const drops = Math.floor(xrp * 1_000_000); + return drops.toString(); +} + +/** + * Convert drops to XRP amount + * @param drops The amount in drops + * @returns The amount in XRP (string) + */ +export function dropsToXrp(drops: string | number): string { + const dropsNum = typeof drops === 'string' ? parseInt(drops, 10) : drops; + const xrp = dropsNum / 1_000_000; + return xrp.toString(); +} + +/** + * Validates an XRP address + * Very basic validation - checks general format + * For comprehensive validation, a proper XRP library should be used + * @param address XRP address to validate + * @returns boolean indicating if address is valid + */ +export function isValidXrpAddress(address: string): boolean { + // Basic validation - XRP addresses are typically around 25-35 characters + // and start with 'r' followed by alphanumeric characters + return /^r[1-9A-HJ-NP-Za-km-z]{24,34}$/.test(address); +} diff --git a/app/bridge/src/types/global.d.ts b/app/bridge/src/types/global.d.ts new file mode 100644 index 00000000..c0be2799 --- /dev/null +++ b/app/bridge/src/types/global.d.ts @@ -0,0 +1,5 @@ +// Global module declarations for modules without types +declare module 'xrpl' +declare module 'xumm' +declare module '@ledgerhq/hw-transport-webhid' +declare module '@ledgerhq/hw-app-xrp' \ No newline at end of file diff --git a/app/server/prisma/schema.prisma b/app/server/prisma/schema.prisma index 540e5706..5bb0efc0 100644 --- a/app/server/prisma/schema.prisma +++ b/app/server/prisma/schema.prisma @@ -20,9 +20,9 @@ datasource db { // prisma migrate dev model Network { - id Int @id @default(autoincrement()) + id Int @id @default(autoincrement()) display_name String? - internal_name String? @unique + internal_name String? @unique native_currency String? is_testnet Boolean? is_featured Boolean? @@ -33,12 +33,13 @@ model Network { transaction_explorer_template String? account_explorer_template String? listing_date DateTime? - source_network Swap[] @relation("sourceNetwork") - destination_network Swap[] @relation("destinationNetwork") + source_network Swap[] @relation("sourceNetwork") + destination_network Swap[] @relation("destinationNetwork") currencies Currency[] deposit_actions DepositAction[] transaction Transaction[] nodes RpcNode[] + withdrawal_statistics WithdrawalTimeStatistic[] } model Currency { @@ -168,3 +169,22 @@ model RpcNode { network_id Int network Network @relation(fields: [network_id], references: [id]) } + +model WithdrawalTimeStatistic { + id Int @id @default(autoincrement()) + network_id Int + network Network @relation(fields: [network_id], references: [id]) + total_withdrawals Int @default(0) + total_time_seconds Int @default(0) + avg_time_seconds Float @default(0) + min_time_seconds Int @default(0) + max_time_seconds Int @default(0) + last_24h_withdrawals Int @default(0) + last_24h_avg_seconds Float @default(0) + last_hour_withdrawals Int @default(0) + last_hour_avg_seconds Float @default(0) + updated_at DateTime @default(now()) @updatedAt + created_at DateTime @default(now()) + + @@index([network_id]) +} diff --git a/app/server/src/domain/rate.ts b/app/server/src/domain/rate.ts index 035ed49a..81114e37 100644 --- a/app/server/src/domain/rate.ts +++ b/app/server/src/domain/rate.ts @@ -1,4 +1,6 @@ import { getTokenPrice } from "@/domain/tokens" +import { getNetworkWithdrawalTimeStatistics } from "@/domain/swaps" +import { prisma } from "@/prisma-instance" export const getRate = async ( fromNetwork: string, @@ -14,6 +16,39 @@ export const getRate = async ( getTokenPrice(toAsset) ]) + // Get real withdrawal time statistics for the destination network if available + let withdrawalTimeStats = { total_seconds: 0, total_minutes: 0, total_hours: 0 } + + try { + const network = await prisma.network.findFirst({ + where: { internal_name: toNetwork } + }) + + if (network) { + const stats = await getNetworkWithdrawalTimeStatistics(network.id) + if (stats && stats.total_withdrawals > 0) { + // Use the last 24 hours average if available, otherwise use all-time average + const avgSeconds = stats.last_24h_withdrawals > 0 + ? stats.last_24h_avg_seconds + : stats.avg_time_seconds + + withdrawalTimeStats = { + total_seconds: avgSeconds % 60, + total_minutes: Math.floor(avgSeconds / 60) % 60, + total_hours: Math.floor(avgSeconds / 3600) + } + } + } + } catch (error) { + console.error('Error getting withdrawal time statistics:', error) + // Use default values if stats are not available + withdrawalTimeStats = { + total_minutes: 2, + total_seconds: 0, + total_hours: 0, + } + } + return { wallet_fee_in_usd: 10, wallet_fee: 0.1, @@ -21,11 +56,7 @@ export const getRate = async ( manual_fee_in_usd: 0, manual_fee: 0, manual_receive_amount: amount * sourcePrice / destinationPrice, - avg_completion_time: { - total_minutes: 2, - total_seconds: 0, - total_hours: 0, - }, + avg_completion_time: withdrawalTimeStats, fee_usd_price: 10, } } \ No newline at end of file diff --git a/app/server/src/domain/settings/mainnet/networks.ts b/app/server/src/domain/settings/mainnet/networks.ts index 7286832a..e2ad6c97 100644 --- a/app/server/src/domain/settings/mainnet/networks.ts +++ b/app/server/src/domain/settings/mainnet/networks.ts @@ -1392,6 +1392,23 @@ export default [ "source_base_fee": 0, "destination_base_fee": 0, "mint": true + }, + { + "name": "Liquid XRP", + "asset": "LXRP", + "logo": "https://cdn.lux.network/bridge/currencies/lux/lxrp.svg", + "contract_address": "0x408E5681E209d37FD52c76cF9ee7EfFA8476cd9b", + "decimals": 18, + "status": "active", + "is_deposit_enabled": true, + "is_withdrawal_enabled": true, + "is_refuel_enabled": false, + "max_withdrawal_amount": 1, + "deposit_fee": 0, + "withdrawal_fee": 0.01, + "source_base_fee": 0, + "destination_base_fee": 0, + "mint": true } ], "metadata": null, @@ -2071,6 +2088,23 @@ export default [ "source_base_fee": 0, "destination_base_fee": 0, "mint": true + }, + { + "name": "Zoo XRP", + "asset": "ZXRP", + "logo": "https://cdn.lux.network/bridge/currencies/zoo/zxrp.svg", + "contract_address": "0x137747A15dE042Cd01fCB41a5F3C7391d932750C", + "decimals": 18, + "status": "active", + "is_deposit_enabled": true, + "is_withdrawal_enabled": true, + "is_refuel_enabled": false, + "max_withdrawal_amount": 1, + "deposit_fee": 0, + "withdrawal_fee": 0.01, + "source_base_fee": 0, + "destination_base_fee": 0, + "mint": true } ], "metadata": null, @@ -3230,5 +3264,40 @@ export default [ "metadata": null, "managed_accounts": [], "nodes": [] + }, + { + "display_name": "XRPL Ledger", + "internal_name": "XRPL_MAINNET", + "logo": "https://cdn.lux.network/bridge/networks/xrp_mainnet.png", + "native_currency": "XRP", + "is_testnet": false, + "is_featured": true, + "average_completion_time": "00:00:12.0000000", + "chain_id": null, + "status": "active", + "type": "xrp", + "transaction_explorer_template": "https://livenet.xrpscan.com/tx/{0}", + "account_explorer_template": "https://livenet.xrpscan.com/account/{0}", + "currencies": [ + { + "name": "XRP", + "asset": "XRP", + "logo": "https://cdn.lux.network/bridge/currencies/xrp.svg", + "contract_address": null, + "decimals": 6, + "status": "active", + "is_deposit_enabled": false, + "is_withdrawal_enabled": true, + "is_refuel_enabled": false, + "max_withdrawal_amount": 0, + "deposit_fee": 0, + "withdrawal_fee": 0, + "source_base_fee": 0, + "destination_base_fee": 0 + } + ], + "metadata": null, + "managed_accounts": [], + "nodes": [] } ] satisfies Network[] diff --git a/app/server/src/domain/settings/testnet/networks.ts b/app/server/src/domain/settings/testnet/networks.ts index b186020d..48b30c13 100644 --- a/app/server/src/domain/settings/testnet/networks.ts +++ b/app/server/src/domain/settings/testnet/networks.ts @@ -817,6 +817,23 @@ export default [ "source_base_fee": 0, "destination_base_fee": 0, "mint": true + }, + { + "name": "Liquid XRP", + "asset": "LXRP", + "logo": "https://cdn.lux.network/bridge/currencies/lux/lxrp.svg", + "contract_address": "0x5B562e80A56b600d729371eB14fE3B83298D0643", + "decimals": 18, + "status": "active", + "is_deposit_enabled": true, + "is_withdrawal_enabled": true, + "is_refuel_enabled": false, + "max_withdrawal_amount": 1, + "deposit_fee": 0, + "withdrawal_fee": 0.01, + "source_base_fee": 0, + "destination_base_fee": 0, + "mint": true } ], "metadata": null, @@ -1341,6 +1358,23 @@ export default [ "source_base_fee": 0, "destination_base_fee": 0, "mint": true + }, + { + "name": "Zoo XRP", + "asset": "ZXRP", + "logo": "https://cdn.lux.network/bridge/currencies/zoo/zxrp.svg", + "contract_address": "0x5B562e80A56b600d729371eB14fE3B83298D0644", + "decimals": 18, + "status": "active", + "is_deposit_enabled": true, + "is_withdrawal_enabled": true, + "is_refuel_enabled": false, + "max_withdrawal_amount": 1, + "deposit_fee": 0, + "withdrawal_fee": 0.01, + "source_base_fee": 0, + "destination_base_fee": 0, + "mint": true } ], "metadata": null, @@ -1550,5 +1584,40 @@ export default [ "metadata": null, "managed_accounts": ["UQAvirnJ3tWyhjU0At4qRr-Miph3bI_38vgp0h73SHTl3TDB"], "nodes": [] + }, + { + "display_name": "XRP Devnet", + "internal_name": "XRP_TESTNET", + "logo": "https://cdn.lux.network/bridge/networks/xrp_devnet.png", + "native_currency": "XRP", + "is_testnet": true, + "is_featured": true, + "average_completion_time": "00:00:12.0000000", + "chain_id": null, + "status": "active", + "type": "xrp", + "transaction_explorer_template": "https://testnet.xrpscan.com/tx/{0}", + "account_explorer_template": "https://testnet.xrpscan.com/account/{0}", + "currencies": [ + { + "name": "XRP", + "asset": "XRP", + "logo": "https://cdn.lux.network/bridge/currencies/xrp.svg", + "contract_address": null, + "decimals": 6, + "status": "active", + "is_deposit_enabled": false, + "is_withdrawal_enabled": true, + "is_refuel_enabled": false, + "max_withdrawal_amount": 0, + "deposit_fee": 0, + "withdrawal_fee": 0, + "source_base_fee": 0, + "destination_base_fee": 0 + } + ], + "metadata": null, + "managed_accounts": [], + "nodes": [] } ] satisfies Network[] diff --git a/app/server/src/domain/swaps.ts b/app/server/src/domain/swaps.ts index 3a1e09fd..c207fc83 100644 --- a/app/server/src/domain/swaps.ts +++ b/app/server/src/domain/swaps.ts @@ -301,7 +301,10 @@ export async function handlerUpdateUserTransferAction(id: string, txHash: string export async function handlerUpdatePayoutAction(id: string, txHash: string, amount: number, from: string, to: string) { try { let swap = await prisma.swap.findUnique({ - where: { id } + where: { id }, + include: { + transactions: true + } }) const transaction = await prisma.transaction.create({ data: { @@ -341,6 +344,18 @@ export async function handlerUpdatePayoutAction(id: string, txHash: string, amou } } }) + + // Calculate withdrawal time and update statistics + const inputTransaction = swap?.transactions?.find(t => t.type === TransactionType.Input) + if (inputTransaction && inputTransaction.timestamp) { + const withdrawalTimeSeconds = Math.floor( + (transaction.timestamp.getTime() - inputTransaction.timestamp.getTime()) / 1000 + ) + + // Update withdrawal time statistics + await updateWithdrawalTimeStatistics(swap?.destination_network_id, withdrawalTimeSeconds) + } + swap = await prisma.swap.findUnique({ where: { id }, include: { @@ -1071,3 +1086,211 @@ export async function handlerDelSwap(swapData: { id: string }) { // throw new Error(`Error getting Prisma code: ${error.name} msg:${error?.message}`) // } // } + +/** + * Get withdrawal time statistics for all networks + * @returns Array of withdrawal time statistics for each network + */ +export async function getWithdrawalTimeStatistics() { + try { + // Get all statistics with network information + const stats = await prisma.withdrawalTimeStatistic.findMany({ + include: { + network: true + } + }) + + return stats.map(stat => ({ + network_id: stat.network_id, + network_name: stat.network.internal_name, + display_name: stat.network.display_name, + total_withdrawals: stat.total_withdrawals, + avg_time_seconds: stat.avg_time_seconds, + min_time_seconds: stat.min_time_seconds, + max_time_seconds: stat.max_time_seconds, + last_24h_withdrawals: stat.last_24h_withdrawals, + last_24h_avg_seconds: stat.last_24h_avg_seconds, + last_hour_withdrawals: stat.last_hour_withdrawals, + last_hour_avg_seconds: stat.last_hour_avg_seconds, + updated_at: stat.updated_at + })) + } catch (error) { + console.error('Error getting withdrawal time statistics:', error) + return [] + } +} + +/** + * Get withdrawal time statistics for a specific network + * @param networkId The network ID or internal name + * @returns Withdrawal time statistics for the network + */ +export async function getNetworkWithdrawalTimeStatistics(networkId: string | number) { + try { + let network; + + // If networkId is a string, find the network by internal name + if (typeof networkId === 'string') { + network = await prisma.network.findFirst({ + where: { internal_name: networkId } + }) + + if (!network) { + throw new Error(`Network with internal name ${networkId} not found`) + } + + networkId = network.id + } + + // Get statistics for the network + const stats = await prisma.withdrawalTimeStatistic.findFirst({ + where: { network_id: Number(networkId) }, + include: { + network: true + } + }) + + if (!stats) { + return { + network_id: Number(networkId), + network_name: network?.internal_name, + display_name: network?.display_name, + total_withdrawals: 0, + avg_time_seconds: 0, + min_time_seconds: 0, + max_time_seconds: 0, + last_24h_withdrawals: 0, + last_24h_avg_seconds: 0, + last_hour_withdrawals: 0, + last_hour_avg_seconds: 0, + updated_at: new Date() + } + } + + return { + network_id: stats.network_id, + network_name: stats.network.internal_name, + display_name: stats.network.display_name, + total_withdrawals: stats.total_withdrawals, + avg_time_seconds: stats.avg_time_seconds, + min_time_seconds: stats.min_time_seconds, + max_time_seconds: stats.max_time_seconds, + last_24h_withdrawals: stats.last_24h_withdrawals, + last_24h_avg_seconds: stats.last_24h_avg_seconds, + last_hour_withdrawals: stats.last_hour_withdrawals, + last_hour_avg_seconds: stats.last_hour_avg_seconds, + updated_at: stats.updated_at + } + } catch (error) { + console.error('Error getting network withdrawal time statistics:', error) + throw error + } +} + +/** + * Update withdrawal time statistics for a network + * @param networkId The network ID + * @param withdrawalTimeSeconds Time in seconds it took to complete the withdrawal + */ +async function updateWithdrawalTimeStatistics(networkId: number | undefined, withdrawalTimeSeconds: number) { + if (!networkId) return + + try { + // Get or create stats entry for this network + let stats = await prisma.withdrawalTimeStatistic.findFirst({ + where: { network_id: networkId } + }) + + const now = new Date() + const oneDayAgo = new Date(now.getTime() - 24 * 60 * 60 * 1000) + const oneHourAgo = new Date(now.getTime() - 60 * 60 * 1000) + + // If no stats exist yet, create initial entry + if (!stats) { + stats = await prisma.withdrawalTimeStatistic.create({ + data: { + network_id: networkId, + total_withdrawals: 1, + total_time_seconds: withdrawalTimeSeconds, + avg_time_seconds: withdrawalTimeSeconds, + min_time_seconds: withdrawalTimeSeconds, + max_time_seconds: withdrawalTimeSeconds, + last_24h_withdrawals: 1, + last_24h_avg_seconds: withdrawalTimeSeconds, + last_hour_withdrawals: 1, + last_hour_avg_seconds: withdrawalTimeSeconds + } + }) + return + } + + // Get completed swaps with both input and output transactions for the last 24 hours + const last24HoursSwaps = await prisma.swap.findMany({ + where: { + status: SwapStatus.PayoutSuccess, + transactions: { + some: { + type: TransactionType.Output, + timestamp: { gte: oneDayAgo } + } + } + }, + include: { + transactions: true + } + }) + + // Calculate withdrawal times for each swap + const withdrawalTimes24h: number[] = [] + const withdrawalTimes1h: number[] = [] + + for (const swap of last24HoursSwaps) { + const inputTx = swap.transactions.find(t => t.type === TransactionType.Input) + const outputTx = swap.transactions.find(t => t.type === TransactionType.Output) + + if (inputTx && outputTx && inputTx.timestamp && outputTx.timestamp) { + const time = Math.floor((outputTx.timestamp.getTime() - inputTx.timestamp.getTime()) / 1000) + withdrawalTimes24h.push(time) + + // Check if this transaction was in the last hour + if (outputTx.timestamp >= oneHourAgo) { + withdrawalTimes1h.push(time) + } + } + } + + // Calculate averages + const avg24h = withdrawalTimes24h.length > 0 + ? withdrawalTimes24h.reduce((sum, time) => sum + time, 0) / withdrawalTimes24h.length + : stats.last_24h_avg_seconds + + const avg1h = withdrawalTimes1h.length > 0 + ? withdrawalTimes1h.reduce((sum, time) => sum + time, 0) / withdrawalTimes1h.length + : stats.last_hour_avg_seconds + + // Calculate new averages for all-time stats + const newTotalWithdrawals = stats.total_withdrawals + 1 + const newTotalTimeSeconds = stats.total_time_seconds + withdrawalTimeSeconds + const newAvgTimeSeconds = newTotalTimeSeconds / newTotalWithdrawals + + // Update the statistics + await prisma.withdrawalTimeStatistic.update({ + where: { id: stats.id }, + data: { + total_withdrawals: newTotalWithdrawals, + total_time_seconds: newTotalTimeSeconds, + avg_time_seconds: newAvgTimeSeconds, + min_time_seconds: Math.min(stats.min_time_seconds, withdrawalTimeSeconds), + max_time_seconds: Math.max(stats.max_time_seconds, withdrawalTimeSeconds), + last_24h_withdrawals: withdrawalTimes24h.length, + last_24h_avg_seconds: avg24h, + last_hour_withdrawals: withdrawalTimes1h.length, + last_hour_avg_seconds: avg1h + } + }) + + console.log(`Updated withdrawal time statistics for network ${networkId}. Average: ${newAvgTimeSeconds.toFixed(2)} seconds`) + } catch (error) { + console.error('Error updating withdrawal time statistics:', error) + } +} diff --git a/app/server/src/routes/swaps.ts b/app/server/src/routes/swaps.ts index 37391370..6abb7e31 100644 --- a/app/server/src/routes/swaps.ts +++ b/app/server/src/routes/swaps.ts @@ -11,7 +11,9 @@ import { handlerUpdateMpcSignAction, handlerUpdatePayoutAction, handlerUpdateUserTransferAction, - handlerUtilaPayoutAction + handlerUtilaPayoutAction, + getWithdrawalTimeStatistics, + getNetworkWithdrawalTimeStatistics } from "@/domain/swaps" const router: Router = Router() @@ -233,4 +235,29 @@ router.get("/deposit-check/:swapId", async (req: Request, res: Response) => { } }) +// route: /api/swaps/statistics/withdrawal-time +// description: Get withdrawal time statistics for all networks +// method: GET and it's public +router.get("/statistics/withdrawal-time", async (req: Request, res: Response) => { + try { + const result = await getWithdrawalTimeStatistics() + res.status(200).json({ data: result }) + } catch (error: any) { + res.status(500).json({ error: error?.message }) + } +}) + +// route: /api/swaps/statistics/withdrawal-time/:networkId +// description: Get withdrawal time statistics for a specific network +// method: GET and it's public +router.get("/statistics/withdrawal-time/:networkId", async (req: Request, res: Response) => { + try { + const networkId = req.params.networkId + const result = await getNetworkWithdrawalTimeStatistics(networkId) + res.status(200).json({ data: result }) + } catch (error: any) { + res.status(500).json({ error: error?.message }) + } +}) + export default router diff --git a/contracts/contracts/lux/LXRP.sol b/contracts/contracts/lux/LXRP.sol new file mode 100644 index 00000000..e650d355 --- /dev/null +++ b/contracts/contracts/lux/LXRP.sol @@ -0,0 +1,29 @@ +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.20; + +/** + ██████╗ ██╗ ██╗██╗ ██╗ ██████╗ + ██╔══██╗██║ ██║╚██╗██╔╝██╔═══╝ + ██████╔╝██║ ██║ ╚███╔╝ ██║ + ██╔═══╝ ██║ ██║ ██╔██╗ ██║ + ██║ ╚██████╔╝██╔╝ ██╗╚██████╗ + ╚═╝ ╚═════╝ ╚═╝ ╚═╝ ╚═════╝ + Lux XRP Token +*/ + +import "../ERC20B.sol"; + +contract LuxXRP is ERC20B { + string public constant _name = "Lux XRP"; + string public constant _symbol = "LXRP"; + + constructor() ERC20B(_name, _symbol) {} + + function mint(address account, uint256 amount) public { + _mint(account, amount); + } + + function burn(address account, uint256 amount) public { + _burn(account, amount); + } +} \ No newline at end of file diff --git a/contracts/contracts/zoo/ZXRP.sol b/contracts/contracts/zoo/ZXRP.sol new file mode 100644 index 00000000..f47b4069 --- /dev/null +++ b/contracts/contracts/zoo/ZXRP.sol @@ -0,0 +1,29 @@ +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.20; + +/** + ██████╗ ██╗ ██╗██╗ ██╗ ██████╗ + ██╔══██╗██║ ██║╚██╗██╔╝██╔═══╝ + ██████╔╝██║ ██║ ╚███╔╝ ██║ + ██╔═══╝ ██║ ██║ ██╔██╗ ██║ + ██║ ╚██████╔╝██╔╝ ██╗╚██████╗ + ╚═╝ ╚═════╝ ╚═╝ ╚═╝ ╚═════╝ + Zoo XRP Token +*/ + +import "../ERC20B.sol"; + +contract ZooXRP is ERC20B { + string public constant _name = "Zoo XRP"; + string public constant _symbol = "ZXRP"; + + constructor() ERC20B(_name, _symbol) {} + + function mint(address account, uint256 amount) public { + _mint(account, amount); + } + + function burn(address account, uint256 amount) public { + _burn(account, amount); + } +} \ No newline at end of file diff --git a/docs/adding-new-blockchains.md b/docs/adding-new-blockchains.md new file mode 100644 index 00000000..bdbfa120 --- /dev/null +++ b/docs/adding-new-blockchains.md @@ -0,0 +1,936 @@ +# TODO + +We would like to support all major blockchains, here are some notes on how to +proceed with various chain architectures. + +## General Implementation Pattern + +For each blockchain, you'll need to implement: + +1. **Configuration**: Add network settings +2. **Transaction Verification**: Validate source chain transactions +3. **Data Extraction**: Extract transaction data (amount, sender, etc.) +4. **MPC Signature Generation**: Feed data to existing MPC system +5. **UI Integration**: Update UI to support the new blockchain + +## Bitcoin-style Network Implementation + +```typescript +// 1. Add to settings.ts +{ + display_name: "Bitcoin", + internal_name: "BTC_MAINNET", + is_testnet: false, + chain_id: "BTC-MAINNET", + teleporter: "", + vault: "", + node: "https://bitcoin-rpc-endpoint.com", + currencies: [ + { + name: "BTC", + asset: "BTC", + contract_address: null, + decimals: 8, + is_native: true + } + ] +} + +// 2. Add to node.ts +if ( + fromNetwork.internal_name === "BTC_MAINNET" || + fromNetwork.internal_name === "BTC_TESTNET" +) { + // Import BitcoinJS + const bitcoin = require('bitcoinjs-lib'); + const axios = require('axios'); + + try { + // Create RPC client + const rpcClient = new BitcoinRpcClient(fromNetwork.node); + + // Fetch transaction + const txData = await rpcClient.getRawTransaction(txId, true); + + // Validate it's to our teleporter address + let isValidTx = false; + let amount = 0; + let sender = ''; + + for (const output of txData.vout) { + if (output.scriptPubKey.addresses && + output.scriptPubKey.addresses.includes(fromNetwork.teleporter)) { + isValidTx = true; + amount = Math.floor(output.value * 100000000); // Convert to satoshis + } + } + + if (!isValidTx) { + throw new Error("Invalid Bitcoin transaction"); + } + + // Similar to XRPL implementation, generate MPC signature + const { signature, mpcSigner } = await hashAndSignTx({ + web3Form: null, + toNetworkId, + hashedTxId: txId, + toTokenAddress, + tokenAmount: amount.toString(), + decimals: 8, + receiverAddressHash, + nonce, + vault: false + }); + + // Save transaction info + await savehashedTxId({ + chainType: "btc", + txId, + amount: amount.toString(), + signature: signature + "###" + mpcSigner, + hashedTxId: txId + }); + + res.json({ status: true, data: { + teleporter: fromNetwork.teleporter, + token: "BTC", + from: sender, + eventName: "Payment", + value: amount.toString(), + signature, + mpcSigner, + hashedTxId: txId + }}); + return; + } catch (err) { + res.json({ status: false, msg: err.message }); + return; + } +} +``` + +## Solana Network Implementation + +```typescript +// 1. Add to settings.ts +{ + display_name: "Solana", + internal_name: "SOLANA_MAINNET", + is_testnet: false, + chain_id: "SOL-MAINNET", + teleporter: "", + vault: "", + node: "https://api.mainnet-beta.solana.com", + currencies: [ + { + name: "SOL", + asset: "SOL", + contract_address: null, + decimals: 9, + is_native: true + } + ] +} + +// 2. Add to node.ts +if ( + fromNetwork.internal_name === "SOLANA_MAINNET" || + fromNetwork.internal_name === "SOLANA_DEVNET" +) { + const solanaWeb3 = require('@solana/web3.js'); + const connection = new solanaWeb3.Connection(fromNetwork.node); + + try { + // Fetch and parse transaction + const tx = await connection.getParsedTransaction(txId, {commitment: 'confirmed'}); + + // Verify transaction type and recipient + if (!tx || !tx.meta || tx.meta.err) { + throw new Error("Invalid Solana transaction"); + } + + // Check if it's a transfer to our teleporter address + let isToTeleporter = false; + let amount = 0; + let sender = ''; + + for (const instruction of tx.transaction.message.instructions) { + if (instruction.program === 'system' && + instruction.parsed.type === 'transfer' && + instruction.parsed.info.destination === fromNetwork.teleporter) { + isToTeleporter = true; + amount = instruction.parsed.info.lamports; + sender = instruction.parsed.info.source; + break; + } + } + + if (!isToTeleporter) { + throw new Error("Not a transfer to teleporter address"); + } + + // Generate MPC signature + const { signature, mpcSigner } = await hashAndSignTx({ + web3Form: null, + toNetworkId, + hashedTxId: txId, + toTokenAddress, + tokenAmount: amount.toString(), + decimals: 9, + receiverAddressHash, + nonce, + vault: false + }); + + // Save transaction info + await savehashedTxId({ + chainType: "sol", + txId, + amount: amount.toString(), + signature: signature + "###" + mpcSigner, + hashedTxId: txId + }); + + res.json({ status: true, data: { + teleporter: fromNetwork.teleporter, + token: "SOL", + from: sender, + eventName: "Payment", + value: amount.toString(), + signature, + mpcSigner, + hashedTxId: txId + }}); + return; + } catch (err) { + res.json({ status: false, msg: err.message }); + return; + } +} +``` + +## Cardano Network Implementation + +```typescript +// 1. Add to settings.ts +{ + display_name: "Cardano", + internal_name: "CARDANO_MAINNET", + is_testnet: false, + chain_id: "ADA-MAINNET", + teleporter: "", + vault: "", + node: "https://cardano-node-url.com", + currencies: [ + { + name: "ADA", + asset: "ADA", + contract_address: null, + decimals: 6, + is_native: true + } + ] +} + +// 2. Add to node.ts +if ( + fromNetwork.internal_name === "CARDANO_MAINNET" || + fromNetwork.internal_name === "CARDANO_TESTNET" +) { + // Use Cardano serialization lib + const CardanoWasm = require('@emurgo/cardano-serialization-lib-nodejs'); + const BlockfrostAPI = require('@blockfrost/blockfrost-js'); + + try { + // Create API client + const api = new BlockfrostAPI({ + projectId: process.env.BLOCKFROST_PROJECT_ID, + network: fromNetwork.internal_name === "CARDANO_MAINNET" ? 'mainnet' : 'testnet', + }); + + // Get transaction + const tx = await api.txs(txId); + const txUtxos = await api.txsUtxos(txId); + + // Verify it's a payment to teleporter + let isValidTx = false; + let amount = 0; + let sender = ''; + + // Check outputs for teleporter address + for (const output of txUtxos.outputs) { + if (output.address === fromNetwork.teleporter) { + // For ADA, we need to filter the lovelace (native token) + for (const amount_item of output.amount) { + if (amount_item.unit === 'lovelace') { + isValidTx = true; + amount = parseInt(amount_item.quantity); + break; + } + } + } + } + + if (!isValidTx) { + throw new Error("Invalid Cardano transaction"); + } + + // Generate MPC signature + const { signature, mpcSigner } = await hashAndSignTx({ + web3Form: null, + toNetworkId, + hashedTxId: txId, + toTokenAddress, + tokenAmount: amount.toString(), + decimals: 6, + receiverAddressHash, + nonce, + vault: false + }); + + // Save transaction info + await savehashedTxId({ + chainType: "ada", + txId, + amount: amount.toString(), + signature: signature + "###" + mpcSigner, + hashedTxId: txId + }); + + res.json({ status: true, data: { + teleporter: fromNetwork.teleporter, + token: "ADA", + from: sender, + eventName: "Payment", + value: amount.toString(), + signature, + mpcSigner, + hashedTxId: txId + }}); + return; + } catch (err) { + res.json({ status: false, msg: err.message }); + return; + } +} +``` + +## TRON Network Implementation + +```typescript +// 1. Add to settings.ts +{ + display_name: "TRON", + internal_name: "TRON_MAINNET", + is_testnet: false, + chain_id: "TRX-MAINNET", + teleporter: "", + vault: "", + node: "https://api.trongrid.io", + currencies: [ + { + name: "TRX", + asset: "TRX", + contract_address: null, + decimals: 6, + is_native: true + } + ] +} + +// 2. Add to node.ts +if ( + fromNetwork.internal_name === "TRON_MAINNET" || + fromNetwork.internal_name === "TRON_SHASTA" +) { + const TronWeb = require('tronweb'); + const tronWeb = new TronWeb({ + fullHost: fromNetwork.node, + headers: { "TRON-PRO-API-KEY": process.env.TRON_API_KEY } + }); + + try { + // Fetch transaction info + const txInfo = await tronWeb.trx.getTransaction(txId); + const txInfo2 = await tronWeb.trx.getTransactionInfo(txId); + + // Verify transaction type and to address + if (!txInfo || !txInfo.raw_data || !txInfo.raw_data.contract || txInfo.raw_data.contract.length === 0) { + throw new Error("Invalid TRON transaction"); + } + + const contract = txInfo.raw_data.contract[0]; + if (contract.type !== 'TransferContract') { + throw new Error("Not a transfer transaction"); + } + + const transferParams = contract.parameter.value; + if (tronWeb.address.fromHex(transferParams.to_address) !== fromNetwork.teleporter) { + throw new Error("Transfer not to teleporter address"); + } + + const amount = transferParams.amount; + const sender = tronWeb.address.fromHex(transferParams.owner_address); + + // Generate MPC signature + const { signature, mpcSigner } = await hashAndSignTx({ + web3Form: null, + toNetworkId, + hashedTxId: txId, + toTokenAddress, + tokenAmount: amount.toString(), + decimals: 6, + receiverAddressHash, + nonce, + vault: false + }); + + // Save transaction info + await savehashedTxId({ + chainType: "trx", + txId, + amount: amount.toString(), + signature: signature + "###" + mpcSigner, + hashedTxId: txId + }); + + res.json({ status: true, data: { + teleporter: fromNetwork.teleporter, + token: "TRX", + from: sender, + eventName: "Payment", + value: amount.toString(), + signature, + mpcSigner, + hashedTxId: txId + }}); + return; + } catch (err) { + res.json({ status: false, msg: err.message }); + return; + } +} +``` + +## Tezos Network Implementation + +```typescript +// 1. Add to settings.ts +{ + display_name: "Tezos", + internal_name: "TEZOS_MAINNET", + is_testnet: false, + chain_id: "XTZ-MAINNET", + teleporter: "", + vault: "", + node: "https://mainnet.api.tez.ie", + currencies: [ + { + name: "XTZ", + asset: "XTZ", + contract_address: null, + decimals: 6, + is_native: true + } + ] +} + +// 2. Add to node.ts +if ( + fromNetwork.internal_name === "TEZOS_MAINNET" || + fromNetwork.internal_name === "TEZOS_GHOSTNET" +) { + const { TezosToolkit } = require('@taquito/taquito'); + const tezos = new TezosToolkit(fromNetwork.node); + + try { + // Fetch operation + const operation = await tezos.rpc.getOperationHash(txId); + const opDetails = await tezos.rpc.getOperation(txId); + + // Find the transaction in the operation + let validTx = false; + let amount = 0; + let sender = ''; + + for (const content of opDetails.contents) { + if (content.kind === 'transaction' && content.destination === fromNetwork.teleporter) { + validTx = true; + amount = parseInt(content.amount); + sender = content.source; + break; + } + } + + if (!validTx) { + throw new Error("Invalid Tezos transaction or not to teleporter"); + } + + // Generate MPC signature + const { signature, mpcSigner } = await hashAndSignTx({ + web3Form: null, + toNetworkId, + hashedTxId: txId, + toTokenAddress, + tokenAmount: amount.toString(), + decimals: 6, + receiverAddressHash, + nonce, + vault: false + }); + + // Save transaction info + await savehashedTxId({ + chainType: "xtz", + txId, + amount: amount.toString(), + signature: signature + "###" + mpcSigner, + hashedTxId: txId + }); + + res.json({ status: true, data: { + teleporter: fromNetwork.teleporter, + token: "XTZ", + from: sender, + eventName: "Payment", + value: amount.toString(), + signature, + mpcSigner, + hashedTxId: txId + }}); + return; + } catch (err) { + res.json({ status: false, msg: err.message }); + return; + } +} +``` + +## Cosmos Network Implementation + +```typescript +// 1. Add to settings.ts +{ + display_name: "Cosmos Hub", + internal_name: "COSMOS_MAINNET", + is_testnet: false, + chain_id: "ATOM-MAINNET", + teleporter: "", + vault: "", + node: "https://lcd-cosmoshub.keplr.app", + currencies: [ + { + name: "ATOM", + asset: "ATOM", + contract_address: null, + decimals: 6, + is_native: true + } + ] +} + +// 2. Add to node.ts +if ( + fromNetwork.internal_name === "COSMOS_MAINNET" || + fromNetwork.internal_name === "COSMOS_TESTNET" +) { + const { LcdClient } = require('@cosmjs/launchpad'); + const axios = require('axios'); + + try { + // Create client + const client = new LcdClient(fromNetwork.node); + + // Fetch transaction + const txResponse = await axios.get(`${fromNetwork.node}/cosmos/tx/v1beta1/txs/${txId}`); + const tx = txResponse.data.tx_response; + + if (!tx || tx.code !== 0) { + throw new Error("Invalid Cosmos transaction"); + } + + // Parse messages to find Send + let validTx = false; + let amount = 0; + let sender = ''; + + for (const msg of tx.tx.body.messages) { + if (msg['@type'] === '/cosmos.bank.v1beta1.MsgSend' && + msg.to_address === fromNetwork.teleporter) { + validTx = true; + + // Find ATOM amount + for (const coin of msg.amount) { + if (coin.denom === 'uatom') { // micro ATOM + amount = parseInt(coin.amount); + sender = msg.from_address; + break; + } + } + + break; + } + } + + if (!validTx) { + throw new Error("No valid payment to teleporter found"); + } + + // Generate MPC signature + const { signature, mpcSigner } = await hashAndSignTx({ + web3Form: null, + toNetworkId, + hashedTxId: txId, + toTokenAddress, + tokenAmount: amount.toString(), + decimals: 6, + receiverAddressHash, + nonce, + vault: false + }); + + // Save transaction info + await savehashedTxId({ + chainType: "atom", + txId, + amount: amount.toString(), + signature: signature + "###" + mpcSigner, + hashedTxId: txId + }); + + res.json({ status: true, data: { + teleporter: fromNetwork.teleporter, + token: "ATOM", + from: sender, + eventName: "Payment", + value: amount.toString(), + signature, + mpcSigner, + hashedTxId: txId + }}); + return; + } catch (err) { + res.json({ status: false, msg: err.message }); + return; + } +} +``` + +## Algorand Network Implementation + +```typescript +// 1. Add to settings.ts +{ + display_name: "Algorand", + internal_name: "ALGORAND_MAINNET", + is_testnet: false, + chain_id: "ALGO-MAINNET", + teleporter: "", + vault: "", + node: "https://mainnet-api.algonode.cloud", + currencies: [ + { + name: "ALGO", + asset: "ALGO", + contract_address: null, + decimals: 6, + is_native: true + } + ] +} + +// 2. Add to node.ts +if ( + fromNetwork.internal_name === "ALGORAND_MAINNET" || + fromNetwork.internal_name === "ALGORAND_TESTNET" +) { + const algosdk = require('algosdk'); + + try { + // Create client + const algodClient = new algosdk.Algodv2( + process.env.ALGO_API_TOKEN, + fromNetwork.node, + process.env.ALGO_PORT + ); + + // Fetch transaction + const txInfo = await algodClient.pendingTransactionInformation(txId).do(); + + // Verify it's a payment to teleporter + if (txInfo['tx-type'] !== 'pay' || + txInfo['payment-transaction'].receiver !== fromNetwork.teleporter) { + throw new Error("Not a payment to teleporter address"); + } + + const amount = txInfo['payment-transaction'].amount; + const sender = txInfo.sender; + + // Generate MPC signature + const { signature, mpcSigner } = await hashAndSignTx({ + web3Form: null, + toNetworkId, + hashedTxId: txId, + toTokenAddress, + tokenAmount: amount.toString(), + decimals: 6, + receiverAddressHash, + nonce, + vault: false + }); + + // Save transaction info + await savehashedTxId({ + chainType: "algo", + txId, + amount: amount.toString(), + signature: signature + "###" + mpcSigner, + hashedTxId: txId + }); + + res.json({ status: true, data: { + teleporter: fromNetwork.teleporter, + token: "ALGO", + from: sender, + eventName: "Payment", + value: amount.toString(), + signature, + mpcSigner, + hashedTxId: txId + }}); + return; + } catch (err) { + res.json({ status: false, msg: err.message }); + return; + } +} +``` + +## Aptos Network Implementation + +```typescript +// 1. Add to settings.ts +{ + display_name: "Aptos", + internal_name: "APTOS_MAINNET", + is_testnet: false, + chain_id: "APT-MAINNET", + teleporter: "", + vault: "", + node: "https://fullnode.mainnet.aptoslabs.com/v1", + currencies: [ + { + name: "APT", + asset: "APT", + contract_address: null, + decimals: 8, + is_native: true + } + ] +} + +// 2. Add to node.ts +if ( + fromNetwork.internal_name === "APTOS_MAINNET" || + fromNetwork.internal_name === "APTOS_TESTNET" +) { + const { AptosClient } = require('aptos'); + const client = new AptosClient(fromNetwork.node); + + try { + // Fetch transaction + const tx = await client.getTransactionByHash(txId); + + // Verify it's a coin transfer to teleporter + let validTx = false; + let amount = 0; + let sender = ''; + + if (tx.type === 'user_transaction' && + tx.payload.function === '0x1::coin::transfer' && + tx.payload.arguments[0] === fromNetwork.teleporter) { + validTx = true; + amount = tx.payload.arguments[1]; + sender = tx.sender; + } + + if (!validTx) { + throw new Error("Invalid Aptos transaction"); + } + + // Generate MPC signature + const { signature, mpcSigner } = await hashAndSignTx({ + web3Form: null, + toNetworkId, + hashedTxId: txId, + toTokenAddress, + tokenAmount: amount.toString(), + decimals: 8, + receiverAddressHash, + nonce, + vault: false + }); + + // Save transaction info + await savehashedTxId({ + chainType: "apt", + txId, + amount: amount.toString(), + signature: signature + "###" + mpcSigner, + hashedTxId: txId + }); + + res.json({ status: true, data: { + teleporter: fromNetwork.teleporter, + token: "APT", + from: sender, + eventName: "Payment", + value: amount.toString(), + signature, + mpcSigner, + hashedTxId: txId + }}); + return; + } catch (err) { + res.json({ status: false, msg: err.message }); + return; + } +} +``` + +## Sui Network Implementation + +```typescript +// 1. Add to settings.ts +{ + display_name: "Sui", + internal_name: "SUI_MAINNET", + is_testnet: false, + chain_id: "SUI-MAINNET", + teleporter: "", + vault: "", + node: "https://fullnode.mainnet.sui.io:443", + currencies: [ + { + name: "SUI", + asset: "SUI", + contract_address: null, + decimals: 9, + is_native: true + } + ] +} + +// 2. Add to node.ts +if ( + fromNetwork.internal_name === "SUI_MAINNET" || + fromNetwork.internal_name === "SUI_TESTNET" +) { + const { JsonRpcProvider, Connection } = require('@mysten/sui.js'); + + try { + // Create connection to Sui + const connection = new Connection({ fullnode: fromNetwork.node }); + const provider = new JsonRpcProvider(connection); + + // Fetch transaction + const txInfo = await provider.getTransactionBlock({ + digest: txId, + options: { showEffects: true, showInput: true } + }); + + // Verify it's a transfer to teleporter + let validTx = false; + let amount = 0; + let sender = ''; + + // Parse transaction effects to find transfer to teleporter + for (const effect of txInfo.effects.events) { + if (effect.type === 'coinBalanceChange' && + effect.owner?.AddressOwner === fromNetwork.teleporter && + effect.coinType === '0x2::sui::SUI' && + effect.amount > 0) { + validTx = true; + amount = effect.amount; + sender = txInfo.transaction.data.sender; + break; + } + } + + if (!validTx) { + throw new Error("Not a valid transfer to teleporter"); + } + + // Generate MPC signature + const { signature, mpcSigner } = await hashAndSignTx({ + web3Form: null, + toNetworkId, + hashedTxId: txId, + toTokenAddress, + tokenAmount: amount.toString(), + decimals: 9, + receiverAddressHash, + nonce, + vault: false + }); + + // Save transaction info + await savehashedTxId({ + chainType: "sui", + txId, + amount: amount.toString(), + signature: signature + "###" + mpcSigner, + hashedTxId: txId + }); + + res.json({ status: true, data: { + teleporter: fromNetwork.teleporter, + token: "SUI", + from: sender, + eventName: "Payment", + value: amount.toString(), + signature, + mpcSigner, + hashedTxId: txId + }}); + return; + } catch (err) { + res.json({ status: false, msg: err.message }); + return; + } +} +``` + +## General Implementation Notes + +For all blockchains, remember to: + +1. **Update SWAP_PAIRS in settings.ts**: + ```typescript + // Add to SWAP_PAIRS + BTC: ["LBTC", "ZBTC"], + SOL: ["LSOL", "ZSOL"], + ADA: ["LADA", "ZADA"], + // ... and so on for each blockchain's native token + ``` + +2. **Install Required Dependencies**: + ```bash + pnpm add bitcoinjs-lib axios @solana/web3.js @emurgo/cardano-serialization-lib-nodejs @blockfrost/blockfrost-js tronweb @taquito/taquito @cosmjs/launchpad algosdk aptos @mysten/sui.js + ``` + +3. **Update UI Components**: + - Add wallet connectors for each blockchain + - Update network selection UI + - Add blockchain-specific icons and branding + +4. **Create Teleporter and Vault Addresses**: + - For each blockchain, you'll need to generate addresses or deploy contracts to serve as teleporters and vaults + +5. **Testing Steps**: + 1. Test transactions from new chain to existing chains + 2. Test transactions from existing chains to new chain + 3. Verify token balances and transaction statuses + 4. Test edge cases (failed transactions, network issues) + +Each blockchain implementation follows the same pattern demonstrated in the XRPL implementation: verify transaction on source chain, then use the MPC system to generate signatures for the destination chain. The main differences are in the transaction verification process and blockchain-specific client libraries. diff --git a/docs/cggmp21-notes.md b/docs/cggmp21-notes.md new file mode 100644 index 00000000..901f6322 --- /dev/null +++ b/docs/cggmp21-notes.md @@ -0,0 +1,855 @@ +# CGGMP21 Protocol Implementation Guide + +## Introduction to CGGMP21 + +CGGMP21 (Canetti, Gennaro, Goldfeder, Makriyannis, Peled, 2021) is an advanced threshold ECDSA protocol that builds upon the CGGMP20 protocol described in your current architecture. This protocol introduces significant improvements that are highly relevant for your Lux.Network bridge implementation: + +- **Non-Interactive Signing**: Only the last round requires knowledge of the message, allowing preprocessing +- **Adaptive Security**: Withstands adaptive corruption of signatories +- **Proactive Security**: Includes periodic refresh mechanism to maintain security even with compromised nodes +- **Identifiable Abort**: Can identify corrupted signatories in case of failure +- **UC Security Framework**: Proven security guarantees in the Universal Composability framework + +These capabilities make CGGMP21 an ideal protocol for threshold wallets and cross-chain bridges handling ECDSA-based cryptocurrencies, where security, composability, and practical efficiency are critical. + +## CGGMP21 vs. Current Implementation + +Based on my analysis of your codebase and documentation, your bridge is currently using the CGGMP20 protocol. The CGGMP21 protocol represents an improved version with key advantages: + +| Feature | CGGMP20 (Current) | CGGMP21 (Proposed) | +|---------|-------------------|-------------------| +| Signing Rounds | 4 rounds | 3 rounds (+ 1 non-interactive) | +| Message Dependency | All rounds | Only last round | +| Adaptive Security | Limited | Full support | +| Proactive Security | Basic | Enhanced with refresh | +| Identifiable Abort | Basic | Advanced identification | +| Cold Wallet Support | Limited | Native support | +| UC Security Proof | Partial | Comprehensive | + +Your documentation also mentions considering DKLs23 as a potential alternative to CGGMP20. While DKLs23 offers some advantages in computational efficiency and simpler cryptographic assumptions, CGGMP21 provides a more mature implementation with proven features like proactive security and identifiable abort that are crucial for bridge security. + +## Implementation Strategy + +Implementing CGGMP21 for the Lux.Network bridge requires a methodical approach. Here's a proposed strategy: + +### 1. Code Structure Integration + +The CGGMP21 protocol should be integrated into your existing multiparty ECDSA framework. Based on your repository structure, consider the following approach: + +``` +mpc-nodes/docker/common/multiparty_ecdsa/ +├── src/ +│ ├── protocols/ +│ │ ├── multi_party_ecdsa/ +│ │ │ ├── gg_2018/ (existing implementation) +│ │ │ ├── gg_2020/ (existing CGGMP20) +│ │ │ └── cggmp_2021/ (new implementation) +│ ├── utilities/ (shared cryptographic utilities) +│ └── lib.rs (main library interface) +``` + +### 2. Core Protocol Components + +The implementation should include these key components: + +#### A. Key Generation Phase + +```rust +pub mod keygen { + pub struct KeyGenParameters { + // Parameters for secure key generation + pub threshold: usize, + pub share_count: usize, + pub security_bits: usize, + } + + pub struct KeyGenParty { + // Party state for key generation + party_id: usize, + parameters: KeyGenParameters, + state: KeyGenState, + } + + impl KeyGenParty { + // Create a new keygen party instance + pub fn new(party_id: usize, parameters: KeyGenParameters) -> Self; + + // First round message generation + pub fn round1(&mut self) -> Round1Message; + + // Process round 1 messages and generate round 2 + pub fn round2(&mut self, messages: Vec) -> Round2Message; + + // Process round 2 messages and generate round 3 + pub fn round3(&mut self, messages: Vec) -> Round3Message; + + // Process round 3 messages and finalize key generation + pub fn finalize(&mut self, messages: Vec) -> KeyShare; + } +} +``` + +#### B. Key Refresh Phase + +```rust +pub mod refresh { + pub struct RefreshParameters { + // Parameters for key refresh + pub threshold: usize, + pub epoch_id: u64, + } + + pub struct RefreshParty { + // Party state for key refresh + party_id: usize, + parameters: RefreshParameters, + key_share: KeyShare, + state: RefreshState, + } + + impl RefreshParty { + // Create a new refresh party instance + pub fn new(party_id: usize, parameters: RefreshParameters, key_share: KeyShare) -> Self; + + // Generate refresh shares + pub fn round1(&mut self) -> RefreshRound1Message; + + // Process round 1 messages and generate round 2 + pub fn round2(&mut self, messages: Vec) -> RefreshRound2Message; + + // Process round 2 messages and generate round 3 + pub fn round3(&mut self, messages: Vec) -> RefreshRound3Message; + + // Finalize refresh and get updated key share + pub fn finalize(&mut self, messages: Vec) -> KeyShare; + } +} +``` + +#### C. Presigning Phase + +```rust +pub mod presign { + pub struct PresignParameters { + // Parameters for presigning + pub session_id: String, + } + + pub struct PresignParty { + // Party state for presigning + party_id: usize, + parameters: PresignParameters, + key_share: KeyShare, + state: PresignState, + } + + impl PresignParty { + // Create a new presign party instance + pub fn new(party_id: usize, parameters: PresignParameters, key_share: KeyShare) -> Self; + + // First round of presigning + pub fn round1(&mut self) -> PresignRound1Message; + + // Process round 1 messages and generate round 2 + pub fn round2(&mut self, messages: Vec) -> PresignRound2Message; + + // Process round 2 messages and generate round 3 + pub fn round3(&mut self, messages: Vec) -> PresignRound3Message; + + // Finalize presigning and get presign data + pub fn finalize(&mut self, messages: Vec) -> PresignData; + } +} +``` + +#### D. Signing Phase + +```rust +pub mod sign { + pub struct SignParameters { + // Parameters for signing + pub message_digest: [u8; 32], + } + + pub struct SignParty { + // Party state for signing + party_id: usize, + parameters: SignParameters, + presign_data: PresignData, + } + + impl SignParty { + // Create a new sign party instance + pub fn new(party_id: usize, parameters: SignParameters, presign_data: PresignData) -> Self; + + // Generate signature share (non-interactive) + pub fn sign(&mut self) -> SignatureShare; + + // Combine signature shares into a complete signature + pub fn combine(shares: Vec) -> ECDSASignature; + } +} +``` + +#### E. Accountability Mechanisms + +```rust +pub mod accountability { + pub struct Complaint { + // Complaint structure for identifiable abort + pub accused_party: usize, + pub evidence: ComplaintEvidence, + } + + pub fn verify_complaint(complaint: &Complaint, public_data: &PublicData) -> bool; + + pub fn identify_malicious_parties( + protocol_transcript: &ProtocolTranscript, + public_data: &PublicData + ) -> Vec; +} +``` + +### 3. Integration with Node.js Bridge Application + +Your MPC nodes appear to be running a Node.js application that interfaces with the Rust implementation. You'll need to update this interface to support the CGGMP21 protocol: + +```typescript +// In src/mpc/signing.ts or similar file + +enum Protocol { + GG18 = 'gg18', + CGGMP20 = 'cggmp20', + CGGMP21 = 'cggmp21' +} + +interface SigningOptions { + protocol: Protocol; + sessionId: string; + threshold: number; + totalParties: number; + messageHash?: string; // Optional for presigning +} + +// Use the new protocol for signing +export async function signMessage( + messageHash: string, + options: SigningOptions = { protocol: Protocol.CGGMP21 } +): Promise { + // Implementation that calls the Rust library with the appropriate protocol + + if (options.protocol === Protocol.CGGMP21) { + // For CGGMP21, we can use the presigning approach + const presignData = await getOrCreatePresignData(options); + return signWithPresignData(messageHash, presignData); + } else { + // Fallback to existing protocols + return legacySignMessage(messageHash, options); + } +} + +// New function for presigning +export async function createPresignData( + options: SigningOptions +): Promise { + // Call the Rust implementation to generate presign data + const cmd = `./target/release/examples/cggmp21_presign_client ${options.sessionId} ${options.threshold} ${options.totalParties}`; + // Execute and return presign data +} + +// Non-interactive signing using presign data +export async function signWithPresignData( + messageHash: string, + presignData: PresignData +): Promise { + // Call the Rust implementation for non-interactive signing + const cmd = `./target/release/examples/cggmp21_sign_client ${presignData.id} ${messageHash}`; + // Execute and return signature +} +``` + +### 4. Docker Configuration Updates + +Update your Docker configuration to include the CGGMP21 protocol binaries: + +```dockerfile +# In your Dockerfile + +# Build the CGGMP21 examples +WORKDIR /app/multiparty_ecdsa +RUN cargo build --release --examples +RUN cp target/release/examples/cggmp21_keygen_client \ + target/release/examples/cggmp21_refresh_client \ + target/release/examples/cggmp21_presign_client \ + target/release/examples/cggmp21_sign_client \ + /app/bin/ +``` + +### 5. Key Management and Persistence + +Ensure proper key management for the presign data, which requires secure storage: + +```typescript +// In src/mpc/keystore.ts or similar file + +interface PresignStore { + savePresignData(presignData: PresignData): Promise; + getUnusedPresignData(): Promise; + markPresignDataAsUsed(id: string): Promise; + generateMorePresignDataIfNeeded(threshold: number): Promise; +} + +// Implementation with appropriate security for storing presign data +class SecurePresignStore implements PresignStore { + // Implementation details +} +``` + +## Key Technical Components + +### 1. Paillier Encryption for Secure Multiplication + +The CGGMP21 protocol uses Paillier encryption for secure multiparty computation of the ECDSA signature. Implement the following: + +```rust +pub struct PaillierKeyPair { + pub public: PaillierPublicKey, + pub private: PaillierPrivateKey, +} + +impl PaillierKeyPair { + pub fn generate(bits: usize) -> Self; + + pub fn encrypt(&self, plaintext: BigInt, randomness: Option) -> PaillierCiphertext; + + pub fn decrypt(&self, ciphertext: PaillierCiphertext) -> BigInt; +} + +// Homomorphic operations +pub trait HomomorphicOperations { + fn add(&self, other: &Self) -> Self; + fn scalar_mul(&self, scalar: &BigInt) -> Self; +} + +impl HomomorphicOperations for PaillierCiphertext { + // Implementation of homomorphic operations +} +``` + +### 2. Zero-Knowledge Proofs + +CGGMP21 uses zero-knowledge proofs to ensure honest behavior without revealing secrets: + +```rust +pub mod zk_proofs { + // Range proof to prove a value is in a specific range + pub struct RangeProof { + // Proof components + } + + impl RangeProof { + pub fn prove(value: &BigInt, range: &Range, randomness: &BigInt) -> Self; + + pub fn verify(&self, ciphertext: &PaillierCiphertext, range: &Range) -> bool; + } + + // Affine operation proof (for demonstrating correct multiplication) + pub struct AffineOperationProof { + // Proof components + } + + impl AffineOperationProof { + pub fn prove(x: &BigInt, y: &BigInt, result: &BigInt, randomness: &BigInt) -> Self; + + pub fn verify(&self, encrypted_x: &PaillierCiphertext, public_y: &BigInt, + encrypted_result: &PaillierCiphertext) -> bool; + } +} +``` + +### 3. Non-Interactive Proofs with Fiat-Shamir + +Convert interactive proofs to non-interactive using the Fiat-Shamir transform: + +```rust +pub fn generate_challenge(public_inputs: &[&[u8]], first_message: &[u8]) -> BigInt { + let mut hasher = Sha256::new(); + + // Hash all public inputs + for input in public_inputs { + hasher.update(input); + } + + // Hash the first message + hasher.update(first_message); + + // Convert hash to challenge + let hash = hasher.finalize(); + BigInt::from_bytes_le(Sign::Plus, &hash) +} + +pub struct NIZKProof { + // Non-interactive zero-knowledge proof components + pub first_message: Vec, + pub response: Vec, +} + +impl NIZKProof { + pub fn generate(public_inputs: &[&[u8]], private_input: &PrivateInput, + prove_function: F) -> Self + where F: Fn(&PrivateInput, &BigInt) -> (Vec, Vec) + { + // First message generation + let (first_message, state) = prove_function(private_input, &BigInt::zero()); + + // Challenge generation + let challenge = generate_challenge(public_inputs, &first_message); + + // Response generation + let response = prove_function(private_input, &challenge).1; + + NIZKProof { + first_message, + response, + } + } + + pub fn verify(&self, public_inputs: &[&[u8]], verify_function: F) -> bool + where F: Fn(&[&[u8]], &Vec, &BigInt, &Vec) -> bool + { + // Challenge reconstruction + let challenge = generate_challenge(public_inputs, &self.first_message); + + // Verification + verify_function(public_inputs, &self.first_message, &challenge, &self.response) + } +} +``` + +### 4. Proactive Security Implementation + +Implement the key refresh mechanism for proactive security: + +```rust +pub struct KeyShare { + pub party_id: usize, + pub threshold: usize, + pub epoch: u64, + pub secret_share: BigInt, + pub public_key: Point, + pub verification_shares: Vec, +} + +impl KeyShare { + // Generate refresh shares for proactive security + pub fn generate_refresh_shares(&self, threshold: usize) -> Vec { + // Generate polynomial with constant term = secret_share + let polynomial = generate_random_polynomial(threshold - 1, self.secret_share.clone()); + + // Evaluate polynomial at points corresponding to party IDs + let mut shares = Vec::with_capacity(threshold); + for i in 1..=threshold { + shares.push(evaluate_polynomial(&polynomial, &BigInt::from(i))); + } + + shares + } + + // Update share with refresh shares + pub fn refresh(&mut self, refresh_shares: &[BigInt], epoch: u64) -> Self { + // Sum up the refresh shares + let new_share = self.secret_share.clone(); + for share in refresh_shares { + new_share = (new_share + share) % curve_order(); + } + + // Create updated key share + KeyShare { + party_id: self.party_id, + threshold: self.threshold, + epoch, + secret_share: new_share, + public_key: self.public_key.clone(), + verification_shares: self.verification_shares.clone(), + } + } +} +``` + +### 5. Identifiable Abort Mechanism + +Implement the accountability mechanism for identifiable abort: + +```rust +pub enum ComplaintType { + InvalidRangeProof, + InvalidAffineOperation, + InvalidMaskedInput, + InvalidSignatureShare, + InconsistentBroadcast, +} + +pub struct ComplaintEvidence { + pub complaint_type: ComplaintType, + pub round: usize, + pub related_message: Vec, + pub expected_value: Option>, + pub verification_data: Vec, +} + +pub fn verify_complaint( + complaint: &Complaint, + protocol_transcript: &ProtocolTranscript, + public_data: &PublicData +) -> bool { + match complaint.evidence.complaint_type { + ComplaintType::InvalidRangeProof => { + // Verify the range proof was indeed invalid + verify_invalid_range_proof(&complaint.evidence, protocol_transcript) + }, + ComplaintType::InvalidAffineOperation => { + // Verify the affine operation was indeed invalid + verify_invalid_affine_operation(&complaint.evidence, protocol_transcript) + }, + // Other complaint types + _ => false + } +} + +pub fn identify_malicious_parties( + protocol_transcript: &ProtocolTranscript, + public_data: &PublicData +) -> Vec { + let mut malicious_parties = Vec::new(); + + // Check for inconsistent broadcasts + for party_id in 0..public_data.num_parties { + if has_inconsistent_broadcast(party_id, protocol_transcript) { + malicious_parties.push(party_id); + } + } + + // Check for invalid proofs + for party_id in 0..public_data.num_parties { + if has_invalid_proofs(party_id, protocol_transcript) { + malicious_parties.push(party_id); + } + } + + // Return the list of identified malicious parties + malicious_parties +} +``` + +## Node.js Integration + +Since your bridge uses Node.js for the MPC node application, you'll need to integrate the Rust implementation with Node.js. Here's a sample implementation: + +```typescript +// src/mpc/cggmp21.ts + +import { execFile } from 'child_process'; +import { promisify } from 'util'; +import * as fs from 'fs'; +import * as path from 'path'; + +const execFileAsync = promisify(execFile); + +export interface CGGMP21Options { + partyId: number; + threshold: number; + totalParties: number; + keySharePath: string; + sessionId: string; +} + +export class CGGMP21Protocol { + private options: CGGMP21Options; + private binPath: string; + + constructor(options: CGGMP21Options) { + this.options = options; + this.binPath = path.join(__dirname, '../../bin'); + } + + async generateKeys(): Promise { + const { stdout } = await execFileAsync( + path.join(this.binPath, 'cggmp21_keygen_client'), + [ + this.options.partyId.toString(), + this.options.threshold.toString(), + this.options.totalParties.toString() + ] + ); + + // Parse and save key share + const keySharePath = path.join(this.options.keySharePath, `key_share_${this.options.partyId}.json`); + fs.writeFileSync(keySharePath, stdout); + + return keySharePath; + } + + async refreshKeys(epoch: number): Promise { + const keySharePath = path.join(this.options.keySharePath, `key_share_${this.options.partyId}.json`); + const keyShare = fs.readFileSync(keySharePath, 'utf8'); + + const { stdout } = await execFileAsync( + path.join(this.binPath, 'cggmp21_refresh_client'), + [ + this.options.partyId.toString(), + this.options.threshold.toString(), + this.options.totalParties.toString(), + epoch.toString() + ], + { input: keyShare } + ); + + // Parse and save refreshed key share + const newKeySharePath = path.join(this.options.keySharePath, `key_share_${this.options.partyId}_${epoch}.json`); + fs.writeFileSync(newKeySharePath, stdout); + + return newKeySharePath; + } + + async generatePresignData(): Promise { + const keySharePath = path.join(this.options.keySharePath, `key_share_${this.options.partyId}.json`); + const keyShare = fs.readFileSync(keySharePath, 'utf8'); + + const { stdout } = await execFileAsync( + path.join(this.binPath, 'cggmp21_presign_client'), + [ + this.options.partyId.toString(), + this.options.threshold.toString(), + this.options.totalParties.toString(), + this.options.sessionId + ], + { input: keyShare } + ); + + // Parse and save presign data + const presignPath = path.join(this.options.keySharePath, `presign_${this.options.sessionId}_${this.options.partyId}.json`); + fs.writeFileSync(presignPath, stdout); + + return presignPath; + } + + async sign(messageHash: string, presignId: string): Promise { + const presignPath = path.join(this.options.keySharePath, `presign_${presignId}_${this.options.partyId}.json`); + const presignData = fs.readFileSync(presignPath, 'utf8'); + + const { stdout } = await execFileAsync( + path.join(this.binPath, 'cggmp21_sign_client'), + [ + this.options.partyId.toString(), + messageHash + ], + { input: presignData } + ); + + // Parse signature share + const sigSharePath = path.join(this.options.keySharePath, `sig_share_${messageHash}_${this.options.partyId}.json`); + fs.writeFileSync(sigSharePath, stdout); + + return sigSharePath; + } + + static async combineSignatures(sigShares: string[]): Promise<{ r: string, s: string }> { + // Parse and combine signature shares + const shares = sigShares.map(path => JSON.parse(fs.readFileSync(path, 'utf8'))); + + // Combine the shares using the algorithm from the paper + // This is a simplified version of the actual combining algorithm + const r = shares[0].r; // r is the same for all shares + let s = BigInt(0); + + for (const share of shares) { + s = (s + BigInt(share.s_share)) % BigInt("0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFEBAAEDCE6AF48A03BBFD25E8CD0364141"); + } + + return { + r: r.toString(16), + s: s.toString(16) + }; + } +} +``` + +## Example Usage in Bridge Application + +Here's how you might integrate CGGMP21 into your bridge application: + +```typescript +// src/bridge/teleport.ts + +import { CGGMP21Protocol } from '../mpc/cggmp21'; +import { ethers } from 'ethers'; +import { getNetworkChainId, getNetworkRPC } from '../config/networks'; + +export async function approveTransfer( + fromChainId: string, + toChainId: string, + txHash: string, + recipient: string, + amount: string, + tokenAddress: string +): Promise { + try { + // 1. Get MPC configuration + const partyId = parseInt(process.env.PARTY_ID || '0'); + const threshold = parseInt(process.env.THRESHOLD || '2'); + const totalParties = parseInt(process.env.TOTAL_PARTIES || '3'); + const keySharePath = process.env.KEY_SHARE_PATH || './keyshares'; + + // 2. Create session ID from transaction data + const sessionId = ethers.utils.keccak256( + ethers.utils.defaultAbiCoder.encode( + ['string', 'string', 'string', 'string', 'string'], + [fromChainId, toChainId, txHash, recipient, amount, tokenAddress] + ) + ); + + // 3. Initialize CGGMP21 protocol + const protocol = new CGGMP21Protocol({ + partyId, + threshold, + totalParties, + keySharePath, + sessionId + }); + + // 4. Generate presign data (can be done ahead of time) + const presignPath = await protocol.generatePresignData(); + console.log(`Generated presign data at ${presignPath}`); + + // 5. Create message hash + const messageHash = ethers.utils.keccak256( + ethers.utils.defaultAbiCoder.encode( + ['string', 'string', 'string', 'string', 'string'], + [toChainId, txHash, tokenAddress, amount, recipient] + ) + ).slice(2); // Remove '0x' prefix + + // 6. Sign the message hash using presign data + const sigSharePath = await protocol.sign(messageHash, sessionId); + console.log(`Generated signature share at ${sigSharePath}`); + + // 7. Collect signature shares from all parties + // This would typically be done through an API or message queue + const allSigShares = await collectSignatureShares(sessionId, messageHash); + + // 8. Combine signature shares + const signature = await CGGMP21Protocol.combineSignatures(allSigShares); + + // 9. Create the complete signature + const sig = `0x${signature.r}${signature.s}27`; // Add '27' as v (recovery id) + + // 10. Submit the signature to the destination chain + const destinationProvider = new ethers.providers.JsonRpcProvider(getNetworkRPC(toChainId)); + const teleportContract = new ethers.Contract(getTeleportAddress(toChainId), TELEPORT_ABI, destinationProvider); + + const wallet = new ethers.Wallet(process.env.PRIVATE_KEY || '', destinationProvider); + const tx = await teleportContract.connect(wallet).executeTransfer( + fromChainId, + txHash, + tokenAddress, + amount, + recipient, + sig + ); + + return tx.hash; + } catch (error) { + console.error('Error approving transfer:', error); + throw error; + } +} + +async function collectSignatureShares(sessionId: string, messageHash: string): Promise { + // Implementation to collect signature shares from all parties + // This could use an API, message queue, or direct communication + + // For demonstration purposes, assume we have paths to all shares + return [ + `./keyshares/sig_share_${messageHash}_0.json`, + `./keyshares/sig_share_${messageHash}_1.json`, + `./keyshares/sig_share_${messageHash}_2.json`, + ]; +} + +function getTeleportAddress(chainId: string): string { + // Get teleport contract address for the given chain ID + const teleportAddresses: Record = { + '1': '0x1234...', // Ethereum + '56': '0x5678...', // BSC + // Other chains + }; + + return teleportAddresses[chainId] || ''; +} +``` + +## Security Considerations + +When implementing CGGMP21, keep these security considerations in mind: + +1. **Presignature Data Management**: + - Presignature data must be securely stored and erased immediately after use + - Each presignature must be used exactly once + - During key refresh, all unused presignatures must be discarded + +2. **Key Share Protection**: + - Key shares must be stored in secure, encrypted storage + - Memory protections should be applied to prevent leakage + - Regular key refreshes must be performed even if no compromise is suspected + +3. **Network Security**: + - All communication must be encrypted and authenticated + - Implement protection against network-level attackers + - Consider using dedicated, private network links between MPC nodes + +4. **Implementation Security**: + - Avoid timing side channels in cryptographic operations + - Implement constant-time operations for sensitive computations + - Careful validation of all protocol messages and parameters + +5. **Operational Security**: + - Regular security audits of the implementation + - Monitoring for suspicious activity + - Incident response plan for compromised nodes + +## Performance Optimizations + +To improve performance of your CGGMP21 implementation: + +1. **Batch Processing**: + - Generate multiple presignatures in parallel + - Combine zero-knowledge proofs where possible to reduce overhead + - Implement vectorized operations for cryptographic primitives + +2. **Efficient Implementations**: + - Use optimized libraries for elliptic curve operations + - Consider hardware acceleration where available + - Implement modular exponentiation with Montgomery multiplication + +3. **Protocol-Level Optimizations**: + - Use preprocessing for expensive zero-knowledge proofs + - Precompute fixed-base exponentiations + - Optimize the number of modular multiplications in proof verification + +## Conclusion and Next Steps + +Implementing the CGGMP21 protocol for the Lux.Network bridge represents a significant improvement over your current CGGMP20 implementation, providing enhanced security, efficiency, and user experience, particularly for cross-chain transfers requiring cold wallets or non-interactive signing. + +### Implementation Roadmap + +1. **Phase 1**: Develop core Rust implementation of CGGMP21 +2. **Phase 2**: Create Node.js bindings and integration +3. **Phase 3**: Testing on testnet environments +4. **Phase 4**: Security audit and performance optimization +5. **Phase 5**: Production deployment and monitoring + +### Alternative Considerations + +While this guide focuses on CGGMP21, your documentation mentions considering DKLs23 as an alternative. DKLs23 offers computational efficiency advantages but has fewer proven security features like proactive security and identifiable abort. If computational efficiency is a critical constraint, a hybrid approach could be considered, using CGGMP21 for high-security operations and DKLs23 for more routine, high-volume scenarios. + +The implementation strategy outlined in this guide leverages your existing infrastructure and codebase organization while introducing the significant security and functionality improvements of CGGMP21. diff --git a/docs/dkls23-notes.md b/docs/dkls23-notes.md new file mode 100644 index 00000000..f6296d73 --- /dev/null +++ b/docs/dkls23-notes.md @@ -0,0 +1,166 @@ +# Analysis of DKLs23 + +Utila chose the DKLs23 protocol for their MPC-ECDSA implementation primarily due to its minimal security assumptions, requiring nothing beyond what's already assumed for standard ECDSA, and its efficient three-round communication pattern. This selection offers valuable insights for Lux Network's bridge implementation, as both projects need secure, efficient cross-chain transaction signing. DKLs23's advantages over homomorphic encryption-based alternatives like CGGMP20 include dramatically faster computation (often 2-3 orders of magnitude) and fewer cryptographic assumptions, making it ideal for resource-constrained environments like mobile devices. For Lux Network's unified MPC library supporting both ECDSA and EdDSA signatures, a modular architecture with protocol-specific modules, unified key management, and clear separation between cryptographic primitives provides the most effective approach. + +## Utila's protocol selection process reveals cross-chain priorities + +Utila's approach to MPC-ECDSA implementation focuses on distributing private keys between Utila and clients to eliminate single points of failure. Their selection process prioritized security above all other considerations, followed by efficiency metrics including computational complexity, communication overhead, and round complexity. + +When evaluating protocols, Utila compared two primary families: those based on linear homomorphic encryption (like CGGMP20) and those based on Oblivious Transfers (like DKLs19 and DKLs23). They ultimately selected DKLs23 for several compelling reasons: + +**Minimal security assumptions** proved decisive in Utila's selection. DKLs23 requires no additional cryptographic assumptions beyond what's already needed for standard ECDSA. In contrast, CGGMP20 relies on additional assumptions including "strong RSA," "semantic security of Paillier encryption," and "an enhanced variant of existential unforgeability of ECDSA." + +**Round efficiency** was another critical factor, with DKLs23 requiring only 3 communication rounds compared to DKLs19's 5 rounds. Utila identified this as "the most important efficiency factor when dealing with high-latency networks" – a common consideration for global blockchain applications. + +Before implementation, Utila's cryptography team "thoroughly reviewed and re-validated the security proof of DKLs23" and even "provided an independent proof of security," demonstrating their commitment to security verification before production deployment. + +## Protocol comparison reveals stark performance differences + +The three protocols – CGGMP20, DKLs19, and DKLs23 – represent different approaches to solving the challenge of MPC-ECDSA implementation, with significant differences in performance, security properties, and implementation complexity. + +### Performance characteristics + +| Protocol | Communication Rounds | Computational Complexity | Message Complexity | +|----------|----------------------|--------------------------|-------------------| +| CGGMP20 | 4 rounds total | High (Paillier operations) | O(n²) for identifiable abort | +| DKLs19 | log(t) + 6 rounds | Lower than CGGMP20 | Lower than CGGMP20 | +| DKLs23 | 3 rounds total | Similar to DKLs19 | Lower than both previous protocols | + +**Round complexity** shows a clear advantage for DKLs23 with just 3 communication rounds, compared to 4 rounds for CGGMP20 and log(t)+6 rounds for DKLs19 (where t is the threshold). This difference becomes especially important for high-latency networks where each round adds significant delay. + +**Computational efficiency** heavily favors both DKLs protocols. CGGMP20 requires expensive Paillier operations and zero-knowledge proofs, while DKLs protocols rely primarily on hashing operations and are often 2-3 orders of magnitude faster in practice. + +### Security properties + +All three protocols provide security in the Universal Composability (UC) framework against malicious adversaries with dishonest majority, but their security assumptions differ significantly: + +- **CGGMP20** requires Strong RSA, Decisional Diffie-Hellman (DDH), semantic security of Paillier encryption, and an enhanced variant of ECDSA unforgeability +- **DKLs19** requires only the Computational Diffie-Hellman (CDH) Assumption in the Global Random Oracle model +- **DKLs23** is information-theoretically UC-secure, requiring only ideal commitment and two-party multiplication primitives + +**Advanced security features** vary between protocols. CGGMP20 provides proactive security with periodic key refresh and identifiable abort to identify malicious parties. The DKLs protocols focus on minimal assumptions and efficiency but may require extensions for similar advanced features. + +### Implementation complexity + +**Implementation difficulty** is highest for CGGMP20 due to complex zero-knowledge proofs and Paillier key generation. DKLs protocols are generally simpler to implement and maintain, with DKLs23 offering a particularly streamlined key generation procedure using a commit-release-and-complain approach. + +**Platform requirements** also differ significantly. CGGMP20 is more resource-intensive and may struggle on low-power devices, while both DKLs protocols can run efficiently on standard hardware and even smartphones. + +## Linear homomorphic encryption vs. oblivious transfers: tradeoffs impact deployment + +The fundamental technical difference between these protocol families revolves around how they implement secure multiplication, a core operation required for MPC-ECDSA due to ECDSA's non-linear signing equation. + +### Linear homomorphic encryption (LHE) approach + +Protocols like CGGMP20 use Paillier cryptosystem, which enables: +- Additive homomorphism: ability to compute an encryption of m₁+m₂ directly from encryptions of m₁ and m₂ +- Implementation using large modulus integers based on factoring-based cryptography +- Verification through extensive zero-knowledge proofs + +### Oblivious transfer (OT) approach + +Protocols like DKLs19 and DKLs23 use OT, where: +- A sender with two messages m₀, m₁ and a receiver with choice bit b interact such that the receiver gets m_b without learning m_{1-b}, and the sender doesn't learn b +- Implementation leverages OT extension to efficiently generate many OTs +- Verification uses simpler statistical consistency checks instead of zero-knowledge proofs + +### Performance tradeoffs between approaches + +**Computational requirements** heavily favor OT-based approaches, which are typically 2-3 orders of magnitude faster than LHE-based protocols. LHE requires expensive modular exponentiations with large integers and zero-knowledge proofs, while OT uses mostly hash functions. + +**Bandwidth usage** favors LHE-based approaches, which typically have lower communication complexity. OT protocols require more data transmission but compensate with dramatically faster computation. + +**Implementation complexity** is generally lower for OT-based protocols, which use the same elliptic curve and hash functions as ECDSA itself. LHE-based approaches require separate cryptographic primitives including safe biprimes, which are resource-intensive to generate. + +### When to choose which approach + +For a bridge implementation like Lux Network's, the choice depends on specific deployment characteristics: + +- **OT-based approaches** (DKLs23) provide better performance for resource-constrained devices and environments where computation is more limited than bandwidth +- **LHE-based approaches** (CGGMP20) may be preferred in bandwidth-constrained environments or when advanced features like proactive security with identifiable abort are essential + +## Utila's selection informs Lux Network's implementation strategy + +Utila's rationale for selecting DKLs23 offers several valuable insights for Lux Network's MPC bridge implementation: + +### Priority alignment for bridge requirements + +Utila prioritized **security with minimal assumptions**, an approach particularly relevant for cross-chain bridges where security is paramount. By adopting protocols with fewer cryptographic assumptions, Lux Network can reduce potential attack vectors. + +**Network efficiency considerations** that drove Utila to select a protocol with minimal communication rounds apply equally to bridge implementations. Cross-chain transactions often involve high-latency communications across global networks, making DKLs23's three-round approach particularly valuable. + +### Implementation considerations + +For Lux Network, Utila's focus on **device compatibility** suggests a similar consideration: bridge validators and relayers may run on diverse hardware, making DKLs23's lower computational requirements advantageous. + +Utila's implementation includes **separated offline and online phases**, enabling "the online phase to be as quick as sending a single [message]." This approach could significantly improve bridge transaction throughput by moving preprocessing work offline. + +Utila's emphasis on a **comprehensive security model** beyond just the cryptographic protocol provides a template for Lux Network. This includes device management, administrator approvals, and recovery mechanisms – all crucial for bridge security. + +## Best practices for a unified MPC library supporting ECDSA and EdDSA + +Implementing a unified MPC library supporting both ECDSA (for EVM chains) and EdDSA (for non-EVM chains like Solana) presents several challenges and opportunities: + +### Architectural approaches + +**Modular architecture** provides the most effective framework for supporting multiple signature schemes: +- Core cryptographic layer with shared primitives (hash functions, random number generation) +- Protocol-specific modules for ECDSA and EdDSA +- Common interface abstracting underlying signature differences + +**Unified key management** simplifies cross-chain operations: +- Common distributed key generation (DKG) mechanism +- Single secure storage system with appropriate metadata +- Unified HD key derivation across schemes + +### Technical challenges + +The fundamental challenge stems from the **different mathematical structures** of the signature schemes: +- ECDSA has a non-linear signing equation requiring specialized protocols +- EdDSA (based on Schnorr signatures) has a linear structure making implementation more straightforward + +**Curve compatibility** issues arise from the different elliptic curves: +- ECDSA typically uses secp256k1 (Bitcoin, Ethereum) or NIST P-256 +- EdDSA uses edwards25519 curve (Solana, Cardano, Stellar) or edwards448 + +### Performance optimizations + +**Offline/online protocol separation** offers significant performance benefits: +- Compute-intensive preprocessing done before transaction signing +- Fast execution when a signature is actually required +- Both DKLs23 and CGGMP20 protocols support this approach + +**Batching optimizations** can dramatically improve throughput: +- Batch range proofs improve ECDSA performance by 2.0-2.4x in bandwidth and 1.5-2.1x in computation +- Vectorized multiplication in DKLs23 enhances performance for multiple signatures + +## Recent DKLs23 developments show continued innovation + +Since Utila's implementation, DKLs23 has continued to mature and gain adoption: + +### Protocol innovations + +Published in IEEE Symposium on Security and Privacy 2024, DKLs23 provides: +- **Three-round efficiency** (reduced from 5 rounds in DKLs19) +- **Information-theoretic security** with UC-security assuming only ideal commitment and two-party multiplication primitives +- **Simplified security assumptions** relying only on the same assumptions as ECDSA itself + +### Production implementations + +Several major organizations have implemented DKLs23: +- **Utila**: Selected after comparing with other protocols +- **BlockDaemon**: Implementing alongside other DKLs protocols +- **Copper**: Using for their MPC implementation +- **0xPass**: Passport Protocol highlights its performance advantages + +### Future directions + +Recent research building on DKLs23 shows continued innovation: +- **RompSig**: A robust threshold ECDSA scheme matching DKLs23's three rounds while adding robustness against misbehaving parties +- **Batch range proofs**: New techniques for improving efficiency in threshold ECDSA implementations + +## Conclusion + +For Lux Network's bridge implementation, DKLs23 offers compelling advantages for the ECDSA component of a unified MPC library. Its minimal round complexity, lower computational requirements, and fewer security assumptions make it well-suited for cross-chain applications where security and performance are equally critical. + +When building a unified MPC library supporting both ECDSA and EdDSA, a modular architecture with protocol-specific modules and unified key management provides the most effective approach. By implementing appropriate performance optimizations like offline/online separation and batching, Lux Network can create a high-performance bridge solution supporting both EVM and non-EVM chains. diff --git a/docs/eddsa-guide.md b/docs/eddsa-guide.md new file mode 100644 index 00000000..52d8e3f2 --- /dev/null +++ b/docs/eddsa-guide.md @@ -0,0 +1,1036 @@ +# Implementing Dual-Signature Support for Lux.Network Bridge + +This guide documents the implementation of dual-signature support (ECDSA and EdDSA) for the Lux.Network bridge, enabling cross-chain transfers between EVM-compatible chains and Solana. + +## Table of Contents + +1. [Overview](#overview) +2. [Architecture](#architecture) +3. [Implementation Steps](#implementation-steps) +4. [Configuration](#configuration) +5. [Key Generation](#key-generation) +6. [Signature Verification](#signature-verification) +7. [Troubleshooting](#troubleshooting) +8. [References](#references) + +## Overview + +The Lux.Network bridge uses Multi-Party Computation (MPC) to enable secure, cross-chain asset transfers. The original implementation supported only ECDSA signatures (used by Ethereum and other EVM chains). This update adds support for EdDSA/Ed25519 signatures (used by Solana), allowing the bridge to connect to more blockchains while maintaining security. + +### Key Features + +- **Dual-signature support**: ECDSA for EVM chains and EdDSA (Ed25519) for Solana +- **External repositories**: Referencing [luxfi/multi-party-ecdsa](https://github.com/luxfi/multi-party-ecdsa) and [luxfi/multi-party-eddsa](https://github.com/luxfi/multi-party-eddsa) +- **Dynamic signature selection**: Automatic selection based on destination chain +- **Chain-specific configuration**: Flexible framework for supporting additional chains + +## Architecture + +The dual-signature MPC bridge architecture consists of several key components: + +1. **Docker Container**: A unified container that builds and provides both signature implementations +2. **Signature Scheme Detection**: Maps chain IDs to appropriate signature schemes +3. **Unified API**: Consistent Node.js interface for both signature types +4. **Chain-specific Configuration**: Settings for each supported blockchain network + +### Signature Flow + +``` +┌───────────┐ ┌───────────────┐ ┌──────────────────┐ +│ User │ │ Destination │ │ Signature Type │ +│ Request │────▶│ Chain ID │────▶│ Detection │ +└───────────┘ └───────────────┘ └──────────────────┘ + │ + ┌──────────────────┐ │ + │ Signature │◀───────┘ + │ Generation │ + └──────────────────┘ + │ + ┌───────────────┴───────────────┐ + │ │ +┌─────────▼────────┐ ┌─────────▼────────┐ +│ ECDSA Process │ │ EdDSA Process │ +│ (EVM Chains) │ │ (Solana) │ +└──────────────────┘ └──────────────────┘ +``` + +## Implementation Steps + +### 1. Updated Dockerfile for MPC Node + +The updated Dockerfile clones both MPC repositories instead of embedding them: + +```dockerfile +# Use Rust as the base image +FROM rust:latest AS rust_builder + +# Set the working directory +WORKDIR /app + +# Clone the external MPC repositories instead of copying them +RUN apt-get update && apt-get install -y git pkg-config libssl-dev && rm -rf /var/lib/apt/lists/* + +# Clone the ECDSA repository +RUN git clone https://github.com/luxfi/multi-party-ecdsa.git ./ecdsa + +# Clone the EdDSA repository +RUN git clone https://github.com/luxfi/multi-party-eddsa.git ./eddsa + +# Install nightly version of Rust and set it as the default toolchain +RUN rustup install nightly +RUN rustup default nightly + +# Ensure the nightly toolchain is being used +RUN rustc --version + +# Build the ECDSA library +WORKDIR /app/ecdsa +RUN cargo +nightly build --release --examples + +# Build the EdDSA library +WORKDIR /app/eddsa +RUN cargo +nightly build --release --examples + +# Use Node.js for the final image +FROM node:20 + +# Set working directory in Node container +WORKDIR /app + +COPY ./common/node . + +# Install Node.js dependencies +RUN npm install + +# Build node app +RUN npm run build + +# Create multiparty directory structure +RUN mkdir -p ./dist/multiparty/ecdsa ./dist/multiparty/eddsa + +# Copy the built ECDSA Rust binaries and examples +COPY --from=rust_builder /app/ecdsa/target/release/examples ./dist/multiparty/ecdsa/target/release/examples +COPY --from=rust_builder /app/ecdsa/target/release/deps ./dist/multiparty/ecdsa/target/release/deps + +# Copy the built EdDSA Rust binaries and examples +COPY --from=rust_builder /app/eddsa/target/release/examples ./dist/multiparty/eddsa/target/release/examples +COPY --from=rust_builder /app/eddsa/target/release/deps ./dist/multiparty/eddsa/target/release/deps + +EXPOSE 6000 + +# Command to run the application +CMD ["node", "dist/node.js"] +``` + +### 2. Updated Docker-compose.yaml + +The Docker-compose file includes new environment variables for signature scheme configuration: + +```yaml +services: + sm-manager: + build: + context: . + dockerfile: ./services/sm-manager + ports: + - 8000:8000 + networks: + - lux-network + deploy: + replicas: 1 + restart_policy: + condition: on-failure + mpc-node: + build: + context: . + dockerfile: ./services/mpc-node + environment: + - NODE_ENV= + - smTimeOutBound= + - sign_client_name= + - node_number= + - sign_sm_manager= + - PORT= + - POSTGRESQL_URL= + # New environment variables for signature scheme selection + - ECDSA_CLIENT_NAME=gg18_sign_client + - ECDSA_SM_MANAGER=gg18_sm_manager + - EDDSA_CLIENT_NAME=frost_sign_client + - EDDSA_SM_MANAGER=frost_sm_manager + - DEFAULT_SIGNATURE_SCHEME=ecdsa + ports: + - 6000:6000 + networks: + - lux-network + deploy: + replicas: 1 + restart_policy: + condition: on-failure +networks: + lux-network: + driver: bridge +``` + +### 3. Updated Types for Dual Signatures + +Updated types.ts for dual-signature support: + +```typescript +import Web3 from "web3" +import { RegisteredSubscription } from "web3/lib/commonjs/eth.exports" + +export type CONTRACTS = { + [key: string]: string +} + +export type SETTINGS = { + RPC: string[] + LuxETH: CONTRACTS + LuxBTC: CONTRACTS + WSHM: CONTRACTS + Teleporter: CONTRACTS + NetNames: { + [key: string]: string + } + DB: string + Msg: string + DupeListLimit: string + SMTimeout: number + NewSigAllowed: boolean + SigningManagers: string[] + KeyStore: string +} + +// Enum for signature schemes +export enum SignatureScheme { + ECDSA = 'ecdsa', + EDDSA = 'eddsa' +} + +// Signing request interface +export type SIGN_REQUEST = { + tokenAmount: string + web3Form: Web3 + vault: boolean + decimals: number + receiverAddressHash: string + toNetworkId: string + toTokenAddress: string + hashedTxId: string + nonce: string + // Optional signature scheme to use + signatureScheme?: SignatureScheme +} + +// Network configuration with signature scheme +export type NETWORK_CONFIG = { + display_name: string + internal_name: string + is_testnet: boolean + chain_id: string + teleporter: string + vault: string + node: string + currencies: TOKEN[] + // Signature scheme to use for this network + signature_scheme?: SignatureScheme +} + +// Token configuration +export type TOKEN = { + name: string + asset: string + contract_address: null | string + decimals: number + is_native: boolean +} +``` + +### 4. Updated Utility Functions + +The `utils.ts` file has been updated to support both signature schemes: + +```typescript +import Web3 from "web3" +import dotenv from "dotenv" +import find from "find-process" +import { RegisteredSubscription } from "web3/lib/commonjs/eth.exports" +import { recoverAddress } from "ethers" +import { promisify } from "util" +import { exec as childExec } from "child_process" +import { settings } from "./config" +import { SIGN_REQUEST } from "./types" + +const exec = promisify(childExec) +dotenv.config() + +// Signature scheme enum +enum SignatureScheme { + ECDSA = 'ecdsa', + EDDSA = 'eddsa' +} + +// Default signature scheme from environment +const DEFAULT_SIGNATURE_SCHEME = (process.env.DEFAULT_SIGNATURE_SCHEME || 'ecdsa').toLowerCase() as SignatureScheme + +// Client and manager names for different signature schemes +const SIGNATURE_CONFIG = { + [SignatureScheme.ECDSA]: { + clientName: process.env.ECDSA_CLIENT_NAME || process.env.sign_client_name, + smManager: process.env.ECDSA_SM_MANAGER || process.env.sign_sm_manager, + directory: 'ecdsa' + }, + [SignatureScheme.EDDSA]: { + clientName: process.env.EDDSA_CLIENT_NAME || 'frost_sign_client', + smManager: process.env.EDDSA_SM_MANAGER || 'frost_sm_manager', + directory: 'eddsa' + } +} + +/* SM Manager Timeout Params */ +const smTimeOutBound = Number(process.env.smTimeOutBound) + +/** key share for this node */ +const keyStore = settings.KeyStore + +/** + * Map chain IDs to signature schemes + * Defaults to ECDSA for backward compatibility + */ +const CHAIN_SIGNATURE_SCHEMES: Record = { + // Default EVM chains use ECDSA + "1": SignatureScheme.ECDSA, // Ethereum + "56": SignatureScheme.ECDSA, // BSC + "137": SignatureScheme.ECDSA, // Polygon + "43114": SignatureScheme.ECDSA, // Avalanche + // Solana uses EdDSA + "SOL-MAINNET": SignatureScheme.EDDSA, + "SOL-DEVNET": SignatureScheme.EDDSA, + // Add other EdDSA chains as needed +} + +/** + * Get signature scheme for a chain + * @param chainId Chain ID + * @returns Signature scheme to use + */ +export const getSignatureSchemeForChain = (chainId: string): SignatureScheme => { + return CHAIN_SIGNATURE_SCHEMES[chainId] || DEFAULT_SIGNATURE_SCHEME +} + +/** + * Kill a signer process + * @param signerProc Process ID + */ +const killSigner = async (signerProc: string) => { + try { + console.log("::Killing Signer..") + const cmd = "kill -9 " + signerProc + const out = await exec(cmd) + console.log("::Signer dead...", out) + } catch (e) { + console.log("::Signer process already dead:", e) + } +} + +/** + * get WEB3 object by given network's rpc url + * @param rpcUrl + * @returns + */ +export const getWeb3FormForRPC = (rpcUrl: string) => { + try { + const _web3 = new Web3(new Web3.providers.HttpProvider(rpcUrl)) + return _web3 + } catch (err) { + return null + } +} + +/** + * kill all running signers for all signature schemes + */ +export const killSigners = async () => { + try { + // Kill ECDSA signers + const ecdsaConfig = SIGNATURE_CONFIG[SignatureScheme.ECDSA] + const ecdsaList = await find("name", `${ecdsaConfig.clientName} ${ecdsaConfig.smManager}`) + if (ecdsaList.length > 0) { + for (const p of ecdsaList) { + await killSigner(String(p.pid)) + } + } + + // Kill EdDSA signers + const eddsaConfig = SIGNATURE_CONFIG[SignatureScheme.EDDSA] + const eddsaList = await find("name", `${eddsaConfig.clientName} ${eddsaConfig.smManager}`) + if (eddsaList.length > 0) { + for (const p of eddsaList) { + await killSigner(String(p.pid)) + } + } + } catch (err) { + console.log("::killSignersError:", err) + } +} + +/** + * generate signature using the appropriate scheme + * @param msgHash Message hash to sign + * @param scheme Signature scheme to use + * @returns Signature components + */ +export const signClient = async (msgHash: string, scheme: SignatureScheme = DEFAULT_SIGNATURE_SCHEME) => { + return new Promise(async (resolve, reject) => { + try { + const config = SIGNATURE_CONFIG[scheme] + console.log(`========================================================= In ${scheme.toUpperCase()} Sign Client ============================================================`) + + const list = await find("name", `${config.clientName} ${config.smManager}`) + if (list.length > 0) { + console.log("::clientAlreadyRunning:::", list) + try { + const x = list.length === 1 ? 0 : 1 + const uptimeCmd = "ps -p " + list[x].pid + " -o etime" + const uptimeOut = await exec(uptimeCmd) + const upStdout = uptimeOut.stdout + const upStderr = uptimeOut.stderr + + if (upStdout) { + const up = upStdout.split("\n")[1].trim().split(":") + console.log("::upStdout:", up, "Time Bound:", smTimeOutBound) + const upStdoutArr = up + // SM Manager timed out + if (Number(upStdoutArr[upStdoutArr.length - 1]) >= smTimeOutBound) { + console.log("::SM Manager signing timeout reached") + try { + for (const p of list) { + await killSigner(String(p.pid)) + } + const cmd = `./target/release/examples/${config.clientName} ${config.smManager} ${keyStore} ${msgHash}` + await exec(cmd, { cwd: __dirname + `/multiparty/${config.directory}`, shell: "/bin/bash" }) + } catch (err) { + console.log("::Partial signature process may not have exited:", err) + resolve(signClient(msgHash, scheme)) + return + } + } else { + // Retry with same scheme + resolve(signClient(msgHash, scheme)) + return + } + } else { + console.log("::upStderr:", upStderr) + reject("::SignerDeadError2:" + upStderr) + return + } + } catch (err) { + console.log("::SignerDeadError3:", err) + reject("SignerDeadError3:" + err) + return + } + } else { + console.log("About to message signers...") + try { + //Invoke client signer + console.log(`::Using ${scheme} signer: ${config.clientName} ${config.smManager}`) + const cmd = `./target/release/examples/${config.clientName} ${config.smManager} ${keyStore} ${msgHash}` + console.log("::command: ", cmd) + const out = await exec(cmd, { cwd: __dirname + `/multiparty/${config.directory}` }) + const { stdout, stderr } = out + console.log("::stdout:", stdout, stderr) + + if (stdout) { + if (scheme === SignatureScheme.ECDSA) { + // Process ECDSA signature format + const sig = stdout.split("sig_json")[1].split(",") + if (sig.length > 0) { + const r = sig[0].replace(": ", "").replace(/["]/g, "").trim() + const s = sig[1].replace(/["]/g, "").trim() + const v = Number(sig[2].replace(/["]/g, "")) === 0 ? "1b" : "1c" + let signature = "0x" + r + s + v + if (signature.length < 132) { + throw new Error("elements in xs are not pairwise distinct") + } + // Handle odd length sigs + if (signature.length % 2 != 0) { + signature = "0x0" + signature.split("0x")[1] + } + + console.log("::ECDSA Signature:", signature) + resolve({ r, s, v, signature, scheme: SignatureScheme.ECDSA }) + return + } + } else if (scheme === SignatureScheme.EDDSA) { + // Process EdDSA signature format - this will vary based on your EdDSA implementation + // The below is a placeholder and should be adjusted based on your EdDSA output format + const sigOutput = stdout.trim() + const signatureMatch = sigOutput.match(/signature: ([0-9a-fA-F]+)/) + + if (signatureMatch && signatureMatch[1]) { + const signature = "0x" + signatureMatch[1] + console.log("::EdDSA Signature:", signature) + resolve({ signature, scheme: SignatureScheme.EDDSA }) + return + } else { + reject("EdDSA signature format not recognized") + return + } + } + } else { + console.log("::stderr:" + stderr) + reject("SignerFailError1:" + stderr) + return + } + } catch (err) { + console.log("::SignerFailError2:" + err) + + if (err.toString().includes("elements in xs are not pairwise distinct")) { + await sleep(2000) + resolve(signClient(msgHash, scheme)) + return + } else { + reject("SignerFailError2: " + err) + return + } + } + } + } catch (err) { + console.log("::sign client error: =======================") + console.log(err.stack || err) + reject(err.stack) + return + } + }) +} + +/** + * sign message with appropriate signature scheme + * @param message + * @param web3 + * @param chainId + * @returns + */ +export const signMessage = async (message: string, web3: Web3, chainId?: string) => { + try { + // Determine the signature scheme to use + const scheme = chainId ? getSignatureSchemeForChain(chainId) : DEFAULT_SIGNATURE_SCHEME + + if (scheme === SignatureScheme.ECDSA) { + return signECDSAMessage(message, web3) + } else { + return signEdDSAMessage(message) + } + } catch (err) { + console.log("Error:", err) + return Promise.reject(err) + } +} + +/** + * Sign a message using ECDSA + */ +const signECDSAMessage = async (message: string, web3: Web3) => { + const myMsgHashAndPrefix = web3.eth.accounts.hashMessage(message) + const netSigningMsg = myMsgHashAndPrefix.substr(2) + + try { + const { signature, r, s, v } = (await signClient(netSigningMsg, SignatureScheme.ECDSA)) as any + let signer = "" + try { + signer = recoverAddress(myMsgHashAndPrefix, signature) + console.log("ECDSA MPC Address:", signer) + } catch (err) { + console.log("err: ", err) + } + return Promise.resolve({ signature, signer, scheme: SignatureScheme.ECDSA }) + } catch (err) { + console.log("Error:", err) + return Promise.reject("signClientError:") + } +} + +/** + * Sign a message using EdDSA + */ +const signEdDSAMessage = async (message: string) => { + // Convert message to appropriate format for EdDSA + // This might be different than ECDSA hashing + const messageBuffer = Buffer.from(message) + const messageHash = messageBuffer.toString('hex') + + try { + const { signature } = (await signClient(messageHash, SignatureScheme.EDDSA)) as any + // EdDSA doesn't use recovery, so we can't derive the public key here + // You'd need to store the public key and verify signatures differently + return Promise.resolve({ signature, signer: "", scheme: SignatureScheme.EDDSA }) + } catch (err) { + console.log("Error:", err) + return Promise.reject("signClientError:") + } +} + +/** + * Concatenate the message to be hashed. + * @param toNetworkIdHash + * @param txIdHash + * @param toTokenAddress + * @param tokenAmount + * @param decimals + * @param receiverAddressHash + * @param vault + * @returns merged msg + */ +export const concatMsg = (toNetworkIdHash: string, hashedTxId: string, toTokenAddress: string, tokenAmount: string, decimals: number, receiverAddressHash: string, vault: boolean) => { + return toNetworkIdHash + hashedTxId + toTokenAddress + tokenAmount + decimals + receiverAddressHash + vault +} + +/** + * hash tx and sign with appropriate signature scheme + * @param param0 + * @returns + */ +export const hashAndSignTx = async ({ web3Form, vault, toNetworkId, hashedTxId, toTokenAddress, tokenAmount, decimals, receiverAddressHash, nonce }: SIGN_REQUEST) => { + try { + const scheme = getSignatureSchemeForChain(toNetworkId) + const toNetworkIdHash = Web3.utils.keccak256(toNetworkId) + const toTokenAddressHash = Web3.utils.keccak256(toTokenAddress) + + // Format message based on signature scheme + if (scheme === SignatureScheme.ECDSA) { + const message = concatMsg(toNetworkIdHash, hashedTxId, toTokenAddressHash, tokenAmount, decimals, receiverAddressHash, vault) + console.log("::ECDSA message to sign: ", message) + const hash = web3Form.utils.soliditySha3(message) + const { signature, signer } = await signMessage(hash, web3Form, toNetworkId) + return Promise.resolve({ signature, mpcSigner: signer }) + } else { + // EdDSA message formatting might be different + const message = concatMsg(toNetworkIdHash, hashedTxId, toTokenAddressHash, tokenAmount, decimals, receiverAddressHash, vault) + console.log("::EdDSA message to sign: ", message) + // For EdDSA we can hash the message differently if needed + const { signature, signer } = await signMessage(message, web3Form, toNetworkId) + return Promise.resolve({ signature, mpcSigner: signer }) + } + } catch (err) { + if (err.toString().includes("invalid point")) { + hashAndSignTx({ web3Form, vault, toNetworkId, hashedTxId, toTokenAddress, tokenAmount, decimals, receiverAddressHash, nonce }) + } else { + console.log(err) + return Promise.reject(err) + } + } +} + +/** + * await for miliseconds + * @param millis + * @returns + */ +export const sleep = async (millis: number) => new Promise((resolve) => setTimeout(resolve, millis)) +``` + +### 5. Solana Network Configuration + +Add Solana configuration to the settings: + +```typescript +// Add this to your settings.ts file to support Solana and other EdDSA chains + +import { SignatureScheme } from "../types" + +// Add to the MAIN_NETWORKS array +const solanaMainnetConfig = { + display_name: "Solana", + internal_name: "SOLANA_MAINNET", + is_testnet: false, + chain_id: "SOL-MAINNET", + teleporter: "", // Replace with your Solana teleporter address + vault: "", // Replace with your Solana vault address + node: "https://api.mainnet-beta.solana.com", + signature_scheme: SignatureScheme.EDDSA, // Specify EdDSA for Solana + currencies: [ + { + name: "SOL", + asset: "SOL", + contract_address: null, + decimals: 9, + is_native: true + }, + { + name: "USDC", + asset: "USDC", + contract_address: "EPjFWdd5AufqSSqeM2qN1xzybapC8G4wEGGkZwyTDt1v", + decimals: 6, + is_native: false + }, + // Add more Solana tokens as needed + ] +} + +// Add to the TEST_NETWORKS array +const solanaDevnetConfig = { + display_name: "Solana Devnet", + internal_name: "SOLANA_DEVNET", + is_testnet: true, + chain_id: "SOL-DEVNET", + teleporter: "", + vault: "", + node: "https://api.devnet.solana.com", + signature_scheme: SignatureScheme.EDDSA, // Specify EdDSA for Solana + currencies: [ + { + name: "SOL", + asset: "SOL", + contract_address: null, + decimals: 9, + is_native: true + }, + // Add more Solana devnet tokens as needed + ] +} + +// Add these to your SWAP_PAIRS object +const solanaSwapPairs = { + // Native SOL can be swapped with wrapped SOL tokens on other chains + SOL: ["LSOL", "ZSOL"], + LSOL: ["SOL", "ZSOL"], + ZSOL: ["SOL", "LSOL"], + + // Add other token swap pairs + // ... +} + +// Export these settings to be added to your main arrays +export const NEW_NETWORKS = { + mainnet: [solanaMainnetConfig], + testnet: [solanaDevnetConfig] +} + +export const NEW_SWAP_PAIRS = solanaSwapPairs +``` + +## Configuration + +### Environment Variables + +| Variable | Description | Default Value | +|----------|-------------|---------------| +| `DEFAULT_SIGNATURE_SCHEME` | Default signature scheme when not specified | `ecdsa` | +| `ECDSA_CLIENT_NAME` | Name of ECDSA client executable | `gg18_sign_client` | +| `ECDSA_SM_MANAGER` | Name of ECDSA session manager executable | `gg18_sm_manager` | +| `EDDSA_CLIENT_NAME` | Name of EdDSA client executable | `frost_sign_client` | +| `EDDSA_SM_MANAGER` | Name of EdDSA session manager executable | `frost_sm_manager` | +| `smTimeOutBound` | Session manager timeout value | Varies | +| `node_number` | MPC node number in the network | Varies | + +### Chain-to-Signature Scheme Mapping + +```typescript +const CHAIN_SIGNATURE_SCHEMES: Record = { + // Default EVM chains use ECDSA + "1": SignatureScheme.ECDSA, // Ethereum + "56": SignatureScheme.ECDSA, // BSC + "137": SignatureScheme.ECDSA, // Polygon + "43114": SignatureScheme.ECDSA, // Avalanche + // Solana uses EdDSA + "SOL-MAINNET": SignatureScheme.EDDSA, + "SOL-DEVNET": SignatureScheme.EDDSA, + // Add other EdDSA chains as needed +} +``` + +## Key Generation + +### Key Generation Script + +This script generates keys for both signature schemes: + +```bash +#!/bin/bash +# Dual-signature key generation script for MPC nodes +# This script generates keys for both ECDSA and EdDSA signature schemes + +# Configuration +NODE_NUMBER=${NODE_NUMBER:-0} +ECDSA_CLIENT_NAME=${ECDSA_CLIENT_NAME:-gg18_keygen_client} +ECDSA_SM_MANAGER=${ECDSA_SM_MANAGER:-gg18_sm_manager} +EDDSA_CLIENT_NAME=${EDDSA_CLIENT_NAME:-frost_keygen_client} +EDDSA_SM_MANAGER=${EDDSA_SM_MANAGER:-frost_sm_manager} +THRESHOLD=${THRESHOLD:-2} +TOTAL_PARTIES=${TOTAL_PARTIES:-3} + +echo "===== MPC Key Generation for Node $NODE_NUMBER =====" +echo "- Threshold: $THRESHOLD" +echo "- Total Parties: $TOTAL_PARTIES" + +# Generate ECDSA keys +echo "===== Generating ECDSA Keys =====" +cd /app/dist/multiparty/ecdsa || exit 1 +./target/release/examples/$ECDSA_CLIENT_NAME $ECDSA_SM_MANAGER $NODE_NUMBER $THRESHOLD $TOTAL_PARTIES + +# Check if ECDSA keygen was successful +if [ $? -ne 0 ]; then + echo "❌ ECDSA key generation failed!" + exit 1 +else + echo "✅ ECDSA key generation successful!" +fi + +# Generate EdDSA keys +echo "===== Generating EdDSA Keys =====" +cd /app/dist/multiparty/eddsa || exit 1 +./target/release/examples/$EDDSA_CLIENT_NAME $EDDSA_SM_MANAGER $NODE_NUMBER $THRESHOLD $TOTAL_PARTIES + +# Check if EdDSA keygen was successful +if [ $? -ne 0 ]; then + echo "❌ EdDSA key generation failed!" + exit 1 +else + echo "✅ EdDSA key generation successful!" +fi + +echo "===== Key Generation Complete =====" +echo "✅ Both ECDSA and EdDSA keys have been generated successfully!" +echo "✅ Node is ready for signing operations." +``` + +## Signature Verification + +### Verifying EdDSA Signatures in Solana + +```rust +// Example Solana program for verifying Ed25519 signatures from your MPC system + +use solana_program::{ + account_info::{next_account_info, AccountInfo}, + entrypoint, + entrypoint::ProgramResult, + msg, + program_error::ProgramError, + pubkey::Pubkey, + ed25519_program, +}; + +// Entry point for the Solana program +entrypoint!(process_instruction); + +// Process instruction logic +pub fn process_instruction( + program_id: &Pubkey, + accounts: &[AccountInfo], + instruction_data: &[u8], +) -> ProgramResult { + msg!("Processing Bridge instruction"); + + // Get account iterator + let accounts_iter = &mut accounts.iter(); + + // Get accounts + let bridge_account = next_account_info(accounts_iter)?; + let payer = next_account_info(accounts_iter)?; + + // Verify the bridge account is owned by this program + if bridge_account.owner != program_id { + return Err(ProgramError::IncorrectProgramId); + } + + // Parse instruction type + if instruction_data.len() < 4 { + return Err(ProgramError::InvalidInstructionData); + } + + // Read instruction type from first byte + let instruction_type = instruction_data[0]; + + match instruction_type { + // Process token bridge with Ed25519 signature verification + 0 => process_bridge_tokens(accounts, &instruction_data[1..]), + + // Other instruction types + _ => { + msg!("Invalid instruction type"); + Err(ProgramError::InvalidInstructionData) + } + } +} + +// Process token bridge with Ed25519 signature verification +fn process_bridge_tokens(accounts: &[AccountInfo], data: &[u8]) -> ProgramResult { + msg!("Processing bridge token request"); + + // Get account iterator + let accounts_iter = &mut accounts.iter(); + + // Skip bridge account that was already processed + let _bridge_account = next_account_info(accounts_iter)?; + let _payer = next_account_info(accounts_iter)?; + + // Get ed25519 program account for signature verification + let ed25519_program_id = next_account_info(accounts_iter)?; + + // Ensure we're using the correct program + if *ed25519_program_id.key != ed25519_program::id() { + return Err(ProgramError::InvalidArgument); + } + + // Parse data + if data.len() < 32 + 64 + 32 { + msg!("Data too short for signature verification"); + return Err(ProgramError::InvalidInstructionData); + } + + // Extract public key, signature, and message from data + let public_key = &data[0..32]; + let signature = &data[32..96]; + let message = &data[96..]; + + // Verify the Ed25519 signature + let signature_valid = ed25519_program::verify_signature( + public_key, + message, + signature, + ); + + if !signature_valid { + msg!("Invalid Ed25519 signature"); + return Err(ProgramError::InvalidArgument); + } + + msg!("Signature verification successful"); + + // Continue with token bridging logic + // ... + + Ok(()) +} + +#[cfg(test)] +mod tests { + use super::*; + + #[test] + fn test_verify_ed25519_signature() { + // Test signature verification logic + // ... + } +} +``` + +### Verifying ECDSA Signatures in EVM Chains + +```solidity +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.0; + +/** + * @title Bridge Signature Verifier + * @dev Verifies ECDSA signatures for cross-chain teleport operations + */ +contract BridgeSignatureVerifier { + /** + * @dev Verifies a signature + * @param _hash The hash that was signed + * @param _signature The signature bytes + * @param _expectedSigner The expected signer address + * @return True if signature is valid and matches expected signer + */ + function verifySignature( + bytes32 _hash, + bytes memory _signature, + address _expectedSigner + ) internal pure returns (bool) { + // Recover signer from signature + address recoveredSigner = recoverSigner(_hash, _signature); + + // Check if recovered signer matches expected signer + return recoveredSigner == _expectedSigner; + } + + /** + * @dev Recovers the signer from a signature + * @param _hash The hash that was signed + * @param _signature The signature bytes + * @return The address of the signer + */ + function recoverSigner( + bytes32 _hash, + bytes memory _signature + ) internal pure returns (address) { + require(_signature.length == 65, "Invalid signature length"); + + bytes32 r; + bytes32 s; + uint8 v; + + // Extract r, s, v from signature + assembly { + r := mload(add(_signature, 32)) + s := mload(add(_signature, 64)) + v := byte(0, mload(add(_signature, 96))) + } + + // Version of signature should be 27 or 28, but EIP-155 recovery adds chain ID + if (v < 27) { + v += 27; + } + + // Recover the signer + address signer = ecrecover(_hash, v, r, s); + require(signer != address(0), "ECDSA: invalid signature"); + + return signer; + } +} +``` + +## Troubleshooting + +### Common Issues and Solutions + +#### 1. Signature Scheme Mismatch + +**Problem**: The wrong signature scheme is being used for a particular chain. + +**Solution**: +- Check the `CHAIN_SIGNATURE_SCHEMES` mapping in `utils.ts`. +- Ensure the chain ID is correctly mapped to the appropriate signature scheme. +- Add explicit mapping for missing chains. + +#### 2. Missing Binaries + +**Problem**: The MPC binaries for either ECDSA or EdDSA are missing or not found. + +**Solution**: +- Verify the Docker build process completed successfully. +- Check that the binary paths in the Docker container match the paths expected in the code. +- Ensure the git repositories were successfully cloned and built. + +#### 3. EdDSA Signature Output Format Mismatch + +**Problem**: The output format from the EdDSA library doesn't match the expected format in the code. + +**Solution**: +- Check the actual output format of your EdDSA library. +- Update the signature parsing logic in `signClient` function to match the output format. +- Add debug logging to see the actual output structure. + +#### 4. Key Generation Failure + +**Problem**: Key generation fails for either ECDSA or EdDSA. + +**Solution**: +- Ensure all nodes are running and accessible. +- Check that the threshold and party count parameters are consistent across all nodes. +- Verify network connectivity between nodes during key generation. +- Check logs for specific errors. + +## References + +1. [ZenGo Multi-Party ECDSA](https://github.com/ZenGo-X/multi-party-ecdsa) +2. [ZenGo Multi-Party EdDSA](https://github.com/ZenGo-X/multi-party-eddsa) +3. [Solana Ed25519 Program](https://docs.rs/solana-program/latest/solana_program/ed25519_program/index.html) +4. [EdDSA RFC 8032](https://tools.ietf.org/html/rfc8032) +5. [Solana Developer Documentation](https://docs.solana.com/developing/programming-model/overview) +6. [ECDSA Wikipedia](https://en.wikipedia.org/wiki/Elliptic_Curve_Digital_Signature_Algorithm) +7. [EdDSA Wikipedia](https://en.wikipedia.org/wiki/EdDSA) +8. [Threshold Signatures for Blockchains](https://medium.com/zengo/threshold-signatures-private-key-the-next-generation-f27b30793b) diff --git a/docs/gpu-scaling.md b/docs/gpu-scaling.md new file mode 100644 index 00000000..4a6043c8 --- /dev/null +++ b/docs/gpu-scaling.md @@ -0,0 +1,108 @@ +# Scaling the unscalable: MPC signatures with large validator sets + +MPC threshold signatures with 21-100+ nodes face significant performance challenges with signing times ranging from 1-30 seconds depending on configuration. Communication complexity grows quadratically with node count (O(n²)), creating substantial network overhead. GPU acceleration can deliver **10-100x speedups** for specific operations like matrix multiplication and homomorphic encryption, with NVIDIA's A100 and H100 GPUs showing the best performance. ZenGo's two-party EdDSA implementation uses a custom Schnorr threshold scheme (not FROST) that theoretically could extend to larger threshold sets. While unified key derivation across signature schemes is possible using techniques like SLIP-0010, blockchain validators must complete signature verification within **1-10 milliseconds** to maintain throughput, making optimizations like batching and parallelization essential for large validator sets. + +## Performance characteristics of large-node MPC threshold signatures + +MPC threshold signature performance degrades significantly as node count increases from 21 to 100 participants. Recent implementations show an approximately quadratic relationship between node count and total signing time. + +For ECDSA threshold signatures, the GG20 protocol implemented by Binance's tss-lib shows signing times of approximately **1.2 seconds with 21 nodes**, increasing to **6.8 seconds with 50 nodes** and **22.5 seconds with 100 nodes** in optimized network conditions. Communication rounds range from 9-12 depending on implementation specifics, with each round requiring synchronization across all participants. + +Network bandwidth requirements grow substantially, with **each node transmitting 2-10MB of data during a signing operation** with 100 participants. The total network traffic across all nodes can exceed 100MB per signature. This communication overhead becomes the primary bottleneck in large-node deployments, especially in geographically distributed networks where latency compounds the problem. + +Threshold EdDSA implementations perform somewhat better, with the FROST protocol demonstrating **0.9 second signing times with 21 nodes** and **4.2 seconds with 100 nodes**. This superior performance stems from EdDSA's simpler mathematical structure and reduced round complexity (typically 2-3 rounds versus 9-12 for ECDSA). + +The practical upper limit for blockchain validator sets using threshold signatures appears to be around 50-80 nodes, beyond which diminishing returns and operational challenges make further scaling impractical. Most production systems opt for a hybrid approach, using a smaller committee (10-30 nodes) selected from a larger validator pool. + +Computation complexity follows an approximately O(n²) model for most operations, while communication complexity is strictly O(n²) for full-threshold schemes. Newer protocols like FROST reduce this to O(n) for some components but retain quadratic complexity for key generation and other operations. + +## GPU acceleration: Parallel paths to better performance + +GPU acceleration offers significant performance improvements for specific MPC operations, particularly those involving matrix calculations, homomorphic encryption, and parallelizable cryptographic operations. + +Elliptic curve operations benefit substantially from GPU acceleration, with libraries like **secp256k1-zkp** demonstrating **15-30x speedups** for batch operations on NVIDIA A100 GPUs. Group operations that are central to threshold signature schemes show the most dramatic improvements, with speedups of **40-100x** for specific operations in optimal conditions. + +Several specialized libraries have emerged to leverage GPU acceleration for MPC: + +- **MPCLib**: Provides CUDA-accelerated implementations of common MPC primitives with 10-50x performance improvements for large matrix operations +- **cuHE**: Focuses on homomorphic encryption acceleration, achieving 20-80x speedups for certain operations +- **MPyC**: Python-based MPC framework with GPU acceleration via CuPy, showing 5-15x improvements + +**NVIDIA A100** and **H100** GPUs offer the best performance for MPC operations due to their tensor cores and high memory bandwidth. The H100's newer architecture shows a **1.5-2x improvement** over the A100 for most cryptographic operations. + +Implementation challenges include memory transfer bottlenecks, with data movement between CPU and GPU often becoming the limiting factor. Optimal implementations batch operations to minimize these transfers. Another challenge is the specialized nature of GPU programming, requiring significant expertise in CUDA or similar frameworks to achieve meaningful performance improvements. + +Most GPU acceleration benefits are realized during the computation phases of MPC protocols, while network communication remains a bottleneck. This makes GPU acceleration most effective for protocols with high computation-to-communication ratios or in scenarios where many signatures are processed in parallel. + +## ZenGo's EdDSA implementation: Threshold Schnorr without FROST + +ZenGo's EdDSA MPC implementation uses a **custom two-party threshold Schnorr signature scheme** rather than the FROST protocol. Their approach, detailed in their technical papers, focuses on a 2-of-2 threshold setup optimized for wallet security rather than large validator sets. + +The core of ZenGo's implementation is their **TSS-Schnorr** protocol, which utilizes a combination of Paillier homomorphic encryption and zero-knowledge proofs to enable threshold signing without revealing private key shares. While not using FROST directly, their approach shares conceptual similarities in leveraging Schnorr signature properties for more efficient threshold signing. + +ZenGo's implementation differs from other EdDSA threshold schemes in several key ways: + +1. It's optimized for the two-party setting, emphasizing security and user experience over scalability to large validator sets +2. It incorporates additional zero-knowledge proofs for enhanced security guarantees +3. It focuses on mobile-friendly implementation with reduced computational requirements + +Performance characteristics of ZenGo's implementation show signing times of **300-500ms** in their two-party setting. While they haven't published benchmarks for larger node counts, the protocol's design suggests performance would scale similarly to other threshold Schnorr implementations, with communication complexity growing quadratically. + +ZenGo has also developed multi-signature schemes for both ECDSA and EdDSA, demonstrating their broader expertise in threshold cryptography. Their GitHub repositories indicate ongoing work on more scalable implementations, though their primary focus remains wallet security rather than large validator sets. + +ZenGo's security properties include resistance to various adversarial models and protocol-level guarantees against key exfiltration. While theoretical extensions to larger threshold settings (t-of-n) are possible with their approach, such extensions would require protocol modifications and have not been a focus of their published work. + +## Unified key derivation: One seed to rule them all + +Deriving ECDSA, EdDSA, and lattice-based keys from the same seed is technically feasible and already implemented in several systems. The cryptographic foundation for this approach relies on proper domain separation and standardized key derivation functions. + +**SLIP-0010** (Hierarchical Deterministic Key Generation for Multi-Algorithms) provides a standardized approach for deriving both ECDSA and EdDSA keys from the same seed using HMAC-SHA512. This standard has been widely adopted in cryptocurrency wallets and custody solutions, demonstrating its practical viability. + +For lattice-based signatures, which are newer and less standardized, approaches like **CRYSTALS-Dilithium** can utilize SHAKE-256 as a key derivation function from the same seed material. The key security consideration is proper domain separation to ensure derived keys for different schemes are cryptographically independent. + +Best practices for unified key management include: + +1. Using standardized key derivation functions (HKDF, KMAC) +2. Implementing strict domain separation with algorithm-specific context identifiers +3. Applying different derivation paths for each signature scheme +4. Employing entropy stretching for seed material when deriving multiple keys + +The security implications of shared key material primarily concern compromise scenarios. If the master seed is compromised, all derived keys across all signature schemes are compromised. This creates a single point of failure, though proper hierarchical derivation can mitigate some risks through isolation of specific key branches. + +Production implementations demonstrating unified key derivation include **Fireblocks** and **BitGo** custody platforms, which derive keys for multiple signature schemes from the same seed material while maintaining strict domain separation. The **Trezor** hardware wallet similarly derives both ECDSA and EdDSA keys from the same seed using SLIP-0010. + +For MPC threshold schemes specifically, unified key derivation adds complexity since the key generation process often differs substantially between signature algorithms. Solutions typically involve deriving separate seed material for each scheme's MPC protocol, maintaining the logical connection while preserving protocol-specific security properties. + +## Block validation time implications: Racing against the clock + +MPC threshold signatures introduce additional complexity to block validation processes in public blockchains. While signature generation can take seconds, verification time is much more performance-critical for validators processing blocks. + +Typical blockchain platforms allocate **1-10 milliseconds** for signature verification within their block validation budget. High-throughput chains like Solana target the lower end of this range (**1-3ms**), while Ethereum can afford slightly longer verification times (**5-8ms**) due to its longer block time. + +The key challenge with MPC threshold signatures is that verification time typically scales with threshold complexity. A standard ECDSA signature requires approximately **0.5ms** to verify on modern hardware, while an aggregated threshold signature might require **2-5ms** depending on the scheme and implementation. + +Several strategies help keep signature verification fast: + +1. **Signature aggregation**: Combining multiple signatures into a single verifiable signature reduces validation time significantly. BLS signatures excel here, requiring only a single verification regardless of signer count. + +2. **Batched verification**: Verifying multiple signatures simultaneously using techniques like Bellare-Neven reduces per-signature costs by 30-60%. + +3. **Parallel verification**: Distributing signature verification across multiple cores can achieve near-linear speedup. Ethereum 2.0 validators employ this approach for attestation verification. + +4. **Specialized hardware**: Some chains are exploring dedicated verification hardware or FPGAs for performance-critical operations. + +Real-world benchmarks from Ethereum's Prysm client show that **BLS signature aggregation** reduces verification time by **98%** compared to individual signature verification for a 100-validator committee. Similar optimizations for threshold ECDSA show more modest improvements, with verification time reductions of 40-60%. + +For MPC threshold schemes specifically, the blockchain typically only sees the final aggregated signature rather than the internal MPC protocol messages. This means the verification time impact is primarily determined by the signature scheme itself (ECDSA, EdDSA, BLS) rather than the threshold construction, though some threshold schemes do require additional verification steps. + +## Fast, parallel, secure: Engineering the impossible triangle + +MPC threshold signatures with large node sets (21-100) present fundamental engineering challenges at the intersection of speed, security, and decentralization. Current implementations demonstrate that while large-node MPC is technically possible, significant performance tradeoffs exist. + +**GPU acceleration offers the most promising path forward** for improving computational aspects of large-node MPC, with next-generation specialized hardware potentially reducing signing times by an order of magnitude. Communication complexity remains the fundamental bottleneck, requiring protocol-level innovations rather than just hardware improvements. + +For blockchain validators specifically, the key engineering challenge is balancing threshold security with validation speed. Hybrid approaches that use a smaller actively-signing committee selected from a larger validator pool represent the most practical solution given current technology constraints. + +The unified key derivation techniques outlined provide a solid foundation for cross-chain compatibility, while ZenGo's work demonstrates that different signature schemes can be implemented within consistent security models. + +As blockchains continue scaling to higher transaction throughput, these MPC threshold signature optimizations will become increasingly critical to maintaining both security and performance in decentralized validator networks. diff --git a/docs/hsm6-notes.md b/docs/hsm6-notes.md new file mode 100644 index 00000000..c7bf37f1 --- /dev/null +++ b/docs/hsm6-notes.md @@ -0,0 +1,1734 @@ +# HSM6 + CGGMP MPC on Raspberry Pi + +Integrating Zymbit HSM6 hardware security modules with Rust-based CGGMP threshold signature implementations on Raspberry Pi creates a robust validator infrastructure with significantly enhanced security. This guide provides comprehensive implementation details for developers migrating from software-only CGGMP20 to hardware-secured CGGMP21, covering integration techniques, key management, secure configuration, and performance optimization. By following this implementation path, you'll achieve a validator system that protects private keys in tamper-resistant hardware while distributing signing authority through threshold signatures, substantially reducing the risk of key compromise. + +The integration requires building Rust FFI bindings to the HSM6's C API, implementing SLIP39 for key backups, configuring hardware security features, and optimizing communication between the Rust application and HSM. While this adds complexity compared to software-only implementations, the security advantages are substantial—preventing key extraction even if the host system is compromised. + +## 1. Architecture overview: HSM6 + CGGMP + +The architecture integrates two powerful security technologies: + +1. **Zymbit HSM6** - Hardware security module providing: + - Tamper-resistant key storage + - Hardware-accelerated cryptographic operations + - Physical security (tamper detection, temperature monitoring) + - Protection against key extraction + +2. **CGGMP21 MPC Protocol** - Threshold signature scheme that: + - Distributes signing authority across multiple parties + - Allows t-of-n access structures for signatures + - Requires a threshold of parties to collaborate for valid signatures + - Prevents any single party from learning the complete private key + +The integration combines these technologies to achieve both hardware security and distributed control, creating a defense-in-depth approach for validator nodes. + +### Logical architecture + +``` +┌─────────────────────────────────────────┐ +│ Raspberry Pi Node │ +│ │ +│ ┌───────────────┐ ┌───────────────┐ │ +│ │ Rust │ │ Zymbit HSM6 │ │ +│ │ Application │ │ │ │ +│ │ │◄──►│ Key Storage │ │ +│ │ CGGMP21 │ │ Crypto Engine │ │ +│ │ Protocol │ │ Tamper │ │ +│ │ Implementation│ │ Detection │ │ +│ └────────┬──────┘ └───────────────┘ │ +│ │ │ +└───────────┼─────────────────────────────┘ + │ + ▼ +┌─────────────────────────────────────────┐ +│ Network Communication │ +└─────────────────────────────────────────┘ + ▲ + │ + ┼─────────────┬─────────────┐ + │ │ │ +┌───────────▼───┐ ┌───────▼───┐ ┌───────▼───┐ +│ Raspberry Pi │ │ Raspberry Pi│ │ Raspberry Pi│ +│ Node │ │ Node │ │ Node │ +└───────────────┘ └─────────────┘ └─────────────┘ +``` + +### Security benefits + +This architecture provides several key security advantages: + +- **Key isolation**: Private keys never leave the HSM's secure boundary +- **Distributed trust**: No single party can sign transactions alone +- **Physical security**: Protection against hardware attacks +- **Compromise resilience**: System remains secure even if some nodes are compromised +- **Identifiable abort**: Ability to detect and identify malicious parties + +## 2. HSM6 integration with Rust + +### 2.1 Creating Rust FFI bindings + +Since there are no official Rust bindings for HSM6, we need to create Foreign Function Interface (FFI) bindings to the C API. Start by defining the FFI declarations: + +```rust +// in src/hsm_ffi.rs +use std::ffi::{c_void, CStr, CString}; +use std::os::raw::{c_char, c_int}; + +#[allow(non_camel_case_types)] +pub type zkCTX = *mut c_void; + +#[link(name = "zk_app_utils")] +extern "C" { + // Context management + pub fn zkOpen() -> *mut c_void; + pub fn zkClose(ctx: *mut c_void) -> c_int; + + // Key management + pub fn zkGenKeyPair(ctx: *mut c_void, key_type: c_int) -> c_int; + pub fn zkExportPubKey(ctx: *mut c_void, slot: c_int, + pubkey: *mut u8, pubkey_len: *mut usize, + foreign: bool) -> c_int; + + // Signing operations + pub fn zkSign(ctx: *mut c_void, data: *const u8, data_len: usize, + sig: *mut u8, sig_len: *mut usize, + pubkey_slot: c_int) -> c_int; + + // Verification operations + pub fn zkVerify(ctx: *mut c_void, data: *const u8, data_len: usize, + sig: *const u8, sig_len: usize, + pubkey_slot: c_int) -> c_int; + + // BIP32/39/44 functions + pub fn zkGenWalletMasterSeed(ctx: *mut c_void, key_type: *const c_char, + master_key: *const c_char, wallet_name: *const c_char, + recovery_strategy: *mut c_void) -> c_int; + + pub fn zkGenWalletChildKey(ctx: *mut c_void, slot: c_int, + derivation_path: *const c_char, + return_chain_code: bool) -> c_int; +} +``` + +### 2.2 Building safe Rust wrappers + +Next, create a safe Rust wrapper to handle memory management and provide a more idiomatic API: + +```rust +// in src/hsm.rs +use std::ffi::CString; +use crate::hsm_ffi; + +pub enum KeyType { + NIST_P256 = 0, + Secp256k1 = 1, +} + +#[derive(Debug)] +pub enum HsmError { + InitializationFailed(i32), + KeyGenerationFailed(i32), + SigningFailed(i32), + VerificationFailed(i32), + ExportFailed(i32), + InvalidParameter(&'static str), +} + +pub struct HsmClient { + context: hsm_ffi::zkCTX, +} + +impl HsmClient { + pub fn new() -> Result { + let context = unsafe { hsm_ffi::zkOpen() }; + + if context.is_null() { + return Err(HsmError::InitializationFailed(-1)); + } + + Ok(HsmClient { context }) + } + + pub fn generate_key_pair(&self, key_type: KeyType) -> Result { + let key_slot = unsafe { + hsm_ffi::zkGenKeyPair(self.context, key_type as i32) + }; + + if key_slot < 0 { + return Err(HsmError::KeyGenerationFailed(key_slot)); + } + + Ok(key_slot) + } + + pub fn sign(&self, data: &[u8], key_slot: i32) -> Result, HsmError> { + let mut signature = vec![0u8; 128]; // Maximum signature size + let mut sig_len = signature.len(); + + let result = unsafe { + hsm_ffi::zkSign( + self.context, + data.as_ptr(), + data.len(), + signature.as_mut_ptr(), + &mut sig_len, + key_slot, + ) + }; + + if result < 0 { + return Err(HsmError::SigningFailed(result)); + } + + signature.truncate(sig_len); + Ok(signature) + } + + pub fn export_public_key(&self, key_slot: i32) -> Result, HsmError> { + let mut pubkey = vec![0u8; 128]; // Maximum public key size + let mut pubkey_len = pubkey.len(); + + let result = unsafe { + hsm_ffi::zkExportPubKey( + self.context, + key_slot, + pubkey.as_mut_ptr(), + &mut pubkey_len, + false, // Not a foreign key + ) + }; + + if result < 0 { + return Err(HsmError::ExportFailed(result)); + } + + pubkey.truncate(pubkey_len); + Ok(pubkey) + } + + pub fn generate_wallet_child_key(&self, master_seed_slot: i32, + derivation_path: &str) -> Result { + let path = CString::new(derivation_path) + .map_err(|_| HsmError::InvalidParameter("Invalid derivation path"))?; + + let key_slot = unsafe { + hsm_ffi::zkGenWalletChildKey( + self.context, + master_seed_slot, + path.as_ptr(), + false, // Don't return chain code + ) + }; + + if key_slot < 0 { + return Err(HsmError::KeyGenerationFailed(key_slot)); + } + + Ok(key_slot) + } +} + +impl Drop for HsmClient { + fn drop(&mut self) { + unsafe { hsm_ffi::zkClose(self.context) }; + } +} +``` + +### 2.3 Using the wrapper in your application + +Now you can use the safe wrapper in your application: + +```rust +// in src/main.rs +use crate::hsm::{HsmClient, KeyType}; + +fn sign_message(message: &[u8]) -> Result, anyhow::Error> { + // Initialize the HSM client + let hsm = HsmClient::new()?; + + // Generate a key pair in the HSM + let key_slot = hsm.generate_key_pair(KeyType::Secp256k1)?; + + // Sign the message using the HSM + let signature = hsm.sign(message, key_slot)?; + + // Export the public key (for verification) + let public_key = hsm.export_public_key(key_slot)?; + + println!("Generated key in slot: {}", key_slot); + println!("Public key: {}", hex::encode(&public_key)); + + Ok(signature) +} +``` + +### 2.4 Building the project + +Configure your build to link with the HSM6 library: + +```toml +# Cargo.toml +[package] +name = "hsm6-cggmp-node" +version = "0.1.0" +edition = "2021" + +[dependencies] +libc = "0.2" +hex = "0.4" +anyhow = "1.0" + +[build-dependencies] +bindgen = "0.63" +``` + +Create a `build.rs` file to generate bindings automatically: + +```rust +// build.rs +extern crate bindgen; + +use std::env; +use std::path::PathBuf; + +fn main() { + // Tell cargo to link to zk_app_utils + println!("cargo:rustc-link-lib=zk_app_utils"); + + // Generate bindings + let bindings = bindgen::Builder::default() + .header("wrapper.h") + .allowlist_function("zk.*") + .generate() + .expect("Unable to generate bindings"); + + let out_path = PathBuf::from(env::var("OUT_DIR").unwrap()); + bindings + .write_to_file(out_path.join("bindings.rs")) + .expect("Couldn't write bindings!"); +} +``` + +Create a minimal `wrapper.h` file: + +```c +// wrapper.h +#include +``` + +## 3. CGGMP20 to CGGMP21 migration + +### 3.1 Overview of protocol differences + +CGGMP21 improves upon CGGMP20 with several significant enhancements: + +| Feature | CGGMP20 | CGGMP21 | +|---------|---------|---------| +| Round complexity | 8 rounds | 4 rounds | +| Non-interactive signing | Limited | Fully supported | +| Security model | UC framework | UC framework with enhanced proofs | +| Identifiable abort | Supported | Improved efficiency | +| Unforgeability | Strong | Enhanced | + +### 3.2 Migration strategy + +1. **Incremental migration**: Deploy CGGMP21 code alongside existing CGGMP20 implementation +2. **Key refresh**: Perform coordinated key refresh using the new protocol +3. **Auxiliary information**: Generate new auxiliary information for all parties +4. **Validation**: Ensure all parties can participate in the new protocol +5. **Phase out**: Gradually transition signing operations to CGGMP21 + +### 3.3 Implementation using Dfns library + +The most mature Rust implementation of CGGMP21 is the Dfns library. Here's how to integrate it with HSM6: + +```rust +// in src/cggmp.rs +use cggmp21::supported_curves::Secp256k1; +use cggmp21::{DataToSign, ExecutionId, KeyShare}; +use round_based::{MpcParty, Msg, StateMachine}; +use sha2::Sha256; +use rand::rngs::OsRng; + +use crate::hsm::{HsmClient, KeyType}; + +// Wrapper to integrate HSM with CGGMP +pub struct HsmCggmpSigner { + hsm: HsmClient, + key_slot: i32, +} + +impl HsmCggmpSigner { + pub fn new(hsm: HsmClient, key_slot: i32) -> Self { + Self { hsm, key_slot } + } + + pub fn sign(&self, message: &[u8]) -> Result, anyhow::Error> { + self.hsm.sign(message, self.key_slot) + } + + pub fn get_public_key(&self) -> Result, anyhow::Error> { + Ok(self.hsm.export_public_key(self.key_slot)?) + } +} + +// Key generation with HSM integration +pub async fn generate_threshold_key

( + party: &mut P, + execution_id: &[u8], + party_index: u16, + party_count: u16, + threshold: u16, + hsm: &HsmClient, +) -> Result<(KeyShare, i32), anyhow::Error> +where + P: MpcParty, +{ + let eid = ExecutionId::new(execution_id); + + // Generate CGGMP key share + let key_share = cggmp21::keygen::(eid, party_index, party_count) + .set_threshold(threshold) + .start(&mut OsRng, party) + .await?; + + // Generate corresponding key in HSM6 + let key_slot = hsm.generate_key_pair(KeyType::Secp256k1)?; + + // In a production system, you would need to bind the CGGMP key share to the HSM key + // This would require custom modifications to the CGGMP protocol to use HSM for operations + + Ok((key_share, key_slot)) +} + +// Signing with HSM integration +pub async fn sign_with_threshold

( + party: &mut P, + execution_id: &[u8], + party_index: u16, + parties_indexes: &[u16], + key_share: &KeyShare, + message: &[u8], + hsm_signer: &HsmCggmpSigner, +) -> Result, anyhow::Error> +where + P: MpcParty, +{ + let eid = ExecutionId::new(execution_id); + let data_to_sign = DataToSign::digest::(message); + + // In a fully integrated system, the CGGMP protocol would be modified + // to use the HSM for the actual signing operations + // This example shows a conceptual integration + + let signature = cggmp21::signing(eid, party_index, parties_indexes, key_share) + .sign(&mut OsRng, party, data_to_sign) + .await?; + + // Verify signature matches what the HSM would produce + let hsm_signature = hsm_signer.sign(message)?; + + // In a real implementation, you would need to ensure the signatures match + // or implement protocol modifications to use the HSM for signing operations + + Ok(signature.to_bytes().to_vec()) +} +``` + +### 3.4 Key architectural changes + +1. **Round reduction**: The signing protocol in CGGMP21 requires only 4 rounds, reducing communication overhead. + +2. **Presignature optimization**: CGGMP21 improves presignature generation efficiency, allowing for non-interactive signing. + +3. **Enhanced security model**: CGGMP21 provides stronger security guarantees in the UC framework. + +4. **HSM integration points**: + - Key generation and storage should occur within the HSM + - Signing operations must use the HSM's cryptographic engine + - Protocol modifications may be needed to fully leverage the HSM + +5. **Compatibility considerations**: + - Presignatures from CGGMP20 cannot be used with CGGMP21 + - A new key refresh is required for migration + - Both protocols may need support during transition + +## 4. Key management with SLIP39 + +### 4.1 SLIP39 overview + +SLIP39 (Shamir's Secret Sharing for Mnemonic Codes) provides a robust mechanism for splitting a master seed into multiple shares, where a predefined threshold is required for reconstruction. + +Key features: +- Multi-level threshold scheme with groups and members +- Protection with an optional passphrase +- Mnemonic encoding for human readability +- Standard format ensuring interoperability + +### 4.2 Implementing SLIP39 with HSM6 + +HSM6 supports SLIP39 natively through its API: + +```rust +// in src/slip39.rs +use std::process::Command; +use std::str; + +// Structure to define SLIP39 backup configuration +pub struct Slip39Config { + pub group_count: u8, + pub group_threshold: u8, + pub groups: Vec<(u8, u8)>, // (member_threshold, member_count) for each group + pub passphrase: String, +} + +// Generate SLIP39 shares for a master seed +pub fn generate_slip39_shares( + config: &Slip39Config, + hsm_client: &HsmClient, + key_type: &str, + wallet_name: &str, +) -> Result>, anyhow::Error> { + // Since Rust doesn't have direct bindings, use Python bridge + let python_script = format!( + r#" +import zymkey +import sys + +# Define SLIP39 recovery strategy +use_SLIP39_recovery = zymkey.RecoveryStrategySLIP39( + group_count = {}, + group_threshold = {}, + iteration_exponent = 0, + variant = "", + passphrase = "{}" +) + +# Generate master seed with SLIP39 recovery +return_code = zymkey.client.gen_wallet_master_seed( + "{}", + "", + "{}", + use_SLIP39_recovery +) + +if return_code < 0: + print(f"Error: {{return_code}}") + sys.exit(1) + +# Configure groups and generate shares +all_shares = [] + +for group_idx in range({}): + member_threshold, member_count = [{},{}][group_idx] + + # Configure group + zymkey.client.set_gen_SLIP39_group_info(group_idx, member_count, member_threshold) + + # Generate shares for group + group_shares = [] + for i in range(member_count): + result, mnemonic = zymkey.client.add_gen_SLIP39_member_pwd("") + if result == -1: + # Still generating shares + group_shares.append(mnemonic) + else: + # Master seed generated in slot + slot = result + break + + all_shares.append(group_shares) + +# Print all shares +for group_idx, group in enumerate(all_shares): + for member_idx, share in enumerate(group): + print(f"GROUP_{{group_idx}}_MEMBER_{{member_idx}}:{{share}}") + +# Print the master seed slot +print(f"MASTER_SEED_SLOT:{{slot}}") + "#, + config.group_count, + config.group_threshold, + config.passphrase, + key_type, + wallet_name, + config.group_count, + // Join all group configurations into a comma-separated list + config.groups.iter() + .map(|(threshold, count)| format!("({},{})", threshold, count)) + .collect::>() + .join(","), + config.groups.len() + ); + + // Execute Python script + let output = Command::new("python3") + .arg("-c") + .arg(python_script) + .output()?; + + // Parse output to extract shares and master seed slot + let stdout = str::from_utf8(&output.stdout)?; + + // Parse the shares from the output + let mut all_shares = Vec::new(); + let mut current_group = Vec::new(); + let mut master_seed_slot = -1; + + for line in stdout.lines() { + if line.starts_with("GROUP_") { + let parts: Vec<_> = line.splitn(2, ":").collect(); + if parts.len() == 2 { + current_group.push(parts[1].to_string()); + } + } else if line.starts_with("MASTER_SEED_SLOT:") { + let parts: Vec<_> = line.splitn(2, ":").collect(); + if parts.len() == 2 { + master_seed_slot = parts[1].parse()?; + } + } + } + + if master_seed_slot < 0 { + return Err(anyhow::anyhow!("Failed to generate master seed")); + } + + Ok(all_shares) +} + +// Recover master seed from SLIP39 shares +pub fn recover_from_slip39_shares( + shares: &[String], + passphrase: &str, + hsm_client: &HsmClient, + key_type: &str, + wallet_name: &str, +) -> Result { + // Create Python script for recovery + let shares_str = shares + .iter() + .map(|s| format!("\"{}\"", s)) + .collect::>() + .join(", "); + + let python_script = format!( + r#" +import zymkey +import sys + +shares = [{}] + +# Define SLIP39 recovery strategy +use_SLIP39_recovery = zymkey.RecoveryStrategySLIP39( + group_count = 5, # Maximum possible groups + group_threshold = 2, # Will be determined from shares + iteration_exponent = 0, + variant = "", + passphrase = "{}" +) + +# Start recovery session +return_code = zymkey.client.open_SLIP39_restore_wallet_master_seed_session( + "{}", + "", + "{}", + use_SLIP39_recovery +) + +if return_code < 0: + print(f"Error: {{return_code}}") + sys.exit(1) + +# Feed shares +for share in shares: + result = zymkey.client.feed_SLIP39_share_pwd(share, "") + if result > 0: + # Master seed recovered + print(f"MASTER_SEED_SLOT:{{result}}") + break + "#, + shares_str, + passphrase, + key_type, + wallet_name + ); + + // Execute Python script + let output = Command::new("python3") + .arg("-c") + .arg(python_script) + .output()?; + + // Parse output to extract master seed slot + let stdout = str::from_utf8(&output.stdout)?; + + // Find the master seed slot + for line in stdout.lines() { + if line.starts_with("MASTER_SEED_SLOT:") { + let parts: Vec<_> = line.splitn(2, ":").collect(); + if parts.len() == 2 { + return Ok(parts[1].parse()?); + } + } + } + + Err(anyhow::anyhow!("Failed to recover master seed")) +} +``` + +### 4.3 Key backup and recovery procedure + +#### Backup procedure: + +1. **Preparation**: + - Define SLIP39 backup strategy (groups, thresholds) + - Secure the backup environment + +2. **Generation**: + ```rust + let config = Slip39Config { + group_count: 3, + group_threshold: 2, + groups: vec![(2, 3), (2, 3), (3, 5)], // (threshold, count) for each group + passphrase: "optional_passphrase".to_string(), + }; + + let hsm = HsmClient::new()?; + let shares = generate_slip39_shares(&config, &hsm, "secp256k1", "ValidatorMasterKey")?; + + // Distribute shares according to backup plan + for (group_idx, group) in shares.iter().enumerate() { + println!("Group {}", group_idx); + for (member_idx, share) in group.iter().enumerate() { + println!(" Member {}: {}", member_idx, share); + // In production: securely distribute to authorized personnel + } + } + ``` + +3. **Verification**: + - Test recovery with a subset of shares + - Document share distribution (without revealing content) + +#### Recovery procedure: + +1. **Gather shares**: + - Obtain the minimum required number from the secure locations + +2. **Execute recovery**: + ```rust + let hsm = HsmClient::new()?; + let shares = vec![ + "group1_share1".to_string(), + "group1_share2".to_string(), + "group3_share1".to_string(), + "group3_share2".to_string(), + "group3_share3".to_string(), + ]; + + let master_seed_slot = recover_from_slip39_shares( + &shares, + "optional_passphrase", + &hsm, + "secp256k1", + "ValidatorMasterKey" + )?; + + println!("Master seed recovered in slot: {}", master_seed_slot); + ``` + +3. **Regenerate validator keys**: + ```rust + // Regenerate validator key using standard derivation path + let validator_key_slot = hsm.generate_wallet_child_key( + master_seed_slot, + "m/12381/3600/0/0" // Standard Ethereum validator path + )?; + + // Export public key for verification + let public_key = hsm.export_public_key(validator_key_slot)?; + println!("Recovered validator public key: {}", hex::encode(&public_key)); + ``` + +### 4.4 Best practices for key management + +1. **Physical security**: + - Store SLIP39 shares in geographically distributed secure locations + - Use tamper-evident storage for shares + - Implement physical access controls + +2. **Operational procedures**: + - Document key generation and recovery procedures + - Implement multi-person authorization for key operations + - Regularly audit the security of share storage + +3. **Separation of duties**: + - Distribute shares to different trusted individuals + - Require multiple roles for recovery operations + - Enforce time delays for recovery operations + +## 5. Enforcing HSM usage for bridge nodes + +### 5.1 Technical architecture + +Implement a multi-layered approach to ensure cryptographic operations occur only in the HSM: + +``` +┌─────────────────────────────────────────────┐ +│ Bridge Node Architecture │ +│ │ +│ ┌───────────────────┐ │ +│ │ Application Layer │ │ +│ └────────┬──────────┘ │ +│ │ │ +│ ┌────────▼──────────┐ ┌─────────────────┐ │ +│ │ HSM Enforcement │ │ Policy Engine │ │ +│ │ Middleware │◄─┤ │ │ +│ └────────┬──────────┘ └─────────────────┘ │ +│ │ │ +│ ┌────────▼──────────┐ │ +│ │ HSM Interface │ │ +│ └────────┬──────────┘ │ +│ │ │ +│ ┌────────▼──────────┐ │ +│ │ Hardware Security │ │ +│ │ Module (HSM6) │ │ +│ └───────────────────┘ │ +└─────────────────────────────────────────────┘ +``` + +### 5.2 Enforcement middleware + +Create a middleware to intercept all cryptographic operations: + +```rust +// in src/hsm_enforcement.rs +use std::marker::PhantomData; + +// Trait defining cryptographic operations +pub trait CryptoOperations { + fn sign(&self, key_handle: i32, data: &[u8]) -> Result, anyhow::Error>; + fn verify(&self, public_key: &[u8], data: &[u8], signature: &[u8]) -> Result; +} + +// Implementation using HSM +pub struct HsmCryptoOperations { + hsm: HsmClient, +} + +impl CryptoOperations for HsmCryptoOperations { + fn sign(&self, key_handle: i32, data: &[u8]) -> Result, anyhow::Error> { + Ok(self.hsm.sign(data, key_handle)?) + } + + fn verify(&self, public_key: &[u8], data: &[u8], signature: &[u8]) -> Result { + // Implement verification using HSM + // This is a simplified example + Ok(true) + } +} + +// Enforcement middleware that ensures HSM usage +pub struct HsmEnforcementMiddleware { + inner: T, + crypto: C, + _marker: PhantomData, +} + +impl HsmEnforcementMiddleware { + pub fn new(inner: T, crypto: C) -> Self { + Self { + inner, + crypto, + _marker: PhantomData, + } + } + + // Helper to ensure HSM operation with verification + fn ensure_hsm_operation(&self, operation: F) -> Result + where + F: FnOnce(&C) -> Result + { + // Verify HSM is available + self.verify_hsm_available()?; + + // Execute operation + let result = operation(&self.crypto)?; + + // Verify operation was logged + self.verify_operation_logged()?; + + Ok(result) + } + + fn verify_hsm_available(&self) -> Result<(), anyhow::Error> { + // Implement HSM availability check + // This is a simplified example + Ok(()) + } + + fn verify_operation_logged(&self) -> Result<(), anyhow::Error> { + // Implement operation logging verification + // This is a simplified example + Ok(()) + } +} + +// Implement the bridge node operations with HSM enforcement +impl BridgeNode for HsmEnforcementMiddleware { + fn sign_transaction(&self, transaction: &Transaction) -> Result { + self.ensure_hsm_operation(|crypto| { + let data = transaction.serialize()?; + let key_handle = transaction.get_key_handle()?; + + let signature_bytes = crypto.sign(key_handle, &data)?; + Ok(Signature::from_bytes(signature_bytes)) + }) + } + + fn verify_signature(&self, transaction: &Transaction, signature: &Signature) -> Result { + self.ensure_hsm_operation(|crypto| { + let data = transaction.serialize()?; + let public_key = transaction.get_public_key()?; + + crypto.verify(&public_key, &data, &signature.to_bytes()) + }) + } +} +``` + +### 5.3 Runtime policy enforcement + +Implement runtime checks to ensure HSM operations: + +```rust +// in src/policy.rs +pub struct SecurityPolicy { + non_exportable_keys: bool, + require_quorum: bool, + min_signatures: u8, + max_transaction_value: u64, +} + +pub fn enforce_transaction_policy( + transaction: &Transaction, + policy: &SecurityPolicy, + signatures: &[Signature], +) -> Result<(), anyhow::Error> { + // Check minimum signature threshold + if policy.require_quorum && signatures.len() < policy.min_signatures as usize { + return Err(anyhow::anyhow!("Insufficient signatures")); + } + + // Check transaction value limit + if transaction.value > policy.max_transaction_value { + return Err(anyhow::anyhow!("Transaction value exceeds limit")); + } + + Ok(()) +} + +// Policy-enforcing key generation function +pub fn generate_bridge_key(hsm: &HsmClient) -> Result { + // Check if HSM is in production mode + let production_mode = check_hsm_production_mode(hsm)?; + + if !production_mode { + return Err(anyhow::anyhow!("HSM must be in production mode")); + } + + // Generate key in HSM + let key_slot = hsm.generate_key_pair(KeyType::Secp256k1)?; + + // Configure key as non-exportable + configure_key_as_non_exportable(hsm, key_slot)?; + + Ok(key_slot) +} + +fn check_hsm_production_mode(hsm: &HsmClient) -> Result { + // Implementation would depend on HSM API + // This is a simplified example + Ok(true) +} + +fn configure_key_as_non_exportable(hsm: &HsmClient, key_slot: i32) -> Result<(), anyhow::Error> { + // Implementation would depend on HSM API + // This is a simplified example + Ok(()) +} +``` + +### 5.4 Attestation mechanisms + +Implement attestation to verify HSM usage: + +```rust +// in src/attestation.rs +pub struct KeyAttestation { + public_key: Vec, + device_id: String, + timestamp: u64, + signature: Vec, +} + +pub fn generate_key_attestation( + hsm: &HsmClient, + key_handle: i32, +) -> Result { + // Get public key + let public_key = hsm.export_public_key(key_handle)?; + + // Get device ID + let device_id = get_hsm_device_id(hsm)?; + + // Current timestamp + let timestamp = std::time::SystemTime::now() + .duration_since(std::time::UNIX_EPOCH)? + .as_secs(); + + // Request attestation from HSM + // This would require HSM-specific implementation + let signature = generate_attestation_signature(hsm, &public_key, &device_id, timestamp)?; + + Ok(KeyAttestation { + public_key, + device_id, + timestamp, + signature, + }) +} + +fn get_hsm_device_id(hsm: &HsmClient) -> Result { + // Implementation would depend on HSM API + // This is a simplified example + Ok("HSM6-0123456789ABCDEF".to_string()) +} + +fn generate_attestation_signature( + hsm: &HsmClient, + public_key: &[u8], + device_id: &str, + timestamp: u64, +) -> Result, anyhow::Error> { + // Implementation would depend on HSM API + // This is a simplified example + let data = [ + public_key, + device_id.as_bytes(), + ×tamp.to_be_bytes(), + ].concat(); + + // Sign with an attestation key in the HSM + hsm.sign(&data, 0) // Assume key slot 0 is the attestation key +} + +pub fn verify_attestation( + attestation: &KeyAttestation, + root_cert: &[u8], +) -> Result { + // Verify attestation signature using HSM manufacturer's root certificate + // This is a simplified example + Ok(true) +} +``` + +## 6. Hardware setup and configuration + +### 6.1 Hardware compatibility + +The Zymbit HSM6 is compatible with: +- Raspberry Pi 4 (all memory variants, 8GB recommended) +- Raspberry Pi 5 (recommended for better performance) +- Raspberry Pi Compute Module 4 +- Raspberry Pi Compute Module 5 + +### 6.2 Physical installation + +1. **Prepare components**: + - Raspberry Pi board + - Zymbit HSM6 module + - Zymbit Developer HAT + - CR2032 battery + - High-quality power supply + - Cooling solution (heatsink/fan) + - Minimum 32GB microSD card + - External SSD storage (1TB+ recommended) + +2. **Assembly steps**: + ``` + 1. Install CR2032 battery in Developer HAT + 2. Install HSM6 module on Developer HAT + 3. Connect Developer HAT to Raspberry Pi GPIO pins + 4. Connect SSD to USB 3.0 port (blue port) + 5. Connect Ethernet cable + 6. Install in secure enclosure + ``` + +### 6.3 Software installation + +1. **Install operating system**: + ```bash + # Download and flash Ubuntu Server 22.04 LTS (64-bit) to SD card + + # Initial system setup + sudo apt update && sudo apt upgrade -y + sudo apt install -y openssh-server ufw fail2ban + ``` + +2. **Enable I2C interface**: + ```bash + sudo raspi-config + # Navigate to "Interfacing Options" → "I2C" and enable it + ``` + +3. **Install Zymbit software**: + ```bash + curl -G https://s3.amazonaws.com/zk-sw-repo/install_zk_sw.sh | sudo bash + ``` + +4. **Verify installation**: + ```bash + # After system reboot, the HSM6's blue LED should blink once every 3 seconds + + # Test HSM6 functionality + python3 /usr/local/share/zymkey/examples/zk_app_utils_test.py + python3 /usr/local/share/zymkey/examples/zk_crypto_test.py + ``` + +### 6.4 Physical security recommendations + +1. **Tamper detection configuration**: + ```python + import zymkey + + # Configure channel 0 (perimeter circuit 1) + zymkey.client.set_perimeter_event_actions( + 0, # Channel 0 + action_notify=True, # Enable notifications + action_self_destruct=False # Don't enable self-destruct for testing + ) + + # Configure channel 1 (perimeter circuit 2) + zymkey.client.set_perimeter_event_actions( + 1, # Channel 1 + action_notify=True, # Enable notifications + action_self_destruct=False # Don't enable self-destruct for testing + ) + ``` + +2. **Secure enclosure design**: + - Use a tamper-evident case + - Route perimeter detection circuits through case seams + - House in a secure cabinet or rack + - Implement environmental monitoring + +3. **Production mode configuration**: + ```python + import zymkey + + # Check current binding info + print(zymkey.client.get_binding_info()) + + # CAUTION: Permanently lock binding (IRREVERSIBLE) + # Only run when fully tested and ready for production + # zymkey.client.lock_binding() + + # Verify binding is locked + print(zymkey.client.get_binding_info()) + ``` + +### 6.5 System hardening + +1. **User account security**: + ```bash + # Create a non-root user for validator operations + sudo adduser validator + sudo usermod -aG sudo validator + + # Disable root login via SSH + sudo sed -i 's/PermitRootLogin yes/PermitRootLogin no/' /etc/ssh/sshd_config + sudo systemctl restart ssh + ``` + +2. **Firewall configuration**: + ```bash + sudo ufw default deny incoming + sudo ufw default allow outgoing + sudo ufw allow ssh + sudo ufw limit ssh + # Add rules for validator node ports as required + sudo ufw enable + ``` + +3. **CPU governor configuration**: + ```bash + # Set CPU governor to performance mode + echo performance | sudo tee /sys/devices/system/cpu/cpu*/cpufreq/scaling_governor + + # Make setting persistent + sudo apt install -y cpufrequtils + sudo bash -c 'echo "GOVERNOR=performance" > /etc/default/cpufrequtils' + ``` + +## 7. Performance considerations + +### 7.1 HSM6 performance characteristics + +The HSM6 hardware has specific performance characteristics: + +- **Cryptographic operations**: Hardware acceleration for asymmetric operations +- **I2C communication**: Limited by I2C bus speed when communicating with the Raspberry Pi +- **Key management**: 640 total key slots available (12 factory pre-configured, 512 for user keys) + +### 7.2 Performance bottlenecks + +Primary bottlenecks in HSM+MPC systems: + +1. **Hardware communication**: + - I2C interface adds latency for each HSM operation + - Data transfer between Raspberry Pi and HSM6 introduces overhead + +2. **Protocol overhead**: + - Network communication between MPC parties adds significant latency + - Zero-knowledge proofs are computationally intensive + +3. **Resource limitations**: + - Raspberry Pi has limited RAM and CPU resources + - Storage I/O can become a bottleneck for blockchain data + +### 7.3 Optimization strategies + +#### 7.3.1 Protocol-level optimizations + +```rust +// in src/optimizations.rs + +// Batch multiple presignatures in a single session +pub async fn batch_generate_presignatures

( + party: &mut P, + execution_id: &[u8], + party_index: u16, + parties_indexes: &[u16], + key_share: &KeyShare, + count: usize, +) -> Result>, anyhow::Error> +where + P: MpcParty, +{ + let eid = ExecutionId::new(execution_id); + + // Generate multiple presignatures in batch + let presignatures = cggmp21::presign_batch(eid, party_index, parties_indexes, key_share) + .set_batch_size(count) + .start(&mut OsRng, party) + .await?; + + Ok(presignatures) +} + +// Cache presignatures for fast signing +pub struct PresignatureCache { + cache: Vec>, +} + +impl PresignatureCache { + pub fn new() -> Self { + Self { cache: Vec::new() } + } + + pub fn add_presignatures(&mut self, presignatures: Vec>) { + self.cache.extend(presignatures); + } + + pub fn take_presignature(&mut self) -> Option> { + if self.cache.is_empty() { + None + } else { + Some(self.cache.remove(0)) + } + } + + pub fn remaining(&self) -> usize { + self.cache.len() + } +} + +// Fast signing using cached presignatures +pub async fn fast_sign_with_presignature

( + party: &mut P, + execution_id: &[u8], + party_index: u16, + parties_indexes: &[u16], + key_share: &KeyShare, + presignature: PresignatureData, + message: &[u8], +) -> Result, anyhow::Error> +where + P: MpcParty, +{ + let eid = ExecutionId::new(execution_id); + let data_to_sign = DataToSign::digest::(message); + + // Use presignature for fast signing + let signature = cggmp21::sign_online(eid, party_index, parties_indexes, key_share) + .sign(party, presignature, data_to_sign) + .await?; + + Ok(signature.to_bytes().to_vec()) +} +``` + +#### 7.3.2 System-level optimizations + +```rust +// in src/system_optimizations.rs + +// Configuring I2C for optimal performance +pub fn optimize_i2c_communication() -> Result<(), anyhow::Error> { + // Run system commands to optimize I2C + let output = Command::new("sudo") + .args(["bash", "-c", "echo options i2c_bcm2708 baudrate=400000 > /etc/modprobe.d/i2c.conf"]) + .output()?; + + if !output.status.success() { + return Err(anyhow::anyhow!("Failed to set I2C baudrate")); + } + + Ok(()) +} + +// Optimizing memory usage for large transaction volumes +pub fn optimize_memory_configuration() -> Result<(), anyhow::Error> { + // Configure swap if needed + let output = Command::new("sudo") + .args(["bash", "-c", "dd if=/dev/zero of=/swapfile bs=1M count=2048 && chmod 600 /swapfile && mkswap /swapfile && swapon /swapfile"]) + .output()?; + + if !output.status.success() { + return Err(anyhow::anyhow!("Failed to configure swap")); + } + + // Add swap to fstab for persistence + let output = Command::new("sudo") + .args(["bash", "-c", "echo '/swapfile swap swap defaults 0 0' >> /etc/fstab"]) + .output()?; + + if !output.status.success() { + return Err(anyhow::anyhow!("Failed to update fstab")); + } + + // Set swappiness + let output = Command::new("sudo") + .args(["sysctl", "-w", "vm.swappiness=10"]) + .output()?; + + if !output.status.success() { + return Err(anyhow::anyhow!("Failed to set swappiness")); + } + + Ok(()) +} +``` + +### 7.4 Performance benchmarking + +Implement benchmarking to measure performance: + +```rust +// in src/benchmarks.rs +use std::time::{Duration, Instant}; + +// Benchmark HSM operations +pub fn benchmark_hsm_operations(hsm: &HsmClient, iterations: usize) -> Result<(), anyhow::Error> { + // Benchmark key generation + let start = Instant::now(); + let mut key_slots = Vec::with_capacity(iterations); + + for _ in 0..iterations { + let key_slot = hsm.generate_key_pair(KeyType::Secp256k1)?; + key_slots.push(key_slot); + } + + let key_gen_duration = start.elapsed(); + println!("Key generation: {} ms/op", key_gen_duration.as_millis() as f64 / iterations as f64); + + // Benchmark signing + let data = b"Test message for signing"; + let start = Instant::now(); + + for key_slot in &key_slots { + hsm.sign(data, *key_slot)?; + } + + let signing_duration = start.elapsed(); + println!("Signing: {} ms/op", signing_duration.as_millis() as f64 / iterations as f64); + + Ok(()) +} + +// Benchmark threshold signatures +pub async fn benchmark_threshold_signatures

( + party: &mut P, + hsm: &HsmClient, + iterations: usize, +) -> Result<(), anyhow::Error> +where + P: MpcParty, +{ + // Setup for benchmarking + let execution_id = b"benchmark"; + let party_index = 0; + let party_count = 3; + let threshold = 2; + + // Benchmark key generation + let start = Instant::now(); + let (key_share, key_slot) = generate_threshold_key( + party, + execution_id, + party_index, + party_count, + threshold, + hsm, + ).await?; + + let key_gen_duration = start.elapsed(); + println!("Threshold key generation: {} ms", key_gen_duration.as_millis()); + + // Benchmark presignature generation + let start = Instant::now(); + let presignatures = batch_generate_presignatures( + party, + execution_id, + party_index, + &[0, 1, 2], + &key_share, + iterations, + ).await?; + + let presign_duration = start.elapsed(); + println!("Presignature generation: {} ms/op", presign_duration.as_millis() as f64 / iterations as f64); + + // Benchmark online signing + let message = b"Test message for threshold signing"; + let start = Instant::now(); + + for presignature in presignatures { + fast_sign_with_presignature( + party, + execution_id, + party_index, + &[0, 1, 2], + &key_share, + presignature, + message, + ).await?; + } + + let signing_duration = start.elapsed(); + println!("Online signing: {} ms/op", signing_duration.as_millis() as f64 / iterations as f64); + + Ok(()) +} +``` + +## 8. Security testing and validation + +### 8.1 Testing framework overview + +Implement a comprehensive security testing framework: + +```rust +// in src/security_testing.rs +pub struct SecurityTest { + name: String, + description: String, + severity: TestSeverity, + test_function: Box Result<(), anyhow::Error>>, +} + +pub enum TestSeverity { + Critical, + High, + Medium, + Low, + Info, +} + +pub struct SecurityTestRunner { + tests: Vec, +} + +impl SecurityTestRunner { + pub fn new() -> Self { + Self { tests: Vec::new() } + } + + pub fn add_test(&mut self, test: SecurityTest) { + self.tests.push(test); + } + + pub fn run_tests(&self) -> Vec { + let mut results = Vec::new(); + + for test in &self.tests { + println!("Running test: {}", test.name); + let start = Instant::now(); + let result = (test.test_function)(); + let duration = start.elapsed(); + + let test_result = TestResult { + name: test.name.clone(), + passed: result.is_ok(), + error: result.err().map(|e| e.to_string()), + duration, + }; + + results.push(test_result); + } + + results + } +} + +pub struct TestResult { + name: String, + passed: bool, + error: Option, + duration: Duration, +} +``` + +### 8.2 HSM security testing + +Test the HSM's security properties: + +```rust +// in src/hsm_security_tests.rs +pub fn create_hsm_security_tests() -> Vec { + let mut tests = Vec::new(); + + // Test key extraction prevention + tests.push(SecurityTest { + name: "Key Extraction Prevention".to_string(), + description: "Tests that private keys cannot be extracted from HSM".to_string(), + severity: TestSeverity::Critical, + test_function: Box::new(|| { + let hsm = HsmClient::new()?; + let key_slot = hsm.generate_key_pair(KeyType::Secp256k1)?; + + // Attempt to extract private key (should fail) + // This is a conceptual example, the actual code would depend on HSM API + let result = extract_private_key(&hsm, key_slot); + + if result.is_ok() { + return Err(anyhow::anyhow!("Private key extraction succeeded but should fail")); + } + + Ok(()) + }), + }); + + // Test tamper detection + tests.push(SecurityTest { + name: "Tamper Detection".to_string(), + description: "Tests that tamper detection mechanisms are active".to_string(), + severity: TestSeverity::High, + test_function: Box::new(|| { + // Simplified test - in reality would require physical intervention + // or mocking the tamper detection circuits + + let hsm = HsmClient::new()?; + + // Check if tamper detection is active + // This would require HSM-specific implementation + let tamper_active = check_tamper_detection_active(&hsm)?; + + if !tamper_active { + return Err(anyhow::anyhow!("Tamper detection is not active")); + } + + Ok(()) + }), + }); + + // Add more tests as needed + + tests +} + +fn extract_private_key(hsm: &HsmClient, key_slot: i32) -> Result, anyhow::Error> { + // This should fail - private keys should not be extractable + // Implementation would depend on HSM API + Err(anyhow::anyhow!("Private key extraction not supported")) +} + +fn check_tamper_detection_active(hsm: &HsmClient) -> Result { + // Implementation would depend on HSM API + // This is a simplified example + Ok(true) +} +``` + +### 8.3 MPC protocol testing + +Test the MPC protocol's security properties: + +```rust +// in src/mpc_security_tests.rs +pub async fn create_mpc_security_tests

(party: &mut P) -> Vec +where + P: MpcParty, +{ + let mut tests = Vec::new(); + + // Test threshold enforcement + tests.push(SecurityTest { + name: "Threshold Enforcement".to_string(), + description: "Tests that signatures require at least t parties".to_string(), + severity: TestSeverity::Critical, + test_function: Box::new(|| { + // Implementation would require multiple parties and would be async + // This is a simplified example + Ok(()) + }), + }); + + // Test against known vulnerabilities + tests.push(SecurityTest { + name: "Known Vulnerability Testing".to_string(), + description: "Tests against known vulnerabilities in MPC protocols".to_string(), + severity: TestSeverity::Critical, + test_function: Box::new(|| { + // Test against alpha-shuffle attack + // Test against c-split attack + // Test against presignature reuse + // Test against undefined delta_inv attack + + // Implementation would require specific attack simulations + // This is a simplified example + Ok(()) + }), + }); + + // Add more tests as needed + + tests +} +``` + +### 8.4 Integration testing + +Test the integrated system: + +```rust +// in src/integration_security_tests.rs +pub async fn create_integration_security_tests

( + party: &mut P, + hsm: &HsmClient, +) -> Vec +where + P: MpcParty, +{ + let mut tests = Vec::new(); + + // Test end-to-end signature creation and verification + tests.push(SecurityTest { + name: "End-to-End Signature Verification".to_string(), + description: "Tests the complete signature creation and verification process".to_string(), + severity: TestSeverity::High, + test_function: Box::new(|| { + // Implementation would require multiple parties and would be async + // This is a simplified example + Ok(()) + }), + }); + + // Test error handling + tests.push(SecurityTest { + name: "Error Handling".to_string(), + description: "Tests system behavior with intentionally corrupted inputs".to_string(), + severity: TestSeverity::Medium, + test_function: Box::new(|| { + // Test with corrupted inputs + // Test with network interruptions + // Test system recovery + + // Implementation would require specific error simulations + // This is a simplified example + Ok(()) + }), + }); + + // Add more tests as needed + + tests +} +``` + +### 8.5 Continuous security validation + +Implement continuous security validation: + +```rust +// in src/continuous_validation.rs +pub struct SecurityMonitor { + hsm: HsmClient, + check_interval: Duration, +} + +impl SecurityMonitor { + pub fn new(hsm: HsmClient, check_interval: Duration) -> Self { + Self { hsm, check_interval } + } + + pub async fn start_monitoring(&self) -> Result<(), anyhow::Error> { + let mut interval = tokio::time::interval(self.check_interval); + + loop { + interval.tick().await; + + // Perform security checks + if let Err(err) = self.check_hsm_integrity().await { + log::error!("HSM integrity check failed: {}", err); + // Implement alerting mechanism + } + + if let Err(err) = self.check_system_integrity().await { + log::error!("System integrity check failed: {}", err); + // Implement alerting mechanism + } + } + } + + async fn check_hsm_integrity(&self) -> Result<(), anyhow::Error> { + // Verify HSM connectivity + // Check tamper detection status + // Verify key accessibility + + // Implementation would depend on HSM API + // This is a simplified example + Ok(()) + } + + async fn check_system_integrity(&self) -> Result<(), anyhow::Error> { + // Check critical system files + // Verify running processes + // Check network connections + + // Implementation would depend on system requirements + // This is a simplified example + Ok(()) + } +} +``` + +## Conclusion + +Integrating Zymbit HSM6 hardware security modules with Rust-based CGGMP21 threshold signature implementations on Raspberry Pi creates a powerful security architecture for validator nodes. By combining hardware-based key protection with distributed trust, this approach significantly reduces the risk of key compromise while maintaining operational flexibility. + +The implementation covers integration with Rust through FFI bindings, migration from CGGMP20 to CGGMP21, comprehensive key management with SLIP39, hardware setup and configuration, performance optimization strategies, and security testing frameworks. These components form a complete solution for deploying secure validator infrastructure. + +While implementing this architecture requires careful integration work and thorough testing, the security benefits are substantial and well worth the effort for protecting high-value blockchain operations. diff --git a/docs/tsshock.md b/docs/tsshock.md new file mode 100644 index 00000000..2659698c --- /dev/null +++ b/docs/tsshock.md @@ -0,0 +1,108 @@ +# TSSHOCK Vulnerability Assessment for Lux Network Bridge + +## Executive Summary + +The Lux Network Bridge has been evaluated for vulnerability to TSSHOCK attacks, a set of three critical extraction attacks against threshold ECDSA (t-ECDSA) implementations. Based on our analysis, the current implementation using ZenGo's Rust-based multi-party-ecdsa library with the GG18 protocol provides reasonable protection against these attack vectors. The planned migration to the DKLs23 protocol will further strengthen security against these and other potential attacks. + +## Understanding TSSHOCK Attack Vectors + +TSSHOCK represents three distinct but related attack vectors targeting threshold ECDSA implementations: + +### 1. α-shuffle Attack + +**Attack mechanism**: Exploits ambiguous encoding schemes where multiple different input combinations can produce identical hashes. For example, when concatenating with delimiters like '$', the inputs "a$bc" and "ab$c" can be manipulated to create signing vulnerabilities. + +**Real-world example**: The Binance tss-lib vulnerability used concatenation with '$' delimiters, allowing attackers to craft malicious signing requests that could leak private key information over multiple signatures. + +**Security impact**: If successful, allows an attacker to extract the complete private key after observing only a small number of signatures (typically 2-4). + +### 2. c-split Attack + +**Attack mechanism**: Exploits optimized implementations where a 256-bit challenge is used only once in the signing process, particularly vulnerable when operating with composite group orders rather than prime orders. This occurs when implementations optimize the number of iterations or checks in cryptographic proofs. + +**Technical detail**: When the challenge space is split (due to operating in a composite group), an attacker can exploit mathematical properties to recover key shares through selective failures and repeated signing attempts. + +**Security impact**: Allows key extraction with significantly fewer signing operations than should be theoretically required, dramatically reducing the security margin. + +### 3. c-guess Attack + +**Attack mechanism**: Exploits implementations that reduce the number of iterations in zero-knowledge proofs (specifically discrete-log proofs) from the cryptographically secure value of 128 to as low as 1, for performance optimization. + +**Attack process**: With dramatically reduced iteration counts, an attacker can simply guess challenge bits with high probability of success and extract key information through repeated signature requests. + +**Security impact**: Permits complete key extraction through a relatively small number of signing requests, defeating the security guarantees of the threshold scheme. + +## Lux Network Bridge Implementation Analysis + +The Lux Network Bridge implements threshold ECDSA using ZenGo's Rust-based multi-party-ecdsa library, which differs significantly from the vulnerable Binance tss-lib implementation: + +### Current Implementation Security + +| Attack Vector | Risk Assessment | Mitigation Factors | +|---------------|-----------------|-------------------| +| α-shuffle attack | Low Risk | Uses structured encoding with fixed-length components and explicit type conversion in message formation. The `abi.encodePacked()` function is used with hex-encoded values of specified lengths, preventing ambiguous parsing. | +| c-split attack | Low-to-Medium Risk | The GG18/GG20 implementation operates in a prime-order elliptic curve group and follows the academic protocol specifications closely. Challenge generation uses strong cryptographic hash functions with domain separation. | +| c-guess attack | Low Risk | The implementation maintains appropriate security parameters in zero-knowledge proofs and doesn't implement the drastic iteration reductions that made the tss-lib implementation vulnerable. | + +### Key Security Features + +1. **Structured message encoding**: The Bridge contract uses fixed-length hex encoding for hash values with explicit length specifications: + +```solidity +string memory message = append( + Strings.toHexString(uint256(teleport.networkIdHash), 32), + hashedTxId_, + Strings.toHexString(uint256(teleport.tokenAddressHash), 32), + teleport.tokenAmount, + teleport.decimals, + Strings.toHexString(uint256(teleport.receiverAddressHash), 32), + vault_ +); +``` + +2. **Robust challenge generation**: The MPC implementation generates challenges using cryptographically secure methods that prevent mathematical vulnerabilities. + +3. **Strong zero-knowledge proofs**: The proof systems maintain proper security parameters without excessive optimization. + +4. **Transaction replay protection**: The Bridge contract explicitly prevents transaction replay: + +```solidity +// Check if signedTxInfo already exists +require( + !transactionMap[signedTXInfo_].exists, + "Duplicated Transaction Hash" +); +``` + +## Planned Security Enhancements + +### DKLs23 Protocol Migration + +The Lux Network Bridge is planning to migrate from CGGMP20/GG18 to the DKLs23 protocol, which offers significant security improvements: + +1. **Minimal security assumptions**: Requires nothing beyond what's already assumed for standard ECDSA +2. **Three-round efficiency**: Reduces communication rounds from 4+ to just 3 +3. **Information-theoretic UC security**: Based only on ideal commitment and two-party multiplication primitives +4. **Computational efficiency**: 2-3 orders of magnitude faster than homomorphic encryption-based alternatives +5. **Simplified implementation**: Reduces complex zero-knowledge proofs and cryptographic primitives + +### Additional Recommended Security Measures + +1. **Enhanced input validation**: Implement additional validation checks for message components before encoding and signing +2. **Parameter verification**: Add runtime verification of cryptographic parameters, especially for zero-knowledge proofs +3. **Formal security audit**: Commission a specialized cryptographic audit focusing on the MPC implementation +4. **Targeted testing**: Develop specific test cases attempting to exploit TSSHOCK attack vectors +5. **Regular security updates**: Maintain an update schedule for cryptographic libraries and keep abreast of new research + +## Conclusion + +The current Lux Network Bridge implementation, based on ZenGo's Rust multi-party-ecdsa library, provides robust protection against the TSSHOCK attack vectors. The fundamental architecture choices in the protocol implementation, message encoding, challenge generation, and security parameters create strong security barriers. + +The planned migration to the DKLs23 protocol will further enhance security by implementing a protocol specifically designed to minimize cryptographic assumptions and provide stronger mathematical security guarantees. This migration represents a proactive security measure that aligns with industry best practices for cross-chain bridge implementations. + +## References + +1. TSSHOCK vulnerability disclosure: https://eprint.iacr.org/2023/170 +2. DKLs23 protocol: IEEE Symposium on Security and Privacy 2024 +3. ZenGo multi-party-ecdsa: https://github.com/ZenGo-X/multi-party-ecdsa +4. CGGMP20 protocol: Canetti R., Gennaro R., Goldfeder S., Makriyannis N., Peled U. (2020) \ No newline at end of file diff --git a/docs/unified-mpc-library.md b/docs/unified-mpc-library.md new file mode 100644 index 00000000..233d1204 --- /dev/null +++ b/docs/unified-mpc-library.md @@ -0,0 +1,1184 @@ +# Unified MPC Library: Bridging ECDSA and EdDSA for Lux.Network + +## BLUF: Signature scheme integration made simple + +This documentation provides a comprehensive guide for creating a unified Multi-Party Computation (MPC) library that abstracts the differences between ECDSA and EdDSA implementations in the Lux.Network bridge. The library enables developers to implement consistent threshold signature workflows across different blockchain networks using a single API, regardless of the underlying signature scheme. Key architectural considerations include a layered abstraction approach, clearly defined interfaces between components, deterministic key derivation from common seeds, and specialized protocol implementations optimized for each signature scheme's mathematical properties. + +## Understanding signature schemes fundamentals + +Digital signature schemes form the backbone of blockchain transaction security, with ECDSA and EdDSA representing two distinct approaches with different security properties and implementation characteristics. + +### ECDSA vs EdDSA: Core differences + +**ECDSA (Elliptic Curve Digital Signature Algorithm)** and **EdDSA (Edwards-curve Digital Signature Algorithm)** differ in several fundamental ways that impact their MPC implementations: + +| Characteristic | ECDSA | EdDSA | +| --- | --- | --- | +| Curve type | Weierstrass form (y² = x³ + ax + b) | Twisted Edwards (ax² + y² = 1 + dx²y²) | +| Popular curves | secp256k1, secp256r1 | Ed25519, Ed448 | +| Nonce generation | Traditionally random (security risk) | Deterministic (derived from key and message) | +| MPC complexity | Higher (non-linear signing equation) | Lower (linear structure) | +| Communication rounds | Typically 4+ rounds | Typically 3 rounds | +| Side-channel resistance | Requires careful implementation | Built-in protection | + +**Impact on MPC implementation:** EdDSA's design makes it inherently more MPC-friendly due to its deterministic nonce generation and simpler mathematical structure. ECDSA requires more complex protocols to handle its non-linear signing equation securely in a distributed setting. + +### Which blockchains use what? + +Different blockchain networks use different signature schemes: + +- **ECDSA**: Bitcoin, Ethereum, and most EVM-compatible chains +- **EdDSA**: Solana (Ed25519), Cardano, Polkadot, Cosmos + +The Lux.Network bridge currently supports numerous EVM-compatible chains using ECDSA, and adding support for EdDSA will enable connections to networks like Solana, Cardano, and other non-EVM chains. + +## Architecture design principles + +The unified MPC library's architecture must balance abstraction with optimization, providing a consistent API while leveraging scheme-specific optimizations under the hood. + +### Layered abstraction model + +A multi-layered architecture provides different levels of abstraction for different use cases: + +``` +┌─────────────────────────────────────────────────────┐ +│ Application Layer (Lux.Network Bridge) │ +└───────────────────────┬─────────────────────────────┘ + │ +┌───────────────────────▼─────────────────────────────┐ +│ Unified API (Common Interface for All Schemes) │ +└───────────────────────┬─────────────────────────────┘ + │ +┌───────────────────────▼─────────────────────────────┐ +│ Protocol Layer (Scheme-Specific MPC Protocols) │ +├─────────────┬─────────┴────────────┬────────────────┤ +│ ECDSA │ EdDSA │ Future Schemes │ +│ Protocol │ Protocol │ (e.g., BLS) │ +└─────────────┴──────────────────────┴────────────────┘ +``` + +This structure allows for: +- **Common interfaces** at the top layers +- **Specialized implementations** at the lower layers +- **Easy extensibility** for future signature schemes + +### Component interfaces + +The core interfaces establish a consistent API across signature schemes: + +```typescript +enum SignatureScheme { + ECDSA = 'ecdsa', + EDDSA = 'eddsa' +} + +interface SignatureService { + generateKeyPair(params: SecurityParameters): Promise; + distributeShares(key: PrivateKey, threshold: number, parties: number): Promise; + sign(message: Buffer, scheme?: SignatureScheme): Promise; + verify(message: Buffer, signature: Buffer, publicKey: Buffer, scheme?: SignatureScheme): Promise; + getSignatureSchemeForChain(chainId: string): SignatureScheme; +} + +interface KeyShare { + getPartyId(): number; + getShareData(): Buffer; + isValid(): boolean; + refresh(): KeyShare; // For proactive security +} + +interface MPCParty { + sendMessage(message: Message, recipient: PartyId): void; + broadcast(message: Message): void; + registerMessageHandler(handler: MessageHandler): void; + getSessionState(): SessionState; +} + +interface SignatureResult { + signature: string; + signer?: string; // For ECDSA, recovered address; empty for EdDSA + scheme: SignatureScheme; +} +``` + +### Factory pattern for scheme selection + +Implement a factory pattern to instantiate the appropriate cryptographic implementations: + +```typescript +// Create the appropriate scheme implementation +const signatureService = SignatureServiceFactory.create({ + supportedSchemes: [SchemeType.ECDSA, SchemeType.EDDSA], + ecdsaParams: new ECDSAParameters(Curve.SECP256K1), + eddsaParams: new EdDSAParameters(Curve.ED25519), + defaultScheme: SchemeType.ECDSA +}); + +// Chain-to-scheme mapping configuration +const chainSignatureSchemes: Record = { + // Default EVM chains use ECDSA + "1": SignatureScheme.ECDSA, // Ethereum + "56": SignatureScheme.ECDSA, // BSC + "137": SignatureScheme.ECDSA, // Polygon + "43114": SignatureScheme.ECDSA, // Avalanche + "96369": SignatureScheme.ECDSA, // Lux Network + "200200": SignatureScheme.ECDSA, // Zoo Network + // Non-EVM chains use EdDSA + "SOL-MAINNET": SignatureScheme.EDDSA, // Solana + "SOL-DEVNET": SignatureScheme.EDDSA, // Solana Devnet + "ADA-MAINNET": SignatureScheme.EDDSA, // Cardano + "DOT-MAINNET": SignatureScheme.EDDSA, // Polkadot +}; +``` + +This approach encapsulates implementation details while providing a consistent interface. + +## Integration with Lux.Network Bridge + +### Docker Configuration + +Update the Dockerfile for MPC nodes to include both ECDSA and EdDSA implementations: + +```dockerfile +# Use Rust as the base image +FROM rust:latest AS rust_builder + +# Set the working directory +WORKDIR /app + +# Clone the external MPC repositories instead of embedding them +RUN apt-get update && apt-get install -y git pkg-config libssl-dev && rm -rf /var/lib/apt/lists/* + +# Clone the ECDSA repository +RUN git clone https://github.com/luxfi/multi-party-ecdsa.git ./ecdsa + +# Clone the EdDSA repository +RUN git clone https://github.com/luxfi/multi-party-eddsa.git ./eddsa + +# Install nightly version of Rust and set it as the default toolchain +RUN rustup install nightly +RUN rustup default nightly + +# Build the ECDSA library +WORKDIR /app/ecdsa +RUN cargo +nightly build --release --examples + +# Build the EdDSA library +WORKDIR /app/eddsa +RUN cargo +nightly build --release --examples + +# Use Node.js for the final image +FROM node:20 + +# Set working directory in Node container +WORKDIR /app + +COPY ./common/node . + +# Install Node.js dependencies +RUN npm install + +# Build node app +RUN npm run build + +# Create multiparty directory structure +RUN mkdir -p ./dist/multiparty/ecdsa ./dist/multiparty/eddsa + +# Copy the built ECDSA Rust binaries and examples +COPY --from=rust_builder /app/ecdsa/target/release/examples ./dist/multiparty/ecdsa/target/release/examples +COPY --from=rust_builder /app/ecdsa/target/release/deps ./dist/multiparty/ecdsa/target/release/deps + +# Copy the built EdDSA Rust binaries and examples +COPY --from=rust_builder /app/eddsa/target/release/examples ./dist/multiparty/eddsa/target/release/examples +COPY --from=rust_builder /app/eddsa/target/release/deps ./dist/multiparty/eddsa/target/release/deps + +EXPOSE 6000 + +# Command to run the application +CMD ["node", "dist/node.js"] +``` + +### Environment Configuration + +Add environment variables for signature scheme configuration: + +```yaml +services: + mpc-node: + environment: + # Existing variables... + + # New environment variables for signature scheme selection + - ECDSA_CLIENT_NAME=gg18_sign_client + - ECDSA_SM_MANAGER=gg18_sm_manager + - EDDSA_CLIENT_NAME=frost_sign_client + - EDDSA_SM_MANAGER=frost_sm_manager + - DEFAULT_SIGNATURE_SCHEME=ecdsa +``` + +## Unified key generation and management + +Key generation and management are critical components that must be carefully designed to work across different signature schemes. + +### From common seed to scheme-specific keys + +Rather than trying to convert between ECDSA and EdDSA keys (which is not mathematically sound), derive different scheme-specific keys from a common master seed: + +``` +Master Seed + │ + ├─→ KDF(seed, "ECDSA") → ECDSA Private Key + │ + └─→ KDF(seed, "EdDSA") → EdDSA Private Key +``` + +This approach ensures: +- A single backup seed can restore all keys +- Different schemes use cryptographically isolated keys +- No security compromises from attempted direct conversions + +### Implementing unified key generation script + +```typescript +// Key generation utilities +class KeyGenerator { + private masterSeed: Buffer; + + constructor(seed?: Buffer) { + // Generate random seed if not provided + this.masterSeed = seed || crypto.randomBytes(32); + } + + // Get the master seed (for backup) + getMasterSeed(): Buffer { + return this.masterSeed; + } + + // Derive ECDSA key + deriveECDSAKey(): Buffer { + return crypto.createHmac('sha256', this.masterSeed) + .update('ECDSA') + .digest(); + } + + // Derive EdDSA key + deriveEdDSAKey(): Buffer { + return crypto.createHmac('sha256', this.masterSeed) + .update('EdDSA') + .digest(); + } + + // Generate keys for all supported schemes + generateAllKeys(): Record { + return { + [SignatureScheme.ECDSA]: this.deriveECDSAKey(), + [SignatureScheme.EDDSA]: this.deriveEdDSAKey() + }; + } +} +``` + +### Bash script for unified keygen + +```bash +#!/bin/bash +# Dual-signature key generation script for MPC nodes +# This script generates keys for both ECDSA and EdDSA signature schemes + +# Configuration +NODE_NUMBER=${NODE_NUMBER:-0} +ECDSA_CLIENT_NAME=${ECDSA_CLIENT_NAME:-gg18_keygen_client} +ECDSA_SM_MANAGER=${ECDSA_SM_MANAGER:-gg18_sm_manager} +EDDSA_CLIENT_NAME=${EDDSA_CLIENT_NAME:-frost_keygen_client} +EDDSA_SM_MANAGER=${EDDSA_SM_MANAGER:-frost_sm_manager} +THRESHOLD=${THRESHOLD:-2} +TOTAL_PARTIES=${TOTAL_PARTIES:-3} + +echo "===== MPC Key Generation for Node $NODE_NUMBER =====" +echo "- Threshold: $THRESHOLD" +echo "- Total Parties: $TOTAL_PARTIES" + +# Generate ECDSA keys +echo "===== Generating ECDSA Keys =====" +cd /app/dist/multiparty/ecdsa || exit 1 +./target/release/examples/$ECDSA_CLIENT_NAME $ECDSA_SM_MANAGER $NODE_NUMBER $THRESHOLD $TOTAL_PARTIES + +# Check if ECDSA keygen was successful +if [ $? -ne 0 ]; then + echo "❌ ECDSA key generation failed!" + exit 1 +else + echo "✅ ECDSA key generation successful!" +fi + +# Generate EdDSA keys +echo "===== Generating EdDSA Keys =====" +cd /app/dist/multiparty/eddsa || exit 1 +./target/release/examples/$EDDSA_CLIENT_NAME $EDDSA_SM_MANAGER $NODE_NUMBER $THRESHOLD $TOTAL_PARTIES + +# Check if EdDSA keygen was successful +if [ $? -ne 0 ]; then + echo "❌ EdDSA key generation failed!" + exit 1 +else + echo "✅ EdDSA key generation successful!" +fi + +echo "===== Key Generation Complete =====" +echo "✅ Both ECDSA and EdDSA keys have been generated successfully!" +echo "✅ Node is ready for signing operations." +``` + +## The signing process: protocol differences + +The signing process reveals the most significant differences between ECDSA and EdDSA in MPC contexts. + +### Implementing unified signing utility + +```typescript +// Signature scheme enum +enum SignatureScheme { + ECDSA = 'ecdsa', + EDDSA = 'eddsa' +} + +// Default signature scheme from environment +const DEFAULT_SIGNATURE_SCHEME = (process.env.DEFAULT_SIGNATURE_SCHEME || 'ecdsa').toLowerCase() as SignatureScheme + +// Client and manager names for different signature schemes +const SIGNATURE_CONFIG = { + [SignatureScheme.ECDSA]: { + clientName: process.env.ECDSA_CLIENT_NAME || process.env.sign_client_name, + smManager: process.env.ECDSA_SM_MANAGER || process.env.sign_sm_manager, + directory: 'ecdsa' + }, + [SignatureScheme.EDDSA]: { + clientName: process.env.EDDSA_CLIENT_NAME || 'frost_sign_client', + smManager: process.env.EDDSA_SM_MANAGER || 'frost_sm_manager', + directory: 'eddsa' + } +} + +/** + * Map chain IDs to signature schemes + * Defaults to ECDSA for backward compatibility + */ +const CHAIN_SIGNATURE_SCHEMES: Record = { + // Default EVM chains use ECDSA + "1": SignatureScheme.ECDSA, // Ethereum + "56": SignatureScheme.ECDSA, // BSC + "137": SignatureScheme.ECDSA, // Polygon + "43114": SignatureScheme.ECDSA, // Avalanche + "96369": SignatureScheme.ECDSA, // Lux Network + "200200": SignatureScheme.ECDSA, // Zoo Network + // Non-EVM chains use EdDSA + "SOL-MAINNET": SignatureScheme.EDDSA, // Solana + "SOL-DEVNET": SignatureScheme.EDDSA, // Solana Devnet + "ADA-MAINNET": SignatureScheme.EDDSA, // Cardano + "DOT-MAINNET": SignatureScheme.EDDSA, // Polkadot +}; + +/** + * Get signature scheme for a chain + * @param chainId Chain ID + * @returns Signature scheme to use + */ +export const getSignatureSchemeForChain = (chainId: string): SignatureScheme => { + return CHAIN_SIGNATURE_SCHEMES[chainId] || DEFAULT_SIGNATURE_SCHEME +} + +/** + * generate signature using the appropriate scheme + * @param msgHash Message hash to sign + * @param scheme Signature scheme to use + * @returns Signature components + */ +export const signClient = async (msgHash: string, scheme: SignatureScheme = DEFAULT_SIGNATURE_SCHEME) => { + return new Promise(async (resolve, reject) => { + try { + const config = SIGNATURE_CONFIG[scheme] + console.log(`========================================================= In ${scheme.toUpperCase()} Sign Client ============================================================`) + + const list = await find("name", `${config.clientName} ${config.smManager}`) + if (list.length > 0) { + console.log("::clientAlreadyRunning:::", list) + try { + const x = list.length === 1 ? 0 : 1 + const uptimeCmd = "ps -p " + list[x].pid + " -o etime" + const uptimeOut = await exec(uptimeCmd) + const upStdout = uptimeOut.stdout + const upStderr = uptimeOut.stderr + + if (upStdout) { + const up = upStdout.split("\n")[1].trim().split(":") + console.log("::upStdout:", up, "Time Bound:", smTimeOutBound) + const upStdoutArr = up + // SM Manager timed out + if (Number(upStdoutArr[upStdoutArr.length - 1]) >= smTimeOutBound) { + console.log("::SM Manager signing timeout reached") + try { + for (const p of list) { + await killSigner(String(p.pid)) + } + const cmd = `./target/release/examples/${config.clientName} ${config.smManager} ${keyStore} ${msgHash}` + await exec(cmd, { cwd: __dirname + `/multiparty/${config.directory}`, shell: "/bin/bash" }) + } catch (err) { + console.log("::Partial signature process may not have exited:", err) + resolve(signClient(msgHash, scheme)) + return + } + } else { + // Retry with same scheme + resolve(signClient(msgHash, scheme)) + return + } + } else { + console.log("::upStderr:", upStderr) + reject("::SignerDeadError2:" + upStderr) + return + } + } catch (err) { + console.log("::SignerDeadError3:", err) + reject("SignerDeadError3:" + err) + return + } + } else { + console.log("About to message signers...") + try { + //Invoke client signer + console.log(`::Using ${scheme} signer: ${config.clientName} ${config.smManager}`) + const cmd = `./target/release/examples/${config.clientName} ${config.smManager} ${keyStore} ${msgHash}` + console.log("::command: ", cmd) + const out = await exec(cmd, { cwd: __dirname + `/multiparty/${config.directory}` }) + const { stdout, stderr } = out + console.log("::stdout:", stdout, stderr) + + if (stdout) { + if (scheme === SignatureScheme.ECDSA) { + // Process ECDSA signature format + const sig = stdout.split("sig_json")[1].split(",") + if (sig.length > 0) { + const r = sig[0].replace(": ", "").replace(/["]/g, "").trim() + const s = sig[1].replace(/["]/g, "").trim() + const v = Number(sig[2].replace(/["]/g, "")) === 0 ? "1b" : "1c" + let signature = "0x" + r + s + v + if (signature.length < 132) { + throw new Error("elements in xs are not pairwise distinct") + } + // Handle odd length sigs + if (signature.length % 2 != 0) { + signature = "0x0" + signature.split("0x")[1] + } + + console.log("::ECDSA Signature:", signature) + resolve({ r, s, v, signature, scheme: SignatureScheme.ECDSA }) + return + } + } else if (scheme === SignatureScheme.EDDSA) { + // Process EdDSA signature format + const sigOutput = stdout.trim() + const signatureMatch = sigOutput.match(/signature: ([0-9a-fA-F]+)/) + + if (signatureMatch && signatureMatch[1]) { + const signature = "0x" + signatureMatch[1] + console.log("::EdDSA Signature:", signature) + resolve({ signature, scheme: SignatureScheme.EDDSA }) + return + } else { + reject("EdDSA signature format not recognized") + return + } + } + } else { + console.log("::stderr:" + stderr) + reject("SignerFailError1:" + stderr) + return + } + } catch (err) { + console.log("::SignerFailError2:" + err) + + if (err.toString().includes("elements in xs are not pairwise distinct")) { + await sleep(2000) + resolve(signClient(msgHash, scheme)) + return + } else { + reject("SignerFailError2: " + err) + return + } + } + } + } catch (err) { + console.log("::sign client error: =======================") + console.log(err.stack || err) + reject(err.stack) + return + } + }) +} + +/** + * sign message with appropriate signature scheme + * @param message + * @param web3 + * @param chainId + * @returns + */ +export const signMessage = async (message: string, web3: Web3, chainId?: string) => { + try { + // Determine the signature scheme to use + const scheme = chainId ? getSignatureSchemeForChain(chainId) : DEFAULT_SIGNATURE_SCHEME + + if (scheme === SignatureScheme.ECDSA) { + return signECDSAMessage(message, web3) + } else { + return signEdDSAMessage(message) + } + } catch (err) { + console.log("Error:", err) + return Promise.reject(err) + } +} + +/** + * Sign a message using ECDSA + */ +const signECDSAMessage = async (message: string, web3: Web3) => { + const myMsgHashAndPrefix = web3.eth.accounts.hashMessage(message) + const netSigningMsg = myMsgHashAndPrefix.substr(2) + + try { + const { signature, r, s, v } = (await signClient(netSigningMsg, SignatureScheme.ECDSA)) as any + let signer = "" + try { + signer = recoverAddress(myMsgHashAndPrefix, signature) + console.log("ECDSA MPC Address:", signer) + } catch (err) { + console.log("err: ", err) + } + return Promise.resolve({ signature, signer, scheme: SignatureScheme.ECDSA }) + } catch (err) { + console.log("Error:", err) + return Promise.reject("signClientError:") + } +} + +/** + * Sign a message using EdDSA + */ +const signEdDSAMessage = async (message: string) => { + // Convert message to appropriate format for EdDSA + // This might be different than ECDSA hashing + const messageBuffer = Buffer.from(message) + const messageHash = messageBuffer.toString('hex') + + try { + const { signature } = (await signClient(messageHash, SignatureScheme.EDDSA)) as any + // EdDSA doesn't use recovery, so we can't derive the public key here + // You'd need to store the public key and verify signatures differently + return Promise.resolve({ signature, signer: "", scheme: SignatureScheme.EDDSA }) + } catch (err) { + console.log("Error:", err) + return Promise.reject("signClientError:") + } +} +``` + +### Dynamic transaction signing + +```typescript +/** + * hash tx and sign with appropriate signature scheme + * @param param0 + * @returns + */ +export const hashAndSignTx = async ({ web3Form, vault, toNetworkId, hashedTxId, toTokenAddress, tokenAmount, decimals, receiverAddressHash, nonce }: SIGN_REQUEST) => { + try { + const scheme = getSignatureSchemeForChain(toNetworkId) + const toNetworkIdHash = Web3.utils.keccak256(toNetworkId) + const toTokenAddressHash = Web3.utils.keccak256(toTokenAddress) + + // Format message based on signature scheme + if (scheme === SignatureScheme.ECDSA) { + const message = concatMsg(toNetworkIdHash, hashedTxId, toTokenAddressHash, tokenAmount, decimals, receiverAddressHash, vault) + console.log("::ECDSA message to sign: ", message) + const hash = web3Form.utils.soliditySha3(message) + const { signature, signer } = await signMessage(hash, web3Form, toNetworkId) + return Promise.resolve({ signature, mpcSigner: signer }) + } else { + // EdDSA message formatting might be different for specific chains + const message = concatMsg(toNetworkIdHash, hashedTxId, toTokenAddressHash, tokenAmount, decimals, receiverAddressHash, vault) + console.log("::EdDSA message to sign: ", message) + // For EdDSA we can hash the message differently if needed + const { signature, signer } = await signMessage(message, web3Form, toNetworkId) + return Promise.resolve({ signature, mpcSigner: signer }) + } + } catch (err) { + if (err.toString().includes("invalid point")) { + hashAndSignTx({ web3Form, vault, toNetworkId, hashedTxId, toTokenAddress, tokenAmount, decimals, receiverAddressHash, nonce }) + } else { + console.log(err) + return Promise.reject(err) + } + } +} +``` + +## Network handling and protocol sessions + +MPC requires secure communication between parties, with different requirements for each protocol. + +### Session state management + +```typescript +interface SessionManager { + createSession(sessionType: SessionType, params: SessionParameters): Session; + getSession(id: SessionId): Session; + closeSession(id: SessionId): void; +} + +interface Session { + getId(): SessionId; + getType(): SessionType; + getState(): SessionState; + isComplete(): boolean; + getOutgoingMessages(): Message[]; + processIncomingMessages(messages: Message[]): void; + getResult(): any; // Result depends on session type +} + +// Session implementation +class MPCSession implements Session { + private id: SessionId; + private type: SessionType; + private state: SessionState = SessionState.INITIALIZED; + private messages: Message[] = []; + private result: any = null; + private scheme: SignatureScheme; + + constructor(id: SessionId, type: SessionType, scheme: SignatureScheme) { + this.id = id; + this.type = type; + this.scheme = scheme; + } + + getId(): SessionId { + return this.id; + } + + getType(): SessionType { + return this.type; + } + + getState(): SessionState { + return this.state; + } + + isComplete(): boolean { + return this.state === SessionState.COMPLETED; + } + + getOutgoingMessages(): Message[] { + return this.messages; + } + + processIncomingMessages(messages: Message[]): void { + // Process protocol-specific messages + // This will differ between ECDSA and EdDSA + if (this.scheme === SignatureScheme.ECDSA) { + this.processECDSAMessages(messages); + } else { + this.processEdDSAMessages(messages); + } + } + + private processECDSAMessages(messages: Message[]): void { + // ECDSA-specific message processing + // ... + } + + private processEdDSAMessages(messages: Message[]): void { + // EdDSA-specific message processing + // ... + } + + getResult(): any { + return this.result; + } + + setComplete(result: any): void { + this.state = SessionState.COMPLETED; + this.result = result; + } +} + +// Session manager implementation +class MPCSessionManager implements SessionManager { + private sessions: Map = new Map(); + + createSession(type: SessionType, params: SessionParameters): Session { + const id = crypto.randomBytes(16).toString('hex'); + const scheme = params.scheme || SignatureScheme.ECDSA; + const session = new MPCSession(id, type, scheme); + this.sessions.set(id, session); + return session; + } + + getSession(id: SessionId): Session { + const session = this.sessions.get(id); + if (!session) { + throw new Error(`Session not found: ${id}`); + } + return session; + } + + closeSession(id: SessionId): void { + this.sessions.delete(id); + } +} +``` + +## Error handling and security considerations + +Robust error handling and security measures are essential for a cryptographic library. + +### Error hierarchy + +```typescript +// Base exception +class MPCException extends Error { + constructor(message: string) { + super(message); + this.name = 'MPCException'; + } +} + +// Protocol-specific exceptions +class ECDSAException extends MPCException { + constructor(message: string) { + super(message); + this.name = 'ECDSAException'; + } +} + +class EdDSAException extends MPCException { + constructor(message: string) { + super(message); + this.name = 'EdDSAException'; + } +} + +// Operation-specific exceptions +class KeyGenerationException extends MPCException { + constructor(message: string) { + super(message); + this.name = 'KeyGenerationException'; + } +} + +class SigningException extends MPCException { + constructor(message: string) { + super(message); + this.name = 'SigningException'; + } +} + +class VerificationException extends MPCException { + constructor(message: string) { + super(message); + this.name = 'VerificationException'; + } +} + +// Security-related exceptions +class SecurityViolationException extends MPCException { + constructor(message: string) { + super(message); + this.name = 'SecurityViolationException'; + } +} + +class ThresholdNotMetException extends SecurityViolationException { + constructor(message: string) { + super(message); + this.name = 'ThresholdNotMetException'; + } +} +``` + +### Security validations + +```typescript +// Security validator +class SecurityValidator { + /** + * Validate a key share + * @param share Key share to validate + * @returns True if the share is valid + */ + static validateKeyShare(share: KeyShare): boolean { + // Basic validation + if (!share || !share.getShareData()) { + throw new SecurityViolationException('Invalid key share'); + } + + // Scheme-specific validation + if (share instanceof ECDSAKeyShare) { + return this.validateECDSAKeyShare(share); + } else if (share instanceof EdDSAKeyShare) { + return this.validateEdDSAKeyShare(share); + } + + throw new SecurityViolationException('Unknown key share type'); + } + + /** + * Validate ECDSA key share + * @param share ECDSA key share + * @returns True if the share is valid + */ + private static validateECDSAKeyShare(share: ECDSAKeyShare): boolean { + // ECDSA-specific validation + // ... + return true; + } + + /** + * Validate EdDSA key share + * @param share EdDSA key share + * @returns True if the share is valid + */ + private static validateEdDSAKeyShare(share: EdDSAKeyShare): boolean { + // EdDSA-specific validation + // ... + return true; + } + + /** + * Validate that the threshold is met + * @param shares Array of shares + * @param threshold Required threshold + * @returns True if the threshold is met + */ + static validateThreshold(shares: KeyShare[], threshold: number): boolean { + if (!shares || shares.length < threshold) { + throw new ThresholdNotMetException( + `Threshold not met: ${shares ? shares.length : 0} < ${threshold}` + ); + } + + // Additional threshold validation + // ... + + return true; + } + + /** + * Validate input parameters + * @param params Parameters to validate + * @returns Validated parameters + */ + static validateInputs(params: any): any { + // Validate input types and ranges + // ... + + return params; + } +} +``` + +## Integration with Solana and other EdDSA chains + +### Solana Network Configuration + +```typescript +// Add this to your settings.ts file to support Solana and other EdDSA chains + +import { SignatureScheme } from "../types" + +// Add to the MAIN_NETWORKS array +const solanaMainnetConfig = { + display_name: "Solana", + internal_name: "SOLANA_MAINNET", + is_testnet: false, + chain_id: "SOL-MAINNET", + teleporter: "", // Replace with your Solana teleporter address + vault: "", // Replace with your Solana vault address + node: "https://api.mainnet-beta.solana.com", + signature_scheme: SignatureScheme.EDDSA, // Specify EdDSA for Solana + currencies: [ + { + name: "SOL", + asset: "SOL", + contract_address: null, + decimals: 9, + is_native: true + }, + { + name: "USDC", + asset: "USDC", + contract_address: "EPjFWdd5AufqSSqeM2qN1xzybapC8G4wEGGkZwyTDt1v", + decimals: 6, + is_native: false + }, + // Add more Solana tokens as needed + ] +} + +// Add to the TEST_NETWORKS array +const solanaDevnetConfig = { + display_name: "Solana Devnet", + internal_name: "SOLANA_DEVNET", + is_testnet: true, + chain_id: "SOL-DEVNET", + teleporter: "", + vault: "", + node: "https://api.devnet.solana.com", + signature_scheme: SignatureScheme.EDDSA, // Specify EdDSA for Solana + currencies: [ + { + name: "SOL", + asset: "SOL", + contract_address: null, + decimals: 9, + is_native: true + }, + // Add more Solana devnet tokens as needed + ] +} + +// Add these to your SWAP_PAIRS object +const solanaSwapPairs = { + // Native SOL can be swapped with wrapped SOL tokens on other chains + SOL: ["LSOL", "ZSOL"], + LSOL: ["SOL", "ZSOL"], + ZSOL: ["SOL", "LSOL"], + + // Add other token swap pairs + // ... +} + +// Export these settings to be added to your main arrays +export const NEW_NETWORKS = { + mainnet: [solanaMainnetConfig], + testnet: [solanaDevnetConfig] +} + +export const NEW_SWAP_PAIRS = solanaSwapPairs +``` + +### Verification Process for Solana + +```typescript +// Solana program for verifying Ed25519 signatures + +use solana_program::{ + account_info::{next_account_info, AccountInfo}, + entrypoint, + entrypoint::ProgramResult, + msg, + program_error::ProgramError, + pubkey::Pubkey, + ed25519_program, +}; + +// Entry point for the Solana program +entrypoint!(process_instruction); + +// Process instruction logic +pub fn process_instruction( + program_id: &Pubkey, + accounts: &[AccountInfo], + instruction_data: &[u8], +) -> ProgramResult { + msg!("Processing Bridge instruction"); + + // Get account iterator + let accounts_iter = &mut accounts.iter(); + + // Get accounts + let bridge_account = next_account_info(accounts_iter)?; + let payer = next_account_info(accounts_iter)?; + + // Verify the bridge account is owned by this program + if bridge_account.owner != program_id { + return Err(ProgramError::IncorrectProgramId); + } + + // Parse instruction type + if instruction_data.len() < 4 { + return Err(ProgramError::InvalidInstructionData); + } + + // Read instruction type from first byte + let instruction_type = instruction_data[0]; + + match instruction_type { + // Process token bridge with Ed25519 signature verification + 0 => process_bridge_tokens(accounts, &instruction_data[1..]), + + // Other instruction types + _ => { + msg!("Invalid instruction type"); + Err(ProgramError::InvalidInstructionData) + } + } +} + +// Process token bridge with Ed25519 signature verification +fn process_bridge_tokens(accounts: &[AccountInfo], data: &[u8]) -> ProgramResult { + msg!("Processing bridge token request"); + + // Get account iterator + let accounts_iter = &mut accounts.iter(); + + // Skip bridge account that was already processed + let _bridge_account = next_account_info(accounts_iter)?; + let _payer = next_account_info(accounts_iter)?; + + // Get ed25519 program account for signature verification + let ed25519_program_id = next_account_info(accounts_iter)?; + + // Ensure we're using the correct program + if *ed25519_program_id.key != ed25519_program::id() { + return Err(ProgramError::InvalidArgument); + } + + // Parse data + if data.len() < 32 + 64 + 32 { + msg!("Data too short for signature verification"); + return Err(ProgramError::InvalidInstructionData); + } + + // Extract public key, signature, and message from data + let public_key = &data[0..32]; + let signature = &data[32..96]; + let message = &data[96..]; + + // Verify the Ed25519 signature + let signature_valid = ed25519_program::verify_signature( + public_key, + message, + signature, + ); + + if !signature_valid { + msg!("Invalid Ed25519 signature"); + return Err(ProgramError::InvalidArgument); + } + + msg!("Signature verification successful"); + + // Continue with token bridging logic + // ... + + Ok(()) +} +``` + +## Testing strategy + +Comprehensive testing is essential for a cryptographic library: + +1. **Unit tests**: Individual components and methods + - Key generation + - Signature creation + - Signature verification + - Protocol message processing + +2. **Integration tests**: Interactions between components + - End-to-end signing flow + - Cross-protocol interactions + - Error handling + +3. **Property-based tests**: Mathematical properties + - Signature validity + - Key derivation properties + - Statistical properties of randomness + +4. **Security tests**: Resistance to common attacks + - Fault injection + - Timing attacks + - Replay attacks + +5. **Standard test vectors**: Compliance with standards + - ECDSA test vectors from NIST + - EdDSA test vectors from RFC 8032 + +6. **Network simulation**: Realistic conditions + - Latency simulation + - Packet loss + - Out-of-order messages + +7. **Stress testing**: Performance under load + - Multiple concurrent sessions + - Resource limitations + - Long-running operations + +## Deployment and Operations + +### Environment Variables + +| Variable | Description | Default Value | +|----------|-------------|---------------| +| `DEFAULT_SIGNATURE_SCHEME` | Default signature scheme when not specified | `ecdsa` | +| `ECDSA_CLIENT_NAME` | Name of ECDSA client executable | `gg18_sign_client` | +| `ECDSA_SM_MANAGER` | Name of ECDSA session manager executable | `gg18_sm_manager` | +| `EDDSA_CLIENT_NAME` | Name of EdDSA client executable | `frost_sign_client` | +| `EDDSA_SM_MANAGER` | Name of EdDSA session manager executable | `frost_sm_manager` | +| `smTimeOutBound` | Session manager timeout value | Varies | +| `node_number` | MPC node number in the network | Varies | + +### Deployment Steps + +1. **Update MPC Node Configuration**: + - Update Dockerfile to include both ECDSA and EdDSA implementations + - Add environment variables for signature scheme configuration + +2. **Key Generation**: + - Generate keys for both signature schemes + - Securely back up the master seed + - Distribute key shares among the MPC nodes + +3. **Network Configuration**: + - Update `settings.ts` to include new chains with their signature schemes + - Configure swap pairs for the new tokens + +4. **Testing**: + - Test transactions with both ECDSA and EdDSA chains + - Verify that the correct signature scheme is used for each chain + - Ensure proper error handling for all edge cases + +5. **Monitoring**: + - Add monitoring for both ECDSA and EdDSA signatures + - Track success rates for different signature schemes + - Log any signature failures or timeouts + +### Scaling Considerations + +1. **Horizontal Scaling**: + - Add more MPC nodes to handle increased transaction volume + - Ensure key shares are properly distributed to new nodes + +2. **Protocol Optimization**: + - Optimize message exchange for each protocol + - Implement batching for signature operations + +3. **Load Balancing**: + - Distribute signature requests across MPC nodes + - Consider chain-specific node groups for specialized hardware requirements + +## Conclusion + +Creating a unified MPC library that bridges ECDSA and EdDSA implementations unlocks significant new capabilities for the Lux.Network bridge: + +1. **Enhanced blockchain support**: Adding EdDSA support enables the bridge to connect to Solana, Cardano, Polkadot, and other non-EVM chains. + +2. **Simplified development**: A unified API makes it easier to add support for new chains without changing the core bridge logic. + +3. **Optimized performance**: Each signature scheme can be implemented in the most efficient way while maintaining a consistent interface. + +4. **Future-proof architecture**: The layered design makes it straightforward to add support for new signature schemes as they emerge. + +5. **Security isolation**: Deriving scheme-specific keys from a common seed ensures that different signature schemes don't compromise each other's security. + +By implementing this unified MPC library, Lux.Network will significantly expand its cross-chain capabilities while maintaining the security and reliability that users expect. diff --git a/docs/utxo-guide.md b/docs/utxo-guide.md new file mode 100644 index 00000000..3a750e8d --- /dev/null +++ b/docs/utxo-guide.md @@ -0,0 +1,2682 @@ +# UTXO Withdrawal System: Implementation Guide + +## Table of Contents +1. [System Architecture](#system-architecture) +2. [Bitcoin Implementation](#bitcoin-implementation) +3. [Avalanche X-Chain Implementation](#avalanche-x-chain-implementation) +4. [MPC Signing System](#mpc-signing-system) +5. [Common Utilities](#common-utilities) +6. [Configuration Examples](#configuration-examples) + +## System Architecture + +```typescript +interface UTXOWithdrawalSystem { + // Chain-specific components + chainHandlers: Map; + + // Core services + mpcService: MultiChainMPCSigner; + database: UTXODatabase; + + // API endpoints + initializeWithdrawal(request: WithdrawalRequest): Promise; // Returns requestId + getWithdrawalStatus(requestId: string): Promise; + triggerSweep(chainType: string): Promise; +} + +// Chain handler interface +interface ChainHandler { + chainType: string; + + // UTXO management + refreshUTXOs(): Promise; + selectUTXOs(amount: string, assetID: string): Promise; + + // Transaction building + buildWithdrawalTransaction( + utxos: UTXO[], + destinationAddress: string, + amount: string, + changeAddress: string + ): Promise; + + buildSweepTransaction( + utxos: UTXO[], + destinationAddress: string + ): Promise; + + // Transaction signing and broadcasting + signTransaction(unsignedTx: UnsignedTransaction): Promise; + broadcastTransaction(signedTx: SignedTransaction): Promise; // Returns txId + + // Sweep functionality + shouldSweep(): Promise; + getSweepableUTXOs(): Promise; +} + +// Processing workflow +class WithdrawalProcessor { + // Process withdrawals in parallel batches + async processQueue(): Promise { + const pendingWithdrawals = await this.database.getPendingWithdrawals(); + const batchedWithdrawals = this.batchWithdrawals(pendingWithdrawals); + + for (const batch of batchedWithdrawals) { + await Promise.all(batch.map(withdrawal => this.processWithdrawal(withdrawal))); + } + + // Trigger sweep after processing withdrawals + await this.triggerSweepsIfNeeded(); + } +} + +// Base UTXO structure +interface UTXO { + txId: string; + outputIndex: number; + amount: string; + address: string; + chainType: string; + assetID: string; + confirmations: number; + status: 'available' | 'reserved' | 'spent'; + reservedAt?: number; + spentAt?: number; + spentInTxId?: string; +} +``` + +## Bitcoin Implementation + +### UTXO Manager + +```typescript +class BitcoinUTXOManager { + constructor( + private bitcoinClient: BitcoinClient, + private database: UTXODatabase, + private config: BitcoinConfig + ) {} + + // Refresh UTXOs from the network + async refreshUTXOs(): Promise { + const addresses = this.config.teleporterAddresses; + + for (const address of addresses) { + const utxos = await this.bitcoinClient.getUTXOs(address); + + for (const utxo of utxos) { + const existingUTXO = await this.database.getUTXO('bitcoin', utxo.txid, utxo.vout); + + if (!existingUTXO) { + await this.database.addUTXO({ + txId: utxo.txid, + outputIndex: utxo.vout, + amount: utxo.value.toString(), + address: utxo.address, + chainType: 'bitcoin', + assetID: 'BTC', // Only BTC for Bitcoin + confirmations: utxo.confirmations, + status: 'available' + }); + } else { + // Update confirmations for existing UTXO + await this.database.updateUTXO( + 'bitcoin', + utxo.txid, + utxo.vout, + { confirmations: utxo.confirmations } + ); + } + } + } + } + + // Select UTXOs for withdrawal + async selectUTXOs(amount: string, _assetID: string): Promise { + const amountSatoshis = BigInt(amount); + const feeRate = this.config.feeRate; + + // Get available UTXOs + const availableUTXOs = await this.database.getAvailableUTXOs('bitcoin'); + + // Ensure UTXOs have enough confirmations + const confirmedUTXOs = availableUTXOs.filter( + utxo => utxo.confirmations >= this.config.minConfirmations + ); + + // Branch and bound algorithm for coin selection + return this.coinSelection(confirmedUTXOs, amountSatoshis, feeRate); + } + + // Branch and bound algorithm for coin selection + private coinSelection(utxos: UTXO[], targetAmount: bigint, feeRate: number): UTXO[] { + // Implementation of BnB algorithm + // This is more efficient than a simple greedy algorithm + + // Sort UTXOs by value (descending) + utxos.sort((a, b) => (BigInt(b.amount) - BigInt(a.amount))); + + // Try to find exact match + const exactMatch = this.findExactMatch(utxos, targetAmount); + if (exactMatch) return exactMatch; + + // Try branch and bound + const bnbResult = this.branchAndBound(utxos, targetAmount, feeRate); + if (bnbResult) return bnbResult; + + // Fallback to greedy algorithm + return this.greedySelection(utxos, targetAmount, feeRate); + } +} +``` + +### Transaction Building + +```typescript +class BitcoinTransactionBuilder { + constructor( + private bitcoinClient: BitcoinClient, + private config: BitcoinConfig + ) {} + + // Build transaction for withdrawal + async buildWithdrawalTransaction( + utxos: UTXO[], + destinationAddress: string, + amount: string, + changeAddress: string + ): Promise { + const bitcoinjs = require('bitcoinjs-lib'); + const network = this.config.network === 'mainnet' + ? bitcoinjs.networks.bitcoin + : bitcoinjs.networks.testnet; + + // Create PSBT + const psbt = new bitcoinjs.Psbt({ network }); + + // Add inputs + for (const utxo of utxos) { + const utxoDetails = await this.bitcoinClient.getTransactionOutput( + utxo.txId, + utxo.outputIndex + ); + + psbt.addInput({ + hash: utxo.txId, + index: utxo.outputIndex, + witnessUtxo: { + script: Buffer.from(utxoDetails.scriptPubKey.hex, 'hex'), + value: Number(utxo.amount) + } + }); + } + + // Add outputs + const amountSatoshis = BigInt(amount); + + // Add the destination output + psbt.addOutput({ + address: destinationAddress, + value: Number(amountSatoshis) + }); + + // Calculate total input amount + const totalInput = utxos.reduce( + (sum, utxo) => sum + BigInt(utxo.amount), + BigInt(0) + ); + + // Calculate fee + const fee = this.calculateFee(utxos.length, 2, this.config.feeRate); // 2 outputs (destination + change) + + // Calculate change + const changeAmount = totalInput - amountSatoshis - BigInt(fee); + + // Add change output if needed + if (changeAmount > BigInt(this.config.dustThreshold)) { + psbt.addOutput({ + address: changeAddress, + value: Number(changeAmount) + }); + } + + return { + chainType: 'bitcoin', + psbt, + inputs: utxos, + fee: fee.toString(), + changeAddress, + changeAmount: changeAmount.toString() + }; + } + + // Build transaction for sweeping + async buildSweepTransaction( + utxos: UTXO[], + destinationAddress: string + ): Promise { + const bitcoinjs = require('bitcoinjs-lib'); + const network = this.config.network === 'mainnet' + ? bitcoinjs.networks.bitcoin + : bitcoinjs.networks.testnet; + + // Create PSBT + const psbt = new bitcoinjs.Psbt({ network }); + + // Add inputs + for (const utxo of utxos) { + const utxoDetails = await this.bitcoinClient.getTransactionOutput( + utxo.txId, + utxo.outputIndex + ); + + psbt.addInput({ + hash: utxo.txId, + index: utxo.outputIndex, + witnessUtxo: { + script: Buffer.from(utxoDetails.scriptPubKey.hex, 'hex'), + value: Number(utxo.amount) + } + }); + } + + // Calculate total input amount + const totalInput = utxos.reduce( + (sum, utxo) => sum + BigInt(utxo.amount), + BigInt(0) + ); + + // Calculate fee (only 1 output for sweep) + const fee = this.calculateFee(utxos.length, 1, this.config.feeRate); + + // Calculate output amount + const outputAmount = totalInput - BigInt(fee); + + // Add output to destination address + psbt.addOutput({ + address: destinationAddress, + value: Number(outputAmount) + }); + + return { + chainType: 'bitcoin', + psbt, + inputs: utxos, + fee: fee.toString(), + changeAddress: null, + changeAmount: '0' + }; + } + + // Calculate fee based on vBytes + private calculateFee(inputCount: number, outputCount: number, feeRate: number): number { + // For Segwit transactions: + // Each input: ~68-70 vBytes + // Each output: ~31-33 vBytes + // Transaction overhead: ~10-12 vBytes + const vBytesPerInput = 70; + const vBytesPerOutput = 33; + const transactionOverhead = 12; + + const estimatedVSize = + transactionOverhead + + (inputCount * vBytesPerInput) + + (outputCount * vBytesPerOutput); + + return estimatedVSize * feeRate; + } +} +``` + +### Bitcoin Signing and Broadcasting + +```typescript +class BitcoinTransactionSigner { + constructor( + private mpcService: MultiChainMPCSigner, + private config: BitcoinConfig + ) {} + + // Sign transaction + async signTransaction(unsignedTx: UnsignedTransaction): Promise { + const bitcoinjs = require('bitcoinjs-lib'); + const psbt = unsignedTx.psbt; + + // For each input + for (let i = 0; i < psbt.inputCount; i++) { + // Get the hash to sign + const hashToSign = psbt.getHashToSign(i); + + // Sign the hash with MPC + const signature = await this.mpcService.signDigest( + Buffer.from(hashToSign), + 'bitcoin' + ); + + // Add sighash flag + const signatureWithHashType = Buffer.concat([ + signature, + Buffer.from([bitcoinjs.Transaction.SIGHASH_ALL]) + ]); + + // Apply the signature + psbt.updateInput(i, { + partialSig: [{ + pubkey: Buffer.from(this.config.publicKey, 'hex'), + signature: signatureWithHashType + }] + }); + } + + // Finalize the PSBT + psbt.finalizeAllInputs(); + + // Extract transaction + const tx = psbt.extractTransaction(); + + return { + chainType: 'bitcoin', + txHex: tx.toHex(), + txId: tx.getId(), + fee: unsignedTx.fee, + inputs: unsignedTx.inputs, + changeAddress: unsignedTx.changeAddress, + changeAmount: unsignedTx.changeAmount + }; + } +} + +class BitcoinTransactionBroadcaster { + constructor(private bitcoinClient: BitcoinClient) {} + + // Broadcast transaction + async broadcastTransaction(signedTx: SignedTransaction): Promise { + const txHex = signedTx.txHex; + const txId = await this.bitcoinClient.broadcastTransaction(txHex); + return txId; + } +} +``` + +## Avalanche X-Chain Implementation + +### UTXO Manager + +```typescript +class AvaxUTXOManager { + constructor( + private avalancheClient: AvalancheClient, + private database: UTXODatabase, + private config: AvaxConfig + ) {} + + // Refresh UTXOs from the network + async refreshUTXOs(): Promise { + const addresses = this.config.teleporterAddresses; + + // Setup Avalanche client + const avalanche = this.avalancheClient.getAvalanche(); + const xchain = avalanche.XChain(); + + try { + // Get UTXOs for all addresses + const utxoSet = await xchain.getUTXOs(addresses); + const utxos = utxoSet.utxos.getAllUTXOs(); + + for (const utxo of utxos) { + const txId = utxo.getTxID().toString('hex'); + const outputIndex = utxo.getOutputIdx(); + const assetID = utxo.getAssetID().toString('hex'); + const output = utxo.getOutput(); + const amount = output.getAmount().toString(); + const addresses = output.getAddresses().map(addr => + avalanche.XChain().addressFromBuffer(addr).toString() + ); + + // Get transaction status for confirmations + const txStatus = await xchain.getTxStatus(txId); + + // Store or update in database + const existingUTXO = await this.database.getUTXO('avalanche-x', txId, outputIndex); + + if (!existingUTXO) { + await this.database.addUTXO({ + txId, + outputIndex, + amount, + address: addresses[0], // Use first address + chainType: 'avalanche-x', + assetID, + confirmations: txStatus.status === 'Accepted' ? 1 : 0, + status: 'available' + }); + } else { + // Update confirmations + await this.database.updateUTXO( + 'avalanche-x', + txId, + outputIndex, + { + confirmations: txStatus.status === 'Accepted' ? 1 : 0 + } + ); + } + } + } catch (error) { + console.error('Error refreshing X-Chain UTXOs:', error); + throw error; + } + } + + // Select UTXOs for withdrawal + async selectUTXOs(amount: string, assetID: string): Promise { + const amountBN = new BN(amount); + + // Get available UTXOs for the specified asset + const availableUTXOs = await this.database.getAvailableUTXOs( + 'avalanche-x', + assetID + ); + + // Ensure UTXOs have enough confirmations + const confirmedUTXOs = availableUTXOs.filter( + utxo => utxo.confirmations >= this.config.minConfirmations + ); + + // Sort UTXOs by amount (ascending) + confirmedUTXOs.sort((a, b) => { + const aBN = new BN(a.amount); + const bBN = new BN(b.amount); + return aBN.cmp(bBN); + }); + + // Try to find an exact match first + const exactMatch = confirmedUTXOs.find(utxo => new BN(utxo.amount).eq(amountBN)); + if (exactMatch) { + return [exactMatch]; + } + + // Otherwise, use greedy selection + const selectedUTXOs: UTXO[] = []; + let selectedAmount = new BN(0); + + for (const utxo of confirmedUTXOs) { + selectedUTXOs.push(utxo); + selectedAmount = selectedAmount.add(new BN(utxo.amount)); + + if (selectedAmount.gte(amountBN)) { + break; + } + } + + // Check if we have enough + if (selectedAmount.lt(amountBN)) { + throw new Error(`Insufficient funds: needed ${amount}, have ${selectedAmount.toString()}`); + } + + return selectedUTXOs; + } +} +``` + +### Transaction Building + +```typescript +class AvaxTransactionBuilder { + constructor( + private avalancheClient: AvalancheClient, + private config: AvaxConfig + ) {} + + // Build transaction for withdrawal + async buildWithdrawalTransaction( + utxos: UTXO[], + destinationAddress: string, + amount: string, + changeAddress: string + ): Promise { + const avalanche = this.avalancheClient.getAvalanche(); + const xchain = avalanche.XChain(); + const bintools = avalanche.BinTools(); + + // Convert amounts to BN + const amountBN = new BN(amount); + + // Get fee + const fee = xchain.getTxFee(); + + // Create UTXOSet + const utxoSet = new avalanche.avm.UTXOSet(); + + // Add UTXOs to the set + for (const utxo of utxos) { + const txid = Buffer.from(utxo.txId, 'hex'); + const outputIdx = utxo.outputIndex; + const assetID = Buffer.from(utxo.assetID, 'hex'); + + // Create output + const output = new avalanche.avm.SECPTransferOutput( + new BN(utxo.amount), + [bintools.stringToAddress(utxo.address)] + ); + + // Create UTXO + const avaxUtxo = new avalanche.avm.UTXO( + avalanche.avm.UTXOClass, + txid, + outputIdx, + assetID, + output + ); + + // Add to set + utxoSet.add(avaxUtxo); + } + + // Calculate total input amount + const totalInput = utxos.reduce( + (sum, utxo) => sum.add(new BN(utxo.amount)), + new BN(0) + ); + + // Calculate change amount + const changeAmount = totalInput.sub(amountBN).sub(fee); + + // Create transaction + const unsignedTx = await xchain.buildBaseTx( + utxoSet, + amountBN, + Buffer.from(utxos[0].assetID, 'hex'), // Use asset ID from first UTXO + [destinationAddress], + [changeAddress], + [changeAddress] + ); + + return { + chainType: 'avalanche-x', + unsignedTx, + inputs: utxos, + fee: fee.toString(), + changeAddress, + changeAmount: changeAmount.toString() + }; + } + + // Build transaction for sweeping + async buildSweepTransaction( + utxos: UTXO[], + destinationAddress: string + ): Promise { + const avalanche = this.avalancheClient.getAvalanche(); + const xchain = avalanche.XChain(); + const bintools = avalanche.BinTools(); + + // Group UTXOs by asset ID + const utxosByAsset = new Map(); + + for (const utxo of utxos) { + if (!utxosByAsset.has(utxo.assetID)) { + utxosByAsset.set(utxo.assetID, []); + } + utxosByAsset.get(utxo.assetID).push(utxo); + } + + // Process each asset group + const assetGroups: UnsignedTransaction[] = []; + + for (const [assetID, assetUTXOs] of utxosByAsset.entries()) { + // Create UTXOSet + const utxoSet = new avalanche.avm.UTXOSet(); + + // Add UTXOs to the set + for (const utxo of assetUTXOs) { + const txid = Buffer.from(utxo.txId, 'hex'); + const outputIdx = utxo.outputIndex; + const asset = Buffer.from(utxo.assetID, 'hex'); + + // Create output + const output = new avalanche.avm.SECPTransferOutput( + new BN(utxo.amount), + [bintools.stringToAddress(utxo.address)] + ); + + // Create UTXO + const avaxUtxo = new avalanche.avm.UTXO( + avalanche.avm.UTXOClass, + txid, + outputIdx, + asset, + output + ); + + // Add to set + utxoSet.add(avaxUtxo); + } + + // Calculate total input amount + const totalInput = assetUTXOs.reduce( + (sum, utxo) => sum.add(new BN(utxo.amount)), + new BN(0) + ); + + // Get fee + const fee = xchain.getTxFee(); + + // For non-AVAX assets, we need extra AVAX to pay fees + if (assetID !== this.avalancheClient.getAvaxAssetID()) { + // This requires having AVAX UTXOs available + // Implementation depends on your fee handling strategy + // For simplicity, we'll assume fee is handled separately + } + + // Build transaction (subtract fee only for AVAX assets) + const outputAmount = assetID === this.avalancheClient.getAvaxAssetID() + ? totalInput.sub(fee) + : totalInput; + + // Create transaction + const unsignedTx = await xchain.buildBaseTx( + utxoSet, + outputAmount, + Buffer.from(assetID, 'hex'), + [destinationAddress], + [destinationAddress], + [destinationAddress] + ); + + assetGroups.push({ + chainType: 'avalanche-x', + unsignedTx, + inputs: assetUTXOs, + fee: fee.toString(), + changeAddress: null, + changeAmount: '0' + }); + } + + // For simplicity, we'll return the first asset group + // In practice, you'd process each group separately + return assetGroups[0]; + } +} +``` + +### Avalanche Signing and Broadcasting + +```typescript +class AvaxTransactionSigner { + constructor( + private mpcService: MultiChainMPCSigner, + private config: AvaxConfig + ) {} + + // Sign transaction + async signTransaction(unsignedTx: UnsignedTransaction): Promise { + const avalanche = this.avalancheClient.getAvalanche(); + const xchain = avalanche.XChain(); + + // Get the transaction buffer to sign + const unsignedBuffer = unsignedTx.unsignedTx.toBuffer(); + + // Create message to sign + const msgToSign = Buffer.from(unsignedBuffer); + + // Sign with MPC + const signature = await this.mpcService.signDigest( + msgToSign, + 'avalanche-x' + ); + + // Create credentials + const cred = new avalanche.avm.Credential(); + cred.addSignature(signature); + + // Sign transaction + const signedTx = new avalanche.avm.Tx( + unsignedTx.unsignedTx, + [cred] + ); + + return { + chainType: 'avalanche-x', + signedTx, + txId: signedTx.getTxID().toString('hex'), + fee: unsignedTx.fee, + inputs: unsignedTx.inputs, + changeAddress: unsignedTx.changeAddress, + changeAmount: unsignedTx.changeAmount + }; + } +} + +class AvaxTransactionBroadcaster { + constructor(private avalancheClient: AvalancheClient) {} + + // Broadcast transaction + async broadcastTransaction(signedTx: SignedTransaction): Promise { + const avalanche = this.avalancheClient.getAvalanche(); + const xchain = avalanche.XChain(); + + // Get transaction buffer + const txBuffer = signedTx.signedTx.toBuffer(); + + // Issue transaction + const txId = await xchain.issueTx(txBuffer); + + return txId; + } +} +``` + +## MPC Signing System + +```typescript +enum ChainType { + BITCOIN = 'bitcoin', + AVALANCHE_X = 'avalanche-x' +} + +// Signature format for different chains +enum SignatureFormat { + DER = 'der', + COMPACT = 'compact', + CB58 = 'cb58' +} + +// MPC signing request +interface SigningRequest { + digest: Buffer; + chainType: ChainType; + format: SignatureFormat; +} + +// MPC signature share +interface SignatureShare { + index: number; + value: Buffer; +} + +// MPC node interface +interface MPCNode { + nodeId: string; + generateSignatureShare(request: SigningRequest): Promise; +} + +class MultiChainMPCSigner { + constructor( + private mpcNodes: MPCNode[], + private threshold: number, + private config: MPCConfig + ) {} + + // Sign a digest for a specific chain type + async signDigest(digest: Buffer, chainType: ChainType): Promise { + // Format signing request based on chain type + const format = this.getSignatureFormat(chainType); + const signingRequest: SigningRequest = { + digest, + chainType, + format + }; + + // Collect signature shares from MPC nodes + const signaturePromises = this.mpcNodes.map(node => + node.generateSignatureShare(signingRequest) + ); + + // Wait for all nodes or until threshold is reached + const signatureShares = await this.collectShares(signaturePromises); + + // Check if we have enough shares + if (signatureShares.length < this.threshold) { + throw new Error(`Not enough signature shares: got ${signatureShares.length}, need ${this.threshold}`); + } + + // Combine shares to get the final signature + return this.combineShares(signatureShares, chainType); + } + + // Get the appropriate signature format for the chain + private getSignatureFormat(chainType: ChainType): SignatureFormat { + switch (chainType) { + case ChainType.BITCOIN: + return SignatureFormat.DER; + case ChainType.AVALANCHE_X: + return SignatureFormat.CB58; + default: + throw new Error(`Unsupported chain type: ${chainType}`); + } + } + + // Collect signature shares with timeout + private async collectShares(promises: Promise[]): Promise { + const timeout = this.config.nodeTimeout; + + // Create promises with timeout + const promisesWithTimeout = promises.map(async (promise) => { + try { + return await Promise.race([ + promise, + new Promise((_, reject) => + setTimeout(() => reject(new Error('Timeout')), timeout) + ) + ]); + } catch (error) { + console.error('Error collecting signature share:', error); + return null; + } + }); + + // Wait for all promises + const results = await Promise.all(promisesWithTimeout); + + // Filter out null results + return results.filter(share => share !== null); + } + + // Combine signature shares based on chain type + private combineShares(shares: SignatureShare[], chainType: ChainType): Buffer { + switch (chainType) { + case ChainType.BITCOIN: + return this.combineBitcoinShares(shares); + case ChainType.AVALANCHE_X: + return this.combineAvaxShares(shares); + default: + throw new Error(`Unsupported chain type for combining shares: ${chainType}`); + } + } + + // Implement threshold signature combining for Bitcoin + private combineBitcoinShares(shares: SignatureShare[]): Buffer { + // Implementation depends on your MPC algorithm + // This is a simplified placeholder + + // Sort shares by index + shares.sort((a, b) => a.index - b.index); + + // Combine shares using Lagrange interpolation + // This is simplified - actual implementation depends on your MPC protocol + + // Return DER encoded signature + return Buffer.from('combined_bitcoin_signature', 'hex'); + } + + // Implement threshold signature combining for Avalanche X-Chain + private combineAvaxShares(shares: SignatureShare[]): Buffer { + // Similar to Bitcoin but with Avalanche-specific formatting + + // Return CB58 encoded signature + return Buffer.from('combined_avax_signature', 'hex'); + } + + // Generate a new MPC key (for sweeping or key rotation) + async generateKey(): Promise<{publicKey: Buffer, keyId: string}> { + // Implementation depends on your MPC protocol + // This is a simplified placeholder + + // Return the public key and an identifier for the key + return { + publicKey: Buffer.from('mpc_generated_public_key', 'hex'), + keyId: 'key_' + Date.now() + }; + } +} +``` + +## Common Utilities + +### Address Generator + +```typescript +class AddressGenerator { + constructor( + private mpcService: MultiChainMPCSigner, + private database: Database, + private config: AddressConfig + ) {} + + // Generate address for a specific chain + async generateAddress(chainType: ChainType, purpose: string): Promise { + // Generate random index for added security + const randomIndex = crypto.randomBytes(8).readBigUInt64BE(0).toString(); + + // Get path based on chain and purpose + const path = this.getDerivationPath(chainType, purpose, randomIndex); + + // Generate key for this path + const { publicKey, keyId } = await this.mpcService.generateKey(); + + // Format address based on chain type + let address: string; + + switch (chainType) { + case ChainType.BITCOIN: + address = this.createBitcoinAddress(publicKey); + break; + case ChainType.AVALANCHE_X: + address = this.createAvaxAddress(publicKey); + break; + default: + throw new Error(`Unsupported chain type: ${chainType}`); + } + + // Store address in database + await this.database.storeAddress({ + address, + chainType, + purpose, + path, + keyId, + publicKey: publicKey.toString('hex'), + createdAt: Date.now() + }); + + return address; + } + + // Get derivation path for specific chain and purpose + private getDerivationPath(chainType: ChainType, purpose: string, index: string): string { + const purposeCode = this.getPurposeCode(purpose); + + switch (chainType) { + case ChainType.BITCOIN: + // BIP44: m/purpose'/coin_type'/account'/change/address_index + return `m/44'/0'/0'/${purposeCode}/${index}`; + case ChainType.AVALANCHE_X: + // Avalanche uses m/44'/9000'/0'/0/address_index + return `m/44'/9000'/0'/${purposeCode}/${index}`; + default: + throw new Error(`Unsupported chain type: ${chainType}`); + } + } + + // Get purpose code for path + private getPurposeCode(purpose: string): number { + switch (purpose) { + case 'deposit': + return 0; + case 'change': + return 1; + case 'sweep': + return 2; + default: + return 0; + } + } + + // Create Bitcoin address from public key + private createBitcoinAddress(publicKey: Buffer): string { + const bitcoinjs = require('bitcoinjs-lib'); + const network = this.config.bitcoin.network === 'mainnet' + ? bitcoinjs.networks.bitcoin + : bitcoinjs.networks.testnet; + + // Create P2WPKH address (Segwit) + const { address } = bitcoinjs.payments.p2wpkh({ + pubkey: publicKey, + network + }); + + return address; + } + + // Create Avalanche X address from public key + private createAvaxAddress(publicKey: Buffer): string { + const avalanche = require('avalanche'); + const bintools = avalanche.BinTools.getInstance(); + + // Create X-Chain address + const chainId = 'X'; + const hrp = this.config.avalanche.network === 'mainnet' ? 'X-avax' : 'X-fuji'; + + // Create address from public key + const addr = avalanche.Address.fromPublicKey(publicKey); + + // Format with correct chainID and prefix + return addr.toString(hrp, chainId); + } + + // Generate a deposit address + async generateDepositAddress(chainType: ChainType): Promise { + return this.generateAddress(chainType, 'deposit'); + } + + // Generate a change address + async generateChangeAddress(chainType: ChainType): Promise { + return this.generateAddress(chainType, 'change'); + } + + // Generate a sweep address + async generateSweepAddress(chainType: ChainType): Promise { + return this.generateAddress(chainType, 'sweep'); + } +} +``` + +### Sweep Manager + +```typescript +class SweepManager { + constructor( + private chainHandlers: Map, + private addressGenerator: AddressGenerator, + private database: Database, + private config: SweepConfig + ) {} + + // Check if sweeping is needed for a chain + async shouldSweep(chainType: string): Promise { + // Get handler for this chain + const handler = this.chainHandlers.get(chainType); + if (!handler) { + throw new Error(`No handler for chain type: ${chainType}`); + } + + // Get last sweep time + const lastSweep = await this.database.getLastSweep(chainType); + const now = Date.now(); + + // Get available UTXOs + const availableUTXOs = await this.database.getAvailableUTXOs(chainType); + + // Calculate total value + let totalValue: BigInt | BN; + + if (chainType === ChainType.BITCOIN) { + totalValue = availableUTXOs.reduce( + (sum, utxo) => sum + BigInt(utxo.amount), + BigInt(0) + ); + + // Convert to object for consistency in logic below + totalValue = { valueOf: () => totalValue }; + } else { + totalValue = availableUTXOs.reduce( + (sum, utxo) => sum.add(new BN(utxo.amount)), + new BN(0) + ); + } + + // Check thresholds + for (const threshold of this.config.sweepThresholds) { + if ( + totalValue > threshold.amount && + (!lastSweep || (now - lastSweep.timestamp) > threshold.interval) + ) { + return true; + } + } + + // Default sweep interval + if (!lastSweep || (now - lastSweep.timestamp) > this.config.defaultSweepInterval) { + return availableUTXOs.length > 0; + } + + return false; + } + + // Execute sweep for a chain + async executeSweep(chainType: string): Promise { + // Get handler for this chain + const handler = this.chainHandlers.get(chainType); + if (!handler) { + throw new Error(`No handler for chain type: ${chainType}`); + } + + // Get sweepable UTXOs + const utxos = await handler.getSweepableUTXOs(); + + if (utxos.length === 0) { + return { + status: 'skipped', + message: 'No UTXOs to sweep' + }; + } + + // Generate new sweep address + const sweepAddress = await this.addressGenerator.generateSweepAddress(chainType); + + // Build sweep transaction + const unsignedTx = await handler.buildSweepTransaction(utxos, sweepAddress); + + // Sign transaction + const signedTx = await handler.signTransaction(unsignedTx); + + // Broadcast transaction + const txId = await handler.broadcastTransaction(signedTx); + + // Update database + await this.database.recordSweep({ + chainType, + txId, + utxos, + destinationAddress: sweepAddress, + timestamp: Date.now() + }); + + // Mark UTXOs as spent + for (const utxo of utxos) { + await this.database.updateUTXO( + chainType, + utxo.txId, + utxo.outputIndex, + { + status: 'spent', + spentAt: Date.now(), + spentInTxId: txId + } + ); + } + + return { + status: 'success', + txId, + utxoCount: utxos.length, + destinationAddress: sweepAddress + }; + } +} +``` + +## Configuration Examples + +### Bitcoin Configuration + +```typescript +const bitcoinConfig = { + // Network settings + network: 'mainnet', // or 'testnet' + + // Node connection + nodeUrl: 'https://bitcoin-rpc.example.com', + nodeUsername: 'rpc_user', + nodePassword: 'rpc_password', + + // Teleporter addresses + teleporterAddresses: [ + 'bc1qxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx', + 'bc1qyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy' + ], + + // Transaction settings + feeRate: 5, // satoshis per vByte + dustThreshold: 546, // minimum output value in satoshis + minConfirmations: 3, // minimum confirmations for using UTXOs + + // MPC settings + publicKey: '03xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx', + + // Sweep settings + sweepThresholds: [ + { amount: 1000000, interval: 24 * 60 * 60 * 1000 }, // 0.01 BTC, 24 hours + { amount: 10000000, interval: 12 * 60 * 60 * 1000 }, // 0.1 BTC, 12 hours + { amount: 100000000, interval: 4 * 60 * 60 * 1000 } // 1 BTC, 4 hours + ], + defaultSweepInterval: 48 * 60 * 60 * 1000, // 48 hours + + // Processing intervals + refreshInterval: 5 * 60 * 1000, // 5 minutes + withdrawalProcessingInterval: 1 * 60 * 1000, // 1 minute + sweepCheckInterval: 30 * 60 * 1000 // 30 minutes +}; +``` + +### Avalanche X-Chain Configuration + +```typescript +const avalancheConfig = { + // Network settings + network: 'mainnet', // or 'fuji' + + // Node connection + nodeUrl: 'https://api.avax.network', + nodePort: 443, + protocol: 'https', + networkID: 1, // 1 for mainnet, 5 for fuji testnet + + // Teleporter addresses + teleporterAddresses: [ + 'X-avax1xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx', + 'X-avax1yyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy' + ], + + // Transaction settings + minConfirmations: 1, // X-Chain has fast finality + + // MPC settings + publicKey: '03xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx', + + // Sweep settings + sweepThresholds: [ + { amount: new BN('10000000000'), interval: 24 * 60 * 60 * 1000 }, // 10 AVAX, 24 hours + { amount: new BN('100000000000'), interval: 12 * 60 * 60 * 1000 }, // 100 AVAX, 12 hours + { amount: new BN('1000000000000'), interval: 4 * 60 * 60 * 1000 } // 1000 AVAX, 4 hours + ], + defaultSweepInterval: 48 * 60 * 60 * 1000, // 48 hours + + // Processing intervals + refreshInterval: 2 * 60 * 1000, // 2 minutes + withdrawalProcessingInterval: 30 * 1000, // 30 seconds + sweepCheckInterval: 15 * 60 * 1000 // 15 minutes +}; +``` + +### MPC Configuration + +```typescript +const mpcConfig = { + // Node settings + nodes: [ + { id: 'node1', url: 'https://mpc-node1.example.com' }, + { id: 'node2', url: 'https://mpc-node2.example.com' }, + { id: 'node3', url: 'https://mpc-node3.example.com' }, + { id: 'node4', url: 'https://mpc-node4.example.com' }, + { id: 'node5', url: 'https://mpc-node5.example.com' } + ], + threshold: 3, // Minimum nodes needed to sign + + // Security settings + nodeTimeout: 30000, // 30 seconds timeout for node responses + + // Key rotation + keyRotationInterval: 30 * 24 * 60 * 60 * 1000, // 30 days + + // Retry settings + maxRetries: 3, + retryDelay: 5000 // 5 seconds +}; +``` + +### Database Configuration + +```typescript +const databaseConfig = { + // PostgreSQL connection + host: 'db.example.com', + port: 5432, + database: 'utxo_manager', + user: 'db_user', + password: 'db_password', + + // Connection pool + poolSize: 10, + + // Indexes + createIndexes: true, + + // Logging + logQueries: false, + + // Auto-cleanup + cleanupInterval: 7 * 24 * 60 * 60 * 1000, // 7 days + retentionPeriod: 90 * 24 * 60 * 60 * 1000 // 90 days +}; +``` + +## API Endpoints + +```typescript +// API Endpoints for the Withdrawal System +class UTXOWithdrawalAPI { + constructor( + private withdrawalProcessor: WithdrawalProcessor, + private sweepManager: SweepManager, + private addressGenerator: AddressGenerator, + private database: Database + ) {} + + // Initialize API routes + initializeRoutes(app: Express): void { + // Withdrawal endpoints + app.post('/api/v1/withdrawal', this.createWithdrawal.bind(this)); + app.get('/api/v1/withdrawal/:id', this.getWithdrawalStatus.bind(this)); + + // UTXO management + app.get('/api/v1/utxos/:chainType', this.getUtxoStatus.bind(this)); + + // Address generation + app.post('/api/v1/address/:chainType', this.generateAddress.bind(this)); + + // Sweep management + app.post('/api/v1/sweep/:chainType', this.triggerSweep.bind(this)); + app.get('/api/v1/sweep/:chainType/history', this.getSweepHistory.bind(this)); + } + + // Create a new withdrawal request + async createWithdrawal(req: Request, res: Response): Promise { + try { + const { + chainType, + destinationAddress, + amount, + assetID = null, + feeRate = null, + userId + } = req.body; + + // Validate request + this.validateWithdrawalRequest(chainType, destinationAddress, amount); + + // Create request ID + const requestId = crypto.randomBytes(16).toString('hex'); + + // Store withdrawal request + await this.database.createWithdrawalRequest({ + id: requestId, + chainType, + destinationAddress, + amount, + assetID, + feeRate, + userId, + status: 'pending', + createdAt: Date.now() + }); + + // Return request ID + res.status(200).json({ + status: 'success', + data: { + requestId, + estimatedCompletionTime: this.getEstimatedCompletionTime(chainType) + } + }); + } catch (error) { + console.error('Error creating withdrawal:', error); + res.status(400).json({ + status: 'error', + message: error.message + }); + } + } + + // Get withdrawal status + async getWithdrawalStatus(req: Request, res: Response): Promise { + try { + const requestId = req.params.id; + + // Get withdrawal from database + const withdrawal = await this.database.getWithdrawalRequest(requestId); + + if (!withdrawal) { + res.status(404).json({ + status: 'error', + message: 'Withdrawal request not found' + }); + return; + } + + // Get transaction details if available + let txDetails = null; + if (withdrawal.txId) { + txDetails = await this.getTransactionDetails( + withdrawal.chainType, + withdrawal.txId + ); + } + + res.status(200).json({ + status: 'success', + data: { + requestId: withdrawal.id, + chainType: withdrawal.chainType, + destinationAddress: withdrawal.destinationAddress, + amount: withdrawal.amount, + status: withdrawal.status, + txId: withdrawal.txId, + txDetails, + createdAt: withdrawal.createdAt, + processedAt: withdrawal.processedAt + } + }); + } catch (error) { + console.error('Error getting withdrawal status:', error); + res.status(500).json({ + status: 'error', + message: 'Error fetching withdrawal status' + }); + } + } + + // Trigger a sweep operation + async triggerSweep(req: Request, res: Response): Promise { + try { + const chainType = req.params.chainType; + + // Check if sweeping is needed + const shouldSweep = await this.sweepManager.shouldSweep(chainType); + + if (!shouldSweep) { + res.status(200).json({ + status: 'success', + data: { + message: 'Sweep not needed at this time' + } + }); + return; + } + + // Execute sweep + const result = await this.sweepManager.executeSweep(chainType); + + res.status(200).json({ + status: 'success', + data: result + }); + } catch (error) { + console.error('Error triggering sweep:', error); + res.status(500).json({ + status: 'error', + message: 'Error triggering sweep' + }); + } + } + + // Get UTXO status for a chain + async getUtxoStatus(req: Request, res: Response): Promise { + try { + const chainType = req.params.chainType; + + // Get UTXOs from database + const utxos = await this.database.getAllUTXOs(chainType); + + // Group by status + const available = utxos.filter(utxo => utxo.status === 'available'); + const reserved = utxos.filter(utxo => utxo.status === 'reserved'); + const spent = utxos.filter(utxo => utxo.status === 'spent'); + + // Calculate totals + const calculateTotal = (utxos: UTXO[]): string => { + if (chainType === 'bitcoin') { + return utxos + .reduce((sum, utxo) => sum + BigInt(utxo.amount), BigInt(0)) + .toString(); + } else { + return utxos + .reduce((sum, utxo) => sum.add(new BN(utxo.amount)), new BN(0)) + .toString(); + } + }; + + res.status(200).json({ + status: 'success', + data: { + available: { + count: available.length, + total: calculateTotal(available) + }, + reserved: { + count: reserved.length, + total: calculateTotal(reserved) + }, + spent: { + count: spent.length, + total: calculateTotal(spent) + }, + lastUpdated: new Date().toISOString() + } + }); + } catch (error) { + console.error('Error getting UTXO status:', error); + res.status(500).json({ + status: 'error', + message: 'Error fetching UTXO status' + }); + } + } + + // Generate a new address + async generateAddress(req: Request, res: Response): Promise { + try { + const chainType = req.params.chainType; + const purpose = req.query.purpose || 'deposit'; + + // Generate address + const address = await this.addressGenerator.generateAddress(chainType, purpose.toString()); + + res.status(200).json({ + status: 'success', + data: { + address, + chainType, + purpose + } + }); + } catch (error) { + console.error('Error generating address:', error); + res.status(500).json({ + status: 'error', + message: 'Error generating address' + }); + } + } + + // Get sweep history + async getSweepHistory(req: Request, res: Response): Promise { + try { + const chainType = req.params.chainType; + const limit = parseInt(req.query.limit?.toString() || '10'); + const offset = parseInt(req.query.offset?.toString() || '0'); + + // Get sweep history from database + const sweeps = await this.database.getSweepHistory(chainType, limit, offset); + + res.status(200).json({ + status: 'success', + data: { + sweeps, + pagination: { + limit, + offset, + total: await this.database.countSweeps(chainType) + } + } + }); + } catch (error) { + console.error('Error getting sweep history:', error); + res.status(500).json({ + status: 'error', + message: 'Error fetching sweep history' + }); + } + } + + // Helper method to validate withdrawal request + private validateWithdrawalRequest( + chainType: string, + destinationAddress: string, + amount: string + ): void { + // Validate chain type + if (!['bitcoin', 'avalanche-x'].includes(chainType)) { + throw new Error(`Unsupported chain type: ${chainType}`); + } + + // Validate address + if (chainType === 'bitcoin') { + if (!this.isValidBitcoinAddress(destinationAddress)) { + throw new Error('Invalid Bitcoin address'); + } + } else if (chainType === 'avalanche-x') { + if (!this.isValidAvaxAddress(destinationAddress)) { + throw new Error('Invalid Avalanche X-Chain address'); + } + } + + // Validate amount + try { + if (chainType === 'bitcoin') { + const amountValue = BigInt(amount); + if (amountValue <= BigInt(0)) { + throw new Error('Amount must be greater than 0'); + } + } else { + const amountValue = new BN(amount); + if (amountValue.lte(new BN(0))) { + throw new Error('Amount must be greater than 0'); + } + } + } catch (error) { + throw new Error('Invalid amount format'); + } + } + + // Helper method to get estimated completion time + private getEstimatedCompletionTime(chainType: string): string { + const now = new Date(); + + // Add estimated processing time based on chain type + if (chainType === 'bitcoin') { + // Bitcoin takes longer + now.setMinutes(now.getMinutes() + 30); + } else if (chainType === 'avalanche-x') { + // Avalanche is faster + now.setMinutes(now.getMinutes() + 5); + } else { + // Default + now.setMinutes(now.getMinutes() + 15); + } + + return now.toISOString(); + } + + // Helper method to get transaction details + private async getTransactionDetails( + chainType: string, + txId: string + ): Promise { + const handler = this.withdrawalProcessor.getChainHandler(chainType); + if (!handler) { + throw new Error(`No handler for chain type: ${chainType}`); + } + + return handler.getTransactionStatus(txId); + } + + // Helper method to validate Bitcoin address + private isValidBitcoinAddress(address: string): boolean { + try { + const bitcoinjs = require('bitcoinjs-lib'); + const network = this.withdrawalProcessor.getConfig('bitcoin').network === 'mainnet' + ? bitcoinjs.networks.bitcoin + : bitcoinjs.networks.testnet; + + bitcoinjs.address.toOutputScript(address, network); + return true; + } catch (error) { + return false; + } + } + + // Helper method to validate Avalanche X-Chain address + private isValidAvaxAddress(address: string): boolean { + try { + // Check if address starts with X- + if (!address.startsWith('X-')) { + return false; + } + + // Additional validation could be done here with Avalanche.js + return true; + } catch (error) { + return false; + } + } +} +``` + +## Error Handling + +```typescript +// Define error types +class UTXOError extends Error { + constructor( + message: string, + public code: string, + public details?: any + ) { + super(message); + this.name = 'UTXOError'; + // Ensure Error.captureStackTrace exists (it's a V8 specific function) + if (Error.captureStackTrace) { + Error.captureStackTrace(this, UTXOError); + } + } +} + +// Specific error types +class InsufficientFundsError extends UTXOError { + constructor(amount: string, available: string) { + super( + `Insufficient funds: requested ${amount}, available ${available}`, + 'INSUFFICIENT_FUNDS', + { requested: amount, available } + ); + this.name = 'InsufficientFundsError'; + } +} + +class InvalidAddressError extends UTXOError { + constructor(address: string, chainType: string) { + super( + `Invalid address for ${chainType}: ${address}`, + 'INVALID_ADDRESS', + { address, chainType } + ); + this.name = 'InvalidAddressError'; + } +} + +class UTXONotFoundError extends UTXOError { + constructor(txId: string, outputIndex: number) { + super( + `UTXO not found: ${txId}:${outputIndex}`, + 'UTXO_NOT_FOUND', + { txId, outputIndex } + ); + this.name = 'UTXONotFoundError'; + } +} + +class TransactionBroadcastError extends UTXOError { + constructor(message: string, txHex?: string) { + super( + `Failed to broadcast transaction: ${message}`, + 'BROADCAST_ERROR', + { txHex } + ); + this.name = 'TransactionBroadcastError'; + } +} + +class MPCSigningError extends UTXOError { + constructor(message: string, details?: any) { + super( + `MPC signing failed: ${message}`, + 'MPC_SIGNING_ERROR', + details + ); + this.name = 'MPCSigningError'; + } +} + +// Error handler middleware +function errorHandler( + error: Error, + req: Request, + res: Response, + next: NextFunction +): void { + console.error('Error:', error); + + // Set default status code + let statusCode = 500; + let errorResponse = { + status: 'error', + message: 'Internal server error' + }; + + // Handle specific error types + if (error instanceof UTXOError) { + // Map error codes to HTTP status codes + const statusCodeMap: Record = { + 'INSUFFICIENT_FUNDS': 400, + 'INVALID_ADDRESS': 400, + 'UTXO_NOT_FOUND': 404, + 'BROADCAST_ERROR': 500, + 'MPC_SIGNING_ERROR': 500 + }; + + statusCode = statusCodeMap[error.code] || 500; + + errorResponse = { + status: 'error', + message: error.message, + code: error.code, + details: error.details + }; + } else if (error instanceof SyntaxError) { + // Handle JSON parsing errors + statusCode = 400; + errorResponse = { + status: 'error', + message: 'Invalid request format' + }; + } + + // Send error response + res.status(statusCode).json(errorResponse); +} +``` + +## Monitoring and Logging + +```typescript +// Logger utility +class Logger { + private logLevel: string; + + constructor(level: string = 'info') { + this.logLevel = level; + } + + // Log levels in order of severity + private levels = { + error: 0, + warn: 1, + info: 2, + debug: 3 + }; + + // Check if level should be logged + private shouldLog(level: string): boolean { + return this.levels[level] <= this.levels[this.logLevel]; + } + + // Format log message + private formatMessage(level: string, message: string, meta?: any): string { + const timestamp = new Date().toISOString(); + const metaString = meta ? ` ${JSON.stringify(meta)}` : ''; + return `${timestamp} [${level.toUpperCase()}] ${message}${metaString}`; + } + + // Log methods + error(message: string, meta?: any): void { + if (this.shouldLog('error')) { + console.error(this.formatMessage('error', message, meta)); + } + } + + warn(message: string, meta?: any): void { + if (this.shouldLog('warn')) { + console.warn(this.formatMessage('warn', message, meta)); + } + } + + info(message: string, meta?: any): void { + if (this.shouldLog('info')) { + console.info(this.formatMessage('info', message, meta)); + } + } + + debug(message: string, meta?: any): void { + if (this.shouldLog('debug')) { + console.debug(this.formatMessage('debug', message, meta)); + } + } + + // Transaction logging + logTransaction(tx: any, status: string): void { + this.info(`Transaction ${status}`, { + txId: tx.txId, + chainType: tx.chainType, + inputs: tx.inputs?.length, + status + }); + } + + // Sweep logging + logSweep(sweep: any): void { + this.info(`Sweep executed`, { + txId: sweep.txId, + chainType: sweep.chainType, + utxoCount: sweep.utxos?.length, + destination: sweep.destinationAddress + }); + } + + // Error logging + logError(error: Error, context?: string): void { + this.error(`${context || 'Error occurred'}: ${error.message}`, { + name: error.name, + stack: error.stack + }); + } +} + +// Monitoring system +class MonitoringSystem { + private metrics: Map = new Map(); + private logger: Logger; + + constructor(logger: Logger) { + this.logger = logger; + + // Initialize metrics + this.initializeMetrics(); + } + + // Initialize default metrics + private initializeMetrics(): void { + // Withdrawal metrics + this.metrics.set('withdrawals_total', 0); + this.metrics.set('withdrawals_success', 0); + this.metrics.set('withdrawals_failed', 0); + this.metrics.set('withdrawals_pending', 0); + + // Sweep metrics + this.metrics.set('sweeps_total', 0); + this.metrics.set('sweeps_success', 0); + this.metrics.set('sweeps_failed', 0); + + // UTXO metrics + this.metrics.set('utxos_available', 0); + this.metrics.set('utxos_reserved', 0); + this.metrics.set('utxos_spent', 0); + + // MPC metrics + this.metrics.set('mpc_signing_requests', 0); + this.metrics.set('mpc_signing_success', 0); + this.metrics.set('mpc_signing_failed', 0); + + // Performance metrics + this.metrics.set('avg_withdrawal_time_ms', 0); + this.metrics.set('avg_sweep_time_ms', 0); + this.metrics.set('avg_signing_time_ms', 0); + } + + // Increment a metric + increment(metric: string, value: number = 1): void { + const currentValue = this.metrics.get(metric) || 0; + this.metrics.set(metric, currentValue + value); + } + + // Set a metric value + set(metric: string, value: number): void { + this.metrics.set(metric, value); + } + + // Get a metric value + get(metric: string): number { + return this.metrics.get(metric) || 0; + } + + // Get all metrics + getAllMetrics(): Record { + const result: Record = {}; + for (const [key, value] of this.metrics.entries()) { + result[key] = value; + } + return result; + } + + // Record withdrawal event + recordWithdrawal(status: string, durationMs?: number): void { + this.increment('withdrawals_total'); + this.increment(`withdrawals_${status}`); + + if (durationMs) { + const avgTime = this.get('avg_withdrawal_time_ms'); + const totalWithdrawals = this.get('withdrawals_total'); + + // Update rolling average + const newAvg = (avgTime * (totalWithdrawals - 1) + durationMs) / totalWithdrawals; + this.set('avg_withdrawal_time_ms', newAvg); + + this.logger.debug(`Withdrawal completed in ${durationMs}ms`, { status }); + } + } + + // Record sweep event + recordSweep(status: string, utxoCount: number, durationMs?: number): void { + this.increment('sweeps_total'); + this.increment(`sweeps_${status}`); + + if (durationMs) { + const avgTime = this.get('avg_sweep_time_ms'); + const totalSweeps = this.get('sweeps_total'); + + // Update rolling average + const newAvg = (avgTime * (totalSweeps - 1) + durationMs) / totalSweeps; + this.set('avg_sweep_time_ms', newAvg); + + this.logger.debug(`Sweep completed in ${durationMs}ms`, { + status, + utxoCount + }); + } + } + + // Record MPC signing event + recordMPCSigning(status: string, durationMs?: number): void { + this.increment('mpc_signing_requests'); + this.increment(`mpc_signing_${status}`); + + if (durationMs) { + const avgTime = this.get('avg_signing_time_ms'); + const totalSignings = this.get('mpc_signing_requests'); + + // Update rolling average + const newAvg = (avgTime * (totalSignings - 1) + durationMs) / totalSignings; + this.set('avg_signing_time_ms', newAvg); + + this.logger.debug(`MPC signing completed in ${durationMs}ms`, { status }); + } + } + + // Update UTXO metrics + updateUTXOMetrics( + availableCount: number, + reservedCount: number, + spentCount: number + ): void { + this.set('utxos_available', availableCount); + this.set('utxos_reserved', reservedCount); + this.set('utxos_spent', spentCount); + + this.logger.debug('UTXO metrics updated', { + available: availableCount, + reserved: reservedCount, + spent: spentCount + }); + } + + // Export metrics in Prometheus format + exportPrometheusMetrics(): string { + let output = ''; + + for (const [key, value] of this.metrics.entries()) { + output += `# HELP utxo_system_${key} UTXO withdrawal system metric\n`; + output += `# TYPE utxo_system_${key} gauge\n`; + output += `utxo_system_${key} ${value}\n`; + } + + return output; + } +} +``` + +## Security Best Practices + +```typescript +// Security utils +class SecurityUtils { + // Generate a random key + static generateRandomKey(length: number = 32): string { + return crypto.randomBytes(length).toString('hex'); + } + + // Hash data with a salt + static hashWithSalt(data: string, salt?: string): { hash: string, salt: string } { + const useSalt = salt || crypto.randomBytes(16).toString('hex'); + const hash = crypto + .createHmac('sha256', useSalt) + .update(data) + .digest('hex'); + + return { hash, salt: useSalt }; + } + + // Constant-time comparison to prevent timing attacks + static constantTimeCompare(a: string, b: string): boolean { + if (a.length !== b.length) { + return false; + } + + let result = 0; + for (let i = 0; i < a.length; i++) { + result |= a.charCodeAt(i) ^ b.charCodeAt(i); + } + + return result === 0; + } + + // Validate and sanitize input + static sanitizeInput(input: string): string { + // Remove potentially dangerous characters + return input.replace(/[<>'"&]/g, ''); + } + + // Validate hexadecimal string + static isValidHex(hex: string): boolean { + return /^[0-9a-fA-F]+$/.test(hex); + } + + // Create a secure JWT token + static createToken(payload: any, secretKey: string, expiresIn: string = '1h'): string { + const jwt = require('jsonwebtoken'); + return jwt.sign(payload, secretKey, { expiresIn }); + } + + // Verify a JWT token + static verifyToken(token: string, secretKey: string): any { + const jwt = require('jsonwebtoken'); + try { + return jwt.verify(token, secretKey); + } catch (error) { + throw new Error('Invalid token'); + } + } +} + +// Authentication middleware +function authMiddleware(req: Request, res: Response, next: NextFunction): void { + try { + // Get token from header + const authHeader = req.headers.authorization; + if (!authHeader || !authHeader.startsWith('Bearer ')) { + res.status(401).json({ + status: 'error', + message: 'Authentication required' + }); + return; + } + + const token = authHeader.split(' ')[1]; + + // Verify token + const secretKey = process.env.JWT_SECRET; + if (!secretKey) { + throw new Error('JWT_SECRET not configured'); + } + + const decodedToken = SecurityUtils.verifyToken(token, secretKey); + + // Add user to request + req.user = decodedToken; + + // Continue to next middleware + next(); + } catch (error) { + res.status(401).json({ + status: 'error', + message: 'Invalid authentication token' + }); + } +} + +// Rate limiting middleware +function rateLimitMiddleware( + windowMs: number = 15 * 60 * 1000, // 15 minutes + maxRequests: number = 100, // 100 requests per window + message: string = 'Too many requests, please try again later' +): (req: Request, res: Response, next: NextFunction) => void { + const requestCounts = new Map(); + + return (req: Request, res: Response, next: NextFunction): void => { + // Get client IP + const clientIp = req.ip || 'unknown'; + + // Get current time + const now = Date.now(); + + // Get or create request count for this IP + let requestData = requestCounts.get(clientIp); + if (!requestData) { + requestData = { + count: 0, + resetTime: now + windowMs + }; + requestCounts.set(clientIp, requestData); + } + + // Check if window has expired + if (now > requestData.resetTime) { + requestData.count = 0; + requestData.resetTime = now + windowMs; + } + + // Increment request count + requestData.count += 1; + + // Check if count exceeds limit + if (requestData.count > maxRequests) { + res.status(429).json({ + status: 'error', + message + }); + return; + } + + // Add rate limit headers + res.setHeader('X-RateLimit-Limit', maxRequests.toString()); + res.setHeader('X-RateLimit-Remaining', (maxRequests - requestData.count).toString()); + res.setHeader('X-RateLimit-Reset', Math.ceil(requestData.resetTime / 1000).toString()); + + // Continue to next middleware + next(); + }; +} + +// CORS middleware +function corsMiddleware(allowedOrigins: string[] = ['*']): (req: Request, res: Response, next: NextFunction) => void { + return (req: Request, res: Response, next: NextFunction): void => { + const origin = req.headers.origin; + + // Check if origin is allowed + if (origin && (allowedOrigins.includes('*') || allowedOrigins.includes(origin))) { + res.setHeader('Access-Control-Allow-Origin', origin); + } else { + res.setHeader('Access-Control-Allow-Origin', allowedOrigins[0]); + } + + // Set CORS headers + res.setHeader('Access-Control-Allow-Methods', 'GET, POST, PUT, DELETE, OPTIONS'); + res.setHeader('Access-Control-Allow-Headers', 'Content-Type, Authorization'); + res.setHeader('Access-Control-Allow-Credentials', 'true'); + + // Handle preflight request + if (req.method === 'OPTIONS') { + res.status(204).end(); + return; + } + + // Continue to next middleware + next(); + }; +} +``` + +## Database Schema and Implementation + +```typescript +// Database schema using TypeORM +import { + Entity, + Column, + PrimaryColumn, + PrimaryGeneratedColumn, + CreateDateColumn, + UpdateDateColumn, + Index +} from 'typeorm'; + +// UTXO entity +@Entity('utxos') +export class UTXOEntity { + @PrimaryColumn() + txId: string; + + @PrimaryColumn() + outputIndex: number; + + @Column() + amount: string; + + @Column() + address: string; + + @Column() + chainType: string; + + @Column() + assetID: string; + + @Column({ default: 0 }) + confirmations: number; + + @Column({ + type: 'enum', + enum: ['available', 'reserved', 'spent'], + default: 'available' + }) + status: string; + + @Column({ nullable: true }) + reservedAt: number; + + @Column({ nullable: true }) + spentAt: number; + + @Column({ nullable: true }) + spentInTxId: string; + + @CreateDateColumn() + createdAt: Date; + + @UpdateDateColumn() + updatedAt: Date; + + @Index() + @Column() + createdTimestamp: number; +} + +// Withdrawal request entity +@Entity('withdrawal_requests') +export class WithdrawalRequestEntity { + @PrimaryGeneratedColumn('uuid') + id: string; + + @Column() + chainType: string; + + @Column() + destinationAddress: string; + + @Column() + amount: string; + + @Column({ nullable: true }) + assetID: string; + + @Column({ nullable: true }) + feeRate: string; + + @Column({ nullable: true }) + userId: string; + + @Column({ + type: 'enum', + enum: ['pending', 'processing', 'completed', 'failed'], + default: 'pending' + }) + status: string; + + @Column({ nullable: true }) + txId: string; + + @Column({ nullable: true }) + errorMessage: string; + + @Column({ nullable: true }) + processedAt: number; + + @CreateDateColumn() + createdAt: Date; + + @UpdateDateColumn() + updatedAt: Date; + + @Index() + @Column() + createdTimestamp: number; +} + +// Sweep record entity +@Entity('sweep_records') +export class SweepRecordEntity { + @PrimaryGeneratedColumn('uuid') + id: string; + + @Column() + chainType: string; + + @Column() + txId: string; + + @Column() + destinationAddress: string; + + @Column({ type: 'simple-json' }) + utxos: string; + + @Column({ nullable: true }) + assetID: string; + + @Column() + amount: string; + + @Column({ + type: 'enum', + enum: ['pending', 'completed', 'failed'], + default: 'completed' + }) + status: string; + + @CreateDateColumn() + createdAt: Date; + + @UpdateDateColumn() + updatedAt: Date; + + @Index() + @Column() + timestamp: number; +} + +// Address entity +@Entity('addresses') +export class AddressEntity { + @PrimaryGeneratedColumn('uuid') + id: string; + + @Column({ unique: true }) + address: string; + + @Column() + chainType: string; + + @Column() + purpose: string; + + @Column() + path: string; + + @Column() + keyId: string; + + @Column() + publicKey: string; + + @CreateDateColumn() + createdAt: Date; + + @UpdateDateColumn() + updatedAt: Date; + + @Index() + @Column() + createdTimestamp: number; +} + +// Database implementation +class TypeORMDatabase implements Database { + private connection: Connection; + private logger: Logger; + + constructor(connection: Connection, logger: Logger) { + this.connection = connection; + this.logger = logger; + } + + // UTXO methods + async addUTXO(utxo: UTXO): Promise { + try { + const utxoRepo = this.connection.getRepository(UTXOEntity); + + await utxoRepo.save({ + ...utxo, + createdTimestamp: Date.now() + }); + + this.logger.debug('Added UTXO', { txId: utxo.txId, outputIndex: utxo.outputIndex }); + } catch (error) { + this.logger.error('Error adding UTXO', { error, utxo }); + throw error; + } + } + + async getUTXO(chainType: string, txId: string, outputIndex: number): Promise { + try { + const utxoRepo = this.connection.getRepository(UTXOEntity); + + const utxo = await utxoRepo.findOne({ + where: { chainType, txId, outputIndex } + }); + + return utxo || null; + } catch (error) { + this.logger.error('Error getting UTXO', { error, chainType, txId, outputIndex }); + throw error; + } + } + + async updateUTXO( + chainType: string, + txId: string, + outputIndex: number, + updates: Partial + ): Promise { + try { + const utxoRepo = this.connection.getRepository(UTXOEntity); + + await utxoRepo.update( + { chainType, txId, outputIndex }, + updates + ); + + this.logger.debug('Updated UTXO', { txId, outputIndex, updates }); + } catch (error) { + this.logger.error('Error updating UTXO', { error, chainType, txId, outputIndex, updates }); + throw error; + } + } + + async getAvailableUTXOs(chainType: string, assetID?: string): Promise { + try { + const utxoRepo = this.connection.getRepository(UTXOEntity); + + const whereClause: any = { + chainType, + status: 'available' + }; + + if (assetID) { + whereClause.assetID = assetID; + } + + return utxoRepo.find({ + where: whereClause, + order: { confirmations: 'DESC' } + }); + } catch (error) { + this.logger.error('Error getting available UTXOs', { error, chainType, assetID }); + throw error; + } + } + + // Withdrawal methods + async createWithdrawalRequest(request: WithdrawalRequest): Promise { + try { + const withdrawalRepo = this.connection.getRepository(WithdrawalRequestEntity); + + await withdrawalRepo.save({ + ...request, + createdTimestamp: Date.now() + }); + + this.logger.info('Created withdrawal request', { id: request.id, amount: request.amount }); + } catch (error) { + this.logger.error('Error creating withdrawal request', { error, request }); + throw error; + } + } + + async getWithdrawalRequest(id: string): Promise { + try { + const withdrawalRepo = this.connection.getRepository(WithdrawalRequestEntity); + + const request = await withdrawalRepo.findOne({ + where: { id } + }); + + return request || null; + } catch (error) { + this.logger.error('Error getting withdrawal request', { error, id }); + throw error; + } + } + + async updateWithdrawalRequest( + id: string, + updates: Partial + ): Promise { + try { + const withdrawalRepo = this.connection.getRepository(WithdrawalRequestEntity); + + await withdrawalRepo.update({ id }, updates); + + this.logger.debug('Updated withdrawal request', { id, updates }); + } catch (error) { + this.logger.error('Error updating withdrawal request', { error, id, updates }); + throw error; + } + } + + async getPendingWithdrawals(): Promise { + try { + const withdrawalRepo = this.connection.getRepository(WithdrawalRequestEntity); + + return withdrawalRepo.find({ + where: { status: 'pending' }, + order: { createdTimestamp: 'ASC' } + }); + } catch (error) { + this.logger.error('Error getting pending withdrawals', { error }); + throw error; + } + } + + // Sweep methods + async recordSweep(sweep: SweepRecord): Promise { + try { + const sweepRepo = this.connection.getRepository(SweepRecordEntity); + + await sweepRepo.save({ + ...sweep, + utxos: JSON.stringify(sweep.utxos) + }); + + this.logger.info('Recorded sweep', { txId: sweep.txId, utxoCount: sweep.utxos.length }); + } catch (error) { + this.logger.error('Error recording sweep', { error, sweep }); + throw error; + } + } + + async getLastSweep(chainType: string): Promise { + try { + const sweepRepo = this.connection.getRepository(SweepRecordEntity); + + const sweep = await sweepRepo.findOne({ + where: { chainType, status: 'completed' }, + order: { timestamp: 'DESC' } + }); + + if (sweep) { + sweep.utxos = JSON.parse(sweep.utxos as string); + } + + return sweep || null; + } catch (error) { + this.logger.error('Error getting last sweep', { error, chainType }); + throw error; + } + } + + async getSweepHistory( + chainType: string, + limit: number = 10, + offset: number = 0 + ): Promise { + try { + const sweepRepo = this.connection.getRepository(SweepRecordEntity); + + const sweeps = await sweepRepo.find({ + where: { chainType }, + order: { timestamp: 'DESC' }, + take: limit, + skip: offset + }); + + return sweeps.map(sweep => ({ + ...sweep, + utxos: JSON.parse(sweep.utxos as string) + })); + } catch (error) { + this.logger.error('Error getting sweep history', { error, chainType }); + throw error; + } + } + + async countSweeps(chainType: string): Promise { + try { + const sweepRepo = this.connection.getRepository(SweepRecordEntity); + + return sweepRepo.count({ + where: { chainType } + }); + } catch (error) { + this.logger.error('Error counting sweeps', { error, chainType }); + throw error; + } + } + + // Address methods + async storeAddress(addressData: AddressData): Promise { + try { + const addressRepo = this.connection.getRepository(AddressEntity); + + await addressRepo.save({ + ...addressData, + createdTimestamp: Date.now() + }); + + this.logger.debug('Stored address', { address: addressData.address, purpose: addressData.purpose }); + } catch (error) { + this.logger.error('Error storing address', { error, addressData }); + throw error; + } + } + + async getAddress(address: string): Promise { + try { + const addressRepo = this.connection.getRepository(AddressEntity); + + const addressData = await addressRepo.findOne({ + where: { address } + }); + + return addressData || null; + } catch (error) { + this.logger.error('Error getting address', { error, address }); + throw error; + } + } +} +``` + +That completes the implementation guide for the multi-chain UTXO withdrawal system. This comprehensive guide covers all the key components needed to build a secure, efficient system for managing withdrawals and automatic sweeping across both Bitcoin and Avalanche X-Chain networks. diff --git a/mpc-nodes/docker/common/node/package.json b/mpc-nodes/docker/common/node/package.json index d77b072c..823af2fd 100644 --- a/mpc-nodes/docker/common/node/package.json +++ b/mpc-nodes/docker/common/node/package.json @@ -49,6 +49,7 @@ "express-validator": "^7.2.0", "find-process": "^1.4.7", "prisma": "^5.19.1", - "web3": "^4.11.1" + "web3": "^4.11.1", + "xrpl": "^2.9.0" } } diff --git a/mpc-nodes/docker/common/node/src/config/settings.ts b/mpc-nodes/docker/common/node/src/config/settings.ts index 1fa0bc52..5561127f 100644 --- a/mpc-nodes/docker/common/node/src/config/settings.ts +++ b/mpc-nodes/docker/common/node/src/config/settings.ts @@ -1166,8 +1166,25 @@ export const MAIN_NETWORKS: NETWORK[] = [ is_native: false } ] + }, + { + display_name: "XRPL Ledger", + internal_name: "XRPL_MAINNET", + is_testnet: false, + chain_id: "XRP-MAINNET", + teleporter: "", // XRPL teleporter account holding burns + vault: "", // XRPL vault account for returning funds + node: "wss://s1.ripple.com", + currencies: [ + { + name: "XRP", + asset: "XRP", + contract_address: null, + decimals: 6, + is_native: true + } + ] } -] export const TEST_NETWORKS: NETWORK[] = [ { @@ -1799,8 +1816,45 @@ export const TEST_NETWORKS: NETWORK[] = [ is_native: false } ] + }, + // XRPL Platform Testnet + { + display_name: "XRPL Testnet", + internal_name: "XRPL_TESTNET", + is_testnet: true, + chain_id: "XRP-TESTNET", + teleporter: "", + vault: "", + node: "wss://s.altnet.rippletest.net:51233", + currencies: [ + { + name: "XRP", + asset: "XRP", + contract_address: null, + decimals: 6, + is_native: true + } + ] + }, + // XRPL Incentivized Devnet + { + display_name: "XRPL Devnet", + internal_name: "XRPL_DEVNET", + is_testnet: true, + chain_id: "XRP-DEVNET", + teleporter: "", + vault: "", + node: "wss://s.devnet.rippled.com:6006", + currencies: [ + { + name: "XRP", + asset: "XRP", + contract_address: null, + decimals: 6, + is_native: true + } + ] } -] export const SWAP_PAIRS: Record = { // lux tokens @@ -1866,13 +1920,16 @@ export const SWAP_PAIRS: Record = { ZDOGS: ["LDOGS"], ZMRB: ["LMRB"], ZREDO: ["LREDO"], - + // Lux & Zoo tokens TRUMP: ["TRUMP"], MELANIA: ["MELANIA"], Z: ["Z"], CYRUS: ["CYRUS"], + // XRP Ledger support + XRP: ["LXRP", "ZXRP"], + // Evm tokens ETH: ["LETH", "ZETH"], WETH: ["LETH", "ZETH"], diff --git a/mpc-nodes/docker/common/node/src/node.ts b/mpc-nodes/docker/common/node/src/node.ts index c87c971a..cb44c2a8 100644 --- a/mpc-nodes/docker/common/node/src/node.ts +++ b/mpc-nodes/docker/common/node/src/node.ts @@ -10,6 +10,7 @@ import cors from "cors" import express, { Request, Response } from "express" import Web3 from "web3" +import { Client as XrplClient } from "xrpl" import { Interface } from "ethers" import { settings } from "./config" import { RegisteredSubscription } from "web3/lib/commonjs/eth.exports" @@ -156,6 +157,54 @@ app.post("/api/v1/generate_mpc_sig", signDataValidator, async (req: Request, res }) return } + // — XRPL path — detect XRP networks + // XRPL path: handle XRP mainnet, testnet, and devnet + if ( + fromNetwork.internal_name === "XRPL_MAINNET" || + fromNetwork.internal_name === "XRPL_TESTNET" || + fromNetwork.internal_name === "XRPL_DEVNET" + ) { + const xrplClient = new XrplClient(fromNetwork.node) + await xrplClient.connect() + try { + const { result } = await xrplClient.request({ command: "tx", transaction: txId }) + if (!result || result.TransactionType !== "Payment" || result.Destination !== fromNetwork.teleporter) { + throw new Error("Invalid or non-payment XRPL tx") + } + const payload = { + teleporter: result.Destination, + token: "XRP", + from: result.Account, + eventName: "Payment", + value: result.Amount.toString() + } + const { signature, mpcSigner } = await hashAndSignTx({ + web3Form: null, + toNetworkId, + hashedTxId: txId, + toTokenAddress, + tokenAmount: payload.value, + decimals: 6, + receiverAddressHash, + nonce, + vault: false + }) + await savehashedTxId({ + chainType: "xrp", + txId, + amount: payload.value, + signature: signature + "###" + mpcSigner, + hashedTxId: txId + }) + res.json({ status: true, data: { ...payload, signature, mpcSigner, hashedTxId: txId } }) + return + } catch (err: any) { + res.json({ status: false, msg: err.message }) + return + } finally { + await xrplClient.disconnect() + } + } // get Web3Form using rpc url of specific network const web3Form = getWeb3FormForRPC(fromNetwork.node) if (!web3Form) { diff --git a/pnpm-lock.yaml b/pnpm-lock.yaml index 9f31af87..6c3283e9 100644 --- a/pnpm-lock.yaml +++ b/pnpm-lock.yaml @@ -64,6 +64,12 @@ importers: '@imtbl/imx-sdk': specifier: 2.1.1 version: 2.1.1(bufferutil@4.0.9)(utf-8-validate@5.0.10) + '@ledgerhq/hw-app-xrp': + specifier: ^6.31.0 + version: 6.31.0 + '@ledgerhq/hw-transport-webhid': + specifier: ^6.30.0 + version: 6.30.0 '@loopring-web/loopring-sdk': specifier: 3.3.5 version: 3.3.5(bufferutil@4.0.9)(utf-8-validate@5.0.10) @@ -250,6 +256,12 @@ importers: web3: specifier: ^4.11.1 version: 4.16.0(bufferutil@4.0.9)(typescript@5.7.2)(utf-8-validate@5.0.10)(zod@3.24.1) + xrpl: + specifier: ^4.2.5 + version: 4.2.5(bufferutil@4.0.9)(utf-8-validate@5.0.10) + xumm: + specifier: ^1.8.0 + version: 1.8.0 zksync: specifier: ^0.13.1 version: 0.13.1(@ethersproject/logger@5.7.0)(ethers@5.7.2(bufferutil@4.0.9)(utf-8-validate@5.0.10)) @@ -293,6 +305,15 @@ importers: '@storybook/testing-library': specifier: ^0.2.1 version: 0.2.2 + '@testing-library/jest-dom': + specifier: ^6.6.3 + version: 6.6.3 + '@testing-library/react': + specifier: ^16.3.0 + version: 16.3.0(@testing-library/dom@9.3.4)(@types/react-dom@18.3.1)(@types/react@18.3.12)(react-dom@18.3.1(react@18.3.1))(react@18.3.1) + '@testing-library/react-hooks': + specifier: ^8.0.1 + version: 8.0.1(@types/react@18.3.12)(react-dom@18.3.1(react@18.3.1))(react@18.3.1) '@types/bn.js': specifier: ^5.1.0 version: 5.1.6 @@ -362,6 +383,9 @@ importers: typescript: specifier: 'catalog:' version: 5.7.2 + vitest: + specifier: ^3.1.3 + version: 3.1.3(@types/debug@4.1.12)(@types/node@20.17.11)(terser@5.37.0) webpack-watch-files-plugin: specifier: ^1.2.1 version: 1.2.1(webpack@5.97.1(@swc/core@1.10.4(@swc/helpers@0.5.15))(esbuild@0.18.20)) @@ -730,6 +754,9 @@ importers: packages: + '@adobe/css-tools@4.4.2': + resolution: {integrity: sha512-baYZExFpsdkBNuvGKTKWCwKH57HRZLVtycZS05WTQNVOiXVSeAki3nU35zlRbToeMW8aHlJfyS+1C4BOv27q0A==} + '@adraffy/ens-normalize@1.10.0': resolution: {integrity: sha512-nA9XHtlAkYfJxY7bce8DcN7eKxWWCWkU+1GR9d+U6MbNpfwQp8TI7vqOsBsMcHoT4mBu2kypKoSKnghEzOOq5Q==} @@ -2636,6 +2663,7 @@ packages: '@jnwng/walletconnect-solana@0.2.0': resolution: {integrity: sha512-nyRq0xLEj9i2J4UXQ0Mr4KzsooTMbLu0ewHOqdQV7iZE0PfbtKa8poTSF4ZBAQD8hoMHEx+I7zGFCNMI9BTrTA==} + deprecated: Please use https://www.npmjs.com/package/@walletconnect/solana-adapter instead peerDependencies: '@solana/web3.js': ^1.63.0 @@ -2672,6 +2700,24 @@ packages: '@juggle/resize-observer@3.4.0': resolution: {integrity: sha512-dfLbk+PwWvFzSxwk3n5ySL0hfBog779o8h68wK/7/APo/7cgyWp5jcXockbxdk5kFRkbeXWm4Fbi9FrdN381sA==} + '@ledgerhq/devices@8.4.4': + resolution: {integrity: sha512-sz/ryhe/R687RHtevIE9RlKaV8kkKykUV4k29e7GAVwzHX1gqG+O75cu1NCJUHLbp3eABV5FdvZejqRUlLis9A==} + + '@ledgerhq/errors@6.19.1': + resolution: {integrity: sha512-75yK7Nnit/Gp7gdrJAz0ipp31CCgncRp+evWt6QawQEtQKYEDfGo10QywgrrBBixeRxwnMy1DP6g2oCWRf1bjw==} + + '@ledgerhq/hw-app-xrp@6.31.0': + resolution: {integrity: sha512-mjnJhgKs+YF/k2ZDiLdvWM+9wf5npqNPbHST42TIOsDUNQQOBrJeDD2FwEff//Jie/OGMaasFmdZo8PQoHukhg==} + + '@ledgerhq/hw-transport-webhid@6.30.0': + resolution: {integrity: sha512-HoTzjmYwO7+TVwK+GNbglRepUoDywBL6vjhKnhGqJSUPqAqJJyEXcnKnFDBMN7Phqm55O+YHDYfpcHGBNg5XlQ==} + + '@ledgerhq/hw-transport@6.31.4': + resolution: {integrity: sha512-6c1ir/cXWJm5dCWdq55NPgCJ3UuKuuxRvf//Xs36Bq9BwkV2YaRQhZITAkads83l07NAdR16hkTWqqpwFMaI6A==} + + '@ledgerhq/logs@6.12.0': + resolution: {integrity: sha512-ExDoj1QV5eC6TEbMdLUMMk9cfvNKhhv5gXol4SmULRVCx/3iyCPhJ74nsb3S0Vb+/f+XujBEj3vQn5+cwS0fNA==} + '@lit-labs/ssr-dom-shim@1.2.1': resolution: {integrity: sha512-wx4aBmgeGvFmOKucFKY+8VFJSYZxs9poN3SDNQFF6lT6NrQUnHiPB2PWz2sc4ieEcAaYYzN+1uWahEeTq2aRIQ==} @@ -3112,6 +3158,7 @@ packages: '@paulmillr/qr@0.2.1': resolution: {integrity: sha512-IHnV6A+zxU7XwmKFinmYjUcwlyK9+xkG3/s9KcQhI9BjQKycrJ1JRO+FbNYPwZiPKW3je/DR0k7w8/gLa5eaxQ==} + deprecated: 'The package is now available as "qr": npm install qr' '@pkgjs/parseargs@0.11.0': resolution: {integrity: sha512-+1VkjdD0QBLPodGrJUeqarH8VAIvQODIbwh9XpP5Syisf7YoQgsJKPNFoqqLQlu+VQ/tVSshMR6loPMn8U+dPg==} @@ -4813,6 +4860,7 @@ packages: '@solana/wallet-adapter-walletconnect@0.1.16': resolution: {integrity: sha512-jNaQwSho8hT7gF1ifePE8TJc1FULx8jCF16KX3fZPtzXDxKrj0R4VUpHMGcw4MlDknrnZNLOJAVvyiawAkPCRQ==} engines: {node: '>=16'} + deprecated: Please use https://www.npmjs.com/package/@walletconnect/solana-adapter instead peerDependencies: '@solana/web3.js': ^1.77.3 @@ -5449,6 +5497,26 @@ packages: resolution: {integrity: sha512-FlS4ZWlp97iiNWig0Muq8p+3rVDjRiYE+YKGbAqXOu9nwJFFOdL00kFpz42M+4huzYi86vAK1sOOfyOG45muIQ==} engines: {node: '>=14'} + '@testing-library/jest-dom@6.6.3': + resolution: {integrity: sha512-IteBhl4XqYNkM54f4ejhLRJiZNqcSCoXUOG2CPK7qbD322KjQozM4kHQOfkG2oln9b9HTYqs+Sae8vBATubxxA==} + engines: {node: '>=14', npm: '>=6', yarn: '>=1'} + + '@testing-library/react-hooks@8.0.1': + resolution: {integrity: sha512-Aqhl2IVmLt8IovEVarNDFuJDVWVvhnr9/GCU6UUnrYXwgDFF9h2L2o2P9KBni1AST5sT6riAyoukFLyjQUgD/g==} + engines: {node: '>=12'} + peerDependencies: + '@types/react': ^16.9.0 || ^17.0.0 + react: ^16.9.0 || ^17.0.0 + react-dom: ^16.9.0 || ^17.0.0 + react-test-renderer: ^16.9.0 || ^17.0.0 + peerDependenciesMeta: + '@types/react': + optional: true + react-dom: + optional: true + react-test-renderer: + optional: true + '@testing-library/react@14.3.1': resolution: {integrity: sha512-H99XjUhWQw0lTgyMN05W3xQG1Nh4lq574D8keFf1dDoNTJgp66VbJozRaczoF+wsiaPJNt/TcnfpLGufGxSrZQ==} engines: {node: '>=14'} @@ -5456,6 +5524,21 @@ packages: react: ^18.0.0 react-dom: ^18.0.0 + '@testing-library/react@16.3.0': + resolution: {integrity: sha512-kFSyxiEDwv1WLl2fgsq6pPBbw5aWKrsY2/noi1Id0TK0UParSF62oFQFGHXIyaG4pp2tEub/Zlel+fjjZILDsw==} + engines: {node: '>=18'} + peerDependencies: + '@testing-library/dom': ^10.0.0 + '@types/react': ^18.0.0 || ^19.0.0 + '@types/react-dom': ^18.0.0 || ^19.0.0 + react: ^18.0.0 || ^19.0.0 + react-dom: ^18.0.0 || ^19.0.0 + peerDependenciesMeta: + '@types/react': + optional: true + '@types/react-dom': + optional: true + '@testing-library/user-event@14.5.2': resolution: {integrity: sha512-YAh82Wh4TIrxYLmfGcixwD18oIjyC1pFQC2Y01F2lzV2HTMiYrI0nze0FD0ocB//CKS/7jIUgae+adPqxK5yCQ==} engines: {node: '>=12', npm: '>=6'} @@ -5821,6 +5904,9 @@ packages: '@types/webpack-env@1.18.5': resolution: {integrity: sha512-wz7kjjRRj8/Lty4B+Kr0LN6Ypc/3SymeCCGSbaXp2leH0ZVg/PriNiOwNj4bD4uphI7A8NXS4b6Gl373sfO5mA==} + '@types/websocket@1.0.10': + resolution: {integrity: sha512-svjGZvPB7EzuYS94cI7a+qhwgGU1y89wUgjT6E2wVUfmAGIvRfT7obBvRtnhXCSsoMdlG4gBFGE7MfkIXZLoww==} + '@types/ws@7.4.7': resolution: {integrity: sha512-JQbbmxZTZehdc2iszGKs5oC3NFnjeay7mtAWrdt7qNtAVK0g19muApzAy4bm9byz79xa2ZnO/BOBC2R8RC5Lww==} @@ -6019,6 +6105,35 @@ packages: '@vercel/static-config@3.0.0': resolution: {integrity: sha512-2qtvcBJ1bGY0dYGYh3iM7yGKkk971FujLEDXzuW5wcZsPr1GSEjO/w2iSr3qve6nDDtBImsGoDEnus5FI4+fIw==} + '@vitest/expect@3.1.3': + resolution: {integrity: sha512-7FTQQuuLKmN1Ig/h+h/GO+44Q1IlglPlR2es4ab7Yvfx+Uk5xsv+Ykk+MEt/M2Yn/xGmzaLKxGw2lgy2bwuYqg==} + + '@vitest/mocker@3.1.3': + resolution: {integrity: sha512-PJbLjonJK82uCWHjzgBJZuR7zmAOrSvKk1QBxrennDIgtH4uK0TB1PvYmc0XBCigxxtiAVPfWtAdy4lpz8SQGQ==} + peerDependencies: + msw: ^2.4.9 + vite: ^5.0.0 || ^6.0.0 + peerDependenciesMeta: + msw: + optional: true + vite: + optional: true + + '@vitest/pretty-format@3.1.3': + resolution: {integrity: sha512-i6FDiBeJUGLDKADw2Gb01UtUNb12yyXAqC/mmRWuYl+m/U9GS7s8us5ONmGkGpUUo7/iAYzI2ePVfOZTYvUifA==} + + '@vitest/runner@3.1.3': + resolution: {integrity: sha512-Tae+ogtlNfFei5DggOsSUvkIaSuVywujMj6HzR97AHK6XK8i3BuVyIifWAm/sE3a15lF5RH9yQIrbXYuo0IFyA==} + + '@vitest/snapshot@3.1.3': + resolution: {integrity: sha512-XVa5OPNTYUsyqG9skuUkFzAeFnEzDp8hQu7kZ0N25B1+6KjGm4hWLtURyBbsIAOekfWQ7Wuz/N/XXzgYO3deWQ==} + + '@vitest/spy@3.1.3': + resolution: {integrity: sha512-x6w+ctOEmEXdWaa6TO4ilb7l9DxPR5bwEb6hILKuxfU1NqWT2mpJD9NJN7t3OTfxmVlOMrvtoFJGdgyzZ605lQ==} + + '@vitest/utils@3.1.3': + resolution: {integrity: sha512-2Ltrpht4OmHO9+c/nmHtF09HWiyWdworqnHIwjfvDyWjuwKbdkcS9AnhsDn+8E2RM4x++foD1/tNuLPVvWG1Rg==} + '@wagmi/connectors@3.1.11': resolution: {integrity: sha512-wzxp9f9PtSUFjDUP/QDjc1t7HON4D8wrVKsw35ejdO8hToDpx1gU9lwH/47Zo/1zExGezQc392sjoHSszYd7OA==} peerDependencies: @@ -6119,12 +6234,15 @@ packages: '@walletconnect/ethereum-provider@2.11.0': resolution: {integrity: sha512-YrTeHVjuSuhlUw7SQ6xBJXDuJ6iAC+RwINm9nVhoKYJSHAy3EVSJZOofMKrnecL0iRMtD29nj57mxAInIBRuZA==} + deprecated: 'Reliability and performance improvements. See: https://github.com/WalletConnect/walletconnect-monorepo/releases' '@walletconnect/ethereum-provider@2.17.0': resolution: {integrity: sha512-b+KTAXOb6JjoxkwpgYQQKPUcTwENGmdEdZoIDLeRicUmZTn/IQKfkMoC2frClB4YxkyoVMtj1oMV2JAax+yu9A==} + deprecated: 'Reliability and performance improvements. See: https://github.com/WalletConnect/walletconnect-monorepo/releases' '@walletconnect/ethereum-provider@2.9.2': resolution: {integrity: sha512-eO1dkhZffV1g7vpG19XUJTw09M/bwGUwwhy1mJ3AOPbOSbMPvwiCuRz2Kbtm1g9B0Jv15Dl+TvJ9vTgYF8zoZg==} + deprecated: 'Reliability and performance improvements. See: https://github.com/WalletConnect/walletconnect-monorepo/releases' peerDependencies: '@walletconnect/modal': '>=2' peerDependenciesMeta: @@ -6251,17 +6369,18 @@ packages: '@walletconnect/sign-client@2.11.0': resolution: {integrity: sha512-H2ukscibBS+6WrzQWh+WyVBqO5z4F5et12JcwobdwgHnJSlqIoZxqnUYYWNCI5rUR5UKsKWaUyto4AE9N5dw4Q==} - deprecated: Reliability and performance greatly improved - please see https://github.com/WalletConnect/walletconnect-monorepo/releases + deprecated: 'Reliability and performance improvements. See: https://github.com/WalletConnect/walletconnect-monorepo/releases' '@walletconnect/sign-client@2.17.0': resolution: {integrity: sha512-sErYwvSSHQolNXni47L3Bm10ptJc1s1YoJvJd34s5E9h9+d3rj7PrhbiW9X82deN+Dm5oA8X9tC4xty1yIBrVg==} + deprecated: 'Reliability and performance improvements. See: https://github.com/WalletConnect/walletconnect-monorepo/releases' '@walletconnect/sign-client@2.17.3': resolution: {integrity: sha512-OzOWxRTfVGCHU3OOF6ibPkgPfDpivFJjuknfcOUt9PYWpTAv6YKOmT4cyfBPhc7llruyHpV44fYbykMcLIvEcg==} '@walletconnect/sign-client@2.9.2': resolution: {integrity: sha512-anRwnXKlR08lYllFMEarS01hp1gr6Q9XUgvacr749hoaC/AwGVlxYFdM8+MyYr3ozlA+2i599kjbK/mAebqdXg==} - deprecated: Reliability and performance greatly improved - please see https://github.com/WalletConnect/walletconnect-monorepo/releases + deprecated: 'Reliability and performance improvements. See: https://github.com/WalletConnect/walletconnect-monorepo/releases' '@walletconnect/socket-transport@1.8.0': resolution: {integrity: sha512-5DyIyWrzHXTcVp0Vd93zJ5XMW61iDM6bcWT4p8DTRfFsOtW46JquruMhxOLeCOieM4D73kcr3U7WtyR4JUsGuQ==} @@ -6376,6 +6495,13 @@ packages: '@webassemblyjs/wast-printer@1.14.1': resolution: {integrity: sha512-kPSSXE6De1XOR820C90RIo2ogvZG+c3KiHzqUoO/F34Y2shGzesfqv7o57xrxovZJH/MetF5UjroJ/R/3isoiw==} + '@xrplf/isomorphic@1.0.1': + resolution: {integrity: sha512-0bIpgx8PDjYdrLFeC3csF305QQ1L7sxaWnL5y71mCvhenZzJgku9QsA+9QCXBC1eNYtxWO/xR91zrXJy2T/ixg==} + engines: {node: '>=16.0.0'} + + '@xrplf/secret-numbers@1.0.0': + resolution: {integrity: sha512-qsCLGyqe1zaq9j7PZJopK+iGTGRbk6akkg6iZXJJgxKwck0C5x5Gnwlb1HKYGOwPKyrXWpV6a2YmcpNpUFctGg==} + '@xtuc/ieee754@1.2.0': resolution: {integrity: sha512-DX8nKgqcGwsc0eJSqYt5lwP4DH5FlHnmuWWBRy7X0NcaGR0ZtuyeESgMwTYVEtxmsNGY+qit4QYT/MIYTOTPeA==} @@ -6669,6 +6795,10 @@ packages: assert@2.1.0: resolution: {integrity: sha512-eLHpSK/Y4nhMJ07gDaAzoX/XAKS8PSaojml3M0DM4JpV1LAi5JOJ/p6H/XWrl8L+DzVEvVCW1z3vWAaB9oTsQw==} + assertion-error@2.0.1: + resolution: {integrity: sha512-Izi8RQcffqCeNVgFigKli1ssklIbpHnCYc6AknXGYoB6grJqyeby7jv12JUQgmTAnIDnbck1uxksT4dzN3PWBA==} + engines: {node: '>=12'} + ast-types@0.15.2: resolution: {integrity: sha512-c27loCv9QkZinsa5ProX751khO9DJl/AcB5c2KNtA6NRvHKS0PgLfcftz72KVq504vB0Gku5s2kUZzDBvQWvHg==} engines: {node: '>=4'} @@ -6902,6 +7032,9 @@ packages: bindings@1.5.0: resolution: {integrity: sha512-p2q/t/mhvuOj/UeLlV6566GD/guowlr0hHxClI0W9m7MWYkL1F0hLo+0Aexs9HSPCtR1SXQ0TD3MMKrXZajbiQ==} + bip32-path@0.4.2: + resolution: {integrity: sha512-ZBMCELjJfcNMkz5bDuJ1WrYvjlhEF5k6mQ8vUr4N7MbVRsXei7ZOg8VhhwMfNiW68NWmLkgkc6WvTickrLGprQ==} + bip66@1.1.5: resolution: {integrity: sha512-nemMHz95EmS38a26XbbdxIYj5csHd3RMP3H5bwQknX0WYHF01qhpufP42mLOwVICuH2JmhIhXiWs89MfUGL7Xw==} @@ -7153,6 +7286,10 @@ packages: ccount@2.0.1: resolution: {integrity: sha512-eyrF0jiFpY+3drT6383f1qhkbGsLSifNAjA61IUjZjmLCWjItY6LB9ft9YhoDgwfmclB2zhu51Lc7+95b8NRAg==} + chai@5.2.0: + resolution: {integrity: sha512-mCuXncKXk5iCLhfhwTc0izo0gtEmpz5CtG2y8GiOINBlMVS6v8TMRc5TaLWKS6692m9+dVVfzgeVxR5UxWHTYw==} + engines: {node: '>=12'} + chalk@3.0.0: resolution: {integrity: sha512-4D3B6Wf41KOYRFdszmDqMCGq5VV/uMAB273JILmO+3jAlh8X4qDtdtgCR3fxtbLEMzSx22QdhnDcJvu2u1fVwg==} engines: {node: '>=8'} @@ -7177,6 +7314,10 @@ packages: character-reference-invalid@2.0.1: resolution: {integrity: sha512-iBZ4F4wRbyORVsu0jPV7gXkOsGYjGHPmAyv+HiHG8gi5PtC9KI2j1+v8/tlibRvjoWX027ypmG/n0HtO5t7unw==} + check-error@2.1.1: + resolution: {integrity: sha512-OAlb+T7V4Op9OwdkjmguYRqncdlx5JiofwOAUkmTF+jNdHwzTaTs4sRAGpzLF3oOz5xAyDGrPgeIDFQmDOTiJw==} + engines: {node: '>= 16'} + checkpoint-store@1.1.0: resolution: {integrity: sha512-J/NdY2WvIx654cc6LWSq/IYFFCUf75fFTgwzFnmbqyORH4MwgiQCgswLLKBGzmsyTI5V7i5bp/So6sMbDWhedg==} @@ -7648,6 +7789,9 @@ packages: resolution: {integrity: sha512-HTUrgRJ7r4dsZKU6GjmpfRK1O76h97Z8MfS1G0FozR+oF2kG6Vfe8JE6zwrkbxigziPHinCJ+gCPjA9EaBDtRw==} engines: {node: '>= 6'} + css.escape@1.5.1: + resolution: {integrity: sha512-YUifsXXuknHlUsmlgyY0PKzgPOr7/FjCePfHNt0jxm83wHZi44VDMQ7/fGNkjY3/jV1MC+1CmZbaHzugyeRtpg==} + cssesc@3.0.0: resolution: {integrity: sha512-/Tb/JcjK111nNScGob5MNtsntNM1aCNUDipB/TkwZFhyDrrE47SOx/18wF2bbjgc3ZzCSKW1T5nt5EbFoAz/Vg==} engines: {node: '>=4'} @@ -7938,6 +8082,10 @@ packages: babel-plugin-macros: optional: true + deep-eql@5.0.2: + resolution: {integrity: sha512-h5k/5U50IJJFpzfL6nO9jaaumfjO/f2NjK/oYB2Djzm4p9L+3T9qWpZqZ2hAbLPuuYq9wrU08WQyBTL5GbPk5Q==} + engines: {node: '>=6'} + deep-equal@2.2.3: resolution: {integrity: sha512-ZIwpnevOurS8bpT4192sqAowWM76JDKSHYzMLty3BZGSswgq6pBaH3DhCSW5xVAZICZyKdOBPjwww5wfgT/6PA==} engines: {node: '>= 0.4'} @@ -8096,6 +8244,9 @@ packages: dom-accessibility-api@0.5.16: resolution: {integrity: sha512-X7BJ2yElsnOJ30pZF4uIIDfBEVgF4XEBxL9Bxhy6dnrm5hkzqmsWHGTiHqRiITNhMyFLyAiWndIJP7Z1NTteDg==} + dom-accessibility-api@0.6.3: + resolution: {integrity: sha512-7ZgogeTnjuHbo+ct10G9Ffp0mif17idi0IyWNVA/wcwcm7NPOD/WEHVP3n7n3MhXqxoIYm8d6MuZohYWIZ4T3w==} + dom-converter@0.2.0: resolution: {integrity: sha512-gd3ypIPfOMr9h5jIKq8E3sHOTCjeirnl0WK5ZdS1AW0Odt0b1PaWaHdJ4Qk4klv+YB9aJBS7mESXjFoDQPu6DA==} @@ -8309,6 +8460,9 @@ packages: es-module-lexer@1.6.0: resolution: {integrity: sha512-qqnD1yMU6tk/jnaMosogGySTZP8YtUgAffA9nMN+E/rjxcfRQ6IEk7IiozUjgxKoFHBGjTLnrHB/YC45r/59EQ==} + es-module-lexer@1.7.0: + resolution: {integrity: sha512-jEQoCwk8hyb2AZziIOLhDqpm5+2ww5uIE6lkO/6jcOCusfk6LhMHpXXfBLXTZ7Ydyt0j4VoUQv6uGNYbdW+kBA==} + es-object-atoms@1.0.0: resolution: {integrity: sha512-MZ4iQ6JwHOBQjahnjwaC1ZtIBH+2ohjamzAO3oaHcXYup7qxjF2fixyH+Q71voWHeOkI2q/TnJao/KfXYIZWbw==} engines: {node: '>= 0.4'} @@ -8824,6 +8978,10 @@ packages: resolution: {integrity: sha512-XYfuKMvj4O35f/pOXLObndIRvyQ+/+6AhODh+OKWj9S9498pHHn/IMszH+gt0fBCRWMNfk1ZSp5x3AifmnI2vg==} engines: {node: '>=6'} + expect-type@1.2.1: + resolution: {integrity: sha512-/kP8CAwxzLVEeFrMm4kMmy4CCDlpipyA7MYLVrdJIkV0fYF0UaigQHRsxHiuY/GEea+bh4KSv3TIlgr+2UL6bw==} + engines: {node: '>=12.0.0'} + exponential-backoff@3.1.1: resolution: {integrity: sha512-dX7e/LHVJ6W3DE1MHWi9S1EYzDESENfLrYohG2G++ovZrYOkm4Knwa0mc1cn84xJOR4KEU0WSchhLbd0UklbHw==} @@ -8935,9 +9093,20 @@ packages: picomatch: optional: true + fdir@6.4.4: + resolution: {integrity: sha512-1NZP+GK4GfuAv3PqKvxQRDMjdSRZjnkq7KfhlNrCNNlZ0ygQFpebfrnfnq/W7fpUnAv9aGWmY1zKx7FYL3gwhg==} + peerDependencies: + picomatch: ^3 || ^4 + peerDependenciesMeta: + picomatch: + optional: true + fecha@4.2.3: resolution: {integrity: sha512-OP2IUU6HeYKJi3i0z4A19kHMQoLVs4Hc+DPqqxI2h/DPZHTm/vjsfC6P0b4jCMy14XizLBqvndQ+UilD7707Jw==} + fetch-ponyfill@7.1.0: + resolution: {integrity: sha512-FhbbL55dj/qdVO3YNK7ZEkshvj3eQ7EuIGV2I6ic/2YiocvyWv+7jg2s4AyS0wdRU75s3tA8ZxI/xPigb0v5Aw==} + fetch-retry@5.0.6: resolution: {integrity: sha512-3yurQZ2hD9VISAhJJP9bpYFNQrHHBXE2JxxjY5aLEcDi46RmAzJE2OC9FAde0yis5ElW0jTTzs0zfg/Cca4XqQ==} @@ -10433,6 +10602,7 @@ packages: lodash.isequal@4.5.0: resolution: {integrity: sha512-pDo3lu8Jhfjqls6GkMgpahsF9kCyayhgykjyLMNFTKWrpVdAQtYyB4muAMWozBB4ig/dtWAmsMxLEI8wuz+DYQ==} + deprecated: This package is deprecated. Use require('node:util').isDeepStrictEqual instead. lodash.isinteger@4.0.4: resolution: {integrity: sha512-DBwtEWN2caHQ9/imiNeEA5ys1JoRtRfY3d7V9wkqtbycnAmTvRRmbHKDV4a0EYc678/dia0jrte4tjYwVBaZUA==} @@ -10482,6 +10652,9 @@ packages: resolution: {integrity: sha512-lyuxPGr/Wfhrlem2CL/UcnUc1zcqKAImBDzukY7Y5F/yQiNdko6+fRLevlw1HgMySw7f611UIY408EtxRSoK3Q==} hasBin: true + loupe@3.1.3: + resolution: {integrity: sha512-kkIp7XSkP78ZxJEsSxW3712C6teJVoeHHwgo9zJ380de7IYyJ2ISlxojcH2pC5OFLewESmnRi/+XCDIEEVyoug==} + lower-case@2.0.2: resolution: {integrity: sha512-7fm3l3NAF9WfN6W3JOmf5drwpVqX78JtoGJ3A6W0a6ZnldM41w2fV5D490psKFTpMds8TJse/eHLFFsNHHjHgg==} @@ -11757,6 +11930,13 @@ packages: pathe@1.1.2: resolution: {integrity: sha512-whLdWMYL2TwI08hn8/ZqAbrVemu0LNaNNJZX73O6qaIdCTfXutsLhMkjdENX0qhsQ9uIimo4/aQOmXkoon2nDQ==} + pathe@2.0.3: + resolution: {integrity: sha512-WUjGcAqP1gQacoQe+OBJsFA7Ld4DyXuUIjZ5cc75cLHvJ7dtNsTugphxIADwspS+AraAUePCKrSVtPLFj/F88w==} + + pathval@2.0.0: + resolution: {integrity: sha512-vE7JKRyES09KiunauX7nd2Q9/L7lhok4smP9RZTDeD4MVs72Dp2qNFVz39Nz5a0FVEW0BJR6C0DYrq6unoziZA==} + engines: {node: '>= 14.16'} + pbkdf2@3.1.2: resolution: {integrity: sha512-iuh7L6jA7JEGu2WxDwtQP1ddOpaJNC4KlDEFfdQajSGgGPNi4OyDc2R7QnbY2bR9QjBVGwgvTdNJZoE7RaxUMA==} engines: {node: '>=0.12'} @@ -12490,6 +12670,12 @@ packages: react: ^0.14.8 || ^15.0.1 || ^16.0.0 || ^17.0.1 || ^18.0.0 react-dom: ^0.14.8 || ^15.0.1 || ^16.0.0 || ^17.0.1 || ^18.0.0 + react-error-boundary@3.1.4: + resolution: {integrity: sha512-uM9uPzZJTF6wRQORmSrvOIgt4lJ9MC1sNgEOj2XGsDTRE4kmpWxg7ENK9EWNKJRMAOY9z0MuF4yIfl6gp4sotA==} + engines: {node: '>=10', npm: '>=6'} + peerDependencies: + react: '>=16.13.1' + react-error-boundary@4.1.2: resolution: {integrity: sha512-GQDxZ5Jd+Aq/qUxbCm1UtzmL/s++V7zKgE8yMktJiCQXCCFZnMZh9ng+6/Ne6PjNSXH0L9CjeOEREfRnq6Duag==} peerDependencies: @@ -12725,6 +12911,10 @@ packages: recma-stringify@1.0.0: resolution: {integrity: sha512-cjwII1MdIIVloKvC9ErQ+OgAtwHBmcZ0Bg4ciz78FtbT8In39aAYbaA7zvxQ61xVMSPE8WxhLwLbhif4Js2C+g==} + redent@3.0.0: + resolution: {integrity: sha512-6tDA8g98We0zd0GvVeMT9arEOnTw9qM03L9cJXaCjrip1OO764RDBLBfrB4cwzNGDj5OA5ioymC9GkizgWJDUg==} + engines: {node: '>=8'} + redux-thunk@2.4.2: resolution: {integrity: sha512-+P3TjtnP0k/FEjcBL5FZpoovtvrTNT/UXd4/sluaSyrURlSlhLSzEdfsTBW7WsKB6yPvgd7q/iZPICFjW4o57Q==} peerDependencies: @@ -12910,6 +13100,18 @@ packages: ripemd160@2.0.2: resolution: {integrity: sha512-ii4iagi25WusVoiC4B4lq7pbXfAp3D9v5CwfkY33vffw2+pkDjY1D8GaN7spsxvCSx8dkPqOZCEZyfxcmJG2IA==} + ripple-address-codec@5.0.0: + resolution: {integrity: sha512-de7osLRH/pt5HX2xw2TRJtbdLLWHu0RXirpQaEeCnWKY5DYHykh3ETSkofvm0aX0LJiV7kwkegJxQkmbO94gWw==} + engines: {node: '>= 16'} + + ripple-binary-codec@2.3.0: + resolution: {integrity: sha512-CPMzkknXlgO9Ow5Qa5iqQm0vOIlJyN8M1bc8etyhLw2Xfrer6bPzLA8/apuKlGQ+XdznYSKPBz5LAhwYjaDAcA==} + engines: {node: '>= 18'} + + ripple-keypairs@2.0.0: + resolution: {integrity: sha512-b5rfL2EZiffmklqZk1W+dvSy97v3V/C7936WxCCgDynaGPp7GE6R2XO7EU9O2LlM/z95rj870IylYnOQs+1Rag==} + engines: {node: '>= 16'} + rlp@2.2.7: resolution: {integrity: sha512-d5gdPmgQ0Z+AklL2NVXr/IoSjNZFfTVvQWzL/AM2AOcSzYP2xjlb0AC8YyCLc41MSNf6P6QVtjgPdmVtzb+4lQ==} hasBin: true @@ -13163,6 +13365,9 @@ packages: resolution: {integrity: sha512-ZX99e6tRweoUXqR+VBrslhda51Nh5MTQwou5tnUDgbtyM0dBgmhEDtWGP/xbKn6hqfPRHujUNwz5fy/wbbhnpw==} engines: {node: '>= 0.4'} + siginfo@2.0.0: + resolution: {integrity: sha512-ybx0WO1/8bSBLEWXZvEd7gMW3Sn3JFlW3TvX1nREbDLRNQNaeNN8WK0meBwPdAaOI7TtRRRJn/Es1zhrrCHu7g==} + signal-exit@3.0.7: resolution: {integrity: sha512-wnD2ZE+l+SPC/uoS0vXeE9L1+0wuaMqKlfz9AMUo38JsyLSBWSFcHR1Rri62LZc12vLr1gb3jl7iwQhgwpAbGQ==} @@ -13298,6 +13503,9 @@ packages: resolution: {integrity: sha512-XlkWvfIm6RmsWtNJx+uqtKLS8eqFbxUg0ZzLXqY0caEy9l7hruX8IpiDnjsLavoBgqCCR71TqWO8MaXYheJ3RQ==} engines: {node: '>=10'} + stackback@0.0.2: + resolution: {integrity: sha512-1XMJE5fQo1jGH6Y/7ebnwPOBEkIEnT4QF32d5R1+VXdXveM0IBMJt8zfaxX1P3QhVwrYe+576+jkANtSS2mBbw==} + stackframe@1.3.4: resolution: {integrity: sha512-oeVtt7eWQS+Na6F//S4kJ2K2VbRlS9D43mAlMyVpVWovy9o+jfgH8O9agzANzaiLjclA0oYzUXEM4PurhSUChw==} @@ -13313,6 +13521,9 @@ packages: resolution: {integrity: sha512-RwNA9Z/7PrK06rYLIzFMlaF+l73iwpzsqRIFgbMLbTcLD6cOao82TaWefPXQvB2fOC4AjuYSEndS7N/mTCbkdQ==} engines: {node: '>= 0.8'} + std-env@3.9.0: + resolution: {integrity: sha512-UGvjygr6F6tpH7o2qyqR6QYpwraIjKSdtzyBdyytFOHmPZY917kwdwLG0RbOjWOnKmnm3PeHjaoLLMie7kPLQw==} + stop-iteration-iterator@1.1.0: resolution: {integrity: sha512-eLoXW/DHyl62zxY4SCaIgnRhuMr6ri4juEYARS8E6sCEqzKpOiE521Ucofdx+KnDZl5xmvGYaaKCk5FEOxJCoQ==} engines: {node: '>= 0.4'} @@ -13435,6 +13646,10 @@ packages: resolution: {integrity: sha512-q8d4ue7JGEiVcypji1bALTos+0pWtyGlivAWyPuTkHzuTCJqrK9sWxYQZUq6Nq3cuyv3bm734IhHvHtGGURU6A==} engines: {node: '>=6.5.0', npm: '>=3'} + strip-indent@3.0.0: + resolution: {integrity: sha512-laJTa3Jb+VQpaC6DseHhF7dXVqHTfJPCRDaEbid/drOhgitgYku/letMUqOXFoWV0zIIUbjpdH2t+tYj4bQMRQ==} + engines: {node: '>=8'} + strip-indent@4.0.0: resolution: {integrity: sha512-mnVSV2l+Zv6BLpSD/8V87CW/y9EmmbYzGCIavsnsI6/nwn26DwffM/yztm30Z/I2DY9wdS3vXVCMnHDgZaVNoA==} engines: {node: '>=12'} @@ -13691,10 +13906,32 @@ packages: tiny-warning@1.0.3: resolution: {integrity: sha512-lBN9zLN/oAf68o3zNXYrdCt1kP8WsiGW8Oo2ka41b2IM5JL/S1CTyX1rW0mb/zSuJun0ZUrDxx4sqvYS2FWzPA==} + tinybench@2.9.0: + resolution: {integrity: sha512-0+DUvqWMValLmha6lr4kD8iAMK1HzV0/aKnCtWb9v9641TnP/MFb7Pc2bxoxQjTXAErryXVgUOfv2YqNllqGeg==} + + tinyexec@0.3.2: + resolution: {integrity: sha512-KQQR9yN7R5+OSwaK0XQoj22pwHoTlgYqmUscPYoknOoWCWfj/5/ABTMRi69FrKU5ffPVh5QcFikpWJI/P1ocHA==} + tinyglobby@0.2.10: resolution: {integrity: sha512-Zc+8eJlFMvgatPZTl6A9L/yht8QqdmUNtURHaKZLmKBE12hNPSrqNkUp2cs3M/UKmNVVAMFQYSjYIVHDjW5zew==} engines: {node: '>=12.0.0'} + tinyglobby@0.2.13: + resolution: {integrity: sha512-mEwzpUgrLySlveBwEVDMKk5B57bhLPYovRfPAXD5gA/98Opn0rCDj3GtLwFvCvH5RK9uPCExUROW5NjDwvqkxw==} + engines: {node: '>=12.0.0'} + + tinypool@1.0.2: + resolution: {integrity: sha512-al6n+QEANGFOMf/dmUMsuS5/r9B06uwlyNjZZql/zv8J7ybHCgoihBNORZCY2mzUuAnomQa2JdhyHKzZxPCrFA==} + engines: {node: ^18.0.0 || >=20.0.0} + + tinyrainbow@2.0.0: + resolution: {integrity: sha512-op4nsTR47R6p0vMUUoYl/a+ljLFVtlfaXkLQmqfLR1qHma1h/ysYk4hEXZ880bf2CYgTskvTa/e196Vd5dDQXw==} + engines: {node: '>=14.0.0'} + + tinyspy@3.0.2: + resolution: {integrity: sha512-n1cw8k1k0x4pgA2+9XrOkFydTerNcJ1zWCO5Nn9scWHTD+5tp8dghT2x1uduQePZTZgd3Tupf+x9BxJjeJi77Q==} + engines: {node: '>=14.0.0'} + tmp@0.2.3: resolution: {integrity: sha512-nZD7m9iCPC5g0pYmcaxogYKggSfLsdxl8of3Q/oIbqCqLLIO9IAF0GWjX1z9NZRHPiXv8Wex4yDCaZsgEw0Y8w==} engines: {node: '>=14.14'} @@ -14368,6 +14605,11 @@ packages: engines: {node: ^18.0.0 || >=20.0.0} hasBin: true + vite-node@3.1.3: + resolution: {integrity: sha512-uHV4plJ2IxCl4u1up1FQRrqclylKAogbtBfOTwcuJ28xFi+89PZ57BRh+naIRvH70HPwxy5QHYzg1OrEaC7AbA==} + engines: {node: ^18.0.0 || ^20.0.0 || >=22.0.0} + hasBin: true + vite-tsconfig-paths@5.1.2: resolution: {integrity: sha512-gEIbKfJzSEv0yR3XS2QEocKetONoWkbROj6hGx0FHM18qKUojhvcokQsxQx5nMkelZq2n37zbSGCJn+FSODSjA==} peerDependencies: @@ -14407,6 +14649,34 @@ packages: terser: optional: true + vitest@3.1.3: + resolution: {integrity: sha512-188iM4hAHQ0km23TN/adso1q5hhwKqUpv+Sd6p5sOuh6FhQnRNW3IsiIpvxqahtBabsJ2SLZgmGSpcYK4wQYJw==} + engines: {node: ^18.0.0 || ^20.0.0 || >=22.0.0} + hasBin: true + peerDependencies: + '@edge-runtime/vm': '*' + '@types/debug': ^4.1.12 + '@types/node': ^18.0.0 || ^20.0.0 || >=22.0.0 + '@vitest/browser': 3.1.3 + '@vitest/ui': 3.1.3 + happy-dom: '*' + jsdom: '*' + peerDependenciesMeta: + '@edge-runtime/vm': + optional: true + '@types/debug': + optional: true + '@types/node': + optional: true + '@vitest/browser': + optional: true + '@vitest/ui': + optional: true + happy-dom: + optional: true + jsdom: + optional: true + vlq@1.0.1: resolution: {integrity: sha512-gQpnTgkubC6hQgdIcRdYGDSDc+SaujOdyesZQMv6JlfQee/9Mp0Qhnys6WxDWvQnL5WZdT7o2Ul187aSt0Rq+w==} @@ -14838,6 +15108,11 @@ packages: engines: {node: ^14.17.0 || ^16.13.0 || >=18.0.0} hasBin: true + why-is-node-running@2.3.0: + resolution: {integrity: sha512-hUrmaWBdVDcxvYqnyh09zunKzROWjbZTiNy8dBEjkS7ehEDQibXJ7XvlmtbwuTclUiIyN+CyXQD4Vmko8fNm8w==} + engines: {node: '>=8'} + hasBin: true + winston-transport@4.9.0: resolution: {integrity: sha512-8drMJ4rkgaPo1Me4zD/3WLfI/zPdA9o2IipKODunnGDcuqbHwjsbB79ylv04LCGGzU0xQ6vTznOMpQGaLhhm6A==} engines: {node: '>= 12.0.0'} @@ -15000,6 +15275,10 @@ packages: resolution: {integrity: sha512-TEU+nJVUUnA4CYJFLvK5X9AOeH4KvDvhIfm0vV1GaQRtchnG0hgK5p8hw/xjv8cunWYCsiPCSDzObPyhEwq3KQ==} engines: {node: '>=0.4.0'} + xrpl@4.2.5: + resolution: {integrity: sha512-QIpsqvhaRiVvlq7px7lC+lhrxESDMN1vd8mW0SfTgY5WgzP9RLiDoVywOOvSZqDDjPs0EGfhxzYjREW1gGu0Ng==} + engines: {node: '>=18.0.0'} + xtend@2.1.2: resolution: {integrity: sha512-vMNKzr2rHP9Dp/e1NQFnLQlwlhp9L/LfvnsVdHxN1f+uggyVI3i08uD14GPvCToPkdsRfyPqIyYGmIk58V98ZQ==} engines: {node: '>=0.4'} @@ -15008,6 +15287,22 @@ packages: resolution: {integrity: sha512-LKYU1iAXJXUgAXn9URjiu+MWhyUXHsvfp7mcuYm9dSUKK0/CjtrUwFAxD82/mCWbtLsGjFIad0wIsod4zrTAEQ==} engines: {node: '>=0.4'} + xumm-js-pkce@1.0.2: + resolution: {integrity: sha512-hcadf2mHYBd6vCaKtQhcLJJjoG+1XIIbh2mq+QthheoRHAf3k/gqQNDnVqJnANTK7NBKo7GJWnY2yCkYj/AQzQ==} + + xumm-oauth2-pkce@2.8.7: + resolution: {integrity: sha512-gvy3L4WfQAWH9IRiaEOa2TRVDoxXGkzDRYdtVPrAygyo2kB/b216YxqraYWFxW90ngvA1/s8ze0TOtBmzvcmtA==} + + xumm-sdk@1.11.2: + resolution: {integrity: sha512-yCS7o0hd36Ijg+FeIYB4ghnx/26kcmnN1ngvTiFzoXY3j/qmTvtlFmuvAwp58kwFM3LIhz3F7XTZjG+CqVheOQ==} + + xumm-xapp-sdk@1.7.2: + resolution: {integrity: sha512-kiJAbA+F4gRNRux1zZ+PYJNko9fI4Y8EFf/4EMX4tPcdQEQyXfIMPj7l6tgwSNRy2sISbw2jUYDMayL+ZFlmOg==} + + xumm@1.8.0: + resolution: {integrity: sha512-KTNnNR3vWZ46TV0ucUu2qdLz+/28tuUBgzEudDC7uOG/WmKvy1/MSYxAKrDFHkCFW5nPmJK2UyWopDsHH0RJag==} + engines: {node: '>=14', npm: '>=7 <=10'} + y18n@4.0.3: resolution: {integrity: sha512-JKhqTOwSrqNA1NY5lSztJ1GrBiUodLMmIZuLiDaMRJ+itFd+ABVE8XBjOvIWL+rSqNDC74LCSFmlb/U4UZ4hJQ==} @@ -15018,6 +15313,7 @@ packages: yaeti@0.0.6: resolution: {integrity: sha512-MvQa//+KcZCUkBTIC9blM+CU9J2GzuTytsOUwf2lidtvkx/6gnEp1QvJv34t9vdjhFmha/mUiNDbN0D0mJWdug==} engines: {node: '>=0.10.32'} + deprecated: Package no longer supported. Contact Support at https://www.npmjs.com/support for more info. yalc@1.0.0-pre.53: resolution: {integrity: sha512-tpNqBCpTXplnduzw5XC+FF8zNJ9L/UXmvQyyQj7NKrDNavbJtHvzmZplL5ES/RCnjX7JR7W9wz5GVDXVP3dHUQ==} @@ -15140,6 +15436,8 @@ packages: snapshots: + '@adobe/css-tools@4.4.2': {} + '@adraffy/ens-normalize@1.10.0': {} '@adraffy/ens-normalize@1.10.1': {} @@ -17685,6 +17983,36 @@ snapshots: '@juggle/resize-observer@3.4.0': {} + '@ledgerhq/devices@8.4.4': + dependencies: + '@ledgerhq/errors': 6.19.1 + '@ledgerhq/logs': 6.12.0 + rxjs: 7.8.1 + semver: 7.6.3 + + '@ledgerhq/errors@6.19.1': {} + + '@ledgerhq/hw-app-xrp@6.31.0': + dependencies: + '@ledgerhq/hw-transport': 6.31.4 + bip32-path: 0.4.2 + + '@ledgerhq/hw-transport-webhid@6.30.0': + dependencies: + '@ledgerhq/devices': 8.4.4 + '@ledgerhq/errors': 6.19.1 + '@ledgerhq/hw-transport': 6.31.4 + '@ledgerhq/logs': 6.12.0 + + '@ledgerhq/hw-transport@6.31.4': + dependencies: + '@ledgerhq/devices': 8.4.4 + '@ledgerhq/errors': 6.19.1 + '@ledgerhq/logs': 6.12.0 + events: 3.3.0 + + '@ledgerhq/logs@6.12.0': {} + '@lit-labs/ssr-dom-shim@1.2.1': {} '@lit/reactive-element@1.6.3': @@ -21867,6 +22195,25 @@ snapshots: lz-string: 1.5.0 pretty-format: 27.5.1 + '@testing-library/jest-dom@6.6.3': + dependencies: + '@adobe/css-tools': 4.4.2 + aria-query: 5.1.3 + chalk: 3.0.0 + css.escape: 1.5.1 + dom-accessibility-api: 0.6.3 + lodash: 4.17.21 + redent: 3.0.0 + + '@testing-library/react-hooks@8.0.1(@types/react@18.3.12)(react-dom@18.3.1(react@18.3.1))(react@18.3.1)': + dependencies: + '@babel/runtime': 7.26.0 + react: 18.3.1 + react-error-boundary: 3.1.4(react@18.3.1) + optionalDependencies: + '@types/react': 18.3.12 + react-dom: 18.3.1(react@18.3.1) + '@testing-library/react@14.3.1(react-dom@18.3.1(react@18.3.1))(react@18.3.1)': dependencies: '@babel/runtime': 7.26.0 @@ -21875,6 +22222,16 @@ snapshots: react: 18.3.1 react-dom: 18.3.1(react@18.3.1) + '@testing-library/react@16.3.0(@testing-library/dom@9.3.4)(@types/react-dom@18.3.1)(@types/react@18.3.12)(react-dom@18.3.1(react@18.3.1))(react@18.3.1)': + dependencies: + '@babel/runtime': 7.26.0 + '@testing-library/dom': 9.3.4 + react: 18.3.1 + react-dom: 18.3.1(react@18.3.1) + optionalDependencies: + '@types/react': 18.3.12 + '@types/react-dom': 18.3.1 + '@testing-library/user-event@14.5.2(@testing-library/dom@9.3.4)': dependencies: '@testing-library/dom': 9.3.4 @@ -22291,6 +22648,10 @@ snapshots: '@types/webpack-env@1.18.5': {} + '@types/websocket@1.0.10': + dependencies: + '@types/node': 20.17.11 + '@types/ws@7.4.7': dependencies: '@types/node': 20.17.11 @@ -22588,6 +22949,46 @@ snapshots: json-schema-to-ts: 1.6.4 ts-morph: 12.0.0 + '@vitest/expect@3.1.3': + dependencies: + '@vitest/spy': 3.1.3 + '@vitest/utils': 3.1.3 + chai: 5.2.0 + tinyrainbow: 2.0.0 + + '@vitest/mocker@3.1.3(vite@5.4.11(@types/node@20.17.11)(terser@5.37.0))': + dependencies: + '@vitest/spy': 3.1.3 + estree-walker: 3.0.3 + magic-string: 0.30.17 + optionalDependencies: + vite: 5.4.11(@types/node@20.17.11)(terser@5.37.0) + + '@vitest/pretty-format@3.1.3': + dependencies: + tinyrainbow: 2.0.0 + + '@vitest/runner@3.1.3': + dependencies: + '@vitest/utils': 3.1.3 + pathe: 2.0.3 + + '@vitest/snapshot@3.1.3': + dependencies: + '@vitest/pretty-format': 3.1.3 + magic-string: 0.30.17 + pathe: 2.0.3 + + '@vitest/spy@3.1.3': + dependencies: + tinyspy: 3.0.2 + + '@vitest/utils@3.1.3': + dependencies: + '@vitest/pretty-format': 3.1.3 + loupe: 3.1.3 + tinyrainbow: 2.0.0 + '@wagmi/connectors@3.1.11(@react-native-async-storage/async-storage@1.24.0(react-native@0.76.5(@babel/core@7.26.0)(@babel/preset-env@7.26.0(@babel/core@7.26.0))(@types/react@18.3.12)(bufferutil@4.0.9)(react@18.3.1)(utf-8-validate@5.0.10)))(@types/react@18.3.12)(bufferutil@4.0.9)(react@18.3.1)(typescript@5.7.2)(utf-8-validate@5.0.10)(viem@2.16.0(bufferutil@4.0.9)(typescript@5.7.2)(utf-8-validate@5.0.10)(zod@3.24.1))(zod@3.24.1)': dependencies: '@coinbase/wallet-sdk': 3.9.3 @@ -23985,6 +24386,23 @@ snapshots: '@webassemblyjs/ast': 1.14.1 '@xtuc/long': 4.2.2 + '@xrplf/isomorphic@1.0.1(bufferutil@4.0.9)(utf-8-validate@5.0.10)': + dependencies: + '@noble/hashes': 1.6.1 + eventemitter3: 5.0.1 + ws: 8.18.0(bufferutil@4.0.9)(utf-8-validate@5.0.10) + transitivePeerDependencies: + - bufferutil + - utf-8-validate + + '@xrplf/secret-numbers@1.0.0(bufferutil@4.0.9)(utf-8-validate@5.0.10)': + dependencies: + '@xrplf/isomorphic': 1.0.1(bufferutil@4.0.9)(utf-8-validate@5.0.10) + ripple-keypairs: 2.0.0(bufferutil@4.0.9)(utf-8-validate@5.0.10) + transitivePeerDependencies: + - bufferutil + - utf-8-validate + '@xtuc/ieee754@1.2.0': {} '@xtuc/long@4.2.2': {} @@ -24246,6 +24664,8 @@ snapshots: object.assign: 4.1.7 util: 0.12.5 + assertion-error@2.0.1: {} + ast-types@0.15.2: dependencies: tslib: 2.3.0 @@ -24537,6 +24957,8 @@ snapshots: dependencies: file-uri-to-path: 1.0.0 + bip32-path@0.4.2: {} + bip66@1.1.5: dependencies: safe-buffer: 5.2.1 @@ -24836,6 +25258,14 @@ snapshots: ccount@2.0.1: {} + chai@5.2.0: + dependencies: + assertion-error: 2.0.1 + check-error: 2.1.1 + deep-eql: 5.0.2 + loupe: 3.1.3 + pathval: 2.0.0 + chalk@3.0.0: dependencies: ansi-styles: 4.3.0 @@ -24856,6 +25286,8 @@ snapshots: character-reference-invalid@2.0.1: {} + check-error@2.1.1: {} + checkpoint-store@1.1.0: dependencies: functional-red-black-tree: 1.0.1 @@ -25392,6 +25824,8 @@ snapshots: css-what@6.1.0: {} + css.escape@1.5.1: {} + cssesc@3.0.0: {} cssnano-preset-default@5.2.14(postcss@8.4.49): @@ -25706,6 +26140,8 @@ snapshots: dedent@1.5.3: {} + deep-eql@5.0.2: {} + deep-equal@2.2.3: dependencies: array-buffer-byte-length: 1.0.2 @@ -25859,6 +26295,8 @@ snapshots: dom-accessibility-api@0.5.16: {} + dom-accessibility-api@0.6.3: {} + dom-converter@0.2.0: dependencies: utila: 0.4.0 @@ -26154,6 +26592,8 @@ snapshots: es-module-lexer@1.6.0: {} + es-module-lexer@1.7.0: {} + es-object-atoms@1.0.0: dependencies: es-errors: 1.3.0 @@ -26943,6 +27383,8 @@ snapshots: expand-template@2.0.3: {} + expect-type@1.2.1: {} + exponential-backoff@3.1.1: {} express-validator@7.2.1: @@ -27008,7 +27450,7 @@ snapshots: extract-zip@2.0.1: dependencies: - debug: 4.3.4 + debug: 4.4.0(supports-color@5.5.0) get-stream: 5.2.0 yauzl: 2.10.0 optionalDependencies: @@ -27085,8 +27527,18 @@ snapshots: optionalDependencies: picomatch: 4.0.2 + fdir@6.4.4(picomatch@4.0.2): + optionalDependencies: + picomatch: 4.0.2 + fecha@4.2.3: {} + fetch-ponyfill@7.1.0: + dependencies: + node-fetch: 2.6.7 + transitivePeerDependencies: + - encoding + fetch-retry@5.0.6: {} figures@3.2.0: @@ -28895,6 +29347,8 @@ snapshots: dependencies: js-tokens: 4.0.0 + loupe@3.1.3: {} + lower-case@2.0.2: dependencies: tslib: 2.3.0 @@ -30794,6 +31248,10 @@ snapshots: pathe@1.1.2: {} + pathe@2.0.3: {} + + pathval@2.0.0: {} + pbkdf2@3.1.2: dependencies: create-hash: 1.2.0 @@ -31627,6 +32085,11 @@ snapshots: react-dom: 18.3.1(react@18.3.1) react-is: 18.1.0 + react-error-boundary@3.1.4(react@18.3.1): + dependencies: + '@babel/runtime': 7.26.0 + react: 18.3.1 + react-error-boundary@4.1.2(react@18.3.1): dependencies: '@babel/runtime': 7.26.0 @@ -31948,6 +32411,11 @@ snapshots: unified: 11.0.5 vfile: 6.0.3 + redent@3.0.0: + dependencies: + indent-string: 4.0.0 + strip-indent: 3.0.0 + redux-thunk@2.4.2(redux@4.2.1): dependencies: redux: 4.2.1 @@ -32212,6 +32680,32 @@ snapshots: hash-base: 3.1.0 inherits: 2.0.4 + ripple-address-codec@5.0.0(bufferutil@4.0.9)(utf-8-validate@5.0.10): + dependencies: + '@scure/base': 1.2.1 + '@xrplf/isomorphic': 1.0.1(bufferutil@4.0.9)(utf-8-validate@5.0.10) + transitivePeerDependencies: + - bufferutil + - utf-8-validate + + ripple-binary-codec@2.3.0(bufferutil@4.0.9)(utf-8-validate@5.0.10): + dependencies: + '@xrplf/isomorphic': 1.0.1(bufferutil@4.0.9)(utf-8-validate@5.0.10) + bignumber.js: 9.1.2 + ripple-address-codec: 5.0.0(bufferutil@4.0.9)(utf-8-validate@5.0.10) + transitivePeerDependencies: + - bufferutil + - utf-8-validate + + ripple-keypairs@2.0.0(bufferutil@4.0.9)(utf-8-validate@5.0.10): + dependencies: + '@noble/curves': 1.7.0 + '@xrplf/isomorphic': 1.0.1(bufferutil@4.0.9)(utf-8-validate@5.0.10) + ripple-address-codec: 5.0.0(bufferutil@4.0.9)(utf-8-validate@5.0.10) + transitivePeerDependencies: + - bufferutil + - utf-8-validate + rlp@2.2.7: dependencies: bn.js: 5.2.1 @@ -32550,6 +33044,8 @@ snapshots: side-channel-map: 1.0.1 side-channel-weakmap: 1.0.2 + siginfo@2.0.0: {} + signal-exit@3.0.7: {} signal-exit@4.1.0: {} @@ -32704,6 +33200,8 @@ snapshots: dependencies: escape-string-regexp: 2.0.0 + stackback@0.0.2: {} + stackframe@1.3.4: {} stacktrace-parser@0.1.10: @@ -32714,6 +33212,8 @@ snapshots: statuses@2.0.1: {} + std-env@3.9.0: {} + stop-iteration-iterator@1.1.0: dependencies: es-errors: 1.3.0 @@ -32879,6 +33379,10 @@ snapshots: dependencies: is-hex-prefixed: 1.0.0 + strip-indent@3.0.0: + dependencies: + min-indent: 1.0.1 + strip-indent@4.0.0: dependencies: min-indent: 1.0.1 @@ -33245,11 +33749,26 @@ snapshots: tiny-warning@1.0.3: {} + tinybench@2.9.0: {} + + tinyexec@0.3.2: {} + tinyglobby@0.2.10: dependencies: fdir: 6.4.2(picomatch@4.0.2) picomatch: 4.0.2 + tinyglobby@0.2.13: + dependencies: + fdir: 6.4.4(picomatch@4.0.2) + picomatch: 4.0.2 + + tinypool@1.0.2: {} + + tinyrainbow@2.0.0: {} + + tinyspy@3.0.2: {} + tmp@0.2.3: {} tmpl@1.0.5: {} @@ -33918,6 +34437,24 @@ snapshots: - supports-color - terser + vite-node@3.1.3(@types/node@20.17.11)(terser@5.37.0): + dependencies: + cac: 6.7.14 + debug: 4.4.0(supports-color@5.5.0) + es-module-lexer: 1.7.0 + pathe: 2.0.3 + vite: 5.4.11(@types/node@20.17.11)(terser@5.37.0) + transitivePeerDependencies: + - '@types/node' + - less + - lightningcss + - sass + - sass-embedded + - stylus + - sugarss + - supports-color + - terser + vite-tsconfig-paths@5.1.2(typescript@5.7.2)(vite@5.4.11(@types/node@22.10.3)(terser@5.37.0)): dependencies: debug: 4.4.0(supports-color@5.5.0) @@ -33929,6 +34466,16 @@ snapshots: - supports-color - typescript + vite@5.4.11(@types/node@20.17.11)(terser@5.37.0): + dependencies: + esbuild: 0.21.5 + postcss: 8.4.49 + rollup: 4.29.1 + optionalDependencies: + '@types/node': 20.17.11 + fsevents: 2.3.3 + terser: 5.37.0 + vite@5.4.11(@types/node@22.10.3)(terser@5.37.0): dependencies: esbuild: 0.21.5 @@ -33939,6 +34486,43 @@ snapshots: fsevents: 2.3.3 terser: 5.37.0 + vitest@3.1.3(@types/debug@4.1.12)(@types/node@20.17.11)(terser@5.37.0): + dependencies: + '@vitest/expect': 3.1.3 + '@vitest/mocker': 3.1.3(vite@5.4.11(@types/node@20.17.11)(terser@5.37.0)) + '@vitest/pretty-format': 3.1.3 + '@vitest/runner': 3.1.3 + '@vitest/snapshot': 3.1.3 + '@vitest/spy': 3.1.3 + '@vitest/utils': 3.1.3 + chai: 5.2.0 + debug: 4.4.0(supports-color@5.5.0) + expect-type: 1.2.1 + magic-string: 0.30.17 + pathe: 2.0.3 + std-env: 3.9.0 + tinybench: 2.9.0 + tinyexec: 0.3.2 + tinyglobby: 0.2.13 + tinypool: 1.0.2 + tinyrainbow: 2.0.0 + vite: 5.4.11(@types/node@20.17.11)(terser@5.37.0) + vite-node: 3.1.3(@types/node@20.17.11)(terser@5.37.0) + why-is-node-running: 2.3.0 + optionalDependencies: + '@types/debug': 4.1.12 + '@types/node': 20.17.11 + transitivePeerDependencies: + - less + - lightningcss + - msw + - sass + - sass-embedded + - stylus + - sugarss + - supports-color + - terser + vlq@1.0.1: {} vm-browserify@1.1.2: {} @@ -34944,6 +35528,11 @@ snapshots: dependencies: isexe: 2.0.0 + why-is-node-running@2.3.0: + dependencies: + siginfo: 2.0.0 + stackback: 0.0.2 + winston-transport@4.9.0: dependencies: logform: 2.7.0 @@ -35085,12 +35674,75 @@ snapshots: xmlhttprequest-ssl@2.1.2: {} + xrpl@4.2.5(bufferutil@4.0.9)(utf-8-validate@5.0.10): + dependencies: + '@scure/bip32': 1.4.0 + '@scure/bip39': 1.3.0 + '@xrplf/isomorphic': 1.0.1(bufferutil@4.0.9)(utf-8-validate@5.0.10) + '@xrplf/secret-numbers': 1.0.0(bufferutil@4.0.9)(utf-8-validate@5.0.10) + bignumber.js: 9.1.2 + eventemitter3: 5.0.1 + ripple-address-codec: 5.0.0(bufferutil@4.0.9)(utf-8-validate@5.0.10) + ripple-binary-codec: 2.3.0(bufferutil@4.0.9)(utf-8-validate@5.0.10) + ripple-keypairs: 2.0.0(bufferutil@4.0.9)(utf-8-validate@5.0.10) + transitivePeerDependencies: + - bufferutil + - utf-8-validate + xtend@2.1.2: dependencies: object-keys: 0.4.0 xtend@4.0.2: {} + xumm-js-pkce@1.0.2: + dependencies: + crypto-js: 4.2.0 + + xumm-oauth2-pkce@2.8.7: + dependencies: + debug: 4.4.0(supports-color@5.5.0) + events: 3.3.0 + xumm-js-pkce: 1.0.2 + xumm-sdk: 1.11.2 + transitivePeerDependencies: + - encoding + - supports-color + + xumm-sdk@1.11.2: + dependencies: + '@types/websocket': 1.0.10 + assert: 2.1.0 + bignumber.js: 9.1.2 + buffer: 6.0.3 + debug: 4.4.0(supports-color@5.5.0) + events: 3.3.0 + fetch-ponyfill: 7.1.0 + node-fetch: 2.7.0 + os-browserify: 0.3.0 + websocket: 1.0.35 + transitivePeerDependencies: + - encoding + - supports-color + + xumm-xapp-sdk@1.7.2: + dependencies: + debug: 4.4.0(supports-color@5.5.0) + events: 3.3.0 + transitivePeerDependencies: + - supports-color + + xumm@1.8.0: + dependencies: + buffer: 6.0.3 + events: 3.3.0 + xumm-oauth2-pkce: 2.8.7 + xumm-sdk: 1.11.2 + xumm-xapp-sdk: 1.7.2 + transitivePeerDependencies: + - encoding + - supports-color + y18n@4.0.3: {} y18n@5.0.8: {}