Playground Overview

The Rastion Playground is an interactive web-based environment that allows you to test, compare, and analyze optimization algorithms in real-time. It provides a powerful interface for experimenting with different algorithms and parameters without needing to set up local development environments.

What is the Playground?

The Playground is a cloud-based execution environment that combines the power of the Qubots framework with an intuitive web interface, enabling:

Real-time Execution

Run optimizations directly in your browser with immediate results

Parameter Tuning

Adjust algorithm parameters with intuitive controls and see the impact

Visual Analytics

Visualize optimization progress and results with interactive charts

Algorithm Comparison

Compare multiple algorithms side-by-side on the same problem

Interface Overview

Main Components

The Playground interface consists of several key areas:

┌─────────────────────────────────────────────────────────┐
│ Problem Selection Panel    │ Optimizer Selection Panel   │
├─────────────────────────────────────────────────────────┤
│ Parameter Configuration Area                            │
├─────────────────────────────────────────────────────────┤
│ Execution Controls         │ Real-time Progress Monitor  │
├─────────────────────────────────────────────────────────┤
│ Results Visualization and Analysis Dashboard            │
└─────────────────────────────────────────────────────────┘

Problem Selection Panel

Choose from available optimization problems:

  • Problem Categories: Browse by domain (TSP, MaxCut, VRP, etc.)
  • Problem Details: View problem descriptions and parameters
  • Instance Selection: Choose specific problem instances
  • Custom Problems: Upload and test your own problems

Optimizer Selection Panel

Select optimization algorithms to test:

  • Algorithm Categories: Browse by type (exact, heuristic, metaheuristic)
  • Compatibility Check: See which optimizers work with selected problems
  • Algorithm Details: View optimizer descriptions and parameters
  • Custom Optimizers: Upload and test your own algorithms

Getting Started

Step 1: Access the Playground

  1. Navigate to rastion.com/playground
  2. Log in to your Rastion account
  3. The playground interface will load with default selections

Step 2: Select a Problem

  1. Browse Categories: Click on a problem category (e.g., “Combinatorial Optimization”)
  2. Choose Problem: Select a specific problem (e.g., “Maximum Cut Problem”)
  3. Review Details: Read the problem description and default parameters
  4. Configure Instance: Adjust problem parameters if needed

Example problem selection:

Selected: Maximum Cut Problem
Description: Graph partitioning optimization to maximize edge cuts
Parameters:
  • n_vertices: 15
  • graph_type: random
  • density: 0.4
  • weight_range: [1.0, 10.0]

Step 3: Choose an Optimizer

  1. View Compatible Optimizers: See algorithms that can solve the selected problem
  2. Select Algorithm: Choose an optimizer (e.g., “OR-Tools MaxCut Solver”)
  3. Review Parameters: Check default algorithm parameters
  4. Customize Settings: Adjust parameters for your experiment

Example optimizer selection:

Selected: OR-Tools MaxCut Optimizer
Description: Constraint programming solver using Google OR-Tools
Parameters:
  • time_limit: 60.0 seconds
  • num_search_workers: 4
  • use_symmetry: true
  • log_search_progress: false

Step 4: Run Optimization

  1. Review Configuration: Verify problem and optimizer settings
  2. Start Execution: Click the “Run Optimization” button
  3. Monitor Progress: Watch real-time progress updates
  4. View Results: Analyze results when optimization completes

Parameter Configuration

Problem Parameters

Customize problem instances with intuitive controls:

Slider Controls

For numerical parameters like graph size or density:

n_vertices: [====|====] 15 (range: 5-50)
density:    [==|======] 0.4 (range: 0.1-1.0)

For categorical parameters:

graph_type: [Random ▼]
Options: Random, Complete, Cycle, Grid, Scale-free

Text Inputs

For complex parameters:

weight_range: [1.0, 10.0]
custom_seed: 42

Optimizer Parameters

Fine-tune algorithm behavior:

Performance Settings

time_limit:        [====|==] 60.0 seconds
memory_limit:      [===|===] 1024 MB
num_workers:       [==|====] 4 threads

Algorithm-Specific Options

☑ use_symmetry_breaking
☑ enable_preprocessing
☐ log_detailed_progress
☐ save_intermediate_solutions

Execution and Monitoring

Real-time Progress

Monitor optimization progress with live updates:

Progress Indicators

  • Overall Progress: Percentage completion
  • Current Best: Best solution found so far
  • Iterations: Number of algorithm iterations
  • Time Elapsed: Runtime since start

Live Charts

  • Convergence Plot: Solution quality over time
  • Resource Usage: CPU and memory consumption
  • Search Progress: Algorithm-specific metrics

Execution Controls

Manage running optimizations:

[▶ Run] [⏸ Pause] [⏹ Stop] [🔄 Restart]
  • Run: Start the optimization
  • Pause: Temporarily pause execution
  • Stop: Terminate optimization early
  • Restart: Reset and run again with same parameters

Results Analysis

Solution Visualization

View optimization results in multiple formats:

Numerical Results

Optimization Results:
✅ Status: Success
🎯 Best Value: 11,856.0
⏱️ Runtime: 45.23 seconds
🔄 Iterations: 1,247
📊 Success Rate: 100%

Solution Details

Best Solution Found:
Partition A: [0, 2, 5, 7, 9, 11, 13]
Partition B: [1, 3, 4, 6, 8, 10, 12, 14]
Cut Weight: 11,856.0
Cut Edges: 23 out of 42 total edges

Visual Representations

  • Graph Visualization: For graph problems (TSP tours, graph cuts)
  • Route Maps: For routing problems (VRP, TSP)
  • Convergence Charts: Solution quality over time
  • Performance Metrics: Detailed algorithm statistics

Comparison Tools

Compare multiple optimization runs:

Side-by-Side Comparison

┌─────────────────┬─────────────────┬─────────────────┐
│ OR-Tools        │ Genetic Alg.    │ Simulated Ann.  │
├─────────────────┼─────────────────┼─────────────────┤
│ Value: 11,856   │ Value: 11,834   │ Value: 11,798   │
│ Time: 45.2s     │ Time: 120.5s    │ Time: 89.3s     │
│ Success: 100%   │ Success: 95%    │ Success: 88%    │
└─────────────────┴─────────────────┴─────────────────┘

Performance Charts

  • Quality vs Time: Compare convergence speed
  • Parameter Sensitivity: How parameters affect performance
  • Scalability Analysis: Performance on different problem sizes

Advanced Features

Batch Experiments

Run multiple experiments automatically:

  1. Parameter Sweeps: Test different parameter combinations
  2. Algorithm Comparison: Compare multiple algorithms systematically
  3. Statistical Analysis: Multiple runs for statistical significance
  4. Automated Reporting: Generate comprehensive experiment reports

Custom Uploads

Test your own optimization components:

Upload Problem

# Upload your problem repository
1. Prepare repository with qubot.py and config.json
2. Use "Upload Problem" button in playground
3. Test immediately after upload

Upload Optimizer

# Upload your optimizer repository
1. Ensure compatibility with selected problems
2. Use "Upload Optimizer" button
3. Configure parameters and run tests

Export and Sharing

Share your experiments with others:

  • Export Results: Download results as CSV, JSON, or PDF
  • Share Configuration: Generate shareable links to experiment setups
  • Embed Results: Embed interactive charts in presentations
  • Collaboration: Invite team members to view and modify experiments

Integration with Development Workflow

Local Development

Use playground results to guide local development:

# Reproduce playground results locally
from qubots import AutoProblem, AutoOptimizer

# Use same configuration as playground
problem = AutoProblem.from_repo("examples/maxcut_problem", override_params={
    "n_vertices": 15,
    "graph_type": "random",
    "density": 0.4
})

optimizer = AutoOptimizer.from_repo("examples/ortools_maxcut_optimizer", override_params={
    "time_limit": 60.0,
    "use_symmetry": True
})

result = optimizer.optimize(problem)

Leaderboard Preparation

Use the playground to prepare for leaderboard submissions:

  1. Test Algorithm: Verify your algorithm works correctly
  2. Tune Parameters: Find optimal parameter settings
  3. Validate Performance: Ensure consistent results
  4. Submit to Leaderboard: Submit with confidence

Best Practices

Effective Experimentation

  • Start Simple: Begin with default parameters
  • Change One Thing: Modify one parameter at a time
  • Document Results: Keep notes on what works
  • Compare Systematically: Use consistent evaluation criteria

Performance Optimization

  • Monitor Resources: Watch CPU and memory usage
  • Tune Time Limits: Balance quality and speed
  • Test Scalability: Try different problem sizes
  • Validate Robustness: Test on multiple instances

Collaboration

  • Share Configurations: Use shareable links for team collaboration
  • Document Insights: Add comments to experiment results
  • Discuss Results: Use community features to discuss findings
  • Learn from Others: Study successful configurations from the community

Next Steps