How to Create a Public Optimization Experiment
Learn how to create and share optimization experiments on Rastion for community collaboration and research
How to Create a Public Optimization Experiment
Optimization experiments (also called “workflows”) on Rastion allow you to share your research findings, algorithm comparisons, and optimization insights with the community. This guide shows you how to create compelling experiments that others can learn from and build upon.
What is an Optimization Experiment?
An optimization experiment is a structured study that:
- Documents Optimization Runs: Records specific problem-optimizer combinations
- Shares Configurations: Makes successful parameter settings available to others
- Enables Reproduction: Provides all details needed to reproduce results
- Facilitates Learning: Helps the community understand what works and why
Prerequisites
Before creating an experiment:
- Uploaded Repositories: Upload your algorithms to Rastion
- Playground Testing: Test your algorithms in the playground
- GitHub Login: Sign in with GitHub
- Successful Optimization Run: Complete at least one successful playground execution
Creating Experiments from Playground
1. Save Playground Configuration
After a successful optimization run in the playground:
Save Workflow Option
- Complete Optimization: Finish a playground run with good results
- Click “Save”: Use the save button in the playground interface
- Name Your Workflow: Give it a descriptive name
- Add Description: Explain what makes this configuration special
- Set Visibility: Choose public (shareable) or private (personal)
Workflow Information
Your saved workflow captures:
- Problem Selection: Exact problem repository and version
- Optimizer Selection: Your algorithm repository and version
- Parameter Configuration: All problem and optimizer parameters
- Performance Results: Best value, runtime, and other metrics
- Execution Environment: Playground environment specifications
2. Access Experiments Page
Navigate to rastion.com/experiments to:
View Community Experiments
- Browse Public Workflows: See what others have shared
- Search by Problem Type: Filter by TSP, VRP, MaxCut, etc.
- Sort by Performance: Find the best-performing configurations
- Learn from Success: Study successful parameter combinations
Manage Your Experiments
- My Workflows: View your saved configurations
- Edit Descriptions: Update experiment documentation
- Share Settings: Change visibility between public and private
- Performance Tracking: Monitor how your experiments perform
Experiment Features
1. Experiment Interface
The experiments page provides:
Compact Search and Sorting
- Smaller Search Bar: Streamlined interface for finding experiments
- Sorting Options: Sort by performance, date, popularity
- Filter by Type: Filter by problem categories (TSP, VRP, etc.)
- Consistent Width: Matches the benchmarks page layout for consistency
Experiment Cards
Each experiment displays:
- Configuration Summary: Problem and optimizer combination
- Performance Metrics: Best value, runtime, success rate
- Author Information: Creator and creation date
- Repository Links: Direct links to used repositories
- Action Buttons: Open in playground, copy URL, view details
2. Sharing and Collaboration
Open in Playground
- “Open in Playground” Button: Replaces generic “Run” button
- Pre-configured Setup: Loads exact configuration in playground
- Parameter Inheritance: All parameters automatically set
- Immediate Testing: Start experimenting with proven configurations
Repository Integration
- Clickable Repository Names: Link to
rastion.com/username/repo_name
- Repository Browsing: Full access to algorithm and problem code
- Version Tracking: Links to specific repository versions used
- Code Review: Examine implementation details
URL Sharing
- “Copy URL” Button: Share experiment URLs with other Rastion users
- Direct Access: Recipients can immediately view and use experiments
- Community Sharing: Facilitate knowledge sharing across the platform
- Collaboration: Enable team members to access shared experiments
Creating Multiple Experiments
1. Systematic Experimentation
Parameter Studies
Create multiple experiments to study parameter effects:
- Baseline Experiment: Start with default parameters
- Single Parameter Variation: Change one parameter at a time
- Combination Studies: Test promising parameter combinations
- Performance Analysis: Compare results across experiments
Algorithm Comparisons
Compare different approaches:
- Same Problem, Different Algorithms: Test multiple optimizers on one problem
- Algorithm Variants: Compare different versions of your algorithm
- Hybrid Approaches: Test combinations of optimization techniques
- Benchmark Studies: Compare against established algorithms
2. Experiment Documentation
Effective Descriptions
Write clear experiment descriptions:
Tags and Categorization
Use relevant tags for discoverability:
- Problem Type: “tsp”, “vrp”, “maxcut”, “scheduling”
- Algorithm Type: “genetic”, “simulated_annealing”, “local_search”
- Study Type: “parameter_study”, “comparison”, “benchmark”
- Domain: “logistics”, “graph_theory”, “combinatorial”
Community Engagement
1. Learning from Others
Browse Community Experiments
- Study Successful Configurations: Learn from high-performing experiments
- Understand Parameter Choices: See why certain parameters work well
- Discover New Approaches: Find innovative optimization strategies
- Benchmark Your Work: Compare your results with community standards
Experiment Analysis
When viewing others’ experiments:
- Performance Metrics: Compare objective values and runtimes
- Parameter Settings: Study successful parameter combinations
- Problem-Algorithm Fit: Understand which algorithms work for which problems
- Implementation Details: Access repository code for deeper understanding
2. Contributing to Knowledge
Share Your Insights
- Document Findings: Explain why certain configurations work
- Share Failures: Negative results are also valuable for the community
- Provide Context: Explain the reasoning behind parameter choices
- Offer Improvements: Suggest enhancements to existing approaches
Best Practices for Public Experiments
- Clear Naming: Use descriptive experiment names
- Comprehensive Descriptions: Explain methodology and findings
- Reproducible Setup: Ensure others can replicate your results
- Performance Context: Explain what constitutes good performance
Experiment Analysis and Visualization
Statistical Analysis
Visualization Options
The platform provides built-in visualizations:
- Performance Comparison: Box plots, violin plots
- Convergence Analysis: Line plots showing optimization progress
- Parameter Sensitivity: Heatmaps and scatter plots
- Statistical Summary: Tables with means, medians, confidence intervals
Sharing and Collaboration
Publication Settings
- Public Experiments: Visible to all users, searchable
- Unlisted: Accessible via direct link only
- Private: Visible only to you and collaborators
Collaboration Features
Sharing Options
- Direct Links: Share experiment URLs
- Embed Code: Embed results in websites/papers
- Export Data: Download results in CSV, JSON formats
- Citation: Generate academic citations
Best Practices
Experimental Design
- Clear Hypothesis: State what you’re testing
- Controlled Variables: Keep non-tested parameters constant
- Multiple Runs: Use multiple random seeds for statistical validity
- Appropriate Baselines: Include well-known algorithms for comparison
Documentation
Reproducibility
- Version Control: Specify exact algorithm versions
- Parameter Documentation: Record all parameter settings
- Environment Details: Note any special requirements
- Data Availability: Ensure all data is accessible
Next Steps
Create Benchmark
Create standardized benchmarks for the community
Join Leaderboard
Submit your algorithms to competitions
View Experiments
Browse community experiments for inspiration
Playground Testing
Test algorithms before creating experiments
Your experiments contribute to the collective knowledge of the optimization community. Share your insights and help advance the field!