About the Roles
Rex.zone curates freelance opportunities that plug directly into human-in-the-loop AI/ML workflows. Successful contributors help set annotation guidelines, raise training data quality, execute rubric-driven assessments, and ship reliable ground truth to model engineers. Our freelance remote jobs span short-burst contracts and multi-month engagements across NLP, computer vision, and multimodal tasks. You will partner with client teams on evaluation cycles (A/B comparisons, model preference ranking), dataset curation (gold sets, adversarial prompts), and QA evaluation (inter-annotator agreement, error taxonomies, guideline compliance).



