- Login to Microsoft Translator Hub.
- Click on “Build a translation system”.
- Log in.
- Click on the “Projects” tab.
- In the “Projects with deployed training systems to be review” section click on the link of the “Project Name” you want to test.
- This will take you to a “Test system” page where you can manually translate text using your trained system.
- In addition to manually testing you can also gain insight as to how the BLEU Score was determined by clicking on the “Evaluate results” link. The resulting page will show you the test data that was used to calculate the BLEU Score (0 to 100). Note the green number with a plus in front of it. This is the amount of improvement your training has had from Microsoft's base translation system.
The content in the left hand column is the source text. The text next to “Ref:” label in the right hand column is the version of text that was human translated. The text next to the “MT:” label is machine translated text using your trained system. BLEU score is calculated by determining the average percent similiarity of the Ref and the MT translations. Scores above 50 are considered very good.