Skip to content

Commit

Permalink
update README to include "Microsoft COCO Captions: Data Collection an…
Browse files Browse the repository at this point in the history
…d Evaluation Server"
  • Loading branch information
tylin committed Apr 6, 2015
1 parent 612ce6b commit f1a9663
Showing 1 changed file with 5 additions and 4 deletions.
9 changes: 5 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,12 +12,12 @@ Evaluation codes for MS COCO caption generation.
- cocoEvalCapDemo.py (demo script)

./annotation
- captions_val2014.json (COCO 2014 caption validation set)
- More detials can be found under download tab on [COCO dataset](http://mscoco.org/dataset)
- captions_val2014.json (MS COCO 2014 caption validation set)
- Visit MS COCO [download](http://mscoco.org/dataset/#download) page for more details.

./results
- captions_val2014_fakecap_results.json (example fake results for running demo)
- More details can be found under evaluate->format tab on [COCO dataset](http://mscoco.org/dataset)
- captions_val2014_fakecap_results.json (an example of fake results for running demo)
- Visit MS COCO [format](http://mscoco.org/dataset/#format) page for more details.

./pycocoevalcap: The folder where all evaluation codes are stored.
- evals.py: The file includes COCOEavlCap class that can be used to evaluate results on COCO.
Expand All @@ -29,6 +29,7 @@ Evaluation codes for MS COCO caption generation.

## References ##

- [Microsoft COCO Captions: Data Collection and Evaluation Server](http://http://arxiv.org/abs/1504.00325)
- PTBTokenizer: We use the [Stanford Tokenizer](http://nlp.stanford.edu/software/tokenizer.shtml) which is included in [Stanford CoreNLP 3.4.1](http://nlp.stanford.edu/software/corenlp.shtml).
- BLEU: [BLEU: a Method for Automatic Evaluation of Machine Translation](http://delivery.acm.org/10.1145/1080000/1073135/p311-papineni.pdf?ip=72.229.132.206&id=1073135&acc=OPEN&key=4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E6D218144511F3437&CFID=644819513&CFTOKEN=71377947&__acm__=1426607117_16e4342fbc20d41c064c8fb685cffe60)
- Meteor: [Project page](http://www.cs.cmu.edu/~alavie/METEOR/) with related publications. We use the latest version (1.5) of the [Code](https://github.com/mjdenkowski/meteor). Changes have been made to the source code to properly aggreate the statistics for the entire corpus.
Expand Down

0 comments on commit f1a9663

Please sign in to comment.