Home  Trees  Indices  Help 



Classes and methods for scoring processing modules.





ConfusionMatrix The confusion matrix between a list of reference values and a corresponding list of test values. 

AnnotationTask Represents an annotation task, i.e. 

TrigramAssocMeasures A collection of trigram association measures. 

NgramAssocMeasures An abstract class defining a collection of generic association measures. 

ContingencyMeasures Wraps NgramAssocMeasures classes such that the arguments of association measures are contingency table values rather than marginals. 

BigramAssocMeasures A collection of trigram association measures. 




float or None



float or None



tuple



float or None





int

























Given a list of reference values and a corresponding list of test probability distributions, return the average log likelihood of the reference values, given the probability distributions.

Given a set of reference values and a set of test values, return the
fraction of reference values that appear in the test set. In particular,
return


Given a set of reference values and a set of test values, return the
fraction of test values that appear in the reference set. In particular,
return


Returns an approximate significance level between two lists of independently generated test values. Approximate randomization calculates significance by randomly drawing from a sample of the possible permutations. At the limit of the number of possible permutations, the significance level is exact. The approximate significance level is the sample mean number of times the statistic of the permutated lists varies from the actual statistic of the unpermuted argument lists.

Given a set of reference values and a set of test values, return the
fmeasure of the test values, when compared against the reference values.
The fmeasure is the harmonic mean of the precision and recall,
weighted by
The fmeasure is:
If either

Given a list of reference values and a corresponding list of test
values, return the fraction of corresponding values that are equal. In
particular, return the fraction of indices

Compute the windowdiff score for a pair of segmentations. A segmentation is any sequence over a vocabulary of two items (e.g. "0", "1"), where the specified boundary value is used to mark the edge of a segmentation. >>> s1 = "00000010000000001000000" >>> s2 = "00000001000000010000000" >>> s3 = "00010000000000000001000" >>> windowdiff(s1, s1, 3) 0 >>> windowdiff(s1, s2, 3) 4 >>> windowdiff(s2, s3, 3) 16

Calculate the Levenshtein editdistance between two strings. The edit distance is the number of characters that need to be substituted, inserted, or deleted, to transform s1 into s2. For example, transforming "rain" to "shine" requires three steps, consisting of two substitutions and one insertion: "rain" > "sain" > "shin" > "shine". These operations could have been done in other orders, but at least three steps are needed.

Krippendorff'1 interval distance metric >>> interval_distance(1,10) 81 Krippendorff 1980, Content Analysis: An Introduction to its Methodology 
Simple equality test. 0.0 if the labels are identical, 1.0 if they are different. >>> binary_distance(1,1) 0.0 >>> binary_distance(1,3) 1.0 
Distance metric that takes into account partial agreement when multiple labels are assigned. >>> masi_distance(set([1,2]),set([1,2,3,4])) 0.5 Passonneau 2005, Measuring Agreement on SetValued Items (MASI) for Semantic and Pragmatic Annotation. 
Returns the Spearman correlation coefficient for two rankings, which should be dicts or sequences of (key, rank). The coefficient ranges from 1.0 (ranks are opposite) to 1.0 (ranks are identical), and is only calculated for keys in both rankings (for meaningful results, remove keys present in only one list before ranking). 
Given a sequence of (key, score) tuples, yields each key with an increasing rank, tying with previous key's rank if the difference between their scores is less than rank_gap. Suitable for use as an argument to spearman_correlation. 
Home  Trees  Indices  Help 


Generated by Epydoc 3.0.1 on Mon Apr 11 14:39:41 2011  http://epydoc.sourceforge.net 