Happy December! Christmas is just around the corner and our goals for this semester are also so close to being completed.
This week, I have successfully taken the certain sentences and ran them through the trained model. I had to do some updating to the code to get it all working together, but after running, we are getting an accuracy of 97.07, which is what we need.
The next thing I’ll need to accomplish is getting the data that was outputted into “test.probs” and transform this into a matrix of ones and zeroes, which Dani can use.
test.probs looks like so:
Where the numbers represent the probability of that particular sentence entailing (1) each other or not (0).
If the probability of the sentences entailing each other is higher than the probability of them not, then I have to put a ‘1’ in this matrix spot and ‘0’ if the probability is higher when they don’t entail each other.