text
stringlengths
13
30k
{"text": "Secondly the << dynamical model >> is derived and trained based on this [[ intrinsic representation ]] .", "label": "USED-FOR", "metadata": [11, 12, 2, 3]}
{"text": "Thirdly the learned [[ intrinsic object structure ]] is integrated into a << particle-filter style tracker >> .", "label": "PART-OF", "metadata": [3, 5, 10, 12]}
{"text": "We will show that this intrinsic object representation has some interesting properties and based on which the newly derived [[ dynamical model ]] makes << particle-filter style tracker >> more robust and reliable .", "label": "USED-FOR", "metadata": [19, 20, 22, 24]}
{"feat_id": "13716782", "text": "Kelly: Oh! Oh! Can I pick the first question?\r\nJessica: Sure. Go for it!\r\nKelly: What's the scariest place you've been to!\r\nJessica: I'll start: Palermo in Italy.\r\nMickey: And what's so scary about that? Did you break your nail? :P\r\nJessica: Shut it, Mickey! No, there are the Capuchin Catacombs with 8000 corpses! \r\nKelly: Ewwww! Corpses? Rly?\r\nJessica: Yeah! And you can look at them like museum exhibits. I think they're divided somehow, but have no clue how!\r\nOllie: That's so cool! Do you get to see the bones or are they covered up?\r\nJessica: Well, partly. Most of them were exhibited in their clothes. Basically only skulls and hands. \r\nMickey: I'm writing this one down! That's so precious!\r\nOllie: Me too!", "target": "The scariest place for Jessica was the Capuchin Catacombs in Palermo.", "evaluation_predictions": [2, 115, 136, 613, 1351, 110, 108, 145, 491, 165, 120, 186, 117, 114, 575, 115, 109, 3955, 113, 109, 110, 105, 1309, 113, 523, 16837, 110, 107, 106, 145, 491, 165, 120, 186, 117, 114, 575, 115, 109, 3955, 113, 109, 110, 105, 1309, 113, 523, 16837, 110, 107, 106, 145, 491, 165, 120, 186, 117, 114, 575, 115, 109, 3955, 113, 109, 110, 105, 1309, 113, 523, 16837, 110, 107, 115, 136, 613, 1351, 110, 108, 106, 145, 491, 165, 120, 186, 117, 114, 575, 115, 109, 3955, 113, 109, 110, 105, 1309, 113, 523, 16837, 110, 107, 106, 145, 2426, 109, 3955, 113, 109, 110, 105, 1309, 113, 523, 16837, 130, 4083, 110, 151, 2650, 757, 23811, 3142, 241, 2650, 757, 23811, 740, 117, 109, 1309, 113, 523, 110, 108, 2650, 757, 23811, 522, 117, 109, 1309, 113, 523, 110, 108, 2650, 757, 23811, 726, 117, 109, 1309, 113, 523, 110, 108, 2650, 757, 23811, 914, 117, 109, 1309, 113, 523, 110, 108, 2650, 757, 23811, 1343, 117, 109, 1309, 113, 523, 110, 108, 2650, 757, 23811, 1717, 117, 109, 1309, 113, 523, 110, 108, 2650, 757, 23811, 1954, 117, 109, 1309, 113, 523, 110, 108, 2650, 757, 23811, 2000, 117, 109, 1309, 113, 523, 110, 108, 2650, 757, 23811, 2507, 117, 109, 1309, 113, 523, 110, 108, 2650, 757, 23811, 2449, 117, 109, 1309, 113, 523, 110, 108, 2650, 757, 23811, 4363, 117, 109, 1309, 113, 523, 110, 108, 2650, 757, 23811, 3602, 117, 109, 1309, 113, 523, 110, 108, 2650, 757]}
{"feat_id": "13716592", "text": "Carrie: Just back from Fantastic Beast :)\r\nGina: and what do you think?\r\nCarrie: generally good - as usual nice special effect and visuals, an ok plot, a glimpse of the wizarding community in the US.\r\nAlex: Sounds cool. I was thinking of going this weekend with Lane, but I've seen some bad reviews.\r\nCarrie: Depends on what you expect really - I have a lot of sentiment towards Harry Potter so, I'm gonna like everything the do. But seriously the movie was decent. However, if you're expecting to have your mind blown, then no, it's not THAT good.\r\nGina: I agree. I saw it last week and basically I'm satisfied.\r\nAlex: No spoilers, girls.\r\nCarrie: no worries ;)\r\nCarrie: And Gina, what do you think about Eddie Redmayne as Newt?\r\nGina: I loved him <3 I loved how introverted and awkward he was and how caring he was towards the animals. And with all that he showed a lot of confidence in his beliefs and was a genuinely compassionate character\r\nCarrie: not your standard protagonist, that's for sure\r\nGina: and that's what I liked about him\r\nAlex: Maybe I'll go and see it sooner so we can all talk about it.\r\nCarrie: go see it. If' you're not expecting god-knows-what you're going to enjoy it ;)", "target": "Carrie and Gina saw \"Fantastic Beast\" and liked it. Ginna loved Eddie Redmayne as Newt. ", "evaluation_predictions": [2, 220, 8876, 110, 151, 220, 8876, 110, 151, 180, 171, 119, 311, 110, 152, 110, 151, 180, 171, 119, 311, 113, 130, 144, 110, 152, 110, 151, 110, 116, 180, 171, 119, 311, 113, 130, 144, 110, 152, 110, 151, 110, 116, 180, 171, 119, 311, 113, 130, 144, 110, 152, 110, 151, 110, 116, 180, 368, 119, 311, 113, 130, 144, 110, 152, 110, 151, 110, 116, 180, 368, 119, 311, 113, 130, 144, 110, 152, 110, 151, 110, 116, 180, 368, 119, 311, 113, 130, 144, 110, 152, 110, 151, 110, 116, 180, 368, 119, 311, 113, 130, 144, 110, 152, 110, 151, 110, 116, 180, 368, 119, 311, 113, 130, 144, 110, 152, 110, 151, 110, 116, 180, 368, 119, 311, 113, 130, 144, 110, 152, 110, 151, 110, 116, 180, 368, 119, 311, 113, 130, 144, 110, 152, 110, 151, 110, 116, 180, 368, 119, 311, 113, 130, 144, 110, 152, 110, 151, 110, 116, 180, 368, 119, 311, 113, 130, 144, 110, 152, 110, 151, 110, 116, 180, 368, 119, 311, 113, 130, 144, 110, 152, 110, 151, 110, 116, 180, 368, 119, 311, 113, 130, 144, 110, 152, 110, 151, 110, 116, 180, 368, 119, 311, 113, 130, 144, 110, 152, 110, 151, 110, 116, 180, 368, 119, 311, 113, 130, 144, 110, 152, 110, 151, 110, 116, 180, 368, 119, 311, 113, 130, 144, 110, 152, 110, 151, 110, 116, 180, 368, 119, 311, 113, 130, 144, 110, 152, 110, 151, 110, 116, 180, 368, 119, 311, 113]}
{"feat_id": "13728528", "text": "Samantha: How are you doing today?\r\nRobyn: better, but I really exaggerated with bier last night\r\nSamantha: was it really only bier?\r\nRobyn: sure, why?\r\nSamantha: I don't know, your eyes, behaviour\r\nRobyn: you want to say I took drugs?\r\nSamantha: I'm only asking, I don't have anything against drugs\r\nRobyn: but I have and I never take them\r\nSamantha: ok, sorry, I didn't intend to offend you\r\nRobyn: I just drank too much, that's eat\r\nSamantha: much too much\r\nRobyn: yes, I lost control a bit\r\nRobyn: I am sorry for that\r\nSamantha: important that you feel better today", "target": "Robyn drank too much beer last night and lost control. She is feeling better today.", "evaluation_predictions": [2, 115, 136, 613, 1351, 145, 491, 165, 120, 110, 108, 10137, 112, 180, 117, 3163, 115, 109, 1560, 113, 109, 800, 110, 108, 186, 117, 220, 1812, 115, 109, 4413, 118, 109, 4657, 113, 109, 12584, 4511, 1298, 110, 107, 106, 145, 163, 491, 165, 120, 110, 108, 10137, 112, 180, 117, 3163, 115, 109, 800, 110, 108, 186, 117, 220, 1812, 115, 109, 4413, 118, 109, 4657, 113, 109, 12584, 4511, 1298, 110, 107, 115, 136, 613, 1351, 145, 491, 165, 120, 110, 108, 10137, 112, 180, 117, 3163, 115, 109, 800, 110, 108, 186, 117, 220, 1812, 115, 109, 4413, 118, 109, 4657, 113, 109, 12584, 4511, 1298, 110, 107, 110, 106, 1126, 1126, 1201, 110, 1100, 110, 1100, 115, 136, 613, 1351, 145, 491, 165, 120, 110, 108, 10137, 112, 180, 117, 3163, 115, 109, 800, 110, 108, 186, 117, 220, 1812, 115, 109, 4413, 118, 109, 4657, 113, 109, 12584, 4511, 1298, 110, 107, 106, 145, 163, 491, 165, 120, 110, 108, 10137, 112, 180, 117, 3163, 115, 109, 800, 110, 108, 186, 117, 220, 1812, 115, 109, 4413, 118, 109, 4657, 113, 109, 12584, 4511, 1298, 110, 107, 110, 106, 1126, 1126, 1201, 110, 1100, 110, 1100, 115, 136, 613, 1351, 145, 491, 165, 120, 110, 108, 10137, 112, 180, 117, 3163, 115, 109, 800, 110, 108, 186, 117, 220, 1812, 115, 109, 4413, 118, 109, 4657, 113, 109, 12584, 4511, 1298, 110, 107, 110, 106, 1126, 1126, 1201, 110, 1100, 110, 1100, 115, 136, 613, 1351, 145, 491, 165]}
{"feat_id": "13813429", "text": "Peyton: I have been asking you to bring that video game for me\r\nCameron: Honey, I am not having enough time to come home\r\nPeyton: When would you come home?\r\nCameron: I will have to stay out of town for another week i guess\r\nPeyton: Cant you just deliver that game through the courier? :P\r\nCameron: Dont be mean :/\r\nPeyton: Get the job done and come to home then. ASAP :P", "target": "Peyton is expecting Cameron to bring the video game. Cameron will probably be out for another week.", "evaluation_predictions": [2, 115, 136, 1801, 110, 108, 145, 731, 124, 114, 692, 113, 109, 2650, 757, 23811, 3142, 110, 108, 2650, 757, 23811, 740, 110, 108, 2650, 757, 23811, 522, 110, 108, 2650, 757, 23811, 726, 110, 108, 2650, 757, 23811, 914, 110, 108, 2650, 757, 23811, 1343, 110, 108, 2650, 757, 23811, 1717, 110, 108, 2650, 757, 23811, 1954, 110, 108, 2650, 757, 23811, 2000, 110, 108, 2650, 757, 23811, 2507, 110, 108, 2650, 757, 23811, 2449, 110, 108, 2650, 757, 23811, 4363, 110, 108, 2650, 757, 23811, 3602, 110, 108, 2650, 757, 23811, 5517, 110, 108, 2650, 757, 23811, 5700, 110, 108, 2650, 757, 23811, 4262, 110, 108, 2650, 757, 23811, 5357, 110, 108, 2650, 757, 23811, 6113, 110, 108, 2650, 757, 23811, 4876, 110, 108, 2650, 757, 23811, 6601, 110, 108, 2650, 757, 23811, 3214, 110, 108, 2650, 757, 23811, 7090, 110, 108, 2650, 757, 23811, 8101, 110, 108, 2650, 757, 23811, 8791, 110, 108, 2650, 757, 23811, 6304, 110, 108, 2650, 757, 23811, 5247, 110, 108, 2650, 757, 23811, 5247, 110, 108, 2650, 757, 23811, 9965, 110, 108, 2650, 757, 23811, 9613, 110, 108, 2650, 757, 23811, 9169, 110, 108, 2650, 757, 23811, 10340, 110, 108, 2650, 757, 23811, 6695, 110, 108, 2650, 757, 23811, 12365, 110, 108, 2650, 757, 23811, 914, 110, 108, 2650, 757, 23811, 914, 110, 108, 2650, 757, 23811, 1343, 110, 108, 2650, 757, 23811, 914, 110, 108, 2650, 757, 23811, 1343, 110, 108, 2650, 757, 23811, 914, 110, 108, 2650, 757, 23811, 1343, 110, 108, 2650, 757, 23811]}
{"feat_id": "13731477", "text": "Gene: Did you get the package I sent you\r\nJack: No, when did you send it?\r\nGene: on Friday\r\nJack: shit I should have gotten it by now\r\nJack: send me the tracking umber I;ll check whats up\r\nGene: 12345678900", "target": "Jack has not received yet the package Gene had sent him on Friday. She sent him the tracking number, so he could check the status of the shipment. ", "evaluation_predictions": [2, 145, 731, 124, 109, 211, 5892, 113, 109, 2650, 757, 23811, 14333, 1384, 757, 23811, 17246, 1384, 757, 23811, 522, 11949, 3160, 113, 109, 2650, 757, 23811, 16971, 1384, 757, 23811, 914, 11949, 110, 107, 106, 109, 5892, 140, 266, 115, 109, 2210, 113, 2650, 757, 23811, 1343, 110, 108, 2650, 757, 23811, 1717, 110, 108, 2650, 757, 23811, 1954, 110, 108, 2650, 757, 23811, 2000, 110, 108, 2650, 757, 23811, 2507, 110, 108, 111, 2650, 757, 23811, 2449, 110, 107, 110, 106, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940]}
{"feat_id": "13730896", "text": "Russ: Hi Gurdun, are you feeling better?\r\nGurdun: Hi Ross, a little bit better, yes, thank you.\r\nRuss: Do you think you'll be back at the uni this week?\r\nGurdun: Not tomorrow for sure, but I hope Wednesday maybe.\r\nRuss: Oh, that's fine.\r\nGurdun: Yeah. I'm feeling a bit weak still\r\nRuss: I guess that's pretty normal at this stage.\r\nGurdun: I believe so. And the muscle pain.\r\nRuss: I know what you mean. I had flu like that three months ago.\r\nGurdun: Luckily the headache's gone. It was absolutely awful.\r\nRuss: True enough. I had the same.\r\nGurdun: Anyway, how are things with you?\r\nRuss: Same old, same old, you know. Not much happening in the middle of term.\r\nGurdun: And the team?\r\nRuss: Oh yeah, we won the last match. It was a good game.\r\nGurdun: Great. Listen Russ, I need to run. Catch up later.\r\nRuss: OK, talk to you later.", "target": "Gurdun has the flu. He is feeling better. Gurdun hopes he'll be back at the uni on Wednesday. Russ and his team won the last match. ", "evaluation_predictions": [2, 199, 171, 119, 2488, 109, 2028, 112, 114, 491, 124, 109, 4630, 110, 152, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]}
{"feat_id": "13730348", "text": "Susan: Will you be home tomorrow?\r\nRob: Why do you ask?\r\nSusan: We could have dinner together if you have time after work\r\nRob: Sure! Sounds great\r\nSusan: Fantastic! I will cook something special then!", "target": "Rob will be home tomorrow. Rob and Susan will have dinner together after Rob's work. Susan will cook something special.", "evaluation_predictions": [2, 145, 731, 124, 109, 211, 5892, 113, 109, 2650, 757, 23811, 14333, 1384, 757, 23811, 740, 111, 2650, 757, 23811, 11809, 1384, 757, 23811, 726, 3181, 9428, 113, 109, 2650, 757, 23811, 18143, 1384, 757, 23811, 1343, 111, 2650, 757, 23811, 25390, 1384, 757, 23811, 1954, 15896, 115, 2650, 757, 23811, 2000, 3509, 110, 108, 2650, 757, 23811, 2507, 3509, 110, 108, 2650, 757, 23811, 2449, 3509, 111, 2650, 757, 23811, 4363, 3509, 110, 108, 4802, 110, 108, 115, 2650, 757, 23811, 3602, 3509, 110, 108, 2650, 757, 23811, 5517, 3509, 110, 108, 2650, 757, 23811, 5700, 3509, 110, 108, 2650, 757, 23811, 4262, 3509, 110, 108, 2650, 757, 23811, 5357, 3509, 110, 108, 2650, 757, 23811, 6113, 3509, 110, 108, 2650, 757, 23811, 4876, 3509, 110, 108, 2650, 757, 23811, 6601, 3509, 110, 108, 2650, 757, 23811, 3214, 3509, 110, 108, 2650, 757, 23811, 7090, 3509, 110, 108, 2650, 757, 23811, 8101, 3509, 110, 108, 2650, 757, 23811, 8791, 3509, 110, 108, 2650, 757, 23811, 6304, 3509, 110, 108, 2650, 757, 23811, 5247, 3509, 110, 108, 2650, 757, 23811, 726, 110, 108, 2650, 757, 23811, 726, 110, 108, 2650, 757, 23811, 726, 110, 108, 2650, 757, 23811, 726, 110, 108, 2650, 757, 23811, 914, 110, 108, 2650, 757, 23811, 726, 110, 108, 2650, 757, 23811, 914, 110, 108, 2650, 757, 23811, 726, 110, 108, 2650, 757, 23811, 914, 110, 108, 2650, 757, 23811, 726, 110, 108, 2650, 757, 23811, 726, 110, 108, 2650, 757, 23811, 914, 110, 108, 2650, 757, 23811, 726, 110, 108]}
{"feat_id": "13731077", "text": "Evan: Hi Dennis, I need to cancel my lesson tomorrow, sorry.\r\nDennis: Fine, what's up?\r\nEvan: Oh, got the flu, been off college since Tuesday. Should be ok for next Thurs.\r\nDennis: OK, just tell me in good time if you can't make it. Hope you feel better soon.\r\nEvan: Bye, Den! Thanks.", "target": "Evan can't come to his lesson tomorrow, because he's got the flu.", "evaluation_predictions": [2, 145, 491, 165, 120, 110, 108, 115, 114, 878, 1288, 110, 108, 109, 3241, 113, 114, 2187, 117, 16760, 110, 107, 110, 106, 145, 491, 165, 120, 110, 108, 115, 114, 878, 1288, 110, 108, 109, 3241, 113, 114, 2187, 117, 16760, 110, 107, 110, 106, 145, 491, 165, 120, 110, 108, 115, 114, 878, 1288, 110, 108, 109, 3241, 113, 114, 2187, 117, 16760, 110, 107, 110, 106, 145, 491, 165, 120, 110, 108, 115, 114, 878, 1288, 110, 108, 109, 3241, 113, 114, 2187, 117, 16760, 110, 107, 110, 106, 145, 491, 165, 120, 110, 108, 115, 114, 878, 1288, 110, 108, 109, 3241, 113, 114, 2187, 117, 16760, 110, 107, 110, 106, 145, 491, 165, 120, 110, 108, 115, 114, 878, 1288, 110, 108, 109, 3241, 113, 114, 2187, 117, 16760, 110, 107, 110, 106, 145, 491, 165, 120, 110, 108, 115, 114, 878, 1288, 110, 108, 109, 3241, 113, 114, 2187, 117, 16760, 110, 107, 110, 106, 145, 491, 165, 120, 110, 108, 115, 114, 878, 1288, 110, 108, 109, 3241, 113, 114, 2187, 117, 16760, 110, 107, 110, 106, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110, 940, 110]}
{"feat_id": "13817550", "text": "Lorena: Hi, can you help me with something?\r\nMartin: Well, I can try.\r\nMartin: Depends what it is?\r\nLorena: I got a new desk and it comes with assembly instructions but I give up, I just can't do it.\r\nLorena: I'm illiterate when it comes to instructions, haha.\r\nLorena: So I could really use some help...\r\nMartin: Hmm, I can't today, but how about tomorrow? Shouldn't take long anyway.\r\nLorena: Yeah, it's fine! I just need it for the weekend, but tomorrow's great! Sorry for troubling you again.\r\nMartin: Don't mention it, we've known each other for so long it's almost like we're siblings, haha.\r\nLorena: Thanks a lot! I should be home by 6, so let me know when you can.", "target": "Martin is going to help Lorena assemble a new desk. He is coming to her house tomorrow.", "evaluation_predictions": [2, 145, 799, 114, 3816, 113, 114, 44515, 120, 11343, 109, 628, 113, 114, 2702, 112, 109, 344, 113, 5291, 120, 126, 137, 337, 110, 107, 106, 145, 3486, 120, 109, 344, 113, 5291, 120, 137, 129, 2394, 141, 114, 2702, 113, 114, 634, 628, 117, 114, 1074, 5448, 124, 109, 344, 113, 5291, 120, 137, 129, 2394, 141, 114, 2702, 113, 114, 291, 628, 110, 107, 106, 145, 403, 120, 109, 344, 113, 5291, 120, 137, 129, 2394, 141, 114, 2702, 113, 114, 291, 628, 117, 114, 1074, 5448, 124, 109, 344, 113, 5291, 120, 137, 129, 2394, 141, 114, 2702, 113, 114, 291, 628, 110, 107, 106, 145, 3486, 120, 109, 344, 113, 5291, 120, 137, 129, 2394, 141, 114, 2702, 113, 114, 291, 628, 117, 114, 1074, 5448, 124, 109, 344, 113, 5291, 120, 137, 129, 2394, 141, 114, 2702, 113, 114, 291, 628, 110, 107, 110, 106, 1126, 44515, 1100, 5282, 33208, 3372, 1126, 44515, 1100, 1114, 2967, 64101, 208, 1126, 44515, 1100, 17810, 4430, 208, 1126, 44515, 1100, 27694, 1126, 44515, 1100, 5409, 30540, 1126, 44515, 1100, 40330, 1126, 44515, 1100, 5409, 25730, 10415, 1126, 44515, 1100, 1572, 13413, 5797, 1126, 44515, 1100, 51795, 420, 2928, 1126, 44515, 1100, 29777, 1126, 44515, 1100, 30412, 1126, 44515, 1100, 56515, 1126, 44515, 1100, 95007, 1126, 44515, 1100, 2957, 3580, 1126, 44515, 1100, 37453, 1126, 44515, 1100, 6948, 19459, 1126, 44515, 1100, 216, 9035, 1126, 44515, 1100, 48076, 1126, 44515, 1100, 94307, 145, 799, 114, 3816, 113, 114, 44515, 120, 11343, 109, 628, 113]}
{"feat_id": "13681441", "text": "Joanna: They are sending emails about Lewandowska.\r\nMerve: What happened?\r\nJoanna: <file_photo> \r\nMerve: Wooow!\r\nJoanna: She is hospitalized because she has measles.\r\nMerve: She had what?\r\nJoanna: Anyone who had contact with her within the last couple of days must get vaccinated.\r\nMerve: Luckily I didn't see her since the last semester...\r\nJoanna: I did, she is my thesis mentor :(\r\nMerve: What will you do?\r\nJoanna: They are organizing vaccinations in the main building from 17th until 19th.\r\nMerve: You have to go!\r\nJoanna: I know... And I just started working so I really don't have a lot of time.\r\nMerve: Come on, this is really important.\r\nJoanna: I will try to do it before work on 18th, hopefully I won't lose the entire day...", "target": "Lewandowska has measles. There are vaccinations in the main building from 17th until 19th for everyone who had contact with her. ", "evaluation_predictions": [2, 115, 136, 1801, 110, 108, 145, 845, 109, 645, 906, 110, 151, 117, 126, 921, 120, 110, 151, 143, 532, 110, 158, 109, 449, 113, 109, 327, 117, 3908, 110, 206, 143, 27333, 110, 158, 109, 449, 113, 109, 327, 117, 3908, 110, 206, 143, 51318, 110, 158, 109, 449, 113, 109, 327, 117, 3908, 110, 206, 143, 53301, 110, 158, 109, 449, 113, 109, 327, 117, 3908, 110, 206, 143, 2294, 110, 158, 109, 449, 113, 109, 327, 117, 3908, 110, 206, 143, 18604, 110, 158, 109, 449, 113, 109, 327, 117, 3908, 110, 206, 143, 110, 63631, 110, 158, 109, 449, 113, 109, 327, 117, 3908, 110, 206, 143, 18604, 110, 158, 109, 449, 113, 109, 327, 117, 3908, 110, 206, 143, 2294, 110, 158, 109, 449, 113, 109, 327, 117, 3908, 110, 206, 143, 18604, 110, 158, 109, 449, 113, 109, 327, 117, 3908, 110, 206, 143, 110, 63631, 110, 158, 109, 449, 113, 109, 327, 117, 3908, 110, 206, 143, 18604, 110, 158, 109, 449, 113, 109, 327, 117, 3908, 110, 206, 143, 2294, 110, 158, 109, 449, 113, 109, 327, 117, 3908, 110, 206, 143, 18604, 110, 158, 109, 449, 113, 109, 327, 117, 3908, 110, 206, 143, 110, 63631, 110, 158, 109, 449, 113, 109, 327, 117, 3908, 110, 206, 143, 18604, 110, 158, 109, 449, 113, 109, 327, 117, 3908, 110, 206, 143, 2294, 110, 158, 109, 449, 113, 109, 327, 117, 3908, 110, 206, 143, 18604, 110, 158, 109, 449, 113, 109, 327, 117, 3908, 110, 206]}
{"dialog": ["Say , Jim , how about going for a few beers after dinner ? ", " You know that is tempting but is really not good for our fitness . ", " What do you mean ? It will help us to relax . ", " Do you really think so ? I don't . It will just make us fat and act silly . Remember last time ? ", " I guess you are right.But what shall we do ? I don't feel like sitting at home . ", " I suggest a walk over to the gym where we can play singsong and meet some of our friends . ", " That's a good idea . I hear Mary and Sally often go there to play pingpong.Perhaps we can make a foursome with them . ", " Sounds great to me ! If they are willing , we could ask them to go dancing with us.That is excellent exercise and fun , too . ", " Good.Let ' s go now . ", " All right . "], "act": [3, 4, 2, 2, 2, 3, 4, 1, 3, 4], "emotion": [0, 0, 0, 0, 0, 0, 4, 4, 4, 4], "mention": [["dinner", "beer"], ["fitness", "know"], ["will", "mean", "relax"], ["will", "remember", "time"], ["guess", "sit", "home", "feel"], ["walk", "meet", "friend", "play", "gym"], ["hear", "idea", "play"], ["fun", "dance", "sound", "exercise"], [], []], "source": ["dinner", "beer"], "target": ["fun", "dance", "sound", "exercise"]}
{"dialog": ["Can you do push-ups ? ", " Of course I can . It's a piece of cake ! Believe it or not , I can do 30 push-ups a minute . ", " Really ? I think that's impossible ! ", " You mean 30 push-ups ? ", " Yeah ! ", " It's easy . If you do exercise everyday , you can make it , too . "], "act": [2, 1, 2, 2, 1, 1], "emotion": [0, 0, 6, 0, 0, 0], "mention": [["push"], ["cake", "push", "believe", "minute", "piece"], [], ["mean", "push"], [], ["everyday", "exercise"]], "source": ["push"], "target": ["everyday", "exercise"]}
{"dialog": ["Can you study with the radio on ? ", " No , I listen to background music . ", " What is the difference ? ", " The radio has too many comerials . ", " That's true , but then you have to buy a record player . "], "act": [2, 1, 2, 1, 1], "emotion": [0, 0, 0, 0, 0], "mention": [["radio", "study"], ["listen", "background", "music"], ["difference"], ["radio"], ["player", "buy", "record"]], "source": ["radio", "study"], "target": ["player", "buy", "record"]}
{"dialog": ["Hi , Becky , what's up ? ", " Not much , except that my mother-in-law is driving me up the wall . ", " What's the problem ? ", " She loves to nit-pick and criticizes everything that I do . I can never do anything right when she's around . ", " For example ? ", " Well , last week I invited her over to dinner . My husband and I had no problem with the food , but if you listened to her , then it would seem like I fed her old meat and rotten vegetables . There's just nothing can please her . ", " No , I can't see that happening . I know you're a good cook and nothing like that would ever happen . ", " It's not just that . She also criticizes how we raise the kids . ", " My mother-in-law used to do the same thing to us . If it wasn't disciplining them enough , then we were disciplining them too much . She also complained about the food we fed them , the schools we sent them too , and everything else under the sun . ", " You said she used to ? How did you stop her ? ", " We basically sat her down and told her how we felt about her constant criticizing , and how we welcomed her advice but hoped she'd let us do our things . She understood , and now everything is a lot more peaceful . ", " That sounds like a good idea . I'll have to try that . "], "act": [2, 1, 2, 1, 2, 1, 1, 1, 1, 2, 1, 1], "emotion": [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 4], "mention": [[], ["drive", "mother", "law", "wall"], ["problem"], ["pick", "love", "criticize", "nit"], [], ["husband", "listen", "meat", "week", "food", "dinner", "vegetable", "problem", "invite", "feed"], ["see", "know", "happen", "cook"], ["criticize", "kid", "raise"], ["discipline", "mother", "thing", "food", "use", "school", "law", "sun", "feed", "send", "complain"], ["use"], ["advice", "thing", "sit", "hope", "lot", "criticizing", "understand", "feel"], ["will", "idea", "sound"]], "source": ["drive", "mother", "law", "wall"], "target": ["will", "idea", "sound"]}
{"dialog": ["How are Zina's new programmers working out ? ", " I hate to admit it , but they're good . And fast . The Filipino kid is a genius . ", " So you'll make the Stars.com deadline , and have us up and running next week ? ", " It'll be close , but we'll make it . ", " Good . After Stars.com starts paying us , we won't need Vikam's cash anymore . ", " And if we don't need them , we won't need Zina , either . "], "act": [2, 1, 2, 1, 1, 1], "emotion": [0, 0, 0, 0, 0, 0], "mention": [["programmer", "work"], ["genius", "hate", "admit", "kid"], ["will", "week", "deadline"], ["will"], ["pay", "need", "good", "will", "cash", "start"], ["will", "need"]], "source": ["programmer", "work"], "target": ["will", "need"]}
{"q_id": "11j8u0", "title": "Is it possible that in the distant future, an organism that exists right now on Earth, will evolve to the point that it can be considered intelligent life? Which animal comes the closest? ", "selftext": "nOTHING", "document": "NONE", "subreddit": "askscience", "url": "http://www.reddit.com/r/askscience/comments/11j8u0/is_it_possible_that_in_the_distant_future_an/", "answers": {"a_id": ["c6my89v", "c6my8ed"], "score": [3, 2], "text": [" > Is it possible that in the distant future, an organism that exists right now on Earth, will evolve to the point that it can be considered intelligent life?\n\nYes, it's possible, but by no means is it a sure thing. Evolution doesn't have a goal in mind, nor does it have specific end points that are inherently better than others. Evolution adapts to the environment, and what works works, what doesn't dies off.\n > Which animal comes the closest?\n\nWe have a difficult time defining intelligence *within our own species*. The difficulty goes up astronomically when you look at other species.\n\nAre you talking about technic intelligence? It's unlikely anything in the ocean will ever develop technology (at least, as far as we can tell - without fire lots of tools are going to be difficult to make).\n\nSocial intelligence? Well, insects seem to have a big advantage there... but they're probably a ways off on the tool front.\n\netc, etc, etc.\n\n\nYou see the difficulties? ", "Evolution is not directed.\n\nWe can't answer this without speculation."]}, "title_urls": ["https://facebookresearch.github.io/ELI5/"], "selftext_urls": ["https://facebookresearch.github.io/ELI5/"], "answers_urls": [["https://facebookresearch.github.io/ELI5/"], ["https://facebookresearch.github.io/ELI5/"]]}
{"q_id": "yu0i8", "title": "When travelling fast why do things that we pass appear \"blurry\"? Is there a lack of light for us to create a clear picture?", "selftext": "I was wondering whether things appear blurry as we zoom past them is a result of a lack of light or the result of our brains and eyes not completing the picture", "document": "", "subreddit": "askscience", "url": "http://www.reddit.com/r/askscience/comments/yu0i8/when_travelling_fast_why_do_things_that_we_pass/", "answers": {"a_id": ["c5yt3dq"], "score": [10], "text": ["When something moves quickly in your field of vision, your visual cortex has a tough time distinguishing edges of the object that you're viewing. This is due to the speed that it is moving across your [retina](_URL_0_). When something moves across your retina, your photoreceptors are activated as it moves and as that speed increases, it's hard for your brain to distinguish the line between where your receptors are activated and where they aren't (which is one of the ways we see borders/details of objects).\n\nThat's a simplified explanation.\n\nIt is NOT because your brain has some \"maximum\" FPS that it can process. That is not true at all. Don't believe any maximum FPS that anyone says our brain/eye can process, it varies greatly."]}, "title_urls": [], "selftext_urls": [], "answers_urls": [["http://en.wikipedia.org/wiki/Retina"]]}
{"q_id": "sijx8", "title": "Could a cow survive in the wild?", "selftext": "Could the modern Holstein dairy cow survive in a temperate climate without any human assistance? Do cattle have enough wild-type characteristics that they would be able to live without assistance in a wild environment? Thanks.", "document": "", "subreddit": "askscience", "url": "http://www.reddit.com/r/askscience/comments/sijx8/could_a_cow_survive_in_the_wild/", "answers": {"a_id": ["c4ebwtn", "c4ec1j4", "c4ed1bh", "c4ed7a1", "c4eenia"], "score": [26, 17, 4, 6, 4], "text": ["Some milk cows are bred to over produce milk, this wouldn't necessarily make them die. I live on a cattle ranch in Wyoming, and other cows such as black Angus would be perfectly fine without people. Ranching in Wyoming generally consists of bringing them to mountain pasture for most of the year, where they survive for 6-8 months entirely alone.", "They would be easy prey for predators - where I live, wolf packs kill a lot of cows. Bears and mountain lions also kill cows. \n\nSome breeds of beef cows - like the Highland Cattle (_URL_0_) which a friend of mine raises, are really good at foraging in the wild and are more athletic and able to defend themselves. My friend's Highland Cattle are really cool, more like a real wild animal. They are pretty resourceful beasts and escape a lot.\n\nSo dairy cows might have trouble surviving, beef cows like Angus would do better, and Highland Cattle better still. As long as there weren't too many predators, they'd be fine. Over a few generations, they would start to become less domesticated, and after a few hundred generations, they would be wild.", "In Hawaii they escape into our forests from ranched areas and they exterminate them using helicopters every couple years.\n\nTemperatures just barely drop below freezing where this occurs. There is lots of grass and other vegetation to graze on.\n\nThey are leaner than the domestic cows as they get additional alfalfa feed.", "This really depends on where you live. Cattle raised in parts of Australia or some parts of the US are heavily dependent on humans as a source of food - simply because those areas are prone to extended drought and not much of anything grows there.\n\nAssuming that food supply was not an issue, some would likely not survive to maturity, but over several generations those that did survive would adapt to the specific environment. Eventually they would develop into a new species specific to that region.\n\nOther comments have addressed issues of milking & predators etc. so I won't repeat, except to say that in a country like Australia, New Zealand, or England there aren't a lot of large predators to worry about.\n\nSome other examples to look up where this has actually happened with other species are Pablo Escobar's Hippos in Colombia, Camels & Pigs in Australia, & Horses being brought to the Americas by the Spanish (now known as the Mustang).\n\nEdit: spelling.", "There is a herd of feral cattle in the Big Slough Wilderness, in Texas. I have come across them while hiking there. They were leaner than domestic animals, and surprisingly quiet. There were two bulls, 5 cows, and 4 calfs when I saw them in 1992. I heard from a friend who lives in the area that the herd was spotted in 2005, and was about the same size.\n\nThese, however, were not bred for dairy production, but for meat. "]}, "title_urls": [], "selftext_urls": [], "answers_urls": [[], ["https://en.wikipedia.org/wiki/Highland_cattle"], [], [], []]}
{"q_id": "grifu", "title": "Will the humanity ever be able to destroy the universe itself?", "selftext": "Considering our remarkable progress in the field of destroying things, is it possible that some day humanity might have the means to destroy the whole universe in some way? ", "document": "", "subreddit": "askscience", "url": "http://www.reddit.com/r/askscience/comments/grifu/will_the_humanity_ever_be_able_to_destroy_the/", "answers": {"a_id": ["c1ps8jr", "c1ptckt"], "score": [3, 2], "text": ["Is it possible? Sure, why not.\n\nIs it probable? Not really. \n\nWe don't currently posses the means to travel even to the other end of our own galaxy, much less destroy it. And by the time we do, given our \"remarkable progress in the field of destroying things\", we'd most likely destroy ourselves before we could turn our fury on the rest of the universe.", "How different would a destroyed universe appear compared to a not yet destroyed universe?."]}, "title_urls": [], "selftext_urls": [], "answers_urls": [[], []]}
{"q_id": "1tklna", "title": "How does DNA get repaired ?", "selftext": "What are the mechanisms of DNA repair ? \n", "document": "", "subreddit": "askscience", "url": "http://www.reddit.com/r/askscience/comments/1tklna/how_does_dna_get_repaired/", "answers": {"a_id": ["ce8z5cl"], "score": [10], "text": ["There are various mechanisms at place here, so I'll start off by describing two examples of how DNA is repaired.\n\n**During DNA replication** - During DNA replication, which must occur before any cell can divide, there is a chance that the wrong base could get incorporated into the DNA molecule. In this case, bacterial as well as Eukaryotic (human) cells have mechanism known as [DNA mismatch repair](_URL_0_). The five cent version of this mechanism is that cellular machinery can distinguish between newly synthesized DNA, and template DNA. This machinery has ways of detecting mismatched [base pair](_URL_7_). This machinery is then able to degrade the new strand and insert the correct base.\n\n**After DNA replication** - mutations in our DNA can occur during a cell's life time, even while not replicating the DNA. Things such as ultraviolet light can cause damage to the [nucleotides](_URL_3_) which make up out DNA. [Nucleotide excision repair](_URL_3__excision_repair) mechanisms are employed in many cases. During the cells life time, cells have special machinery that detects aberrations in DNA such as [pyrimidine dimers](_URL_2_). Specific nucleases are engaged, which degrade the DNA (at the site of the lesion) and specialized [DNA polymerases](_URL_1__polymerase) can be used to fill in the gap in the DNA left by the nucleases.\n\nBonus reading if you are interested in this kind of stuff (may also help you understand the answer better)\n\n1. [DNA](_URL_1_)\n2. [DNA replication](_URL_1__replication)\n\nHope this helped, I'll answer any questions to the best of my ability."]}, "title_urls": [], "selftext_urls": [], "answers_urls": [["http://en.wikipedia.org/wiki/DNA_mismatch_repair", "http://en.wikipedia.org/wiki/DNA", "http://en.wikipedia.org/wiki/Pyrimidine_dimer", "http://en.wikipedia.org/wiki/Nucleotide", "http://en.wikipedia.org/wiki/DNA_polymerase", "http://en.wikipedia.org/wiki/DNA_replication", "http://en.wikipedia.org/wiki/Nucleotide_excision_repair", "http://en.wikipedia.org/wiki/Base_pair"]]}
{"q_id": "1ybuc0", "title": "When my ears \"tune out\" sounds at night, are there any physical changes within my ears occurring, or is it simply the brain at work?", "selftext": "I want to fully understand the concept of \"tuning out\" ", "document": "", "subreddit": "askscience", "url": "http://www.reddit.com/r/askscience/comments/1ybuc0/when_my_ears_tune_out_sounds_at_night_are_there/", "answers": {"a_id": ["cfjbpmo", "cfk0mwv"], "score": [3, 4], "text": ["Related to this, but not an answer: I have problems sleeping sometimes and so I'm sometimes very sleepy, so sleepy that I can fall asleep momentarily when I don't want to. In that state, I can sense that my hearing actually turns off right before I fall asleep. I'm only aware of it for a fraction of a second, like maybe a quarter of a second, but all ambient sounds definitely cease, then I fall asleep for a fraction of a second, then wake back up.", "Your brain never truly \"turns out\", as it is always receiving stimulus from the environment around you and making sense of them. When you are asleep your brain doesn't just turn off your auditory cortex; If so, your alarm in the morning would never wake you up. However, certain senses can be dimmed while another is being heavily used. Examples include when you hear a sound in a dim room and you're scared, your hearing suddenly becomes priority and senses such as smell and sight may degrade slightly. \n\nSo to answer your question, there aren't many physical changes occurring in the ear which cause this, it is your brain. Also, \"tuning out\" is most likely a form of day-dreaming. You may not even be tired yet your body relaxes and you concentrate heavily on the thoughts in your head, dulling your other senses, so you may completely not hear a friend's sentence from a foot away. "]}, "title_urls": [], "selftext_urls": [], "answers_urls": [[], []]}
{"q_id": "clgrjd", "title": "Are band gaps experimentally measured or can they be predicted to a certain precision?", "selftext": "I understand what band gaps are, how they differ for conductors, insulators, and semi-conductors, and generally why they exist, though I not have taken quantum, electrodynamics, or modern physics, so my knowledge may be incomplete.", "document": "", "subreddit": "askscience", "url": "https://www.reddit.com/r/askscience/comments/clgrjd/are_band_gaps_experimentally_measured_or_can_they/", "answers": {"a_id": ["evw42f9", "evymdmi", "ew01kwg"], "score": [6, 2, 2], "text": ["Bandgaps can also exist in acoustics or general electrodynamics. Generally in periodic systems such as atomic lattices waves propagating through them will not be able to assume certain frequencies because they destructively interfere with themselves when they interact with the structure. Because frequency is related to energy, in crystal lattices you can get forbidden energies. In practice the lattice acts as a band stop filter for waves propagating through them. So you can measure them but you can also accurately predict them using just math. The Kronig-Penney model is such an example.", "There are many, many computational techniques, such as density functional theory, that can be used to calculate the band-structure of a material (including band-gap). They can also be measured, for example, by looking at the absorption vs. wavelength", "Both.\n\nIn an ideal theorist's toy model (infinite periodic lattice without strong interactions between atoms) the electronic structure can be calculated analytically. In more realistic systems there are more advanced models and techniques for calculating bandgaps, and there's been a lot of interest in developing a way to calculate band gaps using [density functional theory](_URL_0_) although it tends to underestimate the actual gap somewhat.\n\nExperimentally, one of the common ways of finding the bandgap of a solid is through optical methods where you illuminate the sample with different frequencies and measure how much is absorbed to extract the minimum energy that can excite a transition, such as described [here](_URL_1_)"]}, "title_urls": [], "selftext_urls": [], "answers_urls": [[], [], ["https://en.wikipedia.org/wiki/Density_functional_theory", "https://archive.cnx.org/contents/d1c4c0c9-7bcb-4245-b320-6d6fdb3ade9e@1/band-gap-measurement"]]}
{"q_id": "1zdf17", "title": "What would happen if all the matter in the universe was suddenly converted into antimatter? Would it still work the same way?", "selftext": "So,going through some nuclear physics I questioned this very debatable thought about the existence of an antimatter universe?\nBut that is not what I am asking,What I want to know that will the basic laws of physics be valid and will electricity continue to flow,will objects still fall or does the concept of negative gravity come into picture?\nOr is this is just a blasphemous hypothetical question?\nAny way round,I am just very curious to know.", "document": "", "subreddit": "askscience", "url": "http://www.reddit.com/r/askscience/comments/1zdf17/what_would_happen_if_all_the_matter_in_the/", "answers": {"a_id": ["cfsr8f8", "cfsskwf"], "score": [16, 2], "text": ["Charge conjugation is the operation of replacing a particle with its anti-particle. If physical laws remain the same under such an operation, they are said to be [C-symmetric](_URL_1_).\n\nThe weak force is known to couple only to left-handed fermions and right-handed anti-fermions (I'm talking about [chirality](_URL_1_) when I say \"handedness\". Not to be confused with [helicity](_URL_3_)). If we perform charge conjugation on a left-handed fermion, we get a left-handed anti-fermion, which would not couple to the weak force. Therefore the weak force is C-violating.\n\nSimilarly if we perform a spatial inversion, or parity operation (think of it like a mirror) it's easy to see that the weak force is also P-violating.\n\nHow about if we do both? Replace all left handed fermions with right handed anti-fermions? This is known as CP-conjugation, and any law invariant under such a transformation is CP-symmetric.\n\nThe weak force, as it turns out, is [CP-violating](_URL_2_). Neither the strong nor electromagnetic forces have been measured to be CP-violating, and [gravity](_URL_0_) is bloody hard to measure.\n\nThis being said, the amount of CP-violation in the weak force is tiny: it's certainly not enough to explain why there is so much more matter than antimatter. Therefore a universe replaced with its CP-conjugate would be largely the same, with tiny differences in weak-force-mediated processes.", "I vaguely recall reading an article that declared \"parity is not conserved\" -- essentially that some nuclear breakdowns would behave differently. This was the result of an experiment with normal matter but it makes sense to presume antimatter would behave similarly. \n \n_URL_0_"]}, "title_urls": [], "selftext_urls": [], "answers_urls": [["https://en.wikipedia.org/wiki/Gravitational_interaction_of_antimatter", "https://en.wikipedia.org/wiki/C-symmetry", "https://en.wikipedia.org/wiki/CP_violation", "https://en.wikipedia.org/wiki/Helicity_%28particle_physics%29"], ["http://hyperphysics.phy-astr.gsu.edu/hbase/quantum/parity.html"]]}
{"q_id": "17th4n", "title": "Why don't we use freezing to desalinate water?", "selftext": "I am wondering the energy costs of making potable water by heating methods compared to freezing it. \n\nMy line of reasoning is that a byproduct of the cooling would be useful in tropical climates and the final product could be heated by the ambient air before use.\n\nOne of the costs of producing potable water with heat is the cost of cooling it before use.", "document": "", "subreddit": "askscience", "url": "http://www.reddit.com/r/askscience/comments/17th4n/why_dont_we_use_freezing_to_desalinate_water/", "answers": {"a_id": ["c88p305", "c88p3hj", "c88pgtt", "c88wil6"], "score": [56, 37, 7, 2], "text": [" > One of the costs of producing potable water with heat is the cost of cooling it before use.\n\nWhere did you get that idea? You recycle the heat with a counterflow heat exchanger. The outgoing stream heats the incoming one. \n\n", "Some salt gets trapped in the crystal lattice, so the re-melted water won't be free of salt.\n\nCooling is also much more energy intensive than heating and not as easily achieved in places with poor water purification systems.\n\n*Edit:* Just to clarify something too...\n\nDesalinization plants don't use always use distillation to desalinate water, they often use reverse osmosis. Vaporizing a large volume of water is tremendously expensive and should be avoided when possible.\n\nWhen we boil water to make it potable, we're holding it at 212F to kill any microorganisms that may be present. This does not significantly change the dissolved salt concentration since most of the water never leaves the liquid phase.", "When water boils, it evaporates, but it doesn't take the salt with it. Salt's boiling point is very very high ( > 1000 degrees F or C), so it stays put where it started. On the other hand, as Lithium said, salty water freezes at a depressed point.... but it will freeze. There is no way to sequester water from salinated water, as salts dissolve so readily in water.", "Freezing would be very inefficient in terms of energy and effectiveness. The salinity would mean that the water would have to be much colder than normal, as low as 0 F, in order to freeze. Also, it is much easier to heat something than to cool it.\n\nEven if it did freeze, this wouldn't guarantee that the water would be salt free since the water molecules can still freeze around salt ions."]}, "title_urls": [], "selftext_urls": [], "answers_urls": [[], [], [], []]}
{"q_id": "1id6dy", "title": "To what degree can modern man manipulate the weather?", "selftext": "In general. This might be better over in /r/asksciencediscussion.", "document": "", "subreddit": "askscience", "url": "http://www.reddit.com/r/askscience/comments/1id6dy/to_what_degree_can_modern_man_manipulate_the/", "answers": {"a_id": ["cb3b2dj", "cb3me96"], "score": [2, 3], "text": ["[Cloud seeding](_URL_0_)\n\nSome people say China controls the weather to make it sunny on days there are parades - if it rains for a day before, it is less likely to rain the day of the parade.", "Very minimally, as it turns out.\n\nWhile human definitely has an impact on weather -- regionally through actions such as deforestation which alters the local evapotranspiration rate and globally through the emission of anthropogenic greenhouse gases -- our ability to control weather is still at its infant stage. The main reason lies with the scale of weather events: temperature and winds are often synoptic (few hundred km) and is thus beyond what we can practically modify.\n\nCloud seeding is the only plausible manipulation that comes to mind, and yet even that is controversial. To begin with, cloud seeding only induces existing clouds of the appropriate types to precipitate; it does not create rain out of, say, a clear sky. Furthermore, as I understand it, although seeding works well in the lab, field experiments generally failed to produce statistically significant increases in rainfall. The only exception I'm aware of are experiments conducted by [Hydro Tasmania](_URL_0_) (see the articles under 'Studies' in the right panel of the page), which is possibly explained by the pristine environment (and hence lowered quantities of aerosols onto which cloud droplets can form)."]}, "title_urls": [], "selftext_urls": [], "answers_urls": [["http://en.wikipedia.org/wiki/Cloud_seeding"], ["http://www.hydro.com.au/water/cloud-seeding"]]}
{"text": "Brush, brush, its time to brush your teeth\nYes, yes, yes I want to brush my teeth\nGood, good, brushing is good for you\nYay, yay, yay, I like it, oooh\nSee, see, Teddy likes to brush\nOne, two, three, almost clean you see\nYes, yes, yes, you see, Im all done now!\nBrush, brush, brush, theyre all clean, wow!\nBath, bath, its time to take a bath\nYes, yes, yes, I want to take a bath\nGood, good, a bath is good for you\nYay, yay, yay, I like it ooh\nSee, see, Elephant likes the bath\nOne, two, three, almost clean you see\nYes, yes, yes, you see, Im all done now!\nSplash, splash, splash, we like it wow!!\nPajamas, pajamas, its time to wear pajamas\nYes, yes, yes, I want to wear pajamas\nGood, good, pajamas are good for you\nYay, yay, yay I like them, ooh\nSee, see, Monkey likes pajamas\nOne, two, three, almost on you see\nYes, yes, yes, Ill put them on right now!\nSoft, soft, soft, we like them wow!!\nBed, bed, Its time to get in bed\nYes, yes, yes, I want to get in bed\nGood, good, sleep is good for you\nYay, yay, yay, I like it, oooh\nSee, see, Mousie likes the bed\nOne, two, three, cozy warm you see\nYes, yes, yes, Im warm and happy now!\nCozy, cozy, cozy, we like it wow!!\nStory, story, story, its time to read a story\nYes, yes, yes, I want to read a story\nGood, good, the storys fun for you\nYay, yay, yay, I like it, oooh\nSee, see, Teddy reads the story\nOne, two, three, almost done you see\nYes, yes, yes, the story is all done now!\nFun, fun, fun, we like it, wow!!!"}
{"text": "Row, row, row your boat\nTreasures down the stream\nUp, down, up, down\nLets all row and sing\nRow, row, row your boat\nGently down the stream\nMerrily, merrily, merrily, merrily\nLife is but a dream\nRow, row, row your boat\nTreasures down the stream\nUp, down, up, down\nLets all row and sing\nRow, row, row your boat\nGently down the stream\nMerrily, merrily, merrily, merrily\nLife is but a dream\nRow, row, row your boat\nTreasures down the stream\nUp, down, up, down\nLets all row and sing\nRow, row, row your boat\nGently down the stream\nMerrily, merrily, merrily, merrily\nLife is but a dream"}
{"text": "This is the way we make our bed\nMake our bed, make our bed\nThis is the way we make our bed\nSo early in the morning\nThis is the way we wash our face\nWash our face, wash our face\nThis is the way we wash our face\nSo early in the morning\nThis is the way we brush our teeth\nBrush our teeth, brush our teeth\nThis is the way we brush our teeth\nSo early in the morning\nThis is the way we put on our clothes\nPut on our clothes, put on our clothes\nThis is the way we put on our clothes\nSo early in the morning\nThis is the way we comb our hair\nComb our hair, comb our hair\nThis is the way we comb our hair\nSo early in the morning\nThis is the way we put on our shoes\nPut on our shoes, put on our shoes\nThis is the way we put on our shoes\nSo early in the morning"}
{"text": "Humpty Dumpty sat on a wall\nHumpty Dumpty had a great fall\nAll the kings horses and all the kings men\nCouldnt put Humpty together again\nHumpty Dumpty sat on a wall\nHumpty Dumpty had a great fall\nAll the kings horses and all the kings men\nCouldnt put Humpty together again\nHumpty Dumpty sat on a wall\nHumpty Dumpty had a great fall\nAll the kings horses and all the kings men\nCouldnt put Humpty together again\nHumpty Dumpty sat on a wall\nHumpty Dumpty had a great fall\nAll the kings horses and all the kings men\nCouldnt put Humpty together again"}
{"id": "42953", "img": "img/42953.png", "label": 0, "text": "its their character not their color that matters"}
{"id": "23058", "img": "img/23058.png", "label": 0, "text": "don't be afraid to love again everyone is not like your ex"}
{"id": "13894", "img": "img/13894.png", "label": 0, "text": "putting bows on your pet"}
{"id": "37408", "img": "img/37408.png", "label": 0, "text": "i love everything and everybody! except for squirrels i hate squirrels"}
{"id": "82403", "img": "img/82403.png", "label": 0, "text": "everybody loves chocolate chip cookies, even hitler"}
{"id": "16952", "img": "img/16952.png", "label": 0, "text": "go sports! do the thing! win the points!"}
{"id": "76932", "img": "img/76932.png", "label": 0, "text": "fine you're right. now can we fucking drop it?"}
{"id": "70914", "img": "img/70914.png", "label": 1, "text": "tattoos are bad for your health i know 5 million people that had tattoos and they all died"}
{"id": "02973", "img": "img/02973.png", "label": 0, "text": "how long can i run? till the chain tightens"}
{"id": "58306", "img": "img/58306.png", "label": 0, "text": "what is he hiding? we need to see his tax returns! let me stop you right there hillary you deleted 30,000 emails, used bleach bit on hard drives, then destroyed phones with hammers you have no right to talk about people hiding anything truth uncensored"}
{"method": "def __init__(self, range_str):\n<TAB>self.range_str = to_str(range_str)\n<TAB>self.range = set()\n<TAB>range_str = to_str(range_str).split(\",\")\n<TAB>for item in range_str:\n<TAB><TAB>try:\n<TAB><TAB><TAB>int_range = item.split(\"-\")\n<TAB><TAB><TAB>if len(int_range) == 1:\n<TAB><TAB><TAB><TAB>if item:\n<TAB><TAB><TAB><TAB><TAB>self.range.add(int(item))\n<TAB><TAB><TAB>elif len(int_range) == 2:\n<TAB><TAB><TAB><TAB>int_range[0] = int(int_range[0])\n<TAB><TAB><TAB><TAB>int_range[1] = int(int_range[1])\n<TAB><TAB><TAB><TAB>if int_range[0] < 0:\n<TAB><TAB><TAB><TAB><TAB>int_range[0] = 0\n<TAB><TAB><TAB><TAB>if int_range[1] > 65535:\n<TAB><TAB><TAB><TAB><TAB>int_range[1] = 65535\n<TAB><TAB><TAB><TAB>i = int_range[0]\n<TAB><TAB><TAB><TAB>while i <= int_range[1]:\n<TAB><TAB><TAB><TAB><TAB>self.range.add(i)\n<TAB><TAB><TAB><TAB><TAB>i += 1\n<TAB><TAB>except Exception as e:\n<TAB><TAB><TAB>logging.error(e)\n", "block": "<TAB><TAB>try:\n<TAB><TAB><TAB>int_range = item.split(\"-\")\n<TAB><TAB><TAB>if len(int_range) == 1:\n<TAB><TAB><TAB><TAB>if item:\n<TAB><TAB><TAB><TAB><TAB>self.range.add(int(item))\n<TAB><TAB><TAB>elif len(int_range) == 2:\n<TAB><TAB><TAB><TAB>int_range[0] = int(int_range[0])\n<TAB><TAB><TAB><TAB>int_range[1] = int(int_range[1])\n<TAB><TAB><TAB><TAB>if int_range[0] < 0:\n<TAB><TAB><TAB><TAB><TAB>int_range[0] = 0\n<TAB><TAB><TAB><TAB>if int_range[1] > 65535:\n<TAB><TAB><TAB><TAB><TAB>int_range[1] = 65535\n<TAB><TAB><TAB><TAB>i = int_range[0]\n<TAB><TAB><TAB><TAB>while i <= int_range[1]:\n<TAB><TAB><TAB><TAB><TAB>self.range.add(i)\n<TAB><TAB><TAB><TAB><TAB>i += 1\n<TAB><TAB>except Exception as e:\n<TAB><TAB><TAB>logging.error(e)", "complex_masked_block": "<TAB><TAB>try:\n<TAB><TAB><TAB>int_range = item.split(\"-\")\n<TAB><TAB><TAB>if len(int_range) == 1:\n<TAB><TAB><TAB><TAB>if item:\n<TAB><TAB><TAB><TAB><TAB>self.range.add(int(item))\n<TAB><TAB><TAB>elif len(int_range) == 2:\n<TAB><TAB><TAB><TAB>int_range[0] = int(int_range[0])\n<TAB><TAB><TAB><TAB>int_range[1] = int(int_range[1])\n<TAB><TAB><TAB><TAB>if int_range[0] < 0:\n<TAB><TAB><TAB><TAB><TAB>int_range[0] = 0\n<TAB><TAB><TAB><TAB>if int_range[1] > 65535:\n<TAB><TAB><TAB><TAB><TAB>int_range[1] = 65535\n<TAB><TAB><TAB><TAB>i = int_range[0]\n<TAB><TAB><TAB><TAB>while i <=<MASK>", "complex_input": "def __init__(self, range_str): <TAB>self.range_str = to_str(range_str) <TAB>self.range = set() <TAB>range_str = to_str(range_str).split(\",\") <TAB>for item in range_str: <TAB><TAB>try: <TAB><TAB><TAB>int_range = item.split(\"-\") <TAB><TAB><TAB>if len(int_range) == 1: <TAB><TAB><TAB><TAB>if item: <TAB><TAB><TAB><TAB><TAB>self.range.add(int(item)) <TAB><TAB><TAB>elif len(int_range) == 2: <TAB><TAB><TAB><TAB>int_range[0] = int(int_range[0]) <TAB><TAB><TAB><TAB>int_range[1] = int(int_range[1]) <TAB><TAB><TAB><TAB>if int_range[0] < 0: <TAB><TAB><TAB><TAB><TAB>int_range[0] = 0 <TAB><TAB><TAB><TAB>if int_range[1] > 65535: <TAB><TAB><TAB><TAB><TAB>int_range[1] = 65535 <TAB><TAB><TAB><TAB>i = int_range[0] <TAB><TAB><TAB><TAB>while i <=<MASK> ", "complex_target": "int_range[1]: <TAB><TAB><TAB><TAB><TAB>self.range.add(i) <TAB><TAB><TAB><TAB><TAB>i += 1 <TAB><TAB>except Exception as e: <TAB><TAB><TAB>logging.error(e)", "medium_masked_block": "<TAB><TAB>try:\n<TAB><TAB><TAB>int_range = item.split(\"-\")\n<TAB><TAB><TAB>if len(int_range) == 1:\n<TAB><TAB><TAB><TAB>if item:\n<TAB><TAB><TAB><TAB><TAB>self.range.add(int(item))\n<TAB><TAB><TAB>elif len(int_range) == 2:\n<TAB><TAB><TAB><TAB>int_range[0] = int(int_range[0])\n<TAB><TAB><TAB><TAB>int_range[1] = int(int_range[1])\n<TAB><TAB><TAB><TAB>if int_range[0] < 0:\n<TAB><TAB><TAB><TAB><TAB>int_range[0] = 0\n<TAB><TAB><TAB><TAB>if int_range[1] > 65535:\n<TAB><TAB><TAB><TAB><TAB>int_range[1] = 65535\n<TAB><TAB><TAB><TAB>i = int_range[0]\n<TAB><TAB><TAB><TAB>while i <= int_range[1]:\n<TAB><TAB><TAB><TAB><TAB>self.range.add(i)\n<TAB><TAB><TAB><TAB><TAB>i<MASK>", "medium_input": "def __init__(self, range_str): <TAB>self.range_str = to_str(range_str) <TAB>self.range = set() <TAB>range_str = to_str(range_str).split(\",\") <TAB>for item in range_str: <TAB><TAB>try: <TAB><TAB><TAB>int_range = item.split(\"-\") <TAB><TAB><TAB>if len(int_range) == 1: <TAB><TAB><TAB><TAB>if item: <TAB><TAB><TAB><TAB><TAB>self.range.add(int(item)) <TAB><TAB><TAB>elif len(int_range) == 2: <TAB><TAB><TAB><TAB>int_range[0] = int(int_range[0]) <TAB><TAB><TAB><TAB>int_range[1] = int(int_range[1]) <TAB><TAB><TAB><TAB>if int_range[0] < 0: <TAB><TAB><TAB><TAB><TAB>int_range[0] = 0 <TAB><TAB><TAB><TAB>if int_range[1] > 65535: <TAB><TAB><TAB><TAB><TAB>int_range[1] = 65535 <TAB><TAB><TAB><TAB>i = int_range[0] <TAB><TAB><TAB><TAB>while i <= int_range[1]: <TAB><TAB><TAB><TAB><TAB>self.range.add(i) <TAB><TAB><TAB><TAB><TAB>i<MASK> ", "medium_target": "+= 1 <TAB><TAB>except Exception as e: <TAB><TAB><TAB>logging.error(e)", "simple_masked_block": "<TAB><TAB>try:\n<TAB><TAB><TAB>int_range = item.split(\"-\")\n<TAB><TAB><TAB>if len(int_range) == 1:\n<TAB><TAB><TAB><TAB>if item:\n<TAB><TAB><TAB><TAB><TAB>self.range.add(int(item))\n<TAB><TAB><TAB>elif len(int_range) == 2:\n<TAB><TAB><TAB><TAB>int_range[0] = int(int_range[0])\n<TAB><TAB><TAB><TAB>int_range[1] = int(int_range[1])\n<TAB><TAB><TAB><TAB>if int_range[0] < 0:\n<TAB><TAB><TAB><TAB><TAB>int_range[0] = 0\n<TAB><TAB><TAB><TAB>if int_range[1] > 65535:\n<TAB><TAB><TAB><TAB><TAB>int_range[1] = 65535\n<TAB><TAB><TAB><TAB>i = int_range[0]\n<TAB><TAB><TAB><TAB>while i <= int_range[1]:\n<TAB><TAB><TAB><TAB><TAB>self.range.add(i)\n<TAB><TAB><TAB><TAB><TAB>i += 1\n<TAB><TAB>except Exception as<MASK>", "simple_input": "def __init__(self, range_str): <TAB>self.range_str = to_str(range_str) <TAB>self.range = set() <TAB>range_str = to_str(range_str).split(\",\") <TAB>for item in range_str: <TAB><TAB>try: <TAB><TAB><TAB>int_range = item.split(\"-\") <TAB><TAB><TAB>if len(int_range) == 1: <TAB><TAB><TAB><TAB>if item: <TAB><TAB><TAB><TAB><TAB>self.range.add(int(item)) <TAB><TAB><TAB>elif len(int_range) == 2: <TAB><TAB><TAB><TAB>int_range[0] = int(int_range[0]) <TAB><TAB><TAB><TAB>int_range[1] = int(int_range[1]) <TAB><TAB><TAB><TAB>if int_range[0] < 0: <TAB><TAB><TAB><TAB><TAB>int_range[0] = 0 <TAB><TAB><TAB><TAB>if int_range[1] > 65535: <TAB><TAB><TAB><TAB><TAB>int_range[1] = 65535 <TAB><TAB><TAB><TAB>i = int_range[0] <TAB><TAB><TAB><TAB>while i <= int_range[1]: <TAB><TAB><TAB><TAB><TAB>self.range.add(i) <TAB><TAB><TAB><TAB><TAB>i += 1 <TAB><TAB>except Exception as<MASK> ", "simple_target": "e: <TAB><TAB><TAB>logging.error(e)"}
{"method": "def __init__(self, range_str):\n<TAB>self.range_str = to_str(range_str)\n<TAB>self.range = set()\n<TAB>range_str = to_str(range_str).split(\",\")\n<TAB>for item in range_str:\n<TAB><TAB>try:\n<TAB><TAB><TAB>int_range = item.split(\"-\")\n<TAB><TAB><TAB>if len(int_range) == 1:\n<TAB><TAB><TAB><TAB>if item:\n<TAB><TAB><TAB><TAB><TAB>self.range.add(int(item))\n<TAB><TAB><TAB>elif len(int_range) == 2:\n<TAB><TAB><TAB><TAB>int_range[0] = int(int_range[0])\n<TAB><TAB><TAB><TAB>int_range[1] = int(int_range[1])\n<TAB><TAB><TAB><TAB>if int_range[0] < 0:\n<TAB><TAB><TAB><TAB><TAB>int_range[0] = 0\n<TAB><TAB><TAB><TAB>if int_range[1] > 65535:\n<TAB><TAB><TAB><TAB><TAB>int_range[1] = 65535\n<TAB><TAB><TAB><TAB>i = int_range[0]\n<TAB><TAB><TAB><TAB>while i <= int_range[1]:\n<TAB><TAB><TAB><TAB><TAB>self.range.add(i)\n<TAB><TAB><TAB><TAB><TAB>i += 1\n<TAB><TAB>except Exception as e:\n<TAB><TAB><TAB>logging.error(e)\n", "block": "<TAB><TAB><TAB>int_range = item.split(\"-\")\n<TAB><TAB><TAB>if len(int_range) == 1:\n<TAB><TAB><TAB><TAB>if item:\n<TAB><TAB><TAB><TAB><TAB>self.range.add(int(item))\n<TAB><TAB><TAB>elif len(int_range) == 2:\n<TAB><TAB><TAB><TAB>int_range[0] = int(int_range[0])\n<TAB><TAB><TAB><TAB>int_range[1] = int(int_range[1])\n<TAB><TAB><TAB><TAB>if int_range[0] < 0:\n<TAB><TAB><TAB><TAB><TAB>int_range[0] = 0\n<TAB><TAB><TAB><TAB>if int_range[1] > 65535:\n<TAB><TAB><TAB><TAB><TAB>int_range[1] = 65535\n<TAB><TAB><TAB><TAB>i = int_range[0]\n<TAB><TAB><TAB><TAB>while i <= int_range[1]:\n<TAB><TAB><TAB><TAB><TAB>self.range.add(i)\n<TAB><TAB><TAB><TAB><TAB>i += 1", "complex_masked_block": "<TAB><TAB><TAB>int_range = item.split(\"-\")\n<TAB><TAB><TAB>if len(int_range) == 1:\n<TAB><TAB><TAB><TAB>if item:\n<TAB><TAB><TAB><TAB><TAB>self.range.add(int(item))\n<TAB><TAB><TAB>elif len(int_range) == 2:\n<TAB><TAB><TAB><TAB>int_range[0] = int(int_range[0])\n<TAB><TAB><TAB><TAB>int_range[1] = int(int_range[1])\n<TAB><TAB><TAB><TAB>if int_range[0] < 0:\n<TAB><TAB><TAB><TAB><TAB>int_range[0] = 0\n<TAB><TAB><TAB><TAB>if int_range[1] > 65535:\n<TAB><TAB><TAB><TAB><TAB>int_range[1] = 65535\n<MASK>", "complex_input": "def __init__(self, range_str): <TAB>self.range_str = to_str(range_str) <TAB>self.range = set() <TAB>range_str = to_str(range_str).split(\",\") <TAB>for item in range_str: <TAB><TAB>try: <TAB><TAB><TAB>int_range = item.split(\"-\") <TAB><TAB><TAB>if len(int_range) == 1: <TAB><TAB><TAB><TAB>if item: <TAB><TAB><TAB><TAB><TAB>self.range.add(int(item)) <TAB><TAB><TAB>elif len(int_range) == 2: <TAB><TAB><TAB><TAB>int_range[0] = int(int_range[0]) <TAB><TAB><TAB><TAB>int_range[1] = int(int_range[1]) <TAB><TAB><TAB><TAB>if int_range[0] < 0: <TAB><TAB><TAB><TAB><TAB>int_range[0] = 0 <TAB><TAB><TAB><TAB>if int_range[1] > 65535: <TAB><TAB><TAB><TAB><TAB>int_range[1] = 65535 <MASK> <TAB><TAB>except Exception as e: <TAB><TAB><TAB>logging.error(e) ", "complex_target": "<TAB><TAB><TAB><TAB>i = int_range[0] <TAB><TAB><TAB><TAB>while i <= int_range[1]: <TAB><TAB><TAB><TAB><TAB>self.range.add(i) <TAB><TAB><TAB><TAB><TAB>i += 1", "medium_masked_block": "<TAB><TAB><TAB>int_range = item.split(\"-\")\n<TAB><TAB><TAB>if len(int_range) == 1:\n<TAB><TAB><TAB><TAB>if item:\n<TAB><TAB><TAB><TAB><TAB>self.range.add(int(item))\n<TAB><TAB><TAB>elif len(int_range) == 2:\n<TAB><TAB><TAB><TAB>int_range[0] = int(int_range[0])\n<TAB><TAB><TAB><TAB>int_range[1] = int(int_range[1])\n<TAB><TAB><TAB><TAB>if int_range[0] < 0:\n<TAB><TAB><TAB><TAB><TAB>int_range[0] = 0\n<TAB><TAB><TAB><TAB>if int_range[1] > 65535:\n<TAB><TAB><TAB><TAB><TAB>int_range[1] = 65535\n<TAB><TAB><TAB><TAB>i = int_range[0]\n<TAB><TAB><TAB><TAB>while i <= int_range[1]:\n<MASK>", "medium_input": "def __init__(self, range_str): <TAB>self.range_str = to_str(range_str) <TAB>self.range = set() <TAB>range_str = to_str(range_str).split(\",\") <TAB>for item in range_str: <TAB><TAB>try: <TAB><TAB><TAB>int_range = item.split(\"-\") <TAB><TAB><TAB>if len(int_range) == 1: <TAB><TAB><TAB><TAB>if item: <TAB><TAB><TAB><TAB><TAB>self.range.add(int(item)) <TAB><TAB><TAB>elif len(int_range) == 2: <TAB><TAB><TAB><TAB>int_range[0] = int(int_range[0]) <TAB><TAB><TAB><TAB>int_range[1] = int(int_range[1]) <TAB><TAB><TAB><TAB>if int_range[0] < 0: <TAB><TAB><TAB><TAB><TAB>int_range[0] = 0 <TAB><TAB><TAB><TAB>if int_range[1] > 65535: <TAB><TAB><TAB><TAB><TAB>int_range[1] = 65535 <TAB><TAB><TAB><TAB>i = int_range[0] <TAB><TAB><TAB><TAB>while i <= int_range[1]: <MASK> <TAB><TAB>except Exception as e: <TAB><TAB><TAB>logging.error(e) ", "medium_target": "<TAB><TAB><TAB><TAB><TAB>self.range.add(i) <TAB><TAB><TAB><TAB><TAB>i += 1", "simple_masked_block": "<TAB><TAB><TAB>int_range = item.split(\"-\")\n<TAB><TAB><TAB>if len(int_range) == 1:\n<TAB><TAB><TAB><TAB>if item:\n<TAB><TAB><TAB><TAB><TAB>self.range.add(int(item))\n<TAB><TAB><TAB>elif len(int_range) == 2:\n<TAB><TAB><TAB><TAB>int_range[0] = int(int_range[0])\n<TAB><TAB><TAB><TAB>int_range[1] = int(int_range[1])\n<TAB><TAB><TAB><TAB>if int_range[0] < 0:\n<TAB><TAB><TAB><TAB><TAB>int_range[0] = 0\n<TAB><TAB><TAB><TAB>if int_range[1] > 65535:\n<TAB><TAB><TAB><TAB><TAB>int_range[1] = 65535\n<TAB><TAB><TAB><TAB>i = int_range[0]\n<TAB><TAB><TAB><TAB>while i <= int_range[1]:\n<MASK>", "simple_input": "def __init__(self, range_str): <TAB>self.range_str = to_str(range_str) <TAB>self.range = set() <TAB>range_str = to_str(range_str).split(\",\") <TAB>for item in range_str: <TAB><TAB>try: <TAB><TAB><TAB>int_range = item.split(\"-\") <TAB><TAB><TAB>if len(int_range) == 1: <TAB><TAB><TAB><TAB>if item: <TAB><TAB><TAB><TAB><TAB>self.range.add(int(item)) <TAB><TAB><TAB>elif len(int_range) == 2: <TAB><TAB><TAB><TAB>int_range[0] = int(int_range[0]) <TAB><TAB><TAB><TAB>int_range[1] = int(int_range[1]) <TAB><TAB><TAB><TAB>if int_range[0] < 0: <TAB><TAB><TAB><TAB><TAB>int_range[0] = 0 <TAB><TAB><TAB><TAB>if int_range[1] > 65535: <TAB><TAB><TAB><TAB><TAB>int_range[1] = 65535 <TAB><TAB><TAB><TAB>i = int_range[0] <TAB><TAB><TAB><TAB>while i <= int_range[1]: <MASK> <TAB><TAB>except Exception as e: <TAB><TAB><TAB>logging.error(e) ", "simple_target": "<TAB><TAB><TAB><TAB><TAB>self.range.add(i) <TAB><TAB><TAB><TAB><TAB>i += 1"}
{"method": "def run(callbacks=None):\n<TAB>keras_utils.set_session_config(enable_xla=FLAGS.enable_xla)\n<TAB>params = config_factory.config_generator(FLAGS.model)\n<TAB>params = params_dict.override_params_dict(params, FLAGS.config_file, is_strict=True)\n<TAB>params = params_dict.override_params_dict(\n<TAB><TAB>params, FLAGS.params_override, is_strict=True\n<TAB>)\n<TAB>params.override(\n<TAB><TAB>{\n<TAB><TAB><TAB>\"strategy_type\": FLAGS.strategy_type,\n<TAB><TAB><TAB>\"model_dir\": FLAGS.model_dir,\n<TAB><TAB><TAB>\"strategy_config\": executor.strategy_flags_dict(),\n<TAB><TAB>},\n<TAB><TAB>is_strict=False,\n<TAB>)\n<TAB># Make sure use_tpu and strategy_type are in sync.\n<TAB>params.use_tpu = params.strategy_type == \"tpu\"\n<TAB>if not params.use_tpu:\n<TAB><TAB>params.override(\n<TAB><TAB><TAB>{\n<TAB><TAB><TAB><TAB>\"architecture\": {\n<TAB><TAB><TAB><TAB><TAB>\"use_bfloat16\": False,\n<TAB><TAB><TAB><TAB>},\n<TAB><TAB><TAB><TAB>\"norm_activation\": {\n<TAB><TAB><TAB><TAB><TAB>\"use_sync_bn\": False,\n<TAB><TAB><TAB><TAB>},\n<TAB><TAB><TAB>},\n<TAB><TAB><TAB>is_strict=True,\n<TAB><TAB>)\n<TAB>params.validate()\n<TAB>params.lock()\n<TAB>pp = pprint.PrettyPrinter()\n<TAB>params_str = pp.pformat(params.as_dict())\n<TAB>logging.info(\"Model Parameters: %s\", params_str)\n<TAB>train_input_fn = None\n<TAB>eval_input_fn = None\n<TAB>training_file_pattern = (\n<TAB><TAB>FLAGS.training_file_pattern or params.train.train_file_pattern\n<TAB>)\n<TAB>eval_file_pattern = FLAGS.eval_file_pattern or params.eval.eval_file_pattern\n<TAB>if not training_file_pattern and not eval_file_pattern:\n<TAB><TAB>raise ValueError(\n<TAB><TAB><TAB>\"Must provide at least one of training_file_pattern and \"\n<TAB><TAB><TAB>\"eval_file_pattern.\"\n<TAB><TAB>)\n<TAB>if training_file_pattern:\n<TAB><TAB># Use global batch size for single host.\n<TAB><TAB>train_input_fn = input_reader.InputFn(\n<TAB><TAB><TAB>file_pattern=training_file_pattern,\n<TAB><TAB><TAB>params=params,\n<TAB><TAB><TAB>mode=input_reader.ModeKeys.TRAIN,\n<TAB><TAB><TAB>batch_size=params.train.batch_size,\n<TAB><TAB>)\n<TAB>if eval_file_pattern:\n<TAB><TAB>eval_input_fn = input_reader.InputFn(\n<TAB><TAB><TAB>file_pattern=eval_file_pattern,\n<TAB><TAB><TAB>params=params,\n<TAB><TAB><TAB>mode=input_reader.ModeKeys.PREDICT_WITH_GT,\n<TAB><TAB><TAB>batch_size=params.eval.batch_size,\n<TAB><TAB><TAB>num_examples=params.eval.eval_samples,\n<TAB><TAB>)\n<TAB>if callbacks is None:\n<TAB><TAB>callbacks = []\n<TAB>if FLAGS.log_steps:\n<TAB><TAB>callbacks.append(\n<TAB><TAB><TAB>keras_utils.TimeHistory(\n<TAB><TAB><TAB><TAB>batch_size=params.train.batch_size,\n<TAB><TAB><TAB><TAB>log_steps=FLAGS.log_steps,\n<TAB><TAB><TAB>)\n<TAB><TAB>)\n<TAB>return run_executor(\n<TAB><TAB>params,\n<TAB><TAB>FLAGS.mode,\n<TAB><TAB>checkpoint_path=FLAGS.checkpoint_path,\n<TAB><TAB>train_input_fn=train_input_fn,\n<TAB><TAB>eval_input_fn=eval_input_fn,\n<TAB><TAB>callbacks=callbacks,\n<TAB>)\n", "block": "<TAB><TAB>params.override(\n<TAB><TAB><TAB>{\n<TAB><TAB><TAB><TAB>\"architecture\": {\n<TAB><TAB><TAB><TAB><TAB>\"use_bfloat16\": False,\n<TAB><TAB><TAB><TAB>},\n<TAB><TAB><TAB><TAB>\"norm_activation\": {\n<TAB><TAB><TAB><TAB><TAB>\"use_sync_bn\": False,\n<TAB><TAB><TAB><TAB>},\n<TAB><TAB><TAB>},\n<TAB><TAB><TAB>is_strict=True,\n<TAB><TAB>)", "complex_masked_block": "<TAB><TAB>params.override(\n<TAB><TAB><TAB>{\n<TAB><TAB><TAB><TAB>\"architecture\": {\n<TAB><TAB><TAB><TAB><TAB>\"use_bfloat16\": False,\n<TAB><TAB><TAB><TAB>},\n<MASK>", "complex_input": "def run(callbacks=None): <TAB>keras_utils.set_session_config(enable_xla=FLAGS.enable_xla) <TAB>params = config_factory.config_generator(FLAGS.model) <TAB>params = params_dict.override_params_dict(params, FLAGS.config_file, is_strict=True) <TAB>params = params_dict.override_params_dict( <TAB><TAB>params, FLAGS.params_override, is_strict=True <TAB>) <TAB>params.override( <TAB><TAB>{ <TAB><TAB><TAB>\"strategy_type\": FLAGS.strategy_type, <TAB><TAB><TAB>\"model_dir\": FLAGS.model_dir, <TAB><TAB><TAB>\"strategy_config\": executor.strategy_flags_dict(), <TAB><TAB>}, <TAB><TAB>is_strict=False, <TAB>) <TAB># Make sure use_tpu and strategy_type are in sync. <TAB>params.use_tpu = params.strategy_type == \"tpu\" <TAB>if not params.use_tpu: <TAB><TAB>params.override( <TAB><TAB><TAB>{ <TAB><TAB><TAB><TAB>\"architecture\": { <TAB><TAB><TAB><TAB><TAB>\"use_bfloat16\": False, <TAB><TAB><TAB><TAB>}, <MASK> <TAB>params.validate() <TAB>params.lock() <TAB>pp = pprint.PrettyPrinter() <TAB>params_str = pp.pformat(params.as_dict()) <TAB>logging.info(\"Model Parameters: %s\", params_str) <TAB>train_input_fn = None <TAB>eval_input_fn = None <TAB>training_file_pattern = ( <TAB><TAB>FLAGS.training_file_pattern or params.train.train_file_pattern <TAB>) <TAB>eval_file_pattern = FLAGS.eval_file_pattern or params.eval.eval_file_pattern <TAB>if not training_file_pattern and not eval_file_pattern: <TAB><TAB>raise ValueError( <TAB><TAB><TAB>\"Must provide at least one of training_file_pattern and \" <TAB><TAB><TAB>\"eval_file_pattern.\" <TAB><TAB>) <TAB>if training_file_pattern: <TAB><TAB># Use global batch size for single host. <TAB><TAB>train_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=training_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.TRAIN, <TAB><TAB><TAB>batch_size=params.train.batch_size, <TAB><TAB>) <TAB>if eval_file_pattern: <TAB><TAB>eval_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=eval_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.PREDICT_WITH_GT, <TAB><TAB><TAB>batch_size=params.eval.batch_size, <TAB><TAB><TAB>num_examples=params.eval.eval_samples, <TAB><TAB>) <TAB>if callbacks is None: <TAB><TAB>callbacks = [] <TAB>if FLAGS.log_steps: <TAB><TAB>callbacks.append( <TAB><TAB><TAB>keras_utils.TimeHistory( <TAB><TAB><TAB><TAB>batch_size=params.train.batch_size, <TAB><TAB><TAB><TAB>log_steps=FLAGS.log_steps, <TAB><TAB><TAB>) <TAB><TAB>) <TAB>return run_executor( <TAB><TAB>params, <TAB><TAB>FLAGS.mode, <TAB><TAB>checkpoint_path=FLAGS.checkpoint_path, <TAB><TAB>train_input_fn=train_input_fn, <TAB><TAB>eval_input_fn=eval_input_fn, <TAB><TAB>callbacks=callbacks, <TAB>) ", "complex_target": "<TAB><TAB><TAB><TAB>\"norm_activation\": { <TAB><TAB><TAB><TAB><TAB>\"use_sync_bn\": False, <TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB>}, <TAB><TAB><TAB>is_strict=True, <TAB><TAB>)", "medium_masked_block": "<TAB><TAB>params.override(\n<TAB><TAB><TAB>{\n<TAB><TAB><TAB><TAB>\"architecture\": {\n<TAB><TAB><TAB><TAB><TAB>\"use_bfloat16\": False,\n<TAB><TAB><TAB><TAB>},\n<TAB><TAB><TAB><TAB>\"norm_activation\": {\n<TAB><TAB><TAB><TAB><TAB>\"use_sync_bn\": False,\n<MASK>", "medium_input": "def run(callbacks=None): <TAB>keras_utils.set_session_config(enable_xla=FLAGS.enable_xla) <TAB>params = config_factory.config_generator(FLAGS.model) <TAB>params = params_dict.override_params_dict(params, FLAGS.config_file, is_strict=True) <TAB>params = params_dict.override_params_dict( <TAB><TAB>params, FLAGS.params_override, is_strict=True <TAB>) <TAB>params.override( <TAB><TAB>{ <TAB><TAB><TAB>\"strategy_type\": FLAGS.strategy_type, <TAB><TAB><TAB>\"model_dir\": FLAGS.model_dir, <TAB><TAB><TAB>\"strategy_config\": executor.strategy_flags_dict(), <TAB><TAB>}, <TAB><TAB>is_strict=False, <TAB>) <TAB># Make sure use_tpu and strategy_type are in sync. <TAB>params.use_tpu = params.strategy_type == \"tpu\" <TAB>if not params.use_tpu: <TAB><TAB>params.override( <TAB><TAB><TAB>{ <TAB><TAB><TAB><TAB>\"architecture\": { <TAB><TAB><TAB><TAB><TAB>\"use_bfloat16\": False, <TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB><TAB>\"norm_activation\": { <TAB><TAB><TAB><TAB><TAB>\"use_sync_bn\": False, <MASK> <TAB>params.validate() <TAB>params.lock() <TAB>pp = pprint.PrettyPrinter() <TAB>params_str = pp.pformat(params.as_dict()) <TAB>logging.info(\"Model Parameters: %s\", params_str) <TAB>train_input_fn = None <TAB>eval_input_fn = None <TAB>training_file_pattern = ( <TAB><TAB>FLAGS.training_file_pattern or params.train.train_file_pattern <TAB>) <TAB>eval_file_pattern = FLAGS.eval_file_pattern or params.eval.eval_file_pattern <TAB>if not training_file_pattern and not eval_file_pattern: <TAB><TAB>raise ValueError( <TAB><TAB><TAB>\"Must provide at least one of training_file_pattern and \" <TAB><TAB><TAB>\"eval_file_pattern.\" <TAB><TAB>) <TAB>if training_file_pattern: <TAB><TAB># Use global batch size for single host. <TAB><TAB>train_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=training_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.TRAIN, <TAB><TAB><TAB>batch_size=params.train.batch_size, <TAB><TAB>) <TAB>if eval_file_pattern: <TAB><TAB>eval_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=eval_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.PREDICT_WITH_GT, <TAB><TAB><TAB>batch_size=params.eval.batch_size, <TAB><TAB><TAB>num_examples=params.eval.eval_samples, <TAB><TAB>) <TAB>if callbacks is None: <TAB><TAB>callbacks = [] <TAB>if FLAGS.log_steps: <TAB><TAB>callbacks.append( <TAB><TAB><TAB>keras_utils.TimeHistory( <TAB><TAB><TAB><TAB>batch_size=params.train.batch_size, <TAB><TAB><TAB><TAB>log_steps=FLAGS.log_steps, <TAB><TAB><TAB>) <TAB><TAB>) <TAB>return run_executor( <TAB><TAB>params, <TAB><TAB>FLAGS.mode, <TAB><TAB>checkpoint_path=FLAGS.checkpoint_path, <TAB><TAB>train_input_fn=train_input_fn, <TAB><TAB>eval_input_fn=eval_input_fn, <TAB><TAB>callbacks=callbacks, <TAB>) ", "medium_target": "<TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB>}, <TAB><TAB><TAB>is_strict=True, <TAB><TAB>)", "simple_masked_block": "<TAB><TAB>params.override(\n<TAB><TAB><TAB>{\n<TAB><TAB><TAB><TAB>\"architecture\": {\n<TAB><TAB><TAB><TAB><TAB>\"use_bfloat16\": False,\n<TAB><TAB><TAB><TAB>},\n<TAB><TAB><TAB><TAB>\"norm_activation\": {\n<TAB><TAB><TAB><TAB><TAB>\"use_sync_bn\": False,\n<TAB><TAB><TAB><TAB>},\n<TAB><TAB><TAB>},\n<MASK>", "simple_input": "def run(callbacks=None): <TAB>keras_utils.set_session_config(enable_xla=FLAGS.enable_xla) <TAB>params = config_factory.config_generator(FLAGS.model) <TAB>params = params_dict.override_params_dict(params, FLAGS.config_file, is_strict=True) <TAB>params = params_dict.override_params_dict( <TAB><TAB>params, FLAGS.params_override, is_strict=True <TAB>) <TAB>params.override( <TAB><TAB>{ <TAB><TAB><TAB>\"strategy_type\": FLAGS.strategy_type, <TAB><TAB><TAB>\"model_dir\": FLAGS.model_dir, <TAB><TAB><TAB>\"strategy_config\": executor.strategy_flags_dict(), <TAB><TAB>}, <TAB><TAB>is_strict=False, <TAB>) <TAB># Make sure use_tpu and strategy_type are in sync. <TAB>params.use_tpu = params.strategy_type == \"tpu\" <TAB>if not params.use_tpu: <TAB><TAB>params.override( <TAB><TAB><TAB>{ <TAB><TAB><TAB><TAB>\"architecture\": { <TAB><TAB><TAB><TAB><TAB>\"use_bfloat16\": False, <TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB><TAB>\"norm_activation\": { <TAB><TAB><TAB><TAB><TAB>\"use_sync_bn\": False, <TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB>}, <MASK> <TAB>params.validate() <TAB>params.lock() <TAB>pp = pprint.PrettyPrinter() <TAB>params_str = pp.pformat(params.as_dict()) <TAB>logging.info(\"Model Parameters: %s\", params_str) <TAB>train_input_fn = None <TAB>eval_input_fn = None <TAB>training_file_pattern = ( <TAB><TAB>FLAGS.training_file_pattern or params.train.train_file_pattern <TAB>) <TAB>eval_file_pattern = FLAGS.eval_file_pattern or params.eval.eval_file_pattern <TAB>if not training_file_pattern and not eval_file_pattern: <TAB><TAB>raise ValueError( <TAB><TAB><TAB>\"Must provide at least one of training_file_pattern and \" <TAB><TAB><TAB>\"eval_file_pattern.\" <TAB><TAB>) <TAB>if training_file_pattern: <TAB><TAB># Use global batch size for single host. <TAB><TAB>train_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=training_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.TRAIN, <TAB><TAB><TAB>batch_size=params.train.batch_size, <TAB><TAB>) <TAB>if eval_file_pattern: <TAB><TAB>eval_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=eval_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.PREDICT_WITH_GT, <TAB><TAB><TAB>batch_size=params.eval.batch_size, <TAB><TAB><TAB>num_examples=params.eval.eval_samples, <TAB><TAB>) <TAB>if callbacks is None: <TAB><TAB>callbacks = [] <TAB>if FLAGS.log_steps: <TAB><TAB>callbacks.append( <TAB><TAB><TAB>keras_utils.TimeHistory( <TAB><TAB><TAB><TAB>batch_size=params.train.batch_size, <TAB><TAB><TAB><TAB>log_steps=FLAGS.log_steps, <TAB><TAB><TAB>) <TAB><TAB>) <TAB>return run_executor( <TAB><TAB>params, <TAB><TAB>FLAGS.mode, <TAB><TAB>checkpoint_path=FLAGS.checkpoint_path, <TAB><TAB>train_input_fn=train_input_fn, <TAB><TAB>eval_input_fn=eval_input_fn, <TAB><TAB>callbacks=callbacks, <TAB>) ", "simple_target": "<TAB><TAB><TAB>is_strict=True, <TAB><TAB>)"}
{"method": "def run(callbacks=None):\n<TAB>keras_utils.set_session_config(enable_xla=FLAGS.enable_xla)\n<TAB>params = config_factory.config_generator(FLAGS.model)\n<TAB>params = params_dict.override_params_dict(params, FLAGS.config_file, is_strict=True)\n<TAB>params = params_dict.override_params_dict(\n<TAB><TAB>params, FLAGS.params_override, is_strict=True\n<TAB>)\n<TAB>params.override(\n<TAB><TAB>{\n<TAB><TAB><TAB>\"strategy_type\": FLAGS.strategy_type,\n<TAB><TAB><TAB>\"model_dir\": FLAGS.model_dir,\n<TAB><TAB><TAB>\"strategy_config\": executor.strategy_flags_dict(),\n<TAB><TAB>},\n<TAB><TAB>is_strict=False,\n<TAB>)\n<TAB># Make sure use_tpu and strategy_type are in sync.\n<TAB>params.use_tpu = params.strategy_type == \"tpu\"\n<TAB>if not params.use_tpu:\n<TAB><TAB>params.override(\n<TAB><TAB><TAB>{\n<TAB><TAB><TAB><TAB>\"architecture\": {\n<TAB><TAB><TAB><TAB><TAB>\"use_bfloat16\": False,\n<TAB><TAB><TAB><TAB>},\n<TAB><TAB><TAB><TAB>\"norm_activation\": {\n<TAB><TAB><TAB><TAB><TAB>\"use_sync_bn\": False,\n<TAB><TAB><TAB><TAB>},\n<TAB><TAB><TAB>},\n<TAB><TAB><TAB>is_strict=True,\n<TAB><TAB>)\n<TAB>params.validate()\n<TAB>params.lock()\n<TAB>pp = pprint.PrettyPrinter()\n<TAB>params_str = pp.pformat(params.as_dict())\n<TAB>logging.info(\"Model Parameters: %s\", params_str)\n<TAB>train_input_fn = None\n<TAB>eval_input_fn = None\n<TAB>training_file_pattern = (\n<TAB><TAB>FLAGS.training_file_pattern or params.train.train_file_pattern\n<TAB>)\n<TAB>eval_file_pattern = FLAGS.eval_file_pattern or params.eval.eval_file_pattern\n<TAB>if not training_file_pattern and not eval_file_pattern:\n<TAB><TAB>raise ValueError(\n<TAB><TAB><TAB>\"Must provide at least one of training_file_pattern and \"\n<TAB><TAB><TAB>\"eval_file_pattern.\"\n<TAB><TAB>)\n<TAB>if training_file_pattern:\n<TAB><TAB># Use global batch size for single host.\n<TAB><TAB>train_input_fn = input_reader.InputFn(\n<TAB><TAB><TAB>file_pattern=training_file_pattern,\n<TAB><TAB><TAB>params=params,\n<TAB><TAB><TAB>mode=input_reader.ModeKeys.TRAIN,\n<TAB><TAB><TAB>batch_size=params.train.batch_size,\n<TAB><TAB>)\n<TAB>if eval_file_pattern:\n<TAB><TAB>eval_input_fn = input_reader.InputFn(\n<TAB><TAB><TAB>file_pattern=eval_file_pattern,\n<TAB><TAB><TAB>params=params,\n<TAB><TAB><TAB>mode=input_reader.ModeKeys.PREDICT_WITH_GT,\n<TAB><TAB><TAB>batch_size=params.eval.batch_size,\n<TAB><TAB><TAB>num_examples=params.eval.eval_samples,\n<TAB><TAB>)\n<TAB>if callbacks is None:\n<TAB><TAB>callbacks = []\n<TAB>if FLAGS.log_steps:\n<TAB><TAB>callbacks.append(\n<TAB><TAB><TAB>keras_utils.TimeHistory(\n<TAB><TAB><TAB><TAB>batch_size=params.train.batch_size,\n<TAB><TAB><TAB><TAB>log_steps=FLAGS.log_steps,\n<TAB><TAB><TAB>)\n<TAB><TAB>)\n<TAB>return run_executor(\n<TAB><TAB>params,\n<TAB><TAB>FLAGS.mode,\n<TAB><TAB>checkpoint_path=FLAGS.checkpoint_path,\n<TAB><TAB>train_input_fn=train_input_fn,\n<TAB><TAB>eval_input_fn=eval_input_fn,\n<TAB><TAB>callbacks=callbacks,\n<TAB>)\n", "block": "<TAB><TAB>train_input_fn = input_reader.InputFn(\n<TAB><TAB><TAB>file_pattern=training_file_pattern,\n<TAB><TAB><TAB>params=params,\n<TAB><TAB><TAB>mode=input_reader.ModeKeys.TRAIN,\n<TAB><TAB><TAB>batch_size=params.train.batch_size,\n<TAB><TAB>)", "complex_masked_block": "<TAB><TAB>train_input_fn = input_reader.InputFn(\n<MASK>", "complex_input": "def run(callbacks=None): <TAB>keras_utils.set_session_config(enable_xla=FLAGS.enable_xla) <TAB>params = config_factory.config_generator(FLAGS.model) <TAB>params = params_dict.override_params_dict(params, FLAGS.config_file, is_strict=True) <TAB>params = params_dict.override_params_dict( <TAB><TAB>params, FLAGS.params_override, is_strict=True <TAB>) <TAB>params.override( <TAB><TAB>{ <TAB><TAB><TAB>\"strategy_type\": FLAGS.strategy_type, <TAB><TAB><TAB>\"model_dir\": FLAGS.model_dir, <TAB><TAB><TAB>\"strategy_config\": executor.strategy_flags_dict(), <TAB><TAB>}, <TAB><TAB>is_strict=False, <TAB>) <TAB># Make sure use_tpu and strategy_type are in sync. <TAB>params.use_tpu = params.strategy_type == \"tpu\" <TAB>if not params.use_tpu: <TAB><TAB>params.override( <TAB><TAB><TAB>{ <TAB><TAB><TAB><TAB>\"architecture\": { <TAB><TAB><TAB><TAB><TAB>\"use_bfloat16\": False, <TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB><TAB>\"norm_activation\": { <TAB><TAB><TAB><TAB><TAB>\"use_sync_bn\": False, <TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB>}, <TAB><TAB><TAB>is_strict=True, <TAB><TAB>) <TAB>params.validate() <TAB>params.lock() <TAB>pp = pprint.PrettyPrinter() <TAB>params_str = pp.pformat(params.as_dict()) <TAB>logging.info(\"Model Parameters: %s\", params_str) <TAB>train_input_fn = None <TAB>eval_input_fn = None <TAB>training_file_pattern = ( <TAB><TAB>FLAGS.training_file_pattern or params.train.train_file_pattern <TAB>) <TAB>eval_file_pattern = FLAGS.eval_file_pattern or params.eval.eval_file_pattern <TAB>if not training_file_pattern and not eval_file_pattern: <TAB><TAB>raise ValueError( <TAB><TAB><TAB>\"Must provide at least one of training_file_pattern and \" <TAB><TAB><TAB>\"eval_file_pattern.\" <TAB><TAB>) <TAB>if training_file_pattern: <TAB><TAB># Use global batch size for single host. <TAB><TAB>train_input_fn = input_reader.InputFn( <MASK> <TAB>if eval_file_pattern: <TAB><TAB>eval_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=eval_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.PREDICT_WITH_GT, <TAB><TAB><TAB>batch_size=params.eval.batch_size, <TAB><TAB><TAB>num_examples=params.eval.eval_samples, <TAB><TAB>) <TAB>if callbacks is None: <TAB><TAB>callbacks = [] <TAB>if FLAGS.log_steps: <TAB><TAB>callbacks.append( <TAB><TAB><TAB>keras_utils.TimeHistory( <TAB><TAB><TAB><TAB>batch_size=params.train.batch_size, <TAB><TAB><TAB><TAB>log_steps=FLAGS.log_steps, <TAB><TAB><TAB>) <TAB><TAB>) <TAB>return run_executor( <TAB><TAB>params, <TAB><TAB>FLAGS.mode, <TAB><TAB>checkpoint_path=FLAGS.checkpoint_path, <TAB><TAB>train_input_fn=train_input_fn, <TAB><TAB>eval_input_fn=eval_input_fn, <TAB><TAB>callbacks=callbacks, <TAB>) ", "complex_target": "<TAB><TAB><TAB>file_pattern=training_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.TRAIN, <TAB><TAB><TAB>batch_size=params.train.batch_size, <TAB><TAB>)", "medium_masked_block": "<TAB><TAB>train_input_fn = input_reader.InputFn(\n<TAB><TAB><TAB>file_pattern=training_file_pattern,\n<TAB><TAB><TAB>params=params,\n<TAB><TAB><TAB>mode=input_reader.ModeKeys.TRAIN,\n<MASK>", "medium_input": "def run(callbacks=None): <TAB>keras_utils.set_session_config(enable_xla=FLAGS.enable_xla) <TAB>params = config_factory.config_generator(FLAGS.model) <TAB>params = params_dict.override_params_dict(params, FLAGS.config_file, is_strict=True) <TAB>params = params_dict.override_params_dict( <TAB><TAB>params, FLAGS.params_override, is_strict=True <TAB>) <TAB>params.override( <TAB><TAB>{ <TAB><TAB><TAB>\"strategy_type\": FLAGS.strategy_type, <TAB><TAB><TAB>\"model_dir\": FLAGS.model_dir, <TAB><TAB><TAB>\"strategy_config\": executor.strategy_flags_dict(), <TAB><TAB>}, <TAB><TAB>is_strict=False, <TAB>) <TAB># Make sure use_tpu and strategy_type are in sync. <TAB>params.use_tpu = params.strategy_type == \"tpu\" <TAB>if not params.use_tpu: <TAB><TAB>params.override( <TAB><TAB><TAB>{ <TAB><TAB><TAB><TAB>\"architecture\": { <TAB><TAB><TAB><TAB><TAB>\"use_bfloat16\": False, <TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB><TAB>\"norm_activation\": { <TAB><TAB><TAB><TAB><TAB>\"use_sync_bn\": False, <TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB>}, <TAB><TAB><TAB>is_strict=True, <TAB><TAB>) <TAB>params.validate() <TAB>params.lock() <TAB>pp = pprint.PrettyPrinter() <TAB>params_str = pp.pformat(params.as_dict()) <TAB>logging.info(\"Model Parameters: %s\", params_str) <TAB>train_input_fn = None <TAB>eval_input_fn = None <TAB>training_file_pattern = ( <TAB><TAB>FLAGS.training_file_pattern or params.train.train_file_pattern <TAB>) <TAB>eval_file_pattern = FLAGS.eval_file_pattern or params.eval.eval_file_pattern <TAB>if not training_file_pattern and not eval_file_pattern: <TAB><TAB>raise ValueError( <TAB><TAB><TAB>\"Must provide at least one of training_file_pattern and \" <TAB><TAB><TAB>\"eval_file_pattern.\" <TAB><TAB>) <TAB>if training_file_pattern: <TAB><TAB># Use global batch size for single host. <TAB><TAB>train_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=training_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.TRAIN, <MASK> <TAB>if eval_file_pattern: <TAB><TAB>eval_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=eval_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.PREDICT_WITH_GT, <TAB><TAB><TAB>batch_size=params.eval.batch_size, <TAB><TAB><TAB>num_examples=params.eval.eval_samples, <TAB><TAB>) <TAB>if callbacks is None: <TAB><TAB>callbacks = [] <TAB>if FLAGS.log_steps: <TAB><TAB>callbacks.append( <TAB><TAB><TAB>keras_utils.TimeHistory( <TAB><TAB><TAB><TAB>batch_size=params.train.batch_size, <TAB><TAB><TAB><TAB>log_steps=FLAGS.log_steps, <TAB><TAB><TAB>) <TAB><TAB>) <TAB>return run_executor( <TAB><TAB>params, <TAB><TAB>FLAGS.mode, <TAB><TAB>checkpoint_path=FLAGS.checkpoint_path, <TAB><TAB>train_input_fn=train_input_fn, <TAB><TAB>eval_input_fn=eval_input_fn, <TAB><TAB>callbacks=callbacks, <TAB>) ", "medium_target": "<TAB><TAB><TAB>batch_size=params.train.batch_size, <TAB><TAB>)", "simple_masked_block": "<TAB><TAB>train_input_fn = input_reader.InputFn(\n<TAB><TAB><TAB>file_pattern=training_file_pattern,\n<TAB><TAB><TAB>params=params,\n<TAB><TAB><TAB>mode=input_reader.ModeKeys.TRAIN,\n<MASK>", "simple_input": "def run(callbacks=None): <TAB>keras_utils.set_session_config(enable_xla=FLAGS.enable_xla) <TAB>params = config_factory.config_generator(FLAGS.model) <TAB>params = params_dict.override_params_dict(params, FLAGS.config_file, is_strict=True) <TAB>params = params_dict.override_params_dict( <TAB><TAB>params, FLAGS.params_override, is_strict=True <TAB>) <TAB>params.override( <TAB><TAB>{ <TAB><TAB><TAB>\"strategy_type\": FLAGS.strategy_type, <TAB><TAB><TAB>\"model_dir\": FLAGS.model_dir, <TAB><TAB><TAB>\"strategy_config\": executor.strategy_flags_dict(), <TAB><TAB>}, <TAB><TAB>is_strict=False, <TAB>) <TAB># Make sure use_tpu and strategy_type are in sync. <TAB>params.use_tpu = params.strategy_type == \"tpu\" <TAB>if not params.use_tpu: <TAB><TAB>params.override( <TAB><TAB><TAB>{ <TAB><TAB><TAB><TAB>\"architecture\": { <TAB><TAB><TAB><TAB><TAB>\"use_bfloat16\": False, <TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB><TAB>\"norm_activation\": { <TAB><TAB><TAB><TAB><TAB>\"use_sync_bn\": False, <TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB>}, <TAB><TAB><TAB>is_strict=True, <TAB><TAB>) <TAB>params.validate() <TAB>params.lock() <TAB>pp = pprint.PrettyPrinter() <TAB>params_str = pp.pformat(params.as_dict()) <TAB>logging.info(\"Model Parameters: %s\", params_str) <TAB>train_input_fn = None <TAB>eval_input_fn = None <TAB>training_file_pattern = ( <TAB><TAB>FLAGS.training_file_pattern or params.train.train_file_pattern <TAB>) <TAB>eval_file_pattern = FLAGS.eval_file_pattern or params.eval.eval_file_pattern <TAB>if not training_file_pattern and not eval_file_pattern: <TAB><TAB>raise ValueError( <TAB><TAB><TAB>\"Must provide at least one of training_file_pattern and \" <TAB><TAB><TAB>\"eval_file_pattern.\" <TAB><TAB>) <TAB>if training_file_pattern: <TAB><TAB># Use global batch size for single host. <TAB><TAB>train_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=training_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.TRAIN, <MASK> <TAB>if eval_file_pattern: <TAB><TAB>eval_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=eval_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.PREDICT_WITH_GT, <TAB><TAB><TAB>batch_size=params.eval.batch_size, <TAB><TAB><TAB>num_examples=params.eval.eval_samples, <TAB><TAB>) <TAB>if callbacks is None: <TAB><TAB>callbacks = [] <TAB>if FLAGS.log_steps: <TAB><TAB>callbacks.append( <TAB><TAB><TAB>keras_utils.TimeHistory( <TAB><TAB><TAB><TAB>batch_size=params.train.batch_size, <TAB><TAB><TAB><TAB>log_steps=FLAGS.log_steps, <TAB><TAB><TAB>) <TAB><TAB>) <TAB>return run_executor( <TAB><TAB>params, <TAB><TAB>FLAGS.mode, <TAB><TAB>checkpoint_path=FLAGS.checkpoint_path, <TAB><TAB>train_input_fn=train_input_fn, <TAB><TAB>eval_input_fn=eval_input_fn, <TAB><TAB>callbacks=callbacks, <TAB>) ", "simple_target": "<TAB><TAB><TAB>batch_size=params.train.batch_size, <TAB><TAB>)"}
{"method": "def run(callbacks=None):\n<TAB>keras_utils.set_session_config(enable_xla=FLAGS.enable_xla)\n<TAB>params = config_factory.config_generator(FLAGS.model)\n<TAB>params = params_dict.override_params_dict(params, FLAGS.config_file, is_strict=True)\n<TAB>params = params_dict.override_params_dict(\n<TAB><TAB>params, FLAGS.params_override, is_strict=True\n<TAB>)\n<TAB>params.override(\n<TAB><TAB>{\n<TAB><TAB><TAB>\"strategy_type\": FLAGS.strategy_type,\n<TAB><TAB><TAB>\"model_dir\": FLAGS.model_dir,\n<TAB><TAB><TAB>\"strategy_config\": executor.strategy_flags_dict(),\n<TAB><TAB>},\n<TAB><TAB>is_strict=False,\n<TAB>)\n<TAB># Make sure use_tpu and strategy_type are in sync.\n<TAB>params.use_tpu = params.strategy_type == \"tpu\"\n<TAB>if not params.use_tpu:\n<TAB><TAB>params.override(\n<TAB><TAB><TAB>{\n<TAB><TAB><TAB><TAB>\"architecture\": {\n<TAB><TAB><TAB><TAB><TAB>\"use_bfloat16\": False,\n<TAB><TAB><TAB><TAB>},\n<TAB><TAB><TAB><TAB>\"norm_activation\": {\n<TAB><TAB><TAB><TAB><TAB>\"use_sync_bn\": False,\n<TAB><TAB><TAB><TAB>},\n<TAB><TAB><TAB>},\n<TAB><TAB><TAB>is_strict=True,\n<TAB><TAB>)\n<TAB>params.validate()\n<TAB>params.lock()\n<TAB>pp = pprint.PrettyPrinter()\n<TAB>params_str = pp.pformat(params.as_dict())\n<TAB>logging.info(\"Model Parameters: %s\", params_str)\n<TAB>train_input_fn = None\n<TAB>eval_input_fn = None\n<TAB>training_file_pattern = (\n<TAB><TAB>FLAGS.training_file_pattern or params.train.train_file_pattern\n<TAB>)\n<TAB>eval_file_pattern = FLAGS.eval_file_pattern or params.eval.eval_file_pattern\n<TAB>if not training_file_pattern and not eval_file_pattern:\n<TAB><TAB>raise ValueError(\n<TAB><TAB><TAB>\"Must provide at least one of training_file_pattern and \"\n<TAB><TAB><TAB>\"eval_file_pattern.\"\n<TAB><TAB>)\n<TAB>if training_file_pattern:\n<TAB><TAB># Use global batch size for single host.\n<TAB><TAB>train_input_fn = input_reader.InputFn(\n<TAB><TAB><TAB>file_pattern=training_file_pattern,\n<TAB><TAB><TAB>params=params,\n<TAB><TAB><TAB>mode=input_reader.ModeKeys.TRAIN,\n<TAB><TAB><TAB>batch_size=params.train.batch_size,\n<TAB><TAB>)\n<TAB>if eval_file_pattern:\n<TAB><TAB>eval_input_fn = input_reader.InputFn(\n<TAB><TAB><TAB>file_pattern=eval_file_pattern,\n<TAB><TAB><TAB>params=params,\n<TAB><TAB><TAB>mode=input_reader.ModeKeys.PREDICT_WITH_GT,\n<TAB><TAB><TAB>batch_size=params.eval.batch_size,\n<TAB><TAB><TAB>num_examples=params.eval.eval_samples,\n<TAB><TAB>)\n<TAB>if callbacks is None:\n<TAB><TAB>callbacks = []\n<TAB>if FLAGS.log_steps:\n<TAB><TAB>callbacks.append(\n<TAB><TAB><TAB>keras_utils.TimeHistory(\n<TAB><TAB><TAB><TAB>batch_size=params.train.batch_size,\n<TAB><TAB><TAB><TAB>log_steps=FLAGS.log_steps,\n<TAB><TAB><TAB>)\n<TAB><TAB>)\n<TAB>return run_executor(\n<TAB><TAB>params,\n<TAB><TAB>FLAGS.mode,\n<TAB><TAB>checkpoint_path=FLAGS.checkpoint_path,\n<TAB><TAB>train_input_fn=train_input_fn,\n<TAB><TAB>eval_input_fn=eval_input_fn,\n<TAB><TAB>callbacks=callbacks,\n<TAB>)\n", "block": "<TAB><TAB>eval_input_fn = input_reader.InputFn(\n<TAB><TAB><TAB>file_pattern=eval_file_pattern,\n<TAB><TAB><TAB>params=params,\n<TAB><TAB><TAB>mode=input_reader.ModeKeys.PREDICT_WITH_GT,\n<TAB><TAB><TAB>batch_size=params.eval.batch_size,\n<TAB><TAB><TAB>num_examples=params.eval.eval_samples,\n<TAB><TAB>)", "complex_masked_block": "<TAB><TAB>eval_input_fn = input_reader.InputFn(\n<TAB><TAB><TAB>file_pattern=eval_file_pattern,\n<TAB><TAB><TAB>params=params,\n<MASK>", "complex_input": "def run(callbacks=None): <TAB>keras_utils.set_session_config(enable_xla=FLAGS.enable_xla) <TAB>params = config_factory.config_generator(FLAGS.model) <TAB>params = params_dict.override_params_dict(params, FLAGS.config_file, is_strict=True) <TAB>params = params_dict.override_params_dict( <TAB><TAB>params, FLAGS.params_override, is_strict=True <TAB>) <TAB>params.override( <TAB><TAB>{ <TAB><TAB><TAB>\"strategy_type\": FLAGS.strategy_type, <TAB><TAB><TAB>\"model_dir\": FLAGS.model_dir, <TAB><TAB><TAB>\"strategy_config\": executor.strategy_flags_dict(), <TAB><TAB>}, <TAB><TAB>is_strict=False, <TAB>) <TAB># Make sure use_tpu and strategy_type are in sync. <TAB>params.use_tpu = params.strategy_type == \"tpu\" <TAB>if not params.use_tpu: <TAB><TAB>params.override( <TAB><TAB><TAB>{ <TAB><TAB><TAB><TAB>\"architecture\": { <TAB><TAB><TAB><TAB><TAB>\"use_bfloat16\": False, <TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB><TAB>\"norm_activation\": { <TAB><TAB><TAB><TAB><TAB>\"use_sync_bn\": False, <TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB>}, <TAB><TAB><TAB>is_strict=True, <TAB><TAB>) <TAB>params.validate() <TAB>params.lock() <TAB>pp = pprint.PrettyPrinter() <TAB>params_str = pp.pformat(params.as_dict()) <TAB>logging.info(\"Model Parameters: %s\", params_str) <TAB>train_input_fn = None <TAB>eval_input_fn = None <TAB>training_file_pattern = ( <TAB><TAB>FLAGS.training_file_pattern or params.train.train_file_pattern <TAB>) <TAB>eval_file_pattern = FLAGS.eval_file_pattern or params.eval.eval_file_pattern <TAB>if not training_file_pattern and not eval_file_pattern: <TAB><TAB>raise ValueError( <TAB><TAB><TAB>\"Must provide at least one of training_file_pattern and \" <TAB><TAB><TAB>\"eval_file_pattern.\" <TAB><TAB>) <TAB>if training_file_pattern: <TAB><TAB># Use global batch size for single host. <TAB><TAB>train_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=training_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.TRAIN, <TAB><TAB><TAB>batch_size=params.train.batch_size, <TAB><TAB>) <TAB>if eval_file_pattern: <TAB><TAB>eval_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=eval_file_pattern, <TAB><TAB><TAB>params=params, <MASK> <TAB>if callbacks is None: <TAB><TAB>callbacks = [] <TAB>if FLAGS.log_steps: <TAB><TAB>callbacks.append( <TAB><TAB><TAB>keras_utils.TimeHistory( <TAB><TAB><TAB><TAB>batch_size=params.train.batch_size, <TAB><TAB><TAB><TAB>log_steps=FLAGS.log_steps, <TAB><TAB><TAB>) <TAB><TAB>) <TAB>return run_executor( <TAB><TAB>params, <TAB><TAB>FLAGS.mode, <TAB><TAB>checkpoint_path=FLAGS.checkpoint_path, <TAB><TAB>train_input_fn=train_input_fn, <TAB><TAB>eval_input_fn=eval_input_fn, <TAB><TAB>callbacks=callbacks, <TAB>) ", "complex_target": "<TAB><TAB><TAB>mode=input_reader.ModeKeys.PREDICT_WITH_GT, <TAB><TAB><TAB>batch_size=params.eval.batch_size, <TAB><TAB><TAB>num_examples=params.eval.eval_samples, <TAB><TAB>)", "medium_masked_block": "<TAB><TAB>eval_input_fn = input_reader.InputFn(\n<TAB><TAB><TAB>file_pattern=eval_file_pattern,\n<TAB><TAB><TAB>params=params,\n<TAB><TAB><TAB>mode=input_reader.ModeKeys.PREDICT_WITH_GT,\n<TAB><TAB><TAB>batch_size=params.eval.batch_size,\n<MASK>", "medium_input": "def run(callbacks=None): <TAB>keras_utils.set_session_config(enable_xla=FLAGS.enable_xla) <TAB>params = config_factory.config_generator(FLAGS.model) <TAB>params = params_dict.override_params_dict(params, FLAGS.config_file, is_strict=True) <TAB>params = params_dict.override_params_dict( <TAB><TAB>params, FLAGS.params_override, is_strict=True <TAB>) <TAB>params.override( <TAB><TAB>{ <TAB><TAB><TAB>\"strategy_type\": FLAGS.strategy_type, <TAB><TAB><TAB>\"model_dir\": FLAGS.model_dir, <TAB><TAB><TAB>\"strategy_config\": executor.strategy_flags_dict(), <TAB><TAB>}, <TAB><TAB>is_strict=False, <TAB>) <TAB># Make sure use_tpu and strategy_type are in sync. <TAB>params.use_tpu = params.strategy_type == \"tpu\" <TAB>if not params.use_tpu: <TAB><TAB>params.override( <TAB><TAB><TAB>{ <TAB><TAB><TAB><TAB>\"architecture\": { <TAB><TAB><TAB><TAB><TAB>\"use_bfloat16\": False, <TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB><TAB>\"norm_activation\": { <TAB><TAB><TAB><TAB><TAB>\"use_sync_bn\": False, <TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB>}, <TAB><TAB><TAB>is_strict=True, <TAB><TAB>) <TAB>params.validate() <TAB>params.lock() <TAB>pp = pprint.PrettyPrinter() <TAB>params_str = pp.pformat(params.as_dict()) <TAB>logging.info(\"Model Parameters: %s\", params_str) <TAB>train_input_fn = None <TAB>eval_input_fn = None <TAB>training_file_pattern = ( <TAB><TAB>FLAGS.training_file_pattern or params.train.train_file_pattern <TAB>) <TAB>eval_file_pattern = FLAGS.eval_file_pattern or params.eval.eval_file_pattern <TAB>if not training_file_pattern and not eval_file_pattern: <TAB><TAB>raise ValueError( <TAB><TAB><TAB>\"Must provide at least one of training_file_pattern and \" <TAB><TAB><TAB>\"eval_file_pattern.\" <TAB><TAB>) <TAB>if training_file_pattern: <TAB><TAB># Use global batch size for single host. <TAB><TAB>train_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=training_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.TRAIN, <TAB><TAB><TAB>batch_size=params.train.batch_size, <TAB><TAB>) <TAB>if eval_file_pattern: <TAB><TAB>eval_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=eval_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.PREDICT_WITH_GT, <TAB><TAB><TAB>batch_size=params.eval.batch_size, <MASK> <TAB>if callbacks is None: <TAB><TAB>callbacks = [] <TAB>if FLAGS.log_steps: <TAB><TAB>callbacks.append( <TAB><TAB><TAB>keras_utils.TimeHistory( <TAB><TAB><TAB><TAB>batch_size=params.train.batch_size, <TAB><TAB><TAB><TAB>log_steps=FLAGS.log_steps, <TAB><TAB><TAB>) <TAB><TAB>) <TAB>return run_executor( <TAB><TAB>params, <TAB><TAB>FLAGS.mode, <TAB><TAB>checkpoint_path=FLAGS.checkpoint_path, <TAB><TAB>train_input_fn=train_input_fn, <TAB><TAB>eval_input_fn=eval_input_fn, <TAB><TAB>callbacks=callbacks, <TAB>) ", "medium_target": "<TAB><TAB><TAB>num_examples=params.eval.eval_samples, <TAB><TAB>)", "simple_masked_block": "<TAB><TAB>eval_input_fn = input_reader.InputFn(\n<TAB><TAB><TAB>file_pattern=eval_file_pattern,\n<TAB><TAB><TAB>params=params,\n<TAB><TAB><TAB>mode=input_reader.ModeKeys.PREDICT_WITH_GT,\n<TAB><TAB><TAB>batch_size=params.eval.batch_size,\n<MASK>", "simple_input": "def run(callbacks=None): <TAB>keras_utils.set_session_config(enable_xla=FLAGS.enable_xla) <TAB>params = config_factory.config_generator(FLAGS.model) <TAB>params = params_dict.override_params_dict(params, FLAGS.config_file, is_strict=True) <TAB>params = params_dict.override_params_dict( <TAB><TAB>params, FLAGS.params_override, is_strict=True <TAB>) <TAB>params.override( <TAB><TAB>{ <TAB><TAB><TAB>\"strategy_type\": FLAGS.strategy_type, <TAB><TAB><TAB>\"model_dir\": FLAGS.model_dir, <TAB><TAB><TAB>\"strategy_config\": executor.strategy_flags_dict(), <TAB><TAB>}, <TAB><TAB>is_strict=False, <TAB>) <TAB># Make sure use_tpu and strategy_type are in sync. <TAB>params.use_tpu = params.strategy_type == \"tpu\" <TAB>if not params.use_tpu: <TAB><TAB>params.override( <TAB><TAB><TAB>{ <TAB><TAB><TAB><TAB>\"architecture\": { <TAB><TAB><TAB><TAB><TAB>\"use_bfloat16\": False, <TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB><TAB>\"norm_activation\": { <TAB><TAB><TAB><TAB><TAB>\"use_sync_bn\": False, <TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB>}, <TAB><TAB><TAB>is_strict=True, <TAB><TAB>) <TAB>params.validate() <TAB>params.lock() <TAB>pp = pprint.PrettyPrinter() <TAB>params_str = pp.pformat(params.as_dict()) <TAB>logging.info(\"Model Parameters: %s\", params_str) <TAB>train_input_fn = None <TAB>eval_input_fn = None <TAB>training_file_pattern = ( <TAB><TAB>FLAGS.training_file_pattern or params.train.train_file_pattern <TAB>) <TAB>eval_file_pattern = FLAGS.eval_file_pattern or params.eval.eval_file_pattern <TAB>if not training_file_pattern and not eval_file_pattern: <TAB><TAB>raise ValueError( <TAB><TAB><TAB>\"Must provide at least one of training_file_pattern and \" <TAB><TAB><TAB>\"eval_file_pattern.\" <TAB><TAB>) <TAB>if training_file_pattern: <TAB><TAB># Use global batch size for single host. <TAB><TAB>train_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=training_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.TRAIN, <TAB><TAB><TAB>batch_size=params.train.batch_size, <TAB><TAB>) <TAB>if eval_file_pattern: <TAB><TAB>eval_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=eval_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.PREDICT_WITH_GT, <TAB><TAB><TAB>batch_size=params.eval.batch_size, <MASK> <TAB>if callbacks is None: <TAB><TAB>callbacks = [] <TAB>if FLAGS.log_steps: <TAB><TAB>callbacks.append( <TAB><TAB><TAB>keras_utils.TimeHistory( <TAB><TAB><TAB><TAB>batch_size=params.train.batch_size, <TAB><TAB><TAB><TAB>log_steps=FLAGS.log_steps, <TAB><TAB><TAB>) <TAB><TAB>) <TAB>return run_executor( <TAB><TAB>params, <TAB><TAB>FLAGS.mode, <TAB><TAB>checkpoint_path=FLAGS.checkpoint_path, <TAB><TAB>train_input_fn=train_input_fn, <TAB><TAB>eval_input_fn=eval_input_fn, <TAB><TAB>callbacks=callbacks, <TAB>) ", "simple_target": "<TAB><TAB><TAB>num_examples=params.eval.eval_samples, <TAB><TAB>)"}
{"method": "def run(callbacks=None):\n<TAB>keras_utils.set_session_config(enable_xla=FLAGS.enable_xla)\n<TAB>params = config_factory.config_generator(FLAGS.model)\n<TAB>params = params_dict.override_params_dict(params, FLAGS.config_file, is_strict=True)\n<TAB>params = params_dict.override_params_dict(\n<TAB><TAB>params, FLAGS.params_override, is_strict=True\n<TAB>)\n<TAB>params.override(\n<TAB><TAB>{\n<TAB><TAB><TAB>\"strategy_type\": FLAGS.strategy_type,\n<TAB><TAB><TAB>\"model_dir\": FLAGS.model_dir,\n<TAB><TAB><TAB>\"strategy_config\": executor.strategy_flags_dict(),\n<TAB><TAB>},\n<TAB><TAB>is_strict=False,\n<TAB>)\n<TAB># Make sure use_tpu and strategy_type are in sync.\n<TAB>params.use_tpu = params.strategy_type == \"tpu\"\n<TAB>if not params.use_tpu:\n<TAB><TAB>params.override(\n<TAB><TAB><TAB>{\n<TAB><TAB><TAB><TAB>\"architecture\": {\n<TAB><TAB><TAB><TAB><TAB>\"use_bfloat16\": False,\n<TAB><TAB><TAB><TAB>},\n<TAB><TAB><TAB><TAB>\"norm_activation\": {\n<TAB><TAB><TAB><TAB><TAB>\"use_sync_bn\": False,\n<TAB><TAB><TAB><TAB>},\n<TAB><TAB><TAB>},\n<TAB><TAB><TAB>is_strict=True,\n<TAB><TAB>)\n<TAB>params.validate()\n<TAB>params.lock()\n<TAB>pp = pprint.PrettyPrinter()\n<TAB>params_str = pp.pformat(params.as_dict())\n<TAB>logging.info(\"Model Parameters: %s\", params_str)\n<TAB>train_input_fn = None\n<TAB>eval_input_fn = None\n<TAB>training_file_pattern = (\n<TAB><TAB>FLAGS.training_file_pattern or params.train.train_file_pattern\n<TAB>)\n<TAB>eval_file_pattern = FLAGS.eval_file_pattern or params.eval.eval_file_pattern\n<TAB>if not training_file_pattern and not eval_file_pattern:\n<TAB><TAB>raise ValueError(\n<TAB><TAB><TAB>\"Must provide at least one of training_file_pattern and \"\n<TAB><TAB><TAB>\"eval_file_pattern.\"\n<TAB><TAB>)\n<TAB>if training_file_pattern:\n<TAB><TAB># Use global batch size for single host.\n<TAB><TAB>train_input_fn = input_reader.InputFn(\n<TAB><TAB><TAB>file_pattern=training_file_pattern,\n<TAB><TAB><TAB>params=params,\n<TAB><TAB><TAB>mode=input_reader.ModeKeys.TRAIN,\n<TAB><TAB><TAB>batch_size=params.train.batch_size,\n<TAB><TAB>)\n<TAB>if eval_file_pattern:\n<TAB><TAB>eval_input_fn = input_reader.InputFn(\n<TAB><TAB><TAB>file_pattern=eval_file_pattern,\n<TAB><TAB><TAB>params=params,\n<TAB><TAB><TAB>mode=input_reader.ModeKeys.PREDICT_WITH_GT,\n<TAB><TAB><TAB>batch_size=params.eval.batch_size,\n<TAB><TAB><TAB>num_examples=params.eval.eval_samples,\n<TAB><TAB>)\n<TAB>if callbacks is None:\n<TAB><TAB>callbacks = []\n<TAB>if FLAGS.log_steps:\n<TAB><TAB>callbacks.append(\n<TAB><TAB><TAB>keras_utils.TimeHistory(\n<TAB><TAB><TAB><TAB>batch_size=params.train.batch_size,\n<TAB><TAB><TAB><TAB>log_steps=FLAGS.log_steps,\n<TAB><TAB><TAB>)\n<TAB><TAB>)\n<TAB>return run_executor(\n<TAB><TAB>params,\n<TAB><TAB>FLAGS.mode,\n<TAB><TAB>checkpoint_path=FLAGS.checkpoint_path,\n<TAB><TAB>train_input_fn=train_input_fn,\n<TAB><TAB>eval_input_fn=eval_input_fn,\n<TAB><TAB>callbacks=callbacks,\n<TAB>)\n", "block": "<TAB><TAB>callbacks.append(\n<TAB><TAB><TAB>keras_utils.TimeHistory(\n<TAB><TAB><TAB><TAB>batch_size=params.train.batch_size,\n<TAB><TAB><TAB><TAB>log_steps=FLAGS.log_steps,\n<TAB><TAB><TAB>)\n<TAB><TAB>)", "complex_masked_block": "<TAB><TAB>callbacks.append(\n<MASK>", "complex_input": "def run(callbacks=None): <TAB>keras_utils.set_session_config(enable_xla=FLAGS.enable_xla) <TAB>params = config_factory.config_generator(FLAGS.model) <TAB>params = params_dict.override_params_dict(params, FLAGS.config_file, is_strict=True) <TAB>params = params_dict.override_params_dict( <TAB><TAB>params, FLAGS.params_override, is_strict=True <TAB>) <TAB>params.override( <TAB><TAB>{ <TAB><TAB><TAB>\"strategy_type\": FLAGS.strategy_type, <TAB><TAB><TAB>\"model_dir\": FLAGS.model_dir, <TAB><TAB><TAB>\"strategy_config\": executor.strategy_flags_dict(), <TAB><TAB>}, <TAB><TAB>is_strict=False, <TAB>) <TAB># Make sure use_tpu and strategy_type are in sync. <TAB>params.use_tpu = params.strategy_type == \"tpu\" <TAB>if not params.use_tpu: <TAB><TAB>params.override( <TAB><TAB><TAB>{ <TAB><TAB><TAB><TAB>\"architecture\": { <TAB><TAB><TAB><TAB><TAB>\"use_bfloat16\": False, <TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB><TAB>\"norm_activation\": { <TAB><TAB><TAB><TAB><TAB>\"use_sync_bn\": False, <TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB>}, <TAB><TAB><TAB>is_strict=True, <TAB><TAB>) <TAB>params.validate() <TAB>params.lock() <TAB>pp = pprint.PrettyPrinter() <TAB>params_str = pp.pformat(params.as_dict()) <TAB>logging.info(\"Model Parameters: %s\", params_str) <TAB>train_input_fn = None <TAB>eval_input_fn = None <TAB>training_file_pattern = ( <TAB><TAB>FLAGS.training_file_pattern or params.train.train_file_pattern <TAB>) <TAB>eval_file_pattern = FLAGS.eval_file_pattern or params.eval.eval_file_pattern <TAB>if not training_file_pattern and not eval_file_pattern: <TAB><TAB>raise ValueError( <TAB><TAB><TAB>\"Must provide at least one of training_file_pattern and \" <TAB><TAB><TAB>\"eval_file_pattern.\" <TAB><TAB>) <TAB>if training_file_pattern: <TAB><TAB># Use global batch size for single host. <TAB><TAB>train_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=training_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.TRAIN, <TAB><TAB><TAB>batch_size=params.train.batch_size, <TAB><TAB>) <TAB>if eval_file_pattern: <TAB><TAB>eval_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=eval_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.PREDICT_WITH_GT, <TAB><TAB><TAB>batch_size=params.eval.batch_size, <TAB><TAB><TAB>num_examples=params.eval.eval_samples, <TAB><TAB>) <TAB>if callbacks is None: <TAB><TAB>callbacks = [] <TAB>if FLAGS.log_steps: <TAB><TAB>callbacks.append( <MASK> <TAB>return run_executor( <TAB><TAB>params, <TAB><TAB>FLAGS.mode, <TAB><TAB>checkpoint_path=FLAGS.checkpoint_path, <TAB><TAB>train_input_fn=train_input_fn, <TAB><TAB>eval_input_fn=eval_input_fn, <TAB><TAB>callbacks=callbacks, <TAB>) ", "complex_target": "<TAB><TAB><TAB>keras_utils.TimeHistory( <TAB><TAB><TAB><TAB>batch_size=params.train.batch_size, <TAB><TAB><TAB><TAB>log_steps=FLAGS.log_steps, <TAB><TAB><TAB>) <TAB><TAB>)", "medium_masked_block": "<TAB><TAB>callbacks.append(\n<TAB><TAB><TAB>keras_utils.TimeHistory(\n<TAB><TAB><TAB><TAB>batch_size=params.train.batch_size,\n<MASK>", "medium_input": "def run(callbacks=None): <TAB>keras_utils.set_session_config(enable_xla=FLAGS.enable_xla) <TAB>params = config_factory.config_generator(FLAGS.model) <TAB>params = params_dict.override_params_dict(params, FLAGS.config_file, is_strict=True) <TAB>params = params_dict.override_params_dict( <TAB><TAB>params, FLAGS.params_override, is_strict=True <TAB>) <TAB>params.override( <TAB><TAB>{ <TAB><TAB><TAB>\"strategy_type\": FLAGS.strategy_type, <TAB><TAB><TAB>\"model_dir\": FLAGS.model_dir, <TAB><TAB><TAB>\"strategy_config\": executor.strategy_flags_dict(), <TAB><TAB>}, <TAB><TAB>is_strict=False, <TAB>) <TAB># Make sure use_tpu and strategy_type are in sync. <TAB>params.use_tpu = params.strategy_type == \"tpu\" <TAB>if not params.use_tpu: <TAB><TAB>params.override( <TAB><TAB><TAB>{ <TAB><TAB><TAB><TAB>\"architecture\": { <TAB><TAB><TAB><TAB><TAB>\"use_bfloat16\": False, <TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB><TAB>\"norm_activation\": { <TAB><TAB><TAB><TAB><TAB>\"use_sync_bn\": False, <TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB>}, <TAB><TAB><TAB>is_strict=True, <TAB><TAB>) <TAB>params.validate() <TAB>params.lock() <TAB>pp = pprint.PrettyPrinter() <TAB>params_str = pp.pformat(params.as_dict()) <TAB>logging.info(\"Model Parameters: %s\", params_str) <TAB>train_input_fn = None <TAB>eval_input_fn = None <TAB>training_file_pattern = ( <TAB><TAB>FLAGS.training_file_pattern or params.train.train_file_pattern <TAB>) <TAB>eval_file_pattern = FLAGS.eval_file_pattern or params.eval.eval_file_pattern <TAB>if not training_file_pattern and not eval_file_pattern: <TAB><TAB>raise ValueError( <TAB><TAB><TAB>\"Must provide at least one of training_file_pattern and \" <TAB><TAB><TAB>\"eval_file_pattern.\" <TAB><TAB>) <TAB>if training_file_pattern: <TAB><TAB># Use global batch size for single host. <TAB><TAB>train_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=training_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.TRAIN, <TAB><TAB><TAB>batch_size=params.train.batch_size, <TAB><TAB>) <TAB>if eval_file_pattern: <TAB><TAB>eval_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=eval_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.PREDICT_WITH_GT, <TAB><TAB><TAB>batch_size=params.eval.batch_size, <TAB><TAB><TAB>num_examples=params.eval.eval_samples, <TAB><TAB>) <TAB>if callbacks is None: <TAB><TAB>callbacks = [] <TAB>if FLAGS.log_steps: <TAB><TAB>callbacks.append( <TAB><TAB><TAB>keras_utils.TimeHistory( <TAB><TAB><TAB><TAB>batch_size=params.train.batch_size, <MASK> <TAB>return run_executor( <TAB><TAB>params, <TAB><TAB>FLAGS.mode, <TAB><TAB>checkpoint_path=FLAGS.checkpoint_path, <TAB><TAB>train_input_fn=train_input_fn, <TAB><TAB>eval_input_fn=eval_input_fn, <TAB><TAB>callbacks=callbacks, <TAB>) ", "medium_target": "<TAB><TAB><TAB><TAB>log_steps=FLAGS.log_steps, <TAB><TAB><TAB>) <TAB><TAB>)", "simple_masked_block": "<TAB><TAB>callbacks.append(\n<TAB><TAB><TAB>keras_utils.TimeHistory(\n<TAB><TAB><TAB><TAB>batch_size=params.train.batch_size,\n<MASK>", "simple_input": "def run(callbacks=None): <TAB>keras_utils.set_session_config(enable_xla=FLAGS.enable_xla) <TAB>params = config_factory.config_generator(FLAGS.model) <TAB>params = params_dict.override_params_dict(params, FLAGS.config_file, is_strict=True) <TAB>params = params_dict.override_params_dict( <TAB><TAB>params, FLAGS.params_override, is_strict=True <TAB>) <TAB>params.override( <TAB><TAB>{ <TAB><TAB><TAB>\"strategy_type\": FLAGS.strategy_type, <TAB><TAB><TAB>\"model_dir\": FLAGS.model_dir, <TAB><TAB><TAB>\"strategy_config\": executor.strategy_flags_dict(), <TAB><TAB>}, <TAB><TAB>is_strict=False, <TAB>) <TAB># Make sure use_tpu and strategy_type are in sync. <TAB>params.use_tpu = params.strategy_type == \"tpu\" <TAB>if not params.use_tpu: <TAB><TAB>params.override( <TAB><TAB><TAB>{ <TAB><TAB><TAB><TAB>\"architecture\": { <TAB><TAB><TAB><TAB><TAB>\"use_bfloat16\": False, <TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB><TAB>\"norm_activation\": { <TAB><TAB><TAB><TAB><TAB>\"use_sync_bn\": False, <TAB><TAB><TAB><TAB>}, <TAB><TAB><TAB>}, <TAB><TAB><TAB>is_strict=True, <TAB><TAB>) <TAB>params.validate() <TAB>params.lock() <TAB>pp = pprint.PrettyPrinter() <TAB>params_str = pp.pformat(params.as_dict()) <TAB>logging.info(\"Model Parameters: %s\", params_str) <TAB>train_input_fn = None <TAB>eval_input_fn = None <TAB>training_file_pattern = ( <TAB><TAB>FLAGS.training_file_pattern or params.train.train_file_pattern <TAB>) <TAB>eval_file_pattern = FLAGS.eval_file_pattern or params.eval.eval_file_pattern <TAB>if not training_file_pattern and not eval_file_pattern: <TAB><TAB>raise ValueError( <TAB><TAB><TAB>\"Must provide at least one of training_file_pattern and \" <TAB><TAB><TAB>\"eval_file_pattern.\" <TAB><TAB>) <TAB>if training_file_pattern: <TAB><TAB># Use global batch size for single host. <TAB><TAB>train_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=training_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.TRAIN, <TAB><TAB><TAB>batch_size=params.train.batch_size, <TAB><TAB>) <TAB>if eval_file_pattern: <TAB><TAB>eval_input_fn = input_reader.InputFn( <TAB><TAB><TAB>file_pattern=eval_file_pattern, <TAB><TAB><TAB>params=params, <TAB><TAB><TAB>mode=input_reader.ModeKeys.PREDICT_WITH_GT, <TAB><TAB><TAB>batch_size=params.eval.batch_size, <TAB><TAB><TAB>num_examples=params.eval.eval_samples, <TAB><TAB>) <TAB>if callbacks is None: <TAB><TAB>callbacks = [] <TAB>if FLAGS.log_steps: <TAB><TAB>callbacks.append( <TAB><TAB><TAB>keras_utils.TimeHistory( <TAB><TAB><TAB><TAB>batch_size=params.train.batch_size, <MASK> <TAB>return run_executor( <TAB><TAB>params, <TAB><TAB>FLAGS.mode, <TAB><TAB>checkpoint_path=FLAGS.checkpoint_path, <TAB><TAB>train_input_fn=train_input_fn, <TAB><TAB>eval_input_fn=eval_input_fn, <TAB><TAB>callbacks=callbacks, <TAB>) ", "simple_target": "<TAB><TAB><TAB><TAB>log_steps=FLAGS.log_steps, <TAB><TAB><TAB>) <TAB><TAB>)"}
{"method": "def check_surrounding_whitespaces(self, definition, docstring):\n<TAB>\"\"\"D210: No whitespaces allowed surrounding docstring text.\"\"\"\n<TAB>if docstring:\n<TAB><TAB>lines = ast.literal_eval(docstring).split(\"\\n\")\n<TAB><TAB>if lines[0].startswith(\" \") or len(lines) == 1 and lines[0].endswith(\" \"):\n<TAB><TAB><TAB>return violations.D210()\n", "block": "<TAB><TAB>lines = ast.literal_eval(docstring).split(\"\\n\")\n<TAB><TAB>if lines[0].startswith(\" \") or len(lines) == 1 and lines[0].endswith(\" \"):\n<TAB><TAB><TAB>return violations.D210()", "complex_masked_block": "<TAB><TAB>lines =<MASK>", "complex_input": "def check_surrounding_whitespaces(self, definition, docstring): <TAB>\"\"\"D210: No whitespaces allowed surrounding docstring text.\"\"\" <TAB>if docstring: <TAB><TAB>lines =<MASK> ", "complex_target": "ast.literal_eval(docstring).split(\"\\n\") <TAB><TAB>if lines[0].startswith(\" \") or len(lines) == 1 and lines[0].endswith(\" \"): <TAB><TAB><TAB>return violations.D210()", "medium_masked_block": "<TAB><TAB>lines = ast.literal_eval(docstring).split(\"\\n\")\n<TAB><TAB>if lines[0].startswith(\" \") or len(lines) == 1 and<MASK>", "medium_input": "def check_surrounding_whitespaces(self, definition, docstring): <TAB>\"\"\"D210: No whitespaces allowed surrounding docstring text.\"\"\" <TAB>if docstring: <TAB><TAB>lines = ast.literal_eval(docstring).split(\"\\n\") <TAB><TAB>if lines[0].startswith(\" \") or len(lines) == 1 and<MASK> ", "medium_target": "lines[0].endswith(\" \"): <TAB><TAB><TAB>return violations.D210()", "simple_masked_block": "<TAB><TAB>lines = ast.literal_eval(docstring).split(\"\\n\")\n<TAB><TAB>if lines[0].startswith(\" \") or len(lines) == 1 and lines[0].endswith(\" \"):\n<MASK>", "simple_input": "def check_surrounding_whitespaces(self, definition, docstring): <TAB>\"\"\"D210: No whitespaces allowed surrounding docstring text.\"\"\" <TAB>if docstring: <TAB><TAB>lines = ast.literal_eval(docstring).split(\"\\n\") <TAB><TAB>if lines[0].startswith(\" \") or len(lines) == 1 and lines[0].endswith(\" \"): <MASK> ", "simple_target": "<TAB><TAB><TAB>return violations.D210()"}
{"method": "def import_names():\n<TAB>plat_table = (\n<TAB><TAB>(\"windows\", (\"windows\")),\n<TAB><TAB>(\"darwin\", (\"darwin\", \"ios\")),\n<TAB><TAB>(\"linux\", (\"linux*\",)),\n<TAB><TAB>(\"freebsd\", (\"freebsd*\", \"openbsd*\")),\n<TAB><TAB>(\"poky\", (\"poky\",)),\n<TAB>)\n<TAB>arch_table = (\n<TAB><TAB>(\"x86\", (\"i386\", \"i486\", \"i586\", \"i686\")),\n<TAB><TAB>(\"x86_64\", (\"x64\", \"x86_64\", \"amd64\", \"intel\")),\n<TAB><TAB>(\"arm\", (\"armv5\",)),\n<TAB><TAB>(\"armv6\", (\"armv6l\",)),\n<TAB><TAB>(\"armv7\", (\"armv7l\",)),\n<TAB><TAB>(\"ppc64\", (\"ppc64le\",)),\n<TAB><TAB>(\"mips32\", (\"mips\",)),\n<TAB><TAB>(\"aarch32\", (\"aarch32\",)),\n<TAB><TAB>(\"aarch64\", (\"aarch64\", \"arm64\")),\n<TAB>)\n<TAB>plat = platform.system().lower()\n<TAB>mach = platform.machine().lower()\n<TAB>for alias, platlist in plat_table:\n<TAB><TAB>for s in platlist:\n<TAB><TAB><TAB>if s.startswith(plat):\n<TAB><TAB><TAB><TAB>plat = alias\n<TAB><TAB><TAB><TAB>break\n<TAB>if plat == \"linux\":\n<TAB><TAB>cname, cver = platform.libc_ver()\n<TAB><TAB>if cname == \"musl\":\n<TAB><TAB><TAB>plat = \"musl\"\n<TAB><TAB>elif cname == \"libc\":\n<TAB><TAB><TAB>plat = \"android\"\n<TAB>for alias, archlist in arch_table:\n<TAB><TAB>if mach in archlist:\n<TAB><TAB><TAB>mach = alias\n<TAB><TAB><TAB>break\n<TAB>if plat == \"windows\" and mach == \"x86_64\":\n<TAB><TAB>bitness = struct.calcsize(\"P\".encode()) * 8\n<TAB><TAB>if bitness == 32:\n<TAB><TAB><TAB>mach = \"x86\"\n<TAB>name = \".\".join([__name__, \"%s_%s\" % (plat, mach), \"pytransform\"])\n<TAB>m = __import__(name, globals(), locals(), [\"*\"])\n<TAB>sys.modules[__name__].__dict__.update(m.__dict__)\n", "block": "<TAB><TAB>cname, cver = platform.libc_ver()\n<TAB><TAB>if cname == \"musl\":\n<TAB><TAB><TAB>plat = \"musl\"\n<TAB><TAB>elif cname == \"libc\":\n<TAB><TAB><TAB>plat = \"android\"", "complex_masked_block": "<TAB><TAB>cname, cver =<MASK>", "complex_input": "def import_names(): <TAB>plat_table = ( <TAB><TAB>(\"windows\", (\"windows\")), <TAB><TAB>(\"darwin\", (\"darwin\", \"ios\")), <TAB><TAB>(\"linux\", (\"linux*\",)), <TAB><TAB>(\"freebsd\", (\"freebsd*\", \"openbsd*\")), <TAB><TAB>(\"poky\", (\"poky\",)), <TAB>) <TAB>arch_table = ( <TAB><TAB>(\"x86\", (\"i386\", \"i486\", \"i586\", \"i686\")), <TAB><TAB>(\"x86_64\", (\"x64\", \"x86_64\", \"amd64\", \"intel\")), <TAB><TAB>(\"arm\", (\"armv5\",)), <TAB><TAB>(\"armv6\", (\"armv6l\",)), <TAB><TAB>(\"armv7\", (\"armv7l\",)), <TAB><TAB>(\"ppc64\", (\"ppc64le\",)), <TAB><TAB>(\"mips32\", (\"mips\",)), <TAB><TAB>(\"aarch32\", (\"aarch32\",)), <TAB><TAB>(\"aarch64\", (\"aarch64\", \"arm64\")), <TAB>) <TAB>plat = platform.system().lower() <TAB>mach = platform.machine().lower() <TAB>for alias, platlist in plat_table: <TAB><TAB>for s in platlist: <TAB><TAB><TAB>if s.startswith(plat): <TAB><TAB><TAB><TAB>plat = alias <TAB><TAB><TAB><TAB>break <TAB>if plat == \"linux\": <TAB><TAB>cname, cver =<MASK> <TAB>for alias, archlist in arch_table: <TAB><TAB>if mach in archlist: <TAB><TAB><TAB>mach = alias <TAB><TAB><TAB>break <TAB>if plat == \"windows\" and mach == \"x86_64\": <TAB><TAB>bitness = struct.calcsize(\"P\".encode()) * 8 <TAB><TAB>if bitness == 32: <TAB><TAB><TAB>mach = \"x86\" <TAB>name = \".\".join([__name__, \"%s_%s\" % (plat, mach), \"pytransform\"]) <TAB>m = __import__(name, globals(), locals(), [\"*\"]) <TAB>sys.modules[__name__].__dict__.update(m.__dict__) ", "complex_target": "platform.libc_ver() <TAB><TAB>if cname == \"musl\": <TAB><TAB><TAB>plat = \"musl\" <TAB><TAB>elif cname == \"libc\": <TAB><TAB><TAB>plat = \"android\"", "medium_masked_block": "<TAB><TAB>cname, cver = platform.libc_ver()\n<TAB><TAB>if cname == \"musl\":\n<TAB><TAB><TAB>plat =<MASK>", "medium_input": "def import_names(): <TAB>plat_table = ( <TAB><TAB>(\"windows\", (\"windows\")), <TAB><TAB>(\"darwin\", (\"darwin\", \"ios\")), <TAB><TAB>(\"linux\", (\"linux*\",)), <TAB><TAB>(\"freebsd\", (\"freebsd*\", \"openbsd*\")), <TAB><TAB>(\"poky\", (\"poky\",)), <TAB>) <TAB>arch_table = ( <TAB><TAB>(\"x86\", (\"i386\", \"i486\", \"i586\", \"i686\")), <TAB><TAB>(\"x86_64\", (\"x64\", \"x86_64\", \"amd64\", \"intel\")), <TAB><TAB>(\"arm\", (\"armv5\",)), <TAB><TAB>(\"armv6\", (\"armv6l\",)), <TAB><TAB>(\"armv7\", (\"armv7l\",)), <TAB><TAB>(\"ppc64\", (\"ppc64le\",)), <TAB><TAB>(\"mips32\", (\"mips\",)), <TAB><TAB>(\"aarch32\", (\"aarch32\",)), <TAB><TAB>(\"aarch64\", (\"aarch64\", \"arm64\")), <TAB>) <TAB>plat = platform.system().lower() <TAB>mach = platform.machine().lower() <TAB>for alias, platlist in plat_table: <TAB><TAB>for s in platlist: <TAB><TAB><TAB>if s.startswith(plat): <TAB><TAB><TAB><TAB>plat = alias <TAB><TAB><TAB><TAB>break <TAB>if plat == \"linux\": <TAB><TAB>cname, cver = platform.libc_ver() <TAB><TAB>if cname == \"musl\": <TAB><TAB><TAB>plat =<MASK> <TAB>for alias, archlist in arch_table: <TAB><TAB>if mach in archlist: <TAB><TAB><TAB>mach = alias <TAB><TAB><TAB>break <TAB>if plat == \"windows\" and mach == \"x86_64\": <TAB><TAB>bitness = struct.calcsize(\"P\".encode()) * 8 <TAB><TAB>if bitness == 32: <TAB><TAB><TAB>mach = \"x86\" <TAB>name = \".\".join([__name__, \"%s_%s\" % (plat, mach), \"pytransform\"]) <TAB>m = __import__(name, globals(), locals(), [\"*\"]) <TAB>sys.modules[__name__].__dict__.update(m.__dict__) ", "medium_target": "\"musl\" <TAB><TAB>elif cname == \"libc\": <TAB><TAB><TAB>plat = \"android\"", "simple_masked_block": "<TAB><TAB>cname, cver = platform.libc_ver()\n<TAB><TAB>if cname == \"musl\":\n<TAB><TAB><TAB>plat = \"musl\"\n<TAB><TAB>elif cname ==<MASK>", "simple_input": "def import_names(): <TAB>plat_table = ( <TAB><TAB>(\"windows\", (\"windows\")), <TAB><TAB>(\"darwin\", (\"darwin\", \"ios\")), <TAB><TAB>(\"linux\", (\"linux*\",)), <TAB><TAB>(\"freebsd\", (\"freebsd*\", \"openbsd*\")), <TAB><TAB>(\"poky\", (\"poky\",)), <TAB>) <TAB>arch_table = ( <TAB><TAB>(\"x86\", (\"i386\", \"i486\", \"i586\", \"i686\")), <TAB><TAB>(\"x86_64\", (\"x64\", \"x86_64\", \"amd64\", \"intel\")), <TAB><TAB>(\"arm\", (\"armv5\",)), <TAB><TAB>(\"armv6\", (\"armv6l\",)), <TAB><TAB>(\"armv7\", (\"armv7l\",)), <TAB><TAB>(\"ppc64\", (\"ppc64le\",)), <TAB><TAB>(\"mips32\", (\"mips\",)), <TAB><TAB>(\"aarch32\", (\"aarch32\",)), <TAB><TAB>(\"aarch64\", (\"aarch64\", \"arm64\")), <TAB>) <TAB>plat = platform.system().lower() <TAB>mach = platform.machine().lower() <TAB>for alias, platlist in plat_table: <TAB><TAB>for s in platlist: <TAB><TAB><TAB>if s.startswith(plat): <TAB><TAB><TAB><TAB>plat = alias <TAB><TAB><TAB><TAB>break <TAB>if plat == \"linux\": <TAB><TAB>cname, cver = platform.libc_ver() <TAB><TAB>if cname == \"musl\": <TAB><TAB><TAB>plat = \"musl\" <TAB><TAB>elif cname ==<MASK> <TAB>for alias, archlist in arch_table: <TAB><TAB>if mach in archlist: <TAB><TAB><TAB>mach = alias <TAB><TAB><TAB>break <TAB>if plat == \"windows\" and mach == \"x86_64\": <TAB><TAB>bitness = struct.calcsize(\"P\".encode()) * 8 <TAB><TAB>if bitness == 32: <TAB><TAB><TAB>mach = \"x86\" <TAB>name = \".\".join([__name__, \"%s_%s\" % (plat, mach), \"pytransform\"]) <TAB>m = __import__(name, globals(), locals(), [\"*\"]) <TAB>sys.modules[__name__].__dict__.update(m.__dict__) ", "simple_target": "\"libc\": <TAB><TAB><TAB>plat = \"android\""}
{"method": "def _get_troubleshooting_result_initial(\n<TAB>self,\n<TAB>resource_group_name: str,\n<TAB>network_watcher_name: str,\n<TAB>parameters: \"_models.QueryTroubleshootingParameters\",\n<TAB>**kwargs\n) -> \"_models.TroubleshootingResult\":\n<TAB>cls = kwargs.pop(\"cls\", None) # type: ClsType[\"_models.TroubleshootingResult\"]\n<TAB>error_map = {\n<TAB><TAB>401: ClientAuthenticationError,\n<TAB><TAB>404: ResourceNotFoundError,\n<TAB><TAB>409: ResourceExistsError,\n<TAB>}\n<TAB>error_map.update(kwargs.pop(\"error_map\", {}))\n<TAB>api_version = \"2019-09-01\"\n<TAB>content_type = kwargs.pop(\"content_type\", \"application/json\")\n<TAB>accept = \"application/json\"\n<TAB># Construct URL\n<TAB>url = self._get_troubleshooting_result_initial.metadata[\"url\"] # type: ignore\n<TAB>path_format_arguments = {\n<TAB><TAB>\"resourceGroupName\": self._serialize.url(\n<TAB><TAB><TAB>\"resource_group_name\", resource_group_name, \"str\"\n<TAB><TAB>),\n<TAB><TAB>\"networkWatcherName\": self._serialize.url(\n<TAB><TAB><TAB>\"network_watcher_name\", network_watcher_name, \"str\"\n<TAB><TAB>),\n<TAB><TAB>\"subscriptionId\": self._serialize.url(\n<TAB><TAB><TAB>\"self._config.subscription_id\", self._config.subscription_id, \"str\"\n<TAB><TAB>),\n<TAB>}\n<TAB>url = self._client.format_url(url, **path_format_arguments)\n<TAB># Construct parameters\n<TAB>query_parameters = {} # type: Dict[str, Any]\n<TAB>query_parameters[\"api-version\"] = self._serialize.query(\n<TAB><TAB>\"api_version\", api_version, \"str\"\n<TAB>)\n<TAB># Construct headers\n<TAB>header_parameters = {} # type: Dict[str, Any]\n<TAB>header_parameters[\"Content-Type\"] = self._serialize.header(\n<TAB><TAB>\"content_type\", content_type, \"str\"\n<TAB>)\n<TAB>header_parameters[\"Accept\"] = self._serialize.header(\"accept\", accept, \"str\")\n<TAB>body_content_kwargs = {} # type: Dict[str, Any]\n<TAB>body_content = self._serialize.body(parameters, \"QueryTroubleshootingParameters\")\n<TAB>body_content_kwargs[\"content\"] = body_content\n<TAB>request = self._client.post(\n<TAB><TAB>url, query_parameters, header_parameters, **body_content_kwargs\n<TAB>)\n<TAB>pipeline_response = await self._client._pipeline.run(\n<TAB><TAB>request, stream=False, **kwargs\n<TAB>)\n<TAB>response = pipeline_response.http_response\n<TAB>if response.status_code not in [200, 202]:\n<TAB><TAB>map_error(\n<TAB><TAB><TAB>status_code=response.status_code, response=response, error_map=error_map\n<TAB><TAB>)\n<TAB><TAB>error = self._deserialize(_models.ErrorResponse, response)\n<TAB><TAB>raise HttpResponseError(\n<TAB><TAB><TAB>response=response, model=error, error_format=ARMErrorFormat\n<TAB><TAB>)\n<TAB>if response.status_code == 200:\n<TAB><TAB>deserialized = self._deserialize(\"TroubleshootingResult\", pipeline_response)\n<TAB>if response.status_code == 202:\n<TAB><TAB>deserialized = self._deserialize(\"TroubleshootingResult\", pipeline_response)\n<TAB>if cls:\n<TAB><TAB>return cls(pipeline_response, deserialized, {})\n<TAB>return deserialized\n", "block": "<TAB><TAB>map_error(\n<TAB><TAB><TAB>status_code=response.status_code, response=response, error_map=error_map\n<TAB><TAB>)\n<TAB><TAB>error = self._deserialize(_models.ErrorResponse, response)\n<TAB><TAB>raise HttpResponseError(\n<TAB><TAB><TAB>response=response, model=error, error_format=ARMErrorFormat\n<TAB><TAB>)", "complex_masked_block": "<TAB><TAB>map_error(\n<TAB><TAB><TAB>status_code=response.status_code, response=response, error_map=error_map\n<TAB><TAB>)\n<MASK>", "complex_input": "def _get_troubleshooting_result_initial( <TAB>self, <TAB>resource_group_name: str, <TAB>network_watcher_name: str, <TAB>parameters: \"_models.QueryTroubleshootingParameters\", <TAB>**kwargs ) -> \"_models.TroubleshootingResult\": <TAB>cls = kwargs.pop(\"cls\", None) # type: ClsType[\"_models.TroubleshootingResult\"] <TAB>error_map = { <TAB><TAB>401: ClientAuthenticationError, <TAB><TAB>404: ResourceNotFoundError, <TAB><TAB>409: ResourceExistsError, <TAB>} <TAB>error_map.update(kwargs.pop(\"error_map\", {})) <TAB>api_version = \"2019-09-01\" <TAB>content_type = kwargs.pop(\"content_type\", \"application/json\") <TAB>accept = \"application/json\" <TAB># Construct URL <TAB>url = self._get_troubleshooting_result_initial.metadata[\"url\"] # type: ignore <TAB>path_format_arguments = { <TAB><TAB>\"resourceGroupName\": self._serialize.url( <TAB><TAB><TAB>\"resource_group_name\", resource_group_name, \"str\" <TAB><TAB>), <TAB><TAB>\"networkWatcherName\": self._serialize.url( <TAB><TAB><TAB>\"network_watcher_name\", network_watcher_name, \"str\" <TAB><TAB>), <TAB><TAB>\"subscriptionId\": self._serialize.url( <TAB><TAB><TAB>\"self._config.subscription_id\", self._config.subscription_id, \"str\" <TAB><TAB>), <TAB>} <TAB>url = self._client.format_url(url, **path_format_arguments) <TAB># Construct parameters <TAB>query_parameters = {} # type: Dict[str, Any] <TAB>query_parameters[\"api-version\"] = self._serialize.query( <TAB><TAB>\"api_version\", api_version, \"str\" <TAB>) <TAB># Construct headers <TAB>header_parameters = {} # type: Dict[str, Any] <TAB>header_parameters[\"Content-Type\"] = self._serialize.header( <TAB><TAB>\"content_type\", content_type, \"str\" <TAB>) <TAB>header_parameters[\"Accept\"] = self._serialize.header(\"accept\", accept, \"str\") <TAB>body_content_kwargs = {} # type: Dict[str, Any] <TAB>body_content = self._serialize.body(parameters, \"QueryTroubleshootingParameters\") <TAB>body_content_kwargs[\"content\"] = body_content <TAB>request = self._client.post( <TAB><TAB>url, query_parameters, header_parameters, **body_content_kwargs <TAB>) <TAB>pipeline_response = await self._client._pipeline.run( <TAB><TAB>request, stream=False, **kwargs <TAB>) <TAB>response = pipeline_response.http_response <TAB>if response.status_code not in [200, 202]: <TAB><TAB>map_error( <TAB><TAB><TAB>status_code=response.status_code, response=response, error_map=error_map <TAB><TAB>) <MASK> <TAB>if response.status_code == 200: <TAB><TAB>deserialized = self._deserialize(\"TroubleshootingResult\", pipeline_response) <TAB>if response.status_code == 202: <TAB><TAB>deserialized = self._deserialize(\"TroubleshootingResult\", pipeline_response) <TAB>if cls: <TAB><TAB>return cls(pipeline_response, deserialized, {}) <TAB>return deserialized ", "complex_target": "<TAB><TAB>error = self._deserialize(_models.ErrorResponse, response) <TAB><TAB>raise HttpResponseError( <TAB><TAB><TAB>response=response, model=error, error_format=ARMErrorFormat <TAB><TAB>)", "medium_masked_block": "<TAB><TAB>map_error(\n<TAB><TAB><TAB>status_code=response.status_code, response=response, error_map=error_map\n<TAB><TAB>)\n<TAB><TAB>error = self._deserialize(_models.ErrorResponse, response)\n<TAB><TAB>raise HttpResponseError(\n<MASK>", "medium_input": "def _get_troubleshooting_result_initial( <TAB>self, <TAB>resource_group_name: str, <TAB>network_watcher_name: str, <TAB>parameters: \"_models.QueryTroubleshootingParameters\", <TAB>**kwargs ) -> \"_models.TroubleshootingResult\": <TAB>cls = kwargs.pop(\"cls\", None) # type: ClsType[\"_models.TroubleshootingResult\"] <TAB>error_map = { <TAB><TAB>401: ClientAuthenticationError, <TAB><TAB>404: ResourceNotFoundError, <TAB><TAB>409: ResourceExistsError, <TAB>} <TAB>error_map.update(kwargs.pop(\"error_map\", {})) <TAB>api_version = \"2019-09-01\" <TAB>content_type = kwargs.pop(\"content_type\", \"application/json\") <TAB>accept = \"application/json\" <TAB># Construct URL <TAB>url = self._get_troubleshooting_result_initial.metadata[\"url\"] # type: ignore <TAB>path_format_arguments = { <TAB><TAB>\"resourceGroupName\": self._serialize.url( <TAB><TAB><TAB>\"resource_group_name\", resource_group_name, \"str\" <TAB><TAB>), <TAB><TAB>\"networkWatcherName\": self._serialize.url( <TAB><TAB><TAB>\"network_watcher_name\", network_watcher_name, \"str\" <TAB><TAB>), <TAB><TAB>\"subscriptionId\": self._serialize.url( <TAB><TAB><TAB>\"self._config.subscription_id\", self._config.subscription_id, \"str\" <TAB><TAB>), <TAB>} <TAB>url = self._client.format_url(url, **path_format_arguments) <TAB># Construct parameters <TAB>query_parameters = {} # type: Dict[str, Any] <TAB>query_parameters[\"api-version\"] = self._serialize.query( <TAB><TAB>\"api_version\", api_version, \"str\" <TAB>) <TAB># Construct headers <TAB>header_parameters = {} # type: Dict[str, Any] <TAB>header_parameters[\"Content-Type\"] = self._serialize.header( <TAB><TAB>\"content_type\", content_type, \"str\" <TAB>) <TAB>header_parameters[\"Accept\"] = self._serialize.header(\"accept\", accept, \"str\") <TAB>body_content_kwargs = {} # type: Dict[str, Any] <TAB>body_content = self._serialize.body(parameters, \"QueryTroubleshootingParameters\") <TAB>body_content_kwargs[\"content\"] = body_content <TAB>request = self._client.post( <TAB><TAB>url, query_parameters, header_parameters, **body_content_kwargs <TAB>) <TAB>pipeline_response = await self._client._pipeline.run( <TAB><TAB>request, stream=False, **kwargs <TAB>) <TAB>response = pipeline_response.http_response <TAB>if response.status_code not in [200, 202]: <TAB><TAB>map_error( <TAB><TAB><TAB>status_code=response.status_code, response=response, error_map=error_map <TAB><TAB>) <TAB><TAB>error = self._deserialize(_models.ErrorResponse, response) <TAB><TAB>raise HttpResponseError( <MASK> <TAB>if response.status_code == 200: <TAB><TAB>deserialized = self._deserialize(\"TroubleshootingResult\", pipeline_response) <TAB>if response.status_code == 202: <TAB><TAB>deserialized = self._deserialize(\"TroubleshootingResult\", pipeline_response) <TAB>if cls: <TAB><TAB>return cls(pipeline_response, deserialized, {}) <TAB>return deserialized ", "medium_target": "<TAB><TAB><TAB>response=response, model=error, error_format=ARMErrorFormat <TAB><TAB>)", "simple_masked_block": "<TAB><TAB>map_error(\n<TAB><TAB><TAB>status_code=response.status_code, response=response, error_map=error_map\n<TAB><TAB>)\n<TAB><TAB>error = self._deserialize(_models.ErrorResponse, response)\n<TAB><TAB>raise HttpResponseError(\n<TAB><TAB><TAB>response=response, model=error,<MASK>", "simple_input": "def _get_troubleshooting_result_initial( <TAB>self, <TAB>resource_group_name: str, <TAB>network_watcher_name: str, <TAB>parameters: \"_models.QueryTroubleshootingParameters\", <TAB>**kwargs ) -> \"_models.TroubleshootingResult\": <TAB>cls = kwargs.pop(\"cls\", None) # type: ClsType[\"_models.TroubleshootingResult\"] <TAB>error_map = { <TAB><TAB>401: ClientAuthenticationError, <TAB><TAB>404: ResourceNotFoundError, <TAB><TAB>409: ResourceExistsError, <TAB>} <TAB>error_map.update(kwargs.pop(\"error_map\", {})) <TAB>api_version = \"2019-09-01\" <TAB>content_type = kwargs.pop(\"content_type\", \"application/json\") <TAB>accept = \"application/json\" <TAB># Construct URL <TAB>url = self._get_troubleshooting_result_initial.metadata[\"url\"] # type: ignore <TAB>path_format_arguments = { <TAB><TAB>\"resourceGroupName\": self._serialize.url( <TAB><TAB><TAB>\"resource_group_name\", resource_group_name, \"str\" <TAB><TAB>), <TAB><TAB>\"networkWatcherName\": self._serialize.url( <TAB><TAB><TAB>\"network_watcher_name\", network_watcher_name, \"str\" <TAB><TAB>), <TAB><TAB>\"subscriptionId\": self._serialize.url( <TAB><TAB><TAB>\"self._config.subscription_id\", self._config.subscription_id, \"str\" <TAB><TAB>), <TAB>} <TAB>url = self._client.format_url(url, **path_format_arguments) <TAB># Construct parameters <TAB>query_parameters = {} # type: Dict[str, Any] <TAB>query_parameters[\"api-version\"] = self._serialize.query( <TAB><TAB>\"api_version\", api_version, \"str\" <TAB>) <TAB># Construct headers <TAB>header_parameters = {} # type: Dict[str, Any] <TAB>header_parameters[\"Content-Type\"] = self._serialize.header( <TAB><TAB>\"content_type\", content_type, \"str\" <TAB>) <TAB>header_parameters[\"Accept\"] = self._serialize.header(\"accept\", accept, \"str\") <TAB>body_content_kwargs = {} # type: Dict[str, Any] <TAB>body_content = self._serialize.body(parameters, \"QueryTroubleshootingParameters\") <TAB>body_content_kwargs[\"content\"] = body_content <TAB>request = self._client.post( <TAB><TAB>url, query_parameters, header_parameters, **body_content_kwargs <TAB>) <TAB>pipeline_response = await self._client._pipeline.run( <TAB><TAB>request, stream=False, **kwargs <TAB>) <TAB>response = pipeline_response.http_response <TAB>if response.status_code not in [200, 202]: <TAB><TAB>map_error( <TAB><TAB><TAB>status_code=response.status_code, response=response, error_map=error_map <TAB><TAB>) <TAB><TAB>error = self._deserialize(_models.ErrorResponse, response) <TAB><TAB>raise HttpResponseError( <TAB><TAB><TAB>response=response, model=error,<MASK> <TAB>if response.status_code == 200: <TAB><TAB>deserialized = self._deserialize(\"TroubleshootingResult\", pipeline_response) <TAB>if response.status_code == 202: <TAB><TAB>deserialized = self._deserialize(\"TroubleshootingResult\", pipeline_response) <TAB>if cls: <TAB><TAB>return cls(pipeline_response, deserialized, {}) <TAB>return deserialized ", "simple_target": "error_format=ARMErrorFormat <TAB><TAB>)"}
{"method": "def sftp_copy_from_json(self, input):\n<TAB>try:\n<TAB><TAB>src_path = input.get(\"src_path\")\n<TAB><TAB>dest_path = input.get(\"dest_path\")\n<TAB><TAB>src_host = input.get(\"src_host\")\n<TAB><TAB>src_port = input.get(\"src_port\")\n<TAB><TAB>src_username = input.get(\"src_username\")\n<TAB><TAB>src_password = input.get(\"src_password\")\n<TAB><TAB>dest_host = input.get(\"dest_host\")\n<TAB><TAB>dest_port = input.get(\"dest_port\")\n<TAB><TAB>dest_username = input.get(\"dest_username\")\n<TAB><TAB>dest_password = input.get(\"dest_password\")\n<TAB>except:\n<TAB><TAB>return \"Couldn't get all objects\"\n<TAB>curr_dir = os.getcwd()\n<TAB>temp_dir = os.path.join(curr_dir, r\"temp_data\")\n<TAB>os.makedirs(temp_dir)\n<TAB>async with asyncssh.connect(\n<TAB><TAB>host=src_host,\n<TAB><TAB>port=src_port,\n<TAB><TAB>username=src_username,\n<TAB><TAB>password=src_password,\n<TAB><TAB>known_hosts=None,\n<TAB>) as conn:\n<TAB><TAB>async with asyncssh.connect(\n<TAB><TAB><TAB>host=dest_host,\n<TAB><TAB><TAB>port=dest_port,\n<TAB><TAB><TAB>username=dest_username,\n<TAB><TAB><TAB>password=dest_password,\n<TAB><TAB><TAB>tunnel=conn,\n<TAB><TAB><TAB>known_hosts=None,\n<TAB><TAB>) as tunneled_conn:\n<TAB><TAB><TAB># grab remote file, place in container\n<TAB><TAB><TAB>async with conn.start_sftp_client() as sftp:\n<TAB><TAB><TAB><TAB>results = await sftp.get(src_path, temp_dir)\n<TAB><TAB><TAB>spliced_path = src_path.split(\"/\")\n<TAB><TAB><TAB>file_name = spliced_path[len(spliced_path) - 1]\n<TAB><TAB><TAB># copy grabbed file to desired location\n<TAB><TAB><TAB>async with tunneled_conn.start_sftp_client() as sftp2:\n<TAB><TAB><TAB><TAB>results2 = await sftp2.put(temp_dir + \"/\" + file_name, dest_path)\n<TAB># cleaning up temp file\n<TAB>for file in os.listdir(temp_dir):\n<TAB><TAB>file_path = os.path.join(temp_dir, file)\n<TAB><TAB>if os.path.isfile(file_path):\n<TAB><TAB><TAB>os.remove(file_path)\n<TAB>os.rmdir(temp_dir)\n<TAB>return \"Successfully Copied File.\"\n", "block": "<TAB><TAB>src_path = input.get(\"src_path\")\n<TAB><TAB>dest_path = input.get(\"dest_path\")\n<TAB><TAB>src_host = input.get(\"src_host\")\n<TAB><TAB>src_port = input.get(\"src_port\")\n<TAB><TAB>src_username = input.get(\"src_username\")\n<TAB><TAB>src_password = input.get(\"src_password\")\n<TAB><TAB>dest_host = input.get(\"dest_host\")\n<TAB><TAB>dest_port = input.get(\"dest_port\")\n<TAB><TAB>dest_username = input.get(\"dest_username\")\n<TAB><TAB>dest_password = input.get(\"dest_password\")", "complex_masked_block": "<TAB><TAB>src_path = input.get(\"src_path\")\n<TAB><TAB>dest_path = input.get(\"dest_path\")\n<TAB><TAB>src_host = input.get(\"src_host\")\n<TAB><TAB>src_port = input.get(\"src_port\")\n<TAB><TAB>src_username = input.get(\"src_username\")\n<TAB><TAB>src_password = input.get(\"src_password\")\n<TAB><TAB>dest_host =<MASK>", "complex_input": "def sftp_copy_from_json(self, input): <TAB>try: <TAB><TAB>src_path = input.get(\"src_path\") <TAB><TAB>dest_path = input.get(\"dest_path\") <TAB><TAB>src_host = input.get(\"src_host\") <TAB><TAB>src_port = input.get(\"src_port\") <TAB><TAB>src_username = input.get(\"src_username\") <TAB><TAB>src_password = input.get(\"src_password\") <TAB><TAB>dest_host =<MASK> <TAB>except: <TAB><TAB>return \"Couldn't get all objects\" <TAB>curr_dir = os.getcwd() <TAB>temp_dir = os.path.join(curr_dir, r\"temp_data\") <TAB>os.makedirs(temp_dir) <TAB>async with asyncssh.connect( <TAB><TAB>host=src_host, <TAB><TAB>port=src_port, <TAB><TAB>username=src_username, <TAB><TAB>password=src_password, <TAB><TAB>known_hosts=None, <TAB>) as conn: <TAB><TAB>async with asyncssh.connect( <TAB><TAB><TAB>host=dest_host, <TAB><TAB><TAB>port=dest_port, <TAB><TAB><TAB>username=dest_username, <TAB><TAB><TAB>password=dest_password, <TAB><TAB><TAB>tunnel=conn, <TAB><TAB><TAB>known_hosts=None, <TAB><TAB>) as tunneled_conn: <TAB><TAB><TAB># grab remote file, place in container <TAB><TAB><TAB>async with conn.start_sftp_client() as sftp: <TAB><TAB><TAB><TAB>results = await sftp.get(src_path, temp_dir) <TAB><TAB><TAB>spliced_path = src_path.split(\"/\") <TAB><TAB><TAB>file_name = spliced_path[len(spliced_path) - 1] <TAB><TAB><TAB># copy grabbed file to desired location <TAB><TAB><TAB>async with tunneled_conn.start_sftp_client() as sftp2: <TAB><TAB><TAB><TAB>results2 = await sftp2.put(temp_dir + \"/\" + file_name, dest_path) <TAB># cleaning up temp file <TAB>for file in os.listdir(temp_dir): <TAB><TAB>file_path = os.path.join(temp_dir, file) <TAB><TAB>if os.path.isfile(file_path): <TAB><TAB><TAB>os.remove(file_path) <TAB>os.rmdir(temp_dir) <TAB>return \"Successfully Copied File.\" ", "complex_target": "input.get(\"dest_host\") <TAB><TAB>dest_port = input.get(\"dest_port\") <TAB><TAB>dest_username = input.get(\"dest_username\") <TAB><TAB>dest_password = input.get(\"dest_password\")", "medium_masked_block": "<TAB><TAB>src_path = input.get(\"src_path\")\n<TAB><TAB>dest_path = input.get(\"dest_path\")\n<TAB><TAB>src_host = input.get(\"src_host\")\n<TAB><TAB>src_port = input.get(\"src_port\")\n<TAB><TAB>src_username = input.get(\"src_username\")\n<TAB><TAB>src_password = input.get(\"src_password\")\n<TAB><TAB>dest_host = input.get(\"dest_host\")\n<TAB><TAB>dest_port = input.get(\"dest_port\")\n<TAB><TAB>dest_username =<MASK>", "medium_input": "def sftp_copy_from_json(self, input): <TAB>try: <TAB><TAB>src_path = input.get(\"src_path\") <TAB><TAB>dest_path = input.get(\"dest_path\") <TAB><TAB>src_host = input.get(\"src_host\") <TAB><TAB>src_port = input.get(\"src_port\") <TAB><TAB>src_username = input.get(\"src_username\") <TAB><TAB>src_password = input.get(\"src_password\") <TAB><TAB>dest_host = input.get(\"dest_host\") <TAB><TAB>dest_port = input.get(\"dest_port\") <TAB><TAB>dest_username =<MASK> <TAB>except: <TAB><TAB>return \"Couldn't get all objects\" <TAB>curr_dir = os.getcwd() <TAB>temp_dir = os.path.join(curr_dir, r\"temp_data\") <TAB>os.makedirs(temp_dir) <TAB>async with asyncssh.connect( <TAB><TAB>host=src_host, <TAB><TAB>port=src_port, <TAB><TAB>username=src_username, <TAB><TAB>password=src_password, <TAB><TAB>known_hosts=None, <TAB>) as conn: <TAB><TAB>async with asyncssh.connect( <TAB><TAB><TAB>host=dest_host, <TAB><TAB><TAB>port=dest_port, <TAB><TAB><TAB>username=dest_username, <TAB><TAB><TAB>password=dest_password, <TAB><TAB><TAB>tunnel=conn, <TAB><TAB><TAB>known_hosts=None, <TAB><TAB>) as tunneled_conn: <TAB><TAB><TAB># grab remote file, place in container <TAB><TAB><TAB>async with conn.start_sftp_client() as sftp: <TAB><TAB><TAB><TAB>results = await sftp.get(src_path, temp_dir) <TAB><TAB><TAB>spliced_path = src_path.split(\"/\") <TAB><TAB><TAB>file_name = spliced_path[len(spliced_path) - 1] <TAB><TAB><TAB># copy grabbed file to desired location <TAB><TAB><TAB>async with tunneled_conn.start_sftp_client() as sftp2: <TAB><TAB><TAB><TAB>results2 = await sftp2.put(temp_dir + \"/\" + file_name, dest_path) <TAB># cleaning up temp file <TAB>for file in os.listdir(temp_dir): <TAB><TAB>file_path = os.path.join(temp_dir, file) <TAB><TAB>if os.path.isfile(file_path): <TAB><TAB><TAB>os.remove(file_path) <TAB>os.rmdir(temp_dir) <TAB>return \"Successfully Copied File.\" ", "medium_target": "input.get(\"dest_username\") <TAB><TAB>dest_password = input.get(\"dest_password\")", "simple_masked_block": "<TAB><TAB>src_path = input.get(\"src_path\")\n<TAB><TAB>dest_path = input.get(\"dest_path\")\n<TAB><TAB>src_host = input.get(\"src_host\")\n<TAB><TAB>src_port = input.get(\"src_port\")\n<TAB><TAB>src_username = input.get(\"src_username\")\n<TAB><TAB>src_password = input.get(\"src_password\")\n<TAB><TAB>dest_host = input.get(\"dest_host\")\n<TAB><TAB>dest_port = input.get(\"dest_port\")\n<TAB><TAB>dest_username = input.get(\"dest_username\")\n<MASK>", "simple_input": "def sftp_copy_from_json(self, input): <TAB>try: <TAB><TAB>src_path = input.get(\"src_path\") <TAB><TAB>dest_path = input.get(\"dest_path\") <TAB><TAB>src_host = input.get(\"src_host\") <TAB><TAB>src_port = input.get(\"src_port\") <TAB><TAB>src_username = input.get(\"src_username\") <TAB><TAB>src_password = input.get(\"src_password\") <TAB><TAB>dest_host = input.get(\"dest_host\") <TAB><TAB>dest_port = input.get(\"dest_port\") <TAB><TAB>dest_username = input.get(\"dest_username\") <MASK> <TAB>except: <TAB><TAB>return \"Couldn't get all objects\" <TAB>curr_dir = os.getcwd() <TAB>temp_dir = os.path.join(curr_dir, r\"temp_data\") <TAB>os.makedirs(temp_dir) <TAB>async with asyncssh.connect( <TAB><TAB>host=src_host, <TAB><TAB>port=src_port, <TAB><TAB>username=src_username, <TAB><TAB>password=src_password, <TAB><TAB>known_hosts=None, <TAB>) as conn: <TAB><TAB>async with asyncssh.connect( <TAB><TAB><TAB>host=dest_host, <TAB><TAB><TAB>port=dest_port, <TAB><TAB><TAB>username=dest_username, <TAB><TAB><TAB>password=dest_password, <TAB><TAB><TAB>tunnel=conn, <TAB><TAB><TAB>known_hosts=None, <TAB><TAB>) as tunneled_conn: <TAB><TAB><TAB># grab remote file, place in container <TAB><TAB><TAB>async with conn.start_sftp_client() as sftp: <TAB><TAB><TAB><TAB>results = await sftp.get(src_path, temp_dir) <TAB><TAB><TAB>spliced_path = src_path.split(\"/\") <TAB><TAB><TAB>file_name = spliced_path[len(spliced_path) - 1] <TAB><TAB><TAB># copy grabbed file to desired location <TAB><TAB><TAB>async with tunneled_conn.start_sftp_client() as sftp2: <TAB><TAB><TAB><TAB>results2 = await sftp2.put(temp_dir + \"/\" + file_name, dest_path) <TAB># cleaning up temp file <TAB>for file in os.listdir(temp_dir): <TAB><TAB>file_path = os.path.join(temp_dir, file) <TAB><TAB>if os.path.isfile(file_path): <TAB><TAB><TAB>os.remove(file_path) <TAB>os.rmdir(temp_dir) <TAB>return \"Successfully Copied File.\" ", "simple_target": "<TAB><TAB>dest_password = input.get(\"dest_password\")"}
{"source": "Title: Andrea Arnold Text: Andrea Arnold Andrea Arnold, OBE (born 5 April 1961) is an English filmmaker and former actress. She won an Academy Award for her short film \"Wasp\" in 2005. She has since made the leap to feature films and television, including \"Red Road\" (2006), \"Fish Tank\" (2009), and \"American Honey\" (2016), all of which have won the Jury Prize at the Cannes Film Festival. Arnold has also directed four episodes of the Emmy Award-winning series \"Transparent\", as well as all seven episodes of the second season of the Emmy Award-winning series \"Big Little Lies\". Arnold was born in Dartford, Kent, the Question: big little lies season 2 how many episodes ", "meta": {"id": "7854255", "qid": "a", "question": "big little lies season 2 how many episodes", "title": "Andrea Arnold", "text": "Andrea Arnold Andrea Arnold, OBE (born 5 April 1961) is an English filmmaker and former actress. She won an Academy Award for her short film \"Wasp\" in 2005. She has since made the leap to feature films and television, including \"Red Road\" (2006), \"Fish Tank\" (2009), and \"American Honey\" (2016), all of which have won the Jury Prize at the Cannes Film Festival. Arnold has also directed four episodes of the Emmy Award-winning series \"Transparent\", as well as all seven episodes of the second season of the Emmy Award-winning series \"Big Little Lies\". Arnold was born in Dartford, Kent, the"}, "target": "Answer: seven"}
{"source": "Title: Designing Women Text: to whom she eventually loses. In reality, Dixie Carter was a Republican who disagreed with some of the liberal views expressed by her onscreen character, although she did become a Clinton supporter. Shout! Factory has released all seven seasons of \"Designing Women\" on DVD in Region 1. On September 2, 2003, Sony Pictures released \"The Best of Designing Women\", a single-disc DVD featuring five episodes ranging between seasons one through four: \"Designing Women (Pilot)\" (season 1), \"Killing All the Right People\" (season 2), \"Reservations for Eight\" (season 2), \"Big Haas and Little Falsie\" (season 3) and \"They Shoot Fat Women, Question: big little lies season 2 how many episodes ", "meta": {"id": "1523654", "qid": "i", "question": "big little lies season 2 how many episodes", "title": "Designing Women", "text": "to whom she eventually loses. In reality, Dixie Carter was a Republican who disagreed with some of the liberal views expressed by her onscreen character, although she did become a Clinton supporter. Shout! Factory has released all seven seasons of \"Designing Women\" on DVD in Region 1. On September 2, 2003, Sony Pictures released \"The Best of Designing Women\", a single-disc DVD featuring five episodes ranging between seasons one through four: \"Designing Women (Pilot)\" (season 1), \"Killing All the Right People\" (season 2), \"Reservations for Eight\" (season 2), \"Big Haas and Little Falsie\" (season 3) and \"They Shoot Fat Women,"}, "target": "Answer: seven"}
{"source": "Title: Waiting for a Girl Like You Text: Waiting for a Girl Like You \"Waiting for a Girl Like You\" is a 1981 power ballad by the British-American rock band Foreigner. The distinctive synthesizer theme was performed by the then-little-known Thomas Dolby, and this song also marked a major departure from their earlier singles because their previous singles were mid to upper tempo rock songs while this song was a softer love song with the energy of a power ballad. It was the second single released from the album \"4\" (1981) and was co-written by Lou Gramm and Mick Jones. It has become one of the band's most Question: who sang waiting for a girl like you ", "meta": {"id": "8685225", "qid": "1", "question": "who sang waiting for a girl like you", "title": "Waiting for a Girl Like You", "text": "Waiting for a Girl Like You \"Waiting for a Girl Like You\" is a 1981 power ballad by the British-American rock band Foreigner. The distinctive synthesizer theme was performed by the then-little-known Thomas Dolby, and this song also marked a major departure from their earlier singles because their previous singles were mid to upper tempo rock songs while this song was a softer love song with the energy of a power ballad. It was the second single released from the album \"4\" (1981) and was co-written by Lou Gramm and Mick Jones. It has become one of the band's most"}, "target": "Answer: Foreigner"}
{"source": "Title: Waiting for a Girl Like You Text: Waiting for a Girl Like You \"Waiting for a Girl Like You\" is a 1981 power ballad by the British-American rock band Foreigner. The distinctive synthesizer theme was performed by the then-little-known Thomas Dolby, and this song also marked a major departure from their earlier singles because their previous singles were mid to upper tempo rock songs while this song was a softer love song with the energy of a power ballad. It was the second single released from the album \"4\" (1981) and was co-written by Lou Gramm and Mick Jones. It has become one of the band's most Question: who sang waiting for a girl like you ", "meta": {"id": "8685225", "qid": "_", "question": "who sang waiting for a girl like you", "title": "Waiting for a Girl Like You", "text": "Waiting for a Girl Like You \"Waiting for a Girl Like You\" is a 1981 power ballad by the British-American rock band Foreigner. The distinctive synthesizer theme was performed by the then-little-known Thomas Dolby, and this song also marked a major departure from their earlier singles because their previous singles were mid to upper tempo rock songs while this song was a softer love song with the energy of a power ballad. It was the second single released from the album \"4\" (1981) and was co-written by Lou Gramm and Mick Jones. It has become one of the band's most"}, "target": "Answer: Foreigner"}
{"source": "Title: Foreigner discography Text: top 3 (\"Hot Blooded\", \"Double Vision\", \"Waiting for a Girl Like You\" and \"I Want to Know What Love Is\"). Despite its British roots, the band achieved only moderate success in the UK Singles Chart, with only two of their songs, \"Waiting for a Girl Like You\" and \"I Want to Know What Love Is\", peaking within the top 20. In 2014, Rhino released a boxed set of the band's seven albums recorded for Atlantic entitled \"Foreigner: The Complete Atlantic Studios Albums 1977-1991\". The first four albums that were expanded to include bonus tracks in 2002 were included in the Question: who sang waiting for a girl like you ", "meta": {"id": "11944859", "qid": "t", "question": "who sang waiting for a girl like you", "title": "Foreigner discography", "text": "top 3 (\"Hot Blooded\", \"Double Vision\", \"Waiting for a Girl Like You\" and \"I Want to Know What Love Is\"). Despite its British roots, the band achieved only moderate success in the UK Singles Chart, with only two of their songs, \"Waiting for a Girl Like You\" and \"I Want to Know What Love Is\", peaking within the top 20. In 2014, Rhino released a boxed set of the band's seven albums recorded for Atlantic entitled \"Foreigner: The Complete Atlantic Studios Albums 1977-1991\". The first four albums that were expanded to include bonus tracks in 2002 were included in the"}, "target": "Answer: Foreigner"}
{"text": "are there any exchange fees", "inputs": {"text": "are there any exchange fees"}, "prediction": [{"label": "NEGATIVE", "score": 0.9919849038124084}, {"label": "POSITIVE", "score": 0.008015076629817486}], "prediction_agent": "distilbert-base-uncased-finetuned-sst-2-english", "annotation": null, "annotation_agent": null, "multi_label": false, "explanation": null, "id": null, "metadata": {"category": 31}, "status": "Default", "event_timestamp": null, "metrics": null}
{"text": "A percentage fee was charged to my withdrawal.", "inputs": {"text": "A percentage fee was charged to my withdrawal."}, "prediction": [{"label": "NEGATIVE", "score": 0.9973815083503723}, {"label": "POSITIVE", "score": 0.002618502825498581}], "prediction_agent": "distilbert-base-uncased-finetuned-sst-2-english", "annotation": null, "annotation_agent": null, "multi_label": false, "explanation": null, "id": null, "metadata": {"category": 19}, "status": "Default", "event_timestamp": null, "metrics": null}
{"text": "How long does it take to activate my card?", "inputs": {"text": "How long does it take to activate my card?"}, "prediction": [{"label": "NEGATIVE", "score": 0.9980280995368958}, {"label": "POSITIVE", "score": 0.0019718497060239315}], "prediction_agent": "distilbert-base-uncased-finetuned-sst-2-english", "annotation": null, "annotation_agent": null, "multi_label": false, "explanation": null, "id": null, "metadata": {"category": 0}, "status": "Default", "event_timestamp": null, "metrics": null}
{"text": "The ATM I used for foreign currency provided the wrong exchange rate.", "inputs": {"text": "The ATM I used for foreign currency provided the wrong exchange rate."}, "prediction": [{"label": "NEGATIVE", "score": 0.9993785619735718}, {"label": "POSITIVE", "score": 0.0006214180612005293}], "prediction_agent": "distilbert-base-uncased-finetuned-sst-2-english", "annotation": null, "annotation_agent": null, "multi_label": false, "explanation": null, "id": null, "metadata": {"category": 76}, "status": "Default", "event_timestamp": null, "metrics": null}
{"text": "How do I get my money back for a duplicated charge?", "inputs": {"text": "How do I get my money back for a duplicated charge?"}, "prediction": [{"label": "NEGATIVE", "score": 0.9996625185012817}, {"label": "POSITIVE", "score": 0.00033745719701983035}], "prediction_agent": "distilbert-base-uncased-finetuned-sst-2-english", "annotation": null, "annotation_agent": null, "multi_label": false, "explanation": null, "id": null, "metadata": {"category": 63}, "status": "Default", "event_timestamp": null, "metrics": null}
{"text": "I can't freeze my account as I need the card as I am traveling, how do I change my pin?", "inputs": {"text": "I can't freeze my account as I need the card as I am traveling, how do I change my pin?"}, "prediction": [{"label": "NEGATIVE", "score": 0.9956651329994202}, {"label": "POSITIVE", "score": 0.004334910772740841}], "prediction_agent": "distilbert-base-uncased-finetuned-sst-2-english", "annotation": null, "annotation_agent": null, "multi_label": false, "explanation": null, "id": null, "metadata": {"category": 21}, "status": "Default", "event_timestamp": null, "metrics": null}
{"text": "A transfer to my account was denied.", "inputs": {"text": "A transfer to my account was denied."}, "prediction": [{"label": "NEGATIVE", "score": 0.9981142282485962}, {"label": "POSITIVE", "score": 0.00188581389375031}], "prediction_agent": "distilbert-base-uncased-finetuned-sst-2-english", "annotation": null, "annotation_agent": null, "multi_label": false, "explanation": null, "id": null, "metadata": {"category": 7}, "status": "Default", "event_timestamp": null, "metrics": null}
{"text": "WHAT IS THE REASON FOR THAT", "inputs": {"text": "WHAT IS THE REASON FOR THAT"}, "prediction": [{"label": "NEGATIVE", "score": 0.3312133848667145}, {"label": "POSITIVE", "score": 0.6687866449356079}], "prediction_agent": "distilbert-base-uncased-finetuned-sst-2-english", "annotation": null, "annotation_agent": null, "multi_label": false, "explanation": null, "id": null, "metadata": {"category": 14}, "status": "Default", "event_timestamp": null, "metrics": null}
{"text": "I looked at my statement and found a payment I and found a mistake.", "inputs": {"text": "I looked at my statement and found a payment I and found a mistake."}, "prediction": [{"label": "NEGATIVE", "score": 0.9990764856338501}, {"label": "POSITIVE", "score": 0.0009234766475856304}], "prediction_agent": "distilbert-base-uncased-finetuned-sst-2-english", "annotation": null, "annotation_agent": null, "multi_label": false, "explanation": null, "id": null, "metadata": {"category": 16}, "status": "Default", "event_timestamp": null, "metrics": null}
{"text": "Hi, I am unable to see transaction in my account which i made couples of hours ago from my UK account. Please help me in this.", "inputs": {"text": "Hi, I am unable to see transaction in my account which i made couples of hours ago from my UK account. Please help me in this."}, "prediction": [{"label": "NEGATIVE", "score": 0.9786044359207153}, {"label": "POSITIVE", "score": 0.021395523101091385}], "prediction_agent": "distilbert-base-uncased-finetuned-sst-2-english", "annotation": null, "annotation_agent": null, "multi_label": false, "explanation": null, "id": null, "metadata": {"category": 5}, "status": "Default", "event_timestamp": null, "metrics": null}
{"text": "A Porthcawl RNLI crew with a medic and the coastguard search and rescue helicopter were sent to Sker Beach, near Kenfig Nature Reserve, at 12:50 GMT on Sunday.\nCrewman Chris Page said the rider had head injuries and was very cold from lying unconscious on wet sand.\nShe was treated before being flown to Cardiff's University Hospital of Wales.", "target": "A woman has been airlifted to hospital after falling from a horse on a Bridgend county beach.", "feat_id": "38630833", "evaluation_predictions": [0, 25917, 196, 693, 3207, 111, 140, 221, 1505, 135, 7538, 18193, 124, 4434, 3391, 110, 107, 106, 2239, 140, 2839, 269, 270, 19136, 112, 16794, 131, 116, 502, 3348, 113, 5620, 110, 107, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]}
{"text": "Verheydt, who had been with Maastricht since 2015, has signed a three-year contract with Harry Kewell's side.\nThe 25-year-old scored 13 goals in 43 appearances in all competitions last season.\nMeanwhile, Watford winger Dennon Lewis, 20, has joined the Reds on a loan deal until 31 December.\nLewis, who made 29 National League appearances on loan at Woking last season, played under Kewell when the Australian was coach of Watford's under-23 team.\nFind all the latest football transfers on our dedicated page.", "target": "League Two side Crawley Town have signed striker Thomas Verheydt from Dutch second-tier side MVV Maastricht for an undisclosed fee.", "feat_id": "40523815", "evaluation_predictions": [0, 8640, 20898, 252, 144, 196, 174, 122, 67389, 381, 1680, 110, 107, 106, 5247, 121, 1019, 121, 1623, 3523, 1428, 1203, 115, 6926, 9273, 115, 149, 8863, 289, 578, 110, 107, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]}
{"text": "The 32-year-old had been playing in the Isthmian Premier Division with Leatherhead following his release by Newport at the end of last season.\nPidgeley has made 260 appearances in spells with nine clubs, including Chelsea, Watford and Millwall.\nForest Green are currently second in the National League table, one point behind leaders Cheltenham Town.\nPidgeley could make his Rovers debut when they host Aldershot on Friday.", "target": "National League side Forest Green Rovers have signed goalkeeper Lenny Pidgeley until the end of the season.", "feat_id": "35893195", "evaluation_predictions": [0, 139, 18253, 1019, 121, 1623, 196, 174, 1123, 115, 109, 125, 116, 307, 57010, 6235, 4034, 122, 8014, 4801, 110, 107, 106, 969, 35050, 2858, 148, 266, 21647, 9273, 115, 14659, 122, 2899, 4751, 108, 330, 8969, 108, 32686, 111, 4315, 8813, 110, 107, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]}
{"text": "The suspected green mamba was found on a ship that had docked in Aberdeen from west Africa last month.\nThe snake died after it was placed in a freezer by the animal charity after attempts to rehome it with specialist reptile keepers failed.\nThe snake was later identified as a harmless green tree snake.\nPolice Scotland said a complaint was under consideration.\nScottish SPCA Ch Supt Mike Flynn said: \"We were called out after a green snake arrived in Aberdeen on a boat from Africa.\n\"The snake was thought to be a green mamba, one of the deadliest snakes in the world. The snake was taken by police escort to our Aberdeenshire animal rescue and rehoming centre.\n\"Sadly the snake, which staff genuinely believed to be a green mamba, had to be put to sleep after our attempts to rehome it to specialist reptile keepers were unsuccessful.\"\nHe added: \"We could not keep the snake in our centre due to severe health and safety concerns, as the closest anti-venom is held in Bedford. Green mambas also require a Dangerous Wild Animal Licence which the society does not have.\n\"The safety of our staff and the public is paramount and as such the snake was placed in a freezer where it passed away.\n\"The Scottish SPCA is proud of its policy not put healthy animals to sleep. Animals are only put to sleep on veterinary advice if they are too ill or too aggressive to be rehomed, or where we are legally required to do so.\n\"The decision to euthanise the snake was not taken lightly. Unfortunately, the snake has since been identified as a harmless green tree snake. This has been an honest mistake on the society's part as we genuinely believed this was an extremely deadly snake.\"\nThe western green mamba feeds on small animals and rodents and is mainly found in the coastal tropical rainforests of western Africa.\nExperts say its bite can be fatal in as little as 30 minutes.", "target": "A complaint about the Scottish SPCA putting what was thought to be one of the world's most deadly snakes to sleep is being investigated by police.", "feat_id": "38249621", "evaluation_predictions": [0, 21624, 2342, 244, 126, 140, 1828, 115, 114, 8381, 141, 109, 2517, 4402, 110, 107, 106, 35932, 116, 112, 920, 4581, 126, 122, 3192, 45588, 41483, 3004, 110, 107, 106, 159, 13475, 140, 678, 3087, 130, 114, 16365, 1190, 1681, 13475, 110, 107, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]}
{"text": "The all-rounder has been included in the T20 squad to face Sri Lanka on 5 July at the Ageas Bowl, having been part of the World T20 squad in March.\n\"It's nice to be involved again,\" the 26-year-old slow left-armer and right-handed batsman told BBC Radio Solent.\n\"I wasn't expecting to be as I thought Moeen Ali would, but him being rested hands me another opportunity.\"\nDawson failed to make an appearance during England's run to the World T20 final in India this year, but his limited-overs performances for Hampshire this season have kept him in contention.\n\"I'm not sure what team they'll go with,\" he said.\n\"I've had some good performances for Hampshire in one-day and T20 cricket and if I keep going with that, hopefully I might get a chance.\"", "target": "Hampshire's Liam Dawson says it would be \"amazing\" to make his England Twenty20 debut on his home ground.", "feat_id": "36547398", "evaluation_predictions": [0, 10107, 131, 116, 21331, 22132, 148, 174, 953, 115, 2159, 131, 116, 781, 3214, 7784, 110, 107, 106, 159, 1311, 464, 6746, 10773, 117, 134, 109, 6271, 1483, 6552, 124, 371, 1307, 110, 107, 106, 92074, 140, 297, 113, 109, 894, 781, 3214, 7784, 115, 1051, 110, 107, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]}
{"text": "The Egypt-backed plan had envisaged a regional forum which analysts say might have forced Israel to reveal whether or not it has nuclear weapons.\nThe proposal was blocked by the US, the UK and Canada. The next review is set for 2020.\nIsrael neither confirms nor denies it has a stockpile of nuclear weapons.\nSpeaking after four weeks of negotiations, US Under-Secretary of State Rose Gottemoeller accused Egypt and other Arab countries of \"not willing to let go of these unrealistic and unworkable conditions\" for future talks.\nShe also said some participants tried to \"cynically manipulate\" the whole process.\nBut Egypt warned that the failure to reach a deal \"will have consequences in front of the Arab world and public opinion\", the Associated Press news agency reports.\nLast month, Egypt had proposed to stage a regional conference - with or without Israel's participation and without an agreed agenda.\nSome analysts suggested that this move might have forced Israel - which is not a party to the Treaty on the Non-Proliferation of Nuclear Weapons (NPT) - to publicly clarify its position on nuclear weapons.\nDecisions at NPT review conferences - held every five years - are made by consensus.\nThe failure of the current talks means the next gathering could only be held in 2020 at the earliest.", "target": "A UN conference aimed at preventing the proliferation of nuclear weapons has ended in failure after a row over a nuclear-free Middle East proposal.", "feat_id": "32856926", "evaluation_predictions": [0, 6325, 121, 17945, 511, 47408, 114, 2881, 3828, 162, 8067, 416, 382, 133, 3354, 3019, 112, 4494, 682, 132, 146, 126, 148, 4573, 4841, 110, 107, 106, 159, 3993, 140, 7010, 141, 109, 787, 108, 109, 926, 111, 1493, 110, 107, 106, 22751, 4891, 17350, 3001, 29525, 126, 148, 114, 38078, 113, 4573, 4841, 110, 107, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]}
{"text": "Mothers-to-be are more likely to get malaria as their immunity is lowered, says the Royal College of Obstetricians and Gynaecologists (RCOG).\nMalaria carries serious risks for mother and baby including miscarriage, stillbirth and premature labour.\nThe tropical disease is transmitted by mosquitoes and causes a fever, flu-like symptoms, vomiting and diarrhoea.\nThere have been no malaria-related deaths in pregnant or recently pregnant women in the UK in the past decade but the RCOG says it has been receiving inquires from worried women.\nIn the UK, about 1,500 cases of malaria are reported each year and about 10 people will die, said the RCOG.\nIt says all non-essential trips to areas with a high risk of malaria should be avoided.\nRisk areas include large areas of Africa, Asia including China and India, Central and South America, parts of the Middle East and some Pacific Islands.\nIf the trip is unavoidable, the college advises women to seek advice from a centre with expertise in malaria which will provide information on ways to reduce the risk of infection.\nWomen should make sure they are aware of the risk, take out measures such as mosquito nets for bite prevention and take anti-malarial medication.\nPhilippa Marsden, who chairs the RCOG's patient information committee, said although the risks were still relatively small it was important that women were well-informed.\nCath Broderick of the RCOG women's network said: \"If women are worried about symptoms after returning from a high-risk country and think they may have malaria, they should see a doctor immediately and inform them of their recent travels.\"\nSymptoms can take a week or more to develop after being bitten.", "target": "Pregnant women should visit countries with a risk of malaria only if their trip is essential, experts are warning.", "feat_id": "30003436", "evaluation_predictions": [0, 71386, 6087, 1651, 3198, 118, 1499, 111, 1362, 330, 31911, 108, 81095, 111, 16936, 6963, 110, 107, 106, 621, 133, 174, 220, 24263, 121, 3316, 6996, 115, 5725, 132, 938, 5725, 652, 115, 109, 926, 115, 109, 555, 3496, 110, 107, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]}
{"text": "Researchers at the University of Sheffield said the tumours were effectively \"fertilising\" the bone to help themselves grow.\nThe study, in the journal Nature, said it may be possible to protect bone from a tumour's nefarious influence and consequently stop the cancer's spread.\nCancer charities said this opened up \"a whole new avenue for research\".\nAround 85% of breast cancers that spread around the body end up in bone, at which point the cancer is difficult to treat and more deadly.\nThe scientists, in Sheffield and the University of Copenhagen, discovered patients with secondary cancers had higher levels of an enzyme called LOX being produced by their tumours and released into the blood.\nBone is constantly being broken down and rebuilt. But in a series of experiments on mice, the research team showed LOX was disrupting the process and leaving lesions and holes in the bone.\nUsing drugs to block LOX prevented the cancer from spreading.\nDr Alison Gartland, a reader in bone and cancer biology at the university, told the BBC News website: \"We think it's a significant breakthrough in trying to prevent metastases (secondary tumours) in breast cancer.\n\"The cancer cells in the primary tumour are actually fertilising the soil for the future growth of itself, LOX is changing the environment in bone to make it better to grow.\"\nThe animal tests also showed that a set of osteoporosis drugs called bisphosphonates could prevent the spread of cancer.\nBisphosphonates also interfere with the way bone is recycled in order to strengthen it.\nThey are already given to some cancer patients, but the Sheffield team believe they could have a much larger role.\nThe effect was discovered only in oestrogen-negative breast cancers. They account for around a third of cases, but are far more deadly.\nKatherine Woods, from Breast Cancer Campaign and Breakthrough Breast Cancer, said: \"By unveiling the role that the protein LOX is playing, these results open up a whole new avenue for research and treatments that could stop breast cancer spreading to the bone.\n\"The research also adds weight to the growing body of evidence supporting the role of bisphosphonates in stopping secondary breast cancer in its tracks.\n\"The reality of living with secondary breast cancer in the bone is a stark one, which leaves many women with bone pain and fractures that need extensive surgery just when they need to be making the most of the time they have left with friends and family.\"\nThe findings may also apply in colon cancer.", "target": "Breast cancers can manipulate the structure of bone to make it easier to spread there, a study has found.", "feat_id": "32901133", "evaluation_predictions": [0, 10278, 21399, 113, 4622, 15791, 120, 2275, 279, 109, 513, 370, 164, 115, 4499, 110, 107, 106, 34182, 2642, 1044, 122, 4367, 15791, 196, 902, 1099, 113, 142, 17739, 568, 1054, 31231, 110, 107, 106, 1240, 31231, 140, 31945, 109, 366, 111, 2096, 23790, 111, 4343, 115, 109, 4499, 110, 107, 106, 37514, 116, 112, 2105, 1054, 31231, 11890, 109, 1695, 135, 8561, 110, 107, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]}
{"title": "Backup", "content": "verb form referring process back whereas noun adjective form backup.backups used recover data loss data deletion corruption recover data earlier time.backups provide simple form disaster recovery however backup systems able reconstitute computer system complex configuration computer cluster active directory server database server.a backup system contains least one copy data considered worth saving.data storage requirements large.information repository model may used provide structure storage."}
{"title": "Backup", "content": "different types data storage devices used copying backups data already secondary storage onto archive files.also different ways devices arranged provide geographic dispersion data security portability.data selected extracted manipulated storage.process include methods dealing live data including open files well compression encryption deduplication.additional techniques apply enterprise clientserver backup.backup schemes may include dry runs validate reliability data backed up."}
{"title": "Backup", "content": "limitations human factors involved backup scheme.storage backup strategy requires information repository secondary storage space data aggregates backups data sources.repository could simple list backup media dvds etc.dates produced could include computerized index catalog relational database.the backup data needs stored requiring backup rotation scheme system backing data computer media limits number backups different dates retained separately appropriate reuse data storage media overwriting backups longer needed.scheme determines piece removable storage used backup operation long retained backup data stored it."}
{"title": "Backup", "content": "rule rule aid backup process.states least copies data stored different types storage media one copy kept offsite remote location include cloud storage.different media used eliminate data loss due similar reasons example optical discs may tolerate underwater lto tapes may ssds cannot fail due head crashes damaged spindle motors since dont moving parts unlike hard drives.offsite copy protects fire theft physical media tapes discs natural disasters like floods earthquakes.disaster protected hard drives like made iosafe alternative offsite copy limitations like able resist fire limited period time offsite copy still remains ideal choice."}
{"title": "Backup", "content": "backup methods unstructured unstructured repository may simply stack tapes dvdrs external hdds minimal information backed when.method easiest implement unlikely achieve high level recoverability lacks automation.full onlysystem imaging repository using backup method contains complete source data copies taken one specific points time.copying system images method frequently used computer technicians record known good configurations.however imaging generally useful way deploying standard configuration many systems rather tool making ongoing backups diverse systems."}
{"title": "Backup", "content": "incremental incremental backup stores data changed since reference point time.duplicate copies unchanged data arent copied.typically full backup files infrequent intervals serving reference point incremental repository.subsequently number incremental backups made successive time periods.restores begin last full backup apply incrementals.forever incremental backup starts one initial full backup later incremental backups created."}
{"title": "Backup", "content": "benefits forever incremental backup would less backup storage less bandwidth usage users schedule backups frequently achieve shorter rpo.backup systems create synthetic full backup series incrementals thus providing equivalent frequently full backup.done modify single archive file speeds restores recent versions files.nearcdp continuous data protection cdp refers backup instantly saves copy every change made data.allows restoration data point time comprehensive advanced data protection."}
{"title": "Backup", "content": "means maximum two backups repository used restore data.however time last full backup thus accumulated changes data increases time perform differential backup.restoring entire system requires starting recent full backup applying last differential backup.a differential backup copies files created changed since last full backup regardless whether differential backups made since whereas incremental backup copies files created changed since recent backup type full incremental.changes files may detected recent datetime last modification file attribute andor changes file size.variations incremental backup include multilevel incrementals blocklevel incrementals compare parts files instead entire files."}
{"video_id": "hg2Q_O5b9w4", "title": "CURL: Contrastive Unsupervised Representations for Reinforcement Learning", "text": "hi there today we're going to look at curl contrastive unsupervised representations for reinforcement learning by Aravind Sreenivas Michel Laskin and Petra Biel so this is a general framework for unsupervised representation learning for our L so let's untangle the title a little bit it is for reinforcement learning which it if you don't know what reinforcement learning is I've done a bunch of videos on are L afraid works so it's for general reinforcement learning that means it can be paired with almost any RL algorithm out there so we're not", "start_timestamp": "00:00:00", "end_timestamp": "00:00:42", "start_second": "0", "end_second": "42", "url": "https://www.youtube.com/watch?v=hg2Q_O5b9w4&t=0s", "thumbnail": "https://i.ytimg.com/vi/hg2Q_O5b9w4/maxresdefault.jpg"}
{"video_id": "hg2Q_O5b9w4", "title": "CURL: Contrastive Unsupervised Representations for Reinforcement Learning", "text": "going to you know dive into specific or allowed rooms today it is unsupervised which means it doesn't need any sort of labels and it also doesn't need a reward signal forum RL which is pretty cool because usually the entire RL pipelines rely on some sort of a reward or auxiliary reward signal now there is a training objective here but it doesn't have to do with the RL reward and then in the it is learning representations which means it learns it learns intermediate representations of the input data that is useful and in the end", "start_timestamp": "00:00:42", "end_timestamp": "00:01:23", "start_second": "42", "end_second": "83", "url": "https://www.youtube.com/watch?v=hg2Q_O5b9w4&t=42s", "thumbnail": "https://i.ytimg.com/vi/hg2Q_O5b9w4/maxresdefault.jpg"}
{"video_id": "hg2Q_O5b9w4", "title": "CURL: Contrastive Unsupervised Representations for Reinforcement Learning", "text": "it is contrastive and that is the the kind of secret sauce in here the training objective it's what's called contrastive learning and that's what we're going to spend most of our time on today exploring what that means alright so here's the general framework you can see it down here sorry about that so you can see that reinforcement learning is just a box which is we don't care about the RL algorithm you use that's just you know what what comes at the end what comes at the beginning oh here is the observation so the observation in an RL", "start_timestamp": "00:01:23", "end_timestamp": "00:02:04", "start_second": "83", "end_second": "124", "url": "https://www.youtube.com/watch?v=hg2Q_O5b9w4&t=83s", "thumbnail": "https://i.ytimg.com/vi/hg2Q_O5b9w4/maxresdefault.jpg"}
{"video_id": "hg2Q_O5b9w4", "title": "CURL: Contrastive Unsupervised Representations for Reinforcement Learning", "text": "algorithm is kind of fundamental now if someone explains RL to you or reinforcement learning usually what they'll say is there is some kind of actor and there is some kind of environment right and the environment will give you an observation right observation Oh which is some sort of let's say here is an image right so in this in this RL framework specifically the examples they give are of image based reinforcement learning so let's say the Atari game where you have this little spaceship here and there are meteorites up here and you need to shoot", "start_timestamp": "00:02:04", "end_timestamp": "00:02:48", "start_second": "124", "end_second": "168", "url": "https://www.youtube.com/watch?v=hg2Q_O5b9w4&t=124s", "thumbnail": "https://i.ytimg.com/vi/hg2Q_O5b9w4/maxresdefault.jpg"}
{"video_id": "hg2Q_O5b9w4", "title": "CURL: Contrastive Unsupervised Representations for Reinforcement Learning", "text": "them so there is a little shot here right you need to shoot those meteorites right so this is the observation oh and then as an age as an actor you have to come up with some sort of action and the actions here can be something like moved to the left move to the right press the button that you know does the shooting so you have to come up with an action somehow given this observation and then the environment will give you back a reward along with the next observation like the next frame of the game and you're gonna have to come up with", "start_timestamp": "00:02:48", "end_timestamp": "00:03:23", "start_second": "168", "end_second": "203", "url": "https://www.youtube.com/watch?v=hg2Q_O5b9w4&t=168s", "thumbnail": "https://i.ytimg.com/vi/hg2Q_O5b9w4/maxresdefault.jpg"}
{"video_id": "hg2Q_O5b9w4", "title": "CURL: Contrastive Unsupervised Representations for Reinforcement Learning", "text": "another action in response to that and the environments going to give you back another reward and the next observation and so on so what you want to do is you want to find a mapping from observation to action such that your reward is going to be as high as possible right this is the fundamental problem of RL and usually what people do is they take this act this mapping here from observation to action to be some sort of function some sort of function that is parameterised maybe and nowadays of course it's often a neural network but", "start_timestamp": "00:03:23", "end_timestamp": "00:04:02", "start_second": "203", "end_second": "242", "url": "https://www.youtube.com/watch?v=hg2Q_O5b9w4&t=203s", "thumbnail": "https://i.ytimg.com/vi/hg2Q_O5b9w4/maxresdefault.jpg"}
{"video_id": "hg2Q_O5b9w4", "title": "CURL: Contrastive Unsupervised Representations for Reinforcement Learning", "text": "you're trying to learn given the input observation what output action you need to do and you can think of the same here so you have this input observation up here and down here after the reinforcement learning the output is going to be an action right and so this this function we talked about up here is usually implemented sorry is usually implement as you put the observation into the r.l framework and then the RL framework learns this f of theta function to give you an action now here you can see the pipeline is a bit different we don't", "start_timestamp": "00:04:02", "end_timestamp": "00:04:39", "start_second": "242", "end_second": "279", "url": "https://www.youtube.com/watch?v=hg2Q_O5b9w4&t=242s", "thumbnail": "https://i.ytimg.com/vi/hg2Q_O5b9w4/maxresdefault.jpg"}
{"video_id": "hg2Q_O5b9w4", "title": "CURL: Contrastive Unsupervised Representations for Reinforcement Learning", "text": "want to shove the observation in directly right we don't want the observation directly but what we put into the RL framework is this Q thing now the Q is supposed to be a representation of the observation and a useful representation so if we think of this of this game here of this Atari game up here what could be the what could be a useful representation if if I had to craft one by hand how would I construct a useful representation keep in mind the representation the goal is to have a representation of the observation that is more useful to the", "start_timestamp": "00:04:39", "end_timestamp": "00:05:22", "start_second": "279", "end_second": "322", "url": "https://www.youtube.com/watch?v=hg2Q_O5b9w4&t=279s", "thumbnail": "https://i.ytimg.com/vi/hg2Q_O5b9w4/maxresdefault.jpg"}
{"video_id": "hg2Q_O5b9w4", "title": "CURL: Contrastive Unsupervised Representations for Reinforcement Learning", "text": "RL algorithm than just the pure pixels of the image right so if I have to craft a representation let's say it's a vector right let's say our our our representations need to be vectors what I would do is I would probably take the x and y coordinates of the little spaceship right x and y and put it in the vector that's pretty useful and then I would probably take the x and y coordinates of the meteorites that are around right let's say there are maximum two XY XY here I would probably take the angle right the angle where my spaceship", "start_timestamp": "00:05:22", "end_timestamp": "00:06:07", "start_second": "322", "end_second": "367", "url": "https://www.youtube.com/watch?v=hg2Q_O5b9w4&t=322s", "thumbnail": "https://i.ytimg.com/vi/hg2Q_O5b9w4/maxresdefault.jpg"}
{"video_id": "hg2Q_O5b9w4", "title": "CURL: Contrastive Unsupervised Representations for Reinforcement Learning", "text": "is pointing to that should be pretty useful because if I shoot I want to know where I shoot right so theta here and then probably maybe the X and y coordinate of the of the shot here of the red shot that I fired if there is one right also going to put that into my representation so x and y and maybe Delta X Delta Y something like this right so you can see if I had to handcraft something if I I can pretty much guarantee that if I put in this representation right here into the RL algorithm but put this in here it would turn out", "start_timestamp": "00:06:07", "end_timestamp": "00:06:52", "start_second": "367", "end_second": "412", "url": "https://www.youtube.com/watch?v=hg2Q_O5b9w4&t=367s", "thumbnail": "https://i.ytimg.com/vi/hg2Q_O5b9w4/maxresdefault.jpg"}
{"id": "AskReddit/ey3wak1", "qid": "cvfrfv", "question": "Why do you stay away from the news?", "answer": "I feel like news increases my anxiety. I already have a diagnosed anxiety problem which I've been struggling with for years and the news tends to make it worse. Unfortunately staying away from the news makes me pretty ignorant regarding certain things but oh well, I'd rather be ignorant than worrying.", "score": 2}
{"id": "explainlikeimfive/djveahx", "qid": "6lojco", "question": "How do people brute force your passwords?", "answer": "Brute forcing rarely happens in the first place. And when it does, it's not thru the system UI. It usually happens when system has been compromised and the password data has been retrieved and stored on the hacker's or another hacker controlled machine. The raw password data is analysed and perhaps run thru a precalculated rainbow table.", "score": 3}
{"id": "AskReddit/ckcd0pi", "qid": "2fspcb", "question": "What are some brilliant FOREIGN shows I should be watching?", "answer": "Like the others, it depends on your definition of foreign. These are comedy shorts rather than shows, but try Service apres vente and Un gars, Une fille for a laugh. You just want to get into some shows from other English speaking countries than the USA? If you liked The Young Ones try Bottom, including the live shows, and Filthy, Rich, and Catflap for a bit of alt humour. In fact, if you don't know much British comedy then you're missing out on a gold mine.", "score": 3}
{"id": "askscience/c6wiw7b", "qid": "12n6tr", "question": "If an obese person attempts to lose weight and starts eating healthy, would the plaque on their arteries decrease and/or disappear over time?", "answer": "Point of clarification, just being obese doesn't mean that there are plaques in arteries (arteriosclerosis). Being obese only raises the risk for developing plaques. That said, a healthier diet and weight loss in a person that does have plaques can slowly decrease the natural progression of the plaques.", "score": 42}
{"id": "AskReddit/cp0razv", "qid": "2xjzf5", "question": "Why do so many people think Multiculturalism is a bad thing?", "answer": "I myself am a product of French, Irish, Jewish, Italian, Cherokee, Blackfoot Sioux and Muskogee Native American Ancestry. I see multiculturalism as a solution to racism and a way to battle ignorance of culture which leads to intolerance. Why shouldn't white people celebrate their diverse heritage if they have one? Why do white skinned people get pigeonholed into \"white culture\" white isn't even a real race it's a label for people who look European.", "score": 2}
{"id": "explainlikeimfive/da3cvuc", "qid": "5dbyyd", "question": "Why does light cast from two equidistant lamp posts not cancel out shadows?", "answer": "You have two light sources independently adding light to the region. Light A and Light B, both emitting 1 \"unit\" let's say. Where they both can reach it is receiving 2 units. Where neither can reach it is receiving 0 units. When you are blocking one light but not the other (occurs on both sides due to your positioning) it is receiving 1 unit. Therefore you can see all the area receiving 2 units, as well as the shallow shadows caused by you blocking 1 light but not the other, and if you looked behind yourself or turned around you would see the deep shadow caused by your body blocking both light sources. Since reflection occurs most of the time yes it would be slightly washed out. there are still shadows but it's more likely that the area you are casting a shadow in actually gets something like 1.4 units of light. And yes, this can definitely wash out. if Light A emits 1 unit but Light B emits 750, the 'shadow' might not even be noticeable due to reflection and you'd be seeing areas of 751 units vs areas of 750.8 or something and not be able to discern the difference with your eyes. \"Shadows\" are really just your interpretation of how an area appears less lit than what is nearby.", "score": 5}
{"id": "explainlikeimfive/cvmw56e", "qid": "3nd06z", "question": "Why flash memory is always in multiples of 8 (256GB, 512GB) while hard drives use a round number?", "answer": "Hard drives are lying. As for your second question, it's easier to quote the 'gross' storage capacity because the 'net' storage capacity depends on how the drive is formatted, which is up to the user/OEM and isn't something the drive manufacturer has control over.", "score": 13}