How to use get_instances method in robotframework-pageobjects

Best Python code snippet using robotframework-pageobjects_python

simulator_test.py

Source:simulator_test.py Github

copy

Full Screen

...8 assert greedy_solution[0].name == 'hostA'9 assert greedy_solution[0].free_ram == 11.0 * 1024.010 assert greedy_solution[0].get_deployment('hadoop1').datanodes == 011 assert greedy_solution[0].get_deployment('hadoop1').tasktrackers == 212 assert len(greedy_solution[0].get_deployment('hadoop1').get_instances()) == 213 assert greedy_solution[0].get_deployment('hadoop1').get_instances()[0].name == 'instA'14 assert greedy_solution[0].get_deployment('hadoop1').get_instances()[1].name == 'instB'15 assert greedy_solution[0].get_deployment('hadoop2').datanodes == 016 assert greedy_solution[0].get_deployment('hadoop2').tasktrackers == 017 assert len(greedy_solution[0].get_deployment('hadoop2').get_instances()) == 018 # HOST B19 assert greedy_solution[1].name == 'hostB'20 assert greedy_solution[1].free_ram == 8.0 * 1024.021 assert greedy_solution[1].get_deployment('hadoop1').datanodes == 222 assert greedy_solution[1].get_deployment('hadoop1').tasktrackers == 123 assert len(greedy_solution[1].get_deployment('hadoop1').get_instances()) == 024 assert greedy_solution[1].get_deployment('hadoop2').datanodes == 125 assert greedy_solution[1].get_deployment('hadoop2').tasktrackers == 026 assert len(greedy_solution[1].get_deployment('hadoop2').get_instances()) == 027 # HOST C28 assert greedy_solution[2].name == 'hostC'29 assert greedy_solution[2].free_ram == 8.0 * 1024.030 assert greedy_solution[2].get_deployment('hadoop1').datanodes == 131 assert greedy_solution[2].get_deployment('hadoop1').tasktrackers == 132 assert len(greedy_solution[2].get_deployment('hadoop1').get_instances()) == 033 assert greedy_solution[2].get_deployment('hadoop2').datanodes == 334 assert greedy_solution[2].get_deployment('hadoop2').tasktrackers == 335 assert len(greedy_solution[2].get_deployment('hadoop2').get_instances()) == 136 assert greedy_solution[2].get_deployment('hadoop2').get_instances()[0].name == 'instC'37 # HOST D38 assert greedy_solution[3].name == 'hostD'39 assert greedy_solution[3].free_ram == -10.0 * 1024.040 assert greedy_solution[3].get_deployment('hadoop1').datanodes == 041 assert greedy_solution[3].get_deployment('hadoop1').tasktrackers == 042 assert len(greedy_solution[3].get_deployment('hadoop1').get_instances()) == 043 assert greedy_solution[3].get_deployment('hadoop2').datanodes == 044 assert greedy_solution[3].get_deployment('hadoop2').tasktrackers == 045 assert len(greedy_solution[3].get_deployment('hadoop2').get_instances()) == 046 # UNIT TEST INPUT2 -GREEDY47 if test_no == 2:48 # HOST A49 assert greedy_solution[0].name == 'hostA'50 assert greedy_solution[0].free_ram == 13.0 * 1024.051 assert greedy_solution[0].get_deployment('hadoop1').datanodes == 052 assert greedy_solution[0].get_deployment('hadoop1').tasktrackers == 153 assert len(greedy_solution[0].get_deployment('hadoop1').get_instances()) == 154 assert greedy_solution[0].get_deployment('hadoop1').get_instances()[0].name == 'instA'55 assert greedy_solution[0].get_deployment('hadoop2').datanodes == 056 assert greedy_solution[0].get_deployment('hadoop2').tasktrackers == 157 assert len(greedy_solution[0].get_deployment('hadoop2').get_instances()) == 158 assert greedy_solution[0].get_deployment('hadoop2').get_instances()[0].name == 'instC'59 # HOST B60 assert greedy_solution[1].name == 'hostB'61 assert greedy_solution[1].free_ram == 8.0 * 1024.062 assert greedy_solution[1].get_deployment('hadoop1').datanodes == 263 assert greedy_solution[1].get_deployment('hadoop1').tasktrackers == 164 assert len(greedy_solution[1].get_deployment('hadoop1').get_instances()) == 065 assert greedy_solution[1].get_deployment('hadoop2').datanodes == 166 assert greedy_solution[1].get_deployment('hadoop2').tasktrackers == 067 assert len(greedy_solution[1].get_deployment('hadoop2').get_instances()) == 068 # HOST C69 assert greedy_solution[2].name == 'hostC'70 assert greedy_solution[2].free_ram == 6.0 * 1024.071 assert greedy_solution[2].get_deployment('hadoop1').datanodes == 172 assert greedy_solution[2].get_deployment('hadoop1').tasktrackers == 173 assert len(greedy_solution[2].get_deployment('hadoop1').get_instances()) == 074 assert greedy_solution[2].get_deployment('hadoop2').datanodes == 375 assert greedy_solution[2].get_deployment('hadoop2').tasktrackers == 376 assert len(greedy_solution[2].get_deployment('hadoop2').get_instances()) == 177 assert greedy_solution[2].get_deployment('hadoop2').get_instances()[0].name == 'instB'78 # HOST D79 assert greedy_solution[3].name == 'hostD'80 assert greedy_solution[3].free_ram == -10.0 * 1024.081 assert greedy_solution[3].get_deployment('hadoop1').datanodes == 082 assert greedy_solution[3].get_deployment('hadoop1').tasktrackers == 083 assert len(greedy_solution[3].get_deployment('hadoop1').get_instances()) == 084 assert greedy_solution[3].get_deployment('hadoop2').datanodes == 085 assert greedy_solution[3].get_deployment('hadoop2').tasktrackers == 086 assert len(greedy_solution[3].get_deployment('hadoop2').get_instances()) == 087 # UNIT TEST INPUT3 - GREEDY AND OPTIMAL88 if test_no == 3:89 # HOST A90 assert greedy_solution[0].name == 'hostA'91 assert greedy_solution[0].free_ram == 0.0 * 1024.092 assert greedy_solution[0].cost == 4.0 * 1024.093 assert greedy_solution[0].get_deployment('hadoop1').datanodes == 094 assert greedy_solution[0].get_deployment('hadoop1').tasktrackers == 395 assert len(greedy_solution[0].get_deployment('hadoop1').get_instances()) == 296 assert greedy_solution[0].get_deployment('hadoop1').get_instances()[0].name == 'instA'97 assert greedy_solution[0].get_deployment('hadoop1').get_instances()[1].name == 'instB'98 # HOST B99 assert greedy_solution[1].name == 'hostB'100 assert greedy_solution[1].free_ram == 0.0 * 1024.0101 assert greedy_solution[1].cost == 2.0 * 1024.0102 assert greedy_solution[1].get_deployment('hadoop1').datanodes == 0103 assert greedy_solution[1].get_deployment('hadoop1').tasktrackers == 3104 assert len(greedy_solution[1].get_deployment('hadoop1').get_instances()) == 1105 assert greedy_solution[1].get_deployment('hadoop1').get_instances()[0].name == 'instC'106 assert greedy_cost == 6.0 * 1024.0107 # HOST A108 assert optimal_solution[0].name == 'hostA'109 assert optimal_solution[0].free_ram == 4.0 * 1024.0110 assert optimal_solution[0].cost == 4.0 * 1024.0111 assert optimal_solution[0].get_deployment('hadoop1').datanodes == 0112 assert optimal_solution[0].get_deployment('hadoop1').datanodes == 0113 assert optimal_solution[0].get_deployment('hadoop1').tasktrackers == 1114 assert len(optimal_solution[0].get_deployment('hadoop1').get_instances()) == 0115 # HOST B116 assert optimal_solution[1].name == 'hostB'117 assert optimal_solution[1].free_ram == -4.0 * 1024.0118 assert optimal_solution[1].cost == 8.0 * 1024.0119 assert optimal_solution[1].get_deployment('hadoop1').datanodes == 0120 assert optimal_solution[1].get_deployment('hadoop1').tasktrackers == 5121 assert len(optimal_solution[1].get_deployment('hadoop1').get_instances()) == 3122 assert optimal_solution[1].get_deployment('hadoop1').get_instances()[0].name == 'instA'123 assert optimal_solution[1].get_deployment('hadoop1').get_instances()[1].name == 'instB'124 assert optimal_solution[1].get_deployment('hadoop1').get_instances()[2].name == 'instC'125 assert optimal_cost == 12.0 * 1024.0126 # UNIT TEST INPUT4 - GREEDY AND OPTIMAL127 if test_no == 4:128 # HOST A129 assert greedy_solution[0].name == 'hostA'130 assert greedy_solution[0].free_ram == 1.0 * 1024.0131 assert greedy_solution[0].cost == 1.0 * 1024.0132 assert greedy_solution[0].get_deployment('hadoop1').datanodes == 1133 assert greedy_solution[0].get_deployment('hadoop1').tasktrackers == 0134 assert len(greedy_solution[0].get_deployment('hadoop1').get_instances()) == 0135 assert greedy_solution[0].get_deployment('hadoop2').datanodes == 0136 assert greedy_solution[0].get_deployment('hadoop2').tasktrackers == 1137 assert len(greedy_solution[0].get_deployment('hadoop2').get_instances()) == 1138 assert greedy_solution[0].get_deployment('hadoop2').get_instances()[0].name == 'instB'139 # HOST B140 assert greedy_solution[1].name == 'hostB'141 assert greedy_solution[1].free_ram == 0.0 * 1024.0142 assert greedy_solution[1].cost == 0.0 * 1024.0143 assert greedy_solution[1].get_deployment('hadoop1').datanodes == 0144 assert greedy_solution[1].get_deployment('hadoop1').tasktrackers == 1145 assert len(greedy_solution[1].get_deployment('hadoop1').get_instances()) == 1146 assert greedy_solution[1].get_deployment('hadoop1').get_instances()[0].name == 'instA'147 assert greedy_solution[1].get_deployment('hadoop2').datanodes == 0148 assert greedy_solution[1].get_deployment('hadoop2').tasktrackers == 2149 assert len(greedy_solution[1].get_deployment('hadoop2').get_instances()) == 0150 assert greedy_cost == 1.0 * 1024.0151 # HOST A152 assert optimal_solution[0].name == 'hostA'153 assert optimal_solution[0].free_ram == -3.0 * 1024.0154 assert optimal_solution[0].cost == -1.0 * 1024.0155 assert optimal_solution[0].get_deployment('hadoop1').datanodes == 1156 assert optimal_solution[0].get_deployment('hadoop1').tasktrackers == 1157 assert len(optimal_solution[0].get_deployment('hadoop1').get_instances()) == 1158 assert optimal_solution[0].get_deployment('hadoop1').get_instances()[0].name == 'instA'159 assert optimal_solution[0].get_deployment('hadoop2').datanodes == 0160 assert optimal_solution[0].get_deployment('hadoop2').tasktrackers == 0161 assert len(optimal_solution[0].get_deployment('hadoop2').get_instances()) == 0162 # HOST B163 assert optimal_solution[1].name == 'hostB'164 assert optimal_solution[1].free_ram == 4.0 * 1024.0165 assert optimal_solution[1].cost == 6.0 * 1024.0166 assert optimal_solution[1].get_deployment('hadoop1').datanodes == 0167 assert optimal_solution[1].get_deployment('hadoop1').tasktrackers == 0168 assert len(optimal_solution[1].get_deployment('hadoop1').get_instances()) == 0169 assert optimal_solution[1].get_deployment('hadoop2').datanodes == 0170 assert optimal_solution[1].get_deployment('hadoop2').tasktrackers == 3171 assert len(optimal_solution[1].get_deployment('hadoop2').get_instances()) == 1172 assert optimal_solution[1].get_deployment('hadoop2').get_instances()[0].name == 'instB'173 assert optimal_cost == 5.0 * 1024.0174 # assert abs(original_unbalance_index - 0.428571428571) < 0.000000000001175 # assert abs(greedy_unbalance_index - 1.0) < 0.000000000001176 # assert abs(greedy_unbalance_diff - 1.333333333333) < 0.000000000001177 # assert abs(optimal_unbalance_index - 7.0) < 0.000000000001178 # assert abs(optimal_unbalance_diff - 15.333333333333) < 0.000000000001...

Full Screen

Full Screen

SOPdata_reader_helper.py

Source:SOPdata_reader_helper.py Github

copy

Full Screen

...9 Dataset.__init__(self)10 data = []11 with open(file, "r",encoding="utf-8") as f:12 lines = csv.reader(f, delimiter='\t', quotechar=None)13 for instance in self.get_instances(lines):14 for proc in pipeline:15 instance = proc(instance)16 data.append(instance)17 self.tensors = [torch.tensor(x, dtype=torch.long) for x in zip(*data)]18 def __len__(self):19 return self.tensors[0].size(0)20 def __getitem__(self, index):21 return tuple(tensor[index] for tensor in self.tensors)22 def get_instances(self, lines):23 """ get instance array from (csv-separated) line list """24 raise NotImplementedError25class load_MRPC_data(data_parser):26 """ Dataset class for MRPC """27 labels = ("0", "1") # label names28 def __init__(self, file, pipeline=[]):29 super().__init__(file, pipeline)30 def get_instances(self, lines):31 for line in itertools.islice(lines, 1, None): # skip header32 yield line[0], line[3], line[4] # label, text_a, text_b33class load_MNLI_data(data_parser):34 """ Dataset class for MNLI """35 labels = ("contradiction", "entailment", "neutral") # label names36 def __init__(self, file, pipeline=[]):37 super().__init__(file, pipeline)38 def get_instances(self, lines):39 for line in itertools.islice(lines, 1, None): # skip header40 yield line[-1], line[8], line[9] # label, text_a, text_b41class load_STSB_data(data_parser):42 """ Dataset Class for STSB"""43 labels = (None) # label names44 def __init__(self, file, pipeline=[]):45 super().__init__(file, pipeline)46 def get_instances(self, lines):47 for line in itertools.islice(lines, 1, None): # skip header48 yield line[-1], line[7], line[8] # label, text_a, text_b49class load_QQP_data(data_parser):50 """ Dataset class for QQP"""51 labels = ("0", "1") # label names52 def __init__(self, file, pipeline=[]):53 super().__init__(file, pipeline)54 def get_instances(self, lines):55 for line in itertools.islice(lines, 1, None): # skip header56 yield line[5], line[3], line[4] # label, text_a, text_b57class load_QNLI_data(data_parser):58 """ Dataset class for QQP"""59 labels = ("entailment", "not_entailment") # label names60 def __init__(self, file, pipeline=[]):61 super().__init__(file, pipeline)62 def get_instances(self, lines):63 for line in itertools.islice(lines, 1, None): # skip header64 yield line[-1], line[1], line[2] # label, text_a, text_b65class load_RTE_data(data_parser):66 """ Dataset class for RTE"""67 labels = ("entailment", "not_entailment") # label names68 def __init__(self, file, pipeline=[]):69 super().__init__(file, pipeline)70 def get_instances(self, lines):71 for line in itertools.islice(lines, 1, None): # skip header72 yield line[-1], line[1], line[2] # label, text_a, text_b73class load_WNLI_data(data_parser):74 """ Dataset class for WNLI"""75 labels = ("0", "1") # label names76 def __init__(self, file, pipeline=[]):77 super().__init__(file, pipeline)78 def get_instances(self, lines):79 for line in itertools.islice(lines, 1, None): # skip header80 yield line[-1], line[1], line[2] # label, text_a, text_b81def dataset_to_class_mapping(task):82 """ Mapping from task string to Dataset Class """83 table = {'mrpc': load_MRPC_data, 'mnli': load_MNLI_data, 'wnli':load_WNLI_data, 'rte':load_RTE_data, 'qnli':load_QNLI_data, 'qqp':load_QQP_data, 'stsb':load_STSB_data}...

Full Screen

Full Screen

test_conferences.py

Source:test_conferences.py Github

copy

Full Screen

1from datetime import date2import unittest3from mock import Mock4from twilio.rest.resources import Conferences5DEFAULT = {6 'DateUpdated<': None,7 'DateUpdated>': None,8 'DateUpdated': None,9 'DateCreated<': None,10 'DateCreated>': None,11 'DateCreated': None,12}13class ConferenceTest(unittest.TestCase):14 def setUp(self):15 self.resource = Conferences("foo", ("sid", "token"))16 self.params = DEFAULT.copy()17 def test_list(self):18 self.resource.get_instances = Mock()19 self.resource.list()20 self.resource.get_instances.assert_called_with(self.params)21 def test_list_after(self):22 self.resource.get_instances = Mock()23 self.resource.list(created_after=date(2011, 1, 1))24 self.params["DateCreated>"] = "2011-01-01"25 self.resource.get_instances.assert_called_with(self.params)26 def test_list_on(self):27 self.resource.get_instances = Mock()28 self.resource.list(created=date(2011, 1, 1))29 self.params["DateCreated"] = "2011-01-01"30 self.resource.get_instances.assert_called_with(self.params)31 def test_list_before(self):32 self.resource.get_instances = Mock()33 self.resource.list(created_before=date(2011, 1, 1))34 self.params["DateCreated<"] = "2011-01-01"...

Full Screen

Full Screen

Automation Testing Tutorials

Learn to execute automation testing from scratch with LambdaTest Learning Hub. Right from setting up the prerequisites to run your first automation test, to following best practices and diving deeper into advanced test scenarios. LambdaTest Learning Hubs compile a list of step-by-step guides to help you be proficient with different test automation frameworks i.e. Selenium, Cypress, TestNG etc.

LambdaTest Learning Hubs:

YouTube

You could also refer to video tutorials over LambdaTest YouTube channel to get step by step demonstration from industry experts.

Run robotframework-pageobjects automation tests on LambdaTest cloud grid

Perform automation testing on 3000+ real desktop and mobile devices online.

Try LambdaTest Now !!

Get 100 minutes of automation test minutes FREE!!

Next-Gen App & Browser Testing Cloud

Was this article helpful?

Helpful

NotHelpful