How to use fileName method in stryker-parent

Best JavaScript code snippet using stryker-parent

FILES.cfg

Source:FILES.cfg Github

copy

Full Screen

1# -*- python -*-2# ex: set syntax=python:3# Copyright (c) 2012 The Chromium Authors. All rights reserved.4# Use of this source code is governed by a BSD-style license that can be5# found in the LICENSE file.6# This is a buildbot configuration file containing a tagged list of files7# processed by the stage/archive scripts. The known tags are:8#9# filename: Name of the file in the build output directory.10# arch: List of CPU architectures for which this file should be processed11# Leave this unspecified to process for all architectures.12# Acceptable values are 64bit, 32bit and arm.13# buildtype: List of build types for which this file should be processed.14# archive: The name of the archive file to store filename in. If not specified,15# filename is added to the default archive (e.g. platform.zip). If16# archive == filename, filename is archived directly, not zipped.17# direct_archive: Force a file to be archived as-is, bypassing zip creation.18# NOTE: This flag will not apply if more than one file has the19# same 'archive' name, which will create a zip of all the20# files instead.21# filegroup: List of named groups to which this file belongs.22# default: Legacy "default archive". TODO(mmoss): These should23# be updated to specify an 'archive' name and then this24# filegroup and the related archive_utils.ParseLegacyList()25# should go away.26# symsrc: Files to upload to the symbol server.27# optional: List of buildtypes for which the file might not exist, and it's not28# considered an error.29FILES = [30 {31 'filename': 'browser_tests.exe',32 'buildtype': ['official'],33 'archive': 'browser_tests.exe',34 'optional': ['official'],35 },36 {37 'filename': 'sync_integration_tests.exe',38 'buildtype': ['official'],39 'archive': 'sync_integration_tests.exe',40 'optional': ['official'],41 },42 {43 'filename': 'chrome.exe',44 'buildtype': ['dev', 'official'],45 'filegroup': ['default', 'symsrc'],46 },47 {48 'filename': 'nacl64.exe',49 'arch': ['32bit'],50 'buildtype': ['dev', 'official'],51 'filegroup': ['default', 'symsrc'],52 },53 {54 'filename': 'chrome.dll',55 'buildtype': ['dev', 'official'],56 'filegroup': ['default', 'symsrc'],57 },58 {59 'filename': 'chrome_child.dll',60 'buildtype': ['dev', 'official'],61 'filegroup': ['default', 'symsrc'],62 'optional': ['dev', 'official'],63 },64 {65 'filename': 'chrome_elf.dll',66 'buildtype': ['dev', 'official'],67 'filegroup': ['default', 'symsrc'],68 },69 {70 'filename': 'eventlog_provider.dll',71 'buildtype': ['dev', 'official'],72 'filegroup': ['default'],73 },74 {75 'filename': '*.manifest',76 'buildtype': ['dev', 'official'],77 'filegroup': ['default'],78 },79 {80 'filename': 'chrome_100_percent.pak',81 'buildtype': ['dev', 'official'],82 },83 {84 'filename': 'chrome_200_percent.pak',85 'buildtype': ['dev', 'official'],86 'optional': ['dev', 'official'],87 },88 {89 'filename': 'First Run',90 'buildtype': ['dev', 'official'],91 },92 {93 'filename': 'icudtl.dat',94 'buildtype': ['dev', 'official'],95 'optional': ['dev', 'official'],96 },97 {98 'filename': 'icudt.dll',99 'buildtype': ['dev', 'official'],100 'optional': ['dev', 'official'],101 },102 {103 'filename': 'mojo_core.dll',104 'buildtype': ['dev', 'official'],105 'optional': ['dev', 'official'],106 },107 {108 'filename': 'v8_context_snapshot.bin',109 'buildtype': ['dev', 'official'],110 'optional': ['dev', 'official'],111 },112 {113 'filename': 'locales/ar.pak',114 'buildtype': ['dev', 'official'],115 },116 {117 'filename': 'locales/bg.pak',118 'buildtype': ['dev', 'official'],119 },120 {121 'filename': 'locales/bn.pak',122 'buildtype': ['dev', 'official'],123 },124 {125 'filename': 'locales/ca.pak',126 'buildtype': ['dev', 'official'],127 },128 {129 'filename': 'locales/cs.pak',130 'buildtype': ['dev', 'official'],131 },132 {133 'filename': 'locales/da.pak',134 'buildtype': ['dev', 'official'],135 },136 {137 'filename': 'locales/de.pak',138 'buildtype': ['dev', 'official'],139 },140 {141 'filename': 'locales/el.pak',142 'buildtype': ['dev', 'official'],143 },144 {145 'filename': 'locales/en-GB.pak',146 'buildtype': ['dev', 'official'],147 },148 {149 'filename': 'locales/en-US.pak',150 'buildtype': ['dev', 'official'],151 },152 {153 'filename': 'locales/es-419.pak',154 'buildtype': ['dev', 'official'],155 },156 {157 'filename': 'locales/es.pak',158 'buildtype': ['dev', 'official'],159 },160 {161 'filename': 'locales/et.pak',162 'buildtype': ['dev', 'official'],163 },164 {165 'filename': 'locales/fi.pak',166 'buildtype': ['dev', 'official'],167 },168 {169 'filename': 'locales/fil.pak',170 'buildtype': ['dev', 'official'],171 },172 {173 'filename': 'locales/fr.pak',174 'buildtype': ['dev', 'official'],175 },176 {177 'filename': 'locales/gu.pak',178 'buildtype': ['dev', 'official'],179 },180 {181 'filename': 'locales/he.pak',182 'buildtype': ['dev', 'official'],183 },184 {185 'filename': 'locales/hi.pak',186 'buildtype': ['dev', 'official'],187 },188 {189 'filename': 'locales/hr.pak',190 'buildtype': ['dev', 'official'],191 },192 {193 'filename': 'locales/hu.pak',194 'buildtype': ['dev', 'official'],195 },196 {197 'filename': 'locales/id.pak',198 'buildtype': ['dev', 'official'],199 },200 {201 'filename': 'locales/it.pak',202 'buildtype': ['dev', 'official'],203 },204 {205 'filename': 'locales/ja.pak',206 'buildtype': ['dev', 'official'],207 },208 {209 'filename': 'locales/kn.pak',210 'buildtype': ['dev', 'official'],211 },212 {213 'filename': 'locales/ko.pak',214 'buildtype': ['dev', 'official'],215 },216 {217 'filename': 'locales/lt.pak',218 'buildtype': ['dev', 'official'],219 },220 {221 'filename': 'locales/lv.pak',222 'buildtype': ['dev', 'official'],223 },224 {225 'filename': 'locales/ml.pak',226 'buildtype': ['dev', 'official'],227 },228 {229 'filename': 'locales/mr.pak',230 'buildtype': ['dev', 'official'],231 },232 {233 'filename': 'locales/ms.pak',234 'buildtype': ['dev', 'official'],235 },236 {237 'filename': 'locales/nb.pak',238 'buildtype': ['dev', 'official'],239 },240 {241 'filename': 'locales/nl.pak',242 'buildtype': ['dev', 'official'],243 },244 {245 'filename': 'locales/pl.pak',246 'buildtype': ['dev', 'official'],247 },248 {249 'filename': 'locales/pt-BR.pak',250 'buildtype': ['dev', 'official'],251 },252 {253 'filename': 'locales/pt-PT.pak',254 'buildtype': ['dev', 'official'],255 },256 {257 'filename': 'locales/ro.pak',258 'buildtype': ['dev', 'official'],259 },260 {261 'filename': 'locales/ru.pak',262 'buildtype': ['dev', 'official'],263 },264 {265 'filename': 'locales/sk.pak',266 'buildtype': ['dev', 'official'],267 },268 {269 'filename': 'locales/sl.pak',270 'buildtype': ['dev', 'official'],271 },272 {273 'filename': 'locales/sr.pak',274 'buildtype': ['dev', 'official'],275 },276 {277 'filename': 'locales/sv.pak',278 'buildtype': ['dev', 'official'],279 },280 {281 'filename': 'locales/ta.pak',282 'buildtype': ['dev', 'official'],283 },284 {285 'filename': 'locales/te.pak',286 'buildtype': ['dev', 'official'],287 },288 {289 'filename': 'locales/th.pak',290 'buildtype': ['dev', 'official'],291 },292 {293 'filename': 'locales/tr.pak',294 'buildtype': ['dev', 'official'],295 },296 {297 'filename': 'locales/uk.pak',298 'buildtype': ['dev', 'official'],299 },300 {301 'filename': 'locales/vi.pak',302 'buildtype': ['dev', 'official'],303 },304 {305 'filename': 'locales/zh-CN.pak',306 'buildtype': ['dev', 'official'],307 },308 {309 'filename': 'locales/zh-TW.pak',310 'buildtype': ['dev', 'official'],311 },312 {313 'filename': 'policy_templates.zip',314 'buildtype': ['official'],315 'archive': 'policy_templates.zip',316 },317 {318 'filename': 'resources.pak',319 'buildtype': ['dev', 'official'],320 },321 # PNaCl translator (archive only, component updater used for shipping).322 {323 'filename': 'pnacl',324 'buildtype': ['dev', 'official'],325 'archive': 'pnacl.zip',326 },327 # Widevine CDM files:328 {329 'filename': 'WidevineCdm/manifest.json',330 'buildtype': ['official'],331 },332 {333 'filename': 'WidevineCdm/LICENSE',334 'buildtype': ['official'],335 },336 {337 'filename': 'WidevineCdm/_platform_specific/win_x86/widevinecdm.dll',338 'arch': ['32bit'],339 'buildtype': ['official'],340 },341 {342 'filename': 'WidevineCdm/_platform_specific/win_x86/widevinecdm.dll.sig',343 'arch': ['32bit'],344 'buildtype': ['official'],345 },346 {347 'filename': 'WidevineCdm/_platform_specific/win_x64/widevinecdm.dll',348 'arch': ['64bit'],349 'buildtype': ['official'],350 },351 {352 'filename': 'WidevineCdm/_platform_specific/win_x64/widevinecdm.dll.sig',353 'arch': ['64bit'],354 'buildtype': ['official'],355 },356 # ANGLE files:357 {358 'filename': 'D3DCompiler_47.dll',359 'buildtype': ['dev', 'official'],360 },361 {362 'filename': 'libEGL.dll',363 'buildtype': ['dev', 'official'],364 'filegroup': ['default', 'symsrc'],365 },366 {367 'filename': 'libGLESv2.dll',368 'buildtype': ['dev', 'official'],369 'filegroup': ['default', 'symsrc'],370 },371 # SwiftShader files:372 {373 'filename': 'swiftshader/libEGL.dll',374 'buildtype': ['dev', 'official'],375 'filegroup': ['default', 'symsrc'],376 },377 {378 'filename': 'swiftshader/libGLESv2.dll',379 'buildtype': ['dev', 'official'],380 'filegroup': ['default', 'symsrc'],381 },382 # Native Client plugin files:383 {384 'filename': 'nacl_irt_x86_32.nexe',385 'arch': ['32bit'],386 'buildtype': ['dev', 'official'],387 },388 {389 'filename': 'nacl_irt_x86_64.nexe',390 'buildtype': ['dev', 'official'],391 },392 # Remoting files:393 {394 'filename': 'chromoting.msi',395 'arch': ['32bit'],396 'buildtype': ['dev', 'official'],397 'archive': 'remoting-host.msi',398 'direct_archive': 1,399 'optional': ['dev'],400 },401 {402 'filename': 'remoting-me2me-host-win.zip',403 'arch': ['32bit'],404 'buildtype': ['dev', 'official'],405 'archive': 'remoting-me2me-host-win.zip',406 'direct_archive': 1,407 'optional': ['dev'],408 },409 {410 'filename': 'remoting-me2me-host-win-unsupported.zip',411 'arch': ['64bit'],412 'buildtype': ['dev', 'official'],413 'archive': 'remoting-me2me-host-win-unsupported.zip',414 'direct_archive': 1,415 'optional': ['dev'],416 },417 {418 'filename': 'remote_assistance_host.exe',419 'buildtype': ['official'],420 'archive': 'remoting-win32.zip',421 'filegroup': ['symsrc'],422 },423 {424 'filename': 'remote_assistance_host.exe.pdb',425 'buildtype': ['official'],426 'archive': 'remoting-win32.zip',427 },428 {429 'filename': 'remote_assistance_host_uiaccess.exe',430 'buildtype': ['official'],431 'archive': 'remoting-win32.zip',432 'filegroup': ['symsrc'],433 },434 {435 'filename': 'remote_assistance_host_uiaccess.exe.pdb',436 'buildtype': ['official'],437 'archive': 'remoting-win32.zip',438 },439 {440 'filename': 'remote_security_key.exe',441 'buildtype': ['official'],442 'archive': 'remoting-win32.zip',443 'filegroup': ['symsrc'],444 },445 {446 'filename': 'remote_security_key.exe.pdb',447 'buildtype': ['official'],448 'archive': 'remoting-win32.zip',449 },450 {451 'filename': 'remoting_core.dll',452 'buildtype': ['official'],453 'archive': 'remoting-win32.zip',454 'filegroup': ['symsrc'],455 },456 {457 'filename': 'remoting_core.dll.pdb',458 'buildtype': ['official'],459 'archive': 'remoting-win32.zip',460 'optional': ['official'],461 },462 {463 'filename': 'remoting_desktop.exe',464 'buildtype': ['official'],465 'archive': 'remoting-win32.zip',466 'filegroup': ['symsrc'],467 },468 {469 'filename': 'remoting_desktop.exe.pdb',470 'buildtype': ['official'],471 'archive': 'remoting-win32.zip',472 'optional': ['official'],473 },474 {475 'filename': 'remoting_host.exe',476 'buildtype': ['official'],477 'archive': 'remoting-win32.zip',478 'filegroup': ['symsrc'],479 },480 {481 'filename': 'remoting_host.exe.pdb',482 'buildtype': ['official'],483 'archive': 'remoting-win32.zip',484 },485 {486 'filename': 'remoting_native_messaging_host.exe',487 'buildtype': ['official'],488 'archive': 'remoting-win32.zip',489 'filegroup': ['symsrc'],490 },491 {492 'filename': 'remoting_native_messaging_host.exe.pdb',493 'buildtype': ['official'],494 'archive': 'remoting-win32.zip',495 },496 {497 'filename': 'remoting_start_host.exe',498 'buildtype': ['official'],499 'archive': 'remoting-win32.zip',500 'filegroup': ['symsrc'],501 },502 {503 'filename': 'remoting_start_host.exe.pdb',504 'buildtype': ['official'],505 'archive': 'remoting-win32.zip',506 },507 # Credential Provider:508 {509 'filename': 'gcp_setup.exe',510 'buildtype': ['dev', 'official'],511 'optional': ['dev', 'official'],512 'filegroup': ['symsrc'],513 },514 {515 'filename': 'gcp_setup.exe.pdb',516 'buildtype': ['dev', 'official'],517 'optional': ['dev', 'official'],518 'archive': 'chrome-win32-syms.zip',519 },520 {521 'filename': 'gaia1_0.dll',522 'buildtype': ['dev', 'official'],523 'optional': ['dev', 'official'],524 'filegroup': ['symsrc'],525 },526 {527 'filename': 'gaia1_0.dll.pdb',528 'buildtype': ['dev', 'official'],529 'optional': ['dev', 'official'],530 'archive': 'chrome-win32-syms.zip',531 },532 {533 'filename': 'gcp_installer.exe',534 'buildtype': ['official'],535 'archive': 'gcp_installer.exe',536 'filegroup': ['symsrc'],537 },538 # Cloud Print files:539 {540 'filename': 'gcp_portmon.dll',541 'buildtype': ['official'],542 'archive': 'cloud_print.zip',543 'filegroup': ['symsrc'],544 },545 {546 'filename': 'gcp_portmon.dll.pdb',547 'buildtype': ['official'],548 'archive': 'cloud_print.zip',549 'optional': ['official'],550 },551 {552 'filename': 'gcp_portmon64.dll',553 'buildtype': ['official'],554 'archive': 'cloud_print.zip',555 'filegroup': ['symsrc'],556 },557 {558 'filename': 'gcp_driver.inf',559 'buildtype': ['official'],560 'archive': 'cloud_print.zip',561 },562 {563 'filename': 'gcp_driver.gpd',564 'buildtype': ['official'],565 'archive': 'cloud_print.zip',566 },567 {568 'filename': 'virtual_driver_setup.exe',569 'buildtype': ['official'],570 'archive': 'cloud_print.zip',571 'filegroup': ['symsrc'],572 },573 {574 'filename': 'virtual_driver_setup.exe.pdb',575 'buildtype': ['official'],576 'archive': 'cloud_print.zip',577 },578 # Test binaries for external QA:579 {580 'filename': 'interactive_ui_tests.exe',581 'buildtype': ['dev', 'official'],582 'optional': ['dev', 'official'],583 },584 # Notification helper files:585 {586 'filename': 'notification_helper.exe',587 'buildtype': ['dev', 'official'],588 'filegroup': ['default', 'symsrc'],589 },590 {591 'filename': 'notification_helper.exe.pdb',592 'buildtype': ['dev', 'official'],593 'archive': 'chrome-win32-syms.zip',594 },595 # Installer files (official build only):596 {597 'filename': 'setup.exe',598 'buildtype': ['official'],599 'archive': 'setup.exe',600 'filegroup': ['symsrc'],601 },602 {603 'filename': 'mini_installer.exe',604 'buildtype': ['dev', 'official'],605 'archive': 'mini_installer.exe',606 'filegroup': ['symsrc'],607 },608 {609 'filename': 'chrome.packed.7z',610 'buildtype': ['official'],611 'archive': 'chrome.packed.7z',612 },613 {614 'filename': 'mini_installer_exe_version.rc',615 'buildtype': ['official'],616 'archive': 'mini_installer_exe_version.rc',617 },618 {619 'filename': 'courgette.exe',620 'buildtype': ['official'],621 'archive': 'courgette.exe',622 },623 {624 'filename': 'courgette64.exe',625 'buildtype': ['official'],626 'archive': 'courgette64.exe',627 },628 {629 'filename': 'zucchini.exe',630 'buildtype': ['official'],631 'optional': ['official'],632 'archive': 'zucchini.exe',633 },634 {635 'filename': 'zucchini.exe.pdb',636 'buildtype': ['official'],637 'optional': ['official'],638 'archive': 'chrome-win32-syms.zip',639 },640 {641 'filename': 'chrome.dll.pdb',642 'buildtype': ['dev', 'official'],643 'archive': 'chrome-win32-syms.zip',644 },645 {646 'filename': 'chrome_child.dll.pdb',647 'buildtype': ['dev', 'official'],648 'optional': ['dev', 'official'],649 'archive': 'chrome-win32-syms.zip',650 },651 {652 'filename': 'chrome_elf.dll.pdb',653 'buildtype': ['dev', 'official'],654 'archive': 'chrome-win32-syms.zip',655 },656 {657 'filename': 'chrome.exe.pdb',658 'buildtype': ['dev', 'official'],659 'archive': 'chrome-win32-syms.zip',660 },661 {662 'filename': 'eventlog_provider.dll.pdb',663 'buildtype': ['dev', 'official'],664 'archive': 'chrome-win32-syms.zip',665 },666 {667 'filename': 'libEGL.dll.pdb',668 'buildtype': ['dev', 'official'],669 'archive': 'chrome-win32-syms.zip',670 },671 {672 'filename': 'libGLESv2.dll.pdb',673 'buildtype': ['dev', 'official'],674 'archive': 'chrome-win32-syms.zip',675 },676 {677 'filename': 'mojo_core.dll.pdb',678 'buildtype': ['dev', 'official'],679 'archive': 'chrome-win32-syms.zip',680 },681 {682 'filename': 'swiftshader/libEGL.dll.pdb',683 'buildtype': ['dev', 'official'],684 'archive': 'chrome-win32-syms.zip',685 },686 {687 'filename': 'swiftshader/libGLESv2.dll.pdb',688 'buildtype': ['dev', 'official'],689 'archive': 'chrome-win32-syms.zip',690 },691 {692 'filename': 'mini_installer.exe.pdb',693 'buildtype': ['dev', 'official'],694 'archive': 'chrome-win32-syms.zip',695 },696 {697 'filename': 'nacl64.exe.pdb',698 'arch': ['32bit'],699 'buildtype': ['dev', 'official'],700 'archive': 'chrome-win32-syms.zip',701 },702 {703 'filename': 'setup.exe.pdb',704 'buildtype': ['dev', 'official'],705 'archive': 'chrome-win32-syms.zip',706 },707 # Updater files (official build only):708 {709 'filename': 'updater.exe',710 'buildtype': ['official'],711 'archive': 'updater.exe',712 },713 {714 'filename': 'UpdaterSetup.exe',715 'buildtype': ['official'],716 'archive': 'UpdaterSetup.exe',717 },718 # Partner API files.719 {720 'filename': 'gcapi.h',721 'buildtype': ['dev', 'official'],722 'archive': 'gcapi.zip',723 },724 {725 'filename': 'gcapi_dll.dll',726 'buildtype': ['dev', 'official'],727 'archive': 'gcapi.zip',728 'filegroup': ['symsrc'],729 },730 {731 'filename': 'gcapi_dll.dll.lib',732 'buildtype': ['dev', 'official'],733 'archive': 'gcapi.zip',734 },735 {736 'filename': 'gcapi_dll.dll.pdb',737 'buildtype': ['dev', 'official'],738 'archive': 'chrome-win32-syms.zip',739 },740 {741 'filename': 'nacl_irt_x86_32.nexe.debug',742 'arch': ['32bit'],743 'buildtype': ['official'],744 'archive': 'chrome-win32-nacl-irt-syms.zip',745 },746 {747 'filename': 'nacl_irt_x86_64.nexe.debug',748 'buildtype': ['official'],749 'archive': 'chrome-win32-nacl-irt-syms.zip',750 },751 # Content shell files:752 {753 'filename': 'blink_deprecated_test_plugin.dll',754 'buildtype': ['dev'],755 'archive': 'content-shell.zip',756 'optional': ['dev'],757 },758 {759 'filename': 'blink_test_plugin.dll',760 'buildtype': ['dev'],761 'archive': 'content-shell.zip',762 'optional': ['dev'],763 },764 {765 'filename': 'content_shell.exe',766 'buildtype': ['dev'],767 'archive': 'content-shell.zip',768 'optional': ['dev'],769 },770 {771 'filename': 'content_shell.pak',772 'buildtype': ['dev'],773 'archive': 'content-shell.zip',774 'optional': ['dev'],775 },776 {777 'filename': 'icudtl.dat',778 'buildtype': ['dev'],779 'archive': 'content-shell.zip',780 'optional': ['dev'],781 },782 {783 'filename': 'v8_context_snapshot.bin',784 'buildtype': ['dev'],785 'archive': 'content-shell.zip',786 'optional': ['dev'],787 },788 {789 'filename': 'resources',790 'buildtype': ['dev'],791 'archive': 'content-shell.zip',792 'optional': ['dev'],793 },794 # Metrics metadata files:795 {796 'filename': 'actions.xml',797 'buildtype': ['dev', 'official'],798 'archive': 'metrics-metadata.zip',799 'optional': ['dev', 'official'],800 },801 {802 'filename': 'histograms.xml',803 'buildtype': ['dev', 'official'],804 'archive': 'metrics-metadata.zip',805 'optional': ['dev', 'official'],806 },807 {808 'filename': 'rappor.xml',809 'buildtype': ['dev', 'official'],810 'archive': 'metrics-metadata.zip',811 'optional': ['dev', 'official'],812 },813 {814 'filename': 'ukm.xml',815 'buildtype': ['dev', 'official'],816 'archive': 'metrics-metadata.zip',817 'optional': ['dev', 'official'],818 },819 # MEI Preload files:820 {821 'filename': 'MEIPreload/manifest.json',822 'buildtype': ['dev', 'official'],823 },824 {825 'filename': 'MEIPreload/preloaded_data.pb',826 'buildtype': ['dev', 'official'],827 },828 # ChromeDriver binary:829 {830 'filename': 'chromedriver.exe',831 'arch': ['32bit'],832 'buildtype': ['dev', 'official'],833 'archive': 'chromedriver_win32.zip',834 'optional': ['dev', 'official'],835 'filegroup': ['symsrc'],836 },837 {838 'filename': 'chromedriver.exe.pdb',839 'buildtype': ['dev', 'official'],840 'archive': 'chromedriver_win32-syms.zip',841 'optional': ['dev', 'official'],842 },843 # Elevation service files:844 {845 'filename': 'elevation_service.exe',846 'buildtype': ['dev', 'official'],847 'filegroup': ['default', 'symsrc'],848 },849 {850 'filename': 'elevation_service.exe.pdb',851 'buildtype': ['dev', 'official'],852 'archive': 'chrome-win32-syms.zip',853 },854 # Bookmark apps shortcut target:855 {856 'filename': 'chrome_proxy.exe',857 'buildtype': ['dev', 'official'],858 'filegroup': ['default', 'symsrc'],859 },860 {861 'filename': 'chrome_proxy.exe.pdb',862 'buildtype': ['dev', 'official'],863 'archive': 'chrome-win32-syms.zip',864 },865 # DevTools front-end files:866 {867 'filename': 'resources/inspector',868 'buildtype': ['dev', 'official'],869 'archive': 'devtools-frontend.zip',870 },871 # Policy cloud documentation source files:872 {873 'filename': 'gen/chrome/app/policy/translations/policy_templates_de-DE.json',874 'buildtype': ['official'],875 'archive': 'policy_templates_de-DE.json',876 'direct_archive': 1,877 'optional': ['official'],878 },879 {880 'filename': 'gen/chrome/app/policy/translations/policy_templates_en-US.json',881 'buildtype': ['official'],882 'archive': 'policy_templates_en-US.json',883 'direct_archive': 1,884 'optional': ['official'],885 },886 {887 'filename': 'gen/chrome/app/policy/translations/policy_templates_es-419.json',888 'buildtype': ['official'],889 'archive': 'policy_templates_es-419.json',890 'direct_archive': 1,891 'optional': ['official'],892 },893 {894 'filename': 'gen/chrome/app/policy/translations/policy_templates_es-ES.json',895 'buildtype': ['official'],896 'archive': 'policy_templates_es-ES.json',897 'direct_archive': 1,898 'optional': ['official'],899 },900 {901 'filename': 'gen/chrome/app/policy/translations/policy_templates_fr-FR.json',902 'buildtype': ['official'],903 'archive': 'policy_templates_fr-FR.json',904 'direct_archive': 1,905 'optional': ['official'],906 },907 {908 'filename': 'gen/chrome/app/policy/translations/policy_templates_id-ID.json',909 'buildtype': ['official'],910 'archive': 'policy_templates_id-ID.json',911 'direct_archive': 1,912 'optional': ['official'],913 },914 {915 'filename': 'gen/chrome/app/policy/translations/policy_templates_it-IT.json',916 'buildtype': ['official'],917 'archive': 'policy_templates_it-IT.json',918 'direct_archive': 1,919 'optional': ['official'],920 },921 {922 'filename': 'gen/chrome/app/policy/translations/policy_templates_ja-JP.json',923 'buildtype': ['official'],924 'archive': 'policy_templates_ja-JP.json',925 'direct_archive': 1,926 'optional': ['official'],927 },928 {929 'filename': 'gen/chrome/app/policy/translations/policy_templates_ko-KR.json',930 'buildtype': ['official'],931 'archive': 'policy_templates_ko-KR.json',932 'direct_archive': 1,933 'optional': ['official'],934 },935 {936 'filename': 'gen/chrome/app/policy/translations/policy_templates_nl-NL.json',937 'buildtype': ['official'],938 'archive': 'policy_templates_nl-NL.json',939 'direct_archive': 1,940 'optional': ['official'],941 },942 {943 'filename': 'gen/chrome/app/policy/translations/policy_templates_pt-BR.json',944 'buildtype': ['official'],945 'archive': 'policy_templates_pt-BR.json',946 'direct_archive': 1,947 'optional': ['official'],948 },949 {950 'filename': 'gen/chrome/app/policy/translations/policy_templates_ru-RU.json',951 'buildtype': ['official'],952 'archive': 'policy_templates_ru-RU.json',953 'direct_archive': 1,954 'optional': ['official'],955 },956 {957 'filename': 'gen/chrome/app/policy/translations/policy_templates_th-TH.json',958 'buildtype': ['official'],959 'archive': 'policy_templates_th-TH.json',960 'direct_archive': 1,961 'optional': ['official'],962 },963 {964 'filename': 'gen/chrome/app/policy/translations/policy_templates_tr-TR.json',965 'buildtype': ['official'],966 'archive': 'policy_templates_tr-TR.json',967 'direct_archive': 1,968 'optional': ['official'],969 },970 {971 'filename': 'gen/chrome/app/policy/translations/policy_templates_uk-UA.json',972 'buildtype': ['official'],973 'archive': 'policy_templates_uk-UA.json',974 'direct_archive': 1,975 'optional': ['official'],976 },977 {978 'filename': 'gen/chrome/app/policy/translations/policy_templates_vi-VN.json',979 'buildtype': ['official'],980 'archive': 'policy_templates_vi-VN.json',981 'direct_archive': 1,982 'optional': ['official'],983 },984 {985 'filename': 'gen/chrome/app/policy/translations/policy_templates_zh-CN.json',986 'buildtype': ['official'],987 'archive': 'policy_templates_zh-CN.json',988 'direct_archive': 1,989 'optional': ['official'],990 },991 {992 'filename': 'gen/chrome/app/policy/translations/policy_templates_zh-TW.json',993 'buildtype': ['official'],994 'archive': 'policy_templates_zh-TW.json',995 'direct_archive': 1,996 'optional': ['official'],997 },998 # Progressive Web App launcher executable:999 {1000 'filename': 'chrome_pwa_launcher.exe',1001 'buildtype': ['dev', 'official'],1002 'filegroup': ['default', 'symsrc'],1003 },1004 {1005 'filename': 'chrome_pwa_launcher.exe.pdb',1006 'buildtype': ['dev', 'official'],1007 'archive': 'chrome-win32-syms.zip',1008 },...

Full Screen

Full Screen

fileManaging.py

Source:fileManaging.py Github

copy

Full Screen

1import os2import pickle3import numpy as np4from time import sleep5import random6from ATLASClusterInterface import errorCorrectionsAndTests as EC7import yaml89file = open('configurations.yml', 'r')10docs = yaml.full_load(file)11file.close()12directories = docs['directories']131415def get_storage_dir():16 cwd = os.getcwd()17 if cwd[0] == 'D':18 storage_dir = directories['storage_dir_office']19 if cwd[0] == 'C':20 storage_dir = directories['storage_dir_laptop']21 if cwd[0] == '/':22 storage_dir = directories['storage_dir_ATLAS']23 return storage_dir242526"""27In case access to file is problematic28"""293031def random_read_helper(func, params):32 try:33 return func(*params)34 except EOFError:35 sleep(random.random())36 return func(*params)373839"""40Basic building blocks (basis, interaction of 2 particles)41"""424344def filename_two_particle_matrices(Mmin, Mmax, potential_type):45 directory = 'pkl_data/two_particle_hamiltonian_annulus/'46 filename = 'Annulus_' + potential_type + '_two_particle_hamiltonian_Mmin=' + str(Mmin) + '_Mmax=' + str(47 Mmax) + '.pkl'48 storage_dir = get_storage_dir()49 directory = storage_dir + directory50 if not os.path.exists(directory):51 os.mkdir(directory)52 filename = directory + filename5354 return filename555657def directory_two_particle_matrices():58 directory = 'pkl_data/two_particle_hamiltonian_annulus/'59 storage_dir = get_storage_dir()60 directory = storage_dir + directory61 return directory626364def write_two_particle_matrices(Mmin, Mmax, potential_type, H_2_particles):65 filename = filename_two_particle_matrices(Mmin, Mmax, potential_type)66 file1 = open(filename, 'wb')67 dic = {"two_particle_hamiltonian": H_2_particles}68 pickle.dump(dic, file1)69 file1.close()70 print("wrote two_particle_hamiltonian into file: " + filename)71 return 0727374def read_two_particle_matrices(Mmin, Mmax, potential_type):75 filename = filename_two_particle_matrices(Mmin, Mmax, potential_type)76 file1 = open(filename, 'rb')77 dic = pickle.load(file1)78 file1.close()79 H_2_particles = dic["two_particle_hamiltonian"]80 print("read two_particle_hamiltonian out of: " + filename)81 return H_2_particles828384def filename_basis_annulus(Mmin, Mmax, N):85 filename = 'basis_annulus_Mmin=' + str(Mmin) + '_Mmax=' + str(Mmax) + '_N=' + str(N) + '.npz'86 directory = 'pkl_data/basis_annulus/' + str(N) + '_particles/'87 storage_dir = get_storage_dir()88 directory = storage_dir + directory89 if not os.path.exists(directory):90 os.mkdir(directory)91 filename = directory + filename9293 return filename949596def read_basis_annulus(Mmin, Mmax, N):97 filename = filename_basis_annulus(Mmin, Mmax, N)98 # filename = filename_basis_annulus(0, Mmax - Mmin, N)99 basis_list = np.load(filename)['basis_list']100 return basis_list101102103def write_basis_annulus(Mmin, Mmax, N, basis_list):104 filename = filename_basis_annulus(Mmin, Mmax, N)105 # filename = filename_basis_annulus(0, Mmax - Mmin, N)106 np.savez(filename, basis_list=basis_list)107 print("wrote basis_list into file: " + filename)108 return 0109110111def filename_basis_annulus_const_lz(Mmin, Mmax, N, lz_val):112 directory = 'pkl_data/basis_annulus/' + str(N) + '_particles/'113 filename = 'basis_annulus_Mmin=' + str(Mmin) + '_Mmax=' + str(Mmax) + '_N=' + str(N) + '_lz_val=' + str(114 lz_val) + '.npz'115 storage_dir = get_storage_dir()116 directory = storage_dir + directory117 if not os.path.exists(directory):118 os.mkdir(directory)119 filename = directory + filename120121 return filename122123124def read_basis_annulus_const_lz(Mmin, Mmax, N, lz_val):125 filename = filename_basis_annulus_const_lz(Mmin, Mmax, N, lz_val)126 basis_list = np.load(filename)['basis_list']127 return basis_list128129130def write_basis_annulus_const_lz(Mmin, Mmax, N, lz_val, basis_list):131 filename = filename_basis_annulus_const_lz(Mmin, Mmax, N, lz_val)132 np.savez(filename, basis_list=basis_list)133 print("wrote basis_list into file: " + filename)134 return 0135136137def directory_sizes_of_hilbert_space():138 directory = 'pkl_data/basis_annulus/sizes_of_hilbert_space/'139 storage_dir = get_storage_dir()140 directory = storage_dir + directory141 if not os.path.exists(directory):142 os.mkdir(directory)143 return directory144145146def filename_size_of_hilbert_space(Mmin, Mmax, N, lz_val):147 directory = directory_sizes_of_hilbert_space() + str(N) + '_particles/'148 if not os.path.exists(directory):149 os.mkdir(directory)150 filename = str(Mmin) + '_' + str(Mmax) + '_' + str(N) + '_' + str(lz_val) + '.npz'151152 filename = directory + filename153 return filename154155156def read_size_of_hilbert_space(Mmin, Mmax, N, lz_val):157 filename = filename_size_of_hilbert_space(Mmin, Mmax, N, lz_val)158 hilbert_space_size = np.load(filename)['hilbert_space_size']159 return hilbert_space_size160161162def write_size_of_hilbert_space(Mmin, Mmax, N, lz_val, hilbert_space_size):163 filename = filename_size_of_hilbert_space(Mmin, Mmax, N, lz_val)164 np.savez(filename, hilbert_space_size=hilbert_space_size)165 print("wrote size of hilbert space into: " + filename)166167 return 0168169170"""171Spectrum172"""173174175def filename_low_lying_spectrum(MminL, MmaxL, edge_states, N, lz_val, hamiltonian_labels, parameters):176 common_args = [MminL, MmaxL, edge_states, N, lz_val]177 common_args = [str(a) for a in common_args]178 filename = 'low_lying_spectrum_' + '_'.join(common_args) + '_'179180 hamiltonian_args = [str(hamiltonian_labels[i]) + '_' + str(parameters[i]) for i in range(len(hamiltonian_labels))]181 filename = filename + '_'.join(hamiltonian_args) + '.pkl'182 directory = 'pkl_data/low_lying_spectrum_annulus/' + str(N) + '_particles/'183 storage_dir = get_storage_dir()184 directory = storage_dir + directory185 if not os.path.exists(directory):186 os.mkdir(directory)187 filename = directory + filename188 return filename189190191def read_low_lying_spectrum(MminL, MmaxL, edge_states, N, lz_val, hamiltonian_labels, parameters):192 filename = filename_low_lying_spectrum(MminL, MmaxL, edge_states, N, lz_val, hamiltonian_labels, parameters)193 spectrum = np.load(filename)['spectrum']194195 return spectrum196197198def write_low_lying_spectrum(MminL, MmaxL, edge_states, N, lz_val, hamiltonian_labels, parameters, spectrum):199 filename = filename_low_lying_spectrum(MminL, MmaxL, edge_states, N, lz_val, hamiltonian_labels, parameters)200 np.savez(filename, spectrum=spectrum)201 print("wrote spectrum to file " + filename)202 return 0203204205def directory_full_spectrum(MminL, MmaxL, edge_states, N):206 directory = 'pkl_data/full_spectrum_annulus/' + str(N) + '_particles/'207 storage_dir = get_storage_dir()208 directory = storage_dir + directory209 if not os.path.exists(directory):210 os.mkdir(directory)211212 sub_dir = '_'.join([str(MminL), str(MmaxL), str(edge_states)]) + '/'213 directory = directory + sub_dir214 if not os.path.exists(directory):215 os.mkdir(directory)216 return directory217218219def filename_full_spectrum(MminL, MmaxL, edge_states, N, window_of_lz, hamiltonian_labels, parameters):220 # window_of_lz = [number,'all','not_conserved']221222 hamiltonian_args = [str(hamiltonian_labels[i]) + '_' + str(parameters[i]) for i in range(len(hamiltonian_labels))]223 args = [MminL, MmaxL, edge_states, N, 'Ham_lbls', *hamiltonian_args, 'lz_win', window_of_lz]224 args = [str(a) for a in args]225 filename = 'full_spectrum_' + '_'.join(args) + '.pkl'226 directory = directory_full_spectrum(MminL, MmaxL, edge_states, N)227 filename = directory + filename228 return filename229230231def read_full_spectrum(MminL, MmaxL, edge_states, N, window_of_lz, hamiltonian_labels, parameters):232 filename = filename_full_spectrum(MminL, MmaxL, edge_states, N, window_of_lz, hamiltonian_labels, parameters)233 file1 = open(filename, 'rb')234 data = pickle.load(file1)235 file1.close()236237 spectrum = data['spectrum']238 return spectrum239240241def filename_spectrum_luttinger_parm(MminL, MmaxL, edge_states, N, window_of_lz, hamiltonian_labels, parameters):242 # window_of_lz = [number,'all','not_conserved']243244 hamiltonian_args = [str(hamiltonian_labels[i]) + '_' + str(parameters[i]) for i in range(len(hamiltonian_labels))]245 args = [MminL, MmaxL, edge_states, N, 'Ham_lbls', *hamiltonian_args, 'lz_win', window_of_lz]246 args = [str(a) for a in args]247 filename = 'luttinger_parm_spectrum_' + '_'.join(args) + '.pkl'248 directory = directory_full_spectrum(MminL, MmaxL, edge_states, N)249 filename = directory + filename250 return filename251252253def read_spectrum_luttinger_parm(MminL, MmaxL, edge_states, N, window_of_lz, hamiltonian_labels, parameters):254 filename = filename_spectrum_luttinger_parm(MminL, MmaxL, edge_states, N, window_of_lz, hamiltonian_labels,255 parameters)256 file1 = open(filename, 'rb')257 data = pickle.load(file1)258 file1.close()259260 spectrum = data['spectrum']261 return spectrum262263264def filename_spectrum_lz_total_vals(MminL, MmaxL, edge_states, N, hamiltonian_labels, parameters):265 hamiltonian_args = [str(hamiltonian_labels[i]) + '_' + str(parameters[i]) for i in range(len(hamiltonian_labels))]266 args = [MminL, MmaxL, edge_states, N, 'Ham_lbls', *hamiltonian_args]267 args = [str(a) for a in args]268 filename = 'spectrum_lz_total_vals_' + '_'.join(args) + '.pkl'269 directory = directory_full_spectrum(MminL, MmaxL, edge_states, N)270 filename = directory + filename271 return filename272273274def directory_spectrum_eigenstates(MminL, MmaxL, edge_states, N):275 directory = 'pkl_data/eigenstates_full_spectrum_annulus/' + str(N) + '_particles/'276 storage_dir = get_storage_dir()277 directory = storage_dir + directory278 if not os.path.exists(directory):279 os.mkdir(directory)280281 sub_dir = '_'.join([str(MminL), str(MmaxL), str(edge_states)]) + '/'282 directory = directory + sub_dir283 if not os.path.exists(directory):284 os.mkdir(directory)285 return directory286287288def filename_spectrum_eigenstates(MminL, MmaxL, edge_states, N, lz_val, hamiltonian_labels, parameters):289 common_args = [MminL, MmaxL, edge_states, N, lz_val]290 common_args = [str(a) for a in common_args]291 filename = 'eigenstates_full_spectrum_' + '_'.join(common_args) + '_'292293 hamiltonian_args = [str(hamiltonian_labels[i]) + '_' + str(parameters[i]) for i in range(len(hamiltonian_labels))]294 filename = filename + '_'.join(hamiltonian_args) + '.pkl'295 directory = directory_spectrum_eigenstates(MminL, MmaxL, edge_states, N)296 filename = directory + filename297 return filename298299300def read_spectrum_eigenstates(MminL, MmaxL, edge_states, N, lz_val, hamiltonian_labels, parameters):301 filename = filename_spectrum_eigenstates(MminL, MmaxL, edge_states, N, lz_val, hamiltonian_labels, parameters)302 file1 = open(filename, 'rb')303 dic = pickle.load(file1)304 file1.close()305 spectrum = dic['spectrum_eigenstates']306 return spectrum307308309def read_spectrum_eigenstates_from_file(filename):310 file1 = open(filename, 'rb')311 dic = pickle.load(file1)312 file1.close()313 spectrum = dic['spectrum_eigenstates']314 return spectrum315316317def write_spectrum_eigenstates(MminL, MmaxL, edge_states, N, lz_val, hamiltonian_labels, parameters, spectrum):318 filename = filename_spectrum_eigenstates(MminL, MmaxL, edge_states, N, lz_val, hamiltonian_labels, parameters)319 file1 = open(filename, 'wb')320 dic = {'spectrum_eigenstates': spectrum}321 pickle.dump(dic, file1)322 file1.close()323 print("wrote low lying spectrum into: " + filename)324 return 0325326327def write_spectrum_eigenstates_to_file(filename, spectrum):328 file1 = open(filename, 'wb')329 dic = {'spectrum_eigenstates': spectrum}330 pickle.dump(dic, file1)331 file1.close()332 return 0333334335def complete_matrices_directory(matrix_name, args):336 hilbert_space_string = "_".join([str(a) for a in args[:-1]])337 # directory = 'pkl_data/matrices/' + hilbert_space_string + '/'338 storage_dir = get_storage_dir()339 dir_N = storage_dir + 'pkl_data/matrices/' + str(args[3]) + '_particles/'340 if not os.path.exists(dir_N):341 os.mkdir(dir_N)342 dir_parms = dir_N + hilbert_space_string + '/'343 if not os.path.exists(dir_parms):344 os.mkdir(dir_parms)345 directory = dir_parms + matrix_name + '/'346347 if not os.path.exists(directory):348 os.mkdir(directory)349 return directory350351352def filename_matrix_pieces_directory(matrix_name, args):353 directory = complete_matrices_directory(matrix_name, args)354355 args_str = [str(a) for a in args]356 one_string = "_".join(args_str)357 directory = directory + matrix_name + '_' + one_string + '/'358359 if not os.path.exists(directory):360 os.mkdir(directory)361 return directory362363364def filename_complete_matrix(matrix_name, args):365 args_str = [str(a) for a in args]366 one_string = "_".join(args_str)367 filename = matrix_name + '_' + one_string + '.npz'368 directory = complete_matrices_directory(matrix_name, args)369 filename = directory + filename370 return filename371372373def read_complete_matrix(matrix_name, args, output=1):374 # args = [MminL, MmaxL, edge_states, N, lz_val, matrix_label]375 filename = filename_complete_matrix(matrix_name, args)376 if EC.does_file_really_exist(filename):377 if not output:378 print("complete matrix exists")379 return 1380 npzfile = np.load(filename)381 row = npzfile['row']382 col = npzfile['col']383 matrix_elements = npzfile['matrix_elements']384 return row, col, matrix_elements385386 print("complete matrix doesn't exist yet")387 return 0388389390def write_complete_matrix(matrix_name, args, row, col, matrix_elements):391 # args = [MminL, MmaxL, edge_states, N, lz_val, matrix_label]392 filename = filename_complete_matrix(matrix_name, args)393 np.savez(filename, col=col, row=row, matrix_elements=matrix_elements)394 return 0395396397def filename_matrix_piece(matrix_name, args, slice):398 directory = filename_matrix_pieces_directory(matrix_name, args)399 slice_str = str(slice[0]) + '_to_' + str(slice[1]) + '.npz'400 filename = directory + slice_str401 return filename402403404def read_matrix_piece(matrix_name, args, slice):405 filename = filename_matrix_piece(matrix_name, args, slice)406 npzfile = np.load(filename)407 col = npzfile['col']408 row = npzfile['row']409 matrix_elements = npzfile['matrix_elements']410411 return row, col, matrix_elements412413414def write_matrix_piece(matrix_name, args, slice, row, col, matrix_elements):415 filename = filename_matrix_piece(matrix_name, args, slice)416 np.savez(filename, col=col, row=row, matrix_elements=matrix_elements)417 print("wrote matrix piece into file " + filename)418 return 0419420421def directory_parameter_files_annulus():422 directory = 'pkl_data/parameter_configuration_annulus/'423 storage_dir = get_storage_dir()424 directory = storage_dir + directory425 if not os.path.exists(directory):426 os.mkdir(directory)427 return directory428429430def filename_parameters_annulus(name):431 directory = directory_parameter_files_annulus()432 fullfilename = directory + name433 if fullfilename[-4:] != '.yml':434 fullfilename = fullfilename + '.yml'435 return fullfilename436437438def get_short_parameters_filename(full_filename):439 parameters_directory_path = directory_parameter_files_annulus()440 short_filename = full_filename[len(parameters_directory_path):]441 return short_filename442443444def filename_spectrum_vs_magnetic_flux(MminL, MmaxL, edge_states, N, hamiltonian_labels, parameters):445 hamiltonian_args = [str(hamiltonian_labels[i]) + '_' + str(parameters[i]) for i in range(len(hamiltonian_labels))]446 args = [MminL, MmaxL, edge_states, N, 'Ham_lbls', *hamiltonian_args]447 args = [str(a) for a in args]448 filename = 'spectrum_vs_magnetic_flux_' + "_".join(args) + '.pkl'449 directory = directory_full_spectrum(MminL, MmaxL, edge_states, N)450 filename = directory + filename451 return filename452453454def edge_subspace_matrices_directory(matrix_name, args):455 # args = [MminL,MmaxL,edge_states,N,lz_val,matrix_label]456 hilbert_space_string = "_".join([str(a) for a in args[:-1]])457 directory = 'pkl_data/matrices_edge_state_subspace/' + hilbert_space_string + '/'458459 storage_dir = get_storage_dir()460 directory = storage_dir + directory461 if not os.path.exists(directory):462 os.mkdir(directory)463 return directory464465466def filename_edge_subspace_matrix(matrix_name, matrix_dim, args):467 args_str = [str(a) for a in args]468 one_string = "_".join(args_str)469 filename = matrix_name + '_matrix_dim=' + str(matrix_dim) + '_' + one_string + '.npz'470 directory = edge_subspace_matrices_directory(matrix_name, args)471 filename = directory + filename472 return filename473474475def read_edge_subspace_matrix(matrix_name, matrix_dim, args, output=1):476 # args = [MminL, MmaxL, edge_states, N, lz_val, matrix_label]477 filename = filename_edge_subspace_matrix(matrix_name, matrix_dim, args)478 if EC.does_file_really_exist(filename):479 if not output:480 print("complete matrix exists")481 return 1482 npzfile = np.load(filename)483 matrix = npzfile['matrix']484 return matrix485486 print("complete matrix doesn't exist yet")487 return 0488489490def write_edge_subspace_matrix(matrix_name, args, matrix_dim, matrix):491 # args = [MminL, MmaxL, edge_states, N, lz_val, matrix_label]492 filename = filename_edge_subspace_matrix(matrix_name, matrix_dim, args)493 # filename = filename_edge_subspace_matrix(matrix_name, args)494 np.savez(filename, matrix=matrix)495 print("wrote subspace matrix into file " + filename)496 return 0497498499def directory_spectrum_subspace(MminL, MmaxL, edge_states, N):500 directory = 'pkl_data/spectrum_subspace_annulus/' + str(N) + '_particles/'501 storage_dir = get_storage_dir()502 directory = storage_dir + directory503 if not os.path.exists(directory):504 os.mkdir(directory)505506 sub_dir = '_'.join([str(MminL), str(MmaxL), str(edge_states)]) + '/'507 directory = directory + sub_dir508 if not os.path.exists(directory):509 os.mkdir(directory)510 return directory511512513def filename_lz_total_spectrum_low_lying_subspace(MminL, MmaxL, edge_states, N, hamiltonian_labels,514 parameters, subspace_size):515 # not including interactions because we only use toy interactions for the calculation516 hamiltonian_args = [str(hamiltonian_labels[i]) + '_' + str(parameters[i]) for i in517 range(1, len(hamiltonian_labels))]518 args = [MminL, MmaxL, edge_states, N, 'Ham_lbls', *hamiltonian_args, 'subspace_dim', subspace_size]519 args = [str(a) for a in args]520 filename = 'spectrum_lz_total_vals_' + '_'.join(args) + '.pkl'521 directory = directory_spectrum_subspace(MminL, MmaxL, edge_states, N)522 filename = directory + filename523 return filename524525526def filename_density_profile_groundstate(MminL, MmaxL, edge_states, N, hamiltonian_labels,527 parameters, num_points):528 directory = 'pkl_data/density_profiles/' + str(N) + '_particles/'529 storage_dir = get_storage_dir()530 directory = storage_dir + directory531 if not os.path.exists(directory):532 os.mkdir(directory)533 hamiltonian_args = [str(hamiltonian_labels[i]) + '_' + str(parameters[i]) for i in range(len(hamiltonian_labels))]534 args = [MminL, MmaxL, edge_states, N, num_points, 'Ham_lbls', *hamiltonian_args]535 args = [str(a) for a in args]536 filename = 'density_profile_groundstate_' + '_'.join(args)537 filename = directory + filename538 return filename539540541def read_spectrum_data_from_file(filename):542 file1 = open(filename, 'rb')543 data = pickle.load(file1)544 file1.close()545546 spectrum = data['spectrum'] ...

Full Screen

Full Screen

test_filestorage.py

Source:test_filestorage.py Github

copy

Full Screen

1# Copyright 2013-2016 Canonical Ltd. This software is licensed under the2# GNU Affero General Public License version 3 (see the file LICENSE).3"""Tests for file-storage API."""4from base64 import b64decode5import http.client6from django.urls import reverse7from testtools.matchers import Contains, Equals, MatchesListwise8from maasserver.models import FileStorage9from maasserver.testing.api import APITestCase10from maasserver.testing.factory import factory11from maasserver.utils.converters import json_load_bytes12from maastesting.utils import sample_binary_data13class FileStorageAPITestMixin:14 def _create_API_params(self, op=None, filename=None, fileObj=None):15 params = {}16 if op is not None:17 params["op"] = op18 if filename is not None:19 params["filename"] = filename20 if fileObj is not None:21 params["file"] = fileObj22 return params23 def make_API_POST_request(self, op=None, filename=None, fileObj=None):24 """Make an API POST request and return the response."""25 params = self._create_API_params(op, filename, fileObj)26 return self.client.post(reverse("files_handler"), params)27 def make_API_GET_request(self, op=None, filename=None, fileObj=None):28 """Make an API GET request and return the response."""29 params = self._create_API_params(op, filename, fileObj)30 return self.client.get(reverse("files_handler"), params)31class AnonymousFileStorageAPITest(32 FileStorageAPITestMixin, APITestCase.ForAnonymous33):34 def test_get_does_not_work_anonymously(self):35 storage = factory.make_FileStorage()36 response = self.make_API_GET_request("get", storage.filename)37 self.assertEqual(http.client.BAD_REQUEST, response.status_code)38 def test_get_by_key_works_anonymously(self):39 storage = factory.make_FileStorage()40 response = self.client.get(41 reverse("files_handler"), {"key": storage.key, "op": "get_by_key"}42 )43 self.assertEqual(http.client.OK, response.status_code)44 self.assertEqual(storage.content, response.content)45 def test_anon_resource_uri_allows_anonymous_access(self):46 storage = factory.make_FileStorage()47 response = self.client.get(storage.anon_resource_uri)48 self.assertEqual(http.client.OK, response.status_code)49 self.assertEqual(storage.content, response.content)50 def test_anon_cannot_list_files(self):51 factory.make_FileStorage()52 response = self.make_API_GET_request("list")53 # The 'list' operation is not available to anon users.54 self.assertEqual(http.client.BAD_REQUEST, response.status_code)55 def test_anon_cannot_get_file(self):56 storage = factory.make_FileStorage()57 response = self.client.get(58 reverse("file_handler", args=[storage.filename])59 )60 self.assertEqual(http.client.UNAUTHORIZED, response.status_code)61 def test_anon_cannot_delete_file(self):62 storage = factory.make_FileStorage()63 response = self.client.delete(64 reverse("file_handler", args=[storage.filename])65 )66 self.assertEqual(http.client.UNAUTHORIZED, response.status_code)67class FileStorageAPITest(FileStorageAPITestMixin, APITestCase.ForUser):68 def test_files_handler_path(self):69 self.assertEqual("/MAAS/api/2.0/files/", reverse("files_handler"))70 def test_file_handler_path(self):71 self.assertEqual(72 "/MAAS/api/2.0/files/filename/",73 reverse("file_handler", args=["filename"]),74 )75 def test_add_file_succeeds(self):76 response = self.make_API_POST_request(77 None, factory.make_name("upload"), factory.make_file_upload()78 )79 self.assertEqual(http.client.CREATED, response.status_code)80 def test_add_file_with_slashes_in_name_succeeds(self):81 filename = "filename/with/slashes/in/it"82 response = self.make_API_POST_request(83 None, filename, factory.make_file_upload()84 )85 self.assertEqual(http.client.CREATED, response.status_code)86 self.assertItemsEqual(87 [filename],88 FileStorage.objects.filter(filename=filename).values_list(89 "filename", flat=True90 ),91 )92 def test_add_file_fails_with_no_filename(self):93 response = self.make_API_POST_request(94 None, fileObj=factory.make_file_upload()95 )96 self.assertEqual(http.client.BAD_REQUEST, response.status_code)97 self.assertIn("text/plain", response["Content-Type"])98 self.assertEqual(b"Filename not supplied", response.content)99 def test_add_empty_file(self):100 filename = "filename"101 response = self.make_API_POST_request(102 None,103 filename=filename,104 fileObj=factory.make_file_upload(content=b""),105 )106 self.assertEqual(http.client.CREATED, response.status_code)107 self.assertItemsEqual(108 [filename],109 FileStorage.objects.filter(filename=filename).values_list(110 "filename", flat=True111 ),112 )113 def test_add_file_fails_with_no_file_attached(self):114 response = self.make_API_POST_request(None, "foo")115 self.assertEqual(http.client.BAD_REQUEST, response.status_code)116 self.assertIn("text/plain", response["Content-Type"])117 self.assertEqual(b"File not supplied", response.content)118 def test_add_file_fails_with_too_many_files(self):119 foo = factory.make_file_upload(name="foo")120 foo2 = factory.make_file_upload(name="foo2")121 response = self.client.post(122 reverse("files_handler"),123 {"filename": "foo", "file": foo, "file2": foo2},124 )125 self.assertEqual(http.client.BAD_REQUEST, response.status_code)126 self.assertIn("text/plain", response["Content-Type"])127 self.assertEqual(128 b"Exactly one file must be supplied", response.content129 )130 def test_add_file_can_overwrite_existing_file_of_same_name(self):131 # Write file one.132 response = self.make_API_POST_request(133 None, "foo", factory.make_file_upload(content=b"file one")134 )135 self.assertEqual(http.client.CREATED, response.status_code)136 # Write file two with the same name but different contents.137 response = self.make_API_POST_request(138 None, "foo", factory.make_file_upload(content=b"file two")139 )140 self.assertEqual(http.client.CREATED, response.status_code)141 # Retrieve the file and check its contents are the new contents.142 response = self.make_API_GET_request("get", "foo")143 self.assertEqual(b"file two", response.content)144 def test_get_file_succeeds(self):145 filename = factory.make_name("file")146 factory.make_FileStorage(147 filename=filename, content=b"give me rope", owner=self.user148 )149 response = self.make_API_GET_request("get", filename)150 self.assertEqual(http.client.OK, response.status_code)151 self.assertEqual(b"give me rope", response.content)152 def test_get_file_checks_owner(self):153 filename = factory.make_name("file")154 factory.make_FileStorage(155 filename=filename,156 content=b"give me rope",157 owner=factory.make_User(),158 )159 response = self.make_API_GET_request("get", filename)160 self.assertEqual(http.client.NOT_FOUND, response.status_code)161 def test_get_fetches_the_most_recent_file(self):162 filename = factory.make_name("file")163 factory.make_FileStorage(filename=filename, owner=self.user)164 storage = factory.make_FileStorage(filename=filename, owner=self.user)165 response = self.make_API_GET_request("get", filename)166 self.assertEqual(http.client.OK, response.status_code)167 self.assertEqual(storage.content, response.content)168 def test_get_file_fails_with_no_filename(self):169 response = self.make_API_GET_request("get")170 self.assertEqual(http.client.BAD_REQUEST, response.status_code)171 self.assertIn("text/plain", response["Content-Type"])172 self.assertEqual(b"No provided filename!", response.content)173 def test_get_file_fails_with_missing_file(self):174 response = self.make_API_GET_request("get", filename="missingfilename")175 self.assertEqual(http.client.NOT_FOUND, response.status_code)176 self.assertIn("text/plain", response["Content-Type"])177 self.assertEqual(b"File not found", response.content)178 def test_list_files_returns_ordered_list(self):179 filenames = ["myfiles/a", "myfiles/z", "myfiles/b"]180 for filename in filenames:181 factory.make_FileStorage(182 filename=filename, content=b"test content", owner=self.user183 )184 response = self.make_API_GET_request()185 self.assertEqual(http.client.OK, response.status_code)186 parsed_results = json_load_bytes(response.content)187 filenames = [result["filename"] for result in parsed_results]188 self.assertEqual(sorted(filenames), filenames)189 def test_list_files_filters_by_owner(self):190 factory.make_FileStorage(owner=factory.make_User())191 response = self.make_API_GET_request()192 self.assertEqual(http.client.OK, response.status_code)193 parsed_results = json_load_bytes(response.content)194 self.assertEqual([], parsed_results)195 def test_list_files_lists_files_with_prefix(self):196 filenames_with_prefix = ["prefix-file1", "prefix-file2"]197 filenames = filenames_with_prefix + ["otherfile", "otherfile2"]198 for filename in filenames:199 factory.make_FileStorage(200 filename=filename, content=b"test content", owner=self.user201 )202 response = self.client.get(203 reverse("files_handler"), {"prefix": "prefix-"}204 )205 self.assertEqual(http.client.OK, response.status_code)206 parsed_results = json_load_bytes(response.content)207 filenames = [result["filename"] for result in parsed_results]208 self.assertItemsEqual(filenames_with_prefix, filenames)209 def test_list_files_does_not_include_file_content(self):210 factory.make_FileStorage(211 filename="filename", content=b"test content", owner=self.user212 )213 response = self.make_API_GET_request()214 parsed_results = json_load_bytes(response.content)215 self.assertNotIn("content", parsed_results[0])216 def test_files_resource_uri_supports_slashes_in_filenames(self):217 filename = "a/filename/with/slashes/in/it/"218 factory.make_FileStorage(219 filename=filename, content=b"test content", owner=self.user220 )221 response = self.make_API_GET_request()222 parsed_results = json_load_bytes(response.content)223 resource_uri = parsed_results[0]["resource_uri"]224 expected_uri = reverse("file_handler", args=[filename])225 self.assertEqual(expected_uri, resource_uri)226 def test_api_supports_slashes_in_filenames_roundtrip_test(self):227 # Do a roundtrip (upload a file then get it) for a file with a228 # name that contains slashes.229 filename = "filename/with/slashes/in/it"230 self.make_API_POST_request(None, filename, factory.make_file_upload())231 file_url = reverse("file_handler", args=[filename])232 # The file url contains the filename without any kind of233 # escaping.234 self.assertIn(filename, file_url)235 response = self.client.get(file_url)236 parsed_result = json_load_bytes(response.content)237 self.assertEqual(filename, parsed_result["filename"])238 def test_get_file_returns_file_object_with_content_base64_encoded(self):239 filename = factory.make_name("file")240 content = sample_binary_data241 factory.make_FileStorage(242 filename=filename, content=content, owner=self.user243 )244 response = self.client.get(reverse("file_handler", args=[filename]))245 parsed_result = json_load_bytes(response.content)246 self.assertEqual(247 (filename, content),248 (parsed_result["filename"], b64decode(parsed_result["content"])),249 )250 def test_get_file_returns_file_object_with_resource_uri(self):251 filename = factory.make_name("file")252 content = sample_binary_data253 factory.make_FileStorage(254 filename=filename, content=content, owner=self.user255 )256 response = self.client.get(reverse("file_handler", args=[filename]))257 parsed_result = json_load_bytes(response.content)258 self.assertEqual(259 reverse("file_handler", args=[filename]),260 parsed_result["resource_uri"],261 )262 def test_get_file_returns_owned_file(self):263 # If both an owned file and a non-owned file are present (with the264 # same name), the owned file is returned.265 filename = factory.make_name("file")266 factory.make_FileStorage(filename=filename, owner=None)267 content = sample_binary_data268 storage = factory.make_FileStorage(269 filename=filename, content=content, owner=self.user270 )271 response = self.client.get(reverse("file_handler", args=[filename]))272 parsed_result = json_load_bytes(response.content)273 self.assertEqual(274 (filename, storage.anon_resource_uri, content),275 (276 parsed_result["filename"],277 parsed_result["anon_resource_uri"],278 b64decode(parsed_result["content"]),279 ),280 )281 def test_get_file_returning_404_file_includes_header(self):282 # In order to fix bug 1123986 we need to distinguish between283 # a 404 returned when the file is not present and a 404 returned284 # when the API endpoint is not present. We do this by setting285 # a header: "Workaround: bug1123986".286 response = self.client.get(287 reverse("file_handler", args=[factory.make_name("file")])288 )289 self.assertThat(290 (response.status_code, list(response.items())),291 MatchesListwise(292 (293 Equals(http.client.NOT_FOUND),294 Contains(("Workaround", "bug1123986")),295 )296 ),297 response,298 )299 def test_delete_filters_by_owner(self):300 storage = factory.make_FileStorage(owner=factory.make_User())301 response = self.client.delete(302 reverse("file_handler", args=[storage.filename])303 )304 self.assertEqual(http.client.NOT_FOUND, response.status_code)305 files = FileStorage.objects.filter(filename=storage.filename)306 self.assertEqual([storage], list(files))307 def test_delete_file_deletes_file(self):308 filename = factory.make_name("file")309 factory.make_FileStorage(310 filename=filename, content=b"test content", owner=self.user311 )312 response = self.client.delete(reverse("file_handler", args=[filename]))313 self.assertEqual(http.client.NO_CONTENT, response.status_code)314 files = FileStorage.objects.filter(filename=filename)315 self.assertEqual([], list(files))316 def test_delete_on_files(self):317 filename = factory.make_name("file")318 factory.make_FileStorage(319 filename=filename, content=b"test content", owner=self.user320 )321 response = self.client.delete(322 reverse("files_handler"), query={"filename": filename}323 )324 self.assertEqual(http.client.NO_CONTENT, response.status_code)325 files = FileStorage.objects.filter(filename=filename)...

Full Screen

Full Screen

current_files_for_cluster_jobs.py

Source:current_files_for_cluster_jobs.py Github

copy

Full Screen

1from DataManaging import fileManaging as FM23"""4send_off_spectrum_flux():5"""678filename_start = 'FM_parameter_scaling/spectrum_vs_flux_edge=2_N='9# Ns = ['7', '8', '9']10Ns = ['8']11filename_end = '_linear_m_space_flux=0.0001_FM_alt_spatial_flux_edge=0.002'12filenames = [filename_start + N + filename_end for N in Ns]1314filename_start = 'testing_different_FM_terms/spectrum_vs_flux_edge=2_N=6_linear_m_space_flux=0.0001_FM_'15# FM_terms = ['spatial2', 'spatial6_flux']16# FM_terms = ['spatial3','spatialL3','spatialL2','spatial6L_flux','spatial4L_flux']17FM_terms = ['spatial_fixed1', 'spatial_fixed2', 'spatial_fixed_mixed']18filename_ends = ['0.001', '0.002', '0.003', '0.004', '0.005', '0.006', '0.007']19filenames1 = []20for filename_end in filename_ends:21 for FM_term in FM_terms:22 filenames1.append(filename_start + FM_term + filename_end)2324filename_start = 'testing_different_FM_terms/spectrum_vs_flux_edge=3_N=6_linear_m_space_flux=0.0001_FM_'25# FM_terms = ['spatial2', 'spatial3', 'spatial6_flux', 'spatial4_flux']26# FM_terms = ['spatialL3','spatialL2','spatial6L_flux','spatial4L_flux']27# FM_terms = ['spatial_fixed1', 'spatial_fixed2']2829filenames2 = []30for filename_end in filename_ends:31 for FM_term in FM_terms:32 filenames2.append(filename_start + FM_term + filename_end)3334filenames = filenames1 + filenames23536# filename_start = 'FM_parameter_scaling/spectrum_vs_flux_edge=2_N='37# Ns = ['7', '8', '9']38# Ns = ['7']39# filename_end = '_linear_m_space_flux=0.0001_FM_alt_spatial_flux_edge=0.002'40# filenames = [filename_start + N + filename_end for N in Ns]4142# filenames = ['spectrum_vs_flux_edge=2_N=6_linear_m_space_flux_conf_pot=0.0001_interactions']434445"""46HISTORY47"""4849"""50 full_spectrum_luttinger_test():51"""52# filename_p1 = 'luttinger_parm/luttinger_parm_calc_N='53filename_p1 = 'luttinger_parm/bigger_lut_parm_N='54filename_p2 = '_edges='55filename_p3 = '_MminL='56N_edges = [(6, 2), (6, 3), (6, 4), (6, 5), (6, 6), (7, 2), (7, 3), (7, 4)]57filenames_start = [filename_p1 + str(N_edges[i][0]) + filename_p2 + str(N_edges[i][1]) + filename_p3 for i in58 range(len(N_edges))]5960"""61full_spectrum_for_thesis():62"""63filename = 'full_spectrum_graph_for_thesis_v4'64params_filename = FM.filename_parameters_annulus(filename)6566"""67density_luttinger_calculations():68"""69filename_starts = ['luttinger_parm_calc_N=6_edges=2_MminL=', 'luttinger_parm_calc_N=6_edges=3_MminL=',70 'luttinger_parm_calc_N=6_edges=4_MminL=', 'luttinger_parm_calc_N=6_edges=5_MminL=',71 'luttinger_parm_calc_N=6_edges=6_MminL=', 'luttinger_parm_calc_N=7_edges=2_MminL=',72 'luttinger_parm_calc_N=7_edges=3_MminL=',73 'luttinger_parm/luttinger_parm_calc_N=8_edges=2_MminL=']74filename_end = [str(i) for i in range(5, 20)]7576"""77send_test_luttinger_parm():78"""79# filename_start = 'luttinger_parm/testing_size_luttinger_parm_calc_MminL=10_edges=3_N='80# filename_start = 'luttinger_parm/testing_size_luttinger_parm_calc_MminL=10_edges=4_N='81# filename_start = 'luttinger_parm/testing_size_luttinger_parm_calc_MminL=10_edges=5_N='82filename_start = 'luttinger_parm/testing_size_luttinger_parm_calc_MminL=10_edges=6_N='83filenames = [filename_start + str(N) for N in range(7, 8)]8485"""86send_luttinger_parm_calcs():87"""88# filename_start2 = 'luttinger_parm_calc_N=6_edges=5_MminL='89# filename_start3 = 'luttinger_parm_calc_N=6_edges=6_MminL='90# filename_start4 = 'luttinger_parm_calc_N=7_edges=4_MminL='91# filename_start = 'luttinger_parm/luttinger_parm_calc_N=8_edges=2_MminL='92# filename_start = 'luttinger_parm/luttinger_parm_calc_N=7_edges=4_MminL='93# filename_start = 'luttinger_parm/luttinger_parm_calc_N=7_edges=6_MminL='94# filename_start = 'luttinger_parm/luttinger_parm_calc_N=9_edges=2_MminL='95# filename_start = 'luttinger_parm/luttinger_parm_calc_N=7_edges=5_MminL='9697# filename_end = [str(i) for i in range(5, 20)]98# filename_end = [str(i) for i in range(7, 27)]99# filename_end = [str(i) for i in range(7, 30)]100# filenames2 = [filename_start2 + end for end in filename_end]101# filenames3 = [filename_start3 + end for end in filename_end]102# filenames4 = [filename_start4 + end for end in filename_end]103# filenames = filenames2 + filenames3 + filenames4104# filenames = [filename_start + end for end in filename_end]105# filenames = [filename_start2 + '8', filename_start3 + '8', filename_start4 + '8']106107# filename_p1 = 'luttinger_parm/luttinger_parm_calc_N='108filename_p1 = 'luttinger_parm/bigger_lut_parm_N='109filename_p2 = '_edges='110filename_p3 = '_MminL='111# N_edges = [(6, 2), (6, 3), (6, 4), (6, 5), (6, 6), (7, 2), (7, 3), (7, 4), (7, 5), (7, 6), (8, 2), (8, 3), (9, 2)]112# N_edges = [(7, 5), (7, 6), (8, 2), (8, 3)]113# N_edges = [(9, 2)]114N_edges = [(7, 3), (7, 4)]115# N_edges = [(9, 2), (9, 3), (9, 4)]116N_edges = [(8, 4), (8, 5), (8, 6), (9, 2), (9, 3), (9, 4)]117# filename_end = [str(i) for i in range(7, 30)]118# N_edges = [(8, 5)]119# N_edges = [(8, 4)]120# N_edges = [(8, 5), (9, 2)]121N_edges = [(8, 6)]122123# filename_end = [str(i) for i in range(11, 16)]124filename_end = [str(i) for i in range(19, 26)]125126# N_edges = [(9, 2)]127# filename_end = [str(i) for i in range(10, 16)] + [str(i) for i in range(17, 24)]128129# N_edges = [(9, 4)]130# filename_end = [str(i) for i in range(29, 30)]131132# filename_end = ['10']133134filenames_start = [filename_p1 + str(N_edges[i][0]) + filename_p2 + str(N_edges[i][1]) + filename_p3 for i in135 range(len(N_edges))]136137"""138send_off_spectrum_flux():139"""140141filename_start = 'FM_parameter_scaling/spectrum_vs_flux_edge=2_N=6_linear_m_space_flux=0.0001_FM_alt_spatial_flux_edge='142# endings = ['0.002', '0.005', '0.001', '0.0001', '0.0005']143endings = ['0.0002']144endings = ['0.0001', '0.0002', '0.0003', '0.0004', '0.0005', '0.0006', '0.0007', '0.0008', '0.0009', '0.001',145 '0.0011', '0.0012', '0.0013', '0.0014', '0.0015', '0.0016', '0.0017', '0.0018', '0.0019', '0.002']146# endings = ['0.0021', '0.0022', '0.0023', '0.0024', '0.0025', '0.0026', '0.0027', '0.0028', '0.0029', '0.003',147# '0.0031', '0.0032', '0.0033', '0.0034', '0.0035', '0.0036', '0.0037', '0.0038', '0.0039', '0.004',148# '0.0041', '0.0042', '0.0043', '0.0044', '0.0045', '0.0046', '0.0047', '0.0048', '0.0049', '0.005']149150# filename_start = 'FM_single_term/spectrum_vs_flux_edge=2_N=6_linear_m_space_flux=0.0001_FM_spatial1='151# endings = ['0.0001', '0.002', '0.005']152# endings = ['0.004', '0.01', '0.007']153filenames = [filename_start + end for end in endings]154155filename_start = 'FM_parameter_scaling/spectrum_vs_flux_edge=2_N='156# Ns = ['7', '8', '9']157Ns = ['8']158filename_end = '_linear_m_space_flux=0.0001_FM_alt_spatial_flux_edge=0.002'159filenames = [filename_start + N + filename_end for N in Ns]160161# filename_end1 = '=0.002'162# filename_end2 = '=0.003'163# filename_end2 = '=0.004'164filename_ends = ['0.001', '=0.002', '=0.003', '0.004', '0.005', '0.006', '0.007']165166filename_start = 'testing_different_FM_terms/spectrum_vs_flux_edge=2_N=6_linear_m_space_flux=0.0001_FM_'167# FM_terms = ['spatial2', 'spatial6_flux']168# FM_terms = ['spatial3','spatialL3','spatialL2','spatial6L_flux','spatial4L_flux']169FM_terms = ['spatial_fixed1', 'spatial_fixed2', 'spatial_fixed_mixed']170filenames1 = []171for filename_end in filename_ends:172 for FM_term in FM_terms:173 filenames1.append(filename_start + FM_term + filename_end)174175filename_start = 'testing_different_FM_terms/spectrum_vs_flux_edge=3_N=6_linear_m_space_flux=0.0001_FM_'176# FM_terms = ['spatial2', 'spatial3', 'spatial6_flux', 'spatial4_flux']177# FM_terms = ['spatialL3','spatialL2','spatial6L_flux','spatial4L_flux']178# FM_terms = ['spatial_fixed1', 'spatial_fixed2']179180filenames2 = []181for filename_end in filename_ends:182 for FM_term in FM_terms:183 filenames2.append(filename_start + FM_term + filename_end)184185filenames = filenames1 + filenames2186187# filename_start = 'FM_parameter_scaling/spectrum_vs_flux_edge=2_N='188# Ns = ['7', '8', '9']189# Ns = ['7']190# filename_end = '_linear_m_space_flux=0.0001_FM_alt_spatial_flux_edge=0.002'191# filenames = [filename_start + N + filename_end for N in Ns]192193# filenames = ['spectrum_vs_flux_edge=2_N=6_linear_m_space_flux_conf_pot=0.0001_interactions']194195196"""197calc_lz_spectrum(filename)198"""199200files_to_calc = ['parms_for_FM_range_N=6_edge=2', 'parms_for_FM_range_N=6_edge=2_random',201 'parms_for_FM_range_N=6_edge=1', 'parms_for_FM_range_N=6_edge=1_random',202 'parms_for_FM_range_N=6_edge=2_random_smaller_FM', 'parms_for_FM_range_N=6_edge=2_smaller_FM']203204# filename = 'lz_spectrum_for_IQH_flux_params_torus_like_conf_pot=0.001_interactions'205206207"""208calc_full_spectrum(filename)209"""210211filename_start = 'luttinger_parm_calc_N=6_edges=2_MminL='212filename_end = [str(i) for i in range(5, 15)] ...

Full Screen

Full Screen

demo_progressive_saliency_encoding.py

Source:demo_progressive_saliency_encoding.py Github

copy

Full Screen

1#!/usr/bin/env python32# Copyright (c) the JPEG XL Project Authors. All rights reserved.3#4# Use of this source code is governed by a BSD-style5# license that can be found in the LICENSE file.6"""Produces demos for how progressive-saliency encoding would look like.7As long as we do not have a progressive decoder that allows showing images8generated from partially-available data, we can resort to building9animated gifs that show how progressive loading would look like.10Method:111. JPEG-XL encode the image, but stop at the pre-final (2nd) step.122. Use separate tool to compute a heatmap which shows where differences between13 the pre-final and final image are expected to be perceptually worst.143. Use this heatmap to JPEG-XL encode the image with the final step split into15 'salient parts only' and 'non-salient parts'. Generate a sequence of images16 that stop decoding after the 1st, 2nd, 3rd, 4th step. JPEG-XL decode these17 truncated images back to PNG.184. Measure byte sizes of the truncated-encoded images.195. Build an animated GIF with variable delays by calling ImageMagick's20 `convert` command.21"""22from __future__ import absolute_import23from __future__ import division24from __future__ import print_function25from six.moves import zip26import ast # For ast.literal_eval() only.27import os28import re29import shlex30import subprocess31import sys32_BLOCKSIZE = 833_CONF_PARSERS = dict(34 keep_tempfiles=lambda s: bool(ast.literal_eval(s)),35 heatmap_command=shlex.split,36 simulated_progressive_loading_time_sec=float,37 simulated_progressive_loading_delay_until_looparound_sec=float,38 jpegxl_encoder=shlex.split,39 jpegxl_decoder=shlex.split,40 blurring=lambda s: s.split(),41)42def parse_config(config_filename):43 """Parses the configuration file."""44 conf = {}45 re_comment = re.compile(r'^\s*(?:#.*)?$')46 re_param = re.compile(r'^(?P<option>\w+)\s*:\s*(?P<value>.*?)\s*$')47 try:48 with open(config_filename) as h:49 for line in h:50 if re_comment.match(line):51 continue52 m = re_param.match(line)53 if not m:54 raise ValueError('Syntax error')55 conf[m.group('option')] = (56 _CONF_PARSERS[m.group('option')](m.group('value')))57 except Exception as exn:58 raise ValueError('Bad Configuration line ({}): {}'.format(exn, line))59 missing_options = set(_CONF_PARSERS) - set(conf)60 if missing_options:61 raise ValueError('Missing configuration options: ' + ', '.join(62 sorted(missing_options)))63 return conf64def generate_demo_image(config, input_filename, output_filename):65 tempfiles = []66 #67 def encode_img(input_filename, output_filename, num_steps,68 heatmap_filename=None):69 replacements = {70 '${INPUT}': input_filename,71 '${OUTPUT}': output_filename,72 '${STEPS}': str(num_steps),73 # Heatmap argument will be provided in --param=value form.74 '${HEATMAP_ARG}': ('--saliency_map_filename=' + heatmap_filename75 if heatmap_filename is not None else '')76 }77 # Remove empty args. This removes the heatmap-argument if no heatmap78 # is provided..79 cmd = [80 _f for _f in81 [replacements.get(arg, arg) for arg in config['jpegxl_encoder']] if _f82 ]83 tempfiles.append(output_filename)84 subprocess.call(cmd)85 #86 def decode_img(input_filename, output_filename):87 replacements = {'${INPUT}': input_filename, '${OUTPUT}': output_filename}88 cmd = [replacements.get(arg, arg) for arg in config['jpegxl_decoder']]89 tempfiles.append(output_filename)90 subprocess.call(cmd)91 #92 def generate_heatmap(orig_image_filename, coarse_grained_filename,93 heatmap_filename):94 cmd = config['heatmap_command'] + [95 str(_BLOCKSIZE), orig_image_filename, coarse_grained_filename,96 heatmap_filename]97 tempfiles.append(heatmap_filename)98 subprocess.call(cmd)99 #100 try:101 encode_img(input_filename, output_filename + '._step1.pik', 1)102 decode_img(output_filename + '._step1.pik', output_filename + '._step1.png')103 encode_img(input_filename, output_filename + '._step2.pik', 2)104 decode_img(output_filename + '._step2.pik', output_filename + '._step2.png')105 generate_heatmap(input_filename, output_filename + '._step2.png',106 output_filename + '._heatmap.png')107 encode_img(input_filename,108 output_filename + '._step3.pik', 3,109 output_filename + '._heatmap.png')110 encode_img(input_filename,111 output_filename + '._step4.pik', 4,112 output_filename + '._heatmap.png')113 decode_img(output_filename + '._step3.pik', output_filename + '._step3.png')114 decode_img(output_filename + '._step4.pik', output_filename + '._step4.png')115 data_sizes = [116 os.stat('{}._step{}.pik'.format(output_filename, num_step)).st_size117 for num_step in (1, 2, 3, 4)]118 time_offsets = [0] + [119 # Imagemagick's `convert` accepts delays in units of 1/100 sec.120 round(100 * config['simulated_progressive_loading_time_sec'] * size /121 data_sizes[-1]) for size in data_sizes]122 time_delays = [t_next - t_prev123 for t_next, t_prev in zip(time_offsets[1:], time_offsets)]124 # Add a fake white initial image. As long as no usable image data is125 # available, the user will see a white background.126 subprocess.call(['convert',127 output_filename + '._step1.png',128 '-fill', 'white', '-colorize', '100%',129 output_filename + '._step0.png'])130 tempfiles.append(output_filename + '._step0.png')131 subprocess.call(132 ['convert', '-loop', '0', output_filename + '._step0.png'] +133 [arg for args in [134 ['-delay', str(time_delays[n - 1]),135 '-blur', config['blurring'][n - 1],136 '{}._step{}.png'.format(output_filename, n)]137 for n in (1, 2, 3, 4)] for arg in args] +138 ['-delay', str(round(100 * config[139 'simulated_progressive_loading_delay_until_looparound_sec'])),140 output_filename + '._step4.png',141 output_filename])142 finally:143 if not config['keep_tempfiles']:144 for filename in tempfiles:145 try:146 os.unlink(filename)147 except OSError:148 pass # May already have been deleted otherwise.149def main():150 if sys.version.startswith('2.'):151 sys.exit('This is a python3-only script.')152 if (len(sys.argv) != 4 or not sys.argv[-1].endswith('.gif')153 or not sys.argv[-2].endswith('.png')):154 sys.exit(155 'Usage: {} [config_options_file] [input.png] [output.gif]'.format(156 sys.argv[0]))157 try:158 _, config_filename, input_filename, output_filename = sys.argv159 config = parse_config(config_filename)160 generate_demo_image(config, input_filename, output_filename)161 except ValueError as exn:162 sys.exit(exn)163if __name__ == '__main__':...

Full Screen

Full Screen

embedthumbnail.py

Source:embedthumbnail.py Github

copy

Full Screen

1# coding: utf-82from __future__ import unicode_literals3import os4import subprocess5from .ffmpeg import FFmpegPostProcessor6from ..utils import (7 check_executable,8 encodeArgument,9 encodeFilename,10 PostProcessingError,11 prepend_extension,12 replace_extension,13 shell_quote14)15class EmbedThumbnailPPError(PostProcessingError):16 pass17class EmbedThumbnailPP(FFmpegPostProcessor):18 def __init__(self, downloader=None, already_have_thumbnail=False):19 super(EmbedThumbnailPP, self).__init__(downloader)20 self._already_have_thumbnail = already_have_thumbnail21 def run(self, info):22 filename = info['filepath']23 temp_filename = prepend_extension(filename, 'temp')24 if not info.get('thumbnails'):25 self._downloader.to_screen('[embedthumbnail] There aren\'t any thumbnails to embed')26 return [], info27 thumbnail_filename = info['thumbnails'][-1]['filename']28 if not os.path.exists(encodeFilename(thumbnail_filename)):29 self._downloader.report_warning(30 'Skipping embedding the thumbnail because the file is missing.')31 return [], info32 def is_webp(path):33 with open(encodeFilename(path), 'rb') as f:34 b = f.read(12)35 return b[0:4] == b'RIFF' and b[8:] == b'WEBP'36 # Correct extension for WebP file with wrong extension (see #25687, #25717)37 _, thumbnail_ext = os.path.splitext(thumbnail_filename)38 if thumbnail_ext:39 thumbnail_ext = thumbnail_ext[1:].lower()40 if thumbnail_ext != 'webp' and is_webp(thumbnail_filename):41 self._downloader.to_screen(42 '[ffmpeg] Correcting extension to webp and escaping path for thumbnail "%s"' % thumbnail_filename)43 thumbnail_webp_filename = replace_extension(thumbnail_filename, 'webp')44 os.rename(encodeFilename(thumbnail_filename), encodeFilename(thumbnail_webp_filename))45 thumbnail_filename = thumbnail_webp_filename46 thumbnail_ext = 'webp'47 # Convert unsupported thumbnail formats to JPEG (see #25687, #25717)48 if thumbnail_ext not in ['jpg', 'png']:49 # NB: % is supposed to be escaped with %% but this does not work50 # for input files so working around with standard substitution51 escaped_thumbnail_filename = thumbnail_filename.replace('%', '#')52 os.rename(encodeFilename(thumbnail_filename), encodeFilename(escaped_thumbnail_filename))53 escaped_thumbnail_jpg_filename = replace_extension(escaped_thumbnail_filename, 'jpg')54 self._downloader.to_screen('[ffmpeg] Converting thumbnail "%s" to JPEG' % escaped_thumbnail_filename)55 self.run_ffmpeg(escaped_thumbnail_filename, escaped_thumbnail_jpg_filename, ['-bsf:v', 'mjpeg2jpeg'])56 os.remove(encodeFilename(escaped_thumbnail_filename))57 thumbnail_jpg_filename = replace_extension(thumbnail_filename, 'jpg')58 # Rename back to unescaped for further processing59 os.rename(encodeFilename(escaped_thumbnail_jpg_filename), encodeFilename(thumbnail_jpg_filename))60 thumbnail_filename = thumbnail_jpg_filename61 if info['ext'] == 'mp3':62 options = [63 '-c', 'copy', '-map', '0', '-map', '1',64 '-metadata:s:v', 'title="Album cover"', '-metadata:s:v', 'comment="Cover (Front)"']65 self._downloader.to_screen('[ffmpeg] Adding thumbnail to "%s"' % filename)66 self.run_ffmpeg_multiple_files([filename, thumbnail_filename], temp_filename, options)67 if not self._already_have_thumbnail:68 os.remove(encodeFilename(thumbnail_filename))69 os.remove(encodeFilename(filename))70 os.rename(encodeFilename(temp_filename), encodeFilename(filename))71 elif info['ext'] in ['m4a', 'mp4']:72 if not check_executable('AtomicParsley', ['-v']):73 raise EmbedThumbnailPPError('AtomicParsley was not found. Please install.')74 cmd = [encodeFilename('AtomicParsley', True),75 encodeFilename(filename, True),76 encodeArgument('--artwork'),77 encodeFilename(thumbnail_filename, True),78 encodeArgument('-o'),79 encodeFilename(temp_filename, True)]80 self._downloader.to_screen('[atomicparsley] Adding thumbnail to "%s"' % filename)81 if self._downloader.params.get('verbose', False):82 self._downloader.to_screen('[debug] AtomicParsley command line: %s' % shell_quote(cmd))83 p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)84 stdout, stderr = p.communicate()85 if p.returncode != 0:86 msg = stderr.decode('utf-8', 'replace').strip()87 raise EmbedThumbnailPPError(msg)88 if not self._already_have_thumbnail:89 os.remove(encodeFilename(thumbnail_filename))90 # for formats that don't support thumbnails (like 3gp) AtomicParsley91 # won't create to the temporary file92 if b'No changes' in stdout:93 self._downloader.report_warning('The file format doesn\'t support embedding a thumbnail')94 else:95 os.remove(encodeFilename(filename))96 os.rename(encodeFilename(temp_filename), encodeFilename(filename))97 else:98 raise EmbedThumbnailPPError('Only mp3 and m4a/mp4 are supported for thumbnail embedding for now.')...

Full Screen

Full Screen

uniturl.py

Source:uniturl.py Github

copy

Full Screen

1import unittest2from bots.botslib import Uri3'''no plugin4'''5class TestTranslate(unittest.TestCase):6 def setUp(self):7 pass8 def testcombinations(self):9 self.assertEqual('scheme://username:password@hostname:80/path1/path2/filename',str(Uri(scheme='scheme',username='username',password='password',hostname='hostname',port=80,path='path1/path2',filename= 'filename',query={'query':'argument'},fragment='fragment')),'basis')10 self.assertEqual('scheme://username:password@hostname:80/path1/path2/filename',str(Uri(scheme='scheme',username='username',password='password',hostname='hostname',port=80,path='path1/path2',filename= 'filename')),'basis')11 12 self.assertEqual('scheme:/path1/path2/filename',str(Uri(scheme='scheme',path='/path1/path2',filename='filename')),'')13 self.assertEqual('scheme:path1/path2/filename',str(Uri(scheme='scheme',path='path1/path2',filename='filename')),'')14 self.assertEqual('path1/path2/filename',str(Uri(path='path1/path2',filename='filename')),'')15 self.assertEqual('path1/path2/',str(Uri(path='path1/path2')),'')16 self.assertEqual('filename',str(Uri(filename='filename')),'')17 self.assertEqual('scheme:path1/path2/',str(Uri(scheme='scheme',path='path1/path2')),'')18 self.assertEqual('scheme:filename',str(Uri(scheme='scheme',filename='filename')),'')19 self.assertEqual('scheme://username:password@hostname:80/path1/path2/',str(Uri(scheme='scheme',username='username',password='password',hostname='hostname',port=80,path='path1/path2')),'basis')20 self.assertEqual('scheme://username:password@hostname:80/filename',str(Uri(scheme='scheme',username='username',password='password',hostname='hostname',port=80,filename= 'filename')),'basis')21 self.assertEqual('scheme://username:password@hostname:80',str(Uri(scheme='scheme',username='username',password='password',hostname='hostname',port=80)),'bas')22 self.assertEqual('scheme://username:password@hostname',str(Uri(scheme='scheme',username='username',password='password',hostname='hostname')),'bas')23 self.assertEqual('scheme://username@hostname',str(Uri(scheme='scheme',username='username',hostname='hostname')),'bas')24 self.assertEqual('scheme://hostname',str(Uri(scheme='scheme',hostname='hostname')),'bas')25 self.assertEqual('scheme://username@hostname:80/path1/path2/filename',str(Uri(scheme='scheme',username='username',hostname='hostname',port=80,path='path1/path2',filename= 'filename')),'no password')26 self.assertEqual('scheme://hostname:80/path1/path2/filename',str(Uri(scheme='scheme',password='password',hostname='hostname',port=80,path='path1/path2',filename= 'filename')),'no username')27 self.assertEqual('scheme:path1/path2/filename',str(Uri(scheme='scheme',username='username',password='password',path='path1/path2',filename= 'filename')),'no hostname')28 self.assertEqual('//username:password@hostname:80/path1/path2/filename',str(Uri(username='username',password='password',hostname='hostname',port=80,path='path1/path2',filename= 'filename')),'no scheme')29 self.assertEqual('path1/path2/filename',str(Uri(username='username',password='password',port=80,path='path1/path2',filename= 'filename')),'no scheme no hostname')30 31 def testempty(self):32 #tests for empty values33 self.assertEqual('//username:password@hostname:80/path1/path2/filename',str(Uri(scheme=None,username='username',password='password',hostname='hostname',port=80,path='path1/path2',filename= 'filename')),'basis')34 self.assertEqual('//username:password@hostname:80/path1/path2/filename',str(Uri(scheme='',username='username',password='password',hostname='hostname',port=80,path='path1/path2',filename= 'filename')),'basis')35 self.assertEqual('scheme://hostname:80/path1/path2/filename',str(Uri(scheme='scheme',username=None,password=None,hostname='hostname',port=80,path='path1/path2',filename= 'filename')),'basis')36 self.assertEqual('scheme://hostname:80/path1/path2/filename',str(Uri(scheme='scheme',username='',password='',hostname='hostname',port=80,path='path1/path2',filename= 'filename')),'basis')37 #~ self.assertEqual('scheme://username:password@hostname:80/path1/path2/filename',str(Uri(scheme='scheme',username='username',password='password',hostname='hostname',port=80,path='path1/path2',filename= 'filename')),'basis')38 def testcalls(self):39 self.assertEqual('scheme:/path1/path2/filename2',Uri(scheme='scheme',path='/path1/path2',filename='filename').uri(filename='filename2'),'')40if __name__ == '__main__':...

Full Screen

Full Screen

File.py

Source:File.py Github

copy

Full Screen

1# Copyright (C) 2014 Ninbora [admin@ninbora.com]2# Licensed under MIT license [http://opensource.org/licenses/MIT]3import shutil4import os5import os.path6#============================================================================7# create <folder> if does not exist yet8# Params: folder = the folder to9# Return: True if the folder exist or was created successfully 10#============================================================================11def ensureFolder(folder):12 if not os.path.exists(folder):13 try:14 os.makedirs(folder)15 except:16 return False17 return True18#============================================================================19# Clean the name for invalid characters20# Params: filename = the filename to clean21# Return: The filename without invalid characters22#============================================================================23def cleanName(filename):24 filename = filename.replace("'","")25 filename = filename.replace('"',"")26 filename = filename.replace("?","")27 filename = filename.replace("<","")28 filename = filename.replace(">","")29 filename = filename.replace("*","")30 filename = filename.replace("|","")31 filename = filename.replace(":","")32 filename = filename.replace("!","")33 filename = filename.replace("\\","")34 filename = filename.replace("/","_")35 filename = filename.replace("\t","")36 return filename37#==============================================================================38# Deletes files in a <folder> and its sub-folders.39# Params : folder = path to local folder40# extension = extension of the files to remove. if empty all files will be removed.41# prefix = prefix of the files to remove. if empty all files will be removed.42#==============================================================================43def delFiles(folder, extension = '', prefix = ''):44 if (extension != '' ) :45 extension = '.' + extension46 try:47 for root, dirs, files in os.walk(folder, topdown=False):48 for name in files:49 filename = os.path.join(root, name)50 if extension == '' and prefix == '':51 os.remove(filename)52 elif extension != '' and filename.endswith(extension):53 os.remove(filename)54 elif prefix != '' and name.startswith(prefix):55 os.remove(filename)56 for name in dirs:57 dir = os.path.join(root, name)58 os.rmdir(dir)59 except IOError:...

Full Screen

Full Screen

Using AI Code Generation

copy

Full Screen

1const fileName = require('stryker-parent').fileName;2console.log(fileName);3const fileName = require('stryker-parent').fileName;4console.log(fileName);5const fileName = require('stryker-parent').fileName;6console.log(fileName);

Full Screen

Using AI Code Generation

copy

Full Screen

1const fileName = require('stryker-parent').fileName;2console.log(fileName);3const fileName = require('stryker-child').fileName;4console.log(fileName);5module.exports = function(config) {6 config.set({7 });8};915:28:25 (10262) INFO Stryker 0.20.0 (mutation testing framework for JavaScript) initialized1015:28:25 (10262) INFO ConcurrencyTokenProvider Creating 1 test runners (based on CPU count)1115:28:25 (10262) INFO DryRunResultReporter Reporting 1 total mutants1215:28:25 (10262) INFO DryRunResultReporter Mutant(s) survived: 01315:28:25 (10262) INFO DryRunResultReporter Mutant(s) timed out: 01415:28:25 (10262) INFO MutationTestExecutor Starting 1 test runners1515:28:25 (10262) INFO MutationTestExecutor 1 test runner(s) started in 0 seconds1615:28:25 (10262) INFO SandboxPool Creating 1 test runners (based on CPU count)

Full Screen

Using AI Code Generation

copy

Full Screen

1var fileName = require('stryker-parent').fileName;2console.log(fileName);3module.exports = {4 fileName: function() {5 return __filename;6 }7};8{9}10{11 "dependencies": {12 }13}14var fileName = require('stryker-parent').fileName;15console.log(fileName);16{17 "dependencies": {18 }19}20{21 "dependencies": {22 }23}24var fileName = require('stryker-parent').fileName;25console.log(fileName);26{27 "dependencies": {28 }29}30{31 "dependencies": {32 }33}34var fileName = require('stryker-parent').fileName;35console.log(fileName);

Full Screen

Using AI Code Generation

copy

Full Screen

1const fileName = require('stryker-parent').fileName;2const fileName = require('stryker-parent').fileName;3console.log(fileName);4const fileName = require('stryker-parent').fileName;5console.log(fileName);6const fileName = require('stryker-parent').fileName;7console.log(fileName);8const fileName = require('stryker-parent').fileName;9console.log(fileName);10const fileName = require('stryker-parent').fileName;11console.log(fileName);12const fileName = require('stryker-parent').fileName;13console.log(fileName);14const fileName = require('stryker-parent').fileName;15console.log(fileName);16const fileName = require('stryker-parent').fileName;17console.log(fileName);18const fileName = require('stryker-parent').fileName;19console.log(fileName);20const fileName = require('stryker-parent').fileName;21console.log(fileName);22const fileName = require('stryker-parent').fileName;23console.log(fileName);24const fileName = require('stryker-parent').fileName;25console.log(fileName);26const fileName = require('stryker-parent').fileName;27console.log(fileName);28const fileName = require('stryker-parent').fileName;29console.log(fileName);30const fileName = require('stryker-parent').fileName;

Full Screen

Using AI Code Generation

copy

Full Screen

1var strykerParent = require('stryker-parent');2var fileName = strykerParent.fileName;3var strykerAPI = require('stryker-api/core');4var log = strykerAPI.getLogger('test');5var path = require('path');6log.info('test.js');7log.info(fileName);8log.info(path.resolve(fileName, '..'));9log.info(path.resolve(fileName, '..', 'test'));10log.info(path.resolve(fileName, '..', '..', 'test'));11log.info(path.resolve(fileName, '..', '..', '..', 'test'));12log.info(path.resolve(fileName, '..', '..', '..', '..', 'test'));13log.info(path.resolve(fileName, '..', '..', '..', '..', '..', 'test'));14log.info(path.resolve(fileName, '..', '..', '..', '..', '..', '..', 'test'));15log.info(path.resolve(fileName, '..', '..', '..', '..', '..', '..', '..', 'test'));16log.info(path.resolve(fileName, '..', '..', '..', '..', '..', '..', '..', '..', 'test'));17var strykerParent = require('stryker-parent');18var fileName = strykerParent.fileName;19var strykerAPI = require('stryker-api/core');20var log = strykerAPI.getLogger('test');21var path = require('path');22log.info('test.js');23log.info(fileName);24log.info(path.resolve(fileName, '..'));25log.info(path.resolve(fileName, '..', 'test'));26log.info(path.resolve(fileName, '..', '..', 'test'));27log.info(path.resolve(fileName, '..', '..', '..', 'test'));28log.info(path.resolve(fileName, '..', '..', '..', '..', 'test'));29log.info(path.resolve(fileName, '..', '..', '..', '..', '..', 'test'));30log.info(path.resolve(fileName, '..', '..', '..', '..', '..', '..', 'test'));31log.info(path.resolve(fileName, '..', '..', '..', '..', '..', '..', '..', 'test'));32log.info(path.resolve(fileName, '..', '..', '..', '..', '..', '..', '..', '..', 'test'));33var strykerParent = require('stryker-parent');34var fileName = strykerParent.fileName;35var strykerAPI = require('stryker-api/core');36var log = strykerAPI.getLogger('test');37var path = require('path');38log.info('test.js');39log.info(fileName);40log.info(path.resolve(fileName, '..'));41log.info(path.resolve(fileName, '..', 'test'));42log.info(path.resolve(fileName, '..', '..', 'test'));43log.info(path.resolve(fileName, '..', '..', '..', 'test'));44log.info(path.resolve

Full Screen

Using AI Code Generation

copy

Full Screen

1var fileName = require('stryker-parent').fileName;2var file = fileName('test.js');3var fileName = require('stryker-parent').fileName;4var file = fileName('test.js');5var fileName = require('stryker-parent').fileName;6var file = fileName('test.js');7var fileName = require('stryker-parent').fileName;8var file = fileName('test.js');9var fileName = require('stryker-parent').fileName;10var file = fileName('test.js');11var fileName = require('stryker-parent').fileName;12var file = fileName('test.js');13var fileName = require('stryker-parent').fileName;14var file = fileName('test.js');15var fileName = require('stryker-parent').fileName;16var file = fileName('test.js');17var fileName = require('stryker-parent').fileName;18var file = fileName('test.js');19var fileName = require('stryker-parent').fileName;20var file = fileName('test.js');21var fileName = require('stryker-parent').fileName;22var file = fileName('test.js');23var fileName = require('stryker-parent').fileName;24var file = fileName('test.js');25var fileName = require('

Full Screen

Using AI Code Generation

copy

Full Screen

1const fileName = require('stryker-parent').fileName;2module.exports = function(config){3 config.set({4 mutate: [fileName(__dirname, 'src/**/*.js')],5 });6};7module.exports = {8 fileName: function(basePath, filePattern) {9 return basePath + filePattern;10 }11};12module.exports = function(config){13 config.set({14 });15};16module.exports = {17 fileName: function(filePattern) {18 return __dirname + filePattern;19 }20};21That way, we could also add other methods, like ~cwd() or ~root() , or even ~packageJson('name') . What do you think?22module.exports = function(config){23 config.set({24 });25};26module.exports = {27 fileName: function(filePattern) {28 return __dirname + filePattern;29 }30};31That way, we could also add other methods, like ~cwd() or ~root() , or even ~packageJson('name') . What do you think?

Full Screen

Using AI Code Generation

copy

Full Screen

1module.exports = function(config) {2 config.set({3 });4};5module.exports = function(config) {6 config.set({7 });8};9module.exports = function(config) {10 config.set({11 });12};13module.exports = function(config) {14 config.set({15 });16};17module.exports = function(config) {18 config.set({19 });20};21module.exports = function(config) {22 config.set({

Full Screen

Automation Testing Tutorials

Learn to execute automation testing from scratch with LambdaTest Learning Hub. Right from setting up the prerequisites to run your first automation test, to following best practices and diving deeper into advanced test scenarios. LambdaTest Learning Hubs compile a list of step-by-step guides to help you be proficient with different test automation frameworks i.e. Selenium, Cypress, TestNG etc.

LambdaTest Learning Hubs:

YouTube

You could also refer to video tutorials over LambdaTest YouTube channel to get step by step demonstration from industry experts.

Run stryker-parent automation tests on LambdaTest cloud grid

Perform automation testing on 3000+ real desktop and mobile devices online.

Try LambdaTest Now !!

Get 100 minutes of automation test minutes FREE!!

Next-Gen App & Browser Testing Cloud

Was this article helpful?

Helpful

NotHelpful