hexsha
stringlengths
40
40
size
int64
5
1.04M
ext
stringclasses
6 values
lang
stringclasses
1 value
max_stars_repo_path
stringlengths
3
344
max_stars_repo_name
stringlengths
5
125
max_stars_repo_head_hexsha
stringlengths
40
78
max_stars_repo_licenses
sequencelengths
1
11
max_stars_count
int64
1
368k
max_stars_repo_stars_event_min_datetime
stringlengths
24
24
max_stars_repo_stars_event_max_datetime
stringlengths
24
24
max_issues_repo_path
stringlengths
3
344
max_issues_repo_name
stringlengths
5
125
max_issues_repo_head_hexsha
stringlengths
40
78
max_issues_repo_licenses
sequencelengths
1
11
max_issues_count
int64
1
116k
max_issues_repo_issues_event_min_datetime
stringlengths
24
24
max_issues_repo_issues_event_max_datetime
stringlengths
24
24
max_forks_repo_path
stringlengths
3
344
max_forks_repo_name
stringlengths
5
125
max_forks_repo_head_hexsha
stringlengths
40
78
max_forks_repo_licenses
sequencelengths
1
11
max_forks_count
int64
1
105k
max_forks_repo_forks_event_min_datetime
stringlengths
24
24
max_forks_repo_forks_event_max_datetime
stringlengths
24
24
content
stringlengths
5
1.04M
avg_line_length
float64
1.14
851k
max_line_length
int64
1
1.03M
alphanum_fraction
float64
0
1
lid
stringclasses
191 values
lid_prob
float64
0.01
1
0c194cfe094a868cb46e36b6137b117cc88eb8f6
66,950
md
Markdown
README.md
tjanas/Nt-Mini-Noir-Jailbreak
af6bd1e16bdf57e79c7fed0d689e02687dfa5d44
[ "W3C" ]
1
2020-12-06T01:19:45.000Z
2020-12-06T01:19:45.000Z
README.md
tjanas/Nt-Mini-Noir-Jailbreak
af6bd1e16bdf57e79c7fed0d689e02687dfa5d44
[ "W3C" ]
null
null
null
README.md
tjanas/Nt-Mini-Noir-Jailbreak
af6bd1e16bdf57e79c7fed0d689e02687dfa5d44
[ "W3C" ]
null
null
null
# Nt-Mini-Noir-Jailbreak Custom *Jailbreak* firmware for the Analogue Nt Mini v2 *Noir*: - Famicom Disk System support added - New NES core 100% supports 2.0 mappers - Genesis core added - Intellivision core added - Megaduck core added - SPC player added - Mandelbrot zoomer added (custom jailbreak firmware for the 2017 original Analogue Nt Mini v1 are available [at that other repository](https://github.com/tjanas/Nt-Mini-Legacy-Jailbreak)) **IMPORTANT**: Use the [NES 2.0 XML Database](https://forums.nesdev.com/viewtopic.php?f=3&t=19940&p=248796) and the [NES Header Repair Tool Python Script](https://github.com/Kitrinx/NES_Header_Repair) to fix your NES rom headers! It is recommended to have less than 800 files per subdirectory. ------------------- # Updating Firmware ## Step 1: Flashing the Firmware Format a 2GB (or larger) SD card as [FAT32](https://en.wikipedia.org/wiki/FAT32) (FAT16 and exFAT are not supported). In Windows, you must use a tool for cards larger than 32GB, such as [fat32format](https://fat32-format.en.softonic.com/). Copy [ntmv2_firmware_verJB6.6.bin](https://raw.githubusercontent.com/SmokeMonsterPacks/Nt-Mini-Noir-Jailbreak/main/firmware/ntmv2_firmware_verJB6.6.bin) into the root directory of your SD card. Be sure that there is only one firmware file there. Insert the card into your Nt Mini Noir and power on. The firmware will be flashed to the console. This process may take three or more minutes. While the firmware is flashing the LED will flash, followed by a considerable pause and HDMI signal blackout. Do not power off. The system will reboot automatically when it has finished. Delete the firmware file from your card after flashing. The main menu will present a new option, `Cores`, to signify it is Jailbreak mode. The Nt Mini Noir is protected from bricking as a result of firmware updates, but follow the above precautions to be safe. If need be you can always reinstall the [official firmware](https://support.analogue.co/hc/en-us/articles/360052916371-Nt-mini-Noir-Firmware-Update-v1-0). If you encounter difficulties, make sure you are following the steps described [here](https://github.com/SmokeMonsterPacks/Nt-Mini-Noir-Jailbreak/issues/10). ## Step 2: System Setup Unzip [SD Card System Structure\_verJB6.6.zip](https://github.com/SmokeMonsterPacks/Nt-Mini-Noir-Jailbreak/raw/main/firmware/SD%20Card%20System%20Structure_verJB6.6.zip) and move all the content from the unzipped folder into the root directory of your SD card. You will now have a set of folders reflecting where game ROMs should be stored. Follow the instructions below for configuring the `/BIOS/` folder. ## How to launch a core? The cores menu lets you select one of the various cores that are and will be available. To run a specific core (say, NES) simply select it. If the core needs to be loaded, a box will pop up asking you if you wish to load it. The loading takes approximately 4 seconds. During the core loading, the front LED will flicker rapidly to indicate it is loading. While this happens, the monitor will show no signal, since the FPGA is being reconfigured at this time. After the loading, the monitor will come back and you will be sitting in the file browser for this particular core. The key setup for the core menus are: - `UP`/`DOWN`: select a file - `LEFT`/`RIGHT`: page through the files quickly 16 at a time - `B`: pressing once will take you to the top of the file list. pressing again takes you to the bottom. - `A`: run the game. - `START`: enter the settings menu. This is slightly different from the main menu. There may be a `Core Options` settings menu. Everything core specific will be found here. - `SELECT`: exit the menu. You will be asked to confirm. If you confirm, it returns to the core select menu. If you do not wish to exit, it returns to the currently running game. ## changelog v6.6 (Feb. 22, 2021) (general) * All cores should have their video window properly centered on the X and Y axis when the sliders for position are in the center. The exception is for cores with an extremely low original resolution, such as Game King. * All scaling defaults have been updated to more useful values. * Added SPD HDMI Packet to identify console as "Nt Mini Noir" (NES) * Added cheat codes menu when running ROMs via jailbreak * Added ability to decrement letters on cheat code and Copynes mini by pressing select * Replaced Firebrandx palette with the latest one ("Smooth_-_Balanced_Greys_FBX") * Fixed Vs. Hogan's Alley and Duck Hunt "gun stolen" alarm sound going off all the time or in the menu * Fixed FDS RAM Adapter, Super Mario Bros. and many other games exhibiting graphics issue when cartridges were being used * Fixed Vs. Mighty Bomb Jack reversed controls * Fixed Trojan audio channels cutting off early * Fixed playing a Vs. game then an FDS game caused a bad palette * Fixed Vs. coin insertion not working when using a SNES or NTT Data Keypad * Fixed Sunsoft 5B audio implementation * Fixed Battletoads and Double Dragon player 2 selection box moving on its own * Fixed games (i.e. Kickmaster) with mappers that used A12 for scanline counts had jittering graphics in 16 sprite mode * Fixed FDS manual disk ejection not working correctly * Fixed Copynes mini not properly exiting high resolution mode from the file menu * Fixed Maniac Mansion's intermitent flickering top line * Fixed Eggerland - Meikyuu no Fukkatsu and Egger Land - Souzou e no Tabidachi's popping square channel (GBC) * Fixed Barbie Pet Rescue's blank screen before the title * Fixed Pokemon Pinball's corrupt frames when screen was blanked * Fixed Bear in the Big Blue House hanging on the title screen * Fixed Bear in the Big Blue House's palette issue at the top of the screen * Fixed Super Mario Deluxe (and other games) palette issues * Fixed Mia Hamm Soccer title screen graphic issues * Fixed Jungle Book: Mowgli's Wild Adventure corruption when paused/unpaused * Fixed Klax crash when reaching level 5 * Fixed Mickey's Speedway palette issues * Fixed Perfect Dark non-working digitized audio * Fixed Perfect Dark status bar on the right not displaying (Genesis) * Added full screen dithering, setting the dithering slider to 63 (maximum) will enable it * Removed non-functional cropping menu - use the crop left/right and top/bottom settings in video extras menu * Fixed interlaced mode display issues (Game Gear) * Fixed top line not visible (2600) * Fixed graphic issues on certain units (random pixels on green screens) * Fixed palette sometimes flashing when entering/exiting the menu (Supervision) * Fixed Journey to the West (Gamate, 2600, Intellivision) * Fixed scaling values v6.5 (Dec. 15, 2020) **General** - Controller test added - Fixed L/R reverse on RCA jacks - Fixed filter cutoff display bug **NES** - Added full FDS support with manual, semiautomatic and automatic side switching (see NES section below for operation notes) - Added *Super Russian Roulette mapper* (#413) - Added support for *Haunted Halloween '85* and *Haunted Halloween '86* - Added Vs. palette support for composite and s-video - Reduced default expansion audio level for better balance with internal audio. - Passthrough mode can now be cancelled by tapping reset twice. - Fixed CopyNES mini save WRAM dumping, save games on cartridges can now be preserved - Fixed Vice Project Doom / Gun-Dec starting in debug mode with second controller plugged in - Fixed VRC6 register swap function. Use both VRC6 and VRC6 Swap for *Madara* and *Esper Dream 2* and just VRC6 for *Akumajou Densetsu* - Fixed FDS Audio channel imbalance when panning sliders are centered - Fixed small FDS channel bug - Fixed popping square wave audio in *Egger Land - Souzou e no Tabidachi* and *Eggerland - Meikyuu no Fukkatsu* - Fixed *Vs. Goonies* graphic issue - Fixed *Game of the Goose* graphic issues (#512) - Fixed DPCM corruption bug for PAL and Dendy Modes - Fixed MMC5 *Castlevania III* PAL cartridge functionality - Fixed Dendy mode for EverDrives and Cartridges - Fixed Dendy mode NMI flag **GB/GBC** - Fixed palette issues affecting multiple games (*Pokemon Pinball*, *Mega Man Xtreme*, *Pooh & Tigger's Hunny Safari*, *Harvest Moon 3*) - Fixed wave channel audio bug - Fixed *Tokyo Disney* crash when playing 5th level - Fixed crash in *Barbie Magic Genie Adventure* when using powers - Fixed *Razor Freestyle Scooter* and *Lufia* failing to load - Fixed *Lego Racers cloud* color bug **Colecovision** - Added Famicom Network controller functionality - Fixed *Penguin Adventure* **Genesis** - Added Famicom Network controller functionality - Fixed Audio sliders bug **SMS** - Added Famicom Network controller functionality - Fixed BIOSes loading built in games - Use "end" button on Famicom Network or Super Famicom NTT Data Keypad to simulate pressing console reset button **Intellivision** - Added Famicom Network controller functionality. `0`, `A`, `B` are the three buttons. `0` on the keypad is remapped to . - Added Player 1/2 swap to cores menu **SPC** - Fixed audio static bug v6.1 (initial release) ------------------------------ # NES/Famicom Core Release Notes This core is extremely large and encompasses 279 mappers. All the major mappers and many other mappers are implemented and work. NES 2.0 headers are 100% supported and many NES 2.0 mappers are included. Every mapper was completely rewritten using the most current information available. ## save games Save game RAM is fully supported for many mappers, and has three modes: - always save - never save - prompt This allows you to control how save RAM saving works. Save filenames should be usable out to 256 characters or so now, which should encompass just about anything. EEPROM and Flash saving are not supported. EEPROM saving is used by some [Mapper 16](https://wiki.nesdev.com/w/index.php/INES_Mapper_016), [Mapper 157](https://wiki.nesdev.com/w/index.php/INES_Mapper_157) and all [Mapper 159](https://wiki.nesdev.com/w/index.php/INES_Mapper_159) games (Bandai releases and Datach Joint ROM System). Flash saving is used by [Mappers 30](https://wiki.nesdev.com/w/index.php/UNROM_512) and [111](https://wiki.nesdev.com/w/index.php/GTROM) (homebrew releases). ## Expansion audio Expansion cartridge audio is supported. Unlike cart mode, however, the act of loading a ROM will force the proper expansion chip to be selected. This includes many pirate conversions of FDS games to cartridge (except for *Tobadaise Daisuken*, which had its FDS audio data removed.) ## Family BASIC Keyboard Enable Passthrough Mode to use the keyboard. ## other features Nt Mini Noir supports NES ROMs up to 16MiB in size with up to 32KiB of CHR-RAM and 64KiB of PRG-RAM. So you cannot have a 16MiB PRG-ROM with 64KiB of CHR-RAM, for example. Some mappers ([mapper 90](https://wiki.nesdev.com/w/index.php/J.Y._Company_ASIC), multicart mappers, drip mapper) support dip switches. These can be found under the `Core Options` menu which is accessed by hitting `START` on the file browser for the NES core. Each time a game is loaded, the dipswitches will be cleared. For the *Nintendo World Championships 1990* and *Nintendo Campus Challenge 1991* competition cartridges, Dips 1-4 set the competition time and Dip 5 removes the time limit. Here are the settings for Dips 1-4: ``` O = switch open ("off") C = switch closed ("on") ``` | Dips 4321 | time (mins) | | --------- | ----------- | | `OOOO` | 5.001 | | `OOOC` | 5.316 | | `OOCO` | 5.629 | | `OOCC` | 5.942 | | `OCOO` | 6.254 (Official Tournament Times) | | `OCOC` | 6.567 | | `OCCO` | 6.880 | | `OCCC` | 7.193 | | `COOO` | 7.505 | | `COOC` | 7.818 | | `COCO` | 8.131 | | `COCC` | 8.444 | | `CCOO` | 8.756 | | `CCOC` | 9.070 | | `CCCO` | 9.318 | | `CCCC` | 9.695 | Dip Switches always default to Off, so you should set the Dip Switches and then press Reset. These games are started by pressing `start` on the 2nd controller. Coin insertion is done using the trigger on a 12 button controller, and can also be done on the dipswitch submenu if this mapper supports. All Vs. mappers and mapper 126 (arcade board) use this. Nt Mini Noir's Core Jailbreak has full Vs. System support for any known game which does not use Vs. Dual System capabilities. This means that *Vs. Balloon Fight*, *Vs. Baseball*, *Vs. Ice Climber Dual*, *Vs. Mahjong*, *Vs. Raid on Bungeling Bay*, *Vs. Tennis* and *Vs. Wrecking Crew* are not playable, but the rest are playable. Vs. System games require full NES 2.0 headers to work properly. Use the [NES 2.0 XML Database](https://forums.nesdev.com/viewtopic.php?f=3&t=19940&p=248796) and the [NES Header Repair Tool Python Script](https://github.com/Kitrinx/NES_Header_Repair) to fix your headers. *Vs. Tetris* has only 24KiB of PRG-ROM but the jailbreak will only load a ROM with 32KiB of PRG-ROM. You can add 8,192 padding bytes of 0x00 bytes (or 0xFF bytes) inserted between the header and start of the PRG-ROM to get the game to work. *Vs. Gumshoe* may also need padding in addition to ROM reordering (see NewRisingSun's [fixes](https://forums.nesdev.com/viewtopic.php?f=3&t=17213&start=45#p220718)). Vs. System games obtain their settings from a bank of eight dip switches, these will be found as Dips 1-8 under Core Options - Dip Switches. If you are not using a SNES-style controller, you can Add a Coin with the Dip Switch menu. With Vs. System games, you start games with the Select button on either Controller I or, for a two-player game, Controller II. Zappers are supported for Vs. games that use a light gun. Vs. System games that use one of the 2C04 PPUs will show very incorrect colors when run on a PPU that uses a regular palette (2C02, 2C03, 2C05, 2C07) On Nt Mini Noir the S-Video and Composite video output options generate palette colors like an original 2C02 or 2C07 would and cannot change their palettes. The result is that you will need to use Component, RGB or HDMI video to play these games with the proper colors. NewRisingSun has [compiled](https://forums.nesdev.com/viewtopic.php?f=3&t=17213&start=45#p220718) a helpful set of NES2.0 headers for many Vs. System ROMs and a script that can construct these NES ROMs from MAMEs PRG/CHR ROM dumps. ## problem solving This core is taking full advantage of NES 2.0 headers. This includes mappers, save RAM size, etc. It is highly recommended that you upgrade your ROM set. If some games will not run, it most likely is because the core needs more information contained in the 2.0 header, such as *Star Tropics*. *Star Tropics* is actually mapper 4.1 (MMC6). Some of the Sachen multi-carts also have issues inherent to their ROMs: *Super Cartridge Ver 1 - 4 in 1 - Honey Peach* has a broken triangle channel during title screen music, *Super Cartridge Ver 5 - 7 in 1 - Magical Mathematics* is missing its title screen audio and *Super Cartridge Ver 7 - 4 in 1 - Silver Eagle*'s main character sprite gets glitchy when shooting. (complete list of implemented and tested mappers) ``` 0 - NROM 1.0 - MMC1 1.5 - MMC1 with fixed PRG banking 2.0 - UNROM normal (no bus conflicts) 2.2 - UNROM bus conflicts 3 - CNROM normal (no bus conflicts) 3.2 - CNROM bus conflicts 4.0 - MMC3B 4.1 - MMC6 4.3 - MC-ACC (Acclaim MMC3 clone with different IRQ timing) 4.4 - MMC3A 5 - MMC5 7 - AxROM normal (no bus conflicts) 7.2 - AxROM bus conflicts 9 - MMC2 10 - MMC4 11 - colordreams (has bus conflicts) 12 - MMC3A variant with upper CHR bit select 13 - CPROM (has bus conflicts) 15 - mapper 15 16.4 - Bandai IRQ counter not latched 16.5 - Bandai IRQ counter latched (no EEPROM support yet) 18 - jaleco 19 - namco N163 21.0 - VRC2/VRC4 21.1 - VRC4a 22 - VRC2a 23.0 - VRC2/VRC4 23.1 - VRC4f 23.2 - VRC4e 23.3 - VRC2b 24 - VRC6 25.0 - VRC2/VRC4 25.1 - VRC4b 25.2 - VRC4d 25.3 - VRC2c 26 - VRC6 27 - World Hero 28 - action 53 29 - Sealie Computing Glider 30 - UNROM512 (no flash support yet) 31 - NSF 32.0 - Irem G-101 32.1 - Irem G-101 variant with 1 screen mirroring 33 - Taito TC0190 34 - BNROM and NINA-001 35 - mapper 90 variant with WRAM 36 - TXC 01-22000-400 37 - SMB + Tetris + NWC 38 - Bit Corp Crime Busters 39 - Subor Study and Game 40 - NTDEC 2722 SMB2j 41 - Caltron 6 in 1 42 - Bio Miracle FDS 43 - TONY-I and YS-612 44 - Super Big 7 in 1 45 - GA23C Multicarts 46 - Rumble Station 47 - Super Spike + NWC 48 - Taito TC0350 49 - Super HIK 4 in 1 50 - N-32 SMB2j 51 - 11 in 1 ball games 52 - Multicarts 53 - supervision 16 in 1 54 - Novel Diamond 999999 in 1 55 - Mario1-Malee2 56 - Kaiser SMB3 57 - Super GK 6 in 1 58 - Multicarts 59 - T3H53 Multicarts 60 - Reset-based multicarts 61 - 20 in 1 62 - 700 in 1 63 - Powerful 250 in 1 64 - Tengen RAMBO-1 65 - Irem H3001 66 - GxROM 67 - Sunsoft-3 68.0 - Sunsoft-4 68.1 - Sunsoft-4 dual cartridge system 69 - Sunsoft FME-7 70 - Bandai simple 71.0 - Codemasters UNROM-like 71.1 - Fire Hawk variant 72 - Jaleco JF-17 73 - VRC3 74 - Waixing 43-393/860908C 75 - VRC1 76 - Namco 3446 77 - Irem with VRAM 78.1 - Irem Cosmo Carrier 78.3 - Irem Holy Diver 79 - NINA-03 / NINA-06 80 - Taito X1-005 81 - Super Gun 82 - Taito X1-017 83.0 - Cony Mapper 256K CHR / heuristic 83.1 - 512K CHR 83.2 - 1M CHR 85 - VRC7 86 - Jaleco JF-13 87 - Jaleco 88 - Namco 118 variant 89 - Sunsoft 90 - mapper 90 inhibited ROM nametables and mirroring 91 - JY Company Super Fighter 3 92 - Jaleco 93 - Sunsoft-2 94 - UN1ROM 95 - Namco 118 variant 96 - Bandai simple 97 - Irem TAM-S1 99 - Vs. System 101 - Jaleco JF-10 102 - drip (moved to 284) 103 - Doki Doki Panic FDS conversion 104 - Pegasus 5 in 1 105 - NES-EVENT 106 - SMB3 bootleg 107 - Magic Dragon 108.1 - FDS-to-cartridge conversions 108.2 - FDS-to-cartridge conversions 108.3 - FDS-to-cartridge conversions 108.4 - FDS-to-cartridge conversions 109 - Great Wall (duplicate of 137) 110 - Sachen 74LS374N (duplicate of 243) 111 - GTROM (no flash support yet) 112 - San Guo Zhi - Qun Xiong Zheng Ba 113 - HES 6 in 1 114.0 - Multiple MMC3 with scrambling 114.1 - Multiple MMC3 with scrambling 115 - SFC-02B/-03/-004 boards 116.0 - SOMARI-P 116.1 - SOMARI-P 117 - san guo zhi 4 118 - TKSROM MMC3 + single screen mirroring 119 - TQROM MMC3 + CHR/RAM switching 120 - 3D worldrunner FDS conv. 121 - A9713 MMC3-clone-bearing protected board 122 - same as 184 123 - Scrambled MMC3 124 - Super Game Mega Type III (arcade board) 125 - FDS Monty on the Run 126 - MMC3 based multicart 127 - funky double dragon 2 pir8 128 - super HiK 4 in 1- chi den 129 - duplicate of 213 130 - NS03 7 in 1 multicart 131 - duplicate of 205 132 - TXC 133 - Sachen Jovial Race & Drunkard 134 - MMC3 clone with extensions 135 - (dupe) sachen 8259A-1 (mapper 141) 136 - Sachen 3011 137 - Sachen 8259D 138 - Sachen 8259B 139 - Sachen 8259C 140 - Jaleco JF-11/JF-14 141 - Sachen 8259A 142 - Kaiser FDS ports 143 - Sachen with lame protection 144 - Death Race 145 - Sachen CNROM-like 146 - functional equivelant to NINA-03 147 - Sachen 3018 148 - Sachen SA-008-A 149 - Sachen SA-0036 150 - Sachen SA-015 151 - Vs. VRC1 152 - Bandai 153 - Bandai with WRAM 154 - Namco 118 variant 155 - MMC1 without WRAM protection 156 - Open Daou 157 - Bandai joint ROM system 158 - Tengen 800037 159 - Bandai with 24C01 EEPROM (no EEPROM support yet) 163 - Nanjing Copy Protected clone games 164 - Final Fantasy V 166 - Subor 167 - Subor 168 - Racermate 169 - Contra 168 in 1 170 - Shiko Game Syu 171 - MMC1 with hardwired mirroring 172 - Super Mega P-4070 board 173 - Idea-Tek 174 - NTDec 5 in 1 180 - Crazy Climber 182 - same as 114 183 - Pirate Gimmick 184 - Sunsoft 185.4 - Protected CNROM 185.5 - Protected CNROM 185.6 - Protected CNROM 185.7 - Protected CNROM 186 - Studybox 187 - MMC3 clone with extensions 188 - Bandai Karaoke (microphone is supported) 189 - Thunder Warrior 190 - Magic Kid GooGoo 191 - MMC3 clone similar to 119 192 - MMC3 clone similar to 119 but lets RAM/ROM work at the same time 193 - NTDEC TC-112 194 - Super Robot Taisen 196 - MMC3 hacks that use BNROM and other address lines than A0 198 - Chinese Mapper 200 - 1200 in 1 201 - 8 in 1 202 - 150 in 1 203 - 35 in 1 204 - 64 in 1 205 - 15 in 1 206 - Namco 118 207 - Taito X1-005 209 - mapper 90 variant (original version with selectable ROM nametables and mirroring) 210 - namco 175/340 210.1 - namco 175 with hardwired mirroring 211 - mapper 90 variant forced ROM nametables and mirroring 212 - Super HIK 300 in 1 213 - duplicate of 58 214 - Super Gun 20 in 1 215 - MMC3 with scrambling 216 - magic jewelry 2 217 - 255 in 1 218 - Magic Floor 221 - NTDEC N625092 multicarts 225 - 52 Games 226 - 76 in 1 227 - 1200 in 1 228 - Action 52 229 - 31 in 1 230 - 22 in 1 231 - 20 in 1 232.0 - Codemasters Quattro series (hard reset to get back to menu) 232.1 - Aladdin version (hard reset to get back to menu) 233 - 42 in 1 234 - AVE Maxi 15 235 - Golden Game 150 in 1 236 - Realtec 8155 multicarts 237 - Teletubbies 420 in 1 240 - misc chinese mapper 241 - BxROM-like 242 - Wai Xing Zhan Shi 243 - Sachen SA-020A 244 - C&E Decathlon 245 - Waixing MMC3 clone 246 - Taiwan mapper 248 - same as 115 249 - funky MMC3 pirate doodle 250 - Time Diver Avenger 253 - VRC4 clone with VRAM extentions, waixing 254 - al senshi nicol 255 - 115 in 1 262 - Sachen Street Heroes (dipswitch 0 sets the game title screen) 281 - mapper 90 variant 256K outer bank 282 - mapper 90 variant 256K outer bank 284 - drip 290 - Asder 20 in 1 295 - mapper 90 variant 128K outer bank 302 - FDS Gyruss conversion 304 - Several FDS conversions 305 - FDS Castlevania 2 306 - Exciting Basket FDS conversion 307 - FDS Metroid 312 - Highway Star FDS Conversion 331 - duplicate of 130 346 - Zanac FDS Conversions 358 - mapper 90 variant 512K outer bank 389 - Caltron 9 in 1 512 - Sachen Game of the Goose (honk honk!) 513 - Sachen Princess Maker game port 516 - Edubank MMC3 thing 533 - Sachen Middle School English 534 - Atari Flashback, Intellivision Play Power 552 - Taito X1-017 553 - Sachen Penguin and Seal (original version) 554 - FDS Castlevania conversion 555 - NES-EVENT2 ``` ## CopyNES mini This lets you dump cartridges and their save RAM (if equipped) directly to the SD card! There are many supported mappers. To dump a game, follow these steps: - Insert the game in question into the cartridge slot. - Select `Run Cartridge` to make sure it works and is making good contact. - Enter the Tools menu and select `Copynes mini`. - Select the mapper that your game uses. Use the NES 2.0 XML Database to find the mapper for your game. - Hit `B` to start the dump. Note that it might take awhile (30 seconds) to determine the size of the ROMs on the cartridge. - After the game is dumped, you can enter a filename using `up`/`down`/`left`/`right`. If no name is entered, it will save it with a filename determined by the sumcheck of the ROM. - Hitting `B` will save the ROM. That's it! You can test the game by going into the cores menu and selecting `NES`, then going to the `/COPYNES/` directory and running the ROM. ## FDS Full FDS disk drive and RAM adapter emulation has been added. This allows FDS games to be played that consist of from 1 to 4 disk sides. The FDS images must be a multiple of 65500 bytes, and cannot contain headers. Valid file sizes in bytes: ``` 65500 bytes - 1 disk side 131000 bytes - 2 disk sides (these first two are the most common) 196500 bytes - 3 disk sides (very rare) 262000 bytes - 4 disk sides (somewhat rare) ``` To use the FDS functionality, an FDS BIOS needs to placed into the `/BIOS/` directory. Any of the three BIOS dumps can be used: R0, R1, or the Twin Famicom version. It must be named `fds.bin` Several methods to load disks and change sides are provided in a new menu titled "FDS" that appears on the cores menu only when an FDS game is loaded. By default, "automatic" mode is selected. This mode will automatically load the first side of the FDS image and run it without user intervention. It will also detect the side and disk the game is attempting to load, and will automatically select the side for you. The loading prompts are also automatically passed so usually no intervention is required to play the games. There are a few exceptions to this, however, and a few games need to be loaded manually. These are listed later. When an FDS game is loaded, a new FDS menu appears on the main menu. This menu allows full control of the FDS "drive". In the menu, the auto/manual selection can be made, and a disk ejected, and/or a new side can be selected. Saving: When the file menu is opened, the FDS disk data is saved to a `.sav` file like with cartridge games, however the data is no longer in .fds format; it is in the expanded disk format with gaps and CRCs. This is due to the fact that .fds files must be expanded to be loaded. When loading a game with an existing .sav file, the .sav file will be loaded and the contents of the .fds will NOT be used, except to determine the number of disk sides the game contains. This means if you wish to reload a game from the .fds to play it, its .sav file must be deleted, or the .fds renamed. This way the original file is not modified. A .sav file will NOT be created unless the contents of the disk were modified. This means games that do not save to disk will not create .sav files. There are three ways to handle disk changing: Manual mode: You must manually select disks and sides in the menu. You do not need to eject before changing disks or sides; it wil perform a 1/4th second eject for you. Semiautomatic mode functionality: This is the default mode. Pressing select will eject the disk as long as select is down, and for about 1/2 second after it is released, and then auto-select the proper disk and side. Some games such as *Doki Doki Panic* have an unusually long wait to check for a disk being ejected so this accommodates that. Note that since select ejects the disk, it is not a good idea to do this while a game is loading since it will throw an error 01. Usually most games can recover from this though, and will attempt to load again. The same caveats with automatic mode follow over to semiautomatic mode with regards to a few games not being able to autodetect the side/disk. No known games or software have an issue with the disk being ejected once they have loaded, and only check it when they wish to load or are actively loading. Automatic mode functionality: The automatic mode works fine for most games except those outlined below. It has a few extra features that are activated by use of the select button on player 1. Some games, such as Zelda, will skip the title screen and directly load the game. To see the title screen, the select button can be held which will stop the automatic switching from happening until it is released. On a few games, it can take a few seconds for the automatic loading process to start. This is normal because every game has to read the state of the disk, and each one does it differently. A few games that use custom loading routines can produce `ERR 01` errors If this happens simply hold the select button down during the loading process. If a game seems unresponsive or won't autoload the disk, try holding down select for a second or so and/or tapping it to see if that unsticks it. These games cannot be used with auto mode because they have custom disk routines, or they do not specify the proper side when loading: - Doremikko - does not specify proper side to load - Koneko Monogatari - uses custom disk routines and not the BIOS - Some unlicensed games don't specify the side to load - Bishoujo Shashinkan - Moving School - reboots unless run in manual mode ------------------------------- # Atari 2600 Core Release Notes The Atari 2600 core currently only supports joystick games. Atari 2600 games determine the number of scanlines they will use. In HDMI mode, the display window for Atari 2600 games is 160 pixels by 239 pixels, which is adequate for most games. A few games (*Pick and Pile*, and *Acid Drop*) produce far too many scanlines, the viewport can be centered so that most of the screen is visible. ## Controller mapping It is suggested to use a SNES controller or an SNES NTT Data Pad (NDK10). If not, the difficulty switches and TV type and Supercharger load functionality is present in the core options menu. You can swap joystick ports virtually, because some games seem to use one or the other. This is also accessible in the core options menu. `Y` is the fire button (`B` on an NES controller). Left and right triggers toggle the difficulty switch for this player, while on player 1, `X` and `A` toggle the B&W/colour switch. `X` is colour, `A` is B&W. In most games, left trigger is hard, and right trigger is easy. `Select` and `start` are `select` and `reset`, respectively. ## Core Options menu You can enable/disable the Atarivox and the controller swap. A toggle for the difficulty and TV type switches is present. An option to load the next portion of a Supercharger ROM is also here. ## Supercharger games If you wish to play Supercharger games, you must rename the 2K Supercharger BIOS ROM to `scbios.bin` (CRC32: `C3A3F073`) and place it in the `/BIOS/` directory. When you wish to load a Supercharger game, press both `A` and `X` to 'press play on tape'. If you have a multiload game, you can press these again to do the next load when the game calls for it. Note: it takes about 1-2 seconds for the load to start when the `A`/`X` combo is pressed or the menu entry selected and pressed. ## Atarivox Atarivox is supported, via a PIC18F1320. The PIC ROM code is stored as `avoxrom.bin` and the PIC EEPROM data is stored as `avoxee.bin`. `avoxrom.bin` is 8K bytes, and avoxee.bin is 256 bytes. Place these two files in the `/BIOS/` directory. Note that the EEPROM is the data on the PIC micro itself, and not the high score/setting EEPROM. The EEPROM that stores high scores, etc. is not implemented at this time. ## Supercharger Demo Unit The demo unit is fully supported. To run it, you need to make a single 66K file consisting of the 64K worth of data EPROMs, followed by the 2K EPROM. ## Palettes All three palettes are supported: NTSC, PAL, and SECAM. There is also an "automatic" mode that will select between NTSC and PAL in most cases. It does this by checking the scanline count. If it's greater than 284, then the PAL palette is selected, otherwise NTSC is. You can force a palette to either of the three, or automatic in the core settings menu. ## Mappers and ROMs The 2600 has no standardized ROM header format, making it difficult to determine the identity of any extra hardware on the cartridge. The 2600 core determines the ROM's mapper based on file size, and then file extension. The mapper is determined like so: | Filesize | Mapper | |-------------|------------------------------------------------------| | 2048 bytes | standard 2K game, no bankswitching | | 4096 bytes | standard 4K game, no bankswitching | | 8192 bytes | standard 8K game, uses F8 (FFF8/FFF9) bankswitching | | 16384 bytes | standard 16K game, uses F6 (FFF6-FFF9) bankswitching | | 32768 bytes | standard 32K game, uses F4 (FFF4-FFFA) bankswitching | | 65536 bytes | Dynacom Megaboy | | 12288 bytes | RAM+ (FA) | | 10240 bytes | Pitfall 2 (DPC) | | 10495 bytes | Pitfall 2 (DPC) | | 24576 bytes | 24K (FA2) | | 8448 bytes | Supercharger single load | | 16896 bytes | Supercharger dual load | | 25344 bytes | Supercharger triple load | | 33792 bytes | Supercharger quad load | | 67584 bytes | Supercharger demo unit | For 8K, 16K, and 32K games, superchip RAM is detected by looking at the first 256 bytes of the file. If it is all `0x00` or `0xff` then the game is assumed to have RAM here. At this point, the extension is checked. If it matches one of the below, then this mapper is selected: | File extension | Mapper | | --------------- | ------ | | `.ACT` | Activision 8K FE banking | | `.PB` | Parker Bros. E0 mapping | | `.TV` | Tigervision 3F mapping | | `.TVR` | Tigervision 3E (with RAM) mapping | | `.MN` | M-network E7 mapping | | `.CV` | Commavid extra RAM | | `.EB` | Econobanking | | `.EF` | EF Bankswitching | | `.EFR` | EF with RAM | | `.UA` | UA bankswitching | | `.X07` | X07 bankswitching | | `.SB` | Superbanking | Note: A popular ROM pack may show certain characters in file names as flashing characters (i.e. *pele's soccer*). These show up as flashing characters in the menu. This is normal because they are outside of the ASCII range later. The following changes need to be made to get these games to run: These need the `.ACT` extension: - Decathlon - Robot Tank - Thwocker These need the `.PB` extension: - Frogger II - Gyruss - James Bond 007 - Lord of the Rings - Montezuma's Revenge - Mr Do's Castle - Popeye - Q-bert's Qubes - Star Wars: Return of the Jedi - Star Wars: The Arcade Game - Super Cobra - Tooth Protectors - Tutankham These need the `.TV` extension: - Espial - Miner 2049'er - Miner 2049'er Vol 2 - Polaris - River patrol - Springer These need the `.MN` extension: - Bump 'n' Jump - Burgertime - Masters of the Universe - Anteater - Golden SKull - Treasures of Tarmin These need the `.CV` extension: - Magicard - Video Life This needs the `.UA` extension: - Pleiades Other changes: *Dig Dug* needs to have the first 256 bytes zeroed out (or set to all `0xFF`'s) so that the extra RAM can be detected properly. ## Problem solving **Kool-Aid Man**: uses the left (Player 1) difficulty switch to pause the game (default/disabled position `A` is paused, position `B` is un-paused). Enable this switch in the Core Options menu, or press the right-trigger button on controller 1 to play this game. The right (Player 2) difficulty switch sets the difficulty (default/disabled position `A` is fast/hard, position `B` is normal); this can toggled in the Core Options menu or by pressing the left/right triggers for hard/easy on controller 2. See [game manual](https://archive.org/details/Atari2600Manuals_201812/Kool-Aid%20Man%20%28USA%29) [text](https://atariage.com/manual_html_page.php?SoftwareLabelID=266) for more details. Some other games that use switches during gameplay: [Ghostbusters](https://archive.org/details/Atari2600Manuals_201812/Ghostbusters%20%28USA%29), [Phaser Patrol](https://archive.org/details/Atari2600Manuals_201812/Phaser%20Patrol%20%28USA%29), [Secret Quest](https://archive.org/details/Atari2600Manuals_201812/Secret%20Quest%20%28USA%29), [Space Shuttle](https://archive.org/details/Atari2600Manuals_201812/Space%20Shuttle%20-%20A%20Journey%20Into%20Space%20%28USA%29), [StarMaster](https://archive.org/details/Atari2600Manuals_201812/StarMaster%20%28USA%29), [Tomcat](https://archive.org/details/Atari2600Manuals_201812/Tomcat%20-%20The%20F-14%20Fighter%20Simulator%20%28USA%29) ----------------------------- # Atari 7800 Core Release Notes This Atari 7800 core supports official games and pokey sound. It currently only supports joysticks and does not run 2600 games. Two button controllers appear to be working. ## Controller mapping This core requires the use of a SNES controller if you wish to manipulate the pause button and difficulty switches. - `X` is pause on either controller. - `L` and `R` triggers change the difficulty switch position for the respective player. - `Select` and `Start` are `select` and `reset`, respectively. The difficulty switches can also be toggled with `Core Options`. ## Mappers and ROMs Atari 7800 ROMs use headers. There are a few ROMs with bad headers, here is how to fix some of them: - *Commando* has no Pokey sound - the header needs to have it enabled. (`0036h` needs to be `03h` instead of `02h`) - There's a broken version of *Summer Games* and *Winter Games*. Both need address `0036h` changed from `02h` to `06h`. - There is a broken version of *Sentinel* as well. Byte `0036h` needs to be `02h` instead of `03h`. This core needs a BIOS to run which you can select with `Core Options`. The Core will load the file `7800bios.bin` (expected CRC32: `5D13730C`) found in the `/BIOS/` directory by default. ---------------------------- # Sega Genesis Jailbreak Notes The Sega core supports all official Genesis ROMs except for *Virtua Racing* and does not run 32x ROMs and common unlicensed games. ## Button Mapping With a NES controller, Button `C` on a Genesis 3-button controller is mapped to `select`. Other mappings are as follows: | Genesis | SNES | | ------- | ---- | | `A` | `B` | | `B` | `Y` | | `C` | `A` | | `X` | `X` | | `Y` | `L` | | `Z` | `R` | | Genesis | M30 | | ------- | --- | | `A` | `B` | | `B` | `Y` | | `C` | `A` | | `X` | `X` | | `Y` | `Z` | | `Z` | `C` | ---------------------- # SMS Core Release Notes The SMS core runs games released for the Sega Mark III and the Master System. This core also runs SG-1000 games but you must use the Japanese BIOS or no BIOS, however. You can select a BIOS or disable a BIOS in the `Core Options` menu. Disabling it all the time may not be ideal because a few games rely on it running first to set memory (see below for a list). The BIOS is only loaded when a game is loaded. To change the BIOS you must reload the game. If it is missing, loading will fail and it will return you to the core menu. Place the BIOS into the `/BIOS/` directory on the SD card and try again, or turn the BIOS off in the `Core Options` menu. The core will load the file with the name `smsbios.bin` by default if one is found in the `/BIOS/` directory. Note that the Sega Master System has [many different BIOS versions](https://datomatic.no-intro.org/index.php?page=search&op=datset&s=26&sel_s=Sega+-+Master+System+-+Mark+III&stext=BIOS&where=1&searchme=Search&pageSel=0&element=Titles&sort=Title&order=Ascending). You may need to change to an `export` region under `System/Hardware` to get some games or BIOSes to work. Because there are three mappers supported, selecting one of the different mappers works by changing the file extention of the ROM file. A ROM that's 48K or less in size is run without a mapper; it is just loaded straight into `x0000-xBFFF`. A ROM with a size greater than 48K will use the standard Sega mapper. Other mappers supported: `*.SCM` - selects the SMS Codemasters mapper. The following games require it: - Cosmic Spacehead - Dinobasher - Starring Bignose the Caveman (Proto) - The Excellent Dizzy Collection (Proto) - Fantastic Dizzy - Micro Machines `*.SKR` - select the SMS Korean mapper. The following games require it: - Dallyeora Pigu Wang (Korea) (Unl) - Jang Pung II (Korea) (Unl) - Jang Pung 3 (Korea) (Unl) - Samgukji 3 (Korea) (Unl) Any other extention is valid, and will just load as either no mapper or the standard Sega one. The system will properly run larger, bankswitched BIOS ROMs such as the combined BIOS+Hang On, etc. ROMs. Some ROMs have a useless 512 byte header at the beginning that is mostly `x00`'s. If this is found, it is ignored. Some versions of the system have a `game reset` button. You may activate this feature by pressing `select` and `X` at the same time on a SNES controller. That is a safeguard to prevent it accidentally being pressed and resetting the game. Pause is mapped to the `start` button like you'd expect. Lastly, you can turn the YM2413 on and off. This is because a couple games crash if it's on. ## Problems and how to solve them **SG-1000**: Unselect `Use BIOS` for playing these games or select the Japanese BIOS. Also, select Japan as the Region in `System/Hardware`. *Terebi Oekaki*: needs a drawing tablet and will not start without it. **SMS**: You may need to pivot between the US and Japanese BIOSes for certain games to work. The US BIOS performs a region check and will not play games (such as Sega's Japanese games) that do not pass the check. - PAL games might have problems such as sprite flicker; this is normal and happens on an NTSC system too. - Some games (*Walter Payton Football*, *Spy vs. Spy*) need the US BIOS to work. The former runs but has initial title screen corruption and the latter doesn't start at all. - MSX Ports larger than 48K use unusual mappers and are not supported. - Games that require a paddle controller, light gun or the 3-D glasses will not work. - *Wanted* - you must turn off FM for this game to work but it still requires a light gun - *Super Tank* - select one of the `export` options in `System/Hardware`. - *Back to the Future 3* - locks up at black screen because it detects PAL/NTSC and will refuse to work if it detects NTSC ---------------------------- # Game Gear Core Release Notes Early models of Game Gear did not include a BIOS, but later ones did. If you wish to see that blue startup screen, you can by selecting the BIOS file under `core options`. The core will load a file with the name `ggbios.bin` (expected CRC32: `0EBEA9D4`) by default if found in the `/BIOS/` directory. Because there are three mappers, selecting one of the different mappers works by changing the file extention of the ROM file. A ROM that's 48K or less in size is run without a mapper; it is just loaded straight into `x0000-xBFFF`. A ROM greater than this size will use the standard Sega mapper. Other mappers supported: - `*.GCM` - selects the SMS Codemasters mappers - `*.GKR` - selects the SMS Korean mapper. Any other extention is valid, and will just load as either no mapper or the standard Sega one. ## Problems and how to solve them These games are not actually Game Gear games but were released on the system and operate in SMS mode. You can play them on the SMS core: - Castle of Illusion - Starring Mickey Mouse - Cave Dude (Proto) - Chase H.Q. - The Excellent Dizzy Collection - Fantastic Dizzy - Jang Pung II/Street Battle (Unl) - Olympic Gold - Out Run Europa - Predator 2 - Prince of Persia - Rastan Saga - R.C. Grand Prix - Street Hero (Unl) - Super Kick Off - Super Tetris (Unl) - WWF WrestleMania Steel Cage Challenge These games need the .GCM file extension: - CJ Elephant Fugitive - Cosmic Spacehead - (Archer MacLean's) Dropzone - Ernie Els Golf - The Excellent Dizzy Collection - Fantastic Dizzy - (S.S. Lucifer) Man Overboard! - Micro Machines - Micro Machines 2: Turbo Tournament - Pete Sampras Tennis The Core will not play any games that use EEPROM for saving properly at this time, they will refuse to load. The following games are the only known games to use EEPROM: - Hyper Pro Yakyuu '92 - Majors Pro Baseball, The - Pro Yakyuu GG League - World Series Baseball - World Series Baseball '95 (including prototypes) ------------------------------- # Colecovision Core Release Notes The Coleco core will run games designed for the unenhanced ColecoVision as well as Super Game Module games. Loading a BIOS is required for this Core to load ROMs. Load a BIOS under the `core options` menu. The Core will automatically load a file named `colbios.bin` if found in the `/BIOS/` directory, but you can use the Select New BIOS File option to select a BIOS file with any name. | Coleco BIOS CRC32 | Description | | ----------------- | ----------- | | `3AA93EF3` | Official | | `39BB16FC` | Unofficial, stylized font, can skip waiting | | `4999ABC6` | Bit Corp clone | ## Button mapping A SNES Controller (or any compatible 12-button controller) or SNES NTT Data Pad (NDK10) is ideal for this core. The NTT Data Pad's numberpad is directly mapped the Coleco controller's numberpad. ~~Same for the Famicom Network Controller (HVC-051).~~ If using a standard SNES Controller, this is how the Coleco numberpad maps to the SNES buttons: | SNES Button/Combo | Assignment | | ------------------- | ---------- | | `Start` | `1` (this usually selects the easiest game difficulty) | | `Select` | `3` (this usually selects a harder game) | | `X` | `#` (a popular option for some games to use as a start button) | | `A` | `*` (another popular option to start games) | | `L` + `up` | `0` | | `L` + `right` | `1` | | `L` + `down` | `2` | | `L` + `left` | `3` | | `R` + `up` | `4` | | `R` + `right` | `5` | | `R` + `down` | `6` | | `R` + `left` | `7` | | `L` + `R` + `up` | `8` | | `L` + `R` + `right` | `9` | ## Special ROM handling Two reproduction games, *The Black Onyx* and *Boxxle* use EEPROM for saving, 256 bytes and 32KB, respectively. Rename their extensions to `.ce0` and `.ce1`, respectively to get them working. A third reproduction game, *Gradius*, saves to flash memory and uses a custom mapper. Rename its extension to `.cf0` to get it working. -------------------------- # Gameboy Core Release Notes The Gameboy core supports games using MBC1, MBC2, MBC3 (except RTC saving) and MBC5 (except rumble). You must select the Gameboy's bootstrap (BIOS) for it to work. `dmgbios.bin` (expected CRC32: `59C8598E`) is the default file to be loaded if found in the `/BIOS/` directory. The Super Gameboy bootstrap `dmgbios2.bin` (example CRC32: `EC8A83B9`) also seems to work and skips the scrolling intro. --------------------------------- # Gameboy Color Core Release Notes The Gameboy Color core supports games using MBC1, MBC2, MBC3 (except RTC saving) and MBC5 (except rumble). You must select the Gameboy Color's bootstrap (BIOS) for it to work. `gbcbios.bin` (expected CRC32: `41884E46`) is the default file to be loaded if found in the `/BIOS/` directory. -------------------------------- # Intellivision Core Release Notes The Intellivision core supports games for the base system, the Intellivoice speech attachment and to some extent, the Enhanced Computer System addon (ECS). You will need the GROM binary `grom.bin` placed in the `/BIOS/` directory (expected CRC32: `683A4158`). You will need an Executive BIOS file (in INTV2 format) for Intellivision (expected default name: `intvexec1.bin`). This file can be selected by name in the core menu. The recommended BIOS to convert and use is `Executive ROM, The (1978) (Mattel).int`, aka `intv/exec.bin` from MAME, with CRC32: `CBCE86F7`, 8192 bytes. When converted to the INTV2 format to be used with the Nt Mini Noir, it is 8208 bytes with CRC32: `EEB54C63`. This converted file should be placed in the `/BIOS/` directory with the expected name `intvexec1.bin`. See further below for information on the INTV2 format. Note: You may also use the *Intellivision II* Executive BIOS (INTV2 format expected CRC32: `A85FC6DD`, 8728 bytes), but it may be incompatible with some games (e.g. *Carnival*, *Donkey Kong*, *Mouse Trap*, *Venture*). This particular rom can be found in [MAME](https://github.com/mamedev/mame/blob/master/src/mame/drivers/intv.cpp) (`intv/intv2/ro-3-9506-010.ic6`, 8704 bytes, CRC32: `DD7E1237`). The first 512 bytes are mapped to address 0x400, the remaining 8192 bytes are mapped to address 0x1000. To convert this file to INTV2 format, each of the two chunks need an eight-byte INTV2 header, with an INTV2 file trailer: `00 04 00 00 00 01 00 00` *(first 512 bytes of rom)* `00 10 00 00 00 10 00 00` *(remaining 8K bytes of rom)* `00 00 00 00 00 00 00 00` To use the Intellivoice, the 2K ROM for the speech chip has to be present in the `/BIOS/` directory and named `012.bin`. `012.bin` can be constructed by reversing the _bit_ order for each byte in `sp0256-012.bin` (CRC32: `0DE7579D`) from the `intv_voice` entry in MAME. ## Controller support Only the SNES NTT Data Pad (NDK10) ~~or the Famicom Network Controller (HVC-051)~~ can fully map the numberpads of the Intellivision controller. By default, the controller plugged into port two on the console will likely be seen as player one by the game, so the `Core Options` menu allows you to swap controllers. It also allows you to enable ECS, Intellivoice and Intellicart mapping. ## INTV2 ROM Format Because Intv games map themselves into various areas of memory, the ROM must be able to tell the Core into which areas of memory address space game code and data are to be loaded. There is more than one existing format for Intellivision ROMs. The existing Intellivision ROM conventions were deemed unsuitable to the Intellivision Core. To make something usable, a new file format was created called INTV2. It has the extension `.intv` and the Core will only load Intellivision ROMs in this format. The files are a relatively simple format. All data is stored in little endian form, with the lowest 8 bits first and the upper 2 bits (for decles) stored next with the top 6 bits `0`'s. This format allows the storage of 16 bit words too, since some homebrews use these instead. It is also stored in little endian format. Since the ROMs self-map into memory, some provision to define where data goes is needed, thus a chunked format was devised. Each file consists of 1 or more chunks, which define the start address in memory where the data is to be loaded, and its length followed by the actual data. The last chunk is simply an address and length of 0. All address and length values are little endian. Each field is 32 bits. The first 64K of the address space is the Intellivision's memory map directly. Each address is a word address, so `0x5000` is word address `0x5000` (which would technically be byte `0xa000` on a modern system). Here's an example using *Burgertime*. It is a 16K game (8K words). It maps into memory at `0x5000`, and has a length of `0x2000` (words). It is constructed as so: ``` / address \ / length \ ROM data / address \ / length \ 00 50 00 00 00 20 00 00 <16K of data follows> 00 00 00 00 00 00 00 00 ``` Bit 16 of the address right now selects if this area is writeable or not (i.e. RAM). The other bits may be used in the future for bankswitched games and/or homebrews as they arise. The BIOS you wish to use also needs to be in this format, since the different BIOSes (i.e. intv2) load data in different places. A utility to convert files to the INTV2 format can be found [here](https://github.com/dot-bob/int2intv). ---------------------------------- # Adventurevision Core Release Notes The Adventurevision core runs all games. The Code Red demo does not run as well as the games because it does not use BIOS routines for graphics drawing. There are two BIOS files needed for this core: | File | Description | | ------------- | ----------- | | `avbios.bin` | This is the 1K ROM found inside the 8048 on the system. CRC32: `279E33D1` | | `avsound.bin` | This is the 512 byte ROM on the sound CPU. CRC32: `81E95975` | Both of these need to be put in the `/BIOS/` directory and named as identified above. *Turtles* seems to have a graphical "bug" on it, the right size of the maze is shifted right by a pixel, but this is how the real system runs it. *Defender* likewise has a bad collision bug on original hardware, you can often shoot through enemies, this appears to occur due to the game checking only when your shots touch the edge of an enemy sprite. ------------------------------- # Arcadia 2001 Core Release Notes The Aradia 2001 core runs known 2001 games, but there are other systems based on the same/similar chips. Games for these systems may or may not run on this core. Several prototypes of released games have graphics issues that the released versions fixed. Some homebrews games rely on the numberpad rather than the joystick to move. ## Controls You can use the keypad of an SNES NTT Data Pad (NDK10) or press "chords" on a SNES controller. The "chording" works similar to the Colecovision: ``` Ltrig + up = 0 Ltrig + right = 1 Ltrig + down = 2 Ltrig + left = 3 Rtrig + up = 4 Rtrig + right = 5 Rtrig + down = 6 Rtrig + left = 7 Ltrig + Rtrig + up = 8 Ltrig + Rtrig + right = 9 Ltrig + Rtrig + down = * ``` These buttons don't require chording: ``` X = # A = option (both controllers) start = start (both controllers) select = select (both controllers) Y = fire button ``` ---------------------------- # Channel F Core Release Notes The Channel F BIOS is required to be present in the `/BIOS/` directory and it must be named `cfbios.bin` (expected CRC32: `2882C02D`). The BIOS should be a concatenation of: - `[BIOS] Fairchild Channel F (USA) (SL31253).bin` (CRC32: `04694ED9`), - `[BIOS] Fairchild Channel F (USA) (SL31254).bin` (CRC32: `9C047BA3`) resulting in a merged file with a CRC32 checksum of `2882C02D`. For example, via Windows command-line: ``` copy /B sl31253.bin + sl31254.bin cfbios.bin ``` ## Button mapping There are four buttons on the Channel F system to select a game, time length, and other things. Starting a game involves optionally setting these to start playing. On this core, the mapping of these four buttons is as such to a SNES controller: ``` 1 - left trigger 2 - right trigger 3 - select 4 - start ``` If you have an SNES NTT Data Pad (NDK10), `1, 2, 3, 4` map to `1, 2, 3, 4` on the NTT Data Pad's numeric keypad. Using it: When the system is reset, it says `G?` on the screen. It is asking which game to play. Hitting one of the four buttons results in selecting one of the 4 games on the cartridge. ``` (left trigger) 1 - "game 1" (from the cart, or the BIOS if "play bios games" file is run) (right trigger) 2 - "game 1" (from the cart, or the BIOS if "play bios games" file is run) (select) 3 - "game 3" (from the cart) (start) 4 - "game 4" (from the cart) ``` Next, it will print `S?` on the screen. You can either start a game now or enter more options. These options are: ``` (left trigger) 1 - time limit (right trigger) 2 - game speed/(mode or motion) (start) 4 - start game immediately ``` If you press `1` for a time limit, then `T?` is displayed. You can choose a time limit using the four buttons: ``` (left trigger) 1 - 2 minutes (right trigger) 2 - 5 minutes (select) 3 - 10 minutes (start) 4 - 20 minutes ``` At this time, you can press either `1`, `2`, or `4` since it should be back at the S? prompt. If you had pressed `2` for a mode (motion / speed) `M?` will be displayed. Generally, 1-4 will be the speed of the game from slow to fast. ``` (left trigger) 1 - speed 1 (right trigger) 2 - speed 2 (select) 3 - speed 3 (start) 4 - speed 4 ``` As with the time setting, it should be back at the `S?` prompt, ready for the next option. For general-variety playing, you can simply load a game and hit `ltrig`/`rtrig`/`select`/`start` then `start` to start one of the 4 games. It seems many games choose one controller port at random to use for player 1 (i.e. some games use controller port 1, some use controller port 2) so if a game does not seem playable, add a second controller or swap ports. ------------------------------- # Creativision Core Release Notes You must press `reset` after loading a game on this system to play it. The Creativision plays games and supports the keyboard but tape saving and loading are not working. The Creativision BIOS must be present in the `/BIOS/` directory and named `crbios.bin`, expected CRC32: `05602697` ## Button mapping Even though there is no support for a physical keyboard, the games are still playable using a regular controller. The `B` and `A` buttons are mapped to the two fire buttons, and `start`/`select` are mapped to the two most common buttons used to start the games. On startup, most games will run, show a demo mode and even appear to respond to controller input. To get out of demo mode, you must reset the system. ------------------------- # Gamate Core Release Notes This core needs a BIOS, and there's two known BIOSes: a Bit Corp version (CRC32: `07090415`) and a UMC version (CRC32: `03A5F3A7`). Either can be used, but it must be named `gmbios.bin` and placed in the `/BIOS/` directory on your SD card. There are three mappers on the system, and two of these can be distinguished by file size, however the multicart(s) need the `.GML` extension. So rename the file from i.e. `4-in-1.bin` to `4-in-1.gml` to run it. ------------------------- # Game King Core Release Notes This core needs a BIOS. It should be named `gkbios.bin` (sometimes it can be found named `gm218.bin`) and place it in the `/BIOS/` directory on your SD card. Expected CRC32: `5A1ADE3D`. There's three built in games to the system, so if you wish to play these, simply run the included file, `play bios games.bin`. This is an empty file consisting of nothing but bytes of `0xff`. This simulates having no cartridge in the system. ---------------------------- # Odyssey<sup>2</sup> Core Release Notes The Odyssey<sup>2</sup> core supports keyboards via PS2 and an adapter that plugs into the Famicom expansion port. If you wish to make an adapter, you can find the schematic in the `/SYSTEM/` directory. The Voice speech expansion add-on is also complete and fully functional if you wish to hear speech/sound effects in the games that support it. Some games (such as *Frogger*) play poorly with The Voice being enabled all the time, but most games will work fine with it enabled. On *Frogger* it will constantly repeat an allophone over and over. This is not a bug but a byproduct of how The Voice works. This core needs a BIOS. It should be named: ``` o2bios.bin - Main 8048 BIOS (1K byte) CRC32: 8016A315 ``` This BIOS file can be selected with `Core Options`. If you wish to use the voice, you need three files: ``` 019.bin - Speech ROM in the SP0256-019 speech chip (2K bytes) CRC32: 19355075 sp128_03.bin - Speech ROM resident in the speech module (16K bytes) CRC32: 66041B03 sp128_04.bin - Speech ROM in Sid the Spellbinder (16K bytes) CRC32: 6780C7D3 ``` `019.bin` can be constructed by reversing the _bit_ order for each byte in `sp0256b-019.bin` (CRC32: 4BB43724) from the `o2_voice` entry in MAME. Place them in the `/BIOS/` directory and name them as indicated above. The controller maps the directionals directly to the Odyssey<sup>2</sup> directions as you'd expect. On a SNES controller the mapping is as follows : ``` Y is the fire button start is "1" on the keyboard select is "2" on the keyboard A is "3" on the keyboard X is "4" on the keyboard Ltrig is "enter" on the keyboard Rtrig is "clear" on the keyboard ``` Note that some games swap the two controllers from the majority of games, so player 2 is actually the controller to use for single player games. Keyboard mapping: If you make or buy the PS2 adapter and use a PS2 keyboard you can use it directly to simulate the Odyssey<sup>2</sup> keyboard. All Odyssey<sup>2</sup> keys map to their corresponding keys on a PC keyboard except "clear" which is mapped to backspace. Most games can be played without the keyboard, due to the mapping of 1-4 from the keyboard which are usually used to start games. Some games like *Killer Bees* start using the fire button. ------------------------------- # RCA Studio 2 Core Release Notes The Studio 2 core runs games and homebrew. The video display can be problematic on this Core, showing the following issues: - The selected entry on the menus flashes really fast. - The screen might scroll on reset. - You cannot see the menu on composite/rgb if the CPU is reset. These problems are related to how the system renders video, the CPU itself drives the video display, and when the CPU does not run, it does not generate video timing. Many games on this system them start with a black screen until you press a button. This is normal, refer to the instruction manual to determine how to start the game. This system requires a BIOS, and it must be named `rca2bios.bin` (CRC32: `A494B339`) and placed in the `/BIOS/` directory. The BIOS should be a concatenation of: - `84932.ic11` (CRC32: `283B7E65`), - `84933.ic12` (CRC32: `A396B77C`), - `85456.ic13` (CRC32: `D25CF97F`), - `85457.ic14` (CRC32: `74AA724F`) from the `studio2` entry in MAME, resulting in a merged file with a CRC32 checksum of `A494B339`. For example, via Windows command-line: ``` copy /B 84932.ic11 + 84933.ic12 + 85456.ic13 + 85457.ic14 rca2bios.bin ``` ## Controller mapping The system has two 10 key keypads. These are almost always used as a "joystick". Here is how it maps to a SNES Controller : | RCA Studio 2 keypad | SNES Controller | | ------------------- | --------------- | | `1` | `up`+`left` | | `2` | `up` | | `3` | `up`+`right` | | `4` | `left` | | `5` | `Y` | | `6` | `right` | | `7` | `down`+`left` | | `8` | `down` | | `9` | `down`+`right` | | `0` | `B` | You can hold down `A` which will disable the D-pad on the controller so you can easily press `1`, `3`, `9`, or `7` (the diagonals) without hitting one of the cardinal directions. If you have an SNES NTT Data Pad (NDK10), you can use its numberpad to hit `0`-`9`. This core uses a monochrome menu. This is normal. Note: When the system is first started and a game is loaded, it's overwritten by the BIOS games, so the first games that will play are the BIOS games. Loading another game or the same one again will cause it to load the game as desired. ------------------------------ # Supervision Core Release Notes The Supervision core plays all games and does not require a BIOS. *sssnake* appears to output a very quiet sound when you pick something up, but this is a bug in the program. The game is playing one of the drum samples really slowly so it doesn't make much sound. ----------------------------- # Videobrain Core Release Notes The Videobrain supports keyboards via a PS2 keyboard and the Odyssey<sup>2</sup> adapter. This core needs a pair BIOS ROMs. They must be named: ``` uvres1.bin - The first BIOS file (2K bytes) CRC32: `065FE7C2` (commonly named "uvres 1n.d67" in the vidbrain entry in MAME) uvres2.bin - The second BIOS file (2K bytes) CRC32: `1D85D7BE` (commonly named "resn2.e5" in the vidbrain entry in MAME) ``` Place them in the `/BIOS/` directory. Controller mapping: The controller maps the directionals directly to the Videobrain's directionals `B` is the fire button ## Keyboard mapping If you use the PS2 adapter and keyboard, the PC keyboard maps as follows: `A`-`Z` maps to `A`-`Z` Capslock is the `shift` key because the Capslock function is controlled by the system. The state of shift is indicated on screen with a box at the bottom right. If it's black, it's unshifted and `A`-`Z` works like usual. If it's grey, shift is on and symbols and numbers are usable. The mapping of the keyboard when shifted and unshifted is as follows: Non-shifted characters: ``` A-Z, 0, space ``` Shifted characters: ``` 1-9, !, #, $, %, *, (, ), -, +, =, ., ', ", ? F9 - cent symbol F10 - pi symbol F11 - division symbol F12 - multiplication symbol ``` These characters and symbols are "dual mapped", because shift/nonshift is not known by the keyboard, due to the shift being handled in the VideoBrain itself, so if you are unshifted and press `%`, `Q` will print to the screen. Likewise if it is shifted and `Q` is pressed, `%` will appear. ``` F1 - restart/erase F2 - run/stop F3 - alarm/special F4 - clock/next F5 - color/previous F6 - text/back ``` ## Playing the games Many of the games need the keyboard to start them. There is an empty binary file that lets you use the built in BIOS functions. ---------------------- # Megaduck Release Notes Megaduck has no BIOS but it does have mappers. ## Special ROM handling There are two different mappers for Megaduck. They are selected by extension. `.MD1` selects the single 32K selectable bank mode, and `.MD2` selects the 16K selectable bank mode. All games greater than 32K need the `.MD2` extension except for the following two games: - *Puppet Knight* - *Suleiman's Treasure* These need the `.MD1` extension. 32K games do not need a special extension.
36.150108
689
0.69528
eng_Latn
0.99535
0c1979591cdd48e658bedd7ea1890c05e313febe
22
md
Markdown
README.md
gcrevell/plex-pre-roll-script
fec63127fece10b9fbba8d3ac44389fdfcecd66f
[ "MIT" ]
null
null
null
README.md
gcrevell/plex-pre-roll-script
fec63127fece10b9fbba8d3ac44389fdfcecd66f
[ "MIT" ]
null
null
null
README.md
gcrevell/plex-pre-roll-script
fec63127fece10b9fbba8d3ac44389fdfcecd66f
[ "MIT" ]
null
null
null
# plex-pre-roll-script
22
22
0.772727
eng_Latn
0.209741
0c1a48d091349ef785f6d66ca9d511ff6c411126
4,546
md
Markdown
docs/framework/winforms/controls/how-to-display-scroll-bars-in-the-windows-forms-richtextbox-control.md
xp-development/docs.de-de
4ed86aedf7dfd721b914a25db5f3e2506bc68bad
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/winforms/controls/how-to-display-scroll-bars-in-the-windows-forms-richtextbox-control.md
xp-development/docs.de-de
4ed86aedf7dfd721b914a25db5f3e2506bc68bad
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/winforms/controls/how-to-display-scroll-bars-in-the-windows-forms-richtextbox-control.md
xp-development/docs.de-de
4ed86aedf7dfd721b914a25db5f3e2506bc68bad
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: 'Vorgehensweise: Anzeigen von Scrollleisten im RichTextBox-Steuerelement von Windows Forms' ms.date: 03/30/2017 helpviewer_keywords: - text boxes [Windows Forms], displaying scroll bars - scroll bars [Windows Forms], displaying in controls - RichTextBox control [Windows Forms], displaying scroll bars ms.assetid: cdeb42e1-86e8-410c-ba46-18aec264ef5f ms.openlocfilehash: 152706cee511e4bca1dd324a652e8077b1f8548a ms.sourcegitcommit: 9b552addadfb57fab0b9e7852ed4f1f1b8a42f8e ms.translationtype: MT ms.contentlocale: de-DE ms.lasthandoff: 04/23/2019 ms.locfileid: "61650478" --- # <a name="how-to-display-scroll-bars-in-the-windows-forms-richtextbox-control"></a>Vorgehensweise: Anzeigen von Scrollleisten im RichTextBox-Steuerelement von Windows Forms Standardmäßig wird die Windows-Formulare <xref:System.Windows.Forms.RichTextBox> -Steuerelement zeigt die horizontale und vertikale Bildlaufleisten nach Bedarf. Es gibt sieben mögliche Werte für die <xref:System.Windows.Forms.RichTextBox.ScrollBars%2A> Eigenschaft der <xref:System.Windows.Forms.RichTextBox> -Steuerelement, das in der folgenden Tabelle beschrieben werden. ### <a name="to-display-scroll-bars-in-a-richtextbox-control"></a>Zum Anzeigen von Bildlaufleisten im RichTextBox-Steuerelement 1. Legen Sie die <xref:System.Windows.Forms.RichTextBox.Multiline%2A> -Eigenschaft auf `true`fest. Kein Typ der Schiebeleiste, einschließlich horizontal wird angezeigt, wenn die <xref:System.Windows.Forms.RichTextBox.Multiline%2A> -Eigenschaftensatz auf `false`. 2. Legen Sie die <xref:System.Windows.Forms.RichTextBox.ScrollBars%2A> Eigenschaft, um einen geeigneten Wert aus der <xref:System.Windows.Forms.RichTextBoxScrollBars> Enumeration. |Wert|Beschreibung| |-----------|-----------------| |<xref:System.Windows.Forms.RichTextBoxScrollBars.Both> (Standard)|Horizontale oder vertikale Bildlaufleisten angezeigt oder für beide nur, wenn Text die Breite bzw. Höhe des Steuerelements überschreitet.| |<xref:System.Windows.Forms.RichTextBoxScrollBars.None>|Nie zeigt jede Art von Bildlaufleiste angezeigt.| |<xref:System.Windows.Forms.RichTextBoxScrollBars.Horizontal>|Zeigt eine horizontale Schiebeleiste nur, wenn der Text die Breite des Steuerelements überschreitet. (Damit dies eintritt, die <xref:System.Windows.Forms.TextBoxBase.WordWrap%2A> Eigenschaft muss festgelegt werden, um `false`.)| |<xref:System.Windows.Forms.RichTextBoxScrollBars.Vertical>|Zeigt eine vertikale scrollleiste an, nur, wenn der Text der Höhe des Steuerelements überschreitet.| |<xref:System.Windows.Forms.RichTextBoxScrollBars.ForcedHorizontal>|Zeigt eine horizontale Schiebeleiste bei der <xref:System.Windows.Forms.TextBoxBase.WordWrap%2A> -Eigenschaftensatz auf `false`. Wenn der Text die Breite des Steuerelements nicht überschreitet, wird von die Bildlaufleiste abgeblendet angezeigt.| |<xref:System.Windows.Forms.RichTextBoxScrollBars.ForcedVertical>|Zeigt immer eine vertikale Bildlaufleiste angezeigt. Wenn der Text die Länge des Steuerelements nicht überschreitet, wird von die Bildlaufleiste abgeblendet angezeigt.| |<xref:System.Windows.Forms.RichTextBoxScrollBars.ForcedBoth>|Zeigt immer eine vertikale Bildlaufleiste an. Zeigt eine horizontale Schiebeleiste bei der <xref:System.Windows.Forms.TextBoxBase.WordWrap%2A> -Eigenschaftensatz auf `false`. Wenn der Text die Breite bzw. Höhe des Steuerelements nicht überschreitet, werden die Bildlaufleisten abgeblendet angezeigt.| 3. Legen Sie für die <xref:System.Windows.Forms.TextBoxBase.WordWrap%2A>-Eigenschaft einen geeigneten Wert fest. |Wert|Beschreibung| |-----------|-----------------| |`false`|Text im Steuerelement wird nicht automatisch angepasst, um die Breite des Steuerelements anpassen, damit sie nach rechts Scrollen wird, bis ein Zeilenumbruch eingefügt erreicht ist. Verwenden Sie diesen Wert, wenn Sie horizontale Scrollleisten oder beide, oben ausgewählt haben.| |`true` (Standard)|Text im Steuerelement wird automatisch angepasst, um die Breite des Steuerelements passt. Die horizontale Bildlaufleiste wird nicht angezeigt. Verwenden Sie diesen Wert, wenn Sie vertikale Bildlaufleisten oder none, oben, einen oder mehrere Absätze anzeigen.| ## <a name="see-also"></a>Siehe auch - <xref:System.Windows.Forms.RichTextBoxScrollBars> - <xref:System.Windows.Forms.RichTextBox> - [RichTextBox-Steuerelement](richtextbox-control-windows-forms.md) - [Windows Forms-Steuerelemente](controls-to-use-on-windows-forms.md)
94.708333
375
0.794325
deu_Latn
0.936
0c1ad707cc8a35a0dc8d6e8a36bfe8ebba831ccf
1,308
md
Markdown
_talks/2017-05-talk-ICC17.md
lynnlilu/academicpages.github.io
4caf064835ae276140e935914fd5a10b9b8bd104
[ "MIT" ]
null
null
null
_talks/2017-05-talk-ICC17.md
lynnlilu/academicpages.github.io
4caf064835ae276140e935914fd5a10b9b8bd104
[ "MIT" ]
null
null
null
_talks/2017-05-talk-ICC17.md
lynnlilu/academicpages.github.io
4caf064835ae276140e935914fd5a10b9b8bd104
[ "MIT" ]
null
null
null
--- title: "Cost-Efficient VM Configuration Algorithm in the Cloud using Mix Scaling Strategy" collection: talks type: "Talk" permalink: /talks/2017-05-talk-ICC17 venue: "CCN-04: Quality of Service and Experience (QoS & QoE) in Cloud Computing, IEEE ICC 2017" date: 2017-05-24 location: "Paris, France" --- Benefiting from the pay-per-use pricing model of cloud computing, many companies migrate their services and applications from typical expensive infrastructures to the cloud. However, due to fluctuations in the workload of services and applications, making a cost-efficient VM configuration decision in the cloud remains a critical challenge. Even experienced administrators cannot accurately predict the workload in the future. Since the pricing model of cloud provider is convex other than linear that often assumed in past research, instead of typical scaling out strategy. In this paper, we adopt mix scale strategy. Based on this observation, we model an optimization problem aiming to minimize the VM configuration cost under the constraint of migration delay. Taking advantages of Lyapunov optimization techniques, we propose a mix scale online algorithm which achieves more cost-efficiency than that of scale out strategy. [View presentation slide](http://lynnlilu.github.io/files/ICCv2.pdf)
93.428571
929
0.811162
eng_Latn
0.987025
0c1b3e54d1994a404ca8c599382bb631ec0dbf72
1,045
md
Markdown
_posts/2011-10-17-d181d183d0bf-d0bfd18ed180d0b5-d0b8d0b7-d0bcd0bed180d0bad0bed0b2d0b8-d0b8d0bcd0b1d0b8d180d18f-d0b8-d0bad0bed0bad0bed181d0b0.md
prostoalex/lanamoskalyuk-jekyll-netlify-boilerplate
6a64121a51cf13377675478a42bc661b61ae46ab
[ "MIT" ]
null
null
null
_posts/2011-10-17-d181d183d0bf-d0bfd18ed180d0b5-d0b8d0b7-d0bcd0bed180d0bad0bed0b2d0b8-d0b8d0bcd0b1d0b8d180d18f-d0b8-d0bad0bed0bad0bed181d0b0.md
prostoalex/lanamoskalyuk-jekyll-netlify-boilerplate
6a64121a51cf13377675478a42bc661b61ae46ab
[ "MIT" ]
null
null
null
_posts/2011-10-17-d181d183d0bf-d0bfd18ed180d0b5-d0b8d0b7-d0bcd0bed180d0bad0bed0b2d0b8-d0b8d0bcd0b1d0b8d180d18f-d0b8-d0bad0bed0bad0bed181d0b0.md
prostoalex/lanamoskalyuk-jekyll-netlify-boilerplate
6a64121a51cf13377675478a42bc661b61ae46ab
[ "MIT" ]
null
null
null
--- id: 5177 title: Суп-пюре из моркови, имбиря и кокоса date: 2011-10-17T21:44:50-08:00 author: lana layout: post guid: http://lana.moskalyuk.com/?p=5177 permalink: /recipe/5177/ ljID: - "2128" categories: - carrots - coconut - garlic - ginger - limes - olive oil - onions - soup --- <img loading="lazy" class="alignnone" title="carrot, ginger and coconut soup" src="http://farm7.static.flickr.com/6212/6256038363_f4f862c5a9_z.jpg" alt="" width="640" height="638" /> **Ингредиенты**: олив. масло 1 большая луковица 3 дольки чеснока имбирь 1 кг моркови 4 чашки овощного бульона или воды кокосовое молоко (1 баночка) сок 1 лайма соль и перец по вкусу Обжарить лук и чеснок на олив. масле в кастрюльке. Добавить имбирь. Морковь мелко нарезать и добавить в кастрюлю. Залить водой и кокосовым молоком. Довести до кипения и варить на среднем огне до готовности моркови (15-20 минут). Снять с огня и блендером взбить смесь до состояния пюре. Добавить сок лайма. Посолить и поперчить. Приятного аппетита!
26.794872
182
0.725359
rus_Cyrl
0.665053
0c1b77e0bcf1f2d2c763cc2b9f63c163d42a2e3a
1,570
md
Markdown
docs/src/apalache/known-issues.md
JonathanLorimer/apalache
e3b35e114247a21d3f25d5f0712a9ec6f7e01b41
[ "Apache-2.0" ]
155
2020-07-26T13:11:11.000Z
2022-03-30T19:38:31.000Z
docs/src/apalache/known-issues.md
JonathanLorimer/apalache
e3b35e114247a21d3f25d5f0712a9ec6f7e01b41
[ "Apache-2.0" ]
1,027
2020-06-26T20:44:31.000Z
2022-03-31T17:06:27.000Z
docs/src/apalache/known-issues.md
JonathanLorimer/apalache
e3b35e114247a21d3f25d5f0712a9ec6f7e01b41
[ "Apache-2.0" ]
12
2020-11-18T19:25:08.000Z
2022-03-17T23:22:28.000Z
# Known issues This page collects known issues that were reported by the users. ## Deadlock detection Deadlock detection is imprecise. It may report false negatives, see [Issue 711][]. **Affected versions:** <= 0.15.x **Planned fix:** [Issue 712][] ## Updating records with excess fields Given a record with a type declaration specifying `n` fields, if that record is given more than `n` fields and the specification includes an `EXCEPT` expression that updates the record, Apalache may be unable to check the specification. In the following example, the variable `m` is given the type of a record with `1` field (namely `a`), but it is then assigned to a record with `2` fields (namely `a` and `foo`). ```tla VARIABLE \* @type: [a: Int]; m Init == m = [a |-> 0, foo |-> TRUE] Next == \/ m' = m \/ m' = [m EXCEPT !.a = 0] ``` Given the current (unsound) typing discipline Apalache uses for records, this specification is not considered incorrectly typed. However, due to the update using `EXCEPT` in the `Next` operator, the specification cannot be checked. **Affected versions:** <= 0.15.x **Planned fix:** [Issue 401][] ### Workaround Add the `foo` field to the variable's type signature: ```tla VARIABLE \* @type: [a: Int, foo: Bool]; m Init == m = [a |-> 0, foo |-> TRUE] Next == \/ m' = m \/ m' = [m EXCEPT !.a = 0] ``` [Issue 711]: https://github.com/informalsystems/apalache/issues/711 [Issue 712]: https://github.com/informalsystems/apalache/issues/712 [Issue 401]: https://github.com/informalsystems/apalache/issues/401
24.53125
80
0.684713
eng_Latn
0.990159
0c1bc37799711a1eba4a27e3af35ab34c3511769
183
md
Markdown
_includes/02-image.md
shutruknahunte/markdown-portfolio
61c4fea2139018de87311f4e98b5eeb0c9a515b5
[ "MIT" ]
null
null
null
_includes/02-image.md
shutruknahunte/markdown-portfolio
61c4fea2139018de87311f4e98b5eeb0c9a515b5
[ "MIT" ]
5
2021-03-19T14:49:45.000Z
2021-03-19T17:53:33.000Z
_includes/02-image.md
shutruknahunte/markdown-portfolio
61c4fea2139018de87311f4e98b5eeb0c9a515b5
[ "MIT" ]
null
null
null
![Hobbit house door](https://images.unsplash.com/photo-1523593288094-3ccfb6b2c192?ixid=MXwxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHw%3D&ixlib=rb-1.2.1&auto=format&fit=crop&w=1314&q=80)
91.5
182
0.825137
yue_Hant
0.082062
0c1c7c27d2ae61cbe74eb5b0b820899057459e47
4,216
md
Markdown
android.md
HateIron/HateIron
4dc8860ca7f78ce5d8fad3016d04df3e49f02ef9
[ "Zend-2.0" ]
4
2018-10-12T12:18:48.000Z
2021-10-17T12:40:46.000Z
android.md
HateIron/HateIron
4dc8860ca7f78ce5d8fad3016d04df3e49f02ef9
[ "Zend-2.0" ]
null
null
null
android.md
HateIron/HateIron
4dc8860ca7f78ce5d8fad3016d04df3e49f02ef9
[ "Zend-2.0" ]
2
2018-11-24T15:18:09.000Z
2021-04-26T10:13:21.000Z
# <center>android</center> [TOC] ## 一、开发环境及示例程序 直接上 android studio ## 二、Android Git服务器`gidder` ###[本文主要来自](https://blog.csdn.net/TaylorPotter/article/details/69808733) https://blog.csdn.net/TaylorPotter/article/details/69808733 ###I、资源获取 ```powershell 一、安卓市场下载 gidder 二、gidder源码:https://github.com/antoniy/Gidder 三、https://play.google.com/store/apps/details?id=net.antoniy.gidder.beta 三、https://pan.baidu.com/s/1eRWFxRk ``` ###II、概述/简介 ```powershell 推荐一款开源的在Android平台上的git服务器app:gidder。 它可以将手机作为可移动的git服务器。 感谢Bob的推荐,在这里推荐给大家使用。 因为公司网络限制,evernote、为知笔记、有道笔记都上不了,github在国内网络状况不佳,所以找到这样一个工具,将自己的android手机变成一个移动的git服务器,每天下班前将公司记的笔记同步到自己的手机上,回家了再同步到私人电脑上,反之亦然。 这样我们就可以随身携带git服务器管理笔记或者文件。使用这种方式不仅可以不依赖互联网,而且还可以看到到自己笔记每一笔提交,随时回退修改,方便很多。只可惜现在只有Android平台上能勾使用。 ``` ###III、ANDROID 上安装服务器 > ### 从安卓市场下载安装后 > > ![](./attach/gidder_start_index.png) > > ![](./attach/start_settings.png) > > ![](./attach/add_user.png) > > ![](./attach/add_repo.png) > > ![](./attach/add_user_to_repo.png) > > ![](./attach/git_test.png) > > ###IV、客户端连接 > ### 我的服务器用户名密码配置为 `wishcell / wishcell`,仓库名为`fakegithub.git` ####1、连接出错 ![](./attach/github_conn_gidder.png) ####2、解决方法 ```powershell 修改: 如果没有 ~/.ssh/config 文件,则创建之。 然后在其中添加一行: HostkeyAlgorithms +ssh-dss 然后更改访问权限: chmod 600 ~/.ssh/config 然后 git clone 就能完美工作 原文: I have found the problem , The new OpenSSH versions disable the ssh-dss (DSA) public key algorithm. DSA was deemed too weak and OpenSSH community recommends against its use. If you see an error similar to this: Unable to negotiate with 10.96.8.72: no matching host key type found. Their offer: ssh-dss ...then you must re-enable the DSA keys by editing your ~/.ssh/config file to add the following line: HostkeyAlgorithms +ssh-dss You may need to create the ~/.ssh/config file if it does not already exist. After creating the file, you must restrict access permissions: chmod 600 ~/.ssh/config and then do the clone. That should work perfectly fine! ``` ![](./attach/open_ssh_cfg.png) ![](./attach/edit_ssh_dss.png) ![](./attach/ssh_cfg_access_mode.png) ![](./attach/gidder_clone_succ.png) ![](./attach/gidder_commit.png) ###V、由于ip经常更换,push出错 #### 1、修改每次不同的url ```shell git remote set-url origin ssh://[email protected]:2222/fakegithub.git ``` ####2、出错 log ``` $ git push cdgnote master @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ @ WARNING: REMOTE HOST IDENTIFICATION HAS CHANGED! @ @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ IT IS POSSIBLE THAT SOMEONE IS DOING SOMETHING NASTY! Someone could be eavesdropping on you right now (man-in-the-middle attack)! It is also possible that a host key has just been changed. The fingerprint for the DSA key sent by the remote host is SHA256:MhOGEr5Oj3N9MoMM4DcHsNSfMKZk2bSEhGSsvFWUYco. Please contact your system administrator. Add correct host key in /c/Users/ni**.chen/.ssh/known_hosts to get rid of this message. Offending DSA key in /c/Users/ni**.chen/.ssh/known_hosts:4 DSA host key for [10.0.56.201]:2222 has changed and you have requested strict checking. Host key verification failed. fatal: Could not read from remote repository. Please make sure you have the correct access rights and the repository exists. ``` > #### 解决办法: > > ​ 删除 known_host 中对应url的条目,再push。 --- #### 3、 git pull 报错 > ###A、服务器重启能解决的情况 ![](./attach/gidder_git_pull_err.png) > ### B、需要设置分支情况 > > ```powershell > chengang@chengang-dev:~/local_notes/xmlocalnotes$ git pull > Password authentication > Password: > 当前分支没有跟踪信息。 > 请指定您要合并哪一个分支。 > 详见 git-pull(1)。 > > git pull <remote> <branch> > > 如果您想要为此分支创建跟踪信息,您可以执行: > > git branch --set-upstream-to=origin/<branch> master > ``` > > 解决办法: > > ```powershell > chengang@chengang-dev:~/local_notes/xmlocalnotes$ git remote > origin > chengang@chengang-dev:~/local_notes/xmlocalnotes$ git branch -a > * master > remotes/origin/master > chengang@chengang-dev:~/local_notes/xmlocalnotes$ git branch --set-upstream-to=origin/master master > 分支 master 设置为跟踪来自 origin 的远程分支 master。 > chengang@chengang-dev:~/local_notes/xmlocalnotes$ git pull > #pull 成功 > ``` > > **注意git branch –set-upstream-to=origin/master master set前面是两个横杠。 ** --- ---
23.164835
132
0.708491
eng_Latn
0.454616
0c1c819ff0cc25b8c1d9f86d6b629928682abcf7
3,840
md
Markdown
docs/setconsoletitle.md
ratijas/Console-Docs
944bcf1edcff1b435c401f4e0a66b2bf88a28c50
[ "CC-BY-4.0", "MIT" ]
1
2021-08-22T18:02:13.000Z
2021-08-22T18:02:13.000Z
docs/setconsoletitle.md
mgblythe/Console-Docs
c0425af959122bfe9a5972b37536b34538aec78f
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/setconsoletitle.md
mgblythe/Console-Docs
c0425af959122bfe9a5972b37536b34538aec78f
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: SetConsoleTitle function description: See reference information about the SetConsoleTitle function, which sets the title for the current console window. author: miniksa ms.author: miniksa ms.topic: article keywords: console, character mode applications, command line applications, terminal applications, console api f1_keywords: - consoleapi2/SetConsoleTitle - wincon/SetConsoleTitle - SetConsoleTitle - consoleapi2/SetConsoleTitleA - wincon/SetConsoleTitleA - SetConsoleTitleA - consoleapi2/SetConsoleTitleW - wincon/SetConsoleTitleW - SetConsoleTitleW MS-HAID: - '\_win32\_setconsoletitle' - 'base.setconsoletitle' - 'consoles.setconsoletitle' MSHAttr: - 'PreferredSiteName:MSDN' - 'PreferredLib:/library/windows/desktop' ms.assetid: f1db449b-0dd0-4d61-bb79-b7da9a5168f4 topic_type: - apiref api_name: - SetConsoleTitle - SetConsoleTitleA - SetConsoleTitleW api_location: - Kernel32.dll - API-MS-Win-Core-Kernel32-Legacy-l1-1-0.dll - kernel32legacy.dll - API-MS-Win-Core-Kernel32-Legacy-l1-1-1.dll - API-MS-Win-Core-Kernel32-Legacy-l1-1-2.dll - API-MS-Win-DownLevel-Kernel32-l2-1-0.dll - API-MS-Win-Core-Console-l2-1-0.dll - KernelBase.dll - API-MS-Win-DownLevel-Kernel32-l1-1-0.dll - API-MS-Win-Core-Kernel32-Legacy-L1-1-3.dll - API-MS-Win-Core-Kernel32-Legacy-L1-1-4.dll - API-MS-Win-Core-Kernel32-Legacy-L1-1-5.dll api_type: - DllExport --- # SetConsoleTitle function [!INCLUDE [not-recommended-banner](./includes/not-recommended-banner.md)] Sets the title for the current console window. ## Syntax ```C BOOL WINAPI SetConsoleTitle( _In_ LPCTSTR lpConsoleTitle ); ``` ## Parameters *lpConsoleTitle* \[in\] The string to be displayed in the title bar of the console window. The total size must be less than 64K. ## Return value If the function succeeds, the return value is nonzero. If the function fails, the return value is zero. To get extended error information, call [**GetLastError**](/windows/win32/api/errhandlingapi/nf-errhandlingapi-getlasterror). ## Remarks When the process terminates, the system restores the original console title. [!INCLUDE [setting-codepage-mode-remarks](./includes/setting-codepage-mode-remarks.md)] > [!TIP] > This API has a **[virtual terminal](console-virtual-terminal-sequences.md)** equivalent in the **[window title](console-virtual-terminal-sequences.md#window-title)** sequences. _Virtual terminal sequences_ are recommended for all new and ongoing development. ## Examples The following example shows how to retrieve and modify the console title. ```C #include <windows.h> #include <tchar.h> #include <conio.h> #include <strsafe.h> int main( void ) { TCHAR szOldTitle[MAX_PATH]; TCHAR szNewTitle[MAX_PATH]; // Save current console title. if( GetConsoleTitle(szOldTitle, MAX_PATH) ) { // Build new console title string. StringCchPrintf(szNewTitle, MAX_PATH, TEXT("TEST: %s"), szOldTitle); // Set console title to new title if( !SetConsoleTitle(szNewTitle) ) { _tprintf(TEXT("SetConsoleTitle failed (%d)\n"), GetLastError()); return 1; } else { _tprintf(TEXT("SetConsoleTitle succeeded.\n")); } } return 0; } ``` ## Requirements | &nbsp; | &nbsp; | |-|-| | Minimum supported client | Windows 2000 Professional \[desktop apps only\] | | Minimum supported server | Windows 2000 Server \[desktop apps only\] | | Header | ConsoleApi2.h (via WinCon.h, include Windows.h) | | Library | Kernel32.lib | | DLL | Kernel32.dll | | Unicode and ANSI names | **SetConsoleTitleW** (Unicode) and **SetConsoleTitleA** (ANSI) | ## See also [Console Functions](console-functions.md) [**GetConsoleOriginalTitle**](getconsoleoriginaltitle.md) [**GetConsoleTitle**](getconsoletitle.md) [**SetConsoleCP**](setconsolecp.md) [**SetConsoleOutputCP**](setconsoleoutputcp.md)
26.853147
260
0.738542
eng_Latn
0.477092
0c1cf118e46f6d6050d6986b74ab399434cb80ab
5,590
md
Markdown
articles/virtual-machines/windows/classic/virtual-machines-windows-classic-restart-resize-error-troubleshooting.md
OpenLocalizationTestOrg/azure-docs-pr15_fr-FR
15e46e4307973230cd36bb4805c881d943cfb99b
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
null
null
null
articles/virtual-machines/windows/classic/virtual-machines-windows-classic-restart-resize-error-troubleshooting.md
OpenLocalizationTestOrg/azure-docs-pr15_fr-FR
15e46e4307973230cd36bb4805c881d943cfb99b
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
null
null
null
articles/virtual-machines/windows/classic/virtual-machines-windows-classic-restart-resize-error-troubleshooting.md
OpenLocalizationTestOrg/azure-docs-pr15_fr-FR
15e46e4307973230cd36bb4805c881d943cfb99b
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
null
null
null
<properties pageTitle="Machine virtuelle redémarrer ou en redimensionnant problèmes | Microsoft Azure" description="Résoudre les problèmes de déploiement classique de redémarrer ou en redimensionnant une Machine virtuelle Windows existante dans Azure" services="virtual-machines-windows" documentationCenter="" authors="Deland-Han" manager="felixwu" editor="" tags="top-support-issue"/> <tags ms.service="virtual-machines-windows" ms.topic="support-article" ms.tgt_pltfrm="vm-windows" ms.workload="required" ms.date="09/20/2016" ms.devlang="na" ms.author="delhan"/> # <a name="troubleshoot-classic-deployment-issues-with-restarting-or-resizing-an-existing-windows-virtual-machine-in-azure"></a>Résoudre les problèmes de déploiement classique de redémarrer ou en redimensionnant une Machine virtuelle Windows existante dans Azure > [AZURE.SELECTOR] - [Classique](virtual-machines-windows-classic-restart-resize-error-troubleshooting.md) - [Gestionnaire de ressources](../../virtual-machines-windows-restart-resize-error-troubleshooting.md) Lorsque vous essayez de démarrer une Machine virtuelle machine (arrêté Azure virtuelle), ou redimensionner un machine virtuelle Azure existant, l’erreur courant que vous rencontrez est un échec d’allocation. Cette erreur se produit lorsque le cluster ou la région n’a pas de ressources disponibles ou ne reconnaît pas la taille de mémoire virtuelle demandée. > [AZURE.IMPORTANT] Azure comporte deux modèles de déploiement différents pour la création et utilisation des ressources : [Gestionnaire de ressources et classique](../../../resource-manager-deployment-model.md). Cet article décrit l’utilisation du modèle de déploiement classique. Microsoft recommande la plupart des nouveaux déploiements d’utiliser le modèle de gestionnaire de ressources. [AZURE.INCLUDE [support-disclaimer](../../../../includes/support-disclaimer.md)] ## <a name="collect-audit-logs"></a>Journaux d’audit collecte Pour démarrer la résolution des problèmes, collecte des journaux d’audit afin d’identifier l’erreur associée à ce problème. Dans le portail Azure, cliquez sur **Parcourir** > **machines virtuelles** > _votre machine virtuelle Windows_ > **paramètres** > **journaux d’Audit**. ## <a name="issue-error-when-starting-a-stopped-vm"></a>Problème : Erreur lors du démarrage d’une machine virtuelle arrêtée Vous essayez de démarrer une machine virtuelle arrêtée mais obtenez un échec d’allocation. ### <a name="cause"></a>Cause La demande pour démarrer la machine virtuelle arrêtée doit être démarrée au cluster d’origine qui héberge le service cloud. Toutefois, le cluster n’a pas d’espace disque disponible pour répondre à la demande. ### <a name="resolution"></a>Résolution * Créer un nouveau service cloud et l’associer à des options une région ou un réseau virtuel région, mais pas un groupe affinité. * Supprimer la machine virtuelle arrêtée. * Recréez la machine virtuelle dans le nouveau service cloud à l’aide de disques. * Démarrer la machine virtuelle recréée. Si vous obtenez une erreur lorsque vous tentez de créer un nouveau service cloud, recommencez ultérieurement ou modifier une région pour le service cloud. > [AZURE.IMPORTANT] Le nouveau service cloud aura un nouveau nom et VIP, vous devez modifier ces informations pour toutes les dépendances qui utilisent ces informations pour le service cloud existant. ## <a name="issue-error-when-resizing-an-existing-vm"></a>Problème : Erreur lors du redimensionnement d’un ordinateur virtuel existant Vous essayez de redimensionner une machine virtuelle existante mais obtenez un échec d’allocation. ### <a name="cause"></a>Cause La demande pour redimensionner la machine virtuelle doit être démarrée au cluster d’origine qui héberge le service cloud. Toutefois, le cluster ne prend pas en charge la taille de mémoire virtuelle demandée. ### <a name="resolution"></a>Résolution Réduire la taille de mémoire virtuelle demandée, puis retentez l’opération redimensionnement. * Cliquez sur **Parcourir toutes les** > **machines virtuelles (classique)** > _votre machine virtuelle_ > **paramètres** > **taille**. Pour plus d’informations, voir [redimensionner la machine virtuelle](https://msdn.microsoft.com/library/dn168976.aspx). Si elle n’est pas possible de réduire la taille de la mémoire virtuelle, procédez comme suit : * Créer un nouveau service cloud, garantissant il est pas lié à un groupe affinité et non associé à un réseau virtuel qui est lié à un groupe affinité. * Créer une nouveau, grande taille virtuelle qu’il contient. Vous pouvez consolider vos ordinateurs virtuels dans le même service cloud. Si votre service cloud existant est associé à un réseau virtuel région, vous pouvez connecter le nouveau service cloud au réseau virtuel existant. Si le service cloud existant n’est pas associé à un réseau virtuel région, vous devez supprimer les ordinateurs virtuels dans le service cloud existant et recréez-les dans le nouveau service cloud de leurs disques. Toutefois, il est important n’oubliez pas que le nouveau service cloud aura un nouveau nom et VIP, vous devez mettre à jour ces pour toutes les dépendances qui l’utilisent ces informations pour le service cloud existant. ## <a name="next-steps"></a>Étapes suivantes Si vous rencontrez des problèmes lorsque vous créez une nouvelle machine virtuelle Windows dans Azure, consultez [résolution des problèmes de déploiement aide pour créer une machine virtuelle Windows dans Azure](../../virtual-machines-windows-troubleshoot-deployment-new-vm.md).
65
435
0.78712
fra_Latn
0.979517
0c1d52c2ac1b6c0d2e92d0fb2347eb5252ac2bc7
60,176
md
Markdown
b/beurkg/index.md
rugk/gesetze
ce3b4435785e5173e1273b35b00c374f64aa85b2
[ "Unlicense" ]
null
null
null
b/beurkg/index.md
rugk/gesetze
ce3b4435785e5173e1273b35b00c374f64aa85b2
[ "Unlicense" ]
null
null
null
b/beurkg/index.md
rugk/gesetze
ce3b4435785e5173e1273b35b00c374f64aa85b2
[ "Unlicense" ]
null
null
null
--- Title: Beurkundungsgesetz jurabk: BeurkG layout: default origslug: beurkg slug: beurkg --- # Beurkundungsgesetz (BeurkG) Ausfertigungsdatum : 1969-08-28 Fundstelle : BGBl I: 1969, 1513 Zuletzt geändert durch : Art. 4 G v. 5.7.2021 I 3338 Änderung durch : Art. 23 G v. 5.10.2021 I 4607 (Nr. 71) textlich nachgewiesen, dokumentarisch noch nicht abschließend bearbeitet Änderung durch : Art. 2 G v. 21.12.2021 II 1282 (Nr. 27) textlich nachgewiesen, dokumentarisch noch nicht abschließend bearbeitet ## Erster Abschnitt - Allgemeine Vorschriften ### § 1 Geltungsbereich (1) Dieses Gesetz gilt für öffentliche Beurkundungen und Verwahrungen durch den Notar. (2) Soweit für öffentliche Beurkundungen neben dem Notar auch andere Urkundspersonen oder sonstige Stellen zuständig sind, gelten die Vorschriften dieses Gesetzes, ausgenommen § 5 Abs. 2 und des Fünften Abschnittes, entsprechend. ### § 2 Überschreiten des Amtsbezirks Eine Beurkundung ist nicht deshalb unwirksam, weil der Notar sie außerhalb seines Amtsbezirks oder außerhalb des Landes vorgenommen hat, in dem er zum Notar bestellt ist. ### § 3 Verbot der Mitwirkung als Notar (1) Ein Notar soll an einer Beurkundung nicht mitwirken, wenn es sich handelt um 1. eigene Angelegenheiten, auch wenn der Notar nur mitberechtigt oder mitverpflichtet ist, 2. Angelegenheiten seines Ehegatten, früheren Ehegatten oder seines Verlobten, 2a. Angelegenheiten seines Lebenspartners oder früheren Lebenspartners, 3. Angelegenheiten einer Person, die mit dem Notar in gerader Linie verwandt oder verschwägert oder in der Seitenlinie bis zum dritten Grade verwandt oder bis zum zweiten Grade verschwägert ist oder war, 4. Angelegenheiten einer Person, mit der sich der Notar zur gemeinsamen Berufsausübung verbunden oder mit der er gemeinsame Geschäftsräume hat, 5. Angelegenheiten einer Person, deren gesetzlicher Vertreter der Notar oder eine Person im Sinne der Nummer 4 ist, 6. Angelegenheiten einer Person, deren vertretungsberechtigtem Organ der Notar oder eine Person im Sinne der Nummer 4 angehört, 7. Angelegenheiten einer Person, für die der Notar, eine Person im Sinn der Nummer 4 oder eine mit dieser im Sinn der Nummer 4 oder in einem verbundenen Unternehmen (§ 15 des Aktiengesetzes) verbundene Person außerhalb einer Amtstätigkeit in derselben Angelegenheit bereits tätig war oder ist, es sei denn, diese Tätigkeit wurde im Auftrag aller Personen ausgeübt, die an der Beurkundung beteiligt sein sollen, 8. Angelegenheiten einer Person, die den Notar in derselben Angelegenheit bevollmächtigt hat oder zu der der Notar oder eine Person im Sinne der Nummer 4 in einem ständigen Dienst- oder ähnlichen ständigen Geschäftsverhältnis steht, oder 9. Angelegenheiten einer Gesellschaft, an der der Notar mit mehr als fünf vom Hundert der Stimmrechte oder mit einem anteiligen Betrag des Haftkapitals von mehr als 2 500 Euro beteiligt ist. Der Notar hat vor der Beurkundung nach einer Vorbefassung im Sinne des Satzes 1 Nummer 7 zu fragen und in der Urkunde die Antwort zu vermerken. (2) Handelt es sich um eine Angelegenheit mehrerer Personen und ist der Notar früher in dieser Angelegenheit als gesetzlicher Vertreter oder Bevollmächtigter tätig gewesen oder ist er für eine dieser Personen in anderer Sache als Bevollmächtigter tätig, so soll er vor der Beurkundung darauf hinweisen und fragen, ob er die Beurkundung gleichwohl vornehmen soll. In der Urkunde soll er vermerken, daß dies geschehen ist. (3) Absatz 2 gilt entsprechend, wenn es sich handelt um 1. Angelegenheiten einer Person, deren nicht zur Vertretung berechtigtem Organ der Notar angehört, 2. Angelegenheiten einer Gemeinde oder eines Kreises, deren Organ der Notar angehört, 3. Angelegenheiten einer als Körperschaft des öffentlichen Rechts anerkannten Religions- oder Weltanschauungsgemeinschaft oder einer als Körperschaft des öffentlichen Rechts anerkannten Teilorganisation einer solchen Gemeinschaft, deren Organ der Notar angehört. In den Fällen des Satzes 1 Nummer 2 und 3 ist Absatz 1 Satz 1 Nummer 6 nicht anwendbar. ### § 4 Ablehnung der Beurkundung Der Notar soll die Beurkundung ablehnen, wenn sie mit seinen Amtspflichten nicht vereinbar wäre, insbesondere wenn seine Mitwirkung bei Handlungen verlangt wird, mit denen erkennbar unerlaubte oder unredliche Zwecke verfolgt werden. ### § 5 Urkundensprache (1) Urkunden werden in deutscher Sprache errichtet. (2) Der Notar kann auf Verlangen Urkunden auch in einer anderen Sprache errichten. Er soll dem Verlangen nur entsprechen, wenn er der fremden Sprache hinreichend kundig ist. ## Zweiter Abschnitt - Beurkundung von Willenserklärungen ### 1. - Ausschließung des Notars #### § 6 Ausschließungsgründe (1) Die Beurkundung von Willenserklärungen ist unwirksam, wenn 1. der Notar selbst, 2. sein Ehegatte, 2a. sein Lebenspartner, 3. eine Person, die mit ihm in gerader Linie verwandt ist oder war oder 4. ein Vertreter, der für eine der in den Nummern 1 bis 3 bezeichneten Personen handelt, an der Beurkundung beteiligt ist. (2) An der Beurkundung beteiligt sind die Erschienenen, deren im eigenen oder fremden Namen abgegebene Erklärungen beurkundet werden sollen. #### § 7 Beurkundungen zugunsten des Notars oder seiner Angehörigen Die Beurkundung von Willenserklärungen ist insoweit unwirksam, als diese darauf gerichtet sind, 1. dem Notar, 2. seinem Ehegatten oder früheren Ehegatten, 2a. seinem Lebenspartner oder früheren Lebenspartner oder 3. einer Person, die mit ihm in gerader Linie verwandt oder verschwägert oder in der Seitenlinie bis zum dritten Grade verwandt oder bis zum zweiten Grade verschwägert ist oder war, einen rechtlichen Vorteil zu verschaffen. ### 2. - Niederschrift #### § 8 Grundsatz Bei der Beurkundung von Willenserklärungen muß eine Niederschrift über die Verhandlung aufgenommen werden. #### § 9 Inhalt der Niederschrift (1) Die Niederschrift muß enthalten 1. die Bezeichnung des Notars und der Beteiligten sowie 2. die Erklärungen der Beteiligten. Erklärungen in einem Schriftstück, auf das in der Niederschrift verwiesen und das dieser beigefügt wird, gelten als in der Niederschrift selbst enthalten. Satz 2 gilt entsprechend, wenn die Beteiligten unter Verwendung von Karten, Zeichnungen oder Abbildungen Erklärungen abgeben. (2) Die Niederschrift soll Ort und Tag der Verhandlung enthalten. #### § 10 Feststellung der Beteiligten (1) Der Notar soll sich Gewissheit über die Person der Beteiligten verschaffen. (2) In der Niederschrift soll die Person der Beteiligten so genau bezeichnet werden, daß Zweifel und Verwechslungen ausgeschlossen sind. (3) Aus der Niederschrift soll sich ergeben, ob der Notar die Beteiligten kennt oder wie er sich Gewißheit über ihre Person verschafft hat. Kann sich der Notar diese Gewißheit nicht verschaffen, wird aber gleichwohl die Aufnahme der Niederschrift verlangt, so soll der Notar dies in der Niederschrift unter Anführung des Sachverhalts angeben. #### § 11 Feststellungen über die Geschäftsfähigkeit (1) Fehlt einem Beteiligten nach der Überzeugung des Notars die erforderliche Geschäftsfähigkeit, so soll die Beurkundung abgelehnt werden. Zweifel an der erforderlichen Geschäftsfähigkeit eines Beteiligten soll der Notar in der Niederschrift feststellen. (2) Ist ein Beteiligter schwer krank, so soll dies in der Niederschrift vermerkt und angegeben werden, welche Feststellungen der Notar über die Geschäftsfähigkeit getroffen hat. #### § 12 Nachweise für die Vertretungsberechtigung Vorgelegte Vollmachten und Ausweise über die Berechtigung eines gesetzlichen Vertreters sollen der Niederschrift in Urschrift oder in beglaubigter Abschrift beigefügt werden. Ergibt sich die Vertretungsberechtigung aus einer Eintragung im Handelsregister oder in einem ähnlichen Register, so genügt die Bescheinigung eines Notars nach § 21 der Bundesnotarordnung. #### § 13 Vorlesen, Genehmigen, Unterschreiben (1) Die Niederschrift muß in Gegenwart des Notars den Beteiligten vorgelesen, von ihnen genehmigt und eigenhändig unterschrieben werden; soweit die Niederschrift auf Karten, Zeichnungen oder Abbildungen verweist, müssen diese den Beteiligten anstelle des Vorlesens zur Durchsicht vorgelegt werden. In der Niederschrift soll festgestellt werden, daß dies geschehen ist. Haben die Beteiligten die Niederschrift eigenhändig unterschrieben, so wird vermutet, daß sie in Gegenwart des Notars vorgelesen oder, soweit nach Satz 1 erforderlich, zur Durchsicht vorgelegt und von den Beteiligten genehmigt ist. Die Niederschrift soll den Beteiligten auf Verlangen vor der Genehmigung auch zur Durchsicht vorgelegt werden. (2) Werden mehrere Niederschriften aufgenommen, die ganz oder teilweise übereinstimmen, so genügt es, wenn der übereinstimmende Inhalt den Beteiligten einmal nach Absatz 1 Satz 1 vorgelesen oder anstelle des Vorlesens zur Durchsicht vorgelegt wird. § 18 der Bundesnotarordnung bleibt unberührt. (3) Die Niederschrift muß von dem Notar eigenhändig unterschrieben werden. Der Notar soll der Unterschrift seine Amtsbezeichnung beifügen. #### § 13a Eingeschränkte Beifügungs- und Vorlesungspflicht (1) Wird in der Niederschrift auf eine andere notarielle Niederschrift verwiesen, die nach den Vorschriften über die Beurkundung von Willenserklärungen errichtet worden ist, so braucht diese nicht vorgelesen zu werden, wenn die Beteiligten erklären, daß ihnen der Inhalt der anderen Niederschrift bekannt ist, und sie auf das Vorlesen verzichten. Dies soll in der Niederschrift festgestellt werden. Der Notar soll nur beurkunden, wenn den Beteiligten die andere Niederschrift zumindest in beglaubigter Abschrift bei der Beurkundung vorliegt. Für die Vorlage zur Durchsicht anstelle des Vorlesens von Karten, Zeichnungen oder Abbildungen gelten die Sätze 1 bis 3 entsprechend. (2) Die andere Niederschrift braucht der Niederschrift nicht beigefügt zu werden, wenn die Beteiligten darauf verzichten. In der Niederschrift soll festgestellt werden, daß die Beteiligten auf das Beifügen verzichtet haben. (3) Kann die andere Niederschrift bei dem Notar oder einer anderen Stelle rechtzeitig vor der Beurkundung eingesehen werden, so soll der Notar dies den Beteiligten vor der Verhandlung mitteilen; befindet sich die andere Niederschrift bei dem Notar, so soll er diese dem Beteiligten auf Verlangen übermitteln. Unbeschadet des § 17 soll der Notar die Beteiligten auch über die Bedeutung des Verweisens auf die andere Niederschrift belehren. (4) Wird in der Niederschrift auf Karten oder Zeichnungen verwiesen, die von einer öffentlichen Behörde innerhalb der Grenzen ihrer Amtsbefugnisse oder von einer mit öffentlichem Glauben versehenen Person innerhalb des ihr zugewiesenen Geschäftskreises mit Unterschrift und Siegel oder Stempel versehen worden sind, so gelten die Absätze 1 bis 3 entsprechend. #### § 14 Eingeschränkte Vorlesungspflicht (1) Werden Bilanzen, Inventare, Nachlaßverzeichnisse oder sonstige Bestandsverzeichnisse über Sachen, Rechte und Rechtsverhältnisse in ein Schriftstück aufgenommen, auf das in der Niederschrift verwiesen und das dieser beigefügt wird, so braucht es nicht vorgelesen zu werden, wenn die Beteiligten auf das Vorlesen verzichten. Das gleiche gilt für Erklärungen, die bei der Bestellung einer Hypothek, Grundschuld, Rentenschuld, Schiffshypothek oder eines Registerpfandrechts an Luftfahrzeugen aufgenommen werden und nicht im Grundbuch, Schiffsregister, Schiffsbauregister oder im Register für Pfandrechte an Luftfahrzeugen selbst angegeben zu werden brauchen. Eine Erklärung, sich der sofortigen Zwangsvollstreckung zu unterwerfen, muß in die Niederschrift selbst aufgenommen werden. (2) Wird nach Absatz 1 das beigefügte Schriftstück nicht vorgelesen, so soll es den Beteiligten zur Kenntnisnahme vorgelegt und von ihnen unterschrieben werden; besteht das Schriftstück aus mehreren Seiten, soll jede Seite von ihnen unterzeichnet werden. § 17 bleibt unberührt. (3) In der Niederschrift muß festgestellt werden, daß die Beteiligten auf das Vorlesen verzichtet haben; es soll festgestellt werden, daß ihnen das beigefügte Schriftstück zur Kenntnisnahme vorgelegt worden ist. #### § 15 Versteigerungen Bei der Beurkundung von Versteigerungen gelten nur solche Bieter als beteiligt, die an ihr Gebot gebunden bleiben. Entfernt sich ein solcher Bieter vor dem Schluß der Verhandlung, so gilt § 13 Abs. 1 insoweit nicht; in der Niederschrift muß festgestellt werden, daß sich der Bieter vor dem Schluß der Verhandlung entfernt hat. #### § 16 Übersetzung der Niederschrift (1) Ist ein Beteiligter nach seinen Angaben oder nach der Überzeugung des Notars der deutschen Sprache oder, wenn die Niederschrift in einer anderen als der deutschen Sprache aufgenommen wird, dieser Sprache nicht hinreichend kundig, so soll dies in der Niederschrift festgestellt werden. (2) Eine Niederschrift, die eine derartige Feststellung enthält, muß dem Beteiligten anstelle des Vorlesens übersetzt werden. Wenn der Beteiligte es verlangt, soll die Übersetzung außerdem schriftlich angefertigt und ihm zur Durchsicht vorgelegt werden; die Übersetzung soll der Niederschrift beigefügt werden. Der Notar soll den Beteiligten darauf hinweisen, daß dieser eine schriftliche Übersetzung verlangen kann. Diese Tatsachen sollen in der Niederschrift festgestellt werden. (3) Für die Übersetzung muß, falls der Notar nicht selbst übersetzt, ein Dolmetscher zugezogen werden. Für den Dolmetscher gelten die §§ 6, 7 entsprechend. Ist der Dolmetscher nicht allgemein vereidigt, so soll ihn der Notar vereidigen, es sei denn, daß alle Beteiligten darauf verzichten. Diese Tatsachen sollen in der Niederschrift festgestellt werden. Die Niederschrift soll auch von dem Dolmetscher unterschrieben werden. ### 3. - Prüfungs- und Belehrungspflichten #### § 17 Grundsatz (1) Der Notar soll den Willen der Beteiligten erforschen, den Sachverhalt klären, die Beteiligten über die rechtliche Tragweite des Geschäfts belehren und ihre Erklärungen klar und unzweideutig in der Niederschrift wiedergeben. Dabei soll er darauf achten, daß Irrtümer und Zweifel vermieden sowie unerfahrene und ungewandte Beteiligte nicht benachteiligt werden. (2) Bestehen Zweifel, ob das Geschäft dem Gesetz oder dem wahren Willen der Beteiligten entspricht, so sollen die Bedenken mit den Beteiligten erörtert werden. Zweifelt der Notar an der Wirksamkeit des Geschäfts und bestehen die Beteiligten auf der Beurkundung, so soll er die Belehrung und die dazu abgegebenen Erklärungen der Beteiligten in der Niederschrift vermerken. (2a) Der Notar soll das Beurkundungsverfahren so gestalten, daß die Einhaltung der Amtspflichten nach den Absätzen 1 und 2 gewährleistet ist. Bei Verbraucherverträgen soll der Notar darauf hinwirken, dass 1. die rechtsgeschäftlichen Erklärungen des Verbrauchers von diesem persönlich oder durch eine Vertrauensperson vor dem Notar abgegeben werden und 2. der Verbraucher ausreichend Gelegenheit erhält, sich vorab mit dem Gegenstand der Beurkundung auseinanderzusetzen; bei Verbraucherverträgen, die der Beurkundungspflicht nach § 311b Absatz 1 Satz 1 und Absatz 3 des Bürgerlichen Gesetzbuchs unterliegen, soll dem Verbraucher der beabsichtigte Text des Rechtsgeschäfts vom beurkundenden Notar oder einem Notar, mit dem sich der beurkundende Notar zur gemeinsamen Berufsausübung verbunden hat, zur Verfügung gestellt werden. Dies soll im Regelfall zwei Wochen vor der Beurkundung erfolgen. Wird diese Frist unterschritten, sollen die Gründe hierfür in der Niederschrift angegeben werden. Weitere Amtspflichten des Notars bleiben unberührt. (3) Kommt ausländisches Recht zur Anwendung oder bestehen darüber Zweifel, so soll der Notar die Beteiligten darauf hinweisen und dies in der Niederschrift vermerken. Zur Belehrung über den Inhalt ausländischer Rechtsordnungen ist er nicht verpflichtet. #### § 18 Genehmigungserfordernisse Auf die erforderlichen gerichtlichen oder behördlichen Genehmigungen oder Bestätigungen oder etwa darüber bestehende Zweifel soll der Notar die Beteiligten hinweisen und dies in der Niederschrift vermerken. #### § 19 Unbedenklichkeitsbescheinigung Darf nach dem Grunderwerbsteuerrecht eine Eintragung im Grundbuch erst vorgenommen werden, wenn die Unbedenklichkeitsbescheinigung des Finanzamts vorliegt, so soll der Notar die Beteiligten darauf hinweisen und dies in der Niederschrift vermerken. #### § 20 Gesetzliches Vorkaufsrecht Beurkundet der Notar die Veräußerung eines Grundstücks, so soll er, wenn ein gesetzliches Vorkaufsrecht in Betracht kommen könnte, darauf hinweisen und dies in der Niederschrift vermerken. #### § 20a Vorsorgevollmacht Beurkundet der Notar eine Vorsorgevollmacht, so soll er auf die Möglichkeit der Registrierung bei dem Zentralen Vorsorgeregister hinweisen. #### § 21 Grundbucheinsicht, Briefvorlage (1) Bei Geschäften, die im Grundbuch eingetragene oder einzutragende Rechte zum Gegenstand haben, soll sich der Notar über den Grundbuchinhalt unterrichten. Sonst soll er nur beurkunden, wenn die Beteiligten trotz Belehrung über die damit verbundenen Gefahren auf einer sofortigen Beurkundung bestehen; dies soll er in der Niederschrift vermerken. (2) Bei der Abtretung oder Belastung eines Briefpfandrechts soll der Notar in der Niederschrift vermerken, ob der Brief vorgelegen hat. ### 4. - Beteiligung behinderter Personen #### § 22 Hörbehinderte, sprachbehinderte und sehbehinderte Beteiligte (1) Vermag ein Beteiligter nach seinen Angaben oder nach der Überzeugung des Notars nicht hinreichend zu hören, zu sprechen oder zu sehen, so soll zu der Beurkundung ein Zeuge oder ein zweiter Notar zugezogen werden, es sei denn, daß alle Beteiligten darauf verzichten. Auf Verlangen eines hör- oder sprachbehinderten Beteiligten soll der Notar einen Gebärdensprachdolmetscher hinzuziehen. Diese Tatsachen sollen in der Niederschrift festgestellt werden. (2) Die Niederschrift soll auch von dem Zeugen oder dem zweiten Notar unterschrieben werden. #### § 23 Besonderheiten für hörbehinderte Beteiligte Eine Niederschrift, in der nach § 22 Abs. 1 festgestellt ist, daß ein Beteiligter nicht hinreichend zu hören vermag, muß diesem Beteiligten anstelle des Vorlesens zur Durchsicht vorgelegt werden; in der Niederschrift soll festgestellt werden, daß dies geschehen ist. Hat der Beteiligte die Niederschrift eigenhändig unterschrieben, so wird vermutet, daß sie ihm zur Durchsicht vorgelegt und von ihm genehmigt worden ist. #### § 24 Besonderheiten für hör- und sprachbehinderte Beteiligte, mit denen eine schriftliche Verständigung nicht möglich ist (1) Vermag ein Beteiligter nach seinen Angaben oder nach der Überzeugung des Notars nicht hinreichend zu hören oder zu sprechen und sich auch nicht schriftlich zu verständigen, so soll der Notar dies in der Niederschrift feststellen. Wird in der Niederschrift eine solche Feststellung getroffen, so muss zu der Beurkundung eine Person zugezogen werden, die sich mit dem behinderten Beteiligten zu verständigen vermag und mit deren Zuziehung er nach der Überzeugung des Notars einverstanden ist; in der Niederschrift soll festgestellt werden, dass dies geschehen ist. Zweifelt der Notar an der Möglichkeit der Verständigung zwischen der zugezogenen Person und dem Beteiligten, so soll er dies in der Niederschrift feststellen. Die Niederschrift soll auch von der zugezogenen Person unterschrieben werden. (2) Die Beurkundung von Willenserklärungen ist insoweit unwirksam, als diese darauf gerichtet sind, der nach Absatz 1 zugezogenen Person einen rechtlichen Vorteil zu verschaffen. (3) Das Erfordernis, nach § 22 einen Zeugen oder zweiten Notar zuzuziehen, bleibt unberührt. #### § 25 Schreibunfähige Vermag ein Beteiligter nach seinen Angaben oder nach der Überzeugung des Notars seinen Namen nicht zu schreiben, so muß bei dem Vorlesen und der Genehmigung ein Zeuge oder ein zweiter Notar zugezogen werden, wenn nicht bereits nach § 22 ein Zeuge oder ein zweiter Notar zugezogen worden ist. Diese Tatsachen sollen in der Niederschrift festgestellt werden. Die Niederschrift muß von dem Zeugen oder dem zweiten Notar unterschrieben werden. #### § 26 Verbot der Mitwirkung als Zeuge oder zweiter Notar (1) Als Zeuge oder zweiter Notar soll bei der Beurkundung nicht zugezogen werden, wer 1. selbst beteiligt ist oder durch einen Beteiligten vertreten wird, 2. aus einer zu beurkundenden Willenserklärung einen rechtlichen Vorteil erlangt, 3. mit dem Notar verheiratet ist, 3a. mit ihm eine Lebenspartnerschaft führt oder 4. mit ihm in gerader Linie verwandt ist oder war. (2) Als Zeuge soll bei der Beurkundung ferner nicht zugezogen werden, wer 1. zu dem Notar in einem ständigen Dienstverhältnis steht, 2. minderjährig ist, 3. geisteskrank oder geistesschwach ist, 4. nicht hinreichend zu hören, zu sprechen oder zu sehen vermag, 5. nicht schreiben kann oder 6. der deutschen Sprache nicht hinreichend kundig ist; dies gilt nicht im Falle des § 5 Abs. 2, wenn der Zeuge der Sprache der Niederschrift hinreichend kundig ist. ### 5. - Besonderheiten für Verfügungen von Todes wegen #### § 27 Begünstigte Personen Die §§ 7, 16 Abs. 3 Satz 2, § 24 Abs. 2, § 26 Abs. 1 Nr. 2 gelten entsprechend für Personen, die in einer Verfügung von Todes wegen bedacht oder zum Testamentsvollstrecker ernannt werden. #### § 28 Feststellungen über die Geschäftsfähigkeit Der Notar soll seine Wahrnehmungen über die erforderliche Geschäftsfähigkeit des Erblassers in der Niederschrift vermerken. #### § 29 Zeugen, zweiter Notar Auf Verlangen der Beteiligten soll der Notar bei der Beurkundung bis zu zwei Zeugen oder einen zweiten Notar zuziehen und dies in der Niederschrift vermerken. Die Niederschrift soll auch von diesen Personen unterschrieben werden. #### § 30 Übergabe einer Schrift Wird eine Verfügung von Todes wegen durch Übergabe einer Schrift errichtet, so muß die Niederschrift auch die Feststellung enthalten, daß die Schrift übergeben worden ist. Die Schrift soll derart gekennzeichnet werden, daß eine Verwechslung ausgeschlossen ist. In der Niederschrift soll vermerkt werden, ob die Schrift offen oder verschlossen übergeben worden ist. Von dem Inhalt einer offen übergebenen Schrift soll der Notar Kenntnis nehmen, sofern er der Sprache, in der die Schrift verfaßt ist, hinreichend kundig ist; § 17 ist anzuwenden. Die Schrift soll der Niederschrift beigefügt werden; einer Verlesung der Schrift bedarf es nicht. #### § 31 (weggefallen) #### § 32 Sprachunkundige Ist ein Erblasser, der dem Notar seinen letzten Willen mündlich erklärt, der Sprache, in der die Niederschrift aufgenommen wird, nicht hinreichend kundig und ist dies in der Niederschrift festgestellt, so muß eine schriftliche Übersetzung angefertigt werden, die der Niederschrift beigefügt werden soll. Der Erblasser kann hierauf verzichten; der Verzicht muß in der Niederschrift festgestellt werden. #### § 33 Besonderheiten beim Erbvertrag Bei einem Erbvertrag gelten die §§ 30 und 32 entsprechend auch für die Erklärung des anderen Vertragschließenden. #### § 34 Verschließung, Verwahrung (1) Die Niederschrift über die Errichtung eines Testaments soll der Notar in einen Umschlag nehmen und diesen mit dem Prägesiegel verschließen. In den Umschlag sollen auch die nach den §§ 30 und 32 beigefügten Schriften genommen werden. Auf dem Umschlag soll der Notar den Erblasser seiner Person nach näher bezeichnen und angeben, wann das Testament errichtet worden ist; diese Aufschrift soll der Notar unterschreiben. Der Notar soll veranlassen, daß das Testament unverzüglich in besondere amtliche Verwahrung gebracht wird. (2) Beim Abschluß eines Erbvertrages gilt Absatz 1 entsprechend, sofern nicht die Vertragschließenden die besondere amtliche Verwahrung ausschließen; dies ist im Zweifel anzunehmen, wenn der Erbvertrag mit einem anderen Vertrag in derselben Urkunde verbunden wird. (3) Haben die Beteiligten bei einem Erbvertrag die besondere amtliche Verwahrung ausgeschlossen, so bleibt die Urkunde in der Verwahrung des Notars. (4) Die Urschrift einer Verfügung von Todes wegen darf nicht nach § 56 in die elektronische Form übertragen werden. #### § 34a Mitteilungs- und Ablieferungspflichten (1) Der Notar übermittelt nach Errichtung einer erbfolgerelevanten Urkunde im Sinne von § 78d Absatz 2 Satz 1 der Bundesnotarordnung die Verwahrangaben im Sinne von § 78d Absatz 2 Satz 2 der Bundesnotarordnung unverzüglich elektronisch an die das Zentrale Testamentsregister führende Registerbehörde. Die Mitteilungspflicht nach Satz 1 besteht auch bei jeder Beurkundung von Änderungen erbfolgerelevanter Urkunden. (2) Wird ein in die notarielle Verwahrung genommener Erbvertrag gemäß § 2300 Absatz 2, § 2256 Absatz 1 des Bürgerlichen Gesetzbuchs zurückgegeben, teilt der Notar dies der Registerbehörde mit. (3) Befindet sich ein Erbvertrag in der Verwahrung des Notars, liefert der Notar ihn nach Eintritt des Erbfalls an das Nachlassgericht ab, in dessen Verwahrung er danach verbleibt. Enthält eine sonstige Urkunde Erklärungen, nach deren Inhalt die Erbfolge geändert werden kann, so teilt der Notar diese Erklärungen dem Nachlassgericht nach dem Eintritt des Erbfalls in beglaubigter Abschrift mit. #### § 35 Niederschrift ohne Unterschrift des Notars Hat der Notar die Niederschrift über die Errichtung einer Verfügung von Todes wegen nicht unterschrieben, so ist die Beurkundung aus diesem Grunde nicht unwirksam, wenn er die Aufschrift auf dem verschlossenen Umschlag unterschrieben hat. ## Dritter Abschnitt - Sonstige Beurkundungen ### 1. - Niederschriften #### § 36 Grundsatz Bei der Beurkundung anderer Erklärungen als Willenserklärungen sowie sonstiger Tatsachen oder Vorgänge muß eine Niederschrift aufgenommen werden, soweit in § 39 nichts anderes bestimmt ist. #### § 37 Inhalt der Niederschrift (1) Die Niederschrift muß enthalten 1\. die Bezeichnung des Notars sowie 2\. den Bericht über seine Wahrnehmungen. Der Bericht des Notars in einem Schriftstück, auf das in der Niederschrift verwiesen und das dieser beigefügt wird, gilt als in der Niederschrift selbst enthalten. Satz 2 gilt entsprechend, wenn der Notar unter Verwendung von Karten, Zeichnungen oder Abbildungen seinen Bericht erstellt. (2) In der Niederschrift sollen Ort und Tag der Wahrnehmungen des Notars sowie Ort und Tag der Errichtung der Urkunde angegeben werden. (3) § 13 Abs. 3 gilt entsprechend. #### § 38 Eide, eidesstattliche Versicherungen (1) Bei der Abnahme von Eiden und bei der Aufnahme eidesstattlicher Versicherungen gelten die Vorschriften über die Beurkundung von Willenserklärungen entsprechend. (2) Der Notar soll über die Bedeutung des Eides oder der eidesstattlichen Versicherung belehren und dies in der Niederschrift vermerken. ### 2. - Vermerke #### § 39 Einfache Zeugnisse Bei der Beglaubigung einer Unterschrift oder eines Handzeichens oder der Zeichnung einer Namensunterschrift, bei der Feststellung des Zeitpunktes, zu dem eine Privaturkunde vorgelegt worden ist, bei Bescheinigungen über Eintragungen in öffentlichen Registern, bei der Beglaubigung von Abschriften, Abdrucken, Ablichtungen und dergleichen (Abschriften) und bei sonstigen einfachen Zeugnissen genügt anstelle einer Niederschrift eine Urkunde, die das Zeugnis, die Unterschrift und das Präge- oder Farbdrucksiegel (Siegel) des Notars enthalten muß und Ort und Tag der Ausstellung angeben soll (Vermerk). #### § 39a Einfache elektronische Zeugnisse (1) Beglaubigungen und sonstige Zeugnisse im Sinne des § 39 können elektronisch errichtet werden. Das hierzu erstellte Dokument muss mit einer qualifizierten elektronischen Signatur versehen werden. Diese soll auf einem Zertifikat beruhen, das auf Dauer prüfbar ist. Der Notar muss die qualifizierte elektronische Signatur selbst erzeugen. § 33 Absatz 3 der Bundesnotarordnung gilt entsprechend. (2) Mit dem Zeugnis muss eine Bestätigung der Notareigenschaft durch die zuständige Stelle verbunden werden. Das Zeugnis soll Ort und Tag der Ausstellung angeben. (3) Bei der Beglaubigung eines elektronischen Dokuments, das mit einer qualifizierten elektronischen Signatur versehen ist, soll das Ergebnis der Signaturprüfung dokumentiert werden. #### § 40 Beglaubigung einer Unterschrift (1) Eine Unterschrift soll nur beglaubigt werden, wenn sie in Gegenwart des Notars vollzogen oder anerkannt wird. (2) Der Notar braucht die Urkunde nur darauf zu prüfen, ob Gründe bestehen, seine Amtstätigkeit zu versagen. (3) Der Beglaubigungsvermerk muß auch die Person bezeichnen, welche die Unterschrift vollzogen oder anerkannt hat. In dem Vermerk soll angegeben werden, ob die Unterschrift vor dem Notar vollzogen oder anerkannt worden ist. (4) § 10 Absatz 1, 2 und 3 Satz 1 gilt entsprechend. (5) Unterschriften ohne zugehörigen Text soll der Notar nur beglaubigen, wenn dargelegt wird, daß die Beglaubigung vor der Festlegung des Urkundeninhalts benötigt wird. In dem Beglaubigungsvermerk soll angegeben werden, daß bei der Beglaubigung ein durch die Unterschrift gedeckter Text nicht vorhanden war. (6) Die Absätze 1 bis 5 gelten für die Beglaubigung von Handzeichen entsprechend. #### § 41 Beglaubigung der Zeichnung einer Firma oder Namensunterschrift Bei der Beglaubigung der Zeichnung einer Namensunterschrift, die zur Aufbewahrung beim Gericht bestimmt ist, muß die Zeichnung in Gegenwart des Notars vollzogen werden; dies soll in dem Beglaubigungsvermerk festgestellt werden. Der Beglaubigungsvermerk muß auch die Person angeben, welche gezeichnet hat. § 10 Absatz 1, 2 und 3 Satz 1 gilt entsprechend. #### § 42 Beglaubigung einer Abschrift (1) Bei der Beglaubigung der Abschrift einer Urkunde soll festgestellt werden, ob die Urkunde eine Urschrift, eine Ausfertigung, eine beglaubigte oder einfache Abschrift ist. (2) Finden sich in einer dem Notar vorgelegten Urkunde Lücken, Durchstreichungen, Einschaltungen, Änderungen oder unleserliche Worte, zeigen sich Spuren der Beseitigung von Schriftzeichen, insbesondere Radierungen, ist der Zusammenhang einer aus mehreren Blättern bestehenden Urkunde aufgehoben oder sprechen andere Umstände dafür, daß der ursprüngliche Inhalt der Urkunde geändert worden ist, so soll dies in dem Beglaubigungsvermerk festgestellt werden, sofern es sich nicht schon aus der Abschrift ergibt. (3) Enthält die Abschrift nur den Auszug aus einer Urkunde, so soll in dem Beglaubigungsvermerk der Gegenstand des Auszugs angegeben und bezeugt werden, daß die Urkunde über diesen Gegenstand keine weiteren Bestimmungen enthält. (4) Bei der Beglaubigung eines Ausdrucks oder einer Abschrift eines elektronischen Dokuments, das mit einer qualifizierten elektronischen Signatur versehen ist, soll das Ergebnis der Signaturprüfung dokumentiert werden. #### § 43 Feststellung des Zeitpunktes der Vorlegung einer privaten Urkunde Bei der Feststellung des Zeitpunktes, zu dem eine private Urkunde vorgelegt worden ist, gilt § 42 Abs. 2 entsprechend. ## Vierter Abschnitt - Behandlung der Urkunden ### § 44 Verbindung mit Schnur und Prägesiegel Besteht eine Urkunde aus mehreren Blättern, so sollen diese mit Schnur und Prägesiegel verbunden werden. Das gleiche gilt für Schriftstücke sowie für Karten, Zeichnungen oder Abbildungen, die nach § 9 Abs. 1 Satz 2, 3, §§ 14, 37 Abs. 1 Satz 2, 3 der Niederschrift beigefügt worden sind. ### § 44a Änderungen in den Urkunden (1) Zusätze und sonstige, nicht nur geringfügige Änderungen sollen am Schluß vor den Unterschriften oder am Rande vermerkt und im letzteren Falle von dem Notar besonders unterzeichnet werden. Ist der Niederschrift ein Schriftstück nach § 9 Abs. 1 Satz 2, den §§ 14, 37 Abs. 1 Satz 2 beigefügt, so brauchen Änderungen in dem beigefügten Schriftstück nicht unterzeichnet zu werden, wenn aus der Niederschrift hervorgeht, daß sie genehmigt worden sind. (2) Offensichtliche Unrichtigkeiten kann der Notar auch nach Abschluß der Niederschrift durch einen von ihm zu unterschreibenden Nachtragsvermerk richtigstellen. Der Nachtragsvermerk ist mit dem Datum der Richtigstellung zu versehen. Der Nachtragsvermerk ist am Schluß nach den Unterschriften oder auf einem besonderen, mit der Urkunde zu verbindenden Blatt niederzulegen. Wird die elektronische Fassung der Urschrift zum Zeitpunkt der Richtigstellung bereits in der elektronischen Urkundensammlung verwahrt, darf der Nachtragsvermerk nur noch auf einem gesonderten, mit der Urkunde zu verbindenden Blatt niedergelegt werden. (3) Ergibt sich im übrigen nach Abschluß der Niederschrift die Notwendigkeit einer Änderung oder Berichtigung, so hat der Notar hierüber eine besondere Niederschrift aufzunehmen. ### § 44b Nachtragsbeurkundung (1) Wird der Inhalt einer Niederschrift in einer anderen Niederschrift berichtigt, geändert, ergänzt oder aufgehoben, soll der Notar durch einen mit dem Datum zu versehenden und von ihm zu unterschreibenden Nachtragsvermerk auf die andere Niederschrift verweisen. § 44a Absatz 2 Satz 3 und 4 gilt entsprechend. Anstelle eines Nachtragsvermerks kann der Notar die andere Niederschrift zusammen mit der Niederschrift verwahren. (2) Nachtragsvermerke sowie die zusammen mit der Niederschrift verwahrten anderen Niederschriften nach Absatz 1 soll der Notar in Ausfertigungen und Abschriften der Urschrift übernehmen. ### § 45 Urschrift (1) Die Urschrift der notariellen Urkunde bleibt, wenn sie nicht auszuhändigen ist, in der Verwahrung des Notars. (2) Wird die Urschrift der notariellen Urkunde nach § 56 in ein elektronisches Dokument übertragen und in der elektronischen Urkundensammlung verwahrt, steht die elektronische Fassung der Urschrift derjenigen in Papierform gleich. (3) (weggefallen) ### § 45a Aushändigung der Urschrift (1) Die Urschrift einer Niederschrift soll nur ausgehändigt werden, wenn dargelegt wird, dass sie im Ausland verwendet werden soll, und sämtliche Personen zustimmen, die eine Ausfertigung verlangen können. In diesem Fall soll die Urschrift mit dem Siegel versehen werden; ferner soll eine Ausfertigung zurückbehalten und auf ihr vermerkt werden, an wen und weshalb die Urschrift ausgehändigt worden ist. Die Ausfertigung tritt an die Stelle der Urschrift. (2) Die Urschrift einer Urkunde, die in der Form eines Vermerks verfasst ist, ist auszuhändigen, wenn nicht die Verwahrung verlangt wird. ### § 46 Ersetzung der Urschrift (1) Ist die Urschrift einer Niederschrift ganz oder teilweise zerstört worden oder abhanden gekommen und besteht Anlaß, sie zu ersetzen, so kann auf einer Ausfertigung oder beglaubigten Abschrift oder einer davon gefertigten beglaubigten Abschrift vermerkt werden, daß sie an die Stelle der Urschrift tritt. Der Vermerk kann mit dem Beglaubigungsvermerk verbunden werden. Er soll Ort und Tag der Ausstellung angeben und muß unterschrieben werden. (2) Ist die elektronische Fassung der Urschrift ganz oder teilweise zerstört worden, soll die Urschrift erneut nach § 56 in die elektronische Form übertragen und in der elektronischen Urkundensammlung verwahrt werden. Ist die Urschrift nicht mehr vorhanden, gilt Absatz 1 entsprechend oder die Wiederherstellung erfolgt aus einer im Elektronischen Urkundenarchiv gespeicherten früheren elektronischen Fassung der Urschrift. Für die Wiederherstellung aus einer früheren elektronischen Fassung gilt § 56 Absatz 1 entsprechend; in dem Vermerk soll zusätzlich die Tatsache der sicheren Speicherung im Elektronischen Urkundenarchiv angegeben werden. (3) Die Ersetzung erfolgt durch die Stelle, die für die Erteilung einer Ausfertigung zuständig ist. (4) Vor der Ersetzung der Urschrift soll der Schuldner gehört werden, wenn er sich in der Urkunde der sofortigen Zwangsvollstreckung unterworfen hat. Von der Ersetzung der Urschrift sollen die Personen, die eine Ausfertigung verlangen können, verständigt werden, soweit sie sich ohne erhebliche Schwierigkeiten ermitteln lassen. ### § 47 Ausfertigung Die Ausfertigung der Niederschrift vertritt die Urschrift im Rechtsverkehr. ### § 48 Zuständigkeit für die Erteilung der Ausfertigung Die Ausfertigung erteilt, soweit bundes- oder landesrechtlich nichts anderes bestimmt ist, die Stelle, welche die Urschrift verwahrt. Wird die Urschrift bei einem Gericht verwahrt, so erteilt der Urkundsbeamte der Geschäftsstelle die Ausfertigung. ### § 49 Form der Ausfertigung (1) Die Ausfertigung besteht, jeweils mit einem Ausfertigungsvermerk versehen, in 1. einer Abschrift der Urschrift oder der elektronischen Fassung der Urschrift oder 2. einem Ausdruck der elektronischen Fassung der Urschrift. (2) Der Ausfertigungsvermerk soll den Tag und den Ort der Erteilung angeben, die Person bezeichnen, der die Ausfertigung erteilt wird, und die Übereinstimmung der Ausfertigung mit der Urschrift oder der elektronischen Fassung der Urschrift bestätigen. Er muß unterschrieben und mit dem Siegel der erteilenden Stelle versehen sein. Besteht die Ausfertigung in einer Abschrift oder einem Ausdruck der elektronischen Fassung der Urschrift, soll das Ergebnis der Signaturprüfung dokumentiert werden. (3) Werden Abschriften von Urkunden mit der Ausfertigung durch Schnur und Prägesiegel verbunden oder befinden sie sich mit dieser auf demselben Blatt, so genügt für die Beglaubigung dieser Abschriften der Ausfertigungsvermerk; dabei soll entsprechend § 42 Abs. 3 und, wenn die Urkunden, von denen die Abschriften hergestellt sind, nicht zusammen mit der Urschrift der ausgefertigten Urkunde verwahrt werden, auch entsprechend § 42 Abs. 1, 2 verfahren werden. (4) Im Urkundenverzeichnis soll vermerkt werden, wem und an welchem Tage eine Ausfertigung erteilt worden ist. (5) Die Ausfertigung kann auf Antrag auch auszugsweise erteilt werden. § 42 Abs. 3 ist entsprechend anzuwenden. ### § 50 Übersetzungen (1) Ein Notar kann die deutsche Übersetzung einer Urkunde mit der Bescheinigung der Richtigkeit und Vollständigkeit versehen, wenn er die Urkunde selbst in fremder Sprache errichtet hat oder für die Erteilung einer Ausfertigung der Niederschrift zuständig ist. Für die Bescheinigung gilt § 39 entsprechend. Der Notar soll die Bescheinigung nur erteilen, wenn er der fremden Sprache hinreichend kundig ist. (2) Eine Übersetzung, die mit einer Bescheinigung nach Absatz 1 versehen ist, gilt als richtig und vollständig. Der Gegenbeweis ist zulässig. (3) Von einer derartigen Übersetzung können Ausfertigungen und Abschriften erteilt werden. Die Übersetzung soll in diesem Fall zusammen mit der Urschrift verwahrt werden. ### § 51 Recht auf Ausfertigungen, Abschriften und Einsicht (1) Ausfertigungen können verlangen 1. bei Niederschriften über Willenserklärungen jeder, der eine Erklärung im eigenen Namen abgegeben hat oder in dessen Namen eine Erklärung abgegeben worden ist, 2. bei anderen Niederschriften jeder, der die Aufnahme der Urkunde beantragt hat, sowie die Rechtsnachfolger dieser Personen. (2) Die in Absatz 1 genannten Personen können gemeinsam in der Niederschrift oder durch besondere Erklärung gegenüber der zuständigen Stelle etwas anderes bestimmen. (3) Wer Ausfertigungen verlangen kann, ist auch berechtigt, einfache oder beglaubigte Abschriften zu verlangen und die Urschrift einzusehen. (4) Mitteilungspflichten, die auf Grund von Rechtsvorschriften gegenüber Gerichten oder Behörden bestehen, sowie das Recht auf Zugang zu Forschungszwecken nach § 18a der Bundesnotarordnung bleiben unberührt. ### § 52 Vollstreckbare Ausfertigungen Vollstreckbare Ausfertigungen werden nach den dafür bestehenden Vorschriften erteilt. ### § 53 Einreichung beim Grundbuchamt oder Registergericht Sind Willenserklärungen beurkundet worden, die beim Grundbuchamt oder Registergericht einzureichen sind, so soll der Notar dies veranlassen, sobald die Urkunde eingereicht werden kann, es sei denn, daß alle Beteiligten gemeinsam etwas anderes verlangen; auf die mit einer Verzögerung verbundenen Gefahren soll der Notar hinweisen. ### § 54 Rechtsmittel (1) Gegen die Ablehnung der Erteilung der Vollstreckungsklausel oder einer Amtshandlung nach den §§ 45a, 46 und 51 sowie gegen die Ersetzung einer Urschrift ist die Beschwerde gegeben. (2) Für das Beschwerdeverfahren gelten die Vorschriften des Gesetzes über das Verfahren in Familiensachen und in den Angelegenheiten der freiwilligen Gerichtsbarkeit. Über die Beschwerde entscheidet eine Zivilkammer des Landgerichts, in dessen Bezirk die Stelle, gegen die sich die Beschwerde richtet, ihren Sitz hat. ## Fünfter Abschnitt - Verwahrung der Urkunden ### § 55 Verzeichnis und Verwahrung der Urkunden (1) Der Notar führt ein elektronisches Verzeichnis über Beurkundungen und sonstige Amtshandlungen (Urkundenverzeichnis). (2) Das Urkundenverzeichnis und die elektronische Urkundensammlung sind vom Notar im Elektronischen Urkundenarchiv (§ 78h der Bundesnotarordnung) zu führen. (3) Die im Urkundenverzeichnis registrierten Urkunden verwahrt der Notar in einer Urkundensammlung, einer elektronischen Urkundensammlung und einer Erbvertragssammlung. ### § 56 Übertragung der Papierdokumente in die elektronische Form; Einstellung der elektronischen Dokumente in die elektronische Urkundensammlung (1) Bei der Übertragung der in Papierform vorliegenden Schriftstücke in die elektronische Form soll durch geeignete Vorkehrungen nach dem Stand der Technik sichergestellt werden, dass die elektronischen Dokumente mit den in Papierform vorhandenen Schriftstücken inhaltlich und bildlich übereinstimmen. Diese Übereinstimmung ist vom Notar in einem Vermerk unter Angabe des Ortes und des Tages seiner Ausstellung zu bestätigen. Durchstreichungen, Änderungen, Einschaltungen, Radierungen oder andere Mängel des Schriftstücks sollen im Vermerk angegeben werden, soweit sie nicht aus dem elektronischen Dokument eindeutig ersichtlich sind. Das elektronische Dokument und der Vermerk müssen mit einer qualifizierten elektronischen Signatur versehen werden. § 39a Absatz 1 Satz 3 bis 5, Absatz 2 Satz 1 gilt entsprechend. (2) Werden nach der Einstellung der elektronischen Fassung einer in der Urkundensammlung zu verwahrenden Urschrift oder Abschrift in die elektronische Urkundensammlung Nachtragsvermerke, weitere Unterlagen oder andere Urschriften der Urschrift oder Abschrift beigefügt, sind die Nachtragsvermerke, die weiteren Unterlagen und die anderen Urschriften nach Absatz 1 in elektronische Dokumente zu übertragen und zusammen mit der elektronischen Fassung der Urschrift oder Abschrift in der elektronischen Urkundensammlung zu verwahren. (3) Die von dem Notar in der elektronischen Urkundensammlung verwahrten elektronischen Dokumente stehen den Dokumenten gleich, aus denen sie nach den Absätzen 1 und 2 übertragen worden sind. ## Sechster Abschnitt - Verwahrung ### § 57 Antrag auf Verwahrung (1) Der Notar darf Bargeld zur Aufbewahrung oder zur Ablieferung an Dritte nicht entgegennehmen. (2) Der Notar darf Geld zur Verwahrung nur entgegennehmen, wenn 1. hierfür ein berechtigtes Sicherungsinteresse der am Verwahrungsgeschäft beteiligten Personen besteht, 2. ihm ein Antrag auf Verwahrung verbunden mit einer Verwahrungsanweisung vorliegt, in der hinsichtlich der Masse und ihrer Erträge der Anweisende, der Empfangsberechtigte sowie die zeitlichen und sachlichen Bedingungen der Verwahrung und die Auszahlungsvoraussetzungen bestimmt sind, 3. er den Verwahrungsantrag und die Verwahrungsanweisung angenommen hat. (3) Der Notar darf den Verwahrungsantrag nur annehmen, wenn die Verwahrungsanweisung den Bedürfnissen einer ordnungsgemäßen Geschäftsabwicklung und eines ordnungsgemäßen Vollzugs der Verwahrung sowie dem Sicherungsinteresse aller am Verwahrungsgeschäft beteiligten Personen genügt. (4) Die Verwahrungsanweisung sowie deren Änderung, Ergänzung oder Widerruf bedürfen der Schriftform. (5) Auf der Verwahrungsanweisung hat der Notar die Annahme mit Datum und Unterschrift zu vermerken, sofern die Verwahrungsanweisung nicht Gegenstand einer Niederschrift (§§ 8, 36) ist, die er selbst oder seine Notarvertretung aufgenommen hat. (6) Die Absätze 3 bis 5 gelten entsprechend für Treuhandaufträge, die dem Notar im Zusammenhang mit dem Vollzug des der Verwahrung zugrundeliegenden Geschäfts von Personen erteilt werden, die an diesem nicht beteiligt sind. ### § 58 Durchführung der Verwahrung (1) Der Notar hat anvertraute Gelder unverzüglich einem Sonderkonto für fremde Gelder (Notaranderkonto) zuzuführen. Der Notar ist zu einer bestimmten Art der Anlage nur bei einer entsprechenden Anweisung der Beteiligten verpflichtet. Fremdgelder sowie deren Erträge dürfen auch nicht vorübergehend auf einem sonstigen Konto des Notars oder eines Dritten geführt werden. (2) Das Notaranderkonto muß bei einem im Inland zum Geschäftsbetrieb befugten Kreditinstitut oder der Deutschen Bundesbank eingerichtet sein. Die Anderkonten sollen bei Kreditinstituten in dem Amtsbereich des Notars oder den unmittelbar angrenzenden Amtsgerichtsbezirken desselben Oberlandesgerichtsbezirks eingerichtet werden, sofern in der Anweisung nicht ausdrücklich etwas anderes vorgesehen wird oder eine andere Handhabung sachlich geboten ist. Für jede Verwahrungsmasse muß ein gesondertes Anderkonto geführt werden, Sammelanderkonten sind nicht zulässig. (3) Über das Notaranderkonto dürfen nur der Notar persönlich, die Notarvertretung, der Notariatsverwalter oder der nach § 51 Absatz 1 Satz 2 der Bundesnotarordnung mit der Aktenverwahrung betraute Notar verfügen. Die Landesregierungen oder die von ihnen bestimmten Stellen werden ermächtigt, durch Rechtsverordnung zu bestimmen, daß Verfügungen auch durch einen entsprechend bevollmächtigten anderen Notar oder im Land Baden-Württemberg durch Notariatsabwickler erfolgen dürfen. Verfügungen sollen nur erfolgen, um Beträge unverzüglich dem Empfangsberechtigten oder einem von diesem schriftlich benannten Dritten zuzuführen. Sie sind grundsätzlich im bargeldlosen Zahlungsverkehr durchzuführen, sofern nicht besondere berechtigte Interessen der Beteiligten die Auszahlung in bar oder mittels Bar- oder Verrechnungsscheck gebieten. Die Gründe für eine Bar- oder Scheckauszahlung sind von dem Notar zu vermerken. Die Bar- oder Scheckauszahlung ist durch den berechtigten Empfänger oder einen von ihm schriftlich Beauftragten nach Feststellung der Person zu quittieren. Verfügungen zugunsten von Privat- oder Geschäftskonten des Notars sind lediglich zur Bezahlung von Kostenforderungen aus dem zugrundeliegenden Amtsgeschäft unter Angabe des Verwendungszwecks und nur dann zulässig, wenn hierfür eine notarielle Kostenrechnung erteilt und dem Kostenschuldner zugegangen ist und Auszahlungsreife des verwahrten Betrages zugunsten des Kostenschuldners gegeben ist. (4) Eine Verwahrung soll nur dann über mehrere Anderkonten durchgeführt werden, wenn dies sachlich geboten ist und in der Anweisung ausdrücklich bestimmt ist. (5) Schecks sollen unverzüglich eingelöst oder verrechnet werden, soweit sich aus den Anweisungen nichts anderes ergibt. Der Gegenwert ist nach den Absätzen 2 und 3 zu behandeln. ### § 59 Verordnungsermächtigung Das Bundesministerium der Justiz und für Verbraucherschutz hat durch Rechtsverordnung mit Zustimmung des Bundesrates die näheren Bestimmungen zu treffen über den Inhalt, den Aufbau und die Führung des Verwahrungsverzeichnisses einschließlich der Verweismöglichkeiten auf die im Urkundenverzeichnis zu der Urkunde gespeicherten Daten sowie über Einzelheiten der Datenübermittlung und -speicherung sowie der Datensicherheit. Die Verordnung kann auch Ausnahmen von der Eintragungspflicht anordnen. Die technischen und organisatorischen Maßnahmen zur Gewährleistung der Datensicherheit müssen denen zur Gewährleistung der Datensicherheit des Elektronischen Urkundenarchivs entsprechen. ### § 59a Verwahrungsverzeichnis (1) Der Notar führt ein elektronisches Verzeichnis über Verwahrungsmassen, die er nach § 23 der Bundesnotarordnung und nach den §§ 57 und 62 entgegennimmt (Verwahrungsverzeichnis). (2) Das Verwahrungsverzeichnis ist im Elektronischen Urkundenarchiv (§ 78h der Bundesnotarordnung) zu führen. Erfolgt die Verwahrung in Vollzug eines vom Notar in das Urkundenverzeichnis einzutragenden Amtsgeschäfts, soll der Notar im Verwahrungsverzeichnis auf die im Urkundenverzeichnis zu der Urkunde gespeicherten Daten verweisen, soweit diese auch in das Verwahrungsverzeichnis einzutragen wären. ### § 60 Widerruf (1) Den schriftlichen Widerruf einer Anweisung hat der Notar zu beachten, soweit er dadurch Dritten gegenüber bestehende Amtspflichten nicht verletzt. (2) Ist die Verwahrungsanweisung von mehreren Anweisenden erteilt, so ist der Widerruf darüber hinaus nur zu beachten, wenn er durch alle Anweisenden erfolgt. (3) Erfolgt der Widerruf nach Absatz 2 nicht durch alle Anweisenden und wird er darauf gegründet, daß das mit der Verwahrung durchzuführende Rechtsverhältnis aufgehoben, unwirksam oder rückabzuwickeln sei, soll sich der Notar jeder Verfügung über das Verwahrungsgut enthalten. Der Notar soll alle an dem Verwahrungsgeschäft beteiligten Personen im Sinne des § 57 hiervon unterrichten. Der Widerruf wird jedoch unbeachtlich, wenn 1. eine spätere übereinstimmende Anweisung vorliegt oder 2. der Widerrufende nicht innerhalb einer von dem Notar festzusetzenden angemessenen Frist dem Notar nachweist, daß ein gerichtliches Verfahren zur Herbeiführung einer übereinstimmenden Anweisung rechtshängig ist, oder 3. dem Notar nachgewiesen wird, daß die Rechtshängigkeit der nach Nummer 2 eingeleiteten Verfahren entfallen ist. (4) Die Verwahrungsanweisung kann von den Absätzen 2 und 3 abweichende oder ergänzende Regelungen enthalten. (5) § 15 Abs. 2 der Bundesnotarordnung bleibt unberührt. ### § 61 Absehen von Auszahlung Der Notar hat von der Auszahlung abzusehen und alle an dem Verwahrungsgeschäft beteiligten Personen im Sinne des § 57 hiervon zu unterrichten, wenn 1. hinreichende Anhaltspunkte dafür vorliegen, daß er bei Befolgung der unwiderruflichen Weisung an der Erreichung unerlaubter oder unredlicher Zwecke mitwirken würde, oder 2. einem Auftraggeber im Sinne des § 57 durch die Auszahlung des verwahrten Geldes ein unwiederbringlicher Schaden erkennbar droht. ### § 62 Verwahrung von Wertpapieren und Kostbarkeiten (1) Die §§ 57, 60 und 61 gelten entsprechend für die Verwahrung von Wertpapieren und Kostbarkeiten. (2) Der Notar ist berechtigt, Wertpapiere und Kostbarkeiten auch einer Bank im Sinne des § 58 Absatz 2 in Verwahrung zu geben, und ist nicht verpflichtet, von ihm verwahrte Wertpapiere zu verwalten, soweit in der Verwahrungsanweisung nichts anderes bestimmt ist. ## Siebter Abschnitt - Schlußvorschriften ### 1. - Verhältnis zu anderen Gesetzen #### a) - Bundesrecht ##### § 63 Beseitigung von Doppelzuständigkeiten (1) u. (2) (Änderungsvorschriften) (3) (weggefallen) (4) Auch wenn andere Vorschriften des bisherigen Bundesrechts die gerichtliche oder notarielle Beurkundung oder Beglaubigung oder die Erklärung vor einem Gericht oder Notar vorsehen, ist nur der Notar zuständig. ##### § 64 Beurkundungen nach dem Personenstandsgesetz Dieses Gesetz gilt nicht für Beurkundungen nach dem Personenstandsgesetz. ##### § 65 Unberührt bleibendes Bundesrecht Soweit in diesem Gesetz nichts anderes bestimmt ist, bleiben bundesrechtliche Vorschriften über Beurkundungen unberührt. #### b) - Landesrecht ##### § 66 Unberührt bleibendes Landesrecht (1) Unbeschadet der Zuständigkeit des Notars bleiben folgende landesrechtliche Vorschriften unberührt: 1. Vorschriften über die Beurkundung von freiwilligen Versteigerungen; dies gilt nicht für die freiwillige Versteigerung von Grundstücken und grundstücksgleichen Rechten; 2. Vorschriften über die Zuständigkeit zur Aufnahme von Inventaren, Bestandsverzeichnissen, Nachlaßverzeichnissen und anderen Vermögensverzeichnissen sowie zur Mitwirkung bei der Aufnahme solcher Vermögensverzeichnisse; 3. Vorschriften, nach denen die Gerichtsvollzieher zuständig sind, Wechsel- und Scheckproteste aufzunehmen sowie das tatsächliche Angebot einer Leistung zu beurkunden; 4. Vorschriften, nach denen die Amtsgerichte zuständig sind, außerhalb eines anhängigen Verfahrens die Aussagen von Zeugen und die Gutachten von Sachverständigen, die Vereidigung sowie eidesstattliche Versicherungen dieser Personen zu beurkunden; 5. Vorschriften, nach denen Beurkundungen in Fideikommißsachen, für die ein Kollegialgericht zuständig ist, durch einen beauftragten oder ersuchten Richter erfolgen können; 6. Vorschriften, nach denen die Vorstände der Vermessungsbehörden, die das amtliche Verzeichnis im Sinne des § 2 Abs. 2 der Grundbuchordnung führen, und die von den Vorständen beauftragten Beamten dieser Behörden zuständig sind, Anträge der Eigentümer auf Vereinigung oder Teilung von Grundstücken zu beurkunden oder zu beglaubigen; 7. Vorschriften über die Beurkundung der Errichtung fester Grenzzeichen (Abmarkung); 8. Vorschriften über die Beurkundung von Tatbeständen, die am Grund und Boden durch vermessungstechnische Ermittlungen festgestellt werden, durch Behörden, öffentlich bestellte Vermessungsingenieure oder Markscheider; 9. Vorschriften über Beurkundungen in Gemeinheitsteilungs- und agrarrechtlichen Ablösungsverfahren einschließlich der Rentenübernahme- und Rentengutsverfahren; 10. Vorschriften über Beurkundungen im Rückerstattungsverfahren; 11. Vorschriften über die Beglaubigung amtlicher Unterschriften zum Zwecke der Legalisation; 12. Vorschriften über Beurkundungen in Kirchenaustrittssachen. (2) Auf Grund dieser Vorbehalte können den Gerichten Beurkundungszuständigkeiten nicht neu übertragen werden. (3) Auf Grund anderer bundesrechtlicher Vorbehalte kann 1. die Zuständigkeit der Notare für öffentliche Beurkundungen (§ 20 der Bundesnotarordnung) nicht eingeschränkt werden, 2. nicht bestimmt werden, daß für öffentliche Beurkundungen neben dem Notar andere Urkundspersonen oder sonstige Stellen zuständig sind, und 3. keine Regelung getroffen werden, die den Vorschriften des Ersten bis Vierten Abschnitts dieses Gesetzes entgegensteht. (4) (weggefallen) ##### § 67 Zuständigkeit der Amtsgerichte, Zustellung (1) Unbeschadet der Zuständigkeit sonstiger Stellen sind die Amtsgerichte zuständig für die Beurkundung von 1. Erklärungen über die Anerkennung der Vaterschaft, 2. Verpflichtungen zur Erfüllung von Unterhaltsansprüchen eines Kindes, 3. Verpflichtungen zur Erfüllung von Unterhaltsansprüchen nach § 1615l des Bürgerlichen Gesetzbuchs. (2) Die Zustellung von Urkunden, die eine Verpflichtung nach Absatz 1 Nr. 2 oder 3 zum Gegenstand haben, kann auch dadurch vollzogen werden, daß der Schuldner eine beglaubigte Abschrift der Urkunde ausgehändigt erhält; § 174 Satz 2 und 3 der Zivilprozeßordnung gilt entsprechend. ##### § 68 Die Länder sind befugt, durch Gesetz die Zuständigkeit für die öffentliche Beglaubigung von Abschriften oder Unterschriften anderen Personen oder Stellen zu übertragen. ##### § 69 (weggefallen) #### c) - Amtliche Beglaubigungen ##### § 70 Dieses Gesetz gilt nicht für amtliche Beglaubigungen, mit denen eine Verwaltungsbehörde zum Zwecke der Verwendung in Verwaltungsverfahren oder für sonstige Zwecke, für die eine öffentliche Beglaubigung nicht vorgeschrieben ist, die Echtheit einer Unterschrift oder eines Handzeichens oder die Richtigkeit der Abschrift einer Urkunde bezeugt, die nicht von einer Verwaltungsbehörde ausgestellt ist. Die Beweiskraft dieser amtlichen Beglaubigungen beschränkt sich auf den in dem Beglaubigungsvermerk genannten Verwendungszweck. Die Befugnis der Verwaltungsbehörden, Abschriften ihrer eigenen Urkunden oder von Urkunden anderer Verwaltungsbehörden in der dafür vorgeschriebenen Form mit uneingeschränkter Beweiskraft zu beglaubigen, bleibt unberührt. #### d) - Eidesstattliche Versicherungen in Verwaltungsverfahren ##### § 71 Dieses Gesetz gilt nicht für die Aufnahme eidesstattlicher Versicherungen in Verwaltungsverfahren. #### e) - Erklärungen juristischer Personen des öffentlichen Rechts ##### § 72 Die bundes- oder landesrechtlich vorgeschriebene Beidrückung des Dienstsiegels bei Erklärungen juristischer Personen des öffentlichen Rechts wird durch die öffentliche Beurkundung ersetzt. #### f) - Bereits errichtete Urkunden ##### § 73 (1) §§ 45 bis 49, 51, 52, 54 dieses Gesetzes gelten auch für Urkunden, die vor dem Inkrafttreten dieses Gesetzes errichtet worden sind. Dies gilt auch, wenn die Beurkundungszuständigkeit weggefallen ist. (2) Eine vor dem Inkrafttreten dieses Gesetzes erteilte Ausfertigung einer Niederschrift ist auch dann als von Anfang an wirksam anzusehen, wenn sie den Vorschriften dieses Gesetzes genügt. (3) § 2256 Abs. 1, 2 des Bürgerlichen Gesetzbuchs gilt auch für Testamente, die vor dem Inkrafttreten dieses Gesetzes vor einem Richter errichtet worden sind. #### g) - Verweisungen ##### § 74 Soweit in Gesetzen oder Verordnungen auf die durch dieses Gesetz aufgehobenen oder abgeänderten Vorschriften verwiesen ist, treten die entsprechenden Vorschriften dieses Gesetzes an ihre Stelle. ### 2. - Geltung in Berlin #### § 75 (weggefallen) ### 3. - Übergangsvorschrift #### § 76 Übergangsvorschrift zur Einführung des Elektronischen Urkundenarchivs (1) Für Beurkundungen und sonstige Amtshandlungen, die vor dem 1. Januar 2022 vorgenommen worden sind, sind die §§ 55 und 56 nicht anzuwenden. Abweichend von § 49 Absatz 4 ist auf der Urschrift zu vermerken, wem und an welchem Tag eine Ausfertigung erteilt worden ist. Zusätze und Änderungen sind nach den vor dem 1. Januar 2022 geltenden Bestimmungen vorzunehmen. (2) Die Urkundensammlung und die Erbvertragssammlung für Urkunden, die vor dem 1. Januar 2022 errichtet wurden, werden von dem Notar nach Maßgabe der vor dem 1. Januar 2022 geltenden Vorschriften geführt und verwahrt. Zusätze und Änderungen sind nach den vor dem 1. Januar 2022 geltenden Bestimmungen vorzunehmen. (3) Für Verwahrungsmassen, die der Notar vor dem 1. Januar 2022 entgegengenommen hat, ist § 59a vorbehaltlich der Sätze 3 bis 5 nicht anzuwenden. Für diese Verwahrungsmassen werden die Verwahrungsbücher, die Massenbücher, die Namensverzeichnisse zum Massenbuch und die Anderkontenlisten nach den vor dem 1. Januar 2022 geltenden Bestimmungen geführt und verwahrt. Der Notar kann jedoch zum Schluss eines Kalenderjahres alle Verwahrungsmassen im Sinne des Satzes 1 in das Verwahrungsverzeichnis übernehmen und insoweit die Verzeichnisführung nach den vor dem 1. Januar 2022 geltenden Bestimmungen abschließen. Dazu sind für die zu übernehmenden Verwahrungsmassen die nach den Vorschriften des Abschnitts 3 der Verordnung über die Führung notarieller Akten und Verzeichnisse erforderlichen Angaben in das Verwahrungsverzeichnis einzutragen. Dabei sind sämtliche in den Massenbüchern und Verwahrungsbüchern verzeichneten Eintragungen zu übernehmen. (4) Die Urkundenrollen, die Erbvertragsverzeichnisse und die Namensverzeichnisse zur Urkundenrolle für Urkunden, die vor dem 1. Januar 2022 errichtet wurden, werden von dem Notar nach Maßgabe der vor dem 1. Januar 2022 geltenden Vorschriften geführt und verwahrt. (5) Für Beurkundungen und sonstige Amtshandlungen, die vom 1. Januar bis zum 30. Juni 2022 vorgenommen werden, gilt § 55 Absatz 2 nur im Hinblick auf das Urkundenverzeichnis und sind § 55 Absatz 3 sowie § 56 nicht anzuwenden. Im Übrigen gelten für die vom 1. Januar bis zum 30. Juni 2022 vorgenommenen Beurkundungen und sonstigen Amtshandlungen Absatz 1 Satz 3 und Absatz 2 entsprechend. #### Schlußformel Die verfassungsmäßigen Rechte des Landes Baden-Württemberg aus Artikel 138 des Grundgesetzes sind gewahrt.
36.64799
146
0.816056
deu_Latn
0.999723
0c1dcc0e77c0a52a3c38fc8c3c1f56c7e66b8202
4,323
md
Markdown
README.md
ILUMY/react-native-autocomplete-input
b9d66dbea19098a8b70c268043bd2722c0454a80
[ "MIT" ]
6
2018-07-06T02:00:32.000Z
2021-04-04T13:16:43.000Z
README.md
ILUMY/react-native-autocomplete-input
b9d66dbea19098a8b70c268043bd2722c0454a80
[ "MIT" ]
1
2018-07-12T00:29:51.000Z
2018-07-12T00:31:37.000Z
README.md
ILUMY/react-native-autocomplete-input
b9d66dbea19098a8b70c268043bd2722c0454a80
[ "MIT" ]
3
2018-08-08T12:32:39.000Z
2019-08-02T11:12:16.000Z
# react-native-autocomplete-input [![npm version](https://badge.fury.io/js/react-native-autocomplete-input.svg)](https://badge.fury.io/js/react-native-autocomplete-input) [![Build Status](https://travis-ci.org/l-urence/react-native-autocomplete-input.svg)](https://travis-ci.org/l-urence/react-native-autocomplete-input) A pure JS autocomplete component for React Native. Use this component in your own projects or use it as inspiration to build your own autocomplete. ![Autocomplete Example](https://raw.githubusercontent.com/l-urence/react-native-autocomplete-input/master/example.gif) ## How to use react-native-autocomplete-input Tested with RN >= 0.26.2. If you want to use RN < 0.26 try to install react-native-autocomplete-input <= 0.0.5. ### Installation ```shell $ npm install --save react-native-autocomplete-input ``` or install HEAD from github.com: ```shell $ npm install --save l-urence/react-native-autocomplete-input ``` ### Example ```javascript // ... render() { const { query } = this.state; const data = this._filterData(query) return (<Autocomplete data={data} defaultValue={query} onChangeText={text => this.setState({ query: text })} renderItem={item => ( <TouchableOpacity onPress={() => this.setState({ query: item })}> <Text>{item}</Text> </TouchableOpacity> )} />); } // ... ``` A complete example for Android and iOS can be found [here](//github.com/l-urence/react-native-autocomplete-input/blob/master/example/). ### Android Android does not support overflows ([#20](https://github.com/l-urence/react-native-autocomplete-input/issues/20)), for that reason it is necessary to wrap the autocomplete into a *absolute* positioned view on Android. This will allow the suggestion list to overlap other views inside your component. ```javascript //... render() { return( <View> <View style={styles.autocompleteContainer}> <Autocomplete {/* your props */} /> </View> <View> <Text>Some content</Text> </View> </View> ); } //... const styles = StyleSheet.create({ autocompleteContainer: { flex: 1, left: 0, position: 'absolute', right: 0, top: 0, zIndex: 1 } }); ``` ### Props | Prop | Type | Description | | :------------ |:---------------:| :-----| | containerStyle | style | These styles will be applied to the container which surrounds the autocomplete component. | | hideResults | bool | Set to `true` to hide the suggestion list. | data | array | An array with suggestion items to be rendered in `renderItem(item)`. Any array with length > 0 will open the suggestion list and any array with length < 1 will hide the list. | | inputContainerStyle | style | These styles will be applied to the container which surrounds the textInput component. | | listContainerStyle | style | These styles will be applied to the container which surrounds the result list. | | listStyle | style | These style will be applied to the result list. | | onShowResult | function | `onShowResult` will be called when the autocomplete suggestions appear or disappear. | | onStartShouldSetResponderCapture | function | `onStartShouldSetResponderCapture` will be passed to the result list view container ([onStartShouldSetResponderCapture](https://facebook.github.io/react-native/docs/gesture-responder-system.html#capture-shouldset-handlers)). | | renderItem | function | `renderItem` will be called to render the data objects which will be displayed in the result view below the text input. | | renderSeparator | function | `renderSeparator` will be called to render the list separators which will be displayed between the list elements in the result view below the text input. | | renderTextInput | function | render custom TextInput. All props passed to this function. | ## Known issues * By default the autocomplete will not behave as expected inside a `<ScrollView />`. Set the scroll view's prop to fix this: `keyboardShouldPersistTaps={true}` for RN <= 0.39, or `keyboardShouldPersistTaps='always'` for RN >= 0.40. ([#5](https://github.com/l-urence/react-native-autocomplete-input/issues/5)). * If you want to test with Jest add ```jest.mock('react-native-autocomplete-input', () => 'Autocomplete');``` to your test. ## Contribute Feel free to open issues or do a PR!
41.567308
309
0.713625
eng_Latn
0.930589
0c1dfaa34dc32e89ba8a8a6daf1d1640738cda21
1,860
md
Markdown
_posts/Beakjoon/1000-2000/2021-02-17-1193.md
akfn20/github-blog
b12d178f846de8ebc85789c19abbc3924d1bac77
[ "MIT" ]
3
2021-06-29T15:54:21.000Z
2022-03-22T15:41:26.000Z
_posts/Beakjoon/1000-2000/2021-02-17-1193.md
akfn20/github-blog
b12d178f846de8ebc85789c19abbc3924d1bac77
[ "MIT" ]
5
2020-07-01T14:41:15.000Z
2020-10-15T16:04:29.000Z
_posts/Beakjoon/1000-2000/2021-02-17-1193.md
akfn20/github-blog
b12d178f846de8ebc85789c19abbc3924d1bac77
[ "MIT" ]
29
2021-03-27T07:47:57.000Z
2022-03-31T05:47:22.000Z
--- title: "[Beakjoon] 1193번 분수찾기" categories: - Beakjoon tags: - algorithm - beakjoon toc: true toc_sticky: true toc_label: "1193번 분수찾기" toc_icon: "sticky-note" --- 📣<br> **Beakjoon**에서 PASS된 코드만 업데이트합니다.<br> 알고리즘을 먼저 풀이하는 언어(Java)가 정해져있어, 풀이 언어(Python, C++, Java)가 모두 업데이트될 때까지는 시간이 걸릴 수 있습니다. {: .notice--primary} # 문제 제시 --- <br> <b><u><span style="font-size:20px">문제</span></u></b> 무한히 큰 배열에 다음과 같이 분수들이 적혀있다.<br> 1/1&ensp;1/2&ensp;1/3&ensp;1/4&ensp;1/5&ensp;&ensp;…<br> 2/1&ensp;2/2&ensp;2/3&ensp;2/4&ensp;&ensp;…&ensp;&ensp;…<br> 3/1&ensp;3/2&ensp;3/3&ensp;&ensp;…&ensp;&ensp;…&ensp;&ensp;…<br> 4/1&ensp;4/2&ensp;&ensp;…&ensp;&ensp;…&ensp;&ensp;…&ensp;&ensp;…<br> 5/1&ensp;…&ensp;&ensp;…&ensp;&ensp;…&ensp;&ensp;…&ensp;&ensp;…<br> …&ensp;&ensp;…&ensp;&ensp;…&ensp;&ensp;…&ensp;&ensp;…&ensp;&ensp;…<br> 이와 같이 나열된 분수들을 1/1 -> 1/2 -> 2/1 -> 3/1 -> 2/2 -> … 과 같은 지그재그 순서로 차례대로 1번, 2번, 3번, 4번, 5번, … 분수라고 하자. X가 주어졌을 때, X번째 분수를 구하는 프로그램을 작성하시오.<br> <br> <b><u><span style="font-size:20px">입력</span></u></b> 첫째 줄에 X(1 ≤ X ≤ 10,000,000)가 주어진다. <br> <b><u><span style="font-size:20px">출력</span></u></b> 첫째 줄에 분수를 출력한다. <br> <br> # 문제 풀이 --- ![image](https://user-images.githubusercontent.com/45550607/108091636-4003fe00-70bf-11eb-9d62-353c1254b41d.png){: .image-center} 분자와 분모가 각 행과 열의 특징을 그대로 나타내고 있습니다.<br> 따라서 모든 배열의 내용을 할당하고 탐색하는 것이 아닌 i,j의 특징을 찾아서 바로 표현하면 됩니다.<br> 또한 전체적으로 순서대로 진행되면서, 증가/감소 여부만 다르고 동일한 제한 조건(인덱스가 1 이상)을 가집니다.<br> 이를 스위치 변수로 삼아 구현하면 됩니다.<br> <br> 스위치 변수를 T/F로 하는 것이 아닌 1/-1로 하면 이를 바로 연산하면 됩니다.<br> 즉, 증가할 때와 감소할 때의 코드를 별개로 짤 필요가 없는 것이죠.<br> <br> <br> # 문제 코드 ## Java <script src="https://gist.github.com/eona1301/b6242396be3fadf0470479bb05d943c2.js"></script> <br> <br> # 제출 결과 ![image](https://user-images.githubusercontent.com/45550607/108090746-4fcf1280-70be-11eb-8745-f30fbafc9c12.png){: .image-center}
22.682927
128
0.645699
kor_Hang
0.996721
0c1e5e565ab8faafe2a7fd5804d4a16cef9efad4
772
md
Markdown
2020/CVE-2020-3993.md
justinforbes/cve
375c65312f55c34fc1a4858381315fe9431b0f16
[ "MIT" ]
2,340
2022-02-10T21:04:40.000Z
2022-03-31T14:42:58.000Z
2020/CVE-2020-3993.md
justinforbes/cve
375c65312f55c34fc1a4858381315fe9431b0f16
[ "MIT" ]
19
2022-02-11T16:06:53.000Z
2022-03-11T10:44:27.000Z
2020/CVE-2020-3993.md
justinforbes/cve
375c65312f55c34fc1a4858381315fe9431b0f16
[ "MIT" ]
280
2022-02-10T19:58:58.000Z
2022-03-26T11:13:05.000Z
### [CVE-2020-3993](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-3993) ![](https://img.shields.io/static/v1?label=Product&message=VMware%20NSX-T&color=blue) ![](https://img.shields.io/static/v1?label=Version&message=n%2Fa&color=blue) ![](https://img.shields.io/static/v1?label=Vulnerability&message=MITM%20vulnerability&color=brighgreen) ### Description VMware NSX-T (3.x before 3.0.2, 2.5.x before 2.5.2.2.0) contains a security vulnerability that exists in the way it allows a KVM host to download and install packages from NSX manager. A malicious actor with MITM positioning may be able to exploit this issue to compromise the transport node. ### POC #### Reference No PoCs from references. #### Github - https://github.com/alphaSeclab/sec-daily-2020
42.888889
292
0.751295
eng_Latn
0.659067
0c1ea0a99e1b70899295ad8c4d4d124cf7911ead
4,604
markdown
Markdown
_posts/2012-04-23-Pseudo-terminals.markdown
ibutra/skeeto.github.com
4e61a64859561d4a4c8207ceb1242ebfdec31280
[ "Unlicense" ]
42
2016-12-27T14:04:02.000Z
2022-03-18T23:22:10.000Z
_posts/2012-04-23-Pseudo-terminals.markdown
ibutra/skeeto.github.com
4e61a64859561d4a4c8207ceb1242ebfdec31280
[ "Unlicense" ]
13
2017-01-31T21:43:56.000Z
2022-03-08T17:19:13.000Z
_posts/2012-04-23-Pseudo-terminals.markdown
ibutra/skeeto.github.com
4e61a64859561d4a4c8207ceb1242ebfdec31280
[ "Unlicense" ]
24
2016-04-18T15:07:02.000Z
2022-03-08T16:55:25.000Z
--- title: Pseudo-terminals layout: post tags: [c, posix] uuid: 269799fd-3a67-3a22-433a-c5224447e614 --- My dad recently had an interesting problem at work related to serial ports. Since I use serial ports at work, he asked me for advice. They have third-party software which reads and analyzes sensor data from the serial port. It's the only method this program has of inputting a stream of data and they're unable to patch it. Unfortunately, they have another piece of software that needs to massage the data before this final program gets it. The data needs to be intercepted coming on the serial port somehow. ![](/img/diagram/pseudo-terminals.png) The solution they were aiming for was to create a pair of virtual serial ports. The filter software would read data in on the real serial port, output the filtered data into a virtual serial port which would be virtually connected to a second virtual serial port. The analysis software would then read from this second serial port. They couldn't figure out how to set this up, short of buying a couple of USB/serial port adapters and plugging them into each other. It turns out this is very easy to do on Unix-like systems. POSIX defines two functions, `posix_openpt(3)` and `ptsname(3)`. The first one creates a pseudo-terminal — a virtual serial port — and returns a "master" *file descriptor* used to talk to it. The second provides the name of the pseudo-terminal device on the filesystem, usually named something like `/dev/pts/5`. ~~~c #define _GNU_SOURCE #include <stdio.h> #include <stdlib.h> #include <fcntl.h> int main() { int fd = posix_openpt(O_RDWR | O_NOCTTY); printf("%s\n", ptsname(fd)); /* ... read and write to fd ... */ return 0; } ~~~ The printed device name can be opened by software that's expecting to access a serial port, such as [minicom](http://en.wikipedia.org/wiki/Minicom), and it can be communicated with as if by a pipe. This could be useful in testing a program's serial port communication logic virtually. The reason for the unusually long name is because the function wasn't added to POSIX until 1998 (Unix98). They were probably afraid of name collisions with software already using `openpt()` as a function name. The GNU C Library provides an extension `getpt(3)`, which is just shorthand for the above. ~~~c int fd = getpt(); ~~~ Pseudo-terminal functionality was available much earlier, of course. It could be done through the poorly designed `openpty(3)`, added in BSD Unix. ~~~c int openpty(int *amaster, int *aslave, char *name, const struct termios *termp, const struct winsize *winp); ~~~ It accepts `NULL` for the last three arguments, allowing the user to ignore them. What makes it so bad is that string `name`. The user would pass it a chunk of allocated space and hope it was long enough for the file name. If not, `openpty()` would overwrite the end of the string and trash some memory. It's highly unlikely to ever exceed something like 32 bytes, but it's still a correctness problem. The newer `ptsname()` is only slightly better however. It returns a string that doesn't need to be `free()`d, because it's static memory. However, that means the function is not re-entrant; it has issues in multi-threaded programs, since that string could be trashed at any instant by another call to `ptsname()`. Consider this case, ~~~c int fd0 = getpt(); int fd1 = getpt(); printf("%s %s\n", ptsname(fd0), ptsname(fd1)); ~~~ `ptsname()` will be returning the same `char *` pointer each time it's called, merely filling the pointed-to space before returning. Rather than printing two different device filenames, the above would print the same filename twice. The GNU C Library provides an extension to correct this flaw, as `ptsname_r()`, where the user provides the memory as before but also indicates its maximum size. To make a one-way virtual connection between our pseudo-terminals, create two of them and do the typical buffer thing between the file descriptors (for succinctness, no checking for errors), ~~~c while (1) { char buffer; int in = read(pt0, &buffer, 1); write(pt1, &buffer, in); } ~~~ Making a two-way connection would require the use of threads or `select(2)`, but it wouldn't be much more complicated. While all this was new and interesting to me, it didn't help my dad at all because they're using Windows. These functions don't exist there and creating virtual serial ports is a highly non-trivial, less-interesting process. Buying the two adapters and connecting them together is my recommended solution for Windows.
38.049587
70
0.754344
eng_Latn
0.999585
0c1f4b66f33368cc601b7dc4ebbcb56f57232538
5,415
md
Markdown
microApi/content/distributed.md
crufter/docuapi
aa150e83f8e5037cacf6585ba27b8ff75e985593
[ "Apache-2.0" ]
1
2020-08-13T15:04:43.000Z
2020-08-13T15:04:43.000Z
microApi/content/distributed.md
crufter/docuapi
aa150e83f8e5037cacf6585ba27b8ff75e985593
[ "Apache-2.0" ]
null
null
null
microApi/content/distributed.md
crufter/docuapi
aa150e83f8e5037cacf6585ba27b8ff75e985593
[ "Apache-2.0" ]
null
null
null
--- weight: 11 title: distributed --- # distributed ## DistributedNotes.CreateNote ```go package main import ( "github.com/micro/clients/go/client" distributed_proto "github.com/micro/services/distributed/proto" ) func main() { c := client.NewClient(nil) req := distributed_proto.CreateNoteRequest{} rsp := distributed_proto.CreateNoteResponse{} if err := c.Call("go.micro.srv.distributed", "DistributedNotes.CreateNote", req, &rsp); err != nil { fmt.Println(err) return } fmt.Println(rsp) } ``` ```javascript // To install "npm install --save @microhq/ng-client" import { Component, OnInit } from "@angular/core"; import { ClientService } from "@microhq/ng-client"; @Component({ selector: "app-example", templateUrl: "./example.component.html", styleUrls: ["./example.component.css"] }) export class ExampleComponent implements OnInit { constructor(private mc: ClientService) {} ngOnInit() { this.mc .call("go.micro.srv.distributed", "DistributedNotes.CreateNote", {}) .then((response: any) => { console.log(response) }); } } ``` ### Request Parameters Name | Type | Description --------- | --------- | --------- note | Note | ### Response Parameters Name | Type | Description --------- | --------- | --------- note | Note | <aside class="success"> Remember — a happy kitten is an authenticated kitten! </aside> ## DistributedNotes.DeleteNote ```go package main import ( "github.com/micro/clients/go/client" distributed_proto "github.com/micro/services/distributed/proto" ) func main() { c := client.NewClient(nil) req := distributed_proto.DeleteNoteRequest{} rsp := distributed_proto.DeleteNoteResponse{} if err := c.Call("go.micro.srv.distributed", "DistributedNotes.DeleteNote", req, &rsp); err != nil { fmt.Println(err) return } fmt.Println(rsp) } ``` ```javascript // To install "npm install --save @microhq/ng-client" import { Component, OnInit } from "@angular/core"; import { ClientService } from "@microhq/ng-client"; @Component({ selector: "app-example", templateUrl: "./example.component.html", styleUrls: ["./example.component.css"] }) export class ExampleComponent implements OnInit { constructor(private mc: ClientService) {} ngOnInit() { this.mc .call("go.micro.srv.distributed", "DistributedNotes.DeleteNote", {}) .then((response: any) => { console.log(response) }); } } ``` ### Request Parameters Name | Type | Description --------- | --------- | --------- note | Note | ### Response Parameters Name | Type | Description --------- | --------- | --------- <aside class="success"> Remember — a happy kitten is an authenticated kitten! </aside> ## DistributedNotes.ListNotes ```go package main import ( "github.com/micro/clients/go/client" distributed_proto "github.com/micro/services/distributed/proto" ) func main() { c := client.NewClient(nil) req := distributed_proto.ListNotesRequest{} rsp := distributed_proto.ListNotesResponse{} if err := c.Call("go.micro.srv.distributed", "DistributedNotes.ListNotes", req, &rsp); err != nil { fmt.Println(err) return } fmt.Println(rsp) } ``` ```javascript // To install "npm install --save @microhq/ng-client" import { Component, OnInit } from "@angular/core"; import { ClientService } from "@microhq/ng-client"; @Component({ selector: "app-example", templateUrl: "./example.component.html", styleUrls: ["./example.component.css"] }) export class ExampleComponent implements OnInit { constructor(private mc: ClientService) {} ngOnInit() { this.mc .call("go.micro.srv.distributed", "DistributedNotes.ListNotes", {}) .then((response: any) => { console.log(response) }); } } ``` ### Request Parameters Name | Type | Description --------- | --------- | --------- ### Response Parameters Name | Type | Description --------- | --------- | --------- notes | Note | <aside class="success"> Remember — a happy kitten is an authenticated kitten! </aside> ## DistributedNotes.UpdateNote ```go package main import ( "github.com/micro/clients/go/client" distributed_proto "github.com/micro/services/distributed/proto" ) func main() { c := client.NewClient(nil) req := distributed_proto.UpdateNoteRequest{} rsp := distributed_proto.UpdateNoteResponse{} if err := c.Call("go.micro.srv.distributed", "DistributedNotes.UpdateNote", req, &rsp); err != nil { fmt.Println(err) return } fmt.Println(rsp) } ``` ```javascript // To install "npm install --save @microhq/ng-client" import { Component, OnInit } from "@angular/core"; import { ClientService } from "@microhq/ng-client"; @Component({ selector: "app-example", templateUrl: "./example.component.html", styleUrls: ["./example.component.css"] }) export class ExampleComponent implements OnInit { constructor(private mc: ClientService) {} ngOnInit() { this.mc .call("go.micro.srv.distributed", "DistributedNotes.UpdateNote", {}) .then((response: any) => { console.log(response) }); } } ``` ### Request Parameters Name | Type | Description --------- | --------- | --------- note | Note | ### Response Parameters Name | Type | Description --------- | --------- | --------- <aside class="success"> Remember — a happy kitten is an authenticated kitten! </aside>
18.867596
102
0.642475
eng_Latn
0.304956
0c2037b6fefad800ab9b5caa9d86655f17747a23
11,139
md
Markdown
articles/communication-services/quickstarts/voice-video-calling/download-recording-file-sample.md
Tanksta/azure-docs.de-de
3fade99b63ec5fb175e7b72d9f49c28a25a183cc
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/communication-services/quickstarts/voice-video-calling/download-recording-file-sample.md
Tanksta/azure-docs.de-de
3fade99b63ec5fb175e7b72d9f49c28a25a183cc
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/communication-services/quickstarts/voice-video-calling/download-recording-file-sample.md
Tanksta/azure-docs.de-de
3fade99b63ec5fb175e7b72d9f49c28a25a183cc
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: 'Aufzeichnen und Herunterladen von Anrufen mit Event Grid: ein Azure Communication Services-Schnellstart' titleSuffix: An Azure Communication Services quickstart description: In dieser Schnellstartanleitung erfahren Sie, wie Sie Anrufe mithilfe von Event Grid aufzeichnen und herunterladen. author: joseys manager: anvalent services: azure-communication-services ms.author: joseys ms.date: 06/30/2021 ms.topic: overview ms.service: azure-communication-services ms.openlocfilehash: e7a114c5ce31ff4df96648ba2545c2222ba4893d ms.sourcegitcommit: 98308c4b775a049a4a035ccf60c8b163f86f04ca ms.translationtype: HT ms.contentlocale: de-DE ms.lasthandoff: 06/30/2021 ms.locfileid: "113111607" --- # <a name="record-and-download-calls-with-event-grid"></a>Aufzeichnen und Herunterladen von Anrufen mit Event Grid [!INCLUDE [Private Preview Notice](../../includes/private-preview-include.md)] Führen Sie erste Schritte mit Azure Communication Services aus, indem Sie Ihre Communication Services-Anrufe mithilfe von Azure Event Grid aufzeichnen. ## <a name="prerequisites"></a>Voraussetzungen - Ein Azure-Konto mit einem aktiven Abonnement. Sie können [kostenlos ein Konto erstellen](https://azure.microsoft.com/free/?WT.mc_id=A261C142F). - Eine aktive Communication Services-Ressource. [Erstellen Sie eine Communication Services-Ressource](../create-communication-resource.md?pivots=platform-azp&tabs=windows). - Das NuGet-Paket [`Microsoft.Azure.EventGrid`](https://www.nuget.org/packages/Microsoft.Azure.EventGrid/) ## <a name="create-a-webhook-and-subscribe-to-the-recording-events"></a>Erstellen eines Webhooks und Abonnieren der Aufzeichnungsereignisse Sie verwenden *Webhooks* und *Ereignisse*, um die Anrufaufzeichnung und Mediendateidownloads zu vereinfachen. Sie erstellen zunächst einen Webhook. Ihre Communication Services-Ressource benachrichtigt mithilfe von Event Grid diesen Webhook, wenn das `recording`-Ereignis ausgelöst wird, und dann erneut, wenn aufgezeichnete Medien heruntergeladen werden können. Sie können Ihren eigenen benutzerdefinierten Webhook schreiben, um diese Ereignisbenachrichtigungen zu empfangen. Es ist wichtig, dass dieser Webhook mit dem Validierungscode auf eingehende Nachrichten reagiert, um den Webhook erfolgreich für den Ereignisdienst zu abonnieren. ```csharp [HttpPost] public async Task<ActionResult> PostAsync([FromBody] object request) { //Deserializing the request var eventGridEvent = JsonConvert.DeserializeObject<EventGridEvent[]>(request.ToString()) .FirstOrDefault(); var data = eventGridEvent.Data as JObject; // Validate whether EventType is of "Microsoft.EventGrid.SubscriptionValidationEvent" if (string.Equals(eventGridEvent.EventType, EventTypes.EventGridSubscriptionValidationEvent, StringComparison.OrdinalIgnoreCase)) { var eventData = data.ToObject<SubscriptionValidationEventData>(); var responseData = new SubscriptionValidationResponseData { ValidationResponse = eventData.ValidationCode }; if (responseData.ValidationResponse != null) { return Ok(responseData); } } // Implement your logic here. ... ... } ``` Der obige Code hängt vom NuGet-Paket `Microsoft.Azure.EventGrid` ab. Weitere Informationen zur Event Grid-Endpunktüberprüfung finden Sie in der [Dokumentation zur Endpunktüberprüfung](../../../event-grid/receive-events.md#endpoint-validation). Anschließend abonnieren Sie diesen Webhook für das `recording`-Ereignis: 1. Wählen Sie in Ihrer Azure Communication Services-Ressource das Blatt `Events` aus. 2. Wählen Sie wie nachfolgend gezeigt `Event Subscription` aus: ![Screenshot: Event Grid-Benutzeroberfläche](./media/call-recording/image1-event-grid.png) 3. Konfigurieren Sie das Ereignisabonnement, und wählen Sie `Call Recording File Status Update` als `Event Type` aus. Wählen Sie `Webhook` als `Endpoint type` aus. ![Erstellen eines Ereignisabonnements](./media/call-recording/image2-create-subscription.png) 4. Geben Sie unter `Subscriber Endpoint` die URL Ihres Webhooks ein. ![Abonnieren des Ereignisses](./media/call-recording/image3-subscribe-to-event.png) Ihr Webhook wird nun benachrichtigt, wenn die Communication Services-Ressource zum Aufzeichnen eines Anrufs verwendet wird. ## <a name="notification-schema"></a>Benachrichtigungsschema Wenn die Aufzeichnung zum Download zur Verfügung steht, gibt die Communication Services-Ressource eine Benachrichtigung mit dem folgenden Ereignisschema aus. Die Dokument-IDs für die Aufzeichnung können aus den `documentId`-Feldern der einzelnen `recordingChunk`-Elemente abgerufen werden. ```json { "id": string, // Unique guid for event "topic": string, // Azure Communication Services resource id "subject": string, // /recording/call/{call-id} "data": { "recordingStorageInfo": { "recordingChunks": [ { "documentId": string, // Document id for retrieving from AMS storage "index": int, // Index providing ordering for this chunk in the entire recording "endReason": string, // Reason for chunk ending: "SessionEnded", "ChunkMaximumSizeExceeded”, etc. } ] }, "recordingStartTime": string, // ISO 8601 date time for the start of the recording "recordingDurationMs": int, // Duration of recording in milliseconds "sessionEndReason": string // Reason for call ending: "CallEnded", "InitiatorLeft”, etc. }, "eventType": string, // "Microsoft.Communication.RecordingFileStatusUpdated" "dataVersion": string, // "1.0" "metadataVersion": string, // "1" "eventTime": string // ISO 8601 date time for when the event was created } ``` ## <a name="download-the-recorded-media-files"></a>Herunterladen der aufgezeichneten Mediendateien Wenn Sie die Dokument-ID für die Datei abgerufen haben, die Sie herunterladen möchten, rufen Sie die folgenden Azure Communication Services-APIs auf, um die aufgezeichneten Medien und Metadaten mithilfe der HMAC-Authentifizierung herunterzuladen. Die maximale Größe der Aufzeichnungsdatei beträgt 1,5 GB. Wenn diese Dateigröße überschritten wird, teilt der Rekorder aufgezeichnete Medien automatisch in mehrere Dateien auf. Der Client sollte in der Lage sein, alle Mediendateien mit einer einzelnen Anforderung herunterzuladen. Bei einem Problem kann der Client den Vorgang mit einem Bereichsheader wiederholen, um das erneute Herunterladen bereits heruntergeladener Segmente zu vermeiden. So laden Sie aufgezeichnete Medien herunter: - Methode: `GET` - URL: https://contoso.communication.azure.com/recording/download/{documentId}?api-version=2021-04-15-preview1 So laden Sie Metadaten aufgezeichneter Medien herunter: - Methode: `GET` - URL: https://contoso.communication.azure.com/recording/download/{documentId}/metadata?api-version=2021-04-15-preview1 ### <a name="authentication"></a>Authentifizierung Verwenden Sie zum Herunterladen aufgezeichneter Medien und Metadaten die HMAC-Authentifizierung, um die Anforderung für Azure Communication Services-APIs zu authentifizieren. Erstellen Sie ein `HttpClient`-Element, und fügen Sie die erforderlichen Header mithilfe des unten angegebenen `HmacAuthenticationUtils`-Elements hinzu: ```csharp var client = new HttpClient(); // Set Http Method var method = HttpMethod.Get; StringContent content = null; // Build request var request = new HttpRequestMessage { Method = method, // Http GET method RequestUri = new Uri(<Download_Recording_Url>), // Download recording Url Content = content // content if required for POST methods }; // Question: Why do we need to pass String.Empty to CreateContentHash() method? // Answer: In HMAC authentication, the hash of the content is one of the parameters used to generate the HMAC token. // In our case our recording download APIs are GET methods and do not have any content/body to be passed in the request. // However in this case we still need the SHA256 hash for the empty content and hence we pass an empty string. string serializedPayload = string.Empty; // Hash the content of the request. var contentHashed = HmacAuthenticationUtils.CreateContentHash(serializedPayload); // Add HMAC headers. HmacAuthenticationUtils.AddHmacHeaders(request, contentHashed, accessKey, method); // Make a request to the Azure Communication Services APIs mentioned above var response = await client.SendAsync(request).ConfigureAwait(false); ``` #### <a name="hmacauthenticationutils"></a>HmacAuthenticationUtils Die folgenden Hilfsprogramme können zum Verwalten Ihres HMAC-Workflows verwendet werden: **Erstell des Inhaltshashs** ```csharp public static string CreateContentHash(string content) { var alg = SHA256.Create(); using (var memoryStream = new MemoryStream()) using (var contentHashStream = new CryptoStream(memoryStream, alg, CryptoStreamMode.Write)) { using (var swEncrypt = new StreamWriter(contentHashStream)) { if (content != null) { swEncrypt.Write(content); } } } return Convert.ToBase64String(alg.Hash); } ``` **Hinzufügen von HMAC-Headern** ```csharp public static void AddHmacHeaders(HttpRequestMessage requestMessage, string contentHash, string accessKey) { var utcNowString = DateTimeOffset.UtcNow.ToString("r", CultureInfo.InvariantCulture); var uri = requestMessage.RequestUri; var host = uri.Authority; var pathAndQuery = uri.PathAndQuery; var stringToSign = $"{requestMessage.Method}\n{pathAndQuery}\n{utcNowString};{host};{contentHash}"; var hmac = new HMACSHA256(Convert.FromBase64String(accessKey)); var hash = hmac.ComputeHash(Encoding.ASCII.GetBytes(stringToSign)); var signature = Convert.ToBase64String(hash); var authorization = $"HMAC-SHA256 SignedHeaders=date;host;x-ms-content-sha256&Signature={signature}"; requestMessage.Headers.Add("x-ms-content-sha256", contentHash); requestMessage.Headers.Add("Date", utcNowString); requestMessage.Headers.Add("Authorization", authorization); } ``` ## <a name="clean-up-resources"></a>Bereinigen von Ressourcen Wenn Sie ein Communication Services-Abonnement bereinigen und entfernen möchten, können Sie die Ressource oder die Ressourcengruppe löschen. Wenn Sie die Ressourcengruppe löschen, werden auch alle anderen Ressourcen gelöscht, die ihr zugeordnet sind. Weitere Informationen zum Bereinigen von Ressourcen finden Sie [hier](../create-communication-resource.md?pivots=platform-azp&tabs=windows#clean-up-resources). ## <a name="next-steps"></a>Nächste Schritte Weitere Informationen finden Sie in den folgenden Artikeln: - Sehen Sie sich das [Beispiel für Webanrufe](../../samples/web-calling-sample.md) an. - Informieren Sie sich über die [Funktionen des Calling SDK](./calling-client-samples.md?pivots=platform-web). - Informieren Sie sich über die [Funktionsweise von Anrufen](../../concepts/voice-video-calling/about-call-types.md).
49.950673
410
0.759763
deu_Latn
0.712398
0c20762eb9e7d33107beaa67afa5fe8f35bf5239
3,907
md
Markdown
README.md
facexteam/FaceDetection-DSFD
195cd194f67184bd80e842ae912b5305d22e533f
[ "Apache-2.0" ]
6
2019-03-05T07:13:45.000Z
2020-01-08T09:32:00.000Z
README.md
facexteam/FaceDetection-DSFD
195cd194f67184bd80e842ae912b5305d22e533f
[ "Apache-2.0" ]
null
null
null
README.md
facexteam/FaceDetection-DSFD
195cd194f67184bd80e842ae912b5305d22e533f
[ "Apache-2.0" ]
null
null
null
## [DSFD: Dual Shot Face Detector](https://arxiv.org/abs/1810.10220) [![License](https://img.shields.io/badge/license-BSD-blue.svg)](LICENSE) By [Jian Li](https://lijiannuist.github.io/), [Yabiao Wang](https://github.com/ChaunceyWang), [Changan Wang](https://github.com/HiKapok), [Ying Tai](https://github.com/tyshiwo) ## Introduction This paper is accepted by CVPR 2019. In this paper, we propose a novel face detection network, named DSFD, with superior performance over the state-of-the-art face detectors. You can use the code to evaluate our DSFD for face detection. For more details, please refer to our paper [DSFD: Dual Shot Face Detector](https://arxiv.org/abs/1810.10220)! <p align="center"> <img src="https://github.com/TencentYoutuResearch/FaceDetection-DSFD/blob/master/imgs/DSFD_framework.PNG" alt="DSFD Framework" width="1000px"> </p> Our DSFD face detector achieves state-of-the-art performance on [WIDER FACE](http://mmlab.ie.cuhk.edu.hk/projects/WIDERFace/WiderFace_Results.html) and [FDDB](http://vis-www.cs.umass.edu/fddb/results.html) benchmark. ### WIDER FACE <p align="center"> <img src="https://github.com/TencentYoutuResearch/FaceDetection-DSFD/blob/master/imgs/DSFD_widerface.PNG" alt="DSFD Widerface Performance" width="1000px"> </p> ### FDDB <p align="center"> <img src="https://github.com/TencentYoutuResearch/FaceDetection-DSFD/blob/master/imgs/DSFD_fddb.PNG" alt="DSFD FDDB Performance" width="1000px"> </p> ## Qualitative Results <p align='center'> <img src='https://github.com/TencentYoutuResearch/FaceDetection-DSFD/blob/master/imgs/DSFD_demo1.PNG' width='1000'/> </p> <p align='center'> <img src='https://github.com/TencentYoutuResearch/FaceDetection-DSFD/blob/master/imgs/DSFD_demo2.PNG' width='1000'/> </p> ## Requirements - Torch == 0.3.1 - Torchvision == 0.2.1 - Python == 3.6 - NVIDIA GPU == Tesla P40 - Linux CUDA CuDNN ## Getting Started ### Installation Clone the github repository. We will call the cloned directory as `$DSFD_ROOT`. ```bash git clone https://github.com/TencentYoutuResearch/FaceDetection-DSFD.git cd FaceDetection-DSFD ``` ### Evaluation 1. Download the images of [WIDER FACE](http://mmlab.ie.cuhk.edu.hk/projects/WIDERFace/) and [FDDB](https://drive.google.com/open?id=17t4WULUDgZgiSy5kpCax4aooyPaz3GQH) to `$DSFD_ROOT/data/`. 2. Download our [DSFD model](https://drive.google.com/open?id=1eyqFViMoBlN8JokGRHxbnJ8D4o0pTWac) trained on WIDER FACE training set to `$DSFD_ROOT/weights/`. 3. Check out [`tools/demo.py`](https://github.com/TencentYoutuResearch/FaceDetection-DSFD/blob/master/test/demo.py) on how to detect faces using the DSFD model and how to plot detection results. 4. Evaluate the trained model via [`./widerface_test.py`](https://github.com/TencentYoutuResearch/FaceDetection-DSFD/blob/master/test/widerface_test.py) on WIDER FACE. ``` export CUDA_VISIBLE_DEVICES=0 python widerface_test.py [--trained_model [TRAINED_MODEL]] [--save_folder [SAVE_FOLDER]] [--widerface_root [WIDERFACE_ROOT]] --trained_model Path to the saved model --save_folder Path of output widerface resutls --widerface_root Path of widerface dataset ``` 5. Evaluate the trained model via [`./fddb_test.py`](https://github.com/sTencentYoutuResearch/FaceDetection-DSFD/blob/master/test/fddb_test.py) on FDDB. 6. Download the [eval_tool](http://mmlab.ie.cuhk.edu.hk/projects/WIDERFace/support/eval_script/eval_tools.zip) to show the WIDERFACE performance. ### Citing DSFD Please cite DSFD in your publications if it helps your research: ``` @inproceedings{li2018dsfd, title={DSFD: Dual Shot Face Detector}, author={Li, Jian and Wang, Yabiao and Wang, Changan and Tai, Ying and Qian, Jianjun and Yang, Jian and Wang, Chengjie and Li, Jilin and Huang, Feiyue}, booktitle={Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition}, year={2019} } ```
43.898876
216
0.748912
yue_Hant
0.510326
0c212feb76430827ee7ee668a4f57255e1b0691e
6,265
md
Markdown
power-platform/admin/create-edit-business-units.md
madeloe/power-platform
c857118b5efa590078d3024e719516aac79913e3
[ "CC-BY-4.0", "MIT" ]
null
null
null
power-platform/admin/create-edit-business-units.md
madeloe/power-platform
c857118b5efa590078d3024e719516aac79913e3
[ "CC-BY-4.0", "MIT" ]
1
2020-10-19T16:52:52.000Z
2020-10-20T05:29:04.000Z
power-platform/admin/create-edit-business-units.md
madeloe/power-platform
c857118b5efa590078d3024e719516aac79913e3
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: "Create or edit business units | MicrosoftDocs" description: Learn how to create or edit business units author: jimholtz ms.service: power-platform ms.component: pa-admin ms.topic: conceptual ms.date: 09/04/2020 ms.author: jimholtz search.audienceType: - admin search.app: - D365CE - PowerApps - Powerplatform - Flow --- # Create or edit business units A business unit is a logical grouping of related business activities. If your organization is structured around departments or divisions that have separate products, customers, and marketing lists, you might want to create business units. Business units are mapped to an organization’s departments or divisions. Users can securely access data in their own business unit, but they can’t access data in other business units. Business units, security roles, and users are linked together in a way that conforms to the role-based security model. Use business units together with security roles to control data access so people see just the information they need to do their jobs. Keep the following in mind when creating business units: - The organization (also known as the root business unit) is the top level of a business unit hierarchy. The customer engagement apps (Dynamics 365 Sales, Dynamics 365 Customer Service, Dynamics 365 Field Service, Dynamics 365 Marketing, and Dynamics 365 Project Service Automation), automatically create the organization when you install or provision customer engagement apps. You can’t delete the organization name. The organization name is derived from the domain name when the environment was provisioned. You cannot change the organization name using the Business Unit form but it can be changed using the [Web API](https://docs.microsoft.com/dynamics365/customer-engagement/web-api/businessunit?view=dynamics-ce-odata-9). - Each business unit can have just one parent business unit. - Each business unit can have multiple child business units. - Security roles and users are associated with a business unit. You must assign every user to one (and only one) business unit. - You cannot add a user into a business unit directly. All newly provisioned users are assigned to the root business. - You can change the user's business unit at anytime. Once the business unit is changed, the user will show up as a member of the business unit automatically. - Each business unit has a default team. You cannot update the default team's name nor delete the default team. - You cannot add or remove users from the business unit's default team. However you can change the user's business unit to the business unit and the user will automatically be added to the business unit's default team. - You can assign a security role to the business unit's default team. This is done to simplify security role management where all your business unit team members can share the same data access. - You can assign additional team to a business unit but there can only be one business unit per team. - A team can consist of users from one or many business units. Consider using this type of team if you have a situation where users from different business units need to work together on a shared set of records. ## Create a new business unit These settings can be found in the Power Platform admin center by going to **Environments** > [select an environment] > **Settings** > **Users + permissions** > **Business units**. Make sure you have the System Administrator or System Customizer security role or equivalent permissions to update the setting. - Follow the steps in [View your user profile](https://docs.microsoft.com/powerapps/user/view-your-user-profile). - Don’t have the correct permissions? Contact your system administrator. 1. Select an environment and go to **Settings** > **Users + permissions** > **Business units**. 2. On the Actions bar, select **New**. 3. In the **Business Unit** dialog box, type a name for the new business unit. Customer engagement apps automatically fills in the **Parent Business** field with the name of the root business unit. 4. If you want to change the parent business unit, select the **Lookup** button (![Lookup button](media/lookup-4.png)), **Look Up More Records**, and then do one of the following: - Select an existing business unit from the list. - Create a new parent business unit: 1. Choose **New**, and then add the information for the new parent business unit in the **Business Unit** dialog box. 2. When you’re done adding information, select **Save and Close**. 3. In the **Look Up Record** dialog box, select **Add**. 5. In the **Business Unit** dialog box, fill in any of the other optional fields, such as the Division, Website, contact information, or addresses. 6. When you’re done making entries, select **Save and Close**. ## Change the settings for a business unit 1. Select an environment and go to **Settings** > **Users + permissions** > **Business units**. 2. Select a business unit name. 3. In the **Business Unit** dialog box, do one or more of the following: - Modify the data in one or more fields. - Select a record type under **Organization** to see a list of related records. For example, select **Users** to view a list of users in the selected business unit. 4. When you’re done making changes select **Save and Close**. ### Change the business unit for a user > [!IMPORTANT] > By changing the business unit for a user, you remove all security role assignments for the user. At least one security role must be assigned to the user in the new business unit. 1. Select an environment and go to **Settings** > **Users + permissions** > **Users**. 2. Select a user name. 3. On the **More Commands** (**…**) menu, select **Change Business Unit**. 4. In the **Change Business Unit** dialog box, use the **Lookup** button (![Lookup button](media/lookup-4.png)) to select a new business unit, and then select **OK**. ### See also [Delete a business unit](delete-business-unit.md) [Assign a business unit a different parent business](assign-business-unit-different-parent.md)
56.441441
727
0.742059
eng_Latn
0.998537
0c213f4325d988389010c7cf27a478949d89e2dd
60
md
Markdown
locale/ja/lib/auth/fragments/flutter/common_prereq.md
aws-amplify-jp/docs
700cafa5d360d128a48fcfdf591f6a3ce5a0c71b
[ "Apache-2.0" ]
null
null
null
locale/ja/lib/auth/fragments/flutter/common_prereq.md
aws-amplify-jp/docs
700cafa5d360d128a48fcfdf591f6a3ce5a0c71b
[ "Apache-2.0" ]
null
null
null
locale/ja/lib/auth/fragments/flutter/common_prereq.md
aws-amplify-jp/docs
700cafa5d360d128a48fcfdf591f6a3ce5a0c71b
[ "Apache-2.0" ]
null
null
null
* [ウォークスルーを開始](~/lib/auth/getting-started.md) に応じたアプリのセットアップ
60
60
0.783333
eng_Latn
0.056604
0c216eb0a7a6b32094c8545270ab516c766e943e
459
md
Markdown
README.md
lmeerwood/youtube-file-store
27c668f0db0909933389a3f47fba2864859bd808
[ "MIT" ]
null
null
null
README.md
lmeerwood/youtube-file-store
27c668f0db0909933389a3f47fba2864859bd808
[ "MIT" ]
null
null
null
README.md
lmeerwood/youtube-file-store
27c668f0db0909933389a3f47fba2864859bd808
[ "MIT" ]
null
null
null
# YouTube File Store A small project testing the feasibility of using YouTube to store files. ## Goals The goal of this program is to test whether or not YouTube can be used as a method for distributing files. Upon completion the program should be able to: * Encode a file as a video and upload to YouTube * Download a video from YouTube and decode it ## Note Please use this program at your own risk. I am unsure if doing this will violate YouTube's ToS.
41.727273
153
0.775599
eng_Latn
0.999528
0c21d6f53a44ec1117ac2f2b522739be3a0f57ba
906
md
Markdown
README.md
bzaczynski/httpself
8959d40b93764d00d370c2608eea5e9e950df3e2
[ "MIT" ]
null
null
null
README.md
bzaczynski/httpself
8959d40b93764d00d370c2608eea5e9e950df3e2
[ "MIT" ]
1
2022-02-20T14:42:30.000Z
2022-02-20T14:42:30.000Z
README.md
bzaczynski/httpself
8959d40b93764d00d370c2608eea5e9e950df3e2
[ "MIT" ]
null
null
null
# httpself HTTP server over SSL/TLS with an automatically generated self-signed certificate. ## Usage Serve static files from the current working directory: ```shell $ https [-p/--port <NNNN>] [--public] ``` ### Example #1 By default, the server runs on port number `443`, which requires superuser privileges. Note that the server will only be accessible from the `localhost`: ```shell $ https Running server at https://localhost:443 ``` ### Example #2 To specify an alternative port number: ```shell $ https --port 8443 Running server at https://localhost:8443 ``` ### Example #3 To make the server accessible from other devices: ```shell $ https --public Running server at https://0.0.0.0:443 ``` ### Example #4 To run the server publicly on a custom port: ```shell $ https --public -p 8443 Running server at https://0.0.0.0:8443 ``` ## Installation ```shell $ pip install httpself ```
16.777778
153
0.700883
eng_Latn
0.894866
0c21e14506c656ffb22f13df9184969162450243
1,703
md
Markdown
resources/assets/plugins/news-ticker/README.md
zahrul100/DKPDGroboganOke
e924ca1ff6076b23305fcc63ef5a7054f57dc0a0
[ "MIT" ]
null
null
null
resources/assets/plugins/news-ticker/README.md
zahrul100/DKPDGroboganOke
e924ca1ff6076b23305fcc63ef5a7054f57dc0a0
[ "MIT" ]
null
null
null
resources/assets/plugins/news-ticker/README.md
zahrul100/DKPDGroboganOke
e924ca1ff6076b23305fcc63ef5a7054f57dc0a0
[ "MIT" ]
null
null
null
EndlessRiver ============ A **jQuery** content scroller! # Index - [Introduction](#introduction) - [Requirements](#requirements) - [Usage](#usage) - [License](#license) # Introduction **Endless river** is a jQuery plugin, that animates content and make it scroll endlessly! The usefull feature is that you don't have to set widths to elements, because speed is set in pixel/second!! EndlessRiver will calculate the best width for the &lt;li&gt; element and keep speed constant! # Requirements The only requirement needed is [jQuery](https://jquery.com/download/) that you can install it via [Bower](http://bower.io/). # Usage Simply include the endlessRiver JS and CSS ```html <html> <head> <script type="text/javascript" src="path-to/js/endlessRiver.js"></script> <link type="text/css" rel="stylesheet" href="path-to/css/endlessRiver.css" /> </head> </html> ``` Create an &lt;ul&gt; list like this ```html <ul id="myUl"> <li> | </li> <li style="color: red">Espresso</li> <li style="color: orange">Cappuccino</li> <li style="color: yellow">American</li> <li style="color: green">Tea</li> <li style="color: black">Milk</li> <li style="color: blue">Juice</li> <li> | </li> </ul> ``` and call it like this! ```javascript $(document).ready(function(){ $("#myUl").endlessRiver(); }); ``` **Params** |**option**|**description**|**type** |**default**| | -------- | -------- | -------- | -------- | |speed|Speed of animations in pixel/second|Number|100| |pause|Pause on hover: if true, scrolling will stop on mouseover|Boolean|true| |buttons|If true, shows play/stop buttons|Boolean|false| # License Check out LICENSE file (MIT)
25.80303
124
0.650029
eng_Latn
0.690265
0c22231320cac8db696169cc4238d07ecae4f683
217
md
Markdown
README.md
berkmancenter/odie_analysis
940a1bfad95403cfdc060c786707114efe6ab879
[ "MIT" ]
null
null
null
README.md
berkmancenter/odie_analysis
940a1bfad95403cfdc060c786707114efe6ab879
[ "MIT" ]
null
null
null
README.md
berkmancenter/odie_analysis
940a1bfad95403cfdc060c786707114efe6ab879
[ "MIT" ]
1
2021-09-22T01:58:09.000Z
2021-09-22T01:58:09.000Z
ODIE Analysis Stack =================== This is a quick assemblage of scripts wrapped in an HTTP interface. It's all setup for Docker. ## Setup 1. `cp app/env.template app/.env` and configure. 2. docker-compose up
19.727273
76
0.682028
eng_Latn
0.910936
0c228b57caaf935b7ec1f3832afd4affc017265b
6,318
md
Markdown
blog/_posts/2015/2015-12-10-rebuilding-the-pcjs-website.md
jriwanek-forks/pcjs
868047405c383e2b05848e771e89e2bd4558ee96
[ "MIT" ]
711
2015-01-12T21:43:49.000Z
2022-03-31T15:04:16.000Z
blog/_posts/2015/2015-12-10-rebuilding-the-pcjs-website.md
jriwanek-forks/pcjs
868047405c383e2b05848e771e89e2bd4558ee96
[ "MIT" ]
98
2015-01-06T11:52:01.000Z
2022-03-28T18:14:28.000Z
blog/_posts/2015/2015-12-10-rebuilding-the-pcjs-website.md
jriwanek-forks/pcjs
868047405c383e2b05848e771e89e2bd4558ee96
[ "MIT" ]
152
2015-02-01T20:23:27.000Z
2022-03-30T21:58:48.000Z
--- layout: post title: Rebuilding the PCjs Website date: 2015-12-10 11:03:00 category: Website permalink: /blog/2015/12/10/ --- It's been nice using Node.js to power the PCjs website, using Amazon's Elastic Beanstalk service, but that combination has also been a source of some frustrations. + When someone posts an article or a tweet linking to a PCjs page, the website bogs down, and while Amazon's Elastic Beanstalk service makes it easy to automatically scale up, each new instance automatically multiplies my expenses as well, which hover around $34/month for a single instance. With assorted S3 and transfer charges, my average monthly bill is over $55/month. That's a bit much for a site that generates zero revenue. + Once or twice a year, when I'm attempting to either update the server or upgrade my Node configuration, the update or upgrade will fail, and Amazon's web console provides virtually no details about why it failed. I get an error message like "**ERROR: Failed to deploy application**" and that is it. Literally. Granted, there are some "simple" things I could do to improve performance, like adding an **nginx** proxy server to the configuration, but that feels like a band-aid solution, and as a software developer, the time spent fiddling with web server issues is time I'd much rather spend writing code. Since PCjs is designed to do all its work in the user's web browser, and since the website can be completely built out as a set of static web pages, I've decided to stop using Node to power www.pcjs.org. I'm in the process of migrating the website to [GitHub Pages](https://pages.github.com/), and using [Jekyll](https://help.github.com/articles/using-jekyll-with-pages/) to convert all my existing Markdown files to HTML. This new approach is *very* similar to what the PCjs custom Node modules did: every time someone visited a folder on the website that did not yet contain an "index.html", the PCjs Node server would create one, either by converting the README.md file in that folder to HTML or generating a default HTML document. The PCjs Markdown-to-HTML converter also contained some special logic that made it easy to embed PCjs machines on a page. Thanks to GitHub Pages, all of that now happens ahead of time: whenever I update the PCjs "gh-pages" branch on GitHub, Jekyll automatically runs through the entire site and rebuilds a complete set of web pages. The Node web server would automatically embed PCjs machines on web pages by looking for special Markdown links, such as: [Embedded IBM PC](machine.xml "PCjs:ibm5150") After migrating to GitHub Pages and Jekyll, that markup must now be written as: {% raw %} {% include machine.html id="ibm5150" %} {% endraw %} and the following "Front Matter" (YAML) must appear at the top of the Markdown file: --- ... machines: - type: pcx86 id: ibm5150 --- Basically, the YAML at the top of the file lists all the machines that the page intends to use, and then {% raw %}`{% include ... %}`{% endraw %} is inserted in the text at the point where a machine should be embedded. The `machine.html` include accepts only one parameter: the `id` of a machine listed at the top of the file. The `machines` element at the top of the file must specify a `type` and `id` at a minimum. `type` should be one of the following values, depending on whether you want an IBM PC or Challenger 1P: - pc - c1p and `id` can be any identifier you want to use to embed the machine. If you want to include the machine's built-in debugger, set `debugger` to *true*; the older method of specifying *pc-dbg* or *c1p-dbg* instead of *pc* or *c1p* as the `type` still works, too. You may also use `config` to specify a machine XML configuration file if not using the default *machine.xml*; `template` to specify an alternate XSL template file if not using the default *components.xsl*; `state` to specify a JSON-encoded machine state file if the machine requires a predefined state; and `uncompiled` may be set to *true* to force a machine to use uncompiled sources, overriding the value of `site.pcjs.compiled` in **_config.yml**. For example, this [IBM PC](/machines/pcx86/ibm/5150/mda/) contains the following at the top of the [page]({{ site.github.pages }}/machines/pcx86/ibm/5150/mda/): machines: - id: ibm-5150-mda type: pcx86 config: /configs/pcx86/machine/ibm/5150/mda/64kb/machine.json If necessary, you can also override some of the settings in a machine XML file. Here's an example of overriding the FDC `autoMount` setting, making it easy to reuse the same machine XML file with different boot disks: machines: - type: pcx86 id: deskpro386 debugger: true autoMount: A: name: OS/2 FOOTBALL (v7.68.17) config: /configs/pcx86/machine/compaq/deskpro386/ega/4096kb/debugger/machine.xml Other settings that can currently be overridden include: + `autoStart` + `drives` + `messages` + `state` Additional overrides will be added as needed. See the [Windows 95 Demo](/software/pcx86/sys/windows/win95/4.00.950/) machine and its associated [Markdown file]({{ site.github.pages }}/software/pcx86/sys/windows/win95/4.00.950/index.md) for more override examples, including how to set `autoMount` to *not* mount any diskettes. I will continue to include a Node web server with the PCjs Project, but it's now intended for development purposes only (not production servers). I've updated the PCjs MarkOut component to parse any "Front Matter" at the top of the PCjs Markdown files, and to convert all new Jekyll-style embedded machines and screenshots to the older Markdown-compatible formats, so pages generated by the Node web server should still function as before. But there are no guarantees. WARNING: If you decide to run/alternate between Jekyll's web server (WEBrick) and the built-in Node web server (server.js), you should run "grunt clean" before starting either one, to remove any old **index.html** files. Node may inadvertently reuse old "index.html" files, and Jekyll may inadvertently propagate them to its "_site" folder. It's easy to tell when this happens, because you'll see the wrong color scheme: Node web server pages were designed to use *dark* colors, whereas Jekyll web server pages currently use *light* colors.
53.092437
135
0.756885
eng_Latn
0.997844
0c229c9cc2011c18c2faec199fe4567145446616
4,048
md
Markdown
README.md
mvhk/react-simple-boilerplate
6e73534bf3b1535567a4ff3d44c258d604765b97
[ "MIT" ]
null
null
null
README.md
mvhk/react-simple-boilerplate
6e73534bf3b1535567a4ff3d44c258d604765b97
[ "MIT" ]
3
2022-02-14T06:50:47.000Z
2022-02-27T14:04:35.000Z
README.md
udaypydi/blog-infinite
161ed1483a598a4b875129b5550fd70a406eae5a
[ "MIT" ]
null
null
null
<div align="center"> <a href="http://react-simple-boilerplate.surge.sh/"> <h1>React Simple Boilerplate</h1> </a> </div> <div align="center"> <strong>Start your next react project in seconds</strong> <br /> <div> <!-- travis --> <a href="https://travis-ci.org/udaypydi/react-simple-boilerplate"> <img src="https://travis-ci.org/udaypydi/react-simple-boilerplate.svg" alt="Test Coverage" /> </a> <!-- Mit License --> <a href="https://github.com/udaypydi/react-simple-boilerplate/blob/master/LICENSE"> <img src="https://img.shields.io/github/license/udaypydi/react-simple-boilerplate"> </a> <!-- Pr's --> <a href="https://github.com/udaypydi/react-simple-boilerplate/blob/master/CONTRIBUTING.md"> <img src="https://img.shields.io/badge/PRs-welcome-blueviolet.svg"> </a> </div> </div> A simple react boilerplate with webpack hmr and latest babel packages. It comes with inbuilt [Travis-CI](https://travis-ci.org/) for PR checks, lint checks and builds. Kick start the development with tailwind css, react hooks and much more. This React Boilerplate works on macOS, Windows, and Linux.<br> If something doesn’t work, please [file an issue](https://github.com/udaypydi/react-simple-boilerplate/issues/new/choose).<br> ## Quick Overview ```sh git clone https://github.com/udaypydi/react-simple-boilerplate.git cd my-app yarn install npm run dev ``` Then open [http://localhost:9000/](http://localhost:9000/) to load your app.<br> When you’re ready to deploy to production, create a minified bundle with `npm run build`. ### Get Started Immediately You **don’t** need to install or configure tools like webpack or Babel.<br> They are reconfigured and hidden so that you can focus on the code. Clone the project, and you’re good to go. ### Prerequisites - [yarn](https://classic.yarnpkg.com/en/docs/install/) (version > 1.0.0) - [node](https://nodejs.org/en/download/) ## Creating an App **You’ll need to have Node 10.16.0 or higher version on your local development machine**. You can use [nvm](https://github.com/creationix/nvm#installation) (macOS/Linux) or [nvm-windows](https://github.com/coreybutler/nvm-windows#node-version-manager-nvm-for-windows) to switch Node versions between different projects. To create a new app, you should clone this repository using the following command: ### Clone ```sh git clone https://github.com/udaypydi/react-simple-boilerplate.git my-app ``` It will create a directory called `my-app` inside the current folder.<br> Inside that directory, it will generate the initial project structure. No configuration or complicated folder structures, only the files you need to build your app.<br> Once the installation is done, you can open your project folder: ```sh cd my-app ``` Inside the newly created project, you can run some built-in commands: ### `yarn install` It will install all the packages that are required for project to up and running ### `npm run dev` Runs the app in development mode.<br> Open [http://localhost:9000](http://localhost:9000) to view it in the browser. The page will automatically reload if you make changes to the code.<br> You will see the build errors and lint warnings in the console. ### `npm run test` Runs the test watcher in an interactive mode.<br> By default, runs tests related to files changed since the last commit. [Read more about testing.](https://testing-library.com/docs/react-testing-library/intro) ### `npm run lint` Runs the eslint over the `src` dir. You get a list of list errors and warnings if any. ### `npm run lint:fix` Run this command to autofix eslint issues. All the autofixable issues will be fixed. ### `npm run build` or `yarn build` Builds the app for production to the `build` folder.<br> It correctly bundles React in production mode and optimizes the build for the best performance. The build is minified and the filenames include the hashes.<br> Your app is ready to be deployed.
35.508772
319
0.719368
eng_Latn
0.956847
0c2331bdb389244f19a6998e6d83638bee9675d4
18,509
md
Markdown
articles/vs-azure-tools-debug-cloud-services-virtual-machines.md
OpenLocalizationTestOrg/azure-docs-pr15_it-IT
a5b6eb257721d6a02db53be2d3b2bee1d9e5aa1c
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
null
null
null
articles/vs-azure-tools-debug-cloud-services-virtual-machines.md
OpenLocalizationTestOrg/azure-docs-pr15_it-IT
a5b6eb257721d6a02db53be2d3b2bee1d9e5aa1c
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
null
null
null
articles/vs-azure-tools-debug-cloud-services-virtual-machines.md
OpenLocalizationTestOrg/azure-docs-pr15_it-IT
a5b6eb257721d6a02db53be2d3b2bee1d9e5aa1c
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
null
null
null
<properties pageTitle="Il debug di un servizio cloud Azure o una macchina virtuale in Visual Studio | Microsoft Azure" description="Il debug di un servizio Cloud o la macchina virtuale in Visual Studio" services="visual-studio-online" documentationCenter="na" authors="TomArcher" manager="douge" editor="" /> <tags ms.service="visual-studio-online" ms.devlang="multiple" ms.topic="article" ms.tgt_pltfrm="multiple" ms.workload="na" ms.date="08/15/2016" ms.author="tarcher" /> # <a name="debugging-an-azure-cloud-service-or-virtual-machine-in-visual-studio"></a>Il debug di un servizio cloud Azure o una macchina virtuale in Visual Studio Visual Studio offre diverse opzioni per il debug di servizi cloud Azure e macchine virtuali. ## <a name="debug-your-cloud-service-on-your-local-computer"></a>Eseguire il debug il servizio cloud nel computer locale È possibile risparmiare tempo e denaro mediante la Azure calcolare emulatore per eseguire il debug il servizio cloud in un computer locale. Tramite il debug di un servizio in locale prima di distribuirlo, è possibile migliorare l'affidabilità e le prestazioni senza una elaborazione a pagamento. Tuttavia, potrebbero verificarsi alcuni errori solo quando si esegue un servizio cloud in Azure stesso. È possibile eseguire il debug di questi errori se si abilita il debug remoto quando si pubblica il servizio e il debug di un'istanza del ruolo. L'emulatore simula il servizio di Azure calcolare e viene eseguita nel proprio ambiente locale in modo che è possibile testare ed eseguire il debug il servizio cloud prima di distribuirlo. L'emulatore gestisce il ciclo di vita delle istanze del ruolo e consente di accedere alle risorse simulate, ad esempio archivio locale. Quando si esegue il debug o eseguire il servizio da Visual Studio, viene avviata automaticamente l'emulatore come un'applicazione di sfondo e quindi distribuisce il servizio all'emulatore. È possibile utilizzare l'emulatore per visualizzare il servizio quando viene eseguito in ambiente locale. È possibile eseguire la versione completa o la versione express dell'emulatore. (A partire da Azure 2.3, la versione express dell'emulatore è il valore predefinito) Vedere [emulatore utilizzando Express esecuzione e il Debug di un servizio Cloud in locale](https://msdn.microsoft.com/library/dn339018.aspx). ### <a name="to-debug-your-cloud-service-on-your-local-computer"></a>Eseguire il debug il servizio cloud nel computer locale 1. Sulla barra dei menu scegliere **Debug** **Avvia debug** per eseguire il progetto di servizio cloud Azure. In alternativa, è possibile premere F5. Verrà visualizzato un messaggio emulatore di calcolo in fase di avvio. Quando si avvia l'emulatore, nell'area di notifica di sistema per conferma. ![Azure emulatore sulla barra delle applicazioni](./media/vs-azure-tools-debug-cloud-services-virtual-machines/IC783828.png) 1. Visualizzare l'interfaccia utente per l'emulatore di calcolo, aprire il menu di scelta rapida per l'icona Azure nell'area di notifica e quindi selezionare **Mostra interfaccia utente di emulatore calcolare**. Riquadro a sinistra dell'interfaccia utente vengono illustrati i servizi attualmente installati nell'emulatore di calcolo e le istanze di ruolo che ogni servizio è in esecuzione. È possibile scegliere il servizio o i ruoli per visualizzare ciclo di vita, registrazione e informazioni di diagnostica nel riquadro destro. Se si inserisce lo stato attivo nel margine superiore di una finestra incluso, si espande per riempire riquadro a destra. 1. Passaggio tramite l'applicazione selezionando i comandi del menu **Debug** e impostare i punti di interruzione nel codice. Durante l'esecuzione dell'applicazione nella verrà, i riquadri vengono aggiornati con lo stato corrente dell'applicazione. Quando si interrompe il debug, viene eliminata la distribuzione dell'applicazione. Se l'applicazione include un ruolo web ed è stata impostata la proprietà azione di avvio per avviare il web browser, Visual Studio avvia l'applicazione web nel browser. Se si modifica il numero di istanze di un ruolo nella configurazione del servizio, è necessario arrestare il servizio cloud e quindi riavviare il debug in modo che è possibile eseguire il debug di queste nuove istanze del ruolo. **Nota:** Quando si interrompe esecuzione o il debug del servizio, il calcolo locale emulatore ed emulatore lo spazio di archiviazione non interrotti. È necessario interrompere loro in modo esplicito dall'area di notifica. ## <a name="debug-a-cloud-service-in-azure"></a>Eseguire il debug di un servizio cloud di Azure Per eseguire il debug di un servizio cloud da un computer remoto, è necessario abilitare la funzionalità in modo esplicito quando si distribuisce il servizio cloud, in modo che sia necessario servizi (ad esempio msvsmon.exe) vengono installati in macchine virtuali che eseguono le istanze del ruolo. Se non si è attiva il debug remoto quando è stato pubblicato il servizio, è necessario ripubblicare il servizio con il debug remoto attivato. Se si abilita il debug remoto per un servizio cloud, non presentano negativamente sulle prestazioni o sostenere costi aggiuntivi. Non è consigliabile utilizzare il debug remoto in un servizio di produzione, poiché potrebbe influire negativamente sulle client che usano il servizio. >[AZURE.NOTE] Quando si pubblica un servizio cloud da Visual Studio, è possibile abilitare **IntelliTrace** per i ruoli di tale servizio destinati a .NET Framework 4 o .NET Framework 4.5. Tramite **IntelliTrace**, è possibile esaminare gli eventi che si sono verificati in un'istanza di ruoli in passato e riprodurre il contesto di un'ora specifica. Vedere [debug di un servizio cloud pubblicato con traccia di diagnostica e Visual Studio](http://go.microsoft.com/fwlink/?LinkID=623016) e [Tramite IntelliTrace](https://msdn.microsoft.com/library/dd264915.aspx). ### <a name="to-enable-remote-debugging-for-a-cloud-service"></a>Per attivare il debug remoto per un servizio cloud 1. Aprire il menu di scelta rapida per il progetto Azure e quindi selezionare **pubblica**. 1. Selezionare l'ambiente di **gestione temporanea** e la configurazione di **Debug** . Verrà visualizzata solo linee guida. È possibile scegliere di eseguire gli ambienti di test in un ambiente di produzione. Tuttavia, si potrebbero influire negativamente sulle utenti se si abilita il debug remoto nell'ambiente di produzione. È possibile scegliere la configurazione di rilascio, ma la configurazione di Debug rende il debug più semplice. ![Selezionare la configurazione di Debug](./media/vs-azure-tools-debug-cloud-services-virtual-machines/IC746717.gif) 1. Seguire la procedura normale, ma selezionare la casella di controllo **Abilita Debugger remoto per tutti i ruoli** nella scheda **Impostazioni avanzate** . ![Configurazione di debug](./media/vs-azure-tools-debug-cloud-services-virtual-machines/IC746718.gif) ### <a name="to-attach-the-debugger-to-a-cloud-service-in-azure"></a>Il debug in un servizio cloud in Azure 1. In Esplora Server espandere il nodo per il servizio cloud. 1. Aprire il menu di scelta rapida per il ruolo o istanza del ruolo da allegare e quindi selezionare **Debugger allegare**. Se il debug di un ruolo, verrà Visual Studio connesso a ogni istanza di tale ruolo. Si verifica sarà un'interruzione in un punto di interruzione per la prima istanza di ruolo eseguito dalla riga di codice che soddisfa le condizioni di tale punto di interruzione. Se si esegue il debug di un'istanza di debugger è collegato a solo quell'istanza e interruzioni in un punto di interruzione solo quando quell ' istanza viene eseguita la riga di codice e soddisfa le condizioni del punto di interruzione. ![Connettere Debugger](./media/vs-azure-tools-debug-cloud-services-virtual-machines/IC746719.gif) 1. Dopo verrà si connette a un'istanza, eseguire il debug come di consueto. Verrà si connette automaticamente al processo di host appropriato per il proprio ruolo. A seconda che cos'è il ruolo verrà connette w3wp.exe, WaWorkerHost.exe o WaIISHost.exe. Per verificare che il processo a cui è associato verrà, espandere il nodo dell'istanza in Esplora Server. Per ulteriori informazioni sui processi Azure, vedere [Architettura di ruolo Azure](http://blogs.msdn.com/b/kwill/archive/2011/05/05/windows-azure-role-architecture.aspx) . ![Selezionare finestra di dialogo tipo di codice](./media/vs-azure-tools-debug-cloud-services-virtual-machines/IC718346.png) 1. Per identificare i processi a cui è associato verrà, aprire la finestra di dialogo processi, dal menu, scegliere Debug, Windows, processi. (Tastiera: Ctrl + Alt + Z) Per scollegare un processo specifico, aprire il menu di scelta rapida e quindi selezionare **Disconnetti processo**. O, individuare il nodo dell'istanza in Esplora Server, individuare il processo, aprire il menu di scelta rapida e quindi selezionare **Disconnetti processo**. ![Il debug di processi](./media/vs-azure-tools-debug-cloud-services-virtual-machines/IC690787.gif) >[AZURE.WARNING] Evitare interruzioni lunghi interruzione quando remote debug. Azure considera un processo che è in sospeso per più di pochi minuti come non risponde e smette di inviare il traffico a tale istanza. Se si interrompe troppo a lungo, msvsmon.exe si disconnette dal processo. Per scollegare verrà da tutti i processi nel proprio istanza o ruolo, aprire il menu di scelta rapida per il ruolo o istanza che si sta eseguendo il debug e quindi selezionare **Disconnetti Debugger**. ## <a name="limitations-of-remote-debugging-in-azure"></a>Limitazioni del debug remoto Azure Da Azure SDK 2.3, il debug remoto comporta le limitazioni seguenti. - Con il debug remoto attivato, non è possibile pubblicare un servizio cloud in cui qualsiasi ruolo dispone di più di 25 istanze. - Verrà utilizza le porte 30400 a 30424, 31400 a 31424 e 32400 a 32424. Se si tenta di utilizzare una di queste porte, non sarà possibile pubblicare il servizio e uno dei seguenti messaggi di errore verrà visualizzato nel log delle attività per Azure: - Errore di convalida file cscfg nel file di .csdef. La porta riservata intervallo "intervallo" per l'endpoint Microsoft.WindowsAzure.Plugins.RemoteDebugger.Connector del ruolo 'ruolo' si sovrappone a una porta già definita o un intervallo. - Assegnazione non è riuscita. Riprovare in seguito, provare a ridurre la dimensione memoria virtuale o il numero di istanze del ruolo o provare la distribuzione in un'area diversa. ## <a name="debugging-azure-virtual-machines"></a>Il debug macchine virtuali di Azure È possibile eseguire il debug di programmi in esecuzione in macchine virtuali di Azure tramite Esplora Server in Visual Studio. Quando si abilita il debug remoto in un computer virtuale Azure, Azure installa l'estensione di debug remoto sul computer virtuale. Quindi, è possibile connettersi a processi sul computer virtuale e debug come di consueto. >[AZURE.NOTE] Macchine virtuali create attraverso lo stack di manager delle risorse Azure può essere eseguite in modalità remota da utilizzare Gestione Cloud in Visual Studio 2015. Per ulteriori informazioni, vedere [Gestione di risorse Azure con Esplora risorse Cloud](http://go.microsoft.com/fwlink/?LinkId=623031). ### <a name="to-debug-an-azure-virtual-machine"></a>Per eseguire il debug una macchina virtuale Azure 1. In Esplora Server espandere il nodo macchine virtuali e selezionare il nodo della macchina virtuale che si desidera eseguire il debug. 1. Aprire il menu di scelta rapida e selezionare **Attiva debug**. Quando viene richiesto se si è sicuri se si desidera attivare il debug sul computer virtuale, selezionare **Sì**. Azure installa l'estensione di debug remoto sul computer virtuale per attivare il debug. ![Macchina virtuale attivare comando debug](./media/vs-azure-tools-debug-cloud-services-virtual-machines/IC746720.png) ![Registro attività Azure](./media/vs-azure-tools-debug-cloud-services-virtual-machines/IC746721.png) 1. Al termine l'estensione di debug remoto l'installazione, aprire il menu di scelta rapida della macchina virtuale e selezionare **Allegare Debugger** Azure riceve un elenco dei processi sul computer virtuale e li visualizza nella finestra Connetti a processo finestra di dialogo. ![Comando debugger Allega](./media/vs-azure-tools-debug-cloud-services-virtual-machines/IC746722.png) 1. Nella finestra di dialogo **Connetti a processo** selezionare **Selezionare** per limitare l'elenco dei risultati per visualizzare solo i tipi di codice che per eseguire il debug. È possibile eseguire il debug di codice a 32 o 64 bit gestito, codice nativo o entrambe. ![Selezionare finestra di dialogo tipo di codice](./media/vs-azure-tools-debug-cloud-services-virtual-machines/IC718346.png) 1. Selezionare i processi che si desidera eseguire il debug sul computer virtuale e quindi selezionare **Connetti**. È possibile, ad esempio, il processo di w3wp.exe se si desidera eseguire il debug di un'app web sul computer virtuale. Per ulteriori informazioni, vedere [eseguire il Debug uno o più processi in Visual Studio](https://msdn.microsoft.com/library/jj919165.aspx) e [Architettura ruolo Azure](http://blogs.msdn.com/b/kwill/archive/2011/05/05/windows-azure-role-architecture.aspx) . ## <a name="create-a-web-project-and-a-virtual-machine-for-debugging"></a>Creare un progetto web e una macchina virtuale per il debug Prima di pubblicare il progetto Azure, può risultare utile per eseguire il test in un ambiente contenuto che supporta il debug e scenari di test e in cui è possibile installare il test e i programmi di controllo. È possibile eseguire questa operazione è eseguire il debug remoto l'app in un computer virtuale. Progetti di Visual Studio ASP.NET offrono un'opzione per creare una macchina virtuale comodo che è possibile utilizzare per il test di app. La macchina virtuale include endpoint normalmente richiesti, ad esempio PowerShell, desktop remoto e WebDeploy. ### <a name="to-create-a-web-project-and-a-virtual-machine-for-debugging"></a>Per creare un progetto web e una macchina virtuale per il debug 1. In Visual Studio, creare una nuova applicazione Web ASP.NET. 1. Nella finestra di dialogo Nuovo progetto ASP.NET nella sezione Azure scegliere **macchina virtuale** nella casella di riepilogo a discesa. Lasciare selezionata la casella di controllo **crea risorse remote** . Fare clic su **OK** per continuare. Viene visualizzata la finestra di dialogo **Crea macchina virtuale in Azure** . ![Creare finestra di dialogo progetto web ASP.NET](./media/vs-azure-tools-debug-cloud-services-virtual-machines/IC746723.png) **Nota:** Verrà richiesto di accedere al proprio account Azure se è ancora effettuato l'accesso. 1. Selezionare le varie impostazioni per la macchina virtuale e fare clic su **OK**. Per ulteriori informazioni, vedere [macchine virtuali]( http://go.microsoft.com/fwlink/?LinkId=623033) . Il nome che immesso per nome DNS sarà il nome del computer virtuale. ![Creare macchina virtuale nella finestra di dialogo Azure](./media/vs-azure-tools-debug-cloud-services-virtual-machines/IC746724.png) Azure crea disposizioni e macchina virtuale e configura i punti finali, ad esempio Desktop remoto e distribuzione Web 1. Dopo la macchina virtuale completamente configurata, selezionare il nodo della macchina virtuale in Esplora Server. 1. Aprire il menu di scelta rapida e selezionare **Attiva debug**. Quando viene richiesto se si è sicuri se si desidera attivare il debug sul computer virtuale, selezionare **Sì**. Azure consente di installare l'estensione di debug remoto macchina virtuale per attivare il debug. ![Macchina virtuale attivare comando debug](./media/vs-azure-tools-debug-cloud-services-virtual-machines/IC746720.png) ![Registro attività Azure](./media/vs-azure-tools-debug-cloud-services-virtual-machines/IC746721.png) 1. Pubblicare il progetto come descritto in [procedura: distribuire un Web progetto tramite pubblicazione con un clic in Visual Studio](https://msdn.microsoft.com/library/dd465337.aspx). Perché si desidera eseguire il debug del computer virtuale, nella pagina **Impostazioni** della creazione guidata **Pubblicazione Web** selezionare **Debug** come la configurazione. In questo modo che i simboli di codice sono disponibili durante il debug. ![Impostazioni di pubblicazione](./media/vs-azure-tools-debug-cloud-services-virtual-machines/IC718349.png) 1. Nelle **Opzioni di pubblicazione di File**, selezionare **Rimuovi altri file di destinazione** se il progetto è già stato distribuito in precedenza. 1. Dopo il progetto pubblica, nel menu di scelta rapida della macchina virtuale in Esplora Server selezionare **Allegare Debugger** Azure riceve un elenco dei processi sul computer virtuale e li visualizza nella finestra Connetti a processo finestra di dialogo. ![Comando debugger Allega](./media/vs-azure-tools-debug-cloud-services-virtual-machines/IC746722.png) 1. Nella finestra di dialogo **Connetti a processo** selezionare **Selezionare** per limitare l'elenco dei risultati per visualizzare solo i tipi di codice che per eseguire il debug. È possibile eseguire il debug di codice a 32 o 64 bit gestito, codice nativo o entrambe. ![Selezionare finestra di dialogo tipo di codice](./media/vs-azure-tools-debug-cloud-services-virtual-machines/IC718346.png) 1. Selezionare i processi che si desidera eseguire il debug sul computer virtuale e quindi selezionare **Connetti**. È possibile, ad esempio, il processo di w3wp.exe se si desidera eseguire il debug di un'app web sul computer virtuale. Per ulteriori informazioni, vedere [eseguire il Debug uno o più processi in Visual Studio](https://msdn.microsoft.com/library/jj919165.aspx) . ## <a name="next-steps"></a>Passaggi successivi - Utilizzare **Intellitrace** per raccogliere un file di log delle chiamate e gli eventi da un server di rilascio. Vedere [debug di un servizio Cloud pubblicato con traccia di diagnostica e Visual Studio](http://go.microsoft.com/fwlink/?LinkID=623016). - Utilizzare **Azure diagnostica** per registrare informazioni dettagliate dal codice in esecuzione all'interno di ruoli, se i ruoli in esecuzione nell'ambiente di sviluppo o in Azure. Vedere [registrazione di raccolta dati tramite diagnostica Azure](http://go.microsoft.com/fwlink/p/?LinkId=400450).
95.901554
927
0.792425
ita_Latn
0.999119
0c2336e27d032678e8b9916371e16c7b12b04ec6
92
md
Markdown
README.md
integr8/terraform-aws-devops-selenium
b7340192002027fd1101dd4bf3bf4a054697f6ce
[ "BSD-2-Clause" ]
3
2019-01-29T03:33:21.000Z
2021-05-02T13:39:03.000Z
README.md
fabioluciano/terraform-aws-devops-selenium
b7340192002027fd1101dd4bf3bf4a054697f6ce
[ "BSD-2-Clause" ]
null
null
null
README.md
fabioluciano/terraform-aws-devops-selenium
b7340192002027fd1101dd4bf3bf4a054697f6ce
[ "BSD-2-Clause" ]
5
2021-02-06T23:07:30.000Z
2022-02-22T00:10:00.000Z
# terraform-aws-devops-selenium Terraform module to provision a selenium hub cluster on AWS
30.666667
59
0.826087
eng_Latn
0.559072
0c23f2071ac28134f88f3a4c5ae41e0e37392d2b
17,448
md
Markdown
articles/finance/cash-bank-management/set-up-advanced-bank-reconciliation-import-process.md
eltociear/Dynamics-365-Operations.hu-hu
b6935b9aa596c4f0efcd971958602e84679246b8
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/finance/cash-bank-management/set-up-advanced-bank-reconciliation-import-process.md
eltociear/Dynamics-365-Operations.hu-hu
b6935b9aa596c4f0efcd971958602e84679246b8
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/finance/cash-bank-management/set-up-advanced-bank-reconciliation-import-process.md
eltociear/Dynamics-365-Operations.hu-hu
b6935b9aa596c4f0efcd971958602e84679246b8
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Továbbfejlesztett banki egyeztetés importálásának beállítása description: A Továbbfejlesztett banki egyeztetés funkció lehetővé teszi Önnek az elektronikus banki kivonatok és az automatikus egyeztetés importálását a Microsoft Dynamics 365 Finance rendszer banki tranzakcióiba. Ez a cikk ismerteti, hogyan állítható be az importálás az Ön banki kivonataihoz. author: ShylaThompson manager: AnnBe ms.date: 06/20/2017 ms.topic: article ms.prod: '' ms.service: dynamics-ax-applications ms.technology: '' ms.search.form: BankStatementFormat audience: Application User ms.reviewer: roschlom ms.search.scope: Core, Operations ms.custom: 106853 ms.assetid: 45dae275-ea45-4c7e-b38f-89297c7b5352 ms.search.region: Global ms.author: saraschi ms.search.validFrom: 2016-02-28 ms.dyn365.ops.version: AX 7.0.0 ms.openlocfilehash: 4d9a2f6efad6b8ddf3a445fe7831244e161c35d5 ms.sourcegitcommit: dd960cf07d8be791fd27c7bb72e6baa2d63ccd51 ms.translationtype: HT ms.contentlocale: hu-HU ms.lasthandoff: 10/14/2019 ms.locfileid: "2578195" --- # <a name="set-up-the-advanced-bank-reconciliation-import-process"></a>Továbbfejlesztett banki egyeztetés importálásának beállítása [!include [banner](../includes/banner.md)] A Továbbfejlesztett banki egyeztetés funkció lehetővé teszi Önnek az elektronikus banki kivonatok és az automatikus egyeztetés importálását a Dynamics 365 Finance rendszer banki tranzakcióiba. Ez a cikk ismerteti, hogyan állítható be az importálás az Ön banki kivonataihoz. A banki kivonat importálási beállítások az elektronikus banki kivonat formátumától függően változnak. A Finance által támogatott három banki kivonat formátuma: ISO20022, MT940 és BAI2. ## <a name="set-time-zone-preference"></a>Időzóna-preferencia beállítása A banki kivonatok importálási beállításainak konfigurálásakor fontos, hogy figyelembe vegye a banki kivonat fájljain belül a dátum-idő adatok időzónáját. Alapértelmezett módon feltételezhető, hogy a megadott dátum és időértékek már az Egyezményes világidőt (UTC) követik, így nem kell időzóna-átváltást alkalmazni az adatok importálásakor. Lehetőség van az adatok importálásához használni kívánt időzóna meghatározására. Ez a beállítás az egyes **Forrásadat-formátum részletei** oldal **Időzóna-preferencia** mezőjében áll rendelkezésre (**Adatkezelés munkaterület > Adatforrások konfigurálása > Adatformátum kiválasztása > Regionális beállítások** gyorslap). Ez a megadott időzóna-beállítás minden olyan importálásra vonatkozik, amely a forrásadatok formátumát használja. Tetszőleges számú adatforrás-formátumot létre lehet hozni a különböző időzónák adatainak importálásához. Előfordulhat, hogy ez az időzóna nem ugyanaz, mint a felhasználó vagy a vállalat időzónája, ezért ügyeljen arra, hogy az időzónát a dátum-és időadatok alapján kell egyértelműsíteni. Javasoljuk, hogy az időzóna-beállítások megadásakor vegye figyelembe a következő szempontokat. - Ez az időzóna-preferencia minden olyan importálásra vonatkozik, amely a forrásadatok formátumát használja. Tetszőleges számú adatforrás-formátumot létre lehet hozni a különböző időzónák adatainak importálásának kezeléséhez. - Az időzónabeállításnak meg kell felelnie a helyi időzónának az importált fájl dátum-és időadataiban - A dátum-és időadatok időzónája nem biztos, hogy ugyanaz, mint a felhasználó vagy a vállalat időzónája. - A felhasználók saját időzónájukat a **Felhasználói beállítások** pontban adhatják meg. Ne felejtse el, hogy a következő műveletek minimalizálni tudják a lehetséges dátum-és időütközéseket a banki kivonatok importálásakor. - Ne módosítsa az időzóna-preferenciákat olyan bankszámlák esetében, amelyeknél már importált kivonatokat. Az időzóna-preferencia módosítása hatással lehet az új kimutatások meglévő kimutatásokhoz viszonyított elrendezésére az időzóna-módosítás miatt. - Ellenőrizze az összes olyan importot, amely a kiválasztott adatforrás-formátumot használja. A formátumhoz megadott időzóna-preferencia a formátumot használó összes importálási projektre érvényes lesz. Ellenőrizze, hogy a formátumhoz megadott időzóna-preferencia megfelelő-e a formátumot használó összes importálási projektre. ## <a name="sample-files"></a>Mintafájlok Mind a három formátumra vonatkozóan rendelkezni kell olyan fájlokkal, amik az elektronikus banki kivonatot lefordítják az eredeti formátumról a Finance által használható formátumra. A szükséges erőforrás fájlokat megtalálhatja az **Erőforrások** fülön, az Application Explorer-ben a Microsoft Visual Studio alkalmazásban. Miután megtalálta a fájlokat, másolja azokat egy ismert helyre, így egyszerűen feltöltheti azokat a telepítési folyamat során. | Erőforrás neve | Fájlnév | |---------------------------------------------------------|--------------------------------------| | BankStmtImport\_BAI2CSV\_to\_BAI2XML\_xslt | BAI2CSV-to-BAI2XML.xslt | | BankStmtImport\_BAI2XML\_to\_Reconciliation\_xslt | BAI2XML-to-Reconciliation.xslt | | BankStmtImport\_BankReconciliation\_to\_Composite\_xslt | BankReconciliation-to-Composite.xslt | | BankStmtImport\_ISO20022XML\_to\_Reconciliation\_xslt | ISO20022XML-to-Reconciliation.xslt | | BankStmtImport\_MT940TXT\_to\_MT940XML\_xslt | MT940TXT-to-MT940XML.xslt | | BankStmtImport\_MT940XML\_to\_Reconciliation\_xslt | MT940XML-to-Reconciliation.xslt | | BankStmtImport\_SampleBankCompositeEntity\_xml | SampleBankCompositeEntity.xml | ## <a name="examples-of-bank-statement-formats-and-technical-layouts"></a>A banki kivonat formátumainak és a technikai elrendezések példái Az alábbiakban a továbbfejlesztett banki egyeztetési importfájl technikai elrendezésű definicóinak példáit és a három kapcsolódó banki kivonat példafájljait találhatja: https://mbs.microsoft.com/customersource/northamerica/AX/learning/documentation/how-to-articles/exofbankstfotechlayouts | Technikai elrendezésdefiníció | Banki kivonat példafájl | |---------------------------------------------------------|--------------------------------------| | DynamicsAXMT940Layout | MT940StatementExample | | DynamicsAXISO20022Layout | ISO20022StatementExample | | DynamicsAXBAI2Layout | BAI2StatementExample | ## <a name="set-up-the-import-of-iso20022-bank-statements"></a>Állítsa be a ISO20022 banki kivonatok importálását Először definiálni kell a banki kivonat formátum feldolgozási csoportját ISO20022 banki kivonatokhoz, az adatentitás keretrendszer használatával. 1. Menjen a **Munkaterületek** &gt; **Adatkezelés** lehetőségre. 2. Kattintson az **Importálás** gombra. 3. Adja meg a formátum nevét, például **ISO20022**. 4. A **Forrás adat formátuma** mezőt állítsa **XML-Element** értékre. 5. Az **Entitás neve** mezőbe írja be **Banki kivonatok**. 6. Az importálási fájlok feltöltéséhez, kattintson a **Feltöltés** gombra, majd tallózással válassza ki a **SampleBankCompositeEntity.xml** fájlt, amit már korábban lementett. 7. Miután a banki kimutatások entitás feltöltése és a hozzárendelés elkészült, kattintson a **Térkép megjelenítése** entitáshoz tartozó műveletre. 8. A banki kimutatások entitás egy összetett entitást, amelyet négy külön entitás alkot. Válassza ki a listából a **BankStatementDocumentEntity** lehetőséget, majd kattintson a **Térkép megjelenítése** műveletre. 9. Az **Átalakítások** fülön, kattintson az **Új** lehetőségre. 10. Az 1-es sorozatszámhoz, kattintson a **Fájlfeltöltés** lehetőségre, és válassza ki az **ISO20022XML-Reconciliation.xslt** fájl, amit már korábban lementett. **Megjegyzés:** Az átalakítási fájlok szabványos formátumra épülnek. Mivel a bankok gyakran eltérnek ebben a formátumban, lehet, hogy frissíteni kell az átalakító fájlt, hogy leképezze az ön banki kivonat formátumát. <!-- For details about the expected format for ISO20022, see [Dynamics AX ISO20022 Layout](./media/dynamicsaxiso20022layout1.xlsx).--> 11. Kattintson az **Új** elemre. 12. Az 2-es sorozatszámhoz, kattintson a **Fájlfeltöltés** lehetőségre, és válassza ki a **BankReconciliation-to-Composite.xslt** fájl, amit már korábban lementett. 13. Kattintson az **Átalakítások alkalmazása** lehetőségre. A formátum feldolgozó csoport beállítása után, a következő lépés célja az ISO20022 banki kivonatok banki kivonat formátuma szabályainak meghatározása. 1. Menjen a **Készpénz és bankkezelés** &gt; **Beállítás** &gt; **Továbbfejlesztett banki egyeztetés beállítása** &gt; **Banki kivonatok formátuma** elemre. 2. Kattintson az **Új** elemre. 3. Adja meg a kivonat formátumát, például a **ISO20022**. 4. Adja meg a formátum nevét. 5. Állítsa be a **Feldolgozó csoport** mezőt ahhoz a csoporthoz, amit már korábban definiált, például a **ISO20022**. 6. Jelölje be az **XML file** jelölőnégyzetet. Az utolsó lépés, a Speciális bankszámla egyeztetés engedélyezése és a kivonat formátumának beállítása a bank számlán. 1. Nyissa meg a következőt: **Készpénz- és bankkezelés** &gt; **Bankszámlák**. 2. Válassza ki azt a bankszámlát, és nyissa meg a részletek megtekintéséhez. 3. Az **Egyeztetés** lapon, állítsa a **Speciális banki egyeztetés** lehetőséget **Igen** értékre. 4. Állítsa be a **Kivonat formátuma** mezőt ahhoz a formátumhoz, amit már korábban létrehozott, például az **ISO20022**. ## <a name="set-up-the-import-of-mt940-bank-statements"></a>Állítsa be a MT940 banki kivonatok importálását Először definiálni kell a banki kivonat formátum feldolgozási csoportját MT940 banki kivonatokhoz, az adatentitás keretrendszer használatával. 1. Menjen a **Munkaterületek** &gt; **Adatkezelés** lehetőségre. 2. Kattintson az **Importálás** gombra. 3. Adja meg a formátum nevét, például **MT940**. 4. A **Forrás adat formátuma** mezőt állítsa **XML-Element** értékre. 5. Az **Entitás neve** mezőbe írja be **Banki kivonatok**. 6. Az importálási fájlok feltöltéséhez, kattintson a **Feltöltés** gombra, majd tallózással válassza ki a **SampleBankCompositeEntity.xml** fájlt, amit már korábban lementett. 7. Miután a banki kimutatások entitás feltöltése és a hozzárendelés elkészült, kattintson a **Térkép megjelenítése** entitáshoz tartozó műveletre. 8. A banki kimutatások entitás egy összetett entitást, amelyet négy külön entitás alkot. Válassza ki a listából a **BankStatementDocumentEntity** lehetőséget, majd kattintson a **Térkép megjelenítése** műveletre. 9. Az **Átalakítások** fülön, kattintson az **Új** lehetőségre. 10. Az 1-es sorozatszámhoz, kattintson a **Fájlfeltöltés** lehetőségre, és válassza ki az **MT940TXT-to-MT940XML.xslt** fájl, amit már korábban lementett. 11. Kattintson az **Új** elemre. 12. Az 2-es sorozatszámhoz, kattintson a **Fájlfeltöltés** lehetőségre, és válassza ki az **MT940XML-Reconciliation.xslt** fájl, amit már korábban lementett. **Megjegyzés:** Az átalakítási fájlok szabványos formátumra épülnek. Mivel a bankok gyakran eltérnek ebben a formátumban, lehet, hogy frissíteni kell az átalakító fájlt, hogy leképezze az ön banki kivonat formátumát. <!--- For details about the expected format for MT940, see [Dynamics AX MT940 Layout](./media/dynamicsaxmt940layout1.xlsx)--> 13. Kattintson az **Új** elemre. 14. Az 3-es sorozatszámhoz, kattintson a **Fájlfeltöltés** lehetőségre, és válassza ki a **BankReconciliation-to-Composite.xslt** fájl, amit már korábban lementett. 15. Kattintson az **Átalakítások alkalmazása** lehetőségre. A formátum feldolgozó csoport beállítása után, a következő lépés célja az MT940 banki kivonatok banki kivonat formátuma szabályainak meghatározása. 1. Menjen a **Készpénz és bankkezelés** &gt; **Beállítás** &gt; **Továbbfejlesztett banki egyeztetés beállítása** &gt; **Banki kivonatok formátuma** elemre. 2. Kattintson az **Új** elemre. 3. Adja meg a kivonat formátumát, például a **MT940**. 4. Adja meg a formátum nevét. 5. Állítsa be a **Feldolgozó csoport** mezőt ahhoz a csoporthoz, amit már korábban definiált, például a **MT940**. 6. Állítsa a **Fájl típus** mezőt **txt** értékre. Az utolsó lépés, a Speciális bankszámla egyeztetés engedélyezése és a kivonat formátumának beállítása a bank számlán. 1. Nyissa meg a következőt: **Készpénz- és bankkezelés** &gt; **Bankszámlák**. 2. Válassza ki azt a bankszámlát, és nyissa meg a részletek megtekintéséhez. 3. Az **Egyeztetés** lapon, állítsa a **Speciális banki egyeztetés** lehetőséget **Igen** értékre. 4. Amikor felszólítást kap a választás megerősítéséhez, és lehetővé teszi a Speciális banki egyeztetést, kattintson az **OK** lehetőségre. 5. Állítsa be a **Kivonat formátuma** mezőt ahhoz a formátumhoz, amit már korábban létrehozott, például az **MT940**. ## <a name="set-up-the-import-of-bai2-bank-statements"></a>Állítsa be a BAI2 banki kivonatok importálását Először definiálni kell a banki kivonat formátum feldolgozási csoportját BAI2 banki kivonatokhoz, az adatentitás keretrendszer használatával. 1. Menjen a **Munkaterületek** &gt; **Adatkezelés** lehetőségre. 2. Kattintson az **Importálás** gombra. 3. Adja meg a formátum nevét, például **BAI2**. 4. A **Forrás adat formátuma** mezőt állítsa **XML-Element** értékre. 5. Az **Entitás neve** mezőbe írja be **Banki kivonatok**. 6. Az importálási fájlok feltöltéséhez, kattintson a **Feltöltés** gombra, majd tallózással válassza ki a **SampleBankCompositeEntity.xml** fájlt, amit már korábban lementett. 7. Miután a banki kimutatások entitás feltöltése és a hozzárendelés elkészült, kattintson a **Térkép megjelenítése** entitáshoz tartozó műveletre. 8. A banki kimutatások entitás egy összetett entitást, amelyet négy külön entitás alkot. Válassza ki a listából a **BankStatementDocumentEntity** lehetőséget, majd kattintson a **Térkép megjelenítése** műveletre. 9. Az **Átalakítások** fülön, kattintson az **Új** lehetőségre. 10. Az 1-es sorozatszámhoz, kattintson a **Fájlfeltöltés** lehetőségre, és válassza ki az **BAI2CSV-to-BAI2XML.xslt** fájl, amit már korábban lementett. 11. Kattintson az **Új** elemre. 12. Az 2-es sorozatszámhoz, kattintson a **Fájlfeltöltés** lehetőségre, és válassza ki az **BAI2XML-Reconciliation.xslt** fájl, amit már korábban lementett. **Megjegyzés:** Az átalakítási fájlok szabványos formátumra épülnek. Mivel a bankok gyakran eltérnek ebben a formátumban, lehet, hogy frissíteni kell az átalakító fájlt, hogy leképezze az ön banki kivonat formátumát. <!--- For details about the expected format for BAI2, see [Dynamics AX BAI2 Layout](./media/dynamicsaxbai2layout1.xlsx).--> 13. Kattintson az **Új** elemre. 14. Az 3-es sorozatszámhoz, kattintson a **Fájlfeltöltés** lehetőségre, és válassza ki a **BankReconciliation-to-Composite.xslt** fájl, amit már korábban lementett. 15. Kattintson az **Átalakítások alkalmazása** lehetőségre. A formátum feldolgozó csoport beállítása után, a következő lépés célja az BAI2 banki kivonatok banki kivonat formátuma szabályainak meghatározása. 1. Menjen a **Készpénz és bankkezelés** &gt; **Beállítás** &gt; **Továbbfejlesztett banki egyeztetés beállítása** &gt; **Banki kivonatok formátuma** elemre. 2. Kattintson az **Új** elemre. 3. Adja meg a kivonat formátumát, például a **BAI2**. 4. Adja meg a formátum nevét. 5. Állítsa be a **Feldolgozó csoport** mezőt ahhoz a csoporthoz, amit már korábban definiált, például a **BAI2**. 6. Állítsa a **Fájl típus** mezőt **txt** értékre. Az utolsó lépés, a Speciális bankszámla egyeztetés engedélyezése és a kivonat formátumának beállítása a bank számlán. 1. Nyissa meg a következőt: **Készpénz- és bankkezelés** &gt; **Bankszámlák**. 2. Válassza ki azt a bankszámlát, és nyissa meg a részletek megtekintéséhez. 3. Az **Egyeztetés** lapon, állítsa a **Speciális banki egyeztetés** lehetőséget **Igen** értékre. 4. Amikor felszólítást kap a választás megerősítéséhez, és lehetővé teszi a Speciális banki egyeztetést, kattintson az **OK** lehetőségre. 5. Állítsa be a **Kivonat formátuma** mezőt ahhoz a formátumhoz, amit már korábban létrehozott, például az **BAI2**. ## <a name="test-the-bank-statement-import"></a>Bankkivonat importálásának tesztelése Az utolsó lépés a banki kivonatok importálásnak tesztelése. 1. Nyissa meg a következőt: **Készpénz- és bankkezelés** &gt; **Bankszámlák**. 2. Válassza ki azt a bankszámlát, amihez a Speciális banki egyeztetés funkció engedélyezve van. 3. Az **Egyeztetés** fülön, kattintson a **Banki kivonatok** lehetőségre. 4. A **Banki kivonat** oldalon, kattintson az **Importnyilatkozat** lehetőségre. 5. Az **Bank számla** mezőt állítsa a kiválasztott a bankszámlára. A **Kivonat formátuma** mező értéke automatikusan lesz beállítva, a bankszámla beállításainak alapján. 6. Kattintson a **Keresés** lehetőségre, és jelölje ki az ön elektronikus banki kivonat fájlját. 7. Kattintson a **Feltöltés** hivatkozásra. 8. Kattintson az **OK** gombra. Ha az importálás sikeres, egy üzenetet fog kapni, amely arról tájékoztatja, hogy sikeresen importálta az ön kivonatát. Ha az importálás nem sikeres, az **Adatok kezelése** munkaterületen, a **Feladatelőzmények** szakaszban, keresse meg a feladatot. Kattintson a feladathoz tartozó **Végrehajtási részletek** lehetőségre, a **Végrehajtási összefoglalás** lap megnyitásához, majd kattintson a **Végrehajtási napló megtekintése** lehetőségre, az importálási hibák megtekintéséhez.
86.376238
539
0.768913
hun_Latn
1.000009
0c241b85c11efb859f5a490fc7970790b42fa9df
3,775
md
Markdown
wdk-ddi-src/content/portcls/nn-portcls-iservicesink.md
pcfist/windows-driver-docs-ddi
a14a7b07cf628368a637899de9c47e9eefba804c
[ "CC-BY-4.0", "MIT" ]
null
null
null
wdk-ddi-src/content/portcls/nn-portcls-iservicesink.md
pcfist/windows-driver-docs-ddi
a14a7b07cf628368a637899de9c47e9eefba804c
[ "CC-BY-4.0", "MIT" ]
null
null
null
wdk-ddi-src/content/portcls/nn-portcls-iservicesink.md
pcfist/windows-driver-docs-ddi
a14a7b07cf628368a637899de9c47e9eefba804c
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- UID: NN:portcls.IServiceSink title: IServiceSink author: windows-driver-content description: The IServiceSink interface encapsulates handling of a service request. old-location: audio\iservicesink.htm old-project: audio ms.assetid: 329ae226-02fb-438b-b461-da51e3afd6eb ms.author: windowsdriverdev ms.date: 2/27/2018 ms.keywords: IServiceSink, IServiceSink interface [Audio Devices], IServiceSink interface [Audio Devices], described, audio.iservicesink, audmp-routines_68a03e77-6246-44e7-acad-6de0fbe10c41.xml, portcls/IServiceSink ms.prod: windows-hardware ms.technology: windows-devices ms.topic: interface req.header: portcls.h req.include-header: req.target-type: Windows req.target-min-winverclnt: req.target-min-winversvr: req.kmdf-ver: req.umdf-ver: req.ddi-compliance: req.unicode-ansi: req.idl: req.max-support: req.namespace: req.assembly: req.type-library: req.lib: Portcls.lib req.dll: req.irql: PASSIVE_LEVEL topic_type: - APIRef - kbSyntax api_type: - COM api_location: - portcls.h api_name: - IServiceSink product: Windows targetos: Windows req.typenames: PC_EXIT_LATENCY, *PPC_EXIT_LATENCY --- # IServiceSink interface ## -description The <code>IServiceSink</code> interface encapsulates handling of a service request. The source of the service request is typically the miniport driver's interrupt service routine. PortCls supports the <code>IServiceSink</code> interface. An <code>IServiceSink</code> object is typically a member of a service group that is managed by an <a href="..\portcls\nn-portcls-iservicegroup.md">IServiceGroup</a> object. <code>IServiceSink</code> inherits from the <b>IUnknown</b> interface. <code>IServiceSink</code> is the base interface for <b>IServiceGroup</b>. This allows an <b>IServiceGroup</b> object to add itself (as an object with an <code>IServiceSink</code> interface) to another <b>IServiceGroup</b> object's service group. Although the PortCls system driver provides a <a href="..\portcls\nf-portcls-pcnewservicegroup.md">PcNewServiceGroup</a> function for creating a service group object, no similar function exists for creating a service sink object. Instead, a driver object that requires a service sink simply implements an <code>IServiceSink</code> interface in the driver object. For convenience, header file portcls.h includes an <b>IMP_IServiceSink</b> constant for adding the <code>IServiceSink</code> implementation to the object's class definition. The cost of adding an <code>IServiceSink</code> interface to an object is small because the interface supports only a single method. A port driver typically adds an <code>IServiceSink</code> interface to its port object and stream objects so that they can receive notification of interrupts from an audio device. For more information, see <a href="https://msdn.microsoft.com/00e17e01-8889-4fae-a0ff-e110d7a9b21e">Service Sink and Service Group Objects</a>. ## -inheritance The <b xmlns:loc="http://microsoft.com/wdcml/l10n">IServiceSink</b> interface inherits from the <a href="https://msdn.microsoft.com/33f1d79a-33fc-4ce5-a372-e08bda378332">IUnknown</a> interface. <b>IServiceSink</b> also has these types of members: <ul> <li><a href="https://docs.microsoft.com/">Methods</a></li> </ul> ## -members The <b>IServiceSink</b> interface has these methods. <table class="members" id="memberListMethods"> <tr> <th align="left" width="37%">Method</th> <th align="left" width="63%">Description</th> </tr> <tr data="declared;"> <td align="left" width="37%"> <a href="https://msdn.microsoft.com/library/windows/hardware/ff537009">IServiceSink::RequestService</a> </td> <td align="left" width="63%"> The <code>RequestService</code> method is called to forward a service request to an <b>IServiceSink</b> object. </td> </tr> </table> 
43.390805
849
0.774834
eng_Latn
0.687237
0c24f59103ae97a7bff5f1ae3840ce709305189f
1,035
md
Markdown
docs/framework/wcf/diagnostics/tracing/system-servicemodel-channels-maxacceptedchannelsreached.md
BraisOliveira/docs.es-es
cc6cffb862a08615c53b07afbbdf52e2a5ee0990
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/wcf/diagnostics/tracing/system-servicemodel-channels-maxacceptedchannelsreached.md
BraisOliveira/docs.es-es
cc6cffb862a08615c53b07afbbdf52e2a5ee0990
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/wcf/diagnostics/tracing/system-servicemodel-channels-maxacceptedchannelsreached.md
BraisOliveira/docs.es-es
cc6cffb862a08615c53b07afbbdf52e2a5ee0990
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: System.ServiceModel.Channels.MaxAcceptedChannelsReached ms.date: 03/30/2017 ms.assetid: 13d15194-a04f-4a5a-9d85-23ad350fdc7e ms.openlocfilehash: 625597fa878b032c26e95fa2adfd7bfae9091dcf ms.sourcegitcommit: 9b552addadfb57fab0b9e7852ed4f1f1b8a42f8e ms.translationtype: MT ms.contentlocale: es-ES ms.lasthandoff: 04/23/2019 ms.locfileid: "61792449" --- # <a name="systemservicemodelchannelsmaxacceptedchannelsreached"></a>System.ServiceModel.Channels.MaxAcceptedChannelsReached System.ServiceModel.Channels.MaxAcceptedChannelsReached ## <a name="description"></a>Descripción Se ha alcanzado el número máximo de canales de sesión entrantes. ## <a name="see-also"></a>Vea también - [Traza](../../../../../docs/framework/wcf/diagnostics/tracing/index.md) - [Uso del seguimiento para solucionar problemas de su aplicación](../../../../../docs/framework/wcf/diagnostics/tracing/using-tracing-to-troubleshoot-your-application.md) - [Administración y diagnóstico](../../../../../docs/framework/wcf/diagnostics/index.md)
45
171
0.77971
yue_Hant
0.308977
0c250fff034255abdf248588a85e1d3e199148a5
1,884
md
Markdown
articles/cognitive-services/personalizer/whats-new.md
asashiho/azure-docs.ja-jp
979cd1d2262d6eeef35e17df59f00c818fc37b3b
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/cognitive-services/personalizer/whats-new.md
asashiho/azure-docs.ja-jp
979cd1d2262d6eeef35e17df59f00c818fc37b3b
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/cognitive-services/personalizer/whats-new.md
asashiho/azure-docs.ja-jp
979cd1d2262d6eeef35e17df59f00c818fc37b3b
[ "CC-BY-4.0", "MIT" ]
1
2020-05-21T03:03:13.000Z
2020-05-21T03:03:13.000Z
--- title: 新機能 - Personalizer titleSuffix: Azure Cognitive Services description: この記事には、Personalizer に関するニュースが含まれています。 author: diberry manager: nitinme services: cognitive-services ms.service: cognitive-services ms.subservice: personalizer ms.topic: conceptual ms.date: 03/01/2019 ms.author: diberry ms.openlocfilehash: 21e1be094716ac1d43e1f4458b41e855315d1779 ms.sourcegitcommit: 58faa9fcbd62f3ac37ff0a65ab9357a01051a64f ms.translationtype: HT ms.contentlocale: ja-JP ms.lasthandoff: 04/29/2020 ms.locfileid: "82192988" --- # <a name="whats-new-in-personalizer"></a>Personalizer の新機能 サービス内の新機能について説明します。 これらの項目には、リリース ノート、ビデオ、ブログの投稿、およびその他の種類の情報が含まれる可能性があります。 このページをブックマークして、常にサービスの最新情報を確認してください。 ## <a name="release-notes"></a>リリース ノート ### <a name="march-2020"></a>2020 年 3 月 * TLS 1.2 は現在、このサービスへのすべての HTTP 要求に適用されるようになりました。 詳細については、[Azure Cognitive Services のセキュリティ](../cognitive-services-security.md)に関するページを参照してください。 ### <a name="november-2019---ignite-conference"></a>2019 年 11 月 - Ignite Conference * Personalizer の一般提供 (GA) * ライフサイクル全体を対象とした Azure Notebooks [チュートリアル](tutorial-use-azure-notebook-generate-loop-data.md) ### <a name="may-2019---build-conference"></a>2019 年 5 月- //Build Conference Build 2019 Conference では、次のプレビュー機能が公開されました。 * [優先度付けと報酬の学習ループ](what-is-personalizer.md) ## <a name="videos"></a>ビデオ ### <a name="2019-build-videos"></a>2019 Build のビデオ * [Cognitive Services Personalizer を使用して Xbox のような適切なエクスペリエンスとコンテンツを提供する](https://mybuild.techcommunity.microsoft.com/sessions/76970?source=sessions#top-anchor) ## <a name="service-updates"></a>サービスの更新情報 [Cognitive Services に対する Azure 更新プログラムのお知らせ](https://azure.microsoft.com/updates/?product=cognitive-services) ## <a name="next-steps"></a>次のステップ * [クイック スタート: C# でフィードバック ループを作成する](csharp-quickstart-commandline-feedback-loop.md) * [対話型デモを使用する](https://personalizationdemo.azurewebsites.net/)
34.888889
160
0.779724
yue_Hant
0.614564
0c2529b3002e8f801b6f270d5b0c9fc2d74bc735
142
md
Markdown
posts/my-second-post.md
chuiliu/gatsby-site
ae45dff09d2c3d5d1937fa0dee1a5bec72d5e3c6
[ "MIT" ]
null
null
null
posts/my-second-post.md
chuiliu/gatsby-site
ae45dff09d2c3d5d1937fa0dee1a5bec72d5e3c6
[ "MIT" ]
null
null
null
posts/my-second-post.md
chuiliu/gatsby-site
ae45dff09d2c3d5d1937fa0dee1a5bec72d5e3c6
[ "MIT" ]
null
null
null
--- path: "/posts/my-second-post" date: "2018-03-20" title: "Second post" tags: ["animals", "zoos"] --- Has anyone heard about GatsbyJS yet?
15.777778
36
0.65493
eng_Latn
0.661445
0c259992f9eebd834db1cbc6ec3395c255a8613f
3,646
md
Markdown
README.md
moteef/jest-sonar-reporter
bf282b1d53442c83d3b6f95d08a6384815d818e2
[ "MIT" ]
null
null
null
README.md
moteef/jest-sonar-reporter
bf282b1d53442c83d3b6f95d08a6384815d818e2
[ "MIT" ]
null
null
null
README.md
moteef/jest-sonar-reporter
bf282b1d53442c83d3b6f95d08a6384815d818e2
[ "MIT" ]
2
2018-06-08T13:53:18.000Z
2021-02-10T12:20:30.000Z
# jest-sonar-reporter [![Build Status](https://travis-ci.org/3dmind/jest-sonar-reporter.svg?branch=master)](https://travis-ci.org/3dmind/jest-sonar-reporter) [![Quality Gate](https://sonarcloud.io/api/project_badges/measure?project=jest-sonar-reporter&metric=alert_status)](https://sonarcloud.io/dashboard?id=jest-sonar-reporter) jest-sonar-reporter is a custom results processor for Jest. The processor converts Jest's output into Sonar's [generic test data](https://docs.sonarqube.org/display/SONAR/Generic+Test+Data) format. ## Installation Using npm: ```bash $ npm i -D jest-sonar-reporter ``` Using yarn: ```bash $ yarn add -D jest-sonar-reporter ``` ## Configuration Configure Jest in your `package.json` to use `jest-sonar-reporter` as a custom results processor. ```json { "jest": { "testResultsProcessor": "jest-sonar-reporter" } } ``` Configure Sonar to import the test results. Add the `sonar.testExecutionReportPaths` property to your `sonar-project.properties` file. ```properties sonar.testExecutionReportPaths=test-report.xml ``` ## Customization To customize the reporter you can use `package.json` to store the configuration. Create a `jestSonar` entry like this: ```json { "jestSonar": {} } ``` You can customize the following options: - `reportPath` (default: `root`, type: `string`) - This will specify the path to put the report in. - `reportFile` (default: `'test-report.xml'`, type: `string`) - This will specify the file name of the report. - `indent` (default: `2`, type: `number`) - This will specify the indentation to format the report. - `useRelativePath` (default: `false`, type: `boolean`) - This will create relative path for file element in report. ```json { "jestSonar": { "reportPath": "reports", "reportFile": "test-reporter.xml", "indent": 4, "useRelativePath": false } } ``` > Important: Don't forget to update `sonar.testExecutionReportPaths` when you use a custom path and file name. ### Support for Sonarqube 5.6.x Sonarqube 5.6.x does not support [Generic Test Data](https://docs.sonarqube.org/display/SONAR/Generic+Test+Data) however it has a [Generic Test Coverage plugin](https://docs.sonarqube.org/display/PLUG/Generic+Test+Coverage) which offers similar functionality. If you have the plugin installed on Sonarqube, you can configure this reporter to produce files in supported format. ```json { "jestSonar": { "sonar56x": true } } ``` Configure Sonar to import the test results. Add the `sonar.genericcoverage.unitTestReportPaths` property to your `sonar-project.properties` file. ```properties sonar.genericcoverage.unitTestReportPaths=test-report.xml ``` ### Support for different configuration environments To support different environments add the `env` property to the configuration and overwrite the value of the option you want to modify for the specific environment. You can overwrite the following configuration options: `reportPath`, `reportFile`, `indent`, `sonar56x` For example: Overwrite the path were the report will be stored. ```json { "jestSonar": { "reportPath": "reports", "reportFile": "test-reporter.xml", "indent": 4, "env": { "test": { "reportPath": "reports-test" } } } } ``` Use the `NODE_ENV` variable to activate the environment specific configuration. ```shell NODE_ENV=test npm run test ``` ## Usage 1. Run Jest to execute your tests. Using npm: ```bash $ npm run test ``` Using yarn: ```bash $ yarn run test ``` 2. Run sonar-scanner to import the test results. ```bash $ sonar-scanner ``` ## Licence This project uses the [MIT](LICENSE) licence.
24.972603
259
0.721613
eng_Latn
0.808543
0c25b08d642cbd225d737dc24fb466490a57a1db
783
md
Markdown
README.md
ianvgs/backstk
16a346810891e7b8bf5ee7edadacc267bfd740c7
[ "MIT" ]
null
null
null
README.md
ianvgs/backstk
16a346810891e7b8bf5ee7edadacc267bfd740c7
[ "MIT" ]
null
null
null
README.md
ianvgs/backstk
16a346810891e7b8bf5ee7edadacc267bfd740c7
[ "MIT" ]
null
null
null
# backstk BACK-END rodando em: https://bkstk.herokuapp.com/ Repositório: https://github.com/ianvgs/backstk Descrição: Back-end em NODE.JS, Express.js, ORM: Sequelize e Postgresql. As requisições Post são recebidas na rota “/fac” do host (https://github.com/ianvgs/backstk/fac). Caso queira instalar localmente, criar no root do projeto um arquivo ‘.env’ informando os seguintes valores para a configuração do banco de dados: DATABASE=exemplo DB_HOST=exemplo DB_PASSWORD=exemplo DB_PORT=exemplo DB_USER=exemplo Comandar: “npm start” para rodar o projeto e realizar uma única requisição POST(JSON) na rota ‘/admin/’ informando um parametro “acao” com valor “aapl” para gerar o primeiro registro na table criada automaticamente ao rodar o projeto. Obrigado e Valeu. Ian Guedes
34.043478
234
0.785441
por_Latn
0.997132
0c265d1058bc8e7af66ee337428836816d0aea3e
8,128
md
Markdown
articles/sql-database/sql-database-ssms-mfa-authentication.md
OpenLocalizationTestOrg/azure-docs-pr15_nl-BE
0820e32985e69325be0aaa272636461e11ed9eca
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
null
null
null
articles/sql-database/sql-database-ssms-mfa-authentication.md
OpenLocalizationTestOrg/azure-docs-pr15_nl-BE
0820e32985e69325be0aaa272636461e11ed9eca
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
null
null
null
articles/sql-database/sql-database-ssms-mfa-authentication.md
OpenLocalizationTestOrg/azure-docs-pr15_nl-BE
0820e32985e69325be0aaa272636461e11ed9eca
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
null
null
null
<properties pageTitle="Ondersteuning van SSMS voor Azure AD MVR gesloten met de SQL-Database en SQL Data Warehouse | Microsoft Azure" description="Rekening gehouden met Multi verificatie gebruiken met SSMS voor SQL-Database en SQL datawarehouse." services="sql-database" documentationCenter="" authors="BYHAM" manager="jhubbard" editor="" tags=""/> <tags ms.service="sql-database" ms.devlang="na" ms.topic="article" ms.tgt_pltfrm="na" ms.workload="data-management" ms.date="10/04/2016" ms.author="[email protected]"/> # <a name="ssms-support-for-azure-ad-mfa-with-sql-database-and-sql-data-warehouse"></a>SSMS-ondersteuning voor Azure AD MVR gesloten met de SQL-Database en SQL Data Warehouse Azure SQL-Database en Azure SQL Data Warehouse nu ondersteuning voor verbindingen van SQL Server Management Studio (SSMS) met *Universele verificatie van Active Directory*. Universele verificatie van Active Directory is een interactieve procesverloop die *Azure meerledige verificatie* (MVR gesloten) ondersteunt. Azure MVR gesloten helpt beschermen toegang tot gegevens en toepassingen voldoen aan eisen van de gebruiker voor een eenvoudig aanmelden proces. Sterke verificatie met tal van opties voor eenvoudige verificatie levert, telefoongesprek, tekstbericht, smartcards met pin of mobiele app-meldingen, zodat gebruikers kunnen kiezen hoe ze de voorkeur geven. Zie voor een beschrijving van een meerledige verificatie [Meerledige verificatie](../multi-factor-authentication/multi-factor-authentication.md). SSMS biedt nu ondersteuning voor: - Interactieve MVR gesloten met Azure Active Directory met de mogelijkheden voor de validatie van een pop-upvenster. - Niet-interactieve Active Directory-wachtwoord en geïntegreerde verificatie van Active Directory methoden die kunnen worden gebruikt in verschillende toepassingen (ADO.NET, JDBC, ODBC, enz.). Deze twee methoden nooit leiden tot pop-updialoogvensters. Als de gebruikersaccount is geconfigureerd voor de MVR gesloten is de werkstroom interactief verificatie vereist gebruikersinteractie via pop-updialoogvensters, het gebruik van smartcards, enz. Als de gebruikersaccount is geconfigureerd voor de MVR gesloten, moet de gebruiker universele Azure-verificatie om verbinding te selecteren. Als de gebruikersaccount geen MVR gesloten vereist, kan de gebruiker nog steeds de andere twee Azure Active Directory-verificatie-opties gebruiken. ## <a name="universal-authentication-limitations-for-sql-database-and-sql-data-warehouse"></a>Universele verificatie beperkingen voor SQL-Database en SQL Data Warehouse - SSMS is het enige hulpmiddel voor MVR gesloten door middel van universele verificatie van Active Directory ingeschakeld. - Slechts één Azure Active Directory-account kunt aanmelden voor een instantie van SSMS Universal-verificatie. Te melden als een andere Azure AD-account moet u een ander exemplaar van SSMS. (Deze beperking is beperkt tot universele verificatie van Active Directory, kunt u aanmelden bij verschillende servers met Active Directory-wachtwoordverificatie of geïntegreerde verificatie van Active Directory SQL Server-verificatie). - SSMS biedt ondersteuning voor universele verificatie van Active Directory voor visualisatie in Object Explorer met Query Editor en Query opslaan. - Ondersteuning voor universele verificatie DacFx noch de ontwerper van het Schema. - MSA-accounts worden niet ondersteund voor universele verificatie van Active Directory. - Universele verificatie van Active Directory wordt niet ondersteund in SSMS voor gebruikers die worden geïmporteerd in de huidige Active Directory uit andere Azure Active Directories. Deze gebruikers worden niet ondersteund omdat het vereist een huurder-ID voor het valideren van de rekeningen en er geen mechanisme is te bepalen dat. - Er zijn geen extra software-vereisten voor verificatie van Active Directory Universal, behalve dat u een ondersteunde versie van SSMS moet gebruiken. ## <a name="configuration-steps"></a>Configuratiestappen Meerledige verificatie implementeren, moet vier eenvoudige stappen. 1. **Een Azure Active Directory configureren** : Zie [identiteiten in ruimten met Azure Active Directory integreren](../active-directory/active-directory-aadconnect.md), [toevoegen van uw eigen domeinnaam naar Azure AD](https://azure.microsoft.com/blog/2012/11/28/windows-azure-now-supports-federation-with-windows-server-active-directory/) [nu Microsoft Azure federation met Windows Server Active Directory ondersteunt](https://azure.microsoft.com/blog/2012/11/28/windows-azure-now-supports-federation-with-windows-server-active-directory/), [beheert de directory Azure AD](https://msdn.microsoft.com/library/azure/hh967611.aspx)en [Azure AD met Windows PowerShell beheren](https://msdn.microsoft.com/library/azure/jj151815.aspx)voor meer informatie. 2. **MVR gesloten configureren** : Zie [Azure meerledige verificatie configureren](../multi-factor-authentication/multi-factor-authentication-whats-next.md)voor stapsgewijze instructies. 3. **SQL-Database configureren of SQL Data Warehouse voor Azure AD verificatie** – Zie [verbinding maken met een SQL-Database of SQL gegevens magazijn met behulp van Azure Active Directory-verificatie](sql-database-aad-authentication.md)voor stapsgewijze instructies. 4. **SSMS downloaden** : download de nieuwste SSMS op de clientcomputer (ten minste augustus 2016), van de [Download SQL Server Management Studio (SSMS)](https://msdn.microsoft.com/library/mt238290.aspx). ## <a name="connecting-by-using-universal-authentication-with-ssms"></a>Verbinding maken met behulp van universele verificatie met SSMS De volgende stappen laten zien hoe verbinding met de SQL-Database of SQL Data Warehouse met behulp van de nieuwste SSMS. 1. Voor gebruik van universele-verificatie in het dialoogvenster **verbinding maken met Server** , selecteert u de **Universele verificatie van Active Directory**. ![1mfa universele verbinding][1] 2. Zoals gebruikelijk voor SQL-Database en SQL Data Warehouse moet u op **Opties** en de database opgeven in het dialoogvenster **Opties** . Klik op **verbinding maken**. 3. Wanneer het dialoogvenster **aanmelden bij uw account** wordt weergegeven, geven de account en het wachtwoord van uw identiteit Azure Active Directory. ![2mfa-aanmelden][2] > [AZURE.NOTE] Voor universele verificatie met een account die niet nodig MVR gesloten u op dit moment. Voor gebruikers die behoefte hebben MVR gesloten, gaat u verder met de volgende stappen uit. 4. Twee MVR gesloten setup dialoogvensters kunnen worden weergegeven. Dit één keer bewerking is afhankelijk van de instelling beheerder van de MVR gesloten en daarom is niet verplicht. Voor een domein van de MVR gesloten ingeschakeld in deze stap is het soms vooraf gedefinieerde (bijvoorbeeld het domein gebruikers gebruiken een smartcard en de pincode vereist). ![3mfa setup][3] 5. De tweede mogelijke één keer in het dialoogvenster kunt u de details van uw verificatiemethode selecteren. De mogelijke opties worden geconfigureerd door de beheerder. ![4mfa-Controleer of-1][4] 6. Azure Active Directory stuurt de gegevens waarin u. Wanneer u de verificatiecode ontvangt, voert u deze in het vak **Voer de verificatiecode** en klik op **aanmelden**. ![5mfa-Controleer of 2][5] Wanneer de controle is voltooid, verbindt SSMS normaal vermoeden geldige referenties en toegang tot de firewall. ##<a name="next-steps"></a>Volgende stappen Anderen toegang verlenen tot uw database: [-Database SQL-verificatie en autorisatie: toegang verlenen](sql-database-manage-logins.md) Zorg ervoor dat andere gebruikers verbinding kunnen maken via de firewall: [een Azure SQL Database server niveau firewallregel met behulp van de portal Azure configureren](sql-database-configure-firewall-settings.md) [1]: ./media/sql-database-ssms-mfa-auth/1mfa-universal-connect.png [2]: ./media/sql-database-ssms-mfa-auth/2mfa-sign-in.png [3]: ./media/sql-database-ssms-mfa-auth/3mfa-setup.png [4]: ./media/sql-database-ssms-mfa-auth/4mfa-verify-1.png [5]: ./media/sql-database-ssms-mfa-auth/5mfa-verify-2.png
91.325843
811
0.805487
nld_Latn
0.997541
0c26a88f8ad20991e2f674c400b5d6671e5229b4
12,260
md
Markdown
articles/active-directory/saas-apps/foodee-provisioning-tutorial.md
chclaus/azure-docs.de-de
38be052bda16366997a146cf5589168d6c6f3387
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/active-directory/saas-apps/foodee-provisioning-tutorial.md
chclaus/azure-docs.de-de
38be052bda16366997a146cf5589168d6c6f3387
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/active-directory/saas-apps/foodee-provisioning-tutorial.md
chclaus/azure-docs.de-de
38be052bda16366997a146cf5589168d6c6f3387
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: 'Tutorial: Konfigurieren von Foodee für die automatische Benutzerbereitstellung mithilfe von Azure Active Directory | Microsoft-Dokumentation' description: Erfahren Sie, wie Sie Azure Active Directory zum automatischen Bereitstellen und Aufheben der Bereitstellung von Benutzerkonten in Foodee konfigurieren. services: active-directory documentationcenter: '' author: zchia writer: zchia manager: beatrizd ms.assetid: fb48deae-4653-448a-ba2f-90258edab3a7 ms.service: active-directory ms.subservice: saas-app-tutorial ms.workload: identity ms.tgt_pltfrm: na ms.devlang: na ms.topic: article ms.date: 08/30/2019 ms.author: Zhchia ms.openlocfilehash: abf2a752eaf0f1d0a9a8b07072dfc0b4c1ae45b7 ms.sourcegitcommit: 80da36d4df7991628fd5a3df4b3aa92d55cc5ade ms.translationtype: HT ms.contentlocale: de-DE ms.lasthandoff: 10/02/2019 ms.locfileid: "71812722" --- # <a name="tutorial-configure-foodee-for-automatic-user-provisioning"></a>Tutorial: Konfigurieren von Foodee für die automatische Benutzerbereitstellung In diesem Artikel wird gezeigt, wie Sie Azure Active Directory (Azure AD) in Foodee und Azure AD für die automatische Bereitstellung oder Aufhebung der Bereitstellung von Benutzern oder Gruppen in Foodee konfigurieren. > [!NOTE] > Der Artikel enthält die Beschreibung eines Connectors, der auf dem Benutzerbereitstellungsdienst von Azure AD basiert. Wenn Sie sich über den Zweck und die Funktionsweise dieses Diensts informieren sowie Antworten auf häufig gestellte Fragen erhalten möchten, lesen Sie [Automatisieren der Bereitstellung und Bereitstellungsaufhebung von Benutzern für SaaS-Anwendungen mit Azure Active Directory](../manage-apps/user-provisioning.md). > > Dieser Connector befindet sich derzeit in der Vorschauversion. Weitere Informationen zu den Nutzungsbedingungen von Azure für Previewfunktionen finden Sie unter [Zusätzliche Nutzungsbestimmungen für Microsoft Azure-Vorschauen](https://azure.microsoft.com/support/legal/preview-supplemental-terms/). ## <a name="prerequisites"></a>Voraussetzungen In diesem Tutorial wird davon ausgegangen, dass Sie die folgenden Voraussetzungen erfüllt haben und verfügen über: * Einen Azure AD-Mandanten * [Einen Foodee-Mandanten](https://www.food.ee/about/) * Ein Benutzerkonto in Foodee mit Administratorberechtigungen ## <a name="assign-users-to-foodee"></a>Zuweisen von Benutzern zu Foodee Azure AD ermittelt anhand von *Zuweisungen*, welche Benutzer Zugriff auf ausgewählte Apps erhalten sollen. Im Kontext der automatischen Benutzerbereitstellung werden nur die Benutzer oder Gruppen synchronisiert, die einer Anwendung in Azure AD zugewiesen wurden. Vor dem Konfigurieren und Aktivieren der automatischen Benutzerbereitstellung müssen Sie entscheiden, welche Benutzer oder Gruppen in Azure AD Zugriff auf Foodee benötigen. Nachdem Sie Ihre Entscheidung getroffen haben, können Sie diese Benutzer oder Gruppen entsprechend den Anleitungen unter [Zuweisen eines Benutzers oder einer Gruppe zu einer Unternehmens-App](../manage-apps/assign-user-or-group-access-portal.md) Foodee zuweisen. ## <a name="important-tips-for-assigning-users-to-foodee"></a>Wichtige Tipps zum Zuweisen von Benutzern zu Foodee Beachten Sie beim Zuweisen von Benutzern die folgenden Tipps: * Wir empfehlen Ihnen, Foodee zuerst nur einen einzelnen Azure AD-Benutzer zuzuweisen, um die Konfiguration der automatischen Benutzerbereitstellung zu testen. Später können Sie dann weitere Benutzer oder Gruppen zuweisen. * Wenn Sie Foodee einen Benutzer zuweisen, wählen Sie im Bereich **Zuweisung** eine gültige anwendungsspezifische Rolle aus (sofern verfügbar). Benutzer mit der Rolle *Standardzugriff* werden von der Bereitstellung ausgeschlossen. ## <a name="set-up-foodee-for-provisioning"></a>Einrichten von Foodee für die Bereitstellung Bevor Sie Foodee für die automatische Benutzerbereitstellung mithilfe von Azure AD konfigurieren, müssen Sie die SCIM-Bereitstellung (System for Cross-Domain Identity Management, System für domänenübergreifendes Identitätsmanagement) in Foodee aktivieren. 1. Melden Sie sich bei [Foodee](https://www.food.ee/login/) an, und wählen Sie dann Ihre Mandanten-ID aus. ![Foodee](media/Foodee-provisioning-tutorial/tenant.png) 1. Wählen Sie unter **Enterprise Portal** (Unternehmensportal) die Option **Single Sign On** (Einmaliges Anmelden) aus. ![Das Foodee-Menü „Enterprise Portal“ im linken Bereich](media/Foodee-provisioning-tutorial/scim.png) 1. Kopieren Sie den Wert im Feld **API-Token** zur späteren Verwendung. Sie werden ihn im Azure-Portal auf der Registerkarte **Bereitstellung** Ihrer Foodee-Anwendung im Feld **Geheimes Token** eingeben. ![Foodee](media/Foodee-provisioning-tutorial/token.png) ## <a name="add-foodee-from-the-gallery"></a>Hinzufügen von Foodee aus dem Katalog Um Foodee für die automatische Benutzerbereitstellung mithilfe von Azure AD konfigurieren zu können, müssen Sie Ihrer Liste der verwalteten SaaS-Anwendungen Foodee aus dem Azure AD-Anwendungskatalog hinzufügen. Führen Sie die folgenden Schritte aus, um Foodee aus dem Azure AD-Anwendungskatalog hinzuzufügen: 1. Wählen Sie im linken Bereich des [Azure-Portals](https://portal.azure.com) die Option **Azure Active Directory** aus. ![Der Befehl „Azure Active Directory“](common/select-azuread.png) 1. Wählen Sie **Unternehmensanwendungen** > **Alle Anwendungen**. ![Bereich „Unternehmensanwendungen“](common/enterprise-applications.png) 1. Wählen Sie oben im Bereich **Neue Anwendung** aus, um eine neue Anwendung hinzuzufügen. ![Schaltfläche „Neue Anwendung“](common/add-new-app.png) 1. Geben Sie im Suchfeld **Foodee** ein, wählen Sie im Ergebnisbereich **Foodee** und dann **Hinzufügen** aus, um die Anwendung hinzuzufügen. ![„Foodee“ in der Ergebnisliste](common/search-new-app.png) ## <a name="configure-automatic-user-provisioning-to-foodee"></a>Konfigurieren der automatischen Benutzerbereitstellung in Foodee In diesem Abschnitt konfigurieren Sie den Azure AD-Bereitstellungsdienst für das Erstellen, Aktualisieren und Deaktivieren von Benutzern oder Gruppen in Foodee auf der Grundlage von Benutzer- oder Gruppenzuweisungen in Azure AD. > [!TIP] > Sie können auch das SAML-basierte einmalige Anmelden für Foodee aktivieren. Folgen Sie dazu den Anleitungen unter [Tutorial: Integrieren von Foodee in Azure Active Directory](Foodee-tutorial.md). Sie können einmaliges Anmelden unabhängig von der automatischen Benutzerbereitstellung konfigurieren, obwohl diese beiden Features einander ergänzen. Konfigurieren Sie die automatische Benutzerbereitstellung für Foodee in Azure AD, indem Sie die folgenden Schritte ausführen: 1. Wählen Sie im [Azure-Portal](https://portal.azure.com) **Unternehmensanwendungen** > **Alle Anwendungen** aus. ![Bereich für Unternehmensanwendungen](common/enterprise-applications.png) 1. Wählen Sie in der Liste **Anwendungen** den Eintrag **Foodee** aus. ![Der „Foodee“-Link in der Liste „Anwendungen“](common/all-applications.png) 1. Wählen Sie die Registerkarte **Bereitstellung**. ![Registerkarte „Bereitstellung“](common/provisioning.png) 1. Wählen Sie in der Dropdownliste **Bereitstellungsmodus** den Eintrag **Automatisch** aus. ![Registerkarte „Bereitstellung“](common/provisioning-automatic.png) 1. Führen Sie unter **Administratoranmeldeinformationen** die folgenden Schritte aus: a. Geben Sie im Feld **Mandanten-URL** den zuvor abgerufenen Wert **https://concierge.food.ee/scim/v2** ein. b. Geben Sie im Feld **Geheimes Token** den zuvor abgerufenen Wert für **API Token** ein. c. Um sicherzustellen, dass Azure AD eine Verbindung mit Foodee herstellen kann, wählen Sie **Verbindung testen** aus. Wenn die Verbindung nicht hergestellt werden kann, stellen Sie sicher, dass Ihr Foodee-Konto über Administratorberechtigungen verfügt, und versuchen Sie es dann noch einmal. ![Der Link „Verbindung testen“](common/provisioning-testconnection-tenanturltoken.png) 1. Geben Sie im Feld **Benachrichtigungs-E-Mail** die E-Mail-Adresse einer Person oder Gruppe ein, die Benachrichtigungen zu Bereitstellungsfehlern erhalten soll. Aktivieren Sie dann das Kontrollkästchen **Bei Fehler E-Mail-Benachrichtigung senden**. ![Das Textfeld „Benachrichtigungs-E-Mail“](common/provisioning-notification-email.png) 1. Wählen Sie **Speichern** aus. 1. Wählen Sie unter **Zuordnungen** die Option **Azure Active Directory-Benutzer mit Foodee synchronisieren** aus. ![Foodee-Benutzerzuordnungen](media/Foodee-provisioning-tutorial/usermapping.png) 1. Überprüfen Sie unter **Attributzuordnungen** die Benutzerattribute, die von Azure AD mit Foodee synchronisiert werden. Die als **übereinstimmende** Eigenschaften ausgewählten Attribute werden bei Aktualisierungsvorgängen für den Abgleich der *Benutzerkonten* in Foodee verwendet. ![Foodee-Benutzerzuordnungen](media/Foodee-provisioning-tutorial/userattribute.png) 1. Wenn Sie Ihre Änderungen committen möchten, wählen Sie **Speichern** aus. 1. Wählen Sie unter **Zuordnungen** die Option **Azure Active Directory-Gruppen mit Foodee synchronisieren** aus. ![Foodee-Benutzerzuordnungen](media/Foodee-provisioning-tutorial/groupmapping.png) 1. Überprüfen Sie unter **Attributzuordnungen** die Benutzerattribute, die von Azure AD mit Foodee synchronisiert werden. Die als **übereinstimmende** Eigenschaften ausgewählten Attribute werden bei Aktualisierungsvorgängen für den Abgleich der *Gruppenkonten* in Foodee verwendet. ![Foodee-Benutzerzuordnungen](media/Foodee-provisioning-tutorial/groupattribute.png) 1. Wenn Sie Ihre Änderungen committen möchten, wählen Sie **Speichern** aus. 1. Konfigurieren Sie die Bereichsfilter. Informationen zur Vorgehensweise finden Sie in den Anleitungen unter [Attributbasierte Anwendungsbereitstellung mit Bereichsfiltern](../manage-apps/define-conditional-rules-for-provisioning-user-accounts.md). 1. Um den Azure AD-Bereitstellungsdienst für Foodee zu aktivieren, ändern Sie im Abschnitt **Einstellungen** den **Bereitstellungsstatus** in **Ein**. ![Der Schalter „Bereitstellungsstatus“](common/provisioning-toggle-on.png) 1. Definieren Sie unter **Einstellungen** in der Dropdownliste **Bereich** die Benutzer oder Gruppen, die Sie für Foodee bereitstellen möchten. ![Die Dropdownliste für den Bereitstellungsbereich](common/provisioning-scope.png) 1. Wählen Sie **Speichern** aus, wenn die Bereitstellung erfolgen kann. ![Die Schaltfläche „Speichern“ für die Bereitstellungskonfiguration](common/provisioning-configuration-save.png) Durch diesen Vorgang wird die Erstsynchronisierung der Benutzer oder Gruppen gestartet, die Sie in der Dropdownliste **Bereich** definiert haben. Die Erstsynchronisierung nimmt mehr Zeit in Anspruch als die nachfolgenden Synchronisierungen. Weitere Informationen finden Sie unter [Wie lange dauert die Bereitstellung von Benutzern?](../manage-apps/application-provisioning-when-will-provisioning-finish-specific-user.md#how-long-will-it-take-to-provision-users). Sie können den Abschnitt **Aktueller Status** verwenden, um den Fortschritt zu überwachen und den Links zu Ihrem Bereitstellungsaktivitätsbericht zu folgen. Der Bericht beschreibt alle vom Azure AD-Bereitstellungsdienst für Foodee ausgeführten Aktionen. Weitere Informationen finden Sie unter [Ermitteln, wann ein bestimmter Benutzer auf eine Anwendung zugreifen kann](../manage-apps/application-provisioning-when-will-provisioning-finish-specific-user.md). Informationen zum Lesen der Azure AD-Bereitstellungsprotokolle finden Sie unter [Tutorial: Berichterstellung zur automatischen Benutzerkontobereitstellung](../manage-apps/check-status-user-account-provisioning.md). ## <a name="additional-resources"></a>Zusätzliche Ressourcen * [Verwalten der Benutzerkontobereitstellung für Unternehmens-Apps im Azure-Portal](../manage-apps/configure-automatic-user-provisioning-portal.md) * [Was bedeuten Anwendungszugriff und einmaliges Anmelden mit Azure Active Directory?](../manage-apps/what-is-single-sign-on.md) ## <a name="next-steps"></a>Nächste Schritte * [Erfahren Sie, wie Sie Protokolle überprüfen und Berichte zu Bereitstellungsaktivitäten abrufen.](../manage-apps/check-status-user-account-provisioning.md)
68.49162
672
0.804894
deu_Latn
0.991648
0c277045d67cb611e2d3054e100eb6551a99abab
2,737
md
Markdown
info/Qt/allQT.md
nhyilin/devNotes
8daaedb1238afc1e8eb41f231ee01fe19b0d07b9
[ "MIT" ]
null
null
null
info/Qt/allQT.md
nhyilin/devNotes
8daaedb1238afc1e8eb41f231ee01fe19b0d07b9
[ "MIT" ]
null
null
null
info/Qt/allQT.md
nhyilin/devNotes
8daaedb1238afc1e8eb41f231ee01fe19b0d07b9
[ "MIT" ]
null
null
null
# 官方示例 # 信号槽 **信号(Signal)**就是在特定情况下被发射的事件,例如PushButton 最常见的信号就是鼠标单击时发射的 clicked() 信号,一个 ComboBox 最常见的信号是选择的列表项变化时发射的 CurrentIndexChanged() 信号。 GUI 程序设计的主要内容就是对界面上各组件的信号的响应,只需要知道什么情况下发射哪些信号,合理地去响应和处理这些信号就可以了。 **槽(Slot)**就是对信号响应的函数。槽就是一个函数,与一般的[C++](http://c.biancheng.net/cplus/)函数是一样的,可以定义在类的任何部分(public、private 或 protected),可以具有任何参数,也可以被直接调用。槽函数与一般的函数不同的是:槽函数可以与一个信号关联,当信号被发射时,关联的槽函数被自动执行。 **多个信号可以连接同一个槽** connect()函数最常用的一般形式: ```c++ connect(sender, signal, receiver, slot); ``` 五种重载: ```c++ QMetaObject::Connection connect(const QObject *, const char *, const QObject *, const char *, Qt::ConnectionType); QMetaObject::Connection connect(const QObject *, const QMetaMethod &, const QObject *, const QMetaMethod &, Qt::ConnectionType); QMetaObject::Connection connect(const QObject *, const char *, const char *, Qt::ConnectionType) const; QMetaObject::Connection connect(const QObject *, PointerToMemberFunction, const QObject *, PointerToMemberFunction, Qt::ConnectionType) QMetaObject::Connection connect(const QObject *, PointerToMemberFunction, Functor); ``` 这是最常用的形式,我们可以套用这个形式去分析上面给出的五个重载。第一个,sender 类型是const QObject *,signal 的类型是const char *,receiver 类型是const QObject *,slot 类型是const char *。这个函数将 signal 和 slot 作为字符串处理。第二个,sender 和 receiver 同样是const QObject *,但是 signal 和 slot 都是const QMetaMethod &。我们可以将每个函数看做是QMetaMethod的子类。因此,这种写法可以使用QMetaMethod进行类型比对。第三个,sender 同样是const QObject *,signal 和 slot 同样是const char *,但是却缺少了 receiver。这个函数其实是将 this 指针作为 receiver。第四个,sender 和 receiver 也都存在,都是const QObject *,但是 signal 和 slot 类型则是PointerToMemberFunction。看这个名字就应该知道,这是指向成员函数的指针。第五个,前面两个参数没有什么不同,最后一个参数是Functor类型。这个类型可以接受 static 函数、全局函数以及 Lambda 表达式。 由此我们可以看出,connect()函数,sender 和 receiver 没有什么区别,都是QObject指针;主要是 signal 和 slot 形式的区别。具体到我们的示例,我们的connect()函数显然是使用的第五个重载,最后一个参数是QApplication的 static 函数quit()。也就是说,当我们的 button 发出了clicked()信号时,会调用QApplication的quit()函数,使程序退出。 或者,借助 Qt 5 的信号槽语法,我们可以将一个对象的信号连接到 Lambda 表达式,例如: ```c++ // !!! Qt 5 #include <QApplication> #include <QPushButton> #include <QDebug> int main(int argc, char *argv[]) { QApplication app(argc, argv); QPushButton button("Quit"); QObject::connect(&button, &QPushButton::clicked, [](bool) {qDebug() << "You clicked me!";}); //[](bool) {qDebug() << "You clicked me!";},为第三种重载换位lambda表达式,也就是Functor button.show(); return app.exec(); } ``` # 自定义信号槽 # 编译中奇葩问题们 1. `multiple definition of qInitResources` 解决方案:在Qt官网看到相关解释,主要思路是重复调用了`qrc`资源,原文解释如下:`u must not add this generated file to ur SOURCE since it's automatically added when adding the .qrc file to RECOURES. Therefore the file is compoled (and linked) twice ` 我在pro文件里注释掉了相关引用,`#RESOURCES += FastDemoWidget.qrc`,即可避免重复引用,all done well
27.646465
587
0.763975
yue_Hant
0.477047
0c2775ad76bf5428e974fe9a6981cf0fa2aed685
3,074
md
Markdown
articles/key-vault/general/keyvault-moveresourcegroup.md
gschrijvers/azure-docs.nl-nl
e46af0b9c1e4bb7cb8088835a8104c5d972bfb78
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/key-vault/general/keyvault-moveresourcegroup.md
gschrijvers/azure-docs.nl-nl
e46af0b9c1e4bb7cb8088835a8104c5d972bfb78
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/key-vault/general/keyvault-moveresourcegroup.md
gschrijvers/azure-docs.nl-nl
e46af0b9c1e4bb7cb8088835a8104c5d972bfb78
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Azure Key Vault een kluis naar een andere resource groep verplaatsen | Microsoft Docs description: Richt lijnen voor het verplaatsen van een sleutel kluis naar een andere resource groep. services: key-vault author: ShaneBala-keyvault manager: ravijan tags: azure-resource-manager ms.service: key-vault ms.subservice: general ms.topic: conceptual ms.date: 04/29/2020 ms.author: sudbalas Customer intent: As a key vault administrator, I want to move my vault to another resource group. ms.openlocfilehash: bbc27af9eb448911093473d6ab20fde8004c7b88 ms.sourcegitcommit: 31236e3de7f1933be246d1bfeb9a517644eacd61 ms.translationtype: MT ms.contentlocale: nl-NL ms.lasthandoff: 05/04/2020 ms.locfileid: "82783719" --- # <a name="moving-an-azure-key-vault-across-resource-groups"></a>Een Azure Key Vault verplaatsen tussen resource groepen ## <a name="overview"></a>Overzicht Het verplaatsen van een sleutel kluis over resource groepen is een ondersteunde functie voor sleutel kluis. Het verplaatsen van een sleutel kluis tussen resource groepen heeft geen invloed op de configuratie van de sleutel kluis firewall of het toegangs beleid. Verbonden toepassingen en service-principals moeten blijven werken zoals bedoeld. ## <a name="design-considerations"></a>Overwegingen bij het ontwerp Uw organisatie heeft mogelijk Azure Policy geïmplementeerd met afdwinging of uitsluitingen op het niveau van de resource groep. Er is mogelijk een andere set beleids toewijzingen in de resource groep waar uw sleutel kluis momenteel bestaat en de resource groep waar u de sleutel kluis naartoe wilt verplaatsen. Een conflict in beleids vereisten is het mogelijk om uw toepassingen te onderbreken. ### <a name="example"></a>Voorbeeld U hebt een toepassing die is verbonden met een sleutel kluis die certificaten maakt die twee jaar geldig zijn. De resource groep waar u de sleutel kluis wilt verplaatsen, heeft een beleids toewijzing die het maken van certificaten blokkeert die langer dan één jaar geldig zijn. Nadat u de sleutel kluis naar de nieuwe resource groep hebt verplaatst, wordt de bewerking voor het maken van een certificaat dat twee jaar geldig is, geblokkeerd door een Azure-beleids toewijzing. ### <a name="solution"></a>Oplossing Zorg ervoor dat u naar de pagina Azure Policy gaat in de Azure Portal en Bekijk de beleids toewijzingen voor uw huidige resource groep, evenals de resource groep die u wilt verplaatsen en zorg ervoor dat er geen verschillen zijn. ## <a name="procedure"></a>Procedure 1. Aanmelden bij Azure Portal 2. Navigeer naar uw sleutel kluis 3. Klik op het tabblad Overzicht 4. Selecteer de knop "verplaatsen" 5. Selecteer verplaatsen naar een andere resource groep in de vervolg keuzelijst opties 6. Selecteer de resource groep waarnaar u de sleutel kluis wilt verplaatsen 7. De waarschuwing bevestigen over het verplaatsen van resources 8. Selecteer OK Key Vault evalueert nu de geldigheid van de resource verplaatsing en waarschuwt u eventuele fouten. Als er geen fouten worden gevonden, wordt het verplaatsen van de resource voltooid.
60.27451
475
0.81067
nld_Latn
0.99973
0c27a90cbac441dedfa79b10c325cfe856802965
18,419
md
Markdown
articles/active-directory/authentication/howto-sspr-reporting.md
ebarbosahsi/azure-docs.es-es
b6dbec832e5dccd7118e05208730a561103b357e
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/active-directory/authentication/howto-sspr-reporting.md
ebarbosahsi/azure-docs.es-es
b6dbec832e5dccd7118e05208730a561103b357e
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/active-directory/authentication/howto-sspr-reporting.md
ebarbosahsi/azure-docs.es-es
b6dbec832e5dccd7118e05208730a561103b357e
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: 'Informes del autoservicio de restablecimiento de contraseña: Azure Active Directory' description: Creación de informes sobre eventos de autoservicio de restablecimiento de contraseña de Azure AD services: active-directory ms.service: active-directory ms.subservice: authentication ms.topic: how-to ms.date: 02/01/2019 ms.author: justinha author: justinha manager: daveba ms.reviewer: rhicock ms.collection: M365-identity-device-management ms.openlocfilehash: 5ad1b8318e261c7dfef7fc125716736087a84bdc ms.sourcegitcommit: f28ebb95ae9aaaff3f87d8388a09b41e0b3445b5 ms.translationtype: HT ms.contentlocale: es-ES ms.lasthandoff: 03/30/2021 ms.locfileid: "104579185" --- # <a name="reporting-options-for-azure-ad-password-management"></a>Opciones de creación de informes para la administración de contraseñas de Azure AD Tras la implementación, muchas organizaciones quieren saber cómo se usa (y si se usa) realmente el autoservicio de restablecimiento de contraseña (SSPR). La característica de informes que proporciona Azure Active Directory (Azure AD) le ayuda a responder preguntas mediante informes creados previamente. Si está debidamente protegido por licencia, también puede crear consultas personalizadas. ![Generación de informes en SSPR mediante los registros de auditoría en Azure AD][Reporting] Las siguientes preguntas se pueden responder mediante informes existentes en [Azure Portal](https://portal.azure.com/): > [!NOTE] > Debe ser [administrador global](../roles/permissions-reference.md) y habilitar la opción para que estos datos se recopilen en nombre de la organización. Para habilitarla, debe visitar la pestaña **Informes** o los registros de auditoría como mínimo una vez. Hasta entonces, no se recopilan datos para la organización. > * ¿Cuántas personas se han registrado para el restablecimiento de contraseña? * ¿Quién se ha registrado para el restablecimiento de contraseña? * ¿Qué datos está registrando la gente? * ¿Cuántas personas han restablecido sus contraseñas en los últimos siete días? * ¿Cuáles son los métodos más comunes que usan los usuarios o administradores para restablecer sus contraseñas? * ¿Cuáles son los problemas comunes que encuentran los usuarios o administradores al intentar usar el restablecimiento de contraseña? * ¿Qué administradores restablecen sus propias contraseñas con frecuencia? * ¿Hay alguna actividad sospechosa en relación con el restablecimiento de contraseña? ## <a name="how-to-view-password-management-reports-in-the-azure-portal"></a>Visualización de los informes de administración de contraseñas en Azure Portal En la experiencia de Azure Portal, se ha mejorado la visualización de la actividad de restablecimiento de contraseña y de registro de restablecimiento de contraseña. Realice los pasos siguientes para encontrar los eventos de restablecimiento de contraseña y de registro de restablecimiento de contraseña: 1. Vaya a [Azure Portal](https://portal.azure.com). 2. Seleccione **Todos los servicios** en el panel izquierdo. 3. Busque **Azure Active Directory** en la lista de servicios y selecciónelo. 4. Seleccione **Usuarios** en la sección Administrar. 5. Seleccione **Registros de auditoría** en la hoja **Usuarios**. Se muestran todos los eventos de auditoría que se producen en todos los usuarios del directorio. Puede filtrar esta vista para ver también todos los eventos relacionados con contraseñas. 6. En el menú **Filtrar** de la parte superior del panel, seleccione la lista desplegable **Servicio** y cámbiela al tipo de servicio **Self-service Password Management** (Administración de contraseñas de autoservicio). 7. También puede filtrar la lista si elige la **actividad** específica que le interesa. ### <a name="combined-registration"></a>Registro combinado Si ha habilitado el [registro combinado](./concept-registration-mfa-sspr-combined.md), la información relativa a la actividad del usuario en los registros de auditoría se encontrará en **Seguridad** > **Métodos de autenticación**. ## <a name="description-of-the-report-columns-in-the-azure-portal"></a>Descripción de las columnas de informe en Azure Portal En la siguiente lista se explica en detalle cada una de las columnas de informe en Azure Portal: * **Usuario**: el usuario que ha intentado realizar una operación de registro de restablecimiento de contraseña. * **Rol**: el rol del usuario en el directorio. * **Fecha y hora**: fecha y hora del intento. * **Datos registrados**: los datos de autenticación que el usuario ha proporcionado durante el registro de restablecimiento de contraseña. ## <a name="description-of-the-report-values-in-the-azure-portal"></a>Descripción de los valores de informe en Azure Portal En la tabla siguiente se describen los distintos valores que se que puede establecer para cada columna en Azure Portal: | Columna | Valores permitidos y su significado | | --- | --- | | Datos registrados |**Correo electrónico alternativo**: el usuario ha utilizado un correo electrónico alternativo o un correo electrónico de autenticación para autenticarse.<p><p>**Teléfono del trabajo**: el usuario ha utilizado un teléfono de la oficina para autenticarse.<p>**Teléfono móvil**: el usuario ha utilizado un teléfono móvil o un teléfono de autenticación para autenticarse.<p>**Preguntas de seguridad**: el usuario ha utilizado preguntas de seguridad para autenticarse.<p>**Cualquier combinación de los métodos anteriores (por ejemplo, correo electrónico alternativo + teléfono móvil)** : se produce cuando se especifica una directiva de dos puertas y se muestran los dos métodos que el usuario ha utilizado para autenticar su solicitud de restablecimiento de contraseña. | ## <a name="self-service-password-management-activity-types"></a>Tipos de actividad de Self-service Password Management (Administración de contraseñas de autoservicio) Los siguientes tipos de actividad aparecen en la categoría de evento de auditoría **Self-service Password Management** (Administración de contraseñas de autoservicio): * [Bloqueado para el restablecimiento de contraseña de autoservicio](#activity-type-blocked-from-self-service-password-reset): indica que un usuario ha intentado restablecer una contraseña, usar una puerta específica o validar un número de teléfono más de cinco veces en total en 24 horas. * [Cambio de contraseña (autoservicio)](#activity-type-change-password-self-service): indica que un usuario ha realizado un cambio de contraseña voluntario o forzado (por expiración). * [Restablecimiento de contraseña (por parte del administrador)](#activity-type-reset-password-by-admin): indica que un administrador realizó un restablecimiento de contraseña en nombre de un usuario en Azure Portal. * [Restablecimiento de contraseña (autoservicio)](#activity-type-reset-password-self-service): indica que un usuario ha restablecido correctamente su contraseña en el [portal de restablecimiento de contraseña de Azure AD](https://passwordreset.microsoftonline.com). * [Progreso de la actividad del flujo de restablecimiento de contraseña de autoservicio](#activity-type-self-serve-password-reset-flow-activity-progress): indica cada paso específico que un usuario sigue como parte del proceso de restablecimiento de contraseña (como atravesar una puerta de autenticación específica para el restablecimiento de contraseña). * [Desbloqueo de la cuenta de usuario (autoservicio)](#activity-type-unlock-a-user-account-self-service): indica que un usuario ha desbloqueado correctamente su cuenta de Active Directory sin restablecer la contraseña en el [portal de restablecimiento de contraseña de Azure AD](https://passwordreset.microsoftonline.com) mediante la característica de desbloqueo de cuentas sin restablecimiento de Active Directory. * [Usuario registrado para el restablecimiento de contraseña de autoservicio](#activity-type-user-registered-for-self-service-password-reset): indica que un usuario ha registrado toda la información necesaria para restablecer la contraseña de acuerdo con la directiva de restablecimiento de contraseña de inquilino especificada actualmente. ### <a name="activity-type-blocked-from-self-service-password-reset"></a>Tipo de actividad: Bloqueado para el restablecimiento de contraseña de autoservicio En la lista siguiente se explica en detalle esta actividad: * **Descripción de la actividad**: indica que un usuario ha intentado restablecer una contraseña, usar una puerta específica o validar un número de teléfono más de cinco veces en total en 24 horas. * **Actor de la actividad**: el usuario que no ha podido realizar operaciones de restablecimiento adicionales. Puede tratarse de un usuario final o de un administrador. * **Destino de la actividad**: el usuario que no ha podido realizar operaciones de restablecimiento adicionales. Puede tratarse de un usuario final o de un administrador. * **Estado de la actividad**: * _Correcto_: indica que un usuario no ha podido realizar restablecimientos adicionales, intentar ningún otro método de autenticación ni validar ningún otro número de teléfono durante las 24 horas siguientes. * **Motivo del error de estado de la actividad**: No aplicable. ### <a name="activity-type-change-password-self-service"></a>Tipo de actividad: Cambio de contraseña (autoservicio) En la lista siguiente se explica en detalle esta actividad: * **Descripción de la actividad**: indica que un usuario ha realizado un cambio de contraseña voluntario o forzado (por expiración). * **Actor de la actividad**: usuario que ha cambiado la contraseña. Puede tratarse de un usuario final o de un administrador. * **Destino de la actividad**: usuario que ha cambiado la contraseña. Puede tratarse de un usuario final o de un administrador. * **Estados de la actividad**: * _Correcto_: indica que un usuario ha cambiado correctamente la contraseña. * _Error_: indica que un usuario no ha podido cambiar la contraseña. Puede seleccionar la fila para ver la categoría **Motivo del estado de la actividad** y obtener más información sobre por qué se ha producido el error. * **Motivo del error de estado de la actividad**: * _FuzzyPolicyViolationInvalidPassword_: el usuario ha seleccionado una contraseña que se prohibió automáticamente debido a que las funcionalidades de detección de contraseñas prohibidas de Microsoft consideraron que era demasiado común o muy poco segura. ### <a name="activity-type-reset-password-by-admin"></a>Tipo de actividad: Restablecimiento de contraseña (por parte del administrador) En la lista siguiente se explica en detalle esta actividad: * **Descripción de la actividad**: indica que un administrador realizó un restablecimiento de contraseña en nombre de un usuario en Azure Portal. * **Actor de la actividad**: el administrador que ha realizado el restablecimiento de contraseña en nombre de otro administrador o usuario final. Debe ser un administrador global, un administrador de usuarios o un administrador del departamento de soporte técnico. * **Destino de la actividad**: usuario cuya contraseña se ha restablecido. Puede tratarse de un usuario final o de otro administrador. * **Estados de la actividad**: * _Correcto_: indica que un administrador ha restablecido correctamente la contraseña de un usuario. * _Error_: indica que un administrador no ha podido cambiar la contraseña de un usuario. Puede seleccionar la fila para ver la categoría **Motivo del estado de la actividad** y obtener más información sobre por qué se ha producido el error. ### <a name="activity-type-reset-password-self-service"></a>Tipo de actividad: Restablecimiento de contraseña (autoservicio) En la lista siguiente se explica en detalle esta actividad: * **Descripción de la actividad**: indica que un usuario ha restablecido correctamente su contraseña en el [portal de restablecimiento de contraseña de Azure AD](https://passwordreset.microsoftonline.com). * **Actor de la actividad**: usuario que ha restablecido la contraseña. Puede tratarse de un usuario final o de un administrador. * **Destino de la actividad**: usuario que ha restablecido la contraseña. Puede tratarse de un usuario final o de un administrador. * **Estados de la actividad**: * _Correcto_: indica que un usuario ha restablecido correctamente su propia contraseña. * _Error_: indica que un usuario no ha podido restablecer su propia contraseña. Puede seleccionar la fila para ver la categoría **Motivo del estado de la actividad** y obtener más información sobre por qué se ha producido el error. * **Motivo del error de estado de la actividad**: * _FuzzyPolicyViolationInvalidPassword_: el administrador ha seleccionado una contraseña que se prohibió automáticamente debido a que las funcionalidades de detección de contraseñas prohibidas de Microsoft consideraron que era demasiado común o muy poco segura. ### <a name="activity-type-self-serve-password-reset-flow-activity-progress"></a>Tipo de actividad: Progreso de la actividad del flujo de restablecimiento de contraseña de autoservicio En la lista siguiente se explica en detalle esta actividad: * **Descripción de la actividad**: indica cada paso específico que un usuario sigue como parte del proceso de restablecimiento de contraseña (como atravesar una puerta de autenticación específica para el restablecimiento de contraseña). * **Actor de la actividad**: usuario que ha realizado parte del flujo de restablecimiento de contraseña. Puede tratarse de un usuario final o de un administrador. * **Destino de la actividad**: usuario que ha realizado parte del flujo de restablecimiento de contraseña. Puede tratarse de un usuario final o de un administrador. * **Estados de la actividad**: * _Correcto_: indica que un usuario ha completado correctamente un paso específico del flujo de restablecimiento de contraseña. * _Error_: indica que no se ha podido realizar un paso específico del flujo de restablecimiento de contraseña. Puede seleccionar la fila para ver la categoría **Motivo del estado de la actividad** y obtener más información sobre por qué se ha producido el error. * **Razones para el estado de la actividad**: vea en la tabla siguiente [todos los motivos del estado de la actividad de restablecimiento permitidos](#description-of-the-report-columns-in-the-azure-portal). ### <a name="activity-type-unlock-a-user-account-self-service"></a>Tipo de actividad: Desbloqueo de una cuenta de usuario (autoservicio) En la lista siguiente se explica en detalle esta actividad: * **Descripción de la actividad**: indica que un usuario ha desbloqueado correctamente su cuenta de Active Directory sin restablecer la contraseña en el [portal de restablecimiento de contraseña de Azure AD](https://passwordreset.microsoftonline.com) mediante la característica de desbloqueo de cuentas sin restablecimiento de Active Directory. * **Actor de la actividad**: el usuario que ha desbloqueado su cuenta sin restablecer la contraseña. Puede tratarse de un usuario final o de un administrador. * **Destino de la actividad**: el usuario que ha desbloqueado su cuenta sin restablecer la contraseña. Puede tratarse de un usuario final o de un administrador. * **Estados permitidos de la actividad**: * _Correcto_: indica que un usuario ha desbloqueado correctamente su propia cuenta. * _Error_: indica que un usuario no ha podido desbloquear su cuenta. Puede seleccionar la fila para ver la categoría **Motivo del estado de la actividad** y obtener más información sobre por qué se ha producido el error. ### <a name="activity-type-user-registered-for-self-service-password-reset"></a>Tipo de actividad: Usuario registrado para el restablecimiento de contraseña de autoservicio En la lista siguiente se explica en detalle esta actividad: * **Descripción de la actividad**: indica que un usuario ha registrado toda la información necesaria para restablecer la contraseña de acuerdo con la directiva de restablecimiento de contraseña de inquilino especificada actualmente. * **Actor de la actividad**: el usuario que se ha registrado para el restablecimiento de contraseña. Puede tratarse de un usuario final o de un administrador. * **Destino de la actividad**: el usuario que se ha registrado para el restablecimiento de contraseña. Puede tratarse de un usuario final o de un administrador. * **Estados permitidos de la actividad**: * _Correcto_: indica que un usuario se ha registrado correctamente para el restablecimiento de contraseña de acuerdo con la directiva actual. * _Error_: indica que un usuario no ha podido registrarse para el restablecimiento de contraseña. Puede seleccionar la fila para ver la categoría **Motivo del estado de la actividad** y obtener más información sobre por qué se ha producido el error. >[!NOTE] >Un error no implica que un usuario no pueda restablecer su propia contraseña. Sino que no finaliza el proceso de registro. Si en la cuenta hay datos no comprobados que son correctos (como un número de teléfono no validado), todavía pueden usarse para restablecer la contraseña aunque no estén comprobados. ## <a name="next-steps"></a>Pasos siguientes * [Uso de SSPR y MFA, y creación de informes de información](./howto-authentication-methods-activity.md) * [¿Cómo se realiza un lanzamiento correcto de SSPR?](howto-sspr-deployment.md) * [Restablecimiento o modificación de la contraseña](../user-help/active-directory-passwords-update-your-own-password.md). * [Registro para el autoservicio de restablecimiento de contraseñas](../user-help/active-directory-passwords-reset-register.md). * [¿Tiene alguna pregunta acerca de las licencias?](concept-sspr-licensing.md) * [¿Qué datos usa SSPR y cuáles se deben rellenar en lugar de los usuarios?](howto-sspr-authenticationdata.md) * [¿Qué métodos de autenticación están disponibles para los usuarios?](concept-sspr-howitworks.md#authentication-methods) * [¿Cuáles son las opciones de directiva con SSPR?](concept-sspr-policy.md) * [¿Qué es la escritura diferida de contraseñas y por qué nos interesa?](./tutorial-enable-sspr-writeback.md) * [¿Cuáles son todas las opciones en SSPR y qué significan?](concept-sspr-howitworks.md) * [Creo que algo se ha roto. ¿Cómo se solucionan problemas en SSPR?](./troubleshoot-sspr.md) * [Tengo una pregunta que no se ha comentado en ningún otro sitio](active-directory-passwords-faq.md) [Reporting]: ./media/howto-sspr-reporting/sspr-reporting.png "Ejemplo de registros de auditoría de actividad de SSPR en Azure AD"
98.497326
788
0.797817
spa_Latn
0.99289
0c27c75340747b25fbc56f92824eceb1ec295f2d
20,823
md
Markdown
articles/fin-and-ops/hr/localizations/noam-usa-worker-position-payroll-tasks.md
gvrmohanreddy/dynamics-365-unified-operations-public
45b80248f2064fec162a8f35fa44cdb78e0e8855
[ "CC-BY-4.0", "MIT" ]
1
2019-08-12T07:57:20.000Z
2019-08-12T07:57:20.000Z
articles/fin-and-ops/hr/localizations/noam-usa-worker-position-payroll-tasks.md
gvrmohanreddy/dynamics-365-unified-operations-public
45b80248f2064fec162a8f35fa44cdb78e0e8855
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/fin-and-ops/hr/localizations/noam-usa-worker-position-payroll-tasks.md
gvrmohanreddy/dynamics-365-unified-operations-public
45b80248f2064fec162a8f35fa44cdb78e0e8855
[ "CC-BY-4.0", "MIT" ]
1
2019-07-08T19:49:19.000Z
2019-07-08T19:49:19.000Z
--- # required metadata title: Set up payroll for workers description: Before you can pay a worker, you must set up payroll information about the worker's position, taxes, and benefits. This information is used when you generate pay statements for the worker. In addition, if contribution and deduction amounts are changed on a benefit, that change must be made for each worker who is enrolled in that benefit. author: andreabichsel manager: AnnBe ms.date: 11/01/2017 ms.topic: article ms.prod: ms.service: dynamics-ax-applications ms.technology: # optional metadata ms.search.form: HcmBenefit, HcmBenefitElementSetup, PayrollWorkerTaxCode, PayrollWorkerTaxRegion # ROBOTS: audience: Application User # ms.devlang: ms.reviewer: kfend ms.search.scope: Core, Operations # ms.tgt_pltfrm: ms.custom: 248404 ms.assetid: d6530f02-bbee-4d8e-94e7-173aecb4452e ms.search.region: USA # ms.search.industry: ms.author: ryansand ms.search.validFrom: 2016-11-30 ms.dyn365.ops.version: Version 1611 --- # Set up payroll for workers [!include [banner](../../includes/banner.md)] Before you can pay a worker, you must set up payroll information about the worker's position, taxes, and benefits. This information is used when you generate pay statements for the worker. In addition, if contribution and deduction amounts are changed on a benefit, that change must be made for each worker who is enrolled in that benefit. This topic provides information about these tasks and the fields that are used to complete them. For more information, see [Payroll data updates FAQ](noam-usa-payroll-data-updates.md). [![Flow of worker and position tasks](./media/worker-tasks.gif)](./media/worker-tasks.gif) ## Adding payroll periods to positions You must specify payroll details and add them to a position before you can generate payroll for the position. The following table show the information that you must enter on the **Payroll** FastTab. <table> <thead> <tr> <th>Field</th> <th>Description</th> </tr> </thead> <tbody> <tr> <td>Pay cycle</td> <td>Select the pay cycle that specifies the payroll calculation frequency and pay date for the position.</td> </tr> <tr> <td>Work cycle</td> <td>For non-exempt positions only, select the work cycle that specifies the work periods for the position. Some earnings, such as the regular-rate-of-pay premiums that are required by the Fair Labor Standards Act (FLSA), are based on work periods instead of pay periods. For exempt positions, leave this field blank. <blockquote>[!IMPORTANT] For workers who have more than one position, make sure that all positions that are assigned to the worker have the same work cycle.</blockquote> </td> </tr> <tr> <td>Paid by</td> <td>Select the legal entity that is responsible for making the payroll payments for this position. The legal entity that is responsible for paying for the position must be assigned to the position before you can assign worker tax codes to workers. <blockquote>[!NOTE] You can use the <strong>Worker</strong> page to assign default tax codes for each position.</blockquote> </td> </tr> <tr> <td>Annual regular hours</td> <td>Enter the number of regularly paid hours that the position is expected to have each year. This value is used to determine salary adjustments. For example, you might enter <strong>2080</strong> for a regular salaried worker. This value represents 40 hours per week. If a worker has eight hours of sick time, the difference of 32 hours can be calculated automatically.</td> </tr> <tr> <td>Workers' compensation</td> <td>Click <strong>Add</strong>, and select a compensation state and a compensation code. Repeat these steps for any additional workers' compensation benefits.</td> </tr> <tr> <td>Earnings</td> <td>The fields in this group interact in the following ways to determine how earnings are generated and shown on earnings statements: <ul> <li>If neither <strong>Generate salary</strong> nor <strong>Generate earnings from schedule</strong> is selected, base earnings statement lines for the position aren't generated. Only the recurring earnings that are specified on the <strong>Worker</strong> page are generated.</li> <li>If <strong>Generate earnings from schedule</strong> is selected, but <strong>Generate salary</strong> isn't selected, the following information applies: <ul> <li>A schedule is required. You can select among the calendars that have been created for the legal entity that is selected in the <strong>Paid by</strong> field.</li> <li>A default earning code is required.</li> <li>A day-by-day breakout of earnings for the position appears on the worker's earnings statement.</li> </ul> This set of selections is typically used for hourly workers.</li> <li>If <strong>Generate salary</strong> is selected, but <strong>Generate earnings from schedule</strong> Isn't selected, the following information applies: <ul> <li>A default earning code is required.</li> <li>The worker is paid the standard position salary amount for each pay period, and a single line is included on the earnings statement. This line has the date of the last day in the pay period. <blockquote>[!NOTE] If earnings statement lines were entered manually before the earnings were generated, the salary might be split across multiple lines. The total of the manually entered lines and the single generated line is always the standard salary amount.</blockquote> </li> </ul> This set of selections is typically used for salaried workers.</li> <li>If both <strong>Generate salary</strong> and <strong>Generate earnings from schedule</strong> are selected, the following information applies: <ul> <li>A schedule is required. You can select among the calendars that have been created for the legal entity that is selected in the <strong>Paid by</strong> field.</li> <li>A default earning code is required.</li> <li>The worker is paid the standard position salary amount for each pay period.</li> <li>A day-by-day breakout of earnings for the position appears on the worker's earnings statement.</li> </ul> This set of selections is typically used for salaried workers when you want a day-by-day breakout of their time. This functionality might be useful if, for example, a worker's time is associated with a project on two out of five days.</li> </ul> </td> </tr> </tbody> </table> ## Adding earning codes to worker position agreements If a worker receives recurring earnings or an earning rate that differs from the default earning code, you must assign an earning code to the worker to make sure that earnings are generated correctly. For more information, see [Generate earnings](noam-usa-generate-earnings.md). > [!NOTE] > If you're setting up an earning code that is based on hours or pieces, the **Frequency** field isn't available. Earnings that are based on hours are generated based on the **Generate salary** and **Generate earnings from schedule** selections on the **Position** page. Earnings that are based on pieces are entered manually. ## Setting up worker tax regions When you assign a tax region to a worker, all worker tax codes that apply to the worker are set up automatically. All parameters for the worker tax codes are set to their default values. We recommend that you review the worker tax codes for each worker to verify their accuracy. For more information, see [Tax information tasks](noam-usa-tax-information-tasks.md). To set up tax regions for workers, you must first create a list or spreadsheet that contains the following information for each worker that you're setting up tax regions for: - The city and state where the worker claims residency - The city and state of each location where the worker works Additionally, for some states, you must specify the school district and municipality that are associated with the residence or work locations. To determine whether this information is required, consult the state tax office. - On the **Worker tax regions** page, create a new tax region. Select the first task region by selecting the tax region where the worker claims residency, or the tax region where the worker can work. The first region that is assigned to a worker is designated as the worker's resident tax region. You can change the worker's residency at any time. For more information, see the "Changing worker residency (if a change is required)" section in this topic. - Most workers have only one worker tax region. However, a worker who resides in one tax region and works in another tax region has two worker tax regions. A worker who works in multiple locations has multiple tax regions. - In most cases, the worker tax codes that are required for the tax region are automatically assigned to the worker when you close this page. However, if the worker isn't assigned to a position, or if the worker's position isn't assigned to a legal entity for payroll, the tax codes are assigned later, when the position and legal entity are assigned. - You must then select a school district and municipality, if this information is required. If you use both fields, you must select a school district before you can select a municipality. To determine whether this information is required, consult the state tax office. If it isn't required, leave the fields blank. ## Changing worker residency (if a change is required) The first tax region that is assigned to a worker is designated as the worker's resident tax region. A worker can have only one resident tax region at a time. If you know that a worker's resident tax region will change later, you can set up a new resident tax region that takes effect on a specified date. The current resident tax region automatically expires when the new resident tax region takes effect. > [!TIP] > If the resident tax region that you want to change has never been used in a payroll run, you can delete it. ## Assigning default tax regions Tax regions are geographic areas where a specific set of payroll taxes applies. Tax regions generally correspond to the cities or towns where your workers reside or work. A worker tax region is a tax region that has been assigned to a specific worker. A default tax region is required for each position that a worker holds. A default tax region is a worker tax region that is used to generate earnings for a specific position that a worker holds. Therefore, if you don't assign a default tax region to the worker's tax position, earnings can't be generated for the position. In this case, to pay the worker, you must manually enter the earnings and then manually enter the tax region on each earning statement line. If a worker's position requires different tax regions at different times, you must still select a single default tax region for the position. You can then change the tax region on individual earning statement lines after you generate earnings for the position. > [!NOTE] > Although the default tax region is used to generate earnings for a position, it's assigned to the worker who holds the position, not to the position itself. If the position is later reassigned to a different worker, a default tax region must be assigned to the new worker who holds the position. The default tax region isn't reassigned to the new worker when the position is reassigned. > > If the worker tax region that should be specified for the position isn't included in the list, close this page, and use the **Worker tax region** page to assign the tax region to the worker. Then return to this page to assign the default tax region. ## Setting up worker tax codes You can manage a worker's tax options, such as filing status and total allowances, on the **Worker tax codes** page. You don't have to create or assign the worker tax codes, because the codes are automatically created and assigned to the worker when you create worker tax regions. At first, all parameters for worker tax codes are set to their default values. We recommend that you review the worker tax codes for each worker to make sure that they're accurate. For more information, see [Tax information tasks](noam-usa-tax-information-tasks.md). You must first create a list or spreadsheet that contains the values of the tax options for all the tax codes that are assigned to each worker whose tax codes you're setting up. These values differ for each tax code. The information is typically collected on IRS Form W-4 or a similar form for the state. ### Tips for setting up worker tax codes - When the **Worker tax codes** page is opened, it shows the tax codes that are effective on the current date. To view the tax codes and tax code parameters that were effective or will be effective on another date, click **Maintain versions**. - You can also change the value of the parameter directly in the grid. In this case, the new version of the tax code parameter is stamped with the date but not the time. Therefore, the first change that is made during a given day creates a new version of the parameter. If you change the value later during the same day, the new change overwrites the previous change but doesn't create a new version of the parameter. Only the last change is saved in the date-effective version. ## Enrolling workers in benefits Payroll information for a benefit isn't available if the **Payroll impact** field on the **Benefit elements** page is set to **None** for the benefit plan. > [!NOTE] > To set up payroll information for a garnishment or tax levy, see [Garnishment and tax levy enrollment tasks](noam-usa-garnishment-tax-levy-enrollment-tasks.md). Benefit accrual plans, such as paid time off, aren't set up or calculated like other benefits. For information about how to enroll a worker in a benefit accrual plan, see [Benefit accrual plan tasks](noam-usa-benefit-accrual-plan-tasks.md). For more information about how to set up a benefit, see [Benefit setup tasks](noam-usa-benefit-set-up-tasks.md). The following table show the information that you must enter on the **Payroll** FastTab. The fields on this FastTab can vary, depending on the setting of the **Payroll impact** field on the **Benefit elements** page. <table> <thead> <tr> <th>Field</th> <th>Description</th> </tr> </thead> <tbody> <tr> <td>Paid by</td> <td>The legal entity that pays for the selected benefit for this worker. This legal entity is usually the same legal entity that the worker is employed by and that pays for the worker position.</td> </tr> <tr> <td>Position</td> <td>Leave this field blank, except when the total costs for the benefit must follow a specific position. Union dues are an example of a benefit that is assigned to a position. If you select a position, the benefit calculations are based only on the earnings from the position. If you don't select a position, the deductions and contributions for the benefit are calculated based on all the worker's earnings. The amounts are split among all the positions that the employee is currently assigned to. The distribution for the deductions and calculations uses the same distribution as the earnings for those positions.</td> </tr> <tr> <td>Calculation priority</td> <td>The order that deductions and contributions for the selected benefit are calculated in, relative to other benefits. The deductions and contributions for the benefit that has the lowest calculation priority are calculated first. The lowest calculation priority is 0 (zero). When multiple benefits have the same number, the calculations for those benefits are done in alphabetical order. The calculation order is important when the result of the calculation for one benefit is used in the calculation for another benefit. This behavior is especially likely for garnishments and tax levies. Your legal advisors should help you determine the correct calculation priority for all benefits.</td> </tr> <tr> <td>Deduction priority</td> <td>The order that deductions for the selected benefit are made in, relative to other deductions. The deduction for the benefit that has the lowest deduction priority is made first. The lowest deduction priority is 1. When multiple benefits have the same number, the deductions for those benefits are made in alphabetical order. Your legal advisors should help you determine the correct deduction priority for all benefits. The default value for this field is set up on the <strong>Benefit elements</strong> page.</td> </tr> <tr> <td>Rate source, Basis, Deduction</td> <td>Together, the basis option and the deduction amount are used to calculate the amount of the payroll deduction for the benefit. The default values for these fields are set in the <strong>Basis</strong> and <strong>Amount or rate</strong> fields on the <strong>Benefits</strong> page. The <strong>Rate source</strong> field determines whether these fields are changed to match the default values when the benefit rates are updated from the benefit. <ul> <li>If you select <strong>Benefit</strong>, the deduction amount and basis for the worker are updated automatically when you click <strong>Update benefit rates</strong> on the <strong>Benefits</strong> page.</li> <li>If you select <strong>Custom</strong>, the deduction amount and basis for the worker aren't changed when you click <strong>Update benefit rates</strong> on the <strong>Benefits</strong> page. Select this option when the contribution for a worker is specific to that worker. For example, after a rate change, contribution amounts might be grandfathered in for some workers.</li> </ul> <blockquote>[!NOTE] For contributions, the default value of the <strong>Rate source</strong> field is <strong>Benefit</strong>.</blockquote> </td> </tr> <tr> <td>Notes</td> <td>As a best practice, if you change the default values for any payroll fields on this page, you should enter an explanation in this field.</td> </tr> </tbody> </table> The following tables show the information that you must enter on the **Payroll limits** FastTab. This FastTab might contain a set of payroll limits for contributions, deductions, or both contributions and deductions, depending on the value of the **Payroll impact** field on the **Benefit elements** page. **Deductions** | Field | Description | |--------------|-------------| | Limit amount | The maximum amount that can be deducted from a worker's pay for the selected benefit. If there is no maximum amount, leave this field blank.<blockquote>[!IMPORTANT] The **Remaining** field shows the amount that can be deducted for the benefit in future pay periods before the end of the limit period is reached. The amount is automatically updated during each pay run. You can also manually change the amount. Because no change history is kept, we recommend that you not enter or change the value of this field.</blockquote> | | Limit period | The period that the deduction limits apply to. For example, the limit amount is 1,200.00, and the **Limit period** field is set to **Year**. In this case, when the cumulative deductions for the benefit reach 1,200.00, no additional deductions are allowed for that benefit for the rest of the year. The **Limit end** field shows the last day of the current limit period. When the current limit period ends, the value of this field is automatically reset to the end of the new limit period.<blockquote>[!NOTE] The limit period is based on the calendar.</blockquote> | **Contributions** | Field | Description | |--------------|-------------| | Limit amount | The maximum amount that the employer can contribute for the selected benefit. If there is no maximum amount, leave this field blank.<blockquote>[!IMPORTANT] The **Remaining** field shows the amount that can be contributed for the benefit in future pay periods before the end of the limit period is reached. The amount is automatically updated during each pay run. You can also manually change the amount. Because no change history is kept, we recommend that you not enter or change the value of this field.</blockquote> | | Limit period | The period that the contribution limits apply to. For example, the limit amount is 1,200.00, and the **Limit period** field is set to **Year**. In this case, when the cumulative contributions for the benefit reach 1,200.00, no additional contributions are allowed for that benefit for the rest of the year. The **Limit end** field shows the last day of the current limit period. When the current limit period ends, the value of this field is automatically reset to the end of the new limit period.<blockquote>[!NOTE] The limit period is based on the calendar.</blockquote> | ## Additional resources [Tax information tasks](noam-usa-tax-information-tasks.md) [Benefit setup tasks](noam-usa-benefit-set-up-tasks.md) [Garnishment and tax levy setup tasks](noam-usa-garnishment-tax-levy-set-up-tasks.md) [Garnishment and tax levy enrollment tasks](noam-usa-garnishment-tax-levy-enrollment-tasks.md)
86.7625
852
0.779619
eng_Latn
0.999529
0c2826f4bad7277f140adea612086ff02c408212
4,112
md
Markdown
_posts/2017-06-07-chord.md
lucassant0s/lucassant0s.github.io
58dda6809082c5b4f8dbeca202fa58007cd7c744
[ "MIT" ]
null
null
null
_posts/2017-06-07-chord.md
lucassant0s/lucassant0s.github.io
58dda6809082c5b4f8dbeca202fa58007cd7c744
[ "MIT" ]
null
null
null
_posts/2017-06-07-chord.md
lucassant0s/lucassant0s.github.io
58dda6809082c5b4f8dbeca202fa58007cd7c744
[ "MIT" ]
null
null
null
--- layout: post title: Chord um protocolo de pesquisa peer-to-peer escalável para aplicativos de internet subtitle: Chord é um protocolo de peer-to-peer utilizado para encontrar dados bigimg: /img/chord.png --- Chord é um protocolo de peer-to-peer utilizado para encontrar dados, completamente mais eficiente que os demais protocolos de “lookup” existentes, os resultados finais de avaliação do Chord mostram altamente escalável, baixo custo de comunicação e o seu estado é mantido por cada nó escalando logaritmicamente com o número de nós da rede. Pode substituir o protocolo de DNS padrão que é utilizado agora, cada chave poderia corresponder a um IP, utilizando o que o Chord tem de melhor que é, o mapeamento de dados através de uma rede de servidores. Poderia diminuir o número de saltos que o protocolo de DNS tem que fazer para encontra o seu IP de uma aplicação através de uma URL. Alguns protocolos como Plaxton e Pastry possuem características semelhantes ao Chord, utilizam um algoritmo de pesquisa que também tem uma chave para encontrar os valores correspondentes, assumido um baixo custo de saltos que cada uma pesquisa, mas a vantagem do Chord é sua forma de lidar com as falhas em nós que possuem dados semelhantes. O Chord pode ser usado com protocolo para pesquisa de uma variável dentro de um sistema, evitando pontos de falhas centralizados e falta de escalabilidade. Simplificando alguns problemas bem recorrente em sistemas distribuídos tais como: Balanceamento de Carga: distribuindo as chaves hashes de maneira uniforme entre os nós, garantindo assim um grau de equilíbrio de carga natural dentro do “cluster”. - **Descentralização**: sendo totalmente destruído, nenhum nó é mais relevante para o sistema que o outro, tornando assim o cluster mais robusto removendo os pontos de falhas centralizados de um sistema, perfeito para ser utilizado em aplicações peer-to-peer. - **Escalabilidade**: custo de pesquisa usando Chord só cresce à medida que a quantidade de nós aumenta dentro do “cluster”, por isso mesmo que uma aplicação tem um elevado número de nós e o custo de pesquisa continua viável, tendo essa funcionalidade nativa sem nenhuma necessidade de configuração adicional. - **Disponibilidade**: como a maioria dos dados são recém-interligados no cluster, garantido um alto nível de disponibilidade, exceto se ocorrer falhas nas redes subjacentes, o nó que contém os dados pesquisado poderá sempre ser encontrado. Mesmo que o sistema esteja em contínuo estado de mudança. - **Nomeação flexível**: a estrutura do Chord é completamente flexível, então existem várias maneiras de trabalhar no esquema de key/value para fazer o mapeamento das suas pesquisas. ### Os seguinte exemplos mostram ótimas implementações do Chord: Cooperative Mirroring: quando vários desenvolvedores estão trabalhando no “software” e precisam disponibilizar versões atualizadas das suas implementações para os demais, sempre que uma nova versão do mesmo estiver disponível era só publicar através da rede. Armazenamento temporizado: podem ser compartilhado espaço de armazenamento de máquinas que estejam em tempo de ociosidade para salvar dados de outros nós. ### Conclusão Chord fornece uma computação distribuída rápida de uma função hash para mapeamento de chaves para seus nós. Como os nós não precisam armazenar muitas informações sobre os outros nós dentro da rede, garantindo assim uma alta performance na hora de fazer as pesquisa, porque os nós têm somente informações necessárias para o mapeamento da rede. Pesquisas podem ser implementadas e ser feitas utilizando estratégia de anel contendo um número limitado de nós, cada nó só precisa saber como contactar o seu nó subjacente garantindo assim uma maior eficiência nos resultados. Para garantir pesquisas corretas, todos os ponteiros (sucessor) deve ser atualizado. ### Referencias: - http://sistemasdistribuidosucam.blogspot.com.br/2008/03/protocolo-chord.html - https://en.wikipedia.org/wiki/Chord_(peer-to-peer) - https://pt.slideshare.net/tristanpenman/chord-a-scalable-peertopeer-lookup-protocol-for-internet-applications
102.8
342
0.814446
por_Latn
0.999928
0c2867cde58d94f9d571f7a42093e9898aa6e269
476
md
Markdown
kdocs/-kores/com.koresframework.kores.helper/-if-expression-helper/check-ref-equal.md
koresframework/Kores
b6ab31b1d376ab501fd9f481345c767cb0c37d04
[ "MIT-0", "MIT" ]
1
2019-09-17T16:36:51.000Z
2019-09-17T16:36:51.000Z
kdocs/-kores/com.koresframework.kores.helper/-if-expression-helper/check-ref-equal.md
koresframework/Kores
b6ab31b1d376ab501fd9f481345c767cb0c37d04
[ "MIT-0", "MIT" ]
8
2020-12-12T06:48:34.000Z
2021-08-15T22:34:49.000Z
kdocs/-kores/com.koresframework.kores.helper/-if-expression-helper/check-ref-equal.md
koresframework/Kores
b6ab31b1d376ab501fd9f481345c767cb0c37d04
[ "MIT-0", "MIT" ]
null
null
null
//[Kores](../../../index.md)/[com.koresframework.kores.helper](../index.md)/[IfExpressionHelper](index.md)/[checkRefEqual](check-ref-equal.md) # checkRefEqual [jvm]\ fun [checkRefEqual](check-ref-equal.md)(part1: [Instruction](../../com.koresframework.kores/-instruction/index.md), part2: [Instruction](../../com.koresframework.kores/-instruction/index.md)): [IfExpressionHelper](index.md) Compares [part1](check-ref-equal.md) and [part2](check-ref-equal.md) by reference.
52.888889
223
0.726891
deu_Latn
0.15235
0c28fb1fdcff93fd99c8dec8611880fc2fc6c71b
2,995
md
Markdown
README.md
hjdesigner/training-center.github.io
562ec3cfc294329b3d90d370a3ec66d29badb10e
[ "MIT" ]
null
null
null
README.md
hjdesigner/training-center.github.io
562ec3cfc294329b3d90d370a3ec66d29badb10e
[ "MIT" ]
null
null
null
README.md
hjdesigner/training-center.github.io
562ec3cfc294329b3d90d370a3ec66d29badb10e
[ "MIT" ]
null
null
null
# Training Center site Esta comunidade **não possui meios de conseguir renda**, porém, você pode doar mensalmente para nosso grupo através do [apoia.se](http://apoia.se/training-center) e garantir a continuação do funcionamento das nossas ferramentas. ;P <p align="center"> <a href="http://apoia.se/training-center" title="Apoia.se do Training Center"> <img src="https://raw.githubusercontent.com/training-center/sobre/master/assets/btn-apoiar.png" alt="botão de apoio ao Training Center"> </a> </p> ## Sobre Este projeto é um site estático gerado automaticamente pelo [Jekyll](https://jekyllrb.com/), um gerador de páginas estáticas, para facilitar a configuração do site e das páginas e a manutenção do código. Além do Jekyll, utilizamos o [Netlify](https://netlify.com), um serviço de build, deploy e gestão de projetos, para Continuous Deploy e HTTPS. Utilizamos o [dunders](https://github.com/woliveiras/__s), um starter theme, para agilizar a criação do nosso tema para o Jekyll. ## Requisitos Para rodar este site localmente será necessário a instalação das seguintes dependências: - Ruby - Jekyll Para a instalação do Ruby, recomendamos a utilização de um gerenciador de versões, como o [RVM](https://rvm.io/) e para a instalação do Jekyll basta seguir a [documentação do Jekyll](https://jekyllrb.com/). Caso você não consiga instalar alguma das dependências, pode abrir uma issue neste repositório para lhe auxiliarmos. ## Rodando localmente Clone nosso projeto para o seu ambiente de desenvolvimento. ```shell git clone [email protected]:training-center/training-center.github.io.git ``` Execute o comando `bundle` para instalar as dependências do Jekyll e do dunders. ```shell bundle ``` Agora basta rodar o comando para subir o site em sua máquina, o `jekyll s`. ```shell jekyll s ``` O site estará disponível em `localhost:4000`. ## Deploy para produção Todo merge feito em nossa branch **master** gera um deploy automático no Netlify. ## Estrutura de arquivos **assets** Nossas imagens, fontes, etc. **_includes** As `partials` do nosso site, os "pedaços" do site. - **analytics.html:** o arquivo de configuração do Google Analytics - **footer.html:** o **footer** do template - **head.html:** o **head** da página - **header.html:** o **header** do template **_layouts** O `scaffolding` para o Jekyll gerar nosso site. Mais informações sobre isso [aqui](https://jekyllrb.com/docs/themes/#layouts-and-includes). **_sass** Todos os módulos Sass do nosso site. **css** O arquivo principal que importa nossos modulos Sass para a geração do tema. **Outros arquivos** - **CNAME:** mapeamento do nosso domínio para o DNS - **_config.yml:** o arquivo de configuração do nosso site - **index.html:** o `index` do site - **robots.txt:** muito importante para SEO, [confira aqui](http://www.robotstxt.org/robotstxt.html) ## Como contribuir Veja nosso [CONTRIBUTING.md](CONTRIBUTING.md) ## License Este site é regido pela licença [MIT](LICENSE-SITE).
30.252525
231
0.74424
por_Latn
0.99836
0c29570c07badd1fd9f40f1ad89d5a1364b9556a
79,613
md
Markdown
CHANGELOG.md
t0m4uk1991/github-api
1535817992bec2a594237532ab9a6e05542f0e8a
[ "MIT" ]
257
2020-05-09T23:52:53.000Z
2022-03-31T02:54:26.000Z
CHANGELOG.md
t0m4uk1991/github-api
1535817992bec2a594237532ab9a6e05542f0e8a
[ "MIT" ]
501
2020-05-09T08:39:12.000Z
2022-03-30T17:25:08.000Z
CHANGELOG.md
t0m4uk1991/github-api
1535817992bec2a594237532ab9a6e05542f0e8a
[ "MIT" ]
172
2020-05-12T16:55:52.000Z
2022-03-30T08:41:14.000Z
# Changelog For changes after v1.101 see the [GitHub Releases page](https://github.com/hub4j/github-api/releases) for the project. ## [github-api-1.101](https://github.com/hub4j/github-api/tree/github-api-1.101) (2019-11-27) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.100...github-api-1.101) ### Fixes - Fixed `ClassNotFoundException` when creating `okhttp3.OkHttpConnector` with `Cache` @alecharp [\#627](https://github.com/hub4j/github-api/issues/627) ## [github-api-1.100](https://github.com/hub4j/github-api/tree/github-api-1.100) (2019-11-26) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.99...github-api-1.100) ### Features and Fixes - Add method to set repository topics @martinvanzijl [\#594](https://github.com/hub4j/github-api/issues/594) - Adjust GHRateLimit to system time instead of depending on synchronization @bitwiseman [#595](https://github.com/hub4j/github-api/issues/595) - Add Functionality of OTP to support user 2fa @madhephaestus [\#603](https://github.com/hub4j/github-api/issues/603) - Implement Meta endpoint @PauloMigAlmeida [\#611](https://github.com/hub4j/github-api/issues/611) - fix and unit tests for issue #504 @siordache [\#620](https://github.com/hub4j/github-api/issues/620) - Fixed GHContent to allow spaces in path @bitwiseman [\#625](https://github.com/hub4j/github-api/issues/625) ### Internals - Bump okhttp3 from 3.14.2 to 4.2.2 @dependabot-preview [\#593](https://github.com/hub4j/github-api/issues/593) - jackson 2.10.1 @sullis [\#604](https://github.com/hub4j/github-api/issues/604) - Code style fixes @bitwiseman [\#609](https://github.com/hub4j/github-api/issues/609) - Javadoc fail on warning during CI build @bitwiseman [\#613](https://github.com/hub4j/github-api/issues/613) - Clean up Requester interface a bit @bitwiseman [\#614](https://github.com/hub4j/github-api/issues/614) - Branch missing @alexanderrtaylor [\#615](https://github.com/hub4j/github-api/issues/615) - Cleanup imports @bitwiseman [\#616](https://github.com/hub4j/github-api/issues/616) - Removed permission field in createTeam. It is deprecated in the API @asthinasthi [\#619](https://github.com/hub4j/github-api/issues/619) - Re-enable Lifecycle test @bitwiseman [\#621](https://github.com/hub4j/github-api/issues/621) ## [github-api-1.99](https://github.com/hub4j/github-api/tree/github-api-1.99) (2019-11-04) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.95...github-api-1.99) **Closed issues:** - Support all available endpoints for Github App with preview request [\#570](https://github.com/hub4j/github-api/issues/570) - Login details [\#560](https://github.com/hub4j/github-api/issues/560) - GHRepository.listReleases\(\) return empty always [\#535](https://github.com/hub4j/github-api/issues/535) - Unable to get deployment by id [\#529](https://github.com/hub4j/github-api/issues/529) - Malformed URL exception while accessing Enterprise Repository and fetching data [\#526](https://github.com/hub4j/github-api/issues/526) - Allow getting a repository by ID [\#515](https://github.com/hub4j/github-api/issues/515) - Methods to update milestones [\#512](https://github.com/hub4j/github-api/issues/512) - Add ETAG support to minimize API requests [\#505](https://github.com/hub4j/github-api/issues/505) - GitHub.connectUsingOAuth\(\) suddenly taking a really long time to connect [\#493](https://github.com/hub4j/github-api/issues/493) - GHTeam.add does not due to GHTeam.Role\(s\) been capitalized [\#489](https://github.com/hub4j/github-api/issues/489) - Reading file's content through GHContent.read\(\) returns previous version of file. [\#487](https://github.com/hub4j/github-api/issues/487) - Implement archive/unarchive functionality [\#472](https://github.com/hub4j/github-api/issues/472) - \[Gists\] Edit Gists Support [\#466](https://github.com/hub4j/github-api/issues/466) - Missing description field in GHTeam [\#460](https://github.com/hub4j/github-api/issues/460) - Bug: GHOrganization::createTeam does not regard argument repositories [\#457](https://github.com/hub4j/github-api/issues/457) - Null value for GHPullRequestReview created date and updated date [\#450](https://github.com/hub4j/github-api/issues/450) - Support for repository Projects [\#425](https://github.com/hub4j/github-api/issues/425) - create a little MockGitHub class for tests mocking out the github REST API [\#382](https://github.com/hub4j/github-api/issues/382) - Branch name is not being correctly URL encoded [\#381](https://github.com/hub4j/github-api/issues/381) - Issue events [\#376](https://github.com/hub4j/github-api/issues/376) - Not able to get the right file content [\#371](https://github.com/hub4j/github-api/issues/371) - Updating file is not possible [\#354](https://github.com/hub4j/github-api/issues/354) - Missing repository statistics [\#330](https://github.com/hub4j/github-api/issues/330) - Is there a way to make this library more test friendly? [\#316](https://github.com/hub4j/github-api/issues/316) - GitHub 2 factor login [\#292](https://github.com/hub4j/github-api/issues/292) - Unable to resolve github-api artifacts from Maven Central [\#195](https://github.com/hub4j/github-api/issues/195) **Merged pull requests:** - Bump maven-source-plugin from 3.1.0 to 3.2.0 [\#590](https://github.com/hub4j/github-api/pull/590) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Fix site errors [\#587](https://github.com/hub4j/github-api/pull/587) ([bitwiseman](https://github.com/bitwiseman)) - \[Documentation\] :: Add GitHub App Developer Guide [\#586](https://github.com/hub4j/github-api/pull/586) ([PauloMigAlmeida](https://github.com/PauloMigAlmeida)) - Create CODE\_OF\_CONDUCT.md [\#585](https://github.com/hub4j/github-api/pull/585) ([bitwiseman](https://github.com/bitwiseman)) - Convenience method to auth with app installation token && documentation examples [\#583](https://github.com/hub4j/github-api/pull/583) ([PauloMigAlmeida](https://github.com/PauloMigAlmeida)) - Add method to list repository topics [\#581](https://github.com/hub4j/github-api/pull/581) ([martinvanzijl](https://github.com/martinvanzijl)) - Fix for getting deployment by id [\#580](https://github.com/hub4j/github-api/pull/580) ([martinvanzijl](https://github.com/martinvanzijl)) - Add methods to update and delete milestones. [\#579](https://github.com/hub4j/github-api/pull/579) ([martinvanzijl](https://github.com/martinvanzijl)) - GHOrganization.createTeam now adds team to specified repositories [\#578](https://github.com/hub4j/github-api/pull/578) ([martinvanzijl](https://github.com/martinvanzijl)) - bump jackson-databind to 2.10.0 to avoid security alert [\#575](https://github.com/hub4j/github-api/pull/575) ([romani](https://github.com/romani)) - Bump wiremock-jre8-standalone from 2.25.0 to 2.25.1 [\#574](https://github.com/hub4j/github-api/pull/574) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump hamcrest.version from 2.1 to 2.2 [\#573](https://github.com/hub4j/github-api/pull/573) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - GitHub workflow: add JDK 13 to build matrix [\#572](https://github.com/hub4j/github-api/pull/572) ([sullis](https://github.com/sullis)) - More tests [\#568](https://github.com/hub4j/github-api/pull/568) ([bitwiseman](https://github.com/bitwiseman)) - Add merge options to GHRepository [\#567](https://github.com/hub4j/github-api/pull/567) ([jberglund-BSFT](https://github.com/jberglund-BSFT)) - Bump gson from 2.8.5 to 2.8.6 [\#565](https://github.com/hub4j/github-api/pull/565) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump okio from 2.4.0 to 2.4.1 [\#564](https://github.com/hub4j/github-api/pull/564) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Simplify creation of PagedIterables from requests [\#563](https://github.com/hub4j/github-api/pull/563) ([bitwiseman](https://github.com/bitwiseman)) - GitHub workflow: enable Java matrix \[ '1.8.0', '11.0.x' \] [\#562](https://github.com/hub4j/github-api/pull/562) ([sullis](https://github.com/sullis)) - Bump org.eclipse.jgit from 5.5.0.201909110433-r to 5.5.1.201910021850-r [\#561](https://github.com/hub4j/github-api/pull/561) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump okio from 2.2.2 to 2.4.0 [\#558](https://github.com/hub4j/github-api/pull/558) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump mockito-core from 3.0.0 to 3.1.0 [\#557](https://github.com/hub4j/github-api/pull/557) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump wiremock-jre8-standalone from 2.24.1 to 2.25.0 [\#556](https://github.com/hub4j/github-api/pull/556) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump commons-io from 1.4 to 2.6 [\#555](https://github.com/hub4j/github-api/pull/555) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump maven-surefire-plugin from 2.22.1 to 2.22.2 [\#554](https://github.com/hub4j/github-api/pull/554) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump commons-lang3 from 3.7 to 3.9 [\#552](https://github.com/hub4j/github-api/pull/552) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump commons-codec from 1.7 to 1.13 [\#551](https://github.com/hub4j/github-api/pull/551) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump spotbugs-maven-plugin from 3.1.11 to 3.1.12.2 [\#550](https://github.com/hub4j/github-api/pull/550) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump bridge-method-annotation from 1.17 to 1.18 [\#549](https://github.com/hub4j/github-api/pull/549) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump org.eclipse.jgit from 4.9.0.201710071750-r to 5.5.0.201909110433-r [\#547](https://github.com/hub4j/github-api/pull/547) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Support for projects [\#545](https://github.com/hub4j/github-api/pull/545) ([gskjold](https://github.com/gskjold)) - Adding possiblity to get ssh keys [\#544](https://github.com/hub4j/github-api/pull/544) ([arngrimur-seal](https://github.com/arngrimur-seal)) - Grammar [\#543](https://github.com/hub4j/github-api/pull/543) ([jsoref](https://github.com/jsoref)) - Improved OkHttpConnector caching behavior [\#542](https://github.com/hub4j/github-api/pull/542) ([bitwiseman](https://github.com/bitwiseman)) - Add GitHubApiWireMockRule [\#541](https://github.com/hub4j/github-api/pull/541) ([bitwiseman](https://github.com/bitwiseman)) - Add support for team pr review requests [\#532](https://github.com/hub4j/github-api/pull/532) ([farmdawgnation](https://github.com/farmdawgnation)) - Add GitHub API requests logging [\#530](https://github.com/hub4j/github-api/pull/530) ([bozaro](https://github.com/bozaro)) - Add support for draft pull requests [\#525](https://github.com/hub4j/github-api/pull/525) ([vbehar](https://github.com/vbehar)) - Implement GitHub App API methods [\#522](https://github.com/hub4j/github-api/pull/522) ([PauloMigAlmeida](https://github.com/PauloMigAlmeida)) - Added getUserPublicOrganizations method [\#510](https://github.com/hub4j/github-api/pull/510) ([awittha](https://github.com/awittha)) - Add support for editing Gists [\#484](https://github.com/hub4j/github-api/pull/484) ([martinvanzijl](https://github.com/martinvanzijl)) - Add method to invite user to organization [\#482](https://github.com/hub4j/github-api/pull/482) ([martinvanzijl](https://github.com/martinvanzijl)) - Added method to list authorizations [\#481](https://github.com/hub4j/github-api/pull/481) ([martinvanzijl](https://github.com/martinvanzijl)) - Escape special characters in branch URLs [\#480](https://github.com/hub4j/github-api/pull/480) ([martinvanzijl](https://github.com/martinvanzijl)) - Add issue events API [\#479](https://github.com/hub4j/github-api/pull/479) ([martinvanzijl](https://github.com/martinvanzijl)) - Added description field to GHTeam class. [\#478](https://github.com/hub4j/github-api/pull/478) ([martinvanzijl](https://github.com/martinvanzijl)) - Add statistics API. [\#477](https://github.com/hub4j/github-api/pull/477) ([martinvanzijl](https://github.com/martinvanzijl)) - Adding Label description property [\#475](https://github.com/hub4j/github-api/pull/475) ([immanuelqrw](https://github.com/immanuelqrw)) - Implemented GitHub.doArchive [\#473](https://github.com/hub4j/github-api/pull/473) ([joaoe](https://github.com/joaoe)) ## [github-api-1.95](https://github.com/hub4j/github-api/tree/github-api-1.95) (2018-11-06) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.94...github-api-1.95) **Closed issues:** - Add ability to attach/detach issue label w/o side effects to other labels [\#456](https://github.com/hub4j/github-api/issues/456) - \[feature request\] Add support to list commits that only affect a file path [\#454](https://github.com/hub4j/github-api/issues/454) - Rate limit - catch the exception [\#447](https://github.com/hub4j/github-api/issues/447) - GHRepository.listCommits\(\) returns java.net.SocketTimeoutException: Read timed out [\#433](https://github.com/hub4j/github-api/issues/433) - java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.deser.SettableBeanProperty.\<init\> [\#419](https://github.com/hub4j/github-api/issues/419) - mvn test fails to start on HEAD [\#415](https://github.com/hub4j/github-api/issues/415) - NullPointerException in org.kohsuke.github.GHContent.read [\#414](https://github.com/hub4j/github-api/issues/414) **Merged pull requests:** - Added archived attribute in GHRepository [\#470](https://github.com/hub4j/github-api/pull/470) ([recena](https://github.com/recena)) - Fix memory leak. [\#468](https://github.com/hub4j/github-api/pull/468) ([KostyaSha](https://github.com/KostyaSha)) - add request reviewers as attribute of GHPullRequest [\#464](https://github.com/hub4j/github-api/pull/464) ([xeric](https://github.com/xeric)) - Add methods for adding/removing labels to GHIssue [\#461](https://github.com/hub4j/github-api/pull/461) ([evenh](https://github.com/evenh)) ## [github-api-1.94](https://github.com/hub4j/github-api/tree/github-api-1.94) (2018-08-30) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.93...github-api-1.94) **Closed issues:** - Attachment download from issues [\#451](https://github.com/hub4j/github-api/issues/451) - GHEventPayload.Issue [\#442](https://github.com/hub4j/github-api/issues/442) - GHRelease.uploadAsset\(\) that takes InputStream instead of File [\#434](https://github.com/hub4j/github-api/issues/434) - Implement the new invitations API [\#374](https://github.com/hub4j/github-api/issues/374) **Merged pull requests:** - Fix for issue \#426. Fix null pointer when deleting refs. [\#449](https://github.com/hub4j/github-api/pull/449) ([martinvanzijl](https://github.com/martinvanzijl)) - Fix pagination for APIs that supported it ad hoc [\#446](https://github.com/hub4j/github-api/pull/446) ([daniel-beck](https://github.com/daniel-beck)) - Adds the GHEventPayload.Issue class [\#443](https://github.com/hub4j/github-api/pull/443) ([Arrow768](https://github.com/Arrow768)) - support update content through createContent, passing sha of existing file [\#441](https://github.com/hub4j/github-api/pull/441) ([shirdoo](https://github.com/shirdoo)) - Add support for repository searching by "topic" [\#439](https://github.com/hub4j/github-api/pull/439) ([l3ender](https://github.com/l3ender)) - - added overloaded 'uploadAsset' method [\#438](https://github.com/hub4j/github-api/pull/438) ([jgangemi](https://github.com/jgangemi)) - \[feature\] implement Repository Invitations API [\#437](https://github.com/hub4j/github-api/pull/437) ([Rechi](https://github.com/Rechi)) - - remove unthrown IOException [\#436](https://github.com/hub4j/github-api/pull/436) ([jgangemi](https://github.com/jgangemi)) - - branch protection enhancements [\#435](https://github.com/hub4j/github-api/pull/435) ([jgangemi](https://github.com/jgangemi)) - Added release payload. [\#417](https://github.com/hub4j/github-api/pull/417) ([twcurrie](https://github.com/twcurrie)) - Add GHRepository.getRelease and GHRepository.getReleaseByTagName [\#411](https://github.com/hub4j/github-api/pull/411) ([tadfisher](https://github.com/tadfisher)) ## [github-api-1.93](https://github.com/hub4j/github-api/tree/github-api-1.93) (2018-05-01) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.92...github-api-1.93) **Closed issues:** - InvalidFormatException parsing GHEventPayload.PullRequestReview [\#421](https://github.com/hub4j/github-api/issues/421) - https://api.github.com/user response code -1 [\#418](https://github.com/hub4j/github-api/issues/418) - Update commons-lang version [\#409](https://github.com/hub4j/github-api/issues/409) - add a comment to a pull Request [\#380](https://github.com/hub4j/github-api/issues/380) **Merged pull requests:** - \[fix\] fetch labels with HTTP GET method [\#431](https://github.com/hub4j/github-api/pull/431) ([Rechi](https://github.com/Rechi)) - Added request reviewers function within GHPullRequest. [\#430](https://github.com/hub4j/github-api/pull/430) ([twcurrie](https://github.com/twcurrie)) - Add support for previous\_filename for file details in PR. [\#427](https://github.com/hub4j/github-api/pull/427) ([itepikin-smartling](https://github.com/itepikin-smartling)) - Fixes \#421 - Enum case doesn't match for Pull Request Reviews [\#422](https://github.com/hub4j/github-api/pull/422) ([ggrell](https://github.com/ggrell)) - OkHttpConnector: Enforce use of TLSv1.2 to match current Github and Github Enterprise TLS support. [\#420](https://github.com/hub4j/github-api/pull/420) ([randomvariable](https://github.com/randomvariable)) - Update commons-lang to 3.7 [\#410](https://github.com/hub4j/github-api/pull/410) ([Limess](https://github.com/Limess)) ## [github-api-1.92](https://github.com/hub4j/github-api/tree/github-api-1.92) (2018-01-13) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.91...github-api-1.92) ## [github-api-1.91](https://github.com/hub4j/github-api/tree/github-api-1.91) (2018-01-13) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.90...github-api-1.91) **Closed issues:** - How to authenticate using oauth in code? [\#405](https://github.com/hub4j/github-api/issues/405) - It is possible to read a github project wiki ? [\#404](https://github.com/hub4j/github-api/issues/404) - gitHttpTransportUrl Rename [\#403](https://github.com/hub4j/github-api/issues/403) - Do not throw new Error\(\) [\#400](https://github.com/hub4j/github-api/issues/400) - GHPullRequest.getMergeable\(\) never returns True [\#399](https://github.com/hub4j/github-api/issues/399) - NPE in GHPerson.populate [\#395](https://github.com/hub4j/github-api/issues/395) - 64-bit id support [\#393](https://github.com/hub4j/github-api/issues/393) - Numeric value out of range of int [\#387](https://github.com/hub4j/github-api/issues/387) - Diff URL with auth [\#386](https://github.com/hub4j/github-api/issues/386) **Merged pull requests:** - Adding merge settings to GHCreateRepositoryBuilder [\#407](https://github.com/hub4j/github-api/pull/407) ([notsudo](https://github.com/notsudo)) - Improved Pull Request review and comments support [\#406](https://github.com/hub4j/github-api/pull/406) ([sns-seb](https://github.com/sns-seb)) - Add GHIssue\#setMilestone [\#397](https://github.com/hub4j/github-api/pull/397) ([mizoguche](https://github.com/mizoguche)) - \[fix\] GHPerson: check if root is null [\#396](https://github.com/hub4j/github-api/pull/396) ([Rechi](https://github.com/Rechi)) - Add get for all organizations [\#391](https://github.com/hub4j/github-api/pull/391) ([scotty-g](https://github.com/scotty-g)) - Add support for pr review/review comment events [\#384](https://github.com/hub4j/github-api/pull/384) ([mattnelson](https://github.com/mattnelson)) - Roles for team members [\#379](https://github.com/hub4j/github-api/pull/379) ([amberovsky](https://github.com/amberovsky)) ## [github-api-1.90](https://github.com/hub4j/github-api/tree/github-api-1.90) (2017-10-28) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.89...github-api-1.90) **Closed issues:** - \<null\> fields yeld NPE on getX operations. [\#372](https://github.com/hub4j/github-api/issues/372) - Add support for committing multiple files. [\#360](https://github.com/hub4j/github-api/issues/360) - Update compiler [\#357](https://github.com/hub4j/github-api/issues/357) - Extension mechanism? [\#356](https://github.com/hub4j/github-api/issues/356) - Update GHPullRequest with missing properties [\#355](https://github.com/hub4j/github-api/issues/355) - Refactor to allow for additional HTTP headers [\#353](https://github.com/hub4j/github-api/issues/353) - Building failing due problematic repository server [\#344](https://github.com/hub4j/github-api/issues/344) - Pull Request Reviews API [\#305](https://github.com/hub4j/github-api/issues/305) **Merged pull requests:** - Labels: add method to update color [\#390](https://github.com/hub4j/github-api/pull/390) ([batmat](https://github.com/batmat)) - Fixed OAuth connection to enterprise API [\#389](https://github.com/hub4j/github-api/pull/389) ([dorian808080](https://github.com/dorian808080)) - Fix for \#387: numeric value out of range of int [\#388](https://github.com/hub4j/github-api/pull/388) ([aburmeis](https://github.com/aburmeis)) ## [github-api-1.89](https://github.com/hub4j/github-api/tree/github-api-1.89) (2017-09-09) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.88...github-api-1.89) **Closed issues:** - OkHttpConnector is broken [\#335](https://github.com/hub4j/github-api/issues/335) - Support new merge methods \(squash/rebase\) for Github Pull Requests [\#326](https://github.com/hub4j/github-api/issues/326) - Expose OAuth headers [\#303](https://github.com/hub4j/github-api/issues/303) **Merged pull requests:** - Added support for traffic statistics \(number of views and clones\) [\#368](https://github.com/hub4j/github-api/pull/368) ([adw1n](https://github.com/adw1n)) - issue \#360: Add support for committing multiple files [\#361](https://github.com/hub4j/github-api/pull/361) ([siordache](https://github.com/siordache)) - Expose Headers [\#339](https://github.com/hub4j/github-api/pull/339) ([KostyaSha](https://github.com/KostyaSha)) - Remove unused imports [\#337](https://github.com/hub4j/github-api/pull/337) ([sebkur](https://github.com/sebkur)) - Add support for MergeMethod on GHPullRequest [\#333](https://github.com/hub4j/github-api/pull/333) ([greggian](https://github.com/greggian)) - Add some level of synchronization to the root of the API [\#283](https://github.com/hub4j/github-api/pull/283) ([mmitche](https://github.com/mmitche)) ## [github-api-1.88](https://github.com/hub4j/github-api/tree/github-api-1.88) (2017-09-09) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.87...github-api-1.88) ## [github-api-1.87](https://github.com/hub4j/github-api/tree/github-api-1.87) (2017-09-09) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.86...github-api-1.87) **Closed issues:** - Cannot merge newly created GHPullRequest [\#367](https://github.com/hub4j/github-api/issues/367) - Unable to search thru search API [\#365](https://github.com/hub4j/github-api/issues/365) - Unable to find documentation for issue search [\#364](https://github.com/hub4j/github-api/issues/364) - Commit Search API implemented or not? [\#345](https://github.com/hub4j/github-api/issues/345) - How can I get Latest Release of Repository? [\#341](https://github.com/hub4j/github-api/issues/341) **Merged pull requests:** - bridge-method-annotation should be an optional dep [\#378](https://github.com/hub4j/github-api/pull/378) ([jglick](https://github.com/jglick)) - Add basic support for tag objects [\#375](https://github.com/hub4j/github-api/pull/375) ([stephenc](https://github.com/stephenc)) - - improve branch protection support [\#369](https://github.com/hub4j/github-api/pull/369) ([jgangemi](https://github.com/jgangemi)) - Add missing event types used by repository webhooks [\#363](https://github.com/hub4j/github-api/pull/363) ([PauloMigAlmeida](https://github.com/PauloMigAlmeida)) - Add ping hook method [\#362](https://github.com/hub4j/github-api/pull/362) ([KostyaSha](https://github.com/KostyaSha)) - \[JENKINS-36240\] /repos/:owner/:repo/collaborators/:username/permission no longer requires korra preview [\#358](https://github.com/hub4j/github-api/pull/358) ([jglick](https://github.com/jglick)) - Add support for PR reviews preview [\#352](https://github.com/hub4j/github-api/pull/352) ([stephenc](https://github.com/stephenc)) - Add the Commit search API [\#351](https://github.com/hub4j/github-api/pull/351) ([mdeverdelhan](https://github.com/mdeverdelhan)) - add latest release [\#343](https://github.com/hub4j/github-api/pull/343) ([kamontat](https://github.com/kamontat)) - Ignore eclipse files [\#338](https://github.com/hub4j/github-api/pull/338) ([sebkur](https://github.com/sebkur)) - Fix a bug in the Javadocs \(due to copy and paste\) [\#332](https://github.com/hub4j/github-api/pull/332) ([sebkur](https://github.com/sebkur)) ## [github-api-1.86](https://github.com/hub4j/github-api/tree/github-api-1.86) (2017-07-03) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.85...github-api-1.86) **Merged pull requests:** - \[JENKINS-45142\] Retry connections after getting SocketTimeoutException [\#359](https://github.com/hub4j/github-api/pull/359) ([jglick](https://github.com/jglick)) ## [github-api-1.85](https://github.com/hub4j/github-api/tree/github-api-1.85) (2017-03-01) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.84...github-api-1.85) **Closed issues:** - getPusher\(\) returns null in call to getPusher\(\) [\#342](https://github.com/hub4j/github-api/issues/342) **Merged pull requests:** - Ensure that connections are closed for error responses [\#346](https://github.com/hub4j/github-api/pull/346) ([stephenc](https://github.com/stephenc)) - Correct algebra in \#327 [\#329](https://github.com/hub4j/github-api/pull/329) ([stephenc](https://github.com/stephenc)) ## [github-api-1.84](https://github.com/hub4j/github-api/tree/github-api-1.84) (2017-01-10) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.83...github-api-1.84) ## [github-api-1.83](https://github.com/hub4j/github-api/tree/github-api-1.83) (2017-01-10) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.82...github-api-1.83) **Closed issues:** - Webhook creation response error [\#328](https://github.com/hub4j/github-api/issues/328) **Merged pull requests:** - Expose Rate Limit Headers [\#327](https://github.com/hub4j/github-api/pull/327) ([stephenc](https://github.com/stephenc)) - Branch protection attrs [\#325](https://github.com/hub4j/github-api/pull/325) ([jeffnelson](https://github.com/jeffnelson)) - \[JENKINS-36240\] Added GHRepository.getPermission\(String\) [\#324](https://github.com/hub4j/github-api/pull/324) ([jglick](https://github.com/jglick)) ## [github-api-1.82](https://github.com/hub4j/github-api/tree/github-api-1.82) (2016-12-17) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.81...github-api-1.82) **Closed issues:** - API response time [\#322](https://github.com/hub4j/github-api/issues/322) - getLabels\(\) call for a pull request results in downstream 404s [\#319](https://github.com/hub4j/github-api/issues/319) - Gist Searching [\#318](https://github.com/hub4j/github-api/issues/318) - Adding users to organization-owned repos not possible [\#317](https://github.com/hub4j/github-api/issues/317) - GHSearchBuilder terms are cumulative when I expected them to overwrite previous one [\#314](https://github.com/hub4j/github-api/issues/314) - Support ordering in searches [\#313](https://github.com/hub4j/github-api/issues/313) - Update OkHttp usage [\#312](https://github.com/hub4j/github-api/issues/312) - github.getRepository does not work for Enterprise github instance [\#263](https://github.com/hub4j/github-api/issues/263) **Merged pull requests:** - Fix syntactically malformed test JSON [\#323](https://github.com/hub4j/github-api/pull/323) ([jglick](https://github.com/jglick)) - Added ghRepo.getBlob\(String\) method [\#320](https://github.com/hub4j/github-api/pull/320) ([KostyaSha](https://github.com/KostyaSha)) - Fix typos in javadocs [\#315](https://github.com/hub4j/github-api/pull/315) ([davidxia](https://github.com/davidxia)) ## [github-api-1.81](https://github.com/hub4j/github-api/tree/github-api-1.81) (2016-11-21) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.80...github-api-1.81) **Closed issues:** - Multiple assignee support [\#294](https://github.com/hub4j/github-api/issues/294) - Missing support for determining if authenticated user is organization owner [\#293](https://github.com/hub4j/github-api/issues/293) ## [github-api-1.80](https://github.com/hub4j/github-api/tree/github-api-1.80) (2016-11-17) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.79...github-api-1.80) **Closed issues:** - Testing reaction [\#311](https://github.com/hub4j/github-api/issues/311) - Is there a way to update contents in bulk? [\#310](https://github.com/hub4j/github-api/issues/310) - GitHub\#listAllUsers\(\) demanded \(enterprise use-case\) [\#309](https://github.com/hub4j/github-api/issues/309) - Delete OAuth Token [\#308](https://github.com/hub4j/github-api/issues/308) - How to find a pull request using the Search API and get its details? [\#298](https://github.com/hub4j/github-api/issues/298) **Merged pull requests:** - Add portion of auth/application API. [\#307](https://github.com/hub4j/github-api/pull/307) ([KostyaSha](https://github.com/KostyaSha)) - Add offline support to the API to make parsing events easier [\#306](https://github.com/hub4j/github-api/pull/306) ([stephenc](https://github.com/stephenc)) - Fix fields of GHRepository [\#304](https://github.com/hub4j/github-api/pull/304) ([wolfogre](https://github.com/wolfogre)) ## [github-api-1.79](https://github.com/hub4j/github-api/tree/github-api-1.79) (2016-10-25) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.78...github-api-1.79) **Merged pull requests:** - url encode hashes in ref names [\#299](https://github.com/hub4j/github-api/pull/299) ([bsheats](https://github.com/bsheats)) ## [github-api-1.78](https://github.com/hub4j/github-api/tree/github-api-1.78) (2016-10-24) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.77...github-api-1.78) **Closed issues:** - Allow edit for maintainer support [\#297](https://github.com/hub4j/github-api/issues/297) - run mvn install fail! Failure to find org.jenkins-ci:jenkins:pom:1.26 ?? [\#296](https://github.com/hub4j/github-api/issues/296) **Merged pull requests:** - Expose the commit dates [\#300](https://github.com/hub4j/github-api/pull/300) ([stephenc](https://github.com/stephenc)) - Use maximum permitted page size [\#295](https://github.com/hub4j/github-api/pull/295) ([jglick](https://github.com/jglick)) ## [github-api-1.77](https://github.com/hub4j/github-api/tree/github-api-1.77) (2016-08-06) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.76...github-api-1.77) **Closed issues:** - weird format for get list of organizations [\#291](https://github.com/hub4j/github-api/issues/291) - OkHttp is out of date [\#290](https://github.com/hub4j/github-api/issues/290) - 400 from OKHttp getInputStream [\#288](https://github.com/hub4j/github-api/issues/288) - pagination support search APIs [\#287](https://github.com/hub4j/github-api/issues/287) - significant more API calls for same code [\#286](https://github.com/hub4j/github-api/issues/286) - Excessive concurrent request rate limit not handled [\#285](https://github.com/hub4j/github-api/issues/285) - team.add\(repo\) should accept permission flag [\#279](https://github.com/hub4j/github-api/issues/279) - Pull request mergeability is boolean but should be trinary [\#275](https://github.com/hub4j/github-api/issues/275) - Webhook with content type "application/json" [\#274](https://github.com/hub4j/github-api/issues/274) - Disable rate\_limit check on GitHub Enterprise completely [\#273](https://github.com/hub4j/github-api/issues/273) - java.lang.IllegalArgumentException: byteString == null [\#265](https://github.com/hub4j/github-api/issues/265) - Failed to deserialize list of contributors when repo is empty [\#261](https://github.com/hub4j/github-api/issues/261) - github-api does not distinguish between user and organisation [\#260](https://github.com/hub4j/github-api/issues/260) - API Rate Limit Exceeding [\#258](https://github.com/hub4j/github-api/issues/258) **Merged pull requests:** - Implement an abuse handler [\#289](https://github.com/hub4j/github-api/pull/289) ([mmitche](https://github.com/mmitche)) ## [github-api-1.76](https://github.com/hub4j/github-api/tree/github-api-1.76) (2016-06-03) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.75...github-api-1.76) **Closed issues:** - GitHub.getRateLimit hangs inside SocketInputStream.socketRead0 [\#271](https://github.com/hub4j/github-api/issues/271) - \(re\)open method on GHMilestone [\#269](https://github.com/hub4j/github-api/issues/269) - More meaning toString implementations [\#268](https://github.com/hub4j/github-api/issues/268) - GHRepository.fork\(\) does not block on the async operation until it's complete [\#264](https://github.com/hub4j/github-api/issues/264) - Add Support for the Protected Branches API [\#262](https://github.com/hub4j/github-api/issues/262) **Merged pull requests:** - related to JENKINS-34834. updating test for similar condition [\#282](https://github.com/hub4j/github-api/pull/282) ([apemberton](https://github.com/apemberton)) - Add Slug to GHTeam per v3 API: https://developer.github.com/v3/orgs/t… [\#281](https://github.com/hub4j/github-api/pull/281) ([apemberton](https://github.com/apemberton)) - Fixed broken link [\#278](https://github.com/hub4j/github-api/pull/278) ([jglick](https://github.com/jglick)) - Updated Date was wrong [\#277](https://github.com/hub4j/github-api/pull/277) ([KondaReddyR](https://github.com/KondaReddyR)) - Add support to delete a team [\#276](https://github.com/hub4j/github-api/pull/276) ([thug-gamer](https://github.com/thug-gamer)) - Added support for the extended stargazers API in Github V3 API [\#272](https://github.com/hub4j/github-api/pull/272) ([noctarius](https://github.com/noctarius)) - \[\#269\] Add reopen method on GHMilestone [\#270](https://github.com/hub4j/github-api/pull/270) ([szpak](https://github.com/szpak)) ## [github-api-1.75](https://github.com/hub4j/github-api/tree/github-api-1.75) (2016-04-13) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.74...github-api-1.75) ## [github-api-1.74](https://github.com/hub4j/github-api/tree/github-api-1.74) (2016-03-19) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.73...github-api-1.74) **Closed issues:** - missing maven central dependencies in 1.72 [\#257](https://github.com/hub4j/github-api/issues/257) - Not able to specify client Id and client secret to connect using OAuth [\#256](https://github.com/hub4j/github-api/issues/256) - Stuck in Github connect [\#255](https://github.com/hub4j/github-api/issues/255) - Infinite loop in `GHNotificationStream$1.fetch\(\)` [\#252](https://github.com/hub4j/github-api/issues/252) - How to get statistics using this library [\#241](https://github.com/hub4j/github-api/issues/241) **Merged pull requests:** - Animal sniffer [\#259](https://github.com/hub4j/github-api/pull/259) ([Shredder121](https://github.com/Shredder121)) - Better error messages [\#254](https://github.com/hub4j/github-api/pull/254) ([cyrille-leclerc](https://github.com/cyrille-leclerc)) - Fix \#252: infinite loop because the "hypertext engine" generates invalid URLs [\#253](https://github.com/hub4j/github-api/pull/253) ([cyrille-leclerc](https://github.com/cyrille-leclerc)) - Improve checkApiUrlValidity\(\) method [\#251](https://github.com/hub4j/github-api/pull/251) ([recena](https://github.com/recena)) ## [github-api-1.73](https://github.com/hub4j/github-api/tree/github-api-1.73) (2016-03-01) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.72...github-api-1.73) **Closed issues:** - traceback if webhook set to "send me everything". [\#250](https://github.com/hub4j/github-api/issues/250) - Error on github pull request populate [\#246](https://github.com/hub4j/github-api/issues/246) - How to avoid connection timeout while using github.listallpublicrepositories [\#243](https://github.com/hub4j/github-api/issues/243) - myissues [\#242](https://github.com/hub4j/github-api/issues/242) - Question - Stargazers and stars release. [\#239](https://github.com/hub4j/github-api/issues/239) - GHEventInfo getPayload [\#238](https://github.com/hub4j/github-api/issues/238) **Merged pull requests:** - Added getHtmlUrl\(\) to GHCommit [\#249](https://github.com/hub4j/github-api/pull/249) ([zapelin](https://github.com/zapelin)) - Populate commit with data for getCommitShortInfo [\#248](https://github.com/hub4j/github-api/pull/248) ([daniel-beck](https://github.com/daniel-beck)) - Fix error when creating email service hook [\#245](https://github.com/hub4j/github-api/pull/245) ([daniel-beck](https://github.com/daniel-beck)) - Minor amendment to the documentation [\#244](https://github.com/hub4j/github-api/pull/244) ([benbek](https://github.com/benbek)) - Support for auto\_init [\#240](https://github.com/hub4j/github-api/pull/240) ([dlovera](https://github.com/dlovera)) ## [github-api-1.72](https://github.com/hub4j/github-api/tree/github-api-1.72) (2015-12-10) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.71...github-api-1.72) **Fixed bugs:** - GHRepository.getCompare\(GHBranch, GHBranch\) does not allow for cross-fork compares [\#218](https://github.com/hub4j/github-api/issues/218) **Closed issues:** - Access to raw JSON for issues? [\#235](https://github.com/hub4j/github-api/issues/235) - GHRepository.getPullRequests\(\) / listPullRequests\(\) does not support the sort parameter [\#234](https://github.com/hub4j/github-api/issues/234) - Commit obtained by queryCommits does not contain files [\#230](https://github.com/hub4j/github-api/issues/230) - get starred projects by the user [\#228](https://github.com/hub4j/github-api/issues/228) - update of file in github [\#227](https://github.com/hub4j/github-api/issues/227) - Add per\_page paramter to the search builders [\#221](https://github.com/hub4j/github-api/issues/221) - RateLimitHandler Bug [\#220](https://github.com/hub4j/github-api/issues/220) - Followers and following pagination [\#213](https://github.com/hub4j/github-api/issues/213) - Some features of this plugin no longer work with the recent changes to api.github.com [\#202](https://github.com/hub4j/github-api/issues/202) - Add an option to get the id of an event [\#199](https://github.com/hub4j/github-api/issues/199) - Need documentation for how to clone a git repo to the disk [\#196](https://github.com/hub4j/github-api/issues/196) - JDK Version [\#188](https://github.com/hub4j/github-api/issues/188) - Obtain Pushed Commit using GitHub API [\#186](https://github.com/hub4j/github-api/issues/186) - @WithBridgeMethods decorator in GHObject has no value adapterMethod [\#171](https://github.com/hub4j/github-api/issues/171) ## [github-api-1.71](https://github.com/hub4j/github-api/tree/github-api-1.71) (2015-12-01) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.70...github-api-1.71) **Fixed bugs:** - \#218 enable cross fork compare [\#219](https://github.com/hub4j/github-api/pull/219) ([if6was9](https://github.com/if6was9)) - \#215 fix read\(\) failure with private repos [\#216](https://github.com/hub4j/github-api/pull/216) ([if6was9](https://github.com/if6was9)) **Closed issues:** - Push to repo [\#229](https://github.com/hub4j/github-api/issues/229) - Can not find the .github file [\#223](https://github.com/hub4j/github-api/issues/223) - Can not find the .github file [\#222](https://github.com/hub4j/github-api/issues/222) - GHContent.read\(\) is broken due to incorrect HTTP Method [\#215](https://github.com/hub4j/github-api/issues/215) **Merged pull requests:** - Use default timeouts for URLConnections [\#237](https://github.com/hub4j/github-api/pull/237) ([olivergondza](https://github.com/olivergondza)) - Findbugs plugin has been upgraded [\#236](https://github.com/hub4j/github-api/pull/236) ([recena](https://github.com/recena)) - Add information about mirror url if it exist. [\#233](https://github.com/hub4j/github-api/pull/233) ([vparfonov](https://github.com/vparfonov)) - Added a new method to validate the GitHub API URL [\#232](https://github.com/hub4j/github-api/pull/232) ([recena](https://github.com/recena)) - Support for merge\_commit\_sha [\#231](https://github.com/hub4j/github-api/pull/231) ([recena](https://github.com/recena)) - Check builder result to either be a token or a user [\#226](https://github.com/hub4j/github-api/pull/226) ([Shredder121](https://github.com/Shredder121)) - Overzealous FindBugs changes. [\#225](https://github.com/hub4j/github-api/pull/225) ([Shredder121](https://github.com/Shredder121)) - Remove trailing slash when requesting directory content [\#224](https://github.com/hub4j/github-api/pull/224) ([Shredder121](https://github.com/Shredder121)) - Support Milestone closed\_at date [\#217](https://github.com/hub4j/github-api/pull/217) ([dblevins](https://github.com/dblevins)) ## [github-api-1.70](https://github.com/hub4j/github-api/tree/github-api-1.70) (2015-08-15) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.69...github-api-1.70) **Closed issues:** - How to search all repos based on a keyword? [\#211](https://github.com/hub4j/github-api/issues/211) - Missing: List private organizations a user belongs [\#209](https://github.com/hub4j/github-api/issues/209) **Merged pull requests:** - Added option to edit GitHub release once it is created [\#212](https://github.com/hub4j/github-api/pull/212) ([umajeric](https://github.com/umajeric)) - Cleanup issues discovered by FindBugs [\#210](https://github.com/hub4j/github-api/pull/210) ([oleg-nenashev](https://github.com/oleg-nenashev)) ## [github-api-1.69](https://github.com/hub4j/github-api/tree/github-api-1.69) (2015-07-17) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.68...github-api-1.69) **Closed issues:** - NullPointerException in Requester.java [\#194](https://github.com/hub4j/github-api/issues/194) - Dependency problems [\#191](https://github.com/hub4j/github-api/issues/191) - Enable this to work with GitHub Enterprise [\#184](https://github.com/hub4j/github-api/issues/184) - no way to list forks of a repository [\#183](https://github.com/hub4j/github-api/issues/183) - Nothing is Fetched when calling repo.listNotifications\(\); [\#181](https://github.com/hub4j/github-api/issues/181) - java.net.ProtocolException: DELETE does not support writing [\#180](https://github.com/hub4j/github-api/issues/180) **Merged pull requests:** - Fix potential NPE in the code [\#208](https://github.com/hub4j/github-api/pull/208) ([oleg-nenashev](https://github.com/oleg-nenashev)) - Enable FindBugs in the repo [\#207](https://github.com/hub4j/github-api/pull/207) ([oleg-nenashev](https://github.com/oleg-nenashev)) - Specified the GET [\#206](https://github.com/hub4j/github-api/pull/206) ([torodev](https://github.com/torodev)) - Fix NPE found when resolving issues from search api [\#205](https://github.com/hub4j/github-api/pull/205) ([stephenc](https://github.com/stephenc)) - add ping event to GH events enum [\#204](https://github.com/hub4j/github-api/pull/204) ([lanwen](https://github.com/lanwen)) - GitHub API have changed the semantics of /user/repos API [\#203](https://github.com/hub4j/github-api/pull/203) ([lucamilanesio](https://github.com/lucamilanesio)) - Add support for update/delete operations on issue comments [\#201](https://github.com/hub4j/github-api/pull/201) ([henryju](https://github.com/henryju)) - fix for unused json map when method with body, but body is null [\#200](https://github.com/hub4j/github-api/pull/200) ([lanwen](https://github.com/lanwen)) - fix for GH Enterprise which does not have rate limit reset field [\#198](https://github.com/hub4j/github-api/pull/198) ([lanwen](https://github.com/lanwen)) - added Page Build [\#197](https://github.com/hub4j/github-api/pull/197) ([treeduck](https://github.com/treeduck)) - Enable creation and retrieval of org webhooks [\#192](https://github.com/hub4j/github-api/pull/192) ([chrisrhut](https://github.com/chrisrhut)) - allow default branch to be set [\#190](https://github.com/hub4j/github-api/pull/190) ([if6was9](https://github.com/if6was9)) - fixed regression that caused POST operations to be sent as GET [\#189](https://github.com/hub4j/github-api/pull/189) ([if6was9](https://github.com/if6was9)) - Recognize previous\_file field in GHCommit.File [\#187](https://github.com/hub4j/github-api/pull/187) ([yegorius](https://github.com/yegorius)) - Fixes \#183: added a method listForks\(\) to GHRepository [\#185](https://github.com/hub4j/github-api/pull/185) ([marc-guenther](https://github.com/marc-guenther)) - Fix invalid URL for pull request comments update/delete [\#182](https://github.com/hub4j/github-api/pull/182) ([henryju](https://github.com/henryju)) ## [github-api-1.68](https://github.com/hub4j/github-api/tree/github-api-1.68) (2015-04-20) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.67...github-api-1.68) **Closed issues:** - Merging with SHA1 [\#176](https://github.com/hub4j/github-api/issues/176) **Merged pull requests:** - Fix NullPointerException on RateLimitHandler when handling API errors. [\#179](https://github.com/hub4j/github-api/pull/179) ([lskillen](https://github.com/lskillen)) ## [github-api-1.67](https://github.com/hub4j/github-api/tree/github-api-1.67) (2015-04-14) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.66...github-api-1.67) **Merged pull requests:** - Added getters for the objects notifications refer to [\#177](https://github.com/hub4j/github-api/pull/177) ([syniuhin](https://github.com/syniuhin)) - Added the source attribute to GHRepository [\#175](https://github.com/hub4j/github-api/pull/175) ([kickroot](https://github.com/kickroot)) - Improvements [\#170](https://github.com/hub4j/github-api/pull/170) ([KostyaSha](https://github.com/KostyaSha)) - Throw error for bad creds [\#169](https://github.com/hub4j/github-api/pull/169) ([KostyaSha](https://github.com/KostyaSha)) ## [github-api-1.66](https://github.com/hub4j/github-api/tree/github-api-1.66) (2015-03-24) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.64...github-api-1.66) **Closed issues:** - Rate limiting causes silent freezing failures [\#172](https://github.com/hub4j/github-api/issues/172) - Pluggable persistent cache support [\#168](https://github.com/hub4j/github-api/issues/168) - Implement /search [\#158](https://github.com/hub4j/github-api/issues/158) - Support notifications api [\#119](https://github.com/hub4j/github-api/issues/119) - Consider committing to using OkHttp in preference to HttpURLConnection [\#104](https://github.com/hub4j/github-api/issues/104) ## [github-api-1.64](https://github.com/hub4j/github-api/tree/github-api-1.64) (2015-03-22) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.63...github-api-1.64) **Closed issues:** - SBT build is broken from version 1.53 [\#167](https://github.com/hub4j/github-api/issues/167) - Reading a gist in anonymonous mode causes error [\#166](https://github.com/hub4j/github-api/issues/166) - Add support for the Markdown API [\#165](https://github.com/hub4j/github-api/issues/165) - GHContent\#content always returns master version [\#162](https://github.com/hub4j/github-api/issues/162) - infinite Thread usage loop with handleApiError [\#159](https://github.com/hub4j/github-api/issues/159) - /repositories? [\#157](https://github.com/hub4j/github-api/issues/157) - 502 Bad Gateway error from GHRelease.uploadAsset [\#135](https://github.com/hub4j/github-api/issues/135) **Merged pull requests:** - Add method to get the list of languages using in repository [\#161](https://github.com/hub4j/github-api/pull/161) ([khoa-nd](https://github.com/khoa-nd)) - Picking endpoint from the properties file and environment variables [\#156](https://github.com/hub4j/github-api/pull/156) ([ashwanthkumar](https://github.com/ashwanthkumar)) - Implementing github trees [\#155](https://github.com/hub4j/github-api/pull/155) ([ddtxra](https://github.com/ddtxra)) ## [github-api-1.63](https://github.com/hub4j/github-api/tree/github-api-1.63) (2015-03-02) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.62...github-api-1.63) **Closed issues:** - Github Trees support [\#153](https://github.com/hub4j/github-api/issues/153) - getEvents fails periodically with odd exception [\#92](https://github.com/hub4j/github-api/issues/92) ## [github-api-1.62](https://github.com/hub4j/github-api/tree/github-api-1.62) (2015-02-15) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.61...github-api-1.62) **Closed issues:** - "Stars" and "Forks" parameters for Gist [\#154](https://github.com/hub4j/github-api/issues/154) - NPE during tag.getCommit\(\).getLastStatus\(\) [\#152](https://github.com/hub4j/github-api/issues/152) - Incorrect behavior of GHRepository.getReadme [\#150](https://github.com/hub4j/github-api/issues/150) - Make public GHRepository.getOwnerName [\#149](https://github.com/hub4j/github-api/issues/149) - Add information about thread-safety in Javadoc [\#148](https://github.com/hub4j/github-api/issues/148) - Add API to retrieve list of contributors [\#147](https://github.com/hub4j/github-api/issues/147) - Parsing a push event payload doesn't get the repository [\#144](https://github.com/hub4j/github-api/issues/144) - GHRelease issue [\#138](https://github.com/hub4j/github-api/issues/138) - GHRepository.getIssues\(GHIssueState.CLOSED\) also return pull requests [\#134](https://github.com/hub4j/github-api/issues/134) - Feature: watched Repositories [\#130](https://github.com/hub4j/github-api/issues/130) - Cannot create repository in organisation [\#118](https://github.com/hub4j/github-api/issues/118) - Different ways of getting issue.getClosedby\(\) return different results. [\#113](https://github.com/hub4j/github-api/issues/113) - NullPointerException in GHPerson [\#111](https://github.com/hub4j/github-api/issues/111) - Suggested enhancement: GHPerson\#getAllRepositories\(\) [\#110](https://github.com/hub4j/github-api/issues/110) - add support for proxy [\#109](https://github.com/hub4j/github-api/issues/109) - Error while accessing rate limit API - No subject alternative DNS name matching api.github.com found. [\#108](https://github.com/hub4j/github-api/issues/108) - Add support for retrieving repository available labels [\#105](https://github.com/hub4j/github-api/issues/105) - getReadme its outdated [\#99](https://github.com/hub4j/github-api/issues/99) ## [github-api-1.61](https://github.com/hub4j/github-api/tree/github-api-1.61) (2015-02-14) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.60...github-api-1.61) ## [github-api-1.60](https://github.com/hub4j/github-api/tree/github-api-1.60) (2015-02-14) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.59...github-api-1.60) **Closed issues:** - GHTeam.getMembers\(\) response does not include all users if user count \>30 [\#145](https://github.com/hub4j/github-api/issues/145) **Merged pull requests:** - fix \#145 GHTeam.getMembers\(\) does not page properly [\#146](https://github.com/hub4j/github-api/pull/146) ([if6was9](https://github.com/if6was9)) - Complete api implementation for the new github deployment api [\#143](https://github.com/hub4j/github-api/pull/143) ([suryagaddipati](https://github.com/suryagaddipati)) - Add code for creating deployments for a repo [\#142](https://github.com/hub4j/github-api/pull/142) ([suryagaddipati](https://github.com/suryagaddipati)) - Trivial change to enable creating/updating binary content \(files\). [\#141](https://github.com/hub4j/github-api/pull/141) ([alvaro1728](https://github.com/alvaro1728)) - Put mockito in the test scope. [\#139](https://github.com/hub4j/github-api/pull/139) ([farmdawgnation](https://github.com/farmdawgnation)) - added 'diverged' constant to GHCompare.Status enum [\#137](https://github.com/hub4j/github-api/pull/137) ([simonecarriero](https://github.com/simonecarriero)) - Add paging support for Team's Repositories [\#136](https://github.com/hub4j/github-api/pull/136) ([rtyley](https://github.com/rtyley)) ## [github-api-1.59](https://github.com/hub4j/github-api/tree/github-api-1.59) (2014-10-08) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.58...github-api-1.59) **Merged pull requests:** - Add GHCompare.getFiles\(\) method to be able to see the precise files chan... [\#132](https://github.com/hub4j/github-api/pull/132) ([mocleiri](https://github.com/mocleiri)) - Modify GitHubBuilder to resolve user credentials from the system environ... [\#131](https://github.com/hub4j/github-api/pull/131) ([mocleiri](https://github.com/mocleiri)) - Allow pullRequest.getHead\(\).getRepository\(\).getCommit\(headSha1\) to work [\#129](https://github.com/hub4j/github-api/pull/129) ([mocleiri](https://github.com/mocleiri)) - Update github scopes according to https://developer.github.com/v3/oauth/\#scopes [\#128](https://github.com/hub4j/github-api/pull/128) ([ndeloof](https://github.com/ndeloof)) - Allow to use custom HttpConnector when only OAuth token is given [\#124](https://github.com/hub4j/github-api/pull/124) ([ohtake](https://github.com/ohtake)) - Use issues endpoints for pull requests [\#123](https://github.com/hub4j/github-api/pull/123) ([rtyley](https://github.com/rtyley)) - Add missing field browser\_download\_url in GHAsset [\#122](https://github.com/hub4j/github-api/pull/122) ([tbruyelle](https://github.com/tbruyelle)) ## [github-api-1.58](https://github.com/hub4j/github-api/tree/github-api-1.58) (2014-08-30) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.57...github-api-1.58) **Merged pull requests:** - All the refs worth knowing: Implementation of ref updating and deleting. [\#117](https://github.com/hub4j/github-api/pull/117) ([farmdawgnation](https://github.com/farmdawgnation)) - Remove getPath\(\) [\#116](https://github.com/hub4j/github-api/pull/116) ([DavidTanner](https://github.com/DavidTanner)) - Add missing GitHub event types. [\#115](https://github.com/hub4j/github-api/pull/115) ([bernd](https://github.com/bernd)) - get repository full name \(including owner\) [\#114](https://github.com/hub4j/github-api/pull/114) ([ndeloof](https://github.com/ndeloof)) - General pagination [\#107](https://github.com/hub4j/github-api/pull/107) ([msperisen](https://github.com/msperisen)) ## [github-api-1.57](https://github.com/hub4j/github-api/tree/github-api-1.57) (2014-08-19) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.56...github-api-1.57) **Merged pull requests:** - Get all orgs/teams/permissions in a single GitHub API call [\#112](https://github.com/hub4j/github-api/pull/112) ([lucamilanesio](https://github.com/lucamilanesio)) - Implement pagination on list of private+public repos of a user. [\#106](https://github.com/hub4j/github-api/pull/106) ([lucamilanesio](https://github.com/lucamilanesio)) ## [github-api-1.56](https://github.com/hub4j/github-api/tree/github-api-1.56) (2014-07-03) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.55...github-api-1.56) **Closed issues:** - Unable to access commit date through api. [\#100](https://github.com/hub4j/github-api/issues/100) - Add support for commit status contexts. [\#96](https://github.com/hub4j/github-api/issues/96) **Merged pull requests:** - Update to OkHttp 2.0.0, which has a new OkUrlFactory [\#103](https://github.com/hub4j/github-api/pull/103) ([rtyley](https://github.com/rtyley)) - Better FNFE from delete\(\) [\#102](https://github.com/hub4j/github-api/pull/102) ([jglick](https://github.com/jglick)) - Un-finalize a handful of classes. [\#101](https://github.com/hub4j/github-api/pull/101) ([farmdawgnation](https://github.com/farmdawgnation)) ## [github-api-1.55](https://github.com/hub4j/github-api/tree/github-api-1.55) (2014-06-08) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.54...github-api-1.55) **Merged pull requests:** - Add support for adding context to commit status. [\#97](https://github.com/hub4j/github-api/pull/97) ([suryagaddipati](https://github.com/suryagaddipati)) ## [github-api-1.54](https://github.com/hub4j/github-api/tree/github-api-1.54) (2014-06-05) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.53...github-api-1.54) **Closed issues:** - Version 1.8 of bridge-method-annotation not available in maven central [\#91](https://github.com/hub4j/github-api/issues/91) - Ability to specify both branch and sha parameters at same time in GHCommitQueryBuilder [\#90](https://github.com/hub4j/github-api/issues/90) **Merged pull requests:** - Add support for retriving a single ref [\#95](https://github.com/hub4j/github-api/pull/95) ([suryagaddipati](https://github.com/suryagaddipati)) - Add support for adding deploykeys to repo [\#94](https://github.com/hub4j/github-api/pull/94) ([suryagaddipati](https://github.com/suryagaddipati)) - Upgrading to 1.12 version for bridge-method-annotation and bridge-method-injector - fix for \#91 [\#93](https://github.com/hub4j/github-api/pull/93) ([vr100](https://github.com/vr100)) ## [github-api-1.53](https://github.com/hub4j/github-api/tree/github-api-1.53) (2014-05-10) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.52...github-api-1.53) **Closed issues:** - GHUser.getRepositories\(\) does not pull private repositories [\#88](https://github.com/hub4j/github-api/issues/88) - create pull requests? [\#79](https://github.com/hub4j/github-api/issues/79) - getRateLimit\(\) fails for GitHub Enterprise [\#78](https://github.com/hub4j/github-api/issues/78) - GHRepository assumes public github.com, won't work with github enterprise [\#64](https://github.com/hub4j/github-api/issues/64) - Getting private repositories for a private organization [\#62](https://github.com/hub4j/github-api/issues/62) - ClosedBy returns nothing for closed issues [\#60](https://github.com/hub4j/github-api/issues/60) - class file for com.infradna.tool.bridge\_method\_injector.WithBridgeMethods not found [\#54](https://github.com/hub4j/github-api/issues/54) - add support of Gists [\#53](https://github.com/hub4j/github-api/issues/53) - GHUser is missing a getHTMLURL\(\) method [\#52](https://github.com/hub4j/github-api/issues/52) - \[Feature Request\] get tags [\#40](https://github.com/hub4j/github-api/issues/40) - GitHub.connectAnonymously\(\) fails because of a lack of credentials. [\#39](https://github.com/hub4j/github-api/issues/39) **Merged pull requests:** - create a Release & Branch [\#84](https://github.com/hub4j/github-api/pull/84) ([fanfansama](https://github.com/fanfansama)) ## [github-api-1.52](https://github.com/hub4j/github-api/tree/github-api-1.52) (2014-05-10) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.51...github-api-1.52) **Closed issues:** - File size limited to 1MB [\#85](https://github.com/hub4j/github-api/issues/85) **Merged pull requests:** - Fix bug in GHMyself.getEmails due to API change [\#87](https://github.com/hub4j/github-api/pull/87) ([kellycampbell](https://github.com/kellycampbell)) - Using builder pattern to list commits in a repo by author, branch, etc [\#86](https://github.com/hub4j/github-api/pull/86) ([vr100](https://github.com/vr100)) - add tarball\_url and zipball\_url to GHRelease [\#83](https://github.com/hub4j/github-api/pull/83) ([antonkrasov](https://github.com/antonkrasov)) ## [github-api-1.51](https://github.com/hub4j/github-api/tree/github-api-1.51) (2014-04-13) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.50...github-api-1.51) **Closed issues:** - Add support for setting explicit connection timeouts [\#81](https://github.com/hub4j/github-api/issues/81) - Throwing an Error when an IOException occurs is overly fatal [\#65](https://github.com/hub4j/github-api/issues/65) **Merged pull requests:** - Cast url.openConnection\(\) to HttpURLConnection instead of HttpsURLConnec... [\#82](https://github.com/hub4j/github-api/pull/82) ([prazanna](https://github.com/prazanna)) - Add support for removing a user from an Organisation [\#80](https://github.com/hub4j/github-api/pull/80) ([rtyley](https://github.com/rtyley)) ## [github-api-1.50](https://github.com/hub4j/github-api/tree/github-api-1.50) (2014-03-28) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.49...github-api-1.50) **Closed issues:** - publish 1.49 version to jenkins maven repo [\#63](https://github.com/hub4j/github-api/issues/63) **Merged pull requests:** - Add org public-members call, to complement the full members list [\#76](https://github.com/hub4j/github-api/pull/76) ([rtyley](https://github.com/rtyley)) - Support the check-user-team-membership API call [\#75](https://github.com/hub4j/github-api/pull/75) ([rtyley](https://github.com/rtyley)) - Fix GHIssue.setLabels\(\) [\#73](https://github.com/hub4j/github-api/pull/73) ([rtyley](https://github.com/rtyley)) - Un-finalize GHContent. [\#72](https://github.com/hub4j/github-api/pull/72) ([farmdawgnation](https://github.com/farmdawgnation)) - Created new method to automate the merge [\#69](https://github.com/hub4j/github-api/pull/69) ([vanjikumaran](https://github.com/vanjikumaran)) - Enable org member filtering [\#68](https://github.com/hub4j/github-api/pull/68) ([lindseydew](https://github.com/lindseydew)) - Support paging when fetching organization members [\#66](https://github.com/hub4j/github-api/pull/66) ([ryankennedy](https://github.com/ryankennedy)) ## [github-api-1.49](https://github.com/hub4j/github-api/tree/github-api-1.49) (2014-01-06) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.48...github-api-1.49) ## [github-api-1.48](https://github.com/hub4j/github-api/tree/github-api-1.48) (2014-01-06) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.47...github-api-1.48) **Closed issues:** - Implement Contents API [\#46](https://github.com/hub4j/github-api/issues/46) **Merged pull requests:** - Fetching of user's verified keys through standard OAuth scope. [\#61](https://github.com/hub4j/github-api/pull/61) ([lucamilanesio](https://github.com/lucamilanesio)) - Contents API [\#59](https://github.com/hub4j/github-api/pull/59) ([farmdawgnation](https://github.com/farmdawgnation)) ## [github-api-1.47](https://github.com/hub4j/github-api/tree/github-api-1.47) (2013-11-27) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.46...github-api-1.47) **Closed issues:** - github /user/orgs fails [\#57](https://github.com/hub4j/github-api/issues/57) - GHRepository.getIssues\(\) limited to 30 issues [\#55](https://github.com/hub4j/github-api/issues/55) **Merged pull requests:** - Add support for PULL\_REQUEST\_REVIEW\_COMMENT event types. [\#58](https://github.com/hub4j/github-api/pull/58) ([rtholmes](https://github.com/rtholmes)) - Use `PagedIterator\<GHIssue\>` to retrieve repository issues [\#56](https://github.com/hub4j/github-api/pull/56) ([endeavor85](https://github.com/endeavor85)) ## [github-api-1.46](https://github.com/hub4j/github-api/tree/github-api-1.46) (2013-11-13) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.45...github-api-1.46) ## [github-api-1.45](https://github.com/hub4j/github-api/tree/github-api-1.45) (2013-11-09) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.44...github-api-1.45) **Closed issues:** - Issue labels have multiple fields now [\#48](https://github.com/hub4j/github-api/issues/48) **Merged pull requests:** - add support \(most of\) the release-related endpoints [\#51](https://github.com/hub4j/github-api/pull/51) ([evanchooly](https://github.com/evanchooly)) - Updates Jackson to 2.2.3 [\#50](https://github.com/hub4j/github-api/pull/50) ([pescuma](https://github.com/pescuma)) - Use a proper Label in GHIssues [\#49](https://github.com/hub4j/github-api/pull/49) ([pescuma](https://github.com/pescuma)) - Allows to define page size for repository lists and other API enhancements [\#45](https://github.com/hub4j/github-api/pull/45) ([lucamilanesio](https://github.com/lucamilanesio)) ## [github-api-1.44](https://github.com/hub4j/github-api/tree/github-api-1.44) (2013-09-07) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.43...github-api-1.44) **Closed issues:** - getMergeableState in GHPullRequest doesn't work [\#41](https://github.com/hub4j/github-api/issues/41) **Merged pull requests:** - GHMyself should allow accessing the private repos and orgs too [\#43](https://github.com/hub4j/github-api/pull/43) ([stephenc](https://github.com/stephenc)) - Commit's short info model [\#42](https://github.com/hub4j/github-api/pull/42) ([paulbutenko](https://github.com/paulbutenko)) ## [github-api-1.43](https://github.com/hub4j/github-api/tree/github-api-1.43) (2013-07-07) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.42...github-api-1.43) ## [github-api-1.42](https://github.com/hub4j/github-api/tree/github-api-1.42) (2013-05-07) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.41...github-api-1.42) **Merged pull requests:** - add repository to Pull Request payload and wrap the PR with the repository [\#38](https://github.com/hub4j/github-api/pull/38) ([janinko](https://github.com/janinko)) - Force issues-based API route for PR comments [\#37](https://github.com/hub4j/github-api/pull/37) ([spiffxp](https://github.com/spiffxp)) - Allow oauthToken to be used without login [\#36](https://github.com/hub4j/github-api/pull/36) ([spiffxp](https://github.com/spiffxp)) ## [github-api-1.41](https://github.com/hub4j/github-api/tree/github-api-1.41) (2013-04-23) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.40...github-api-1.41) **Closed issues:** - Closing pull request using Github API return FileNotFoundException [\#34](https://github.com/hub4j/github-api/issues/34) **Merged pull requests:** - Stop using deprecated API tokens for Enterprise [\#33](https://github.com/hub4j/github-api/pull/33) ([watsonian](https://github.com/watsonian)) ## [github-api-1.40](https://github.com/hub4j/github-api/tree/github-api-1.40) (2013-04-16) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.39...github-api-1.40) ## [github-api-1.39](https://github.com/hub4j/github-api/tree/github-api-1.39) (2013-04-16) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.38...github-api-1.39) **Merged pull requests:** - Provide a way to determine if the connection is anonymous [\#44](https://github.com/hub4j/github-api/pull/44) ([stephenc](https://github.com/stephenc)) - Implement GHEventPayload.IssueComment [\#32](https://github.com/hub4j/github-api/pull/32) ([janinko](https://github.com/janinko)) - implement retrieving of access token [\#31](https://github.com/hub4j/github-api/pull/31) ([janinko](https://github.com/janinko)) ## [github-api-1.38](https://github.com/hub4j/github-api/tree/github-api-1.38) (2013-04-07) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.37...github-api-1.38) **Closed issues:** - Error 500 - No Protocol [\#29](https://github.com/hub4j/github-api/issues/29) ## [github-api-1.37](https://github.com/hub4j/github-api/tree/github-api-1.37) (2013-03-15) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.36...github-api-1.37) **Merged pull requests:** - Adding Compare and Refs commands to API [\#30](https://github.com/hub4j/github-api/pull/30) ([mc1arke](https://github.com/mc1arke)) ## [github-api-1.36](https://github.com/hub4j/github-api/tree/github-api-1.36) (2013-01-24) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.35...github-api-1.36) ## [github-api-1.35](https://github.com/hub4j/github-api/tree/github-api-1.35) (2013-01-07) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.34...github-api-1.35) **Closed issues:** - Support for organization check mebership [\#23](https://github.com/hub4j/github-api/issues/23) **Merged pull requests:** - Password is no longer required for api usage and fix for broken base64 encoding. [\#27](https://github.com/hub4j/github-api/pull/27) ([johnou](https://github.com/johnou)) - Removed web client and proprietary api usage. [\#26](https://github.com/hub4j/github-api/pull/26) ([johnou](https://github.com/johnou)) ## [github-api-1.34](https://github.com/hub4j/github-api/tree/github-api-1.34) (2013-01-06) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.33...github-api-1.34) **Closed issues:** - Enterprise Github without HTTPS not supported [\#12](https://github.com/hub4j/github-api/issues/12) **Merged pull requests:** - JENKINS-13726: Github plugin should work with Github enterprise by allowing for overriding the github URL. [\#24](https://github.com/hub4j/github-api/pull/24) ([johnou](https://github.com/johnou)) - Retrieve repository directly. [\#22](https://github.com/hub4j/github-api/pull/22) ([janinko](https://github.com/janinko)) ## [github-api-1.33](https://github.com/hub4j/github-api/tree/github-api-1.33) (2012-09-13) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.32...github-api-1.33) **Closed issues:** - GHIssue.getComments\(\) throws NoSuchElementException when there are no comments [\#20](https://github.com/hub4j/github-api/issues/20) - scm in pom.xml is incorrect [\#14](https://github.com/hub4j/github-api/issues/14) - support for: Create an issue POST /repos/:user/:repo/issues [\#13](https://github.com/hub4j/github-api/issues/13) **Merged pull requests:** - PagedIterable dosn't use authentication [\#19](https://github.com/hub4j/github-api/pull/19) ([janinko](https://github.com/janinko)) - When using lazy population, this is not deprecated [\#18](https://github.com/hub4j/github-api/pull/18) ([janinko](https://github.com/janinko)) ## [github-api-1.32](https://github.com/hub4j/github-api/tree/github-api-1.32) (2012-09-06) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.31...github-api-1.32) **Merged pull requests:** - Issues pull requests apiv3 [\#17](https://github.com/hub4j/github-api/pull/17) ([janinko](https://github.com/janinko)) ## [github-api-1.31](https://github.com/hub4j/github-api/tree/github-api-1.31) (2012-08-28) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.30...github-api-1.31) **Merged pull requests:** - Fixes for github api v3 [\#16](https://github.com/hub4j/github-api/pull/16) ([janinko](https://github.com/janinko)) ## [github-api-1.30](https://github.com/hub4j/github-api/tree/github-api-1.30) (2012-08-28) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.29...github-api-1.30) **Merged pull requests:** - Using pagination when getting Pull Requests from a repository [\#15](https://github.com/hub4j/github-api/pull/15) ([athieriot](https://github.com/athieriot)) ## [github-api-1.29](https://github.com/hub4j/github-api/tree/github-api-1.29) (2012-06-18) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.28...github-api-1.29) **Closed issues:** - NPE Crash on 1.27 [\#11](https://github.com/hub4j/github-api/issues/11) - Github API V2 shuts down [\#8](https://github.com/hub4j/github-api/issues/8) ## [github-api-1.28](https://github.com/hub4j/github-api/tree/github-api-1.28) (2012-06-13) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.27...github-api-1.28) ## [github-api-1.27](https://github.com/hub4j/github-api/tree/github-api-1.27) (2012-06-12) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.26...github-api-1.27) ## [github-api-1.26](https://github.com/hub4j/github-api/tree/github-api-1.26) (2012-06-04) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.25...github-api-1.26) ## [github-api-1.25](https://github.com/hub4j/github-api/tree/github-api-1.25) (2012-05-22) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.24...github-api-1.25) ## [github-api-1.24](https://github.com/hub4j/github-api/tree/github-api-1.24) (2012-05-22) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.23...github-api-1.24) ## [github-api-1.23](https://github.com/hub4j/github-api/tree/github-api-1.23) (2012-04-25) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.22...github-api-1.23) ## [github-api-1.22](https://github.com/hub4j/github-api/tree/github-api-1.22) (2012-04-12) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.21...github-api-1.22) ## [github-api-1.21](https://github.com/hub4j/github-api/tree/github-api-1.21) (2012-04-12) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.20...github-api-1.21) **Closed issues:** - Link to Javadoc incorrect at http://github-api.kohsuke.org/ [\#4](https://github.com/hub4j/github-api/issues/4) ## [github-api-1.20](https://github.com/hub4j/github-api/tree/github-api-1.20) (2012-04-11) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.19...github-api-1.20) ## [github-api-1.19](https://github.com/hub4j/github-api/tree/github-api-1.19) (2012-04-06) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.18...github-api-1.19) **Merged pull requests:** - Listing of branches in a repository [\#7](https://github.com/hub4j/github-api/pull/7) ([derfred](https://github.com/derfred)) ## [github-api-1.18](https://github.com/hub4j/github-api/tree/github-api-1.18) (2012-03-08) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.17...github-api-1.18) **Merged pull requests:** - milestone api via v3 [\#6](https://github.com/hub4j/github-api/pull/6) ([YusukeKokubo](https://github.com/YusukeKokubo)) ## [github-api-1.17](https://github.com/hub4j/github-api/tree/github-api-1.17) (2012-02-12) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.16...github-api-1.17) **Closed issues:** - error on getRepositories\(\) [\#5](https://github.com/hub4j/github-api/issues/5) ## [github-api-1.16](https://github.com/hub4j/github-api/tree/github-api-1.16) (2012-01-03) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.15...github-api-1.16) **Merged pull requests:** - Fix for finding private repos on organizations [\#3](https://github.com/hub4j/github-api/pull/3) ([jkrall](https://github.com/jkrall)) ## [github-api-1.15](https://github.com/hub4j/github-api/tree/github-api-1.15) (2012-01-01) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.14...github-api-1.15) ## [github-api-1.14](https://github.com/hub4j/github-api/tree/github-api-1.14) (2011-10-27) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.13...github-api-1.14) ## [github-api-1.13](https://github.com/hub4j/github-api/tree/github-api-1.13) (2011-09-15) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.12...github-api-1.13) **Merged pull requests:** - expose issue\_updated\_at. It looks like a better representation of update [\#2](https://github.com/hub4j/github-api/pull/2) ([lacostej](https://github.com/lacostej)) ## [github-api-1.12](https://github.com/hub4j/github-api/tree/github-api-1.12) (2011-08-27) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.11...github-api-1.12) ## [github-api-1.11](https://github.com/hub4j/github-api/tree/github-api-1.11) (2011-08-27) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.10...github-api-1.11) ## [github-api-1.10](https://github.com/hub4j/github-api/tree/github-api-1.10) (2011-07-11) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.9...github-api-1.10) **Merged pull requests:** - Add support for oauth token and a way to see my organizations [\#1](https://github.com/hub4j/github-api/pull/1) ([mocleiri](https://github.com/mocleiri)) ## [github-api-1.9](https://github.com/hub4j/github-api/tree/github-api-1.9) (2011-06-28) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.8...github-api-1.9) ## [github-api-1.8](https://github.com/hub4j/github-api/tree/github-api-1.8) (2011-06-17) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.7...github-api-1.8) ## [github-api-1.7](https://github.com/hub4j/github-api/tree/github-api-1.7) (2011-05-28) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.6...github-api-1.7) ## [github-api-1.6](https://github.com/hub4j/github-api/tree/github-api-1.6) (2011-03-16) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.5...github-api-1.6) ## [github-api-1.5](https://github.com/hub4j/github-api/tree/github-api-1.5) (2011-02-23) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.4...github-api-1.5) ## [github-api-1.4](https://github.com/hub4j/github-api/tree/github-api-1.4) (2010-12-14) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.3...github-api-1.4) ## [github-api-1.3](https://github.com/hub4j/github-api/tree/github-api-1.3) (2010-11-24) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.2...github-api-1.3) ## [github-api-1.2](https://github.com/hub4j/github-api/tree/github-api-1.2) (2010-04-19) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.1...github-api-1.2) ## [github-api-1.1](https://github.com/hub4j/github-api/tree/github-api-1.1) (2010-04-19) [Full Changelog](https://github.com/hub4j/github-api/compare/github-api-1.0...github-api-1.1) ## [github-api-1.0](https://github.com/hub4j/github-api/tree/github-api-1.0) (2010-04-19) [Full Changelog](https://github.com/hub4j/github-api/compare/ecbfdd7315ef2cf04b2be7f11a072ce0bd00c396...github-api-1.0) \* *This Changelog was automatically generated by [github_changelog_generator](https://github.com/github-changelog-generator/github-changelog-generator)*
67.183966
208
0.730999
zul_Latn
0.190333
0c296a8b178b3606cb8bba34a32fedf0f81f2b57
2,466
md
Markdown
legal.md
ThomasFollender/MCW-Big-data-and-visualization
1beb58b3ae2419e0e6393054a13505f13302dcc2
[ "MIT" ]
140
2019-05-15T06:57:13.000Z
2021-11-16T22:28:34.000Z
legal.md
shikalaba/MCW-Big-data-analytics-and-visualization
28100cba602ffd58136645005421db738ff21ff1
[ "MIT" ]
34
2019-05-21T14:02:56.000Z
2021-11-18T23:14:02.000Z
legal.md
shikalaba/MCW-Big-data-analytics-and-visualization
28100cba602ffd58136645005421db738ff21ff1
[ "MIT" ]
93
2019-05-22T23:00:59.000Z
2021-11-15T20:10:25.000Z
# Legal notice Information in this document, including URL and other Internet Web site references, is subject to change without notice. Unless otherwise noted, the example companies, organizations, products, domain names, e-mail addresses, logos, people, places, and events depicted herein are fictitious, and no association with any real company, organization, product, domain name, e-mail address, logo, person, place or event is intended or should be inferred. Complying with all applicable copyright laws is the responsibility of the user. Without limiting the rights under copyright, no part of this document may be reproduced, stored in or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical, photocopying, recording, or otherwise), or for any purpose, without the express written permission of Microsoft Corporation. Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject matter in this document. Except as expressly provided in any written license agreement from Microsoft, the furnishing of this document does not give you any license to these patents, trademarks, copyrights, or other intellectual property. The names of manufacturers, products, or URLs are provided for informational purposes only and Microsoft makes no representations and warranties, either expressed, implied, or statutory, regarding these manufacturers or the use of the products with any Microsoft technologies. The inclusion of a manufacturer or product does not imply endorsement of Microsoft of the manufacturer or product. Links may be provided to third party sites. Such sites are not under the control of Microsoft and Microsoft is not responsible for the contents of any linked site or any link contained in a linked site, or any changes or updates to such sites. Microsoft is not responsible for webcasting or any other form of transmission received from any linked site. Microsoft is providing these links to you only as a convenience, and the inclusion of any link does not imply endorsement of Microsoft of the site or the products contained therein. © 2018 Microsoft Corporation. All rights reserved. Microsoft and the trademarks listed at https://www.microsoft.com/en-us/legal/intellectualproperty/Trademarks/Usage/General.aspx are trademarks of the Microsoft group of companies. All other trademarks are property of their respective owners.
224.181818
926
0.821573
eng_Latn
0.999798
0c29a54b66626b51a2e35e83ef65636f7974ebbf
47,232
md
Markdown
README.md
TimothyBJacobs/node-wpapi
fa7fc031456788229f3e5c4a7c59ed3d19da4262
[ "MIT" ]
null
null
null
README.md
TimothyBJacobs/node-wpapi
fa7fc031456788229f3e5c4a7c59ed3d19da4262
[ "MIT" ]
null
null
null
README.md
TimothyBJacobs/node-wpapi
fa7fc031456788229f3e5c4a7c59ed3d19da4262
[ "MIT" ]
null
null
null
A WordPress REST API client for JavaScript ========================================== This library is an isomorphic client for the [WordPress REST API](http://developer.wordpress.org/rest-api), designed to work with WordPress 4.7 or later. If you are using the older [WP REST API plugin](https://github.com/WP-API/WP-API), some commands will not work. [![Gitter](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/wp-api/node-wpapi?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) [![Build Status](https://travis-ci.org/WP-API/node-wpapi.svg?branch=master)](https://travis-ci.org/WP-API/node-wpapi) **Index**: - [About](#about) - [Installation](#installation) - [Using the Client](#using-the-client) - [Auto-Discovery](#auto-discovery) - [Creating Posts](#creating-posts) - [Updating Posts](#updating-posts) - [Requesting Different Resources](#requesting-different-resources) - [API Query Parameters & Filtering Collections](#api-query-parameters) - [Uploading Media](#uploading-media) - [Custom Routes](#custom-routes) - [Setter Method Naming](#setter-method-naming-for-named-route-components) - [Query Parameters & Filtering](#query-parameters--filtering-custom-routes) - [Mixins](#mixins) - [Embedding Data](#embedding-data) - [Collection Pagination](#collection-pagination) - [Customizing HTTP Request Behavior](#customizing-http-request-behavior) - [Specifying HTTP Headers](#specifying-http-headers) - [Authentication](#authentication) - [API Documentation](#api-documentation) - [Issues](#issues) - [Contributing](#contributing) ## About `node-wpapi` is an isomorphic JavaScript client for the [WordPress REST API](https://developer.wordpress.org/rest-api) that makes it easy for your JavaScript application to request specific resources from a [WordPress](https://wordpress.org) website. It uses a query builder-style syntax to let you craft the request being made to REST API endpoints, then returns the API's response to your application as a JSON object. And don't let the name fool you: with [Webpack](https://webpack.github.io/) or [Browserify](http://browserify.org/), `node-wpapi` works just as well in the browser as it does on the server! This library is maintained by K. Adam White at [Bocoup](https://bocoup.com), with contributions from a [great community](https://github.com/WP-API/node-wpapi/graphs/contributors) of WordPress and JavaScript developers. To get started, `npm install wpapi` or [download the browser build](https://wp-api.github.io/node-wpapi/wpapi.zip) and check out "Installation" and "Using the Client" below. ## Installation `node-wpapi` works both on the server or in the browser. Node.js version 4.0 or higher is required. ### Install with NPM To use the library from Node, install it with [npm](http://npmjs.org): ```bash npm install --save wpapi ``` Then, within your application's script files, `require` the module to gain access to it: ```javascript var WPAPI = require( 'wpapi' ); ``` This library is designed to work in the browser as well, via a build system such as Browserify or Webpack; just install the package and `require( 'wpapi' )` from your application code. ### Download the UMD Bundle Alternatively, you may download a [ZIP archive of the bundled library code](https://wp-api.github.io/node-wpapi/wpapi.zip). These files are UMD modules, which may be included directly on a page using a regular `<script>` tag _or_ required via AMD or CommonJS module systems. In the absence of a module system, the UMD modules will export the browser global variable `WPAPI`, which can be used in place of `require( 'wpapi' )` to access the library from your code. ## Using the Client The module is a constructor, so you can create an instance of the API client bound to the endpoint for your WordPress install: ```javascript var WPAPI = require( 'wpapi' ); var wp = new WPAPI({ endpoint: 'http://src.wordpress-develop.dev/wp-json' }); ``` Once an instance is constructed, you can chain off of it to construct a specific request. (Think of it as a query-builder for WordPress!) We support requesting posts using either a callback-style or promise-style syntax: ```javascript // Callbacks wp.posts().get(function( err, data ) { if ( err ) { // handle err } // do something with the returned posts }); // Promises wp.posts().then(function( data ) { // do something with the returned posts }).catch(function( err ) { // handle error }); ``` The `wp` object has endpoint handler methods for every endpoint that ships with the default WordPress REST API plugin. Once you have used the chaining methods to describe a resource, you may call `.create()`, `.get()`, `.update()` or `.delete()` to send the API request to create, read, update or delete content within WordPress. These methods are documented in further detail below. ### Self-signed (Insecure) HTTPS Certificates In a case where you would want to connect to a HTTPS WordPress installation that has a self-signed certificate (insecure), you will need to force a connection by placing the following line before you make any `wp` calls. ```javascript process.env.NODE_TLS_REJECT_UNAUTHORIZED = "0"; ``` ### Auto-Discovery It is also possible to leverage the [capability discovery](https://developer.wordpress.org/rest-api/using-the-rest-api/discovery/) features of the API to automatically detect and add setter methods for your custom routes, or routes added by plugins. To utilize the auto-discovery functionality, call `WPAPI.discover()` with a URL within a WordPress REST API-enabled site: ```js var apiPromise = WPAPI.discover( 'http://my-site.com' ); ``` If auto-discovery succeeds this method returns a promise that will be resolved with a WPAPI client instance object configured specifically for your site. You can use that promise as the queue that your client instance is ready, then use the client normally within the `.then` callback. **Custom Routes** will be detected by this process, and registered on the client. To prevent name conflicts, only routes in the `wp/v2` namespace will be bound to your instance object itself. The rest can be accessed through the `.namespace` method on the WPAPI instance, as demonstrated below. ```js apiPromise.then(function( site ) { // If default routes were detected, they are now available site.posts().then(function( posts ) { console.log( posts ); }); // etc // If custom routes were detected, they can be accessed via .namespace() site.namespace( 'myplugin/v1' ).authors() .then(function( authors ) { /* ... */ }); // Namespaces can be saved out to variables: var myplugin = site.namespace( 'myplugin/v1' ); myplugin.authors() .id( 7 ) .then(function( author ) { /* ... */ }); }); ``` #### Authenticating with Auto-Discovery While using `WPAPI.discover( url )` to generate the handler for your site gets you up and running quickly, it does not provide the same level of customization as instantiating your own `new WPAPI` object. In order to specify authentication configuration when using autodiscovery, chain a `.then` onto the initial discovery query to call the `.auth` method on the returned site object with the relevant credentials (username & password, nonce, etc): ```js var apiPromise = WPAPI.discover( 'http://my-site.com' ).then(function( site ) { return site.auth({ username: 'admin', password: 'always use secure passwords' }); }); apiPromise.then(function( site ) { // site is now configured to use authentication }) ``` #### Cross-Origin Auto-Discovery When attempting auto-discovery against a remote server in a client-side environment, discovery will fail unless the server is configured for [Cross-Origin Resource Sharing](https://developer.mozilla.org/en-US/docs/Web/HTTP/Access_control_CORS) (CORS). CORS can be enabled by specifying a set of `Access-Control-` headers in your PHP code to instruct browsers that requests from remote clients are accepted; these headers also let you control what specific methods and links are exposed to those remote clients. The [WP-REST-Allow-All-Cors](https://github.com/ahmadawais/WP-REST-Allow-All-CORS) plugin will permit CORS requests for all API resources. Auto-discovery will still fail when using this plugin, however, because discovery depends on the presence of a `Link` header on WordPress pages outside of the root REST API endpoint. To permit your site to be auto-discovered by client-side REST API clients, add a filter to `send_headers` to explicitly whitelist the `Link` header for `HEAD` requests: ```php add_action( 'send_headers', function() { if ( ! did_action('rest_api_init') && $_SERVER['REQUEST_METHOD'] == 'HEAD' ) { header( 'Access-Control-Allow-Origin: *' ); header( 'Access-Control-Expose-Headers: Link' ); header( 'Access-Control-Allow-Methods: HEAD' ); } } ); ``` Enable CORS at your own discretion. Restricting `Access-Control-Allow-Origin` to a specific origin domain is often preferable to allowing all origins via `*`. ### Bootstrapping If you are building an application designed to interface with a specific site, it is possible to sidestep the additional asynchronous HTTP calls that are needed to bootstrap the client through auto-discovery. You can download the root API response, *i.e.* the JSON response when you hit the root endpoint such as `your-site.com/wp-json`, and save that JSON file locally; then, in your application code, just require in that JSON file and pass the routes property into the `WPAPI` constructor or the `WPAPI.site` method. Note that you must specify the endpoint URL as normal when using this approach. ```js var apiRootJSON = require( './my-endpoint-response.json' ); var site = new WPAPI({ endpoint: 'http://my-site.com/wp-json', routes: apiRootJSON.routes }); // site is now ready to be used with all methods defined in the // my-endpoint-response.json file, with no need to wait for a Promise. site.namespace( 'myplugin/v1' ).authors()... ``` To create a slimmed JSON file dedicated to this particular purpose, see the Node script [lib/data/update-default-routes-json.js](https://github.com/wp-api/node-wpapi/tree/master/lib/data/update-default-routes-json.js), which will let you download and save an endpoint response to your local project. In addition to retrieving the specified resource with `.get()`, you can also `.create()`, `.update()` and `.delete()` resources: ### Creating Posts To create posts, use the `.create()` method on a query to POST (the HTTP verb for "create") a data object to the server: ```js // You must authenticate to be able to POST (create) a post var wp = new WPAPI({ endpoint: 'http://your-site.com/wp-json', // This assumes you are using basic auth, as described further below username: 'someusername', password: 'password' }); wp.posts().create({ // "title" and "content" are the only required properties title: 'Your Post Title', content: 'Your post content', // Post will be created as a draft by default if a specific "status" // is not specified status: 'publish' }).then(function( response ) { // "response" will hold all properties of your newly-created post, // including the unique `id` the post was assigned on creation console.log( response.id ); }) ``` This will work in the same manner for resources other than `post`: you can see the list of required data parameters for each resource on the [REST API Developer Handbook](https://developer.wordpress.org/rest-api/reference/). ### Updating Posts To create posts, use the `.update()` method on a single-item query to PUT (the HTTP verb for "update") a data object to the server: ```js // You must authenticate to be able to PUT (update) a post var wp = new WPAPI({ endpoint: 'http://your-site.com/wp-json', // This assumes you are using basic auth, as described further below username: 'someusername', password: 'password' }); // .id() must be used to specify the post we are updating wp.posts().id( 2501 ).update({ // Update the title title: 'A Better Title', // Set the post live (assuming it was "draft" before) status: 'publish' }).then(function( response ) { console.log( response ); }) ``` This will work in the same manner for resources other than `post`: you can see the list of required data parameters for each resource in the [REST API Developer Handbook](https://developer.wordpress.org/rest-api/reference/). ### Requesting Different Resources A WPAPI instance object provides the following basic request methods: * `wp.posts()...`: Request items from the `/posts` endpoints * `wp.pages()...`: Start a request for the `/pages` endpoints * `wp.types()...`: Get Post Type collections and objects from the `/types` endpoints * `wp.comments()...`: Start a request for the `/comments` endpoints * `wp.taxonomies()...`: Generate a request against the `/taxonomies` endpoints * `wp.tags()...`: Get or create tags with the `/tags` endpoint * `wp.categories()...`: Get or create categories with the `/categories` endpoint * `wp.statuses()...`: Get resources within the `/statuses` endpoints * `wp.users()...`: Get resources within the `/users` endpoints * `wp.media()...`: Get Media collections and objects from the `/media` endpoints * `wp.settings()...`: Read or update site settings from the `/settings` endpoint (always requires authentication) All of these methods return a customizable request object. The request object can be further refined with chaining methods, and/or sent to the server via `.get()`, `.create()`, `.update()`, `.delete()`, `.headers()`, or `.then()`. (Not all endpoints support all methods; for example, you cannot POST or PUT records on `/types`, as these are defined in WordPress plugin or theme code.) Additional querying methods provided, by endpoint: * **posts** - `wp.posts()`: get a collection of posts (default query) - `wp.posts().id( n )`: get the post with ID *n* - `wp.posts().id( n ).revisions()`: get a collection of revisions for the post with ID *n* - `wp.posts().id( n ).revisions( rn )`: get revision *rn* for the post with ID *n* * **pages** - `wp.pages()`: get a collection of page items - `wp.pages().id( n )`: get the page with numeric ID *n* - `wp.pages().path( 'path/str' )`: get the page with the root-relative URL path `path/str` - `wp.pages().id( n ).revisions()`: get a collection of revisions for the page with ID *n* - `wp.pages().id( n ).revisions( rn )`: get revision *rn* for the page with ID *n* * **comments** - `wp.comments()`: get a collection of all public comments - `wp.comments().id( n )`: get the comment with ID *n* * **taxonomies** - `wp.taxonomies()`: retrieve all registered taxonomies - `wp.taxonomies().taxonomy( 'taxonomy_name' )`: get a specific taxonomy object with name *taxonomy_name* * **categories** - `wp.categories()`: retrieve all registered categories - `wp.categories().id( n )`: get a specific category object with id *n* * **tags** - `wp.tags()`: retrieve all registered tags - `wp.tags().id( n )`: get a specific tag object with id *n* * **custom taxonomy terms** - [Use `registerRoute()`](http://wp-api.org/node-wpapi/custom-routes/) or [route auto-discovery](http://wp-api.org/node-wpapi/using-the-client/#auto-discovery) to query for custom taxonomy terms * **types** - `wp.types()`: get a collection of all registered public post types - `wp.types().type( 'cpt_name' )`: get the object for the custom post type with the name *cpt_name* * **statuses** - `wp.statuses()`: get a collection of all registered public post statuses (if the query is authenticated&mdash;will just display "published" if unauthenticated) - `wp.statuses().status( 'slug' )`: get the object for the status with the slug *slug* * **users** - `wp.users()`: get a collection of users (will show only users with published content if request is not authenticated) - `wp.users().id( n )`: get the user with ID *n* (does not require authentication if that user is a published author within the blog) - `wp.users().me()`: get the authenticated user's record * **media** - `wp.media()`: get a collection of media objects (attachments) - `wp.media().id( n )`: get media object with ID *n* * **settings** - `wp.settings()`: get or update one or many site settings For security reasons, methods like `.revisions()` and `.settings()` require the request to be authenticated, and others such as `.users()` and `.posts()` will return only a subset of their information without authentication. #### toString() To get the URI of the resource _without_ making a request, call `.toString()` at the end of a query chain: ```js var uriString = wp.posts().id( 7 ).embed().toString(); ``` As the name implies `.toString()` is not a chaining method, and will return a string containing the full URI; this can then be used with alternative HTTP transports like `request`, Node's native `http`, `fetch`, or jQuery. ### API Query Parameters To set a query parameter on a request, use the `.param()` method: ```js // All posts by author w/ ID "7" published before Sept 22, 2016 wp.posts() .param( 'before', new Date( '2016-09-22' ) ) .param( 'author', 7 )... ``` You can continue to chain properties until you call `.then`, `.get`, `.create`, `.update`, or `.delete` on the request chain. **Parameter Shortcut Methods** This library provides convenience methods for many of the most common parameters, like `search=` (search for a string in post title or content), `slug` (query for a post by slug), and `before` and `after` (find posts in a given date range): ```js // Find a page with a specific slug wp.pages().slug( 'about' )... // Find a post authored by the user with ID #42 wp.posts().author( 42 )... // Find trashed posts wp.posts().status( 'trash' )... // Find posts in status "future" or "draft" wp.posts().status([ 'draft', 'future' ])... // Find all categories containing the word "news" wp.categories().search( 'news' )... // Find posts from March 2013 (provide a Date object or full ISO-8601 date): wp.posts().before( '2013-04-01T00:00:00.000Z' ).after( new Date( 'March 01, 2013' ) )... // Return ONLY sticky posts wp.posts().sticky( true )... // Return NO sticky posts wp.posts().sticky( false )... // Supply the password for a password-protected post wp.posts().id( 2501 ).password( 'correct horse battery staple' )... ``` #### Paging & Sorting Convenience methods are also available to set paging & sorting properties like `page`, `per_page` (available as `.perPage()`), `offset`, `order` and `orderby`: ```js // perPage() sets the maximum number of posts to return. 20 latest posts: wp.posts().perPage( 20 )... // 21st through 40th latest posts (*i.e.* the second page of results): wp.posts().perPage( 20 ).page( 2 )... // Order posts alphabetically by title: wp.posts().order( 'asc' ).orderby( 'title' )... ``` See the section on collection pagination for more information. #### Filtering by Taxonomy Terms A variety of other methods are available to further modify which posts are returned from the API. For example, to restrict the returned posts to only those in category 7, pass that ID to the `.categories()` method: ```js wp.posts().categories( 7 )... ``` **Relationships in the REST API are always specified by ID.** The slug of a term may change, but the term ID associated with the underlying post will not. To find the ID of a tag or category for which the slug is known, you can query the associated collection with `.slug()` and use the ID of the returned object in a two-step process: ```js wp.categories().slug( 'fiction' ) .then(function( cats ) { // .slug() queries will always return as an array var fictionCat = cats[0]; return wp.posts().categories( fictionCat.id ); }) .then(function( postsInFiction ) { // These posts are all categorized "fiction": console.log( postsInFiction ); }); ``` To find posts in category 'fiction' and tagged either 'magical-realism' or 'historical', this process can be extended: note that this example uses the [`RSVP.hash` utility](https://github.com/tildeio/rsvp.js/#hash-of-promises) for convenience and parallelism, but the same result could easily be accomplished with `Promise.all` or by chaining each request. ```js RSVP.hash({ categories: wp.categories().slug( 'fiction' ), tags1: wp.tags().slug('magical-realism'), tags2: wp.tags().slug('historical') }).then(function( results ) { // Combine & map .slug() results into arrays of IDs by taxonomy var tagIDs = results.tags1.concat( results.tags2 ) .map(function( tag ) { return tag.id; }); var categoryIDs = results.categories .map(function( cat ) { return cat.id; }); return wp.posts() .tags( tags ) .categories( categories ); }).then(function( posts ) { // These posts are all fiction, either magical realism or historical: console.log( posts ); }); ``` This process may seem cumbersome, but it provides a more broadly reliable method of querying than querying by mutable slugs. The first requests may also be avoided entirely by pre-creating and storing a dictionary of term slugs and their associated IDs in your application; however, be aware that this dictionary must be updated whenever slugs change. It is also possible to add your own slug-oriented query parameters to a site that you control by creating a plugin that registers additional collection parameter arguments. **Excluding terms** Just as `.categories()` and `.tags()` can be used to return posts that are associated with one or more taxonomies, two methods exist to exclude posts by their term associations. - `.excludeCategories()` is a shortcut for `.param( 'categories_exclude', ... )` which excludes results associated with the provided category term IDs - `.excludeTags()` is a shortcut for `.param( 'tags_exclude', ... )` which excludes results associated with the provided tag term IDs **Custom Taxonomies** Just as the `?categories` and `?categories_exclude` parameters are available for use with the built-in taxonomies, any custom taxonomy that is registered with a `rest_base` argument has a `?{taxonomy rest_base}` and `?{taxonomy rest_base}_exclude` parameter available, which can be set directly using `.param`. For the custom taxonomy `genres`, for example: - `wp.posts().param( 'genres', [ array of genre term IDs ])`: return only records associated with any of the provided genres - `wp.posts().param( 'genres_exclude', [ array of genre term IDs ])`: return only records associated with none of the provided genres #### Retrieving posts by author The `.author()` method also exists to query for posts authored by a specific user (specified by ID). ```js // equivalent to .param( 'author', 42 ): wp.posts().author( 42 ).get(); // last value wins: this queries for author == 71 wp.posts().author( 42 ).author( 71 ).get(); ``` As with categories and tags, the `/users` endpoint may be queried by slug to retrieve the ID to use in this query, if needed. ### Password-Protected posts The `.password()` method (not to be confused with the password property of `.auth()`!) sets the password to use to view a password-protected post. Any post for which the content is protected will have `protected: true` set on its `content` and `excerpt` properties; `content.rendered` and `excerpt.rendered` will both be `''` until the password is provided by query string. ```js wp.posts().id( idOfProtectedPost ) .then(function( result ) { console.log( result.content.protected ); // true console.log( result.content.rendered ); // "" }); wp.posts.id( idOfProtectedPost ) // Provide the password string with the request .password( 'thepasswordstring' ) .then(function( result ) { console.log( result.content.rendered ); // "The post content" }); ``` #### Other Filters The `?filter` query parameter is not natively supported within the WordPress core REST API endpoints, but can be added to your site using the [rest-filter plugin](https://github.com/wp-api/rest-filter). `filter` is a special query parameter that lets you directly specify many WP_Query arguments, including `tag`, `author_name`, and other [public query vars](https://codex.wordpress.org/WordPress_Query_Vars). Even more parameters are available for use with `filter` once you [authenticate with the API](https://developer.wordpress.org/rest-api/using-the-rest-api/authentication/). If your environment supports this parameter, other filtering methods will be available if you initialize your site [using auto-discovery](http://wp-api.org/node-wpapi/using-the-client/#auto-discovery), which will auto-detect the availability of `filter`: ```js WPAPI.discover( 'http://mysite.com' ) .then(function( site ) { // Apply an arbitrary `filter` query parameter: // All posts belonging to author with nicename "jadenbeirne" wp.posts().filter( 'author_name', 'jadenbeirne' ).get(); // Query by the slug of a category or tag // Get all posts in category "islands" and tags "clouds" & "sunset" // (filter can either accept two parameters, as above where it's called with // a key and a value, or an object of parameter keys and values, as below) wp.posts().filter({ category_name: 'islands', tag: [ 'clouds', 'sunset' ] })... // Query for a page at a specific URL path wp.pages().filter( 'pagename', 'some/url/path' ).. }); ``` **Date Filter Methods** `?before` and `?after` provide first-party support for querying by date, but should you have access to `filter` then three additional date query methods are available to return posts from a specific month, day or year: * `.year( year )`: find items published in the specified year * `.month( month )`: find items published in the specified month, designated by the month index (1&ndash;12) or name (*e.g.* "February") * `.day( day )`: find items published on the specified day ### Uploading Media Files may be uploaded to the WordPress media library by creating a media record using the `.media()` collection handler. The file to upload can be specified as - a `String` describing an image file path, _e.g._ `'/path/to/the/image.jpg'` - a `Buffer` with file content, _e.g._ `new Buffer()` - a file object from a `<input>` element, _e.g._ `document.getElementById( 'file-input' ).files[0]` The file is passed into the `.file()` method: ```js wp.media().file(content [, name])... ``` The optional second string argument specifies the file name to use for the uploaded media. If the name argument is omitted `file()` will try to infer a filename from the provided content. #### Adding Media to a Post If you wish to associate a newly-uploaded media record to a specific post, you must use two calls: one to first upload the file, then another to associate it with a post. Example code: ```js wp.media() // Specify a path to the file you want to upload, or a Buffer .file( '/path/to/the/image.jpg' ) .create({ title: 'My awesome image', alt_text: 'an image of something awesome', caption: 'This is the caption text', description: 'More explanatory information' }) .then(function( response ) { // Your media is now uploaded: let's associate it with a post var newImageId = response.id; return wp.media().id( newImageId ).update({ post: associatedPostId }); }) .then(function( response ) { console.log( 'Media ID #' + response.id ); console.log( 'is now associated with Post ID #' + response.post ); }); ``` If you are uploading media from the client side, you can pass a reference to a file input's file list entry in place of the file path: ```js wp.media() .file( document.getElementById( 'file-input' ).files[0] ) .create()... ``` ## Custom Routes Support for Custom Post Types is provided via the `.registerRoute` method. This method returns a handler function which can be assigned to your site instance as a method, and takes the [same namespace and route string arguments as `rest_register_route`](https://developer.wordpress.org/rest-api/extending-the-rest-api/adding-custom-endpoints/): ```js var site = new WPAPI({ endpoint: 'http://www.yoursite.com/wp-json' }); site.myCustomResource = site.registerRoute( 'myplugin/v1', '/author/(?P<id>)' ); site.myCustomResource().id( 17 ); // => myplugin/v1/author/17 ``` The string `(?P<id>)` indicates that a level of the route for this resource is a dynamic property named ID. By default, properties identified in this fashion will not have any inherent validation. This is designed to give developers the flexibility to pass in anything, with the caveat that only valid IDs will be accepted on the WordPress end. You might notice that in the example from the official WP-API documentation, a pattern is specified with a different format: this is a [regular expression](http://www.regular-expressions.info/tutorial.html) designed to validate the values that may be used for this capture group. ```js var site = new WPAPI({ endpoint: 'http://www.yoursite.com/wp-json' }); site.myCustomResource = site.registerRoute( 'myplugin/v1', '/author/(?P<id>\\d+)' ); site.myCustomResource().id( 7 ); // => myplugin/v1/author/7 site.myCustomResource().id( 'foo' ); // => Error: Invalid path component: foo does not match (?P<a>\d+) ``` Adding the regular expression pattern (as a string) enabled validation for this component. In this case, the `\\d+` will cause only _numeric_ values to be accepted. **NOTE THE DOUBLE-SLASHES** in the route definition here, however: ``` '/author/(?P<id>\\d+)' ``` This is a JavaScript string, where `\` _must_ be written as `\\` to be parsed properly. A single backslash will break the route's validation. Each named group in the route will be converted into a named setter method on the route handler, as in `.id()` in the example above: that name is taken from the `<id>` in the route string. The route string `'pages/(?P<parentPage>[\\d]+)/revisions/(?P<id>[\\d]+)'` would create the setters `.parentPage()` and `id()`, permitting any permutation of the provided URL to be created. ### Setter method naming for named route components In the example above, registering the route string `'/author/(?P<id>\\d+)'` results in the creation of an `.id()` method on the resulting resource handler: ```js site.myCustomResource().id( 7 ); // => myplugin/v1/author/7 ``` If a named route component (_e.g._ the "id" part in `(?P<id>\\d+)`, above) is in `snake_case`, then that setter will be converted to camelCase instead, as with `some_part` below: ```js site.myCustomResource = site.registerRoute( 'myplugin/v1', '/resource/(?P<some_part>\\d+)' ); site.myCustomResource().somePart( 7 ); // => myplugin/v1/resource/7 ``` Non-snake_cased route parameter names will be unaffected. ### Query Parameters & Filtering Custom Routes Many of the filtering methods available on the built-in collections are built in to custom-registered handlers, including `.page()`, `.perPage()`, `.search()`, `.include()`/`.exclude()` and `.slug()`; these parameters are supported across nearly all API endpoints, so they are made available automatically to custom endpoints as well. However not _every_ filtering method is available by default, so for convenience a configuration object may be passed to the `registerRoute` method with a `params` property specifying additional query parameters to support. This makes it very easy to add existing methods like `.before()` or `.after()` to your own endpoints: ```js site.handler = site.registerRoute( 'myplugin/v1', 'collection/(?P<id>)', { // Listing any of these parameters will assign the built-in // chaining method that handles the parameter: params: [ 'before', 'after', 'author', 'parent', 'post' ] }); // yields site.handler().post( 8 ).author( 92 ).before( dateObj )... ``` If you wish to set custom parameters, for example to query by the custom taxonomy `genre`, you can use the `.param()` method as usual: ```js site.handler().param( 'genre', genreTermId ); ``` but you can also specify additional query parameter names and a `.param()` wrapper function will be added automatically. _e.g._ here `.genre( x )` will be created as a shortcut for `.param( 'genre', x )`: ```js site.books = site.registerRoute( 'myplugin/v1', 'books/(?P<id>)', { params: [ 'genre' ] }); // yields site.books().genre([ genreId1, genreId2 ])... ``` ### Mixins To assign completely arbitrary custom methods for use with your custom endpoints, a configuration object may be passed to the `registerRoute` method with a `mixins` property defining any functions to add: ```js site.handler = site.registerRoute( 'myplugin/v1', 'collection/(?P<id>)', { mixins: { myParam: function( val ) { return this.param( 'my_param', val ); } } }); ``` This permits a developer to extend an endpoint with arbitrary parameters in the same manner as is done for the automatically-generated built-in route handlers. Note that mixins should always return `this` to support method chaining. ## Embedding Data Data types in WordPress are interrelated: A post has an author, some number of tags, some number of categories, *etc*. By default, the API responses will provide pointers to these related objects, but will not embed the full resources: so, for example, the `"author"` property would come back as just the author's ID, *e.g.* `"author": 4`. This functionality provides API consumers the flexibility to determine when and how they retrieve the related data. However, there are also times where an API consumer will want to get the most data in the fewest number of responses. Certain resources (author, comments, tags, and categories, to name a few) support *embedding*, meaning that they can be included in the response if the `_embed` query parameter is set. To request that the API respond with embedded data, simply call `.embed()` as part of the request chain: `wp.posts().id( 2501 ).embed()`... This will include an `._embedded` object in the response JSON, which contains all of those embeddable objects: ```js { "_embedded": { "author": [ /* ... */ ], "replies": [ /* ... */ ], "wp:attachment": [ /* ... */ ], "wp:term": [ [ {}, {} /* category terms */ ], [ {} /* tag terms */ ], /* etc... */ ], "wp:meta": [ /* ... */ ] } } ``` For more on working with embedded data, [check out the WP-API documentation](https://developer.wordpress.org/rest-api/using-the-rest-api/linking-and-embedding/). ## Collection Pagination WordPress sites can have a lot of content&mdash;far more than you'd want to pull down in a single request. The API endpoints default to providing a limited number of items per request, the same way that a WordPress site will default to 10 posts per page in archive views. The number of objects you can get back can be adjusted by calling the `perPage` method, but `perPage` is capped at 100 items per request for performance reasons. To work around these restrictions, the API provides headers so the API will frequently have to return your posts be unable to fit all of your posts in a single query. ### Using Pagination Headers Paginated collection responses are augmented with a `_paging` property derived from the collection's pagination headers. That `_paging` property on the response object contains some useful metadata: - `.total`: The total number of records matching the provided query - `.totalPages`: The number of pages available (`total` / `perPage`) - `.next`: A WPRequest object pre-bound to the next page of results - `.prev`: A WPRequest object pre-bound to the previous page of results - `.links`: an object containing the parsed `link` HTTP header data (when present) The existence of the `_paging.links.prev` and `_paging.links.next` properties can be used as flags to conditionally show or hide your paging UI, if necessary, as they will only be present when an adjacent page of results is available. You can use the `next` and `prev` properties to traverse an entire collection, should you so choose. For example, this snippet will recursively request the next page of posts and concatenate it with existing results, in order to build up an array of every post on your site: ```javascript var _ = require( 'lodash' ); function getAll( request ) { return request.then(function( response ) { if ( ! response._paging || ! response._paging.next ) { return response; } // Request the next page and return both responses as one collection return Promise.all([ response, getAll( response._paging.next ) ]).then(function( responses ) { return _.flatten( responses ); }); }); } // Kick off the request getAll( wp.posts() ).then(function( allPosts ) { /* ... */ }); ``` Be aware that this sort of unbounded recursion can take a **very long time**: if you use this technique in your application, we strongly recommend caching the response objects in a local database rather than re-requesting from the WP remote every time you need them. Depending on the amount of content in your site loading all posts into memory may also exceed Node's available memory, causing an exception. If this occurs, try to work with smaller subsets of your data at a time. ### Requesting a Specific Page You can also use a `.page(pagenumber)` method on calls that support pagination to directly get that page. For example, to set the API to return 5 posts on every page of results, and to get the third page of results (posts 11 through 15), you would write ```js wp.posts().perPage( 5 ).page( 3 ).then(/* ... */); ``` ### Using `offset` If you prefer to think about your collections in terms of _offset_, or how many items "into" the collection you want to query, you can use the `offset` parameter (and parameter convenience method) instead of `page`. These are equivalent: ```js // With .page() wp.posts().perPage( 5 ).page( 3 )... // With .offset() wp.posts().perPage( 5 ).offset( 10 )... ``` ## Customizing HTTP Request Behavior By default `node-wpapi` uses the [superagent](https://www.npmjs.com/package/superagent) library internally to make HTTP requests against the API endpoints. Superagent is a flexible tool that works on both the client and the browser, but you may want to use a different HTTP library, or to get data from a cache when available instead of making an HTTP request. To facilitate this, `node-wpapi` lets you supply a `transport` object when instantiating a site client to specify custom functions to use for one (or all) of GET, POST, PUT, DELETE & HEAD requests. **This is advanced behavior; you will only need to utilize this functionality if your application has very specific HTTP handling or caching requirements.** In order to maintain consistency with the rest of the API, custom transport methods should take in a WordPress API route handler query object (_e.g._ the result of calling `wp.posts()...` or any of the other chaining resource handlers), a `data` object (for POST, PUT and DELETE requests), and an optional callback function (as `node-wpapi` transport methods both return Promise objects _and_ support traditional `function( err, response )` callbacks). The default HTTP transport methods are available as `WPAPI.transport` (a property of the constructor object) and may be called within your transports if you wish to extend the existing behavior, as in the example below. **Example:** Cache requests in a simple dictionary object, keyed by request URI. If a request's response is already available, serve from the cache; if not, use the default GET transport method to retrieve the data, save it in the cache, and return it to the consumer: ```js var site = new WPAPI({ endpoint: 'http://my-site.com/wp-json', transport: { // Only override the transport for the GET method, in this example // Transport methods should take a wpreq object and a callback: get: function( wpreq, cb ) { var result = cache[ wpreq ]; // If a cache hit is found, return it via the same callback/promise // signature as the default transport method: if ( result ) { if ( cb && typeof cb === 'function' ) { // Invoke the callback function, if one was provided cb( null, result ); } // Return the data as a promise return Promise.resolve( result ); } // Delegate to default transport if no cached data was found return WPAPI.transport.get( wpreq, cb ).then(function( result ) { cache[ wpreq ] = result; return result; }); } } }); ``` You may set one or many custom HTTP transport methods on an existing WP site client instance (for example one returned through [auto-discovery](#auto-discovery) by calling the `.transport()` method on the site client instance and passing an object of handler functions: ```js site.transport({ get: function( wpreq, callbackFn ) { /* ... */}, put: function( wpreq, callbackFn ) { /* ... */} }); ``` Note that these transport methods are the internal methods used by `create` and `.update`, so the names of these methods therefore map to the HTTP verbs "get", "post", "put", "head" and "delete"; name your transport methods accordingly or they will not be used. ### Specifying HTTP Headers If you need to send additional HTTP headers along with your request (for example to provide a specific `Authorization` header for use with alternative authentication schemes), you can use the `.setHeaders()` method to specify one or more headers to send with the dispatched request: #### Set headers for a single request ```js // Specify a single header to send with the outgoing request wp.posts().setHeaders( 'Authorization', 'Bearer xxxxx.yyyyy.zzzzz' )... // Specify multiple headers to send with the outgoing request wp.posts().setHeaders({ Authorization: 'Bearer xxxxx.yyyyy.zzzzz', 'Accept-Language': 'pt-BR' })... ``` #### Set headers globally You can also set headers globally on the WPAPI instance itself, which will then be used for all subsequent requests created from that site instance: ```js // Specify a header to be used by all subsequent requests wp.setHeaders( 'Authorization', 'Bearer xxxxx.yyyyy.zzzzz' ); // These will now be sent with an Authorization header wp.users().me()... wp.posts().id( unpublishedPostId )... ``` ## Authentication You must be authenticated with WordPress to create, edit or delete resources via the API. Some WP-API endpoints additionally require authentication for GET requests in cases where the data being requested could be considered private: examples include any of the `/users` endpoints, requests where the `context` query parameter is `true`, and `/revisions` for posts and pages, among others. ### Basic Authentication This library currently supports [basic HTTP authentication](http://en.wikipedia.org/wiki/Basic_access_authentication). To authenticate with your WordPress install, 1. Download and install the [Basic Authentication handler plugin](https://github.com/WP-API/Basic-Auth) on your target WordPress site. *(Note that the basic auth handler is not curently available through the plugin repository: you must install it manually.)* 2. Activate the plugin. 3. Specify the username and password of an authorized user (a user that can edit_posts) when instantiating the WPAPI request object: ```javascript var wp = new WPAPI({ endpoint: 'http://www.website.com/wp-json', username: 'someusername', password: 'thepasswordforthatuser' }); ``` Now any requests generated from this WPAPI instance will use that username and password for basic authentication if the targeted endpoint requires it. As an example, `wp.users().me()` will automatically enable authentication to permit access to the `/users/me` endpoint. (If a username and password had not been provided, a 401 error would have been returned.) ### Manually forcing authentication Because authentication may not always be set when needed, an `.auth()` method is provided which can enable authentication for any request chain: ```javascript // This will authenticate the GET to /posts/id/817 wp.posts().id( 817 ).auth().get(... ``` This `.auth` method can also be used to manually specify a username and a password as part of a request chain: ```javascript // Use username "mcurie" and password "nobel" for this request wp.posts().id( 817 ).auth( {username: 'mcurie', password: 'nobel'} ).get(... ``` This will override any previously-set username or password values. **Authenticate all requests for a WPAPI instance** It is possible to make all requests from a WPAPI instance use authentication by setting the `auth` option to `true` on instantiation: ```javascript var wp = new WPAPI({ endpoint: // ... username: // ... password: // ... auth: true }); ``` #### SECURITY WARNING Please be aware that basic authentication sends your username and password over the wire, in plain text. **We only recommend using basic authentication in production if you are securing your requests with SSL.** More robust authentication methods will hopefully be added; we would welcome contributions in this area! ### Cookie Authentication When the library is loaded from the frontend of the WordPress site you are querying against, you may authenticate your REST API requests using the built in WordPress [Cookie authentication](https://developer.wordpress.org/rest-api/using-the-rest-api/authentication/#cookie-authentication) by creating and passing a Nonce with your API requests. First localize your scripts with an object with root-url and nonce in your theme's `functions.php` or your plugin: ```php function my_enqueue_scripts() { wp_enqueue_script( 'app', get_template_directory_uri() . '/assets/dist/bundle.js', array(), false, true ); wp_localize_script( 'app', 'WP_API_Settings', array( 'endpoint' => esc_url_raw( rest_url() ), 'nonce' => wp_create_nonce( 'wp_rest' ) ) ); } add_action( 'wp_enqueue_scripts', 'my_enqueue_scripts' ); ``` And then use this nonce when initializing the library: ```javascript var WPAPI = require( 'wpapi' ); var wp = new WPAPI({ endpoint: window.WP_API_Settings.endpoint, nonce: window.WP_API_Settings.nonce }); ``` ## API Documentation In addition to the above getting-started guide, we have automatically-generated [API documentation](http://wp-api.org/node-wpapi/api-reference/wpapi/1.1.2/). ## Issues If you identify any errors in this module, or have an idea for an improvement, please [open an issue](https://github.com/wp-api/node-wpapi/issues). We're excited to see what the community thinks of this project, and we would love your input! ## Contributing We welcome contributions large and small. See our [contributor guide](CONTRIBUTING.md) for more information.
50.193411
610
0.717099
eng_Latn
0.988167
0c29a971445188a00bf2732b1d92a6dc60700251
10,882
md
Markdown
modules/azurecontainerinstances/aci.md
dballance/MTC_ContainerCamp
f8f5a009255311d85fd60bbd0f9203b2c7452c71
[ "MIT" ]
1
2021-01-31T17:38:34.000Z
2021-01-31T17:38:34.000Z
modules/azurecontainerinstances/aci.md
larryclaman/containercamp-jekyll
bc88b71110ebfd4a4d984ec4ec16682e9213708d
[ "MIT" ]
null
null
null
modules/azurecontainerinstances/aci.md
larryclaman/containercamp-jekyll
bc88b71110ebfd4a4d984ec4ec16682e9213708d
[ "MIT" ]
null
null
null
# Deploying container to an Azure Container Instance This lab will walk through creating an ASP.Net Core application as container and debugging and then finally deploying the container to the Azure Container Instance. In this lab you will build the app and create the container on your Windows or Mac machine. Make sure you have installed the following pre-requisites on your machine | Prerequisites | | | ------------- |:-------------| | .NET Core 2.0 | [Install](https://www.microsoft.com/net/download/core) | | Docker | Download and install: [Docker Windows](https://download.docker.com/win/stable/InstallDocker.msi) - [Docker Mac](https://download.docker.com/mac/stable/Docker.dmg)| | VS Code with C# Plugin | [Install](https://code.visualstudio.com/Download) | | Node.js | [Install](https://nodejs.org/en/download/) | | Yeoman | Run from command prompt: **npm install -g yo** | | Yeoman generator for Docker | Run from command prompt: **npm install -g generator-docker** ||On Windows run from the folder "Program Files/nodejs/node_modules" | | Bower-A package manager for the web | Run from command prompt: **npm install -g bower** | | Gulp | Run from command prompt: **npm install -g gulp** | ## Task 1: Create ASP.NET Core Hello World Application 1. Open a command prompt and change your directory to your code location. Then run following commands to create an ASP.Net Core app. ``` md helloworld cd helloworld dotnet new web dotnet restore code . ``` 2. Open *Program.cs* Add missing assets necessary to debug the application by selecting **Yes** on the notification bar. Add the following line of code before the line ```.Build();``` ```c-sharp .UseUrls("http://*:8080") ``` Example: ``` public static IWebHost BuildWebHost(string[] args) => WebHost.CreateDefaultBuilder(args) .UseStartup<Startup>() .UseUrls("http://*:8080") .Build(); ``` 3. Press **Ctrl+S** to save the changes. 4. Open *Startup.cs* and modify the following line of code to match what is shown in the Configure method: ``` await context.Response.WriteAsync($"Hello World v1! {DateTime.Now}"); ``` 5. Press **Ctrl-S** to save changes. 6. Press **F5** and confirm the app runs correctly by browsing to http://localhost:8080. 7. Stop the run before proceeding to Task 2. ## Task 2: Create a Docker Image To support different developer environments, the .NET Core application will be deployed to Linux. Modify the application you started in Task 1 as follows: 1. Open the VS Code Terminal windows by pressing **Ctrl-`**. Publish app using VS Code terminal window with the following command: ``` dotnet restore dotnet publish ``` 3. Create new file by clicking [**Ctrl-N**] in VS Code and re-name it to *dockerfile*. You can do this by first right-click->Save As->Right Click->Rename. Ensure that there is no extension. The file name should be simply 'dockerfile'. 4. Add the following. Make sure to replace **\<appname>** in **ENTRYPOINT** last line to match your application name. For example helloworld.dll ``` FROM microsoft/aspnetcore:2.0 RUN apt-get update RUN apt-get install -y curl unzip RUN curl -sSL https://aka.ms/getvsdbgsh | bash /dev/stdin -v latest -l ~/vsdbg WORKDIR /app COPY ./bin/Debug/netcoreapp2.0/publish . EXPOSE 8080/tcp ENV ASPNETCORE_URLS http://*:8080 ENTRYPOINT ["dotnet", "/app/helloworld.dll"] ``` 5. Press **Ctrl-S** to save changes. 6. In the Terminal windows, create the Docker image by running the following commands. Ensure to substitute <your_image_name> with for example helloworldfromlinux : ``` docker build -t <your_image_name> . docker run -d -p 8080:8080 <your_image_name> docker ps ``` 7. Browse to http://localhost:8080 or IP address listed in Container list (Mac) 8. Close browser 9. In VS Code, update *startup.cs* ``` await context.Response.WriteAsync($"Hello World v2! {DateTime.Now}"); ``` 10. Press **Ctrl-S** to save changes. 11. Run the following commands to publish the updated application, create an updated image, and run the image in a new container. ``` dotnet publish docker build -t <your_image_name> . docker run -d -p 8080:8080 <your_image_name> ``` 12. Browse to http://localhost:8080 or IP address listed in Container list (Mac) 13. Close browser ## Task 3: Debug an Application in a Docker Container **Note: This section is ungoing an update and may not work correctly at this time** 1. Publish debug version of the application ```none dotnet publish -c Debug ``` 2. Build a new version of the image ```none docker build -t <your_image_name> --rm . ``` 3. Find old version and kill ```none docker ps docker kill <image_id> ``` 4. Start debug version ```none docker run -d -p 8080:8080 --name debug <your_image_name> ``` 5. Add the following as the last json object in the *configurations* array in the *launch.json* (found in the *.vscode* folder). Make sure to update the *"pipesArgs"* value and the *"sourceFileMap* values accordingly. ``` { "name": ".NET Core Remote Attach", "type": "coreclr", "request": "attach", "processId": "${command:pickRemoteProcess}", "pipeTransport": { "pipeProgram": "powershell.exe", "pipeArgs": [ "-c", "docker exec -i debug ${debuggerCommand}" ], "debuggerPath": "/root/vsdbg/vsdbg", "pipeCwd": "${workspaceRoot}", "quoteArgs": true }, "sourceFileMap": { "<path to your application's root directory>": "${workspaceRoot}" }, "justMyCode": true } ``` 6. In VS Code, click on Debug icon in the left toolbar *insert image* 7. From **Debug** dropdown, select **.NET Core Remote Attach** 8. Go to the *Startup.cs* file. Set a breakpoint on the line by clicking in the left-hand side gutter ```c-sharp await context.Response.WriteAsync($"Hello World v2! {DateTime.Now}"); ``` 9. Press **F5** or click Debug Run Icon *insert image* 10. Hit the breakpoint. Press **F5** to complete the request. ## Task 4: Deploy to Azure Container Instance 1. Open browser and go to [http://portal.azure.com](http://portal.azure.com) 2. Login using your Azure credentials 3. Click on the Cloud Shell icon at the top of the page. The shell may take some time to be initialized. 4. In the terminal window, create an Azure Resource Group to hold your workshop resources. You can delete the is Resource Group at the end of the workshop. Deleting the resource group will delete all of the assets created. 5. Enter the following command: ```none az group create --name <your_group_name> --location eastus ``` 5. Create an Azure Container Registry. The ACR is used to store Docker images in the cloud. Azure Container Instances will pull images from this registry when deploying a new container. ``` az acr create --resource-group <your_group_name> --name <registry_name> --sku Basic --admin-enabled true ``` 6. You will need information about the registry to use when creating containers in ACI. In the response from the **az acr create** command, find the value for *"loginServer"*. This value will be referred to as *<reg_login_server>* in later commands. 7. Get the password for teh registry using the following command: ``` az acr credential show --name <registry_name> --query passwords[0].value ``` 8. Copy and paste the password into a text document. The password will be referred to as *<registry_password>* in later commands. 9. Return to VS Code. 10. In the Terminal windows, login to the Azure Container Registry using the following command: ``` docker login --username=<registry_name> --password=<reg_password> <reg_login_server> ``` 11. List your Docker images using the following command: ``` docker images ``` 12. We will tag the latest image as version one of our ACI image. Run the follwing command: ``` docker tag <image_name> <reg_login_server>/<image_name>:v1 ``` 13. Confirm that image was created with the following command: ``` docker images ``` 14. Now it is time to push the iamge to the Azure Container Registry. Run the following command: ``` docker push <reg_login_server>/<image_name>:v1 ``` 15. To return a list of images that have been pushed to your Azure Container registry, run the following command in the Cloud Shell in your browser: ``` az acr repository list --name <registry_name> --username <registry_name> --password <registry_password> --output table ``` 16. If you need to see tags that have been applied to a specific image, run the following command: ``` az acr repository show-tags --name <registry_name> --username <registry_name> --password <registry_passwrod> --repository <image_name> --output table ``` 17. It is now time to deploy our app. To deploy your container image from the container registry with a resource request of 1 CPU core and 1GB of memory, run the following command: ``` az container create --name <container_name> --image <reg_login_server>/<image_name>:v1 --cpu 1 --memory 1 --registry-login-server <reg_login_server> --registry-username <registry_name> --registry-password <registry_password> --ip-address public -g <resource_group_name> --port 8080 ``` 18. Within a few seconds, you should get a response to your request. Initially, the container will be in a Creating state, but it should start within a few seconds. You can check the status using the show command: ```none az container show --name <container_name> --resource-group <your_group_name> ``` 20. At the bottom of the output, you will see the container's provisioning state and its IP address: ```json "ipAddress": { "ip": "13.88.8.148", "ports": [ { "port": 80, "protocol": "TCP" } ] }, "osType": "Linux", "provisioningState": "Succeeded" ``` 21. Once the container moves to the Succeeded state, you can reach it in your browser using the IP address provided. 22. When finished, you can delete the container (and save money) ```none az container delete --name <container_name> --resource-group <your_group_name> ``` ### Blog reference: [Visual Studio, Docker and ACI Better Together!](https://blogs.msdn.microsoft.com/alimaz/2017/08/17/visual-studio-docker-azure-container-instances-better-together/)
40.303704
285
0.672946
eng_Latn
0.943878
0c29c1dde41035a33e4549c3e9e379d17008b638
104
md
Markdown
streams-metric-exporter/lib/README.md
IBMStreams/streamsx.jmxclients
e49662dc64a01498a6d4885edfd1aa05a861b3bd
[ "Apache-2.0" ]
3
2017-10-25T00:03:46.000Z
2018-05-31T17:54:44.000Z
streams-metric-exporter/lib/README.md
IBMStreams/streamsx.jmxclients
e49662dc64a01498a6d4885edfd1aa05a861b3bd
[ "Apache-2.0" ]
32
2017-10-02T12:27:16.000Z
2020-09-28T14:49:21.000Z
streams-metric-exporter/lib/README.md
IBMStreams/streamsx.jmxclients
e49662dc64a01498a6d4885edfd1aa05a861b3bd
[ "Apache-2.0" ]
3
2019-04-07T23:58:16.000Z
2021-09-08T21:36:56.000Z
# IBM Streams JMX API ## IBM Streams Version 4.3.0.2 * Allows for streams domain status in DomainMXBean
26
50
0.759615
eng_Latn
0.461495
0c2af2a1c7991018b64d578572282e5a42eaba92
312
md
Markdown
README.md
luoyexunxue/tasty_spoon
ed4965bd3c276185f2cc8d67df00d58fb5f60ae8
[ "MIT" ]
null
null
null
README.md
luoyexunxue/tasty_spoon
ed4965bd3c276185f2cc8d67df00d58fb5f60ae8
[ "MIT" ]
null
null
null
README.md
luoyexunxue/tasty_spoon
ed4965bd3c276185f2cc8d67df00d58fb5f60ae8
[ "MIT" ]
null
null
null
This project is developed using Tuya SDK, which enables you tu quickly develop branded apps connecting and controlling smart scenarios of many devices. For more information, please check Tuya Developer Website. ## 电路原理图 ![原理图](images/schema.png) ## PCB设计 ![PCB](images/pcb.png) ## 外壳设计 ![外壳](images/shell.png)
31.2
210
0.766026
eng_Latn
0.973963
0c2afa38369d1a3a94560b838d431a9ef9193a2d
45,172
md
Markdown
docs/mfc/reference/colecontrolsite-class.md
Mdlglobal-atlassian-net/cpp-docs.cs-cz
803fe43d9332d0b8dda5fd4acfe7f1eb0da3a35e
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/mfc/reference/colecontrolsite-class.md
Mdlglobal-atlassian-net/cpp-docs.cs-cz
803fe43d9332d0b8dda5fd4acfe7f1eb0da3a35e
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/mfc/reference/colecontrolsite-class.md
Mdlglobal-atlassian-net/cpp-docs.cs-cz
803fe43d9332d0b8dda5fd4acfe7f1eb0da3a35e
[ "CC-BY-4.0", "MIT" ]
1
2020-05-28T15:53:26.000Z
2020-05-28T15:53:26.000Z
--- title: COleControlSite – třída ms.date: 11/04/2016 f1_keywords: - COleControlSite - AFXOCC/COleControlSite - AFXOCC/COleControlSite::COleControlSite - AFXOCC/COleControlSite::BindDefaultProperty - AFXOCC/COleControlSite::BindProperty - AFXOCC/COleControlSite::CreateControl - AFXOCC/COleControlSite::DestroyControl - AFXOCC/COleControlSite::DoVerb - AFXOCC/COleControlSite::EnableDSC - AFXOCC/COleControlSite::EnableWindow - AFXOCC/COleControlSite::FreezeEvents - AFXOCC/COleControlSite::GetDefBtnCode - AFXOCC/COleControlSite::GetDlgCtrlID - AFXOCC/COleControlSite::GetEventIID - AFXOCC/COleControlSite::GetExStyle - AFXOCC/COleControlSite::GetProperty - AFXOCC/COleControlSite::GetStyle - AFXOCC/COleControlSite::GetWindowText - AFXOCC/COleControlSite::InvokeHelper - AFXOCC/COleControlSite::InvokeHelperV - AFXOCC/COleControlSite::IsDefaultButton - AFXOCC/COleControlSite::IsWindowEnabled - AFXOCC/COleControlSite::ModifyStyle - AFXOCC/COleControlSite::ModifyStyleEx - AFXOCC/COleControlSite::MoveWindow - AFXOCC/COleControlSite::QuickActivate - AFXOCC/COleControlSite::SafeSetProperty - AFXOCC/COleControlSite::SetDefaultButton - AFXOCC/COleControlSite::SetDlgCtrlID - AFXOCC/COleControlSite::SetFocus - AFXOCC/COleControlSite::SetProperty - AFXOCC/COleControlSite::SetPropertyV - AFXOCC/COleControlSite::SetWindowPos - AFXOCC/COleControlSite::SetWindowText - AFXOCC/COleControlSite::ShowWindow - AFXOCC/COleControlSite::GetControlInfo - AFXOCC/COleControlSite::m_bIsWindowless - AFXOCC/COleControlSite::m_ctlInfo - AFXOCC/COleControlSite::m_dwEventSink - AFXOCC/COleControlSite::m_dwMiscStatus - AFXOCC/COleControlSite::m_dwPropNotifySink - AFXOCC/COleControlSite::m_dwStyle - AFXOCC/COleControlSite::m_hWnd - AFXOCC/COleControlSite::m_iidEvents - AFXOCC/COleControlSite::m_nID - AFXOCC/COleControlSite::m_pActiveObject - AFXOCC/COleControlSite::m_pCtrlCont - AFXOCC/COleControlSite::m_pInPlaceObject - AFXOCC/COleControlSite::m_pObject - AFXOCC/COleControlSite::m_pWindowlessObject - AFXOCC/COleControlSite::m_pWndCtrl - AFXOCC/COleControlSite::m_rect helpviewer_keywords: - COleControlSite [MFC], COleControlSite - COleControlSite [MFC], BindDefaultProperty - COleControlSite [MFC], BindProperty - COleControlSite [MFC], CreateControl - COleControlSite [MFC], DestroyControl - COleControlSite [MFC], DoVerb - COleControlSite [MFC], EnableDSC - COleControlSite [MFC], EnableWindow - COleControlSite [MFC], FreezeEvents - COleControlSite [MFC], GetDefBtnCode - COleControlSite [MFC], GetDlgCtrlID - COleControlSite [MFC], GetEventIID - COleControlSite [MFC], GetExStyle - COleControlSite [MFC], GetProperty - COleControlSite [MFC], GetStyle - COleControlSite [MFC], GetWindowText - COleControlSite [MFC], InvokeHelper - COleControlSite [MFC], InvokeHelperV - COleControlSite [MFC], IsDefaultButton - COleControlSite [MFC], IsWindowEnabled - COleControlSite [MFC], ModifyStyle - COleControlSite [MFC], ModifyStyleEx - COleControlSite [MFC], MoveWindow - COleControlSite [MFC], QuickActivate - COleControlSite [MFC], SafeSetProperty - COleControlSite [MFC], SetDefaultButton - COleControlSite [MFC], SetDlgCtrlID - COleControlSite [MFC], SetFocus - COleControlSite [MFC], SetProperty - COleControlSite [MFC], SetPropertyV - COleControlSite [MFC], SetWindowPos - COleControlSite [MFC], SetWindowText - COleControlSite [MFC], ShowWindow - COleControlSite [MFC], GetControlInfo - COleControlSite [MFC], m_bIsWindowless - COleControlSite [MFC], m_ctlInfo - COleControlSite [MFC], m_dwEventSink - COleControlSite [MFC], m_dwMiscStatus - COleControlSite [MFC], m_dwPropNotifySink - COleControlSite [MFC], m_dwStyle - COleControlSite [MFC], m_hWnd - COleControlSite [MFC], m_iidEvents - COleControlSite [MFC], m_nID - COleControlSite [MFC], m_pActiveObject - COleControlSite [MFC], m_pCtrlCont - COleControlSite [MFC], m_pInPlaceObject - COleControlSite [MFC], m_pObject - COleControlSite [MFC], m_pWindowlessObject - COleControlSite [MFC], m_pWndCtrl - COleControlSite [MFC], m_rect ms.assetid: 43970644-5eab-434a-8ba6-56d144ff1e3f ms.openlocfilehash: 90c41a1be1a66cdceebb3f045a98167e56b7cf4c ms.sourcegitcommit: 7a6116e48c3c11b97371b8ae4ecc23adce1f092d ms.translationtype: MT ms.contentlocale: cs-CZ ms.lasthandoff: 04/22/2020 ms.locfileid: "81753955" --- # <a name="colecontrolsite-class"></a>COleControlSite – třída Poskytuje podporu pro vlastní rozhraní řízení na straně klienta. ## <a name="syntax"></a>Syntaxe ``` class COleControlSite : public CCmdTarget ``` ## <a name="members"></a>Členové ### <a name="public-constructors"></a>Veřejné konstruktory |Název|Popis| |----------|-----------------| |[COleControlSite::COleControlSite](#colecontrolsite)|Vytvoří `COleControlSite` objekt.| ### <a name="public-methods"></a>Veřejné metody |Název|Popis| |----------|-----------------| |[COleControlSite::Vlastnost BindDefaultProperty](#binddefaultproperty)|Sváže výchozí vlastnost hostovaného ovládacího prvku se zdrojem dat.| |[COleControlSite::Vlastnost BindProperty](#bindproperty)|Sváže vlastnost hostovaného ovládacího prvku se zdrojem dat.| |[COleControlSite::CreateControl](#createcontrol)|Vytvoří hostovaný ovládací prvek ActiveX.| |[COleControlSite::DestroyControl](#destroycontrol)|Zničí hostovaný ovládací prvek.| |[COleControlSite::DoVerb](#doverb)|Provede konkrétní sloveso hostovaného ovládacího prvku.| |[COleControlSite::EnableDSC](#enabledsc)|Umožňuje získávání dat pro řídicí lokalitu.| |[COleControlSite::EnableWindow](#enablewindow)|Povolí řídicí lokalitu.| |[COleControlSite::FreezeEvents](#freezeevents)|Určuje, zda řídicí lokalita přijímá události.| |[COleControlSite::GetDefBtnCode](#getdefbtncode)|Načte výchozí kód tlačítka pro hostovaný ovládací prvek.| |[COleControlSite::GetDlgCtrlID](#getdlgctrlid)|Načte identifikátor ovládacího prvku.| |[COleControlSite::GeteventiID](#geteventiid)|Načte ID rozhraní události pro hostovaný ovládací prvek.| |[COleControlSite::GetExStyle](#getexstyle)|Načte rozšířené styly řídicího webu.| |[COleControlSite::Vlastnost GetProperty](#getproperty)|Načte určitou vlastnost hostovaného ovládacího prvku.| |[COleControlSite::GetStyle](#getstyle)|Načte styly řídicího webu.| |[COleControlSite::GetWindowText](#getwindowtext)|Načte text hostovaného ovládacího prvku.| |[COleControlSite::InvokeHelper](#invokehelper)|Vyvolat konkrétní metodu hostovaného ovládacího prvku.| |[COleControlSite::InvokeHelperV](#invokehelperv)|Vyvolat konkrétní metodu hostovaného ovládacího prvku s proměnnýseznam argumentů.| |[COleControlSite::Tlačítko IsDefaultButton](#isdefaultbutton)|Určuje, zda je ovládací prvek výchozím tlačítkem v okně.| |[COleControlSite::IsWindowEnabled COleControlSite::IsWindowEnabled COleControlSite::IsWindowEnabled COle](#iswindowenabled)|Zkontroluje viditelný stav řídicího webu.| |[COleControlSite::Změnit styl](#modifystyle)|Upraví aktuální rozšířené styly řídicího webu.| |[COleControlSite::ModifyStyleEx](#modifystyleex)|Upraví aktuální styly řídicího webu.| |[COleControlSite::MoveWindow](#movewindow)|Změní polohu řídicího místa.| |[COleControlSite::QuickActivate](#quickactivate)|Rychle aktivuje hostovaný ovládací prvek.| |[COleControlSite::SafeSetProperty](#safesetproperty)|Nastaví vlastnost nebo metodu ovládacího prvku bez možnosti vyvolání výjimky.| |[COleControlSite::SetDefaultButton](#setdefaultbutton)|Nastaví výchozí tlačítko v okně.| |[COleControlSite::SetDlgCtrlID](#setdlgctrlid)|Načte identifikátor ovládacího prvku.| |[COleControlSite::SetFocus](#setfocus)|Nastaví fokus na řídicí web.| |[COleControlSite::SetProperty](#setproperty)|Nastaví určitou vlastnost hostovaného ovládacího prvku.| |[COleControlSite::SetPropertyV](#setpropertyv)|Nastaví určitou vlastnost hostovaného ovládacího prvku se seznamem proměnných argumentů.| |[COleControlSite::SetWindowPos](#setwindowpos)|Nastaví pozici řídicího webu.| |[COleControlSite::SetWindowText](#setwindowtext)|Nastaví text hostovaného ovládacího prvku.| |[COleControlSite::Zobrazitokno](#showwindow)|Zobrazí nebo skryje řídicí web.| ### <a name="protected-methods"></a>Chráněné metody |Název|Popis| |----------|-----------------| |[COleControlSite::GetControlInfo](#getcontrolinfo)|Načte informace o klávesnici a mnemotechnické pomůcky pro hostované ovládací prvek.| ### <a name="public-data-members"></a>Veřejné datové členy |Název|Popis| |----------|-----------------| |[COleControlSite::m_bIsWindowless](#m_biswindowless)|Určuje, zda je hostovaný ovládací prvek ovládací prvek bez oken.| |[COleControlSite::m_ctlInfo](#m_ctlinfo)|Obsahuje informace o zpracování klávesnice pro ovládací prvek.| |[COleControlSite::m_dwEventSink](#m_dweventsink)|Soubor cookie spojovacího bodu ovládacího prvku.| |[COleControlSite::m_dwMiscStatus](#m_dwmiscstatus)|Různé stavy pro hostovaný ovládací prvek.| |[COleControlSite::m_dwPropNotifySink](#m_dwpropnotifysink)|Soubor `IPropertyNotifySink` cookie ovládacího prvku.| |[COleControlSite::m_dwStyle](#m_dwstyle)|Styly hostovaného ovládacího prvku.| |[COleControlSite::m_hWnd](#m_hwnd)|Popisovač řídicího místa.| |[COleControlSite::m_iidEvents](#m_iidevents)|ID rozhraní události pro hostovaný ovládací prvek.| |[COleControlSite::m_nID](#m_nid)|ID hostovaného ovládacího prvku.| |[COleControlSite::m_pActiveObject](#m_pactiveobject)|Ukazatel na `IOleInPlaceActiveObject` objekt hostovaného ovládacího prvku.| |[COleControlSite::m_pCtrlCont](#m_pctrlcont)|Kontejner hostovaného ovládacího prvku.| |[COleControlSite::m_pInPlaceObject](#m_pinplaceobject)|Ukazatel na `IOleInPlaceObject` objekt hostovaného ovládacího prvku.| |[COleControlSite::m_pObject](#m_pobject)|Ukazatel na `IOleObjectInterface` rozhraní ovládacího prvku.| |[COleControlSite::m_pWindowlessObject](#m_pwindowlessobject)|Ukazatel na `IOleInPlaceObjectWindowless` rozhraní ovládacího prvku.| |[COleControlSite::m_pWndCtrl](#m_pwndctrl)|Ukazatel na objekt okna pro hostovaný ovládací prvek.| |[COleControlSite::m_rect](#m_rect)|Rozměry řídicího místa.| ## <a name="remarks"></a>Poznámky Tato podpora je primárním prostředkem, kterým vložený ovládací prvek ActiveX získává informace o umístění a rozsahu jeho webu zobrazení, jeho zástupný název, jeho uživatelské rozhraní, jeho okolí vlastnosti a další prostředky poskytované jeho kontejneru. `COleControlSite`plně implementuje rozhraní [IOleControlSite](/windows/win32/api/ocidl/nn-ocidl-iolecontrolsite), [IOleInPlaceSite](/windows/win32/api/oleidl/nn-oleidl-ioleinplacesite), [IOleClientSite](/windows/win32/api/oleidl/nn-oleidl-ioleclientsite), [IPropertyNotifySink](/windows/win32/api/ocidl/nn-ocidl-ipropertynotifysink), `IBoundObjectSite`, `INotifyDBEvents`, [IRowSetNotify.](../../data/oledb/irowsetnotifyimpl-class.md) Kromě toho je implementováno také rozhraní IDispatch (poskytující podporu pro vlastnosti okolí a jímky událostí). Chcete-li vytvořit lokalitu `COleControlSite`ovládacího prvku `COleControlSite`ActiveX pomocí , odvoděte třídu z aplikace . Ve `CWnd`vaší odvozené třídě pro kontejner (například dialogové okno) přepište `CWnd::CreateControlSite` funkci. ## <a name="inheritance-hierarchy"></a>Hierarchie dědičnosti [CObjekt](../../mfc/reference/cobject-class.md) [CCmdCíl](../../mfc/reference/ccmdtarget-class.md) `COleControlSite` ## <a name="requirements"></a>Požadavky **Záhlaví:** afxocc.h ## <a name="colecontrolsitebinddefaultproperty"></a><a name="binddefaultproperty"></a>COleControlSite::Vlastnost BindDefaultProperty Sváže výchozí vlastnost simple bound volajícího objektu, jak je označena v knihovně typů, s podkladovým kurzorem, který je definován vlastnostmi DataSource, UserName, Password a SQL ovládacího prvku zdroje dat. ``` virtual void BindDefaultProperty( DISPID dwDispID, VARTYPE vtProp, LPCTSTR szFieldName, CWnd* pDSCWnd); ``` ### <a name="parameters"></a>Parametry *dwDispID*<br/> Určuje dispid vlastnosti na ovládací prvek vázaný na data, který má být vázán na ovládací prvek zdroje dat. *vtProp*<br/> Určuje typ vlastnosti, která má být vázána, například VT_BSTR, VT_VARIANT a tak dále. *szFieldName*<br/> Určuje název sloupce v kurzoru poskytovaném ovládacím prvkem zdroje dat, ke kterému bude vlastnost vázána. *pDSCWnd*<br/> Ukazatel na `CWnd`-derived objekt, který je hostitelem ovládacího prvku zdroj dat, ke kterému bude vlastnost vázána. ### <a name="remarks"></a>Poznámky Objekt, `CWnd` na kterém voláte tuto funkci, musí být ovládací prvek vázaný na data. ## <a name="colecontrolsitebindproperty"></a><a name="bindproperty"></a>COleControlSite::Vlastnost BindProperty Sváže vlastnost simple bound volajícího objektu označenou v knihovně typů s podkladovým kurzorem, který je definován vlastnostmi DataSource, UserName, Password a SQL ovládacího prvku zdroje dat. ``` virtual void BindProperty( DISPID dwDispId, CWnd* pWndDSC); ``` ### <a name="parameters"></a>Parametry *dwDispId*<br/> Určuje dispid vlastnosti na ovládací prvek vázaný na data, který má být vázán na ovládací prvek zdroje dat. *pWndDSC*<br/> Ukazatel na `CWnd`-derived objekt, který je hostitelem ovládacího prvku zdroj dat, ke kterému bude vlastnost vázána. ### <a name="remarks"></a>Poznámky Objekt, `CWnd` na kterém voláte tuto funkci, musí být ovládací prvek vázaný na data. ## <a name="colecontrolsitecolecontrolsite"></a><a name="colecontrolsite"></a>COleControlSite::COleControlSite Vytvoří nový `COleControlSite` objekt. ``` explicit COleControlSite(COleControlContainer* pCtrlCont); ``` ### <a name="parameters"></a>Parametry *pCtrlCont*<br/> Ukazatel na kontejner ovládacího prvku (který představuje okno, které hostuje ovládací prvek AtiveX). ### <a name="remarks"></a>Poznámky Tato funkce je volána funkcí [COccManager::CreateContainer.](../../mfc/reference/coccmanager-class.md#createcontainer) Další informace o přizpůsobení vytváření kontejnerů naleznete v tématu [COccManager::CreateSite](../../mfc/reference/coccmanager-class.md#createsite). ## <a name="colecontrolsitecreatecontrol"></a><a name="createcontrol"></a>COleControlSite::CreateControl Vytvoří ovládací prvek ActiveX, `COleControlSite` který je hostován objektem. ``` virtual HRESULT CreateControl( CWnd* pWndCtrl, REFCLSID clsid, LPCTSTR lpszWindowName, DWORD dwStyle, const RECT& rect, UINT nID, CFile* pPersist = NULL, BOOL bStorage = FALSE, BSTR bstrLicKey = NULL); virtual HRESULT CreateControl( CWnd* pWndCtrl, REFCLSID clsid, LPCTSTR lpszWindowName, DWORD dwStyle, const POINT* ppt, const SIZE* psize, UINT nID, CFile* pPersist = NULL, BOOL bStorage = FALSE, BSTR bstrLicKey = NULL); ``` ### <a name="parameters"></a>Parametry *pWndCtrl*<br/> Ukazatel na objekt okna představující ovládací prvek. *Identifikátor clsid*<br/> Jedinečné ID třídy ovládacího prvku. *lpszNázev_okna*<br/> Ukazatel na text, který má být zobrazen v ovládacím prvku. Nastaví hodnotu vlastnosti Titulek nebo Text winodw (pokud existuje). *dwStyl*<br/> Styly systému Windows. Dostupné styly jsou uvedeny v části **Poznámky.** *Rect*<br/> Určuje velikost a umístění ovládacího prvku. Může to být `CRect` objekt `RECT` nebo struktura. *Nid*<br/> Určuje ID podřízeného okna ovládacího prvku. *pPřetrvávají*<br/> Ukazatel na `CFile` obsahující trvalý stav ovládacího prvku. Výchozí hodnota je NULL, označující, že ovládací prvek inicializuje sám bez obnovení jeho stavu z jakékoli trvalé úložiště. Pokud není NULL, by měl `CFile`být ukazatel na -odvozený objekt, který obsahuje trvalé data ovládacího prvku, ve formě datového proudu nebo úložiště. Tato data mohla být uložena při předchozí aktivaci klienta. Může `CFile` obsahovat jiná data, ale musí mít ukazatel pro čtení a zápis nastavený na `CreateControl`první bajt trvalých dat v době volání aplikace . *bÚložiště*<br/> Označuje, zda data v *pPersist* `IStorage` by `IStream` měla být interpretována jako nebo data. Pokud data v *pPersist* je úložiště, *bStorage* by měla být TRUE. Pokud data v *pPersist* je datový proud, *bStorage* by měl být FALSE. Výchozí hodnota je FALSE. *bstrLicKey*<br/> Volitelná data licenčního klíče. Tato data jsou potřebná pouze pro vytváření ovládacích prvků, které vyžadují licenční klíč za běhu. Pokud ovládací prvek podporuje licencování, je nutné zadat licenční klíč pro vytvoření ovládacího prvku úspěšné. Výchozí hodnota je NULL. *Ppt*<br/> Ukazatel na `POINT` strukturu, která obsahuje levý horní roh ovládacího prvku. Velikost ovládacího prvku je určena hodnotou *psize*. Hodnoty *ppt* a *psize* jsou volitelnou metodou určení velikosti a umístění opf ovládacího prvku. *pvelikost*<br/> Ukazatel na `SIZE` strukturu, která obsahuje velikost ovládacího prvku. Levý horní roh je určen hodnotou *ppt*. Hodnoty *ppt* a *psize* jsou volitelnou metodou určení velikosti a umístění opf ovládacího prvku. ### <a name="return-value"></a>Návratová hodnota Standardní hodnota HRESULT. ### <a name="remarks"></a>Poznámky Pouze podmnožinu příznaků *Windows dwStyle* jsou podporovány `CreateControl`: - WS_VISIBLE Vytvoří okno, které je zpočátku viditelné. Povinné, pokud chcete, aby byl ovládací prvek viditelný okamžitě, jako běžná okna. - WS_DISABLED Vytvoří okno, které je zpočátku zakázáno. Zakázané okno nemůže přijímat vstup od uživatele. Lze nastavit, pokud má ovládací prvek vlastnost Enabled. - WS_BORDER Vytvoří okno s tenkým ohraničením. Lze nastavit, pokud má ovládací prvek vlastnost BorderStyle. - WS_GROUP Určuje první ovládací prvek skupiny ovládacích prvků. Uživatel může pomocí směrových kláves změnit fokus klávesnice z jednoho ovládacího prvku ve skupině na další. Všechny ovládací prvky definované WS_GROUP stylu po prvním ovládacím prvku patří do stejné skupiny. Další ovládací prvek se stylem WS_GROUP ukončí skupinu a spustí další skupinu. - WS_TABSTOP Určuje ovládací prvek, který může přijímat fokus klávesnice, když uživatel stiskne klávesu TAB. Stisknutím klávesy TAB se změní zaostření klávesnice na další ovládací prvek WS_TABSTOP stylu. Druhé přetížení použijte k vytvoření ovládacích prvků výchozí velikosti. ## <a name="colecontrolsitedestroycontrol"></a><a name="destroycontrol"></a>COleControlSite::DestroyControl Zničí `COleControlSite` objekt. ``` virtual BOOL DestroyControl(); ``` ### <a name="return-value"></a>Návratová hodnota Nenulová, pokud je úspěšná, jinak 0. ### <a name="remarks"></a>Poznámky Po dokončení je objekt uvolněn z paměti a všechny ukazatele na objekt již nejsou platné. ## <a name="colecontrolsitedoverb"></a><a name="doverb"></a>COleControlSite::DoVerb Provede zadané sloveso. ``` virtual HRESULT DoVerb( LONG nVerb, LPMSG lpMsg = NULL); ``` ### <a name="parameters"></a>Parametry *nSVNože*<br/> Určuje sloveso, které má být spuštěno. Může obsahovat jednu z následujících možností: |Hodnota|Význam|Symbol| |-----------|-------------|------------| |0|primární požadavek|OLEIVERB_PRIMARY| |-1|Sekundární sloveso|(None)| |1|Zobrazí objekt pro úpravy.|OLEIVERB_SHOW| |-2|Upovená položku v samostatném okně.|OLEIVERB_OPEN| |-3|Skryje objekt.|OLEIVERB_HIDE| |-4|Aktivuje ovládací prvek na místě.|OLEIVERB_UIACTIVATE| |-5|Aktivuje ovládací prvek na místě, bez dalších prvků uživatelského rozhraní.|OLEIVERB_INPLACEACTIVATE| |-7|Zobrazení vlastností ovládacího prvku.|OLEIVERB_PROPERTIES| *lpMsg*<br/> Ukazatel na zprávu, která způsobila aktivaci položky. ### <a name="return-value"></a>Návratová hodnota Standardní hodnota HRESULT. ### <a name="remarks"></a>Poznámky Tato funkce přímo volá prostřednictvím rozhraní ovládacího `IOleObject` prvku ke spuštění zadaného slovesa. Pokud je vyvolána výjimka v důsledku tohoto volání funkce, je vrácen kód chyby HRESULT. Další informace naleznete v tématu [IOleObject::DoVerb](/windows/win32/api/oleidl/nf-oleidl-ioleobject-doverb) v sadě Windows SDK. ## <a name="colecontrolsiteenabledsc"></a><a name="enabledsc"></a>COleControlSite::EnableDSC Umožňuje získávání dat pro řídicí lokalitu. ``` virtual void EnableDSC(); ``` ### <a name="remarks"></a>Poznámky Volat rámci povolit a inicializovat získávání dat pro řídicí lokality. Přepsat tuto funkci poskytnout vlastní chování. ## <a name="colecontrolsiteenablewindow"></a><a name="enablewindow"></a>COleControlSite::EnableWindow Povolí nebo zakáže vstup myši a klávesnice na řídicí web. ``` virtual BOOL EnableWindow(BOOL bEnable); ``` ### <a name="parameters"></a>Parametry *bEnable*<br/> Určuje, zda má být okno povoleno nebo zakázáno: PRAVDA, má-li být povolen vstup okna, jinak NEPRAVDA. ### <a name="return-value"></a>Návratová hodnota Nenulová, pokud bylo okno dříve zakázáno, jinak 0. ## <a name="colecontrolsitefreezeevents"></a><a name="freezeevents"></a>COleControlSite::FreezeEvents Určuje, zda bude lokalita ovládacího prvku zpracovávat nebo ignorovat události vypálené z ovládacího prvku. ```cpp void FreezeEvents(BOOL bFreeze); ``` ### <a name="parameters"></a>Parametry *bZmrazení*<br/> Určuje, zda si lokalita ovládacího prvku přeje zastavit přijímání událostí. Nenulová, pokud ovládací prvek nepřijímá události; jinak nula. ### <a name="remarks"></a>Poznámky Pokud *bFreeze* je PRAVDA, server ovládacího prvku požaduje ovládací prvek zastavit fring události. Pokud *bFreeze* je FALSE, server ovládacího prvku požaduje ovládací prvek pokračovat v spouštění událostí. > [!NOTE] > Ovládací prvek není nutné zastavit spouštění událostí, pokud o to kontrolní lokality. Může pokračovat v spouštění, ale všechny následné události budou ignorovány řídicím serverem. ## <a name="colecontrolsitegetcontrolinfo"></a><a name="getcontrolinfo"></a>COleControlSite::GetControlInfo Načte informace o ovládacím prvku klávesnice mnemotechnické pomůcky a chování klávesnice. ```cpp void GetControlInfo(); ``` ### <a name="remarks"></a>Poznámky Informace jsou uloženy v [COleControlSite::m_ctlInfo](#m_ctlinfo). ## <a name="colecontrolsitegetdefbtncode"></a><a name="getdefbtncode"></a>COleControlSite::GetDefBtnCode Určuje, zda je ovládací prvek výchozím tlačítkem. ``` DWORD GetDefBtnCode(); ``` ### <a name="return-value"></a>Návratová hodnota Může se na nich vyvěšovat jedna z následujících hodnot: - DLGC_DEFPUSHBUTTON Control je výchozí tlačítko v dialogovém okně. - DLGC_UNDEFPUSHBUTTON Control není v dialogovém okně výchozím tlačítkem. - **0** Ovládání není tlačítko. ## <a name="colecontrolsitegetdlgctrlid"></a><a name="getdlgctrlid"></a>COleControlSite::GetDlgCtrlID Načte identifikátor ovládacího prvku. ``` virtual int GetDlgCtrlID() const; ``` ### <a name="return-value"></a>Návratová hodnota Identifikátor položky dialogu ovládacího prvku. ## <a name="colecontrolsitegeteventiid"></a><a name="geteventiid"></a>COleControlSite::GeteventiID Načte ukazatel na výchozí rozhraní události ovládacího prvku. ``` BOOL GetEventIID(IID* piid); ``` ### <a name="parameters"></a>Parametry *piid*<br/> Ukazatel na ID rozhraní. ### <a name="return-value"></a>Návratová hodnota Nenulová, pokud je úspěšná, jinak 0. V případě *úspěchu, piid* obsahuje ID rozhraní pro výchozí rozhraní události ovládacího prvku. ## <a name="colecontrolsitegetexstyle"></a><a name="getexstyle"></a>COleControlSite::GetExStyle Načte rozšířené styly okna. ``` virtual DWORD GetExStyle() const; ``` ### <a name="return-value"></a>Návratová hodnota Rozšířené styly ovládacího okna. ### <a name="remarks"></a>Poznámky Chcete-li načíst běžné styly, zavolejte [COleControlSite::GetStyle](#getstyle). ## <a name="colecontrolsitegetproperty"></a><a name="getproperty"></a>COleControlSite::Vlastnost GetProperty Získá vlastnost ovládacího prvku určené *dwDispID*. ``` virtual void GetProperty( DISPID dwDispID, VARTYPE vtProp, void* pvProp) const; ``` ### <a name="parameters"></a>Parametry *dwDispID*<br/> Identifikuje ID odeslání vlastnosti, které lze nalézt `IDispatch` na výchozí rozhraní ovládacího prvku, které mají být načteny. *vtProp*<br/> Určuje typ vlastnosti, která má být načtena. Možné hodnoty naleznete v části Poznámky pro [COleDispatchDriver::InvokeHelper](../../mfc/reference/coledispatchdriver-class.md#invokehelper). *pvProp*<br/> Adresa proměnné, která obdrží hodnotu vlastnosti. Musí odpovídat typu určenému *vtProp*. ### <a name="remarks"></a>Poznámky Hodnota je vrácena prostřednictvím *pvProp*. ## <a name="colecontrolsitegetstyle"></a><a name="getstyle"></a>COleControlSite::GetStyle Načte styly řídicího webu. ``` virtual DWORD GetStyle() const; ``` ### <a name="return-value"></a>Návratová hodnota Styly okna. ### <a name="remarks"></a>Poznámky Seznam možných hodnot naleznete v tématu [Styly systému Windows](../../mfc/reference/styles-used-by-mfc.md#window-styles). Chcete-li načíst rozšířené styly lokality ovládacího prvku, zavolejte [COleControlSite::GetExStyle](#getexstyle). ## <a name="colecontrolsitegetwindowtext"></a><a name="getwindowtext"></a>COleControlSite::GetWindowText Načte aktuální text ovládacího prvku. ``` virtual void GetWindowText(CString& str) const; ``` ### <a name="parameters"></a>Parametry *Str*<br/> Odkaz na `CString` objekt, který obsahuje aktuální text ovládacího prvku. ### <a name="remarks"></a>Poznámky Pokud ovládací prvek podporuje caption stock vlastnost, tato hodnota je vrácena. Pokud vlastnost Caption stock není podporována, je vrácena hodnota vlastnosti Text. ## <a name="colecontrolsiteinvokehelper"></a><a name="invokehelper"></a>COleControlSite::InvokeHelper Vyvolá metodu nebo vlastnost určenou *dwDispID*v kontextu určeném *wFlags*. ``` virtual void AFX_CDECL InvokeHelper( DISPID dwDispID, WORD wFlags, VARTYPE vtRet, void* pvRet, const BYTE* pbParamInfo, ...); ``` ### <a name="parameters"></a>Parametry *dwDispID*<br/> Identifikuje ID odeslání vlastnosti nebo metody, které se `IDispatch` nacházejí v rozhraní ovládacího prvku, které mají být vyvolány. *wPříznaky*<br/> Příznaky popisující kontext volání iDispatch::Invoke. Možné hodnoty *wFlags* `IDispatch::Invoke` naleznete v sadě Windows SDK. *vtRet*<br/> Určuje typ vrácené hodnoty. Možné hodnoty naleznete v části Poznámky pro [COleDispatchDriver::InvokeHelper](../../mfc/reference/coledispatchdriver-class.md#invokehelper). *pvRet*<br/> Adresa proměnné, která obdrží hodnotu vlastnosti nebo vrácenou hodnotu. Musí odpovídat typu určenému *vtRet*. *pbParamInfo*<br/> Ukazatel na řetězec bajtů ukončený nulou určující typy parametrů následujících *po pbParamInfo*. Možné hodnoty naleznete v části Poznámky pro [COleDispatchDriver::InvokeHelper](../../mfc/reference/coledispatchdriver-class.md#invokehelper). *...*<br/> Variabilní seznam parametrů, typů určených v *pbParamInfo*. ### <a name="remarks"></a>Poznámky Parametr *pbParamInfo* určuje typy parametrů předaných metodě nebo vlastnosti. Variabilní seznam argumentů je reprezentován ... v deklaraci syntaxe. Tato funkce převede parametry na hodnoty VARIANTARG `IDispatch::Invoke` a poté vyvolá metodu ovládacího prvku. Pokud volání `IDispatch::Invoke` nezdaří, tato funkce vyvolá výjimku. Pokud je stavový `IDispatch::Invoke` `DISP_E_EXCEPTION`kód vrácený , `COleDispatchException` tato funkce vyvolá `COleException`objekt, jinak vyvolá . ## <a name="colecontrolsiteinvokehelperv"></a><a name="invokehelperv"></a>COleControlSite::InvokeHelperV Vyvolá metodu nebo vlastnost určenou *dwDispID*v kontextu určeném *wFlags*. ``` virtual void InvokeHelperV( DISPID dwDispID, WORD wFlags, VARTYPE vtRet, void* pvRet, const BYTE* pbParamInfo, va_list argList); ``` ### <a name="parameters"></a>Parametry *dwDispID*<br/> Identifikuje ID odeslání vlastnosti nebo metody, které se `IDispatch` nacházejí v rozhraní ovládacího prvku, které mají být vyvolány. *wPříznaky*<br/> Příznaky popisující kontext volání iDispatch::Invoke. *vtRet*<br/> Určuje typ vrácené hodnoty. Možné hodnoty naleznete v části Poznámky pro [COleDispatchDriver::InvokeHelper](../../mfc/reference/coledispatchdriver-class.md#invokehelper). *pvRet*<br/> Adresa proměnné, která obdrží hodnotu vlastnosti nebo vrácenou hodnotu. Musí odpovídat typu určenému *vtRet*. *pbParamInfo*<br/> Ukazatel na řetězec bajtů ukončený nulou určující typy parametrů následujících *po pbParamInfo*. Možné hodnoty naleznete v části Poznámky pro [COleDispatchDriver::InvokeHelper](../../mfc/reference/coledispatchdriver-class.md#invokehelper). *seznam arglist*<br/> Ukazatel na seznam proměnných argumentů. ### <a name="remarks"></a>Poznámky Parametr *pbParamInfo* určuje typy parametrů předaných metodě nebo vlastnosti. Další parametry pro volanou metodu nebo vlastnost lze předat pomocí *parametru va_list.* Obvykle je tato funkce `COleControlSite::InvokeHelper`volána společností . ## <a name="colecontrolsiteisdefaultbutton"></a><a name="isdefaultbutton"></a>COleControlSite::Tlačítko IsDefaultButton Určuje, zda je ovládací prvek výchozím tlačítkem. ``` BOOL IsDefaultButton(); ``` ### <a name="return-value"></a>Návratová hodnota Nenulová, pokud je ovládací prvek výchozím tlačítkem v okně, jinak nula. ## <a name="colecontrolsiteiswindowenabled"></a><a name="iswindowenabled"></a>COleControlSite::IsWindowEnabled COleControlSite::IsWindowEnabled COleControlSite::IsWindowEnabled COle Určuje, zda je povolena lokalita ovládacího prvku. ``` virtual BOOL IsWindowEnabled() const; ``` ### <a name="return-value"></a>Návratová hodnota Nenulová, pokud je ovládací prvek povolen, jinak nula. ### <a name="remarks"></a>Poznámky Hodnota je načtena z vlastnosti Enabled stock ovládacího prvku. ## <a name="colecontrolsitem_biswindowless"></a><a name="m_biswindowless"></a>COleControlSite::m_bIsWindowless Určuje, zda je objekt ovládací prvek bez oken. ``` BOOL m_bIsWindowless; ``` ### <a name="remarks"></a>Poznámky Nenulová, pokud ovládací prvek nemá žádné okno, jinak nula. ## <a name="colecontrolsitem_ctlinfo"></a><a name="m_ctlinfo"></a>COleControlSite::m_ctlInfo Informace o tom, jak je ovládáním zpracováno zadávání montovny klávesnicí. ``` CONTROLINFO m_ctlInfo; ``` ### <a name="remarks"></a>Poznámky Tyto informace jsou uloženy ve struktuře [CONTROLINFO.](/windows/win32/api/ocidl/ns-ocidl-controlinfo) ## <a name="colecontrolsitem_dweventsink"></a><a name="m_dweventsink"></a>COleControlSite::m_dwEventSink Obsahuje soubor cookie spojovacího bodu z jímky událostí ovládacího prvku. ``` DWORD m_dwEventSink; ``` ## <a name="colecontrolsitem_dwmiscstatus"></a><a name="m_dwmiscstatus"></a>COleControlSite::m_dwMiscStatus Obsahuje různé informace o ovládacím prvku. ``` DWORD m_dwMiscStatus; ``` ### <a name="remarks"></a>Poznámky Další informace naleznete v [tématu OLEMISC](/windows/win32/api/oleidl/ne-oleidl-olemisc)v sadě Windows SDK. ## <a name="colecontrolsitem_dwpropnotifysink"></a><a name="m_dwpropnotifysink"></a>COleControlSite::m_dwPropNotifySink Obsahuje soubor cookie [IPropertyNotifySink.](/windows/win32/api/ocidl/nn-ocidl-ipropertynotifysink) ``` DWORD m_dwPropNotifySink; ``` ## <a name="colecontrolsitem_dwstyle"></a><a name="m_dwstyle"></a>COleControlSite::m_dwStyle Obsahuje window styly ovládacího prvku. ``` DWORD m_dwStyle; ``` ## <a name="colecontrolsitem_hwnd"></a><a name="m_hwnd"></a>COleControlSite::m_hWnd Obsahuje HWND ovládacího prvku nebo NULL, pokud je ovládací prvek bez oken. ``` HWND m_hWnd; ``` ## <a name="colecontrolsitem_iidevents"></a><a name="m_iidevents"></a>COleControlSite::m_iidEvents Obsahuje ID rozhraní výchozího rozhraní jímky událostí ovládacího prvku. ``` IID m_iidEvents; ``` ## <a name="colecontrolsitem_nid"></a><a name="m_nid"></a>COleControlSite::m_nID Obsahuje ID dialogového okna ovládacího prvku. ``` UINT m_nID; ``` ## <a name="colecontrolsitem_pactiveobject"></a><a name="m_pactiveobject"></a>COleControlSite::m_pActiveObject Obsahuje rozhraní [IOleInPlaceActiveObject](/windows/win32/api/oleidl/nn-oleidl-ioleinplaceactiveobject) ovládacího prvku. ``` LPOLEINPLACEACTIVEOBJECT m_pActiveObject; ``` ## <a name="colecontrolsitem_pctrlcont"></a><a name="m_pctrlcont"></a>COleControlSite::m_pCtrlCont Obsahuje kontejner ovládacího prvku (představující formulář). ``` COleControlContainer* m_pCtrlCont; ``` ## <a name="colecontrolsitem_pinplaceobject"></a><a name="m_pinplaceobject"></a>COleControlSite::m_pInPlaceObject `IOleInPlaceObject` Obsahuje rozhraní [IOleInPlaceObject](/windows/win32/api/oleidl/nn-oleidl-ioleinplaceobject) ovládacího prvku. ``` LPOLEINPLACEOBJECT m_pInPlaceObject; ``` ## <a name="colecontrolsitem_pobject"></a><a name="m_pobject"></a>COleControlSite::m_pObject Obsahuje `IOleObjectInterface` rozhraní ovládacího prvku. ``` LPOLEOBJECT m_pObject; ``` ## <a name="colecontrolsitem_pwindowlessobject"></a><a name="m_pwindowlessobject"></a>COleControlSite::m_pWindowlessObject `IOleInPlaceObjectWindowless`Obsahuje [rozhraní IOleInPlaceObjectLess](/windows/win32/api/ocidl/nn-ocidl-ioleinplaceobjectwindowless) ovládacího prvku. ``` IOleInPlaceObjectWindowless* m_pWindowlessObject; ``` ## <a name="colecontrolsitem_pwndctrl"></a><a name="m_pwndctrl"></a>COleControlSite::m_pWndCtrl Obsahuje ukazatel na `CWnd` objekt, který představuje samotný ovládací prvek. ``` CWnd* m_pWndCtrl; ``` ## <a name="colecontrolsitem_rect"></a><a name="m_rect"></a>COleControlSite::m_rect Obsahuje hranice ovládacího prvku vzhledem k okno kontejneru. ``` CRect m_rect; ``` ## <a name="colecontrolsitemodifystyle"></a><a name="modifystyle"></a>COleControlSite::Změnit styl Upraví styly ovládacího prvku. ``` virtual BOOL ModifyStyle( DWORD dwRemove, DWORD dwAdd, UINT nFlags); ``` ### <a name="parameters"></a>Parametry *dwOdstranit*<br/> Styly, které mají být odebrány z aktuálních stylů oken. *dwAdd*<br/> Styly, které mají být přidány z aktuálních stylů oken. *nPříznaky*<br/> Příznaky umístění okna. Seznam možných hodnot naleznete v tématu [SetWindowPos](/windows/win32/api/winuser/nf-winuser-setwindowpos) funkce v sadě Windows SDK. ### <a name="return-value"></a>Návratová hodnota Nenulová, pokud se změní styly, jinak nula. ### <a name="remarks"></a>Poznámky Vlastnost Enabled ovládacího prvku bude upravena tak, aby odpovídala nastavení pro WS_DISABLED. Vlastnost Border Style ovládacího prvku bude upravena tak, aby odpovídala požadovanému nastavení pro WS_BORDER. Všechny ostatní styly jsou použity přímo na popisovač okna ovládacího prvku, pokud je k dispozici. Upraví styly oken ovládacího prvku. Styly, které mají být přidány nebo odebrány, lze kombinovat pomocí bitového operátoru OR ( &#124;). Informace o dostupných stylech oken naleznete v části [CreateWindow](/windows/win32/api/winuser/nf-winuser-createwindoww) v ksouboru Windows SDK. Pokud *nFlags* je `ModifyStyle` nenulová, volá `SetWindowPos`Win32 funkce a překreslí okno kombinací *nFlags* s následujícími čtyřmi příznaky: - SWP_NOSIZE Zachová aktuální velikost. - SWP_NOMOVE Zachová aktuální pozici. - SWP_NOZORDER Zachová aktuální pořadí Z. - SWP_NOACTIVATE Neaktivuje okno. Chcete-li upravit rozšířené styly okna, zavolejte [ModifyStyleEx](#modifystyleex). ## <a name="colecontrolsitemodifystyleex"></a><a name="modifystyleex"></a>COleControlSite::ModifyStyleEx Upraví rozšířené styly ovládacího prvku. ``` virtual BOOL ModifyStyleEx( DWORD dwRemove, DWORD dwAdd, UINT nFlags); ``` ### <a name="parameters"></a>Parametry *dwOdstranit*<br/> Rozšířené styly, které mají být odebrány z aktuálních stylů oken. *dwAdd*<br/> Rozšířené styly, které mají být přidány z aktuálních stylů oken. *nPříznaky*<br/> Příznaky umístění okna. Seznam možných hodnot naleznete v tématu [SetWindowPos](/windows/win32/api/winuser/nf-winuser-setwindowpos) funkce v sadě Windows SDK. ### <a name="return-value"></a>Návratová hodnota Nenulová, pokud se změní styly, jinak nula. ### <a name="remarks"></a>Poznámky Vlastnost Vzhled ovládacího prvku bude upravena tak, aby odpovídala nastavení pro WS_EX_CLIENTEDGE. Všechny ostatní rozšířené styly okna jsou použity přímo na popisovač okna ovládacího prvku, pokud je k dispozici. Změní rozšířené styly okna objektu řídicího místa. Styly, které mají být přidány nebo odebrány, lze kombinovat pomocí bitového operátoru OR ( &#124;). Informace o dostupných stylech oken naleznete v části [CreateWindowEx](/windows/win32/api/winuser/nf-winuser-createwindowexw) v ksouboru Windows SDK. Pokud *nFlags* je `ModifyStyleEx` nenulová, volá `SetWindowPos`Win32 funkce a překreslí okno kombinací *nFlags* s následujícími čtyřmi příznaky: - SWP_NOSIZE Zachová aktuální velikost. - SWP_NOMOVE Zachová aktuální pozici. - SWP_NOZORDER Zachová aktuální pořadí Z. - SWP_NOACTIVATE Neaktivuje okno. Chcete-li upravit rozšířené styly okna, zavolejte [ModifyStyle](#modifystyle). ## <a name="colecontrolsitemovewindow"></a><a name="movewindow"></a>COleControlSite::MoveWindow Změní polohu ovládacího prvku. ``` virtual void MoveWindow( int x, int y, int nWidth, int nHeight); ``` ### <a name="parameters"></a>Parametry *X*<br/> Nová poloha levé strany okna. *Y*<br/> Nová pozice v horní části okna. *nŠířka*<br/> Nová šířka okna *nVýška*<br/> Nová výška okna. ## <a name="colecontrolsitequickactivate"></a><a name="quickactivate"></a>COleControlSite::QuickActivate Rychle aktivuje uzavřený ovládací prvek. ``` virtual BOOL QuickActivate(); ``` ### <a name="return-value"></a>Návratová hodnota Nenulová, pokud byla aktivována řídicí lokalita, jinak nula. ### <a name="remarks"></a>Poznámky Tato funkce by měla být volána pouze v případě, že uživatel přepíše proces vytváření ovládacího prvku. Metody `IPersist*::Load` `IPersist*::InitNew` a by měly být volány po rychlé aktivaci dojde. Ovládací prvek by měl navázat jeho připojení k jímky kontejneru během rychlé aktivace. Tato připojení však nejsou `IPersist*::Load` `IPersist*::InitNew` aktivní, dokud nebo byla volána. ## <a name="colecontrolsitesafesetproperty"></a><a name="safesetproperty"></a>COleControlSite::SafeSetProperty Nastaví vlastnost ovládacího prvku určenou *dwDispID*. ``` virtual BOOL AFX_CDECL SafeSetProperty( DISPID dwDispID, VARTYPE vtProp, ...); ``` ### <a name="parameters"></a>Parametry *dwDispID*<br/> Identifikuje ID odeslání vlastnosti nebo metody, které se `IDispatch` nacházejí v rozhraní ovládacího prvku, které mají být nastaveny. *vtProp*<br/> Určuje typ vlastnosti, která má být nastavena. Možné hodnoty naleznete v části Poznámky pro [COleDispatchDriver::InvokeHelper](../../mfc/reference/coledispatchdriver-class.md#invokehelper). *...*<br/> Jeden parametr typu určeného *vtProp*. ### <a name="return-value"></a>Návratová hodnota Nenulová, pokud je úspěšná; jinak nula. ### <a name="remarks"></a>Poznámky > [!NOTE] > Na `SetProperty` `SetPropertyV`rozdíl od a , pokud dojde k chybě (například při pokusu o nastavení neexistující vlastnost), není vyvolána žádná výjimka. ## <a name="colecontrolsitesetdefaultbutton"></a><a name="setdefaultbutton"></a>COleControlSite::SetDefaultButton Nastaví ovládací prvek jako výchozí tlačítko. ```cpp void SetDefaultButton(BOOL bDefault); ``` ### <a name="parameters"></a>Parametry *bVýchozí*<br/> Nenulová, pokud by se ovládací prvek měl stát výchozím tlačítkem; jinak nula. ### <a name="remarks"></a>Poznámky > [!NOTE] > Ovládací prvek musí mít nastaven OLEMISC_ACTSLIKEBUTTON stavový bit. ## <a name="colecontrolsitesetdlgctrlid"></a><a name="setdlgctrlid"></a>COleControlSite::SetDlgCtrlID Změní hodnotu identifikátoru položky dialogového okna ovládacího prvku. ``` virtual int SetDlgCtrlID(int nID); ``` ### <a name="parameters"></a>Parametry *Nid*<br/> Nová hodnota identifikátoru. ### <a name="return-value"></a>Návratová hodnota Pokud je úspěšná, předchozí dialogová položka identifikátor okna; jinak 0. ### <a name="remarks"></a>Poznámky ## <a name="colecontrolsitesetfocus"></a><a name="setfocus"></a>COleControlSite::SetFocus Nastaví fokus na ovládací prvek. ``` virtual CWnd* SetFocus(); virtual CWnd* SetFocus(LPMSG lpmsg); ``` ### <a name="parameters"></a>Parametry *lpmsg*<br/> Ukazatel na [strukturu MSG](/windows/win32/api/winuser/ns-winuser-msg). Tato struktura obsahuje zprávu systému Windows, která spouští `SetFocus` požadavek na ovládací prvek obsažený v aktuální lokalitě ovládacího prvku. ### <a name="return-value"></a>Návratová hodnota Ukazatel na okno, které dříve mělo fokus. ## <a name="colecontrolsitesetproperty"></a><a name="setproperty"></a>COleControlSite::SetProperty Nastaví vlastnost ovládacího prvku určenou *dwDispID*. ``` virtual void AFX_CDECL SetProperty( DISPID dwDispID, VARTYPE vtProp, ...); ``` ### <a name="parameters"></a>Parametry *dwDispID*<br/> Identifikuje ID odeslání vlastnosti nebo metody, které se `IDispatch` nacházejí v rozhraní ovládacího prvku, které mají být nastaveny. *vtProp*<br/> Určuje typ vlastnosti, která má být nastavena. Možné hodnoty naleznete v části Poznámky pro [COleDispatchDriver::InvokeHelper](../../mfc/reference/coledispatchdriver-class.md#invokehelper). *...*<br/> Jeden parametr typu určeného *vtProp*. ### <a name="remarks"></a>Poznámky Pokud `SetProperty` dojde k chybě, je vyvolána výjimka. Typ výjimky je určen vrácenou hodnotou pokusu o nastavení vlastnosti nebo metody. Pokud je `DISP_E_EXCEPTION`vrácená `COleDispatchExcpetion` hodnota , je vyvolána; jinak `COleException`. ## <a name="colecontrolsitesetpropertyv"></a><a name="setpropertyv"></a>COleControlSite::SetPropertyV Nastaví vlastnost ovládacího prvku určenou *dwDispID*. ``` virtual void SetPropertyV( DISPID dwDispID, VARTYPE vtProp, va_list argList); ``` ### <a name="parameters"></a>Parametry *dwDispID*<br/> Identifikuje ID odeslání vlastnosti nebo metody, které se `IDispatch` nacházejí v rozhraní ovládacího prvku, které mají být nastaveny. *vtProp*<br/> Určuje typ vlastnosti, která má být nastavena. Možné hodnoty naleznete v části Poznámky pro [COleDispatchDriver::InvokeHelper](../../mfc/reference/coledispatchdriver-class.md#invokehelper). *seznam arglist*<br/> Ukazatel na seznam argumentů. ### <a name="remarks"></a>Poznámky Další parametry pro volanou metodu nebo vlastnost lze předat pomocí *parametru arg_list.* Pokud `SetProperty` dojde k chybě, je vyvolána výjimka. Typ výjimky je určen vrácenou hodnotou pokusu o nastavení vlastnosti nebo metody. Pokud je `DISP_E_EXCEPTION`vrácená `COleDispatchExcpetion` hodnota , je vyvolána; jinak `COleException`. ## <a name="colecontrolsitesetwindowpos"></a><a name="setwindowpos"></a>COleControlSite::SetWindowPos Nastaví velikost, umístění a pořadí Z řídicího webu. ``` virtual BOOL SetWindowPos( const CWnd* pWndInsertAfter, int x, int y, int cx, int cy, UINT nFlags); ``` ### <a name="parameters"></a>Parametry *pWndInsertPo*<br/> Ukazatel na okno. *X*<br/> Nová poloha levé strany okna. *Y*<br/> Nová pozice v horní části okna. *Cx*<br/> Nová šířka okna *Cy*<br/> Nová výška okna. *nPříznaky*<br/> Určuje příznaky velikosti a umístění okna. Možné hodnoty naleznete v části Poznámky pro [setwindowpos](/windows/win32/api/winuser/nf-winuser-setwindowpos) v sadě Windows SDK. ### <a name="return-value"></a>Návratová hodnota Nenulová, pokud je úspěšná, jinak nula. ## <a name="colecontrolsitesetwindowtext"></a><a name="setwindowtext"></a>COleControlSite::SetWindowText Nastaví text pro řídicí web. ``` virtual void SetWindowText(LPCTSTR lpszString); ``` ### <a name="parameters"></a>Parametry *lpszString*<br/> Ukazatel na řetězec s nulovým ukončením, který má být použit jako nový text nadpisu nebo ovládacího prvku. ### <a name="remarks"></a>Poznámky Tato funkce se nejprve pokusí nastavit vlastnost Caption stock. Pokud vlastnost Caption stock není podporována, je místo toho nastavena vlastnost Text. ## <a name="colecontrolsiteshowwindow"></a><a name="showwindow"></a>COleControlSite::Zobrazitokno Nastaví stav zobrazení okna. ``` virtual BOOL ShowWindow(int nCmdShow); ``` ### <a name="parameters"></a>Parametry *nCmdZobrazit*<br/> Určuje způsob zobrazení řídicí lokality. Musí se jednat o jednu z následujících hodnot: - SW_HIDE Skryje toto okno a předá aktivaci do jiného okna. - SW_MINIMIZE Minimalizuje okno a aktivuje okno nejvyšší úrovně v seznamu systému. - SW_RESTORE Aktivuje a zobrazí okno. Pokud je okno minimalizováno nebo maximalizováno, systém Windows jej obnoví do původní velikosti a umístění. - SW_SHOW Aktivuje okno a zobrazí ho v aktuální velikosti a poloze. - SW_SHOWMAXIMIZED Aktivuje okno a zobrazí ho jako maximalizované okno. - SW_SHOWMINIMIZED Aktivuje okno a zobrazí ho jako ikonu. - SW_SHOWMINNOACTIVE Zobrazí okno jako ikonu. Okno, které je aktuálně aktivní, zůstane aktivní. - SW_SHOWNA Zobrazí okno v aktuálním stavu. Okno, které je aktuálně aktivní, zůstane aktivní. - SW_SHOWNOACTIVATE Zobrazí okno v nejnovější velikosti a poloze. Okno, které je aktuálně aktivní, zůstane aktivní. - SW_SHOWNORMAL Aktivuje a zobrazí okno. Pokud je okno minimalizováno nebo maximalizováno, systém Windows jej obnoví do původní velikosti a umístění. ### <a name="return-value"></a>Návratová hodnota Nenulová, pokud bylo okno dříve viditelné; 0, pokud bylo okno dříve skryto. ## <a name="see-also"></a>Viz také [CCmdTarget – třída](../../mfc/reference/ccmdtarget-class.md)<br/> [Graf hierarchie](../../mfc/hierarchy-chart.md)<br/> [COleControlContainer – třída](../../mfc/reference/colecontrolcontainer-class.md)
36.935405
803
0.772691
ces_Latn
0.990341
0c2b1c8661559b28acb4febdf8f36865027c7744
25
md
Markdown
README.md
goranjako/node.js-babel-mysql-jwt
f7ce9eee2c5cb23237586cebe049d2b9ef1063df
[ "MIT" ]
null
null
null
README.md
goranjako/node.js-babel-mysql-jwt
f7ce9eee2c5cb23237586cebe049d2b9ef1063df
[ "MIT" ]
null
null
null
README.md
goranjako/node.js-babel-mysql-jwt
f7ce9eee2c5cb23237586cebe049d2b9ef1063df
[ "MIT" ]
null
null
null
# node.js-babel-mysql-jwt
25
25
0.76
zul_Latn
0.479982
0c2b4fb76d89041efbea452c1b64cf0e24bbb645
85
md
Markdown
README.md
Cenfracee/DEP_webPOS_AdminLTE
394d516fbceacfa96ffd514f603b8ab176427147
[ "MIT" ]
null
null
null
README.md
Cenfracee/DEP_webPOS_AdminLTE
394d516fbceacfa96ffd514f603b8ab176427147
[ "MIT" ]
null
null
null
README.md
Cenfracee/DEP_webPOS_AdminLTE
394d516fbceacfa96ffd514f603b8ab176427147
[ "MIT" ]
null
null
null
## DEP Assignment - WEB POS Based on AdminLTE v3 ### Dependencies - admin-lte 3.0.5
21.25
27
0.694118
eng_Latn
0.450058
0c2bea686c73fb0fbc592fa4b876d76669254149
2,329
md
Markdown
README.md
pierrefourgeaud/node-osx-theme
d27d2733f9f4ad0b6e92f2ef448aedf463e9ad2d
[ "Unlicense", "MIT" ]
1
2017-10-24T05:38:09.000Z
2017-10-24T05:38:09.000Z
README.md
pierrefourgeaud/node-osx-theme
d27d2733f9f4ad0b6e92f2ef448aedf463e9ad2d
[ "Unlicense", "MIT" ]
1
2017-10-24T05:40:56.000Z
2017-10-24T05:40:56.000Z
README.md
pierrefourgeaud/node-osx-theme
d27d2733f9f4ad0b6e92f2ef448aedf463e9ad2d
[ "Unlicense", "MIT" ]
null
null
null
# node-osx-theme Retreive information about the theme on OSX through a light and easy to use API. Not executing external binary :) **Requires OS X 10.10 or later.** **Manage `System Preferences` → `General` → `Use dark menu bar and Dock`.** ## Install ```sh $ npm install --save osx-theme ``` ## API ### theme.getMode() Returns `'Light'` or `'Dark'` depending what theme is activated. ### theme.setMode(mode) Set the mode you want. Possible values are `'Light'` or `'Dark'`. Is you make a typo, `'Light'` will be used. ### theme.toggleMode() Toggle the theme you are on. ### theme.isDark() Returns `true` if the theme is dark, `false` otherwise. ### theme.isLight() Returns `true` if the theme is light, `false` otherwise. ## Example ```js var theme = require('osx-theme'); console.log(theme.isDark()) // True if the dark mode is activated, false otherwise console.log(theme.isLight()) // True if the light mode is activated, false otherwise console.log(theme.getMode()) // 'Light' or 'Dark' theme.setMode('Light') // or 'Dark' theme.toggleMode() // easy to understand :) ``` ## Support * Bugs and feature requests: [Github issue tracker](https://github.com/pierrefourgeaud/node-osx-theme/issues?state=open) ## License The MIT License (MIT) Copyright (c) 2014 Pierre Fourgeaud Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
33.271429
460
0.742808
eng_Latn
0.802946
0c2c1edab83870e9b8b5d0987e38ba229eb76acd
1,699
md
Markdown
README.md
andrewflbarnes/aflb
b79b9d5f1353b32fd8bc2cc93cf82a39629583fa
[ "MIT" ]
null
null
null
README.md
andrewflbarnes/aflb
b79b9d5f1353b32fd8bc2cc93cf82a39629583fa
[ "MIT" ]
null
null
null
README.md
andrewflbarnes/aflb
b79b9d5f1353b32fd8bc2cc93cf82a39629583fa
[ "MIT" ]
null
null
null
# andrewflbarnes [![Netlify Status](https://api.netlify.com/api/v1/badges/846fd94a-78ee-43c3-ba8a-426e9e0933fd/deploy-status)](https://app.netlify.com/sites/eloquent-bell-bf232f/deploys) A minimal static website designed to host my CV and contact me. Footer contains links to common services (GH, contact me, CV, etc.) ### Motivation I need somewhere to keep track of my CV and this seems like a good enough place to do it (version control, yay) but also means I can host in online. The downside is it's not easily converatable to PDF/doc formats (maybe I'll change thatat some point...). ### Local deployment Run the build script (pretty sure I don't need this anymore!) then open static/index.html in your favourite browser - simples :) ```bash make ``` Available receipes are - build - (default) moves source files to a static folder and replaces placeholders - clean - removes the static folder - loop - runs the build recipe once a second ### Production/Staging deployments Push your branch and netlify will take care of the rest :) Note that the build agents on netlify use dash by default. To verify the build will continue to work (in particular the Makefile) ensure dash is installed locally then run `./test.sh`. This will just loop through some common shells and run `make clean build`. ### AWS The contact functionality on the webpage is backed by some services running in AWS. These are - API gateway - hosts the actual API (https://app.swaggerhub.com/apis/andrewflbarnes/andrewflbarnes/1.0.0 - Lambda - a serverless function which integrates with SES to deliver the email (WIP) Some of these services are managed through terraform which can be found in the aws folder
38.613636
169
0.76751
eng_Latn
0.996207
0c2cd1ffff192c5737f19dcfbd06e22a4c6c8c6d
822
md
Markdown
README.md
blueworx/bvr-samples
a3d931a2321dcfc411b9b620bfcd593084abff91
[ "Apache-2.0" ]
3
2018-11-29T10:37:21.000Z
2020-02-04T15:30:19.000Z
README.md
blueworx/bvr-samples
a3d931a2321dcfc411b9b620bfcd593084abff91
[ "Apache-2.0" ]
null
null
null
README.md
blueworx/bvr-samples
a3d931a2321dcfc411b9b620bfcd593084abff91
[ "Apache-2.0" ]
3
2018-12-18T15:16:55.000Z
2021-12-20T21:13:33.000Z
# bvr-samples Repository for BVR sample applications ## Pre-requisites All samples are designed to be used on the Blueworx platform of products including Blueworx Voice Response and Blueworx Resource Manager. Each sample will have specific version requirements / hosting platform requirements (i.e. BVR for AIX or BVR for Linux). Where there are specific requirements, see the sample comments / README.md. ## Using the samples Usage instructions for each sample can either be found in: * A subfolder README.md file * Comment at the top of the sample file ## License This project / repository is licensed under [Apache License Version 2.0](LICENSE) **Note**: All source code files contain individual copyright statements, which, in accordance with the license need to be retained in any derivative works (clause 4c)
41.1
196
0.790754
eng_Latn
0.998323
0c2e3298a8831b3adeba378d2b9327fedc37bba1
9,059
md
Markdown
readme.md
readwritetools/rwt-kanji
90135d71e30589311888228549993072ec693f61
[ "MIT" ]
null
null
null
readme.md
readwritetools/rwt-kanji
90135d71e30589311888228549993072ec693f61
[ "MIT" ]
null
null
null
readme.md
readwritetools/rwt-kanji
90135d71e30589311888228549993072ec693f61
[ "MIT" ]
null
null
null
<figure> <img src='/img/components/kanji/kanji-1500x750.jpg' width='100%' /> <figcaption></figcaption> </figure> ##### Open Source DOM Component # “Kanji” Designer Card ## Discover and reveal text over image <address> <img src='/img/48x48/rwtools.png' /> by <a href='https://readwritetools.com' title='Read Write Tools'>Read Write Tools</a> <time datetime=2020-04-24>Apr 24, 2020</time></address> <table> <tr><th>Abstract</th></tr> <tr><td>The <span class=product>rwt-kanji</span> DOM component displays a designer card which has a horizontal title, a vertically transformed subtitle, text that is formatted as a blockquote, and an image with a mouse-over explanation.</td></tr> </table> ### Motivation The <span>rwt-kanji</span> DOM component is intended for use on web pages where the reader's attention is only lightly engaged. Sometimes a reader is in <q>scan</q> mode, rather than comprehension mode. They are sniffing for the scent of things. When that's the case, this component can be used to present a complex idea as a simple visualization. When a user's interest is piqued, more information can be revealed to the reader by hovering the mouse over the card's prominently displayed image. The extra information fades in, revealing itself on top of the image. This component takes its name from its original use case: describing the inspiration behind the logos used in the desktop apps of Read Write Tools. #### In the wild To see an example of this component in use, visit the <a href='https://readwritetools.com'>READ WRITE TOOLS</a> home page. It uses several instances of this component. To understand what's going on under the hood, use the browser's inspector to view the HTML source code and network activity, and follow along as you read this documentation. ### Installation #### Prerequisites The <span>rwt-kanji</span> DOM component works in any browser that supports modern W3C standards. Templates are written using <span>BLUE</span><span> PHRASE</span> notation, which can be compiled into HTML using the free <a href='https://hub.readwritetools.com/desktop/rwview.blue'>Read Write View</a> desktop app. It has no other prerequisites. Distribution and installation are done with either NPM or via Github. #### Download <details> <summary>Download using NPM</summary> <p><b>OPTION 1:</b> Familiar with Node.js and the <code>package.json</code> file?<br />Great. Install the component with this command:</p> <pre lang=bash> npm install rwt-kanji<br /> </pre> <p><b>OPTION 2:</b> No prior experience using NPM?<br />Just follow these general steps:</p> <ul> <li>Install <a href='https://nodejs.org'>Node.js/NPM</a> on your development computer.</li> <li>Create a <code>package.json</code> file in the root of your web project using the command:</li> <pre lang=bash> npm init<br /> </pre> <li>Download and install the DOM component using the command:</li> <pre lang=bash> npm install rwt-kanji<br /> </pre> </ul> <p style='font-size:0.9em'>Important note: This DOM component uses Node.js and NPM and <code>package.json</code> as a convenient <i>distribution and installation</i> mechanism. The DOM component itself does not need them.</p> </details> <details> <summary>Download using Github</summary> <p>If you prefer using Github directly, simply follow these steps:</p> <ul> <li>Create a <code>node_modules</code> directory in the root of your web project.</li> <li>Clone the <span class=product>rwt-kanji</span> DOM component into it using the command:</li> <pre lang=bash> git clone https://github.com/readwritetools/rwt-kanji.git<br /> </pre> </ul> </details> ### Using the DOM component After installation, you need to add two things to your HTML page to make use of it. * Add a `script` tag to load the component's `rwt-kanji.js` file: ```html <script src='/node_modules/rwt-kanji/rwt-kanji.js' type=module></script> ``` * Add the component tag somewhere on the page, supplying five pieces of slotted text: 1. `span slot=main-title` The main title will be display horizontally across the top of the card in a sans-serif font. Maximum length is approximately 25 characters. 2. `span slot=side-title` The side title will be rotated and displayed vertically along the left-hand edge of the card in a sans-serif font. Maximum length is approximately 25 characters. 3. `span slot=slogan` A sentence that will be placed in the top half of the card. It will be captioned with quotation marks. Use <br> to break the sentence into multiple lines if desired. 4. `span slot=image-text` A sentence that will be hidden from the user until the mouse is hovered over the image. 5. `img slot=image src=URL` An image that should be square, ideally about 200 by 200 pixels. Here's an example: ```html <rwt-kanji role=contentinfo> <span slot=main-title>READ WRITE VIEW</span> <span slot=side-title>PLAIN TEXT</span> <span slot=slogan>Plain text reader.<br />Hypertext Markup writer.</span> <span slot=image-text>The Japanese kanji 見 (mi) is used in words meaning seeing, looking at, and viewing. RWVIEW has adopted 見 as its logo.</span> <img slot=image src='https://readwritetools.com/img/embossed/rwview-embossed.png' /> </rwt-kanji> ``` ### Customization #### Designer card size The designer card is square by default. Its size may be overridden using CSS by defining new values for `--width` and `--height`. Adjust the `--font-basis` to shrink or grow the entire card. ```css rwt-kanji { --font-basis: 1.0; --width: calc(22rem * var(--font-basis)); --height: calc(22rem * var(--font-basis)); --sidebar-width: calc(2rem * var(--font-basis)); --title-height: calc(2rem * var(--font-basis)); } ``` #### Dialog color scheme The default color palette for the dialog uses a dark mode theme. You can use CSS to override the variables' defaults: ```css rwt-kanji { --color: var(--white); --accent-color1: var(--pure-white); --background: var(--black); --accent-background1: var(--pure-black); --accent-background2: var(--nav-black); --accent-background3: var(--medium-black); --accent-background4: var(--gray); --accent-background5: rgba(0,0,0,0.4); } ``` ### Life-cycle events The component issues life-cycle events. <dl> <dt><code>component-loaded</code></dt> <dd>Sent when the component is fully loaded and ready to be used. As a convenience you can use the <code>waitOnLoading()</code> method which returns a promise that resolves when the <code>component-loaded</code> event is received. Call this asynchronously with <code>await</code>.</dd> </dl> --- ### Reference <table> <tr><td><img src='/img/48x48/read-write-hub.png' alt='DOM components logo' width=48 /></td> <td>Documentation</td> <td><a href='https://hub.readwritetools.com/components/kanji.blue'>READ WRITE HUB</a></td></tr> <tr><td><img src='/img/48x48/git.png' alt='git logo' width=48 /></td> <td>Source code</td> <td><a href='https://github.com/readwritetools/rwt-kanji'>github</a></td></tr> <tr><td><img src='/img/48x48/dom-components.png' alt='DOM components logo' width=48 /></td> <td>Component catalog</td> <td><a href='https://domcomponents.com/components/kanji.blue'>DOM COMPONENTS</a></td></tr> <tr><td><img src='/img/48x48/npm.png' alt='npm logo' width=48 /></td> <td>Package installation</td> <td><a href='https://www.npmjs.com/package/rwt-kanji'>npm</a></td></tr> <tr><td><img src='/img/48x48/read-write-stack.png' alt='Read Write Stack logo' width=48 /></td> <td>Publication venue</td> <td><a href='https://readwritestack.com/components/kanji.blue'>READ WRITE STACK</a></td></tr> </table> ### License The <span>rwt-kanji</span> DOM component is licensed under the MIT License. <img src='/img/blue-seal-mit.png' width=80 align=right /> <details> <summary>MIT License</summary> <p>Copyright © 2020 Read Write Tools.</p> <p>Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:</p> <p>The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.</p> <p>THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.</p> </details>
42.331776
468
0.717739
eng_Latn
0.925191
0c2e81371f5bded98e18c563ae37d28f5fb47229
1,688
md
Markdown
README.md
egison/backtracking
b8dd2f4f7ab98ba55a48c65ce2bfe3628eaf2322
[ "BSD-3-Clause" ]
14
2020-06-18T06:18:43.000Z
2020-07-03T22:37:12.000Z
README.md
egison/backtracking
b8dd2f4f7ab98ba55a48c65ce2bfe3628eaf2322
[ "BSD-3-Clause" ]
null
null
null
README.md
egison/backtracking
b8dd2f4f7ab98ba55a48c65ce2bfe3628eaf2322
[ "BSD-3-Clause" ]
null
null
null
# Backtracking Monad This library provides a backtracking monad following Spivey's paper "Algebras for combinatorial search". ## Getting Started The backtracking monad can be used in the similar way as the list monad. We only need to specify a seach stragety (`dfs` or `bfs`) for the initial value and insert `fromList` and `toList` for conversion. ``` take 10 (toList (dfs [1..] >>= \ns -> fromList ns >>= \x -> fromList ns >>= \y -> pure (x, y))) -- [(1,1),(1,2),(1,3),(1,4),(1,5),(1,6),(1,7),(1,8),(1,9),(1,10)] take 10 (toList (bfs [1..] >>= \ns -> fromList ns >>= \x -> fromList ns >>= \y -> pure (x, y))) -- [(1,1),(1,2),(2,1),(1,3),(2,2),(3,1),(1,4),(2,3),(3,2),(4,1)] ``` ## Relationship with Egison Pattern Matching We create this library for implementing [Sweet Egison](https://github.com/egison/sweet-egison), a shallow embedding Egison pattern matching. For example, the match clause `[mc| $x : #(x + 10) : _ -> (x, x + 10) |]` is transformed as follows: ```haskell \ (mat_a5sV, tgt_a5sW) -> let (tmpM_a5sX, tmpM_a5sY) = (consM mat_a5sV) tgt_a5sW in ((fromList (((cons (GP, GP)) mat_a5sV) tgt_a5sW)) >>= (\ (tmpT_a5sZ, tmpT_a5t0) -> let x = tmpT_a5sZ in let (tmpM_a5t1, tmpM_a5t2) = (consM tmpM_a5sY) tmpT_a5t0 in ((fromList (((cons (GP, WC)) tmpM_a5sY) tmpT_a5t0)) >>= (\ (tmpT_a5t3, tmpT_a5t4) -> ((fromList ((((value (x + 10)) ()) tmpM_a5t1) tmpT_a5t3)) >>= (\ () -> pure (x, x + 10))))))) ```
45.621622
140
0.526066
eng_Latn
0.779863
0c2ebce839bf446b18d3976fbe42578e99520b71
2,424
md
Markdown
guide/russian/miscellaneous/getting-started-with-back-end-projects/index.md
SweeneyNew/freeCodeCamp
e24b995d3d6a2829701de7ac2225d72f3a954b40
[ "BSD-3-Clause" ]
10
2019-08-09T19:58:19.000Z
2019-08-11T20:57:44.000Z
guide/russian/miscellaneous/getting-started-with-back-end-projects/index.md
SweeneyNew/freeCodeCamp
e24b995d3d6a2829701de7ac2225d72f3a954b40
[ "BSD-3-Clause" ]
2,056
2019-08-25T19:29:20.000Z
2022-02-13T22:13:01.000Z
guide/russian/miscellaneous/getting-started-with-back-end-projects/index.md
SweeneyNew/freeCodeCamp
e24b995d3d6a2829701de7ac2225d72f3a954b40
[ "BSD-3-Clause" ]
5
2018-10-18T02:02:23.000Z
2020-08-25T00:32:41.000Z
--- title: Getting Started with Back End Projects localeTitle: Начало работы с проектами Back End --- Учебный план, посвященный первому проекту Back End, не является исчерпывающим. Вот несколько общих ресурсов, которые другие туристы нашли полезными. * Введение в Yeoman - много полезных советов и трюков для настройки Yoman Angular Fullstack * [Угловой генератор](https://github.com/DaftMonk/generator-angular-fullstack#generators) - генератор, используемый Yeoman, вы можете найти синтаксис и какие файлы он создает ## API-интерфейсы * API для диаграмм фондового рынка: [https://www.quandl.com/help/api](https://www.quandl.com/help/api) ## Учебники и видеоролики MEAN Stack * 5 Part Series по настройке стека MEAN [https://www.youtube.com/watch?v=kHV7gOHvNdk](https://www.youtube.com/watch?v=kHV7gOHvNdk) * Учебник MEAN, который создает простой твист Twitter [https://channel9.msdn.com/Series/MEAN-Stack-Jump-Start](https://channel9.msdn.com/Series/MEAN-Stack-Jump-Start) * Клементина - это стянутый стежок MEAN, отлично подходящий для изучения основ. [https://johnstonbl01.github.io/clementinejs/tutorials/tutorial-beginner.html](https://johnstonbl01.github.io/clementinejs/tutorials/tutorial-beginner.html) * Аутентификация с паспортом для стека MEAN: [https://vickev.com/#!/article/authentication-in-single-page-applications-node-js-passportjs-angularjs](https://vickev.com/#!/article/authentication-in-single-page-applications-node-js-passportjs-angularjs) * Удивительный список ресурсов для изучения стека MEAN: [https://github.com/ericdouglas/MEAN-Learning](https://github.com/ericdouglas/MEAN-Learning) ## Учебники Scotch IO * [https://scotch.io/tutorials/setting-up-a-mean-stack-single-page-application](https://scotch.io/tutorials/setting-up-a-mean-stack-single-page-application) * [https://scotch.io/tutorials/node-and-angular-to-do-app-application-organization-and-structure](https://scotch.io/tutorials/node-and-angular-to-do-app-application-organization-and-structure) ## Узел / Экспресс * [Отладка онлайн для Node.js / Express](http://stackoverflow.com/a/16512303/1420506) ## Облако 9 трюков ### Ускорить перезагрузку браузера 1. Откройте файл gruntfile.js и отредактируйте оба экземпляра `livereload: true` to `livereload: false` . 2. Откройте сервер / config / express.js и закомментируйте строку `app.use(require('connect-livereload')());`
59.121951
210
0.771452
rus_Cyrl
0.330961
0c2f631d6df81b121a13c8bc46e489cd660b1915
7,165
md
Markdown
doc/dsa.md
TinkerBoard-Android/external-wycheproof
1b808efc7c903b0eaae0da8f6788f55513e4e339
[ "Apache-2.0" ]
1
2022-02-22T22:59:30.000Z
2022-02-22T22:59:30.000Z
doc/dsa.md
TinkerBoard-Android/external-wycheproof
1b808efc7c903b0eaae0da8f6788f55513e4e339
[ "Apache-2.0" ]
null
null
null
doc/dsa.md
TinkerBoard-Android/external-wycheproof
1b808efc7c903b0eaae0da8f6788f55513e4e339
[ "Apache-2.0" ]
null
null
null
# DSA [TOC] The digital signature algorithm (DSA) is one of three signature schemes descripted in the digital signature standard [DSS]. ## Key generation 4.2 Selection of Parameter Sizes and Hash Functions for DSA The DSS specifies the following choices for the pair (L,N), where L is the size of p in bits and N is the size of q in bits: L | N ---:|----: 1024| 160 2048| 224 2048| 256 3072| 256 The tests expect the following properties of the parameters used during key generation: * If only the parameter L is specified by the caller then N should be one of the options proposed in [DSS]. * If no size is specified then L should be at least 2048. This is the minimal key size recommended by NIST for the period up to the year 2030. ## Signature generation The DSA signature algorithm requires that each signature is computed with a new one-time secret k. This secret value should be close to uniformly distributed. If that is not the case then DSA signatures can leak the private key that was used to generate the signature. Two methods for generating the one-time secrets are described in FIPS PUB 186-4, Section B.5.1 or B.5.2 [DSS]. There is also the possibility that the use of mismatched implementations for key generation and signature generation are leaking the private keys. ## Signature verification A DSA signature is a DER encoded tuple of two integers (r,s). To verify a signature the verifier first checks $$0 < r < q$$ and $$0 < s < q$$. The verifier then computes: $$ \begin{array}{l} w=s^{-1} \bmod q\\ u1 = w \cdot H(m) \bmod q\\ u2 = w \cdot r \bmod q\\ \end{array} $$ and then verifies that \\(r = (g^{u1}y^{u2} \bmod p) \bmod q\\) ## Incorrect computations and range checks. Some libraries return 0 as the modular inverse of 0 or q. This can happen if the library computes the modular inverse of s as \\(w=s^{q-2} \mod q\\) (gpg4browsers) of simply if the implementations is buggy (pycrypto). if additionally to such a bug the range of r,s is not or incorrectly tested then it might be feasible to forge signatures with the values (r=1, s=0) or (r=1, s=q). In particular, if a library can be forced to compute \\(s^{-1} \mod q = 0\\) then the verification would compute \\( w = u1 = u2 = 0 \\) and hence \\( (g^{u1}y^{u2} \mod p) \mod q = 1 .\\) ## Timing attacks TBD # Some notable failures of crypto libraries. ## JDK The jdk8 implementation of SHA1withDSA previously checked the key size as follows: ```java @Override protected void checkKey(DSAParams params) throws InvalidKeyException { int valueL = params.getP().bitLength(); if (valueL > 1024) { throw new InvalidKeyException("Key is too long for this algorithm"); } } ``` This check was reasonable, it partially ensures conformance with the NIST standard. In most cases would prevent the attack described above. However, Oracle released a patch that removed the length verification in DSA in jdk9: http://hg.openjdk.java.net/jdk9/dev/jdk/rev/edd7a67585a5 https://bugs.openjdk.java.net/browse/JDK-8039921 The new code is here: http://hg.openjdk.java.net/jdk9/dev/jdk/file/edd7a67585a5/src/java.base/share/classes/sun/security/provider/DSA.java The change was further backported to jdk8: http://hg.openjdk.java.net/jdk8u/jdk8u/jdk/rev/3212f1631643 Doing this was a serious mistake. It easily allowed incorrect implementations. While generating 2048 bit DSA keys in jdk7 was not yet supported, doing so in jdk8 is. To trigger this bug in jdk7 an application had to use a key generated by a third party library (e.g. OpenSSL). Now, it is possible to trigger the bug just using JCE. Moreover, the excessive use of default values in JCE makes it easy to go wrong and rather difficult to spot the errors. The bug was for example triggered by the following code snippet: ```java KeyPairGenerator keygen = KeyPairGenerator.getInstance("DSA"); Keygen.initialize(2048); KeyPair keypair = keygen.genKeyPair(); Signature s = Signature.getInstance("DSA"); s.initSign(keypair.getPrivate()); ``` The first three lines generate a 2048 bit DSA key. 2048 bits is currently the smallest key size recommended by NIST. ```java KeyPairGenerator keygen = KeyPairGenerator.getInstance("DSA"); Keygen.initialize(2048); KeyPair keypair = keygen.genKeyPair(); ``` The key size specifies the size of p but not the size of q. The NIST standard allows either 224 or 256 bits for the size of q. The selection typically depends on the library. The Sun provider uses 224. Other libraries e.g. OpenSSL generates by default a 256 bit q for 2048 bit DSA keys. The next line contains a default in the initialization ```java    Signature s = Signature.getInstance("DSA"); ``` This line is equivalent to ```java    Signature s = Signature.getInstance("SHA1withDSA"); ``` Hence the code above uses SHA1 but with DSA parameters generated for SHA-224 or SHA-256 hashes. Allowing this combination by itself is already a mistake, but a flawed implementaion made the situation even worse. The implementation of SHA1withDSA assumeed that the parameter q is 160 bits long and used this assumption to generate a random 160-bit k when generating a signature instead of choosing it uniformly in the range (1,q-1). Hence, k severely biased. Attacks against DSA with biased k are well known. Howgrave-Graham and Smart analyzed such a situation [HS99]. Their results show that about 4 signatrues leak enough information to determine the private key in a few milliseconds. Nguyen analyzed a similar flaw in GPG [N04]. I.e., Section 3.2 of Nguyens paper describes essentially the same attack as used here. More generally, attacks based on lattice reduction were developed to break a variety of cryptosystems such as the knapsack cryptosystem [O90]. ## Further notes The short algorithm name “DSA” is misleading, since it hides the fact that `Signature.getInstance(“DSA”)` is equivalent to `Signature.getInstance(“SHA1withDSA”)`. To reduce the chance of a misunderstanding short algorithm names should be deprecated. In JCE the hash algorithm is defined by the algorithm. I.e. depending on the hash algorithm to use one would call one of: ```java Signature.getInstance(“SHA1withDSA”); Signature.getInstance(“SHA224withDSA”); Signature.getInstance(“SHA256withDSA”); ``` A possible way to push such a change are code analysis tools. "DSA" is in good company with other algorithm names “RSA”, “AES”, “DES”, all of which default to weak algorithms. ## References [HS99]: N.A. Howgrave-Graham, N.P. Smart, “Lattice Attacks on Digital Signature Schemes” http://www.hpl.hp.com/techreports/1999/HPL-1999-90.pdf [N04]: Phong Nguyen, “Can we trust cryptographic software? Cryptographic flaws in Gnu privacy guard 1.2.3”, Eurocrypt 2004, https://www.iacr.org/archive/eurocrypt2004/30270550/ProcEC04.pdf [O90]: A. M. Odlyzko, "The rise and fall of knapsack cryptosystems", Cryptology and Computational Number Theory, pp.75-88, 1990 [DSS]: FIPS PUB 186-4, "Digital Signature Standard (DSS)", National Institute of Standards and Technology, July 2013 http://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.186-4.pdf
37.124352
116
0.751291
eng_Latn
0.988834
0c2fcbbc5919703d78504f5eab0968522d1cde86
12,394
md
Markdown
docs/release-notes/NuGet-2.8.md
NuGet/docs.microsoft.com-nuget.es-es
da58ee2206d7d12f4f16ead5b56d8f096a8384c0
[ "MIT" ]
3
2017-08-28T06:05:17.000Z
2019-03-12T02:16:58.000Z
docs/release-notes/NuGet-2.8.md
NuGet/docs.microsoft.com-nuget.es-es
da58ee2206d7d12f4f16ead5b56d8f096a8384c0
[ "MIT" ]
11
2018-01-16T09:14:42.000Z
2021-01-22T04:33:51.000Z
docs/release-notes/NuGet-2.8.md
NuGet/docs.microsoft.com-nuget.es-es
da58ee2206d7d12f4f16ead5b56d8f096a8384c0
[ "MIT" ]
12
2017-09-04T15:20:38.000Z
2021-11-26T17:51:30.000Z
--- title: Notas de la versión de NuGet 2,8 description: Notas de la versión de NuGet 2,8, incluidos problemas conocidos, correcciones de errores, características agregadas y DCR. author: JonDouglas ms.author: jodou ms.date: 11/11/2016 ms.topic: conceptual ms.openlocfilehash: cb77cf0f049b5b3cfe1039d83ab58e33457674bf ms.sourcegitcommit: ee6c3f203648a5561c809db54ebeb1d0f0598b68 ms.translationtype: MT ms.contentlocale: es-ES ms.lasthandoff: 01/26/2021 ms.locfileid: "98776712" --- # <a name="nuget-28-release-notes"></a>Notas de la versión de NuGet 2,8 Notas de la [versión de NuGet 2.7.2](../release-notes/nuget-2.7.2.md) | [Notas de la versión de NuGet 2.8.1](../release-notes/nuget-2.8.1.md) NuGet 2,8 se lanzó el 29 de enero de 2014. ## <a name="acknowledgements"></a>Agradecimientos 1. [Llewellyn Pritchard](https://www.codeplex.com/site/users/view/leppie) ( [@leppie](https://twitter.com/leppie) ) - [#3466](https://nuget.codeplex.com/workitem/3466) : al empaquetar paquetes, comprobando el identificador de los paquetes de dependencias. 2. [Maarten Balliauw](https://www.codeplex.com/site/users/view/maartenba) ( [@maartenballiauw](https://twitter.com/maartenballiauw) ) - [#2379](https://nuget.codeplex.com/workitem/2379) : Quite el sufijo de $metadata cuando persistening las credenciales de la fuente. 3. [Filip de vos](https://www.codeplex.com/site/users/view/FilipDeVos) ( [@foxtricks](https://twitter.com/foxtricks) ) - [#3538](http://nuget.codeplex.com/workitem/3538) : compatibilidad que especifica el archivo de proyecto para el comando nuget.exe Update. 4. [Juan Gonzalez](https://www.codeplex.com/site/users/view/jjgonzalez) - [#3536](http://nuget.codeplex.com/workitem/3536) : los tokens de reemplazo no se pasan con-IncludeReferencedProjects. 5. [David Poole](https://www.codeplex.com/site/users/view/Sarkie) ( [@Sarkie_Dave](https://twitter.com/Sarkie_Dave) ) - [#3677](http://nuget.codeplex.com/workitem/3677) : corrija Nuget. Push produce OutOfMemoryException al insertar un paquete de gran tamaño. 6. [Wouter Ouwens](https://www.codeplex.com/site/users/view/Despotes) - [#3666](http://nuget.codeplex.com/workitem/3666) : corregir la ruta de acceso de destino incorrecta cuando el proyecto hace referencia a otro proyecto de CLI o C++. 7. [Adam Rafa](http://www.codeplex.com/site/users/view/adamralph) ( [@adamralph](https://twitter.com/adamralph) ) - [#3639](https://nuget.codeplex.com/workitem/3639) : permitir que los paquetes se instalen como dependencias de desarrollo de forma predeterminada 8. [David Fowler](https://www.codeplex.com/site/users/view/dfowler) ( [@davidfowl](https://twitter.com/davidfowl) ) - [#3717](https://nuget.codeplex.com/workitem/3717) : quitar actualizaciones implícitas a la última versión de revisión 9. [Gregory Vandenbrouck](https://www.codeplex.com/site/users/view/vdbg) - Varias correcciones de errores y mejoras para NuGet. Server, el comando nuget.exe Mirror y otros. - Este trabajo se ha realizado durante varios meses, con Gregory trabajando con nosotros en el momento adecuado para integrarlo en la principal de 2,8. ## <a name="patch-resolution-for-dependencies"></a>Resolución de revisiones para dependencias Al resolver las dependencias de paquete, NuGet ha implementado históricamente una estrategia de selección de la versión de paquete principal y secundaria más baja que satisface las dependencias del paquete. A diferencia de la versión principal y secundaria, sin embargo, la versión de revisión siempre se resolvió en la versión más alta. Aunque el comportamiento estaba bien intencionado, creó una falta de determinismo para instalar paquetes con dependencias. Considere el ejemplo siguiente: ``` [email protected] -[ >=1.0.0 ]-> [email protected] Developer1 installs [email protected]: installed [email protected] and [email protected] [email protected] is published Developer2 installs [email protected]: installed [email protected] and [email protected] ``` En este ejemplo, aunque Developer1 y Developer2 están instalados [email protected] , cada uno finalizó con una versión diferente de PackageB. NuGet 2,8 cambia este comportamiento predeterminado de modo que el comportamiento de la resolución de dependencias para las versiones de revisión sea coherente con el comportamiento de las versiones principales y secundarias. En el ejemplo anterior, [email protected] se instalaría como resultado de la instalación de [email protected] , independientemente de la versión de revisión más reciente. ## <a name="-dependencyversion-switch"></a>Modificador-DependencyVersion Aunque NuGet 2,8 cambia el comportamiento _predeterminado_ para resolver las dependencias, también agrega un control más preciso sobre el proceso de resolución de dependencias a través del modificador-DependencyVersion en la consola del administrador de paquetes. El modificador permite resolver las dependencias con la versión más baja posible (comportamiento predeterminado), la versión más alta posible o la versión secundaria o de revisión más alta. Este modificador solo funciona para Install-Package en el comando de PowerShell. ![Modificador DependencyVersion](./media/NuGet-2.8/dependencyversion.png) ## <a name="dependencyversion-attribute"></a>Atributo DependencyVersion Además del modificador-DependencyVersion descrito anteriormente, NuGet también permite la capacidad de establecer un nuevo atributo en el archivo Nuget.Config que define cuál es el valor predeterminado, si el modificador-DependencyVersion no se especifica en una invocación de Install-Package. Este valor también lo respeta el cuadro de diálogo Administrador de paquetes NuGet para cualquier operación de paquete de instalación. Para establecer este valor, agregue el atributo siguiente al archivo Nuget.Config: ```xml <config> <add key="dependencyversion" value="Highest" /> </config> ``` ## <a name="preview-nuget-operations-with--whatif"></a>Vista previa de las operaciones de NuGet con-Whatif Algunos paquetes NuGet pueden tener gráficos de dependencia profundos y, por lo tanto, pueden ser útiles durante una operación de instalación, desinstalación o actualización para ver primero lo que ocurrirá. NuGet 2,8 agrega el modificador estándar de PowerShell-Whatif a los comandos Install-Package, uninstall-Package y Update-package para habilitar la visualización del cierre completo de los paquetes a los que se aplicará el comando. Por ejemplo, `install-package Microsoft.AspNet.WebApi -whatif` la ejecución de en una aplicación Web de ASP.net vacía produce lo siguiente. ``` PM> install-package Microsoft.AspNet.WebApi -whatif Attempting to resolve dependency 'Microsoft.AspNet.WebApi.WebHost (≥ 5.0.0)'. Attempting to resolve dependency 'Microsoft.AspNet.WebApi.Core (≥ 5.0.0)'. Attempting to resolve dependency 'Microsoft.AspNet.WebApi.Client (≥ 5.0.0)'. Attempting to resolve dependency 'Newtonsoft.Json (≥ 4.5.11)'. Install Newtonsoft.Json 4.5.11 Install Microsoft.AspNet.WebApi.Client 5.0.0 Install Microsoft.AspNet.WebApi.Core 5.0.0 Install Microsoft.AspNet.WebApi.WebHost 5.0.0 Install Microsoft.AspNet.WebApi 5.0.0 ``` ## <a name="downgrade-package"></a>Degradar paquete No es raro instalar una versión preliminar de un paquete con el fin de investigar las nuevas características y, a continuación, decidir revertir a la última versión estable. Antes de NuGet 2,8, se trata de un proceso de varios pasos para desinstalar el paquete de versión preliminar y sus dependencias, y, a continuación, instalar la versión anterior. Con NuGet 2,8, sin embargo, el paquete de actualización revertirá ahora todo el cierre del paquete (por ejemplo, el árbol de dependencias del paquete) a la versión anterior. ## <a name="development-dependencies"></a>Dependencias de desarrollo Muchos tipos diferentes de funcionalidades se pueden entregar como paquetes NuGet, incluidas las herramientas que se usan para optimizar el proceso de desarrollo. Estos componentes, aunque pueden ser instrumental para desarrollar un nuevo paquete, no deben considerarse una dependencia del nuevo paquete cuando se publican posteriormente. NuGet 2,8 permite a un paquete identificarse en el `.nuspec` archivo como developmentDependency. Cuando se instalan, estos metadatos también se agregan al `packages.config` archivo del proyecto en el que se instaló el paquete. Cuando el `packages.config` archivo se analice posteriormente para las dependencias de NuGet durante `nuget.exe pack` , se excluirán las dependencias marcadas como dependencias de desarrollo. ## <a name="individual-packagesconfig-files-for-different-platforms"></a>Archivos de packages.config individuales para distintas plataformas Al desarrollar aplicaciones para varias plataformas de destino, es habitual tener distintos archivos de proyecto para cada uno de los entornos de compilación respectivos. También es habitual consumir distintos paquetes de NuGet en diferentes archivos de proyecto, ya que los paquetes tienen distintos niveles de compatibilidad con distintas plataformas. NuGet 2,8 proporciona compatibilidad mejorada para este escenario mediante la creación `packages.config` de archivos diferentes para distintos archivos de proyecto específicos de la plataforma. ![Varios archivos de package.config](./media/NuGet-2.8/multiple-packageconfigs.png) ## <a name="fallback-to-local-cache"></a>Reserva a caché local Aunque los paquetes NuGet se suelen usar desde una galería remota como [la galería de Nuget](http://www.nuget.org/) mediante una conexión de red, hay muchos escenarios en los que el cliente no está conectado. Sin una conexión de red, el cliente de NuGet no pudo instalar paquetes correctamente, incluso cuando esos paquetes ya estaban en el equipo del cliente en la caché de NuGet local. NuGet 2,8 agrega la reserva de caché automática en la consola del administrador de paquetes. Por ejemplo, al desconectar el adaptador de red e instalar jQuery, la consola muestra lo siguiente: ``` PM> Install-Package jquery The source at nuget.org [https://www.nuget.org/api/v2/] is unreachable. Falling back to NuGet Local Cache at C:\Users\me\AppData\Local\NuGet\Cache Installing 'jQuery 2.0.3'. Successfully installed 'jQuery 2.0.3'. Adding 'jQuery 2.0.3' to WebApplication18. Successfully added 'jQuery 2.0.3' to WebApplication18. ``` La característica de reserva de caché no requiere ningún argumento de comando específico. Además, la reserva de caché solo funciona actualmente en la consola del administrador de paquetes: el comportamiento no funciona actualmente en el cuadro de diálogo Administrador de paquetes. ## <a name="webmatrix-nuget-client-updates"></a>Actualizaciones de cliente de NuGet de WebMatrix Junto con NuGet 2,8, la extensión NuGet para WebMatrix también se actualizó para incluir muchas de las características principales que se ofrecen con [NuGet 2,5](../release-notes/nuget-2.5.md). Entre las nuevas funcionalidades se incluyen las de ' Actualizar todo ', ' versión mínima de NuGet ' y permitir la sobrescritura de archivos de contenido. Para actualizar la extensión del administrador de paquetes NuGet en WebMatrix 3: 1. Abrir WebMatrix 3 1. Haga clic en el icono extensiones de la cinta de opciones. 1. Seleccione la pestaña actualizaciones. 1. Haga clic para actualizar el administrador de paquetes NuGet a 2.5.0 1. Cierre y reinicie WebMatrix 3 Esta es la primera versión del equipo de NuGet de la extensión del administrador de paquetes NuGet para WebMatrix. Microsoft ha aportado recientemente el código al proyecto de NuGet de código abierto. Anteriormente, la integración de NuGet se integraba en WebMatrix y no se podía actualizar fuera de banda desde WebMatrix. Ahora tenemos la capacidad de actualizarla junto con el resto de las herramientas de cliente de NuGet. ## <a name="bug-fixes"></a>Correcciones de errores Una de las correcciones de errores más importantes que se realizaron fue la mejora del rendimiento en el comando Update-package-REINSTALL. Además de estas características y de la corrección de rendimiento mencionada anteriormente, esta versión de NuGet también incluye muchas otras correcciones de errores. Se han solucionado 181 problemas totales en la versión. Para obtener una lista completa de los elementos de trabajo corregidos en NuGet 2,8, consulte el [seguimiento de problemas de Nuget en esta versión](https://nuget.codeplex.com/workitem/list/advanced?release=NuGet%202.8&status=all).
88.528571
757
0.794255
spa_Latn
0.973549
0c2fe8b945746b5e73d7c897310258f3b07e26ff
1,587
md
Markdown
content/telegraf/v1.6/concepts/_index.md
muszbek/docs.influxdata.com
5689ba6dce8f223d49771bc7663bb8fcf47abc7a
[ "MIT" ]
254
2016-02-04T10:48:28.000Z
2020-08-17T03:35:57.000Z
content/telegraf/v1.6/concepts/_index.md
muszbek/docs.influxdata.com
5689ba6dce8f223d49771bc7663bb8fcf47abc7a
[ "MIT" ]
1,979
2016-01-04T23:55:01.000Z
2020-09-28T17:12:18.000Z
content/telegraf/v1.6/concepts/_index.md
muszbek/docs.influxdata.com
5689ba6dce8f223d49771bc7663bb8fcf47abc7a
[ "MIT" ]
430
2016-01-08T10:33:09.000Z
2020-08-29T05:59:30.000Z
--- title: Key Telegraf concepts description: This section discusses key concepts about Telegraf, including information on supported input data formats, output data formats, aggregator and processor plugins, and includes a glossary of important terms. menu: telegraf_1_6: name: Concepts weight: 30 --- This section discusses key concepts about Telegraf, the plug-in driven server agent component of the InfluxData time series platform. Topics covered include supported input data formats, output data formats, aggregator and processor plugins, and a glossary of important terms. ## [Telegraf input data formats](/telegraf/v1.6/concepts/data_formats_input/) [Telegraf input data formats](/telegraf/v1.6/concepts/data_formats_input/) supports parsing input data formats into metrics for InfluxDB Line Protocol, JSON, Graphite, Value, Nagios, Collectd, and Dropwizard. ## [Telegraf output data formats](/telegraf/v1.6/concepts/data_formats_output/) [Telegraf output data formats](/telegraf/v1.6/concepts/data_formats_output/) can serialize metrics into output data formats for InfluxDB Line Protocol, JSON, and Graphite. ## [Telegraf aggregator and processor plugins](/telegraf/v1.6/concepts/aggregator_processor_plugins/) [Telegraf aggregator and processor plugins](/telegraf/v1.6/concepts/aggregator_processor_plugins/) work between the input plugins and output plugins to aggregate and process metrics in Telegraf. ## [Glossary of terms (for Telegraf)](/telegraf/v1.6/concepts/glossary/) This section includes definitions of important terms for related to Telegraf.
61.038462
276
0.804663
eng_Latn
0.755504
0c3031db338a356cae42f885f0d40e5fa2e8279c
16,618
md
Markdown
vendor/jackiedo/dotenv-editor/README.md
ergauravarora/ezfolio
b2f750230e076680d5dc1da1f2f68fb22ea28fb3
[ "MIT" ]
170
2017-05-14T09:33:06.000Z
2022-03-26T04:30:57.000Z
vendor/jackiedo/dotenv-editor/README.md
ergauravarora/ezfolio
b2f750230e076680d5dc1da1f2f68fb22ea28fb3
[ "MIT" ]
25
2017-05-12T04:57:47.000Z
2022-03-09T19:06:56.000Z
vendor/jackiedo/dotenv-editor/README.md
ergauravarora/ezfolio
b2f750230e076680d5dc1da1f2f68fb22ea28fb3
[ "MIT" ]
60
2017-05-15T09:29:39.000Z
2022-02-21T12:06:19.000Z
# Laravel Dotenv Editor ![laravel-dotenv-editor](https://cloud.githubusercontent.com/assets/9862115/25982836/029612b2-370a-11e7-82c5-d9146dc914a1.png) [![Latest Stable Version](https://poser.pugx.org/jackiedo/dotenv-editor/v/stable)](https://packagist.org/packages/jackiedo/dotenv-editor) [![Total Downloads](https://poser.pugx.org/jackiedo/dotenv-editor/downloads)](https://packagist.org/packages/jackiedo/dotenv-editor) [![Latest Unstable Version](https://poser.pugx.org/jackiedo/dotenv-editor/v/unstable)](https://packagist.org/packages/jackiedo/dotenv-editor) [![License](https://poser.pugx.org/jackiedo/dotenv-editor/license)](https://packagist.org/packages/jackiedo/dotenv-editor) Laravel Dotenv Editor is the .env file editor (or files with same structure and syntax) for Laravel 5.8+. Now you can easily edit .env files with the following features: - Read raw content of file. - Read lines of file content. - Read setters (key-value-pair) of file content. - Determine one key name of existing setter. - Append empty lines to file. - Append comment lines to file. - Append new or update exists setter lines to file. - Delete existing setter line in file. - Backup and restore file. - Manage backup files. # Versions and compatibility Laravel Dotenv Editor is compatible with Laravel 5+ and above. Since the release of `1.2.0` onwards, this package only supports Laravel 5.8 and later. Previous versions of Laravel will no longer be supported. # Note for the release `1.2.0` and later Starting with the release `1.2.0`, the .gitignore file in the folder containing the backup file will no longer be created automatically. Developers will have to create this file manually if deemed necessary. # Documentation Look at one of the following topics to learn more about Laravel Dotenv Editor: - [Installation](#installation) - [Configuration](#configuration) - [Auto backup mode](#auto-backup-mode) - [Backup location](#backup-location) - [Always create backup folder](#always-create-backup-folder) - [Usage](#usage) - [Working with facade](#working-with-facade) - [Using dependency injection](#using-dependency-injection) - [Loading file for working](#loading-file-for-working) - [Reading file content](#reading-file-content) - [Writing content into file](#writing-content-into-file) - [Backing up and restoring file](#backing-up-and-restoring-file) - [Method chaining](#method-chaining) - [Working with Artisan CLI](#working-with-artisan-cli) - [Exceptions](#exceptions) ## Installation You can install this package through [Composer](https://getcomposer.org) with the following steps: #### Step 1 - Require package At the root of your application directory, run the following command (in any terminal client): ```shell $ composer require jackiedo/dotenv-editor ``` **Note:** Since Laravel 5.5, [service providers and aliases are automatically registered](https://laravel.com/docs/5.5/packages#package-discovery), so you can safely skip the following two steps: #### Step 2 - Register service provider Open `config/app.php`, and add a new line to the providers section: ```php Jackiedo\DotenvEditor\DotenvEditorServiceProvider::class, ``` #### Step 3 - Register facade Add the following line to the aliases section in file `config/app.php`: ```php 'DotenvEditor' => Jackiedo\DotenvEditor\Facades\DotenvEditor::class, ``` ## Configuration To start using the package, you should publish the configuration file so that you can configure the package as needed. To do that, run the following command (in any terminal client) at the root of your application: ```shell $ php artisan vendor:publish --provider="Jackiedo\DotenvEditor\DotenvEditorServiceProvider" --tag="config" ``` This will create a `config/dotenv-editor.php` file in your app that you can modify to set your configuration. Also, make sure you check for changes to the original config file in this package between releases. Currently there are the following settings: #### Auto backup mode The `autoBackup` setting allows your original file to be backed up automatically before saving. Set it to `true` to agree. #### Backup location The `backupPath` setting is used to specify where your file is backed up. This value is a sub path (sub-folder) from the root folder of the project application. #### Always create backup folder The `alwaysCreateBackupFolder` setting is used to request that the backup folder always be created, whether or not the backup is performed. ## Usage ### Working with facade Laravel Dotenv Editor has a facade with the name `Jackiedo\DotenvEditor\Facades\DotenvEditor`. You can perform all operations through this facade. **Example:** ```php <?php namespace Your\Namespace; // ... use Jackiedo\DotenvEditor\Facades\DotenvEditor; class YourClass { public function yourMethod() { DotenvEditor::doSomething(); } } ``` ### Using dependency injection This package also supports dependency injection. You can easily inject an instance of the `Jackiedo\DotenvEditor\DotenvEditor` class into your controller or other classes. **Example:** ```php <?php namespace App\Http\Controllers; // ... use Jackiedo\DotenvEditor\DotenvEditor; class TestDotenvEditorController extends Controller { protected $editor; public function __construct(DotenvEditor $editor) { $this->editor = $editor; } public function doSomething() { $editor = $this->editor->doSomething(); } } ``` ### Loading file for working By default, Laravel Dotenv Editor will load the `.env` file in the root of your project. Example: ```php $content = DotenvEditor::getContent(); // Get raw content of file .env in root folder ``` However, if you want to explicitly specify the files you are going to work with, you should use the `load()` method. **Method syntax:** ```php /** * Load file for working * * @param string|null $filePath The file path * @param boolean $restoreIfNotFound Restore this file from other file if it's not found * @param string|null $restorePath The file path you want to restore from * * @return DotenvEditor */ public function load($filePath = null, $restoreIfNotFound = false, $restorePath = null); ``` **Example:** ```php // Working with file .env in root folder $file = DotenvEditor::load(); // Working with file .env.example in root folder $file = DotenvEditor::load('.env.example'); // Working with file .env.backup in folder storage/dotenv-editor/backups/ $file = DotenvEditor::load(storage_path('dotenv-editor/backups/.env.backup')); ``` **Note:** The `load()` method has three parameters: - **`$filePath`**: The path to the file you want to work with. Set `null` to work with the file `.env` in the root folder. - **`$restoreIfNotFound`**: Allows to restore your file if it is not found. - **`$restorePath`**: The path to the file used to restoring. Set `null` to restore from an older backup file. ### Reading file content #### Reading raw content. **Method syntax:** ```php /** * Get raw content of file * * @return string */ public function getContent(); ``` **Example:** ```php $rawContent = DotenvEditor::getContent(); ``` #### Reading content by lines. **Method syntax:** ```php /** * Get all lines from file * * @return array */ public function getLines(); ``` **Example:** ```php $lines = DotenvEditor::getLines(); ``` **Note:** This will return an array. Each element in the array consists of the following items: - Number of the line. - Raw content of the line. - Parsed content of the line, including: type of line (empty, comment, setter...), key name of setter, value of setter, comment of setter... #### Reading content by keys **Method syntax:** ```php /** * Get all or exists given keys in file content * * @param array $keys * * @return array */ public function getKeys($keys = []); ``` **Example:** ```php // Get all keys $keys = DotenvEditor::getKeys(); // Only get two given keys if exists $keys = DotenvEditor::getKeys(['APP_DEBUG', 'APP_URL']); ``` **Note:** This will return an array. Each element in the array consists of the following items: - Number of the line. - Key name of the setter. - Value of the setter. - Comment of the setter. - If this key is used for the "export" command or not. #### Determine if a key exists **Method syntax:** ```php /** * Check, if a given key is exists in the file content * * @param string $keys * * @return bool */ public function keyExists($key); ``` **Example:** ```php $keyExists = DotenvEditor::keyExists('APP_URL'); ``` #### Get value of a key **Method syntax:** ```php /** * Return the value matching to a given key in the file content * * @param $key * * @throws \Jackiedo\DotenvEditor\Exceptions\KeyNotFoundException * * @return string */ public function getValue($key); ``` **Example:** ```php $value = DotenvEditor::getValue('APP_URL'); ``` ### Writing content into a file To edit file content, you have two jobs: - First is writing content into the buffer - Second is saving the buffer into the file #### Add an empty line into buffer **Method syntax:** ```php /** * Add empty line to buffer * * @return DotenvEditor */ public function addEmpty(); ``` **Example:** ```php $file = DotenvEditor::addEmpty(); ``` #### Add a comment line into buffer **Method syntax:** ```php /** * Add comment line to buffer * * @param object * * @return DotenvEditor */ public function addComment($comment); ``` **Example:** ```php $file = DotenvEditor::addComment('This is a comment line'); ``` #### Add or update a setter into buffer **Method syntax:** ```php /** * Set one key to buffer * * @param string $key Key name of setter * @param string|null $value Value of setter * @param string|null $comment Comment of setter * @param boolean $export Leading key name by "export " * * @return DotenvEditor */ public function setKey($key, $value = null, $comment = null, $export = false); ``` **Example:** ```php // Set key ENV_KEY with empty value $file = DotenvEditor::setKey('ENV_KEY'); // Set key ENV_KEY with none empty value $file = DotenvEditor::setKey('ENV_KEY', 'anything-you-want'); // Set key ENV_KEY with a value and comment $file = DotenvEditor::setKey('ENV_KEY', 'anything-you-want', 'your-comment'); // Update key ENV_KEY with a new value and keep earlier comment $file = DotenvEditor::setKey('ENV_KEY', 'new-value-1'); // Update key ENV_KEY with a new value, keep earlier comment and use 'export ' before key name $file = DotenvEditor::setKey('ENV_KEY', 'new-value', null, true); // Update key ENV_KEY with a new value and clear comment $file = DotenvEditor::setKey('ENV_KEY', 'new-value-2', '', false); ``` #### Add or update multi setter into buffer **Method syntax:** ```php /** * Set many keys to buffer * * @param array $data * * @return DotenvEditor */ public function setKeys($data); ``` **Example:** ```php $file = DotenvEditor::setKeys([ [ 'key' => 'ENV_KEY_1', 'value' => 'your-value-1', 'comment' => 'your-comment-1', 'export' => true ], [ 'key' => 'ENV_KEY_2', 'value' => 'your-value-2', 'export' => true ], [ 'key' => 'ENV_KEY_3', 'value' => 'your-value-3', ] ]); ``` Alternatively, you can also provide an associative array of keys and values: ```php $file = DotenvEditor::setKeys([ 'ENV_KEY_1' => 'your-value-1', 'ENV_KEY_2' => 'your-value-2', 'ENV_KEY_3' => 'your-value-3', ]); ``` #### Delete a setter line in buffer **Method syntax:** ```php /** * Delete on key in buffer * * @param string $key * * @return DotenvEditor */ public function deleteKey($key); ``` **Example:** ```php $file = DotenvEditor::deleteKey('ENV_KEY'); ``` #### Delete multi setter lines in buffer **Method syntax:** ```php /** * Delete many keys in buffer * * @param array $keys * * @return DotenvEditor */ public function deleteKeys($keys = []); ``` **Example:** ```php // Delete two keys $file = DotenvEditor::deleteKeys(['ENV_KEY_1', 'ENV_KEY_2']); ``` #### Save buffer into file **Method syntax:** ```php /** * Save buffer to file * * @return DotenvEditor */ public function save(); ``` **Example:** ```php $file = DotenvEditor::save(); ``` ### Backing up and restoring file #### Backup your file **Method syntax:** ```php /** * Create one backup of loaded file * * @return DotenvEditor */ public function backup(); ``` **Example:** ```php $file = DotenvEditor::backup(); ``` #### Get all backup versions **Method syntax:** ```php /** * Return an array with all available backups * * @return array */ public function getBackups(); ``` **Example:** ```php $backups = DotenvEditor::getBackups(); ``` #### Get latest backup version **Method syntax:** ```php /** * Return the information of the latest backup file * * @return array */ public function getLatestBackup(); ``` **Example:** ```php $latestBackup = DotenvEditor::getLatestBackup(); ``` #### Restore your file from latest backup or other file **Method syntax:** ```php /** * Restore the loaded file from latest backup file or from special file. * * @param string|null $filePath * * @return DotenvEditor */ public function restore($filePath = null); ``` **Example:** ```php // Restore from latest backup $file = DotenvEditor::restore(); // Restore from other file $file = DotenvEditor::restore(storage_path('dotenv-editor/backups/.env.backup_2017_04_10_152709')); ``` #### Delete one backup file **Method syntax:** ```php /** * Delete the given backup file * * @param string $filePath * * @return DotenvEditor */ public function deleteBackup($filePath); ``` **Example:** ```php $file = DotenvEditor::deleteBackup(storage_path('dotenv-editor/backups/.env.backup_2017_04_10_152709')); ``` #### Delete multi backup files **Method syntax:** ```php /** * Delete all or the given backup files * * @param array $filePaths * * @return DotenvEditor */ public function deleteBackups($filePaths = []); ``` **Example:** ```php // Delete two backup file $file = DotenvEditor::deleteBackups([ storage_path('dotenv-editor/backups/.env.backup_2017_04_10_152709'), storage_path('dotenv-editor/backups/.env.backup_2017_04_11_091552') ]); // Delete all backup $file = DotenvEditor::deleteBackups(); ``` #### Change auto backup mode **Method syntax:** ```php /** * Switching of the auto backup mode * * @param boolean $on * * @return DotenvEditor */ public function autoBackup($on = true); ``` **Example:** ```php // Enable auto backup $file = DotenvEditor::autoBackup(true); // Disable auto backup $file = DotenvEditor::autoBackup(false); ``` ### Method chaining Some functions of loading, writing, backing up, restoring support method chaining. So these functions can be called chained together in a single statement. Example: ```php $file = DotenvEditor::load('.env.example')->backup()->setKey('APP_URL', 'http://example.com')->save(); return $file->getKeys(); ``` ### Working with Artisan CLI Now, Laravel Dotenv Editor has 6 commands which can be used easily with the Artisan CLI. These are: - `php artisan dotenv:backup` - `php artisan dotenv:get-backups` - `php artisan dotenv:restore` - `php artisan dotenv:get-keys` - `php artisan dotenv:set-key` - `php artisan dotenv:delete-key` Please use each of the commands with the `--help` option to leanr more about there usage. **Example:** ```shell $ php artisan dotenv:get-backups --help ``` ### Exceptions This package will throw exceptions if something goes wrong. This way it's easier to debug your code using this package or to handle the error based on the type of exceptions. | Exception | Reason | | ---------------------------- | ---------------------------------------------- | | *FileNotFoundException* | When the file was not found. | | *InvalidValueException* | When the value of setter is invalid. | | *KeyNotFoundException* | When the requested key does not exist in file. | | *NoBackupAvailableException* | When no backup file exists. | | *UnableReadFileException* | When unable to read the file. | | *UnableWriteToFileException* | When unable to write to the file. | # Contributors This project exists thanks to all its [contributors](https://github.com/JackieDo/Laravel-Dotenv-Editor/graphs/contributors). # License [MIT](LICENSE) © Jackie Do
24.330893
253
0.683837
eng_Latn
0.926431
0c30aab8084dd10bcbd77aa02e25ab79b9a5e1a1
4,621
md
Markdown
src/docs/ftw-introduction/ftw-customization-checklist/index.md
TalAter/flex-docs
54c507a2b260940f7114dd7c321c1d889dc8c51c
[ "Apache-2.0" ]
null
null
null
src/docs/ftw-introduction/ftw-customization-checklist/index.md
TalAter/flex-docs
54c507a2b260940f7114dd7c321c1d889dc8c51c
[ "Apache-2.0" ]
null
null
null
src/docs/ftw-introduction/ftw-customization-checklist/index.md
TalAter/flex-docs
54c507a2b260940f7114dd7c321c1d889dc8c51c
[ "Apache-2.0" ]
null
null
null
--- title: Customization checklist slug: customization-checklist updated: 2019-10-23 category: ftw-introduction ingress: This guide lists the important things to go through when customizing Flex Template for Web (FTW). published: true --- Here is a list of things to update and check when starting to customize FTW. ## 1. Customize visual styles - Marketplace colors: [How to customize FTW styles](/ftw-styling/how-to-customize-ftw-styles/) - Favicon and application icons: [How to change FTW icons](/ftw-styling/how-to-change-ftw-icons/) - Social media sharing graphics in the [Page](https://github.com/sharetribe/flex-template-web/blob/master/src/components/Page/Page.js) component - [Logo](https://github.com/sharetribe/flex-template-web/blob/master/src/components/Logo/Logo.js) component change and check that it works on [Topbar](https://github.com/sharetribe/flex-template-web/tree/master/src/components/TopbarDesktop), [Footer](https://github.com/sharetribe/flex-template-web/tree/master/src/components/Footer), and [CheckoutPage](https://github.com/sharetribe/flex-template-web/blob/master/src/containers/CheckoutPage/CheckoutPage.js) - [Default background image](https://github.com/sharetribe/flex-template-web/blob/master/src/assets/background-1440.jpg) ## 2. Change text content - Update UI texts or change the language: [How to change FTW UI texts and translations](/ftw-styling/how-to-change-ftw-ui-texts-and-translations/) - [LandingPage](https://github.com/sharetribe/flex-template-web/blob/master/src/containers/LandingPage/LandingPage.js) component: update and create branded sections - [Footer](https://github.com/sharetribe/flex-template-web/blob/master/src/components/Footer/Footer.js) component - [AboutPage](https://github.com/sharetribe/flex-template-web/blob/master/src/containers/AboutPage/AboutPage.js) component - Update [Terms of Service](https://github.com/sharetribe/flex-template-web/blob/master/src/components/TermsOfService/TermsOfService.js) and [Privacy Policy](https://github.com/sharetribe/flex-template-web/blob/master/src/components/PrivacyPolicy/PrivacyPolicy.js) <extrainfo title="Locate Terms of Service and Privacy Policy"> ```shell └── src    └── components       ├── TermsOfService       └── PrivacyPolicy ``` </extrainfo> ## 3. Change configuration - Go through the [FTW Environment configuration variables](/ftw-configuration/ftw-env/) - [Config: siteTitle](https://github.com/sharetribe/flex-template-web/blob/master/src/config.js) for page schema (SEO) - [Config: marketplace address](https://github.com/sharetribe/flex-template-web/blob/master/src/config.js): contact details also improve SEO - [Config: social media pages](https://github.com/sharetribe/flex-template-web/blob/master/src/config.js) - [Marketplace custom config](https://github.com/sharetribe/flex-template-web/blob/master/src/marketplace-custom-config.js) ## 4. Other optional changes - Update [ListingPage](https://github.com/sharetribe/flex-template-web/blob/master/src/containers/ListingPage/ListingPage.js) to show extended data (aka publicData attribute) - Update [EditListingWizard](https://github.com/sharetribe/flex-template-web/blob/master/src/components/EditListingWizard/EditListingWizard.js) and panels to add extended data - Update [SearchPage](https://github.com/sharetribe/flex-template-web/blob/master/src/containers/SearchPage/SearchPage.js) to filter with extended data - Update [routeConfiguration](https://github.com/sharetribe/flex-template-web/blob/master/src/routeConfiguration.js) if needed - Update transaction email templates. For more information, see [Edit email templates with Flex CLI](/flex-cli/edit-email-templates-with-flex-cli/) tutorial and [Email templates](/references/email-templates/) reference article. - Update [config: bookingUnitType](https://github.com/sharetribe/flex-template-web/blob/master/src/config.js) if needed - If `line-item/units` is used, add quantity handling to [BookingDatesForm](https://github.com/sharetribe/flex-template-web/blob/master/src/forms/BookingDatesForm/BookingDatesForm.js), [ListingPage](https://github.com/sharetribe/flex-template-web/blob/master/src/containers/ListingPage/ListingPage.js), [CheckoutPage](https://github.com/sharetribe/flex-template-web/blob/master/src/containers/CheckoutPage/CheckoutPage.js) - Add more static pages: [How to add static pages in FTW](/ftw-styling/how-to-add-static-pages-in-ftw/) - Changes to existing pages - Changes to transaction process (API + Web app)
47.153061
136
0.769314
yue_Hant
0.499442
0c31bcb6f53b9b2a40c45c5514f8952e57800b73
3,166
md
Markdown
docs-archive-a/2014/integration-services/lesson-1-4-adding-a-data-flow-task-to-the-package.md
v-alji/sql-docs-archive-pr.es-es
410a49b0a08c22fd4bc973078b563238d69c8b44
[ "CC-BY-4.0", "MIT" ]
1
2021-11-25T21:09:51.000Z
2021-11-25T21:09:51.000Z
docs-archive-a/2014/integration-services/lesson-1-4-adding-a-data-flow-task-to-the-package.md
v-alji/sql-docs-archive-pr.es-es
410a49b0a08c22fd4bc973078b563238d69c8b44
[ "CC-BY-4.0", "MIT" ]
1
2021-11-25T02:22:05.000Z
2021-11-25T02:27:15.000Z
docs-archive-a/2014/integration-services/lesson-1-4-adding-a-data-flow-task-to-the-package.md
v-alji/sql-docs-archive-pr.es-es
410a49b0a08c22fd4bc973078b563238d69c8b44
[ "CC-BY-4.0", "MIT" ]
1
2021-09-29T08:53:04.000Z
2021-09-29T08:53:04.000Z
--- title: 'Paso 4: Agregar una tarea Flujo de datos al paquete | Microsoft Docs' ms.custom: '' ms.date: 06/13/2017 ms.prod: sql-server-2014 ms.reviewer: '' ms.technology: integration-services ms.topic: conceptual ms.assetid: 96af3073-8f11-4444-b934-fe8613a2d084 author: chugugrace ms.author: chugu ms.openlocfilehash: 4fda1de99e9fcaa683f9063e2feb1b71886a5cd1 ms.sourcegitcommit: ad4d92dce894592a259721a1571b1d8736abacdb ms.translationtype: MT ms.contentlocale: es-ES ms.lasthandoff: 08/04/2020 ms.locfileid: "87673128" --- # <a name="step-4-adding-a-data-flow-task-to-the-package"></a>Paso 4: Adición de una tarea Flujo de datos al paquete Una vez que ha creado los administradores de conexión para los datos de origen y de destino, la siguiente tarea consiste en agregar una tarea de flujo de datos al paquete. La tarea de flujo de datos encapsula el motor de flujo de datos que mueve datos entre orígenes y destinos, y proporciona la funcionalidad para transformar, limpiar y modificar los datos a medida que se mueven. En la tarea de flujo de datos se lleva a cabo la mayor parte del proceso de extracción, transformación y carga (ETL). > [!NOTE] > [!INCLUDE[ssNoVersion](../includes/ssnoversion-md.md)][!INCLUDE[ssISnoversion](../includes/ssisnoversion-md.md)]separa el flujo de datos del flujo de control. ### <a name="to-add-a-data-flow-task"></a>Para agregar una tarea de flujo de datos 1. Haga clic en la pestaña **Flujo de control** . 2. En el **Cuadro de herramientas de SSIS**, expanda **Favoritos**y arrastre una **tarea Flujo de datos** a la superficie de diseño de la pestaña **Flujo de control** . > [!NOTE] > Si el cuadro de herramientas de SSIS no está disponible, seleccione SSIS en el menú principal y, después, haga clic en el cuadro de herramientas de SSIS para mostrarlo. 3. En la superficie de diseño **flujo de control** , haga clic con el botón secundario en la **tarea flujo de datos**recién agregada, haga clic en cambiar **nombre**y cambie el nombre a `Extract Sample Currency Data` . Es aconsejable proporcionar nombres únicos a todos los componentes que se agregan a una superficie de diseño. Para facilitar su uso y mantenimiento, los nombres deben describir la función que lleva a cabo cada componente. Seguir estas directrices de nomenclatura permite que los paquetes de [!INCLUDE[ssISnoversion](../includes/ssisnoversion-md.md)] sean autodocumentados. Los paquetes también pueden documentarse mediante anotaciones. Para obtener más información sobre cómo usar anotaciones, consulte [Usar anotaciones en paquetes](use-annotations-in-packages.md). 4. Haga clic con el botón secundario en la tarea flujo de datos, haga clic en **propiedades**y, en el ventana Propiedades, compruebe que la `LocaleID` propiedad está establecida en **Inglés (Estados Unidos)**. ## <a name="next-task-in-lesson"></a>Siguiente tarea de la lección [Paso 5: Adición y configuración del origen de archivo plano](lesson-1-5-adding-and-configuring-the-flat-file-source.md) ## <a name="see-also"></a>Consulte también [Tarea Flujo de datos](control-flow/data-flow-task.md)
67.361702
573
0.756791
spa_Latn
0.978777
0c31cd1cbcf7eb1755397b7a1c520d79802a8f6e
24
md
Markdown
README.md
htlustboy/javaNIOProject
bb60bcbd53f1bec1e2bc471035dec3b1a104d036
[ "Unlicense" ]
null
null
null
README.md
htlustboy/javaNIOProject
bb60bcbd53f1bec1e2bc471035dec3b1a104d036
[ "Unlicense" ]
null
null
null
README.md
htlustboy/javaNIOProject
bb60bcbd53f1bec1e2bc471035dec3b1a104d036
[ "Unlicense" ]
null
null
null
# javaNIOProject javaIO
8
16
0.833333
swe_Latn
0.188341
0c326819540f706189119033eade57b302d9f33c
11,476
md
Markdown
docs/content/any/guides/rbac/index.md
Kn99HN/oso
cb8e7dc40daf2b47a0fcce59c3026cc072c2989b
[ "Apache-2.0" ]
null
null
null
docs/content/any/guides/rbac/index.md
Kn99HN/oso
cb8e7dc40daf2b47a0fcce59c3026cc072c2989b
[ "Apache-2.0" ]
null
null
null
docs/content/any/guides/rbac/index.md
Kn99HN/oso
cb8e7dc40daf2b47a0fcce59c3026cc072c2989b
[ "Apache-2.0" ]
null
null
null
--- title: Build Role-Based Access Control (RBAC) metaTitle: Build Role-Based Access Control (RBAC) in $LANG weight: -1 description: | Build role-based access control (RBAC) with Oso's built-in authorization modeling features. aliases: - /guides/roles/index.html - /guides/roles/sqlalchemy/index.html --- # Build Role-Based Access Control (RBAC) in {{% lang %}} Role-based access control (RBAC) is so ubiquitous that Oso provides syntax for modeling RBAC. This syntax makes it easy to create a role-based authorization policy with roles and permissions -- for example, declaring that the `"maintainer"` role on a repository allows a user to `"push"` to that repository. In this guide, we'll walk through building an RBAC policy for [GitClub][]. [GitClub]: https://github.com/osohq/gitclub ## Declare application types as actors and resources Oso makes authorization decisions by determining if an **actor** can perform an **action** on a **resource**: - **Actor**: who is performing the action? `User("Ariana")` - **Action**: what are they trying to do? `"push"` - **Resource**: what are they doing it to? `Repository("Acme App")` The first step of building an RBAC policy is telling Oso which application types are **actors** and which are **resources**. Our example app has a pair of **resource** types that we want to control access to, `Organization` and `Repository`. We declare both as resources as follows: <!-- TODO(gj): I guess these only need to be dedented when you use angle bracket handlebars. --> {{< code file="main.polar" >}} resource Organization {} resource Repository {} {{< /code >}} Our app also has a `User` type that will be our lone type of **actor**: {{% literalInclude dynPath="policy_path" from="docs: begin-actor" to="docs: end-actor" %}} This piece of syntax is called a *resource block*, and it performs two functions: it identifies the type as an **actor** or a **resource**, and it provides a centralized place to declare roles and permissions for that particular type. {{% callout "Note" "blue" %}} For every resource block, we also need to register the type with Oso: {{< literalInclude dynPath="app_path" from="docs: begin-setup" to="docs: end-setup" hlFrom="docs: begin-register" hlTo="docs: end-register" >}} {{% /callout %}} ## Declare roles and permissions In GitClub, users can perform actions such as `"delete"`-ing an organization or `"push"`-ing to a repository. Users can also be assigned roles for either type of resource, such as the `"owner"` role for an `Organization` or the `"maintainer"` role for a `Repository`. Inside the curly braces of each `resource` block, we declare the roles and permissions for that resource: {{< code file="main.polar" >}} resource Organization { roles = ["owner"]; } resource Repository { permissions = ["read", "push"]; roles = ["contributor", "maintainer"]; } {{< /code >}} <!-- TODO(gj): transition --> ## Grant permissions to roles Next, we're going to write *shorthand rules* that grant permissions to roles. For example, if we grant the `"push"` permission to the `"maintainer"` role in the `Repository` resource block, then a user who's been assigned the `"maintainer"` role for a particular repository can `"push"` to that repository. Here's our `Repository` resource block with a few shorthand rules added: {{< code file="main.polar" hl_lines="5-10" >}} resource Repository { permissions = ["read", "push"]; roles = ["contributor", "maintainer"]; # An actor has the "read" permission if they have the "contributor" role. "read" if "contributor"; # An actor has the "read" permission if they have the "maintainer" role. "read" if "maintainer"; # An actor has the "push" permission if they have the "maintainer" role. "push" if "maintainer"; } {{< /code >}} Shorthand rules expand to regular [Polar rules](polar-syntax#rules) when a policy is loaded. The `"push" if "maintainer"` shorthand rule above expands to: ```polar has_permission(actor: Actor, "push", repository: Repository) if has_role(actor, "maintainer", repository); ``` {{% callout "Note" "blue" %}} Instances of our application's `User` type will match the `Actor` [specializer](polar-syntax#specialization) because of our `actor User {}` resource block declaration. {{% /callout %}} ## Grant roles to other roles All of the shorthand rules we've written so far have been in the `<permission> if <role>` form, but we can also write `<role1> if <role2>` rules. This type of rule is great for situations where you want to express that `<role2>` should be granted every permission you've granted to `<role1>`. In the previous snippet, the permissions granted to the `"maintainer"` role are a superset of those granted to the `"contributor"` role. If we replace the existing `"read" if "maintainer"` rule with `"contributor" if "maintainer"`, the `"maintainer"` role still grants the `"read"` permission: {{< code file="main.polar" hl_lines="10-11" >}} resource Repository { permissions = ["read", "push"]; roles = ["contributor", "maintainer"]; # An actor has the "read" permission if they have the "contributor" role. "read" if "contributor"; # An actor has the "push" permission if they have the "maintainer" role. "push" if "maintainer"; # An actor has the "contributor" role if they have the "maintainer" role. "contributor" if "maintainer"; } {{< /code >}} In addition, any permissions we grant the `"contributor"` role in the future will automatically propagate to the `"maintainer"` role. <!-- TODO(gj): better heading --> ## Access role assignments stored in the application An Oso policy contains authorization *logic*, but the application remains in control of all authorization *data*. For example, the logic that the `"maintainer"` role on a repository grants the `"push"` permission lives in the policy, but Oso doesn't manage the data of which users have been assigned the `"maintainer"` role for `Repository("Acme App")`. That data stays in the application, and Oso asks the application for it as needed. The main question Oso asks is: does `User("Ariana")` have the `"maintainer"` role on `Repository("Acme App")`? For Oso to be able to ask this question, we need to implement a `has_role()` rule in the policy: {{< literalInclude dynPath="policy_path" from="docs: begin-has_role" to="docs: end-has_role" >}} `role in user.{{% exampleGet "roles" %}}` iterates over a user's assigned roles and `role.{{% exampleGet "role_name" %}} = name and role.{{% exampleGet "role_resource" %}} = resource` succeeds if the user has been assigned the `name` role for `resource`. {{< callout "Note" "blue" >}} <!-- TODO(gj): spacing seems off in these callouts --> <div class="pb-4"></div> The body of this rule will vary according to the way roles are stored in your application. The data model for our GitClub example is as follows: <div class="pb-4"></div> {{< literalInclude dynPath="app_path" from="docs: begin-types" to="docs: end-types" >}} If, for example, repository roles and organization roles were stored separately instead of in a heterogeneous set, we might define a pair of `has_role()` rules, one for each role type: <!-- TODO(gj): why do I need to dedent this? --> {{< code codeLang="polar" >}} has_role(user: User, name: String, repository: Repository) if role in user.{{% exampleGet "repository_roles" %}} and role.{{% exampleGet "role_name" %}} = name and role.{{% exampleGet "role_repository" %}} = repository; has_role(user: User, name: String, organization: Organization) if role in user.{{% exampleGet "organization_roles" %}} and role.{{% exampleGet "role_name" %}} = name and role.{{% exampleGet "role_organization" %}} = organization; {{< /code >}} {{% /callout %}} Our `has_role()` rule can check role assignments on repositories and organizations, but so far we've only talked about repository roles. Let's change that and see how Oso can leverage parent-child relationships like the one between `Repository` and `Organization` to grant a role on a child resource to a role on the parent. <!-- TODO(gj): better heading --> ## Grant a role on a child resource to a role on the parent If you've used ~~GitHub~~ *GitClub* before, you know that having a role on an organization grants certain roles and permissions on that organization's repositories. For example, a user is granted the `"maintainer"` role on a repository if they're assigned the `"owner"` role on the repository's parent organization. This is how you write that rule with Oso: {{< literalInclude dynPath="policy_path" lines="22-26,37-45" hl_lines="4,8-9,12-13" ellipsis=" # ..." >}} First, we declare that every `Repository` has a `"parent"` relation that references an `Organization`: {{< literalInclude dynPath="policy_path" from="docs: begin-relations" to="docs: end-relations" >}} This is a dictionary where each key is the name of the relation and each value is the relation's type. Next, we write a `has_relation()` rule that tells Oso how to check if an organization has the `"parent"` relation with a repository: {{< literalInclude dynPath="policy_path" from="docs: begin-has_relation" to="docs: end-has_relation" >}} <!-- TODO(gj): better phrasing for the next sentence --> In this case, an organization is the `"parent"` of a repository if the repository's `organization` field points to it. {{% callout "Note" "blue" %}} Note that the resource where we declared the relationship, `Repository`, is the *third* parameter and the related resource, `Organization`, is the *first*. This ordering was chosen to mirror the ordering of the expanded forms for `has_role()` and `has_permission()`, where the resource for which the actor has the role or permission is the third argument: ```polar has_role(actor: Actor, name: String, resource: Resource) if ... has_permission(actor: Actor, name: String, resource: Resource) if ... has_relation(related_resource: Resource, name: String, resource: Resource) if ... ``` {{% /callout %}} Finally, we add a shorthand rule that involves the `"maintainer"` repository role, the `"owner"` organization role, and the `"parent"` relation between the two resource types: {{< literalInclude dynPath="policy_path" lines="22-26,37-39" hl_lines="4,8-9" ellipsis=" # ..." >}} ## Add an `allow()` rule At this point, the policy is almost fully functional. All that's left is adding an `allow()` rule: {{< literalInclude dynPath="policy_path" from="docs: begin-allow" to="docs: end-allow" >}} This is a typical `allow()` rule for a policy using resource blocks: an actor is allowed to perform an action on a resource if the actor *has permission* to perform the action on the resource. <!-- And an actor has permission to perform an action on a resource if the actor is assigned a role that grants that permission. --> This `allow()` rule serves as the entrypoint when we query our policy via Oso's enforcement methods like {{% apiDeepLink class="Oso" %}}{{% exampleGet "authorize_method_name" %}}{{% /apiDeepLink %}}: {{% exampleGet "authorize" %}} ## Baby Got RBAC Our complete policy looks like this: {{< literalInclude dynPath="policy_path" >}} If you'd like to play around with a more fully-featured version of this policy and application, check out the GitClub repository on [GitHub][GitClub].
35.094801
101
0.712705
eng_Latn
0.994996
0c32c9634f7424b985013deded6be0ae045a1197
6,449
md
Markdown
README.md
Zulqurnain/ZCalenderView
0fa01702abd9f29a628d78dbf17dd40ff4bf3487
[ "MIT", "Unlicense" ]
7
2017-08-28T00:28:01.000Z
2021-01-16T21:44:52.000Z
README.md
Zulqurnain/ZCalenderView
0fa01702abd9f29a628d78dbf17dd40ff4bf3487
[ "MIT", "Unlicense" ]
null
null
null
README.md
Zulqurnain/ZCalenderView
0fa01702abd9f29a628d78dbf17dd40ff4bf3487
[ "MIT", "Unlicense" ]
2
2018-01-09T16:59:30.000Z
2022-03-01T17:32:01.000Z
# ZCalenderView It's a custom calender inspired from iOS7 style calender , displays Months verticaly. you can include this library using gradle , maven and if you enjoy don't forget to follow me on facebook [@RajaJutt](https://www.facebook.com/Raja.jutt "joine on facebook") , twitter [@zulqurnainbro](https://twitter.com/zulqurnainbro "twitter") Thanks ;) ## ScreenShots <img src="https://github.com/Zulqurnain/ZCalenderView/raw/master/screenshots/1.png" width="200"> <img src="https://github.com/Zulqurnain/ZCalenderView/raw/master/screenshots/2.png" width="200"> <img src="https://github.com/Zulqurnain/ZCalenderView/raw/master/screenshots/3.png" width="200"> ## Features: - Support Up to API level 11+ - Show Months Verticaly - Support for Marking Events - Curreent day and Marked day selection color and text changing - fixed crashes - Kotlin Support :heartpulse: - Android 11 Support :scream: - JCenter Removed :ok_hand: ### Gradle [![](https://jitpack.io/v/Zulqurnain/ZCalenderView.svg)](https://jitpack.io/#Zulqurnain/ZCalenderView) **Step 1** Add the JitPack repository to your build file. Add it in your build.gradle at the end of repositories. ```java repositories { maven { url "https://jitpack.io" } } ``` **Step-2** Add the dependency in the form ```java dependencies { implement 'com.github.Zulqurnain:ZCalenderView:2.0' } ``` ### Maven ```xml <repository> <id>jitpack.io</id> <url>https://jitpack.io</url> </repository> ``` **Step 2** Add the dependency in the form ```xml <dependency> <groupId>com.github.Zulqurnain</groupId> <artifactId>ZCalenderView</artifactId> <version>2.0</version> </dependency> ``` ### Sbt **Step-1** Add it in your build.sbt at the end of resolvers: ```java resolvers += "jitpack" at "https://jitpack.io" ``` **Step-2** Add the dependency in the form ```java libraryDependencies += "com.github.Zulqurnain" % "ZCalenderView" % "2.0" ``` ### Usage Declare a ZCalenderView inside your layout XML file: ``` xml <jutt.com.zcalenderview.ZCalenderView android:id="@+id/calendar_view" xmlns:calendar="http://schemas.android.com/apk/res-auto" android:layout_width="match_parent" android:layout_height="match_parent" android:layout_below="@+id/ll_calender_month" android:background="@android:color/white" calendar:colorMonthName="@color/colorGreen" calendar:colorCurrentDayCircle="@color/colorLightRed" calendar:colorCurrentDayText="@android:color/white" calendar:calendarHeight="400dp" calendar:colorDayName="@color/colorPrimary" calendar:colorNormalDay="@color/colorPrimary" calendar:drawRoundRect="false"/> ``` Next, you have to implement `DatePickerController` in your Activity or your Fragment. calender originaly shows 3 years data current year previous one and next one ``` java @Override public void onDayOfMonthSelected(int year, int month, int day) { Log.e("Day Selected", day + " / " + month + " / " + year); } ``` Then you can use it like this: ``` calenderView = (ZCalenderView) findViewById(R.id.calendar_view); // on click of specific date calenderView.setController(this); // define if you want to show 1 month for this view only and vertical scroll is enable in this case calenderView.setEnableHeightResize(false); // events require hashmap HashMap<SimpleMonthAdapter.CalendarDay,Integer> eventsMap = new HashMap<>(); // this date is basically 1st august 2017 and it will show 1 event dot under this day eventsMap.put(new SimpleMonthAdapter.CalendarDay(2017,7,20),1); calenderView.setEventsHashMap(eventsMap); ``` --- ### Customization ZCalenderView is fully customizable: * app:colorSelectedDayBackground --> If you click on a day, a circle indicator or a rouded rectangle indicator will be draw. * app:colorSelectedDayText --> This is the text color of a selected day * app:colorPreviousDay [color def:#ff999999] --> In the current month you can choose to have a specific color for the past days * app:colorNormalDay [color def:#ff999999] --> Default text color for a day * app:colorMonthName [color def:#ff999999] --> Month name and year text color (three letter words) * app:colorDayName [color def:#ff999999] --> Day name text color * app:textSizeDay [dimension def:16sp] --> Font size for numeric day must be greater than or equal to given * app:drawRoundRect [boolean def:false] --> Draw a rounded rectangle for selected days instead of a circle * app:selectedDayRadius [dimension def:16dip] --> Set radius if you use default circle indicator * app:calendarHeight [dimension def:270dip] --> Height of each month/row * app:enablePreviousDay [boolean def:true] --> Enable past days in current month * app:currentDaySelected [boolean def:false] --> Select current day by default * app:colorCurrentDayCircle --> current day circle color * app:colorCurrentDayText --> current day text color ### Acknowledgements Thanks to: - [Robin Chutaux](https://github.com/traex) for his [CalendarListview](https://github.com/traex/CalendarListview). - [JunGuan Zhu](https://github.com/NLMartian) for further improving Robin's work as [SilkCal](https://github.com/NLMartian/SilkCal) ### MIT License ``` The MIT License (MIT) Copyright (c) 2014 Junguan Zhu Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ```
37.935294
340
0.7274
eng_Latn
0.707333
0c335d91d7f8e1c2199345fc137b73aca09de8ce
6,971
md
Markdown
README.md
QCOP/it-info
7f40fb0d799d51131867f8d7bcdbd7dc2904651e
[ "Apache-1.1" ]
null
null
null
README.md
QCOP/it-info
7f40fb0d799d51131867f8d7bcdbd7dc2904651e
[ "Apache-1.1" ]
null
null
null
README.md
QCOP/it-info
7f40fb0d799d51131867f8d7bcdbd7dc2904651e
[ "Apache-1.1" ]
null
null
null
# it-info Information for the current IT Executive. ## About This file is for the current IT Executive. This is a list of the how-to's to do your role in relation to the website. The website is hosted on Github pages because it's free and it's simpler. The websites are built in pure HTML, CSS, and JS. In this document, I'm assuming you have zero knowledge of Github or Github pages. If you have any questions, fire me a message on Facebook or chat to the previous years IT person. ## Using Github ### Why? Pretty simply: it's free, keeps a copy of all files and copies of previous years so it's easy to get past speaker and sponsor information. ### Creating a new website repository We have a QCOP organization. If you are not apart of this organization then you need to contact the co-chairs or the previous years co-chairs. They have `owner` permissions. All QCOP related code can be found [here](https://github.com/QCOP). Each year creates a new repository (In short: repo) for their website so following the naming convention `QCOP-20XX`. There is no copy of the [2016](https://github.com/qcop/qcop-2015) year however it was a fork of the 2015 in terms of looks, and the speaker and sponsor assets exists in [2017](https://github.com/qcop/qcop-2017). ![logo](https://raw.githubusercontent.com/QCOP/it-info/master/imgs/step1_1.png "Logo Title Text 2") As an owner, you can create the new repository: click the `+` in the top right corner, then `New Repository`. From the new repository page you can change the owner (using the dropdown) to be `QCOP`. Insert the name (`QCOP-20__`) and the description (`The Queen's Conference on Philanthropy website for 20__-20__`). Keep the repository public because website assets are public anyway (plus private costs money). Check `Initialize this repository with a README`, and add an `Apache License`. Then click `Create repository`. ![logo2](https://raw.githubusercontent.com/QCOP/it-info/master/imgs/step1_2.png "Logo Title Text 2") ### Cloning your repository Now open your new repository, so the URL would be something like: `https://github.com/qcop/qcop-20XX`. It will look like: ![logo3](https://raw.githubusercontent.com/QCOP/it-info/master/imgs/step1_3.png "Logo Title Text 2") Keep this open. Now you need `git`. Git and Github are two different things. Git is a versioning software for developers. And Github at a high level is a UI for public Git servers. I'm not going to explain it too much more but if you do some Googling, someone explains it. #### Installing Git Go to [https://git-scm.com/](https://git-scm.com/) and hit the download on the right side. ![logo4](https://raw.githubusercontent.com/QCOP/it-info/master/imgs/step1_4.png "Logo Title Text 2") Now install that. #### Cloning Open your repo Github page again. Under the green button, you'll see a URL. This should be `https`, if not click `Use HTTPS`. Copy it. It will be something like `https://github.com/qcop/qcop-20__.git`. Open a terminal (this is a program) on your computer. If you are on windows, use the program called GitBash and anytime I reference terminal, its Gitbash. Navigate to the desired folder using: ``` # Lists the current folders and files in your current directory ls # Navigates into a folder cd ______ # Navigates back a folder cd .. # Takes you to your desktop cd ~/Desktop ``` I'd probably do something like: ``` cd ~/Desktop ``` Then clone the repository using that `git` link: ``` git clone https://github.com/qcop/qcop-20__.git ``` And you should see a new folder in that directory (folder). Then navigate into that directory using: ``` cd qcop-20__ ``` You can also press `tab` to auto complete the folder if you are at a unique part. *Image with commands below* ### Making your commits Just as a test/example copy and paste a file into that folder (A file you don't mind being public). After doing that, in your terminal with the repo being your current directory (hence follow the `cd` into the correct folder). You should be able to type: ``` git status ``` and see the one file. *Instructions from above in my terminal* ![logo5](https://raw.githubusercontent.com/QCOP/it-info/master/imgs/step2_1.png "Logo Title Text 2") Then type: ``` git add --all ``` or to add each individual file: ``` git add FILE_NAME_LIKE_CHANGE_THIS # Example: git add ./example_file.txt # ./ <-- means current directory ``` Then type (description cannot be blank): ``` git commit -m "DESCRIBE WHAT YOU DID HERE" ``` Then push it to the cloud/Github: ``` git push ``` And you should see it on Github. ![logo6](https://raw.githubusercontent.com/QCOP/it-info/master/imgs/step2_3.png "Logo Title Text 2") So every time you want to update the website. Change the file, then follow the `add`, `commit`, `push` commands. If you ever have an issue, do `git status` and read the recommended commands. ### Updating the GH-Pages Branch On the repo, there is a settings tab. You'll have to open it. ![logo7](https://raw.githubusercontent.com/QCOP/it-info/master/imgs/step3_1.png "Logo Title Text 2") Scroll down to the `Github Pages` section. Use the `source` dropdown and select the `master branch`. This means all files on `master` will go to the website. ![logo28](https://raw.githubusercontent.com/QCOP/it-info/master/imgs/step3_2.png "Logo Title Text 2") Now you should be able to go to `https://qcop.github.io/qcop-20__` and see the website. It may take a few minutes. If you can't after 2 hours, you did something wrong. ### Pointing the domain To get the QCOP domain to work on the Github pages so you go to `qcop.ca` instead of `https://qcop.github.io...`. You'll need to add a `CNAME` file to your repo (put one in the folder and push it to the server). This file is `CNAME` only, no extensions or anything. Look at the previous years to see an example. In this file add `qcop.ca`. You can open a file like this using [Atom](https://atom.io/), Sublime Text 3, Notepad++, Notepad, etc. Next, QCOP's domain is registered with [CanHost](https://canhost.ca). The co-chairs should have the email and password. Click on `Domains`. ![logo9](https://raw.githubusercontent.com/QCOP/it-info/master/imgs/stepa.png "Logo Title Text 2") Then hit the little wrench next to `qcop.ca`. ![logo10](https://raw.githubusercontent.com/QCOP/it-info/master/imgs/stepb.png "Logo Title Text 2") On the left is a menu and at the bottom click `Manage DNS`. ![logo11](https://raw.githubusercontent.com/QCOP/it-info/master/imgs/stepc.png "Logo Title Text 2") Last step, change `CNAME` row, `rdata` column to be `qcop.github.io` and the `localhost.qcop.ca` row to be `http://qcop.github.io/qcop-20XX`. This will take a little while to kick in. ![logo12](https://raw.githubusercontent.com/QCOP/it-info/master/imgs/stepd.png "Logo Title Text 2") ## Collecting Applications You can use a few options around the internet. [TypeForm](https://typeform.com) is probably my recommendation.
37.681081
571
0.740783
eng_Latn
0.981474
0c33b5bf8f08904f287167d43523532ef99b2039
1,276
md
Markdown
README.md
matusf/pay-by-square
83013ac02acce5f27c9eebb27d3c3f5b773e20c1
[ "MIT" ]
9
2020-11-03T19:31:55.000Z
2021-12-26T14:21:56.000Z
README.md
matusf/pay-by-square
83013ac02acce5f27c9eebb27d3c3f5b773e20c1
[ "MIT" ]
3
2020-09-11T10:45:04.000Z
2021-07-30T13:22:10.000Z
README.md
matusf/pay-by-square
83013ac02acce5f27c9eebb27d3c3f5b773e20c1
[ "MIT" ]
4
2020-07-17T10:42:58.000Z
2021-11-16T17:40:20.000Z
# PAY by square Generate codes for [by square](https://bysquare.com/) payments. ## Installation Note: `pay-by-square` generates string that can be passes to QR code generator to create image. To run example below, you need to install [qrcode module](https://github.com/lincolnloop/python-qrcode) as well. ```sh pip install pay-by-square ``` ## Usage ### API ```text pay_by_square.generate( *, amount: float, iban: str, swift: str = '', date: Optional[date] = None, beneficiary_name: str = '', currency: str = 'EUR', variable_symbol: str = '', constant_symbol: str = '', specific_symbol: str = '', note: str = '', beneficiary_address_1: str = '', beneficiary_address_2: str = '', ) -> str: Generate pay-by-square code that can by used to create QR code for banking apps When date is not provided current date will be used. ``` ### Example ```python import qrcode import pay_by_square code = pay_by_square.generate( amount=10, iban='SK7283300000009111111118', swift='FIOZSKBAXXX', variable_symbol='47', ) print(code) img = qrcode.make(code) img.show() ``` ## Testing ```sh python -m unittest tests.py ``` --- Kudos to [guys from devel.cz](https://devel.cz/otazka/qr-kod-pay-by-square)
18.764706
88
0.662226
eng_Latn
0.842121
0c33bc1dbddad373e5fff350d570bb898fa1391d
1,449
md
Markdown
README.md
lehman-webdev/project-1-turtles
e84937e313bb82d363aa5897eee27e1af151b028
[ "MIT" ]
6
2019-03-25T22:21:12.000Z
2021-11-17T09:12:50.000Z
README.md
lehman-webdev/project-1-turtles
e84937e313bb82d363aa5897eee27e1af151b028
[ "MIT" ]
1
2019-04-02T15:25:13.000Z
2019-04-08T21:41:17.000Z
README.md
lehman-webdev/project-1-turtles
e84937e313bb82d363aa5897eee27e1af151b028
[ "MIT" ]
3
2019-03-25T22:18:09.000Z
2021-03-28T13:29:47.000Z
# Project 1 This project is intended to exercise everything we have learned in class up to (excluding) JavaScript. The task is to implement a multi-page static website without using any third-party libraries, toolkits, templates, or existing code. A design may be created from scratch or adapted from any website/theme/template available online (but no code may be reused from any source: not your own, not a library or toolkit, not an existing website, not a template). The content may be anything—a fictitious business website, a personal website for yourself, a website for an organization you belong to, etc. Alternatively, you may clone an existing website (recreate a site that already exists). Think about your site content, layout, pages, design (read chapter 18 in our book). ## Goals The purpose of this project is to practice using and demonstrate understanding of: **Git** * Code locally, push to GitHub * Conflict resolution **HTML** * Create multi-page static website * Demonstrate understanding of all major tags **CSS** * Implement responsive design * Demonstrate understanding of common properties ## Requirements * HTML correctness (validation) * HTML variety (use of many tags) * Accessibility * Responsiveness (media queries) * CSS correctness (validation) * CSS variety (use of many properties) * Team delegation (splitting up of work) * Consistency between pages (header, footer) * Code style (indentation)
42.617647
458
0.775017
eng_Latn
0.99824
0c33cafbe87a99309c6f785da6117d24a2cff61a
947
md
Markdown
docs/quick-start.md
martin7dev/n8n
0bdb9cecac846db285b0df768f0a8e7e4d9d43c6
[ "Apache-2.0" ]
1
2020-05-27T10:07:39.000Z
2020-05-27T10:07:39.000Z
docs/quick-start.md
martin7dev/n8n
0bdb9cecac846db285b0df768f0a8e7e4d9d43c6
[ "Apache-2.0" ]
19
2022-03-07T06:56:25.000Z
2022-03-11T19:47:52.000Z
docs/quick-start.md
martin7dev/n8n
0bdb9cecac846db285b0df768f0a8e7e4d9d43c6
[ "Apache-2.0" ]
1
2020-11-03T14:12:31.000Z
2020-11-03T14:12:31.000Z
# Quick Start ## Give n8n a spin To spin up n8n, you can run: ```bash npx n8n ``` It will download everything that is needed to start n8n. You can then access n8n by opening: [http://localhost:5678](http://localhost:5678) ## Start with docker To play around with n8n, you can also start it using docker: ```bash docker run -it --rm \ --name n8n \ -p 5678:5678 \ n8nio/n8n ``` Be aware that all the data will be lost once the docker container gets removed. To persist the data mount the `~/.n8n` folder: ```bash docker run -it --rm \ --name n8n \ -p 5678:5678 \ -v ~/.n8n:/root/.n8n \ n8nio/n8n ``` More information about the Docker setup can be found in the README file of the [Docker Image](https://github.com/n8n-io/n8n/blob/master/docker/images/n8n/README.md). In case you run into issues, check out the [troubleshooting](troubleshooting.md) page or ask for help in the community [forum](https://community.n8n.io/).
21.522727
154
0.697994
eng_Latn
0.94677
0c3428463d73bf53f20168319437b06fe55edda0
84
md
Markdown
README.md
wzmemo/Starting_Regression_STA3155
481fbfb96b17c9bc3ba6199e1eba68600ab85d49
[ "Apache-2.0" ]
null
null
null
README.md
wzmemo/Starting_Regression_STA3155
481fbfb96b17c9bc3ba6199e1eba68600ab85d49
[ "Apache-2.0" ]
null
null
null
README.md
wzmemo/Starting_Regression_STA3155
481fbfb96b17c9bc3ba6199e1eba68600ab85d49
[ "Apache-2.0" ]
null
null
null
# Starting Regression STA3155 Solving simple linear regression for class (STA 3155)
28
53
0.821429
eng_Latn
0.935359
0c355ab945868932530e814b3b33327e0edd4482
1,310
md
Markdown
README.md
Rogiel/star-map
1883146837ebdf4de835315f09a70731961e4ec9
[ "BSD-2-Clause" ]
1
2019-05-27T01:41:37.000Z
2019-05-27T01:41:37.000Z
README.md
Rogiel/star-map
1883146837ebdf4de835315f09a70731961e4ec9
[ "BSD-2-Clause" ]
1
2020-10-11T22:26:04.000Z
2020-10-11T22:26:04.000Z
README.md
Rogiel/star-map
1883146837ebdf4de835315f09a70731961e4ec9
[ "BSD-2-Clause" ]
1
2020-10-11T22:20:42.000Z
2020-10-11T22:20:42.000Z
# Star Map This library allows you to read StarCraft II map files from PHP. A object-oriented API is provided to browse through the metadata and the minimap image. ## Features * Read .SC2Map files from all public game versions * **Minimap**: Allows to read the embeded minimap image ## Installation The recommended way of installing this library is using Composer. composer require "rogiel/star-map" This library uses [php-mpq](https://github.com/Rogiel/php-mpq) to parse and extract compressed information inside the map file. ## Example ```php use Rogiel\StarMap\Map; // Parse the map $map = new Map('Ruins of Seras.SC2Map'); // Get the map name in multiple locales $documentHeader = $map->getDocumentHeader(); echo sprintf('Map name (English): %s', $documentHeader->getName()).PHP_EOL; // english is default echo sprintf('Map name (French): %s', $documentHeader->getName('frFR')).PHP_EOL; // Get the map size $mapInfo = $map->getMapInfo(); $x = $mapInfo->getWidth(); $y = $mapInfo->getHeight(); echo sprintf('Map size: %sx%s', $x, $y).PHP_EOL; // Export Minimap image as a PNG $map->getMinimap()->toPNG('Minimap.png'); ``` The output to the snippet above is the following: ``` Map name (English): Ruins of Seras Map name (French): Ruines de Seras Map size: 224x192 ``` Have fun!
25.686275
127
0.71374
eng_Latn
0.852352
0c35b6138719cb3c518a0ec74c5bfc712b203652
327
md
Markdown
README.md
shane-e945/TriangleSpirograph
014acb76e2e448c1af67b19c2cd15488b547fec2
[ "MIT" ]
3
2020-04-11T08:17:45.000Z
2021-09-19T20:46:19.000Z
README.md
shane-e945/TriangleSpirograph
014acb76e2e448c1af67b19c2cd15488b547fec2
[ "MIT" ]
null
null
null
README.md
shane-e945/TriangleSpirograph
014acb76e2e448c1af67b19c2cd15488b547fec2
[ "MIT" ]
1
2020-04-12T06:43:16.000Z
2020-04-12T06:43:16.000Z
# TriangleSpirograph Triangle Spirograph Generator using Python and Python mode for Processing Processing: https://processing.org/download/ To get Python Mode, click on the mode button in the top right corner and click "Add mode…" ![Spirograph GIF](https://github.com/shane-e945/TriangleSpirograph/blob/master/triangle.gif)
36.333333
92
0.801223
eng_Latn
0.507621
0c35e001ea6af2545d1eb49cea9aeb786bd73093
115,102
md
Markdown
articles/active-directory/saas-apps/workday-inbound-tutorial.md
changeworld/azure-docs.tr-tr
a6c8b9b00fe259a254abfb8f11ade124cd233fcb
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/active-directory/saas-apps/workday-inbound-tutorial.md
changeworld/azure-docs.tr-tr
a6c8b9b00fe259a254abfb8f11ade124cd233fcb
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/active-directory/saas-apps/workday-inbound-tutorial.md
changeworld/azure-docs.tr-tr
a6c8b9b00fe259a254abfb8f11ade124cd233fcb
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: 'Öğretici: Azure Active Directory ile otomatik kullanıcı sağlama için İş Gününü Yapılandırın | Microsoft Dokümanlar' description: Azure Active Directory'yi, kullanıcı hesaplarını İş Günü'ne otomatik olarak sağlamak ve sağlamadan çıkarmak için nasıl yapılandırılayarıştırmayı öğrenin. services: active-directory author: cmmdesai documentationcenter: na manager: daveba ms.assetid: 1a2c375a-1bb1-4a61-8115-5a69972c6ad6 ms.service: active-directory ms.subservice: saas-app-tutorial ms.devlang: na ms.topic: article ms.tgt_pltfrm: na ms.workload: identity ms.date: 05/16/2019 ms.author: chmutali ms.collection: M365-identity-device-management ms.openlocfilehash: d7eb01f3997ac4ab2e439c00f07990c51ec3e3d3 ms.sourcegitcommit: 2ec4b3d0bad7dc0071400c2a2264399e4fe34897 ms.translationtype: MT ms.contentlocale: tr-TR ms.lasthandoff: 03/27/2020 ms.locfileid: "80370357" --- # <a name="tutorial-configure-workday-for-automatic-user-provisioning"></a>Öğretici: Otomatik kullanıcı sağlama için İş Gününü Yapılandır Bu öğreticinin amacı, çalışma gününden hem Active Directory'ye hem de Azure Etkin Dizini'ne işçi profillerini içe aktarmanız için gerçekleştirmeniz gereken adımları ve e-posta adresinin ve kullanıcı adının İş Günü'ne isteğe bağlı olarak yazılmasıdır. ## <a name="overview"></a>Genel Bakış [Azure Active Directory kullanıcı sağlama hizmeti,](../app-provisioning/user-provisioning.md) kullanıcı hesaplarını sağlamak için [İş Günü İnsan Kaynakları API'siyle](https://community.workday.com/sites/default/files/file-hosting/productionapi/Human_Resources/v21.1/Get_Workers.html) tümleşir. Azure AD, aşağıdaki kullanıcı sağlama iş akışlarını etkinleştirmek için bu bağlantıyı kullanır: * **Kullanıcıları Active Directory'ye sağlama** - İş Günü'nden seçilen kullanıcı kümelerini bir veya daha fazla Active Directory etki alanına sağlama. * **Yalnızca bulut kullanıcıları Azure Etkin Dizini'ne sağlama** - Şirket içi Active Directory'nin kullanılmadığı senaryolarda, kullanıcılar Azure AD kullanıcı sağlama hizmetini kullanarak doğrudan İş Günü'nden Azure Etkin Dizin'ine kadar sağlanabilir. * **E-posta adresini ve kullanıcı adını İş Günü'ne geri yazın** - Azure AD kullanıcı sağlama hizmeti, e-posta adreslerini ve kullanıcı adını Azure AD'den Iş Gününe geri yazabilir. ### <a name="what-human-resources-scenarios-does-it-cover"></a>Hangi insan kaynakları senaryolarını kapsıyor? Azure AD kullanıcı sağlama hizmeti tarafından desteklenen İş Günü kullanıcı sağlama iş akışları, aşağıdaki insan kaynaklarının ve kimlik yaşam döngüsü yönetimi senaryolarının otomasyonuna olanak tanır: * **Yeni çalışanların işe alınması** - İş Günü'ne yeni bir çalışan eklendiğinde, Active Directory, Azure Active Directory ve isteğe bağlı olarak Office 365 ve [Azure AD tarafından desteklenen diğer SaaS uygulamalarında](../app-provisioning/user-provisioning.md)otomatik olarak bir kullanıcı hesabı oluşturulur ve e-posta adresinin İş Günü'ne geri yazılması yla birlikte. * **Çalışan özniteliği ve profil güncelleştirmeleri** - Bir çalışan kaydı İş Günü'nde (adı, unvanı veya yöneticisi gibi) güncelleştirildiğinde, kullanıcı hesapları Active Directory, Azure Active Directory ve isteğe bağlı olarak Office 365 ve [Azure AD tarafından desteklenen diğer SaaS uygulamalarında](../app-provisioning/user-provisioning.md)otomatik olarak güncelleştirilir. * **Çalışan sonlandırmaları** - Bir çalışanın İş Günü'nde sonlandırılması durumunda, kullanıcı hesabı Active Directory, Azure Active Directory ve isteğe bağlı olarak Office 365 ve [Azure AD tarafından desteklenen diğer SaaS uygulamalarında](../app-provisioning/user-provisioning.md)otomatik olarak devre dışı bırakılır. * **Çalışan yeniden işe alma -** Bir çalışan İş Günü'nde yeniden işe alındığında, eski hesabı otomatik olarak yeniden etkinleştirilebilir veya yeniden kullanılabilir (tercihinize bağlı olarak) Active Directory, Azure Active Directory ve isteğe bağlı olarak Office 365 ve Azure AD tarafından desteklenen diğer [SaaS uygulamalarına.](../app-provisioning/user-provisioning.md) ### <a name="who-is-this-user-provisioning-solution-best-suited-for"></a>Bu kullanıcı sağlama çözümü kimin için en uygun? Bu İş Günü kullanıcı sağlama çözümü aşağıdakiler için idealdir: * İş Günü kullanıcı sağlama için önceden oluşturulmuş, bulut tabanlı bir çözüm isteyen kuruluşlar * İş Günü'nden Etkin Dizin'e veya Azure Etkin Dizini'ne doğrudan kullanıcı sağlama gerektiren kuruluşlar * İş Günü HCM modülünden elde edilen veriler kullanılarak kullanıcıların sağlanmasını gerektiren kuruluşlar (bkz. [Get_Workers)](https://community.workday.com/sites/default/files/file-hosting/productionapi/Human_Resources/v21.1/Get_Workers.html) * Yalnızca İş Günü HCM modülünde algılanan değişiklik bilgilerine dayalı olarak bir veya daha fazla Etkin Dizin Ormanları, Etki Alanları ve OS'lere eşitlenmeleri için katılmayı, taşımayı ve kullanıcıların ayrılmasını gerektiren kuruluşlar (bkz. [Get_Workers)](https://community.workday.com/sites/default/files/file-hosting/productionapi/Human_Resources/v21.1/Get_Workers.html) * E-posta için Office 365 kullanan kuruluşlar ## <a name="solution-architecture"></a>Çözüm Mimarisi Bu bölümde, ortak karma ortamlar için uçtan uca kullanıcı sağlama çözüm mimarisi açıklanmaktadır. İlgili iki akış vardır: * **Yetkili İk Veri Akışı – İş Gününden şirket içi Aktif Dizine:** Bu akışta çalışan olaylar (Yeni İşe alımlar, aktarımlar, sonlandırmalar gibi) önce bulut İş Günü İk kiracısında oluşur ve ardından olay verileri Azure AD ve Geçici Aracı aracılığıyla şirket içi Active Directory'ye akar. Olaya bağlı olarak, AD'de oluşturma/güncelleştirme/etkinleştirme/devre dışı etme işlemlerine yol açabilir. * **E-posta ve Kullanıcı Adı Writeback Flow – şirket içi Aktif Dizinden İş Gününe:** Hesap oluşturma Etkin Dizin'de tamamlandıktan sonra Azure AD Connect aracılığıyla Azure AD ile senkronize edilir ve e-posta ve kullanıcı adı özniteliği İş Günü'ne geri yazılabilir. ![Genel Bakış](./media/workday-inbound-tutorial/wd_overview.png) ### <a name="end-to-end-user-data-flow"></a>Uçuça kullanıcı veri akışı 1. İk ekibi, Iş Günü HCM'de işçi işlemlerini (Joiners/Movers/Leavers veya New Hires/Transfers/Terminations) gerçekleştirir 2. Azure AD Sağlama Hizmeti, İş Günü İk'daki kimliklerin zamanlanmış eşitlemelerini çalıştırıyor ve şirket içi Active Directory ile eşitlenmesi için işlenmesi gereken değişiklikleri tanımlar. 3. Azure AD Sağlama Hizmeti, şirket içi Azure AD Bağlantı Sağlama Aracısını AD hesabı oluşturma/güncelleştirme/etkinleştirme/devre dışı etme işlemlerini içeren bir istek yüküyle çağırır. 4. Azure AD Bağlantı Sağlama Aracısı, AD hesabı verilerini eklemek/güncellemek için bir hizmet hesabı kullanır. 5. Azure AD Connect / AD Sync motoru, AD'deki güncelleştirmeleri çekmek için delta sync çalıştırın. 6. Etkin Dizin güncelleştirmeleri Azure Etkin Dizini ile senkronize edilir. 7. Workday Writeback bağlayıcısı yapılandırılırsa, kullanılan eşleşen özniteliğe bağlı olarak e-posta özniteliğini ve kullanıcı adını Workday'e geri yazar. ## <a name="planning-your-deployment"></a>Dağıtımınızı planlama İş Günü tümleştirmenize başlamadan önce, aşağıdaki ön koşulları kontrol edin ve geçerli Active Directory mimariniz ve kullanıcı sağlama gereksinimleriniz Azure Active Directory tarafından sağlanan çözüm(ler) ile nasıl eşleşip eşleştirilemeye ilişkin aşağıdaki kılavuzu okuyun. Çalışma sayfalarını planlamayı içeren kapsamlı bir [dağıtım planı,](https://docs.microsoft.com/azure/active-directory/fundamentals/active-directory-deployment-plans) İş Günü tümleştirme iş ortağınız ve İk paydaşlarınızla işbirliği nizde size yardımcı olmak için de kullanılabilir. Bu bölümde planlamanın aşağıdaki yönleri yer almaktadır: * [Ön koşullar](#prerequisites) * [Dağıtmak için sağlama bağlayıcı sıyrık uygulamalarını seçme](#selecting-provisioning-connector-apps-to-deploy) * [Azure AD Bağlantı Sağlama Aracısı'nın dağıtımını planlama](#planning-deployment-of-azure-ad-connect-provisioning-agent) * [Birden çok Active Directory etki alanıyla tümleştirme](#integrating-with-multiple-active-directory-domains) * [Active Directory Kullanıcı Öznitelik Eşleme ve Dönüşümler Için İş Günü Planlama](#planning-workday-to-active-directory-user-attribute-mapping-and-transformations) ### <a name="prerequisites"></a>Ön koşullar Bu öğreticide özetlenen senaryo, zaten aşağıdaki öğelere sahip olduğunuzu varsayar: * İş Günü'nden kaynaklanacak ve şirket içi Active Directory veya Azure Active Directory'de sağlanacak her kullanıcı için geçerli bir Azure AD Premium P1 veya daha yüksek abonelik lisansı. * Sağlama aracısını yapılandırmak için Azure AD global yönetici erişimi * Test ve tümleştirme amacıyla Bir İş Günü uygulama kiracı * Sistem tümleştirme kullanıcısı oluşturmak ve test amacıyla çalışan verilerini sınamak için değişiklik yapmak için İş Günü'nde yönetici izinleri * Active Directory'ye kullanıcı sağlama için, .NET 4.7.1+ çalışma süresine sahip Windows Server 2012 veya daha büyük bir sunucunun [şirket içi sağlama aracısını](https://go.microsoft.com/fwlink/?linkid=847801) barındırması gerekir * Kullanıcıları Active Directory ve Azure AD arasında senkronize etmek için [Azure AD Connect](../hybrid/whatis-hybrid-identity.md) ### <a name="selecting-provisioning-connector-apps-to-deploy"></a>Dağıtmak için sağlama bağlayıcı sıyrık uygulamalarını seçme Azure AD, İş Günü ve Etkin Dizin arasında iş akışı sağlamayı kolaylaştırmak için Azure AD uygulama galerisinden ekleyebileceğiniz birden çok kullanılabilir bağlayıcı uygulama sağlar: ![Azure AD Uygulama Galerisi](./media/workday-inbound-tutorial/wd_gallery.png) * **Active Directory User Provisioning için Iş günü** - Bu uygulama, Kullanıcı hesabının İş Günü'nden tek bir Active Directory etki alanına sağlanmasını kolaylaştırır. Birden çok etki alanınız varsa, sağlamanız gereken her Active Directory etki alanı için Azure AD uygulama galerisinden bu uygulamanın bir örneğini ekleyebilirsiniz. * **Azure AD Kullanıcı Sağlama** Için İş Günü - Azure AD Connect, Active Directory kullanıcılarını Azure Etkin Dizini ile senkronize etmek için kullanılması gereken bir araç olsa da, bu uygulama Yalnızca Bulut Kullanıcılarının İş Günü'nden tek bir Azure Etkin Dizin kiracısına sağlanmasını kolaylaştırmak için kullanılabilir. * **İş Günü Writeback** - Bu uygulama, Kullanıcının e-posta adreslerinin Azure Active Directory'den İş Günü'ne geri yazılmasını kolaylaştırır. > [!TIP] > Normal "İş Günü" uygulaması, İş Günü ve Azure Etkin Dizini arasında tek oturum açma ayarlamak için kullanılır. Hangi İş Günü sağlama uygulamalarının senaryonuzla alakalı olduğunu belirlemek için aşağıdaki karar akış şemasını kullanın. ![Karar Akış Şeması](./media/workday-inbound-tutorial/wday_app_flowchart.png "Karar Akış Şeması") Bu öğreticinin ilgili bölümüne gitmek için içindekiler tablosunu kullanın. ### <a name="planning-deployment-of-azure-ad-connect-provisioning-agent"></a>Azure AD Bağlantı Sağlama Aracısı'nın dağıtımını planlama > [!NOTE] > Bu bölüm yalnızca İş Gününü Active Directory Kullanıcı Sağlama Uygulamasına dağıtmayı planlıyorsanız geçerlidir. İş Günü Yazma Veya Çalışma Günü'nü Azure AD Kullanıcı Sağlama Uygulamasına dağıtıyorsanız bunu atlayabilirsiniz. AD Kullanıcı Sağlama çözümü için İş Günü, windows 2012 R2 veya daha fazla çalıştıran sunucularda en az 4 GB RAM ve .NET 4.7.1+ çalışma süresiyle bir veya daha fazla Sağlama Aracısı dağıtmayı gerektirir. Provizyon Aracısını yüklemeden önce aşağıdaki hususlar göz önünde bulundurulmalıdır: * Sağlama Aracısını çalıştıran ana bilgisayar sunucusunun hedef AD etki alanına ağ erişimine sahip olduğundan emin olun * Sağlama Aracısı Yapılandırma Sihirbazı aracıyı Azure AD kiracınıza kaydeder ve kayıt işlemi TLS bağlantı noktası 443 üzerinden *.msappproxy.net erişim gerektirir. Bu iletişimi sağlayan giden güvenlik duvarı kurallarının yerinde olduğundan emin olun. Aracı [giden HTTPS proxy yapılandırmasını](#how-do-i-configure-the-provisioning-agent-to-use-a-proxy-server-for-outbound-http-communication)destekler. * Sağlama Aracısı, şirket içi AD etki alanı(lar) ile iletişim kurmak için bir hizmet hesabı kullanır. Aracının yüklenmesinden önce, etki alanı yöneticisi izinleri ve süresi dolmayan bir parola içeren bir hizmet hesabı oluşturmanız önerilir. * Sağlama Aracısı yapılandırması sırasında, sağlama isteklerini işlemesi gereken etki alanı denetleyicilerini seçebilirsiniz. Coğrafi olarak dağıtılmış birkaç etki alanı denetleyiciniz varsa, uçtan uca çözümün güvenilirliğini ve performansını artırmak için Provisioning Agent'ı tercih ettiğiniz etki alanı denetleyicisi(ler) ile aynı sitede kurun * Yüksek kullanılabilirlik için, birden fazla Sağlama Aracısı dağıtabilir ve aynı şirket içi AD etki alanlarını işlemek için kaydedebilirsiniz. > [!IMPORTANT] > Üretim ortamlarında Microsoft, yüksek kullanılabilirlik için Azure AD kiracınızla yapılandırılan en az 3 Sağlama Aracınız olduğunu önerir. ### <a name="integrating-with-multiple-active-directory-domains"></a>Birden çok Active Directory etki alanıyla tümleştirme > [!NOTE] > Bu bölüm yalnızca İş Gününü Active Directory Kullanıcı Sağlama Uygulamasına dağıtmayı planlıyorsanız geçerlidir. İş Günü Yazma Veya Çalışma Günü'nü Azure AD Kullanıcı Sağlama Uygulamasına dağıtıyorsanız bunu atlayabilirsiniz. Active Directory topolojinize bağlı olarak, yapılandırmanız için Kullanıcı Sağlama BağlayıcıSı Uygulamalarının sayısına ve Sağlama Aracılarının sayısına karar vermeniz gerekir. Aşağıda, dağıtımınızı planlarken başvurabileceğiniz yaygın dağıtım desenlerinden bazıları listelenmiştir. #### <a name="deployment-scenario-1--single-workday-tenant---single-ad-domain"></a>Dağıtım Senaryosu #1 : Tek İş Günü Kiracı -> Tek AD etki alanı Bu senaryoda, bir İş günü kiracınız vardır ve kullanıcıları tek bir hedef AD etki alanına sağlamak istersiniz. Aşağıda, bu dağıtım için önerilen üretim yapılandırması ve önerilmesi önerilir. | | | | - | - | | Hayır. şirket içinde dağıtmak için sağlama ajanlarının | 3 (yüksek kullanılabilirlik ve üzerinde başarısız için) | | Hayır. Azure portalında yapılandırmak için AD Kullanıcı Sağlama Uygulamaları'na Çalışma Günü | 1 | ![Senaryo 1](./media/workday-inbound-tutorial/dep_scenario1.png) #### <a name="deployment-scenario-2--single-workday-tenant---multiple-child-ad-domains"></a>Dağıtım Senaryosu #2 : Tek İş Günü Kiracı -> Birden Çok alt AD etki alanı Bu senaryo, kullanıcıları Çalışma Günü'nden bir ormandaki birden çok hedef AD alt etki alanına sağlamayı içerir. Aşağıda, bu dağıtım için önerilen üretim yapılandırması ve önerilmesi önerilir. | | | | - | - | | Hayır. şirket içinde dağıtmak için sağlama ajanlarının | 3 (yüksek kullanılabilirlik ve üzerinde başarısız için) | | Hayır. Azure portalında yapılandırmak için AD Kullanıcı Sağlama Uygulamaları'na Çalışma Günü | alt etki alanı başına bir uygulama | ![Senaryo 2](./media/workday-inbound-tutorial/dep_scenario2.png) #### <a name="deployment-scenario-3--single-workday-tenant---disjoint-ad-forests"></a>Dağıtım Senaryosu #3 : Tek İş Günü Kiracı -> Ayrık AD ormanları Bu senaryo, kullanıcıları İş Günü'nden ayrı ad ormanlarında etki alanına sağlamayı içerir. Aşağıda, bu dağıtım için önerilen üretim yapılandırması ve önerilmesi önerilir. | | | | - | - | | Hayır. şirket içinde dağıtmak için sağlama ajanlarının | 3 ayrı AD orman başına | | Hayır. Azure portalında yapılandırmak için AD Kullanıcı Sağlama Uygulamaları'na Çalışma Günü | alt etki alanı başına bir uygulama | ![3. Senaryo](./media/workday-inbound-tutorial/dep_scenario3.png) ### <a name="planning-workday-to-active-directory-user-attribute-mapping-and-transformations"></a>Active Directory Kullanıcı Öznitelik Eşleme ve Dönüşümler Için İş Günü Planlama > [!NOTE] > Bu bölüm yalnızca İş Gününü Active Directory Kullanıcı Sağlama Uygulamasına dağıtmayı planlıyorsanız geçerlidir. İş Günü Yazma Veya Çalışma Günü'nü Azure AD Kullanıcı Sağlama Uygulamasına dağıtıyorsanız bunu atlayabilirsiniz. Kullanıcı sağlamayı Etkin Dizin etki alanına yapılandırmadan önce aşağıdaki soruları göz önünde bulundurun. Bu soruların yanıtları kapsam filtrelerinizin ve öznitelik eşlemelerinizin nasıl ayarlanması gerektiğini belirler. * **İş Günü'ndeki kullanıcıların bu Active Directory ormanına sağlanması gerekir?** * *Örnek: İş Günü "Şirket" özniteliğinin "Contoso" değerini, "Worker_Type" özniteliğinin ise "Düzenli" olduğu kullanıcılar* * **Kullanıcılar farklı organizasyon birimlerine (OSB) nasıl yönlendirilir?** * *Örnek: Kullanıcılar, İş Günü "Belediye" ve "Country_Region_Reference" özniteliklerinde tanımlandığı şekilde, bir ofis konumuna karşılık gelen OS B'lere yönlendirilir* * **Aşağıdaki öznitelikler Active Directory'de nasıl doldurulmalıdır?** * Ortak Ad (cn) * *Örnek: İnsan kaynakları tarafından belirlenen İş Günü User_ID değerini kullanın* * Çalışan Kimliği (employeeId) * *Örnek: İş Günü Worker_ID değerini kullanın* * SAM Hesap Adı (sAMAccountName) * *Örnek: Geçersiz karakterleri kaldırmak için Azure AD sağlama ifadesi ne kadar filtre uygulanarak İş Günü User_ID değerini kullanma* * Kullanıcı Adı (userPrincipalName) * *Örnek: Bir etki alanı adını eklemek için Azure AD sağlama ifadesiyle İş Günü User_ID değerini kullanın* * **Kullanıcılar İş Günü ve Etkin Dizin arasında nasıl eşlenmelidir?** * *Örnek: Belirli bir İş Günü "Worker_ID" değerine sahip kullanıcılar, "employeeID"in aynı değere sahip olduğu Active Directory kullanıcılarıyla eşleşir. Worker_ID değeri Active Directory'de bulunmazsa, yeni bir kullanıcı oluşturun.* * **Etkin Dizin ormanı, eşleşen mantığın çalışması için gereken kullanıcı iliklerini zaten içeriyor mu?** * *Örnek: Bu kurulum yeni bir İş Günü dağıtımıysa, Eşleşen mantığı mümkün olduğunca basit tutmak için Active Directory'nin doğru İş Günü Worker_ID değerleriyle (veya seçtiğiniz benzersiz kimlik değeriyle) önceden doldurulması önerilir.* Bu özel sağlama bağlayıcı uygulamalarının nasıl kurulup yapılandırılabildiğini bu öğreticinin kalan bölümlerinin konusudur. Yapılandırmayı seçtiğiniz uygulamalar, hangi sistemlere sağlamanız gerektiğine ve ortamınızda kaç Etkin Dizin Etki Alanı ve Azure AD kiracısı olduğuna bağlıdır. ## <a name="configure-integration-system-user-in-workday"></a>İş Günü'nde entegrasyon sistemi kullanıcılarını yapılandırma Tüm İş Günü sağlama bağlayıcılarının ortak bir gereksinimi, İş Günü İnsan Kaynakları API'sine bağlanmak için bir İş Günü tümleştirme sistemi kullanıcısının kimlik bilgilerini gerektirmeleridir. Bu bölümde, İş Günü'nde bir tümleştirme sistemi kullanıcısının nasıl oluşturulacak ları açıklanır ve aşağıdaki bölümler vardır: * [Entegrasyon sistemi kullanıcısı oluşturma](#creating-an-integration-system-user) * [Tümleştirme güvenlik grubu oluşturma](#creating-an-integration-security-group) * [Etki alanı güvenlik ilkesi izinlerini yapılandırma](#configuring-domain-security-policy-permissions) * [İş süreci güvenlik ilkesi izinlerini yapılandırma](#configuring-business-process-security-policy-permissions) * [Güvenlik ilkesi değişikliklerini etkinleştirme](#activating-security-policy-changes) > [!NOTE] > Bu yordamı atlamak ve bunun yerine sistem tümleştirme hesabı olarak bir İş Günü global yönetici hesabı kullanmak mümkündür. Bu demolar için iyi çalışabilir, ancak üretim dağıtımları için önerilmez. ### <a name="creating-an-integration-system-user"></a>Entegrasyon sistemi kullanıcısı oluşturma **Bir entegrasyon sistemi kullanıcısı oluşturmak için:** 1. Yönetici hesabını kullanarak İş Günü kiracınızda oturum açın. İş **Günü Uygulamasında,** arama kutusuna kullanıcı oluştur'u girin ve ardından **Tümleştirme Sistemi Kullanıcısını Oluştur'u**tıklatın. ![Kullanıcı oluştur](./media/workday-inbound-tutorial/wd_isu_01.png "Kullanıcı oluştur") 2. Yeni Bir Entegrasyon Sistemi Kullanıcısı için bir kullanıcı adı ve parola sağlayarak **Entegrasyon Sistemi Kullanıcı Oluştur** görevini tamamlayın. * Bu kullanıcı programlı olarak oturum açacağı **için, Sonraki Oturum Açma** seçeneğinde Yeni Parola İste'yi işaretsiz bırakın. * Oturum **Zaman Ekme Dakikalarını** varsayılan değeri 0 olan ve kullanıcının oturumlarının zamanından önce zamanlamasını engelleyecek şekilde bırakın. * Kullanıcı **İçi Gün Oturumlarına İzin Verme** seçeneğini seçin, çünkü tümleştirme sisteminin parolasına sahip bir kullanıcının İş Günü'ne giriş yapmasını engelleyen ek bir güvenlik katmanı sağlar. ![Entegrasyon Sistemi Kullanıcısı Oluştur](./media/workday-inbound-tutorial/wd_isu_02.png "Entegrasyon Sistemi Kullanıcısı Oluştur") ### <a name="creating-an-integration-security-group"></a>Tümleştirme güvenlik grubu oluşturma Bu adımda, Çalışma Günü'nde sınırlandırılmamış veya kısıtlanmış bir tümleştirme sistemi güvenlik grubu oluşturur ve önceki adımda oluşturulan tümleştirme sistemi kullanıcısını bu gruba atarsınız. **Bir güvenlik grubu oluşturmak için:** 1. Arama kutusuna güvenlik grubu oluştur'u girin ve ardından **Güvenlik Grubu Oluştur'u**tıklatın. ![Güvenlik Grubu Oluşturma](./media/workday-inbound-tutorial/wd_isu_03.png "Güvenlik Grubu Oluşturma") 2. Güvenlik **Grubu Oluştur** görevini tamamlayın. * İş Günü'nde iki tür güvenlik grubu vardır: * **Sınırlandırılmamış:** Güvenlik grubunun tüm üyeleri, güvenlik grubu tarafından güvenli olarak güvence altına alınan tüm veri örneklerine erişebilir. * **Kısıtlı:** Tüm güvenlik grubu üyeleri, güvenlik grubunun erişebileceği bir alt veri örnekleri kümesine (satırlar) bağlamsal erişime sahiptir. * Tümleştirme için uygun güvenlik grubu türünü seçmek için lütfen İş Günü tümleştirme ortağınıza danışın. * Grup türünü anladıktan sonra, Kiracı Güvenlik Grubu açılır bırakma **türünden** **Tümleştirme Sistemi Güvenlik Grubu (Sınırlandırılmamış)** veya **Tümleştirme Sistemi Güvenlik Grubu'nu (Kısıtlı)** seçin. ![Güvenlik Grubu Oluşturma](./media/workday-inbound-tutorial/wd_isu_04.png "Güvenlik Grubu Oluşturma") 3. Güvenlik Grubu oluşturma başarılı olduktan sonra, Güvenlik Grubu'na üye atayabileceğiniz bir sayfa görürsünüz. Bu güvenlik grubuna önceki adımda oluşturulan yeni tümleştirme sistemi kullanıcısını ekleyin. *Kısıtlı* güvenlik grubu kullanıyorsanız, uygun kuruluş kapsamını da seçmeniz gerekir. ![Güvenlik Grubunu Edit](./media/workday-inbound-tutorial/wd_isu_05.png "Güvenlik Grubunu Edit") ### <a name="configuring-domain-security-policy-permissions"></a>Etki alanı güvenlik ilkesi izinlerini yapılandırma Bu adımda, güvenlik grubuna çalışan veriler için "etki alanı güvenliği" ilkesi izinleri verirsiniz. **Etki alanı güvenlik ilkesi izinlerini yapılandırmak için:** 1. Arama kutusuna **Etki Alanı Güvenlik Yapılandırması'nı** girin ve ardından Etki Alanı Güvenlik Yapılandırma **Raporu**bağlantısına tıklayın. ![Etki Alanı Güvenlik İlkeleri](./media/workday-inbound-tutorial/wd_isu_06.png "Etki Alanı Güvenlik İlkeleri") 2. Etki **Alanı** metin kutusunda, aşağıdaki etki alanlarını arayın ve bunları filtreye tek tek ekleyin. * *Dış Hesap Sağlama* * *İşçi Verileri: Kamu Çalışanı Raporları* * *Kişi Verileri: İş İletişim Bilgileri* * *İşçi Verileri: Tüm Pozisyonlar* * *İşçi Verileri: Güncel Personel Bilgileri* * *İşçi Verileri: İşçi Profilinde İşletme Unvanı* * *İş Günü Hesapları* ![Etki Alanı Güvenlik İlkeleri](./media/workday-inbound-tutorial/wd_isu_07.png "Etki Alanı Güvenlik İlkeleri") ![Etki Alanı Güvenlik İlkeleri](./media/workday-inbound-tutorial/wd_isu_08.png "Etki Alanı Güvenlik İlkeleri") **Tamam**'a tıklayın. 3. Görünen raporda, **Dış Hesap Sağlama'nın** yanında görünen elipsleri (...) seçin ve **Güvenlik İlkesi İzinlerini > Edit** Menü seçeneğine tıklayın ![Etki Alanı Güvenlik İlkeleri](./media/workday-inbound-tutorial/wd_isu_09.png "Etki Alanı Güvenlik İlkeleri") 4. Etki **Alanı Güvenlik İlkesi İzinlerini Edit** sayfasında, **Tümleştirme İzinleri**bölümüne gidin. Tümleştirme sistemi grubunu **Al** ve **Koy** tümleştirme izinleri olan güvenlik grupları listesine eklemek için "+" işaretini tıklatın. ![İzin Leri Edle](./media/workday-inbound-tutorial/wd_isu_10.png "İzin Leri Edle") 5. Tümleştirme sistemi grubunu **Al** ve **Koy** tümleştirme izinleri olan güvenlik grupları listesine eklemek için "+" işaretini tıklatın. ![İzin Leri Edle](./media/workday-inbound-tutorial/wd_isu_11.png "İzin Leri Edle") 6. Bu kalan güvenlik ilkelerinin her biri için yukarıdaki 3-5 adımları yineleyin: | İşlem | Etki Alanı Güvenlik İlkesi | | ---------- | ---------- | | Alın ve Koyun | İşçi Verileri: Kamu Çalışanı Raporları | | Alın ve Koyun | Kişi Verileri: İş İletişim Bilgileri | | Al | İşçi Verileri: Tüm Pozisyonlar | | Al | İşçi Verileri: Güncel Personel Bilgileri | | Al | İşçi Verileri: İşçi Profilinde İşletme Unvanı | | Alın ve Koyun | İş Günü Hesapları | ### <a name="configuring-business-process-security-policy-permissions"></a>İş süreci güvenlik ilkesi izinlerini yapılandırma Bu adımda, güvenlik grubuna çalışan veriler için "iş süreci güvenliği" ilkesi izinleri verirsiniz. Bu adım, Workday Writeback uygulama konektörünü ayarlamak için gereklidir. **İş süreci güvenlik ilkesi izinlerini yapılandırmak için:** 1. Arama kutusuna **İş Süreci İlkesi'ni** girin ve ardından **İş Süreci Güvenlik İlkesi** görevini edit bağlantısını tıklatın. ![İş Süreci Güvenlik İlkeleri](./media/workday-inbound-tutorial/wd_isu_12.png "İş Süreci Güvenlik İlkeleri") 2. İş **Süreci Türü** metin kutusunda, *Kişi'yi* arayın ve **İletişim Değiştir** iş sürecini seçin ve **Tamam'ı**tıklatın. ![İş Süreci Güvenlik İlkeleri](./media/workday-inbound-tutorial/wd_isu_13.png "İş Süreci Güvenlik İlkeleri") 3. İş **Süreci Güvenlik İlkesini Edit** sayfasında, **Kişi Bilgilerini Koru (Web Hizmeti)** bölümüne gidin. ![İş Süreci Güvenlik İlkeleri](./media/workday-inbound-tutorial/wd_isu_14.png "İş Süreci Güvenlik İlkeleri") 4. Web hizmetleri isteğini başlatabilecek güvenlik grupları listesine yeni tümleştirme sistemi güvenlik grubunu seçin ve ekleyin. **Bitti'ye**tıklayın. ![İş Süreci Güvenlik İlkeleri](./media/workday-inbound-tutorial/wd_isu_15.png "İş Süreci Güvenlik İlkeleri") ### <a name="activating-security-policy-changes"></a>Güvenlik ilkesi değişikliklerini etkinleştirme **Güvenlik ilkesi değişikliklerini etkinleştirmek için:** 1. Arama kutusuna etkinleştir'i girin ve ardından **Bekleyen Güvenlik İlkesi Değişikliklerini Etkinleştir**bağlantısını tıklatın. ![Etkinleştir](./media/workday-inbound-tutorial/wd_isu_16.png "Etkinleştir") 1. Denetim amacıyla bir açıklama girerek Bekleyen Güvenlik İlkesi Değişikliklerini Etkinleştir'i başlatın ve ardından **Tamam'ı**tıklatın. 1. Onay kutusunu **işaretle**seçeneğini işaretleyerek bir sonraki ekranda görevi tamamlayın ve **ardından Tamam'ı**tıklatın. ![Bekleyen Güvenliği Etkinleştir](./media/workday-inbound-tutorial/wd_isu_18.png "Bekleyen Güvenliği Etkinleştir") ## <a name="configuring-user-provisioning-from-workday-to-active-directory"></a>İş Günü'nden Active Directory'ye kullanıcı sağlama yapılandırma Bu bölümde, iş gününden tümleştirmeniz kapsamındaki her Etkin Dizin etki alanına kullanıcı hesabı sağlama adımları sağlanır. * [Sağlama bağlayıcısı uygulamasını ekleyin ve Provisioning Agent'ı indirin](#part-1-add-the-provisioning-connector-app-and-download-the-provisioning-agent) * [Şirket içi Provizyon Aracısı(lar) yükleme ve yapılandırma](#part-2-install-and-configure-on-premises-provisioning-agents) * [İş Günü ve Etkin Dizini bağlantı yapılandırma](#part-3-in-the-provisioning-app-configure-connectivity-to-workday-and-active-directory) * [Öznitelik eşlemelerini yapılandırma](#part-4-configure-attribute-mappings) * [Kullanıcı sağlamayı etkinleştirme ve başlatma](#enable-and-launch-user-provisioning) ### <a name="part-1-add-the-provisioning-connector-app-and-download-the-provisioning-agent"></a>Bölüm 1: Sağlama bağlayıcısı uygulamasını ekleyin ve Provisioning Agent'ı indirin **İş Gününü Etkin Dizin sağlama için yapılandırmak için:** 1. <https://portal.azure.com> kısmına gidin. 2. Azure portalında Azure Etkin **Dizini'ni**arayın ve seçin. 3. **Kurumsal Uygulamaları**seçin, ardından **Tüm Uygulamalar**. 4. **Uygulama Ekle'yi**seçin ve **Tüm** kategoriyi seçin. 5. Active **Directory'ye İş Günü Sağlama'yı**arayın ve bu uygulamayı galeriden ekleyin. 6. Uygulama eklendikten ve uygulama ayrıntıları ekranı gösterildikten **sonra, Sağlama'yı**seçin. 7. Sağlama **Provisioning** **Modunu** **Otomatik**olarak değiştirin. 8. Provisioning Agent'ı indirmek için görüntülenen bilgi başlığına tıklayın. ![Aracıyı İndir](./media/workday-inbound-tutorial/pa-download-agent.png "Agent Ekran İndir") ### <a name="part-2-install-and-configure-on-premises-provisioning-agents"></a>Bölüm 2: Şirket içi Provizyon Aracısı(lar) yükleme ve yapılandırma Active Directory'ye şirket içinde sağlamak için, Sağlama aracısı .NET 4.7.1+ Framework ve istenilen Active Directory etki alanına (lar) ağ erişimi olan bir sunucuya yüklenmelidir. > [!TIP] > [Burada](https://docs.microsoft.com/dotnet/framework/migration-guide/how-to-determine-which-versions-are-installed)verilen yönergeleri kullanarak sunucunuzdaki .NET çerçevesinin sürümünü kontrol edebilirsiniz. > Sunucuda .NET 4.7.1 veya daha yüksek yüklü yoksa buradan [here](https://support.microsoft.com/help/4033342/the-net-framework-4-7-1-offline-installer-for-windows)indirebilirsiniz. İndirilen aracı yıyükleyiciyi sunucu ana bilgisayara aktarın ve aracı yapılandırmasını tamamlamak için aşağıda verilen adımları izleyin. 1. Yeni aracıyı yüklemek istediğiniz Windows Server'da oturum açın. 1. Provizyon Aracısı yükleyicisini başlatın, şartları kabul edin ve **Yükle** düğmesine tıklayın. ![Ekranı Yükle](./media/workday-inbound-tutorial/pa_install_screen_1.png "Ekranı Yükle") 1. Yükleme tamamlandıktan sonra sihirbaz başlatılır ve **Azure REKLAM** Bağlantısını ekranını görürsünüz. Azure REKLAM örneğinize bağlanmak için **Kimlik Doğrulama** düğmesini tıklatın. ![Azure AD’yi bağlama](./media/workday-inbound-tutorial/pa_install_screen_2.png "Azure AD’yi bağlama") 1. Genel Yönetici Kimlik Bilgilerini kullanarak Azure REKLAM örneğinize kimlik doğrulaması verin. ![Yönetici Auth](./media/workday-inbound-tutorial/pa_install_screen_3.png "Yönetici Auth") > [!NOTE] > Azure AD yönetici kimlik bilgileri yalnızca Azure AD kiracınıza bağlanmak için kullanılır. Aracı, kimlik bilgilerini sunucuda yerel olarak depolamaz. 1. Azure AD ile başarılı kimlik doğrulamadan sonra **Etkin Dizin Bağla** ekranını görürsünüz. Bu adımda, AD etki alanı adınızı girin ve **Dizin Ekle** düğmesini tıklatın. ![Dizin Ekle](./media/workday-inbound-tutorial/pa_install_screen_4.png "Dizin Ekle") 1. Artık AD Etki Alanına bağlanmak için gereken kimlik bilgilerini girmeniz istenir. Aynı ekranda, aracının sağlama istekleri göndermek için kullanması gereken etki alanı denetleyicilerini belirtmek için **etki alanı denetleyicisini seç önceliğini** kullanabilirsiniz. ![Etki Alanı Kimlik Bilgileri](./media/workday-inbound-tutorial/pa_install_screen_5.png) 1. Etki alanını yapılandırıldıktan sonra, yükleyici yapılandırılan etki alanlarının listesini görüntüler. Bu ekranda, daha fazla etki alanı eklemek için adım #5 ve #6 tekrarlayabilir veya aracı kaydına geçmek için **İleri'ye** tıklayabilirsiniz. ![Yapılandırılmış Etki Alanları](./media/workday-inbound-tutorial/pa_install_screen_6.png "Yapılandırılmış Etki Alanları") > [!NOTE] > Birden fazla AD etki alanınız varsa (örn. na.contoso.com, emea.contoso.com), lütfen her etki alanını listeye ayrı ayrı ekleyin. > Yalnızca ana etki alanını (örn. contoso.com) eklemek yeterli değildir. Her alt etki alanını aracıya kaydetmeniz gerekir. 1. Yapılandırma ayrıntılarını gözden geçirin ve aracıyı kaydetmek için **Onayla'yı** tıklatın. ![Ekranı Onayla](./media/workday-inbound-tutorial/pa_install_screen_7.png "Ekranı Onayla") 1. Yapılandırma sihirbazı aracı kaydının ilerlemesini görüntüler. ![Acente Kaydı](./media/workday-inbound-tutorial/pa_install_screen_8.png "Acente Kaydı") 1. Aracı kaydı başarılı olduktan sonra Sihirbazdan çıkmak için **Çıkış'a** tıklayabilirsiniz. ![Çıkış Ekranı](./media/workday-inbound-tutorial/pa_install_screen_9.png "Çıkış Ekranı") 1. Aracının yüklemesini doğrulayın ve "Hizmetler" Snap-In'i açarak çalıştırdığından emin olun ve "Microsoft Azure AD Bağlantı Sağlama Aracısı" adlı Hizmeti arayın ![Hizmetler](./media/workday-inbound-tutorial/services.png) ### <a name="part-3-in-the-provisioning-app-configure-connectivity-to-workday-and-active-directory"></a>Bölüm 3: Sağlama uygulamasında, İş Günü ve Active Directory bağlantısını yapılandırın Bu adımda, Azure portalında İş Günü ve Active Directory ile bağlantı kuruyoruz. 1. Azure portalında, [Bölüm 1'de](#part-1-add-the-provisioning-connector-app-and-download-the-provisioning-agent) oluşturulan Active Directory Kullanıcı Sağlama Uygulamasına Iş Günü'ne geri dön 1. Yönetici **Kimlik Bilgileri** bölümünü aşağıdaki gibi tamamlayın: * **Yönetici Kullanıcı Adı** – Kiracı alan adı eklenen İş Günü entegrasyon sistemi hesabının kullanıcı adını girin. Bu gibi bir şey görünmelidir: **kullanıcı adı\@tenant_name** * **Yönetici şifresi –** İş Günü entegrasyon sistemi hesabının parolasını girin * **Kiracı URL ' i –** Kiracınız için İş Günü web hizmetleri bitiş noktasına URL'yi girin. Bu değer şu https://wd3-impl-services1.workday.com/ccx/service/contoso4şekilde görünmelidir: , *contoso4'ün* doğru kiracı adınızla değiştirildiği ve *wd3-impl'in* doğru ortam dizesi ile değiştirildiği yer. * **Aktif Dizin Ormanı -** Etkin Dizin etki alanınızın aracıya kayıtlı olarak "Adı". Sağlama için hedef etki alanını seçmek için açılır alanı kullanın. Bu değer genellikle şöyle bir dizedir: *contoso.com* * **Aktif Dizin Konteyner -** Aracının varsayılan olarak kullanıcı hesapları oluşturması gereken DN kapsayıcısını girin. Örnek: *OU=Standart Kullanıcılar,OU=Kullanıcılar,DC=contoso,DC=test* > [!NOTE] > Bu ayar yalnızca *parentDistinguishedName* özniteliği öznitelik eşlemelerinde yapılandırılmamışsa, kullanıcı hesabı oluşturmaları için devreye girer. Bu ayar, kullanıcı arama veya güncelleştirme işlemleri için kullanılmaz. Tüm etki alanı alt ağacı arama işlemi kapsamında düşer. * **Bildirim E-postası –** E-posta adresinizi girin ve "hata olursa e-posta gönder" onay kutusunu işaretleyin. > [!NOTE] > Azure AD Sağlama Hizmeti, sağlama işi [karantina](https://docs.microsoft.com/azure/active-directory/manage-apps/application-provisioning-quarantine-status) durumuna girerse e-posta bildirimi gönderir. * Test **Bağlantısı** düğmesini tıklatın. Bağlantı testi başarılı olursa, üstteki **Kaydet** düğmesini tıklatın. Başarısız olursa, iş günü kimlik bilgileri ve aracı kurulumunda yapılandırılan AD kimlik bilgilerinin geçerli olup olmadığını iki kez denetleyin. ![Azure portalında](./media/workday-inbound-tutorial/wd_1.png) * Kimlik bilgileri başarıyla kaydedildikten sonra, **Eşlemeler** bölümü Varsayılan Eşleme **İş Günü Çalışanlarını Yerinde Etkin Dizine Senkronize** Eder ### <a name="part-4-configure-attribute-mappings"></a>Bölüm 4: Öznitelik eşlemelerini yapılandırma Bu bölümde, kullanıcı verilerinin İş Günü'nden Active Directory'ye akışını nasıl yapılandıracağınız. 1. **Eşlemeler**altındaki Sağlama sekmesinde, **İş Günü Çalışanlarını Şirket Içi Etkin Dizini Eşitle'yi**tıklatın. 1. Kaynak **Nesne Kapsamı** alanında, bir öznitelik tabanlı filtre kümesi tanımlayarak, İş Günü'ndeki hangi kullanıcı kümelerinin AD'ye karşılık vermek için kapsamda olması gerektiğini seçebilirsiniz. Varsayılan kapsam "İş Günü'ndeki tüm kullanıcılar"tır. Örnek filtreler: * Örnek: 1000000 ile 2000000 arasında İşçi Numarası olan kullanıcılara kapsam (20000000 hariç) * Öznitelik: İşçi Kimliği * Operatör: REGEX Maç * Değer: (1[0-9][0-9][0-9][0-9][0-9][0-9][0-9]) * Örnek: Sadece çalışanlar değil, şartlı çalışanlar * Öznitelik: EmployeeID * Operatör: NULL Değİl > [!TIP] > Sağlama uygulamasını ilk kez yapılandırırken, size istenen sonucu verdiğinden emin olmak için öznitelik eşlemelerinizi ve ifadelerinizi test etmeniz ve doğrulamanız gerekir. Microsoft, eşlemelerinizi İş Günü'nden birkaç test kullanıcısıyla test etmek için **Kaynak Nesne Kapsamı** altındaki kapsam filtrelerini kullanmanızı önerir. Eşlemelerin çalıştığını doğruladıktan sonra, filtreyi kaldırabilir veya daha fazla kullanıcı yı içerecek şekilde kademeli olarak genişletebilirsiniz. > [!CAUTION] > Sağlama altyapısının varsayılan davranışı, kapsam dışına çıkan kullanıcıları devre dışı/silmektir. Bu, Ad tümleştirmesi için Çalışma gününüzde istilen olmayabilir. Bu varsayılan davranışı geçersiz kılmak [için, kapsam dışına çıkan kullanıcı hesaplarının silinmesini atla makalesine](../app-provisioning/skip-out-of-scope-deletions.md) bakın 1. Hedef **Nesne Eylemleri** alanında, Etkin Dizini'nde hangi eylemlerin gerçekleştirildiğini genel olarak filtreleyebilirsiniz. **Oluştur** ve **Güncelleştir** en yaygın olanıdır. 1. **Öznitelik eşlemeleri** bölümünde, tek tek İş Günü'nün eşlemeyi Etkin Dizin özniteliklerine nasıl bağldadığını tanımlayabilirsiniz. 1. Güncelleştirmek için varolan bir öznitelik eşleciliğini tıklatın veya yeni eşlemeler eklemek için ekranın alt kısmında **yeni eşleme ekle'yi** tıklatın. Tek tek öznitelik eşleme bu özellikleri destekler: * **Haritalama Türü** * **Doğrudan** – İş Günü özniteliğinin değerini AD özniteliğine hiçbir değişiklik olmadan yazar * **Sabit** - AD özniteliğine statik, sabit bir dize değeri yazın * **İfade** – Bir veya daha fazla İş Günü özniteliğine dayalı olarak AD özniteliğine özel bir değer yazmanızı sağlar. [Daha fazla bilgi için, ifadeler bu makaleye bakın.](../app-provisioning/functions-for-customizing-application-data.md) * **Kaynak özniteliği** - İş Günü'nden kullanıcı özniteliği. Aradığınız öznitelik yoksa, [bkz.](#customizing-the-list-of-workday-user-attributes) * **Varsayılan değer** – İsteğe bağlı. Kaynak özniteliği boş bir değere sahipse, eşleme bunun yerine bu değeri yazar. En yaygın yapılandırma bu boş bırakmaktır. * **Hedef öznitelik** – Active Directory'deki kullanıcı özniteliği. * **Bu özniteliği kullanarak nesneleri eşleştirin** – Bu eşlemenin İş Günü ile Etkin Dizin arasındaki kullanıcıları benzersiz olarak tanımlamak için kullanılıp kullanılmaması gerektiği. Bu değer genellikle Çalışma Günü için İşçi Kimliği alanında ayarlanır ve bu değer genellikle Active Directory'deki Çalışan Kimliği özniteliklerinden birine eşlenir. * **Eşleşen öncelik** – Birden çok eşleşen öznitelik ayarlanabilir. Birden çok olduğunda, bu alan tarafından tanımlanan sırada değerlendirilir. Eşleşme bulunur bulunmaz, başka eşleşen öznitelikler değerlendirilmez. * **Bu eşlemi uygulayın** * **Her zaman** – Bu eşlemi hem kullanıcı oluşturma hem de güncelleştirme eylemlerine uygulayın * **Yalnızca oluşturma sırasında** - Bu eşlemi yalnızca kullanıcı oluşturma eylemlerine uygulayın 1. Eşlemelerinizi kaydetmek için, Atrit-Eşleme bölümünün üst **kısmındakaydet'i** tıklatın. ![Azure portalında](./media/workday-inbound-tutorial/wd_2.png) #### <a name="below-are-some-example-attribute-mappings-between-workday-and-active-directory-with-some-common-expressions"></a>Aşağıda, Bazı ortak ifadeler ile İş Günü ve Etkin Dizin arasında bazı örnek öznitelik eşlemeleri ve * Bir veya daha fazla İş Günü kaynak özniteliklerine dayalı olarak bir kullanıcıyı farklı OS'lere sağlamak için *bir kullanıcıyı ana DistinguishedName* özniteliğine eşleyen ifade kullanılır. Bu örnek, kullanıcıları hangi şehirde olduklarına göre farklı İş'lere yerleştiren kullanıcılardır. * Active Directory'deki *userPrincipalName* özniteliği, hedef AD etki alanında oluşturulan bir değerin varlığını denetleyen ve yalnızca benzersizse ayarlayan [SelectUniqueValue](../app-provisioning/functions-for-customizing-application-data.md#selectuniquevalue) de-duplikasyon işlevi kullanılarak oluşturulur. * [Burada ifadeler yazma belgeleri vardır.](../app-provisioning/functions-for-customizing-application-data.md) Bu bölümde, özel karakterlerin nasıl kaldırılılabilene ilişkin örnekler yer almaktadır. | İş GÜNÜ ÖZNITELIĞI | ACTIVE DIZIN ÖZNITELIĞI | EŞLEŞEN KIMLIK MI? | OLUŞTUR / GÜNCELLE | | ---------- | ---------- | ---------- | ---------- | | **İşçi Kimliği** | EmployeeID | **Evet** | Yalnızca oluşturma üzerine yazıldı | | **Tercih Edilen Ad Verileri** | Cn | | Yalnızca oluşturma üzerine yazıldı | | **SelectUniqueValue(",\@ \[Join(",\]FirstName \[\], "contoso.com", Join("\@"Join(", Mid(FirstName\[\], 1, 1), \[\]LastName ), "contoso.com"),\@Join(" ",\[Join(", Mid(Name\], 1, 2), \[Soyadı\]), "contoso.com")** | userPrincipalName | | Yalnızca oluşturma üzerine yazıldı | **Değiştir(Orta(Replace(\[\]UserID ,\[\\\\/\\\\\\\\\\\\\[\\\\\]\\,\\\\ "(\\:\\ \\\|\\\\=\\\\,\\\\+\\\\\*\\\\? \\\\\\)", , "", , 1, 20), " ([.) \\ &lt; \\ \\ &gt; \] \*](file:///\\ \$.) *$)", , "", , )** | sAMAccountName | | Yalnızca oluşturma üzerine yazıldı | | **Switch(\[\]Etkin , , "0", "Doğru", "1", "False")** | hesapDevre Dışı | | Oluşturma + güncelleme | | **FirstName** | givenName | | Oluşturma + güncelleme | | **Soyadı** | sn | | Oluşturma + güncelleme | | **Tercih Edilen Ad Verileri** | displayName | | Oluşturma + güncelleme | | **Şirket** | şirket | | Oluşturma + güncelleme | | **Denetleyici Organizasyon** | bölüm | | Oluşturma + güncelleme | | **Yönetici Referans** | manager | | Oluşturma + güncelleme | | **BusinessTitle** | başlık | | Oluşturma + güncelleme | | **AdresLineData** | Streetaddress | | Oluşturma + güncelleme | | **Belediyesi** | l | | Oluşturma + güncelleme | | **ÜlkeReferansİki Mektubu** | Co | | Oluşturma + güncelleme | | **ÜlkeReferansİki Mektubu** | c | | Oluşturma + güncelleme | | **CountryRegionReference** | st | | Oluşturma + güncelleme | | **Çalışma AlanıBaşvurusu** | physicalDeliveryOfficeName | | Oluşturma + güncelleme | | **Posta Kodu** | Postakodu | | Oluşturma + güncelleme | | **Birincil Çalışma Telefonu** | Telephonenumber | | Oluşturma + güncelleme | | **Faks** | faksTelefon Numarası | | Oluşturma + güncelleme | | **Mobil** | mobil | | Oluşturma + güncelleme | | **Yerel Referans** | tercihDil | | Oluşturma + güncelleme | | **Switch\[(\]Belediye , "OU = Standart Kullanıcılar,OU=Kullanıcılar,OU=Default,OU=Locations,DC=contoso,DC=com", "Dallas", "OU=Standard Users,OU=Users,OU=Dallas,OU=Locations,DC=contoso,DC=com", "Austin", "OU=Standart Kullanıcılar,OU=Kullanıcılar,OU=Austin,Ou=Locations,DC=com", "Seattle", "OU=Standard Users,OU=Users,OU=Seattle,OU=Locations,DC=contoso,DC=com", "London", "OU=Standard Users,OU=London,OU=Locations,DC=contoso,DC=com")** | parentDistinguishedName | | Oluşturma + güncelleme | Öznitelik eşleme yapılandırmanız tamamlandıktan sonra, [artık kullanıcı sağlama hizmetini etkinleştirebilir ve başlatabilirsiniz.](#enable-and-launch-user-provisioning) ## <a name="configuring-user-provisioning-to-azure-ad"></a>Kullanıcı sağlamayı Azure AD olarak yapılandırma Aşağıdaki bölümlerde, yalnızca buluta yönelik dağıtımlar için İş Günü'nden Azure AD'ye kullanıcı sağlama yapılandırma adımları açıklanmaktadır. * [Azure AD sağlama bağlayıcısı uygulamasını ekleme ve İş Günü bağlantısı oluşturma](#part-1-adding-the-azure-ad-provisioning-connector-app-and-creating-the-connection-to-workday) * [İş Günü ve Azure AD öznitelik eşlemelerini yapılandırma](#part-2-configure-workday-and-azure-ad-attribute-mappings) * [Kullanıcı sağlamayı etkinleştirme ve başlatma](#enable-and-launch-user-provisioning) > [!IMPORTANT] > Yalnızca şirket içi Active Directory'ye değil, Azure AD'ye verilmesi gereken yalnızca bulut kullanıcılarınız varsa aşağıdaki yordamı uygulayın. ### <a name="part-1-adding-the-azure-ad-provisioning-connector-app-and-creating-the-connection-to-workday"></a>Bölüm 1: Azure AD sağlama bağlayıcısı uygulamasını ekleme ve İş Günü bağlantısı oluşturma **İş Günü'nü yalnızca bulut kullanıcıları için Azure Etkin Dizin sağlama için yapılandırmak için:** 1. <https://portal.azure.com> kısmına gidin. 2. Azure portalında Azure Etkin **Dizini'ni**arayın ve seçin. 3. **Kurumsal Uygulamaları**seçin, ardından **Tüm Uygulamalar**. 4. **Uygulama ekle'yi**seçin ve ardından **Tüm** ler kategorisini seçin. 5. İş **Günü'nden Azure AD sağlamasına**arama yapın ve bu uygulamayı galeriden ekleyin. 6. Uygulama eklendikten ve uygulama ayrıntıları ekranı gösterildikten **sonra, Sağlama'yı**seçin. 7. Sağlama **Provisioning** **Modunu** **Otomatik**olarak değiştirin. 8. Yönetici **Kimlik Bilgileri** bölümünü aşağıdaki gibi tamamlayın: * **Yönetici Kullanıcı Adı** – Kiracı alan adı eklenen İş Günü entegrasyon sistemi hesabının kullanıcı adını girin. Şöyle bir şey görünmelidir:username@contoso4 * **Yönetici şifresi –** İş Günü entegrasyon sistemi hesabının parolasını girin * **Kiracı URL ' i –** Kiracınız için İş Günü web hizmetleri bitiş noktasına URL'yi girin. Bu değer şu https://wd3-impl-services1.workday.com/ccx/service/contoso4/Human_Resourcesşekilde görünmelidir: , *contoso4'ün* doğru kiracı adınızla değiştirildiği ve *wd3-impl'in* doğru ortam dizesi ile değiştirildiği yer. Bu URL bilinmiyorsa, kullanılacak doğru URL'yi belirlemek için lütfen Iş Günü tümleştirme ortağınızla veya destek temsilcinizle birlikte çalışın. * **Bildirim E-postası –** E-posta adresinizi girin ve "hata olursa e-posta gönder" onay kutusunu işaretleyin. * Test **Bağlantısı** düğmesini tıklatın. * Bağlantı testi başarılı olursa, üstteki **Kaydet** düğmesini tıklatın. Başarısız olursa, İş Günü URL'sinin ve kimlik bilgilerinin İş Günü'nde geçerli olup olmadığını iki kez denetleyin. ### <a name="part-2-configure-workday-and-azure-ad-attribute-mappings"></a>Bölüm 2: İş Günü ve Azure AD öznitelik eşlemelerini yapılandırma Bu bölümde, yalnızca bulut kullanıcıları için kullanıcı verilerinin İş Günü'nden Azure Etkin Dizini'ne nasıl aktığını yapılandırırsınız. 1. **Eşlemeler**altındaki Sağlama sekmesinde, **Çalışanları Azure AD'a Eşitle'yi**tıklatın. 2. Kaynak **Nesne Kapsamı** alanında, bir öznitelik tabanlı filtre kümesi tanımlayarak, İş Günü'ndeki hangi kullanıcı kümelerinin Azure AD'ye karşılık vermek için kapsamda olması gerektiğini seçebilirsiniz. Varsayılan kapsam "İş Günü'ndeki tüm kullanıcılar"tır. Örnek filtreler: * Örnek: 1000000 ile 2000000 arasında İşçi Numarası olan kullanıcılara kapsam * Öznitelik: İşçi Kimliği * Operatör: REGEX Maç * Değer: (1[0-9][0-9][0-9][0-9][0-9][0-9][0-9]) * Örnek: Sadece şartlı işçiler değil, düzenli çalışanlar * Öznitelik: ContingentID * Operatör: NULL Değİl 3. Hedef **Nesne Eylemleri** alanında, Azure AD'de hangi eylemlerin gerçekleştirildiğini genel olarak filtreleyebilirsiniz. **Oluştur** ve **Güncelleştir** en yaygın olanıdır. 4. **Öznitelik eşlemeleri** bölümünde, tek tek İş Günü'nün eşlemeyi Etkin Dizin özniteliklerine nasıl bağldadığını tanımlayabilirsiniz. 5. Güncelleştirmek için varolan bir öznitelik eşleciliğini tıklatın veya yeni eşlemeler eklemek için ekranın alt kısmında **yeni eşleme ekle'yi** tıklatın. Tek tek öznitelik eşleme bu özellikleri destekler: * **Haritalama Türü** * **Doğrudan** – İş Günü özniteliğinin değerini AD özniteliğine hiçbir değişiklik olmadan yazar * **Sabit** - AD özniteliğine statik, sabit bir dize değeri yazın * **İfade** – Bir veya daha fazla İş Günü özniteliğine dayalı olarak AD özniteliğine özel bir değer yazmanızı sağlar. [Daha fazla bilgi için, ifadeler bu makaleye bakın.](../app-provisioning/functions-for-customizing-application-data.md) * **Kaynak özniteliği** - İş Günü'nden kullanıcı özniteliği. Aradığınız öznitelik yoksa, [bkz.](#customizing-the-list-of-workday-user-attributes) * **Varsayılan değer** – İsteğe bağlı. Kaynak özniteliği boş bir değere sahipse, eşleme bunun yerine bu değeri yazar. En yaygın yapılandırma bu boş bırakmaktır. * **Hedef öznitelik** – Azure AD'deki kullanıcı özniteliği. * **Bu özniteliği kullanarak nesneleri eşleştirin** – Bu öznitelik, İş Günü ve Azure AD arasındaki kullanıcıları benzersiz olarak tanımlamak için kullanılıp kullanılmaması gerektiğini. Bu değer genellikle, genellikle Çalışan Kimliği özniteliğine (yeni) veya Azure AD'deki bir uzantı özniteliğine eşlenen İş Günü için İşçi Kimliği alanında ayarlanır. * **Eşleşen öncelik** – Birden çok eşleşen öznitelik ayarlanabilir. Birden çok olduğunda, bu alan tarafından tanımlanan sırada değerlendirilir. Eşleşme bulunur bulunmaz, başka eşleşen öznitelikler değerlendirilmez. * **Bu eşlemi uygulayın** * **Her zaman** – Bu eşlemi hem kullanıcı oluşturma hem de güncelleştirme eylemlerine uygulayın * **Yalnızca oluşturma sırasında** - Bu eşlemi yalnızca kullanıcı oluşturma eylemlerine uygulayın 6. Eşlemelerinizi kaydetmek için, Atrit-Eşleme bölümünün üst **kısmındakaydet'i** tıklatın. Öznitelik eşleme yapılandırmanız tamamlandıktan sonra, [artık kullanıcı sağlama hizmetini etkinleştirebilir ve başlatabilirsiniz.](#enable-and-launch-user-provisioning) ## <a name="configuring-azure-ad-attribute-writeback-to-workday"></a>Azure AD özniteliği ni İş Gününe yeniden yapılandırma Azure Active Directory'den Workday'e kullanıcı e-posta adreslerinin ve kullanıcı adının geri yazısını yapılandırmak için bu yönergeleri izleyin. * [Writeback bağlayıcısı uygulamasını ekleme ve İş Günü'ne bağlantı oluşturma](#part-1-adding-the-writeback-connector-app-and-creating-the-connection-to-workday) * [Writeback öznitelik eşlemelerini yapılandırma](#part-2-configure-writeback-attribute-mappings) * [Kullanıcı sağlamayı etkinleştirme ve başlatma](#enable-and-launch-user-provisioning) ### <a name="part-1-adding-the-writeback-connector-app-and-creating-the-connection-to-workday"></a>Bölüm 1: Writeback bağlayıcısı uygulamasını ekleme ve İş Günü'ne bağlantı oluşturma **Workday Writeback konektörünü yapılandırmak için:** 1. <https://portal.azure.com> kısmına gidin. 2. Azure portalında Azure Etkin **Dizini'ni**arayın ve seçin. 3. **Kurumsal Uygulamaları**seçin, ardından **Tüm Uygulamalar**. 4. **Uygulama ekle'yi**seçin ve ardından **Tüm** ler kategorisini seçin. 5. **Workday Writeback'i**arayın ve bu uygulamayı galeriden ekleyin. 6. Uygulama eklendikten ve uygulama ayrıntıları ekranı gösterildikten **sonra, Sağlama'yı**seçin. 7. Sağlama **Provisioning** **Modunu** **Otomatik**olarak değiştirin. 8. Yönetici **Kimlik Bilgileri** bölümünü aşağıdaki gibi tamamlayın: * **Yönetici Kullanıcı Adı** – Kiracı alan adı eklenen İş Günü entegrasyon sistemi hesabının kullanıcı adını girin. Gibi bir şey görünmelidir: *kullanıcı adı\@contoso4* * **Yönetici şifresi –** İş Günü entegrasyon sistemi hesabının parolasını girin * **Kiracı URL ' i –** Kiracınız için İş Günü web hizmetleri bitiş noktasına URL'yi girin. Bu değer şu https://wd3-impl-services1.workday.com/ccx/service/contoso4/Human_Resourcesşekilde görünmelidir: *contoso4'ün* doğru kiracı adınızla değiştirildiği ve *wd3-impl'in* doğru ortam dizesi (gerekirse) ile değiştirildiği yer. * **Bildirim E-postası –** E-posta adresinizi girin ve "hata olursa e-posta gönder" onay kutusunu işaretleyin. * Test **Bağlantısı** düğmesini tıklatın. Bağlantı testi başarılı olursa, üstteki **Kaydet** düğmesini tıklatın. Başarısız olursa, İş Günü URL'sinin ve kimlik bilgilerinin İş Günü'nde geçerli olup olmadığını iki kez denetleyin. ### <a name="part-2-configure-writeback-attribute-mappings"></a>Bölüm 2: Writeback öznitelik eşlemelerini yapılandırma Bu bölümde, geri yazma özniteliklerinin Azure AD'den İş Günü'ne akışını nasıl yapılandıracağınız. Şu anda, bağlayıcı yalnızca E-posta adresi nin ve kullanıcı adının Workday'e yazılmasını destekler. 1. **Eşlemeler**altındaki Sağlama sekmesinde, **Azure Etkin Dizin Kullanıcılarını İş Günü'ne Senkronize Et'i**tıklatın. 2. Kaynak **Nesne Kapsamı** alanında isteğe bağlı olarak, Azure Etkin Dizini'ndeki hangi kullanıcı kümelerinin e-posta adreslerinin İş Günü'ne geri yazılması gerektiğini filtreleyebilirsiniz. Varsayılan kapsam "Azure AD'deki tüm kullanıcılar"tır. 3. **Atnitelik eşlemeleri** bölümünde, Çalışma Günü çalışan kimliğinin veya çalışan kimliğinin depolandığı Azure Etkin Dizini'ndeki özniteliği belirtmek için eşleşen kimliği güncelleştirin. Popüler bir eşleştirme yöntemi, İş Günü çalışan kimliğini veya çalışan kimliğini Azure AD'deki Attribute1-15 uzantısıyla eşitlemek ve ardından bu özniteliği İş Günü'ndeki kullanıcıları eşleştirmek için Azure AD'de kullanmaktır. 4. Genellikle Azure AD *userPrincipalName* özniteliğini İş Günü *UserID* özniteliğine göre eşlersiniz ve Azure AD posta özniteliğini İş Günü *E-posta Adresi* özniteliğiyle eşlersiniz. *mail* Eşlemelerinizi kaydetmek için, Atrit-Eşleme bölümünün üst **kısmındakaydet'i** tıklatın. Öznitelik eşleme yapılandırmanız tamamlandıktan sonra, [artık kullanıcı sağlama hizmetini etkinleştirebilir ve başlatabilirsiniz.](#enable-and-launch-user-provisioning) ## <a name="enable-and-launch-user-provisioning"></a>Kullanıcı sağlamayı etkinleştirme ve başlatma İş Günü sağlama uygulaması yapılandırmaları tamamlandıktan sonra, Azure portalındaki sağlama hizmetini açabilirsiniz. > [!TIP] > Varsayılan olarak, sağlama hizmetini açtığınızda, kapsamdaki tüm kullanıcılar için sağlama işlemleri başlatılır. Eşleme veya İş Günü veri sorunlarında hatalar varsa, sağlama işi başarısız olabilir ve karantina durumuna geçebilirsiniz. Bunu önlemek için, en iyi yöntem olarak, **Kaynak Nesne Kapsamı** filtresini yapılandırmanızı ve tüm kullanıcılar için tam eşitleme başlatmadan önce öznitelik eşlemelerinizi birkaç test kullanıcısıyla test etmenizi öneririz. Eşlemelerin işe yarayıp yaradığını doğruladıktan ve size istenen sonuçları verdikten sonra, filtreyi kaldırabilir veya daha fazla kullanıcı yı içerecek şekilde kademeli olarak genişletebilirsiniz. 1. **Sağlama** sekmesinde, Sağlama **Durumunu** **A.C.** olarak ayarlayın. 2. **Kaydet**'e tıklayın. 3. Bu işlem, İş Günü kiracısında kaç kullanıcı olduğuna bağlı olarak değişken saat sayısı alabilecek ilk eşitleme başlatılır. 4. İstediğinzaman, sağlama hizmetinin hangi eylemleri gerçekleştirdiğini görmek için Azure portalındaki **Denetim günlükleri** sekmesini kontrol edin. Denetim günlükleri, hangi kullanıcıların İş Günü dışında okunduğu ve daha sonra Etkin Dizin'e eklendiği veya güncelleştirildiği gibi, sağlama hizmeti tarafından gerçekleştirilen tüm bireysel eşitleme olaylarını listeler. Denetim günlüklerinin nasıl gözden geçirileceği ve sağlama hatalarını nasıl düzelteceğimizle ilgili yönergeler için Sorun Giderme bölümüne bakın. 5. İlk eşitleme tamamlandıktan sonra, aşağıda gösterildiği **gibi, Sağlama** sekmesine bir denetim özet raporu yazar. ![Azure portalında](./media/workday-inbound-tutorial/wd_3.png) ## <a name="frequently-asked-questions-faq"></a>Sık Sorulan Sorular (SSS) * **Çözüm yeteneği soruları** * [Workday'den yeni bir işe alma işlemi yapılırken, çözüm Active Directory'deki yeni kullanıcı hesabının parolasını nasıl ayarlar?](#when-processing-a-new-hire-from-workday-how-does-the-solution-set-the-password-for-the-new-user-account-in-active-directory) * [Çözüm, sağlama işlemleri tamamlandıktan sonra e-posta bildirimleri göndermeyi destekliyor mu?](#does-the-solution-support-sending-email-notifications-after-provisioning-operations-complete) * [Yeni işe alımlar için parola teslimini nasıl yönetirim ve parolalarını sıfırlamak için güvenli bir şekilde bir mekanizma sağlarım?](#how-do-i-manage-delivery-of-passwords-for-new-hires-and-securely-provide-a-mechanism-to-reset-their-password) * [Çözüm, Azure AD bulutundaki veya sağlama aracısı katmanındaki İş Günü kullanıcı profillerini önbelleğe alıyor mu?](#does-the-solution-cache-workday-user-profiles-in-the-azure-ad-cloud-or-at-the-provisioning-agent-layer) * [Çözüm, şirket içi REKLAM gruplarının kullanıcıya atanmasını destekliyor mu?](#does-the-solution-support-assigning-on-premises-ad-groups-to-the-user) * [Çözüm, İş Günü çalışanı profillerini sorgulamak ve güncelleştirmek için hangi İş Günü API'lerini kullanır?](#which-workday-apis-does-the-solution-use-to-query-and-update-workday-worker-profiles) * [İş günü HCM kiracımı iki Azure AD kiracısıyla yapılandırabilir miyim?](#can-i-configure-my-workday-hcm-tenant-with-two-azure-ad-tenants) * [Azure AD Connect'i dağıttıysak neden "Azure AD'ye Çalışma Günü" kullanıcı sağlama uygulaması desteklenmez?](#why-workday-to-azure-ad-user-provisioning-app-is-not-supported-if-we-have-deployed-azure-ad-connect) * [İş Günü ve Azure REKLAM tümleştirmesi ile ilgili iyileştirmeleri nasıl önerirveya yeni özellikler isteyebilirim?](#how-do-i-suggest-improvements-or-request-new-features-related-to-workday-and-azure-ad-integration) * **Agent sorularını sağlama** * [Provizyon Aracısı'nın GA sürümü nedir?](#what-is-the-ga-version-of-the-provisioning-agent) * [Provizyon Ajanımın sürümünü nasıl bilirim?](#how-do-i-know-the-version-of-my-provisioning-agent) * [Microsoft, Provisioning Agent güncelleştirmelerini otomatik olarak itiyor mu?](#does-microsoft-automatically-push-provisioning-agent-updates) * [Azure AD Connect çalıştıran aynı sunucuya Sağlama Aracısı yükleyebilir miyim?](#can-i-install-the-provisioning-agent-on-the-same-server-running-azure-ad-connect) * [Giden HTTP iletişimi için bir proxy sunucusu kullanmak üzere Geçici Aracıyı nasıl yapılandırıyorum?](#how-do-i-configure-the-provisioning-agent-to-use-a-proxy-server-for-outbound-http-communication) * [Sağlama Aracısının Azure AD kiracısıyla iletişim kurabilmesinden ve aracının gerektirdiği bağlantı noktalarını engelleyen güvenlik duvarları olmadığından nasıl emin olabilirim?](#how-do-i-ensure-that-the-provisioning-agent-is-able-to-communicate-with-the-azure-ad-tenant-and-no-firewalls-are-blocking-ports-required-by-the-agent) * [Geçici Aracımla ilişkili etki alanının kaydını nasıl açabilirim?](#how-do-i-de-register-the-domain-associated-with-my-provisioning-agent) * [Geçici Aracıyı nasıl kaldıracağım?](#how-do-i-uninstall-the-provisioning-agent) * **AD öznitelik eşleme ve yapılandırma soruları için iş günü** * [İş Günü Sağlama Öznitelik Haritalama ve Şema'mın çalışan bir kopyasını nasıl yedeklerveya dışa aktarım?](#how-do-i-back-up-or-export-a-working-copy-of-my-workday-provisioning-attribute-mapping-and-schema) * [İş Günü ve Active Directory'de özel özelliklerim var. Çözümü özel özniteliklerimle çalışacak şekilde nasıl yapılandırıyorum?](#i-have-custom-attributes-in-workday-and-active-directory-how-do-i-configure-the-solution-to-work-with-my-custom-attributes) * [Kullanıcının fotoğrafını İş Günü'nden Active Directory'ye sağlayabilir miyim?](#can-i-provision-users-photo-from-workday-to-active-directory) * [İş Günü'ndeki cep telefonu numaralarını, genel kullanım için kullanıcı onayına dayanarak nasıl senkronize edebilirim?](#how-do-i-sync-mobile-numbers-from-workday-based-on-user-consent-for-public-usage) * [Ad'de görüntüleme adlarını kullanıcının departman/ülke/şehir özelliklerine göre nasıl biçimlendirebilirim ve bölgesel farkları nasıl işlerim?](#how-do-i-format-display-names-in-ad-based-on-the-users-departmentcountrycity-attributes-and-handle-regional-variances) * [SamAccountName özniteliği için benzersiz değerler oluşturmak için SelectUniqueValue'ı nasıl kullanabilirim?](#how-can-i-use-selectuniquevalue-to-generate-unique-values-for-samaccountname-attribute) * [Aksit etekli karakterleri nasıl kaldırıp normal İngilizce alfabeye dönüştürebilirim?](#how-do-i-remove-characters-with-diacritics-and-convert-them-into-normal-english-alphabets) ### <a name="solution-capability-questions"></a>Çözüm yeteneği soruları #### <a name="when-processing-a-new-hire-from-workday-how-does-the-solution-set-the-password-for-the-new-user-account-in-active-directory"></a>Workday'den yeni bir işe alma işlemi yapılırken, çözüm Active Directory'deki yeni kullanıcı hesabının parolasını nasıl ayarlar? Şirket içi sağlama aracısı yeni bir AD hesabı oluşturmak için bir istek aldığında, AD sunucusu tarafından tanımlanan parola karmaşıklığı gereksinimlerini karşılamak üzere tasarlanmış karmaşık bir rasgele parolayı otomatik olarak oluşturur ve bunu kullanıcı nesnesine ayarlar. Bu parola hiçbir yerde günlüğe kaydedilmez. #### <a name="does-the-solution-support-sending-email-notifications-after-provisioning-operations-complete"></a>Çözüm, sağlama işlemleri tamamlandıktan sonra e-posta bildirimleri göndermeyi destekliyor mu? Hayır, sağlama işlemlerini tamamladıktan sonra e-posta bildirimleri göndermek geçerli sürümde desteklenmez. #### <a name="how-do-i-manage-delivery-of-passwords-for-new-hires-and-securely-provide-a-mechanism-to-reset-their-password"></a>Yeni işe alımlar için parola teslimini nasıl yönetirim ve parolalarını sıfırlamak için güvenli bir şekilde bir mekanizma sağlarım? Yeni AD hesabı sağlamayla ilgili son adımlardan biri, kullanıcının AD hesabına atanan geçici parolanın teslim edilmesidir. Birçok işletme, geçici parolayı kullanıcı yöneticisine teslim etme geleneksel yaklaşımını kullanmaya devam eder ve parolayı yeni işe alma/şartlı çalışana teslim eder. Bu işlem doğal bir güvenlik kusuruna sahiptir ve Azure AD özelliklerini kullanarak daha iyi bir yaklaşım uygulamak için kullanılabilir bir seçenek vardır. İşe alma sürecinin bir parçası olarak, İk ekipleri genellikle bir arka plan kontrolü çalıştırmak ve yeni işe cep telefonu numarasını veteriner. AD Kullanıcı Sağlama tümleştirmesi için İş Günü ile, bu gerçeğin üzerine inşa edebilir ve 1. Bu, Yeni işe alma nın "Mobil Numara" özniteliğini İş Günü'nden AD'ye, ardından da Azure AD Connect'i kullanarak AD'den Azure AD'ye yayılarak gerçekleştirilir. Azure AD'de "Mobil Numara" bulunduğunda, kullanıcının hesabı için [Self Servis Parola Sıfırlama'yı (SSPR)](../authentication/howto-sspr-authenticationdata.md) etkinleştirebilirsiniz, böylece 1. #### <a name="does-the-solution-cache-workday-user-profiles-in-the-azure-ad-cloud-or-at-the-provisioning-agent-layer"></a>Çözüm, Azure AD bulutundaki veya sağlama aracısı katmanındaki İş Günü kullanıcı profillerini önbelleğe alıyor mu? Hayır, çözüm kullanıcı profillerinin önbelleğini korumaz. Azure AD sağlama hizmeti, yalnızca bir veri işlemcisi, İş Günü'nden verileri okuma ve hedef Active Directory veya Azure AD'ye yazma görevi görür. Bkz. Kullanıcı gizliliği ve veri saklama yla ilgili ayrıntılar için [kişisel verileri yönetme](#managing-personal-data) bölümüne bakın. #### <a name="does-the-solution-support-assigning-on-premises-ad-groups-to-the-user"></a>Çözüm, şirket içi REKLAM gruplarının kullanıcıya atanmasını destekliyor mu? Bu işlevsellik şu anda desteklenmez. Önerilen geçici çözüm, [denetim günlüğü verileri](https://docs.microsoft.com/graph/api/resources/azure-ad-auditlog-overview?view=graph-rest-beta) için Microsoft Graph API bitiş noktasını sorgulayan bir PowerShell komut dosyası dağıtmak ve bunu grup ataması gibi senaryoları tetiklemek için kullanmaktır. Bu PowerShell komut dosyası bir görev zamanlayıcısına eklenebilir ve sağlama aracısını çalıştıran aynı kutuda dağıtılabilir. #### <a name="which-workday-apis-does-the-solution-use-to-query-and-update-workday-worker-profiles"></a>Çözüm, İş Günü çalışanı profillerini sorgulamak ve güncelleştirmek için hangi İş Günü API'lerini kullanır? Çözüm şu anda aşağıdaki İş Günü API'lerini kullanır: * Get_Workers (v21.1) işçi bilgilerini almak için * İş E-posta Yazma özelliği için Maintain_Contact_Information (v26.1) * Kullanıcı adı Yazma özelliği için Update_Workday_Account (v31.2) #### <a name="can-i-configure-my-workday-hcm-tenant-with-two-azure-ad-tenants"></a>İş günü HCM kiracımı iki Azure AD kiracısıyla yapılandırabilir miyim? Evet, bu yapılandırma desteklenir. Bu senaryoyu yapılandırmak için üst düzey adımlar şunlardır: * Geçici aracı #1 dağıtın ve Azure AD kiracı #1 kaydedin. * Geçici aracı #2 dağıtın ve Azure AD kiracı #2 kaydedin. * Her Geçici Aracının yöneteceği "Alt Etki Alanları"na göre, her aracı etki alanı(lar) ile yapılandırın. Bir aracı birden çok etki alanı işleyebilir. * Azure portalında, İş Günü'nü her kiracıda AD Kullanıcı Sağlama Uygulamasına ayarla ve ilgili etki alanlarıyla yapılandırın. #### <a name="why-workday-to-azure-ad-user-provisioning-app-is-not-supported-if-we-have-deployed-azure-ad-connect"></a>Azure AD Connect'i dağıttıysak neden "Azure AD'ye Çalışma Günü" kullanıcı sağlama uygulaması desteklenmez? Azure AD karma modda kullanıldığında (bulut + şirket içi kullanıcıların bir karışımını içerdiği durumlarda), "yetki kaynağı" tanımının net bir tanımına sahip olmak önemlidir. Genellikle karma senaryolar Azure AD Connect'in dağıtımını gerektirir. Azure AD Connect dağıtıldığında, şirket içi AD yetki kaynağıdır. İş Günü'nü Azure AD bağlayıcısına tanıtmak, İş Günü öznitelik değerlerinin Azure AD Connect tarafından belirlenen değerlerin üzerine yazabileceği bir duruma yol açabilir. Bu nedenle Azure AD Connect etkinleştirildiğinde "Azure AD'ye Çalışma Günü" sağlama uygulamasının kullanımı desteklenmez. Bu gibi durumlarda, kullanıcıları şirket içi REKLAM'a sokmak ve Azure AD Connect'i kullanarak Azure AD'ye senkronize etmek için "AD Kullanıcıya Çalışma Günü" sağlama uygulamasını kullanmanızı öneririz. #### <a name="how-do-i-suggest-improvements-or-request-new-features-related-to-workday-and-azure-ad-integration"></a>İş Günü ve Azure REKLAM tümleştirmesi ile ilgili iyileştirmeleri nasıl önerirveya yeni özellikler isteyebilirim? Geri bildiriminiz, gelecekteki sürümler ve geliştirmelerin yönünü belirlememize yardımcı olduğu için son derece değerlidir. Tüm geri bildirimleri memnuniyetle karşılarız ve Azure [AD'nin geri bildirim forumunda](https://feedback.azure.com/forums/169401-azure-active-directory)fikrinizi veya geliştirme önerinizi göndermenizi öneririz. İş Günü tümleştirmesi ile ilgili belirli geri bildirimler için, İş Günü ile ilgili varolan geri bildirimleri bulmak için *SaaS Uygulamaları* kategorisini seçin ve *İş Günü* anahtar kelimelerini kullanarak arama yapın. ![UserVoice SaaS Uygulamaları](media/workday-inbound-tutorial/uservoice_saas_apps.png) ![UserVoice İş Günü](media/workday-inbound-tutorial/uservoice_workday_feedback.png) Yeni bir fikir önerirken, lütfen başka birinin benzer bir özellik önerip önerdiğini kontrol edin. Bu durumda, özelliği veya geliştirme isteğini oylayabilirsiniz. Ayrıca, fikre olan desteğinizi göstermek ve özelliğin sizin için de ne kadar değerli olacağını göstermek için özel kullanım durumunuzla ilgili bir yorum bırakabilirsiniz. ### <a name="provisioning-agent-questions"></a>Agent sorularını sağlama #### <a name="what-is-the-ga-version-of-the-provisioning-agent"></a>Provizyon Aracısı'nın GA sürümü nedir? * Provizyon Aracısı'nın GA sürümü 1.1.30 ve üzeridir. * Aracı sürümünüz 1.1.30'dan azsa, genel önizleme sürümünü çalıştırMışsınız dır ve aracıyı barındıran sunucuda .NET 4.7.1 çalışma süresi varsa, genel önizleme sürümünü çalıştırAbilirsiniz ve otomatik olarak GA sürümüne güncelleştirilir. * Sunucunuzda yüklü [olan .NET sürümünü kontrol](https://docs.microsoft.com/dotnet/framework/migration-guide/how-to-determine-which-versions-are-installed) edebilirsiniz. Sunucu .NET 4.7.1 çalışmıyorsa [,NET 4.7.1'i indirip yükleyebilirsiniz.](https://support.microsoft.com/help/4033342/the-net-framework-4-7-1-offline-installer-for-windows) .NET 4.7.1'i yükledikten sonra sağlama aracınız GA sürümüne otomatik olarak güncellenir. #### <a name="how-do-i-know-the-version-of-my-provisioning-agent"></a>Provizyon Ajanımın sürümünü nasıl bilirim? * Sağlama Aracısı'nın yüklü olduğu Windows sunucusunda oturum açın. * Denetim **Masası'na** -> git**Kaldır veya Programı Değiştir** menüsünü değiştir * Microsoft Azure AD Connect **Provisioning Aracısı** girişine karşılık gelen sürümü arayın ![Azure portalında](./media/workday-inbound-tutorial/pa_version.png) #### <a name="does-microsoft-automatically-push-provisioning-agent-updates"></a>Microsoft, Provisioning Agent güncelleştirmelerini otomatik olarak itiyor mu? Evet, Microsoft sağlama aracısını otomatik olarak güncelleştirir. Windows hizmetini durdurarak otomatik güncelleştirmeleri devre dışı kullanabilirsiniz **Microsoft Azure AD Connect Agent Updater.** #### <a name="can-i-install-the-provisioning-agent-on-the-same-server-running-azure-ad-connect"></a>Azure AD Connect çalıştıran aynı sunucuya Sağlama Aracısı yükleyebilir miyim? Evet, Sağlama Aracısını Azure AD Connect'i çalıştıran sunucuya yükleyebilirsiniz. #### <a name="at-the-time-of-configuration-the-provisioning-agent-prompts-for-azure-ad-admin-credentials-does-the-agent-store-the-credentials-locally-on-the-server"></a>Yapılandırma sırasında, Azure AD yönetici kimlik bilgileri için Sağlama Aracısı istemleri. Aracı, kimlik bilgilerini sunucuda yerel olarak depolar mı? Yapılandırma sırasında, Sağlama Aracısı Yalnızca Azure AD yöneticinizin kiracınıza bağlanması nı ister. Kimlik bilgilerini sunucuda yerel olarak depolamaz. Ancak, yerel bir Windows parola tonozundaki *şirket içi Active Directory etki alanına* bağlanmak için kullanılan kimlik bilgilerini korur. #### <a name="how-do-i-configure-the-provisioning-agent-to-use-a-proxy-server-for-outbound-http-communication"></a>Giden HTTP iletişimi için bir proxy sunucusu kullanmak üzere Geçici Aracıyı nasıl yapılandırıyorum? Geçici Aracı giden proxy kullanımını destekler. Aracı config dosyasını düzenleyerek yapılandırabilirsiniz **C:\Program Files\Microsoft Azure AD Connect Provisioning Agent\AADConnectProvisioningAgent.exe.config**. Kapanış `</configuration>` etiketinden hemen önce dosyanın sonuna doğru aşağıdaki satırları ekleyin. Değişkenleri [proxy-server] ve [proxy-port] ile proxy sunucu adı ve bağlantı noktası değerleri değiştirin. ```xml <system.net> <defaultProxy enabled="true" useDefaultCredentials="true"> <proxy usesystemdefault="true" proxyaddress="http://[proxy-server]:[proxy-port]" bypassonlocal="true" /> </defaultProxy> </system.net> ``` #### <a name="how-do-i-ensure-that-the-provisioning-agent-is-able-to-communicate-with-the-azure-ad-tenant-and-no-firewalls-are-blocking-ports-required-by-the-agent"></a>Sağlama Aracısının Azure AD kiracısıyla iletişim kurabilmesinden ve aracının gerektirdiği bağlantı noktalarını engelleyen güvenlik duvarları olmadığından nasıl emin olabilirim? [Konektör Bağlantı Noktaları Test Aracı'nı](https://aadap-portcheck.connectorporttest.msappproxy.net/) şirket içi ağınızdan açarak gerekli tüm bağlantı noktalarını açıp açmadığınızı da kontrol edebilirsiniz. Daha fazla yeşil onay işareti daha fazla esneklik anlamına gelir. Aracın size doğru sonuçları verdiğinden emin olmak için aşağıdakilerden emin olun: * Aracı, Sağlama Aracısını yüklediğiniz sunucudan bir tarayıcıda açın. * Bu sayfaya, Geçici Aracınız için geçerli olan ek veya güvenlik duvarlarının da uygulandığından emin olun. Bu, Internet Explorer'da **Ayarlar -> Internet Seçenekleri -> Bağlantıları -> LAN Ayarları'na**giderek yapılabilir. Bu sayfada, "LAN'ınız için Proxy Sunucusu kullanın" alanını görürsünüz. Bu kutuyu seçin ve proxy adresini "Adres" alanına koyun. #### <a name="can-one-provisioning-agent-be-configured-to-provision-multiple-ad-domains"></a>Bir Sağlama Aracısı birden çok AD etki alanı sağlamak üzere yapılandırılabilir mi? Evet, bir Sağlama Aracısı, aracının ilgili etki alanı denetleyicilerine görüş hattı olduğu sürece birden çok AD etki alanını işlenecek şekilde yapılandırılabilir. Microsoft, yüksek kullanılabilirlik sağlamak ve destek üzerinde başarısız olmasını sağlamak için aynı AD etki alanı kümesine hizmet veren 3 geçici aracılardan oluşan bir grup oluşturmanızı önerir. #### <a name="how-do-i-de-register-the-domain-associated-with-my-provisioning-agent"></a>Geçici Aracımla ilişkili etki alanının kaydını nasıl açabilirim? * Azure portalından Azure AD kiracınızın *kiracı kimliğini* alın. * Provisioning Agent'ı çalıştıran Windows sunucusunda oturum açın. * PowerShell'i Windows Yöneticisi olarak açın. * Kayıt komut dosyalarını içeren dizine geçiş inizi değiştirin \[ve\] kiracı kimlik parametrenizi kiracı kimliğinizin değeriyle değiştirerek aşağıdaki komutları çalıştırın. ```powershell cd “C:\Program Files\Microsoft Azure AD Connect Provisioning Agent\RegistrationPowershell\Modules\PSModulesFolder” Import-Module "C:\Program Files\Microsoft Azure AD Connect Provisioning Agent\RegistrationPowershell\Modules\PSModulesFolder\AppProxyPSModule.psd1" Get-PublishedResources -TenantId "[tenant ID]" ``` * Görünen aracılar listesinden – *kaynak* Adı `id` AD etki alanı adınıza eşit olan bu kaynaktan alanın değerini kopyalayın. * Kimlik değerini bu komuta yapıştırın ve PowerShell'deki komutu çalıştırın. ```powershell Remove-PublishedResource -ResourceId "[resource ID]" -TenantId "[tenant ID]" ``` * Aracı yapılandırma sihirbazını yeniden çalıştırın. * Daha önce bu etki alanına atanmış diğer aracıların yeniden yapılandırılması gerekir. #### <a name="how-do-i-uninstall-the-provisioning-agent"></a>Geçici Aracıyı nasıl kaldıracağım? * Sağlama Aracısı'nın yüklü olduğu Windows sunucusunda oturum açın. * Denetim **Masası'na** -> git**Kaldır veya Programı Değiştir** menüsünü değiştir * Aşağıdaki programları kaldırın: * Microsoft Azure AD Connect Sağlama Aracısı * Microsoft Azure AD Bağlantı Aracısı Güncelleyici * Microsoft Azure AD Connect Sağlama Aracısı Paketi ### <a name="workday-to-ad-attribute-mapping-and-configuration-questions"></a>AD öznitelik eşleme ve yapılandırma soruları için iş günü #### <a name="how-do-i-back-up-or-export-a-working-copy-of-my-workday-provisioning-attribute-mapping-and-schema"></a>İş Günü Sağlama Öznitelik Haritalama ve Şema'mın çalışan bir kopyasını nasıl yedeklerveya dışa aktarım? İş Günü Kullanıcı Sağlama yapılandırmanızı dışa aktarmak için Microsoft Graph API'yi kullanabilirsiniz. Ayrıntılar [için İş Günü Kullanıcı Sağlama Öznitelik Eşleme yapılandırmanızı Dışa](#exporting-and-importing-your-configuration) Aktarma ve Alma bölümündeki adımlara bakın. #### <a name="i-have-custom-attributes-in-workday-and-active-directory-how-do-i-configure-the-solution-to-work-with-my-custom-attributes"></a>İş Günü ve Active Directory'de özel özelliklerim var. Çözümü özel özniteliklerimle çalışacak şekilde nasıl yapılandırıyorum? Çözüm, özel İş Günü ve Etkin Dizin özniteliklerini destekler. Özel özniteliklerinizi eşleme şemasına eklemek **için, Attribute Mapping** blade'i açın ve gelişmiş **seçenekleri göster**bölümünü genişletmek için aşağı kaydırın. ![Öznitelik Listesini Edit](./media/workday-inbound-tutorial/wd_edit_attr_list.png) Özel İş Günü özniteliklerinizi eklemek *için, İş Günü için öznitelik edit listesini* seçin ve özel AD özniteliklerinizi eklemek *için, Yerinde Etkin Dizini için öznitelik edit listesini*seçin. Ayrıca bkz: * [İş Günü kullanıcı öznitelikleri listesini özelleştirme](#customizing-the-list-of-workday-user-attributes) #### <a name="how-do-i-configure-the-solution-to-only-update-attributes-in-ad-based-on-workday-changes-and-not-create-any-new-ad-accounts"></a>Çözümü, yalnızca İş Günü değişikliklerine dayalı olarak AD'deki öznitelikleri güncelleştirmek ve yeni AD hesapları oluşturmak için nasıl yapılandırıyorum? Bu yapılandırma, aşağıda gösterildiği gibi **Atföz Eşlemeler** bıçak **Hedef Nesne Eylemleri** ayarlayarak elde edilebilir: ![Eylemi güncelleştir](./media/workday-inbound-tutorial/wd_target_update_only.png) Yalnızca Iş Günü'nden AD'ye akacak güncelleştirme işlemleri için "Güncelleştir" onay kutusunu seçin. #### <a name="can-i-provision-users-photo-from-workday-to-active-directory"></a>Kullanıcının fotoğrafını İş Günü'nden Active Directory'ye sağlayabilir miyim? Çözüm şu anda Active *Directory'de küçük resimPhoto* ve *jpegPhoto* gibi ikili öznitelikleri ayarlamayı desteklemez. #### <a name="how-do-i-sync-mobile-numbers-from-workday-based-on-user-consent-for-public-usage"></a>İş Günü'ndeki cep telefonu numaralarını, genel kullanım için kullanıcı onayına dayanarak nasıl senkronize edebilirim? * İş Günü Sağlama Uygulamanızın "Sağlama" bıçağına gidin. * Öznitelik Eşlemeleri'ne tıklayın * **Eşlemeler**altında, **İş Günü Çalışanlarını Şirket Içi Etkin Dizini Eşitle'yi** seçin (veya İş Günü Çalışanlarını Azure AD ile **Senkronize Edin).** * Attribute Eşlemeler sayfasında, aşağı kaydırın ve "Gelişmiş Seçenekleri Göster" kutusunu işaretleyin. İş **Günü için Öznitelik Listesini Edit'e** tıklayın * Açılan bıçakta, "Mobil" özniteliğini bulun ve **API Expression** ![Mobile GDPR'ı yeniden yapabilmeniz için satıra tıklayın](./media/workday-inbound-tutorial/mobile_gdpr.png) * **API İfadesini,** yalnızca "Genel Kullanım Bayrağı" İş Günü'nde "True" olarak ayarlanmışsa iş cep telefonu numarasını alan aşağıdaki yeni ifadeyle değiştirin. ``` wd:Worker/wd:Worker_Data/wd:Personal_Data/wd:Contact_Data/wd:Phone_Data[translate(string(wd:Phone_Device_Type_Reference/@wd:Descriptor),'abcdefghijklmnopqrstuvwxyz','ABCDEFGHIJKLMNOPQRSTUVWXYZ')='MOBILE' and translate(string(wd:Usage_Data/wd:Type_Data/wd:Type_Reference/@wd:Descriptor),'abcdefghijklmnopqrstuvwxyz','ABCDEFGHIJKLMNOPQRSTUVWXYZ')='WORK' and string(wd:Usage_Data/@wd:Public)='1']/@wd:Formatted_Phone ``` * Öznitelik Listesini Kaydet. * Öznitelik Eşlemi kaydet. * Geçerli durumu temizleyin ve tam eşitlemeyi yeniden başlatın. #### <a name="how-do-i-format-display-names-in-ad-based-on-the-users-departmentcountrycity-attributes-and-handle-regional-variances"></a>Ad'de görüntüleme adlarını kullanıcının departman/ülke/şehir özelliklerine göre nasıl biçimlendirebilirim ve bölgesel farkları nasıl işlerim? Ayrıca kullanıcının departmanı ve ülke /bölge hakkında bilgi sağlamak için *AD'de displayName* özniteliği ni yapılandırmak yaygın bir gereksinimdir. Örneğin John Smith ABD'de Pazarlama Departmanı'nda çalışıyorsa, *displayName'inin* *Smith, John (Marketing-US)* olarak gösterilmesini isteyebilirsiniz. Şirket, iş birimi, şehir veya ülke/bölge gibi özellikleri içerecek şekilde *CN* veya *displayName* oluşturmak için bu tür gereksinimleri şu şekilde işleyebilirsiniz. * Her İş Günü özniteliği, İş Günü için **Öznitelik > Gelişmiş Bölüm -> Edit öznitelik listesinde**yapılandırılabilir olan altta yatan bir XPATH API ifadesi kullanılarak alınır. İşte Workday *PreferredFirstName*için varsayılan XPATH API ifadesi , *PreferredLastName*, *Şirket* ve *Denetleyici Organizasyon* öznitelikleri. | İş Günü Özniteliği | API XPATH İfadesi | | ----------------- | -------------------- | | Tercih Edilen İlkAd | wd:Worker/wd:Worker_Data/wd:Personal_Data/wd:Name_Data/wd:Preferred_Name_Data/wd:Name_Detail_Data/wd:First_Name/text() | | Tercih Edilen Son Ad | wd:Worker/wd:Worker_Data/wd:Personal_Data/wd:Name_Data/wd:Preferred_Name_Data/wd:Name_Detail_Data/wd:Last_Name/text() | | Şirket | wd:Worker/wd:Worker_Data/wd:Organization_Data/wd:Worker_Organization_Data[wd:Organization_Data/wd:Organization_Type_Reference/wd:ID[@wd:type='Organization_Type_ID']]='Şirket']/wd:Organization_Reference/@wd:Descriptor | | Denetleyici Organizasyon | wd:Worker/wd:Worker_Data/wd:Organization_Data/wd:Worker_Organization_Data/wd:Organization_Data[wd:Organization_Type_Reference/wd:ID[@wd:type='Organization_Type_ID']='Supervisory']/wd:Organization_Name/text() | İş Günü ekibinizle yukarıdaki API ifadesinin İş Günü kiracı yapılandırmanız için geçerli olduğunu doğrulayın. Gerekirse, [bunları İş Günü kullanıcı öznitelikleri listesini özelleştirme](#customizing-the-list-of-workday-user-attributes)bölümünde açıklandığı şekilde yeniden atabilirsiniz. * Benzer şekilde, İş Günü'nde bulunan ülke bilgileri aşağıdaki XPATH kullanılarak alınır: *wd:Worker/wd:Worker_Data/wd:Employment_Data/wd:Position_Data/wd:Business_Site_Summary_Data/wd:Address_Data/wd:Country_Reference* İş Günü öznitelik listesi bölümünde bulunan ülkeyle ilgili 5 öznitelik vardır. | İş Günü Özniteliği | API XPATH İfadesi | | ----------------- | -------------------- | | Ülke Referans | wd:Worker/wd:Worker_Data/wd:Employment_Data/wd:Position_Data/wd:Business_Site_Summary_Data/wd:Address_Data/wd:Country_Reference/wd:ID[@wd:type='ISO_3166-1_Alpha-3_Code']/text() | | CountryReferenceFriendly | wd:Worker/wd:Worker_Data/wd:Employment_Data/wd:Position_Data/wd:Business_Site_Summary_Data/wd:Address_Data/wd:Country_Reference/@wd:Descriptor | | ÜlkeReferansSayısal | wd:Worker/wd:Worker_Data/wd:Employment_Data/wd:Position_Data/wd:Business_Site_Summary_Data/wd:Address_Data/wd:Country_Reference/wd:ID[@wd:type='ISO_3166-1_Numeric-3_Code']/text() | | ÜlkeReferansİki Mektubu | wd:Worker/wd:Worker_Data/wd:Employment_Data/wd:Position_Data/wd:Business_Site_Summary_Data/wd:Address_Data/wd:Country_Reference/wd:ID[@wd:type='ISO_3166-1_Alpha-2_Code']/text() | | CountryRegionReference | wd:Worker/wd:Worker_Data/wd:Employment_Data/wd:Position_Data/wd:Business_Site_Summary_Data/wd:Address_Data/wd:Country_Region_Reference/@wd:Descriptor | İş Günü ekibinizle yukarıdaki API ifadelerinin İş Günü kiracı yapılandırmanız için geçerli olduğunu doğrulayın. Gerekirse, [bunları İş Günü kullanıcı öznitelikleri listesini özelleştirme](#customizing-the-list-of-workday-user-attributes)bölümünde açıklandığı şekilde yeniden atabilirsiniz. * Doğru öznitelik eşleme ifadesini oluşturmak için, hangi İş Günü özniteliğinin "yetkili" kullanıcının adını, soyadını, ülke/bölgesini ve bölümünü temsil ettiğini belirleyin. Öznitelikleri sırasıyla *PreferredFirstName*, *PreferredLastName*, *CountryReferenceTwoLetter* ve *SupervisoryOrganization* olduğunu söylüyorlar. *Smith, John (Marketing-US)* gibi bir görüntü adı almak için aşağıdaki gibi AD *displayName* özniteliği için bir ifade oluşturmak için kullanabilirsiniz. ``` Append(Join(", ",[PreferredLastName],[PreferredFirstName]), Join(""," (",[SupervisoryOrganization],"-",[CountryReferenceTwoLetter],")")) ``` Doğru ifadeye sahip olduktan sonra, Attribute Mappings tablosunu edin ve *displayName* öznitelik eşlemini aşağıda gösterildiği gibi değiştirin: ![DisplayName Mapping](./media/workday-inbound-tutorial/wd_displayname_map.png) * Yukarıdaki örneği genişleterek, İş Günü'nden gelen şehir adlarını steno değere dönüştürmek ve daha sonra *Smith, John (CHI)* veya *Doe, Jane (NYC)* gibi ekran adları oluşturmak için kullanmak istediğinizi varsayalım, ardından bu sonuç, Determinant değişkeni olarak İş Günü *Belediyesi* özniteliği ile bir Switch ifadesi kullanılarak elde edilebilir. ``` Switch ( [Municipality], Join(", ", [PreferredLastName], [PreferredFirstName]), "Chicago", Append(Join(", ",[PreferredLastName], [PreferredFirstName]), "(CHI)"), "New York", Append(Join(", ",[PreferredLastName], [PreferredFirstName]), "(NYC)"), "Phoenix", Append(Join(", ",[PreferredLastName], [PreferredFirstName]), "(PHX)") ) ``` Ayrıca bkz: * [Anahtar Fonksiyonu Sözdizimi](../app-provisioning/functions-for-customizing-application-data.md#switch) * [İşlev Sözdizimini Birleştirme](../app-provisioning/functions-for-customizing-application-data.md#join) * [Ek Işlev Sözdizimi](../app-provisioning/functions-for-customizing-application-data.md#append) #### <a name="how-can-i-use-selectuniquevalue-to-generate-unique-values-for-samaccountname-attribute"></a>SamAccountName özniteliği için benzersiz değerler oluşturmak için SelectUniqueValue'ı nasıl kullanabilirim? İş Günü'ndeki *FirstName* ve *Soyadı* özniteliklerinin bir birleşimini kullanarak *samAccountName* özniteliği için benzersiz değerler oluşturmak istediğinizi varsayalım. Aşağıda verilen bir ifade ile başlayabilirsiniz: ``` SelectUniqueValue( Replace(Mid(Replace(NormalizeDiacritics(StripSpaces(Join("", Mid([FirstName],1,1), [LastName]))), , "([\\/\\\\\\[\\]\\:\\;\\|\\=\\,\\+\\*\\?\\<\\>])", , "", , ), 1, 20), , "(\\.)*$", , "", , ), Replace(Mid(Replace(NormalizeDiacritics(StripSpaces(Join("", Mid([FirstName],1,2), [LastName]))), , "([\\/\\\\\\[\\]\\:\\;\\|\\=\\,\\+\\*\\?\\<\\>])", , "", , ), 1, 20), , "(\\.)*$", , "", , ), Replace(Mid(Replace(NormalizeDiacritics(StripSpaces(Join("", Mid([FirstName],1,3), [LastName]))), , "([\\/\\\\\\[\\]\\:\\;\\|\\=\\,\\+\\*\\?\\<\\>])", , "", , ), 1, 20), , "(\\.)*$", , "", , ) ) ``` Yukarıdaki ifade nasıl çalışır: Kullanıcı John Smith ise, ilk JSmith oluşturmak için çalışır, JSmith zaten varsa, o zaman JoSmith oluşturur, eğer varsa, JohSmith oluşturur. İfade ayrıca, oluşturulan değerin uzunluk kısıtlamasını ve *samAccountName*ile ilişkili özel karakter kısıtlamasını karşılamasını da sağlar. Ayrıca bkz: * [Orta Fonksiyon sözdizimi](../app-provisioning/functions-for-customizing-application-data.md#mid) * [İşlev Sözdizimini Değiştir](../app-provisioning/functions-for-customizing-application-data.md#replace) * [SelectUniqueValue İşlev Sözdizimi](../app-provisioning/functions-for-customizing-application-data.md#selectuniquevalue) #### <a name="how-do-i-remove-characters-with-diacritics-and-convert-them-into-normal-english-alphabets"></a>Aksit etekli karakterleri nasıl kaldırıp normal İngilizce alfabeye dönüştürebilirim? Kullanıcı için e-posta adresini veya CN değerini yaparken, kullanıcının adı ve soyadındaki özel karakterleri kaldırmak için [NormalizeDiacritics](../app-provisioning/functions-for-customizing-application-data.md#normalizediacritics) işlevini kullanın. ## <a name="troubleshooting-tips"></a>Sorun giderme ipuçları Bu bölümde, Azure AD Denetim Günlükleri ve Windows Server Olay Görüntüleyicisi günlüklerini kullanarak Çalışma Günü tümleştirmenizdeki sağlama sorunlarını nasıl giderçalışacağınız konusunda özel kılavuzlar sağlanmaktadır. Öğretici'de yakalanan genel sorun giderme adımlarının ve kavramların üzerine inşa [eder: Otomatik kullanıcı hesabı sağlama hakkında raporlama](../app-provisioning/check-status-user-account-provisioning.md) Bu bölümde sorun giderme aşağıdaki yönlerini kapsar: * [Aracı sorun giderme için Windows Event Viewer ayarlama](#setting-up-windows-event-viewer-for-agent-troubleshooting) * [Hizmet sorun giderme için Azure portalı Denetim Günlükleri ayarlama](#setting-up-azure-portal-audit-logs-for-service-troubleshooting) * [AD Kullanıcı Hesabı için günlükleri anlama işlemleri oluşturmak](#understanding-logs-for-ad-user-account-create-operations) * [Yönetici güncelleştirme işlemleri için günlükleri anlama](#understanding-logs-for-manager-update-operations) * [Sık karşılaşılan hataları çözme](#resolving-commonly-encountered-errors) ### <a name="setting-up-windows-event-viewer-for-agent-troubleshooting"></a>Aracı sorun giderme için Windows Event Viewer ayarlama * Sağlama Aracısının dağıtıldığı Windows Server makinesinde oturum açma * **Windows Server Event Viewer** masaüstü uygulamasını açın. * **Windows Günlükleri > Uygulamasını**seçin. * Filtre **Geçerli Günlük'i kullanın...** kaynak AAD altında günlüğe kaydedilen tüm olayları görüntüleme **seçeneği. Connect.ProvisioningAgent** ve aşağıda gösterildiği gibi filtre "-5" belirterek Olay Kimliği "5" ile olayları hariç. ![Windows Olay Görüntüleyici](media/workday-inbound-tutorial/wd_event_viewer_01.png)) * **Tamam'ı** tıklatın ve sonuç görünümünü **Tarih ve Saat** sütununa göre sıralayın. ### <a name="setting-up-azure-portal-audit-logs-for-service-troubleshooting"></a>Hizmet sorun giderme için Azure portalı Denetim Günlükleri ayarlama * Azure [portalını](https://portal.azure.com)başlatın ve İş Günü sağlama uygulamanızın **Denetim günlükleri** bölümüne gidin. * Görünümde yalnızca aşağıdaki sütunları (Tarih, Etkinlik, Durum, Durum Nedeni) görüntülemek için Denetim Günlükleri sayfasındaki **Sütunlar** düğmesini kullanın. Bu yapılandırma, yalnızca sorun giderme yle ilgili verilere odaklanmanızı sağlar. ![Denetim günlüğü sütunları](media/workday-inbound-tutorial/wd_audit_logs_00.png) * Görünümü filtrelemek için **Hedef** ve **Tarih Aralığı** sorgu parametrelerini kullanın. * **Hedef** sorgu parametresini İş Günü çalışanı nesnesinin "İşçi Kimliği" veya "Çalışan Kimliği" olarak ayarlayın. * Tarih **Aralığını,** sağlamayla ilgili hatalar veya sorunlar için araştırmak istediğiniz uygun bir zaman aralığına ayarlayın. ![Denetim günlüğü filtreleri](media/workday-inbound-tutorial/wd_audit_logs_01.png) ### <a name="understanding-logs-for-ad-user-account-create-operations"></a>AD Kullanıcı Hesabı için günlükleri anlama işlemleri oluşturmak Çalışma Günü'nde yeni bir işe alım algılandığında (diyelim ki Çalışan Kimliği *21023*ile), Azure AD sağlama hizmeti çalışan için yeni bir AD kullanıcı hesabı oluşturmaya çalışır ve bu süreçte aşağıda açıklandığı gibi 4 denetim günlüğü kaydı oluşturur: [![Denetim günlüğü ops oluşturmak](media/workday-inbound-tutorial/wd_audit_logs_02.png)](media/workday-inbound-tutorial/wd_audit_logs_02.png#lightbox) Denetim günlüğü kayıtlarından herhangi birini tıklattığınızda, **Etkinlik Ayrıntıları** sayfası açılır. Etkinlik **Ayrıntıları** sayfasının her günlük kayıt türü için görüntülemesi aşağıda açıklanır. * **İş Günü Alma** kaydı: Bu günlük kaydı, İş Günü'nden getirilen alt bilgileri görüntüler. İş Günü'nden veri alma yla ilgili sorunları gidermek için günlük kaydının *Ek Ayrıntılar* bölümündeki bilgileri kullanın. Her alanın nasıl yorumlanacağına ilişkin işaretçilerle birlikte aşağıda bir örnek kayıt gösterilmiştir. ```JSON ErrorCode : None // Use the error code captured here to troubleshoot Workday issues EventName : EntryImportAdd // For full sync, value is "EntryImportAdd" and for delta sync, value is "EntryImport" JoiningProperty : 21023 // Value of the Workday attribute that serves as the Matching ID (usually the Worker ID or Employee ID field) SourceAnchor : a071861412de4c2486eb10e5ae0834c3 // set to the WorkdayID (WID) associated with the record ``` * **AD Alma** kaydı: Bu günlük kaydı, AD'den alınan hesabın bilgilerini görüntüler. İlk kullanıcı oluşturma sırasında AD hesabı olmadığı *için, Etkinlik Durum Nedeni,* Etkin Dizin'de Eşleşen Kimlik özniteliği değerine sahip hiçbir hesabın bulunmadığını gösterir. İş Günü'nden veri alma yla ilgili sorunları gidermek için günlük kaydının *Ek Ayrıntılar* bölümündeki bilgileri kullanın. Her alanın nasıl yorumlanacağına ilişkin işaretçilerle birlikte aşağıda bir örnek kayıt gösterilmiştir. ```JSON ErrorCode : None // Use the error code captured here to troubleshoot Workday issues EventName : EntryImportObjectNotFound // Implies that object was not found in AD JoiningProperty : 21023 // Value of the Workday attribute that serves as the Matching ID ``` Bu AD alma işlemine karşılık gelen Provisioning Agent günlük kayıtlarını bulmak için Windows Olay Görüntüleyicisi günlüklerini açın ve **Bul...** eşleşen kimlik/Birleştirme Özelliği öznitelik değerini içeren günlük girişlerini bulmak için menü seçeneği (bu durumda *21023).* ![Bul](media/workday-inbound-tutorial/wd_event_viewer_02.png) AD hesabını almak için aracı tarafından kullanılan LDAP arama filtresini sağlayacak *Event ID = 9*ile girişi arayın. Benzersiz kullanıcı girişlerini almak için bunun doğru arama filtresi olup olmadığını doğrulayabilirsiniz. ![LDAP Arama](media/workday-inbound-tutorial/wd_event_viewer_03.png) *Olay Kimliği = 2* ile hemen takip eden kayıt, arama işleminin sonucunu yakalar ve herhangi bir sonuç döndürürse. ![LDAP Sonuçları](media/workday-inbound-tutorial/wd_event_viewer_04.png) * **Eşitleme kuralı eylem** kaydı: Bu günlük kaydı, gelen İş Günü olayını işlemek için yapılacak sağlama eylemiyle birlikte öznitelik eşleme kurallarının ve yapılandırılmış kapsam filtrelerinin sonuçlarını görüntüler. Eşitleme eylemiyle ilgili sorunları gidermek için günlük kaydının *Ek Ayrıntılar* bölümündeki bilgileri kullanın. Her alanın nasıl yorumlanacağına ilişkin işaretçilerle birlikte aşağıda bir örnek kayıt gösterilmiştir. ```JSON ErrorCode : None // Use the error code captured here to troubleshoot sync issues EventName : EntrySynchronizationAdd // Implies that the object will be added JoiningProperty : 21023 // Value of the Workday attribute that serves as the Matching ID SourceAnchor : a071861412de4c2486eb10e5ae0834c3 // set to the WorkdayID (WID) associated with the profile in Workday ``` Öznitelik eşleme ifadelerinizle ilgili sorunlar varsa veya gelen İş Günü verilerinde sorunlar varsa (örneğin: gerekli öznitelikler için boş veya boş değer), bu aşamada hatanın ayrıntılarını belirten ErrorCode ile bir hata gözlemlersiniz. * **AD Dışa Aktarma** kaydı: Bu günlük kaydı, işlemde ayarlanan öznitelik değerleriyle birlikte AD hesabı oluşturma işleminin sonucunu görüntüler. Hesap oluşturma işlemiyle ilgili sorunları gidermek için günlük kaydının *Ek Ayrıntılar* bölümündeki bilgileri kullanın. Her alanın nasıl yorumlanacağına ilişkin işaretçilerle birlikte aşağıda bir örnek kayıt gösterilmiştir. "Ek Ayrıntılar" bölümünde ,"EventName" "EntryExportAdd" olarak ayarlanır, "JoiningProperty" Eşleşen Kimlik özniteliğinin değerine ayarlanır, "SourceAnchor" kayıtla ilişkili WorkdayID (WID) ve "TargetAnchor" yeni oluşturulan kullanıcının AD "ObjectGuid" özniteliğinin değeri. ```JSON ErrorCode : None // Use the error code captured here to troubleshoot AD account creation issues EventName : EntryExportAdd // Implies that object will be created JoiningProperty : 21023 // Value of the Workday attribute that serves as the Matching ID SourceAnchor : a071861412de4c2486eb10e5ae0834c3 // set to the WorkdayID (WID) associated with the profile in Workday TargetAnchor : 83f0156c-3222-407e-939c-56677831d525 // set to the value of the AD "objectGuid" attribute of the new user ``` Bu AD dışa aktarma işlemine karşılık gelen Provisioning Agent günlük kayıtlarını bulmak için Windows Olay Görüntüleyicisi günlüklerini açın ve **Bul...** eşleşen kimlik/Birleştirme Özelliği öznitelik değerini içeren günlük girişlerini bulmak için menü seçeneği (bu durumda *21023).* *Olay Kimliği = 2*ile ihracat işleminin zaman damgasına karşılık gelen bir HTTP POST kaydı arayın. Bu kayıt, sağlama hizmeti tarafından gönderen öznitelik değerlerini içerir. [![SCIM Ekle](media/workday-inbound-tutorial/wd_event_viewer_05.png)](media/workday-inbound-tutorial/wd_event_viewer_05.png#lightbox) Yukarıdaki olaydan hemen sonra, ad hesabı oluşturma işleminin yanıtını yakalayan başka bir olay olmalıdır. Bu olay, AD'de oluşturulan yeni objectGuid'i döndürür ve sağlama hizmetinde TargetAnchor özniteliği olarak ayarlanır. [![SCIM Ekle](media/workday-inbound-tutorial/wd_event_viewer_06.png)](media/workday-inbound-tutorial/wd_event_viewer_06.png#lightbox) ### <a name="understanding-logs-for-manager-update-operations"></a>Yönetici güncelleştirme işlemleri için günlükleri anlama Yönetici özniteliği, AD'deki bir başvuru özniteliğidir. Sağlama hizmeti, yönetici özniteliğini kullanıcı oluşturma işleminin bir parçası olarak ayarlamaz. Bunun yerine yönetici özniteliği, kullanıcı için AD hesabı oluşturulduktan sonra bir *güncelleştirme* işleminin parçası olarak ayarlanır. Yukarıdaki örneği genişleterek, İş Günü'nde "21451" olan yeni bir işe alma nın etkinleştirildiğini ve yeni işe alma yöneticisinin *(21023)* zaten bir AD hesabı olduğunu varsayalım. Bu senaryoda, kullanıcı 21451 için Denetim günlükleri arama 5 girişleri gösterir. [![Yönetici Güncellemesi](media/workday-inbound-tutorial/wd_audit_logs_03.png)](media/workday-inbound-tutorial/wd_audit_logs_03.png#lightbox) İlk 4 kayıt, kullanıcı oluşturma işleminin bir parçası olarak araştırdığımız kayıtlara benzer. 5. kayıt, yönetici öznitelik güncelleştirmesiyle ilişkili dışa aktarmadır. Günlük kaydı, yöneticinin *objectGuid* özniteliği kullanılarak gerçekleştirilen AD hesap yöneticisi güncelleştirme işleminin sonucunu görüntüler. ```JSON // Modified Properties Name : manager New Value : "83f0156c-3222-407e-939c-56677831d525" // objectGuid of the user 21023 // Additional Details ErrorCode : None // Use the error code captured here to troubleshoot AD account creation issues EventName : EntryExportUpdate // Implies that object will be created JoiningProperty : 21451 // Value of the Workday attribute that serves as the Matching ID SourceAnchor : 9603bf594b9901693f307815bf21870a // WorkdayID of the user TargetAnchor : 43b668e7-1d73-401c-a00a-fed14d31a1a8 // objectGuid of the user 21451 ``` ### <a name="resolving-commonly-encountered-errors"></a>Sık karşılaşılan hataları çözme Bu bölümde, İş Günü kullanıcı sağlama ile sık görülen hataları ve nasıl çözüleceğini kapsar. Hatalar aşağıdaki gibi gruplandırılır: * [Aracı hatalarını sağlama](#provisioning-agent-errors) * [Bağlantı hataları](#connectivity-errors) * [AD kullanıcı hesabı oluşturma hataları](#ad-user-account-creation-errors) * [AD kullanıcı hesabı güncelleştirme hataları](#ad-user-account-update-errors) #### <a name="provisioning-agent-errors"></a>Aracı hatalarını sağlama |#|Hata Senaryosu |Olası Nedenler|Önerilen Çözünürlük| |--|---|---|---| |1.| Hata iletisi ile bulma aracısını yükleme hatası: *'Microsoft Azure AD Connect Provisioning Agent' (AADConnectProvisioningAgent) hizmeti başlatılamamış. Sistemi başlatmak için yeterli ayrıcalıklara sahip olduğunuzu doğrulayın.* | Bir etki alanı denetleyicisine geçici aracıyı yüklemeye çalışıyorsanız ve grup ilkesi hizmetin başlatılmasını engelliyorsa, bu hata genellikle ortaya çıkar. Ayrıca, çalışan aracının önceki bir sürümü varsa ve yeni bir yükleme başlatmadan önce kaldırmamışsanız da görülür.| Sağlama aracısını DC olmayan bir sunucuya yükleyin. Yeni aracıyı yüklemeden önce aracının önceki sürümlerinin kaldırdığından emin olun.| |2.| Windows Hizmeti 'Microsoft Azure AD Connect Provisioning Agent' *Başlangıç* durumundadır ve *Çalışma* durumuna geçmez. | Yüklemenin bir parçası olarak, aracı sihirbazı sunucuda yerel bir hesap **(NT Service\\AADConnectProvisioningAgent)** oluşturur ve bu hizmeti başlatmak için kullanılan oturum açma hesabıdır. Windows sunucunuzdaki bir güvenlik ilkesi yerel hesapların hizmetleri çalıştırmasını engelliyorsa, bu hatayla karşılaşırsınız. | Hizmetler *konsolu'nu*açın. Windows Hizmeti 'Microsoft Azure AD Connect Provisioning Agent' ve oturum açma sekmesinde hizmeti çalıştırmak için bir etki alanı yöneticisinin hesabını belirtin sağ tıklayın. Hizmeti yeniden başlatın. | |3.| Active *Directory'yi bağla*adımda AD etki alanınız ile birlikte sağlama aracısını yapılandırırken, sihirbaz AD şemasını yüklemeye çalışırken uzun zaman alır ve sonunda zaman ları dışarı çıkarır. | Bu hata genellikle, güvenlik duvarı sorunlarından dolayı sihirbaz AD etki alanı denetleyicisi sunucusuna bağlanamadığında gösterilir. | Etkin *Dizin Bağlantısı* sihirbazı ekranında, AD etki alanınızın kimlik bilgilerini sağlarken, *etki alanı denetleyicisini seç önceliği*adlı bir seçenek vardır. Aracı sunucusuyla aynı sitede bulunan bir etki alanı denetleyicisi seçmek ve iletişimi engelleyen güvenlik duvarı kuralları olmadığından emin olmak için bu seçeneği kullanın. | #### <a name="connectivity-errors"></a>Bağlantı hataları Sağlama hizmeti İş Günü'ne veya Etkin Dizin'e bağlanamıyorsa, hükmün karantinaya alınmasına neden olabilir. Bağlantı sorunlarını gidermek için aşağıdaki tabloyu kullanın. |#|Hata Senaryosu |Olası Nedenler|Önerilen Çözünürlük| |--|---|---|---| |1.| **Test Bağlantısı'nı**tıklattığınızda, hata iletisi alırsınız: *Active Directory'ye bağlanan bir hata oluştu. Lütfen şirket içi Sağlama Aracısının çalıştığını ve doğru Active Directory etki alanıyla yapılandırıldığından emin olun.* | Bu hata genellikle, sağlama aracısı çalışmıyorsa veya Azure AD ile sağlama aracısı arasındaki iletişimi engelleyen bir güvenlik duvarı varsa ortaya çıkar. Etki alanı Aracı Sihirbazı'nda yapılandırılmamışsa, bu hatayı da görebilirsiniz. | Aracının çalıştığını doğrulamak için Windows sunucusundaki *Hizmetler* konsolunu açın. Sağlama aracısihirbazı sihirbazını açın ve doğru etki alanının aracıya kayıtlı olduğunu onaylayın. | |2.| Sağlama işi hafta sonları karantina durumuna geçer (Cum-Sat) ve senkronizasyonda bir hata olduğuna dair bir e-posta bildirimi alıyoruz. | Bu hatanın yaygın nedenlerinden biri Workday'in planlı kapalı kalma süresidir. Workday uygulama kiracısını kullanıyorsanız, Workday'in uygulama kiracıları için hafta sonları zamanladığı kapalı kalma sürelerini (genellikle Cuma akşamından Cumartesi sabahına kadar) ve bu süre boyunca Workday sağlama uygulamalarının Workday'e bağlanamadığı için karantina durumuna geçebileceğini unutmayın. Workday uygulama kiracısı yeniden çevrimiçi olduğunda normal durumuna geri döner. Nadir durumlarda, kiracı yenilendiği veya hesap kilitlendiği ya da süresi dolduğu için Tümleştirme Sistemi Kullanıcısının parolası değiştirildiğinde de bu hatayı görebilirsiniz. | Workday'in kapalı kalma zamanlamasını öğrenmek için Workday yöneticinize veya iş ortağınıza danışın. Kapalı kalma süresince uyarı iletilerini yoksayın ve Workday örneği yeniden çevrimiçi olduğunda kullanılabilirliği onaylayın. | #### <a name="ad-user-account-creation-errors"></a>AD kullanıcı hesabı oluşturma hataları |#|Hata Senaryosu |Olası Nedenler|Önerilen Çözünürlük| |--|---|---|---| |1.| Hata: OperationsError-SvcErr: Bir işlem hatası oluştu iletisi ile denetim günlüğündeki işlem hatalarını dışa *aktarma. Dizin hizmeti için üstün bir başvuru yapılandırılmamamıştır. Bu nedenle dizin hizmeti, bu ormanın dışındaki nesnelere yönlendirme düzenleyemiyor.* | *Active Directory Container* OU doğru ayarlanmazsa veya *parentDistinguishedName*için kullanılan İfade Eşlemesi ile ilgili sorunlar varsa bu hata genellikle görünür. | Yazım hataları için *Active Directory Container* OU parametresini kontrol edin. Öznitelik eşlemesinde *parentDistinguishedName* kullanıyorsanız, bunun her zaman AD etki alanı içinde bilinen bir kapsayıcı olarak değerlendirildiğinden emin olun. Oluşturulan değeri görmek için denetim günlüklerinde *Dışa* Aktarma olayını denetleyin. | |2.| Hata kodu ile denetim günlüğündeki işlem hatalarını dışa aktarma: *SystemForCrossDomainIdentityManagementBadResponse* ve mesaj *Hatası: ConstraintViolation-AtrErr: İstekteki bir değer geçersizdir. Öznitelik için bir değer kabul edilebilir değerler aralığında değildi. \nHata Detayları: CONSTRAINT_ATT_TYPE - şirket*. | Bu hata *şirket* özniteliğine özgü olsa da, *cn* gibi diğer öznitelikler için de bu hatayı görebilirsiniz. Bu hata, AD zorunlu şema kısıtlaması nedeniyle görünür. Varsayılan olarak, *AD'deki şirket* ve *CN* gibi özniteliklerin üst sınırı 64 karakterdir. Workday'den gelen değer 64 karakterden fazlaysa, bu hata iletisini görürsünüz. | Hata iletisinde bildirilen öznitelik değerini görmek için denetim günlüklerinde *Dışa* Aktarma olayını denetleyin. [Orta](../app-provisioning/functions-for-customizing-application-data.md#mid) işlevi kullanarak İş Günü'nden gelen değeri kesmeyi veya eşlemeleri benzer uzunluk kısıtlamaları olmayan bir AD özniteliğiyle değiştirmeyi düşünün. | #### <a name="ad-user-account-update-errors"></a>AD kullanıcı hesabı güncelleştirme hataları AD kullanıcı hesabı güncelleştirme işlemi sırasında, sağlama hizmeti hem İş günü hem de AD'den bilgileri okur, öznitelik eşleme kurallarını çalıştırAr ve herhangi bir değişikliğin yürürlüğe girmesi gerekip gerekmeyeceğini belirler. Buna göre bir güncelleştirme olayı tetiklenir. Bu adımlardan herhangi biri bir hatayla karşılaşırsa, denetim günlüklerinde günlüğe kaydedilir. Sık karşılaşılan güncelleştirme hatalarını gidermek için aşağıdaki tabloyu kullanın. |#|Hata Senaryosu |Olası Nedenler|Önerilen Çözünürlük| |--|---|---|---| |1.| Denetim günlüğündeki senkronizasyon kuralı eylem hataları *ile eventname = EntrySynchronizationError and ErrorCode = EndpointUnavailable*. | Bu hata, sağlama hizmeti, şirket içi sağlama aracısının karşılaştığı bir işlem hatası nedeniyle Active Directory'den kullanıcı profili verilerini alamıyorsa ortaya çıkar. | Okuma işlemiyle ilgili sorunları gösteren hata olayları için Madde Sağlama Aracısı Olay Görüntüleyicisi günlüklerini denetleyin (Event ID #2 tarafından filtreleyin). | |2.| AD'deki yönetici özniteliği, AD'deki belirli kullanıcılar için güncelleştirilmeyecektir. | Bu hatanın en olası nedeni kapsam kuralları kullanıyorsanız ve kullanıcı yöneticisi kapsamın bir parçası değilse. Yöneticinin eşleşen kimlik özniteliği (örn. EmployeeID) hedef AD etki alanında bulunmazsa veya doğru değere ayarlanmazsa da bu sorunla karşılaşabilirsiniz. | Kapsam filtresini gözden geçirin ve yönetici kullanıcıyı kapsama ekleyin. Eşleşen kimlik özniteliği için bir değer olduğundan emin olmak için AD'deki yöneticinin profilini denetleyin. | ## <a name="managing-your-configuration"></a>Yapılandırmanızı yönetme Bu bölümde, İş Günü odaklı kullanıcı sağlama yapılandırmanızı nasıl daha da genişletebileceğiniz, özelleştirebileceğiniz ve yönetebileceğiniz açıklanmaktadır. Aşağıdaki konuları kapsar: * [İş Günü kullanıcı öznitelikleri listesini özelleştirme](#customizing-the-list-of-workday-user-attributes) * [Yapılandırmanızı dışarı ve içeri aktarma](#exporting-and-importing-your-configuration) ### <a name="customizing-the-list-of-workday-user-attributes"></a>İş Günü kullanıcı öznitelikleri listesini özelleştirme Active Directory ve Azure AD için İş Günü sağlama uygulamaları, aralarından seçim yapabileceğiniz Varsayılan İş Günü kullanıcı öznitelikleri listesini içerir. Ancak, bu listeler kapsamlı değildir. İş günü, standart veya İş Günü kiracınıza özgü olabilecek yüzlerce olası kullanıcı özniteliğini destekler. Azure AD sağlama hizmeti, İnsan Kaynakları API'sinin [Get_Workers](https://community.workday.com/sites/default/files/file-hosting/productionapi/Human_Resources/v21.1/Get_Workers.html) çalışmasına maruz kalan öznitelikleri içerecek şekilde listenizi veya İş Günü özniteliğinizi özelleştirme olanağını destekler. Bu değişikliği yapmak için, kullanmak istediğiniz öznitelikleri temsil eden XPath ifadelerini ayıklamak ve azure portalındaki gelişmiş öznitelik düzenleyicisini kullanarak bunları sağlama yapılandırmanıza eklemek için [Workday Studio'yu](https://community.workday.com/studio-download) kullanmanız gerekir. **İş Günü kullanıcı özniteliği için bir XPath ifadesi almak için:** 1. [Workday Studio'u](https://community.workday.com/studio-download)indirin ve kurun. Yükleyiciye erişmek için bir İş Günü topluluk hesabına ihtiyacınız olacaktır. 2. İş Günü Human_Resources WSDL dosyasını bu URL'den indirin:https://community.workday.com/sites/default/files/file-hosting/productionapi/Human_Resources/v21.1/Human_Resources.wsdl 3. İş Günü Stüdyosu'ni başlatın. 4. Komut çubuğundan **Tester seçeneğinde İş Günü > Test Web Hizmeti'ni** seçin. 5. **Harici'yi**seçin ve adım 2'de indirdiğiniz Human_Resources WSDL dosyasını seçin. ![İş Günü Stüdyosu](./media/workday-inbound-tutorial/wdstudio1.png) 6. **Konum** alanını `https://IMPL-CC.workday.com/ccx/service/TENANT/Human_Resources`gerçek örnek türünüzle "IMPL-CC" ve gerçek kiracı adınız ile "KIRACı" olarak değiştirin. 7. **İşlemi** **Get_Workers** ayarlayın 8. İş günü kimlik bilgilerinizi ayarlamak için İstek/Yanıt bölmelerinin altındaki küçük **yapıyı yapılandır** bağlantısını tıklatın. **Kimlik Doğrulama'yı**denetleyin ve ardından İş Günü entegrasyon sistemi hesabınız için kullanıcı adı ve parolagirini girin. Kullanıcı adını ad\@kiracı olarak biçimlendirdiğinden ve **WS-Security Kullanıcı AdıToken** seçeneğini seçtiğinizden emin olun. ![İş Günü Stüdyosu](./media/workday-inbound-tutorial/wdstudio2.png) 9. **Tamam'ı**seçin. 10. **İstek** bölmesine, aşağıdaki XML'e yapıştırın ve Iş günü kiracınızdaki gerçek bir kullanıcının çalışan kimliğine **Employee_ID** ayarlayın. Ayıklamak istediğiniz özniteliğe sahip bir kullanıcı seçin. ```xml <?xml version="1.0" encoding="UTF-8"?> <env:Envelope xmlns:env="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsd="https://www.w3.org/2001/XMLSchema"> <env:Body> <wd:Get_Workers_Request xmlns:wd="urn:com.workday/bsvc" wd:version="v21.1"> <wd:Request_References wd:Skip_Non_Existing_Instances="true"> <wd:Worker_Reference> <wd:ID wd:type="Employee_ID">21008</wd:ID> </wd:Worker_Reference> </wd:Request_References> <wd:Response_Group> <wd:Include_Reference>true</wd:Include_Reference> <wd:Include_Personal_Information>true</wd:Include_Personal_Information> <wd:Include_Employment_Information>true</wd:Include_Employment_Information> <wd:Include_Management_Chain_Data>true</wd:Include_Management_Chain_Data> <wd:Include_Organizations>true</wd:Include_Organizations> <wd:Include_Reference>true</wd:Include_Reference> <wd:Include_Transaction_Log_Data>true</wd:Include_Transaction_Log_Data> <wd:Include_Photo>true</wd:Include_Photo> <wd:Include_User_Account>true</wd:Include_User_Account> <wd:Include_Roles>true</wd:Include_Roles> </wd:Response_Group> </wd:Get_Workers_Request> </env:Body> </env:Envelope> ``` 11. Komutu yürütmek için **İstek Gönder'i** (yeşil ok) tıklatın. Başarılı olursa, yanıt **Yanıt** bölmesi içinde görünmelidir. Girdiğiniz kullanıcı kimliğinin verilerine sahip olduğundan emin olmak için yanıtı denetleyin, hata olmadığından emin olun. 12. Başarılı olursa, XML'i **Yanıt** bölmesinden kopyalayın ve XML dosyası olarak kaydedin. 13. Workday Studio'nun komut çubuğunda **Dosya > Dosyayı seçin...** ve kaydettiğiniz XML dosyasını açın. Bu eylem, Workday Studio XML düzenleyicisinde dosyayı açar. ![İş Günü Stüdyosu](./media/workday-inbound-tutorial/wdstudio3.png) 14. Dosya ağacında, **/env: Zarf > env: Gövde > wd:Get_Workers_Response > wd:Response_Data > wd: Kullanıcınızın** verilerini bulmak için işçi. 15. **wd altında: İşçi**, eklemek istediğiniz özniteliği bulmak ve seçin. 16. Seçili özniteliğiniz için **XPath** ifadesini Belge Yolu alanından kopyalayın. 17. Kopyalanan ifadeden **/env:Zarf/env:Gövde/wd:Get_Workers_Response/wd:Response_Data/** önekini kaldırın. 18. Kopyalanan ifadedeki son öğe bir düğümse (örnek: "/wd: Birth_Date"), ifadenin sonuna **/text()** ekleyin. Son öğe bir öznitelik (örnek: " tür")/@wd: ise bu gerekli değildir. 19. Sonuç gibi `wd:Worker/wd:Worker_Data/wd:Personal_Data/wd:Birth_Date/text()`bir şey olmalıdır . Bu değer, Azure portalına kopyalayabildiğiniz değerdir. **Özel İş Günü kullanıcı özniteliğinizi sağlama yapılandırmanıza eklemek için:** 1. Azure [portalını](https://portal.azure.com)başlatın ve bu eğitimde daha önce açıklandığı gibi İş Günü sağlama uygulamanızın Sağlama bölümüne gidin. 2. **Proveverme Durumunu** **Kapalı**olarak ayarlayın ve **Kaydet'i**seçin. Bu adım, değişikliklerinizin yalnızca hazır olduğunuzda etkili olmasını sağlamaya yardımcı olur. 3. **Eşlemeler**altında, **İş Günü Çalışanlarını Şirket Içi Etkin Dizini Eşitle'yi** seçin (veya İş Günü Çalışanlarını Azure AD ile **Senkronize Edin).** 4. Bir sonraki ekranın altına gidin ve **gelişmiş seçenekleri göster'i**seçin. 5. **İş Günü için Öznitelik Listesini Edit'i**seçin. ![İş Günü Stüdyosu](./media/workday-inbound-tutorial/wdstudio_aad1.png) 6. Öznitelik listesinin altına, giriş alanlarının bulunduğu yere kaydırın. 7. **Ad**için, özniteliğiniz için bir görüntü adı girin. 8. **Tür**için, özniteliğe uygun şekilde karşılık gelen türü seçin **(String** en yaygınolanıdır). 9. **API İfadesi**için, Workday Studio'dan kopyaladığınız XPath ifadesini girin. Örnek: `wd:Worker/wd:Worker_Data/wd:Personal_Data/wd:Birth_Date/text()` 10. **Öznitelik Ekle'yi**seçin. ![İş Günü Stüdyosu](./media/workday-inbound-tutorial/wdstudio_aad2.png) 11. Yukarıda **Kaydet'i** ve ardından iletişim kutusuna **Evet'i** seçin. Hala açıksa Öznitelik Eşleme ekranını kapatın. 12. Ana **Sağlama** sekmesinde, **İş Günü Çalışanlarını Şirket Içi Etkin Dizini eşitle** 'yi (veya **Çalışanları Azure AD'ye Senkronize Etme)** seçeneğini yeniden seçin. 13. **Yeni eşleme ekle'yi**seçin. 14. Yeni özniteliğiniz artık **Kaynak öznitelik** listesinde görünmelidir. 15. Yeni özniteliğiniz için istediğiniz gibi bir eşleme ekleyin. 16. Tamamlandığında, **Provisioning Durumunu** **Tekrar A'ya** ayarlamayı ve kaydetmeyi unutmayın. ### <a name="exporting-and-importing-your-configuration"></a>Yapılandırmanızı dışarı ve içeri aktarma Maddeye bakın [Verme ve alma sağlama yapılandırması](../app-provisioning/export-import-provisioning-configuration.md) ## <a name="managing-personal-data"></a>Kişisel verileri yönetme Active Directory için İş Günü sağlama çözümü, şirket içi bir Windows sunucusuna bir sağlama aracısının yüklenmesini gerektirir ve bu aracı Windows Olay günlüğünde İş Günü'nden AD özniteliğinize bağlı olarak kişisel veriler içerebilecek günlükler oluşturur Eşleştirmeleri. Kullanıcı gizliliği yükümlülüklerine uymak için, olay günlüğünü temizlemek için bir Windows zamanlanmış görev ayarlayarak Olay günlüklerinde 48 saatten fazla veri tutulmadığından emin olabilirsiniz. Azure AD sağlama hizmeti, GDPR sınıflandırmasının **veri işlemcisi** kategorisine girer. Bir veri işlemcisi boru hattı olarak, hizmet önemli ortaklara ve son tüketicilere veri işleme hizmetleri sağlar. Azure AD sağlama hizmeti kullanıcı verileri oluşturmaz ve hangi kişisel verilerin toplandığı ve nasıl kullanıldığı üzerinde bağımsız bir denetime sahip değildir. Azure AD sağlama hizmetinde veri alma, toplama, analiz ve raporlama, varolan kurumsal verilere dayanır. [!INCLUDE [GDPR-related guidance](../../../includes/gdpr-hybrid-note.md)] Azure AD sağlama hizmeti, veri saklama yla ilgili olarak rapor oluşturmaz, analiz yapmaz veya 30 günden fazla öngörü sağlamaz. Bu nedenle, Azure AD sağlama hizmeti 30 günden fazla veri depolamaz, işlemez veya saklamaz. Bu tasarım, GDPR yönetmeliklerine, Microsoft gizlilik uyumluluk yönetmeliklerine ve Azure REKLAM veri saklama ilkeleriyle uyumludur. ## <a name="next-steps"></a>Sonraki adımlar * [Günlükleri nasıl inceleyip sağlama etkinliği yle ilgili raporları nasıl alacağınızı öğrenin](../app-provisioning/check-status-user-account-provisioning.md) * [İş Günü ve Azure Etkin Dizini arasında tek oturum açma yı nasıl yapılandırılabildiğini öğrenin](workday-tutorial.md) * [Diğer SaaS uygulamalarını Azure Etkin Dizini ile nasıl entegre edebilirsiniz öğrenin](tutorial-list.md) * [Sağlama yapılandırmalarını yönetmek için Microsoft Graph API'lerini nasıl kullanacağınızı öğrenin](https://developer.microsoft.com/graph/docs/api-reference/beta/resources/synchronization-overview)
84.077429
1,023
0.792488
tur_Latn
0.999754
0c373c95d61da9a27261b22b3547e6b97a3bc366
327
md
Markdown
README.md
TheSTL/Visualize-Sorting-Algorithm
2c58030d821ba78e7c155a4f4baacb86c14798aa
[ "MIT" ]
16
2020-04-15T16:40:54.000Z
2022-01-20T17:46:03.000Z
README.md
TheSTL/Visualize-Sorting-Algorithm
2c58030d821ba78e7c155a4f4baacb86c14798aa
[ "MIT" ]
2
2020-05-04T09:46:00.000Z
2020-05-04T13:40:41.000Z
README.md
TheSTL/Visualize-Sorting-Algorithm
2c58030d821ba78e7c155a4f4baacb86c14798aa
[ "MIT" ]
1
2020-05-04T10:04:39.000Z
2020-05-04T10:04:39.000Z
## Visualize-Sorting-Algorithm This project is created to visualize sorting algorithms like merge sort,quick sort, heap sort etc.. in a better way. ![vs](https://user-images.githubusercontent.com/50075905/86538405-85e6ca00-bf13-11ea-8fa4-5a83ea396041.gif) ### Task List - [X] Merge Sort - [X] Quick Sort - [ ] Heap Sort
21.8
116
0.740061
eng_Latn
0.569287
0c37c3406209291ff4ceda4b1f3d45019e44870f
2,671
md
Markdown
README.md
zattoo/reviewers
1991c1bcb0f2a68e6de295f9a30d0235b0413588
[ "MIT" ]
null
null
null
README.md
zattoo/reviewers
1991c1bcb0f2a68e6de295f9a30d0235b0413588
[ "MIT" ]
22
2021-08-24T14:26:56.000Z
2022-01-14T13:36:24.000Z
README.md
zattoo/reviewers
1991c1bcb0f2a68e6de295f9a30d0235b0413588
[ "MIT" ]
null
null
null
# Reviewers GitHub Action to recognize and assign reviewers and codeowners ## Inputs ### `token` `string` Required. GitHub token ### `source` `string` Required. Filename which contain owners metadata to look for ### `ignore` `multi-line string` Optional. list of files which the action should ignore when assigning reviewers, If no other changed files except ignore ones, the action will assign root level owners ### `labels` `JSON string` Optional. Record, with `label` keys and regex path values. If a specified label is added to a PR, the action will assign reviewers according to the map. ## Where to see Information The action provides information regarding which files needs to be approved and who are the codeowners of each file The information is updated everytime the action runs automatically It can be found in the Pull Request description as well as in the Workflow logs In the Pull Request description if the amount of files exceeded 500 the action won't show specific details for each file ### Example: ``` <!-- reviewers start --> ## Reviewers 2 files needs to be approved by: @not-a-user, @another-user <details> <summary>Details</summary> | File | Owners | | :--- | :--- | | `src/index.jsx` | not-a-user | | `.github/workflows/reviewers.yml` | not-a-user, another-user| </details> <!-- reviewers end --> ``` ## Usage ### Metadata file The metadata file contains list of labels separated by break-line between which should be assigned ot all sub-paths. ```yml # name: projects/common/.owners nitzanashi ``` If the changed file was `projects/common/utils/time.js` the action will search for the closest `source` (e.g `.labels`) In the current example `projects/common/.labels` is the closest one so all the labels listed in that file will be assigned. ### Workflow ````yaml name: Reviewers on: pull_request_review: pull_request: types: [ opened, ready_for_review, reopened, synchronize, labeled, unlabeled, ] jobs: reviewers: name: Reviewers runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - uses: zattoo/reviewers@levels with: token: ${{secrets.TOKEN}} source: '.owners' ignore: | CHANGELOG.md Another file labels: | { "reviewers:projects": "**/projects/*", "reviewers:platform": "/" } ````
25.438095
167
0.618495
eng_Latn
0.991889
0c37c884a7a2cc5129427859eec262b560617059
5,157
md
Markdown
rc/shared.md
mrqa/mrqa.github.io
65f425390acd0e370af6d940cfdc23e68c8d2789
[ "MIT" ]
6
2018-11-14T08:36:20.000Z
2020-06-26T02:26:25.000Z
rc/shared.md
mrqa/mrqa.github.io
65f425390acd0e370af6d940cfdc23e68c8d2789
[ "MIT" ]
1
2019-06-20T13:07:11.000Z
2019-06-25T19:01:14.000Z
rc/shared.md
mrqa/mrqa.github.io
65f425390acd0e370af6d940cfdc23e68c8d2789
[ "MIT" ]
4
2019-03-16T12:05:18.000Z
2021-10-03T18:04:51.000Z
--- layout: main-rc title: Shared Task order: 2 collection: pages_2019_rc permalink: /rc/shared --- ## Shared Task Overview The 2019 MRQA Shared Task focuses on **generalization**. We release an official training dataset containing examples from existing QA datasets, and evaluate submitted models on ten hidden QA test datasets. Train and test datasets may differ in some of the following ways: - **Passage distribution**: Test examples may involve passages from different sources (e.g., science, news, novels, medical abstracts, etc) with pronounced syntactic and lexical differences. - **Question distribution**: Test examples may emphasize different styles of questions (e.g., entity-centric, relational, other tasks reformulated as QA, etc) which may come from different sources (e.g., crowdworkers, domain experts, exam writers, etc.) - **Joint distribution**: Test examples may vary according to the relationship of the question to the passage (e.g., collected independent vs. dependent of evidence, multi-hop, etc) Both train and test datasets have the same format and this year we focus on extractive question answering. That is, given a question and context passage, systems must find a segment of text, or span in the document that best answers the question. While this format is somewhat restrictive, it allows us to leverage many existing datasets, and its simplicity helps us focus on out-of-domain generalization, instead of other important but orthogonal challenges. Each participant will submit a single QA system trained on the provided training data. We will then privately evaluate each system on the hidden test data. ## Training Datasets All participants are required to use our official training corpus (see our [GitHub repository](https://github.com/mrqa/MRQA-Shared-Task-2019) for details), which consists of examples pooled from the following datasets: - [SQuAD](https://arxiv.org/abs/1606.05250) (Rajpurkar et al., 2016) - [NewsQA](https://arxiv.org/abs/1611.09830) (Trischler et al., 2016) - [TriviaQA](https://arxiv.org/abs/1705.03551) (Joshi et al., 2017) - [SearchQA](https://arxiv.org/abs/1704.05179) (Dunn et al., 2017) - [HotpotQA](https://arxiv.org/abs/1809.09600) (Yang, Qi, Zhang, et al., 2018) - [NaturalQuestions](https://ai.google/research/pubs/pub47761) (Kwiatkowski et al., 2019) **No other question answering data may be used for training.** We allow and encourage participants to use off-the-shelf tools for linguistic annotation (e.g. POS taggers, syntactic parsers), as well as any publicly available unlabeled data and models derived from these (e.g. word vectors, pre-trained language models). ## Dev & Test Datasets For development, we will release development datasets for **five** out of the ten test datasets: - TBA on May 13 We will keep the other five test datasets hidden until the conclusion of the shared task. We hope this will prevent teams from building solutions that are specific to our test datasets, but do not generalize to other datasets. Note: while the development data can be used for model selection, **participants should not train models directly on the development data**. ## Evaluation Systems will first be evaluated using automatic metrics: exact match score (EM) and word-level F1-score (F1). EM only gives credit for predictions that exactly match the gold answer(s), whereas F1 gives partial credit for partial word overlap with the gold answer(s). We will judge systems primarily on their (macro-) average F1 score across all test datasets. Time and resources permitting, we plan to run human evaluation on the top few systems with the highest overall score. Human evaluators will directly judge whether top systems’ predictions are good answers to the test questions. After models have been submitted, we will release anonymized, interactive web demos for high-performing models. Anyone will be able to pose their own questions to these models, in order to better understand their strengths and weaknesses. We will report on these findings at the workshop. ## Data Format and Submission Instructions We detail data format and submission instructions, along with our baseline models, in this [GitHub repository](https://github.com/mrqa/MRQA-Shared-Task-2019). For any inquiry about the shared task and the submission, please make a new **issue** in the repository. ## Important Dates - **May 2, 2019**: [Training datasets released](https://github.com/mrqa/MRQA-Shared-Task-2019#training-data) - **May 13, 2019**: Development datasets released - **July 29, 2019**: Deadline for model submission - **August 12, 2019**: Test results announced - **August 30, 2019**: System description paper submission deadline - **September 16, 2019**: Acceptance notification and reviews shared with authors - **September 30, 2019**: System description paper camera-ready deadline All submission deadlines are 11:59 PM GMT-12 (anywhere in the world). ## Questions? For any questions regarding our shared task, please use [Github issues](https://github.com/mrqa/MRQA-Shared-Task-2019/issues). We are here to answer your questions and looking forward to your submissions!
58.602273
459
0.779523
eng_Latn
0.992406
0c37e6dd9f8ec72790ec7f1f3bb812cd70800e29
455
md
Markdown
_posts/Baekjoon/2020-12-28-11022.md
limjunho/limjunho.github.io
f71e5b80fe56499789aac9d14285b9ebf542797f
[ "MIT" ]
null
null
null
_posts/Baekjoon/2020-12-28-11022.md
limjunho/limjunho.github.io
f71e5b80fe56499789aac9d14285b9ebf542797f
[ "MIT" ]
null
null
null
_posts/Baekjoon/2020-12-28-11022.md
limjunho/limjunho.github.io
f71e5b80fe56499789aac9d14285b9ebf542797f
[ "MIT" ]
null
null
null
--- title: Baekjoon 11022번 tags: Algorithm --- [send me email](mailto:[email protected]) if you have any questions. <!--more--> --- ### Problem ![그림1](/assets/Baekjoon/11022/1.PNG) ### Code ```cpp #include <iostream> int main() { int T; int A, B; scanf("%d", &T); for (size_t i = 1; i <= T; i++) { scanf("%d %d", &A, &B); printf("Case #%d: %d + %d = %d\n", i, A, B, A + B); } return 0; } ```
13.382353
70
0.481319
eng_Latn
0.261646
0c3819a9197f7654b04d8181c161c5843e0fee25
3,672
md
Markdown
docs/vs-2015/code-quality/ca1414-mark-boolean-p-invoke-arguments-with-marshalas.md
nuthrash/visualstudio-docs.zh-tw
a7dbb032182faf6f26c4b1b39a532591622acd16
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/vs-2015/code-quality/ca1414-mark-boolean-p-invoke-arguments-with-marshalas.md
nuthrash/visualstudio-docs.zh-tw
a7dbb032182faf6f26c4b1b39a532591622acd16
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/vs-2015/code-quality/ca1414-mark-boolean-p-invoke-arguments-with-marshalas.md
nuthrash/visualstudio-docs.zh-tw
a7dbb032182faf6f26c4b1b39a532591622acd16
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: CA1414:布林值 P-invoke 以標記引數 MarshalAs |Microsoft Docs ms.date: 11/15/2016 ms.prod: visual-studio-dev14 ms.technology: vs-ide-code-analysis ms.topic: reference f1_keywords: - CA1414 - MarkBooleanPInvokeArgumentsWithMarshalAs helpviewer_keywords: - CA1414 - MarkBooleanPInvokeArgumentsWithMarshalAs ms.assetid: c0c84cf5-7701-4897-9114-66fc4b895699 caps.latest.revision: 16 author: gewarren ms.author: gewarren manager: wpickett ms.openlocfilehash: e444519c5a6d6d1547b782006d063e90d4a3b976 ms.sourcegitcommit: 8b538eea125241e9d6d8b7297b72a66faa9a4a47 ms.translationtype: MT ms.contentlocale: zh-TW ms.lasthandoff: 01/23/2019 ms.locfileid: "58944663" --- # <a name="ca1414-mark-boolean-pinvoke-arguments-with-marshalas"></a>CA1414:以 MarshalAs 標記布林 P/Invoke 引數 [!INCLUDE[vs2017banner](../includes/vs2017banner.md)] ||| |-|-| |TypeName|MarkBooleanPInvokeArgumentsWithMarshalAs| |CheckId|CA1414| |分類|Microsoft.Interoperability| |中斷變更|中斷| ## <a name="cause"></a>原因 平台叫用方法宣告包含<xref:System.Boolean?displayProperty=fullName>參數或傳回值,但<xref:System.Runtime.InteropServices.MarshalAsAttribute?displayProperty=fullName>屬性不會套用至參數或傳回值。 ## <a name="rule-description"></a>規則描述 平台叫用方法存取 unmanaged 程式碼,並使用定義`Declare`中的關鍵字[!INCLUDE[vbprvb](../includes/vbprvb-md.md)]或<xref:System.Runtime.InteropServices.DllImportAttribute?displayProperty=fullName>。 <xref:System.Runtime.InteropServices.MarshalAsAttribute> 指定用來轉換資料類型,managed 和 unmanaged 程式碼之間封送處理行為。 許多簡單的資料類型,例如<xref:System.Byte?displayProperty=fullName>和<xref:System.Int32?displayProperty=fullName>、 unmanaged 程式碼中有單一的表示法,並不需要指定其封送處理行為; common language runtime 會自動提供正確的行為。 <xref:System.Boolean>資料類型在 unmanaged 程式碼中有多種表示。 當<xref:System.Runtime.InteropServices.MarshalAsAttribute>未指定,預設值封送處理行為<xref:System.Boolean>資料類型是<xref:System.Runtime.InteropServices.UnmanagedType?displayProperty=fullName>。 這是 32 位元整數,也就是不適合所有情況。 Unmanaged 方法需要布林值表示應該決定,並對應到適當<xref:System.Runtime.InteropServices.UnmanagedType?displayProperty=fullName>。 UnmanagedType.Bool 是 Win32 BOOL 類型,這一律是 4 個位元組。 應該使用 c + + 的 UnmanagedType.U1`bool`或其他 1 個位元組類型。 如需詳細資訊,請參閱 <<c0> [ 預設為布林值類型封送處理](http://msdn.microsoft.com/d4c00537-70f7-4ca6-8197-bfc1ec037ff9)。 ## <a name="how-to-fix-violations"></a>如何修正違規 若要修正此規則的違規情形,套用<xref:System.Runtime.InteropServices.MarshalAsAttribute>至<xref:System.Boolean>參數或傳回值。 將屬性的值設定為適當<xref:System.Runtime.InteropServices.UnmanagedType>。 ## <a name="when-to-suppress-warnings"></a>隱藏警告的時機 請勿隱藏此規則的警告。 即使封送處理行為的預設值是適當,程式碼時的行為已明確指定時,更輕鬆地可保留。 ## <a name="example"></a>範例 下列範例顯示兩個平台叫用的標記具有適當的方法,<xref:System.Runtime.InteropServices.MarshalAsAttribute>屬性。 [!code-cpp[FxCop.Interoperability.BoolMarshalAs#1](../snippets/cpp/VS_Snippets_CodeAnalysis/FxCop.Interoperability.BoolMarshalAs/cpp/FxCop.Interoperability.BoolMarshalAs.cpp#1)] [!code-csharp[FxCop.Interoperability.BoolMarshalAs#1](../snippets/csharp/VS_Snippets_CodeAnalysis/FxCop.Interoperability.BoolMarshalAs/cs/FxCop.Interoperability.BoolMarshalAs.cs#1)] [!code-vb[FxCop.Interoperability.BoolMarshalAs#1](../snippets/visualbasic/VS_Snippets_CodeAnalysis/FxCop.Interoperability.BoolMarshalAs/vb/FxCop.Interoperability.BoolMarshalAs.vb#1)] ## <a name="related-rules"></a>相關的規則 [CA1901:P/Invoke 宣告應該為可移植的](../code-quality/ca1901-p-invoke-declarations-should-be-portable.md) [CA2101: 必須指定的 P/Invoke 字串引數封送處理](../code-quality/ca2101-specify-marshaling-for-p-invoke-string-arguments.md) ## <a name="see-also"></a>另請參閱 <xref:System.Runtime.InteropServices.UnmanagedType?displayProperty=fullName> [預設值的布林值類型封送處理](http://msdn.microsoft.com/d4c00537-70f7-4ca6-8197-bfc1ec037ff9)[與相互操作 Unmanaged 程式碼](http://msdn.microsoft.com/library/ccb68ce7-b0e9-4ffb-839d-03b1cd2c1258)
58.285714
548
0.817538
yue_Hant
0.688939
0c3943c559d4c4dd031d1773f5740d43570aae57
510
md
Markdown
_shows/73_74/the_stronger.md
johnathan99j/history-project
4b65d28d804b03faddc6439aeef6b7cb66ea52de
[ "MIT" ]
17
2015-04-14T00:36:02.000Z
2019-10-10T14:37:42.000Z
_shows/73_74/the_stronger.md
johnathan99j/history-project
4b65d28d804b03faddc6439aeef6b7cb66ea52de
[ "MIT" ]
1,563
2015-04-14T02:06:03.000Z
2022-03-25T18:18:11.000Z
_shows/73_74/the_stronger.md
johnathan99j/history-project
4b65d28d804b03faddc6439aeef6b7cb66ea52de
[ "MIT" ]
27
2015-04-26T17:43:47.000Z
2022-03-14T15:43:46.000Z
--- title: The Stronger playwright: August Stringberg translator: John McRae date_start: 1974-06-29 date_end: 1974-07-01 season: In House venue: New Theatre period: Spring season_sort: 250 crew: - role: Director name: John McRae --- Two actresses meet by chance on Christmas Eve in a cafe. One is married, the other is not. One talks, the other remains silent, but between the two a story unfolds of past conquests and present victories. Of the two, the silent one and the chatterer, who is the stronger?
26.842105
271
0.766667
eng_Latn
0.998383
0c39c96563dafbec7f416e42c38996fa808a8d2f
1,959
md
Markdown
README.md
uberpixel/commit-bot
f76b83306525f005d4b71827d3a9a4137381de68
[ "MIT" ]
2
2015-02-03T17:02:06.000Z
2021-07-02T13:57:39.000Z
README.md
uberpixel/commit-bot
f76b83306525f005d4b71827d3a9a4137381de68
[ "MIT" ]
null
null
null
README.md
uberpixel/commit-bot
f76b83306525f005d4b71827d3a9a4137381de68
[ "MIT" ]
null
null
null
A crude and simple post-receive hook that sends out commit notifications via Jabber/XMPP. ## Requirements * git * python * sleekxmpp ## Setup Create a new file named `config.json` in the bots directory, and add a dictionary named `sender` with a `account` and `pass` keys containing the account name and password for the XMPP account that should deliver the notifications. Example: { "sender": { "account": "[email protected]", "pass": "1234567890" } } ## Config.json The `sender` dictionary is a **must**, additionally however, you can specify the following keys: `recipients`: An array of XMPP accounts that should receive all notifications `repositories`: An array of dictionaries for every repository. Each entry must at least have a key called `name` with the name of the repository as value. Additionally you can set a `recipients` array, which works just like the global one, except that it's constraint to this repository only. You can also set an `exclude` key which can be either `true`, to completely exclude the repository, or an array of strings containing the names of branches which should be excluded. `whitelist`: Defaults to `false` if not set, however, if set to `true`, the bot will only send notifications about repositories that it finds in the `repositories` list and which are not explicitly excluded. Example config.json: { "sender": { "account": "[email protected]", "pass": "1234567890" }, "whitelist": true, "recipients": [ "global@receiver" ], "repositories": [ { "name": "some-repo", "exclude": true }, { "name": "another-repo", "recipients": [ "local@receiver" ] }, { "name": "third-repo", "exclude: [ "some-branch", "another-branch" ] } ] } ## Usage Simply symlink the post-receive script into the hooks directory of the git repositories you want to receive notifications for ## License MIT license. See LICENSE.md for the complete licensing text
34.982143
474
0.711588
eng_Latn
0.998437
0c3a132e645c19dd0c434fa807f283f2a2d79902
298
md
Markdown
content/_index.md
pinkfloppydisk/edvinas-dev
ee834c71299f9880c6c1473e26114e3634f687e1
[ "MIT" ]
null
null
null
content/_index.md
pinkfloppydisk/edvinas-dev
ee834c71299f9880c6c1473e26114e3634f687e1
[ "MIT" ]
null
null
null
content/_index.md
pinkfloppydisk/edvinas-dev
ee834c71299f9880c6c1473e26114e3634f687e1
[ "MIT" ]
null
null
null
--- title: --- This is a site where I write about tech, video games, random non-tech stuff, personal projects and alike. Checkout links scattered around this site, you might find something useful. At the moment I'm mainly focusing on posting content [here](/posts "Posts"), check it out. Tests
29.8
181
0.748322
eng_Latn
0.999709
0c3a6a1c733b3ce305fddc0fc3d3936d881520d1
2,034
md
Markdown
_cases-test2/the-prosecutor-v-lubanga.md
ABA-Center-for-Human-Rights/aba-icc
f9133a6fb68c6c1fa51a997128a53bf1b423d526
[ "MIT" ]
null
null
null
_cases-test2/the-prosecutor-v-lubanga.md
ABA-Center-for-Human-Rights/aba-icc
f9133a6fb68c6c1fa51a997128a53bf1b423d526
[ "MIT" ]
61
2015-01-22T15:08:08.000Z
2016-12-14T18:01:05.000Z
_cases-test2/the-prosecutor-v-lubanga.md
isabella232/aba-icc
94e870caa4b377228e6f5e1b5753675db8623fb1
[ "MIT" ]
3
2015-02-19T16:25:25.000Z
2017-01-07T14:33:44.000Z
--- title: "The Prosecutor v. Lubanga" published: true country: democratic-republic-of-the-congo trial_opening_date: 2009-01-26T00:00:00.000Z case_status: "Trial Complete: Sentenced" long_name: "The Prosecutor v. Thomas Lubanga Dyilo" accuseds: - thomas-lubanga-dyilo key_actions: - event : event_date: action: accused: thomas-lubanga-dyilo description: "Lubanga was transferred to the Hague on March 16, 2006. Charges were confirmed against him on January 29, 2007. He was convicted on March 14, 2012. He was sentenced, on July 10, 2012, to 14 years of imprisonment from which his time already spent in ICC custody will be deducted." - event (2006-03-18): event_date: 2006-03-18 action: "Arrest Warrant " accused: thomas-lubanga-dyilo description: "[Made public](http://www.icc-cpi.int/iccdocs/doc/doc236258.pdf)" - event (2007-01-29): event_date: 2007-01-29 action: "Confirmation of Charges" accused: thomas-lubanga-dyilo description: "[Decision on the confirmation of charges](http://www.icc-cpi.int/iccdocs/doc/doc266175.PDF)" - event (2009-01-26): event_date: 2009-01-26 action: "Start of Trial " accused: thomas-lubanga-dyilo description: "" - event (2012-03-14): event_date: 2012-03-14 action: "Trial Judgement " accused: thomas-lubanga-dyilo description: "" - event (2012-07-10): event_date: 2012-07-10 action: "Sentencing" accused: thomas-lubanga-dyilo description: "" - event (2012-08-07): event_date: 2012-08-07 action: "Decision on Reparations" accused: thomas-lubanga-dyilo description: "" - event (2014-12-01): event_date: 2014-12-01 action: "Appeal Judgement " accused: thomas-lubanga-dyilo description: "" - event (2015-03-03): event_date: 2015-03-03 action: "Amended Decision on Reparations" accused: thomas-lubanga-dyilo description: "" slug: the-prosecutor-v-lubanga --- * * *
33.344262
299
0.669125
eng_Latn
0.657526
0c3ab571359ac1910a2bbfe7724e21b161ae86e1
72
md
Markdown
src/Python/GeometricObjects/LinearCellDemo.md
cvandijck/VTKExamples
b6bb89414522afc1467be8a1f0089a37d0c16883
[ "Apache-2.0" ]
309
2017-05-21T09:07:19.000Z
2022-03-15T09:18:55.000Z
src/Python/GeometricObjects/LinearCellDemo.md
yijianmingliu/VTKExamples
dc8aac47c4384f9a2de9facbdd1ab3249f62ec99
[ "Apache-2.0" ]
379
2017-05-21T09:06:43.000Z
2021-03-29T20:30:50.000Z
src/Python/GeometricObjects/LinearCellDemo.md
yijianmingliu/VTKExamples
dc8aac47c4384f9a2de9facbdd1ab3249f62ec99
[ "Apache-2.0" ]
170
2017-05-17T14:47:41.000Z
2022-03-31T13:16:26.000Z
### Description This example displays all linear cells in the Toolkit.
18
54
0.777778
eng_Latn
0.997459
0c3b6917a033fb9209ff2c1113d9a48c5f72842f
28,644
md
Markdown
docs/boards/work-items/guidance/work-item-field.md
leotsarev/vsts-docs
f05687a10743bfa3a9230c58dd4410f645395b3b
[ "CC-BY-4.0", "MIT" ]
2
2020-03-02T07:18:04.000Z
2020-03-20T22:25:25.000Z
docs/boards/work-items/guidance/work-item-field.md
leotsarev/vsts-docs
f05687a10743bfa3a9230c58dd4410f645395b3b
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/boards/work-items/guidance/work-item-field.md
leotsarev/vsts-docs
f05687a10743bfa3a9230c58dd4410f645395b3b
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Index of default and system work item fields titleSuffix: Azure Boards description: Index to all fields used in the Agile, Scrum, and CMMI processes/process templates for Azure Boards, Azure DevOps, & Team Foundation Server ms.custom: work-items ms.technology: devops-agile ms.assetid: 9720b88e-474c-451b-b3fe-5253ba24a653 ms.topic: reference ms.author: kaelli author: KathrynEE monikerRange: '>= tfs-2013' ms.date: 10/03/2019 --- # Work item field index [!INCLUDE [temp](../includes/version-vsts-tfs-all-versions.md)] Use this index to look up a description of each field used to track work items. This reference includes all fields defined within the core system processes/process templates: [Basic](../../get-started/plan-track-work.md), [Agile](agile-process.md), [Scrum](scrum-process.md), and [CMMI](cmmi-process.md). The fields and work item types (WITs) available to you depend on the process you chose when you [created your project](../../../organizations/projects/create-project.md). ::: moniker range="azure-devops" To support additional tracking needs, you can [define your own custom work item fields](../../../organizations/settings/work/customize-process.md). ::: moniker-end ::: moniker range="azure-devops-2019" To support additional tracking needs, you can [define your own custom work item fields](../../../organizations/settings/work/customize-process.md) using the Inheritance process model, or if your project collection is configured to use the On-premises XML process model, then see [Modify or add a custom field](../../../reference/add-modify-field.md). ::: moniker-end ::: moniker range="<= tfs-2018" To support additional tracking needs, you can [modify or add a custom field](../../../reference/add-modify-field.md). ::: moniker-end ::: moniker range=">= azure-devops-2019" > [!NOTE] > The [Analytics Service](../../../report/powerbi/what-is-analytics.md) doesn't support reporting on plain text and HTML fields at this time. ::: moniker-end ## Alphabetical index Values in parenthesis indicate the following: - **System**: Core system field assigned to all work item types for all processes - **Agile**: Used only by the [Agile process](agile-process.md) - **CMMI**: Used only by the [CMMI process](cmmi-process.md) - **Scrum**: Used only by the [Scrum process](scrum-process.md) - **TCM**: Used to support Test case management <table> <tbody valign="top"> <tr> <td width="33%"><h3>A</h3> <ul> <li><a href="../../queries/titles-ids-descriptions.md" data-raw-source="[Acceptance Criteria](../../queries/titles-ids-descriptions.md)">Acceptance Criteria</a> (Scrum)</li> <li><a href="guidance-code-review-feedback-field-reference.md" data-raw-source="[Accepted By](guidance-code-review-feedback-field-reference.md)">Accepted By</a> </li> <li><a href="guidance-code-review-feedback-field-reference.md" data-raw-source="[Accepted Date](guidance-code-review-feedback-field-reference.md)">Accepted Date</a></li> <li><a href="../../queries/query-by-workflow-changes.md" data-raw-source="[Activated By](../../queries/query-by-workflow-changes.md)">Activated By</a></li> <li><a href="../../queries/query-by-workflow-changes.md" data-raw-source="[Activated Date](../../queries/query-by-workflow-changes.md)">Activated Date</a></li> <li><a href="../../queries/query-numeric.md" data-raw-source="[Activity](../../queries/query-numeric.md)">Activity</a></li> <li><a href="cmmi/guidance-review-meeting-field-reference-cmmi.md" data-raw-source="[Actual Attendee 1-8](cmmi/guidance-review-meeting-field-reference-cmmi.md)">Actual Attendee 1-8</a> (CMMI)</li> <li><a href="cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md" data-raw-source="[Analysis](cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md)">Analysis</a> (CMMI)</li> <li><a href="guidance-code-review-feedback-field-reference.md" data-raw-source="[Application Launch Instructions](guidance-code-review-feedback-field-reference.md)">Application Launch Instructions</a></li> <li><a href="guidance-code-review-feedback-field-reference.md" data-raw-source="[Application Start Information](guidance-code-review-feedback-field-reference.md)">Application Start Information</a> </li> <li><a href="guidance-code-review-feedback-field-reference.md" data-raw-source="[Application Type](guidance-code-review-feedback-field-reference.md)">Application Type</a> </li> <li><a href="../../queries/query-by-area-iteration-path.md" data-raw-source="[Area ID](../../queries/query-by-area-iteration-path.md)">Area ID</a> (System)</li> <li><a href="../../queries/query-by-area-iteration-path.md" data-raw-source="[Area Path](../../queries/query-by-area-iteration-path.md)">Area Path</a> (System)</li> <li><a href="../../queries/query-by-workflow-changes.md" data-raw-source="[Assigned To](../../queries/query-by-workflow-changes.md)">Assigned To</a></li> <li><a href="guidance-code-review-feedback-field-reference.md" data-raw-source="[Associated Context](guidance-code-review-feedback-field-reference.md)">Associated Context</a></li> <li><a href="guidance-code-review-feedback-field-reference.md" data-raw-source="[Associated Context Code](guidance-code-review-feedback-field-reference.md)">Associated Context Code</a></li> <li><a href="guidance-code-review-feedback-field-reference.md" data-raw-source="[Associated Context Owner](guidance-code-review-feedback-field-reference.md)">Associated Context Owner</a></li> <li><a href="guidance-code-review-feedback-field-reference.md" data-raw-source="[Associated Context Type](guidance-code-review-feedback-field-reference.md)">Associated Context Type</a></li> <li><a href="../../queries/linking-attachments.md" data-raw-source="[Attached File Count](../../queries/linking-attachments.md)">Attached File Count</a></li> <li>Authorized As (Not used)</li> <li><a href="../../queries/build-test-integration.md" data-raw-source="[Automated Test Id](../../queries/build-test-integration.md)">Automated Test Id</a> (TCM)</li> <li><a href="../../queries/build-test-integration.md" data-raw-source="[Automated Test Name](../../queries/build-test-integration.md)">Automated Test Name</a> (TCM) </li> <li><a href="../../queries/build-test-integration.md" data-raw-source="[Automated Test Storage](../../queries/build-test-integration.md)">Automated Test Storage</a> (TCM)</li> <li><a href="../../queries/build-test-integration.md" data-raw-source="[Automated Test Type](../../queries/build-test-integration.md)">Automated Test Type</a> (TCM) </li> <li><a href="../../queries/build-test-integration.md" data-raw-source="[AutomatedTestId](../../queries/build-test-integration.md)">AutomatedTestId</a> (TCM) </li> <li><a href="../../queries/build-test-integration.md" data-raw-source="[AutomatedTestName](../../queries/build-test-integration.md)">AutomatedTestName</a> (TCM)</li> <li><a href="../../queries/build-test-integration.md" data-raw-source="[Automation Status](../../queries/build-test-integration.md)">Automation Status</a> (TCM)</li> </ul> <h3>B</h3> <ul><li><a href="../../queries/planning-ranking-priorities.md" data-raw-source="[Backlog Priority](../../queries/planning-ranking-priorities.md)">Backlog Priority</a> (Scrum)</li> <li><a href="../../queries/planning-ranking-priorities.md" data-raw-source="[Blocked](../../queries/planning-ranking-priorities.md)">Blocked</a></li> <li><a href="../../queries/query-by-workflow-changes.md">Board Column<sup>1</sup></a></li> <li><a href="../../queries/query-by-workflow-changes.md" data-raw-source="[Board Column Done](../../queries/query-by-workflow-changes.md)">Board Column Done<sup>1</sup></a></li> <li><a href="../../queries/query-by-workflow-changes.md" data-raw-source="[Board Lane](../../queries/query-by-workflow-changes.md)">Board Lane<sup>1</sup></a></li> <li><a href="../../queries/query-numeric.md" data-raw-source="[Business Value](../../queries/query-numeric.md)">Business Value</a></li> </ul> <h3>C</h3> <ul><li><a href="cmmi/guidance-review-meeting-field-reference-cmmi.md" data-raw-source="[Called By](cmmi/guidance-review-meeting-field-reference-cmmi.md)">Called By</a> (CMMI)</li> <li><a href="cmmi/guidance-review-meeting-field-reference-cmmi.md" data-raw-source="[Called Date](cmmi/guidance-review-meeting-field-reference-cmmi.md)">Called Date</a> (CMMI)</li> <li><a href="../../queries/history-and-auditing.md" data-raw-source="[Changed By](../../queries/history-and-auditing.md)">Changed By</a> (System)</li> <li><a href="../../queries/history-and-auditing.md" data-raw-source="[Changed Date](../../queries/history-and-auditing.md)">Changed Date</a> (System)</li> <li><a href="../../queries/query-by-workflow-changes.md" data-raw-source="[Closed By](../../queries/query-by-workflow-changes.md)">Closed By</a> (System)</li> <li><a href="../../queries/query-by-workflow-changes.md" data-raw-source="[Closed Date](../../queries/query-by-workflow-changes.md)">Closed Date</a> (System)</li> <li><a href="guidance-code-review-feedback-field-reference.md" data-raw-source="[Closed Status](guidance-code-review-feedback-field-reference.md)">Closed Status</a></li> <li><a href="guidance-code-review-feedback-field-reference.md" data-raw-source="[Closed Status Code](guidance-code-review-feedback-field-reference.md)">Closed Status Code</a></li> <li><a href="guidance-code-review-feedback-field-reference.md" data-raw-source="[Closing Comment](guidance-code-review-feedback-field-reference.md)">Closing Comment</a></li> <li><a href="../../queries/linking-attachments.md">Comment Count<sup>2</sup></a></li> <li><a href="cmmi/guidance-review-meeting-field-reference-cmmi.md" data-raw-source="[Comments](cmmi/guidance-review-meeting-field-reference-cmmi.md)">Comments</a> (CMMI)</li> <li><a href="../../queries/planning-ranking-priorities.md" data-raw-source="[Committed](../../queries/planning-ranking-priorities.md)">Committed</a> (CMMI)</li> <li><a href="../../queries/query-numeric.md" data-raw-source="[Completed Work](../../queries/query-numeric.md)">Completed Work</a></li> <li><a href="cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md" data-raw-source="[Contingency Plan](cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md)">Contingency Plan</a> (CMMI)</li> <li><a href="cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md" data-raw-source="[Corrective Action Actual Resolution](cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md)">Corrective Action Actual Resolution</a> (CMMI)</li> <li><a href="cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md" data-raw-source="[Corrective Action Plan](cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md)">Corrective Action Plan</a> (CMMI)</li> <li><a href="../../queries/query-by-workflow-changes.md" data-raw-source="[Created By](../../queries/query-by-workflow-changes.md)">Created By</a> (System)</li> <li><a href="../../queries/query-by-workflow-changes.md" data-raw-source="[Created Date](../../queries/query-by-workflow-changes.md)">Created Date</a> (System)</li> </ul> </td> <td width="33%"> <h3>D-E-F</h3> <ul> <li><a href="../../queries/query-numeric.md" data-raw-source="[Discipline](../../queries/query-numeric.md)">Discipline</a> (CMMI)</li> <li><a href="../../queries/titles-ids-descriptions.md" data-raw-source="[Description](../../queries/titles-ids-descriptions.md)">Description</a> (System)</li> <li><a href="../../queries/query-by-date-or-current-iteration.md" data-raw-source="[Due Date](../../queries/query-by-date-or-current-iteration.md)">Due Date</a></li> <li><a href="../../queries/query-numeric.md" data-raw-source="[Effort](../../queries/query-numeric.md)">Effort</a> </li> <li><a href="../../queries/planning-ranking-priorities.md" data-raw-source="[Escalate](../../queries/planning-ranking-priorities.md)">Escalate</a> (CMMI)</li> <li><a href="../../queries/linking-attachments.md#external-link-count" data-raw-source="[External Link Count](../../queries/linking-attachments.md#external-link-count)">External Link Count</a> </li> <li><a href="../../queries/query-by-date-or-current-iteration.md" data-raw-source="[Finish Date](../../queries/query-by-date-or-current-iteration.md)">Finish Date</a></li> <li><a href="../../queries/build-test-integration.md" data-raw-source="[Found In Build](../../queries/build-test-integration.md)">Found In Build</a> (TCM)</li> <li><a href="cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md" data-raw-source="[Found In Environment](cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md)">Found In Environment</a> (CMMI)</li> </ul> <h3>H</h3> <ul> <li><a href="../../queries/history-and-auditing.md" data-raw-source="[History](../../queries/history-and-auditing.md)">History</a> (System)</li> <li><a href="cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md" data-raw-source="[How Found](cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md)">How Found</a> (CMMI)</li> <li><a href="../../queries/linking-attachments.md#hyper-link-count" data-raw-source="[Hyperlink Count](../../queries/linking-attachments.md#hyper-link-count)">Hyperlink Count</a></li> </ul> <h3>I</h3> <ul> <li><a href="../../queries/titles-ids-descriptions.md" data-raw-source="[ID](../../queries/titles-ids-descriptions.md)">ID</a> (System)</li> <li><a href="cmmi/guidance-requirements-field-reference-cmmi.md" data-raw-source="[Impact Assessment](cmmi/guidance-requirements-field-reference-cmmi.md)">Impact Assessment</a> (CMMI)</li> <li><a href="cmmi/guidance-change-request-field-reference-cmmi.md" data-raw-source="[Impact on Architecture](cmmi/guidance-change-request-field-reference-cmmi.md)">Impact on Architecture</a> (CMMI)</li> <li><a href="cmmi/guidance-change-request-field-reference-cmmi.md" data-raw-source="[Impact on Development](cmmi/guidance-change-request-field-reference-cmmi.md)">Impact on Development</a> (CMMI)</li> <li><a href="cmmi/guidance-change-request-field-reference-cmmi.md" data-raw-source="[Impact on Technical Publications](cmmi/guidance-change-request-field-reference-cmmi.md)">Impact on Technical Publications</a> (CMMI)</li> <li><a href="cmmi/guidance-change-request-field-reference-cmmi.md" data-raw-source="[Impact on Test](cmmi/guidance-change-request-field-reference-cmmi.md)">Impact on Test</a> (CMMI)</li> <li><a href="cmmi/guidance-change-request-field-reference-cmmi.md" data-raw-source="[Impact on User Experience](cmmi/guidance-change-request-field-reference-cmmi.md)">Impact on User Experience</a> (CMMI)</li> <li><a href="../../queries/build-test-integration.md" data-raw-source="[Integrated in Build](../../queries/build-test-integration.md)">Integrated in Build</a> (TCM)</li> <li><a href="../../queries/build-test-integration.md" data-raw-source="[Issue](../../queries/build-test-integration.md)">Issue</a> (TCM)</li> <li><a href="../../queries/query-by-area-iteration-path.md" data-raw-source="[Iteration Id](../../queries/query-by-area-iteration-path.md)">Iteration Id</a> (System)</li> <li><a href="../../queries/query-by-area-iteration-path.md" data-raw-source="[Iteration Path](../../queries/query-by-area-iteration-path.md)">Iteration Path</a> (System)</li> </ul> <h3>J-L-M-N</h3> <ul> <li><a href="cmmi/guidance-change-request-field-reference-cmmi.md" data-raw-source="[Justification](cmmi/guidance-change-request-field-reference-cmmi.md)">Justification</a> (CMMI)</li> <li><a href="../../queries/linking-attachments.md" data-raw-source="[Link Comment](../../queries/linking-attachments.md)">Link Comment</a> (System)</li> <li><a href="../../queries/linking-attachments.md" data-raw-source="[Link Description](../../queries/linking-attachments.md)">Link Description</a> (System)</li> <li><a href="../../queries/build-test-integration.md" data-raw-source="[Local Data Source](../../queries/build-test-integration.md)">Local Data Source</a> (TCM)</li> <li><a href="cmmi/guidance-review-meeting-field-reference-cmmi.md" data-raw-source="[Meeting Type](cmmi/guidance-review-meeting-field-reference-cmmi.md)">Meeting Type</a> (CMMI)</li> <li><a href="cmmi/guidance-review-meeting-field-reference-cmmi.md" data-raw-source="[Minutes](cmmi/guidance-review-meeting-field-reference-cmmi.md)">Minutes</a> (CMMI) </li> <li><a href="cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md" data-raw-source="[Mitigation Plan](cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md)">Mitigation Plan</a> (CMMI) </li> <li><a href="cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md" data-raw-source="[Mitigation Triggers](cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md)">Mitigation Triggers</a> (CMMI)</li> <li><a href="../../queries/query-by-area-iteration-path.md" data-raw-source="[Node Name](../../queries/query-by-area-iteration-path.md)">Node Name</a> (System)</li> </ul> <h3>O-P-Q</h3> <ul> <li><a href="cmmi/guidance-review-meeting-field-reference-cmmi.md" data-raw-source="[Optional Attendee 1-8](cmmi/guidance-review-meeting-field-reference-cmmi.md)">Optional Attendee 1-8</a> (CMMI)</li> <li><a href="../../queries/query-numeric.md" data-raw-source="[Original Estimate](../../queries/query-numeric.md)">Original Estimate</a></li> <li><a href="../../queries/build-test-integration.md" data-raw-source="[Parameters](../../queries/build-test-integration.md)">Parameters</a> (TCM)</li> <li><a href="../../queries/linking-attachments.md#parent">Parent<sup>3</sup></a></li> <li><a href="../../queries/planning-ranking-priorities.md" data-raw-source="[Priority](../../queries/planning-ranking-priorities.md)">Priority</a> </li> <li><a href="cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md" data-raw-source="[Probability](cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md)">Probability</a> (CMMI)</li> <li><a href="cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md">Proposed Fix</a> (CMMI) </li> <li><a href="cmmi/guidance-review-meeting-field-reference-cmmi.md" data-raw-source="[Purpose](cmmi/guidance-review-meeting-field-reference-cmmi.md)">Purpose</a> (CMMI)</li> <li><a href="../../queries/build-test-integration.md" data-raw-source="[Query Text](../../queries/build-test-integration.md)">Query Text</a> (TCM)</li> </ul> </td> <td width="33%"><h3>R</h3> <ul> <li><a href="guidance-code-review-feedback-field-reference.md" data-raw-source="[Rating](guidance-code-review-feedback-field-reference.md)">Rating</a></li> <li><a href="../../queries/query-by-workflow-changes.md" data-raw-source="[Reason](../../queries/query-by-workflow-changes.md)">Reason</a> (System)</li> <li><a href="../../queries/linking-attachments.md" data-raw-source="[Related Link Count](../../queries/linking-attachments.md)">Related Link Count</a> (System)</li> <li><a href="../../queries/query-numeric.md" data-raw-source="[Remaining Work](../../queries/query-numeric.md)">Remaining Work</a> </li> <li><a href="../../queries/linking-attachments.md#remote-link-count">Remote Link Count<sup>3</sup></a> (System)</li> <li><a href="../../queries/titles-ids-descriptions.md" data-raw-source="[Repro Steps](../../queries/titles-ids-descriptions.md)">Repro Steps</a></li> <li><a href="cmmi/guidance-review-meeting-field-reference-cmmi.md" data-raw-source="[Required Attendee 1-8](cmmi/guidance-review-meeting-field-reference-cmmi.md)">Required Attendee 1-8</a> (CMMI)</li> <li><a href="cmmi/guidance-requirements-field-reference-cmmi.md" data-raw-source="[Requirement Type](cmmi/guidance-requirements-field-reference-cmmi.md)">Requirement Type</a> (CMMI)</li> <li><a href="../../queries/query-numeric.md" data-raw-source="[Requires Review](../../queries/query-numeric.md)">Requires Review</a> (CMMI)</li> <li><a href="../../queries/query-numeric.md" data-raw-source="[Requires Test](../../queries/query-numeric.md)">Requires Test</a> (CMMI)</li> <li><a href="../../queries/titles-ids-descriptions.md" data-raw-source="[Resolution](../../queries/titles-ids-descriptions.md)">Resolution</a> (Scrum)</li> <li><a href="../../queries/query-by-workflow-changes.md" data-raw-source="[Resolved By](../../queries/query-by-workflow-changes.md)">Resolved By</a></li> <li><a href="../../queries/query-by-workflow-changes.md" data-raw-source="[Resolved Date](../../queries/query-by-workflow-changes.md)">Resolved Date</a></li> <li><a href="../../queries/query-by-workflow-changes.md" data-raw-source="[Resolved Reason](../../queries/query-by-workflow-changes.md)">Resolved Reason</a></li> <li><a href="guidance-code-review-feedback-field-reference.md" data-raw-source="[Reviewed By](guidance-code-review-feedback-field-reference.md)">Reviewed By</a></li> <li><a href="guidance-code-review-feedback-field-reference.md" data-raw-source="[Reviewed Date](guidance-code-review-feedback-field-reference.md)">Reviewed Date</a></li> <li><a href="../../queries/history-and-auditing.md" data-raw-source="[Rev](../../queries/history-and-auditing.md)">Rev</a> (System)</li> <li><a href="../../queries/planning-ranking-priorities.md" data-raw-source="[Risk](../../queries/planning-ranking-priorities.md)">Risk</a> (Agile)</li> <li><a href="cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md" data-raw-source="[Root Cause](cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md)">Root Cause</a> (CMMI)</li> </ul> <h3>S</h3> <ul> <li><a href="../../queries/planning-ranking-priorities.md" data-raw-source="[Severity](../../queries/planning-ranking-priorities.md)">Severity</a></li> <li><a href="../../queries/query-numeric.md" data-raw-source="[Size](../../queries/query-numeric.md)">Size</a> (CMMI)</li> <li><a href="../../queries/planning-ranking-priorities.md" data-raw-source="[Stack Rank](../../queries/planning-ranking-priorities.md)">Stack Rank</a></li> <li><a href="../../queries/query-by-date-or-current-iteration.md" data-raw-source="[Start Date](../../queries/query-by-date-or-current-iteration.md)">Start Date</a></li> <li><a href="../../queries/query-by-workflow-changes.md" data-raw-source="[State](../../queries/query-by-workflow-changes.md)">State</a> (System)</li> <li><a href="../../queries/query-by-workflow-changes.md" data-raw-source="[State Change Date](../../queries/query-by-workflow-changes.md)">State Change Date</a></li> <li><a href="guidance-code-review-feedback-field-reference.md" data-raw-source="[State Code](guidance-code-review-feedback-field-reference.md)">State Code</a></li> <li><a href="../../queries/build-test-integration.md" data-raw-source="[Steps](../../queries/build-test-integration.md)">Steps</a> (TCM)</li> <li><a href="../../queries/titles-ids-descriptions.md" data-raw-source="[Steps to Reproduce](../../queries/titles-ids-descriptions.md)">Steps to Reproduce</a> (TCM)</li> <li><a href="../../queries/query-numeric.md" data-raw-source="[Story Points](../../queries/query-numeric.md)">Story Points</a> (Agile)</li> <li><a href="cmmi/guidance-requirements-field-reference-cmmi.md" data-raw-source="[Subject Matter Expert](cmmi/guidance-requirements-field-reference-cmmi.md)">Subject Matter Expert</a> (CMMI)</li> <li><a href="cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md" data-raw-source="[Symptom](cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md)">Symptom</a> (CMMI)</li> <li><a href="../../queries/titles-ids-descriptions.md" data-raw-source="[System Info](../../queries/titles-ids-descriptions.md)">System Info</a> (TCM) </li> </ul> <h3>T</h3> <ul> <li><a href="../../queries/add-tags-to-work-items.md" data-raw-source="[Tags](../../queries/add-tags-to-work-items.md)">Tags</a></li> <li><a href="../../queries/query-by-date-or-current-iteration.md" data-raw-source="[Target Date](../../queries/query-by-date-or-current-iteration.md)">Target Date</a></li> <li><a href="cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md" data-raw-source="[Target Resolve Date](cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md)">Target Resolve Date</a> (CMMI)</li> <li><a href="../../queries/query-numeric.md" data-raw-source="[Task Type](../../queries/query-numeric.md)">Task Type</a> (CMMI)</li> <li><a href="../../queries/titles-ids-descriptions.md" data-raw-source="[Team Project](../../queries/titles-ids-descriptions.md)">Team Project</a> (System) </li> <li><a href="../../queries/build-test-integration.md" data-raw-source="[Test Suite Audit](../../queries/build-test-integration.md)">Test Suite Audit</a> (TCM)</li> <li><a href="../../queries/build-test-integration.md" data-raw-source="[Test Suite Type](../../queries/build-test-integration.md)">Test Suite Type</a> (TCM)</li> <li><a href="../../queries/build-test-integration.md" data-raw-source="[Test Suite Type ID](../../queries/build-test-integration.md)">Test Suite Type ID</a> (TCM)</li> <li><a href="../../queries/planning-ranking-priorities.md" data-raw-source="[Time Criticality](../../queries/planning-ranking-priorities.md)">Time Criticality</a></li> <li><a href="../../queries/titles-ids-descriptions.md" data-raw-source="[Title](../../queries/titles-ids-descriptions.md)">Title</a> (System)</li> <li><a href="../../queries/planning-ranking-priorities.md" data-raw-source="[Triage](../../queries/planning-ranking-priorities.md)">Triage</a> (CMMI)</li> </ul> <h3>U-V-W</h3> <ul> <li><a href="cmmi/guidance-requirements-field-reference-cmmi.md" data-raw-source="[User Acceptance Test](cmmi/guidance-requirements-field-reference-cmmi.md)">User Acceptance Test</a> (CMMI)</li> <li><a href="../../queries/planning-ranking-priorities.md" data-raw-source="[Value Area](../../queries/planning-ranking-priorities.md)">Value Area</a></li> <li><a href="../../queries/history-and-auditing.md" data-raw-source="[Watermark](../../queries/history-and-auditing.md)">Watermark</a> (System)</li> <li><a href="../../queries/titles-ids-descriptions.md" data-raw-source="[Work Item Type](../../queries/titles-ids-descriptions.md)">Work Item Type</a> (System) </li> </ul> </td> </tr> </tbody> </table> <hr/> <br/> #### Notes: 1. These fields are available from Azure DevOps Services and TFS 2015.1 or later versions. 2. The Comment Count field is available from Azure DevOps Services and TFS 2017 or later versions. 3. These fields are available from Azure DevOps Services only at this time. By using the system fields or other fields you have added to your project collection, you can enable meaningful cross-project reports and queries. In addition, any non-system field that is referenced in the workflow or forms section of the work item type definition must have a **FIELD** element that defines it in the **FIELDS** section of the work item type definition XML file. Also, you must specify any non-system field that you might want to use to generate a query or report in the **FIELDS** section. ## Field reference topics The following articles describe fields that are used in common by several WITs, or those that are functionally specific to just one or a few WITs. ### Fields common to many work types - [Titles, IDs, and descriptive fields](../../queries/titles-ids-descriptions.md) - [History and revision changes](../../queries/history-and-auditing.md) - [Areas and iterations](../../../organizations/settings/set-area-paths.md) - [Assignments and account-specific fields](../../queries/query-by-workflow-changes.md) - [Planning, ranking, and priorities](../../queries/planning-ranking-priorities.md) - [Work estimates, activity, and other numeric fields](../../queries/query-numeric.md) - [Build and test integration fields](../../queries/build-test-integration.md) - [Links and attachment related fields](../../queries/linking-attachments.md) ### Fields used by specific work item types - [Code Review Request](guidance-code-review-feedback-field-reference.md) - [Code Review Response](guidance-code-review-feedback-field-reference.md) - [Feedback Request](guidance-code-review-feedback-field-reference.md) - [Feedback Response](guidance-code-review-feedback-field-reference.md) - [Shared Steps](../../queries/build-test-integration.md) - [Test Case](../../queries/build-test-integration.md) ### Fields used to track CMMI work items - [Requirements](cmmi/guidance-requirements-field-reference-cmmi.md) - [Bugs](cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md) - [Change Requests](cmmi/guidance-change-request-field-reference-cmmi.md) - [Issues](cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md) - [Review Meetings](cmmi/guidance-review-meeting-field-reference-cmmi.md) - [Risks](cmmi/guidance-bugs-issues-risks-field-reference-cmmi.md) ## Related articles - [About work item fields](../work-item-fields.md) - [Create managed queries](../../queries/example-queries.md) - [Define a query](../../queries/using-queries.md) - [Choose a process](choose-process.md) - [Reportable fields reference](../../../reference/xml/reportable-fields-reference.md) (on-premises TFS only)
96.444444
510
0.717218
yue_Hant
0.176121
0c3b7625ad3149e248e196f6919650767b3e4733
5,391
md
Markdown
README.md
tydanel/amber
8eaff496ee9ef7c6016540da58113308f29f3d3a
[ "MIT" ]
null
null
null
README.md
tydanel/amber
8eaff496ee9ef7c6016540da58113308f29f3d3a
[ "MIT" ]
1
2018-06-21T12:49:37.000Z
2018-06-21T12:49:37.000Z
README.md
tydanel/amber
8eaff496ee9ef7c6016540da58113308f29f3d3a
[ "MIT" ]
null
null
null
![amber](https://github.com/amberframework/site-assets/raw/master/images/amber-horizontal.png) **Productivity. Performance. Happiness.** _Amber makes building web applications fast, simple, and enjoyable - with fewer bugs and blazing fast performance._ [![Build Status](https://travis-ci.org/amberframework/amber.svg?branch=master)](https://travis-ci.org/amberframework/amber) [![Version](https://img.shields.io/github/tag/amberframework/amber.svg?maxAge=360&label=version)](https://github.com/amberframework/amber/releases/latest) [![License](https://img.shields.io/github/license/amberframework/amber.svg)](https://github.com/amberframework/amber/blob/master/LICENSE) [![Liberapay patrons](https://img.shields.io/liberapay/patrons/amber-framework.svg?label=liberapay%20patrons)](https://liberapay.com/amber-framework/) [![Gitter](https://img.shields.io/gitter/room/amberframework/amber.svg)](https://gitter.im/amberframework/amber) # Welcome to Amber **Amber** is a web application framework written in [Crystal](https://crystal-lang.org/) inspired by Kemal, Rails, Phoenix and other popular application frameworks. The purpose of Amber is not to create yet another framework, but to take advantage of the beautiful Crystal language capabilities and provide engineers and the Crystal community with an efficient, cohesive, well maintained web framework that embraces the language philosophies, conventions, and guidelines. Amber borrows concepts that have already been battle tested and successful, and embraces new concepts through team and community collaboration and analysis, which also aligns with Crystal's philosophy. ## Community Questions? Join our IRC channel [#amber](https://webchat.freenode.net/?channels=#amber) or [Gitter room](https://gitter.im/amberframework/amber) or ask on Stack Overflow under the [amber-framework](https://stackoverflow.com/questions/tagged/amber-framework) tag. Guidelines? We have adopted the Contributor Covenant to be our [CODE OF CONDUCT](.github/CODE_OF_CONDUCT.md) for Amber. Our Philosophy? [Read Amber Philosophy H.R.T.](.github/AMBER_PHILOSOPHY.md) ## Documentation Read Amber documentation on https://docs.amberframework.org ## Benchmarks [Techempower Framework Benchmarks - Round 17 (2018-10-30)](https://www.techempower.com/benchmarks/#section=data-r17&hw=ph&test=json&c=6&d=9&o=e) * Filtered by Full Stack, Full ORM, Mysql or Pg for comparing similar frameworks. ## Installation & Usage #### macOS ``` brew tap amberframework/amber brew install amber ``` #### Linux ``` git clone https://github.com/amberframework/amber.git cd amber git checkout stable make sudo make install ``` If you're using ArchLinux or similar distro try: ``` yay -S amber ``` #### Common To compile a local `bin/amber` per project use `shards build amber` To use it as dependency, add this to your application's `shard.yml`: ```yaml dependencies: amber: github: amberframework/amber ``` [Read Amber quick start guide](https://docs.amberframework.org/amber/getting-started) [Read Amber CLI commands usage](https://docs.amberframework.org/amber/cli) [Read more about Amber CLI installation guide](https://docs.amberframework.org/amber/guides/installation) ## Have an Amber-based Project? Use Amber badge [![Amber Framework](https://img.shields.io/badge/using-amber_framework-orange.svg)](https://amberframework.org/) ```markdown [![Amber Framework](https://img.shields.io/badge/using-amber_framework-orange.svg)](https://amberframework.org/) ``` ## Contributing Contributing to Amber can be a rewarding way to learn, teach, and build experience in just about any skill you can imagine. You don’t have to become a lifelong contributor to enjoy participating in Amber. Tracking issues? Check our [project board](https://github.com/orgs/amberframework/projects/1?fullscreen=true). Code Triage? Join us on [codetriage](https://www.codetriage.com/amberframework/amber). [![Open Source Contributors](https://www.codetriage.com/amberframework/amber/badges/users.svg)](https://www.codetriage.com/amberframework/amber) Amber is a community effort and we want You to be part of it. [Join Amber Community!](https://github.com/amberframework/amber/blob/master/.github/CONTRIBUTING.md) 1. Fork it https://github.com/amberframework/amber/fork 2. Create your feature branch `git checkout -b my-new-feature` 3. Write and execute specs and formatting checks `./bin/amber_spec` 4. Commit your changes `git commit -am 'Add some feature'` 5. Push to the branch `git push origin my-new-feature` 6. Create a new Pull Request ## Contributors - [Dru Jensen](https://github.com/drujensen "drujensen") - [Elias Perez](https://github.com/eliasjpr "eliasjpr") - [Isaac Sloan](https://github.com/elorest "elorest") - [Faustino Aguilar](https://github.com/faustinoaq "faustinoaq") - [Nick Franken](https://github.com/fridgerator "fridgerator") - [Mark Siemers](https://github.com/marksiemers "marksiemers") - [Robert Carpenter](https://github.com/robacarp "robacarp") [See more Amber contributors](https://github.com/amberframework/amber/graphs/contributors) ## License This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details ## Acknowledgments * Inspired by [Kemal](https://kemalcr.com/), [Rails](https://rubyonrails.org/), [Phoenix](https://phoenixframework.org/), and [Hanami](https://hanamirb.org/)
41.152672
306
0.768874
eng_Latn
0.523288
0c3c168eccbff42c88ab5348c68f37f534206775
355
md
Markdown
_includes/modulators/pressuretocv.md
mthierman/BespokeSynthDocs
fc0b7be560c2c3ba33669622c15ee78e1db35e69
[ "CC0-1.0" ]
8
2021-10-04T00:25:12.000Z
2022-03-26T23:18:51.000Z
_includes/modulators/pressuretocv.md
mthierman/BespokeSynthDocs
fc0b7be560c2c3ba33669622c15ee78e1db35e69
[ "CC0-1.0" ]
1
2021-11-16T23:50:00.000Z
2021-11-16T23:50:00.000Z
_includes/modulators/pressuretocv.md
mthierman/BespokeSynthDocs
fc0b7be560c2c3ba33669622c15ee78e1db35e69
[ "CC0-1.0" ]
6
2021-11-17T15:00:54.000Z
2022-03-18T22:54:57.000Z
<a name=pressuretocv></a><br> # <b>pressuretocv</b> <img src="../images/pressuretocv.png"><br> <div style="display:inline-block;margin-left:50px;"> take a note's pressure and convert it to a modulation value<br/><br/> <b>max</b>: output for pressure 127<br> <b>min</b>: output for pressure 0<br> <br>accepts: <font color=orange>notes</font> <br></div>
29.583333
69
0.690141
eng_Latn
0.595646
0c3c8117642a6c775724e22131f63aaee7a6be42
5,201
md
Markdown
methods/transformers/model_cards/camembert/camembert-base-ccnet/README.md
INK-USC/RiddleSense
a3d57eaf084da9cf6b77692c608e2cd2870fbd97
[ "MIT" ]
3
2021-07-06T20:02:31.000Z
2022-03-27T13:13:01.000Z
methods/transformers/model_cards/camembert/camembert-base-ccnet/README.md
INK-USC/RiddleSense
a3d57eaf084da9cf6b77692c608e2cd2870fbd97
[ "MIT" ]
null
null
null
methods/transformers/model_cards/camembert/camembert-base-ccnet/README.md
INK-USC/RiddleSense
a3d57eaf084da9cf6b77692c608e2cd2870fbd97
[ "MIT" ]
null
null
null
--- language: fr --- # CamemBERT: a Tasty French Language Model ## Introduction [CamemBERT](https://arxiv.org/abs/1911.03894) is a state-of-the-art language model for French based on the RoBERTa model. It is now available on Hugging Face in 6 different versions with varying number of parameters, amount of pretraining data and pretraining data source domains. For further information or requests, please go to [Camembert Website](https://camembert-model.fr/) ## Pre-trained models | Model | #params | Arch. | Training data | |--------------------------------|--------------------------------|-------|-----------------------------------| | `camembert-base` | 110M | Base | OSCAR (138 GB of text) | | `camembert/camembert-large` | 335M | Large | CCNet (135 GB of text) | | `camembert/camembert-base-ccnet` | 110M | Base | CCNet (135 GB of text) | | `camembert/camembert-base-wikipedia-4gb` | 110M | Base | Wikipedia (4 GB of text) | | `camembert/camembert-base-oscar-4gb` | 110M | Base | Subsample of OSCAR (4 GB of text) | | `camembert/camembert-base-ccnet-4gb` | 110M | Base | Subsample of CCNet (4 GB of text) | ## How to use CamemBERT with HuggingFace ##### Load CamemBERT and its sub-word tokenizer : ```python from transformers import CamembertModel, CamembertTokenizer # You can replace "camembert-base" with any other model from the table, e.g. "camembert/camembert-large". tokenizer = CamembertTokenizer.from_pretrained("camembert/camembert-base-ccnet") camembert = CamembertModel.from_pretrained("camembert/camembert-base-ccnet") camembert.eval() # disable dropout (or leave in train mode to finetune) ``` ##### Filling masks using pipeline ```python from transformers import pipeline camembert_fill_mask = pipeline("fill-mask", model="camembert/camembert-base-ccnet", tokenizer="camembert/camembert-base-ccnet") results = camembert_fill_mask("Le camembert est <mask> :)") # results #[{'sequence': '<s> Le camembert est bon :)</s>', 'score': 0.14011502265930176, 'token': 305}, # {'sequence': '<s> Le camembert est délicieux :)</s>', 'score': 0.13929404318332672, 'token': 11661}, # {'sequence': '<s> Le camembert est excellent :)</s>', 'score': 0.07010319083929062, 'token': 3497}, # {'sequence': '<s> Le camembert est parfait :)</s>', 'score': 0.025885622948408127, 'token': 2528}, # {'sequence': '<s> Le camembert est top :)</s>', 'score': 0.025684962049126625, 'token': 2328}] ``` ##### Extract contextual embedding features from Camembert output ```python import torch # Tokenize in sub-words with SentencePiece tokenized_sentence = tokenizer.tokenize("J'aime le camembert !") # ['▁J', "'", 'aime', '▁le', '▁cam', 'ember', 't', '▁!'] # 1-hot encode and add special starting and end tokens encoded_sentence = tokenizer.encode(tokenized_sentence) # [5, 133, 22, 1250, 16, 12034, 14324, 81, 76, 6] # NB: Can be done in one step : tokenize.encode("J'aime le camembert !") # Feed tokens to Camembert as a torch tensor (batch dim 1) encoded_sentence = torch.tensor(encoded_sentence).unsqueeze(0) embeddings, _ = camembert(encoded_sentence) # embeddings.detach() # embeddings.size torch.Size([1, 10, 768]) #tensor([[[ 0.0667, -0.2467, 0.0954, ..., 0.2144, 0.0279, 0.3621], # [-0.0472, 0.4092, -0.6602, ..., 0.2095, 0.1391, -0.0401], # [ 0.1911, -0.2347, -0.0811, ..., 0.4306, -0.0639, 0.1821], # ..., ``` ##### Extract contextual embedding features from all Camembert layers ```python from transformers import CamembertConfig # (Need to reload the model with new config) config = CamembertConfig.from_pretrained("camembert/camembert-base-ccnet", output_hidden_states=True) camembert = CamembertModel.from_pretrained("camembert/camembert-base-ccnet", config=config) embeddings, _, all_layer_embeddings = camembert(encoded_sentence) # all_layer_embeddings list of len(all_layer_embeddings) == 13 (input embedding layer + 12 self attention layers) all_layer_embeddings[5] # layer 5 contextual embedding : size torch.Size([1, 10, 768]) #tensor([[[ 0.0057, -0.1022, 0.0163, ..., -0.0675, -0.0360, 0.1078], # [-0.1096, -0.3344, -0.0593, ..., 0.1625, -0.0432, -0.1646], # [ 0.3751, -0.3829, 0.0844, ..., 0.1067, -0.0330, 0.3334], # ..., ``` ## Authors CamemBERT was trained and evaluated by Louis Martin\*, Benjamin Muller\*, Pedro Javier Ortiz Suárez\*, Yoann Dupont, Laurent Romary, Éric Villemonte de la Clergerie, Djamé Seddah and Benoît Sagot. ## Citation If you use our work, please cite: ```bibtex @inproceedings{martin2020camembert, title={CamemBERT: a Tasty French Language Model}, author={Martin, Louis and Muller, Benjamin and Su{\'a}rez, Pedro Javier Ortiz and Dupont, Yoann and Romary, Laurent and de la Clergerie, {\'E}ric Villemonte and Seddah, Djam{\'e} and Sagot, Beno{\^\i}t}, booktitle={Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics}, year={2020} } ```
46.855856
206
0.650067
eng_Latn
0.730826
0c3d1f7d08976e28e8230e8b179a189896017ad9
3,191
md
Markdown
memdocs/intune/configuration/wi-fi-settings-import-windows-8-1.md
eltociear/memdocs.ko-kr
2c0a72ceaf42a9c3a035d7348803d339abb3295d
[ "CC-BY-4.0", "MIT" ]
null
null
null
memdocs/intune/configuration/wi-fi-settings-import-windows-8-1.md
eltociear/memdocs.ko-kr
2c0a72ceaf42a9c3a035d7348803d339abb3295d
[ "CC-BY-4.0", "MIT" ]
null
null
null
memdocs/intune/configuration/wi-fi-settings-import-windows-8-1.md
eltociear/memdocs.ko-kr
2c0a72ceaf42a9c3a035d7348803d339abb3295d
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Microsoft Intune에서 Windows 디바이스에 대한 Wi-Fi 설정 가져오기 - Azure | Microsoft Docs description: netsh wlan을 사용하여 Windows 디바이스에서 Wi-Fi 설정을 XML 파일로 내보냅니다. 그런 다음, Intune에서 이 파일을 가져와 Windows 8.1, Windows 10 및 Windows Holographic for Business를 실행하는 디바이스에 대한 Wi-Fi 프로필을 만듭니다. keywords: '' author: MandiOhlinger ms.author: mandia manager: dougeby ms.date: 12/18/2019 ms.topic: reference ms.service: microsoft-intune ms.subservice: configuration ms.localizationpriority: medium ms.technology: '' ms.suite: ems search.appverid: MET150 ms.custom: intune-azure ms.collection: M365-identity-device-management ms.openlocfilehash: b8cd8deb04dc939ed3dc742c9066c6dbfd4f51f3 ms.sourcegitcommit: 3d895be2844bda2177c2c85dc2f09612a1be5490 ms.translationtype: HT ms.contentlocale: ko-KR ms.lasthandoff: 03/13/2020 ms.locfileid: "79363813" --- # <a name="import-wi-fi-settings-for-windows-devices-in-intune"></a>Intune에서 Windows 디바이스에 대한 Wi-Fi 설정 가져오기 Windows를 실행하는 디바이스의 경우 이전에 파일로 내보낸 Wi-Fi 구성 프로필을 가져올 수도 있습니다. **Windows 10 이상 디바이스의 경우 Intune에서 직접 [Wi-Fi 프로필을 만들](wi-fi-settings-windows.md) 수도 있습니다**. 적용 대상: - Windows 8.1 이상 - Windows 10 이상 - Windows 10 데스크톱 또는 모바일 - Windows Holographic for Business ## <a name="export-wi-fi-settings-from-a-windows-device"></a>Windows 디바이스에서 Wi-Fi 설정 내보내기 Windows에서 `netsh wlan`을 사용하여 기존 Wi-Fi 프로필을 Intune에서 읽을 수 있는 XML 파일로 내보냅니다. 프로필을 사용하기 위해 일반 텍스트로 키를 내보내야 합니다. 필수 WiFi 프로필이 이미 설치되어 있는 Windows 컴퓨터에서 다음 단계를 사용합니다. 1. 내보낸 Wi-Fi 프로필에 대한 로컬 폴더를 만듭니다(예: **c:\WiFi**). 2. 관리자 권한으로 명령 프롬프트를 엽니다. 3. `netsh wlan show profiles` 명령을 실행합니다. 내보내려는 프로필의 이름을 적어 둡니다. 이 예제에서 프로필 이름은 **WiFiName**입니다. 4. `netsh wlan export profile name="ProfileName" folder=c:\Wifi` 명령을 실행합니다. 이 명령은 **Wi-Fi-WiFiName.xml**이라는 Wi-Fi 프로필 파일을 대상 폴더에 만듭니다. ## <a name="import-the-wi-fi-settings-into-intune"></a>Intune으로 Wi-Fi 설정 가져오기 1. [Microsoft Endpoint Manager 관리 센터](https://go.microsoft.com/fwlink/?linkid=2109431)에 로그인합니다. 2. **디바이스 구성** > **구성 프로필** > **프로필 만들기**를 선택합니다. 3. 다음 설정을 입력합니다. - **이름**: 프로필에 대한 설명이 포함된 이름을 입력합니다. 이름은 Wi-Fi 프로필 xml의 이름 속성과 동일**해야** 합니다. 그렇지 않으면 실패합니다. - **설명**: 설정에 대한 개요와 기타 중요한 모든 세부 정보를 제공하는 설명을 입력합니다. - **플랫폼**: **Windows 8.1 이상**을 선택합니다. - **프로필 유형**: **Wi-Fi 가져오기**를 선택합니다. > [!IMPORTANT] > - 미리 공유한 키를 포함하는 Wi-Fi 프로필을 내보내는 경우 명령에 `key=clear`를 추가**해야** 합니다. 예를 들어 다음과 같이 입력합니다. `netsh wlan export profile name="ProfileName" key=clear folder=c:\Wifi` > - Windows 10에서 미리 공유한 키를 사용하면 Intune에서 나타나는 업데이트 관리 오류가 발생합니다. 이런 경우 Wi-Fi 프로필은 디바이스에 제대로 할당되고 프로필은 예상대로 작동합니다. > - 미리 공유한 키를 포함하는 Wi-Fi 프로필을 내보내는 경우 파일이 보호되도록 합니다. 키가 일반 텍스트인 경우 키를 보호하는 것은 사용자의 책임입니다. 4. 다음 설정을 입력합니다. - **연결 이름**: Wi-Fi 연결의 이름을 입력합니다. 이 이름은 사용자가 사용 가능한 Wi-Fi 네트워크를 찾아볼 때 표시됩니다. - **프로필 XML**: 찾아보기 단추를 선택하고 가져오려는 Wi-Fi 프로필 설정이 포함된 XML 파일을 선택합니다. - **파일 내용**: 선택한 구성 프로필에 대한 XML 코드를 표시합니다. 5. **확인**을 선택하여 변경 내용을 저장합니다. 6. 완료되면 **확인** > **만들기**를 선택하여 Intune 프로필을 만듭니다. 완료되면 프로필이 **디바이스 - 구성 프로필** 목록에 표시됩니다. ## <a name="next-steps"></a>다음 단계 프로필이 만들어지지만 아무것도 하지 않습니다. 다음으로, [프로필을 할당](device-profile-assign.md)하고, [해당 상태를 모니터링](device-profile-monitor.md)합니다. 사용 가능한 다른 플랫폼을 포함한 [Wi-Fi 설정 개요](wi-fi-settings-configure.md)를 참조하세요.
41.986842
186
0.708869
kor_Hang
1.000009
0c3d9121748b1def85aa095cd784fcb38586e584
2,362
md
Markdown
README.md
michael-joyner/cll2-gdx
7e382b85c9360a5b3e4fcd24466b19eba1ec1a66
[ "Apache-2.0" ]
1
2022-01-12T18:00:51.000Z
2022-01-12T18:00:51.000Z
README.md
michael-joyner/cll2-gdx
7e382b85c9360a5b3e4fcd24466b19eba1ec1a66
[ "Apache-2.0" ]
null
null
null
README.md
michael-joyner/cll2-gdx
7e382b85c9360a5b3e4fcd24466b19eba1ec1a66
[ "Apache-2.0" ]
null
null
null
# cll1-gdx Cherokee Language Lessons 2 - Core Exercises (libGDX) You can install the testing version for Android/Kingle Fire from here: (TBD) You can install a recent desktop version for Microsoft/Apple/Linux from here: (TBD) ## About This app is designed to help you acquire the Cherokee language though the use of pictures representing actions, states of being, and so forth. The app closely follows the material in the book "Cherokee Language Lessons 2". (Amazon link: TBD / Also available from LULU.) - Author: Michael Joyner - Font: FreeSerif [https://www.gnu.org/software/freefont/] - Toolkit: libGDX - Voice: Michael Joyner (Until someone better volunteers) - Artwork: https://www.openclipart.org/ - Artwork: Michael Joyner ## How this app works You will be challenged with a Cherokee word or phrase. Select the best matching picture. If you don't know the answer, do a best guess! A training session lasts about 5 minutes. You must complete the session for your progress to be recorded! Tap [?] to reveal the answer. Tap [AUDIO] to repeat the audio. Tap [QUIT] to pause or quit. ## Also published ### BOOKS - EITHER AUTHORED or TYPESET/EDITED - The Tale of Peter Rabbit (Translator: Lawrence Panther) - Na Usdi Agigage Jitaga Agisi - The Little Red Hen - Na Anijoi Sigwa - The Three Pigs - Na Anijoi Yona - The Three Bears - Na Anijoi Wesa Anida ale Jitaga Usdi - The Three Kittens and Chicken Little - The Three Kittens and Chicken Little / Cherokee-English Dictionary - Cherokee Language Lessons 1 (http://amzn.to/2qkZ4Ze) - ᎪᎸᏅᏱ ᏣᎳᎩ-ᏲᏁᎦ ᏗᏕᎶᏆᏍᏙᏗ / Raven Rock Cherokee-English Dictionary - ᎹᎦᎵ ᎤᏤᎵ ᏣᎳᎩ ᎠᎪᎵᏰᏗ / Michael's Cherokee Reader - Pilgrim's Progress Cherokee Only Junelodi Dunigisvsvi Jalagiha - Genesis or the First Book of Moses - Dual Language - Cherokee / English - Ije Kanohedv Dahlohisdv (Hardcover or Paperback) / Cherokee New Testament ### COMICS - EITHER AUTHORED or TYPESET/EDITED - Michael's Cherokee Funny Papers - Buster Bear - Rollo Raccoon and the Swim - Buster Bear - Charles Chipmunk and the Photographer - Juhli Wagani - Jalagi-Yonega - Digohweli Sagwu / Foxy Fagan - Cherokee-English (Translator: Lawrence Panther) - Juhli Wagani - Jalagiha - Digohweli Sagwu / Foxy Fagan - Cherokee Only (Translator: Lawrence Panther) ### APPS - Cherokee Language Animals - Cherokee Language Syllabary - Cherokee Bound Pronouns
42.178571
269
0.7663
eng_Latn
0.824853
0c3de93f96b574601a0dcd4581401aeac9691c27
588
md
Markdown
content/zh-cn/devops.md
WebShivam/glossary
dc453424eacf679ddf4507bdcf13101ef9000b61
[ "Apache-2.0" ]
1
2022-03-09T21:44:11.000Z
2022-03-09T21:44:11.000Z
content/zh-cn/devops.md
MrErlison/k8s-glossary
dc453424eacf679ddf4507bdcf13101ef9000b61
[ "Apache-2.0" ]
null
null
null
content/zh-cn/devops.md
MrErlison/k8s-glossary
dc453424eacf679ddf4507bdcf13101ef9000b61
[ "Apache-2.0" ]
null
null
null
--- title: 开发运维 status: Completed category: 概念 --- ## 是什么 开发运维是一种方法论,其中团队拥有从应用程序开发到生产操作的整个过程,因此 DevOps,它不仅仅是实施一套技术,还需要文化和流程的彻底转变。 DevOps 需要一组工程师来处理小组件(相对于整个功能),从而减少交接——一个常见的错误来源。 ## 强调的问题 传统上,在具有 [紧密耦合](/tightly-coupled-architectures/) [单体应用程序](/zh-cn/monolithic-apps/) 的复杂组织中,工作通常分散在多个组之间。 这导致了多次交接和较长的交货时间。 每次组件或更新准备就绪时,都会将其放入队列中以供下一个团队使用。 因为个人只参与了项目的一小部分,这种方法导致缺乏所有权。 他们的目标是将工作交给下一个小组,而不是为客户提供正确的功能——明显的优先级错位。 到代码最终投入生产时,它经过了这么多开发人员,排了这么多队列,如果代码不起作用,很难追查问题的根源。 DevOps 颠覆了这种方法。 ## 如何帮助 让一个团队拥有应用程序的整个生命周期可以最大限度地减少交接,降低部署到生产中的风险,提高代码质量,因为团队还负责代码在生产中的执行方式,并且由于更多的自主权和所有权而提高了员工满意度。
29.4
224
0.82483
zho_Hans
0.634993
0c3e1e7aea874945833ba38b969a3a9db7b589f8
4,158
md
Markdown
docs/relational-databases/native-client-ole-db-tables-indexes/tables-and-indexes.md
AndersUP/sql-docs
3457f10fee7c7e8969300a549023a8f39dbe57b0
[ "CC-BY-4.0", "MIT" ]
4
2019-03-10T21:54:49.000Z
2022-03-09T09:08:21.000Z
docs/relational-databases/native-client-ole-db-tables-indexes/tables-and-indexes.md
AndersUP/sql-docs
3457f10fee7c7e8969300a549023a8f39dbe57b0
[ "CC-BY-4.0", "MIT" ]
10
2022-02-15T00:49:23.000Z
2022-02-23T22:10:33.000Z
docs/relational-databases/native-client-ole-db-tables-indexes/tables-and-indexes.md
AndersUP/sql-docs
3457f10fee7c7e8969300a549023a8f39dbe57b0
[ "CC-BY-4.0", "MIT" ]
8
2022-02-15T00:43:41.000Z
2022-02-23T20:06:52.000Z
--- description: "Tables and Indexes in SQL Server Native Client" title: Tables and indexes (Native Client OLE DB provider) ms.custom: "" ms.date: "03/14/2017" ms.prod: sql ms.prod_service: "database-engine, sql-database, sql-data-warehouse, pdw" ms.reviewer: "" ms.technology: native-client ms.topic: "reference" helpviewer_keywords: - "OLE DB, indexes" - "OLE DB, tables" - "ITableDefinition interface" - "tables [OLE DB]" - "IIndexDefinition interface" - "SQL Server Native Client OLE DB provider, tables" - "SQL Server Native Client OLE DB provider, indexes" - "indexes [OLE DB]" ms.assetid: 4217c6d8-8cd2-43dc-b36f-3cfd8a58fabc author: markingmyname ms.author: maghan monikerRange: ">=aps-pdw-2016||=azuresqldb-current||=azure-sqldw-latest||>=sql-server-2016||>=sql-server-linux-2017||=azuresqldb-mi-current" --- # Tables and Indexes in SQL Server Native Client [!INCLUDE[SQL Server Azure SQL Database Synapse Analytics PDW ](../../includes/applies-to-version/sql-asdb-asdbmi-asa-pdw.md)] The [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] Native Client OLE DB provider exposes the **IIndexDefinition** and **ITableDefinition** interfaces, allowing consumers to create, alter, and drop [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] tables and indexes. Valid table and index definitions depend on the version of [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)]. The ability to create or drop tables and indexes depends on the [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] access rights of the consumer-application user. Dropping a table can be further constrained by the presence of declarative referential integrity constraints or other factors. Most applications targeting [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] use SQL-DMO instead of these [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] Native Client OLE DB provider interfaces. SQL-DMO is a collection of OLE Automation objects that support all the administrative functions of [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)]. Applications targeting multiple OLE DB providers use these generic OLE DB interfaces that are supported by the various OLE DB providers. In the provider-specific property set DBPROPSET_SQLSERVERCOLUMN, [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] defines the following property. |Property ID|Description| |-----------------|-----------------| |SSPROP_COL_COLLATIONNAME|Type: VT_BSTR<br /><br /> R/W: Write<br /><br /> Default: Null<br /><br /> Description: This property is used only in **ITableDefinition**. The string specified in this property is used when creating a [CREATE TABLE](../../t-sql/statements/create-table-transact-sql.md)<br /><br /> statement.| ## In This Section - [Creating SQL Server Tables](../../relational-databases/native-client-ole-db-tables-indexes/creating-sql-server-tables.md) - [Adding a Column to a SQL Server Table](../../relational-databases/native-client-ole-db-tables-indexes/adding-a-column-to-a-sql-server-table.md) - [Removing a Column from a SQL Server Table](../../relational-databases/native-client-ole-db-tables-indexes/removing-a-column-from-a-sql-server-table.md) - [Dropping a SQL Server Table](../../relational-databases/native-client-ole-db-tables-indexes/dropping-a-sql-server-table.md) - [Creating SQL Server Indexes](../../relational-databases/native-client-ole-db-tables-indexes/creating-sql-server-indexes.md) - [Dropping a SQL Server Index](../../relational-databases/native-client-ole-db-tables-indexes/dropping-a-sql-server-index.md) ## See Also [SQL Server Native Client &#40;OLE DB&#41;](../../relational-databases/native-client/ole-db/sql-server-native-client-ole-db.md) [DROP TABLE &#40;Transact-SQL&#41;](../../t-sql/statements/drop-table-transact-sql.md) [CREATE INDEX &#40;Transact-SQL&#41;](../../t-sql/statements/create-index-transact-sql.md) [DROP INDEX &#40;Transact-SQL&#41;](../../t-sql/statements/drop-index-transact-sql.md)
68.163934
513
0.719336
eng_Latn
0.404784
0c3e255350ba879c83de5536aaecb7568e9cc1a3
441
md
Markdown
README.md
odoralc/sonarqube-jenkins-vagrant
7e38485ae440b68b2eca5b8199511d2a3c3aaf96
[ "MIT" ]
null
null
null
README.md
odoralc/sonarqube-jenkins-vagrant
7e38485ae440b68b2eca5b8199511d2a3c3aaf96
[ "MIT" ]
null
null
null
README.md
odoralc/sonarqube-jenkins-vagrant
7e38485ae440b68b2eca5b8199511d2a3c3aaf96
[ "MIT" ]
1
2021-03-06T15:42:37.000Z
2021-03-06T15:42:37.000Z
# sonarqube-jenkins-vagrant ## Get Started ### Install vagrant-docker-compose ``` ./setup ``` ### Setup sonarqube with jenkins with Vagrant ``` vagrant up ``` ## How to get Jenkins' password Login to the box ``` vagrant ssh ``` List docker containers ``` sudo docker ps ``` Find the container ID(for example `7144f7a232cf`) of image jenkins and check the logs ``` sudo docker logs 7144f7a232cf ``` The password is in the logs.
11.605263
85
0.69161
eng_Latn
0.812478
0c3e2fa801eafcbd9c4225516c9ded30e29fa918
2,618
md
Markdown
docs/2014/relational-databases/native-client-odbc-table-valued-parameters/cross-version-compatibility.md
gmilani/sql-docs.pt-br
02f07ca69eae8435cefd74616a8b00f09c4d4f99
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/2014/relational-databases/native-client-odbc-table-valued-parameters/cross-version-compatibility.md
gmilani/sql-docs.pt-br
02f07ca69eae8435cefd74616a8b00f09c4d4f99
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/2014/relational-databases/native-client-odbc-table-valued-parameters/cross-version-compatibility.md
gmilani/sql-docs.pt-br
02f07ca69eae8435cefd74616a8b00f09c4d4f99
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Compatibilidade entre versões | Microsoft Docs ms.custom: '' ms.date: 03/06/2017 ms.prod: sql-server-2014 ms.reviewer: '' ms.technology: native-client ms.topic: reference helpviewer_keywords: - table-valued parameters (ODBC), cross-version compatibility ms.assetid: 5f14850b-b85c-41e2-8116-6f5b3f5e0856 author: MightyPen ms.author: genemi manager: craigg ms.openlocfilehash: f23b0f43b32d737f03cb7c9b00368558e89e9288 ms.sourcegitcommit: 3026c22b7fba19059a769ea5f367c4f51efaf286 ms.translationtype: MT ms.contentlocale: pt-BR ms.lasthandoff: 06/15/2019 ms.locfileid: "63199998" --- # <a name="cross-version-compatibility"></a>Compatibilidade entre versões Podem ocorrer conflitos entre versões quando instâncias do cliente e do servidor do [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] anteriores ao [!INCLUDE[ssKatmai](../../includes/sskatmai-md.md)] são esperadas para processar parâmetros com valor de tabela. Em geral, a funcionalidade de parâmetros com valor de tabela estão disponíveis apenas em clientes [!INCLUDE[ssKatmai](../../includes/sskatmai-md.md)] (que utilizam o SQL Server Native Client 10.0) ou posteriores que estão conectados a servidores [!INCLUDE[ssKatmai](../../includes/sskatmai-md.md)] (ou posteriores). Colunas novas em conjuntos de resultados de funções de catálogo estarão presentes somente quando conectado a um [!INCLUDE[ssKatmai](../../includes/sskatmai-md.md)] (ou posterior) server. Se um aplicativo cliente compilado com uma versão anterior do [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] Native Client executar instruções que esperam parâmetros com valor de tabela, o servidor detectará essa condição por meio de um erro de conversão de dados, e o ODBC retornará com SQLSTATE 07006 e a mensagem "Violação do atributo de tipo de dados restrito". Se um aplicativo cliente que foi compilado com [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] Native Client 10.0 ou posterior tentar usar parâmetros com valor de tabela quando conectado a uma instância de servidor anterior ao [!INCLUDE[ssKatmai](../../includes/sskatmai-md.md)], [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] Native Client irá detectar isso e SQLBindCol, Chamadas SQLBindParameter, SQLSetDescFields e SQLSetDescRec falhará com SQLSTATE 07006 e a mensagem "Violação do atributo de tipo de dados (a versão do SQL Server para esta conexão não oferece suporte a parâmetros com valor de tabela) de Restricted". ## <a name="see-also"></a>Consulte também [Parâmetros com valor de tabela &#40;ODBC&#41;](table-valued-parameters-odbc.md)
74.8
647
0.778839
por_Latn
0.988776
0c3eeb9c2dcb46a6639c09d6aecb9982481b0baa
513
md
Markdown
README.md
bernatixer/ConnectedCars
061be716a824c89bafd65d819b92da3b5f59e972
[ "MIT" ]
1
2017-10-13T20:30:54.000Z
2017-10-13T20:30:54.000Z
README.md
bernatixer/ConnectedCars
061be716a824c89bafd65d819b92da3b5f59e972
[ "MIT" ]
null
null
null
README.md
bernatixer/ConnectedCars
061be716a824c89bafd65d819b92da3b5f59e972
[ "MIT" ]
null
null
null
# ConnectedCars A simple car sharing application. Easy to use, made for sharing daily routes. ## How to deploy ConnectedCars Download the files `git clone https://github.com/bernatixer/ConnectedCars.git` Now, enter the directory: `cd ConnectedCars` We have to download the dependencies, `npm install` We need to install [MongoDB](https://docs.mongodb.com/manual/installation/), once installed, just open the DB server. Well done, we just need to open the server, just run: ``` npm start ``` ## Licence MIT
25.65
117
0.756335
eng_Latn
0.935623
0c3fc13d0f87ed62e09f7eb7ced700bbd0472067
6,449
md
Markdown
content/posts/2021-08-26-ffb.md
JonathanBuchh/buchh.org
bc8e96f4a324b1761a1e860d3f9acbc2b5733bb4
[ "MIT" ]
null
null
null
content/posts/2021-08-26-ffb.md
JonathanBuchh/buchh.org
bc8e96f4a324b1761a1e860d3f9acbc2b5733bb4
[ "MIT" ]
null
null
null
content/posts/2021-08-26-ffb.md
JonathanBuchh/buchh.org
bc8e96f4a324b1761a1e860d3f9acbc2b5733bb4
[ "MIT" ]
null
null
null
--- title: Fantasy Football date: 2021-08-26 description: A Beginner's Guide url: ffb --- I don't watch football, I don't follow football, and I don't understand America's infatuation with football. You might not think these characteristics describe someone who has played fantasy football for the past several years who has found success, playing in two championship games last year, but it's true of me. A lack of passion for _American_ football has never stopped me from falling in love with _fantasy_ football. I love drafting players, managing my (albeit fake) teams, and pretending I played a part when one of my players has a big game. This is intended to be a beginner's guide to fantasy football, something a new manager can read so they aren't embarrassed/taken advantage by their coworkers, family, or friends when added to their first fantasy football league. This guide includes the basic principles that I have been using to successfully play fantasy football. I assume some general knowledge about American football and fantasy football, such as positions and fantasy football scoring, but nothing that the [fantasy football Wikipedia page](https://en.wikipedia.org/wiki/Fantasy_football_(gridiron)) can't provide. At a basic level, you assume the role of the General Manager of a National Football League (NFL) team, comprised of NFL players from different teams. You draft a team, pick starting lineups, and earn points based on each player's performance, competing against other teams in your league. ## Drafting The draft is one of the most important parts of fantasy football, but that doesn't mean you need to spend hours researching or making a complex plan to have a good draft. When making picks, it is paramount to evaluate a player's value compared to other players you could draft. The easiest way to do this is to compare a given player's Average Draft Position (ADP) to the current pick. If a player has an ADP of 56, it means on average, they are selected with the 56th pick. If you have the 2nd pick, you probably shouldn't pick a player with an ADP of 56 without good reason, since there should be more valuable players available. The player with an ADP of 56 should be available in later rounds. You should be able see each player's ADP while drafting, but rankings can easily be found online. ADP is just a good rule of thumb to estimate a player's value. If you think it should be different than ADP, use your own judgment. There are a few common mistakes that beginner fantasy football managers make. The first is picking a quarterback too early. If you pick for value using ADP, you won't run into this problem. Additionally, if your league has defenses and/or kickers, many new players try to go for the best kicker or best defense. These positions don't really make a difference in the outcome of fantasy games, so don't pick either of them outside of the last two rounds. These positions are easily replaceable and aren't very reliable. Another common mistake made by new managers is picking too many players from their favorite team. While it's fine to have a few players from your favorite team, if you have too many, when those players have a bye week, you won't have many players to put in your lineup. Additionally, if that particular NFL team has a bad week, your fantasy team will suffer as well. For the same reason you diversify financial assets, you should diversify your players. Watch mock drafts if you want to see how good drafters think about each pick. These can be found online. ## In-season If you want to win, you need to manage your team. The number one reason that any fantasy football manager performs poorly is not paying attention to their team. "Managing your team" means monitoring updates on your players so that you know if they'll be playing this week. It's hard to win if you have inactive players in your starting lineup. ### Waiver wires While it's possible to get a great player in the later rounds of the draft, more often than not, you'll be wanting to find replacements for these players. Luckily, there is something called the waiver wire that allows you to pick up players that no one else has on their team. Rules on how the waiver wire works varies from league to league, so ask your commissioner. To pick up someone from the waiver wire, you need to drop someone on your team. You should do this if they sustain a season ending injury like an ACL tear. You can also drop a player if they're severely underperforming their expectations and you can find someone better to add to your team. In both cases, check to see what fantasy experts say about the player. Sometimes they sustain a minor to mild injury that might have them out a few games, but are good enough to keep on your bench. Other times, players just have a bad game. Just because someone didn't score very many points in your last matchup doesn't mean they shouldn't be on your team. Make sure that fantasy experts agree that a player should be dropped before you do so yourself. Many fantasy football experts post lists of good waiver wire pickups. You should read about these players before your waiver wire closes. Every year, there are a few elite players that show up out of nowhere and are available for free on the waiver wire. You don't want to miss out on the opportunity to get one of these players for free. ## Trades Only trade when necessary. Estimate if you will have a net gain in points if you make a trade. Some experienced managers like to prey on new mangers and try to take advantage of them by sending them unbalanced trades. Some leagues have voting systems that prevent unfair trades from happening, but you need to be able to determine if a trade is a good deal or not yourself. ### Picking lineups Believe it or not, there is some strategy in making the best lineup. If you are slated to win, put in players that are consistent. You don't need risky players that _might_ score you a lot of points. You need these types of players (often referred to as "boom or bust") when you are supposed to lose by a lot. You can find out if a player is "boom or bust" by researching them online. ## Appendix You should now be prepared for a great fantasy season. If your season starts poorly, don't give up. Even the best fantasy football managers have bad years. Unexpected things happen. The season is long and a lot changes in between the start and end of the season.
52.008065
80
0.79082
eng_Latn
0.999958
0c40307f98cc93911ca0fbd1d1207ef41b5a88ea
12,711
md
Markdown
content/post/Dabbling_with_react_@_React_India_2019.md
vipulgupta2048/ALiAS_Blog
4e220dd4710f75d329ad3030018b9ddb9b18583b
[ "MIT" ]
3
2019-01-26T02:45:45.000Z
2019-01-31T20:01:29.000Z
content/post/Dabbling_with_react_@_React_India_2019.md
vipulgupta2048/ALiAS_Blog
4e220dd4710f75d329ad3030018b9ddb9b18583b
[ "MIT" ]
66
2019-04-01T18:06:03.000Z
2021-01-12T06:17:06.000Z
content/post/Dabbling_with_react_@_React_India_2019.md
vipulgupta2048/ALiAS_Blog
4e220dd4710f75d329ad3030018b9ddb9b18583b
[ "MIT" ]
16
2019-03-31T21:35:34.000Z
2021-03-06T06:24:52.000Z
+++ title = "Dabbling with React @ React India 2019: A Tweet story" slug = "reactindia2019-vipulgupta2048" image = "https://user-images.githubusercontent.com/19930870/71670852-4dfd9f00-2d97-11ea-8731-12cbe620ca7b.png" date = "2019-10-07T10:10:00" description = "My experience of India's first React conf, React India 2019" disableComments = false author = "Vipul Gupta" categories = ["conferences"] tags = ['React', 'GraphQL', 'javascript', 'vipulgupta2048', 'community', 'FOSS', 'Redux', 'India', 'React-Native', 'conferences'] +++ Lush green trees, monsoon weather, soul cooling winds coming from the tumultuous waters of the beach where sounds of waves splashing over the dry sand can be heard by people walking by. Now, just imagine a tech conference on a piece of software that powers almost half of the web today at the place that I just described. How could you stop yourself? I couldn't. Here's a brief twitter story of my adventures at React India. React India was the first-ever international, community-led beach conference that provided a platform for developers to share, discuss their insights and experiences with React and React Native. The three-day conference was the first of its kind in India broken into workshops and conference days with community talks from 26th September - 28th September 2019 Before setting off, I planned to capture a picture of everyone I met at the conference & tweet about it, that way when the conference ends. I could write this blog much more easily. And now you know why exactly it's called a tweet story. ## The Journey The journey starts with me filling the diversity scholarship form for React India and in turn getting selected from a pool of hundreds. Happy to say, the other nine people who were selected were gems in their own fields. I had a marvelous time interacting with each and every one of them. I hopped on an early morning flight to Goa and was met with a brief patch of Goa rains. We boarded the airport shuttle arranged by React India, filled with fellow attendees. From there on, we started for the beautiful south side of Goa, along the narrow lanes of the countryside where we saw colorful houses, cottages behind tall coconut trees everywhere the eye could wander. <blockquote class="twitter-tweet"><p lang="en" dir="ltr">First up, catching up with <a href="https://twitter.com/bishnoi_pallavi?ref_src=twsrc%5Etfw">@bishnoi_pallavi</a> from <a href="https://twitter.com/CodingBlocksIn?ref_src=twsrc%5Etfw">@CodingBlocksIn</a> and <a href="https://twitter.com/manvisinghwal?ref_src=twsrc%5Etfw">@manvisinghwal</a> from <a href="https://twitter.com/Girlscript1?ref_src=twsrc%5Etfw">@Girlscript1</a> on our early morning flight to Goa. <br><br>Skies look clear, and time for takeoff. React India here we come 🐣🚨 <a href="https://t.co/Yt3ZvsGAGC">pic.twitter.com/Yt3ZvsGAGC</a></p>&mdash; Vipul Gupta 🐣 (@vipulgupta2048) <a href="https://twitter.com/vipulgupta2048/status/1177007858958753793?ref_src=twsrc%5Etfw">September 25, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> ## Workshops Galore The first day of the React India was all about workshops and connecting with individuals. I met professionals from Accenture, Verizon, Paperphile and made new connections while touring the amazing property of Planet Hollywood. It was a beach conference, why not enjoy it? Many workshops were happening that day, most notably of Siddharth's about Building your first Design System: from scratch to production and Vladimir Novik's workshop about Practical ReasonML for React devs. <blockquote class="twitter-tweet" data-conversation="none" data-lang="en" data-dnt="true"><p lang="en" dir="ltr"><a href="https://twitter.com/aravindballa?ref_src=twsrc%5Etfw">@aravindballa</a> joining us from Hyderabad at <a href="https://twitter.com/react_india?ref_src=twsrc%5Etfw">@react_india</a>, working at an awesome startup, <a href="https://twitter.com/paperpile?ref_src=twsrc%5Etfw">@paperpile</a> where they harness <a href="https://twitter.com/hashtag/React?src=hash&amp;ref_src=twsrc%5Etfw">#React</a> and <a href="https://twitter.com/hashtag/extjs?src=hash&amp;ref_src=twsrc%5Etfw">#extjs</a> to help researchers write better. Check them out! <a href="https://t.co/5kOEYHcVYC">pic.twitter.com/5kOEYHcVYC</a></p>&mdash; Vipul Gupta 🐣 (@vipulgupta2048) <a href="https://twitter.com/vipulgupta2048/status/1177098931433459713?ref_src=twsrc%5Etfw">September 26, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> ## Tell us more about the conference The conference was extremely well organized. I have seen almost nothing like it, the entire main hall where the single-track conference took place was made out of German hanger tent, completely waterproof, Air-conditioned and was located out in the open. There were community talks, quizzes, games, company stalls and even an after-party where [Ken Wheeler](https://twitter.com/ken_wheeler) performed remotely! <blockquote class="twitter-tweet"><p lang="en" dir="ltr">Thinking of catching a break from all the amazing sessions at <a href="https://twitter.com/react_india?ref_src=twsrc%5Etfw">@react_india</a> to get some peace and quiet. <br><br>Well, just wander around. <a href="https://t.co/rxJMfwJVpY">pic.twitter.com/rxJMfwJVpY</a></p>&mdash; Vipul Gupta 🐣 (@vipulgupta2048) <a href="https://twitter.com/vipulgupta2048/status/1177120244940165120?ref_src=twsrc%5Etfw">September 26, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> ## Day 1 begins The conference started out with the very talented and extremely renowned speaker, giving the opening keynote, [Sunil Pai](https://twitter.com/threepointone). GraphQL was in the air as a lot of talks focused on refactoring, DRY approaches for it, even some great applications with it as Nader talked about in his talk. <blockquote class="twitter-tweet" data-conversation="none" data-dnt="true"><p lang="en" dir="ltr">Curious cases of <a href="https://twitter.com/GraphQL?ref_src=twsrc%5Etfw">@graphql</a> by <a href="https://twitter.com/dabit3?ref_src=twsrc%5Etfw">@dabit3</a> as he guides us through future ways of web and mobile application development with serverless, GraphQL transform Library and AWS. <br><br>He works as a <a href="https://twitter.com/hashtag/DevRel?src=hash&amp;ref_src=twsrc%5Etfw">#DevRel</a> for <a href="https://twitter.com/awscloud?ref_src=twsrc%5Etfw">@awscloud</a> <a href="https://t.co/S1jSmzsoiD">pic.twitter.com/S1jSmzsoiD</a></p>&mdash; Vipul Gupta 🐣 (@vipulgupta2048) <a href="https://twitter.com/vipulgupta2048/status/1177453791840354304?ref_src=twsrc%5Etfw">September 27, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> GraphQL is much more than efficiently pulling data, and I am glad Nader's talk helped make the case for it as he showcased the powers of AWS Amplify as well as some awesome problems he solved with GraphQL. Moving on, I took a break from GraphQL and attended talks by [Ives van Hoorne](https://twitter.com/compuives) about building [CodeSandbox](https://codesandbox.io), and React for teenagers by my roommate, Yash Gupta. Both very inspirational on how to build products for everyone and contributing back to the community. You all should really try CodeSandbox, it's like VsCode but online and more flexible. <blockquote class="twitter-tweet" data-conversation="none" data-dnt="true"><p lang="en" dir="ltr">First picture is is off <a href="https://twitter.com/yashguptaz?ref_src=twsrc%5Etfw">@yashguptaz</a> preparing for his session.<br><br>The other one, is the youngest speaker (Just 16) at <a href="https://twitter.com/react_india?ref_src=twsrc%5Etfw">@react_india</a> giving his session to help teenagers join the <a href="https://twitter.com/reactjs?ref_src=twsrc%5Etfw">@reactjs</a> community. <a href="https://t.co/S0Ownxdipi">pic.twitter.com/S0Ownxdipi</a></p>&mdash; Vipul Gupta 🐣 (@vipulgupta2048) <a href="https://twitter.com/vipulgupta2048/status/1177521013799579648?ref_src=twsrc%5Etfw">September 27, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> ## Community is where the heart is Open-Source was a general theme behind the conference, where people talked from how we can build more inclusive communities to releasing new frameworks for blogging platforms to conflict resolution to this amazing quote by Carolyn in her talk about "Intuitive tooling". > "We mistake familiarity for simplicity" - Carolyn Stransky She argues on the fact that we as developers, once fluent with a framework or a technology being to introducing it as "simple". While its simple for you now, we must not forget it might not be that straightforward for a newcomer due to the steep learning curve or improper documentation. I loved [Jason Lengstorf](https://twitter.com/jlengstorf)'s talk as he asks the question **Is open-source really open?**. He highlights several issues he observed and resolved at the time of his working at Gatsby. Simple principles that communities and corporations can follow to eliminate those problems and be supportive of their contributors. <blockquote class="twitter-tweet"><p lang="en" dir="ltr"><a href="https://twitter.com/hashtag/5sec?src=hash&amp;ref_src=twsrc%5Etfw">#5sec</a> summary<br>How make <a href="https://twitter.com/hashtag/OpenSource?src=hash&amp;ref_src=twsrc%5Etfw">#OpenSource</a> more open? Not just by saying, but taking action.<br><br>- Invest in onboarding for new contributors.<br>- Fix your representation, by putting money where your mouth is.<br>- Show gratitude in meaningful ways<br>- Build trust &amp; safety <a href="https://t.co/RFtFjFfOqr">pic.twitter.com/RFtFjFfOqr</a></p>&mdash; Vipul Gupta 🐣 (@vipulgupta2048) <a href="https://twitter.com/vipulgupta2048/status/1177812220186521600?ref_src=twsrc%5Etfw">September 28, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> I had the most insightful discussions with folks from the community, including this one talk from [Tanvi Bhakta](https://twitter.com/tanvibhakta_) on building a healthier, more inclusive workplace culture. She is also working on an awesome project, that you all should check out and support. it's called [Desi Deck of Dames](https://desideckofdames.com/) and it's awesome!! <blockquote class="twitter-tweet" data-conversation="none" data-dnt="true"><p lang="en" dir="ltr">Finally met the amazing <a href="https://twitter.com/tanvibhakta_?ref_src=twsrc%5Etfw">@tanvibhakta_</a> from <a href="https://twitter.com/obvious_in?ref_src=twsrc%5Etfw">@obvious_in</a>, speaking about how to build communities for everyone and implement simple steps to improve workplace health at your company. <a href="https://t.co/PGax5YZfNZ">pic.twitter.com/PGax5YZfNZ</a></p>&mdash; Vipul Gupta 🐣 (@vipulgupta2048) <a href="https://twitter.com/vipulgupta2048/status/1177494903758049280?ref_src=twsrc%5Etfw">September 27, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> Making connections is all about why we attend conferences, and I finally met these two in the halls. One is the organiser of Hackerspace Mumbai, an incredible community working on the forefront of tech and the other an extremely talented, newly appointed DevRel for Dgraph Labs! <blockquote class="twitter-tweet" data-conversation="none" data-dnt="true"><p lang="en" dir="ltr">Forging new <a href="https://twitter.com/hashtag/connections?src=hash&amp;ref_src=twsrc%5Etfw">#connections</a> with the community at <a href="https://twitter.com/react_india?ref_src=twsrc%5Etfw">@react_india</a>, with the sustainable <a href="https://twitter.com/hackmum?ref_src=twsrc%5Etfw">@hackmum</a> ran by <a href="https://twitter.com/TalkOrTweets?ref_src=twsrc%5Etfw">@TalkOrTweets</a> and <a href="https://twitter.com/hackintoshrao?ref_src=twsrc%5Etfw">@hackintoshrao</a> who is <a href="https://twitter.com/dgraphlabs?ref_src=twsrc%5Etfw">@dgraphlabs</a>&#39;s newest <a href="https://twitter.com/hashtag/DevRel?src=hash&amp;ref_src=twsrc%5Etfw">#DevRel</a><br><br>Great meeting up with both of you. <a href="https://t.co/Om3eBxtYxk">https://t.co/Om3eBxtYxk</a> <a href="https://t.co/NmnqnS6UTR">pic.twitter.com/NmnqnS6UTR</a></p>&mdash; Vipul Gupta 🐣 (@vipulgupta2048) <a href="https://twitter.com/vipulgupta2048/status/1177663175174410241?ref_src=twsrc%5Etfw">September 27, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
176.541667
1,191
0.772166
eng_Latn
0.918978
0c4039392d49a9eef07ee2d60d6ebb1b8090e459
4,936
md
Markdown
accumulo1.8/README.md
jingjunLi/YCSB
f7169eb7309e19b6ddb492015221b23c4d0d4034
[ "Apache-2.0" ]
null
null
null
accumulo1.8/README.md
jingjunLi/YCSB
f7169eb7309e19b6ddb492015221b23c4d0d4034
[ "Apache-2.0" ]
null
null
null
accumulo1.8/README.md
jingjunLi/YCSB
f7169eb7309e19b6ddb492015221b23c4d0d4034
[ "Apache-2.0" ]
null
null
null
<!-- Copyright (c) 2015 YCSB contributors. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. See accompanying LICENSE file. --> ## Quick Start This section describes how to run YCSB on [Accumulo](https://accumulo.apache.org/). ### 1. Start Accumulo See the [Accumulo Documentation](https://accumulo.apache.org/1.8/accumulo_user_manual.html#_installation) for details on installing and running Accumulo. Before running the YCSB test you must create the Accumulo table. Again see the [Accumulo Documentation](https://accumulo.apache.org/1.8/accumulo_user_manual.html#_basic_administration) for details. The default table name is `ycsb`. ### 2. Set Up YCSB Git clone YCSB and compile: git clone http://github.com/brianfrankcooper/YCSB.git cd YCSB mvn -pl com.yahoo.ycsb:aerospike-binding -am clean package ### 3. Create the Accumulo table By default, YCSB uses a table with the name "usertable". Users must create this table before loading data into Accumulo. For maximum Accumulo performance, the Accumulo table must be pre-split. A simple Ruby script, based on the HBase README, can generate adequate split-point. 10's of Tablets per TabletServer is a good starting point. Unless otherwise specified, the following commands should run on any version of Accumulo. $ echo 'num_splits = 20; puts (1..num_splits).map {|i| "user#{1000+i*(9999-1000)/num_splits}"}' | ruby > /tmp/splits.txt $ accumulo shell -u <user> -p <password> -e "createtable usertable" $ accumulo shell -u <user> -p <password> -e "addsplits -t usertable -sf /tmp/splits.txt" $ accumulo shell -u <user> -p <password> -e "config -t usertable -s table.cache.block.enable=true" Additionally, there are some other configuration properties which can increase performance. These can be set on the Accumulo table via the shell after it is created. Setting the table durability to `flush` relaxes the constraints on data durability during hard power-outages (avoids calls to fsync). Accumulo defaults table compression to `gzip` which is not particularly fast; `snappy` is a faster and similarly-efficient option. The mutation queue property controls how many writes that Accumulo will buffer in memory before performing a flush; this property should be set relative to the amount of JVM heap the TabletServers are given. Please note that the `table.durability` and `tserver.total.mutation.queue.max` properties only exists for >=Accumulo-1.7. There are no concise replacements for these properties in earlier versions. accumulo> config -s table.durability=flush accumulo> config -s tserver.total.mutation.queue.max=256M accumulo> config -t usertable -s table.file.compress.type=snappy On repeated data loads, the following commands may be helpful to re-set the state of the table quickly. accumulo> createtable tmp --copy-splits usertable --copy-config usertable accumulo> deletetable --force usertable accumulo> renametable tmp usertable accumulo> compact --wait -t accumulo.metadata ### 4. Load Data and Run Tests Load the data: ./bin/ycsb load accumulo1.8 -s -P workloads/workloada \ -p accumulo.zooKeepers=localhost \ -p accumulo.columnFamily=ycsb \ -p accumulo.instanceName=ycsb \ -p accumulo.username=user \ -p accumulo.password=supersecret \ > outputLoad.txt Run the workload test: ./bin/ycsb run accumulo1.8 -s -P workloads/workloada \ -p accumulo.zooKeepers=localhost \ -p accumulo.columnFamily=ycsb \ -p accumulo.instanceName=ycsb \ -p accumulo.username=user \ -p accumulo.password=supersecret \ > outputLoad.txt ## Accumulo Configuration Parameters - `accumulo.zooKeepers` - The Accumulo cluster's [zookeeper servers](https://accumulo.apache.org/1.8/accumulo_user_manual.html#_connecting). - Should contain a comma separated list of of hostname or hostname:port values. - No default value. - `accumulo.columnFamily` - The name of the column family to use to store the data within the table. - No default value. - `accumulo.instanceName` - Name of the Accumulo [instance](https://accumulo.apache.org/1.8/accumulo_user_manual.html#_connecting). - No default value. - `accumulo.username` - The username to use when connecting to Accumulo. - No default value. - `accumulo.password` - The password for the user connecting to Accumulo. - No default value.
41.478992
124
0.747366
eng_Latn
0.953713