hexsha
stringlengths 40
40
| size
int64 5
1.04M
| ext
stringclasses 6
values | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 3
344
| max_stars_repo_name
stringlengths 5
125
| max_stars_repo_head_hexsha
stringlengths 40
78
| max_stars_repo_licenses
sequencelengths 1
11
| max_stars_count
int64 1
368k
⌀ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
⌀ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
⌀ | max_issues_repo_path
stringlengths 3
344
| max_issues_repo_name
stringlengths 5
125
| max_issues_repo_head_hexsha
stringlengths 40
78
| max_issues_repo_licenses
sequencelengths 1
11
| max_issues_count
int64 1
116k
⌀ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
⌀ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
⌀ | max_forks_repo_path
stringlengths 3
344
| max_forks_repo_name
stringlengths 5
125
| max_forks_repo_head_hexsha
stringlengths 40
78
| max_forks_repo_licenses
sequencelengths 1
11
| max_forks_count
int64 1
105k
⌀ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
⌀ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
⌀ | content
stringlengths 5
1.04M
| avg_line_length
float64 1.14
851k
| max_line_length
int64 1
1.03M
| alphanum_fraction
float64 0
1
| lid
stringclasses 191
values | lid_prob
float64 0.01
1
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
ff0720ffcd22658773437e6b2369306f40956846 | 4,946 | md | Markdown | docs/test/load-test/get-started-jmeter-test.md | erjaintarun/azure-devops-docs | 3eb4b71a41caf575b60de0681f08b1bc271aecbf | [
"CC-BY-4.0",
"MIT"
] | 1 | 2022-01-31T21:04:59.000Z | 2022-01-31T21:04:59.000Z | docs/test/load-test/get-started-jmeter-test.md | erjaintarun/azure-devops-docs | 3eb4b71a41caf575b60de0681f08b1bc271aecbf | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/test/load-test/get-started-jmeter-test.md | erjaintarun/azure-devops-docs | 3eb4b71a41caf575b60de0681f08b1bc271aecbf | [
"CC-BY-4.0",
"MIT"
] | 1 | 2022-02-10T14:48:46.000Z | 2022-02-10T14:48:46.000Z | ---
title: Run Apache JMeter load tests
description: Using JMeter to performance test your application in the cloud using the features of Azure DevOps and TFS
ms.assetid: 3B2A725F-4E7B-4652-BFD1-FC7C9A248B7B
ms.technology: devops-test
ms.topic: conceptual
ms.author: sdanie
author: steved0x
ms.date: 12/07/2018
monikerRange: '> tfs-2018'
---
# Run Apache JMeter load tests with Azure DevOps
[!INCLUDE [version-header-devops-services](../includes/version-header-devops-services.md)]
[!INCLUDE [loadtest-deprecated-include](../includes/loadtest-deprecated-include.md)]
Before you start your load testing:
* [Create your Azure DevOps subscription](https://visualstudio.microsoft.com/products/visual-studio-team-services-vs),
if you don't have one already.
**To run a JMeter load test:**
1. Sign into Azure DevOps.
2. Go to the **Load Test** section of [!INCLUDE [test-hub-include-adsonly](../includes/test-hub-include-adsonly.md)], open the **+ New**
menu and choose **Apache JMeter test**.

3. Enter your load test parameters. To run your test near to where your users are located,
select a closer location for your load test. Then start your test when you're ready.

>For information about the scripts and supporting files used for JMeter
web tests, see [Build a Web Test Plan](https://jmeter.apache.org/usermanual/build-web-test-plan.html)
on the Apache JMeter website.
4. As the test runs, you see live information about the progress
of the test. You can stop the test by using the **Abort** link on the
toolbar.

5. When your test is done, look at the results to see how
well your app performed. For example, you can see an overview
of your app's performance in the **Summary** page.
This page shows all of the main metrics such as average response
time, user load, requests per second, failed requests, any errors
that might have occurred, and test usage.

The lower section of the **Summary** page shows the settings used
for the test, and details of the five slowest requests during the test.
If there are any transaction tests, the page will also show the five slowest of these.
Use the 
icon above a column to sort the list based on the contents of that column.
6. Open the **Charts** page to see a graphical representation of
the test results over time. The charts show the average
performance, throughput, errors, and the results of each test
request. Hover your mouse pointer over a chart to
see more details.

7. Open the **Diagnostics** page to see detailed information such as a list
of errors and status messages.

You can also use the 
icon in the **Errors** section of the **Summary** page to go directly to the
**Diagnostics** page.

8. Open the **Logs** page to see a list of test runs. Choose the link in
the **Attachment** column to download the detailed log as a text file.

9. If you have a favorite listener that you use to analyze results in
the JMeter IDE, download the test results in .CSV format and the logs
as a zip file from the **Download Results** link.

10. To run the same test again, choose **Rerun**.

11. Now see how you can [view and compare your load test runs](performance-reports.md).
## See also
* [FAQs for load testing](reference-qa.md#jmeter-tests)
* [Load test with Visual Studio](getting-started-with-performance-testing.md)
* [Load test with Azure DevOps](get-started-simple-cloud-load-test.md)
* [Tutorial: Run load tests before release](run-performance-tests-app-before-release.md)
* [Analyze load test results using the Load Test Analyzer](/visualstudio/test/analyze-load-test-results-using-the-load-test-analyzer)
[!INCLUDE [help-and-support-footer](../includes/help-and-support-footer.md)]
| 45.796296 | 136 | 0.754145 | eng_Latn | 0.962957 |
ff073b705c5599e98194c3c51c97d4b9e5dea9ac | 5,643 | md | Markdown | README.md | ashutoshvarma/tslogs | 5b2b849191952601788784ed06667e0f6219c48f | [
"MIT"
] | null | null | null | README.md | ashutoshvarma/tslogs | 5b2b849191952601788784ed06667e0f6219c48f | [
"MIT"
] | 1 | 2022-01-27T15:39:05.000Z | 2022-01-27T15:39:05.000Z | README.md | ashutoshvarma/tslogs | 5b2b849191952601788784ed06667e0f6219c48f | [
"MIT"
] | null | null | null | # `tslogs` [](https://github.com/ashutoshvarma/tslogs/stargazers)
*A Python parser and visualizer for ThrottleStop logs.*

[](https://choosealicense.com/licenses/mit/)
[](https://github.com/ashutoshvarma/tslogs/)
[](https://codecov.io/gh/ashutoshvarma/tslogs/)
[](https://pypi.python.org/pypi/tslogs)
[](https://pypi.org/project/tslogs/#files)
[](https://pypi.org/project/pronto/#files)
[](https://github.com/ashutoshvarma/tslogs/issues)
[](https://pepy.tech/project/tslogs)
## 🚩 Table of Contents
- [Overview](#%EF%B8%8F-overview)
- [What is ThrottleStop ?](#-what-is-throttlestop-)
- [Why would you like to parse ThrottleStop Logs ?](#why-would-you-like-to-parse-throttlestop-logs-)
- [Enable Logging in ThrottleStop](#%EF%B8%8F-enable-logging-in-throttlestop)
- [Installing](#-installing)
- [Usage](#-usage)
- [License](#-license)
## 🗺️ Overview
tslogs is a Python library to parse, browse, export and visualize
[ThrottleStop](https://www.techpowerup.com/download/techpowerup-throttlestop/) log files.
#### 📖 What is ThrottleStop ?
> [ThrottleStop](https://www.techpowerup.com/download/techpowerup-throttlestop/)
is a small application designed to monitor for
and correct the three main types of CPU throttling that are
being used on many laptop computers.
Official Thread - [here](http://forum.notebookreview.com/threads/the-throttlestop-guide.531329/)
Comprehensive Guide - [here](https://www.ultrabookreview.com/31385-the-throttlestop-guide/)
#### Why would you like to parse ThrottleStop Logs ?
TODO
#### 🏳️ Enable Logging in ThrottleStop
<table>
<tr>
<td>Select the `Log File` checkbox</td>
<td>Click on `Options` button and Select the Folder where you want logs to be saved.</td>
</tr>
<tr>
<td valign="top">
<img src="https://github.com/ashutoshvarma/tslogs/blob/main/docs/_static/throttlestop-log-enable.jpg?raw=true" width="500" height="400">
</td>
<td valign="top">
<img src="https://github.com/ashutoshvarma/tslogs/blob/main/docs/_static/throttlestop-log-folder.jpg?raw=true" width="500" height="400">
</td>
</tr>
</table>
## 🔧 Installing
Installing with `pip` is the easiest:
```console
# pip install tslogs # if you have the admin rights
$ pip install tslogs --user # install it in a user-site directory
```
Finally, a development version can be installed from GitHub
using `setuptools` & `pip`
```console
$ git clone https://github.com/ashutoshvarma/tslogs
$ cd tslogs
# pip install .
```
## 💡 Usage
### 1. `tslogs` - CLI tool
```console
$ tslogs --help
usage: tslogs [-h] [--json | --plot ] [--dates START END] [--interval INTERVAL] [--smooth SMOOTH] [--output FILE] [--indent VALUE] [--quiet]
[--version]
paths paths
positional arguments:
paths One or more paths to log dir or log files.
optional arguments:
-h, --help show this help message and exit
--json, -j dump all parsed log data.
--plot, -p Plot given the logs attributes (default: None). Allowed values are {multi, c0, clock_mod, chip_mod, battery_mw,
cpu_temp, gpu_mhz, gpu_temp, vid, power}
--quiet, -q Run in silent mode
--version, -v show program's version number and exit
Filter:
--dates, -d START END
Datetime range to filter (in ISO format, yyyy-mm-dd HH:MM:SS)
Plot Options:
--interval, -I INTERVAL
Plot data frequency in seconds (default: 60)
--smooth, -S SMOOTH Span interval for smoothing the graph, if data frequency is very high using increasing this with 'interval' can
yield smooth graph (default: 2)
Output:
--output, -o FILE Output file path, default is '-' (stdout)
--indent VALUE indent value for json output, default is 4
```
- #### a.) Print the summary

- #### b.) Plot Graphs
This will plot `cpu_temp`, `multi` (clock speed in GHz),
`power` (in W) and `c0` from logs between time 16:00 to 16:15 in `2020-07-28.txt`
```console
tslogs .\tests\logs\2020-07-28.txt -p cpu_temp multi power --smooth 2 --interval 1 -d "2020-07-28 16:00:00" "2020-07-28 16:15:00"
```

### 2. `tslogs` - Python Module
TODO
See `parse_logs()` and `LogLine` in `parse.py`. For more references
see the CLI implementation in `cli.py`
## 📜 License
This library is provided under the open-source
[MIT license](https://choosealicense.com/licenses/mit/).
| 42.75 | 235 | 0.694134 | eng_Latn | 0.381864 |
ff0754ebf05de9657bf405da6ea32eba13edad64 | 9,884 | md | Markdown | articles/application-gateway/tutorial-autoscale-ps.md | changeworld/azure-docs.tr-tr | a6c8b9b00fe259a254abfb8f11ade124cd233fcb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/application-gateway/tutorial-autoscale-ps.md | changeworld/azure-docs.tr-tr | a6c8b9b00fe259a254abfb8f11ade124cd233fcb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/application-gateway/tutorial-autoscale-ps.md | changeworld/azure-docs.tr-tr | a6c8b9b00fe259a254abfb8f11ade124cd233fcb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 'Öğretici: Web uygulaması erişimini geliştirin - Azure Uygulama Ağ Geçidi'
description: Bu eğitimde, Azure PowerShell'i kullanarak ayrılmış bir IP adresine sahip otomatik ölçekleme, bölge gereksiz bir uygulama ağ geçidi oluşturmayı öğrenin.
services: application-gateway
author: vhorne
ms.service: application-gateway
ms.topic: tutorial
ms.date: 11/13/2019
ms.author: victorh
ms.custom: mvc
ms.openlocfilehash: e07fc34c7177e3a1dace34ab298b64dc3aa6a06a
ms.sourcegitcommit: 0947111b263015136bca0e6ec5a8c570b3f700ff
ms.translationtype: MT
ms.contentlocale: tr-TR
ms.lasthandoff: 03/24/2020
ms.locfileid: "74011358"
---
# <a name="tutorial-create-an-application-gateway-that-improves-web-application-access"></a>Öğretici: Web uygulaması erişimini geliştiren bir uygulama ağ geçidi oluşturma
Web uygulaması erişimini geliştirmekle ilgili bir BT yöneticisiyseniz, uygulama ağ geçidinizi müşteri talebine göre ölçeklendirecek şekilde optimize edebilir ve birden çok kullanılabilirlik bölgesine yayabilirsiniz. Bu öğretici, azure uygulama ağ geçidi özelliklerini yapılandırmanıza yardımcı olur: otomatik ölçekleme, bölge artıklığı ve ayrılmış VIP'ler (statik IP). Sorunu çözmek için Azure PowerShell cmdlets'i ve Azure Kaynak Yöneticisi dağıtım modelini kullanırsınız.
Bu öğreticide şunların nasıl yapıldığını öğrenirsiniz:
> [!div class="checklist"]
> * Otomatik olarak imzalanan sertifika oluşturma
> * Otomatik ölçeklendirme sanal ağ oluşturma
> * Ayrılmış genel IP adresi oluşturma
> * Uygulama ağ geçidi altyapınızı ayarlama
> * Otomatik ölçeklendirmeyi belirtme
> * Uygulama ağ geçidi oluşturma
> * Uygulama ağ geçidini test etme
Azure aboneliğiniz yoksa, başlamadan önce [ücretsiz](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) bir hesap oluşturun.
## <a name="prerequisites"></a>Ön koşullar
[!INCLUDE [updated-for-az](../../includes/updated-for-az.md)]
Bu öğretici için Azure PowerShell’i yerel olarak çalıştırmanız gerekir. Azure PowerShell modülü sürümü 1.0.0 veya daha sonra yüklü olmalıdır. Sürümü bulmak için `Get-Module -ListAvailable Az` komutunu çalıştırın. Yükseltmeniz gerekirse, bkz. [Azure PowerShell modülünü yükleme](https://docs.microsoft.com/powershell/azure/install-az-ps). PowerShell sürümünü doğruladıktan sonra, Azure ile bağlantı oluşturmak için `Connect-AzAccount` komutunu çalıştırın.
## <a name="sign-in-to-azure"></a>Azure'da oturum açma
```azurepowershell
Connect-AzAccount
Select-AzSubscription -Subscription "<sub name>"
```
## <a name="create-a-resource-group"></a>Kaynak grubu oluşturma
Kullanılabilir konumlardan birinde bir kaynak grubu oluşturun.
```azurepowershell
$location = "East US 2"
$rg = "AppGW-rg"
#Create a new Resource Group
New-AzResourceGroup -Name $rg -Location $location
```
## <a name="create-a-self-signed-certificate"></a>Otomatik olarak imzalanan sertifika oluşturma
Üretim sırasında kullanım için, güvenilen bir sağlayıcı tarafından imzalanan geçerli bir sertifikayı içeri aktarmalısınız. Bu öğretici için [New-SelfSignedCertificate](https://docs.microsoft.com/powershell/module/pkiclient/new-selfsignedcertificate) komutunu kullanarak otomatik olarak imzalanan bir sertifika oluşturursunuz. Sertifikadan pfx dosyası dışarı aktarmak için döndürülen Parmak izi ile [Export-PfxCertificate](https://docs.microsoft.com/powershell/module/pkiclient/export-pfxcertificate) komutunu kullanabilirsiniz.
```powershell
New-SelfSignedCertificate `
-certstorelocation cert:\localmachine\my `
-dnsname www.contoso.com
```
Bu sonuca benzer bir şey görmeniz gerekir:
```
PSParentPath: Microsoft.PowerShell.Security\Certificate::LocalMachine\my
Thumbprint Subject
---------- -------
E1E81C23B3AD33F9B4D1717B20AB65DBB91AC630 CN=www.contoso.com
```
pfx dosyasını oluşturmak için parmak izini kullanın:
```powershell
$pwd = ConvertTo-SecureString -String "Azure123456!" -Force -AsPlainText
Export-PfxCertificate `
-cert cert:\localMachine\my\E1E81C23B3AD33F9B4D1717B20AB65DBB91AC630 `
-FilePath c:\appgwcert.pfx `
-Password $pwd
```
## <a name="create-a-virtual-network"></a>Sanal ağ oluşturma
Otomatik ölçekleme uygulaması ağ geçidi için özel bir alt ağ içeren bir sanal ağ oluşturun. Şu anda her ayrılmış alt ağda yalnızca bir otomatik ölçeklendirme yapan uygulama ağ geçidi dağıtılabilir.
```azurepowershell
#Create VNet with two subnets
$sub1 = New-AzVirtualNetworkSubnetConfig -Name "AppGwSubnet" -AddressPrefix "10.0.0.0/24"
$sub2 = New-AzVirtualNetworkSubnetConfig -Name "BackendSubnet" -AddressPrefix "10.0.1.0/24"
$vnet = New-AzvirtualNetwork -Name "AutoscaleVNet" -ResourceGroupName $rg `
-Location $location -AddressPrefix "10.0.0.0/16" -Subnet $sub1, $sub2
```
## <a name="create-a-reserved-public-ip"></a>Ayrılmış genel IP adresi oluşturma
PublicIPAddress'in ayırma yöntemini **Statik**olarak belirtin. Otomatik ölçeklendirme yapan uygulama ağ geçidi VIP’si yalnızca statik olabilir. Dinamik IP’ler desteklenmez. Yalnızca standart PublicIpAddress SKU’su desteklenir.
```azurepowershell
#Create static public IP
$pip = New-AzPublicIpAddress -ResourceGroupName $rg -name "AppGwVIP" `
-location $location -AllocationMethod Static -Sku Standard
```
## <a name="retrieve-details"></a>Ayrıntıları alma
Uygulama ağ geçidiiçin IP yapılandırma ayrıntılarını oluşturmak için yerel bir nesnedeki kaynak grubu, alt ağ ve IP ayrıntılarını alın.
```azurepowershell
$resourceGroup = Get-AzResourceGroup -Name $rg
$publicip = Get-AzPublicIpAddress -ResourceGroupName $rg -name "AppGwVIP"
$vnet = Get-AzvirtualNetwork -Name "AutoscaleVNet" -ResourceGroupName $rg
$gwSubnet = Get-AzVirtualNetworkSubnetConfig -Name "AppGwSubnet" -VirtualNetwork $vnet
```
## <a name="configure-the-infrastructure"></a>Altyapıyı yapılandırma
IP config, front-end IP config, arka uç havuzu, HTTP ayarları, sertifika, bağlantı noktası, dinleyici ve kural varolan Standart uygulama ağ geçidi ile aynı biçimde yapılandırın. Yeni SKU, standart SKU ile aynı nesne modelini izler.
```azurepowershell
$ipconfig = New-AzApplicationGatewayIPConfiguration -Name "IPConfig" -Subnet $gwSubnet
$fip = New-AzApplicationGatewayFrontendIPConfig -Name "FrontendIPCOnfig" -PublicIPAddress $publicip
$pool = New-AzApplicationGatewayBackendAddressPool -Name "Pool1" `
-BackendIPAddresses testbackend1.westus.cloudapp.azure.com, testbackend2.westus.cloudapp.azure.com
$fp01 = New-AzApplicationGatewayFrontendPort -Name "SSLPort" -Port 443
$fp02 = New-AzApplicationGatewayFrontendPort -Name "HTTPPort" -Port 80
$securepfxpwd = ConvertTo-SecureString -String "Azure123456!" -AsPlainText -Force
$sslCert01 = New-AzApplicationGatewaySslCertificate -Name "SSLCert" -Password $securepfxpwd `
-CertificateFile "c:\appgwcert.pfx"
$listener01 = New-AzApplicationGatewayHttpListener -Name "SSLListener" `
-Protocol Https -FrontendIPConfiguration $fip -FrontendPort $fp01 -SslCertificate $sslCert01
$listener02 = New-AzApplicationGatewayHttpListener -Name "HTTPListener" `
-Protocol Http -FrontendIPConfiguration $fip -FrontendPort $fp02
$setting = New-AzApplicationGatewayBackendHttpSettings -Name "BackendHttpSetting1" `
-Port 80 -Protocol Http -CookieBasedAffinity Disabled
$rule01 = New-AzApplicationGatewayRequestRoutingRule -Name "Rule1" -RuleType basic `
-BackendHttpSettings $setting -HttpListener $listener01 -BackendAddressPool $pool
$rule02 = New-AzApplicationGatewayRequestRoutingRule -Name "Rule2" -RuleType basic `
-BackendHttpSettings $setting -HttpListener $listener02 -BackendAddressPool $pool
```
## <a name="specify-autoscale"></a>Otomatik ölçeklendirmeyi belirtme
Artık uygulama ağ geçidi için otomatik ölçek yapılandırmasını belirtebilirsiniz. İki otomatik ölçeklendirme yapılandırması türü desteklenir:
* **Sabit kapasite modu**. Bu modda, uygulama ağ geçidi otomatik ölçeklendirme yapmaz ve sabit bir Ölçek Birimi kapasitesinde çalışır.
```azurepowershell
$sku = New-AzApplicationGatewaySku -Name Standard_v2 -Tier Standard_v2 -Capacity 2
```
* **Otomatik ölçeklendirme modu**. Bu modda, uygulama ağ geçidi uygulamanın trafik desenine bağlı olarak otomatik ölçeklendirme yapar.
```azurepowershell
$autoscaleConfig = New-AzApplicationGatewayAutoscaleConfiguration -MinCapacity 2
$sku = New-AzApplicationGatewaySku -Name Standard_v2 -Tier Standard_v2
```
## <a name="create-the-application-gateway"></a>Uygulama ağ geçidi oluşturma
Uygulama ağ geçidini oluşturun ve artıklık bölgeleri ve otomatik ölçek yapılandırması ekleyin.
```azurepowershell
$appgw = New-AzApplicationGateway -Name "AutoscalingAppGw" -Zone 1,2,3 `
-ResourceGroupName $rg -Location $location -BackendAddressPools $pool `
-BackendHttpSettingsCollection $setting -GatewayIpConfigurations $ipconfig `
-FrontendIpConfigurations $fip -FrontendPorts $fp01, $fp02 `
-HttpListeners $listener01, $listener02 -RequestRoutingRules $rule01, $rule02 `
-Sku $sku -sslCertificates $sslCert01 -AutoscaleConfiguration $autoscaleConfig
```
## <a name="test-the-application-gateway"></a>Uygulama ağ geçidini test etme
Uygulama ağ geçidinin genel IP adresini almak için Get-AzPublicIPAddress'i kullanın. Genel IP adresini veya DNS adını kopyalayıp tarayıcınızın adres çubuğuna yapıştırın.
`Get-AzPublicIPAddress -ResourceGroupName $rg -Name AppGwVIP`
## <a name="clean-up-resources"></a>Kaynakları temizleme
Önce uygulama ağ geçidi yle oluşturulan kaynakları keşfedin. Daha sonra, artık bunlara ihtiyaç duyulmadığında, kaynak grubunu, uygulama ağ geçidini ve ilgili tüm kaynakları kaldırmak için `Remove-AzResourceGroup` komutu kullanabilirsiniz.
`Remove-AzResourceGroup -Name $rg`
## <a name="next-steps"></a>Sonraki adımlar
> [!div class="nextstepaction"]
> [URL yolu tabanlı yönlendirme kuralları ile bir uygulama ağ geçidi oluşturma](./tutorial-url-route-powershell.md)
| 50.172589 | 527 | 0.789458 | tur_Latn | 0.991228 |
ff079732807044af7c2529ab2b14b1a84b3a17e9 | 13,226 | md | Markdown | articles/vpn-gateway/bgp-how-to-cli.md | TomohiroSuzuki128/azure-docs.ja-jp | c4ea15150c1e6a9b62fb16f1dc933c72f3a02dcb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/vpn-gateway/bgp-how-to-cli.md | TomohiroSuzuki128/azure-docs.ja-jp | c4ea15150c1e6a9b62fb16f1dc933c72f3a02dcb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/vpn-gateway/bgp-how-to-cli.md | TomohiroSuzuki128/azure-docs.ja-jp | c4ea15150c1e6a9b62fb16f1dc933c72f3a02dcb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 'Azure VPN ゲートウェイで BGP を構成する: Resource Manager と CLI | Microsoft Docs'
description: この記事では、Azure Resource Manager と CLI を使用して Azure VPN ゲートウェイで BGP を構成する方法を説明します。
services: vpn-gateway
documentationcenter: na
author: yushwang
ms.service: vpn-gateway
ms.topic: article
ms.date: 09/25/2018
ms.author: yushwang
ms.openlocfilehash: 51402196c8429797b644357822a1e3c08982b384
ms.sourcegitcommit: d4dfbc34a1f03488e1b7bc5e711a11b72c717ada
ms.translationtype: HT
ms.contentlocale: ja-JP
ms.lasthandoff: 06/13/2019
ms.locfileid: "65209503"
---
# <a name="how-to-configure-bgp-on-an-azure-vpn-gateway-by-using-cli"></a>CLI を使用して Azure VPN ゲートウェイで BGP を構成する方法
この記事では、Azure Resource Manager デプロイ モデルと Azure CLI を使用して、クロスプレミスのサイト間 (S2S) VPN 接続および VNet 間接続 (仮想ネットワーク間の接続) 上で BGP を有効にする方法について説明します。
## <a name="about-bgp"></a>BGP について
BGP は、2 つ以上のネットワーク間でルーティングと到達可能性の情報を交換するためにインターネット上で広く使用されている標準のルーティング プロトコルです。 VPN ゲートウェイとオンプレミスの VPN デバイス (BGP ピアまたは BGP 近隣ノードと呼ばれる) は BGP を使用してルートを交換します。 このルートにより、関連するゲートウェイまたはルーターの可用性と、プレフィックスが到達できる可能性に関する情報が両方のゲートウェイに伝達されます。 また、BGP では、BGP ゲートウェイが特定の BGP ピアから学習したルートを他のすべての BGP ピアに伝達することで、複数のネットワークでトランジット ルーティングを行うこともできます。
BGP の利点の詳しい説明と、BGP を使用する場合の技術面の要件および考慮事項の説明については、「[Azure VPN ゲートウェイを使用した BGP の概要](vpn-gateway-bgp-overview.md)」をご覧ください。
この記事では、次のタスクについて説明します。
* [VPN ゲートウェイに対して BGP を有効にする](#enablebgp) (必須)
次のセクションのいずれかまたは両方を完了できます。
* [BGP を使用してクロスプレミス接続を確立する](#crossprembgp)
* [BGP を使用して VNet 間接続を確立する](#v2vbgp)
この 3 つの各セクションでは、ネットワーク接続で BGP を有効にするための基本的な構成要素を形成します。 3 つのセクションをすべて完了すると、次の図のようなトポロジが構築されます。

これらのセクションを組み合わせると、ニーズに合わせて、より複雑なマルチホップの通過ネットワークを構築することができます。
## <a name ="enablebgp"></a>VPN ゲートウェイに対して BGP を有効にする
このセクションで説明する手順を実行してから、他の 2 つの構成セクションの手順を実行する必要があります。 以下の構成手順では、次の図に示すように、Azure VPN ゲートウェイの BGP パラメーターを設定します。

### <a name="before-you-begin"></a>開始する前に
最新バージョンの CLI コマンド (2.0 以降) をインストールします。 CLI コマンドのインストール方法については、「[Azure CLI のインストール](/cli/azure/install-azure-cli)」および「[Azure CLI を使ってみる](/cli/azure/get-started-with-azure-cli)」を参照してください。
### <a name="step-1-create-and-configure-testvnet1"></a>手順 1:TestVNet1 の作成と構成
#### <a name="Login"></a>1.サブスクリプションへの接続
[!INCLUDE [CLI login](../../includes/vpn-gateway-cli-login-include.md)]
#### <a name="2-create-a-resource-group"></a>手順 2.リソース グループの作成
次の例では、TestRG1 という名前のリソース グループを "eastus" の場所に作成します。 仮想ネットワークを作成するリージョンにリソース グループが既にある場合は、代わりにそのリソース グループを使用できます。
```azurecli
az group create --name TestBGPRG1 --location eastus
```
#### <a name="3-create-testvnet1"></a>手順 3.TestVNet1 を作成する
次の例では、TestVNet1 という名前の仮想ネットワークと、次の 3 つのサブネットを作成します: GatewaySubnet、FrontEnd、BackEnd。 値を代入するときは、ゲートウェイ サブネットの名前を必ず GatewaySubnet にすることが重要です。 別の名前にすると、ゲートウェイの作成は失敗します。
1 番目のコマンドは、フロントエンド アドレス空間と、FrontEnd サブネットを作成します。 2 番目のコマンドでは、BackEnd サブネット用に追加のアドレス空間を作成します。 3 番目と 4 番目のコマンドは、BackEnd サブネットと GatewaySubnet を作成します。
```azurecli
az network vnet create -n TestVNet1 -g TestBGPRG1 --address-prefix 10.11.0.0/16 -l eastus --subnet-name FrontEnd --subnet-prefix 10.11.0.0/24
az network vnet update -n TestVNet1 --address-prefixes 10.11.0.0/16 10.12.0.0/16 -g TestBGPRG1
az network vnet subnet create --vnet-name TestVNet1 -n BackEnd -g TestBGPRG1 --address-prefix 10.12.0.0/24
az network vnet subnet create --vnet-name TestVNet1 -n GatewaySubnet -g TestBGPRG1 --address-prefix 10.12.255.0/27
```
### <a name="step-2-create-the-vpn-gateway-for-testvnet1-with-bgp-parameters"></a>手順 2:BGP パラメーターを指定して TestVNet1 の VPN ゲートウェイを作成する
#### <a name="1-create-the-public-ip-address"></a>1.パブリック IP アドレスの作成
パブリック IP アドレスを要求します。 仮想ネットワーク用に作成した VPN ゲートウェイにパブリック IP アドレスが割り当てられます。
```azurecli
az network public-ip create -n GWPubIP -g TestBGPRG1 --allocation-method Dynamic
```
#### <a name="2-create-the-vpn-gateway-with-the-as-number"></a>2.AS 番号で VPN ゲートウェイを作成する
TestVNet1 用の仮想ネットワーク ゲートウェイを作成します。 BGP ではルートベースの VPN ゲートウェイが必要です。 TestVNet1 の自律システム番号 (ASN) を設定するには、追加のパラメーター `-Asn` も必要です。 ゲートウェイの作成には時間がかかります (完了に 45 分以上必要とします)。
`--no-wait` パラメーターを使用してこのコマンドを実行した場合は、フィードバックや出力が表示されなくなります。 `--no-wait` パラメーターは、ゲートウェイをバックグラウンドで作成するためのものです。 これは、VPN ゲートウェイがすぐに作成されるという意味ではありません。
```azurecli
az network vnet-gateway create -n VNet1GW -l eastus --public-ip-address GWPubIP -g TestBGPRG1 --vnet TestVNet1 --gateway-type Vpn --sku HighPerformance --vpn-type RouteBased --asn 65010 --no-wait
```
#### <a name="3-obtain-the-azure-bgp-peer-ip-address"></a>3.Azure BGP ピア IP アドレスを取得する
ゲートウェイが作成されたら、Azure VPN ゲートウェイの BGP ピア IP アドレスを取得する必要があります。 オンプレミスの VPN デバイスの BGP ピアとして VPN ゲートウェイを構成するには、このアドレスが必要です。
次のコマンドを実行し、出力の上部にある `bgpSettings` セクションを確認します。
```azurecli
az network vnet-gateway list -g TestBGPRG1
"bgpSettings": {
"asn": 65010,
"bgpPeeringAddress": "10.12.255.30",
"peerWeight": 0
}
```
ゲートウェイが作成されたら、このゲートウェイを使用して、BGP でクロスプレミス接続または VNet 間接続を確立することができます。
## <a name ="crossprembgp"></a>BGP を使用してクロスプレミス接続を確立する
クロスプレミス接続を確立するには、オンプレミスの VPN デバイスを表すローカル ネットワーク ゲートウェイを作成する必要があります。 その後、Azure VPN ゲートウェイをローカル ネットワーク ゲートウェイに接続します。 次の手順は、他の接続の作成方法と似ていますが、BGP 構成パラメーターを指定するのに必要なプロパティが追加されています。

### <a name="step-1-create-and-configure-the-local-network-gateway"></a>手順 1:ローカル ネットワーク ゲートウェイを作成して構成する
この演習では、引き続き、図に示されている構成を作成します。 値は実際の構成で使用する値に置換します。 ローカル ネットワーク ゲートウェイを使用する場合は、次のことにご注意ください。
* ローカル ネットワーク ゲートウェイは、VPN ゲートウェイと同じ場所およびリソース グループに配置することも、別の場所およびリソース グループに配置することもできます。 この例では、異なる場所の異なるリソース グループのゲートウェイを示します。
* ローカル ネットワーク ゲートウェイ用に宣言する必要がある最小限のプレフィックスは、VPN デバイスの BGP ピア IP アドレスのホスト アドレスです。 この場合は、10.51.255.254/32 の /32 プレフィックスです。
* 既に説明したように、オンプレミスのネットワークと Azure 仮想ネットワークでは、異なる BGP ASN を使用する必要があります。 これらが同じ場合、オンプレミス VPN デバイスが既に ASN を使用して他の BGP 近隣ノードとピアリングしているのであれば、VNet ASN を変更する必要があります。
先に進む前に、この演習の「[VPN ゲートウェイに対して BGP を有効にする](#enablebgp)」セクションが終了していること、さらにサブスクリプション 1 にまだ接続されていることをご確認ください。 この例では、新しいリソース グループを作成します。 また、ローカル ネットワーク ゲートウェイ用の 2 つの追加パラメーターである `Asn` と `BgpPeerAddress` にもご注意ください。
```azurecli
az group create -n TestBGPRG5 -l eastus2
az network local-gateway create --gateway-ip-address 23.99.221.164 -n Site5 -g TestBGPRG5 --local-address-prefixes 10.51.255.254/32 --asn 65050 --bgp-peering-address 10.51.255.254
```
### <a name="step-2-connect-the-vnet-gateway-and-local-network-gateway"></a>手順 2:VNet ゲートウェイとローカル ネットワーク ゲートウェイを接続する
この手順では、TestVNet1 から Site5 への接続を作成します。 この接続に対して BGP を有効にするには、`--enable-bgp` パラメーターを指定する必要があります。
この例では、仮想ネットワーク ゲートウェイとローカル ネットワーク ゲートウェイがそれぞれ異なるリソース グループにあります。 ゲートウェイがそれぞれ異なるリソース グループに属している場合は、仮想ネットワーク間の接続を設定するために、2 つのゲートウェイのリソース ID 全体を指定する必要があります。
#### <a name="1-get-the-resource-id-of-vnet1gw"></a>1.VNet1GW のリソース ID を取得する
次のコマンドの出力を使用して、VNet1GW のリソース ID を取得します。
```azurecli
az network vnet-gateway show -n VNet1GW -g TestBGPRG1
```
出力結果から `"id":` 行を探し出します。 次のセクションで接続を作成する際に引用符で囲まれた値が必要になります。
出力例:
```
{
"activeActive": false,
"bgpSettings": {
"asn": 65010,
"bgpPeeringAddress": "10.12.255.30",
"peerWeight": 0
},
"enableBgp": true,
"etag": "W/\"<your etag number>\"",
"gatewayDefaultSite": null,
"gatewayType": "Vpn",
"id": "/subscriptions/<subscription ID>/resourceGroups/TestBGPRG1/providers/Microsoft.Network/virtualNetworkGateways/VNet1GW",
```
`"id":` の後の値は、メモ帳などのテキスト エディターにコピーしてください。そうすることで、接続の作成時に簡単に貼り付けることができます。
```
"id": "/subscriptions/<subscription ID>/resourceGroups/TestRG1/providers/Microsoft.Network/virtualNetworkGateways/VNet1GW"
```
#### <a name="2-get-the-resource-id-of-site5"></a>2.Site5 のリソース ID を取得する
次のコマンドの出力を使用して、Site5 のリソース ID を取得します。
```azurecli
az network local-gateway show -n Site5 -g TestBGPRG5
```
#### <a name="3-create-the-testvnet1-to-site5-connection"></a>3.TestVNet1-to-Site5 接続を作成する
この手順では、TestVNet1 から Site5 への接続を作成します。 既に説明したように、同じ Azure VPN ゲートウェイで BGP 接続と BGP 以外の接続の両方を混在させることはできません。 接続プロパティで BGP を有効にしない限り、両方のゲートウェイで BGP パラメーターが既に構成されていても、Azure はこの接続の BGP を有効にしません。 サブスクリプション ID を自分の ID に置換します。
```azurecli
az network vpn-connection create -n VNet1ToSite5 -g TestBGPRG1 --vnet-gateway1 /subscriptions/<subscription ID>/resourceGroups/TestBGPRG1/providers/Microsoft.Network/virtualNetworkGateways/VNet1GW --enable-bgp -l eastus --shared-key "abc123" --local-gateway2 /subscriptions/<subscription ID>/resourceGroups/TestBGPRG5/providers/Microsoft.Network/localNetworkGateways/Site5 --no-wait
```
この演習のために、次の例ではオンプレミスの VPN デバイスの BGP 構成セクションに入力するパラメーターの一覧を表示します。
```
Site5 ASN : 65050
Site5 BGP IP : 10.52.255.254
Prefixes to announce : (for example) 10.51.0.0/16 and 10.52.0.0/16
Azure VNet ASN : 65010
Azure VNet BGP IP : 10.12.255.30
Static route : Add a route for 10.12.255.30/32, with nexthop being the VPN tunnel interface on your device
eBGP Multihop : Ensure the "multihop" option for eBGP is enabled on your device if needed
```
数分後に接続が確立されます。 IPsec 接続が確立されると、BGP ピアリング セッションが始まります。
## <a name ="v2vbgp"></a>BGP を使用して VNet 間接続を確立する
このセクションでは、次の図に示すように、BGP を使用して VNet 間接続を追加します。

以下の手順は、前のセクションで説明した手順の続きです。 BGP で TestVNet1 と VPN ゲートウェイを作成して構成するには、「[VPN ゲートウェイに対して BGP を有効にする](#enablebgp)」セクションが完了している必要があります。
### <a name="step-1-create-testvnet2-and-the-vpn-gateway"></a>手順 1:TestVNet2 と VPN ゲートウェイを作成する
新しい仮想ネットワークである TestVNet2 の IP アドレス空間がどの VNet 範囲とも重ならないようにすることが重要です。
この例では、仮想ネットワークは同じサブスクリプションに属しています。 異なるサブスクリプション間に VNet 間の接続を設定できます。 詳しくは、[VNet 間の接続の構成](vpn-gateway-howto-vnet-vnet-cli.md)に関する記事をご覧ください。 接続の作成時に BGP を有効にするには、必ず `-EnableBgp $True` を追加してください。
#### <a name="1-create-a-new-resource-group"></a>1.新しいリソース グループを作成する
```azurecli
az group create -n TestBGPRG2 -l westus
```
#### <a name="2-create-testvnet2-in-the-new-resource-group"></a>2.新しいリソース グループに TestVNet2 を作成する
1 番目のコマンドは、フロントエンド アドレス空間と、FrontEnd サブネットを作成します。 2 番目のコマンドでは、BackEnd サブネット用に追加のアドレス空間を作成します。 3 番目と 4 番目のコマンドは、BackEnd サブネットと GatewaySubnet を作成します。
```azurecli
az network vnet create -n TestVNet2 -g TestBGPRG2 --address-prefix 10.21.0.0/16 -l westus --subnet-name FrontEnd --subnet-prefix 10.21.0.0/24
az network vnet update -n TestVNet2 --address-prefixes 10.21.0.0/16 10.22.0.0/16 -g TestBGPRG2
az network vnet subnet create --vnet-name TestVNet2 -n BackEnd -g TestBGPRG2 --address-prefix 10.22.0.0/24
az network vnet subnet create --vnet-name TestVNet2 -n GatewaySubnet -g TestBGPRG2 --address-prefix 10.22.255.0/27
```
#### <a name="3-create-the-public-ip-address"></a>3.パブリック IP アドレスの作成
パブリック IP アドレスを要求します。 仮想ネットワーク用に作成した VPN ゲートウェイにパブリック IP アドレスが割り当てられます。
```azurecli
az network public-ip create -n GWPubIP2 -g TestBGPRG2 --allocation-method Dynamic
```
#### <a name="4-create-the-vpn-gateway-with-the-as-number"></a>4.AS 番号で VPN ゲートウェイを作成する
TestVNet2 用の仮想ネットワーク ゲートウェイを作成します。 Azure VPN ゲートウェイでは既定の ASN をオーバーライドする必要があります。 BGP とトランジット ルーティングを有効にするために、接続された仮想ネットワークの ASN はそれぞれ異なっている必要があります。
```azurecli
az network vnet-gateway create -n VNet2GW -l westus --public-ip-address GWPubIP2 -g TestBGPRG2 --vnet TestVNet2 --gateway-type Vpn --sku Standard --vpn-type RouteBased --asn 65020 --no-wait
```
### <a name="step-2-connect-the-testvnet1-and-testvnet2-gateways"></a>手順 2:TestVNet1 と TestVNet2 のゲートウェイを接続する
この手順では、TestVNet1 から Site5 への接続を作成します。 この接続に対して BGP を有効にするには、`--enable-bgp` パラメーターを指定する必要があります。
次の例では、仮想ネットワーク ゲートウェイとローカル ネットワーク ゲートウェイがそれぞれ異なるリソース グループにあります。 ゲートウェイがそれぞれ異なるリソース グループに属している場合は、仮想ネットワーク間の接続を設定するために、2 つのゲートウェイのリソース ID 全体を指定する必要があります。
#### <a name="1-get-the-resource-id-of-vnet1gw"></a>1.VNet1GW のリソース ID を取得する
次のコマンドの出力から、VNet1GW のリソース ID を取得します。
```azurecli
az network vnet-gateway show -n VNet1GW -g TestBGPRG1
```
#### <a name="2-get-the-resource-id-of-vnet2gw"></a>2.VNet2GW のリソース ID を取得する
次のコマンドの出力から、VNet2GW のリソース ID を取得します。
```azurecli
az network vnet-gateway show -n VNet2GW -g TestBGPRG2
```
#### <a name="3-create-the-connections"></a>3.接続を作成する
TestVNet1 から TestVNet2 への接続と、TestVNet2 から TestVNet1 への接続を作成します。 サブスクリプション ID を自分の ID に置換します。
```azurecli
az network vpn-connection create -n VNet1ToVNet2 -g TestBGPRG1 --vnet-gateway1 /subscriptions/<subscription ID>/resourceGroups/TestBGPRG1/providers/Microsoft.Network/virtualNetworkGateways/VNet1GW --enable-bgp -l eastus --shared-key "efg456" --vnet-gateway2 /subscriptions/<subscription ID>/resourceGroups/TestBGPRG2/providers/Microsoft.Network/virtualNetworkGateways/VNet2GW
```
```azurecli
az network vpn-connection create -n VNet2ToVNet1 -g TestBGPRG2 --vnet-gateway1 /subscriptions/<subscription ID>/resourceGroups/TestBGPRG2/providers/Microsoft.Network/virtualNetworkGateways/VNet2GW --enable-bgp -l westus --shared-key "efg456" --vnet-gateway2 /subscriptions/<subscription ID>/resourceGroups/TestBGPRG1/providers/Microsoft.Network/virtualNetworkGateways/VNet1GW
```
> [!IMPORTANT]
> "*両方*" の接続で BGP を有効にしてください。
>
>
これらの手順を完了すると、数分後に接続が確立します。 BGP ピアリング セッションは、VNet 間の接続が完了すると有効になります。
## <a name="next-steps"></a>次の手順
接続が完成したら、仮想ネットワークに仮想マシンを追加することができます。 手順については、[仮想マシンの作成](../virtual-machines/virtual-machines-windows-hero-tutorial.md?toc=%2fazure%2fvirtual-machines%2fwindows%2ftoc.json)に関するページをご覧ください。
| 42.941558 | 382 | 0.775594 | yue_Hant | 0.718736 |
ff0914585bbf8b3ae44ca9da4f546850047722d6 | 917 | md | Markdown | index.md | josephrocca/ljvmiranda921.github.io | b2b67dd229a3d38500e552a5b9cdfa79b4aa52a7 | [
"CC-BY-4.0"
] | 36 | 2018-08-24T19:46:36.000Z | 2022-01-26T04:42:44.000Z | index.md | josephrocca/ljvmiranda921.github.io | b2b67dd229a3d38500e552a5b9cdfa79b4aa52a7 | [
"CC-BY-4.0"
] | 164 | 2017-09-24T12:27:18.000Z | 2022-03-11T04:27:49.000Z | index.md | josephrocca/ljvmiranda921.github.io | b2b67dd229a3d38500e552a5b9cdfa79b4aa52a7 | [
"CC-BY-4.0"
] | 19 | 2017-06-08T14:37:30.000Z | 2021-09-17T09:52:42.000Z | ---
# You don't need to edit this file, it's empty on purpose.
# Edit theme's home layout instead if you wanna make some changes
# See: https://jekyllrb.com/docs/themes/#overriding-theme-defaults
layout: home
---
<div class="divider">
<div class="left">
<img id="profilepic" width="200" height="200" src="assets/profile.JPG" alt="Profile">
</div>
<div class="right">
<p>Hi, I'm Lj Miranda and welcome to my blog!</p>
<p>
I'm taking some time off to rest, refresh, and recharge. <b>I am
currently unreachable.</b> In the meantime, I'll still be publishing
posts that I've written ahead of time.
</p>
<p>
In this blog, I write about my interests in machine learning,
software development, and research—so grab a cup of coffee and feel
free to look around!
</p>
</div>
</div>
| 35.269231 | 93 | 0.609597 | eng_Latn | 0.976757 |
ff09e5f59d9567eca45ba7ce4421d399b467ac45 | 3,887 | md | Markdown | _posts/정보처리기사/200926 기출오답 1과목.md | seunghee-ryu/seunghee-ryu.github.io | 384d8eebabf570b02b7885b528f0b813bed0d319 | [
"MIT"
] | null | null | null | _posts/정보처리기사/200926 기출오답 1과목.md | seunghee-ryu/seunghee-ryu.github.io | 384d8eebabf570b02b7885b528f0b813bed0d319 | [
"MIT"
] | null | null | null | _posts/정보처리기사/200926 기출오답 1과목.md | seunghee-ryu/seunghee-ryu.github.io | 384d8eebabf570b02b7885b528f0b813bed0d319 | [
"MIT"
] | null | null | null | # 1과목
## XP(eXtreme Programming)의 기본원리로 볼 수 없는 것은?
- Linear Sequential Method
1-2 개발자 공동 2명작업 PP
1-3 공동소유 CO
1-4 지속적통합 CI
그외 계발계획수립PG, 짧은배포주기SR,문장형시스템아키텍쳐meta, 단순설계SD,테스트,
중복제거 Refactoring, 고객상주,코드표준의사소통
4대원칙:의사소통, 단순성, 피드백, 용기
[해설작성자 : 저질체력]
▶애자일 방법론은 소프트웨어 개발 방법에 있어서 아무런 계획이 없는 개발 방법과 계획이 지나치게 많은 개발 방법들 사이에서 타협점을 찾고자 하는 방법론/적은 규모의 개발 프로젝트에 적용하기 좋다(그중에서도 XP와 SCRUM이 제일 많이 통용)
▶XP (Extreme Programming) 의 5원칙:단순성, 소통, 피드백, 용기, 존중
▷소통(Communication)-고객과 개발자와의 의사소통을 중요시
▷단순성(Simplicity)-사용되지 않는 구조와 알고리즘 배제, 가장 효율적인 디자인이나 코딩을 하는 것.
▷피드백(Feedback)-즉각적 피드백 통해 빠른 의사결정
▷용기(Courage)-개발자들이 자신감있게 변화를 수용하며 고객요구사항에 능동적 대처 용기
------------------
1.Linear Sequential Method(x) 순차적 방법
2.Pair Programming=모든 프로그래밍은 하나의 컴퓨터에 2명의 프로그래머가 같이 공동작업 진행
3.Collective Ownership=Collective Code Ownership=소스코드에 대한 팀의 공통책임이자 코드는 누구든지 수정가능함
4.Continuous Integration=컴포넌트 또는 모듈 단위로 나누어서 개발된 소스코드들은 하나의 작업이 끝날 때 마다 지속적으로 통합되고 동시에 테스트함
## 2. 럼바우(Rumbaugh) 객체지향 분석 기법에서 동적 모델링에 활용되는 다이어그램은?
- 상태 다이어그램(State Diagram)
- 동적 모델링(Dynamic Modeling)은 상태 다이어그램(상태도)를 이용해 시간 흐름에 따른 객체들 간의 제어 흐름, 상호 작용, 동작 순서 등의 동적인 행위를 표현하는 모델링을 말한다.
*럼바우(Rumbaugh) : 소프트웨어 구성요소를 그래픽 표기법으로 이용한 모델링
* " 분석: 객체 모델링, 동적 모델링, 기능 모델링
객체모델링:객체 다이어그램으로 표시, 가장 중요시 선행
동적모델링:상태 다이어그램(상태도), 동적인 흐름 행위
기능모델링:자료의 흐름을 이용하여 프로세스간의 자료 흐름을 처리
## 3. CASE(Computer Aided Software Engineering)의 주요 기능으로 옳지 않은 것은?
- 언어 번역
- 컴퓨터 지원 소프트웨어 공학(computer-aided software engineering: CASE)= 컴퓨터 지원 시스템 공학
- 시스템 개발 방법론들의 자동화를 지원하는 소프트웨어 도구를 제공해 개발자의 반복적인 작업량을 줄이도록 하는 것
- CASE 도구들은 차트와 다이어그램을 자동으로 생성하는 그래픽 기능, 화면과 리포트 생성기, 데이터사전, 분석과 검사 도구, 코드 생성기, 문서 생성기 등을 제공
- 기능: 1. 소프트웨어 생명주기(Software Lifecycle) 전 단계의 연결
2. 그래픽 지원
3. 다양한 소프트웨어 개발 모형 지원
## 6. 파이프 필터 형태의 소프트웨어 아키텍처에 대한 설명으로 옳은 것은?
- 서브시스템이 입력데이터를 받아 처리하고 결과를 다음 서브시스템으로 넘겨주는 과정을 반복한다.
- 아키텍처(architecture)란 영어 뜻으로는 구조, 건축물, 건축학 등의 뜻
소프트웨어 아키텍처:소프트웨어 구조
1. 레이어 패턴 (Layers Pattern): 시스템을 계층으로 구분하여 구성,ex)OSI 참조 모델
2. 클라이언트-서버 패턴 (Client-Server Pattern):하나의 서버 컴포넌트와 다수의 클라이언트 컴포넌트로 구성되는 패턴
3. 파이프-필터 패턴 (Pipe-Filter Pattern):데이터 스트림 절차의 각 단계를 필터 컴포넌트로 캡슐화하여 파이프를 통해 데이터를 전송하는 패턴 ex)UNIX의 쉘
4. 모델-뷰-컨트롤러 패턴 (Model-View-Controller Pattern):서브시스템을 3개의 부분으로 구조화하는 패턴
-------------------
5. 마스터-슬레이브 패턴 6. 브로커 패턴 7. 피어-투-피어 패턴 8. 이벤트-버스 패턴 9. 블랙보드 패턴 10. 인터프리터 패턴
## 7. 코드화 대상 항목의 중량, 면적, 용량 등의 물리적 수치를 이용하여 만든 코드는?
- 표의 숫자 코드
- *코드 정의:
데이터를 사용 목적에 따라 식별, 분류, 배열하기 위하여 사용되는 숫자, 문자 또는 기호로 컴퓨터 처리에 효율적인 것을 선정
*코드 종류
1) 순차 코드(Sequence Code)-자료의 발생순, 크기순, 가나다순 등 일정 순서대로 코드
2) 블록 코드(Block Code : 구분 코드)- 코드화 대상을 미리 파악하여 블록으로 구분한 후 그 안에서 순서대로 코드를 부여
3) 그룹 분류 코드(Group Classification Code)-구분 코드를 세분화한 형태로 대분류, 중분류, 소분류 등 각 분류별로 자릿수를 구성
4) 표의 숫자 코드(Significant Digit Code)-표현하려는 대상의 의미는 제외하고 수치만을 모아 만든 것으로 대상이 되는 물체의 중량, 면적, 크기 등을 직접 코드에 적용
5) 십진 분류 코드(Decimal Classification Code)-코드화 대상물을 일정한 소속으로 구분하여 십진수 한 자리씩 구분하여 대분류하고, 같은 방법으로 중 분류, 소분류한 코드
6) 연상 코드(Mnemonic Code)-숫자나 문자를 조합해서 나타내는 것으로 어떤 내용을 기억할 수 있도록 표시한 기호 코드
7) 약자 코드(Letter Code)-일반적으로 사용해온 단위의 약자를 코드로 사용
8) 끝자리 분류 코드(Final Digit Code)-다른 종류의 코드와 조합해서 사용하며, 코드의 끝에 붙여서 그 의미를 표현
## 9. DFD(data flow diagram)에 대한 설명으로 틀린 것은?
- 시간 흐름을 명확하게 표현할 수 있다.
- DFD는 시간 흐름을 명확하게 표현할 수 없다. 자료에 대한 흐름을 표현하며 구조적 분석기법에 이용된다. 요소는 화살표, 원 , 직선으로 표시한다.
## 11. UML의 기본 구성요소가 아닌 것은?
- Terminal
- UML의 구성요소로는 사물, 관계, 다이어그램 3가지로 이루어져있으며, Things은 사물, Relationship은 관계, Diagram은 다이어그램입니다.
## 15. 요구 사항 명세기법에 대한 설명으로 틀린 것은?
- 비정형 명세기법은 사용자의 요구를 표현할 때 Z 비정형 명세기법을 사용한다.
- 정형 명세법
- 수학적 기반/모델링 기반
- Z, VDM, Petri-Net(모형기반)
- CSP, CCS, LOTOS(대수적방법)
- 시스템 요구특성이 정확하고 명세가 간결하다. 명세와 구현이 일치.
- 그러나 이해도가 낮으며 이해관계자의 작성 부담 가중.
- 비정형명세
- 상태, 기능, 객체 중심 명세법
- FSM(Finite state machine)
- Decision Table, ER모델링
- State chart(SADT)
- UseCase : 사용자기반모델링
- 명세 작성이 간편하고 의사전달 방법이 다양하다.
- 불충분한 명세가능성, 모호성.
## 16. 소프트웨어 개발 단계에서 요구 분석 과정에 대한 설명으로 거리가 먼 것은?
- 개발 비용이 가장 많이 소요되는 단계이다.
- 개발비용이 가장 많이 소요되는 단계는 유지보수 단계이다.
| 32.123967 | 135 | 0.694109 | kor_Hang | 1.00001 |
ff09f4d0c82deb2eeb378d94df805b98fefb4610 | 10,618 | md | Markdown | docs/ELI_FileSpec.md | bakercp/libdpen | 5efb0bb70a9f9f8fb713a104bd314fd394494b30 | [
"MIT"
] | 6 | 2015-06-05T07:08:13.000Z | 2018-05-02T17:39:39.000Z | docs/ELI_FileSpec.md | bakercp/libdpen | 5efb0bb70a9f9f8fb713a104bd314fd394494b30 | [
"MIT"
] | null | null | null | docs/ELI_FileSpec.md | bakercp/libdpen | 5efb0bb70a9f9f8fb713a104bd314fd394494b30 | [
"MIT"
] | null | null | null |
_The .ELI/.WPI binary format is a serialized binary format representing pen data, including position, tilt, pressure, timing, strokes, layers, and more._
_This specification is unofficial, incomplete, in progress and the result of trial and error, a hex editor, and a handful of .WPI files carefully generated with the Wacom Inkling._
_Christopher Baker <https://christopherbaker.net>_
# Equipment
## Wacom Inkling
The Wacom Inkling's official specifications can be found [here](https://web.archive.org/web/20121013141410/http://www.wacom.com/en/Products/Inkling/Inkling-Technical-Specifics.aspx). It appears that the sampling rate for the Inkling is 200 samples / second. It provides 1024 levels of pressure.
## Others
…
# Header
The standard header length is **2059** bytes. There are some variations between headers, but no examples have exceeded **2059** bytes. The contents are still a work in progress.
|Byte_Range|Function|_Notes_|
|--:|:--|:--|
|0-269|…|…|
|270-273|…|differ across devices ?|
|270-273|…|differ across devices ?|
|334-337|…|differ across devices ?|
|434-436|…|differ across devices ?|
|446-1136|…|differ across devices ?|
|1146-1837|…|differ across devices ?|
|1858-1860|…|differ across devices ?|
|1870-1871|…|differ across devices ?|
|1882-1885|…|differ across devices ?|
|1894-1897|…|differ across devices ?|
|2043-2044|…|differ across devices ?|
|2045-2059|…|Are again the same.|
# Layer and Stroke Blocks
Layer and stroke blocks are 3 bytes long and are indicated by a `0xF1` marker, and a block identification byte differentiating the specific event types.
## Layer Start Block
A new layer is indicated by block identification of `0x80`.
|BYTE_0|BYTE_1|BYTE_2|
|:----:|:----:|:----:|
|_Marker_|_Length_|_ID_|
|`0xF1`|`0x03`|`0x80`|
### Notes
> The new layer marker will be present even when there is no stroke data on the layer. In the case of the Wacom Inkling, this can happen if one presses the new layer button multiple times without putting pen to paper. Wacom's Sketch Manager software skips such "empty" layers during display and export.
## Stroke Start Block
The beginning of a stroke (a pen down) is indicated by 3 byte sequence.
|BYTE_0|BYTE_1|BYTE_2|
|:----:|:----:|:----:|
|_Marker_|_Length_|_ID_|
|`0xF1`|`0x03`|`0x01`|
## Stroke End Block
The end of the stroke (a pen lift) is indicated by a 3 byte sequence.
|BYTE_0|BYTE_1|BYTE_2|
|:----:|:----:|:----:|
|_Marker_|_Length_|_ID_|
|`0xF1`|`0x03`|`0x00`|
# Pen Data Blocks
Wacom Inkling `.WPI` files generally present pen data in clusters of three ordered data points. These clusters are always surrounded by [Stroke Start](#strokestart) and [Stroke End](#strokeend) events.
### Example
> * [Stroke Start](#strokestart)
> * ...
> * [X/Y Position](#xyposition)
> * [Pressure](#pressure)
> * [X/Y Tilt](#xytilt)
> * ...
> * [Stroke End](#strokeend)
## X/Y POSITION Block
X and Y values are encoded as a two byte `uint16_t` values.
|BYTE_0|BYTE_1|BYTE_2|BYTE_3|BYTE_4|BYTE_5|
|:----:|:----:|:----:|:----:|:----:|:----:|
|_Marker_|_Length_|_X HIGH_|_X LOW_|_Y HIGH_|_Y LOW_|
|`0xF1`|`0x06`|`VARIES`|`VARIES`|`VARIES`|`VARIES`|
### Example
```c++
uint8_t buffer[] = { 0x61, 0x06, 0x03, 0x4F, 0x13, 0x61 };
uint32_t i = 0;
uint16_t xPos = (buffer[i+2] << 8) | (buffer[i+3]); // get x value
uint16_t yPos = (buffer[i+4] << 8) | (buffer[i+5]); // get y value
```
### Notes
> The Inkling Sketch Manager software scales the resulting raw `x` and `y` values by `x / 10.0f` and `y / 5.0f` when creating the `.WAC` InkML output files. e.g.
```c++
float x = xPos / 10.0f; // Scale like Inkling Sketch Manager
float y = xPos / 5.0f; // Scale like Inkling Sketch Manager
```
This scaling factor was empirically determined by comparing the raw `.WPI` file output to Wacom's Sketch Manager `.WAC` InkML files.
## X/Y TILT Block
Tilt X and Tilt Y values are encoded as a one byte `uint8_t` values.
|BYTE_0|BYTE_1|BYTE_2|BYTE_3|BYTE_4|BYTE_5|
|:----:|:----:|:----:|:----:|:----:|:----:|
|_Marker_|_Length_|_X TILT_|_Y TILT_|_?_|_?_|
|`0x65`|`0x06`|`VARIES`|`VARIES`|`0x00`[‡](#dc)|`0x00`[‡](#dc)|
### Example
```c++
uint8_t buffer[] = { 0x65, 0x06, 0x3D, 0x28 0x00, 0x00 };
uint32_t i = 0;
int8_t xTilt = buffer[i+2]; // get tilt x value
int8_t yTilt = buffer[i+3]; // get tilt y value
```
### Notes
> ...
## PRESSURE
Pressure is encoded as a two byte `uint16_t` value. Pressure values range between 0-1024 based on the specifications.
|BYTE_0|BYTE_1|BYTE_2|BYTE_3|BYTE_4|BYTE_5|
|:----:|:----:|:----:|:----:|:----:|:----:|
|_Marker_|_Length_|_?_|_?_|_PRES. HIGH_|_PRES. LOW_|
|`0x64`|`0x06`|`0x00`[‡](#dc)|`0x00`[‡](#dc)|`VARIES`|`VARIES`|
### Example
```c++
uint8_t buffer[] = { 0x64, 0x06, 0x00, 0x00, 0x00, 0xAD };
uint32_t i = 0;
uint16_t pressure = (buffer[i+4] << 8) | (buffer[i+5]); // get pressure
```
### Notes
> ...
# Time
Timing blocks begin with a `0xC2` are `0x06` bytes long and are identified by a block id, depending on the type of counter information. The most important timer is the [Clock](#clock).
## Clock
Throughout each file, a time marker is recorded every second beginning with device power on. They are identified by a block id of `0x11`. The timing sequence is composed of 6 bytes. The elapsed time is encoded in the last two bytes as an `uint16_t`. BYTE_3 may also be used to encode time, but the last two bytes can represent 2<sup>8</sup> seconds (over 18 hours). No test data of this duration has been generated.
|BYTE_0|BYTE_1|BYTE_2|BYTE_3|BYTE_4|BYTE_5|
|:----:|:----:|:----:|:----:|:----:|:----:|
|_Marker_|_Length_|_ID_|_?_|_COUNTER HIGH_|_COUNTER LOW_|
|`0xC2`|`0x06`|`0x11`|`0x00`[‡](#dc)|`VARIES`|`VARIES`|
### Example
```c++
uint8_t buffer[] = { 0xC2, 0x06, 0x11, 0x00, 0x00, 0x01 };
uint32_t i = 0;
uint16_t counter = (buffer[i+4] << 8) | (buffer[i+5]); // get counter value
```
### Notes
> ...
## Clock "Init" [INCOMPLETE](#Incomplete)
The clock "init" block happens early in all files examined -- usually before the first Clock block. They are identified by a block id of `0x00`. Usually this block ends with a `0x01`, but when multiples are present in longer files, the last byte can be a `0x02`. Usually this byte alternates between `0x01` and `0x02` when multiples are present. It does not seem to affect or respond other timer values.
|BYTE_0|BYTE_1|BYTE_2|BYTE_3|BYTE_4|BYTE_5|
|:----:|:----:|:----:|:----:|:----:|:----:|
|_Marker_|_Length_|_ID_|_?_|_?_|_Alternates b/t 0x01/0x02_|
|`0xC2`|`0x06`|`0x00`|`0x00`[‡](#dc)|`0x00`[‡](#dc)|`0x01, 0x02`[‡](#dc)|
## Clock "Unknown" [INCOMPLETE](#Incomplete)
The clock "unknown" block usually only happens once per file. It is usually located near the beginning of the file. They are identified by a block id of `0x12`. The last two bytes often share values between files, but not consistently. The range of values taken by the last two bytes are fairly limited, matching occasionally, but not in a clear pattern. It is possible that the last two bytes should be read as a `int16_t` or `uint16_t`.
|BYTE_0|BYTE_1|BYTE_2|BYTE_3|BYTE_4|BYTE_5|
|:----:|:----:|:----:|:----:|:----:|:----:|
|_Marker_|_Length_|_ID_|_?_|_?_|_?_|
|`0xC2`|`0x06`|`0x12`|`0x00`[‡](#dc)|`VARIES`|`VARIES`|
# Quality / Control Blocks
## Interference / Obstructions
The interference sequence occurs when an object is placed in front of the sensor. While the obstruction is present, bytes 10 and 11 are incremented, perhaps representing a counter. Note that unlike position values (which are Big Endian), the counter value appears to be Little Endian.
|BYTE_0|BYTE_1|BYTE_2|BYTE_3|BYTE_4|BYTE_5|BYTE_6|BYTE_7|
|:----:|:----:|:----:|:----:|:----:|:----:|:----:|:----:|
|_Marker_|_Length_|_?_|_?_|_?_|_?_|_?_|_?_|
|`0xC7`|`0x0E`|`0x04`[‡](#dc)|`0x04`[‡](#dc)|`0x03`[‡](#dc)|`VARIES`|`VARIES`|`VARIES`|
|BYTE_8|BYTE_9|BYTE_10|BYTE_11|BYTE_12|BYTE_13|
|:----:|:----:|:-----:|:-----:|:-----:|:-----:|
|_?_|_?_|_COUNTER LOW_|_COUNTER HIGH_|_?_|_?_|
|`VARIES`|`VARIES`|`VARIES`|`VARIES`|`0x00`[‡](#dc)|`0x00`[‡](#dc)|
## 0xC7, 0x1E [INCOMPLETE](#Incomplete)
|BYTE_0|BYTE_1|BYTE_2|BYTE_3|BYTE_4|BYTE_5|BYTE_6|BYTE_7|
|:----:|:----:|:----:|:----:|:----:|:----:|:----:|:----:|
|_Marker_|_Length (30)_|_?_|_?_|_?_|_?_|_?_|_?_|
|`0xC7`|`0x1E`|`0x05 or 0x03`[‡](#dc)|`0x04 or 0x02`[‡](#dc)|`0x00`[‡](#dc)|`0x00`[‡](#dc)|`0x07`[‡](#dc)|`0x00`[‡](#dc)|
|BYTE_8|BYTE_9|BYTE_10|BYTE_11|BYTE_12|BYTE_13|BYTE_14|BYTE_15|
|:-:|:-:|:-:|:-:|:-:|:-:|:-:|:-:|
|_?_|_?_|_?_|_COUNTER LOW_|_COUNTER HIGH_|_?_|_?_|_?_|
|`0x00`[‡](#dc)|`0x00`[‡](#dc)|`VARIES`|`VARIES`|`VARIES`|`0x00`[‡](#dc)|`0x00`[‡](#dc)|`0x00`[‡](#dc)|
|BYTE_16|BYTE_17|BYTE_18|BYTE_19|BYTE_20|BYTE_21|BYTE_22|BYTE_23|
|:----:|:----:|:----:|:----:|:----:|:----:|:----:|:----:|
|_?_|_?_|_?_|_?_|_?_|_?_|_?_|_?_|
|`0x00`|`0x00`|`0x00`[‡](#dc)|`0x00`[‡](#dc)|`0x00`[‡](#dc)|`0x00 OR 0x01`[‡](#dc)|`0x00`[‡](#dc)|`0x00`[‡](#dc)|
|BYTE_24|BYTE_25|BYTE_26|BYTE_27|BYTE_28|BYTE_29|
|:-:|:-:|:-:|:-:|:-:|:-:|
|_?_|_?_|_?_|_?_|_?_|_?_|
|`0x00`[‡](#dc)|`0x00 OR 0x01`[‡](#dc)|`0x00`[‡](#dc)|`0x00`[‡](#dc)|`0x00`[‡](#dc)|`0x00`[‡](#dc)|
### Notes
> ...
## 0xC7, 0x1A [INCOMPLETE](#Incomplete)
### Notes
> ...
## 0xC7, 0x16 [INCOMPLETE](#Incomplete)
### Notes
> ...
## 0xC7, 0x22 [INCOMPLETE](#Incomplete)
### Notes
> ...
e.g.
c7,0e,04,04,00,00,03,00,00,00,4a,2a,00,00w
0xC7, 0x0E, 0x00
/////////////////
The "0xC513" sequence happens through the file. The file always ends with a "0xC513" sequence and the very last one last byte in the last "0xC513" sequence is always a 0x02.
e.g.
0xC5 0x13 0x00 0x4A 0xCD 0x59 0x03 0x03 0x01 0x94 0x04 0x00 0x00 0x00 0x00 0x00 0x00 0x00 0x00
e.g. last "0xC513" seq.
0xC5 0x13 0x00 0x6E 0xCD 0x59 0x03 0x06 0x01 0xB6 0x04 0x00 0x00 0x00 0x00 0x00 0x00 0x00 0x02
The "0xC71E" sequence usually happens early in the file, but can happen elsewhere. It is usually (but not always) closely followed by a "0xC71A" sequence (below).
e.g.
0xC7 0x1E 0x03 0x02 0x00 0x00 0x07 0x00 0x00 0x00 0xE2 0x0C 0x00 0x00 0x00 0x80 0x00 0x00 0x00 0x00 0x00 0x00 0x01 0x00 0x00 0x00 0x01 0x00 0x00 0x00
The "0xC71A" sequence always follows a "0xC71E" sequence.
e.g.
0xC7 0x1A 0x04 0x02 0x00 0x00 0x06 0x00 0x00 0x00 0x06 0x0D 0x00 0x00 0x00 0x80 0x00 0x00 0x00 0x00 0x00 0x00 0x02 0x00 0x00 0x00
# Footnotes
* test <a id="dcc"></a>dcc
* <a id="*">\*</a> :
* <a id="**">\**</a> : -
* <a id="†">†</a> :
* <a id="dc">‡</a> : Typical Value - could take other values, but no other values have been observed.
* <a id="§">§</a> : -
* <a id="Incomplete">Incomplete</a> : Incomplete - more data needed.
| 37.25614 | 443 | 0.652383 | eng_Latn | 0.853457 |
ff0a40b2dcc5f7d79898e5851bfaf927e6df3ffe | 1,248 | md | Markdown | docs/dotnet/how-to-detect-clr-compilation.md | Juanmemo/cpp-docs | 4dcb1791f0c619a99b785ea6f832b8804c188bd9 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-05-18T02:55:34.000Z | 2021-05-18T02:55:34.000Z | docs/dotnet/how-to-detect-clr-compilation.md | Juanmemo/cpp-docs | 4dcb1791f0c619a99b785ea6f832b8804c188bd9 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/dotnet/how-to-detect-clr-compilation.md | Juanmemo/cpp-docs | 4dcb1791f0c619a99b785ea6f832b8804c188bd9 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: "How to: Detect -clr Compilation | Microsoft Docs"
ms.custom: ""
ms.date: "11/04/2016"
ms.reviewer: ""
ms.suite: ""
ms.technology: ["cpp-windows"]
ms.tgt_pltfrm: ""
ms.topic: "get-started-article"
dev_langs: ["C++"]
helpviewer_keywords: ["compilation, detecting /clr", "/clr compiler option [C++], detecting use of"]
ms.assetid: a9310045-4810-4637-a64a-0b31a08791c1
caps.latest.revision: 10
author: "mikeblome"
ms.author: "mblome"
manager: "ghogen"
ms.workload: ["cplusplus", "dotnet"]
---
# How to: Detect /clr Compilation
Use the `_MANAGED` or `_M_CEE` macro to see if a module is compiled with **/clr**. For more information, see [/clr (Common Language Runtime Compilation)](../build/reference/clr-common-language-runtime-compilation.md).
For more information about macros, see [Predefined Macros](../preprocessor/predefined-macros.md).
## Example
```
// detect_CLR_compilation.cpp
// compile with: /clr
#include <stdio.h>
int main() {
#if (_MANAGED == 1) || (_M_CEE == 1)
printf_s("compiling with /clr\n");
#else
printf_s("compiling without /clr\n");
#endif
}
```
## See Also
[Using C++ Interop (Implicit PInvoke)](../dotnet/using-cpp-interop-implicit-pinvoke.md) | 30.439024 | 219 | 0.672276 | eng_Latn | 0.627116 |
ff0a6ec86ca7a62efee1b80495a550a5de4f4aea | 605 | md | Markdown | content/Reduced Row Echelon Form.md | ransurf/quartz | 174c514401f4265b360fb0e22449adeb462cc152 | [
"MIT"
] | null | null | null | content/Reduced Row Echelon Form.md | ransurf/quartz | 174c514401f4265b360fb0e22449adeb462cc152 | [
"MIT"
] | null | null | null | content/Reduced Row Echelon Form.md | ransurf/quartz | 174c514401f4265b360fb0e22449adeb462cc152 | [
"MIT"
] | null | null | null | Status:
Tags: #cards/math232/unit2
Links: [[Augmented Matrix]]
___
# Reduced Row Echelon Form
## Principles
- Are unique
Process
?
- all digits to right of a leading 1 are 0
- achieved by adding on the multiple of the row directly below to cancel out and create a 0
<!--SR:!2022-03-08,26,170-->
Checking
?
- Choose the [[Free variables]], and then find the others from there
- Plug in to all original equations
<!--SR:!2022-02-22,12,170-->
___
# Backlinks
```dataview
list from [[Reduced Row Echelon Form]] AND !outgoing([[Reduced Row Echelon Form]])
```
___
References:
Created:: 2022-01-22 00:50
| 19.516129 | 91 | 0.710744 | eng_Latn | 0.97261 |
ff0a7b7913fe88dc087dac399a30726b49104f13 | 5,387 | md | Markdown | yammer/configure-your-yammer-network/file-migration-native-mode.md | whitemike889/OfficeDocs-O365ITPro | 9850ee6bbd9ad78d7fb959d00f1ff233a6eb8a42 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | yammer/configure-your-yammer-network/file-migration-native-mode.md | whitemike889/OfficeDocs-O365ITPro | 9850ee6bbd9ad78d7fb959d00f1ff233a6eb8a42 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | yammer/configure-your-yammer-network/file-migration-native-mode.md | whitemike889/OfficeDocs-O365ITPro | 9850ee6bbd9ad78d7fb959d00f1ff233a6eb8a42 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: "Prepare files in Yammer for Native Mode for Microsoft 365"
ms.author: v-teflor
author: TeresaFG-writer
manager: pamgreen
ms.date: 9/23/2019
audience: Admin
ms.topic: article
ms.service: yammer
localization_priority: Normal
ms.custom: Adm_Yammer
search.appverid:
- MOE150
- MET150
description: "Learn how to move all Yammer files to SharePoint, part of the requirements for using Native Mode for Microsoft 365."
ROBOTS: NOINDEX, NOFOLLOW
---
# Prepare files in Yammer for Native Mode for Microsoft 365
To use Native Mode for Microsoft 365, all Yammer files must be stored in SharePoint. Having one consistent location for files makes usage easier for both end-users and admins:
- **End-users** follow the same processes with all files in all groups, including the ability to add conditional access policies on specific files.
- **Admins** can use the same security and compliance tools for Yammer as they do with the rest of Office 365 including eDiscovery, have a consistent way to export all files, and simplify complying with GDPR requests.
The Microsoft 365 Alignment Wizard does the following file-related steps:
1. Provides a link to download the **User and Group Activity Report**, which lists the number of files to be moved for each group and the files to be deleted.
2. Copy all Yammer group files from Azure cloud storage to the SharePoint document library for the group, and automatically delete the Azure cloud storage files 30 days after the wizard completes.
3. Delete all Yammer files that were attached to private messages.
>[!IMPORTANT]
> After migration has been started, it can't be stopped or reversed.
## Decisions needed
Because migration deletes all files from private messages, you need to decide how to handle these files. The options are to:
- Notify users that the files will be deleted. The wizard will delete the files, and they will no longer be available.
- Export the files before migration and save them in case anyone asks for them.
- Export the files before migration, and send the files to the person who posted them, letting them know the files are no longer available on Yammer.
## Recommended process
1. Start the Microsoft 365 Alignment Wizard and download the User and Group Activity Report to see how many files will be moved and how many will be deleted.
2. Export files from private messages if these files need to be saved. For steps, see [Export data from Yammer Enterprise](../manage-security-and-compliance/export-yammer-enterprise-data.md).
3. Optional. Export all files from all groups that have been stored in Yammer to provide a backup of the files.
4. When your backup is complete, start the migration from the Microsoft 365 Alignment Wizard. You can start migration when export is running, but the export data may indicate certain files are in SharePoint due to the migration.
5. Review the Microsoft 365 Alignment Wizard to determine if other steps are necessary before your network can be in Native Mode for Microsoft 365.
## Admin experience during migration
Migration status can be viewed on the Network Alignment Wizard page. You can track:
- How many groups where file migration was successful, partial, or unsuccessful.
- How many files where the filename was changed due to a file name conflict.
Most migrations complete within 72 hours.
## End-user experience during migration
During the migration, users can create new files from Yammer, edit existing Yammer files that are stored in SharePoint, and upload new versions of files that are already stored in SharePoint. Users won't be able to delete files stored in Azure cloud storage once migration has started.
After migration:
- All files will be stored in SharePoint, providing a consistent file management experience.
- If users want to search the content of files, they'll need to go to the SharePoint site or document library for an individual group and use the Search bar at the top of the page, or use content search from the organization's SharePoint site.
Yammer search searches the first 5000 characters of files in Azure cloud storage as well as the title and author, but only searches the title and author of files stored in SharePoint.
## File metadata
When files are copied to the SharePoint document library:
- Only the latest version is moved over, and the version history is not copied.
- The follower count is not copied.
- Users can no longer mark a file as official.
- Files are marked as having been migrated to SharePoint.
## File name conflicts
When migration completes, the admin gets a report of any file name conflicts, and how they were handled:
- If a file with the same name already exists in the SharePoint document library, we append \_*date* to the file name.
- If a filename includes invalid characters, we update the file name to remove the invalid characters.
- If a file can't be migrated for any reason, we retry once, and continue the migration. You won't be able to take your network to Native Mode for Microsoft 365 until you resolve any files left behind.
## FAQ
**Q: Does every Yammer user need a SharePoint license?**
A: No. Only one person in your organization needs a SharePoint license.
## Related articles
[Prepare a Yammer network for Native Mode for Microsoft 365](native-mode.md)
[Enforce Office 365 identity in Yammer](enforce-office-365-identity.md)
| 46.843478 | 285 | 0.786152 | eng_Latn | 0.998926 |
ff0adc26f9c17c02cf0ab9fa3ce973037eef378f | 3,651 | md | Markdown | applications/BearPi/BearPi-HM_Nano/sample/A6_kernel_message/README.md | xusiwei/bearpi-hm_nano | 3ee5b0f9097b19dae89713908500b8dd855c4238 | [
"Apache-2.0"
] | null | null | null | applications/BearPi/BearPi-HM_Nano/sample/A6_kernel_message/README.md | xusiwei/bearpi-hm_nano | 3ee5b0f9097b19dae89713908500b8dd855c4238 | [
"Apache-2.0"
] | null | null | null | applications/BearPi/BearPi-HM_Nano/sample/A6_kernel_message/README.md | xusiwei/bearpi-hm_nano | 3ee5b0f9097b19dae89713908500b8dd855c4238 | [
"Apache-2.0"
] | null | null | null | # BearPi-HM_Nano开发板鸿蒙OS内核编程开发——消息队列
本示例将演示如何在BearPi-HM_Nano开发板上使用cmsis 2.0 接口通过消息队列进行线程之间交换消息

## MessageQueue API分析
## osMessageQueueNew()
```c
osMessageQueueId_t osMessageQueueNew(uint32_t msg_count,uint32_t msg_size,const osMessageQueueAttr_t *attr)
```
**描述:**
函数osMessageQueueNew创建并初始化一个消息队列对象。该函数返回消息队列对象标识符,如果出现错误则返回NULL,可以在RTOS启动(调用 osKernelStart)之前安全地调用该函数,也可以在内核初始化 (调用 osKernelInitialize)之前调用该函数。
> **注意** :不能在中断服务调用该函数
**参数:**
|名字|描述|
|:--|:------|
| msg_count |队列中的最大消息数. |
| msg_size |最大消息大小(以字节为单位). |
| attr |消息队列属性;空:默认值. |
## osMessageQueuePut()
```c
osStatus_t osMessageQueuePut(osMessageQueueId_t mq_id,const void *msg_ptr,uint8_t msg_prio,uint32_t timeout)
```
**描述:**
函数osMessageQueuePut将msg_ptr指向的消息放入参数mq_id指定的消息队列中。
> **注意** :如果参数timeout设置为0,可以从中断服务例程调用
**参数:**
|名字|描述|
|:--|:------|
| mq_id | 由osMessageQueueNew获得的消息队列ID. |
| msg_ptr | 要发送的消息. |
| msg_prio | 指优先级. |
| timeout | 超时值. |
## osMessageQueueGet()
```c
osStatus_t osMessageQueueGet(osMessageQueueId_t mq_id,void *msg_ptr,uint8_t *msg_prio,uint32_t timeout)
```
**描述:**
函数osMessageQueueGet从参数mq_id指定的消息队列中检索消息,并将其保存到参数msg_ptr所指向的缓冲区中。
> **注意** :如果参数timeout设置为0,可以从中断服务例程调用。
**参数:**
|名字|描述|
|:--|:------|
| mq_id | 由osMessageQueueNew获得的消息队列ID. |
| msg_ptr | 指针指向队列中获取消息的缓冲区指针. |
| msg_prio | 指优先级. |
| timeout | 超时值. |
## 软件设计
## 软件设计
**主要代码分析**
在Message_example函数中,通过osMessageQueueNew()函数创建了消息队列ID,Thread_MsgQueue1()函数中通过osMessageQueuePut()函数向消息队列中发送消息。在Thread_MsgQueue2()函数中通过osMessageQueueGet()函数读取消息队列中的消息比打印出来。
```c
void Thread_MsgQueue1 (void *argument)
{
(void)argument;
msg.Buf = "Hello BearPi-HM_Nano!"; // do some work...
msg.Idx = 0U;
while (1)
{
osMessageQueuePut(mid_MsgQueue, &msg, 0U, 0U);
osThreadYield(); // suspend thread
osDelay(100);
}
}
void Thread_MsgQueue2 (void *argument)
{
(void)argument;
osStatus_t status;
while (1) {
// Insert thread code here...
status = osMessageQueueGet(mid_MsgQueue, &msg, NULL, 0U); // wait for message
if (status == osOK) {
printf("Message Queue Get msg:%s\n",msg.Buf);
}
}
}
static void Message_example (void) {
mid_MsgQueue = osMessageQueueNew(MSGQUEUE_OBJECTS, 100, NULL);
if (mid_MsgQueue == NULL) {
printf("Falied to create Message Queue!\n");
}
osThreadAttr_t attr;
attr.attr_bits = 0U;
attr.cb_mem = NULL;
attr.cb_size = 0U;
attr.stack_mem = NULL;
attr.stack_size = 1024*10;
attr.priority = 25;
attr.name = "Thread_MsgQueue1";
if (osThreadNew(Thread_MsgQueue1, NULL, &attr) == NULL) {
printf("Falied to create Thread_MsgQueue1!\n");
}
attr.name = "Thread_MsgQueue2";
if (osThreadNew(Thread_MsgQueue2, NULL, &attr) == NULL) {
printf("Falied to create Thread_MsgQueue2!\n");
}
}
```
## 编译调试
### 修改 BUILD.gn 文件
修改 `applications\BearPi\BearPi-HM_Nano\sample`路径下 BUILD.gn 文件,指定 `message_example` 参与编译。
```r
#"A1_kernal_thread:thread_example",
#"A2_kernel_timer:timer_example",
#"A3_kernel_event:event_example",
#"A4_kernel_mutex:mutex_example",
#"A5_kernel_semaphore:semaphore_example",
"A6_kernel_message:message_example",
```
### 运行结果<a name="section18115713118"></a>
示例代码编译烧录代码后,按下开发板的RESET按键,通过串口助手查看日志,会打印从消息队列中获取的消息。
```c
Message Queue Get msg:Hello BearPi-HM_Nano!
Message Queue Get msg:Hello BearPi-HM_Nano!
Message Queue Get msg:Hello BearPi-HM_Nano!
Message Queue Get msg:Hello BearPi-HM_Nano!
Message Queue Get msg:Hello BearPi-HM_Nano!
```
| 22.81875 | 169 | 0.697343 | kor_Hang | 0.206546 |
ff0af7f4db31671898f55f59fcf8dc852a47c0be | 14,872 | md | Markdown | azurermps-6.13.0/AzureRM.Profile/Add-AzureRmEnvironment.md | vladimirf7/azure-docs-powershell | 3ff03c91cee2b137f85eded1db721b46c118e413 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-07-18T09:52:49.000Z | 2021-07-20T20:07:58.000Z | azurermps-6.13.0/AzureRM.Profile/Add-AzureRmEnvironment.md | vladimirf7/azure-docs-powershell | 3ff03c91cee2b137f85eded1db721b46c118e413 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | azurermps-6.13.0/AzureRM.Profile/Add-AzureRmEnvironment.md | vladimirf7/azure-docs-powershell | 3ff03c91cee2b137f85eded1db721b46c118e413 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-03-16T09:07:16.000Z | 2021-03-16T09:07:16.000Z | ---
external help file: Microsoft.Azure.Commands.Profile.dll-Help.xml
Module Name: AzureRM.Profile
online version: https://docs.microsoft.com/en-us/powershell/module/azurerm.profile/add-azurermenvironment
schema: 2.0.0
content_git_url: https://github.com/Azure/azure-powershell/blob/preview/src/ResourceManager/Profile/Commands.Profile/help/Add-AzureRmEnvironment.md
original_content_git_url: https://github.com/Azure/azure-powershell/blob/preview/src/ResourceManager/Profile/Commands.Profile/help/Add-AzureRmEnvironment.md
---
# Add-AzureRmEnvironment
## SYNOPSIS
Adds endpoints and metadata for an instance of Azure Resource Manager.
## SYNTAX
### Name (Default)
```
Add-AzureRmEnvironment [-Name] <String> [[-PublishSettingsFileUrl] <String>] [[-ServiceEndpoint] <String>]
[[-ManagementPortalUrl] <String>] [[-StorageEndpoint] <String>] [[-ActiveDirectoryEndpoint] <String>]
[[-ResourceManagerEndpoint] <String>] [[-GalleryEndpoint] <String>]
[[-ActiveDirectoryServiceEndpointResourceId] <String>] [[-GraphEndpoint] <String>]
[[-AzureKeyVaultDnsSuffix] <String>] [[-AzureKeyVaultServiceEndpointResourceId] <String>]
[[-TrafficManagerDnsSuffix] <String>] [[-SqlDatabaseDnsSuffix] <String>]
[[-AzureDataLakeStoreFileSystemEndpointSuffix] <String>]
[[-AzureDataLakeAnalyticsCatalogAndJobEndpointSuffix] <String>] [-EnableAdfsAuthentication]
[[-AdTenant] <String>] [[-GraphAudience] <String>] [[-DataLakeAudience] <String>]
[[-BatchEndpointResourceId] <String>] [[-AzureOperationalInsightsEndpointResourceId] <String>]
[[-AzureOperationalInsightsEndpoint] <String>] [-AzureAnalysisServicesEndpointSuffix <String>] [-Scope <ContextModificationScope>]
[-DefaultProfile <IAzureContextContainer>] [-WhatIf] [-Confirm] [<CommonParameters>]
```
### ARMEndpoint
```
Add-AzureRmEnvironment [-Name] <String> [[-StorageEndpoint] <String>] [-ARMEndpoint] <String>
[[-AzureKeyVaultDnsSuffix] <String>] [[-AzureKeyVaultServiceEndpointResourceId] <String>]
[[-DataLakeAudience] <String>] [[-BatchEndpointResourceId] <String>]
[[-AzureOperationalInsightsEndpointResourceId] <String>] [[-AzureOperationalInsightsEndpoint] <String>]
[-Scope <ContextModificationScope>] [-DefaultProfile <IAzureContextContainer>] [-WhatIf] [-Confirm]
[<CommonParameters>]
```
## DESCRIPTION
The Add-AzureRmEnvironment cmdlet adds endpoints and metadata to enable Azure Resource Manager cmdlets to connect with a new instance of Azure Resource Manager.
The built-in environments AzureCloud and AzureChinaCloud target existing public instances of Azure Resource Manager.
## EXAMPLES
### Example 1: Creating and modifying a new environment
```
PS C:\> Add-AzureRmEnvironment -Name TestEnvironment `
-ActiveDirectoryEndpoint TestADEndpoint `
-ActiveDirectoryServiceEndpointResourceId TestADApplicationId `
-ResourceManagerEndpoint TestRMEndpoint `
-GalleryEndpoint TestGalleryEndpoint `
-GraphEndpoint TestGraphEndpoint
Name Resource Manager Url ActiveDirectory Authority
---- -------------------- -------------------------
TestEnvironment TestRMEndpoint TestADEndpoint/
PS C:\> Set-AzureRmEnvironment -Name TestEnvironment `
-ActiveDirectoryEndpoint NewTestADEndpoint `
-GraphEndpoint NewTestGraphEndpoint | Format-List
Name : TestEnvironment
EnableAdfsAuthentication : False
OnPremise : False
ActiveDirectoryServiceEndpointResourceId : TestADApplicationId
AdTenant :
GalleryUrl : TestGalleryEndpoint
ManagementPortalUrl :
ServiceManagementUrl :
PublishSettingsFileUrl :
ResourceManagerUrl : TestRMEndpoint
SqlDatabaseDnsSuffix :
StorageEndpointSuffix :
ActiveDirectoryAuthority : NewTestADEndpoint
GraphUrl : NewTestGraphEndpoint
GraphEndpointResourceId :
TrafficManagerDnsSuffix :
AzureKeyVaultDnsSuffix :
DataLakeEndpointResourceId :
AzureDataLakeStoreFileSystemEndpointSuffix :
AzureDataLakeAnalyticsCatalogAndJobEndpointSuffix :
AzureKeyVaultServiceEndpointResourceId :
AzureOperationalInsightsEndpointResourceId :
AzureOperationalInsightsEndpoint :
AzureAnalysisServicesEndpointSuffix :
VersionProfiles : {}
ExtendedProperties : {}
BatchEndpointResourceId :
In this example we are creating a new Azure environment with sample endpoints using Add-AzureRmEnvironment, and then we are changing the value of the ActiveDirectoryEndpoint and GraphEndpoint attributes of the created environment using the cmdlet Set-AzureRmEnvironment.
```
## PARAMETERS
### -ActiveDirectoryEndpoint
Specifies the base authority for Azure Active Directory authentication.
```yaml
Type: System.String
Parameter Sets: Name
Aliases: AdEndpointUrl, ActiveDirectory, ActiveDirectoryAuthority
Required: False
Position: 5
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -ActiveDirectoryServiceEndpointResourceId
Specifies the audience for tokens that authenticate requests to Azure Resource Manager or Service Management (RDFE) endpoints.
```yaml
Type: System.String
Parameter Sets: Name
Aliases:
Required: False
Position: 8
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -AdTenant
Specifies the default Active Directory tenant.
```yaml
Type: System.String
Parameter Sets: Name
Aliases:
Required: False
Position: 17
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -ARMEndpoint
The Azure Resource Manager endpoint
```yaml
Type: System.String
Parameter Sets: ARMEndpoint
Aliases: ArmUrl
Required: True
Position: 1
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -AzureAnalysisServicesEndpointSuffix
Dns Suffix of Azure Analysis Services service endpoints
```yaml
Type: System.String
Parameter Sets: Name
Aliases:
Required: False
Position: 15
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -AzureDataLakeAnalyticsCatalogAndJobEndpointSuffix
Dns Suffix of Azure Data Lake Analytics job and catalog services
```yaml
Type: System.String
Parameter Sets: Name
Aliases:
Required: False
Position: 15
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -AzureDataLakeStoreFileSystemEndpointSuffix
Dns Suffix of Azure Data Lake Store FileSystem.
Example: azuredatalake.net
```yaml
Type: System.String
Parameter Sets: Name
Aliases:
Required: False
Position: 14
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -AzureKeyVaultDnsSuffix
Specifies the domain name suffix for Key Vault services.
```yaml
Type: System.String
Parameter Sets: (All)
Aliases:
Required: False
Position: 10
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -AzureKeyVaultServiceEndpointResourceId
Specifies the audience for access tokens that authorize requests for Key Vault services.
```yaml
Type: System.String
Parameter Sets: (All)
Aliases:
Required: False
Position: 11
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -AzureOperationalInsightsEndpoint
Specifies the endpoint for the Operational Insights query access.
```yaml
Type: System.String
Parameter Sets: (All)
Aliases:
Required: False
Position: 22
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -AzureOperationalInsightsEndpointResourceId
Specifies the audience for access tokens that authorize requests for Operational Insights services.
```yaml
Type: System.String
Parameter Sets: (All)
Aliases:
Required: False
Position: 21
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -BatchEndpointResourceId
The resource identifier of the Azure Batch service that is the recipient of the requested token
```yaml
Type: System.String
Parameter Sets: (All)
Aliases: BatchResourceId, BatchAudience
Required: False
Position: 20
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -DataLakeAudience
The audience for tokens authenticating with the AD Data Lake services Endpoint.
```yaml
Type: System.String
Parameter Sets: (All)
Aliases: DataLakeEndpointResourceId, DataLakeResourceId
Required: False
Position: 19
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -DefaultProfile
The credeetnails, tenant and subscription used for communication with azure
```yaml
Type: Microsoft.Azure.Commands.Common.Authentication.Abstractions.IAzureContextContainer
Parameter Sets: (All)
Aliases: AzureRmContext, AzureCredential
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -EnableAdfsAuthentication
Indicates that Active Directory Federation Services (ADFS) on-premise authentication is allowed.
```yaml
Type: System.Management.Automation.SwitchParameter
Parameter Sets: Name
Aliases: OnPremise
Required: False
Position: 16
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -GalleryEndpoint
Specifies the endpoint for the Azure Resource Manager gallery of deployment templates.
```yaml
Type: System.String
Parameter Sets: Name
Aliases: Gallery, GalleryUrl
Required: False
Position: 7
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -GraphAudience
The audience for tokens authenticating with the AD Graph Endpoint.
```yaml
Type: System.String
Parameter Sets: Name
Aliases: GraphEndpointResourceId, GraphResourceId
Required: False
Position: 18
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -GraphEndpoint
Specifies the URL for Graph (Active Directory metadata) requests.
```yaml
Type: System.String
Parameter Sets: Name
Aliases: Graph, GraphUrl
Required: False
Position: 9
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -ManagementPortalUrl
Specifies the URL for the Management Portal.
```yaml
Type: System.String
Parameter Sets: Name
Aliases:
Required: False
Position: 3
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -Name
Specifies the name of the environment to add.
```yaml
Type: System.String
Parameter Sets: (All)
Aliases:
Required: True
Position: 0
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -PublishSettingsFileUrl
Specifies the URL from which .publishsettings files can be downloaded.
```yaml
Type: System.String
Parameter Sets: Name
Aliases:
Required: False
Position: 1
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -ResourceManagerEndpoint
Specifies the URL for Azure Resource Manager requests.
```yaml
Type: System.String
Parameter Sets: Name
Aliases: ResourceManager, ResourceManagerUrl
Required: False
Position: 6
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -Scope
Determines the scope of context changes, for example, whether changes apply only to the current process, or to all sessions started by this user.
```yaml
Type: Microsoft.Azure.Commands.Profile.Common.ContextModificationScope
Parameter Sets: (All)
Aliases:
Accepted values: Process, CurrentUser
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -ServiceEndpoint
Specifies the endpoint for Service Management (RDFE) requests.
```yaml
Type: System.String
Parameter Sets: Name
Aliases: ServiceManagement, ServiceManagementUrl
Required: False
Position: 2
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -SqlDatabaseDnsSuffix
Specifies the domain-name suffix for Azure SQL Database servers.
```yaml
Type: System.String
Parameter Sets: Name
Aliases:
Required: False
Position: 13
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -StorageEndpoint
Specifies the endpoint for storage (blob, table, queue, and file) access.
```yaml
Type: System.String
Parameter Sets: (All)
Aliases: StorageEndpointSuffix
Required: False
Position: 4
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -TrafficManagerDnsSuffix
Specifies the domain-name suffix for Azure Traffic Manager services.
```yaml
Type: System.String
Parameter Sets: Name
Aliases:
Required: False
Position: 12
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -Confirm
Prompts you for confirmation before running the cmdlet.
```yaml
Type: System.Management.Automation.SwitchParameter
Parameter Sets: (All)
Aliases: cf
Required: False
Position: Named
Default value: False
Accept pipeline input: False
Accept wildcard characters: False
```
### -WhatIf
Shows what would happen if the cmdlet runs. The cmdlet is not run.
```yaml
Type: System.Management.Automation.SwitchParameter
Parameter Sets: (All)
Aliases: wi
Required: False
Position: Named
Default value: False
Accept pipeline input: False
Accept wildcard characters: False
```
### CommonParameters
This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see about_CommonParameters (http://go.microsoft.com/fwlink/?LinkID=113216).
## INPUTS
### System.String
### System.Management.Automation.SwitchParameter
## OUTPUTS
### Microsoft.Azure.Commands.Profile.Models.PSAzureEnvironment
## NOTES
## RELATED LINKS
[Get-AzureRMEnvironment](./Get-AzureRMEnvironment.md)
[Remove-AzureRMEnvironment](./Remove-AzureRMEnvironment.md)
[Set-AzureRMEnvironment](./Set-AzureRMEnvironment.md)
| 26.557143 | 314 | 0.75585 | yue_Hant | 0.304554 |
ff0b11f5b1be805b8828d6790d1d3c81632b900d | 277 | md | Markdown | README.md | JumpJets/Sankakucomplex-Auto-load-original-image | f29438fd3734c4463e677161c6e2aadbf7ecacd2 | [
"MIT"
] | null | null | null | README.md | JumpJets/Sankakucomplex-Auto-load-original-image | f29438fd3734c4463e677161c6e2aadbf7ecacd2 | [
"MIT"
] | null | null | null | README.md | JumpJets/Sankakucomplex-Auto-load-original-image | f29438fd3734c4463e677161c6e2aadbf7ecacd2 | [
"MIT"
] | null | null | null | # [Sankakucomplex] Auto-load original image
Userscript to replace on sankakucomplex samples images to original images
You need addon in your browser such as **Tampermonkey** to use it.
You also may use bonus styles from CSS-file, just import them in **Stylish/Stylus** addon
| 39.571429 | 89 | 0.787004 | eng_Latn | 0.994196 |
ff0b2e5105a3b31537e0d0ef2026e271bba4f9db | 351 | md | Markdown | src/23-web/23.5-clients/23.5.2-apis/23.5.2.4-paginated.md | zzy/crate-guide | 9246b6eaa9334dcf2abf0d6b8bb2488542cc23d0 | [
"Apache-2.0",
"MIT"
] | 47 | 2020-12-18T08:37:12.000Z | 2022-03-21T03:30:44.000Z | src/23-web/23.5-clients/23.5.2-apis/23.5.2.4-paginated.md | zzy/rust-crate-guide | 9246b6eaa9334dcf2abf0d6b8bb2488542cc23d0 | [
"Apache-2.0",
"MIT"
] | null | null | null | src/23-web/23.5-clients/23.5.2-apis/23.5.2.4-paginated.md | zzy/rust-crate-guide | 9246b6eaa9334dcf2abf0d6b8bb2488542cc23d0 | [
"Apache-2.0",
"MIT"
] | 8 | 2021-02-15T12:26:32.000Z | 2022-02-15T08:31:49.000Z | # 23.5.2.4. 使用 RESTful API 分页
[![reqwest-badge]][reqwest] [![serde-badge]][serde] [![cat-net-badge]][cat-net] [![cat-encoding-badge]][cat-encoding]
可以将分页的 web API 方便地包裹在 Rust 迭代器中,当到达每一页的末尾时,迭代器会从远程服务器加载下一页结果。
```rust,edition2018,no_run
{{ #include ../../../../examples/web/clients/apis/examples/paginated.rs }}
```
{{#include ../../../links.md}}
| 29.25 | 117 | 0.663818 | yue_Hant | 0.598526 |
ff0b8b2f98303f02672138dbd92a2360dbcefdc8 | 4,277 | md | Markdown | microsoft-365/compliance/preview-ediscovery-search-results.md | MicrosoftDocs/microsoft-365-docs-pr.es-ES | 90e5004934c592bb15f72edd88bec954c94ad710 | [
"CC-BY-4.0",
"MIT"
] | 19 | 2020-05-18T20:10:47.000Z | 2022-03-09T07:27:47.000Z | microsoft-365/compliance/preview-ediscovery-search-results.md | MicrosoftDocs/microsoft-365-docs-pr.es-ES | 90e5004934c592bb15f72edd88bec954c94ad710 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2022-02-09T06:48:57.000Z | 2022-02-09T06:49:35.000Z | microsoft-365/compliance/preview-ediscovery-search-results.md | MicrosoftDocs/microsoft-365-docs-pr.es-ES | 90e5004934c592bb15f72edd88bec954c94ad710 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2019-05-25T04:43:49.000Z | 2021-05-26T19:33:23.000Z | ---
title: Vista previa de los resultados de una búsqueda de eDiscovery
f1.keywords:
- NOCSH
ms.author: markjjo
author: markjjo
manager: laurawi
audience: Admin
ms.topic: article
ms.service: O365-seccomp
ms.localizationpriority: high
ms.collection:
- Strat_O365_IP
- M365-security-compliance
- SPO_Content
search.appverid:
- MOE150
- MED150
- MET150
ms.custom:
- seo-marvel-apr2020
description: Obtenga una vista previa de un ejemplo de los resultados devueltos por una búsqueda de contenido o una búsqueda de eDiscovery Core en el Centro de cumplimiento de Microsoft 365.
ms.openlocfilehash: af0811d0c442d6f064fd336d4261d1f7b2337dc8
ms.sourcegitcommit: d4b867e37bf741528ded7fb289e4f6847228d2c5
ms.translationtype: HT
ms.contentlocale: es-ES
ms.lasthandoff: 10/06/2021
ms.locfileid: "60189350"
---
# <a name="preview-ediscovery-search-results"></a>Vista previa de los resultados de búsqueda de eDiscovery
Después de ejecutar una búsqueda de contenido o una búsqueda asociada a un caso de eDiscovery Core, puede obtener una vista previa de un ejemplo de los resultados devueltos por la búsqueda. La vista previa de elementos devueltos por la consulta de búsqueda puede ayudarle a determinar si la búsqueda devuelve los resultados que espera o si necesita cambiar la consulta de búsqueda y volver a ejecutar la búsqueda.
Para obtener una vista previa de un ejemplo de resultados devuelto por una búsqueda:
1. En el Centro de cumplimiento de Microsoft 365, vaya a la página Búsqueda de contenido o a un caso de eDiscovery Core.
2. Seleccione la búsqueda para mostrar la página de controles flotantes.
3. En la parte inferior de la página de controles flotantes, haga clic en **Revisar muestra**.

Se muestra una página que contiene un ejemplo de los resultados de búsqueda.
4. Seleccione un elemento para ver su contenido en el panel de lectura.

En el recorte de pantalla anterior, observe que las palabras clave de búsqueda están resaltadas al previsualizar los elementos.
## <a name="how-the-search-result-samples-are-selected"></a>Cómo se seleccionan las muestras de resultados
Hay un máximo de 1 000 elementos seleccionados aleatoriamente disponibles para previsualizar. Además de seleccionarse aleatoriamente, los elementos disponibles para la vista previa deben cumplir los siguientes criterios:
- Se puede previsualizar un máximo de 100 elementos de una sola ubicación de contenido (un buzón o un sitio). Esto significa que es posible que menos de 1 000 elementos estén disponibles para previsualizar. Por ejemplo, si busca cuatro buzones y la búsqueda obtiene 1 500 elementos estimados, solo 400 estarán disponibles para la vista previa, ya que solo se pueden previsualizar 100 elementos de cada buzón.
- Para los elementos del buzón, solo se pueden previsualizar los mensajes de correo electrónico. No se pueden previsualizar elementos como tareas, elementos de calendario y contactos.
- Para los elementos del sitio, solo los documentos están disponibles para previsualizar. No se pueden previsualizar elementos como carpetas, listas o archivos adjuntos de listas.
## <a name="file-types-supported-when-previewing-search-results"></a>Los tipos de archivos se admiten al obtener una vista previa de los resultados de la búsqueda
Puede obtener una vista previa de los tipos de archivo compatibles en el panel de vista previa. Si un tipo de archivo no es compatible, necesitará descargar una copia en el equipo local (haciendo clic en **Descargar artículo original**). Para las páginas web .aspx se incluye la dirección URL de la página, aunque es posible que no tenga permiso para acceder a ella. No se puede obtener una vista previa de los elementos sin indexar.
Se admiten los siguientes tipos de archivo y se puede obtener una vista previa en el panel Resultados de búsqueda.
- .txt, .html, .mhtml
- .eml
- .doc, .docx, .docm
- .pptm, .pptx
- .pdf
Asimismo, se admiten los siguientes tipos de contenedor de archivo. Puede ver la lista de archivos del contenedor en el panel de vista previa.
- .zip
- .gzip | 52.158537 | 433 | 0.797755 | spa_Latn | 0.995155 |
ff0ba54601448fbc5688eb91392c317fb573c809 | 1,891 | md | Markdown | docs/dev.md | trollepierre/cozy-ui | 23aa5795d4971bb3a4b1e105635306fbeeac09d8 | [
"MIT"
] | 42 | 2015-10-01T17:59:43.000Z | 2022-02-05T21:53:43.000Z | docs/dev.md | trollepierre/cozy-ui | 23aa5795d4971bb3a4b1e105635306fbeeac09d8 | [
"MIT"
] | 1,774 | 2015-11-16T19:03:37.000Z | 2022-03-31T10:22:35.000Z | docs/dev.md | trollepierre/cozy-ui | 23aa5795d4971bb3a4b1e105635306fbeeac09d8 | [
"MIT"
] | 44 | 2015-10-01T19:45:58.000Z | 2021-12-14T13:48:48.000Z | # Working with a linked cozy-ui
Since `cozy-ui` is transpiled, when linking you must first
`yarn release`
Then you can have the transpiler watch the files :
`yarn transpile --watch`
If you change the icons, or the palette, you must run `yarn release` again.
On the app side, instead of using `yarn link` which can cause problems
with imports, you can use [`rlink`](https://gist.github.com/ptbrowne/add609bdcf4396d32072acc4674fff23).
# Transform markdown examples
:information_source: [`remark`][remark] is a processor for markdown: it parses markdown source into an AST,
transforms the tree and stringifies it afterwards. It can be used along with
jscodeshift to automatically migrate and transform examples.
When an API has changed and we need to update example, it can be useful to do it via a codemod. Here
is an example, running a jscodeshift codemod through the remark-jscodeshift plugin:
```
remark -o --use remark-jscodeshift=allowNoLang:true,transform:\"codemods/transform-dialog-imports.js\" .
```
[remark]: https://github.com/remarkjs/remark
# Screenshot testing locally
* You can screenshot old components into pristine_screenshots directory
* Screenshot the new one inside screenshots
* Run pixelmatch-server, which shows screenshots side by side like on Argos (you need the `pixelmatch` binary to be available)
```bash
# Screenshot all the components
yarn build:doc:react
yarn screenshots
cp -r screenshots pristine_screenshots
# yarn watch:doc:react
# Make changes to BarButton...
# Screenshot BarButton
export COMPONENT=BarButton
yarn screenshots --component $COMPONENT
# Run pixel diff on a single component
pixelmatch pristine_screenshots/$COMPONENT.png screenshots/$COMPONENT.png diff/$COMPONENT.png 0.1
# Open pixelmatch server to check diffs
$ yarn screenshots:server
# Enable hot reload
$ livereload screenshots,pristine_screenshots,diffs -w 1000
```
| 32.050847 | 126 | 0.786885 | eng_Latn | 0.981973 |
ff0be53aeb2410022b1b44587c19c7f036781858 | 583 | md | Markdown | README.md | viccowang/zhx-vue-plateform | aad3dc6bb1d17b760797334cee9cc42091457526 | [
"MIT"
] | 4 | 2018-10-28T04:47:56.000Z | 2021-12-10T07:03:44.000Z | README.md | viccowang/zhx-vue-platform | aad3dc6bb1d17b760797334cee9cc42091457526 | [
"MIT"
] | null | null | null | README.md | viccowang/zhx-vue-platform | aad3dc6bb1d17b760797334cee9cc42091457526 | [
"MIT"
] | null | null | null | # 中航讯 前端基础框架 zhx-vue-platform
#### Author:Vicco Wang / ver:2.0.4
---
## Domain Driven Model
> 该版本重构了大部分核心结构代码。实质更加模型化,解耦与适应大型项目团队合作模式。
> 该版本与1.x版本在结构上不兼容
## 更新记录
### Ver 2.0.4 2018-12-21
- FIXES shortcut menu 不同用户登录展示的问题
- UPDATE shortcut menu / tag tabs 右键菜单新增快捷项目
- NEW utils 新增了一些方法
### Ver 2.0.3 2018-12-18
- FIXES 三级包括三级以上的子路由组件无法被正常缓存的Bug
- FIXES NextPage组件在多个页面组件同时存在时造成数据异常的Bug
### Ver 2.0.2 2018-09-29
- NEW tagsTab右键菜单
- NEW tagsTab / shortcuts 菜单拖拽排序功能
### Ver 2.0.1 2018-07-28
- FIXES API接口生成器问题
- NEW 自定义验证
- NEW 业务Tree生成器
### Ver 2.0.0
- NEW 重构了项目结构与核心代码
| 18.806452 | 44 | 0.711835 | yue_Hant | 0.915771 |
ff0c442e8cecdc6a0b455dc75a8a20f5a1f6987f | 2,398 | md | Markdown | docs/deployment/how-to-enable-autostart-for-cd-installations.md | MicrosoftDocs/visualstudio-docs.it- | 3e6906339549f32b01960e19cd3400222dcc7b94 | [
"CC-BY-4.0",
"MIT"
] | 3 | 2018-03-29T21:12:32.000Z | 2022-03-26T11:56:08.000Z | docs/deployment/how-to-enable-autostart-for-cd-installations.md | MicrosoftDocs/visualstudio-docs.it- | 3e6906339549f32b01960e19cd3400222dcc7b94 | [
"CC-BY-4.0",
"MIT"
] | 12 | 2018-03-07T15:43:33.000Z | 2021-03-29T15:28:34.000Z | docs/deployment/how-to-enable-autostart-for-cd-installations.md | MicrosoftDocs/visualstudio-docs.it- | 3e6906339549f32b01960e19cd3400222dcc7b94 | [
"CC-BY-4.0",
"MIT"
] | 12 | 2017-11-26T08:17:38.000Z | 2021-10-09T11:24:07.000Z | ---
title: Abilitare l'avvio automatico per le installazioni da CD | Microsoft Docs
description: Informazioni su come abilitare l'avvio automatico quando si distribuisce un'ClickOnce per mezzo di supporti rimovibili, ad esempio CD-ROM o DVD-ROM.
ms.custom: SEO-VS-2020
ms.date: 11/04/2016
ms.topic: how-to
dev_langs:
- VB
- CSharp
- C++
helpviewer_keywords:
- ClickOnce deployment, AutoStart
- ClickOnce deployment, installation on CD or DVD
- deploying applications [ClickOnce], installation on CD or DVD
ms.assetid: caaec619-900c-4790-90e3-8c91f5347635
author: mikejo5000
ms.author: mikejo
manager: jmartens
ms.technology: vs-ide-deployment
ms.workload:
- multiple
ms.openlocfilehash: f16def763bebca4cc91b902d1f9202c6a6fdaede
ms.sourcegitcommit: 68897da7d74c31ae1ebf5d47c7b5ddc9b108265b
ms.translationtype: MT
ms.contentlocale: it-IT
ms.lasthandoff: 08/13/2021
ms.locfileid: "122127922"
---
# <a name="how-to-enable-autostart-for-cd-installations"></a>Procedura: Attivare l'avvio automatico per le installazioni da CD
Quando si distribuisce un'applicazione tramite supporti rimovibili, ad esempio CD-ROM o DVD-ROM, è possibile abilitare in modo che l'applicazione viene avviata automaticamente quando viene inserito il [!INCLUDE[ndptecclick](../deployment/includes/ndptecclick_md.md)] `AutoStart` [!INCLUDE[ndptecclick](../deployment/includes/ndptecclick_md.md)] supporto.
`AutoStart`può essere abilitato nella **pagina Pubblica** di progettazione **Project .**
### <a name="to-enable-autostart"></a>Per abilitare l'avvio automatico
1. Con un progetto selezionato in **Esplora soluzioni**, nel menu **Project** fare clic su **Proprietà**.
2. Fare clic sulla scheda **Pubblica**.
3. Fare clic sul pulsante **Opzioni** .
Verrà **visualizzata la finestra di** dialogo Opzioni di pubblicazione .
4. Fare clic su **Distribuzione**.
5. Selezionare la casella **di controllo Per le installazioni da CD, avvia automaticamente il programma di installazione all'inserimento di CD.**
Un file *Autorun.inf* verrà copiato nel percorso di pubblicazione quando viene pubblicata l'applicazione.
## <a name="see-also"></a>Vedi anche
- [Pubblicare applicazioni ClickOnce](../deployment/publishing-clickonce-applications.md)
- [Procedura: Pubblicare un'ClickOnce di pubblicazione usando la Pubblicazione guidata](../deployment/how-to-publish-a-clickonce-application-using-the-publish-wizard.md) | 46.115385 | 354 | 0.785655 | ita_Latn | 0.956576 |
ff0c9273423c75aba17a00b9fff919dd5ab37bce | 279 | md | Markdown | iambismark.net/content/post/2010/06/1275353878.md | bismark/iambismark.net | 1ef89663cfcf4682fbfd60781bb143a7fd276312 | [
"MIT"
] | null | null | null | iambismark.net/content/post/2010/06/1275353878.md | bismark/iambismark.net | 1ef89663cfcf4682fbfd60781bb143a7fd276312 | [
"MIT"
] | null | null | null | iambismark.net/content/post/2010/06/1275353878.md | bismark/iambismark.net | 1ef89663cfcf4682fbfd60781bb143a7fd276312 | [
"MIT"
] | null | null | null | ---
alturls:
- https://twitter.com/bismark/status/15143917757
- https://www.facebook.com/17803937/posts/749767902129
archive:
- 2010-06
date: '2010-06-01T00:57:58+00:00'
slug: '1275353878'
---
the stripe of dead touchscreen has made my phone completely worthless for games :(.
| 21.461538 | 83 | 0.741935 | eng_Latn | 0.339065 |
ff0cc84df9d88f6d885cc88ca1b33b75ea403e1c | 8,771 | md | Markdown | content/docs/code-splitting.md | kyarik/reactjs.org | 6e79d0882a13dade0f85010ca5303f17e1574c6f | [
"CC-BY-4.0"
] | 8 | 2020-05-25T15:59:42.000Z | 2020-05-26T21:44:34.000Z | content/docs/code-splitting.md | kyarik/reactjs.org | 6e79d0882a13dade0f85010ca5303f17e1574c6f | [
"CC-BY-4.0"
] | 17 | 2020-01-10T14:31:17.000Z | 2020-02-06T19:21:26.000Z | content/docs/code-splitting.md | kyarik/reactjs.org | 6e79d0882a13dade0f85010ca5303f17e1574c6f | [
"CC-BY-4.0"
] | 9 | 2019-07-17T07:54:49.000Z | 2021-04-02T03:12:48.000Z | ---
id: code-splitting
title: Code-Splitting
permalink: docs/code-splitting.html
---
## Bundling {#bundling}
Most React apps will have their files "bundled" using tools like
[Webpack](https://webpack.js.org/) or [Browserify](http://browserify.org/).
Bundling is the process of following imported files and merging them into a
single file: a "bundle". This bundle can then be included on a webpage to load
an entire app at once.
#### Example {#example}
**App:**
```js
// app.js
import { add } from './math.js';
console.log(add(16, 26)); // 42
```
```js
// math.js
export function add(a, b) {
return a + b;
}
```
**Bundle:**
```js
function add(a, b) {
return a + b;
}
console.log(add(16, 26)); // 42
```
> Note:
>
> Your bundles will end up looking a lot different than this.
If you're using [Create React App](https://github.com/facebookincubator/create-react-app), [Next.js](https://github.com/zeit/next.js/), [Gatsby](https://www.gatsbyjs.org/), or a similar tool, you will have a Webpack setup out of the box to bundle your
app.
If you aren't, you'll need to setup bundling yourself. For example, see the
[Installation](https://webpack.js.org/guides/installation/) and
[Getting Started](https://webpack.js.org/guides/getting-started/) guides on the
Webpack docs.
## Code Splitting {#code-splitting}
Bundling is great, but as your app grows, your bundle will grow too. Especially
if you are including large third-party libraries. You need to keep an eye on
the code you are including in your bundle so that you don't accidentally make
it so large that your app takes a long time to load.
To avoid winding up with a large bundle, it's good to get ahead of the problem
and start "splitting" your bundle.
[Code-Splitting](https://webpack.js.org/guides/code-splitting/) is a feature
supported by bundlers like Webpack and Browserify (via
[factor-bundle](https://github.com/browserify/factor-bundle)) which can create
multiple bundles that can be dynamically loaded at runtime.
Code-splitting your app can help you "lazy-load" just the things that are
currently needed by the user, which can dramatically improve the performance of
your app. While you haven't reduced the overall amount of code in your app,
you've avoided loading code that the user may never need, and reduced the amount
of code needed during the initial load.
## `import()` {#import}
The best way to introduce code-splitting into your app is through the dynamic
`import()` syntax.
**Before:**
```js
import { add } from './math';
console.log(add(16, 26));
```
**After:**
```js
import("./math").then(math => {
console.log(math.add(16, 26));
});
```
> Note:
>
> The dynamic `import()` syntax is a ECMAScript (JavaScript)
> [proposal](https://github.com/tc39/proposal-dynamic-import) not currently
> part of the language standard. It is expected to be accepted in the
> near future.
When Webpack comes across this syntax, it automatically starts code-splitting
your app. If you're using Create React App, this is already configured for you
and you can [start using it](https://facebook.github.io/create-react-app/docs/code-splitting) immediately. It's also supported
out of the box in [Next.js](https://github.com/zeit/next.js/#dynamic-import).
If you're setting up Webpack yourself, you'll probably want to read Webpack's
[guide on code splitting](https://webpack.js.org/guides/code-splitting/). Your Webpack config should look vaguely [like this](https://gist.github.com/gaearon/ca6e803f5c604d37468b0091d9959269).
When using [Babel](https://babeljs.io/), you'll need to make sure that Babel can
parse the dynamic import syntax but is not transforming it. For that you will need [babel-plugin-syntax-dynamic-import](https://yarnpkg.com/en/package/babel-plugin-syntax-dynamic-import).
## `React.lazy` {#reactlazy}
> Note:
>
> `React.lazy` and Suspense is not yet available for server-side rendering. If you want to do code-splitting in a server rendered app, we recommend [Loadable Components](https://github.com/smooth-code/loadable-components). It has a nice [guide for bundle splitting with server-side rendering](https://github.com/smooth-code/loadable-components/blob/master/packages/server/README.md).
The `React.lazy` function lets you render a dynamic import as a regular component.
**Before:**
```js
import OtherComponent from './OtherComponent';
function MyComponent() {
return (
<div>
<OtherComponent />
</div>
);
}
```
**After:**
```js
const OtherComponent = React.lazy(() => import('./OtherComponent'));
function MyComponent() {
return (
<div>
<OtherComponent />
</div>
);
}
```
This will automatically load the bundle containing the `OtherComponent` when this component gets rendered.
`React.lazy` takes a function that must call a dynamic `import()`. This must return a `Promise` which resolves to a module with a `default` export containing a React component.
### Suspense {#suspense}
If the module containing the `OtherComponent` is not yet loaded by the time `MyComponent` renders, we must show some fallback content while we're waiting for it to load - such as a loading indicator. This is done using the `Suspense` component.
```js
const OtherComponent = React.lazy(() => import('./OtherComponent'));
function MyComponent() {
return (
<div>
<Suspense fallback={<div>Loading...</div>}>
<OtherComponent />
</Suspense>
</div>
);
}
```
The `fallback` prop accepts any React elements that you want to render while waiting for the component to load. You can place the `Suspense` component anywhere above the lazy component. You can even wrap multiple lazy components with a single `Suspense` component.
```js
const OtherComponent = React.lazy(() => import('./OtherComponent'));
const AnotherComponent = React.lazy(() => import('./AnotherComponent'));
function MyComponent() {
return (
<div>
<Suspense fallback={<div>Loading...</div>}>
<section>
<OtherComponent />
<AnotherComponent />
</section>
</Suspense>
</div>
);
}
```
### Error boundaries {#error-boundaries}
If the other module fails to load (for example, due to network failure), it will trigger an error. You can handle these errors to show a nice user experience and manage recovery with [Error Boundaries](/docs/error-boundaries.html). Once you've created your Error Boundary, you can use it anywhere above your lazy components to display an error state when there's a network error.
```js
import MyErrorBoundary from './MyErrorBoundary';
const OtherComponent = React.lazy(() => import('./OtherComponent'));
const AnotherComponent = React.lazy(() => import('./AnotherComponent'));
const MyComponent = () => (
<div>
<MyErrorBoundary>
<Suspense fallback={<div>Loading...</div>}>
<section>
<OtherComponent />
<AnotherComponent />
</section>
</Suspense>
</MyErrorBoundary>
</div>
);
```
## Route-based code splitting {#route-based-code-splitting}
Deciding where in your app to introduce code splitting can be a bit tricky. You
want to make sure you choose places that will split bundles evenly, but won't
disrupt the user experience.
A good place to start is with routes. Most people on the web are used to
page transitions taking some amount of time to load. You also tend to be
re-rendering the entire page at once so your users are unlikely to be
interacting with other elements on the page at the same time.
Here's an example of how to setup route-based code splitting into your app using
libraries like [React Router](https://reacttraining.com/react-router/) with `React.lazy`.
```js
import { BrowserRouter as Router, Route, Switch } from 'react-router-dom';
import React, { Suspense, lazy } from 'react';
const Home = lazy(() => import('./routes/Home'));
const About = lazy(() => import('./routes/About'));
const App = () => (
<Router>
<Suspense fallback={<div>Loading...</div>}>
<Switch>
<Route exact path="/" component={Home}/>
<Route path="/about" component={About}/>
</Switch>
</Suspense>
</Router>
);
```
## Named Exports {#named-exports}
`React.lazy` currently only supports default exports. If the module you want to import uses named exports, you can create an intermediate module that reexports it as the default. This ensures that tree shaking keeps working and that you don't pull in unused components.
```js
// ManyComponents.js
export const MyComponent = /* ... */;
export const MyUnusedComponent = /* ... */;
```
```js
// MyComponent.js
export { MyComponent as default } from "./ManyComponents.js";
```
```js
// MyApp.js
import React, { lazy } from 'react';
const MyComponent = lazy(() => import("./MyComponent.js"));
```
| 32.727612 | 383 | 0.712005 | eng_Latn | 0.979178 |
ff0cd7cb8bc6f9ebf5e54a49a8f8ce47ab532d76 | 29,660 | md | Markdown | articles/azure-resource-manager/management/tag-resources.md | MicrosoftDocs/azure-docs.hu-hu | 5fb082c5dae057fd040c7e09881e6c407e535fe2 | [
"CC-BY-4.0",
"MIT"
] | 7 | 2017-08-28T07:44:33.000Z | 2021-04-20T21:12:50.000Z | articles/azure-resource-manager/management/tag-resources.md | MicrosoftDocs/azure-docs.hu-hu | 5fb082c5dae057fd040c7e09881e6c407e535fe2 | [
"CC-BY-4.0",
"MIT"
] | 412 | 2018-07-25T09:31:03.000Z | 2021-03-17T13:17:45.000Z | articles/azure-resource-manager/management/tag-resources.md | MicrosoftDocs/azure-docs.hu-hu | 5fb082c5dae057fd040c7e09881e6c407e535fe2 | [
"CC-BY-4.0",
"MIT"
] | 13 | 2017-09-05T09:10:35.000Z | 2021-11-05T11:42:31.000Z | ---
title: Erőforrások, erőforráscsoportok és előfizetések címkézése a logikai szervezet számára
description: Bemutatja, hogyan alkalmazhat címkéket az Azure-erőforrások számlázáshoz és felügyelethez való rendszerezéséhez.
ms.topic: conceptual
ms.date: 01/04/2021
ms.custom: devx-track-azurecli
ms.openlocfilehash: 1e755a378fd71ea2763cc3e43477876fa3e8c5d5
ms.sourcegitcommit: 32e0fedb80b5a5ed0d2336cea18c3ec3b5015ca1
ms.translationtype: MT
ms.contentlocale: hu-HU
ms.lasthandoff: 03/30/2021
ms.locfileid: "105934189"
---
# <a name="use-tags-to-organize-your-azure-resources-and-management-hierarchy"></a>Címkék használata az Azure-erőforrások és a felügyeleti hierarchia rendszerezéséhez
Címkéket alkalmazhat az Azure-erőforrások, az erőforráscsoportok és az előfizetések számára, hogy logikailag szervezze őket a besorolásba. Minden címke egy nevet és egy érték párokat tartalmaz. Alkalmazhatja például a „Környezet” nevet és az „Éles” értéket az összes éles üzemben használt erőforrásra.
A címkézési stratégia megvalósításával kapcsolatos javaslatokért lásd: [erőforrás-elnevezési és címkézési döntési útmutató](/azure/cloud-adoption-framework/decision-guides/resource-tagging/?toc=/azure/azure-resource-manager/management/toc.json).
> [!IMPORTANT]
> A címkék nevei a kis-és nagybetűk megkülönböztetését jelentik a műveletekhez. A címkével ellátott címkét a rendszer a beborítástól függetlenül frissíti vagy lekéri. Előfordulhat azonban, hogy az erőforrás-szolgáltató megtartja a címke nevéhez megadott burkolatot. Látni fogja, hogy a ház a Cost-jelentésekben szerepel.
>
> A címke értékei megkülönböztetik a kis-és nagybetűket.
[!INCLUDE [Handle personal data](../../../includes/gdpr-intro-sentence.md)]
## <a name="required-access"></a>Szükséges hozzáférés
Két módon kérheti le a szükséges hozzáférést az erőforrások címkézéséhez.
- Írási jogosultsággal rendelkezhet a **Microsoft. Resources/Tags** erőforrástípus. Ez a hozzáférés lehetővé teszi, hogy bármilyen erőforrást címkével lássa el, még akkor is, ha nem rendelkezik hozzáféréssel az erőforráshoz. A [címke közreműködői](../../role-based-access-control/built-in-roles.md#tag-contributor) szerepköre engedélyezi ezt a hozzáférést. Jelenleg a tag közreműködői szerepkör nem alkalmazhat címkéket az erőforrásokra vagy az erőforráscsoportok a portálon keresztül. Címkéket is alkalmazhat az előfizetésekhez a portálon keresztül. Támogatja az összes címkézési műveletet a PowerShell és a REST API használatával.
- Az erőforráshoz írási hozzáféréssel is rendelkezhet. A [közreműködő](../../role-based-access-control/built-in-roles.md#contributor) szerepkör biztosítja a szükséges hozzáférést a címkék bármely entitásra való alkalmazásához. Ha csak egy erőforrás-típusra kíván címkéket alkalmazni, használja az adott erőforrás közreműködői szerepkörét. Ha például címkéket szeretne alkalmazni a virtuális gépekre, használja a [virtuális gép közreműködőjét](../../role-based-access-control/built-in-roles.md#virtual-machine-contributor).
## <a name="powershell"></a>PowerShell
### <a name="apply-tags"></a>Címkék alkalmazása
Azure PowerShell két parancsot kínál a címkék alkalmazásához: [New-AzTag](/powershell/module/az.resources/new-aztag) és [Update-AzTag](/powershell/module/az.resources/update-aztag). Az az. Resources Module 1.12.0 vagy újabb verzióval kell rendelkeznie. A verzióját a segítségével is megtekintheti `Get-Module Az.Resources` . Telepítheti a modult, vagy [telepítheti a Azure PowerShell](/powershell/azure/install-az-ps) 3.6.1-es vagy újabb verzióját.
A **New-AzTag** az erőforráson, az erőforráscsoporton vagy az előfizetésen lévő összes címkét lecseréli. A parancs hívásakor adja meg a címkével ellátni kívánt entitás erőforrás-AZONOSÍTÓját.
Az alábbi példa a címkék egy készletét alkalmazza a Storage-fiókra:
```azurepowershell-interactive
$tags = @{"Dept"="Finance"; "Status"="Normal"}
$resource = Get-AzResource -Name demoStorage -ResourceGroup demoGroup
New-AzTag -ResourceId $resource.id -Tag $tags
```
Ha a parancs befejeződik, figyelje meg, hogy az erőforrásnak két címkéje van.
```output
Properties :
Name Value
====== =======
Dept Finance
Status Normal
```
Ha újra futtatja a parancsot, de ezúttal más címkékkel, figyelje meg, hogy a korábbi címkék el lesznek távolítva.
```azurepowershell-interactive
$tags = @{"Team"="Compliance"; "Environment"="Production"}
New-AzTag -ResourceId $resource.id -Tag $tags
```
```output
Properties :
Name Value
=========== ==========
Environment Production
Team Compliance
```
Ha címkéket szeretne felvenni egy olyan erőforráshoz, amely már rendelkezik címkékkel, használja az **Update-AzTag**. Állítsa be a **-Operation** paramétert az **egyesítéshez**.
```azurepowershell-interactive
$tags = @{"Dept"="Finance"; "Status"="Normal"}
Update-AzTag -ResourceId $resource.id -Tag $tags -Operation Merge
```
Figyelje meg, hogy a két új címke hozzá lett adva a két meglévő címkéhez.
```output
Properties :
Name Value
=========== ==========
Status Normal
Dept Finance
Team Compliance
Environment Production
```
Minden címke nevének csak egy értéke lehet. Ha egy címkéhez új értéket ad meg, akkor a rendszer akkor is lecseréli a régi értéket, ha az egyesítési műveletet használja. Az alábbi példa megváltoztatja az állapot címkéjét a Normálról zöldre.
```azurepowershell-interactive
$tags = @{"Status"="Green"}
Update-AzTag -ResourceId $resource.id -Tag $tags -Operation Merge
```
```output
Properties :
Name Value
=========== ==========
Status Green
Dept Finance
Team Compliance
Environment Production
```
Ha a **-Operation** paramétert **lecseréli**, a meglévő címkéket a címkék új készlete váltja fel.
```azurepowershell-interactive
$tags = @{"Project"="ECommerce"; "CostCenter"="00123"; "Team"="Web"}
Update-AzTag -ResourceId $resource.id -Tag $tags -Operation Replace
```
Csak az új címkék maradnak meg az erőforráson.
```output
Properties :
Name Value
========== =========
CostCenter 00123
Team Web
Project ECommerce
```
Ugyanezek a parancsok az erőforráscsoportok vagy az előfizetések esetében is működnek. Adja meg a címkével ellátni kívánt erőforráscsoport vagy előfizetés azonosítóját.
Ha új címkéket szeretne hozzáadni egy erőforráscsoporthoz, használja a következőt:
```azurepowershell-interactive
$tags = @{"Dept"="Finance"; "Status"="Normal"}
$resourceGroup = Get-AzResourceGroup -Name demoGroup
New-AzTag -ResourceId $resourceGroup.ResourceId -tag $tags
```
Egy erőforráscsoport címkéinak frissítéséhez használja a következőt:
```azurepowershell-interactive
$tags = @{"CostCenter"="00123"; "Environment"="Production"}
$resourceGroup = Get-AzResourceGroup -Name demoGroup
Update-AzTag -ResourceId $resourceGroup.ResourceId -Tag $tags -Operation Merge
```
Új címkék előfizetéshez való hozzáadásához használja a következőt:
```azurepowershell-interactive
$tags = @{"CostCenter"="00123"; "Environment"="Dev"}
$subscription = (Get-AzSubscription -SubscriptionName "Example Subscription").Id
New-AzTag -ResourceId "/subscriptions/$subscription" -Tag $tags
```
Az előfizetéshez tartozó címkék frissítéséhez használja a következőt:
```azurepowershell-interactive
$tags = @{"Team"="Web Apps"}
$subscription = (Get-AzSubscription -SubscriptionName "Example Subscription").Id
Update-AzTag -ResourceId "/subscriptions/$subscription" -Tag $tags -Operation Merge
```
Egy erőforráscsoporthoz több azonos nevű erőforrás is tartozhat. Ebben az esetben beállíthatja az egyes erőforrásokat a következő parancsokkal:
```azurepowershell-interactive
$resource = Get-AzResource -ResourceName sqlDatabase1 -ResourceGroupName examplegroup
$resource | ForEach-Object { Update-AzTag -Tag @{ "Dept"="IT"; "Environment"="Test" } -ResourceId $_.ResourceId -Operation Merge }
```
### <a name="list-tags"></a>Címkék listázása
Ha egy erőforrás, erőforráscsoport vagy előfizetés címkéit szeretné lekérni, használja a [Get-AzTag](/powershell/module/az.resources/get-aztag) parancsot, és adja meg az entitás erőforrás-azonosítóját.
Az adott erőforráshoz tartozó címkék megtekintéséhez használja a következőt:
```azurepowershell-interactive
$resource = Get-AzResource -Name demoStorage -ResourceGroup demoGroup
Get-AzTag -ResourceId $resource.id
```
Egy erőforráscsoport címkéit a következő paranccsal tekintheti meg:
```azurepowershell-interactive
$resourceGroup = Get-AzResourceGroup -Name demoGroup
Get-AzTag -ResourceId $resourceGroup.ResourceId
```
Az előfizetés címkéit a következő paranccsal tekintheti meg:
```azurepowershell-interactive
$subscription = (Get-AzSubscription -SubscriptionName "Example Subscription").Id
Get-AzTag -ResourceId "/subscriptions/$subscription"
```
### <a name="list-by-tag"></a>Listázás címke szerint
A megadott címke névvel és értékkel rendelkező erőforrások lekéréséhez használja a következőt:
```azurepowershell-interactive
(Get-AzResource -Tag @{ "CostCenter"="00123"}).Name
```
Ha olyan erőforrásokat szeretne lekérni, amelyek címkével megadott névvel rendelkeznek, használja a következőt:
```azurepowershell-interactive
(Get-AzResource -TagName "Dept").Name
```
A megadott címke névvel és értékkel rendelkező erőforráscsoportok lekéréséhez használja a következőt:
```azurepowershell-interactive
(Get-AzResourceGroup -Tag @{ "CostCenter"="00123" }).ResourceGroupName
```
### <a name="remove-tags"></a>Címkék eltávolítása
Adott címkék eltávolításához használja az **Update-AzTag** és a set **-Operation** parancsot a **törléshez**. Adja meg a törölni kívánt címkéket.
```azurepowershell-interactive
$removeTags = @{"Project"="ECommerce"; "Team"="Web"}
Update-AzTag -ResourceId $resource.id -Tag $removeTags -Operation Delete
```
A megadott címkék el lesznek távolítva.
```output
Properties :
Name Value
========== =====
CostCenter 00123
```
Az összes címke eltávolításához használja a [Remove-AzTag](/powershell/module/az.resources/remove-aztag) parancsot.
```azurepowershell-interactive
$subscription = (Get-AzSubscription -SubscriptionName "Example Subscription").Id
Remove-AzTag -ResourceId "/subscriptions/$subscription"
```
## <a name="azure-cli"></a>Azure CLI
### <a name="apply-tags"></a>Címkék alkalmazása
Az Azure CLI két parancsot kínál a címkék alkalmazásához – [az tag Create](/cli/azure/tag#az_tag_create) és [az az tag Update](/cli/azure/tag#az_tag_update). Az Azure CLI-2.10.0 vagy újabb verziójára van szükség. A verzióját a segítségével is megtekintheti `az version` . A frissítéshez vagy a telepítéshez lásd: [Az Azure CLI telepítése](/cli/azure/install-azure-cli).
Az az **tag Create** lecseréli az összes címkét az erőforrás, az erőforráscsoport vagy az előfizetés elemre. A parancs hívásakor adja meg a címkével ellátni kívánt entitás erőforrás-AZONOSÍTÓját.
Az alábbi példa a címkék egy készletét alkalmazza a Storage-fiókra:
```azurecli-interactive
resource=$(az resource show -g demoGroup -n demoStorage --resource-type Microsoft.Storage/storageAccounts --query "id" --output tsv)
az tag create --resource-id $resource --tags Dept=Finance Status=Normal
```
Ha a parancs befejeződik, figyelje meg, hogy az erőforrásnak két címkéje van.
```output
"properties": {
"tags": {
"Dept": "Finance",
"Status": "Normal"
}
},
```
Ha újra futtatja a parancsot, de ezúttal más címkékkel, figyelje meg, hogy a korábbi címkék el lesznek távolítva.
```azurecli-interactive
az tag create --resource-id $resource --tags Team=Compliance Environment=Production
```
```output
"properties": {
"tags": {
"Environment": "Production",
"Team": "Compliance"
}
},
```
Ha címkéket szeretne felvenni egy olyan erőforráshoz, amely már rendelkezik címkékkel, használja a következőt: `az tag update` . Állítsa a paramétert a következőre: `--operation` `Merge` .
```azurecli-interactive
az tag update --resource-id $resource --operation Merge --tags Dept=Finance Status=Normal
```
Figyelje meg, hogy a két új címke hozzá lett adva a két meglévő címkéhez.
```output
"properties": {
"tags": {
"Dept": "Finance",
"Environment": "Production",
"Status": "Normal",
"Team": "Compliance"
}
},
```
Minden címke nevének csak egy értéke lehet. Ha egy címkéhez új értéket ad meg, akkor a rendszer akkor is lecseréli a régi értéket, ha az egyesítési műveletet használja. Az alábbi példa megváltoztatja az állapot címkéjét a Normálról zöldre.
```azurecli-interactive
az tag update --resource-id $resource --operation Merge --tags Status=Green
```
```output
"properties": {
"tags": {
"Dept": "Finance",
"Environment": "Production",
"Status": "Green",
"Team": "Compliance"
}
},
```
Ha a paramétert a értékre állítja `--operation` `Replace` , a meglévő címkéket a címkék új készlete váltja fel.
```azurecli-interactive
az tag update --resource-id $resource --operation Replace --tags Project=ECommerce CostCenter=00123 Team=Web
```
Csak az új címkék maradnak meg az erőforráson.
```output
"properties": {
"tags": {
"CostCenter": "00123",
"Project": "ECommerce",
"Team": "Web"
}
},
```
Ugyanezek a parancsok az erőforráscsoportok vagy az előfizetések esetében is működnek. Adja meg a címkével ellátni kívánt erőforráscsoport vagy előfizetés azonosítóját.
Ha új címkéket szeretne hozzáadni egy erőforráscsoporthoz, használja a következőt:
```azurecli-interactive
group=$(az group show -n demoGroup --query id --output tsv)
az tag create --resource-id $group --tags Dept=Finance Status=Normal
```
Egy erőforráscsoport címkéinak frissítéséhez használja a következőt:
```azurecli-interactive
az tag update --resource-id $group --operation Merge --tags CostCenter=00123 Environment=Production
```
Új címkék előfizetéshez való hozzáadásához használja a következőt:
```azurecli-interactive
sub=$(az account show --subscription "Demo Subscription" --query id --output tsv)
az tag create --resource-id /subscriptions/$sub --tags CostCenter=00123 Environment=Dev
```
Az előfizetéshez tartozó címkék frissítéséhez használja a következőt:
```azurecli-interactive
az tag update --resource-id /subscriptions/$sub --operation Merge --tags Team="Web Apps"
```
### <a name="list-tags"></a>Címkék listázása
Egy erőforrás, erőforráscsoport vagy előfizetés címkéjének lekéréséhez használja az az [tag List](/cli/azure/tag#az_tag_list) parancsot, és adja meg az entitás erőforrás-azonosítóját.
Az adott erőforráshoz tartozó címkék megtekintéséhez használja a következőt:
```azurecli-interactive
resource=$(az resource show -g demoGroup -n demoStorage --resource-type Microsoft.Storage/storageAccounts --query "id" --output tsv)
az tag list --resource-id $resource
```
Egy erőforráscsoport címkéit a következő paranccsal tekintheti meg:
```azurecli-interactive
group=$(az group show -n demoGroup --query id --output tsv)
az tag list --resource-id $group
```
Az előfizetés címkéit a következő paranccsal tekintheti meg:
```azurecli-interactive
sub=$(az account show --subscription "Demo Subscription" --query id --output tsv)
az tag list --resource-id /subscriptions/$sub
```
### <a name="list-by-tag"></a>Listázás címke szerint
A megadott címke névvel és értékkel rendelkező erőforrások lekéréséhez használja a következőt:
```azurecli-interactive
az resource list --tag CostCenter=00123 --query [].name
```
Ha olyan erőforrásokat szeretne lekérni, amelyek címkével megadott névvel rendelkeznek, használja a következőt:
```azurecli-interactive
az resource list --tag Team --query [].name
```
A megadott címke névvel és értékkel rendelkező erőforráscsoportok lekéréséhez használja a következőt:
```azurecli-interactive
az group list --tag Dept=Finance
```
### <a name="remove-tags"></a>Címkék eltávolítása
Adott címkék eltávolításához használja a parancsot, `az tag update` és állítsa a következőre: `--operation` `Delete` . Adja meg a törölni kívánt címkéket.
```azurecli-interactive
az tag update --resource-id $resource --operation Delete --tags Project=ECommerce Team=Web
```
A megadott címkék el lesznek távolítva.
```output
"properties": {
"tags": {
"CostCenter": "00123"
}
},
```
Az összes címke eltávolításához használja az az [tag delete](/cli/azure/tag#az_tag_delete) parancsot.
```azurecli-interactive
az tag delete --resource-id $resource
```
### <a name="handling-spaces"></a>Szóközök feldolgozása
Ha a címke neve vagy értéke szóközt tartalmaz, tegye idézőjelek közé.
```azurecli-interactive
az tag update --resource-id $group --operation Merge --tags "Cost Center"=Finance-1222 Location="West US"
```
## <a name="arm-templates"></a>ARM-sablonok
Az üzembe helyezés során egy Azure Resource Manager sablonnal (ARM-sablonnal) címkézheti az erőforrásokat, az erőforráscsoportokat és az előfizetéseket.
> [!NOTE]
> Az ARM-sablonon keresztül alkalmazott címkék felülírják a meglévő címkéket.
### <a name="apply-values"></a>Értékek alkalmazása
Az alábbi példa három címkével rendelkező Storage-fiókot telepít. A címkék közül kettő ( `Dept` és `Environment` ) konstans értékre van beállítva. Az egyik címke ( `LastDeployed` ) egy olyan paraméterre van beállítva, amely alapértelmezett értéke az aktuális dátum.
```json
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"utcShort": {
"type": "string",
"defaultValue": "[utcNow('d')]"
},
"location": {
"type": "string",
"defaultValue": "[resourceGroup().location]"
}
},
"resources": [
{
"apiVersion": "2019-04-01",
"type": "Microsoft.Storage/storageAccounts",
"name": "[concat('storage', uniqueString(resourceGroup().id))]",
"location": "[parameters('location')]",
"tags": {
"Dept": "Finance",
"Environment": "Production",
"LastDeployed": "[parameters('utcShort')]"
},
"sku": {
"name": "Standard_LRS"
},
"kind": "Storage",
"properties": {}
}
]
}
```
### <a name="apply-an-object"></a>Objektum alkalmazása
Megadhat olyan objektumparamétert, amely több címkét tartalmaz, majd alkalmazhatja azt az objektumot a címkeelemre. Ez a megközelítés nagyobb rugalmasságot biztosít az előző példánál, mert az objektum különböző tulajdonságokkal rendelkezhet. Az objektum minden tulajdonsága az erőforrás külön címkéjévé válik. Az alábbi példa egy `tagValues` nevű paramétert tartalmaz, amely a címkeelemre van alkalmazva.
```json
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"location": {
"type": "string",
"defaultValue": "[resourceGroup().location]"
},
"tagValues": {
"type": "object",
"defaultValue": {
"Dept": "Finance",
"Environment": "Production"
}
}
},
"resources": [
{
"apiVersion": "2019-04-01",
"type": "Microsoft.Storage/storageAccounts",
"name": "[concat('storage', uniqueString(resourceGroup().id))]",
"location": "[parameters('location')]",
"tags": "[parameters('tagValues')]",
"sku": {
"name": "Standard_LRS"
},
"kind": "Storage",
"properties": {}
}
]
}
```
### <a name="apply-a-json-string"></a>JSON-karakterlánc alkalmazása
Ha több értéket szeretne tárolni egyetlen címkében, alkalmazzon a megfelelő értékeket képviselő JSON-sztringet. A teljes JSON-karakterlánc egyetlen címkeként van tárolva, amely nem lehet hosszabb 256 karakternél. Az alábbi példában egy `CostCenter` nevű címke szerepel, amely egy JSON-sztring számos értékét tartalmazza:
```json
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"location": {
"type": "string",
"defaultValue": "[resourceGroup().location]"
}
},
"resources": [
{
"apiVersion": "2019-04-01",
"type": "Microsoft.Storage/storageAccounts",
"name": "[concat('storage', uniqueString(resourceGroup().id))]",
"location": "[parameters('location')]",
"tags": {
"CostCenter": "{\"Dept\":\"Finance\",\"Environment\":\"Production\"}"
},
"sku": {
"name": "Standard_LRS"
},
"kind": "Storage",
"properties": {}
}
]
}
```
### <a name="apply-tags-from-resource-group"></a>Címkék alkalmazása az erőforrás-csoportból
Ha címkéket szeretne alkalmazni egy erőforrás-csoportból egy erőforrásra, használja a [resourceGroup ()](../templates/template-functions-resource.md#resourcegroup) függvényt. A címke értékének beolvasása során a szintaxis helyett használja a `tags[tag-name]` szintaxist `tags.tag-name` , mert néhány karakter nem megfelelően van értelmezve a dot jelölésben.
```json
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"location": {
"type": "string",
"defaultValue": "[resourceGroup().location]"
}
},
"resources": [
{
"apiVersion": "2019-04-01",
"type": "Microsoft.Storage/storageAccounts",
"name": "[concat('storage', uniqueString(resourceGroup().id))]",
"location": "[parameters('location')]",
"tags": {
"Dept": "[resourceGroup().tags['Dept']]",
"Environment": "[resourceGroup().tags['Environment']]"
},
"sku": {
"name": "Standard_LRS"
},
"kind": "Storage",
"properties": {}
}
]
}
```
### <a name="apply-tags-to-resource-groups-or-subscriptions"></a>Címkék alkalmazása erőforráscsoportok vagy előfizetések számára
Hozzáadhat címkéket egy erőforráscsoporthoz vagy előfizetéshez a **Microsoft. Resources/Tags** erőforrástípus üzembe helyezésével. A címkéket a rendszer a célként megadott erőforráscsoporthoz vagy előfizetésre alkalmazza a központi telepítéshez. Minden alkalommal, amikor központilag telepíti a sablont, a rendszer már alkalmazta a címkéket.
```json
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"tagName": {
"type": "string",
"defaultValue": "TeamName"
},
"tagValue": {
"type": "string",
"defaultValue": "AppTeam1"
}
},
"variables": {},
"resources": [
{
"type": "Microsoft.Resources/tags",
"name": "default",
"apiVersion": "2019-10-01",
"dependsOn": [],
"properties": {
"tags": {
"[parameters('tagName')]": "[parameters('tagValue')]"
}
}
}
]
}
```
Ha a címkéket egy erőforráscsoporthoz szeretné alkalmazni, használja a PowerShell vagy az Azure CLI-t. Telepítse a címkével ellátni kívánt erőforráscsoportot.
```azurepowershell-interactive
New-AzResourceGroupDeployment -ResourceGroupName exampleGroup -TemplateFile https://raw.githubusercontent.com/Azure/azure-docs-json-samples/master/azure-resource-manager/tags.json
```
```azurecli-interactive
az deployment group create --resource-group exampleGroup --template-uri https://raw.githubusercontent.com/Azure/azure-docs-json-samples/master/azure-resource-manager/tags.json
```
A címkék előfizetésre való alkalmazásához használja a PowerShell vagy az Azure CLI-t. Telepítse a címkével ellátni kívánt előfizetést.
```azurepowershell-interactive
New-AzSubscriptionDeployment -name tagresourcegroup -Location westus2 -TemplateUri https://raw.githubusercontent.com/Azure/azure-docs-json-samples/master/azure-resource-manager/tags.json
```
```azurecli-interactive
az deployment sub create --name tagresourcegroup --location westus2 --template-uri https://raw.githubusercontent.com/Azure/azure-docs-json-samples/master/azure-resource-manager/tags.json
```
Az előfizetések telepítésével kapcsolatos további információkért lásd: [erőforráscsoportok és erőforrások létrehozása az előfizetési szinten](../templates/deploy-to-subscription.md).
A következő sablon hozzáadja a címkéket egy objektumból egy erőforráscsoporthoz vagy előfizetésbe.
```json
"$schema": "https://schema.management.azure.com/schemas/2018-05-01/subscriptionDeploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"tags": {
"type": "object",
"defaultValue": {
"TeamName": "AppTeam1",
"Dept": "Finance",
"Environment": "Production"
}
}
},
"variables": {},
"resources": [
{
"type": "Microsoft.Resources/tags",
"name": "default",
"apiVersion": "2019-10-01",
"dependsOn": [],
"properties": {
"tags": "[parameters('tags')]"
}
}
]
}
```
## <a name="portal"></a>Portál
[!INCLUDE [resource-manager-tag-resource](../../../includes/resource-manager-tag-resources.md)]
## <a name="rest-api"></a>REST API
Ha az Azure REST API használatával szeretne címkékkel dolgozni, használja a következőt:
* [Címkék – létrehozás vagy frissítés hatókör](/rest/api/resources/resources/tags/createorupdateatscope) (Put művelet)
* [Címkék – frissítés a hatókörben](/rest/api/resources/resources/tags/updateatscope) (javítási művelet)
* [Címkék – beolvasás hatóköre](/rest/api/resources/resources/tags/getatscope) (lekérési művelet)
* [Címkék – törlés a hatókörben](/rest/api/resources/resources/tags/deleteatscope) (törlési művelet)
## <a name="inherit-tags"></a>Címkék öröklése
Az erőforráscsoport vagy az előfizetés által alkalmazott címkéket nem öröklik az erőforrások. Ha címkéket szeretne alkalmazni egy előfizetésből vagy erőforráscsoporthoz az erőforrásokra, tekintse meg az [Azure-szabályzatok – címkék](tag-policies.md)című témakört.
## <a name="tags-and-billing"></a>Címkék és számlázás
Címkék segítségével a számlázási adatok is csoportosíthatók. Ha például több virtuális gépet futtat különböző vállalatok számára, akkor a használatot címkék segítségével tudja költséghely szerint csoportosítani. A címkék a költségek futtatókörnyezet szerinti besorolására, például az éles környezetben futó virtuális gépek használatának kiszámlázására is felhasználhatók.
A címkékre vonatkozó információkat a Azure Portalból elérhető, vesszővel tagolt (CSV) fájl letöltésével kérheti le. További információkért tekintse [meg az Azure számlázási és napi használati adatainak letöltését vagy megtekintését](../../cost-management-billing/manage/download-azure-invoice-daily-usage-date.md)ismertető témakört. A Azure Fiókközpont a használati fájl letöltésekor válassza a **2. verziót**. A számlázási címkéket támogató szolgáltatások esetében a címkék a **címkék** oszlopban jelennek meg.
REST API műveletekhez tekintse meg az [Azure számlázási REST API referenciáját](/rest/api/billing/).
## <a name="limitations"></a>Korlátozások
Az alábbi korlátozások érvényesek a címkékre:
* Nem minden erőforrástípus támogatja a címkéket. Annak megállapításához, hogy lehet-e címkét alkalmazni az erőforrás típusára, tekintse meg [Az Azure-erőforrások támogatásának címkézését](tag-support.md)ismertető témakört.
* Minden erőforrás, erőforráscsoport és előfizetés legfeljebb 50 címke név/érték párokat tartalmazhat. Ha a maximálisan megengedettnél több címkét kell alkalmaznia, használjon egy JSON-karakterláncot a címke értékhez. A JSON-sztring sok olyan értéket tartalmazhat, amelyek egyetlen címkenévre vannak alkalmazva. Egy erőforráscsoport vagy előfizetés több olyan erőforrást is tartalmazhat, amelyek mindegyike 50 címke név/érték párokat tartalmaz.
* A címke neve legfeljebb 512 karakter, a címke értéke pedig legfeljebb 256 karakter hosszúságú lehet. A tárfiókok esetében a címke neve legfeljebb 128 karakter, a címke értéke pedig legfeljebb 256 karakter hosszúságú lehet.
* A címkék nem alkalmazhatók a klasszikus erőforrásokra, például a Cloud Servicesra.
* A címkék nevei nem tartalmazhatják a következő karaktereket:,,,,, `<` `>` `%` `&` `\` `?` , `/`
> [!NOTE]
> Azure DNS zónák és Traffic Manager szolgáltatások jelenleg nem teszik lehetővé a szóközök használatát a címkében.
>
> Az Azure bejárati ajtó nem támogatja a `#` címke nevében való használatát.
>
> Azure Automation és Azure CDN csak a 15 címkét támogatja az erőforrásokon.
## <a name="next-steps"></a>Következő lépések
* Nem minden erőforrástípus támogatja a címkéket. Annak megállapításához, hogy lehet-e címkét alkalmazni az erőforrás típusára, tekintse meg [Az Azure-erőforrások támogatásának címkézését](tag-support.md)ismertető témakört.
* A címkézési stratégia megvalósításával kapcsolatos javaslatokért lásd: [erőforrás-elnevezési és címkézési döntési útmutató](/azure/cloud-adoption-framework/decision-guides/resource-tagging/?toc=/azure/azure-resource-manager/management/toc.json).
| 40.910345 | 634 | 0.713655 | hun_Latn | 0.999859 |
ff0d262dd11bc00a589b29d9cf72da16d7d5fbf3 | 4,900 | md | Markdown | content/usage/sequences-and-loops.md | Kumortas/osdoc_eyelogic | 5688c1b85135aaa0fcfb37c4889b9c3dd82d7821 | [
"CC-BY-3.0"
] | null | null | null | content/usage/sequences-and-loops.md | Kumortas/osdoc_eyelogic | 5688c1b85135aaa0fcfb37c4889b9c3dd82d7821 | [
"CC-BY-3.0"
] | null | null | null | content/usage/sequences-and-loops.md | Kumortas/osdoc_eyelogic | 5688c1b85135aaa0fcfb37c4889b9c3dd82d7821 | [
"CC-BY-3.0"
] | null | null | null | ---
layout: osdoc
title: Sequences and loops
group: Usage
permalink: /sequences-and-loops/
---
The `loop` and `sequence` items are two special items that add structure to your experiment. Understanding how `loop`s and `sequence`s work is one of the trickier aspects of working with OpenSesame.
%--
toc:
mindepth: 2
--%
## `sequence` items
A `sequence` is a list of items that is executed sequentially. For every item in a `sequence` there is also a 'Run if' condition, which specifies the conditions under which an item should be executed (by default this is 'always'). A `sequence` does not repeat automatically: For this, you will need to combine it with a `loop`.
A typical situation where a `sequence` is used is as a trial: A single trial will often correspond to a single `sequence` item. This illustrated in the screenshot below.
%--
figure:
id: FigTrial
source: 1.png
caption: |
A single trial often corresponds to a single `sequence` item.
--%
In this example trial, there is a *trial_sequence*, which consists of a fixation dot (a `sketchpad` item), a target display (another `sketchpad`), a response collection item (a `keyboard_response`), a green fixation dot (another `sketchpad`), a red fixation dot (another `sketchpad`), and a data logging item (a `logger`).
Most items are called 'always'. However, the green and red fixation dots have specific 'Run if' conditions. The green fixation dot is only called when the variable `correct` has the value 1. The red fixation dot is only called when the variable `correct` has the value 0. Effectively, this means that the color of the fixation dot provides the participant with immediate feedback on every trial: Green means correct, red means incorrect.
The variable “correct” is set automatically by the keyboard_response item. For more information about variables and conditional statements, see:
- [usage/variables-and-conditional-statements/]
## `loop` items
`Loop` items repeatedly call a single other item, the 'item to run'. `Loop`s are also used to control independent variables, so that every time that the item-to-run is called, the independent variables have different values. A `loop` only calls a single other item. To call multiple items, will need to combine a `loop` it with a `sequence`.
A typical situation where a loop is `used` is to form a block of trials. In that case, the item-to-run is a trial sequence, which is called multiple times. This is illustrated in the screenshot below.
%--
figure:
id: FigLoopTable
source: 2.png
caption: |
A `loop` item provides a table in which you can define your independent variables.
--%
In this example `loop`, three independent variables have been defined: `object`, `orientation`, and `correct_response`. There are eight different cycles, or combinations ('knife, left, z', 'knife, right, z', etc.). Because 'repeat' is set to 3, every combination is called three times. Therefore, the item *trial_sequence* is called 8 x 3 = 24 times. The 'order' is set to 'random', which means that a random cycle is selected (without replacement) for every call of the item *trial_sequence*.
## Combining `loop`s and `sequence`s
`Loop`s and `sequence`s are often combined to create a structure in which multiple items are repeated. As we've seen, a typical example of a `loop`-`sequence` structure is a single block of trials. Here a single trial is a `sequence`, which is called repeatedly by a `loop` to form a block of trials, as shown in the screenshots below.
%--
figure:
id: FigBlock
source: 3.png
caption: |
A block of trials often corresponds to a `loop` item, which in turn calls a `sequence` item that corresponds to a single trial.
--%
One level up in the hierarchy of the experiment, there is another `loop`-`sequence` structure, which corresponds to multiple blocks of trials. Here, a `sequence` (the *block_sequence* in the figure) calls a single block of trials (the *block_loop*), followed by a `feedback` item. This `sequence` is repeatedly called by a `loop` (the *experimental_loop*), so that there are multiple blocks of trials, each followed by feedback.
%--
figure:
id: FigLoopSequence
source: 4.png
caption: |
You can use nested `loop`-`sequence` structures to implement trials, blocks of trials, blocks of blocks, etc.
--%
The structure displayed in the screenshot above might look a bit confusing at first sight, but it becomes clearer when you think about it as a two nested `loop`-`sequence` structures. The first one (*block_loop* - *trial_sequence*) corresponds to a single block of trials. The second one (*experimental_loop* - *block_sequence*) corresponds to multiple blocks of trials, each followed by feedback to the participant. Many experiments will contain a structure of this kind.
[timing]: /miscellaneous/timing
[usage/variables-and-conditional-if-statements/]: /usage/variables-and-conditional-statements/
| 62.025316 | 493 | 0.76 | eng_Latn | 0.999487 |
ff0d668d3fe31ba6775ac389872fe00058a88ff8 | 13,571 | md | Markdown | articles/mysql/concept-reserved-pricing.md | ZetaPR/azure-docs.es-es | 0e2bf787d1d9ab12065fcb1091a7f13b96c6f8a2 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-03-12T23:37:16.000Z | 2021-03-12T23:37:16.000Z | articles/mysql/concept-reserved-pricing.md | ZetaPR/azure-docs.es-es | 0e2bf787d1d9ab12065fcb1091a7f13b96c6f8a2 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/mysql/concept-reserved-pricing.md | ZetaPR/azure-docs.es-es | 0e2bf787d1d9ab12065fcb1091a7f13b96c6f8a2 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Pago por adelantado de recursos de proceso con capacidad reservada para Azure Database for MySQL
description: Pago por adelantado de recursos de proceso de Azure Database for MySQL con capacidad reservada
author: mksuni
ms.author: sumuth
ms.service: mysql
ms.topic: conceptual
ms.date: 10/06/2021
ms.openlocfilehash: fb679c4cfdcda3a34ea43bace8a9aa546e542acf
ms.sourcegitcommit: e82ce0be68dabf98aa33052afb12f205a203d12d
ms.translationtype: HT
ms.contentlocale: es-ES
ms.lasthandoff: 10/07/2021
ms.locfileid: "129658163"
---
# <a name="prepay-for-azure-database-for-mysql-compute-resources-with-reserved-instances"></a>Pago por adelantado de recursos de proceso de Azure Database for MySQL con instancias reservadas
[!INCLUDE[applies-to-mysql-single-flexible-server](includes/applies-to-mysql-single-flexible-server.md)]
Ahora Azure Database for MySQL le ayuda a ahorrar mediante el pago por adelantado de los recursos de proceso en comparación con los precios de pago por uso. Con las instancias reservadas de Azure Database for MySQL se realiza un compromiso inicial en el servidor MySQL sobre un periodo de entre uno y tres años con el fin de obtener un descuento importante en los costos de proceso. Para comprar capacidad reservada de Azure Database for MySQL, debe especificar la región de Azure, el tipo de implementación, el nivel de rendimiento y el periodo. </br>
## <a name="how-does-the-instance-reservation-work"></a>Funcionamiento de la reserva de instancias
No es necesario asignar la reserva a servidores concretos de Azure Database for MySQL. Una instancia de Azure Database for MySQL que ya está en ejecución, o las que se han implementado recientemente, obtendrán de forma automática la ventaja de los precios reservados. Al comprar una reserva, se adelanta el pago de los costos de proceso durante un período de uno a tres años. En cuanto se compra una reserva, los costos de proceso de Azure Database for MySQL que coincidan con los atributos de reserva dejan de pagarse según las tarifas de pago por uso. La reserva no cubre los cargos por software, redes o almacenamiento asociados al servidor de bases de datos de MySQL. Al final del plazo de reserva, la ventaja en la facturación expira y las instancias de Azure Database for MySQL se facturan según los precios de pago por uso. Las reservas no se renuevan automáticamente. Para obtener información sobre precios, vea [Oferta de capacidad reservada de Azure Database for MySQL](https://azure.microsoft.com/pricing/details/mysql/). </br>
Puede comprar capacidad reservada de Azure Database for MySQL en [Azure Portal](https://portal.azure.com/). Pague la reserva [por adelantado o mensualmente](../cost-management-billing/reservations/prepare-buy-reservation.md). Para comprar capacidad reservada:
* Debe tener el rol de propietario al menos en una suscripción Enterprise o individual con tarifas de pago por uso.
* En el caso de las suscripciones Enterprise, la opción **Agregar instancias reservadas** debe estar habilitada en el [portal de EA](https://ea.azure.com/). O bien, si esa opción está deshabilitada, debe ser un administrador de EA en la suscripción.
* En el caso del programa del Proveedor de soluciones en la nube (CSP), los únicos que pueden comprar la capacidad reservada de Azure Database for MySQL son los agentes de administración o de ventas. </br>
Para información sobre cómo se les cobra a los clientes de empresa y a los de pago por uso las compras de reservas, consulte [Información sobre el uso de reservas de Azure para la inscripción Enterprise](../cost-management-billing/reservations/understand-reserved-instance-usage-ea.md) y [Información sobre el uso de reservas de Azure para suscripciones de pago por uso](../cost-management-billing/reservations/understand-reserved-instance-usage.md).
## <a name="reservation-exchanges-and-refunds"></a>Cambios de reserva y reembolsos
Puede intercambiar una reserva por otra del mismo tipo, también puede intercambiar una de Azure Database for MySQL: servidor único por servidor flexible. Además es posible reembolsar una reserva si ya no la necesita. Puede usar Azure Portal para intercambiar o reembolsar una reserva. Para más información, consulte [Autoservicio de intercambios y reembolsos de reservas de Azure](../cost-management-billing/reservations/exchange-and-refund-azure-reservations.md).
## <a name="reservation-discount"></a>Descuentos por reserva
Puede ahorrar hasta un 67 % en costos de proceso con las instancias reservadas. Para encontrar el descuento aplicable en su caso, visite la [hoja Reserva de Azure Portal](https://aka.ms/reservations) y compruebe el ahorro por plan de tarifa y región. Las instancias reservadas le ayudan a administrar mejor las cargas de trabajo, el presupuesto y la previsión, gracias al pago por adelantado en períodos de un año o de tres años. También se pueden intercambiar o cancelar las reservas a medida que cambien las necesidades del negocio.
## <a name="determine-the-right-database-size-before-purchase"></a>Determinación del tamaño de base de datos adecuado antes de la compra
El tamaño de la reserva se debe basar en la cantidad total de proceso que va a usar el servidor existente o que se va a implementar pronto en una región específica y con el mismo nivel de rendimiento y generación de hardware.</br>
Por ejemplo, imagine que ejecuta una base de datos MySQL de propósito general Gen5 de 32 núcleos virtuales y dos bases de datos MySQL optimizadas para memoria Gen5 de 16 núcleos virtuales. Además, supongamos que planea implementar en el próximo mes un grupo elástico Gen5 de uso general y 32 núcleos virtuales adicionales, y un servidor de bases de datos optimizado para memoria Gen5 de 16 núcleos virtuales. Vamos a suponer que sabe que necesitará estos recursos durante al menos 1 año. En este caso debe comprar una reserva de 1 año de una instancia de Gen5 con 64 núcleos virtuales (2 × 32) para una base de datos única de uso general y otra reserva de 1 año de una instancia de Gen 5 con 48 núcleos virtuales (2 × 16 + 16) para una base de datos única optimizada para memoria.
## <a name="buy-azure-database-for-mysql-reserved-capacity"></a>Capacidad reservada de Azure Database for MySQL
1. Inicie sesión en [Azure Portal](https://portal.azure.com/).
2. Seleccione **Todos los servicios** > **Reservations**.
3. Seleccione **Agregar** y, en el panel Comprar reservas, seleccione **Azure Database for MySQL** para comprar una nueva reserva para las bases de datos de MySQL.
4. Rellene todos los campos obligatorios. Las bases de datos existentes o nuevas que coincidan con los atributos seleccionados serán aptas para el descuento en la capacidad reservada. El número real de servidores de Azure Database for MySQL que obtienen el descuento depende del ámbito y la cantidad seleccionados.
:::image type="content" source="media/concepts-reserved-pricing/mysql-reserved-price.png" alt-text="Información general sobre los precios reservados":::
En la siguiente tabla se describen los campos obligatorios.
| Campo | Descripción |
| :------------ | :------- |
| Subscription | La suscripción usada para pagar la reserva de capacidad reservada de Azure Database for MySQL. Los costos anticipados por la reserva de capacidad reservada de Azure Database for MySQL se cobran mediante el método de pago de la suscripción. El tipo de suscripción debe ser Contrato Enterprise (números de oferta: MS-AZR-0017P o MS-AZR-0148P) o un contrato individual con precios de pago por uso (números de oferta: MS-AZR-0003P o MS-AZR-0023P). Para una suscripción Enterprise, los cargos se deducen del pago por adelantado de Azure (antes conocido como saldo de compromiso monetario) de la inscripción, o se cobran como uso por encima del límite. Para una suscripción individual con precios de pago por uso, los cargos se cobran en el método de pago de la factura o la tarjeta de crédito de la suscripción.
| Ámbito | El ámbito de la reserva de núcleos virtuales puede cubrir una suscripción o varias (ámbito compartido). Si selecciona: </br></br> **Compartido**: el descuento por la reserva de núcleos virtuales se aplica a los servidores de Azure Database for MySQL en ejecución en cualquiera de las suscripciones en el contexto de facturación. Para los clientes Enterprise, el ámbito compartido es la inscripción e incluye todas las suscripciones que esta contiene. Para los clientes de Pago por uso, el ámbito compartido incluye todas las suscripciones de Pago por uso creadas por el administrador de la cuenta.</br></br> **Suscripción única**: el descuento por la reserva de núcleos virtuales se aplica a los servidores de Azure Database for MySQL de esta suscripción. </br></br> **Grupo de recursos único**: el descuento de reserva se aplica a los servidores de Azure Database for MySQL de la suscripción seleccionada y al grupo de recursos seleccionado de esa suscripción.
| Region | La región de Azure que abarca la reserva de capacidad reservada de Azure Database for MySQL.
| Tipo de implementación | El tipo de recurso de Azure Database for MySQL para el que quiere comprar la reserva.
| Nivel de rendimiento | El nivel de servicio de los servidores de Azure Database for MySQL.
| Término | Un año
| Cantidad | Número de recursos de proceso que se compran dentro de la reserva de capacidad reservada de Azure Database for MySQL. La cantidad es un número de núcleos virtuales de la región de Azure y el nivel de rendimiento seleccionados que se están reservando y obtendrán el descuento de facturación. Por ejemplo, si ejecuta o planea ejecutar servidores de Azure Database for MySQL con la capacidad total de proceso de 16 núcleos virtuales Gen5 en la región Este de EE. UU., deberá especificar la cantidad como 16 para maximizar las ventajas de todos los servidores.
## <a name="reserved-instances-api-support"></a>Soporte técnico de la API de instancias reservadas
Use las API de Azure para obtener información mediante programación para su organización sobre el servicio de Azure o las reservas de software. Por ejemplo, use las API para:
- Buscar las reservas que desee comprar
- Adquisición de una reserva
- Visualización de las reservas adquiridas
- Ver y administrar el acceso a las reservas
- Dividir o combinar reservas
- Cambiar el ámbito de las reservas
Para más información, consulte [APIs for Azure reservation automation](../cost-management-billing/reservations/reservation-apis.md) (API de automatización de reservas de Azure).
## <a name="vcore-size-flexibility"></a>Flexibilidad de tamaño del núcleo virtual
La flexibilidad de tamaño del núcleo virtual le ayuda a escalar o reducir verticalmente dentro de un nivel de rendimiento y una región, sin perder los beneficios de la capacidad reservada.
## <a name="how-to-view-reserved-instance-purchase-details"></a>Visualización de los detalles de la compra de instancias reservadas
Puede ver los detalles de la compra de una instancia reservada por medio del [menú Reservas en el lado izquierdo de Azure Portal](https://aka.ms/reservations). Para obtener más información, vea [Aplicación de un descuento por reserva a un servidor individual Azure Database for MySQL](../cost-management-billing/reservations/understand-reservation-charges-mysql.md).
## <a name="reserved-instance-expiration"></a>Expiración de las instancias reservadas
Recibirá notificaciones por correo electrónico, la primera, 30 días antes de la expiración de la reserva y otra al expirar. Una vez que expire la reserva, las máquinas virtuales implementadas seguirán ejecutándose y se facturarán mediante una tasa de pago por uso. Para obtener más información, vea [Instancias reservadas de Azure Database for MySQL](../cost-management-billing/reservations/understand-reservation-charges-mysql.md).
## <a name="need-help--contact-us"></a>¿Necesita ayuda? Ponerse en contacto con nosotros
Si tiene alguna pregunta o necesita ayuda, [cree una solicitud de soporte técnico](https://portal.azure.com/#blade/Microsoft_Azure_Support/HelpAndSupportBlade/newsupportrequest).
## <a name="next-steps"></a>Pasos siguientes
El descuento por la reserva de núcleos virtuales se aplica automáticamente al número de servidores de Azure Database for MySQL que coincidan con el ámbito y los atributos de la reserva de capacidad reservada de Azure Database for MySQL. Se puede actualizar el ámbito de la reserva de capacidad reservada de Azure Database for MySQL a través de Azure Portal, PowerShell, la CLI o la API. </br></br>
Para obtener información sobre cómo administrar la capacidad reservada de Azure Database for MySQL, vea la sección Administración de la capacidad reservada de Azure Database for MySQL.
Para obtener más información acerca de Azure Reservations, consulte los siguientes artículos:
* [¿Qué es Azure Reservations?](../cost-management-billing/reservations/save-compute-costs-reservations.md)
* [Administración de Azure Reservations](../cost-management-billing/reservations/manage-reserved-vm-instance.md)
* [Información sobre el descuento de Azure Reservations](../cost-management-billing/reservations/understand-reservation-charges.md)
* [Información sobre el uso de reservas para suscripciones de pago por uso](../cost-management-billing/reservations/understand-reservation-charges-mysql.md)
* [Información sobre el uso de reservas para la inscripción Enterprise](../cost-management-billing/reservations/understand-reserved-instance-usage-ea.md)
* [Azure Reservations en el programa del Proveedor de soluciones en la nube (CSP) del Centro de partners](/partner-center/azure-reservations) | 120.097345 | 1,038 | 0.800162 | spa_Latn | 0.988096 |
ff0e6ff6000d7d8ca093de1e0df9a5130db148ba | 30,126 | md | Markdown | samples/moby-dick/032-cetology.md | soapdog/little-webby-press | 1a2d19a52bfb565c6554d76edcd7f3203bc62d56 | [
"MIT"
] | 19 | 2020-06-12T12:28:27.000Z | 2022-02-12T04:32:05.000Z | samples/moby-dick/032-cetology.md | soapdog/little-webby-press | 1a2d19a52bfb565c6554d76edcd7f3203bc62d56 | [
"MIT"
] | 5 | 2021-04-13T07:23:38.000Z | 2022-02-11T15:31:14.000Z | samples/moby-dick/032-cetology.md | soapdog/little-webby-press | 1a2d19a52bfb565c6554d76edcd7f3203bc62d56 | [
"MIT"
] | 4 | 2020-11-17T16:53:02.000Z | 2021-12-16T04:08:39.000Z | ### Chapter 32.
# Cetology.
Already we are boldly launched upon the deep; but soon we shall be lost in its
unshored, harbourless immensities. Ere that come to pass; ere the _Pequod’s_
weedy hull rolls side by side with the barnacled hulls of the leviathan; at the
outset it is but well to attend to a matter almost indispensable to a thorough
appreciative understanding of the more special leviathanic revelations and
allusions of all sorts which are to follow.
It is some systematized exhibition of the whale in his broad genera, that I
would now fain put before you. Yet is it no easy task. The classification of
the constituents of a chaos, nothing less is here essayed. Listen to what the
best and latest authorities have laid down.
“No branch of Zoology is so much involved as that which is entitled Cetology,”
says Captain Scoresby, A.D. 1820.
“It is not my intention, were it in my power, to enter into the inquiry as to
the true method of dividing the cetacea into groups and families.... Utter
confusion exists among the historians of this animal” (sperm whale), says
Surgeon Beale, A.D. 1839.
“Unfitness to pursue our research in the unfathomable waters.” “Impenetrable
veil covering our knowledge of the cetacea.” “A field strewn with thorns.” “All
these incomplete indications but serve to torture us naturalists.”
Thus speak of the whale, the great Cuvier, and John Hunter, and Lesson, those
lights of zoology and anatomy. Nevertheless, though of real knowledge there be
little, yet of books there are a plenty; and so in some small degree, with
cetology, or the science of whales. Many are the men, small and great, old and
new, landsmen and seamen, who have at large or in little, written of the whale.
Run over a few: — The Authors of the Bible; Aristotle; Pliny; Aldrovandi; Sir
Thomas Browne; Gesner; Ray; Linnaeus; Rondeletius; Willoughby; Green; Artedi;
Sibbald; Brisson; Marten; Lacepede; Bonneterre; Desmarest; Baron Cuvier;
Frederick Cuvier; John Hunter; Owen; Scoresby; Beale; Bennett; J. Ross Browne;
the Author of Miriam Coffin; Olmstead; and the Rev. T. Cheever. But to what
ultimate generalizing purpose all these have written, the above cited extracts
will show.
Of the names in this list of whale authors, only those following Owen ever saw
living whales; and but one of them was a real professional harpooneer and
whaleman. I mean Captain Scoresby. On the separate subject of the Greenland or
right-whale, he is the best existing authority. But Scoresby knew nothing and
says nothing of the great sperm whale, compared with which the Greenland whale
is almost unworthy mentioning. And here be it said, that the Greenland whale is
an usurper upon the throne of the seas. He is not even by any means the largest
of the whales. Yet, owing to the long priority of his claims, and the profound
ignorance which, till some seventy years back, invested the then fabulous or
utterly unknown sperm-whale, and which ignorance to this present day still
reigns in all but some few scientific retreats and whale-ports; this usurpation
has been every way complete. Reference to nearly all the leviathanic allusions
in the great poets of past days, will satisfy you that the Greenland whale,
without one rival, was to them the monarch of the seas. But the time has at
last come for a new proclamation. This is Charing Cross; hear ye! good people
all, — the Greenland whale is deposed, — the great sperm whale now reigneth!
There are only two books in being which at all pretend to put the living sperm
whale before you, and at the same time, in the remotest degree succeed in the
attempt. Those books are Beale’s and Bennett’s; both in their time surgeons to
English South-Sea whale-ships, and both exact and reliable men. The original
matter touching the sperm whale to be found in their volumes is necessarily
small; but so far as it goes, it is of excellent quality, though mostly
confined to scientific description. As yet, however, the sperm whale,
scientific or poetic, lives not complete in any literature. Far above all other
hunted whales, his is an unwritten life.
Now the various species of whales need some sort of popular comprehensive
classification, if only an easy outline one for the present, hereafter to be
filled in all its departments by subsequent laborers. As no better man advances
to take this matter in hand, I hereupon offer my own poor endeavors. I promise
nothing complete; because any human thing supposed to be complete, must for
that very reason infallibly be faulty. I shall not pretend to a minute
anatomical description of the various species, or — in this place at least — to
much of any description. My object here is simply to project the draught of a
systematization of cetology. I am the architect, not the builder.
But it is a ponderous task; no ordinary letter-sorter in the Post-Office is
equal to it. To grope down into the bottom of the sea after them; to have one’s
hands among the unspeakable foundations, ribs, and very pelvis of the world;
this is a fearful thing. What am I that I should essay to hook the nose of this
leviathan! The awful tauntings in Job might well appal me. Will he the
(leviathan) make a covenant with thee? Behold the hope of him is vain! But I
have swam through libraries and sailed through oceans; I have had to do with
whales with these visible hands; I am in earnest; and I will try. There are
some preliminaries to settle.
**First:** The uncertain, unsettled condition of this science of Cetology is in the
very vestibule attested by the fact, that in some quarters it still remains a
moot point whether a whale be a fish. In his System of Nature, A.D. 1776,
Linnaeus declares, “I hereby separate the whales from the fish.” But of my own
knowledge, I know that down to the year 1850, sharks and shad, alewives and
herring, against Linnaeus’s express edict, were still found dividing the
possession of the same seas with the Leviathan.
The grounds upon which Linnaeus would fain have banished the whales from the
waters, he states as follows: “On account of their warm bilocular heart, their
lungs, their movable eyelids, their hollow ears, penem intrantem feminam mammis
lactantem,” and finally, _“ex lege naturae jure meritoque.”_ I submitted all this
to my friends Simeon Macey and Charley Coffin, of Nantucket, both messmates of
mine in a certain voyage, and they united in the opinion that the reasons set
forth were altogether insufficient. Charley profanely hinted they were humbug.
Be it known that, waiving all argument, I take the good old fashioned ground
that the whale is a fish, and call upon holy Jonah to back me. This
fundamental thing settled, the next point is, in what internal respect does the
whale differ from other fish. Above, Linnaeus has given you those items. But in
brief, they are these: lungs and warm blood; whereas, all other fish are
lungless and cold blooded.
**Next:** how shall we define the whale, by his obvious externals, so as
conspicuously to label him for all time to come? To be short, then, a whale is
_a spouting fish with a horizontal tail._ There you have him. However
contracted, that definition is the result of expanded meditation. A walrus
spouts much like a whale, but the walrus is not a fish, because he is
amphibious. But the last term of the definition is still more cogent, as
coupled with the first. Almost any one must have noticed that all the fish
familiar to landsmen have not a flat, but a vertical, or up-and-down tail.
Whereas, among spouting fish the tail, though it may be similarly shaped,
invariably assumes a horizontal position.
By the above definition of what a whale is, I do by no means exclude from the
leviathanic brotherhood any sea creature hitherto identified with the whale by
the best informed Nantucketers; nor, on the other hand, link with it any fish
hitherto authoritatively regarded as alien.\* Hence, all the smaller, spouting,
and horizontal tailed fish must be included in this ground-plan of Cetology.
Now, then, come the grand divisions of the entire whale host.
> \* I am aware that down to the present time, the fish styled Lamatins and
> Dugongs (Pig-fish and Sow-fish of the Coffins of Nantucket) are included by
> many naturalists among the whales. But as these pig-fish are a noisy,
> contemptible set, mostly lurking in the mouths of rivers, and feeding on wet
> hay, and especially as they do not spout, I deny their credentials as whales;
> and have presented them with their passports to quit the Kingdom of Cetology.
---
**First:** According to magnitude I divide the whales into three primary
_Books_ (subdivisible into _Chapters_), and these shall comprehend them all,
both small and large.
## I. The _Folio_ Whale; II. the _Octavo_ Whale; III. the _Duodecimo_ Whale.
As the type of the _Folio_ I present the **Sperm Whale**; of the _Octavo_, the
**Grampus**; of the _Duodecimo,_ the **Porpoise.**
---
### Folios.
Among these I here include the following chapters: — **I. The Sperm
Whale;** **II. the Right Whale;** **III. the Fin-back Whale;** **IV. the Hump-backed
Whale;** **V. the Razor-back Whale;** **VI. the Sulphur-Bottom Whale.**
#### Book I. **(Folio),** Chapter I. (Sperm Whale).
— This whale, among the English of old vaguely known as the Trumpa whale, and
the Physeter whale, and the Anvil Headed whale, is the present Cachalot of the
French, and the Pottsfich of the Germans, and the Macrocephalus of the Long
Words. He is, without doubt, the largest inhabitant of the globe; the most
formidable of all whales to encounter; the most majestic in aspect; and lastly,
by far the most valuable in commerce; he being the only creature from which
that valuable substance, spermaceti, is obtained. All his peculiarities will,
in many other places, be enlarged upon. It is chiefly with his name that I now
have to do. Philologically considered, it is absurd. Some centuries ago, when
the Sperm whale was almost wholly unknown in his own proper individuality, and
creature identical with the one then known in England as the Greenland or Right
Whale. It was the idea also, that this same spermaceti was that quickening
humor of the Greenland Whale which the first syllable of the word literally
expresses. In those times, also, spermaceti was exceedingly scarce, not being
used for light, but only as an ointment and medicament. It was only to be had
from the druggists as you nowadays buy an ounce of rhubarb. When, as I opine,
in the course of time, the true nature of spermaceti became known, its original
name was still retained by the dealers; no doubt to enhance its value by a
notion so strangely significant of its scarcity. And so the appellation must at
last have come to be bestowed upon the whale from which this spermaceti was
really derived.
#### Book I. _(Folio),_ Chapter II. (Right Whale).
— In one respect this is the most venerable of the leviathans, being the one
first regularly hunted by man. It yields the article commonly known as
whalebone or baleen; and the oil specially known as “whale oil,” an inferior
article in commerce. Among the fishermen, he is indiscriminately designated by
all the following titles: The Whale; the Greenland Whale; the Black Whale; the
Great Whale; the True Whale; the Right Whale. There is a deal of obscurity
concerning the identity of the species thus multitudinously baptised. What then
is the whale, which I include in the second species of my Folios? It is the
Great Mysticetus of the English naturalists; the Greenland Whale of the English
whalemen; the Baliene Ordinaire of the French whalemen; the Growlands Walfish
of the Swedes. It is the whale which for more than two centuries past has been
hunted by the Dutch and English in the Arctic seas; it is the whale which the
American fishermen have long pursued in the Indian ocean, on the Brazil Banks,
on the Nor’ West Coast, and various other parts of the world, designated by
them Right Whale Cruising Grounds.
Some pretend to see a difference between the Greenland whale of the English and
the right whale of the Americans. But they precisely agree in all their grand
features; nor has there yet been presented a single determinate fact upon which
to ground a radical distinction. It is by endless subdivisions based upon the
most inconclusive differences, that some departments of natural history become
so repellingly intricate. The right whale will be elsewhere treated of at some
length, with reference to elucidating the sperm whale.
#### Book I. _(Folio),_ Chapter III. (Fin-back).
— Under this head I reckon a monster which, by the various names of Fin-Back,
Tall-Spout, and Long-John, has been seen almost in every sea and is commonly
the whale whose distant jet is so often descried by passengers crossing the
Atlantic, in the New York packet-tracks. In the length he attains, and in his
baleen, the Fin-back resembles the right whale, but is of a less portly girth,
and a lighter colour, approaching to olive. His great lips present a cable-like
aspect, formed by the intertwisting, slanting folds of large wrinkles. His
grand distinguishing feature, the fin, from which he derives his name, is often
a conspicuous object. This fin is some three or four feet long, growing
vertically from the hinder part of the back, of an angular shape, and with a
very sharp pointed end. Even if not the slightest other part of the creature be
visible, this isolated fin will, at times, be seen plainly projecting from the
surface. When the sea is moderately calm, and slightly marked with spherical
ripples, and this gnomon-like fin stands up and casts shadows upon the wrinkled
surface, it may well be supposed that the watery circle surrounding it somewhat
resembles a dial, with its style and wavy hour-lines graved on it. On that
Ahaz-dial the shadow often goes back. The Fin-Back is not gregarious. He seems
a whale-hater, as some men are man-haters. Very shy; always going solitary;
unexpectedly rising to the surface in the remotest and most sullen waters; his
straight and single lofty jet rising like a tall misanthropic spear upon a
barren plain; gifted with such wondrous power and velocity in swimming, as to
defy all present pursuit from man; this leviathan seems the banished and
unconquerable Cain of his race, bearing for his mark that style upon his back.
From having the baleen in his mouth, the Fin-Back is sometimes included with
the right whale, among a theoretic species denominated _whalebone whales,_ that
is, whales with baleen. Of these so called Whalebone whales, there would seem
to be several varieties, most of which, however, are little known. Broad-nosed
whales and beaked whales; pike-headed whales; bunched whales; under-jawed
whales and rostrated whales, are the fishermen’s names for a few sorts.
In connection with this appellative of “Whalebone whales,” it is of great
importance to mention, that however such a nomenclature may be convenient in
facilitating allusions to some kind of whales, yet it is in vain to attempt a
clear classification of the Leviathan, founded upon either his baleen, or hump,
or fin, or teeth; notwithstanding that those marked parts or features very
obviously seem better adapted to afford the basis for a regular system of
Cetology than any other detached bodily distinctions, which the whale, in his
kinds, presents. How then? The baleen, hump, back-fin, and teeth; these are
things whose peculiarities are indiscriminately dispersed among all sorts of
whales, without any regard to what may be the nature of their structure in
other and more essential particulars. Thus, the sperm whale and the humpbacked
whale, each has a hump; but there the similitude ceases. Then, this same
humpbacked whale and the Greenland whale, each of these has baleen; but there
again the similitude ceases. And it is just the same with the other parts above
mentioned. In various sorts of whales, they form such irregular combinations;
or, in the case of any one of them detached, such an irregular isolation; as
utterly to defy all general methodization formed upon such a basis. On this
rock every one of the whale-naturalists has split.
But it may possibly be conceived that, in the internal parts of the whale, in
his anatomy — there, at least, we shall be able to hit the right
classification. Nay; what thing, for example, is there in the Greenland whale’s
anatomy more striking than his baleen? Yet we have seen that by his baleen it
is impossible correctly to classify the Greenland whale. And if you descend
into the bowels of the various leviathans, why there you will not find
distinctions a fiftieth part as available to the systematizer as those external
ones already enumerated. What then remains? nothing but to take hold of the
whales bodily, in their entire liberal volume, and boldly sort them that way.
And this is the Bibliographical system here adopted; and it is the only one
that can possibly succeed, for it alone is practicable. To proceed.
#### Book I. _(Folio)_ Chapter IV. (Hump-back).
— This whale is often seen on the northern American coast. He has been
frequently captured there, and towed into harbor. He has a great pack on him
like a peddler; or you might call him the Elephant and Castle whale. At any
rate, the popular name for him does not sufficiently distinguish him, since the
sperm whale also has a hump though a smaller one. His oil is not very valuable.
He has baleen. He is the most gamesome and light-hearted of all the whales,
making more gay foam and white water generally than any other of them.
#### Book I. _(Folio),_ Chapter V. (Razor-back).
— Of this whale little is known but his name. I have seen him at a distance off
Cape Horn. Of a retiring nature, he eludes both hunters and philosophers.
Though no coward, he has never yet shown any part of him but his back, which
rises in a long sharp ridge. Let him go. I know little more of him, nor does
anybody else.
#### Book I. _(Folio),_ CHAPTER VI. (Sulphur-bottom).
— Another retiring gentleman, with a brimstone belly, doubtless got by scraping
along the Tartarian tiles in some of his profounder divings. He is seldom seen;
at least I have never seen him except in the remoter southern seas, and then
always at too great a distance to study his countenance. He is never chased; he
would run away with rope-walks of line. Prodigies are told of him. Adieu,
Sulphur Bottom! I can say nothing more that is true of ye, nor can the oldest
Nantucketer.
Thus ends **Book I. _(Folio),_** and now begins **Book II. _(Octavo)._**
---
### _Octavoes._\*
— These embrace the whales of middling magnitude, among which present may be
numbered: — **I., the Grampus;** **II., the Black fish;** **III., the
Narwhale;** **IV., the Thrasher;** **V., the Killer.**
> \* Why this book of whales is not denominated the Quarto is very plain.
> Because, while the whales of this order, though smaller than those of the
> former order, nevertheless retain a proportionate likeness to them in figure,
> yet the bookbinder’s Quarto volume in its dimensioned form does not preserve
> the shape of the _Folio_ volume, but the _Octavo_ volume does.
#### Book II. _(Octavo),_ Chapter I. (Grampus).
— Though this fish, whose loud sonorous breathing, or rather blowing, has
furnished a proverb to landsmen, is so well known a denizen of the deep, yet is
he not popularly classed among whales. But possessing all the grand distinctive
features of the leviathan, most naturalists have recognised him for one. He is
of moderate _octavo_ size, varying from fifteen to twenty-five feet in length,
and of corresponding dimensions round the waist. He swims in herds; he is never
regularly hunted, though his oil is considerable in quantity, and pretty good
for light. By some fishermen his approach is regarded as premonitory of the
advance of the great sperm whale.
#### Book II. _(Octavo),_ Chapter II. (Black fish).
— I give the popular fishermen’s names for all these fish, for generally they
are the best. Where any name happens to be vague or inexpressive, I shall say
so, and suggest another. I do so now, touching the Black Fish, so-called,
because blackness is the rule among almost all whales. So, call him the Hyena
Whale, if you please. His voracity is well known, and from the circumstance
that the inner angles of his lips are curved upwards, he carries an everlasting
Mephistophelean grin on his face. This whale averages some sixteen or eighteen
feet in length. He is found in almost all latitudes. He has a peculiar way of
showing his dorsal hooked fin in swimming, which looks something like a Roman
nose. When not more profitably employed, the sperm whale hunters sometimes
capture the Hyena whale, to keep up the supply of cheap oil for domestic
employment — as some frugal housekeepers, in the absence of company, and quite
alone by themselves, burn unsavory tallow instead of odorous wax. Though their
blubber is very thin, some of these whales will yield you upwards of thirty
gallons of oil.
#### Book II. _(Octavo),_ Chapter III. (Narwhale), that is, Nostril whale.
— Another instance of a curiously named whale, so named I suppose from his
peculiar horn being originally mistaken for a peaked nose. The creature is some
sixteen feet in length, while its horn averages five feet, though some exceed
ten, and even attain to fifteen feet. Strictly speaking, this horn is but a
lengthened tusk, growing out from the jaw in a line a little depressed from the
horizontal. But it is only found on the sinister side, which has an ill effect,
giving its owner something analogous to the aspect of a clumsy left-handed man.
What precise purpose this ivory horn or lance answers, it would be hard to say.
It does not seem to be used like the blade of the sword-fish and bill-fish;
though some sailors tell me that the Narwhale employs it for a rake in turning
over the bottom of the sea for food. Charley Coffin said it was used for an
ice-piercer; for the Narwhale, rising to the surface of the Polar Sea, and
finding it sheeted with ice, thrusts his horn up, and so breaks through. But
you cannot prove either of these surmises to be correct. My own opinion is,
that however this one-sided horn may really be used by the Narwhale — however
that may be — it would certainly be very convenient to him for a folder in
reading pamphlets. The Narwhale I have heard called the Tusked whale, the
Horned whale, and the Unicorn whale. He is certainly a curious example of the
Unicornism to be found in almost every kingdom of animated nature. From certain
cloistered old authors I have gathered that this same sea-unicorn’s horn was in
ancient days regarded as the great antidote against poison, and as such,
preparations of it brought immense prices. It was also distilled to a volatile
salts for fainting ladies, the same way that the horns of the male deer are
manufactured into hartshorn. Originally it was in itself accounted an object of
great curiosity. Black Letter tells me that Sir Martin Frobisher on his return
from that voyage, when Queen Bess did gallantly wave her jewelled hand to him
from a window of Greenwich Palace, as his bold ship sailed down the Thames;
“when Sir Martin returned from that voyage,” saith Black Letter, “on bended
knees he presented to her highness a prodigious long horn of the Narwhale,
which for a long period after hung in the castle at Windsor.” An Irish author
avers that the Earl of Leicester, on bended knees, did likewise present to her
highness another horn, pertaining to a land beast of the unicorn nature.
The Narwhale has a very picturesque, leopard-like look, being of a milk-white
ground colour, dotted with round and oblong spots of black. His oil is very
superior, clear and fine; but there is little of it, and he is seldom hunted.
He is mostly found in the circumpolar seas.
#### Book II. _(Octavo),_ Chapter IV. (Killer).
— Of this whale little is precisely known to the Nantucketer, and nothing at
all to the professed naturalist. From what I have seen of him at a distance, I
should say that he was about the bigness of a grampus. He is very savage — a
sort of Feegee fish. He sometimes takes the great _Folio_ whales by the lip,
and hangs there like a leech, till the mighty brute is worried to death. The
Killer is never hunted. I never heard what sort of oil he has. Exception might
be taken to the name bestowed upon this whale, on the ground of its
indistinctness. For we are all killers, on land and on sea; Bonapartes and
Sharks included.
#### Book II. _(Octavo),_ Chapter V. (Thrasher).
— This gentleman is famous for his tail, which he uses for a ferule in
thrashing his foes. He mounts the _Folio_ whale’s back, and as he swims, he
works his passage by flogging him; as some schoolmasters get along in the world
by a similar process. Still less is known of the Thrasher than of the Killer.
Both are outlaws, even in the lawless seas.
Thus ends **Book II. _(Octavo),_** and begins **Book III. _(Duodecimo)._**
---
### Duodecimoes.
— These include the smaller whales. **I. The Huzza Porpoise.** **II. The
Algerine Porpoise.** **III. The Mealy-mouthed Porpoise.**
To those who have not chanced specially to study the subject, it may possibly
seem strange, that fishes not commonly exceeding four or five feet should be
marshalled among _whales_ — a word, which, in the popular sense, always conveys
an idea of hugeness. But the creatures set down above as _Duodecimoes_ are
infallibly whales, by the terms of my definition of what a whale is — i.e. a
spouting fish, with a horizontal tail.
#### Book III. _(Duodecimo),_ Chapter 1. (Huzza Porpoise).
— This is the common porpoise found almost all over the globe. The name is of
my own bestowal; for there are more than one sort of porpoises, and something
must be done to distinguish them. I call him thus, because he always swims in
hilarious shoals, which upon the broad sea keep tossing themselves to heaven
like caps in a Fourth-of-July crowd. Their appearance is generally hailed with
delight by the mariner. Full of fine spirits, they invariably come from the
breezy billows to windward. They are the lads that always live before the wind.
They are accounted a lucky omen. If you yourself can withstand three cheers at
beholding these vivacious fish, then heaven help ye; the spirit of godly
gamesomeness is not in ye. A well-fed, plump Huzza Porpoise will yield you one
good gallon of good oil. But the fine and delicate fluid extracted from his
jaws is exceedingly valuable. It is in request among jewellers and watchmakers.
Sailors put it on their hones. Porpoise meat is good eating, you know. It may
never have occurred to you that a porpoise spouts. Indeed, his spout is so
small that it is not very readily discernible. But the next time you have a
chance, watch him; and you will then see the great Sperm whale himself in
miniature.
#### Book III. _(Duodecimo),_ Chapter II. (Algerine Porpoise).
— A pirate. Very savage. He is only found, I think, in the Pacific. He is
somewhat larger than the Huzza Porpoise, but much of the same general make.
Provoke him, and he will buckle to a shark. I have lowered for him many times,
but never yet saw him captured.
#### Book III. _(Duodecimo),_ Chapter III. (Mealy-mouthed Porpoise).
— The largest kind of Porpoise; and only found in the Pacific, so far as it is
known. The only English name, by which he has hitherto been designated, is that
of the fishers — Right-Whale Porpoise, from the circumstance that he is chiefly
found in the vicinity of that _Folio._ In shape, he differs in some degree from
the Huzza Porpoise, being of a less rotund and jolly girth; indeed, he is of
quite a neat and gentleman-like figure. He has no fins on his back (most other
porpoises have), he has a lovely tail, and sentimental Indian eyes of a hazel
hue. But his mealy-mouth spoils all. Though his entire back down to his side
fins is of a deep sable, yet a boundary line, distinct as the mark in a ship’s
hull, called the “bright waist,” that line streaks him from stem to stern, with
two separate colours, black above and white below. The white comprises part of
his head, and the whole of his mouth, which makes him look as if he had just
escaped from a felonious visit to a meal-bag. A most mean and mealy aspect! His
oil is much like that of the common porpoise.
---
Beyond the _Duodecimo,_ this system does not proceed, inasmuch as the Porpoise
is the smallest of the whales. Above, you have all the Leviathans of note. But
there are a rabble of uncertain, fugitive, half-fabulous whales, which, as an
American whaleman, I know by reputation, but not personally. I shall enumerate
them by their fore-castle appellations; for possibly such a list may be
valuable to future investigators, who may complete what I have here but begun.
If any of the following whales, shall hereafter be caught and marked, then he
can readily be incorporated into this System, according to his _Folio,_ _Octavo,_
or _Duodecimo_ magnitude: — The Bottle-Nose Whale; the Junk Whale; the
Pudding-Headed Whale; the Cape Whale; the Leading Whale; the Cannon Whale; the
Scragg Whale; the Coppered Whale; the Elephant Whale; the Iceberg Whale; the
Quog Whale; the Blue Whale; etc. From Icelandic, Dutch, and old English
authorities, there might be quoted other lists of uncertain whales, blessed
with all manner of uncouth names. But I omit them as altogether obsolete; and
can hardly help suspecting them for mere sounds, full of Leviathanism, but
signifying nothing.
**Finally:** It was stated at the outset, that this system would not be here,
and at once, perfected. You cannot but plainly see that I have kept my word.
But I now leave my cetological System standing thus unfinished, even as the
great Cathedral of Cologne was left, with the crane still standing upon the top
of the uncompleted tower. For small erections may be finished by their first
architects; grand ones, true ones, ever leave the copestone to posterity. God
keep me from ever completing anything. This whole book is but a draught — nay,
but the draught of a draught. Oh, Time, Strength, Cash, and Patience!
| 61.481633 | 84 | 0.78341 | eng_Latn | 0.999963 |
ff0e7f77747b0f93fd25f82551722889fc53e53e | 4,108 | md | Markdown | README.md | parcelLab/aws-heartbeat-server | 19263977877fb5f62a4806faf1919be1b083df6e | [
"MIT"
] | 3 | 2018-06-30T15:43:13.000Z | 2020-09-28T22:54:41.000Z | README.md | parcelLab/aws-heartbeat-server | 19263977877fb5f62a4806faf1919be1b083df6e | [
"MIT"
] | null | null | null | README.md | parcelLab/aws-heartbeat-server | 19263977877fb5f62a4806faf1919be1b083df6e | [
"MIT"
] | 4 | 2017-11-22T15:34:50.000Z | 2021-09-10T07:27:14.000Z | # AWS Heartbeat (Server)
> Easy Heartbeat service based on AWS Lambda & DynamoDB
For client see: https://github.com/parcelLab/aws-heartbeat-client
# About
The goal of this repo is to provide a heartbeat service to monitor your services using AWS Lambda and DynamoDB.
AWS API gateway is used to return dynamoDB data in the form of a html table. It's sortable and all pulses which exceeded their threshold are marked.
The default setting has a 24h threshold and only counts Weekdays. e.g. a pulse(with a threshold of 24h) send on friday evening wont be marked as exceeded until monday evening.
It's possible to change the settings and get automatic Slack notifications in case of an exceeded time limit.
# What is the repo content?
This repo contains AWS Lambda functions, instructions how to deploy them and how to setup the AWS API gateway. It contains 5 lambda functions:
* `pulse` handles the pulse API calls from the aws-heartbeat-client, and creates a new DynamoDB document or updates an existing one — i.e. this one receives a heartbeat.
* `dashboard` gets all hearbeats from DynamoDB and returns a sortable data table as a web page. Every pulse has a edit option to change settings.
* `monitor`: checks if any pulse exceeded his threshold. Sends error notification to Slack, if so.
* `editGet`: renders a page to edit a single heartbeat via a table with all data of the pulse.
* `editPost`: updates the settings for a single heartbeat in DynamoDB, and is called by `editGet`
# Setup
There are multiple steps to make this work:
1. Create a new DynamoDB table with the name `Heartbeat` and the primary partition key `_id`.
2. Create a new AWS API Gateway with a child Resource `/heartbeat`.
3. Create a IAM role with read and write access rights to DynamoDB, as this is needed for the Lambda functions to work.
4. Download this repo, go to the folder and install the dependencies with `npm install`
5. Before deploying any Lambda functions, change the `secret`, `baseUrl` and `slackWebhook` in the code. You can do so by running the prepared script:
```
./changeSettings.sh <secret> <baseUrl> <slack>
```
...where:
* `secret`: a random string of your choice, acts as a kind of password to be used in the URL to access the dashboard.
* `baseUrl`: the URL of your newly created AWS API Gateway like `https://lambda.example.com/v1/heartbeat`
* `slackWebhook`: the URL of your Slack webhook, to be set up in Slack. If you don't provide this, the monitor function won't send any notifications.
6. Deploy the Lambda functions with [Claudia](https://claudiajs.com/), which requires you to first successfully configure your AWS credentials as described here: http://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/configuring-the-jssdk.html
```
./deployLambda.sh <region> <role>
```
...where:
* `region`: the AWS region where you want to deploy your functions, e.g. `eu-central-1`
* `role`: the access role for Lambda functions, e.g. `arn:aws:iam::123456789123:role/role_name`
7. Create 4 Child Resources and methods on the API Gateway. **Important:** For all methods, choose integration type `Lambda Function` and Lambda Proxy Intergration `yes`:
* heartbeat/dashboard: method `GET` with function `dashboard`
* heartbeat/edit: method `GET` with function `editGet`, and method `POST` with function `editPost`
* heartbeat/monitor: method `GET` with function `monitor`
* heartbeat/pulse: method `GET` with function `pulse`
8. Set cronjob for monitor function by going to [AWS Cloudwatch / Events](https://eu-central-1.console.aws.amazon.com/cloudwatch/home#rules:), and create a new rule with a schedule pattern:
We recommend setting up two rules, each executing the same Lambda function, but one for working days and one for Sundays that executes the monitor less often:
Working days:
```
0 0,3,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20 ? * MON,TUE,WED,THU,FRI,SAT *
```
Sundays:
```
0 6,12,18 ? * SUN *
```
On the side of the target, configure the input and set a Constant (JSON text):
```
{"queryStringParameters":{"secret":"your-secret-goes-here"}}
```
| 46.157303 | 253 | 0.752921 | eng_Latn | 0.989028 |
ff0e8f685304f896d199562dfd30cf1cd9a56d06 | 40 | md | Markdown | TxtableAutoWrite/README.md | TsumugiWenders/Archives | faeb60c8592a1ca33ef36399bc00e56e822323fe | [
"MIT"
] | null | null | null | TxtableAutoWrite/README.md | TsumugiWenders/Archives | faeb60c8592a1ca33ef36399bc00e56e822323fe | [
"MIT"
] | null | null | null | TxtableAutoWrite/README.md | TsumugiWenders/Archives | faeb60c8592a1ca33ef36399bc00e56e822323fe | [
"MIT"
] | null | null | null | # TxtableAutoWrite
自动填写使用腾讯文档收集的防疫每日信息表
| 13.333333 | 20 | 0.9 | yue_Hant | 0.983609 |
ff0ec505bbd15b30311560d77a0775aad7e3ded1 | 503 | md | Markdown | README.md | pgaref/MusicBox | 107ac09834d0954f4d038d4904520ea8f1b482cc | [
"Apache-2.0"
] | null | null | null | README.md | pgaref/MusicBox | 107ac09834d0954f4d038d4904520ea8f1b482cc | [
"Apache-2.0"
] | null | null | null | README.md | pgaref/MusicBox | 107ac09834d0954f4d038d4904520ea8f1b482cc | [
"Apache-2.0"
] | null | null | null | MusicBox
========
A music player with basic functionality and a Graphics interface
You can import is as a Netbeans project or just use the source code!

functions
Play mp3 file from filesystem (use some file open dialog)
Pause, stop, seek, next/previous from playlist
Playlist (place where you can add songs in your order from different locations)
Add multiple files to playlist
Play from playlist in random order
| 25.15 | 80 | 0.767396 | eng_Latn | 0.99687 |
ff108372ef7becf7463ff10282d9a55e92faba7e | 6,199 | md | Markdown | README.md | Kautenja/parse-server-sendmail-template-adapter | 222d9aad42b942aaf303bbf46fadf589775fdf6a | [
"MIT"
] | null | null | null | README.md | Kautenja/parse-server-sendmail-template-adapter | 222d9aad42b942aaf303bbf46fadf589775fdf6a | [
"MIT"
] | null | null | null | README.md | Kautenja/parse-server-sendmail-template-adapter | 222d9aad42b942aaf303bbf46fadf589775fdf6a | [
"MIT"
] | 1 | 2020-07-14T11:44:07.000Z | 2020-07-14T11:44:07.000Z | # parse-server-sendmail-template-adapter #
[![npm][npm-image]][npm-url]
[![build-status][travis-image]][travis-url]
[![downloads][downloads-image]][downloads-url]
[![npm-issues][npm-issues-image]][npm-issues-url]
[![js-standard-style][standard-image]][standard-url]
[travis-image]: https://travis-ci.org/Kautenja/parse-server-sendmail-template-adapter.svg?branch=master
[travis-url]: https://travis-ci.org/Kautenja/parse-server-sendmail-template-adapter
[standard-image]: https://img.shields.io/badge/code%20style-standard-brightgreen.svg
[standard-url]: http://standardjs.com/
[npm-image]: https://img.shields.io/npm/v/parse-server-sendmail-template-adapter.svg?style=flat
[npm-url]: https://npmjs.org/package/parse-server-sendmail-template-adapter
[downloads-image]: https://img.shields.io/npm/dt/parse-server-sendmail-template-adapter.svg?style=flat
[downloads-url]: https://npmjs.org/package/parse-server-sendmail-template-adapter
[npm-issues-image]: https://img.shields.io/github/issues/Kautenja/parse-server-sendmail-template-adapter.svg
[npm-issues-url]: https://github.com/Kautenja/parse-server-sendmail-template-adapter/issues
This mail adapter for
[Parse Server](https://github.com/parse-community/parse-server) uses
[sendmail](https://www.npmjs.com/package/sendmail) to send emails.
This code is inspired by
[parse-server-sendmail-adapter](https://www.npmjs.com/package/parse-server-sendmail-adapter).
## Installation ##
Put `parse-server-sendmail-template-adapter` in the dependencies object of
your project's `package.json` file.
```json
"dependencies":
{
"parse-server-sendmail-template-adapter": "^1.0.0"
}
```
## Usage ##
The initializer takes a single object parameter with the following fields:
| Field Name | Description |
|:-----------------------|--------------------------------------------------------------------------|
| `fromAddress` | the address to send from (i.e. [email protected]_) |
| `verificationSubject` | the subject for new account verification emails |
| `verificationBody` | the body for new account verification emails (inline text or a filename) |
| `passwordResetSubject` | the subject for password reset emails |
| `passwordResetBody` | the body for password reset emails (inline text or a filename) |
| `userFields` | the custom items from the user object to pull for use in templates |
`fromAddress` is the only required field to get setup.
### `mailAdapter` object ###
For readability I confine all the mail adapter configuration items for parse
servers in their own object called `emailAdapter` that I then supply to the
server's constructor.
```javascript
var server = new ParseServer({
// sends a verification when users add and email to their account
verifyUserEmails: true,
// the amount of time they have to reset their password in seconds
emailVerifyTokenValidityDuration: 2 * 60 * 60,
// the settings for the mail adapter for the server
emailAdapter: emailAdapter
});
```
Here are some examples of how to set up the `emailAdapter` object using this
mail adapter with text files, HTML templates, or inline strings.
#### Simple example ####
This example uses the default verification and password reset subjects and
bodies.
```javascript
var emailAdapter =
{
module: require('parse-server-sendmail-template-adapter'),
options:
{
fromAddress: '[email protected]',
}
};
```
#### Example using file bodies and inline subjects ####
This example uses HTML files as templates for the body.
```javascript
var emailAdapter =
{
module: require('parse-server-sendmail-template-adapter'),
options:
{
fromAddress: '[email protected]',
verificationSubject: 'Confirm your %appname% account',
verificationBody: 'templates/inlined/verify_email.html',
passwordResetSubject: 'Reset your %appname% password',
passwordResetBody: 'templates/inlined/reset_password.html'
}
};
```
#### Example using inline fields ####
This example uses inline strings to define the body contents of the emails.
```javascript
var emailAdapter =
{
module: require('parse-server-sendmail-template-adapter'),
options:
{
fromAddress: '[email protected]',
verificationSubject: 'Confirm your %appname% account',
verificationBody: `Hi,
You are being asked to confirm the e-mail address %email% with %appname%
Click here to confirm it:
%link%`,
passwordResetSubject: 'Reset your %appname% password',
passwordResetBody: `Hi,
You requested a password reset for %appname%.
Click here to reset it:
%link%`
}
};
```
### Templates ###
The adapter uses templating to use generalized email formats to provide unique
information to users. By default there are 4 items that can be used in templates:
| Template Key | Description |
|:-------------|:------------------------------------------------------------------------|
| `%appname%` | The name of the app as it appears in the Parse Server configuration |
| `%link%` | The unique link to provide in the email for web app feature integration |
| `%username%` | The username of the user receiving an email |
| `%email%` | The email address of the user receiving an email |
#### Example using custom user fields in templates ####
This example uses custom fields from the user object in its templates. to
access the dynamic values in `userField` in a template, surround it with %s.
i.e. if a user is sent an email using the following template it would be
replaced with items from the user object.
* Hi %firstName% %lastName%! -> Hi Jacob Smith!
```javascript
var emailAdapter =
{
module: require('parse-server-sendmail-template-adapter'),
options:
{
fromAddress: '[email protected]',
userFields: ["firstName", "lastName"]
}
};
```
| 36.464706 | 108 | 0.664139 | eng_Latn | 0.814333 |
ff120db52a90e37bd9c03ae7e100dfd79914f5df | 7,997 | md | Markdown | README.md | djmilstein/famplex | e7dabc2d7bfaf65a21eb139f9db286dd5698e635 | [
"CC0-1.0"
] | null | null | null | README.md | djmilstein/famplex | e7dabc2d7bfaf65a21eb139f9db286dd5698e635 | [
"CC0-1.0"
] | null | null | null | README.md | djmilstein/famplex | e7dabc2d7bfaf65a21eb139f9db286dd5698e635 | [
"CC0-1.0"
] | null | null | null | # FamPlex
*FamPlex* is a collection of resources for grounding biological entities
from text and describing their hierarchical relationships. Resources were
developed by manual curation for use by natural language processing and
biological modeling teams in the [DARPA Big
Mechanism](http://www.darpa.mil/program/big-mechanism) and [Communicating with
Computers](http://www.darpa.mil/program/communicating-with-computers) programs.
Note: FamPlex used to be called Bioentities, and was renamed to better reflect
the focus of the resource on protein families, complexes, and their lexical
synonyms.
The repository contains the following files:
* ```relations.csv```. Defines membership of specific genes/proteins in
families and protein complexes. For example, ```PIK3CA isa PIK3C```, where
PIK3C represents the class of catalytic subunits of PI3K; and ```PIK3C partof
PI3K```, where PI3K represents a named complex consisting of a catalytic and
regulatory subunit.
* ```equivalences.csv```. Defines mappings between outside namespaces and
the FamPlex namespace.
* ```entities.csv```. A registry of the families and complexes defined in the
FamPlex namespace.
* ```grounding_map.csv```. Explicit mapping of text strings to identifiers in
biological databases.
* ```gene_prefixes.csv```. Patterns of prefixes and suffixes on named entities.
* ```check_references.py```. A script to check the integrity and completeness
of the cross-references among the various files.
## Entities, Relations and Equivalences
*FamPlex* contains resources for defining the relationships between
genes/proteins and their membership in families and named complexes. Entities
defined within the FamPlex namespace are listed in the ```entities.csv```
file. Cross-referencing the entries among the various files maintains
consistency and prevents errors.
Relationships are defined in ```relations.csv``` as a triples using two
relationships:
* ```isa```, denoting membership in a family;
* ```partof```, denoting membership in a protein complex.
These two relationships can be combined to capture complex hierarchical
relationships, including sub-families (families within families) and complexes
consisting of families of related subunits (e.g., PI3K, NF-kB).
The ```relations.csv``` file consists of five columns: (1) the namespace for
the subject (e.g., ```HGNC``` for gene names, ```UP``` for Uniprot, or
```FPLX``` for the FamPlex namespace), (2) the identifier for the subject,
(3) the relationship (```isa``` or ```partof```), (4) the namespace for the
object, and (5) the identifier for the object.
The ```equivalences.csv``` file consists of three columns (1) the namespace of
an outsite entity (e.g. ```BEL```, ```PFAM```),
(2) the identifier of the outside entity in the namespace given in the
first column, and (3) the equivalent entity in the ```FPLX``` namespace.
## Grounding Map
Using mechanisms extracted from text mining to explain biological datasets
requires that the entities in text are correctly grounded to the canonical
names and IDs of genes, proteins, and chemicals. The problem is that simple
lookups based on string matching often fail, particularly for protein families
and named complexes, which appear frequently in text but lack corresponding
entries in databases.
The grounding map addresses this by providing explicit grounding for frequently
encountered entities in the biological literature. The text strings were drawn
from a corpus of roughly 32,000 papers focused on growth factor signaling in
cancer.
Entities are grounded to the following databases:
* Genes/proteins: [Uniprot](http://www.uniprot.org)
* Chemicals: [PubChem](https://pubchem.ncbi.nlm.nih.gov/),
[CHEBI](https://www.ebi.ac.uk/chebi/), and [HMDB](http://www.hmdb.ca/) (for
metabolites)
* Biological processes: [GO](http://geneontology.org/) and
[MeSH](http://www.ncbi.nlm.nih.gov/mesh)
* Protein families and named complexes: grounded to entities defined within
the FamPlex repository in the ```entities.csv``` and ```relations.csv```
files, and to identifiers in [PFAM](http://pfam.xfam.org/)
and [Interpro](https://www.ebi.ac.uk/interpro/) when possible.
## Gene prefixes
The file ```gene_prefixes.csv``` enumerates prefixes and suffixes frequently
appended to named entities. Some of these represent subtleties of experimental
context (for example, that a protein of interest was tagged with a fluorescent
protein in an experiment) that can safely be ignored when determining the logic
of a sentence. However, others carry essential meaning: for example, a sentence
describing the effect of 'AKT shRNA' on a downstream target has the opposite
meaning of a sentence involving 'AKT', because 'AKT shRNA' represents
inhibition of AKT by genetic silencing.
The patterns included in this file were found by manually reviewing 70,000
named entities extracted by the REACH parser from a corpus of roughly 32,000
papers focused on growth factor signaling.
**Important note: the prefixes/suffixes may be applied additively, for example
```Myr-Flag-Akt1```, indicating myristoylated, FLAG-tagged AKT1; or
```GFP-KRAS-G12V```, indicating GFP-tagged KRAS with a G12V mutation.**
The file contains three columns:
1. A case-sensitive pattern, e.g., ```mEGFP-{Gene name}```, where ```{Gene name}``` represents a protein/gene name.
2. A category, described below.
3. Notes: spelling out acronyms, etc.
The category of the prefix/suffix determines whether it can be stripped off
with minimal effect on the meaning, or whether it carries meaning that needs to
be incorporated by a parser. The categories are as follows:
* ```experimental context```. Protein tags, gene delivery techniques, etc. **Can
generally be ignored.**
* ```species```. Prefixes denoting human, mouse, primate, or mammalian versions
of a gene. **In most use cases can be ignored.**
* ```generic descriptor```. Additional words extracted by the entity recognizer
that might designate that an entity is a "protein", a "protease",
"transcription factor", etc. **In most use cases can be ignored.**
* ```mrna grounding```. In most cases, entities can be grounded to proteins; in
the case of ```{Gene name} mRNA```, the entity **must be explicitly grounded
as an mRNA.**
* ```protein state```. Designate activation state, post-translational
modification, cellular localization, etc. **Must be captured by the
parser.**
* ```inhibition```. Designate protein forms or interventions that represent an
inhibition of the protein, that is, a loss-of-function experiment. Have the
effect of switching the polarity of the extracted mechanism. For example, the
sentence "DUSP6 silencing leads to MAPK1 phosphorylation" indicates that DUSP6
**inhibits** MAPK1 phosphorylation. **Must be captured by the parser.**
## Contributing
Contributions are welcome! Please submit pull requests via the main
sorgerlab/famplex repository: https://github.com/sorgerlab/famplex
If making additions or revisions to the CSV files
take care to handle quotations and newlines correctly. This allows diffs to be
handled correctly so changes can be reviewed. Please submit updates via pull
requests on Github.
The CSV files in the FamPlex repo are set up to be edited natively using
Microsoft Excel. The CSV files in the repo have Windows line terminators
('\r\n'), and are not ragged (i.e., missing entries in a row are padded out
with empty strings to reach the full width of the longest row).
To preserve correct newlines, take the following steps:
1. If saving from Excel (Windows or Mac OS X), save to the "Windows Comma
Separated (.csv)" format.
2. If reading (or writing) the files using a Python script, use the following
set of csv format parameters::
csvreader = csv.reader(f, delimiter=',', quotechar='"',
quoting=csv.QUOTE_MINIMAL, lineterminator='\r\n')
3. If editing the files on Linux, post-process files using ```unix2dos``` or a
similar program.
| 44.926966 | 115 | 0.762036 | eng_Latn | 0.997362 |
ff12b5f8e74b941037a7d1edc401e8b51b2f9bd8 | 86 | md | Markdown | CHANGELOG.md | rjdestigter/postcss-colour | e34638303e2105ca024221ee610d718b59beefae | [
"CC0-1.0"
] | 40 | 2021-09-04T13:11:18.000Z | 2022-01-08T12:10:32.000Z | CHANGELOG.md | rjdestigter/postcss-colour | e34638303e2105ca024221ee610d718b59beefae | [
"CC0-1.0"
] | null | null | null | CHANGELOG.md | rjdestigter/postcss-colour | e34638303e2105ca024221ee610d718b59beefae | [
"CC0-1.0"
] | null | null | null | # Changes to PostCSS Colour
### 1.0.0 (September 19, 2019)
- Added: Initial version
| 14.333333 | 30 | 0.686047 | eng_Latn | 0.427442 |
ff12edc2a829824617e918ca88a85d78705a73ec | 90,611 | md | Markdown | articles/cognitive-services/Translator/quickstart-translator.md | NikoMix/azure-docs.de-de | 357aca84dfe4bb69cc9c376d62d7b4c81da38b42 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/cognitive-services/Translator/quickstart-translator.md | NikoMix/azure-docs.de-de | 357aca84dfe4bb69cc9c376d62d7b4c81da38b42 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/cognitive-services/Translator/quickstart-translator.md | NikoMix/azure-docs.de-de | 357aca84dfe4bb69cc9c376d62d7b4c81da38b42 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 'Schnellstart: Erste Schritte mit der Textübersetzung'
titleSuffix: Azure Cognitive Services
description: Lernen Sie, wie Sie mit dem Textübersetzungsdienst Text übersetzen, Text transkribieren, die Sprache erkennen und mehr. Es stehen Beispiele in C#, Java, JavaScript und Python zur Verfügung.
services: cognitive-services
author: erhopf
manager: nitinme
ms.service: cognitive-services
ms.subservice: translator-text
ms.topic: quickstart
ms.date: 09/14/2020
ms.author: erhopf
ms.custom: cog-serv-seo-aug-2020
keywords: Textübersetzung, Übersetzerdienst, Text übersetzen, Text transkribieren, Sprachenerkennung
ms.openlocfilehash: 38bd4d28a8ae4c737155cd74bcb39d1acfaf699c
ms.sourcegitcommit: 78ecfbc831405e8d0f932c9aafcdf59589f81978
ms.translationtype: HT
ms.contentlocale: de-DE
ms.lasthandoff: 01/23/2021
ms.locfileid: "98733493"
---
# <a name="quickstart-get-started-with-translator"></a>Schnellstart: Erste Schritte mit der Textübersetzung
In dieser Schnellstartanleitung erfahren Sie, wie Sie den Textübersetzungsdienst mithilfe von REST verwenden. Sie beginnen mit einfachen Beispielen und fahren dann mit einigen Konfigurationsoptionen für den Kern fort, die bei der Entwicklung häufig verwendet werden, darunter:
* [Übersetzung](#translate-text)
* [Transliteration](#transliterate-text)
* [Sprachenerkennung](#detect-language)
* [Berechnen der Satzlänge](#get-sentence-length)
* [Abrufen alternativer Übersetzungen](#dictionary-lookup-alternate-translations) und [Beispiele für die Verwendung von Wörtern in einem Satz](#dictionary-examples-translations-in-context)
## <a name="prerequisites"></a>Voraussetzungen
* Azure-Abonnement – [Erstellen eines kostenlosen Kontos](https://azure.microsoft.com/free/cognitive-services/)
* Sobald Sie über ein Azure-Abonnement verfügen, erstellen Sie eine [Textübersetzungsressource](https://ms.portal.azure.com/#create/Microsoft.CognitiveServicesTextTranslation) im Azure-Portal, um Ihren Schlüssel und Endpunkt zu erhalten. Wählen Sie nach Abschluss der Bereitstellung **Zu Ressource wechseln** aus.
* Sie benötigen den Schlüssel und Endpunkt der Ressource, um Ihre Anwendung mit dem Textübersetzungsdienst zu verbinden. Der Schlüssel und der Endpunkt werden weiter unten in der Schnellstartanleitung in den Code eingefügt.
* Sie können den kostenlosen Tarif (F0) verwenden, um den Dienst zu testen, und später für die Produktion auf einen kostenpflichtigen Tarif upgraden.
## <a name="platform-setup"></a>Plattformeinrichtung
# <a name="c"></a>[C#](#tab/csharp)
* Erstellen Sie ein neues Projekt: `dotnet new console -o your_project_name`
* Ersetzen Sie den Code in „Program.cs“ durch den unten abgebildeten C#-Code.
* Legen Sie die Werte von Abonnementschlüssel und Endpunkt in „Program.cs“ fest.
* [Fügen Sie „Newtonsoft.Json“ mithilfe der .NET-CLI hinzu](https://www.nuget.org/packages/Newtonsoft.Json/).
* Führen Sie das Programm aus dem Projektverzeichnis aus: ``dotnet run``
# <a name="go"></a>[Go](#tab/go)
* Erstellen Sie ein neues Go-Projekt in Ihrem bevorzugten Code-Editor.
* Fügen Sie den unten stehenden Code hinzu.
* Ersetzen Sie den `subscriptionKey`-Wert durch einen für Ihr Abonnement gültigen Zugriffsschlüssel.
* Speichern Sie die Datei mit der Erweiterung „.go“.
* Öffnen Sie auf einem Computer, auf dem Go installiert ist, eine Eingabeaufforderung.
* Erstellen Sie die Datei, beispielsweise mit „go build example-code.go“.
* Führen Sie die Datei aus, beispielsweise mit „example-code“.
# <a name="java"></a>[Java](#tab/java)
* Erstellen Sie ein Arbeitsverzeichnis für Ihr Projekt. Beispiel: `mkdir sample-project`.
* Initialisieren Sie Ihr Projekt mit Gradle: `gradle init --type basic`. Wenn Sie zur Auswahl einer **DSL** aufgefordert werden, wählen Sie **Kotlin** aus.
* Aktualisieren Sie `build.gradle.kts`. Beachten Sie, dass Sie Ihren `mainClassName` passend zum Beispiel aktualisieren müssen.
```java
plugins {
java
application
}
application {
mainClassName = "<NAME OF YOUR CLASS>"
}
repositories {
mavenCentral()
}
dependencies {
compile("com.squareup.okhttp:okhttp:2.5.0")
compile("com.google.code.gson:gson:2.8.5")
}
```
* Erstellen Sie eine Java-Datei, und kopieren Sie den Code aus dem bereitgestellten Beispiel hinein. Vergessen Sie nicht, ihren Abonnementschlüssel hinzuzufügen.
* Führen Sie das Beispiel aus: `gradle run`.
# <a name="nodejs"></a>[Node.js](#tab/nodejs)
* Erstellen Sie in Ihrer bevorzugten IDE oder einem Editor ein neues Projekt.
* Kopieren Sie den Code aus einem der Beispiele in Ihr Projekt.
* Legen Sie Ihren Abonnementschlüssel fest.
* Führen Sie das Programm aus. Beispiel: `node Translate.js`.
# <a name="python"></a>[Python](#tab/python)
* Erstellen Sie in Ihrer bevorzugten IDE oder einem Editor ein neues Projekt.
* Kopieren Sie den Code aus einem der Beispiele in Ihr Projekt.
* Legen Sie Ihren Abonnementschlüssel fest.
* Führen Sie das Programm aus. Beispiel: `python translate.py`.
---
## <a name="headers"></a>Header
Beim Aufrufen des Textübersetzungsdiensts über REST müssen Sie sicherstellen, dass jede Anforderung die folgenden Header enthält. Keine Sorge, wir fügen die Header in den folgenden Abschnitten in den Beispielcode ein.
<table width="100%">
<th width="20%">Header</th>
<th>BESCHREIBUNG</th>
<tr>
<td>Authentifizierungsheader</td>
<td><em>Erforderlicher Anforderungsheader</em>.<br/><code>Ocp-Apim-Subscription-Key</code><br/><br/><em>Erforderlicher Anforderungsheader bei Verwendung einer Cognitive Services-Ressource. Optional, wenn eine Textübersetzungsressource verwendet wird.</em>.<br/><code>Ocp-Apim-Subscription-Region</code><br/><br/>Weitere Informationen finden Sie in den <a href="/azure/cognitive-services/translator/reference/v3-0-reference#authentication">verfügbaren Optionen für die Authentifizierung</a>.</td>
</tr>
<tr>
<td>Content-Type</td>
<td><em>Erforderlicher Anforderungsheader</em>.<br/>Gibt den Inhaltstyp der Nutzlast an.<br/> Der zulässige Wert ist <code>application/json; charset=UTF-8</code>.</td>
</tr>
<tr>
<td>Content-Length</td>
<td><em>Erforderlicher Anforderungsheader</em>.<br/>Die Länge des Anforderungstexts.</td>
</tr>
<tr>
<td>X-ClientTraceId</td>
<td><em>Optional:</em><br/>Eine vom Client erstellte GUID zur eindeutigen Identifizierung der Anforderung. Sie können diesen Header nur weglassen, wenn Sie die Ablaufverfolgungs-ID in die Abfragezeichenfolge über einen Abfrageparameter namens <code>ClientTraceId</code> einschließen.</td>
</tr>
</table>
## <a name="keys-and-endpoints"></a>Keys and endpoints (Schlüssel und Endpunkte)
In den Beispielen auf dieser Seite werden aus Gründen der Einfachheit hartcodierte Schlüssel und Endpunkte verwendet. Denken Sie daran, **den Schlüssel aus Ihrem Code zu entfernen, wenn Sie fertig sind**, und ihn **niemals zu veröffentlichen**. In der Produktionsumgebung sollten Sie eine sichere Methode zum Speichern Ihrer Anmeldeinformationen sowie zum Zugriff darauf verwenden. Weitere Informationen finden Sie im Cognitive Services-Artikel zur [Sicherheit](../cognitive-services-security.md).
## <a name="translate-text"></a>Text übersetzen
Der Kernvorgang des Textübersetzungsdiensts besteht im Übersetzen von Text. In diesem Abschnitt erstellen Sie eine Anforderung, die eine einzelne Quelle (`from`) annimmt und zwei Ausgaben (`to`) zur Verfügung stellt. Anschließend überprüfen wir einige Parameter, die verwendet werden können, um sowohl die Anforderung als auch die Antwort anzupassen.
# <a name="c"></a>[C#](#tab/csharp)
```csharp
using System;
using System.Net.Http;
using System.Text;
using System.Threading.Tasks;
using Newtonsoft.Json; // Install Newtonsoft.Json with NuGet
class Program
{
private static readonly string subscriptionKey = "YOUR-SUBSCRIPTION-KEY";
private static readonly string endpoint = "https://api.cognitive.microsofttranslator.com/";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
private static readonly string location = "YOUR_RESOURCE_LOCATION";
static async Task Main(string[] args)
{
// Input and output languages are defined as parameters.
string route = "/translate?api-version=3.0&from=en&to=de&to=it";
string textToTranslate = "Hello, world!";
object[] body = new object[] { new { Text = textToTranslate } };
var requestBody = JsonConvert.SerializeObject(body);
using (var client = new HttpClient())
using (var request = new HttpRequestMessage())
{
// Build the request.
request.Method = HttpMethod.Post;
request.RequestUri = new Uri(endpoint + route);
request.Content = new StringContent(requestBody, Encoding.UTF8, "application/json");
request.Headers.Add("Ocp-Apim-Subscription-Key", subscriptionKey);
request.Headers.Add("Ocp-Apim-Subscription-Region", location);
// Send the request and get response.
HttpResponseMessage response = await client.SendAsync(request).ConfigureAwait(false);
// Read response as a string.
string result = await response.Content.ReadAsStringAsync();
Console.WriteLine(result);
}
}
}
```
# <a name="go"></a>[Go](#tab/go)
```go
package main
import (
"bytes"
"encoding/json"
"fmt"
"log"
"net/http"
"net/url"
)
func main() {
subscriptionKey := "YOUR-SUBSCRIPTION-KEY"
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
location := "YOUR_RESOURCE_LOCATION";
endpoint := "https://api.cognitive.microsofttranslator.com/"
uri := endpoint + "/translate?api-version=3.0"
// Build the request URL. See: https://golang.org/pkg/net/url/#example_URL_Parse
u, _ := url.Parse(uri)
q := u.Query()
q.Add("from", "en")
q.Add("to", "de")
q.Add("to", "it")
u.RawQuery = q.Encode()
// Create an anonymous struct for your request body and encode it to JSON
body := []struct {
Text string
}{
{Text: "Hello, world!"},
}
b, _ := json.Marshal(body)
// Build the HTTP POST request
req, err := http.NewRequest("POST", u.String(), bytes.NewBuffer(b))
if err != nil {
log.Fatal(err)
}
// Add required headers to the request
req.Header.Add("Ocp-Apim-Subscription-Key", subscriptionKey)
req.Header.Add("Ocp-Apim-Subscription-Region", location)
req.Header.Add("Content-Type", "application/json")
// Call the Translator API
res, err := http.DefaultClient.Do(req)
if err != nil {
log.Fatal(err)
}
// Decode the JSON response
var result interface{}
if err := json.NewDecoder(res.Body).Decode(&result); err != nil {
log.Fatal(err)
}
// Format and print the response to terminal
prettyJSON, _ := json.MarshalIndent(result, "", " ")
fmt.Printf("%s\n", prettyJSON)
}
```
# <a name="java"></a>[Java](#tab/java)
```java
import java.io.*;
import java.net.*;
import java.util.*;
import com.google.gson.*;
import com.squareup.okhttp.*;
public class Translate {
private static String subscriptionKey = "YOUR_SUBSCRIPTION_KEY";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
private static String location = "YOUR_RESOURCE_LOCATION";
HttpUrl url = new HttpUrl.Builder()
.scheme("https")
.host("api.cognitive.microsofttranslator.com")
.addPathSegment("/translate")
.addQueryParameter("api-version", "3.0")
.addQueryParameter("from", "en")
.addQueryParameter("to", "de")
.addQueryParameter("to", "it")
.build();
// Instantiates the OkHttpClient.
OkHttpClient client = new OkHttpClient();
// This function performs a POST request.
public String Post() throws IOException {
MediaType mediaType = MediaType.parse("application/json");
RequestBody body = RequestBody.create(mediaType,
"[{\"Text\": \"Hello World!\"}]");
Request request = new Request.Builder().url(url).post(body)
.addHeader("Ocp-Apim-Subscription-Key", subscriptionKey)
.addHeader("Ocp-Apim-Subscription-Key", location)
.addHeader("Content-type", "application/json")
.build();
Response response = client.newCall(request).execute();
return response.body().string();
}
// This function prettifies the json response.
public static String prettify(String json_text) {
JsonParser parser = new JsonParser();
JsonElement json = parser.parse(json_text);
Gson gson = new GsonBuilder().setPrettyPrinting().create();
return gson.toJson(json);
}
public static void main(String[] args) {
try {
Translate translateRequest = new Translate();
String response = translateRequest.Post();
System.out.println(prettify(response));
} catch (Exception e) {
System.out.println(e);
}
}
}
```
# <a name="nodejs"></a>[Node.js](#tab/nodejs)
```Javascript
const axios = require('axios').default;
const { v4: uuidv4 } = require('uuid');
var subscriptionKey = "YOUR_SUBSCRIPTION_KEY";
var endpoint = "https://api.cognitive.microsofttranslator.com";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
var location = "YOUR_RESOURCE_LOCATION";
axios({
baseURL: endpoint,
url: '/translate',
method: 'post',
headers: {
'Ocp-Apim-Subscription-Key': subscriptionKey,
'Ocp-Apim-Subscription-Region': location,
'Content-type': 'application/json',
'X-ClientTraceId': uuidv4().toString()
},
params: {
'api-version': '3.0',
'from': 'en',
'to': ['de', 'it']
},
data: [{
'text': 'Hello World!'
}],
responseType: 'json'
}).then(function(response){
console.log(JSON.stringify(response.data, null, 4));
})
```
# <a name="python"></a>[Python](#tab/python)
```python
import requests, uuid, json
# Add your subscription key and endpoint
subscription_key = "YOUR_SUBSCRIPTION_KEY"
endpoint = "https://api.cognitive.microsofttranslator.com"
# Add your location, also known as region. The default is global.
# This is required if using a Cognitive Services resource.
location = "YOUR_RESOURCE_LOCATION"
path = '/translate'
constructed_url = endpoint + path
params = {
'api-version': '3.0',
'from': 'en',
'to': ['de', 'it']
}
constructed_url = endpoint + path
headers = {
'Ocp-Apim-Subscription-Key': subscription_key,
'Ocp-Apim-Subscription-Region': location,
'Content-type': 'application/json',
'X-ClientTraceId': str(uuid.uuid4())
}
# You can pass more than one object in body.
body = [{
'text': 'Hello World!'
}]
request = requests.post(constructed_url, params=params, headers=headers, json=body)
response = request.json()
print(json.dumps(response, sort_keys=True, ensure_ascii=False, indent=4, separators=(',', ': ')))
```
---
Nach einem erfolgreichen Aufruf sollten Sie die folgende Antwort sehen:
```JSON
[
{
"translations": [
{
"text": "Hallo Welt!",
"to": "de"
},
{
"text": "Salve, mondo!",
"to": "it"
}
]
}
]
```
## <a name="detect-language"></a>Sprache erkennen
Wenn Sie wissen, dass Sie eine Übersetzung benötigen, die Sprache des Texts aber nicht kennen, der an den Textübersetzungsdienst gesendet werden soll, können Sie den Sprachenerkennungsvorgang verwenden. Es gibt mehr als eine Möglichkeit, die Sprache des Ausgangstexts zu bestimmen. In diesem Abschnitt erfahren Sie, wie Sie die Sprachenerkennung mithilfe des `translate`-Endpunkts und des `detect`-Endpunkts verwenden.
### <a name="detect-source-language-during-translation"></a>Erkennen der Quellsprache während der Übersetzung
Wenn Sie den `from`-Parameter nicht in Ihre Übersetzungsanforderung aufnehmen, versucht der Textübersetzungsdienst, die Sprache des Ausgangstexts zu erkennen. In der Antwort erhalten Sie die erkannte Sprache (`language`) und eine Vertrauensbewertung (`score`). Je näher `score` an `1.0` liegt, desto zuverlässiger wurde die Sprache richtig erkannt.
# <a name="c"></a>[C#](#tab/csharp)
```csharp
using System;
using System.Net.Http;
using System.Text;
using System.Threading.Tasks;
using Newtonsoft.Json; // Install Newtonsoft.Json with NuGet
class Program
{
private static readonly string subscriptionKey = "YOUR-SUBSCRIPTION-KEY";
private static readonly string endpoint = "https://api.cognitive.microsofttranslator.com/";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
private static readonly string location = "YOUR_RESOURCE_LOCATION";
static async Task Main(string[] args)
{
// Output languages are defined as parameters, input language detected.
string route = "/translate?api-version=3.0&to=de&to=it";
string textToTranslate = "Hello, world!";
object[] body = new object[] { new { Text = textToTranslate } };
var requestBody = JsonConvert.SerializeObject(body);
using (var client = new HttpClient())
using (var request = new HttpRequestMessage())
{
// Build the request.
request.Method = HttpMethod.Post;
request.RequestUri = new Uri(endpoint + route);
request.Content = new StringContent(requestBody, Encoding.UTF8, "application/json");
request.Headers.Add("Ocp-Apim-Subscription-Key", subscriptionKey);
request.Headers.Add("Ocp-Apim-Subscription-Region", location);
// Send the request and get response.
HttpResponseMessage response = await client.SendAsync(request).ConfigureAwait(false);
// Read response as a string.
string result = await response.Content.ReadAsStringAsync();
Console.WriteLine(result);
}
}
}
```
# <a name="go"></a>[Go](#tab/go)
```go
package main
import (
"bytes"
"encoding/json"
"fmt"
"log"
"net/http"
"net/url"
)
func main() {
subscriptionKey := "YOUR-SUBSCRIPTION-KEY"
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
location := "YOUR_RESOURCE_LOCATION";
endpoint := "https://api.cognitive.microsofttranslator.com/"
uri := endpoint + "/translate?api-version=3.0"
// Build the request URL. See: https://golang.org/pkg/net/url/#example_URL_Parse
u, _ := url.Parse(uri)
q := u.Query()
q.Add("to", "de")
q.Add("to", "it")
u.RawQuery = q.Encode()
// Create an anonymous struct for your request body and encode it to JSON
body := []struct {
Text string
}{
{Text: "Hello, world!"},
}
b, _ := json.Marshal(body)
// Build the HTTP POST request
req, err := http.NewRequest("POST", u.String(), bytes.NewBuffer(b))
if err != nil {
log.Fatal(err)
}
// Add required headers to the request
req.Header.Add("Ocp-Apim-Subscription-Key", subscriptionKey)
req.Header.Add("Ocp-Apim-Subscription-Region", location)
req.Header.Add("Content-Type", "application/json")
// Call the Translator API
res, err := http.DefaultClient.Do(req)
if err != nil {
log.Fatal(err)
}
// Decode the JSON response
var result interface{}
if err := json.NewDecoder(res.Body).Decode(&result); err != nil {
log.Fatal(err)
}
// Format and print the response to terminal
prettyJSON, _ := json.MarshalIndent(result, "", " ")
fmt.Printf("%s\n", prettyJSON)
}
```
# <a name="java"></a>[Java](#tab/java)
```java
import java.io.*;
import java.net.*;
import java.util.*;
import com.google.gson.*;
import com.squareup.okhttp.*;
public class Translate {
private static String subscriptionKey = "YOUR_SUBSCRIPTION_KEY";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
private static String location = "YOUR_RESOURCE_LOCATION";
HttpUrl url = new HttpUrl.Builder()
.scheme("https")
.host("api.cognitive.microsofttranslator.com")
.addPathSegment("/translate")
.addQueryParameter("api-version", "3.0")
.addQueryParameter("to", "de")
.addQueryParameter("to", "it")
.build();
// Instantiates the OkHttpClient.
OkHttpClient client = new OkHttpClient();
// This function performs a POST request.
public String Post() throws IOException {
MediaType mediaType = MediaType.parse("application/json");
RequestBody body = RequestBody.create(mediaType,
"[{\"Text\": \"Hello World!\"}]");
Request request = new Request.Builder().url(url).post(body)
.addHeader("Ocp-Apim-Subscription-Key", subscriptionKey)
.addHeader("Ocp-Apim-Subscription-Key", location)
.addHeader("Content-type", "application/json")
.build();
Response response = client.newCall(request).execute();
return response.body().string();
}
// This function prettifies the json response.
public static String prettify(String json_text) {
JsonParser parser = new JsonParser();
JsonElement json = parser.parse(json_text);
Gson gson = new GsonBuilder().setPrettyPrinting().create();
return gson.toJson(json);
}
public static void main(String[] args) {
try {
Translate translateRequest = new Translate();
String response = translateRequest.Post();
System.out.println(prettify(response));
} catch (Exception e) {
System.out.println(e);
}
}
}
```
# <a name="nodejs"></a>[Node.js](#tab/nodejs)
```javascript
const axios = require('axios').default;
const { v4: uuidv4 } = require('uuid');
var subscriptionKey = "YOUR_SUBSCRIPTION_KEY";
var endpoint = "https://api.cognitive.microsofttranslator.com";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
var location = "YOUR_RESOURCE_LOCATION";
axios({
baseURL: endpoint,
url: '/translate',
method: 'post',
headers: {
'Ocp-Apim-Subscription-Key': subscriptionKey,
'Ocp-Apim-Subscription-Region': location,
'Content-type': 'application/json',
'X-ClientTraceId': uuidv4().toString()
},
params: {
'api-version': '3.0',
'from': 'en',
'to': ['de', 'it']
},
data: [{
'text': 'Hello World!'
}],
responseType: 'json'
}).then(function(response){
console.log(JSON.stringify(response.data, null, 4));
})
```
# <a name="python"></a>[Python](#tab/python)
```python
import requests, uuid, json
# Add your subscription key and endpoint
subscription_key = "YOUR_SUBSCRIPTION_KEY"
endpoint = "https://api.cognitive.microsofttranslator.com"
# Add your location, also known as region. The default is global.
# This is required if using a Cognitive Services resource.
location = "YOUR_RESOURCE_LOCATION"
path = '/translate'
constructed_url = endpoint + path
params = {
'api-version': '3.0',
'to': ['de', 'it']
}
constructed_url = endpoint + path
headers = {
'Ocp-Apim-Subscription-Key': subscription_key,
'Ocp-Apim-Subscription-Region': location,
'Content-type': 'application/json',
'X-ClientTraceId': str(uuid.uuid4())
}
# You can pass more than one object in body.
body = [{
'text': 'Hello World!'
}]
request = requests.post(constructed_url, params=params, headers=headers, json=body)
response = request.json()
print(json.dumps(response, sort_keys=True, ensure_ascii=False, indent=4, separators=(',', ': ')))
```
---
Nach einem erfolgreichen Aufruf sollten Sie die folgende Antwort sehen:
```json
[
{
"detectedLanguage": {
"language": "en",
"score": 1.0
},
"translations": [
{
"text": "Hallo Welt!",
"to": "de"
},
{
"text": "Salve, mondo!",
"to": "it"
}
]
}
]
```
### <a name="detect-source-language-without-translation"></a>Erkennen der Quellsprache ohne Übersetzung
Es ist möglich, den Textübersetzungsdienst zum Erkennen der Sprache von Quelltext zu verwenden, ohne eine Übersetzung durchzuführen. Zu diesem Zweck verwenden Sie den [`/detect`](./reference/v3-0-detect.md)-Endpunkt.
# <a name="c"></a>[C#](#tab/csharp)
```csharp
using System;
using System.Net.Http;
using System.Text;
using System.Threading.Tasks;
using Newtonsoft.Json; // Install Newtonsoft.Json with NuGet
class Program
{
private static readonly string subscriptionKey = "YOUR-SUBSCRIPTION-KEY";
private static readonly string endpoint = "https://api.cognitive.microsofttranslator.com/";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
private static readonly string location = "YOUR_RESOURCE_LOCATION";
static async Task Main(string[] args)
{
// Just detect language
string route = "/detect?api-version=3.0";
string textToLangDetect = "Ich würde wirklich gern Ihr Auto um den Block fahren ein paar Mal.";
object[] body = new object[] { new { Text = textToLangDetect } };
var requestBody = JsonConvert.SerializeObject(body);
using (var client = new HttpClient())
using (var request = new HttpRequestMessage())
{
// Build the request.
request.Method = HttpMethod.Post;
request.RequestUri = new Uri(endpoint + route);
request.Content = new StringContent(requestBody, Encoding.UTF8, "application/json");
request.Headers.Add("Ocp-Apim-Subscription-Key", subscriptionKey);
request.Headers.Add("Ocp-Apim-Subscription-Region", location);
// Send the request and get response.
HttpResponseMessage response = await client.SendAsync(request).ConfigureAwait(false);
// Read response as a string.
string result = await response.Content.ReadAsStringAsync();
Console.WriteLine(result);
}
}
}
```
# <a name="go"></a>[Go](#tab/go)
```go
package main
import (
"bytes"
"encoding/json"
"fmt"
"log"
"net/http"
"net/url"
)
func main() {
subscriptionKey := "YOUR-SUBSCRIPTION-KEY"
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
location := "YOUR_RESOURCE_LOCATION";
endpoint := "https://api.cognitive.microsofttranslator.com/"
uri := endpoint + "/detect?api-version=3.0"
// Build the request URL. See: https://golang.org/pkg/net/url/#example_URL_Parse
u, _ := url.Parse(uri)
q := u.Query()
u.RawQuery = q.Encode()
// Create an anonymous struct for your request body and encode it to JSON
body := []struct {
Text string
}{
{Text: "Ich würde wirklich gern Ihr Auto um den Block fahren ein paar Mal."},
}
b, _ := json.Marshal(body)
// Build the HTTP POST request
req, err := http.NewRequest("POST", u.String(), bytes.NewBuffer(b))
if err != nil {
log.Fatal(err)
}
// Add required headers to the request
req.Header.Add("Ocp-Apim-Subscription-Key", subscriptionKey)
req.Header.Add("Ocp-Apim-Subscription-Region", location)
req.Header.Add("Content-Type", "application/json")
// Call the Translator API
res, err := http.DefaultClient.Do(req)
if err != nil {
log.Fatal(err)
}
// Decode the JSON response
var result interface{}
if err := json.NewDecoder(res.Body).Decode(&result); err != nil {
log.Fatal(err)
}
// Format and print the response to terminal
prettyJSON, _ := json.MarshalIndent(result, "", " ")
fmt.Printf("%s\n", prettyJSON)
}
```
# <a name="java"></a>[Java](#tab/java)
```java
import java.io.*;
import java.net.*;
import java.util.*;
import com.google.gson.*;
import com.squareup.okhttp.*;
public class Detect {
private static String subscriptionKey = "YOUR_SUBSCRIPTION_KEY";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
private static String location = "YOUR_RESOURCE_LOCATION";
HttpUrl url = new HttpUrl.Builder()
.scheme("https")
.host("api.cognitive.microsofttranslator.com")
.addPathSegment("/detect")
.addQueryParameter("api-version", "3.0")
.build();
// Instantiates the OkHttpClient.
OkHttpClient client = new OkHttpClient();
// This function performs a POST request.
public String Post() throws IOException {
MediaType mediaType = MediaType.parse("application/json");
RequestBody body = RequestBody.create(mediaType,
"[{\"Text\": \"Ich würde wirklich gern Ihr Auto um den Block fahren ein paar Mal.\"}]");
Request request = new Request.Builder().url(url).post(body)
.addHeader("Ocp-Apim-Subscription-Key", subscriptionKey)
.addHeader("Ocp-Apim-Subscription-Key", location)
.addHeader("Content-type", "application/json")
.build();
Response response = client.newCall(request).execute();
return response.body().string();
}
// This function prettifies the json response.
public static String prettify(String json_text) {
JsonParser parser = new JsonParser();
JsonElement json = parser.parse(json_text);
Gson gson = new GsonBuilder().setPrettyPrinting().create();
return gson.toJson(json);
}
public static void main(String[] args) {
try {
Detect detectRequest = new Detect();
String response = detectRequest.Post();
System.out.println(prettify(response));
} catch (Exception e) {
System.out.println(e);
}
}
}
```
# <a name="nodejs"></a>[Node.js](#tab/nodejs)
```javascript
const axios = require('axios').default;
const { v4: uuidv4 } = require('uuid');
var subscriptionKey = "YOUR_SUBSCRIPTION_KEY";
var endpoint = "https://api.cognitive.microsofttranslator.com";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
var location = "YOUR_RESOURCE_LOCATION";
axios({
baseURL: endpoint,
url: '/detect',
method: 'post',
headers: {
'Ocp-Apim-Subscription-Key': subscriptionKey,
'Ocp-Apim-Subscription-Region': location,
'Content-type': 'application/json',
'X-ClientTraceId': uuidv4().toString()
},
params: {
'api-version': '3.0'
},
data: [{
'text': 'Ich würde wirklich gern Ihr Auto um den Block fahren ein paar Mal.'
}],
responseType: 'json'
}).then(function(response){
console.log(JSON.stringify(response.data, null, 4));
})
```
# <a name="python"></a>[Python](#tab/python)
```python
import requests, uuid, json
# Add your subscription key and endpoint
subscription_key = "YOUR_SUBSCRIPTION_KEY"
endpoint = "https://api.cognitive.microsofttranslator.com"
# Add your location, also known as region. The default is global.
# This is required if using a Cognitive Services resource.
location = "YOUR_RESOURCE_LOCATION"
path = '/detect'
constructed_url = endpoint + path
params = {
'api-version': '3.0'
}
constructed_url = endpoint + path
headers = {
'Ocp-Apim-Subscription-Key': subscription_key,
'Ocp-Apim-Subscription-Region': location,
'Content-type': 'application/json',
'X-ClientTraceId': str(uuid.uuid4())
}
# You can pass more than one object in body.
body = [{
'text': 'Ich würde wirklich gern Ihr Auto um den Block fahren ein paar Mal.'
}]
request = requests.post(constructed_url, params=params, headers=headers, json=body)
response = request.json()
print(json.dumps(response, sort_keys=True, ensure_ascii=False, indent=4, separators=(',', ': ')))
```
---
Beim Verwenden des `/detect`-Endpunkts enthält die Antwort alternative erkannte Sprachen und informiert Sie darüber, ob Übersetzung und Transliteration für alle erkannten Sprachen unterstützt werden. Nach einem erfolgreichen Aufruf sollten Sie die folgende Antwort sehen:
```json
[
{
"alternatives": [
{
"isTranslationSupported": true,
"isTransliterationSupported": false,
"language": "nl",
"score": 0.92
},
{
"isTranslationSupported": true,
"isTransliterationSupported": false,
"language": "sk",
"score": 0.77
}
],
"isTranslationSupported": true,
"isTransliterationSupported": false,
"language": "de",
"score": 1.0
}
]
```
## <a name="transliterate-text"></a>Transliteration von Text
Transliteration ist der Vorgang der Umwandlung eines Worts oder Ausdrucks aus der Schrift (Alphabet) einer Sprache in eine andere, ausgehend von der phonetischen Ähnlichkeit. Beispielsweise können Sie Transliteration verwenden, um „สวัสดี“ (`thai`) in „sawatdi“ (`latn`) zu konvertieren. Es gibt mehrere Möglichkeiten zum Durchführen von Transliteration. In diesem Abschnitt erfahren Sie, wie Sie die Sprachenerkennung mithilfe des `translate`-Endpunkts und des `transliterate`-Endpunkts verwenden.
### <a name="transliterate-during-translation"></a>Transliteration während der Übersetzung
Wenn Sie in eine Sprache übersetzen, die ein anderes Alphabet (oder andere Phoneme) als ihre Quelle verwendet, benötigen Sie möglicherweise Transliteration. In diesem Beispiel wird „Hello“ von Englisch in Thailändisch übersetzt. Über das Ergebnis einer Übersetzung ins Thailändische hinaus erhalten Sie eine Transliteration des übersetzten Satzes, die das lateinische Alphabet verwendet.
Zum Abrufen einer Übersetzung vom `translate`-Endpunkt verwenden Sie den Parameter `toScript`.
> [!NOTE]
> Eine vollständige Liste der verfügbaren Sprachen und Transliterationen finden Sie unter [Sprachunterstützung](language-support.md).
# <a name="c"></a>[C#](#tab/csharp)
```csharp
using System;
using System.Net.Http;
using System.Text;
using System.Threading.Tasks;
using Newtonsoft.Json; // Install Newtonsoft.Json with NuGet
class Program
{
private static readonly string subscriptionKey = "YOUR-SUBSCRIPTION-KEY";
private static readonly string endpoint = "https://api.cognitive.microsofttranslator.com/";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
private static readonly string location = "YOUR_RESOURCE_LOCATION";
static async Task Main(string[] args)
{
// Output language defined as parameter, with toScript set to latn
string route = "/translate?api-version=3.0&to=th&toScript=latn";
string textToTransliterate = "Hello";
object[] body = new object[] { new { Text = textToTransliterate } };
var requestBody = JsonConvert.SerializeObject(body);
using (var client = new HttpClient())
using (var request = new HttpRequestMessage())
{
// Build the request.
request.Method = HttpMethod.Post;
request.RequestUri = new Uri(endpoint + route);
request.Content = new StringContent(requestBody, Encoding.UTF8, "application/json");
request.Headers.Add("Ocp-Apim-Subscription-Key", subscriptionKey);
request.Headers.Add("Ocp-Apim-Subscription-Region", location);
// Send the request and get response.
HttpResponseMessage response = await client.SendAsync(request).ConfigureAwait(false);
// Read response as a string.
string result = await response.Content.ReadAsStringAsync();
Console.WriteLine(result);
}
}
}
```
# <a name="go"></a>[Go](#tab/go)
```go
package main
import (
"bytes"
"encoding/json"
"fmt"
"log"
"net/http"
"net/url"
)
func main() {
subscriptionKey := "YOUR-SUBSCRIPTION-KEY"
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
location := "YOUR_RESOURCE_LOCATION";
endpoint := "https://api.cognitive.microsofttranslator.com/"
uri := endpoint + "/translate?api-version=3.0"
// Build the request URL. See: https://golang.org/pkg/net/url/#example_URL_Parse
u, _ := url.Parse(uri)
q := u.Query()
q.Add("to", "th")
q.Add("toScript", "latn")
u.RawQuery = q.Encode()
// Create an anonymous struct for your request body and encode it to JSON
body := []struct {
Text string
}{
{Text: "Hello"},
}
b, _ := json.Marshal(body)
// Build the HTTP POST request
req, err := http.NewRequest("POST", u.String(), bytes.NewBuffer(b))
if err != nil {
log.Fatal(err)
}
// Add required headers to the request
req.Header.Add("Ocp-Apim-Subscription-Key", subscriptionKey)
req.Header.Add("Ocp-Apim-Subscription-Region", location)
req.Header.Add("Content-Type", "application/json")
// Call the Translator API
res, err := http.DefaultClient.Do(req)
if err != nil {
log.Fatal(err)
}
// Decode the JSON response
var result interface{}
if err := json.NewDecoder(res.Body).Decode(&result); err != nil {
log.Fatal(err)
}
// Format and print the response to terminal
prettyJSON, _ := json.MarshalIndent(result, "", " ")
fmt.Printf("%s\n", prettyJSON)
}
```
# <a name="java"></a>[Java](#tab/java)
```java
import java.io.*;
import java.net.*;
import java.util.*;
import com.google.gson.*;
import com.squareup.okhttp.*;
public class Translate {
private static String subscriptionKey = "YOUR_SUBSCRIPTION_KEY";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
private static String location = "YOUR_RESOURCE_LOCATION";
HttpUrl url = new HttpUrl.Builder()
.scheme("https")
.host("api.cognitive.microsofttranslator.com")
.addPathSegment("/translate")
.addQueryParameter("api-version", "3.0")
.addQueryParameter("to", "th")
.addQueryParameter("toScript", "latn")
.build();
// Instantiates the OkHttpClient.
OkHttpClient client = new OkHttpClient();
// This function performs a POST request.
public String Post() throws IOException {
MediaType mediaType = MediaType.parse("application/json");
RequestBody body = RequestBody.create(mediaType,
"[{\"Text\": \"Hello\"}]");
Request request = new Request.Builder().url(url).post(body)
.addHeader("Ocp-Apim-Subscription-Key", subscriptionKey)
.addHeader("Ocp-Apim-Subscription-Key", location)
.addHeader("Content-type", "application/json")
.build();
Response response = client.newCall(request).execute();
return response.body().string();
}
// This function prettifies the json response.
public static String prettify(String json_text) {
JsonParser parser = new JsonParser();
JsonElement json = parser.parse(json_text);
Gson gson = new GsonBuilder().setPrettyPrinting().create();
return gson.toJson(json);
}
public static void main(String[] args) {
try {
Translate translateRequest = new Translate();
String response = translateRequest.Post();
System.out.println(prettify(response));
} catch (Exception e) {
System.out.println(e);
}
}
}
```
# <a name="nodejs"></a>[Node.js](#tab/nodejs)
```javascript
const axios = require('axios').default;
const { v4: uuidv4 } = require('uuid');
var subscriptionKey = "YOUR_SUBSCRIPTION_KEY";
var endpoint = "https://api.cognitive.microsofttranslator.com";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
var location = "YOUR_RESOURCE_LOCATION";
axios({
baseURL: endpoint,
url: '/translate',
method: 'post',
headers: {
'Ocp-Apim-Subscription-Key': subscriptionKey,
'Ocp-Apim-Subscription-Region': location,
'Content-type': 'application/json',
'X-ClientTraceId': uuidv4().toString()
},
params: {
'api-version': '3.0',
'to': 'th',
'toScript': 'latn'
},
data: [{
'text': 'Hello'
}],
responseType: 'json'
}).then(function(response){
console.log(JSON.stringify(response.data, null, 4));
})
```
# <a name="python"></a>[Python](#tab/python)
```Python
import requests, uuid, json
# Add your subscription key and endpoint
subscription_key = "YOUR_SUBSCRIPTION_KEY"
endpoint = "https://api.cognitive.microsofttranslator.com"
# Add your location, also known as region. The default is global.
# This is required if using a Cognitive Services resource.
location = "YOUR_RESOURCE_LOCATION"
path = '/translate'
constructed_url = endpoint + path
params = {
'api-version': '3.0',
'to': 'th',
'toScript': 'latn'
}
constructed_url = endpoint + path
headers = {
'Ocp-Apim-Subscription-Key': subscription_key,
'Ocp-Apim-Subscription-Region': location,
'Content-type': 'application/json',
'X-ClientTraceId': str(uuid.uuid4())
}
# You can pass more than one object in body.
body = [{
'text': 'Hello'
}]
request = requests.post(constructed_url, params=params, headers=headers, json=body)
response = request.json()
print(json.dumps(response, sort_keys=True, ensure_ascii=False, indent=4, separators=(',', ': ')))
```
---
Nach einem erfolgreichen Aufruf sollten Sie die folgende Antwort sehen. Beachten Sie, dass die Antwort vom `translate`-Endpunkt die erkannte Ausgangssprache mit einer Vertrauensbewertung, eine Übersetzung im Alphabet der Zielsprache und eine Transliteration unter Verwendung des lateinischen Alphabets umfasst.
```json
[
{
"detectedLanguage": {
"language": "en",
"score": 1.0
},
"translations": [
{
"text": "สวัสดี",
"to": "th",
"transliteration": {
"script": "Latn",
"text": "sawatdi"
}
}
]
}
]
```
### <a name="transliterate-without-translation"></a>Transliteration ohne Übersetzung
Sie können ferner den `transliterate`-Endpunkt verwenden, um eine Transliteration abzurufen. Beim Verwenden des Transliterationsendpunkts müssen Sie die Ausgangssprache (`language`), die Ausgangsschrift/das Ausgangsalphabet (`fromScript`) und die Zielschrift/das Zielalphabet (`toScript`) als Parameter angeben. In diesem Beispiel rufen wir die Transliteration für สวัสดี ab.
> [!NOTE]
> Eine vollständige Liste der verfügbaren Sprachen und Transliterationen finden Sie unter [Sprachunterstützung](language-support.md).
# <a name="c"></a>[C#](#tab/csharp)
```csharp
using System;
using System.Net.Http;
using System.Text;
using System.Threading.Tasks;
using Newtonsoft.Json; // Install Newtonsoft.Json with NuGet
class Program
{
private static readonly string subscriptionKey = "YOUR-SUBSCRIPTION-KEY";
private static readonly string endpoint = "https://api.cognitive.microsofttranslator.com/";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
private static readonly string location = "YOUR_RESOURCE_LOCATION";
static async Task Main(string[] args)
{
// For a complete list of options, see API reference.
// Input and output languages are defined as parameters.
string route = "/translate?api-version=3.0&to=th&toScript=latn";
string textToTransliterate = "Hello";
object[] body = new object[] { new { Text = textToTransliterate } };
var requestBody = JsonConvert.SerializeObject(body);
using (var client = new HttpClient())
using (var request = new HttpRequestMessage())
{
// Build the request.
request.Method = HttpMethod.Post;
request.RequestUri = new Uri(endpoint + route);
request.Content = new StringContent(requestBody, Encoding.UTF8, "application/json");
request.Headers.Add("Ocp-Apim-Subscription-Key", subscriptionKey);
request.Headers.Add("Ocp-Apim-Subscription-Region", location);
// Send the request and get response.
HttpResponseMessage response = await client.SendAsync(request).ConfigureAwait(false);
// Read response as a string.
string result = await response.Content.ReadAsStringAsync();
Console.WriteLine(result);
}
}
}
```
# <a name="go"></a>[Go](#tab/go)
```go
package main
import (
"bytes"
"encoding/json"
"fmt"
"log"
"net/http"
"net/url"
)
func main() {
subscriptionKey := "YOUR-SUBSCRIPTION-KEY"
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
location := "YOUR_RESOURCE_LOCATION";
endpoint := "https://api.cognitive.microsofttranslator.com/"
uri := endpoint + "/transliterate?api-version=3.0"
// Build the request URL. See: https://golang.org/pkg/net/url/#example_URL_Parse
u, _ := url.Parse(uri)
q := u.Query()
q.Add("language", "th")
q.Add("fromScript", "thai")
q.Add("toScript", "latn")
u.RawQuery = q.Encode()
// Create an anonymous struct for your request body and encode it to JSON
body := []struct {
Text string
}{
{Text: "สวัสดี"},
}
b, _ := json.Marshal(body)
// Build the HTTP POST request
req, err := http.NewRequest("POST", u.String(), bytes.NewBuffer(b))
if err != nil {
log.Fatal(err)
}
// Add required headers to the request
req.Header.Add("Ocp-Apim-Subscription-Key", subscriptionKey)
req.Header.Add("Ocp-Apim-Subscription-Region", location)
req.Header.Add("Content-Type", "application/json")
// Call the Translator API
res, err := http.DefaultClient.Do(req)
if err != nil {
log.Fatal(err)
}
// Decode the JSON response
var result interface{}
if err := json.NewDecoder(res.Body).Decode(&result); err != nil {
log.Fatal(err)
}
// Format and print the response to terminal
prettyJSON, _ := json.MarshalIndent(result, "", " ")
fmt.Printf("%s\n", prettyJSON)
}
```
# <a name="java"></a>[Java](#tab/java)
```java
import java.io.*;
import java.net.*;
import java.util.*;
import com.google.gson.*;
import com.squareup.okhttp.*;
public class Transliterate {
private static String subscriptionKey = "YOUR_SUBSCRIPTION_KEY";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
private static String location = "YOUR_RESOURCE_LOCATION";
HttpUrl url = new HttpUrl.Builder()
.scheme("https")
.host("api.cognitive.microsofttranslator.com")
.addPathSegment("/transliterate")
.addQueryParameter("api-version", "3.0")
.addQueryParameter("language", "th")
.addQueryParameter("fromScript", "thai")
.addQueryParameter("toScript", "latn")
.build();
// Instantiates the OkHttpClient.
OkHttpClient client = new OkHttpClient();
// This function performs a POST request.
public String Post() throws IOException {
MediaType mediaType = MediaType.parse("application/json");
RequestBody body = RequestBody.create(mediaType,
"[{\"Text\": \"สวัสดี\"}]");
Request request = new Request.Builder().url(url).post(body)
.addHeader("Ocp-Apim-Subscription-Key", subscriptionKey)
.addHeader("Ocp-Apim-Subscription-Key", location)
.addHeader("Content-type", "application/json")
.build();
Response response = client.newCall(request).execute();
return response.body().string();
}
// This function prettifies the json response.
public static String prettify(String json_text) {
JsonParser parser = new JsonParser();
JsonElement json = parser.parse(json_text);
Gson gson = new GsonBuilder().setPrettyPrinting().create();
return gson.toJson(json);
}
public static void main(String[] args) {
try {
Transliterate transliterateRequest = new Transliterate();
String response = transliterateRequest.Post();
System.out.println(prettify(response));
} catch (Exception e) {
System.out.println(e);
}
}
}
```
# <a name="nodejs"></a>[Node.js](#tab/nodejs)
```javascript
const axios = require('axios').default;
const { v4: uuidv4 } = require('uuid');
var subscriptionKey = "YOUR_SUBSCRIPTION_KEY";
var endpoint = "https://api.cognitive.microsofttranslator.com";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
var location = "YOUR_RESOURCE_LOCATION";
axios({
baseURL: endpoint,
url: '/transliterate',
method: 'post',
headers: {
'Ocp-Apim-Subscription-Key': subscriptionKey,
'Ocp-Apim-Subscription-Region': location,
'Content-type': 'application/json',
'X-ClientTraceId': uuidv4().toString()
},
params: {
'api-version': '3.0',
'language': 'th',
'fromScript': 'thai',
'toScript': 'latn'
},
data: [{
'text': 'สวัสดี'
}],
responseType: 'json'
}).then(function(response){
console.log(JSON.stringify(response.data, null, 4));
})
```
# <a name="python"></a>[Python](#tab/python)
```python
import requests, uuid, json
# Add your subscription key and endpoint
subscription_key = "YOUR_SUBSCRIPTION_KEY"
endpoint = "https://api.cognitive.microsofttranslator.com"
# Add your location, also known as region. The default is global.
# This is required if using a Cognitive Services resource.
location = "YOUR_RESOURCE_LOCATION"
path = '/transliterate'
constructed_url = endpoint + path
params = {
'api-version': '3.0',
'language': 'th',
'fromScript': 'thai',
'toScript': 'latn'
}
constructed_url = endpoint + path
headers = {
'Ocp-Apim-Subscription-Key': subscription_key,
'Ocp-Apim-Subscription-Region': location,
'Content-type': 'application/json',
'X-ClientTraceId': str(uuid.uuid4())
}
# You can pass more than one object in body.
body = [{
'text': 'สวัสดี'
}]
request = requests.post(constructed_url, params=params, headers=headers, json=body)
response = request.json()
print(json.dumps(response, sort_keys=True, indent=4, separators=(',', ': ')))
```
---
Nach einem erfolgreichen Aufruf sollten Sie die folgende Antwort sehen. Im Gegensatz zu einem Aufruf des `translate`-Endpunkts gibt `transliterate` nur die `script` und den Ausgabe-`text` zurück.
```json
[
{
"script": "latn",
"text": "sawatdi"
}
]
```
## <a name="get-sentence-length"></a>Abrufen der Satzlänge
Mit dem Textübersetzungsdienst können Sie die Zeichenanzahl für einen Satz oder eine Reihe von Sätzen bestimmen. Die Antwort wird als Array zurückgegeben, das die Zeichenanzahl für jeden erkannten Satz enthält. Sie können Satzlängen mit den Endpunkten `translate` und `breaksentence` abrufen.
### <a name="get-sentence-length-during-translation"></a>Abrufen der Satzlänge während der Übersetzung
Sie können die Zeichenanzahl sowohl für den Ausgangstext als auch für die übersetzte Ausgabe mithilfe des `translate`-Endpunkts abrufen. Um die Satzlänge (`srcSenLen` und `transSenLen`) zurückzugeben, müssen Sie den `includeSentenceLength`-Parameter auf `True` festlegen.
# <a name="c"></a>[C#](#tab/csharp)
```csharp
using System;
using System.Net.Http;
using System.Text;
using System.Threading.Tasks;
using Newtonsoft.Json; // Install Newtonsoft.Json with NuGet
class Program
{
private static readonly string subscriptionKey = "YOUR-SUBSCRIPTION-KEY";
private static readonly string endpoint = "https://api.cognitive.microsofttranslator.com/";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
private static readonly string location = "YOUR_RESOURCE_LOCATION";
static async Task Main(string[] args)
{
// Include sentence length details.
string route = "/translate?api-version=3.0&to=es&includeSentenceLength=true";
string sentencesToCount =
"Can you tell me how to get to Penn Station? Oh, you aren't sure? That's fine.";
object[] body = new object[] { new { Text = sentencesToCount } };
var requestBody = JsonConvert.SerializeObject(body);
using (var client = new HttpClient())
using (var request = new HttpRequestMessage())
{
// Build the request.
request.Method = HttpMethod.Post;
request.RequestUri = new Uri(endpoint + route);
request.Content = new StringContent(requestBody, Encoding.UTF8, "application/json");
request.Headers.Add("Ocp-Apim-Subscription-Key", subscriptionKey);
request.Headers.Add("Ocp-Apim-Subscription-Region", location);
// Send the request and get response.
HttpResponseMessage response = await client.SendAsync(request).ConfigureAwait(false);
// Read response as a string.
string result = await response.Content.ReadAsStringAsync();
Console.WriteLine(result);
}
}
}
```
# <a name="go"></a>[Go](#tab/go)
```go
package main
import (
"bytes"
"encoding/json"
"fmt"
"log"
"net/http"
"net/url"
)
func main() {
subscriptionKey := "YOUR-SUBSCRIPTION-KEY"
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
location := "YOUR_RESOURCE_LOCATION";
endpoint := "https://api.cognitive.microsofttranslator.com/"
uri := endpoint + "/translate?api-version=3.0"
// Build the request URL. See: https://golang.org/pkg/net/url/#example_URL_Parse
u, _ := url.Parse(uri)
q := u.Query()
q.Add("to", "es")
q.Add("includeSentenceLength", "true")
u.RawQuery = q.Encode()
// Create an anonymous struct for your request body and encode it to JSON
body := []struct {
Text string
}{
{Text: "Can you tell me how to get to Penn Station? Oh, you aren't sure? That's fine."},
}
b, _ := json.Marshal(body)
// Build the HTTP POST request
req, err := http.NewRequest("POST", u.String(), bytes.NewBuffer(b))
if err != nil {
log.Fatal(err)
}
// Add required headers to the request
req.Header.Add("Ocp-Apim-Subscription-Key", subscriptionKey)
req.Header.Add("Ocp-Apim-Subscription-Region", location)
req.Header.Add("Content-Type", "application/json")
// Call the Translator API
res, err := http.DefaultClient.Do(req)
if err != nil {
log.Fatal(err)
}
// Decode the JSON response
var result interface{}
if err := json.NewDecoder(res.Body).Decode(&result); err != nil {
log.Fatal(err)
}
// Format and print the response to terminal
prettyJSON, _ := json.MarshalIndent(result, "", " ")
fmt.Printf("%s\n", prettyJSON)
}
```
# <a name="java"></a>[Java](#tab/java)
```java
import java.io.*;
import java.net.*;
import java.util.*;
import com.google.gson.*;
import com.squareup.okhttp.*;
public class Translate {
private static String subscriptionKey = "YOUR_SUBSCRIPTION_KEY";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
private static String location = "YOUR_RESOURCE_LOCATION";
HttpUrl url = new HttpUrl.Builder()
.scheme("https")
.host("api.cognitive.microsofttranslator.com")
.addPathSegment("/translate")
.addQueryParameter("api-version", "3.0")
.addQueryParameter("to", "es")
.addQueryParameter("includeSentenceLength", "true")
.build();
// Instantiates the OkHttpClient.
OkHttpClient client = new OkHttpClient();
// This function performs a POST request.
public String Post() throws IOException {
MediaType mediaType = MediaType.parse("application/json");
RequestBody body = RequestBody.create(mediaType,
"[{\"Text\": \"Can you tell me how to get to Penn Station? Oh, you aren\'t sure? That\'s fine.\"}]");
Request request = new Request.Builder().url(url).post(body)
.addHeader("Ocp-Apim-Subscription-Key", subscriptionKey)
.addHeader("Ocp-Apim-Subscription-Key", location)
.addHeader("Content-type", "application/json")
.build();
Response response = client.newCall(request).execute();
return response.body().string();
}
// This function prettifies the json response.
public static String prettify(String json_text) {
JsonParser parser = new JsonParser();
JsonElement json = parser.parse(json_text);
Gson gson = new GsonBuilder().setPrettyPrinting().create();
return gson.toJson(json);
}
public static void main(String[] args) {
try {
Translate translateRequest = new Translate();
String response = translateRequest.Post();
System.out.println(prettify(response));
} catch (Exception e) {
System.out.println(e);
}
}
}
```
# <a name="nodejs"></a>[Node.js](#tab/nodejs)
```javascript
const axios = require('axios').default;
const { v4: uuidv4 } = require('uuid');
var subscriptionKey = "YOUR_SUBSCRIPTION_KEY";
var endpoint = "https://api.cognitive.microsofttranslator.com";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
var location = "YOUR_RESOURCE_LOCATION";
axios({
baseURL: endpoint,
url: '/translate',
method: 'post',
headers: {
'Ocp-Apim-Subscription-Key': subscriptionKey,
'Ocp-Apim-Subscription-Region': location,
'Content-type': 'application/json',
'X-ClientTraceId': uuidv4().toString()
},
params: {
'api-version': '3.0',
'to': 'es',
'includeSentenceLength': true
},
data: [{
'text': 'Can you tell me how to get to Penn Station? Oh, you aren\'t sure? That\'s fine.'
}],
responseType: 'json'
}).then(function(response){
console.log(JSON.stringify(response.data, null, 4));
})
```
# <a name="python"></a>[Python](#tab/python)
```python
import requests, uuid, json
# Add your subscription key and endpoint
subscription_key = "YOUR_SUBSCRIPTION_KEY"
endpoint = "https://api.cognitive.microsofttranslator.com"
# Add your location, also known as region. The default is global.
# This is required if using a Cognitive Services resource.
location = "YOUR_RESOURCE_LOCATION"
path = '/translate'
constructed_url = endpoint + path
params = {
'api-version': '3.0',
'to': 'es',
'includeSentenceLength': True
}
constructed_url = endpoint + path
headers = {
'Ocp-Apim-Subscription-Key': subscription_key,
'Ocp-Apim-Subscription-Region': location,
'Content-type': 'application/json',
'X-ClientTraceId': str(uuid.uuid4())
}
# You can pass more than one object in body.
body = [{
'text': 'Can you tell me how to get to Penn Station? Oh, you aren\'t sure? That\'s fine.'
}]
request = requests.post(constructed_url, params=params, headers=headers, json=body)
response = request.json()
print(json.dumps(response, sort_keys=True, ensure_ascii=False, indent=4, separators=(',', ': ')))
```
---
Nach einem erfolgreichen Aufruf sollten Sie die folgende Antwort sehen. Über die erkannte Ausgangssprache und die Übersetzung hinaus erhalten Sie die Zeichenanzahl für jeden erkannten Satz, sowohl für die Quelle (`srcSentLen`) als auch für die Übersetzung (`transSentLen`).
```json
[
{
"detectedLanguage": {
"language": "en",
"score": 1.0
},
"translations": [
{
"sentLen": {
"srcSentLen": [
44,
21,
12
],
"transSentLen": [
48,
18,
10
]
},
"text": "¿Puedes decirme cómo llegar a la estación Penn? ¿No estás seguro? Está bien.",
"to": "es"
}
]
}
]
```
### <a name="get-sentence-length-without-translation"></a>Abrufen der Satzlänge ohne Übersetzung
Der Textübersetzungsdienst lässt Sie die Satzlänge auch ohne Übersetzung abrufen, wenn Sie den `breaksentence`-Endpunkt verwenden.
# <a name="c"></a>[C#](#tab/csharp)
```csharp
using System;
using System.Net.Http;
using System.Text;
using System.Threading.Tasks;
using Newtonsoft.Json; // Install Newtonsoft.Json with NuGet
class Program
{
private static readonly string subscriptionKey = "YOUR-SUBSCRIPTION-KEY";
private static readonly string endpoint = "https://api.cognitive.microsofttranslator.com/";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
private static readonly string location = "YOUR_RESOURCE_LOCATION";
static async Task Main(string[] args)
{
// Only include sentence length details.
string route = "/breaksentence?api-version=3.0";
string sentencesToCount =
"Can you tell me how to get to Penn Station? Oh, you aren't sure? That's fine.";
object[] body = new object[] { new { Text = sentencesToCount } };
var requestBody = JsonConvert.SerializeObject(body);
using (var client = new HttpClient())
using (var request = new HttpRequestMessage())
{
// Build the request.
request.Method = HttpMethod.Post;
request.RequestUri = new Uri(endpoint + route);
request.Content = new StringContent(requestBody, Encoding.UTF8, "application/json");
request.Headers.Add("Ocp-Apim-Subscription-Key", subscriptionKey);
request.Headers.Add("Ocp-Apim-Subscription-Region", location);
// Send the request and get response.
HttpResponseMessage response = await client.SendAsync(request).ConfigureAwait(false);
// Read response as a string.
string result = await response.Content.ReadAsStringAsync();
Console.WriteLine(result);
}
}
}
```
# <a name="go"></a>[Go](#tab/go)
```go
package main
import (
"bytes"
"encoding/json"
"fmt"
"log"
"net/http"
"net/url"
)
func main() {
subscriptionKey := "YOUR-SUBSCRIPTION-KEY"
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
location := "YOUR_RESOURCE_LOCATION";
endpoint := "https://api.cognitive.microsofttranslator.com/"
uri := endpoint + "/breaksentence?api-version=3.0"
// Build the request URL. See: https://golang.org/pkg/net/url/#example_URL_Parse
u, _ := url.Parse(uri)
q := u.Query()
u.RawQuery = q.Encode()
// Create an anonymous struct for your request body and encode it to JSON
body := []struct {
Text string
}{
{Text: "Can you tell me how to get to Penn Station? Oh, you aren't sure? That's fine."},
}
b, _ := json.Marshal(body)
// Build the HTTP POST request
req, err := http.NewRequest("POST", u.String(), bytes.NewBuffer(b))
if err != nil {
log.Fatal(err)
}
// Add required headers to the request
req.Header.Add("Ocp-Apim-Subscription-Key", subscriptionKey)
req.Header.Add("Ocp-Apim-Subscription-Region", location)
req.Header.Add("Content-Type", "application/json")
// Call the Translator API
res, err := http.DefaultClient.Do(req)
if err != nil {
log.Fatal(err)
}
// Decode the JSON response
var result interface{}
if err := json.NewDecoder(res.Body).Decode(&result); err != nil {
log.Fatal(err)
}
// Format and print the response to terminal
prettyJSON, _ := json.MarshalIndent(result, "", " ")
fmt.Printf("%s\n", prettyJSON)
}
```
# <a name="java"></a>[Java](#tab/java)
```java
import java.io.*;
import java.net.*;
import java.util.*;
import com.google.gson.*;
import com.squareup.okhttp.*;
public class BreakSentence {
private static String subscriptionKey = "YOUR_SUBSCRIPTION_KEY";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
private static String location = "YOUR_RESOURCE_LOCATION";
HttpUrl url = new HttpUrl.Builder()
.scheme("https")
.host("api.cognitive.microsofttranslator.com")
.addPathSegment("/breaksentence")
.addQueryParameter("api-version", "3.0")
.build();
// Instantiates the OkHttpClient.
OkHttpClient client = new OkHttpClient();
// This function performs a POST request.
public String Post() throws IOException {
MediaType mediaType = MediaType.parse("application/json");
RequestBody body = RequestBody.create(mediaType,
"[{\"Text\": \"Can you tell me how to get to Penn Station? Oh, you aren\'t sure? That\'s fine.\"}]");
Request request = new Request.Builder().url(url).post(body)
.addHeader("Ocp-Apim-Subscription-Key", subscriptionKey)
.addHeader("Ocp-Apim-Subscription-Key", location)
.addHeader("Content-type", "application/json")
.build();
Response response = client.newCall(request).execute();
return response.body().string();
}
// This function prettifies the json response.
public static String prettify(String json_text) {
JsonParser parser = new JsonParser();
JsonElement json = parser.parse(json_text);
Gson gson = new GsonBuilder().setPrettyPrinting().create();
return gson.toJson(json);
}
public static void main(String[] args) {
try {
BreakSentence breakSentenceRequest = new BreakSentence();
String response = breakSentenceRequest.Post();
System.out.println(prettify(response));
} catch (Exception e) {
System.out.println(e);
}
}
}
```
# <a name="nodejs"></a>[Node.js](#tab/nodejs)
```javascript
const axios = require('axios').default;
const { v4: uuidv4 } = require('uuid');
var subscriptionKey = "YOUR_SUBSCRIPTION_KEY";
var endpoint = "https://api.cognitive.microsofttranslator.com";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
var location = "YOUR_RESOURCE_LOCATION";
axios({
baseURL: endpoint,
url: '/breaksentence',
method: 'post',
headers: {
'Ocp-Apim-Subscription-Key': subscriptionKey,
'Ocp-Apim-Subscription-Region': location,
'Content-type': 'application/json',
'X-ClientTraceId': uuidv4().toString()
},
params: {
'api-version': '3.0'
},
data: [{
'text': 'Can you tell me how to get to Penn Station? Oh, you aren\'t sure? That\'s fine.'
}],
responseType: 'json'
}).then(function(response){
console.log(JSON.stringify(response.data, null, 4));
})
```
# <a name="python"></a>[Python](#tab/python)
```python
import requests, uuid, json
# Add your subscription key and endpoint
subscription_key = "YOUR_SUBSCRIPTION_KEY"
endpoint = "https://api.cognitive.microsofttranslator.com"
# Add your location, also known as region. The default is global.
# This is required if using a Cognitive Services resource.
location = "YOUR_RESOURCE_LOCATION"
path = '/breaksentence'
constructed_url = endpoint + path
params = {
'api-version': '3.0'
}
constructed_url = endpoint + path
headers = {
'Ocp-Apim-Subscription-Key': subscription_key,
'Ocp-Apim-Subscription-Region': location,
'Content-type': 'application/json',
'X-ClientTraceId': str(uuid.uuid4())
}
# You can pass more than one object in body.
body = [{
'text': 'Can you tell me how to get to Penn Station? Oh, you aren\'t sure? That\'s fine.'
}]
request = requests.post(constructed_url, params=params, headers=headers, json=body)
response = request.json()
print(json.dumps(response, sort_keys=True, indent=4, separators=(',', ': ')))
```
---
Nach einem erfolgreichen Aufruf sollten Sie die folgende Antwort sehen. Anders als ein Aufruf des `translate`-Endpunkts gibt `breaksentence` nur die Zeichenanzahl für den Ausgangstext in einem Array mit dem Namen `sentLen` zurück.
```json
[
{
"detectedLanguage": {
"language": "en",
"score": 1.0
},
"sentLen": [
44,
21,
12
]
}
]
```
## <a name="dictionary-lookup-alternate-translations"></a>Wörterbuchsuche (Alternative Übersetzungen)
Mithilfe des -Endpunks können Sie alternative Übersetzungen für ein Wort oder einen Satz abrufen. Wenn Sie beispielsweise das Wort „Shark“ aus `en` in `es` übersetzen, gibt dieser Endpunkt sowohl „tiburón“ als auch „escualo“ zurück.
# <a name="c"></a>[C#](#tab/csharp)
```csharp
using System;
using System.Net.Http;
using System.Text;
using System.Threading.Tasks;
using Newtonsoft.Json; // Install Newtonsoft.Json with NuGet
class Program
{
private static readonly string subscriptionKey = "YOUR-SUBSCRIPTION-KEY";
private static readonly string endpoint = "https://api.cognitive.microsofttranslator.com/";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
private static readonly string location = "YOUR_RESOURCE_LOCATION";
static async Task Main(string[] args)
{
// See many translation options
string route = "/dictionary/lookup?api-version=3.0&from=en&to=es";
string wordToTranslate = "shark";
object[] body = new object[] { new { Text = wordToTranslate } };
var requestBody = JsonConvert.SerializeObject(body);
using (var client = new HttpClient())
using (var request = new HttpRequestMessage())
{
// Build the request.
request.Method = HttpMethod.Post;
request.RequestUri = new Uri(endpoint + route);
request.Content = new StringContent(requestBody, Encoding.UTF8, "application/json");
request.Headers.Add("Ocp-Apim-Subscription-Key", subscriptionKey);
request.Headers.Add("Ocp-Apim-Subscription-Region", location);
// Send the request and get response.
HttpResponseMessage response = await client.SendAsync(request).ConfigureAwait(false);
// Read response as a string.
string result = await response.Content.ReadAsStringAsync();
Console.WriteLine(result);
}
}
}
```
# <a name="go"></a>[Go](#tab/go)
```go
package main
import (
"bytes"
"encoding/json"
"fmt"
"log"
"net/http"
"net/url"
)
func main() {
subscriptionKey := "YOUR-SUBSCRIPTION-KEY"
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
location := "YOUR_RESOURCE_LOCATION";
endpoint := "https://api.cognitive.microsofttranslator.com/"
uri := endpoint + "/dictionary/lookup?api-version=3.0"
// Build the request URL. See: https://golang.org/pkg/net/url/#example_URL_Parse
u, _ := url.Parse(uri)
q := u.Query()
q.Add("from", "en")
q.Add("to", "es")
u.RawQuery = q.Encode()
// Create an anonymous struct for your request body and encode it to JSON
body := []struct {
Text string
}{
{Text: "shark"},
}
b, _ := json.Marshal(body)
// Build the HTTP POST request
req, err := http.NewRequest("POST", u.String(), bytes.NewBuffer(b))
if err != nil {
log.Fatal(err)
}
// Add required headers to the request
req.Header.Add("Ocp-Apim-Subscription-Key", subscriptionKey)
req.Header.Add("Ocp-Apim-Subscription-Region", location)
req.Header.Add("Content-Type", "application/json")
// Call the Translator API
res, err := http.DefaultClient.Do(req)
if err != nil {
log.Fatal(err)
}
// Decode the JSON response
var result interface{}
if err := json.NewDecoder(res.Body).Decode(&result); err != nil {
log.Fatal(err)
}
// Format and print the response to terminal
prettyJSON, _ := json.MarshalIndent(result, "", " ")
fmt.Printf("%s\n", prettyJSON)
}
```
# <a name="java"></a>[Java](#tab/java)
```java
import java.io.*;
import java.net.*;
import java.util.*;
import com.google.gson.*;
import com.squareup.okhttp.*;
public class DictionaryLookup {
private static String subscriptionKey = "YOUR_SUBSCRIPTION_KEY";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
private static String location = "YOUR_RESOURCE_LOCATION";
HttpUrl url = new HttpUrl.Builder()
.scheme("https")
.host("api.cognitive.microsofttranslator.com")
.addPathSegment("/dictionary/lookup")
.addQueryParameter("api-version", "3.0")
.addQueryParameter("from", "en")
.addQueryParameter("to", "es")
.build();
// Instantiates the OkHttpClient.
OkHttpClient client = new OkHttpClient();
// This function performs a POST request.
public String Post() throws IOException {
MediaType mediaType = MediaType.parse("application/json");
RequestBody body = RequestBody.create(mediaType,
"[{\"Text\": \"Shark\"}]");
Request request = new Request.Builder().url(url).post(body)
.addHeader("Ocp-Apim-Subscription-Key", subscriptionKey)
.addHeader("Ocp-Apim-Subscription-Key", location)
.addHeader("Content-type", "application/json")
.build();
Response response = client.newCall(request).execute();
return response.body().string();
}
// This function prettifies the json response.
public static String prettify(String json_text) {
JsonParser parser = new JsonParser();
JsonElement json = parser.parse(json_text);
Gson gson = new GsonBuilder().setPrettyPrinting().create();
return gson.toJson(json);
}
public static void main(String[] args) {
try {
DictionaryLookup dictionaryLookupRequest = new DictionaryLookup();
String response = dictionaryLookupRequest.Post();
System.out.println(prettify(response));
} catch (Exception e) {
System.out.println(e);
}
}
}
```
# <a name="nodejs"></a>[Node.js](#tab/nodejs)
```javascript
const axios = require('axios').default;
const { v4: uuidv4 } = require('uuid');
var subscriptionKey = "YOUR_SUBSCRIPTION_KEY";
var endpoint = "https://api.cognitive.microsofttranslator.com";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
var location = "YOUR_RESOURCE_LOCATION";
axios({
baseURL: endpoint,
url: '/dictionary/lookup',
method: 'post',
headers: {
'Ocp-Apim-Subscription-Key': subscriptionKey,
'Ocp-Apim-Subscription-Region': location,
'Content-type': 'application/json',
'X-ClientTraceId': uuidv4().toString()
},
params: {
'api-version': '3.0',
'from': 'en',
'to': 'es'
},
data: [{
'text': 'shark'
}],
responseType: 'json'
}).then(function(response){
console.log(JSON.stringify(response.data, null, 4));
})
```
# <a name="python"></a>[Python](#tab/python)
```python
import requests, uuid, json
# Add your subscription key and endpoint
subscription_key = "YOUR_SUBSCRIPTION_KEY"
endpoint = "https://api.cognitive.microsofttranslator.com"
# Add your location, also known as region. The default is global.
# This is required if using a Cognitive Services resource.
location = "YOUR_RESOURCE_LOCATION"
path = '/dictionary/lookup'
constructed_url = endpoint + path
params = {
'api-version': '3.0',
'from': 'en',
'to': 'es'
}
constructed_url = endpoint + path
headers = {
'Ocp-Apim-Subscription-Key': subscription_key,
'Ocp-Apim-Subscription-Region': location,
'Content-type': 'application/json',
'X-ClientTraceId': str(uuid.uuid4())
}
# You can pass more than one object in body.
body = [{
'text': 'shark'
}]
request = requests.post(constructed_url, params=params, headers=headers, json=body)
response = request.json()
print(json.dumps(response, sort_keys=True, ensure_ascii=False, indent=4, separators=(',', ': ')))
```
---
Nach einem erfolgreichen Aufruf sollten Sie die folgende Antwort sehen. Wir möchten dies aufschlüsseln, da der JSON-Code komplexer als viele der anderen Beispiele in diesem Artikel ist. Das Array `translations` enthält eine Liste mit Übersetzungen. Jedes Objekt in diesem Array umfasst eine Vertrauensbewertung (`confidence`), den für die Anzeige auf dem Endbenutzerdisplay optimierten Text (`displayTarget`), den normalisierten Text (`normalizedText`), den Teil der Sprache (`posTag`) und Informationen über frühere Übersetzungen (`backTranslations`). Weitere Informationen über die Antwort finden Sie unter [Wörterbuchsuche](reference/v3-0-dictionary-lookup.md)
```json
[
{
"displaySource": "shark",
"normalizedSource": "shark",
"translations": [
{
"backTranslations": [
{
"displayText": "shark",
"frequencyCount": 45,
"normalizedText": "shark",
"numExamples": 0
}
],
"confidence": 0.8182,
"displayTarget": "tiburón",
"normalizedTarget": "tiburón",
"posTag": "OTHER",
"prefixWord": ""
},
{
"backTranslations": [
{
"displayText": "shark",
"frequencyCount": 10,
"normalizedText": "shark",
"numExamples": 1
}
],
"confidence": 0.1818,
"displayTarget": "escualo",
"normalizedTarget": "escualo",
"posTag": "NOUN",
"prefixWord": ""
}
]
}
]
```
## <a name="dictionary-examples-translations-in-context"></a>Wörterbuchbeispiele (Übersetzungen im Kontext)
Nachdem Sie eine Wörterbuchsuche durchgeführt haben, können Sie den Ausgangstext und die Übersetzung an den `dictionary/examples`-Endpunkt übergeben, um eine Liste mit Beispielen zu erhalten, in denen beide Begriffe im Kontext eines Satzes oder Ausdrucks gezeigt werden. Ausgehend vom vorherigen Beispiel verwenden Sie den `normalizedText` und das `normalizedTarget` aus der Antwort der Wörterbuchsuche als `text` bzw. `translation`. Die Parameter für die Ausgangssprache (`from`) und das Ausgabeziel (`to`) sind erforderlich.
# <a name="c"></a>[C#](#tab/csharp)
```csharp
using System;
using System.Net.Http;
using System.Text;
using System.Threading.Tasks;
using Newtonsoft.Json; // Install Newtonsoft.Json with NuGet
class Program
{
private static readonly string subscriptionKey = "YOUR-SUBSCRIPTION-KEY";
private static readonly string endpoint = "https://api.cognitive.microsofttranslator.com/";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
private static readonly string location = "YOUR_RESOURCE_LOCATION";
static async Task Main(string[] args)
{
// See examples of terms in context
string route = "/dictionary/examples?api-version=3.0&from=en&to=es";
object[] body = new object[] { new { Text = "Shark", Translation = "tiburón" } } ;
var requestBody = JsonConvert.SerializeObject(body);
using (var client = new HttpClient())
using (var request = new HttpRequestMessage())
{
// Build the request.
request.Method = HttpMethod.Post;
request.RequestUri = new Uri(endpoint + route);
request.Content = new StringContent(requestBody, Encoding.UTF8, "application/json");
request.Headers.Add("Ocp-Apim-Subscription-Key", subscriptionKey);
request.Headers.Add("Ocp-Apim-Subscription-Region", location);
// Send the request and get response.
HttpResponseMessage response = await client.SendAsync(request).ConfigureAwait(false);
// Read response as a string.
string result = await response.Content.ReadAsStringAsync();
Console.WriteLine(result);
}
}
}
```
# <a name="go"></a>[Go](#tab/go)
```go
package main
import (
"bytes"
"encoding/json"
"fmt"
"log"
"net/http"
"net/url"
)
func main() {
subscriptionKey := "YOUR-SUBSCRIPTION-KEY"
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
location := "YOUR_RESOURCE_LOCATION";
endpoint := "https://api.cognitive.microsofttranslator.com/"
uri := endpoint + "/dictionary/examples?api-version=3.0"
// Build the request URL. See: https://golang.org/pkg/net/url/#example_URL_Parse
u, _ := url.Parse(uri)
q := u.Query()
q.Add("from", "en")
q.Add("to", "es")
u.RawQuery = q.Encode()
// Create an anonymous struct for your request body and encode it to JSON
body := []struct {
Text string
Translation string
}{
{
Text: "Shark",
Translation: "tiburón",
},
}
b, _ := json.Marshal(body)
// Build the HTTP POST request
req, err := http.NewRequest("POST", u.String(), bytes.NewBuffer(b))
if err != nil {
log.Fatal(err)
}
// Add required headers to the request
req.Header.Add("Ocp-Apim-Subscription-Key", subscriptionKey)
req.Header.Add("Ocp-Apim-Subscription-Region", location)
req.Header.Add("Content-Type", "application/json")
// Call the Translator Text API
res, err := http.DefaultClient.Do(req)
if err != nil {
log.Fatal(err)
}
// Decode the JSON response
var result interface{}
if err := json.NewDecoder(res.Body).Decode(&result); err != nil {
log.Fatal(err)
}
// Format and print the response to terminal
prettyJSON, _ := json.MarshalIndent(result, "", " ")
fmt.Printf("%s\n", prettyJSON)
}
```
# <a name="java"></a>[Java](#tab/java)
```java
import java.io.*;
import java.net.*;
import java.util.*;
import com.google.gson.*;
import com.squareup.okhttp.*;
public class DictionaryExamples {
private static String subscriptionKey = "YOUR_SUBSCRIPTION_KEY";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
private static String location = "YOUR_RESOURCE_LOCATION";
HttpUrl url = new HttpUrl.Builder()
.scheme("https")
.host("api.cognitive.microsofttranslator.com")
.addPathSegment("/dictionary/examples")
.addQueryParameter("api-version", "3.0")
.addQueryParameter("from", "en")
.addQueryParameter("to", "es")
.build();
// Instantiates the OkHttpClient.
OkHttpClient client = new OkHttpClient();
// This function performs a POST request.
public String Post() throws IOException {
MediaType mediaType = MediaType.parse("application/json");
RequestBody body = RequestBody.create(mediaType,
"[{\"Text\": \"Shark\", \"Translation\": \"tiburón\"}]");
Request request = new Request.Builder().url(url).post(body)
.addHeader("Ocp-Apim-Subscription-Key", subscriptionKey).addHeader("Content-type", "application/json")
.build();
Response response = client.newCall(request).execute();
return response.body().string();
}
// This function prettifies the json response.
public static String prettify(String json_text) {
JsonParser parser = new JsonParser();
JsonElement json = parser.parse(json_text);
Gson gson = new GsonBuilder().setPrettyPrinting().create();
return gson.toJson(json);
}
public static void main(String[] args) {
try {
DictionaryExamples dictionaryExamplesRequest = new DictionaryExamples();
String response = dictionaryExamplesRequest.Post();
System.out.println(prettify(response));
} catch (Exception e) {
System.out.println(e);
}
}
}
```
# <a name="nodejs"></a>[Node.js](#tab/nodejs)
```javascript
const axios = require('axios').default;
const { v4: uuidv4 } = require('uuid');
var subscriptionKey = "YOUR_SUBSCRIPTION_KEY";
var endpoint = "https://api.cognitive.microsofttranslator.com";
// Add your location, also known as region. The default is global.
// This is required if using a Cognitive Services resource.
var location = "YOUR_RESOURCE_LOCATION";
axios({
baseURL: endpoint,
url: '/dictionary/examples',
method: 'post',
headers: {
'Ocp-Apim-Subscription-Key': subscriptionKey,
'Ocp-Apim-Subscription-Region': location,
'Content-type': 'application/json',
'X-ClientTraceId': uuidv4().toString()
},
params: {
'api-version': '3.0',
'from': 'en',
'to': 'es'
},
data: [{
'text': 'shark',
'translation': 'tiburón'
}],
responseType: 'json'
}).then(function(response){
console.log(JSON.stringify(response.data, null, 4));
})
```
# <a name="python"></a>[Python](#tab/python)
```python
import requests, uuid, json
# Add your subscription key and endpoint
subscription_key = "YOUR_SUBSCRIPTION_KEY"
endpoint = "https://api.cognitive.microsofttranslator.com"
# Add your location, also known as region. The default is global.
# This is required if using a Cognitive Services resource.
location = "YOUR_RESOURCE_LOCATION"
path = '/dictionary/examples'
constructed_url = endpoint + path
params = {
'api-version': '3.0',
'from': 'en',
'to': 'es'
}
constructed_url = endpoint + path
headers = {
'Ocp-Apim-Subscription-Key': subscription_key,
'Ocp-Apim-Subscription-Region': location,
'Content-type': 'application/json',
'X-ClientTraceId': str(uuid.uuid4())
}
# You can pass more than one object in body.
body = [{
'text': 'shark',
'translation': 'tiburón'
}]
request = requests.post(constructed_url, params=params, headers=headers, json=body)
response = request.json()
print(json.dumps(response, sort_keys=True, ensure_ascii=False, indent=4, separators=(',', ': ')))
```
---
Nach einem erfolgreichen Aufruf sollten Sie die folgende Antwort sehen. Weitere Informationen über die Antwort finden Sie unter [Wörterbuchsuche](reference/v3-0-dictionary-examples.md)
```json
[
{
"examples": [
{
"sourcePrefix": "More than a match for any ",
"sourceSuffix": ".",
"sourceTerm": "shark",
"targetPrefix": "Más que un fósforo para cualquier ",
"targetSuffix": ".",
"targetTerm": "tiburón"
},
{
"sourcePrefix": "Same with the mega ",
"sourceSuffix": ", of course.",
"sourceTerm": "shark",
"targetPrefix": "Y con el mega ",
"targetSuffix": ", por supuesto.",
"targetTerm": "tiburón"
},
{
"sourcePrefix": "A ",
"sourceSuffix": " ate it.",
"sourceTerm": "shark",
"targetPrefix": "Te la ha comido un ",
"targetSuffix": ".",
"targetTerm": "tiburón"
}
],
"normalizedSource": "shark",
"normalizedTarget": "tiburón"
}
]
```
## <a name="troubleshooting"></a>Problembehandlung
### <a name="java-users"></a>Java-Benutzer
Wenn Verbindungsprobleme auftreten, ist möglicherweise das SSL-Zertifikat abgelaufen. Installieren Sie zum Beheben dieses Problems [DigiCertGlobalRootG2.crt](http://cacerts.digicert.com/DigiCertGlobalRootG2.crt) im privaten Speicher.
## <a name="next-steps"></a>Nächste Schritte
* [Informationen zur Vorgehensweise der API beim Zählen von Zeichen](character-counts.md)
* [Anpassen und Verbessern der Übersetzung](customization.md)
## <a name="see-also"></a>Weitere Informationen
* [Textübersetzung v3 API-Referenz](reference/v3-0-reference.md)
* [Sprachunterstützung](language-support.md) | 33.77227 | 663 | 0.64948 | kor_Hang | 0.298737 |
ff132cc52379805a2cbb03b4d832ec5cd2225ea5 | 13,193 | md | Markdown | support/windows-server/identity/replication-error-8456-8457.md | valecanto/SupportArticles-docs | d2e295a72d77e6650dd16bd22fd15f558c3bbf81 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | support/windows-server/identity/replication-error-8456-8457.md | valecanto/SupportArticles-docs | d2e295a72d77e6650dd16bd22fd15f558c3bbf81 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | support/windows-server/identity/replication-error-8456-8457.md | valecanto/SupportArticles-docs | d2e295a72d77e6650dd16bd22fd15f558c3bbf81 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Troubleshoot replication error 8456 or 8457
description: Describes how to troubleshoot replication error 8456 or 8457.
ms.date: 10/10/2020
author: Deland-Han
ms.author: delhan
manager: dcscontentpm
audience: itpro
ms.topic: troubleshooting
ms.prod: windows-server
localization_priority: medium
ms.reviewer: kaushika
ms.custom: sap:active-directory-replication, csstroubleshoot
ms.technology: windows-server-active-directory
---
# Active Directory replication error 8456 or 8457: The source | destination server is currently rejecting replication requests
This article describes the symptoms, cause, and resolution steps for situations where Active Directory operations fail with error 8456 or 8457.
_Applies to:_ Windows Server 2012 R2
_Original KB number:_ 2023007
> [!NOTE]
> **Home users:** This article is only intended for technical support agents and IT professionals. If you're looking for help with a problem, [ask the Microsoft Community](https://answers.microsoft.com).
## Symptoms
Active Directory operations fail with error 8456 or 8457: The source | destination server is currently rejecting replication requests.
1. The DCPROMO promotion of a new domain controller in an existing forest fails with the error: The source server is currently rejecting replication requests.
Dialog title text: Active Directory Installation Wizard
Dialog message text:
> The operation failed because: Active Directory could not transfer the remaining data in directory partition \<directory partition DN path> to domain controller \<destination DC>. "The source server is currently rejecting replication requests."
2. DCDIAG reports the error: **The source server is currently rejecting replication requests** or **The destination server is currently rejecting replication requests**.
> Testing server: Default-First-Site-Name\<DC NAME>
Starting test: Replications
\* Replications Check
[Replications Check,\<DC NAME>] A recent replication attempt failed:
From IADOMINO to \<DC NAME>
Naming Context: DC=\<DN path of partition>
**The replication generated an error (8456):**
The source server is currently rejecting replication requests.
The failure occurred at \<Date> \<Time>.
The last success occurred at \<Date> \<time>.
957 failures have occurred since the last success.
Replication has been explicitly disabled through the server options
>
> Testing server: Default-First-Site-Name\<DC NAME>
Starting test: Replications
\* Replications Check
[Replications Check,\<DC NAME>] A recent replication attempt failed:
From IADOMINO to \<DC NAME>
Naming Context: DC=\<DN path of partition>
**The replication generated an error (8457):**
The destination server is currently rejecting replication requests.
The failure occurred at \<Date> \<Time>.
The last success occurred at \<Date> \<time>.
957 failures have occurred since the last success.
Replication has been explicitly disabled through the server options
3. REPADMIN indicates that incoming and outgoing Active Directory replication may be failing with the error: The source | destination server is currently rejecting replication.
> DC=Contoso,DC=COM
\<site name>\<dc name> via RPC
DC object GUID: \<objectguid of source DCs NTDS settings object>
Last attempt @ \<date> \<time> failed, result 8457 (0x2109):
The destination server is currently rejecting replication requests.
>
> DC=Contoso,DC=COM
\<site name>\<dc name> via RPC
DC object GUID: \<objectguid of source DCs NTDS settings object>
Last attempt @ \<date> \<time> failed, result 8456 (0x2108):
The source server is currently rejecting replication requests.
> [!NOTE]
> REPADMIN commands may display both the hexadecimal and the decimal equivalent for the currently rejecting replication error.
4. Event sources and event IDs that indicate that a USN rollback has occurred include but are not limited to the following.
| Event source| Event ID| Event string |
|---|---|---|
|NTDS KCC|1308| The Knowledge Consistency Checker (KCC) has detected that successive attempts to replicate with the following domain controller has consistently failed. |
|NTDS KCC|1925|The attempt to establish a replication link for the following writable directory partition failed.|
|NTDS KCC|1926|The attempt to establish a replication link to a read-only directory partition with the following parameters failed|
|NTDS Replication|1586| The Windows NT 4.0 or earlier replication checkpoint with the PDC emulator master was unsuccessful. A full synchronization of the security accounts manager (SAM) database to domain controllers running Windows NT 4.0 and earlier might occur if the PDC emulator master role is transferred to the local domain controller before the next successful checkpoint. The checkpoint process will be tried again in four hours. |
|NTDS Replication|2023|The local domain controller was unable to replicate changes to the following remote domain controller for the following directory partition.|
| Microsoft-Windows-ActiveDirectory_DomainService|2095|During an Active Directory Domain Services replication request, the local domain controller (DC) identified a remote DC which has received replication data from the local DC by using already acknowledged USN tracking numbers.|
| Microsoft-Windows-ActiveDirectory_DomainService|2103|The Active Directory Domain Services database was restored by using an unsupported restoration procedure. Active Directory Domain Services will be unable to log on users while this condition persists. Therefore, the Net Logon service has paused. |
Where embedded status codes 8456 and 8457 map to the following.
| Decimal error| Hexadecimal error| Error string |
|---|---|---|
|8456|2108|The source server is currently rejecting replication|
|8457|2109|The destination server is currently rejecting replication|
5. NTDS General Event 2013 may be logged in the Directory Services event log. This indicates that a USN rollback occurred because of an unsupported rollback or restore of the Active Directory Database.
> Event Type: Error
Event Source: NTDS General
Event Category: Service Control
Event ID: 2103
Date: \<date>
Time: \<time>
User: \<user name>
Computer: \<computer name>
Description: The Active Directory database has been restored by using an unsupported restoration procedure. Active Directory will be unable to log on users while this condition persists. As a result, the Net Logon service has paused. User Action See previous event logs for details. For more information, visit the Help and Support Center at `https://support.microsoft.com`.
6. NTDS General Event 1393 may be logged in the Directory Services event log. This indicates that the physical or virtual drive that is hosting the Active Directory database or log files lacks sufficient free disk space:
> Event Type: Error
Event Source: NTDS General
Event Category: Service Control
Event ID: 1393
Date: \<date>
Time: \<time>
User: \<user name>
Computer: \<computer name>
Description:
Attempts to update the Directory Service database are failing with error 112. Since Windows will be unable to log on users while this condition persists, the NetLogon service is being paused. M ake sure that sufficient free disk space is available on the drives where the directory database and log files reside.
## Cause
Incoming or outgoing replication was automatically disabled by the operating system because of multiple root causes.
Three events that disable inbound or outbound replication include:
- A USN rollback occurred (NTDS General Event 2103).
- The hard disk is full (NTDS General Event 1393).
- A corrupt UTD vector is present (Event 2881).
The operating system automatically makes four configuration changes when one of three conditions occurs. The four configuration changes are as follows:
1. Incoming Active Directory replication is disabled.
2. Outgoing Active Directory replication is disabled.
3. **DSA not writable** is set to a nonzero value in the registry.
4. The NETLOGON service status is changed from **running** to **paused**.
The dominant root cause for this error condition is a USN rollback discussed in [A Windows Server domain controller logs Directory Services event 2095 when it encounters a USN rollback](https://support.microsoft.com/help/875495).
Do not assume that any nonzero value for **DSA not writable** or that a source or destination server is currently rejecting replication requests during DCPROMO / AD Replication definitively means that a USN rollback has occurred and that such domain controllers implicitly have to be force-demoted or force-repromoted. Demotionmaybe the correct option. However, it may be excessive when the error is caused by insufficient free disk space.
## Resolution
1. Check the value for **DSA not writable**.
For each domain controller that is logging the 8456 or 8457 error, determine whether one of the three triggering events automatically disabled incoming or outgoing Active Directory Replication by reading the value for " DSA not writable" from the local registry.
When replication is automatically disabled, the operating system writes one of four possible values to **DSA not writable**:
- Path: `HKEY_LOCAL_MACHINE\System\CurrentControlSet\Services\NTDS\Parameters`
- Setting: DSA not writable
- Type: Reg_dword
- Values:
- #define DSA_WRITABLE_GEN 1
- #define DSA_WRITABLE_NO_SPACE 2
- #define DSA_WRITABLE_USNROLLBCK 4
- #define DSA_WRITABLE_CORRUPT_UTDV 8
A value of 1 can be written only when the forest version is incompatible with the OS (for example, the W2K DC is promoted into a Windows Server 2003 forest functional level forest or the like).
A value of 2 means that the physical or virtual drive that is hosting the Active Directory database or log files lacks sufficient free disk space.
A value of 4 means that a USN rollback occurred because the Active Directory database was incorrectly rolled back in time. Operations that are known to cause a USN rollback include the following:
- The booting from previously saved virtual machine snapshots of domain controller role computers on Hyper-V or VMWARE hosts.
- Incorrect physical-to-virtual (P2V) conversions in forests that contain more than one domain controller.
- Restoring DC role computers by using imaging products such as Ghost.
- Rolling the contents of a partition that is hosting the active directory database back in time by using an advanced disk subsystem.
A value of 8 indicates that the up-to-dateness-vector is corrupted on the local DC.
Technically, **DSA not writable** could consist of multiple values. For example, a registry value of 10 would indicate insufficient disk space and a corrupted UTD. Typically, a single value is written to **DSA not writable**.
> [!NOTE]
> It is common for support professionals and administrators to partly disable the replication quarantine by enabling outgoing replication, by enabling incoming replication, by changing the startup value for the NETLOGON service from disabled to automatic, and by starting the NETLOGON service. Therefore, the full quarantine configuration may not be in place when it is examined.
2. Check the Directory Service event log for quarantine events.
Assuming the Directory Service event log has not wrapped, you may find one or more related events logged in the Directory Service event log of a domain controller that is logging the 8456 or 8457 error.
|Event|Details|
|---|---|
|NTDS General 2103|The Active Directory database was restored by using an unsupported restoration procedure. Active Directory will be unable to log on users while this condition persists. Therefore, the Net Logon service has paused. User Action See previous event logs for more information.|
|NTDS General Event 1393|There is insufficient space on the disk.|
|Event 2881|Not applicable|
3. Perform the recovery based on the value of **DSA not writable** or on events that are logged on the system:
- If **DSA not writable** equals 4 or if NTDS General Event 2103 is logged, perform the recovery steps for a USN Rollback. For more information, see [A Windows Server domain controller logs Directory Services event 2095 when it encounters a USN rollback](https://support.microsoft.com/en-us/help/875495).
- If **DSA not writable** equals 2 or if NTDS General event 1393 is logged, check for sufficient free disk space on the physical and virtual partitions that are hosting the Active Directory database and log files. Free up space as required.
- If **DSA not writable** equals 8, demote and then repromote the domain controller before it can replicate its bad value to other domain controllers in the forest.
| 66.296482 | 444 | 0.763359 | eng_Latn | 0.995997 |
ff136960371df4c27c52130e5b3ce5d2e0143bd6 | 4,063 | md | Markdown | business-central/ui-extensions-microsoft-pay-payments.md | MicrosoftDocs/dynamics365smb-docs-pr.fr-be | 20b1cd1040fc100ac788ca1138d8d72eaf6bbe3b | [
"CC-BY-4.0",
"MIT"
] | 3 | 2017-08-28T10:43:04.000Z | 2021-04-20T21:13:46.000Z | business-central/ui-extensions-microsoft-pay-payments.md | MicrosoftDocs/dynamics365smb-docs-pr.fr-be | 20b1cd1040fc100ac788ca1138d8d72eaf6bbe3b | [
"CC-BY-4.0",
"MIT"
] | null | null | null | business-central/ui-extensions-microsoft-pay-payments.md | MicrosoftDocs/dynamics365smb-docs-pr.fr-be | 20b1cd1040fc100ac788ca1138d8d72eaf6bbe3b | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Microsoft Pay Standard
description: L’extension Microsoft Pay ajoute un lien Microsoft Pay à vos documents de vente afin que les clients peuvent facilement effectuer des paiements à l’aide de Microsoft Pay.
services: project-madeira
documentationcenter: ''
author: SorenGP
ms.service: dynamics365-business-central
ms.topic: conceptual
ms.devlang: na
ms.tgt_pltfrm: na
ms.workload: na
ms.date: 04/01/2021
ms.author: edupont
ms.openlocfilehash: 574ebae554b21c5184a5e1c2bcd5ae9b0d34f817
ms.sourcegitcommit: e562b45fda20ff88230e086caa6587913eddae26
ms.translationtype: HT
ms.contentlocale: fr-BE
ms.lasthandoff: 06/30/2021
ms.locfileid: "6322940"
---
# <a name="the-microsoft-pay-extension"></a>Extension Microsoft Pay
> [!IMPORTANT]
> À compter du 8 février 2020, les changements dans le service Microsoft Pay affecteront l’extension Microsoft Pay dans Microsoft [!INCLUDE[prod_short](includes/prod_long.md)]. En raison des changements, après le 8 février, les liens de paiement **Payer maintenant** que l’extension Microsoft Pay génère pour les factures dans [!INCLUDE[prod_short](includes/prod_short.md)] n’ouvriront pas Microsoft Pay. Les clients qui utilisent l’extension doivent modifier la configuration de leurs services de paiement pour qu’ils démarrent plutôt avec l’extension PayPal.<br /></br>
>
> À partir du 8 janvier, nous afficherons une notification dans [!INCLUDE[prod_short](includes/prod_short.md)]. La notification contiendra un lien vers les paramètres que vous devez modifier et vers un complément d’informations. Après le 8 février, l’extension Microsoft Pay ne sera plus disponible dans [!INCLUDE[prod_short](includes/prod_short.md)].<br /></br>
>
> Les modifications affectent les versions suivantes de Business Central :
> - Microsoft Dynamics 365 Business Central, octobre 2018
> - Microsoft Dynamics 365 Business Central, avril 2019
> - Microsoft Dynamics 365 Business Central, deuxième vague de publication 2019
Les clients sont toujours plus exigeants en ce qui concerne le service clientèle, en termes de qualité des produits mais aussi en termes de services de livraison et de paiement. Le service Microsoft Pay vous aide à développer votre service clientèle.
L’extension Microsoft Pay ajoute un lien Microsoft Pay à vos documents de vente afin que les clients peuvent facilement effectuer des paiements à l’aide de Microsoft Pay. Vous pouvez alors envoyer les documents par e-mail pour fournir un meilleure service client et raccourcir le temps nécessaire pour que les paiements des clients parviennent sur votre compte bancaire.
L’extension Microsoft Pay offre les avantages suivants :
- Les paiements client apparaissent plus rapidement sur votre compte bancaire.
- Les clients disposent de davantage de méthodes de paiement de leurs factures.
- Microsoft Pay offre un service de paiement fiable, que les clients apprécient plutôt que de devoir saisir leurs données de carte de crédit sur des sites Web inconnus.
- Microsoft Pay propose plusieurs méthodes de gestion des paiements, y compris le traitement de cartes de crédit, comme PayPal et Stripe.
- Le lien Microsoft Pay peut être incorporé automatiquement ou par l’utilisateur sur chaque document facture.
- Comme cette fonctionnalité est conçus comme une extension, elle vous donne le contrôle complet et vous permet de l’activer quand et si vos processus d’entreprise le nécessitent.
L’activation des extensions de service de paiement est gratuite dans [!INCLUDE[prod_short](includes/prod_short.md)], toutefois, vous devez contacter le service de paiement pour obtenir un compte. Pour plus d’informations, voir [Activer les paiements client via les services de paiement](sales-how-enable-payment-service-extensions.md).
## <a name="see-also"></a>Voir aussi
[Personnalisation de [!INCLUDE[prod_short](includes/prod_short.md)] à l’aide des extensions](ui-extensions.md)
[Définition des ventes](sales-setup-sales.md)
[Utilisation de [!INCLUDE[prod_short](includes/prod_short.md)]](ui-work-product.md)
[!INCLUDE[footer-include](includes/footer-banner.md)] | 76.660377 | 571 | 0.805562 | fra_Latn | 0.956886 |
ff138cdc4f47703798eda92b5fb1484561504202 | 1,373 | md | Markdown | docs/build/reference/pdbaltpath-use-alternate-pdb-path.md | lackhole/cpp-docs.ko-kr | 38c9b85d3e8b45ea887365f7f628aa5832f25c18 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-07-24T04:27:56.000Z | 2020-07-24T04:27:56.000Z | docs/build/reference/pdbaltpath-use-alternate-pdb-path.md | lackhole/cpp-docs.ko-kr | 38c9b85d3e8b45ea887365f7f628aa5832f25c18 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/build/reference/pdbaltpath-use-alternate-pdb-path.md | lackhole/cpp-docs.ko-kr | 38c9b85d3e8b45ea887365f7f628aa5832f25c18 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: /PDBALTPATH(대체 PDB 경로 사용)
ms.date: 11/04/2016
f1_keywords:
- /pdbaltpath
helpviewer_keywords:
- .pdb files, path
- PDBALTPATH dumpbin option
- -PDBALTPATH dumpbin option
- /PDBALTPATH dumpbin option
- PDB files, path
ms.assetid: 72e200aa-e2c3-4ad8-b687-25528da1aaaf
ms.openlocfilehash: 660e39a97b9fed0c5a9228fe011e7c0fa2566e68
ms.sourcegitcommit: 0ab61bc3d2b6cfbd52a16c6ab2b97a8ea1864f12
ms.translationtype: MT
ms.contentlocale: ko-KR
ms.lasthandoff: 04/23/2019
ms.locfileid: "62320034"
---
# <a name="pdbaltpath-use-alternate-pdb-path"></a>/PDBALTPATH(대체 PDB 경로 사용)
```
/PDBALTPATH:pdb_file_name
```
## <a name="arguments"></a>인수
*pdb_file_name*<br/>
.pdb 파일의 경로 및 파일 이름입니다.
## <a name="remarks"></a>설명
이 옵션은 컴파일된 이진 파일에서 프로그램 데이터베이스(.pdb) 파일의 대체 위치를 제공하는 데 사용합니다. 일반적으로 링커는 자신이 생성한 이진 파일에 .pdb 파일의 위치를 기록합니다. 이 옵션을 사용하여 .pdb 파일의 다른 경로 및 파일 이름을 제공할 수 있습니다. /PDBALTPATH와 함께 제공되는 정보는 실제 .pdb 파일의 위치 또는 이름을 변경하지 않고 링커가 이진 파일에 쓴 정보를 변경합니다. 따라서 빌드 컴퓨터의 파일 구조와 독립적인 경로를 제공할 수 있습니다. 이 옵션은 일반적으로 네트워크 경로 또는 경로 정보가 없는 파일을 제공하는 두 가지 용도로 사용됩니다.
변수의 *pdb_file_name* 임의 문자열, 환경 변수 또는 **_PDB %** 합니다. 링커는 환경 변수를 같은 확장 **% SystemRoot %**, 해당 값입니다. 환경 변수를 정의 하는 링커 **% _PDB** 하 고 **_EXT %** 합니다. **% _PDB** 경로 정보가 없는 실제.pdb 파일의 파일 이름으로 확장 하 고 **_EXT %** 는 생성된 된 실행 파일의 확장명입니다.
## <a name="see-also"></a>참고자료
[DUMPBIN 옵션](dumpbin-options.md)<br/>
[/PDBPATH](pdbpath.md)
| 33.487805 | 330 | 0.712309 | kor_Hang | 1.00001 |
ff14c2879afb144fbe5cc618c828692aab9c410a | 12,769 | md | Markdown | node_modules/react-sizeme/README.md | jeffpuzzo/jp-rosa-react-form-wizard | fa565d0dddd5310ed3eb7c31bff4abee870bf266 | [
"MIT"
] | null | null | null | node_modules/react-sizeme/README.md | jeffpuzzo/jp-rosa-react-form-wizard | fa565d0dddd5310ed3eb7c31bff4abee870bf266 | [
"MIT"
] | null | null | null | node_modules/react-sizeme/README.md | jeffpuzzo/jp-rosa-react-form-wizard | fa565d0dddd5310ed3eb7c31bff4abee870bf266 | [
"MIT"
] | 1 | 2021-07-28T14:06:26.000Z | 2021-07-28T14:06:26.000Z | <p> </p>
<p align='center'>
<img src='https://raw.githubusercontent.com/ctrlplusb/react-sizeme/master/assets/logo.png' width='250'/>
<p align='center'>Make your React Components aware of their width and/or height!</p>
</p>
<p> </p>
[](http://npm.im/react-sizeme)
[](http://opensource.org/licenses/MIT)
[](https://travis-ci.org/ctrlplusb/react-sizeme)
[](https://codecov.io/github/ctrlplusb/react-sizeme)
- Hyper Responsive Components!
- Performant.
- Easy to use.
- Extensive browser support.
- Supports functional and class Component types.
- Tiny bundle size.
- Demo: https://4mkpc.csb.app/
Use it via the render prop pattern (supports `children` or `render` prop):
```javascript
import { SizeMe } from 'react-sizeme'
function MyApp() {
return <SizeMe>{({ size }) => <div>My width is {size.width}px</div>}</SizeMe>
}
```
Or, via a higher order component:
```javascript
import { withSize } from 'react-sizeme'
function MyComponent({ size }) {
return <div>My width is {size.width}px</div>
}
export default withSize()(MyComponent)
```
<p> </p>
---
## TOCs
- [Intro](https://github.com/ctrlplusb/react-sizeme#intro)
- [Installation](https://github.com/ctrlplusb/react-sizeme#installation)
- [Configuration](https://github.com/ctrlplusb/react-sizeme#configuration)
- [Component Usage](https://github.com/ctrlplusb/react-sizeme#component-usage)
- [HOC Usage](https://github.com/ctrlplusb/react-sizeme#hoc-usage)
- [`onSize` callback alternative usage](https://github.com/ctrlplusb/react-sizeme#onsize-callback-alternative-usage)
- [Under the hood](https://github.com/ctrlplusb/react-sizeme#under-the-hood)
- [Examples](#examples)
- [Loading different child components based on size](#loading-different-child-components-based-on-size)
- [Server Side Rendering](https://github.com/ctrlplusb/react-sizeme#server-side-rendering)
- [Extreme Appreciation](https://github.com/ctrlplusb/react-sizeme#extreme-appreciation)
- [Backers](https://github.com/ctrlplusb/react-sizeme#backers)
<p> </p>
---
## Intro
Give your Components the ability to have render logic based on their height and/or width. Responsive design on the Component level. This allows you to create highly reusable components that can adapt to wherever they are rendered.
Check out a working demo here: https://4mkpc.csb.app/
<p> </p>
---
## Installation
```javascript
npm install react-sizeme
```
<p> </p>
---
## Configuration
The following configuration options are available. Please see the usage docs for how to pass these configuration values into either the [component](#component-usage) or [higher order function](#hoc-usage).
- `monitorWidth` (_boolean_, **default**: true)
If true, then any changes to your Components rendered width will cause an recalculation of the "size" prop which will then be be passed into your Component.
- `monitorHeight` (_boolean_, **default**: false)
If true, then any changes to your Components rendered height will cause an
recalculation of the "size" prop which will then be be passed into
your Component.
> PLEASE NOTE: that this is set to `false` by default
- `refreshRate` (_number_, **default**: 16)
The maximum frequency, in milliseconds, at which size changes should be recalculated when changes in your Component's rendered size are being detected. This should not be set to lower than 16.
- `refreshMode` (_string_, **default**: 'throttle')
The mode in which refreshing should occur. Valid values are "debounce" and "throttle".
"throttle" will eagerly measure your component and then wait for the refreshRate to pass before doing a new measurement on size changes.
"debounce" will wait for a minimum of the refreshRate before it does a measurement check on your component.
"debounce" can be useful in cases where your component is animated into the DOM.
> NOTE: When using "debounce" mode you may want to consider disabling the placeholder as this adds an extra delay in the rendering time of your component.
- `noPlaceholder` (_boolean_, **default**: false)
By default we render a "placeholder" component initially so we can try and "prefetch" the expected size for your component. This is to avoid any unnecessary deep tree renders. If you feel this is not an issue for your component case and you would like to get an eager render of
your component then disable the placeholder using this config option.
> NOTE: You can set this globally. See the docs on first render.
<p> </p>
---
## Component Usage
We provide a "render props pattern" based component. You can import it like so:
```javascript
import { SizeMe } from 'react-sizeme'
```
You then provide it either a `render` or `children` prop containing a function/component that will receive a `size` prop (an object with `width` and `height` properties):
```javascript
<SizeMe>{({ size }) => <div>My width is {size.width}px</div>}</SizeMe>
```
_or_
```javascript
<SizeMe render={({ size }) => <div>My width is {size.width}px</div>} />
```
To provide [configuration](#configuration) you simply add any customisation as props. For example:
```javascript
<SizeMe
monitorHeight
refreshRate={32}
render={({ size }) => <div>My width is {size.width}px</div>}
/>
```
<p> </p>
---
## HOC Usage
We provide you with a higher order component function called `withSize`. You can import it like so:
```javascript
import { withSize } from 'react-sizeme'
```
Firstly, you have to call the `withSize` function, passing in an optional [configuration](#configuration) object should you wish to customise the behaviour:
```javascript
const withSizeHOC = withSize()
```
You can then use the returned Higher Order Component to decorate any of your existing Components with the size awareness ability:
```javascript
const SizeAwareComponent = withSizeHOC(MyComponent)
```
Your component will then receive a `size` prop (an object with `width` and `height` properties).
> Note that the values could be undefined based on the configuration you provided (e.g. you explicitly do not monitor either of the dimensions)
Below is a full example:
```javascript
import { withSize } from 'react-sizeme'
class MyComponent extends Component {
render() {
const { width, height } = this.props.size
return (
<div>
My size is {width || -1}px x {height || -1}px
</div>
)
}
}
export default withSize({ monitorHeight: true })(MyComponent)
```
### `onSize` callback alternative usage
The higher order component also allows an alternative usage where you provide an `onSize` callback function.
This allows the "parent" to manage the `size` value rather than your component, which can be useful in specific circumstances.
Below is an example of it's usage.
Firstly, create a component you wish to know the size of:
```jsx
import { withSize } from 'react-sizeme'
function MyComponent({ message }) {
return <div>{message}</div>
}
export default withSize()(MyComponent)
```
Now create a "parent" component providing it a `onSize` callback function to the size aware component:
```jsx
class ParentComponent extends React.Component {
onSize = (size) => {
console.log('MyComponent has a width of', size.width)
}
render() {
return <MyComponent message="Hello world" onSize={this.onSize} />
}
}
```
<p> </p>
---
## Under the hood
It can be useful to understand the rendering workflow should you wish to debug any issues we may be having.
In order to size your component we have a bit of a chicken/egg scenario. We can't know the width/height of your Component until it is rendered. This can lead wasteful rendering cycles should you choose to render your components based on their width/height.
Therefore for the first render of your component we actually render a lightweight placeholder in place of your component in order to obtain the width/height. If your component was being passed a `className` or `style` prop then these will be applied to the placeholder so that it can more closely resemble your actual components dimensions.
So the first dimensions that are passed to your component may not be "correct" dimensions, however, it should quickly receive the "correct" dimensions upon render.
Should you wish to avoid the render of a placeholder and have an eager render of your component then you can use the `noPlaceholder` configuration option. Using this configuration value your component will be rendered directly, however, the `size` prop may contain `undefined` for width and height until your component completes its first render.
<p> </p>
---
## Examples
### Loading different child components based on size
```javascript
import React from 'react'
import LargeChildComponent from './LargeChildComponent'
import SmallChildComponent from './SmallChildComponent'
import sizeMe from 'react-sizeme'
function MyComponent(props) {
const { width, height } = props.size
const ToRenderChild = height > 600 ? LargeChildComponent : SmallChildComponent
return (
<div>
<h1>
My size is {width}x{height}
</div>
<ToRenderChild />
</div>
)
}
export default sizeMe({ monitorHeight: true })(MyComponent)
```
> EXTRA POINTS! Combine the above with a code splitting API (e.g. Webpack's System.import) to avoid unnecessary code downloads for your clients. Zing!
<p> </p>
---
## Server Side Rendering
Okay, I am gonna be up front here and tell you that using this library in an SSR context is most likely a bad idea. If you insist on doing so you then you should take the time to make yourself fully aware of any possible repercussions you application may face.
A standard `sizeMe` configuration involves the rendering of a placeholder component. After the placeholder is mounted to the DOM we extract it's dimension information and pass it on to your actual component. We do this in order to avoid any unnecessary render cycles for possibly deep component trees. Whilst this is useful for a purely client side set up, this is less than useful for an SSR context as the delivered page will contain empty placeholders. Ideally you want actual content to be delivered so that users without JS can still have an experience, or SEO bots can scrape your website.
To avoid the rendering of placeholders in this context you can make use of the `noPlaceholders` global configuration value. Setting this flag will disables any placeholder rendering. Instead your wrapped component will be rendered directly - however it's initial render will contain no values within the `size` prop (i.e. `width`, and `height` will be `null`).
```javascript
import sizeMe from 'react-sizeme'
// This is a global variable. i.e. will be the default for all instances.
sizeMe.noPlaceholders = true
```
> Note: if you only partialy server render your application you may want to use the component level configuration that allows disabling placeholders per component (e.g. `sizeMe({ noPlaceholder: true })`)
It is up to you to decide how you would like to initially render your component then. When your component is sent to the client and mounted to the DOM `SizeMe` will calculate and send the dimensions to your component as normal. I suggest you tread very carefully with how you use this updated information and do lots of testing using various screen dimensions. Try your best to avoid unnecessary re-rendering of your components, for the sake of your users.
If you come up with any clever strategies for this please do come share them with us! :)
<p> </p>
---
## Extreme Appreciation!
We make use of the awesome [element-resize-detector](https://github.com/wnr/element-resize-detector) library. This library makes use of an scroll/object based event strategy which outperforms window resize event listening dramatically. The original idea for this approach comes from another library, namely [css-element-queries](https://github.com/marcj/css-element-queries) by Marc J. Schmidt. I recommend looking into these libraries for history, specifics, and more examples. I love them for the work they did, whithout which this library would not be possible. :sparkling_heart:
<p> </p>
---
## Backers
Thank goes to all our backers! [[Become a backer](https://opencollective.com/controlplusb#backer)].
<a href="https://opencollective.com/controlplusb#backers">
<img src="https://opencollective.com/controlplusb/backers.svg?width=950" />
</a>
| 37.890208 | 595 | 0.747905 | eng_Latn | 0.99199 |
ff14d2f0331e1737f82237ce5267aa47848aa214 | 236 | md | Markdown | src/md-data/categories/supplements.md | smmirchev/my-e-commerce | e3491bf7d25e8d534799c437fbd6fe1fe72f1140 | [
"RSA-MD"
] | null | null | null | src/md-data/categories/supplements.md | smmirchev/my-e-commerce | e3491bf7d25e8d534799c437fbd6fe1fe72f1140 | [
"RSA-MD"
] | null | null | null | src/md-data/categories/supplements.md | smmirchev/my-e-commerce | e3491bf7d25e8d534799c437fbd6fe1fe72f1140 | [
"RSA-MD"
] | null | null | null | ---
path: "/categories"
id: "supplements"
titleEn: "Supplements"
titleDe: "Ergänzungen"
titleFr: "Suppléments"
description: "Description for cars and boats"
image: ../../static/images/categories-supplements.jpg
imageAlt: "vitamins"
---
| 21.454545 | 53 | 0.741525 | kor_Hang | 0.157773 |
ff15315ef02ed8cbf11e86054b47d787f6bc21c0 | 1,253 | md | Markdown | notes/Expressing uncertainty doesn't undermine authority.md | djdrysdale/11ty-garden-blog | 9bd43230dfd466ac173a11986c5c7f5f7cd07f5a | [
"MIT"
] | null | null | null | notes/Expressing uncertainty doesn't undermine authority.md | djdrysdale/11ty-garden-blog | 9bd43230dfd466ac173a11986c5c7f5f7cd07f5a | [
"MIT"
] | null | null | null | notes/Expressing uncertainty doesn't undermine authority.md | djdrysdale/11ty-garden-blog | 9bd43230dfd466ac173a11986c5c7f5f7cd07f5a | [
"MIT"
] | null | null | null | Tags: #lit
When reporting their findings, researchers should not be reluctant to admitting what they don't know. What we know might remain in flux as new evidence comes to light, and identifying remaining uncertainties is more informative than presenting false confidence in an assertion. Indeed, uncertainty may be a critical part of the message that needs to be shared, such as when reporting on dynamic situations (like COVID guidelines and trends).
Blastland et al. found that expressing uncertainty did not have a negative impact on the trustworthiness of reporting. They further add that there's little downside in expressing findings in a range rather than as an absolute number. Blastland et al. suggest that audiences are more sophisticated at evaluating arguments and evidence than popular media tend to give them credit for and that they will, in fact, evaluate claims based on strength of evidence more than clarity of message.
---
## Related
- [[Inform rather than persuade]]
## Citations
Blastland, Michael, Alexandra L. J. Freeman, Sander van der Linden, Theresa M. Marteau, and David Spiegelhalter. “Five Rules for Evidence Communication.” Nature 587, no. 7834 (November 2020): 362–64. https://doi.org/10.1038/d41586-020-03189-1.
| 89.5 | 487 | 0.794892 | eng_Latn | 0.997931 |
ff166bdb28898dbdd6cb0f879d3209ac800f2600 | 1,134 | md | Markdown | articles/azure-monitor/app/java-on-premises.md | Yueying-Liu/mc-docs.zh-cn | 21000ea687a4cda18cecf10e9183fd2172918bb5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/azure-monitor/app/java-on-premises.md | Yueying-Liu/mc-docs.zh-cn | 21000ea687a4cda18cecf10e9183fd2172918bb5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/azure-monitor/app/java-on-premises.md | Yueying-Liu/mc-docs.zh-cn | 21000ea687a4cda18cecf10e9183fd2172918bb5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 监视在本地运行的 Java 应用程序 - Azure Monitor Application Insights
description: 在不检测应用的情况下对本地运行的 Java 应用程序进行应用程序性能监视。 分布式跟踪和应用程序映射。
ms.topic: conceptual
author: Johnnytechn
ms.custom: devx-track-java
ms.author: v-johya
ms.date: 11/10/2020
ms.openlocfilehash: da34cc2c2fa39379b61cd641ccd99974c5359c8b
ms.sourcegitcommit: d30cf549af09446944d98e4bd274f52219e90583
ms.translationtype: HT
ms.contentlocale: zh-CN
ms.lasthandoff: 11/15/2020
ms.locfileid: "94638198"
---
# <a name="java-codeless-application-monitoring-on-premises---azure-monitor-application-insights---public-preview"></a>Java 无代码应用程序本地监视 - Azure Monitor Application Insights -公 共预览版
Java 无代码应用程序监视非常简单 - 不需要更改代码,只需更改几个配置就可以启用 Java 代理。
## <a name="overview"></a>概述
启用后,Java 代理将自动从最广泛使用的库和框架收集大量请求、依赖项、日志和指标。
请遵循所有环境(包括本地)的相关[详细说明](./java-in-process-agent.md)。
## <a name="next-steps"></a>后续步骤
* [获取下载 Java 代理的说明](./java-in-process-agent.md)
* [配置 JVM 参数](https://github.com/microsoft/ApplicationInsights-Java/wiki/3.0-Preview:-Tips-for-updating-your-JVM-args)
* [自定义配置](https://github.com/microsoft/ApplicationInsights-Java/wiki/3.0-Preview:-Configuration-Options)
| 35.4375 | 180 | 0.784832 | yue_Hant | 0.407555 |
ff17169291c14a642086cbc093b221cc73c7fdea | 3,106 | md | Markdown | treebanks/sv_talbanken/sv-dep-compound.md | mjabrams/docs | eef96df1ce8f6752e9f80660c8255482b2a07c45 | [
"Apache-2.0"
] | 204 | 2015-01-20T16:36:39.000Z | 2022-03-28T00:49:51.000Z | treebanks/sv_talbanken/sv-dep-compound.md | mjabrams/docs | eef96df1ce8f6752e9f80660c8255482b2a07c45 | [
"Apache-2.0"
] | 654 | 2015-01-02T17:06:29.000Z | 2022-03-31T18:23:34.000Z | treebanks/sv_talbanken/sv-dep-compound.md | mjabrams/docs | eef96df1ce8f6752e9f80660c8255482b2a07c45 | [
"Apache-2.0"
] | 200 | 2015-01-16T22:07:02.000Z | 2022-03-25T11:35:28.000Z | ---
layout: base
title: 'Statistics of compound in UD_Swedish'
udver: '2'
---
## Treebank Statistics: UD_Swedish: Relations: `compound`
This relation is universal.
There are 1 language-specific subtypes of `compound`: <tt><a href="sv-dep-compound-prt.html">compound:prt</a></tt>.
17 nodes (0%) are attached to their parents as `compound`.
17 instances of `compound` (100%) are right-to-left (child precedes parent).
Average distance between parent and child is 1.11764705882353.
The following 3 pairs of parts of speech are connected with `compound`: <tt><a href="sv-pos-NOUN.html">NOUN</a></tt>-<tt><a href="sv-pos-NOUN.html">NOUN</a></tt> (14; 82% instances), <tt><a href="sv-pos-NOUN.html">NOUN</a></tt>-<tt><a href="sv-pos-PROPN.html">PROPN</a></tt> (2; 12% instances), <tt><a href="sv-pos-NOUN.html">NOUN</a></tt>-<tt><a href="sv-pos-ADJ.html">ADJ</a></tt> (1; 6% instances).
~~~ conllu
# visual-style 5 bgColor:blue
# visual-style 5 fgColor:white
# visual-style 6 bgColor:blue
# visual-style 6 fgColor:white
# visual-style 6 5 compound color:blue
1 Nato Nato PROPN PM|NOM Case=Nom 0 root _ SpaceAfter=No
2 : : PUNCT MID _ 1 punct _ _
3 North North ADJ JJ _ 4 amod _ _
4 Atlantic Atlantic PROPN PM|NOM Case=Nom 5 compound _ _
5 Treaty Treaty NOUN NN _ 6 compound _ _
6 Organization Organization NOUN NN _ 1 appos _ _
7 ( ( PUNCT PAD _ 6 punct _ SpaceAfter=No
8 Atlantpakten Atlantpakten NOUN NN|UTR|SIN|DEF|NOM Case=Nom|Definite=Def|Gender=Com|Number=Sing 6 appos _ SpaceAfter=No
9 ) ) PUNCT PAD _ 6 punct _ SpaceAfter=No
10 . . PUNCT MAD _ 1 punct _ _
~~~
~~~ conllu
# visual-style 4 bgColor:blue
# visual-style 4 fgColor:white
# visual-style 5 bgColor:blue
# visual-style 5 fgColor:white
# visual-style 5 4 compound color:blue
1 Nato Nato PROPN PM|NOM Case=Nom 0 root _ SpaceAfter=No
2 : : PUNCT MID _ 1 punct _ _
3 North North ADJ JJ _ 4 amod _ _
4 Atlantic Atlantic PROPN PM|NOM Case=Nom 5 compound _ _
5 Treaty Treaty NOUN NN _ 6 compound _ _
6 Organization Organization NOUN NN _ 1 appos _ _
7 ( ( PUNCT PAD _ 6 punct _ SpaceAfter=No
8 Atlantpakten Atlantpakten NOUN NN|UTR|SIN|DEF|NOM Case=Nom|Definite=Def|Gender=Com|Number=Sing 6 appos _ SpaceAfter=No
9 ) ) PUNCT PAD _ 6 punct _ SpaceAfter=No
10 . . PUNCT MAD _ 1 punct _ _
~~~
~~~ conllu
# visual-style 4 bgColor:blue
# visual-style 4 fgColor:white
# visual-style 5 bgColor:blue
# visual-style 5 fgColor:white
# visual-style 5 4 compound color:blue
1 Efta Efta PROPN PM|NOM Case=Nom 0 root _ SpaceAfter=No
2 : : PUNCT MID _ 1 punct _ _
3 European European ADJ JJ _ 6 amod _ _
4 Free Free ADJ JJ _ 5 compound _ _
5 Trade Trade NOUN NN _ 6 compound _ _
6 Association Association NOUN NN _ 1 appos _ _
7 ( ( PUNCT PAD _ 6 punct _ SpaceAfter=No
8 Den en DET DT|UTR|SIN|DEF Definite=Def|Gender=Com|Number=Sing|PronType=Art 10 det _ _
9 europeiska europeisk ADJ JJ|POS|UTR/NEU|SIN|DEF|NOM Case=Nom|Definite=Def|Degree=Pos|Number=Sing 10 amod _ _
10 frihandelssammanslutningen frihandelssammanslutning NOUN NN|UTR|SIN|DEF|NOM Case=Nom|Definite=Def|Gender=Com|Number=Sing 6 appos _ SpaceAfter=No
11 ) ) PUNCT PAD _ 6 punct _ _
~~~
| 38.345679 | 401 | 0.732131 | yue_Hant | 0.733483 |
ff1795557797eb3c98c41b34535d2777f578974b | 781 | md | Markdown | reading/notes/2014-11-29-The Prison Angel.md | mkudija/mkudija.github.io | 48f4dfa81db138c039df8c520fe3e17ec2bafa4b | [
"CC-BY-3.0",
"MIT"
] | 3 | 2019-01-31T03:20:59.000Z | 2019-11-23T16:42:17.000Z | reading/notes/2014-11-29-The Prison Angel.md | mkudija/mkudija.github.io | 48f4dfa81db138c039df8c520fe3e17ec2bafa4b | [
"CC-BY-3.0",
"MIT"
] | null | null | null | reading/notes/2014-11-29-The Prison Angel.md | mkudija/mkudija.github.io | 48f4dfa81db138c039df8c520fe3e17ec2bafa4b | [
"CC-BY-3.0",
"MIT"
] | 4 | 2017-09-24T17:13:46.000Z | 2019-10-22T03:05:34.000Z |
# The Prison Angel by Mary Jordan and Kevin Sullivan
* She mentioned how drug lords always needed the tv or radio on because in the silence that could hear the truth of what they were doing --> this highlights the importance of making time for silence to listen to Truth
* “Charity is not a thing you do, it’s love, it’s who you become.” (51)
* when she saw people wearing mink fur —> “‘Don’t you know you’re wearing a hospital around your neck?’” (57)
* “When you know in your heart that something is right, that it’s who you are, that God is calling you to do something, you make the sacrifices you have to make.” (75)
* “It’s not learning that brings you to perfection, it’s unlearning.” (83)
* “If fear dictated her actions, she never would have moved into La Mesa.” (164)
| 71 | 218 | 0.740077 | eng_Latn | 0.999956 |
ff17b18d22c4f55c7ebd6031af4013fae362a953 | 1,343 | md | Markdown | xnet/README.md | go-board/x-go | c00eb062fe6f87c8ba30edf06246dbb0d0a0ac9f | [
"Apache-2.0"
] | 70 | 2020-03-18T00:55:39.000Z | 2021-11-19T16:34:17.000Z | xnet/README.md | go-board/x-go | c00eb062fe6f87c8ba30edf06246dbb0d0a0ac9f | [
"Apache-2.0"
] | 3 | 2020-03-05T10:37:44.000Z | 2020-06-22T07:51:48.000Z | xnet/README.md | go-board/x-go | c00eb062fe6f87c8ba30edf06246dbb0d0a0ac9f | [
"Apache-2.0"
] | 5 | 2020-04-09T11:02:59.000Z | 2021-08-19T16:30:45.000Z | # XNet
标准库实现了大量的net函数,这里作为补充,提供了`LimitedListener`和`KeepaliveListener`, 以及一个辅助函数`PrivateAddress`获取内网地址。
### [xhttp](./xhttp/README.md)
## LimitedListener
对最大同时连接数做了上限限制 `n`
```go
package xnet
type limitedListener struct {
net.Listener
sem chan struct{}
closeOnce sync.Once // ensures the done chan is only closed once
done chan struct{} // no values sent; closed when Close is called
}
func LimitListener(l net.Listener, n int) net.Listener {
return &limitedListener{
Listener: l,
sem: make(chan struct{}, n),
done: make(chan struct{}),
}
}
```
## KeepaliveListener
会定时发送keepalive message的tcpListener
```go
package xnet
type keepaliveListener struct {
*net.TCPListener
timeout time.Duration
}
// 创建一个新的keepalive的listener,keepalive间隔为keepalive,如果listener不是*TcpListener,则会原样返回。
func KeepAliveListener(l net.Listener, keepalive time.Duration) net.Listener
// 设置net.TcpConn keepalive的时间间隔
func SetKeepAlive(c net.Conn, keepalive time.Duration)
```
## TimeoutListener
会自动设置超时间的tcpListener
```go
package xnet
type timeoutListener struct {
*net.TCPListener
timeout time.Duration
}
// 创建一个新的具有超时时间的listener,超时时间为timeout,如果listener不是*TcpListener,则会原样返回。
func TimeoutListener(l net.Listener, timeout time.Duration) net.Listener
// 设置net.TcpConn的超时时间
func SetTimeout(c net.Conn, timeout time.Duration)
```
| 23.561404 | 95 | 0.76545 | eng_Latn | 0.245418 |
ff17d238635e72358a9eba267a5aa20e543736cd | 11,864 | md | Markdown | heapster-1.3.0/docs/sink-configuration.md | qsunny/k8s | e6dca256d5bfce4787480a235c6634fc8a1971c0 | [
"Apache-2.0"
] | null | null | null | heapster-1.3.0/docs/sink-configuration.md | qsunny/k8s | e6dca256d5bfce4787480a235c6634fc8a1971c0 | [
"Apache-2.0"
] | null | null | null | heapster-1.3.0/docs/sink-configuration.md | qsunny/k8s | e6dca256d5bfce4787480a235c6634fc8a1971c0 | [
"Apache-2.0"
] | null | null | null | Configuring sinks
=================
Heapster can store data into different backends (sinks). These are specified on the command line
via the `--sink` flag. The flag takes an argument of the form `PREFIX:CONFIG[?OPTIONS]`.
Options (optional!) are specified as URL query parameters, separated by `&` as normal.
This allows each source to have custom configuration passed to it without needing to
continually add new flags to Heapster as new sinks are added. Heapster can
store data into multiple sinks at once if multiple `--sink` flags are specified.
## Current sinks
### Log
This sink writes all data to the standard output which is particularly useful for debugging.
--sink=log
### InfluxDB
This sink supports both monitoring metrics and events.
*This sink supports InfluxDB versions v0.9 and above*.
To use the InfluxDB sink add the following flag:
--sink=influxdb:<INFLUXDB_URL>[?<INFLUXDB_OPTIONS>]
If you're running Heapster in a Kubernetes cluster with the default InfluxDB + Grafana setup you can use the following flag:
--sink=influxdb:http://monitoring-influxdb:80/
The following options are available:
* `user` - InfluxDB username (default: `root`)
* `pw` - InfluxDB password (default: `root`)
* `db` - InfluxDB Database name (default: `k8s`)
* `retention` - Duration of the default InfluxDB retention policy, e.g. `4h` or `7d` (default: `0` meaning infinite)
* `secure` - Connect securely to InfluxDB (default: `false`)
* `insecuressl` - Ignore SSL certificate validity (default: `false`)
* `withfields` - Use [InfluxDB fields](storage-schema.md#using-fields) (default: `false`)
### Google Cloud Monitoring
This sink supports monitoring metrics only.
To use the GCM sink add the following flag:
--sink=gcm
*Note: This sink works only on a Google Compute Engine VM as of now*
GCM has one option - `metrics` that can be set to:
* all - the sink exports all metrics
* autoscaling - the sink exports only autoscaling-related metrics
### Google Cloud Logging
This sink supports events only.
To use the GCL sink add the following flag:
--sink=gcl
*Notes:*
* This sink works only on a Google Compute Engine VM as of now
* GCE instance must have “https://www.googleapis.com/auth/logging.write” auth scope
* GCE instance must have Logging API enabled for the project in Google Developer Console
* GCL Logs are accessible via:
* `https://console.developers.google.com/project/<project_ID>/logs?service=custom.googleapis.com`
* Where `project_ID` is the project ID of the Google Cloud Platform project.
* Select `kubernetes.io/events` from the `All logs` drop down menu.
### Hawkular-Metrics
This sink supports monitoring metrics only.
To use the Hawkular-Metrics sink add the following flag:
--sink=hawkular:<HAWKULAR_SERVER_URL>[?<OPTIONS>]
If `HAWKULAR_SERVER_URL` includes any path, the default `hawkular/metrics` is overridden. To use SSL, the `HAWKULAR_SERVER_URL` has to start with `https`
The following options are available:
* `tenant` - Hawkular-Metrics tenantId (default: `heapster`)
* `labelToTenant` - Hawkular-Metrics uses given label's value as tenant value when storing data
* `useServiceAccount` - Sink will use the service account token to authorize to Hawkular-Metrics (requires OpenShift)
* `insecure` - SSL connection will not verify the certificates
* `caCert` - A path to the CA Certificate file that will be used in the connection
* `auth` - Kubernetes authentication file that will be used for constructing the TLSConfig
* `user` - Username to connect to the Hawkular-Metrics server
* `pass` - Password to connect to the Hawkular-Metrics server
* `filter` - Allows bypassing the store of matching metrics, any number of `filter` parameters can be given with a syntax of `filter=operation(param)`. Supported operations and their params:
* `label` - The syntax is `label(labelName:regexp)` where `labelName` is 1:1 match and `regexp` to use for matching is given after `:` delimiter
* `name` - The syntax is `name(regexp)` where MetricName is matched (such as `cpu/usage`) with a `regexp` filter
* `batchSize`- How many metrics are sent in each request to Hawkular-Metrics (default is 1000)
* `concurrencyLimit`- How many concurrent requests are used to send data to the Hawkular-Metrics (default is 5)
* `labelTagPrefix` - A prefix to be placed in front of each label when stored as a tag for the metric (default is `labels.`)
A combination of `insecure` / `caCert` / `auth` is not supported, only a single of these parameters is allowed at once. Also, combination of `useServiceAccount` and `user` + `pass` is not supported. To increase the performance of Hawkular sink in case of multiple instances of Hawkular-Metrics (such as scaled scenario in OpenShift) modify the parameters of batchSize and concurrencyLimit to balance the load on Hawkular-Metrics instances.
### Wavefront
The Wavefront sink supports monitoring metrics only.
To use the Wavefront sink add the following flag:
--sink=wavefront:<WAVEFRONT_PROXY_URL:PORT>[?<OPTIONS>]
The following options are available:
* `clusterName` - The name of the Kubernetes cluster being monitored. This will be added as a tag called `cluster` to metrics in Wavefront (default: `k8s-cluster`)
* `prefix` - The prefix to be added to all metrics that Heapster collects (default: `heapster.`)
* `includeLabels` - If set to true, any K8s labels will be applied to metrics as tags (default: `false`)
* `includeContainers` - If set to true, all container metrics will be sent to Wavefront. When set to false, container level metrics are skipped (pod level and above are still sent to Wavefront) (default: `true`)
### OpenTSDB
This sink supports both monitoring metrics and events.
To use the OpenTSDB sink add the following flag:
--sink=opentsdb:<OPENTSDB_SERVER_URL>
Currently, accessing OpenTSDB via its rest apis doesn't need any authentication, so you
can enable OpenTSDB sink like this:
--sink=opentsdb:http://192.168.1.8:4242
### Kafka
This sink supports monitoring metrics only.
To use the kafka sink add the following flag:
--sink="kafka:<?<OPTIONS>>"
Normally, kafka server has multi brokers, so brokers' list need be configured for producer.
So, we provide kafka brokers' list and topics about timeseries & topic in url's query string.
Options can be set in query string, like this:
* `brokers` - Kafka's brokers' list.
* `timeseriestopic` - Kafka's topic for timeseries. Default value : `heapster-metrics`
* `eventstopic` - Kafka's topic for events.Default value : `heapster-events`
For example,
--sink="kafka:?brokers=localhost:9092&brokers=localhost:9093×eriestopic=testseries&eventstopic=testtopic"
### Riemann
This sink supports metrics only.
To use the Riemann sink add the following flag:
--sink="riemann:<RIEMANN_SERVER_URL>[?<OPTIONS>]"
The following options are available:
* `ttl` - TTL for writing to Riemann. Default: `60 seconds`
* `state` - The event state. Default: `""`
* `tags` - Default. `heapster`
* `batchsize` - The Riemann sink sends batch of events. The default size is `1000`
For example,
--sink=riemann:http://localhost:5555?ttl=120&state=ok&tags=foobar&batchsize=150
### Elasticsearch
This sink supports monitoring metrics and events. To use the Elasticsearch
sink add the following flag:
```
--sink=elasticsearch:<ES_SERVER_URL>[?<OPTIONS>]
```
Normally an Elasticsearch cluster has multiple nodes or a proxy, so these need
to be configured for the Elasticsearch sink. To do this, you can set
`ES_SERVER_URL` to a dummy value, and use the `?nodes=` query value for each
additional node in the cluster. For example:
```
--sink=elasticsearch:?nodes=http://foo.com:9200&nodes=http://bar.com:9200
```
(*) Notice that using the `?nodes` notation will override the `ES_SERVER_URL`
If you run your ElasticSearch cluster behind a loadbalancer (or otherwise do
not want to specify multiple nodes) then you can do the following:
```
--sink=elasticsearch:http://elasticsearch.example.com:9200?sniff=false
```
Besides this, the following options can be set in query string:
(*) Note that the keys are case sensitive
* `index` - the index for metrics and events. The default is `heapster`
* `esUserName` - the username if authentication is enabled
* `esUserSecret` - the password if authentication is enabled
* `maxRetries` - the number of retries that the Elastic client will perform
for a single request after before giving up and return an error. It is `0`
by default, so retry is disabled by default.
* `healthCheck` - specifies if healthchecks are enabled by default. It is enabled
by default. To disable, provide a negative boolean value like `0` or `false`.
* `sniff` - specifies if the sniffer is enabled by default. It is enabled
by default. To disable, provide a negative boolean value like `0` or `false`.
* `startupHealthcheckTimeout` - the time in seconds the healthcheck waits for
a response from Elasticsearch on startup, i.e. when creating a client. The
default value is `1`.
* `bulkWorkers` - number of workers for bulk processing. Default value is `5`.
* `cluster_name` - cluster name for different Kubernetes clusters. Default value is `default`.
Like this:
--sink="elasticsearch:?nodes=http://127.0.0.1:9200&index=testMetric"
or
--sink="elasticsearch:?nodes=http://127.0.0.1:9200&index=testEvent"
#### AWS Integration
In order to use AWS Managed Elastic we need to use one of the following methods:
1. Making sure the public IPs of the Heapster are allowed on the Elasticsearch's Access Policy
-OR-
2. Configuring an Access Policy with IAM
1. Configure the Elasticsearch cluster policy with IAM User
2. Create a secret that stores the IAM credentials
3. Expose the credentials to the environment variables: `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY`
```
env:
- name: AWS_ACCESS_KEY_ID
valueFrom:
secretKeyRef:
name: aws-heapster
key: aws.id
- name: AWS_SECRET_ACCESS_KEY
valueFrom:
secretKeyRef:
name: aws-heapster
key: aws.secret
```
### Graphite/Carbon
This sink supports monitoring metrics only.
To use the graphite sink add the following flag:
--sink="graphite:<PROTOCOL>://<HOST>[:<PORT>][<?<OPTIONS>>]"
PROTOCOL must be `tcp` or `udp`, PORT is 2003 by default.
These options are available:
* `prefix` - Adds specified prefix to all metric paths
For example,
--sink="graphite:tcp://metrics.example.com:2003?prefix=kubernetes.example"
Metrics are sent to Graphite with this hierarchy:
* `PREFIX`
* `cluster`
* `namespaces`
* `NAMESPACE`
* `nodes`
* `NODE`
* `pods`
* `NAMESPACE`
* `POD`
* `containers`
* `CONTAINER`
* `sys-containers`
* `SYS-CONTAINER`
### Librato
This sink supports monitoring metrics only.
To use the librato sink add the following flag:
--sink="librato:<?<OPTIONS>>"
Options can be set in query string, like this:
* `username` - Librato user email address (https://www.librato.com/docs/api/#authentication).
* `token` - Librato API token
* `prefix` - Prefix for all measurement names
* `tags` - By default provided tags (comma separated list)
* `tag_{name}` - Value for the tag `name`
For example,
--sink="librato:?username=xyz&token=secret&prefix=k8s&tags=cluster&tag_cluster=staging"
The librato sink currently only works with accounts, which support [tagged metrics](https://www.librato.com/docs/kb/faq/account_questions/tags_or_sources/).
## Using multiple sinks
Heapster can be configured to send k8s metrics and events to multiple sinks by specifying the`--sink=...` flag multiple times.
For example, to send data to both gcm and influxdb at the same time, you can use the following:
```shell
--sink=gcm --sink=influxdb:http://monitoring-influxdb:80/
```
| 40.769759 | 439 | 0.73938 | eng_Latn | 0.970977 |
ff17f0ba28481ea08698839e9c7a11420fc82133 | 11,371 | md | Markdown | vendor/zendframework/zend-mvc/CHANGELOG.md | newhollandpress/newnhp | 4810ddb918d738c99f9e1cf2a8241b7e637a4890 | [
"BSD-3-Clause"
] | null | null | null | vendor/zendframework/zend-mvc/CHANGELOG.md | newhollandpress/newnhp | 4810ddb918d738c99f9e1cf2a8241b7e637a4890 | [
"BSD-3-Clause"
] | null | null | null | vendor/zendframework/zend-mvc/CHANGELOG.md | newhollandpress/newnhp | 4810ddb918d738c99f9e1cf2a8241b7e637a4890 | [
"BSD-3-Clause"
] | null | null | null | # Changelog
All notable changes to this project will be documented in this file, in reverse chronological order by release.
## 2.7.8 - 2016-05-31
### Added
- [#138](https://github.com/zendframework/zend-mvc/pull/138) adds support for
PHP 7 `Throwable`s within each of:
- `DispatchListener`
- `MiddlewareListener`
- The console `RouteNotFoundStrategy` and `ExceptionStrategy`
- The HTTP `DefaultRenderingStrategy` and `RouteNotFoundStrategy`
### Deprecated
- Nothing.
### Removed
- Nothing.
### Fixed
- Nothing.
## 2.7.7 - 2016-04-12
### Added
- Nothing.
### Deprecated
- Nothing.
### Removed
- Nothing.
### Fixed
- [#122](https://github.com/zendframework/zend-mvc/pull/122) fixes the
`FormAnnotationBuilderFactory` to use the container's `get()` method instead
of `build()` to retrieve the event manager instance.
## 2.7.6 - 2016-04-06
### Added
- [#94](https://github.com/zendframework/zend-mvc/pull/94) adds a documentation
recipe for using middleware withing MVC event listeners.
### Deprecated
- Nothing.
### Removed
- Nothing.
### Fixed
- [#107](https://github.com/zendframework/zend-mvc/pull/107) fixes an incorrect
import statement in the `DiStrictAbstractServiceFactoryFactory` that prevented
it from working.
- [#112](https://github.com/zendframework/zend-mvc/pull/112) fixes how the
`Forward` plugin detects and detaches event listeners to ensure it works
against either v2 or v3 releases of zend-eventmanager.
## 2.7.5 - 2016-04-06
### Added
- Nothing.
### Deprecated
- Nothing.
### Removed
- Nothing.
### Fixed
- [#111](https://github.com/zendframework/zend-mvc/pull/111) fixes a bug in how
the `ConsoleExceptionStrategyFactory` whereby it was overwriting the default
exception message template with an empty string when no configuration for it
was provided.
## 2.7.4 - 2016-04-03
### Added
- Nothing.
### Deprecated
- Nothing.
### Removed
- Nothing.
### Fixed
- [#114](https://github.com/zendframework/zend-mvc/pull/114) fixes an issue in
the `ServiceLocatorAware` initializer whereby plugin manager instances were
falsely identified as the container instance when under zend-servicemanager v2.
## 2.7.3 - 2016-03-08
### Added
- Nothing.
### Deprecated
- Nothing.
### Removed
- Nothing.
### Fixed
- [#97](https://github.com/zendframework/zend-mvc/pull/97) re-introduces the
`ServiceManager` factory definition inside `ServiceManagerConfig`, to ensure
backwards compatibility.
## 2.7.2 - 2016-03-08
### Added
- Nothing.
### Deprecated
- Nothing.
### Removed
- Nothing.
### Fixed
- [#95](https://github.com/zendframework/zend-mvc/pull/95) re-introduces the
various zend-di aliases and factories in `Zend\Mvc\Service\ServiceListenerFactory`,
which were accidently removed in the 2.7.0 release.
- [#96](https://github.com/zendframework/zend-mvc/pull/96) fixes shared event
detachment/attachment within the `Forward` plugin to work with both v2 and v3
of zend-eventmanager.
- [#93](https://github.com/zendframework/zend-mvc/pull/93) ensures that the
Console `Catchall` route factory will not fail when the `defaults` `$options`
array key is missing.
- [#43](https://github.com/zendframework/zend-mvc/pull/43) updates the
`AbstractRestfulController` to ensure it can accept textual (e.g., XML, YAML)
data.
- [#79](https://github.com/zendframework/zend-mvc/pull/79) updates the
continuous integration configuration to ensure we test against lowest and
highest accepted dependencies, and those in the current lockfile.
## 2.7.1 - 2016-03-02
### Added
- Nothing.
### Deprecated
- Nothing.
### Removed
- Nothing.
### Fixed
- [#88](https://github.com/zendframework/zend-mvc/pull/88) addresses backwards
compatibility concerns raised by users due to the new deprecation notices
emitted by `ServiceLocatorAware` initializers; in particular, all
`AbstractController` implementations were raising a deprecation wen first
pulled from the `ControllerManager`.
At this time, notices are now only raised in the following conditions:
- When a non-controller, non-plugin manager, `ServiceLocatorAware` instance
is detected.
- When a plugin manager instance is detected that is `ServiceLocatorAware` and
does not have a composed service locator. In this situation, the deprecation
notice indicates that the factory for the plugin manager should be updated
to inject the service locator via the constructor.
- For controllers that do not extend `AbstractController` but do implement
`ServiceLocatorAware`.
- When calling `getServiceLocator()` from within an `AbstractController`
extension; this properly calls out the practice that should be avoided and
which requires updates to the controller.
## 2.7.0 - 2016-03-01
### Added
- [#31](https://github.com/zendframework/zend-mvc/pull/31) adds three new
optional arguments to the `Zend\Mvc\Application` constructor: an EventManager
instance, a Request instance, and a Response instance.
- [#36](https://github.com/zendframework/zend-mvc/pull/36) adds more than a
dozen service factories, primarily to separate conditional factories into
discrete factories.
- [#32](https://github.com/zendframework/zend-mvc/pull/32) adds
`Zend\Mvc\MiddlewareListener`, which allows dispatching PSR-7-based middleware
implementing the signature `function (ServerRequestInterface $request,
ResponseInterface $response)`. To dispatch such middleware, point the
`middleware` "default" for a given route to a service name or callable that
will resolve to the middleware:
```php
[ 'router' => 'routes' => [
'path' => [
'type' => 'Literal',
'options' => [
'route' => '/path',
'defaults' => [
'middleware' => 'ServiceNameForPathMiddleware',
],
],
],
]
```
This new listener listens at the same priority as the `DispatchListener`, but,
due to being registered earlier, will invoke first; if the route match does
not resolve to middleware, it will fall through to the original
`DispatchListener`, allowing normal ZF2-style controller dispatch.
- [#84](https://github.com/zendframework/zend-mvc/pull/84) publishes the
documentation to https://zendframework.github.io/zend-mvc/
### Deprecated
- Two initializers registered by `Zend\Mvc\Service\ServiceManagerConfig` are now
deprecated, and will be removed starting in version 3.0:
- `ServiceManagerAwareInitializer`, which injects classes implementing
`Zend\ServiceManager\ServiceManagerAwareInterface` with the service manager
instance. Users should create factories for such classes that directly
inject their dependencies instead.
- `ServiceLocatorAwareInitializer`, which injects classes implementing
`Zend\ServiceManager\ServiceLocatorAwareInterface` with the service manager
instance. Users should create factories for such classes that directly
inject their dependencies instead.
### Removed
- `Zend\Mvc\Controller\AbstractController` no longer directly implements
`Zend\ServiceManager\ServiceLocatorAwareInterface`, but still implements the
methods defined in that interface. This was done to provide
forwards-compatibility, as zend-servicemanager v3 no longer defines the
interface. All initializers that do `ServiceLocatorInterface` injection were
updated to also inject when just the methods are present.
### Fixed
- [#31](https://github.com/zendframework/zend-mvc/pull/31) and
[#76](https://github.com/zendframework/zend-mvc/pull/76) update the component
to be forwards-compatible with zend-eventmanager v3.
- [#36](https://github.com/zendframework/zend-mvc/pull/36),
[#76](https://github.com/zendframework/zend-mvc/pull/76),
[#80](https://github.com/zendframework/zend-mvc/pull/80),
[#81](https://github.com/zendframework/zend-mvc/pull/81), and
[#82](https://github.com/zendframework/zend-mvc/pull/82) update the component
to be forwards-compatible with zend-servicemanager v3. Several changes were
introduced to support this effort:
- Added a `RouteInvokableFactory`, which can act as either a
`FactoryInterface` or `AbstractFactoryInterface` for loading invokable route
classes, including by fully qualified class name. This is registered as an
abstract factory by default with the `RoutePluginManager`.
- The `DispatchListener` now receives the controller manager instance at
instantiation.
- The `ViewManager` implementations were updated, and most functionality
within separated into discrete factories.
## 2.6.3 - 2016-02-23
### Added
- Nothing.
### Deprecated
- Nothing.
### Removed
- Nothing.
### Fixed
- [#74](https://github.com/zendframework/zend-mvc/pull/74) fixes the
`FormAnnotationBuilderFactory`'s usage of the
`FormElementManager::injectFactory()` method to ensure it works correctly on
all versions.
## 2.6.2 - 2016-02-22
### Added
- Nothing.
### Deprecated
- Nothing.
### Removed
- Nothing.
### Fixed
- [#71](https://github.com/zendframework/zend-mvc/pull/71) fixes the
`ViewHelperManagerFactory` to be backwards-compatible with v2 by ensuring that
the factories for each of the `url`, `basepath`, and `doctype` view helpers
are registered using the fully qualified class names present in
`Zend\View\HelperPluginManager`; these changes ensure requests for these
helpers resolve to these override factories, instead of the
`InvokableFactory`.
## 2.6.1 - 2016-02-16
### Added
- Nothing.
### Deprecated
- Nothing.
### Removed
- Nothing.
### Fixed
- [#69](https://github.com/zendframework/zend-mvc/pull/69) largely reverts
[#30](https://github.com/zendframework/zend-mvc/pull/30), having the component
utilize the `HydratorPluginManager` from zend-stdlib 2.7.5. This was done to
provide backwards compatibility; while zend-stdlib Hydrator types can be used
in place of zend-hydrator types, the reverse is not true.
You can make your code forwards-compatible with version 3, where the
`HydratorPluginManager` will be pulled from zend-hydrator, by updating your
typehints to use the zend-hydrator classes instead of those from zend-stdlib;
the instances returned from the zend-stdlib `HydratorPluginManager`, because
they extend those from zend-hydrator, remain compatible.
## 2.6.0 - 2015-09-22
### Added
- [#30](https://github.com/zendframework/zend-mvc/pull/30) updates the component
to use zend-hydrator for hydrator functionality; this provides forward
compatibility with zend-hydrator, and backwards compatibility with
hydrators from older versions of zend-stdlib.
### Deprecated
- Nothing.
### Removed
- Nothing.
### Fixed
- Nothing.
## 2.5.3 - 2015-09-22
### Added
- Nothing.
### Deprecated
- Nothing.
### Removed
- Nothing.
### Fixed
- [#29](https://github.com/zendframework/zend-mvc/pull/29) updates the
zend-stdlib dependency to reference `>=2.5.0,<2.7.0` to ensure hydrators
will work as expected following extraction of hydrators to the zend-hydrator
repository.
## 2.5.2 - 2015-09-14
### Added
- Nothing.
### Deprecated
- Nothing.
### Removed
- Nothing.
### Fixed
- [#27](https://github.com/zendframework/zend-mvc/pull/27) fixes a condition
where non-view model results from controllers could cause errors to be
raisedin the `DefaultRenderingStrategy`.
| 27.334135 | 111 | 0.732653 | eng_Latn | 0.957294 |
ff186bd6099e23912cf396f8e16345c2a83f1b36 | 14,595 | md | Markdown | README.md | anderpups/packer-templates | 149aac20697e8a0b451a006ef8803c76cd3afc06 | [
"Apache-2.0"
] | 326 | 2016-12-18T22:05:08.000Z | 2022-03-28T12:52:02.000Z | README.md | mtint/packer-templates | 149aac20697e8a0b451a006ef8803c76cd3afc06 | [
"Apache-2.0"
] | 107 | 2017-11-26T00:39:41.000Z | 2022-03-29T12:19:43.000Z | README.md | mtint/packer-templates | 149aac20697e8a0b451a006ef8803c76cd3afc06 | [
"Apache-2.0"
] | 108 | 2016-09-15T08:04:26.000Z | 2022-03-30T21:26:09.000Z | # Packer Templates mainly for the Vagrant [libvirt][libvirt] and [VirtualBox][virtualbox]
## Customized+Clean/Minimal boxes for [libvirt][libvirt] and [VirtualBox][virtualbox]
[libvirt]: https://github.com/vagrant-libvirt/vagrant-libvirt
[virtualbox]: https://www.vagrantup.com/docs/virtualbox/
[](https://github.com/ruzickap/packer-templates)
---
### GitHub repository for bug reports or feature requests
* [https://github.com/ruzickap/packer-templates/](https://github.com/ruzickap/packer-templates/issues)
### Vagrant Cloud repository for the images build by these templates
* [https://app.vagrantup.com/peru](https://app.vagrantup.com/peru)
## Requirements
* [QEMU-KVM](https://en.wikibooks.org/wiki/QEMU/Installing_QEMU)
* [Vagrant](https://www.vagrantup.com/downloads.html)
* [Vagrant Libvirt Plugin](https://github.com/pradels/vagrant-libvirt#installation)
* [VirtualBox](https://www.virtualbox.org/) (Version 6.1 or later)
* [Packer](https://www.packer.io/) (Version 1.6.0 or later)
## Login Credentials
`root` / `Administrator` password is `vagrant` or is not set.
Default login credentials:
* Username: `vagrant`
* Password: `vagrant`
## VM Specifications
Drivers / Devices added for the VMs for specific providers.
### Libvirt
* VirtIO dynamic Hard Disk (up to 50 GiB)
* VirtIO Network Interface
* QXL Video Card (SPICE display)
* Channel Device (com.redhat.spice.0)
### VirtualBox
* SATA Disk
## Configuration
### Minimal Linux installation
* en_US.UTF-8
* keymap for standard US keyboard
* UTC timezone
* NTP enabled (default configuration)
* full-upgrade
* unattended-upgrades
* /dev/vda1 mounted on / using ext4/xfs filesystem (all files in one partition)
* no swap
### Customized Linux installation
Some of the [images](https://app.vagrantup.com/boxes/search?utf8=%E2%9C%93&sort=downloads&provider=&q=peru/my)/templates
begins with "my_" - they are preconfigured with [Ansible role](https://github.com/ruzickap/ansible-role-my_common_defaults/):
* there are usually many customization depends on distribution - all are
described in [Ansible playbook](https://github.com/ruzickap/packer-templates/blob/master/ansible/site.yml).
* added packages: see the [Common list](https://github.com/ruzickap/ansible-role-my_common_defaults/blob/master/vars/main.yml)
and [Debian list](https://github.com/ruzickap/ansible-role-my_common_defaults/blob/master/vars/Debian.yml)
or [CentOS list](https://github.com/ruzickap/ansible-role-my_common_defaults/blob/master/vars/RedHat.yml)
* mouse disabled in Midnight Commander + other MC customizations
* preconfigured snmpd, vim, screen
* logrotate using xz instead of gzip
* logwatch is running once per week instead of once per day
* sshd is using only the strong algorithms
* sysstat (sar) is running every minute instead of every 5 minutes
### Minimal Windows installation
* UTC timezone
* IEHarden disabled
* Home Page set to "about:blank"
* First Run Wizard disabled
* Firewall allows Remote Desktop connections
* AutoActivation skipped
* DoNotOpenInitialConfigurationTasksAtLogon set to true
* WinRM (SSL) enabled
* New Network Window turned off
* Administrator account enabled
* EnableLUA
* Windows image was finalized using `sysprep`: [unattended.xml](https://github.com/ruzickap/packer-templates/blob/master/scripts/win-common/unattend.xml)
### Customized Windows 10 installation
* added packages: see the [common_windows_packages](https://github.com/ruzickap/ansible-role-my_common_defaults/blob/master/vars/Windows.yml)
* Additional configuration done via Ansible playbook [Win32NT-common.yml](https://github.com/ruzickap/ansible-role-my_common_defaults/blob/master/tasks/Win32NT-common.yml)
### Additional Drivers installed for libvirt boxes - [VirtIO](https://docs.fedoraproject.org/en-US/quick-docs/creating-windows-virtual-machines-using-virtio-drivers)
Installed during installation:
* NetKVM: VirtIO Network driver
* qxldod: QXL graphics driver
* viostor: VirtIO Block driver (VirtIO SCSI controller driver)
Installed components via Ansible playbook [win-simple.yml](https://github.com/ruzickap/packer-templates/blob/master/ansible/win-simple.yml)
for Windows:
* vioscsi: Support for VirtIO SCSI pass-through controller
* Balloon: VirtIO Memory Balloon driver
* viorng: VirtIO RNG Device driver
* vioser: VirtIO Serial Driver
* vioinput: VirtIO Input Driver - support for new QEMU input devices
virtio-keyboard-pci, virtio-mouse-pci, virtio-tablet-pci,
virtio-input-host-pci
* pvpanic: QEMU pvpanic device driver
* qemu-ga: [Qemu Guest Agent](http://wiki.libvirt.org/page/Qemu_guest_agent)
### Additional Drivers installed for VirtualBox boxes
* VirtualBox Guest Additions
## How to build images
If you have necessary software installed+configured on your local machine you
can use the following commands to build the images.
You can build the images using the build script [build.sh](build.sh) or directly
with Packer.
* Ubuntu requirements:
```bash
sudo apt update
sudo apt install -y ansible curl git jq libc6-dev libvirt-daemon-system libvirt-dev python3-winrm qemu-kvm sshpass xorriso unzip virtualbox
PACKER_LATEST_VERSION="$(curl -s https://checkpoint-api.hashicorp.com/v1/check/packer | jq -r -M '.current_version')"
curl "https://releases.hashicorp.com/packer/${PACKER_LATEST_VERSION}/packer_${PACKER_LATEST_VERSION}_linux_amd64.zip" --output /tmp/packer_linux_amd64.zip
sudo unzip /tmp/packer_linux_amd64.zip -d /usr/local/bin/
rm /tmp/packer_linux_amd64.zip
VAGRANT_LATEST_VERSION=$(curl -s https://checkpoint-api.hashicorp.com/v1/check/vagrant | jq -r -M '.current_version')
curl "https://releases.hashicorp.com/vagrant/${VAGRANT_LATEST_VERSION}/vagrant_${VAGRANT_LATEST_VERSION}_x86_64.deb" --output /tmp/vagrant_x86_64.deb
sudo apt install --no-install-recommends -y /tmp/vagrant_x86_64.deb
rm /tmp/vagrant_x86_64.deb
sudo gpasswd -a ${USER} kvm ; sudo gpasswd -a ${USER} libvirt ; sudo gpasswd -a ${USER} vboxusers
vagrant plugin install vagrant-libvirt
```
* Debian 10+ requirements:
VirtualBox is not in Debian 10 or later. If you need it, you will need to
figure out a way to install it.
```bash
echo 'deb http://deb.debian.org/debian bullseye main contrib non-free' | sudo tee /etc/apt/sources.list.d/bullseye.list
sudo sed --regexp-extended 's/^([^#].+\s+main)$/\1 contrib non-free/;' --in-place /etc/apt/sources.list ## Ensure required apt components are enabled.
cat <<EOF | sudo tee /etc/apt/preferences.d/bullseye.pref
Explanation: Just install packages from bullseye if they are not in buster or buster-backports. Do not upgrade. Delete this file when you want to upgrade to bullseye.
Package: *
Pin: release o=Debian,n=bullseye
Pin-Priority: 50
EOF
sudo apt update
sudo apt install -y ansible curl git jq libc6-dev libvirt-daemon-system libvirt-dev python3-winrm qemu-kvm sshpass xorriso unzip packer/bullseye vagrant vagrant-libvirt
sudo gpasswd -a ${USER} kvm ; sudo gpasswd -a ${USER} libvirt
sudo gpasswd -a ${USER} vboxusers ## If you have VirtualBox installed.
```
* Fedora requirements:
```bash
sudo dnf install https://download1.rpmfusion.org/free/fedora/rpmfusion-free-release-$(rpm -E %fedora).noarch.rpm
sudo dnf install -y ansible curl git jq libvirt libvirt-devel qemu-kvm ruby-devel xorriso unzip VirtualBox
PACKER_LATEST_VERSION="$(curl -s https://checkpoint-api.hashicorp.com/v1/check/packer | jq -r -M '.current_version')"
curl "https://releases.hashicorp.com/packer/${PACKER_LATEST_VERSION}/packer_${PACKER_LATEST_VERSION}_linux_amd64.zip" --output /tmp/packer_linux_amd64.zip
sudo unzip /tmp/packer_linux_amd64.zip -d /usr/local/bin/
rm /tmp/packer_linux_amd64.zip
VAGRANT_LATEST_VERSION=$(curl -s https://checkpoint-api.hashicorp.com/v1/check/vagrant | jq -r -M '.current_version')
sudo dnf install -y https://releases.hashicorp.com/vagrant/${VAGRANT_LATEST_VERSION}/vagrant_${VAGRANT_LATEST_VERSION}_x86_64.rpm
CONFIGURE_ARGS="with-ldflags=-L/opt/vagrant/embedded/lib with-libvirt-include=/usr/include/libvirt with-libvirt-lib=/usr/lib64/libvirt" vagrant plugin install vagrant-libvirt
sudo gpasswd -a ${USER} kvm ; sudo gpasswd -a ${USER} libvirt ; sudo gpasswd -a ${USER} vboxusers
systemctl start libvirtd
```
### Build process with the [build.sh](build.sh) script
```bash
git clone --recurse-submodules https://github.com/ruzickap/packer-templates.git
cd packer-templates
```
* Ubuntu:
```bash
# Ubuntu Server
./build.sh ubuntu-{20.04,18.04,16.04}-server-amd64-{libvirt,virtualbox}
# Ubuntu Desktop
./build.sh ubuntu-{20.04,18.04}-desktop-amd64-{libvirt,virtualbox}
# Ubuntu Server - customized
./build.sh my_ubuntu-{20.04,18.04,16.04}-server-amd64-{libvirt,virtualbox}
```
* Windows:
```bash
# Windows Server
./build.sh windows-server-2012_r2-standard-x64-eval-{libvirt,virtualbox}
./build.sh windows-server-2016-standard-x64-eval-{libvirt,virtualbox}
./build.sh windows-server-2019-standard-x64-eval-{libvirt,virtualbox}
./build.sh windows-server-2022-standard-x64-eval-{libvirt,virtualbox}
# Windows 10
./build.sh windows-10-enterprise-x64-eval-{libvirt,virtualbox}
# Windows 10 - customized
./build.sh my_windows-10-enterprise-x64-eval-{libvirt,virtualbox}
```
### Build process with the Packer
* Ubuntu:
```bash
# Ubuntu Server
NAME="ubuntu-20.04-server-amd64" \
UBUNTU_IMAGES_URL="http://archive.ubuntu.com/ubuntu/dists/focal/main/installer-amd64/current/legacy-images/" \
UBUNTU_TYPE="server" PACKER_IMAGES_OUTPUT_DIR="/var/tmp/" \
packer build -only="qemu" ubuntu-server.json
NAME="ubuntu-18.04-server-amd64" \
UBUNTU_IMAGES_URL="http://archive.ubuntu.com/ubuntu/dists/bionic-updates/main/installer-amd64/current/images/" \
UBUNTU_TYPE="server" PACKER_IMAGES_OUTPUT_DIR="/var/tmp/" \
packer build -only="qemu" ubuntu-server.json
NAME="ubuntu-16.04-server-amd64" \
UBUNTU_IMAGES_URL="http://archive.ubuntu.com/ubuntu/dists/xenial-updates/main/installer-amd64/current/images/" \
UBUNTU_TYPE="server" PACKER_IMAGES_OUTPUT_DIR="/var/tmp/" \
packer build -only="qemu" ubuntu-server.json
# Ubuntu Desktop
NAME="ubuntu-20.04-desktop-amd64" \
UBUNTU_IMAGES_URL="http://archive.ubuntu.com/ubuntu/dists/focal/main/installer-amd64/current/legacy-images/" \
UBUNTU_TYPE="desktop" PACKER_IMAGES_OUTPUT_DIR="/var/tmp/" \
packer build -only="qemu" ubuntu-desktop.json
# Ubuntu Server - customized
NAME="my_ubuntu-20.04-server-amd64" \
UBUNTU_IMAGES_URL="http://archive.ubuntu.com/ubuntu/dists/focal/main/installer-amd64/current/legacy-images/" \
UBUNTU_TYPE="server" PACKER_IMAGES_OUTPUT_DIR="/var/tmp/" \
packer build -only="qemu" my_ubuntu-server.json
NAME="my_ubuntu-18.04-server-amd64" \
UBUNTU_IMAGES_URL="http://archive.ubuntu.com/ubuntu/dists/bionic-updates/main/installer-amd64/current/images/" \
UBUNTU_TYPE="server" PACKER_IMAGES_OUTPUT_DIR="/var/tmp/" \
packer build -only="qemu" my_ubuntu-server.json
NAME="my_ubuntu-16.04-server-amd64" \
UBUNTU_IMAGES_URL="http://archive.ubuntu.com/ubuntu/dists/xenial-updates/main/installer-amd64/current/images/" \
UBUNTU_TYPE="server" PACKER_IMAGES_OUTPUT_DIR="/var/tmp/" \
packer build -only="qemu" my_ubuntu-server.json
```
* Windows:
```bash
curl -L -O /var/tmp/virtio-win.iso https://fedorapeople.org/groups/virt/virtio-win/direct-downloads/latest-virtio/virtio-win.iso
xorriso -report_about WARNING -osirrox on -indev /var/tmp/virtio-win.iso -extract / /var/tmp/virtio-win
export TMPDIR=/var/tmp
# Windows Server
## Windows Server 2022
export NAME="windows-server-2022-standard-x64-eval"
export WINDOWS_VERSION="2022"
export VIRTIO_WIN_ISO_DIR="/var/tmp/virtio-win"
export ISO_URL="https://software-download.microsoft.com/download/sg/20348.169.210806-2348.fe_release_svc_refresh_SERVER_EVAL_x64FRE_en-us.iso"
export PACKER_IMAGES_OUTPUT_DIR="/var/tmp/"
packer build -only="qemu" windows.json
## Windows Server 2019
export NAME="windows-server-2019-standard-x64-eval"
export WINDOWS_VERSION="2019"
export VIRTIO_WIN_ISO_DIR="/var/tmp/virtio-win"
export ISO_URL="https://software-download.microsoft.com/download/pr/17763.737.190906-2324.rs5_release_svc_refresh_SERVER_EVAL_x64FRE_en-us_1.iso"
export PACKER_IMAGES_OUTPUT_DIR="/var/tmp/"
packer build -only="qemu" windows.json
## Windows Server 2016
export NAME="windows-server-2016-standard-x64-eval"
export WINDOWS_VERSION="2016"
export VIRTIO_WIN_ISO_DIR="/var/tmp/virtio-win"
export ISO_URL="https://software-download.microsoft.com/download/pr/Windows_Server_2016_Datacenter_EVAL_en-us_14393_refresh.ISO"
export PACKER_IMAGES_OUTPUT_DIR="/var/tmp/"
packer build -only="qemu" windows.json
## Windows Server 2012
export NAME="windows-server-2012_r2-standard-x64-eval"
export WINDOWS_VERSION="2012"
export VIRTIO_WIN_ISO_DIR="/var/tmp/virtio-win"
export ISO_URL="http://care.dlservice.microsoft.com/dl/download/6/2/A/62A76ABB-9990-4EFC-A4FE-C7D698DAEB96/9600.17050.WINBLUE_REFRESH.140317-1640_X64FRE_SERVER_EVAL_EN-US-IR3_SSS_X64FREE_EN-US_DV9.ISO"
export PACKER_IMAGES_OUTPUT_DIR="/var/tmp/"
packer build -only="qemu" windows.json
# Windows 10
export NAME="windows-10-enterprise-x64-eval"
export WINDOWS_VERSION="10"
export VIRTIO_WIN_ISO_DIR="/var/tmp/virtio-win"
export ISO_URL="https://software-download.microsoft.com/download/sg/19043.928.210409-1212.21h1_release_svc_refresh_CLIENTENTERPRISEEVAL_OEMRET_x64FRE_en-us.iso"
export PACKER_IMAGES_OUTPUT_DIR="/var/tmp/"
packer build -only="qemu" windows.json
# Windows 10 - customized
export NAME="my_windows-10-enterprise-x64-eval"
export WINDOWS_VERSION="10"
export VIRTIO_WIN_ISO_DIR="/var/tmp/virtio-win"
export ISO_URL="https://software-download.microsoft.com/download/sg/19043.928.210409-1212.21h1_release_svc_refresh_CLIENTENTERPRISEEVAL_OEMRET_x64FRE_en-us.iso"
export PACKER_IMAGES_OUTPUT_DIR="/var/tmp/"
packer build -only="qemu" my_windows.json
```
## Helper scripts
* `build.sh` - build single image specified on command line
* `build_all.sh` - builds all images
* `build_all_remote_ssh.sh` - connects to remote Ubuntu server, install
the necessary packages for building images and execute `build_all.sh`
* `vagrant_init_destroy_boxes.sh` - tests all `*.box` images in the current
directory using `vagrant add/up/ssh/winrm/destroy`
GitLab CI configuration (obsolete) can be found here: [GitLab_CI_configuration.md](docs/GitLab_CI_configuration.md)
| 42.675439 | 203 | 0.766838 | yue_Hant | 0.221376 |
ff18e45c1f51b7aeebeb93bfb8365007a2ba02b5 | 6,074 | md | Markdown | powerapps-docs/maker/canvas-apps/functions/function-update-updateif.md | LinnZawWin/powerapps-docs | 8ba5f6b88dbd71eb3663dfeec5f0b4427c1543c0 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-01-12T02:59:48.000Z | 2021-01-12T02:59:48.000Z | powerapps-docs/maker/canvas-apps/functions/function-update-updateif.md | LinnZawWin/powerapps-docs | 8ba5f6b88dbd71eb3663dfeec5f0b4427c1543c0 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | powerapps-docs/maker/canvas-apps/functions/function-update-updateif.md | LinnZawWin/powerapps-docs | 8ba5f6b88dbd71eb3663dfeec5f0b4427c1543c0 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-05-28T21:00:36.000Z | 2020-05-28T21:00:36.000Z | ---
title: Update and UpdateIf functions | Microsoft Docs
description: Reference information, including syntax and examples, for the Update and UpdateIf functions in Power Apps
author: gregli-msft
manager: kvivek
ms.service: powerapps
ms.topic: reference
ms.custom: canvas
ms.reviewer: tapanm
ms.date: 10/21/2015
ms.author: gregli
search.audienceType:
- maker
search.app:
- PowerApps
---
# Update and UpdateIf functions in Power Apps
Updates [records](../working-with-tables.md#records) in a [data source](../working-with-data-sources.md).
## Description
### Update function
Use the **Update** function to replace an entire record in a data source. In contrast, the **UpdateIf** and the **[Patch](function-patch.md)** functions modify one or more values in a record, leaving the other values alone.
For a [collection](../working-with-data-sources.md#collections), the entire record must match. Collections allow duplicate records, so multiple records might match. You can use the **All** argument to update all copies of a record; otherwise, only one copy of the record is updated.
If the data source generates a column's value automatically, the value of that [column](../working-with-tables.md#columns) must be reaffirmed.
### UpdateIf function
Use the **UpdateIf** function to modify one or more values in one or more records that match one or more conditions. The condition can be any formula that results in a **true** or **false** and can reference columns of the data source by name. The function evaluates the condition for each record and modifies any record for which the result is **true**.
To specify a modification, use a change record that contains new property values. If you provide this change record inline with curly braces, property formulas can reference properties of the record that's being modified. You can use this behavior to modify records based on a formula.
Similar to **UpdateIf**, you can also use the **[Patch](function-patch.md)** function to change specific columns of a record without affecting other columns.
Both **Update** and **UpdateIf** return the modified data source as a [table](../working-with-tables.md). You must use either function in a [behavior formula](../working-with-formulas-in-depth.md).
### Delegation
[!INCLUDE [delegation-no](../../../includes/delegation-no.md)]
## Syntax
**Update**( *DataSource*, *OldRecord*, *NewRecord* [, **All** ] )
* *DataSource* – Required. The data source that contains the record that you want to replace.
* *OldRecord* – Required. The record to replace.
* *NewRecord* – Required. The replacement record. This isn't a change record. The entire record is replaced, and missing properties will contain *blank*.
* **All** – Optional. In a collection, the same record may appear more than once. Specify the **All** argument to remove all copies of the record.
**UpdateIf**( *DataSource*, *Condition1*, *ChangeRecord1* [, *Condition2*, *ChangeRecord2*, ... ] )
* *DataSource* – Required. The data source that contains the record or records that you want to modify.
* *Condition(s)* – Required. A formula that evaluates to **true** for the record or records that you want to modify. You can use column names of *DataSource* in the formula.
* *ChangeRecord(s)* - Required. For each corresponding condition, a change record of new property values to apply to records of *DataSource* that satisfy the condition. If you provide the record inline using curly braces, property values of the existing record can be used in the property formulas.
## Examples
In these examples, you'll replace or modify records in a data source that's named **IceCream** and that starts with the data in this table:

| Formula | Description | Result |
| --- | --- | --- |
| **Update( IceCream,<br>First( Filter( IceCream, Flavor="Chocolate" ) ), { ID: 1, Flavor: "Mint Chocolate", Quantity:150 } )** |Replaces a record from the data source. |<style> img { max-width: none } </style> <br><br>The **IceCream** data source has been modified. |
| **UpdateIf( IceCream, Quantity > 175, { Quantity: Quantity + 10 } )** |Modifies records that have a **Quantity** that is greater than **150**. The **Quantity** field is incremented by 10, and no other fields are modified. |<br><br>The **IceCream** data source has been modified. |
| **Update( IceCream,<br>First( Filter( IceCream, Flavor="Strawberry" ) ),<br>{ ID: 3, Flavor: "Strawberry Swirl"} )** |Replaces a record from the data source. The **Quantity** property hasn't been supplied in the replacement record, so that property will be *blank* in the result. |<br><br>The **IceCream** data source has been modified. |
| **UpdateIf( IceCream, true, { Quantity: 0 } )** |Sets the value of the **Quantity** property for all records in the data source to 0. |<br> <br>The **IceCream** data source has been modified. |
### Step by step
1. Import or create a collection named **Inventory**, and show it in a gallery as [Show data in a gallery](../show-images-text-gallery-sort-filter.md) describes.
2. Name the gallery **ProductGallery**.
3. Add a slider named **UnitsSold**, and set its **Max** property to this expression:<br>**ProductGallery.Selected.UnitsInStock**
4. Add a button, and set its **[OnSelect](../controls/properties-core.md)** property to this formula:<br>**UpdateIf(Inventory, ProductName = ProductGallery.Selected.ProductName, {UnitsInStock:UnitsInStock-UnitsSold.Value})**
5. Press F5, select a product in the gallery, specify a value with the slider, and then select the button.
The number of units in stock for the product you specified decreases by the amount that you specified.
| 80.986667 | 439 | 0.742509 | eng_Latn | 0.967961 |
ff19351b9a0409d86c863434b3a5ef49b26d2e8d | 1,400 | md | Markdown | Module1/Day09/README.md | sydneybeal/100DaysPython | d1b004bd27a0644983f3af100172f394ee039f30 | [
"MIT"
] | 2 | 2019-06-02T12:17:18.000Z | 2019-07-12T16:55:55.000Z | Module1/Day09/README.md | sydneybeal/100DaysPython | d1b004bd27a0644983f3af100172f394ee039f30 | [
"MIT"
] | null | null | null | Module1/Day09/README.md | sydneybeal/100DaysPython | d1b004bd27a0644983f3af100172f394ee039f30 | [
"MIT"
] | 1 | 2019-06-04T01:57:23.000Z | 2019-06-04T01:57:23.000Z | # Day 9: Index and Slicing
**Instructions:**
1. Open a new python file.
2. Specific items can be retrieved from a list by using its indices. _Type and execute:_
`quotes = ["Pitter patter, let's get at 'er", "Hard no!", "H'are ya now?", "Good-n-you?", "Not so bad.", "Is that what you appreciates about me?"]`
`quotes[0]`
`print(f"{quotes[2]}\n\t{quotes[3]}\n{quotes[4]}")`
3. Slicing uses the format `[start:stop:step]`. Start is _inclusive_, but stop is _exclusive_. Unlike using just the index, slicing allows the user to return a sequence rather than a single item. Slicing can be conducted on mutable and immutable objects. _Type and execute:_
`quotes[2:5]`
4. The step can be used to identify how many items to skip between returned values. _Type and execute:_
`quotes[::2]`
5. The step can also be used to reverse the order of the returned items. _Type and execute:_
`quotes[::-1]`
6. Slicing can be combined with indices to return a sequence from a specific item. _Type and execute:_
`quotes[0][::2]`
`quotes[0][::-1]`
7. _Type and execute:_
`wayne = "Toughest Guy in Letterkenny"`
`wayne[::-1]`
8. Retrieval by index and slicing can also be applied directly to a string. _Type and execute:_
`"That's a Texas sized 10-4."[0:9:2]`
9.
10.
11.
12.
13.
14.
15. Update the [log file](../../log.md) with what you have learned today.
| 48.275862 | 276 | 0.679286 | eng_Latn | 0.992637 |
ff1989838e45e798b63ff2fdfe42515347e29e72 | 3,278 | md | Markdown | README.md | blopker/bluebutton-sample-client-django | 92bc221016d7ab3eed995bf2a8b3256f706f18c2 | [
"Apache-2.0"
] | null | null | null | README.md | blopker/bluebutton-sample-client-django | 92bc221016d7ab3eed995bf2a8b3256f706f18c2 | [
"Apache-2.0"
] | null | null | null | README.md | blopker/bluebutton-sample-client-django | 92bc221016d7ab3eed995bf2a8b3256f706f18c2 | [
"Apache-2.0"
] | null | null | null | Blue Button Sample Client Application - Django Version
======================================================
## Introduction
This client demonstrates authenticating to the Blue Buttom API and subsequent FHIR API calls.
It demonstrates the OAuth2 Server Side web application flow where a `client_secret` is used.
## Status and Contributing
The application is in active development so check back often for updates.
Please consider improving this code with your contributions. Pull requests welcome ;)
## Basic Setup
git clone https://github.com/HHSIDEAlab/django_bluebutton_client.git
cd django_blubutton_client/bbc
While not required, using `virtualenv` is a good idea.
The following commands work for Python 3+. Please search `virtualenv`
to fine eqivilent commands to install and setup `virtualenv` for Python 2.7.
python -m venv venv
source venv/bin/activate
The following command assumes a `virtualenv` was created and activated.
If you aren't using `virtualenv`, then you may need to put `sudo` in
front of the following `pip` command.
pip install -r requirements/requirements.in
cp bbc/settings/local_sample.py bbc/settings/local.py
python manage.py migrate --settings bbc.settings.local
### Configuring Your Development Application
By default, your application will be set up to use the public OAuth service
at https://dev.bluebutton.cms.fhirservice.net/. In order to use this version of
the service, you'll need to request an account on that site. So select Account ->
"Request an Invite," fill out the form, setting user type to "Developer," and
we'll get back to you as soon as possible.
Once you have your developer account created and you've verified your email address,
you'll need to set up an application. Log in to your new account, and select
"Applications" -> "Applications You Created" -> "Register New Application". From
here, you can fill out the form with the following options:
Scope: [you likely want to select all available]
Name: [your choice]
Client type: Confidential
Authorization grant type: Authorization Code
Redirect uris: http://localhost:8000/social-auth/complete/oauth2io/
Once you submit the form, you should receive an application key and secret that
can be be added to the bbc/settings/local.py file you created above, overwriting
the values for:
* `SOCIAL_AUTH_OAUTH2IO_KEY`
* `SOCIAL_AUTH_OAUTH2IO_SECRET`
### Final Steps
Finally, you're ready to execute
python manage.py runserver --settings bbc.settings.local
And from here, you can navigate to http://localhost:8000 and test your application.
## Other Settings
* `OAUTH2IO_HOST` - the default is `https://sandbox.bluebutton.cms.gov `
* `EXTERNAL_AUTH_NAME` - the default is `CMS`.
If you change the `OAUTH2IO_HOST` to something non https (for testing), then you need to
tell the oauthlib to operate in an insecure mode like so.
import os
os.environ['OAUTHLIB_INSECURE_TRANSPORT'] = '1'
## Running the Tests
To run the tests against https://sandbox.bluebutton.cms.gov use:
python manage.py test --settings=bbc.settings.test
To run the tests against a local OAuth2/FHIR server instance (http://localhost:8000) use:
python manage.py test --settings=bbc.settings.test_local
| 36.831461 | 93 | 0.751678 | eng_Latn | 0.976927 |
ff19a54754e9e50ce05d84d7de58a22e037d3138 | 57 | md | Markdown | README.md | cklxl/spring-boot-cklxl | 3a9e3b3050230027159c0f511d78cc8575a0e985 | [
"Apache-2.0"
] | null | null | null | README.md | cklxl/spring-boot-cklxl | 3a9e3b3050230027159c0f511d78cc8575a0e985 | [
"Apache-2.0"
] | null | null | null | README.md | cklxl/spring-boot-cklxl | 3a9e3b3050230027159c0f511d78cc8575a0e985 | [
"Apache-2.0"
] | null | null | null | # spring-boot-cklxl
spring boot custom extension project
| 19 | 36 | 0.824561 | eng_Latn | 0.853203 |
ff1a33099b137b20f027ef82a8440bd5ea987fda | 4,243 | md | Markdown | articles/synapse-analytics/quickstart-create-apache-spark-pool-studio.md | zoeleng/azure-docs.zh-cn | 55d0a2ed4017c7b4da8c14b0b454f90ce2345b44 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/synapse-analytics/quickstart-create-apache-spark-pool-studio.md | zoeleng/azure-docs.zh-cn | 55d0a2ed4017c7b4da8c14b0b454f90ce2345b44 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/synapse-analytics/quickstart-create-apache-spark-pool-studio.md | zoeleng/azure-docs.zh-cn | 55d0a2ed4017c7b4da8c14b0b454f90ce2345b44 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 快速入门:使用 Synapse Studio 创建无服务器 Apache Spark 池(预览版)
description: 遵循本指南中的步骤,使用 Synapse Studio 创建一个无服务器 Apache Spark 池。
services: synapse-analytics
author: saveenr
ms.service: synapse-analytics
ms.topic: quickstart
ms.subservice: spark
ms.date: 10/16/2020
ms.author: saveenr
ms.reviewer: jrasnick
ms.openlocfilehash: 313ad0c620fe06158e96c208ae265702134b58d1
ms.sourcegitcommit: 96918333d87f4029d4d6af7ac44635c833abb3da
ms.translationtype: HT
ms.contentlocale: zh-CN
ms.lasthandoff: 11/04/2020
ms.locfileid: "93324207"
---
# <a name="quickstart-create-a-serverless-apache-spark-pool-preview-using-synapse-studio"></a>快速入门:使用 Synapse Studio 创建无服务器 Apache Spark 池(预览版)
Azure Synapse Analytics 提供了各种分析引擎,可帮助你引入、转换、分析和提供数据,以及对数据建模。 Apache Spark 池提供开源大数据计算功能。 在 Synapse 工作区中创建 Apache Spark 池后,可以加载、处理和提供数据以及为数据建模,以获取见解。
本快速入门介绍了使用 Synapse Studio 在 Synapse 工作区中创建 Apache Spark 池的步骤。
> [!IMPORTANT]
> 不管是否正在使用 Spark 实例,它们都会按分钟按比例计费。 请务必在用完 Spark 实例后将其关闭,或设置较短的超时。 有关详细信息,请参阅本文的 **清理资源** 部分。
如果没有 Azure 订阅,请[在开始之前创建一个免费帐户](https://azure.microsoft.com/free/)。
## <a name="prerequisites"></a>先决条件
- Azure 订阅 - [创建免费帐户](https://azure.microsoft.com/free/)
- [Synapse 工作区](./quickstart-create-workspace.md)
## <a name="sign-in-to-the-azure-portal"></a>登录到 Azure 门户
登录到 [Azure 门户](https://portal.azure.com/)
## <a name="navigate-to-the-synapse-workspace"></a>导航到 Synapse 工作区
1. 导航到要在其中创建 Apache Spark 池的 Synapse 工作区,方法是在搜索栏中键入服务名称(或直接键入资源名称)。

1. 从工作区列表中,键入要打开的工作区的名称(或名称的一部分)。 在此示例中,我们将使用名为 contosoanalytics 的工作区。

## <a name="launch-synapse-studio"></a>启动 Synapse Studio
从工作区概述中,选择“工作区 Web URL”以打开 Synapse Studio。

## <a name="create-the-apache-spark-pool-in-synapse-studio"></a>在 Synapse Studio 中创建 Apache Spark 池
1. 在 Synapse Studio 主页上,选择“管理”图标以导航到左侧导航栏中的“管理中心” 。

1. 进入管理中心后,导航到“Apache Spark 池”部分,查看工作区中可用的 Apache Spark 池的当前列表。

1. 选择“+ 新建”,随即会显示新 Apache Spark 池创建向导。
1. 在“基本信息”选项卡中输入以下详细信息:
| 设置 | 建议的值 | 说明 |
| :------ | :-------------- | :---------- |
| **Apache Spark 池名称** | contosospark | 这是 Apache Spark 池要使用的名称。 |
| **节点大小** | 小 (4 vCPU / 32 GB) | 请将此项设置为最小大小,以降低本快速入门的成本 |
| **自动缩放** | 已禁用 | 使用此快速入门时无需进行自动缩放 |
| **节点数** | 8 | 在此快速入门中使用小尺寸来限制成本|

> [!IMPORTANT]
> 请注意,Apache Spark 池可以使用的名称有特定的限制。 名称只能包含字母或数字,必须为 15 个或更少的字符,必须以字母开头,不能包含保留字,并且在工作区中必须是独一无二的。
1. 在下一个选项卡(“其他设置”)中,将所有设置保留为默认值。
1. 目前不会添加任何标记,因此请选择“查看 + 创建”。
1. 在“查看 + 创建”选项卡中,根据前面输入的内容,确保详细信息正确,然后按“创建” 。

1. Apache Spark 池将开始预配过程。
1. 预配完成后,新的 Apache Spark 池将显示在列表中。

## <a name="clean-up-apache-spark-pool-resources-using-synapse-studio"></a>使用 Synapse Studio 清理 Apache Spark 池资源
执行以下步骤,使用 Synapse Studio 从工作区中删除 Apache Spark 池。
> [!WARNING]
> 删除 Spark 池会从工作区中删除分析引擎。 将不再可以连接到该池,并且使用此 Spark 池的所有查询、管道和笔记本都不再可以正常运行。
若要删除 Apache Spark 池,请执行以下操作:
1. 在 Synapse Studio 的管理中心导航到 Apache Spark 池。
1. 选择要删除的 Apache 池旁边的省略号(在本例中为 contosospark)以显示 Apache Spark 池的命令。

1. 按“删除”。
1. 确认删除,然后按“删除”按钮。
1. 成功完成该过程后,工作区资源中将不再列出该 Apache Spark 池。
## <a name="next-steps"></a>后续步骤
- 请参阅[快速入门:使用 Web 工具在 Synapse Studio 中创建 Apache Spark 池](quickstart-apache-spark-notebook.md)。
- 请参阅[快速入门:使用 Azure 门户创建 Apache Spark 池](quickstart-create-apache-spark-pool-portal.md)。
| 36.895652 | 149 | 0.740042 | yue_Hant | 0.63491 |
ff1a6a9c99033cbc8b3a6e61ea1da2102e9edc57 | 1,015 | md | Markdown | README.md | dianw/spring-boot-intro | f83332fb29e6cf9fb6df28155d6088e71ad74ced | [
"Apache-2.0"
] | null | null | null | README.md | dianw/spring-boot-intro | f83332fb29e6cf9fb6df28155d6088e71ad74ced | [
"Apache-2.0"
] | null | null | null | README.md | dianw/spring-boot-intro | f83332fb29e6cf9fb6df28155d6088e71ad74ced | [
"Apache-2.0"
] | null | null | null | # Spring Boot Intro
Simple Spring Boot web application example.
For the complete example of how spring boot works with other modules (security, social media, etc), you can find at https://github.com/codenergic/theskeleton. Contribution are always welcome.
## Running the development server
```bash
$ mvn spring-boot:run
```
## Building package and starting application
```bash
$ mvn package
$ java -jar target/*.jar
```
## License
```
Copyright 2018 the original author or authors.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
```
| 30.757576 | 191 | 0.770443 | eng_Latn | 0.990526 |
ff1a6e201a4145b9ee36aa229da2fd92d67dc81b | 906 | md | Markdown | keyboards/mehkee96/readme.md | fzf/qmk_toolbox | 10d6b425bd24b45002555022baf16fb11254118b | [
"MIT"
] | null | null | null | keyboards/mehkee96/readme.md | fzf/qmk_toolbox | 10d6b425bd24b45002555022baf16fb11254118b | [
"MIT"
] | null | null | null | keyboards/mehkee96/readme.md | fzf/qmk_toolbox | 10d6b425bd24b45002555022baf16fb11254118b | [
"MIT"
] | null | null | null | # Mehkee 96KEE

96-key Keyboard from mehkee
* Keyboard Maintainer: [johanntang](https://github.com/johanntang)
* Hardware Supported: mehkee96
* Hardware Availability: [mehkee, group buy closed](https://mehkee.com/products/96kee?variant=46912017423)
Make example for this keyboard (after setting up your build environment):
make mehkee96:default
Flashing example for this keyboard ([after setting up the bootloadHID flashing environment](https://docs.qmk.fm/#/flashing_bootloadhid))
make mehkee96:default:flash
See the [build environment setup](https://docs.qmk.fm/#/getting_started_build_tools) and the [make instructions](https://docs.qmk.fm/#/getting_started_make_guide) for more information. Brand new to QMK? Start with our [Complete Newbs Guide](https://docs.qmk.fm/#/newbs).
| 45.3 | 270 | 0.780353 | eng_Latn | 0.571833 |
ff1a8b18c929a18744621d204eed7c26661525db | 95,449 | md | Markdown | README.md | sam33339999/til | b5889c994cf49b89e807ede97f63f254d62e0e88 | [
"MIT"
] | null | null | null | README.md | sam33339999/til | b5889c994cf49b89e807ede97f63f254d62e0e88 | [
"MIT"
] | null | null | null | README.md | sam33339999/til | b5889c994cf49b89e807ede97f63f254d62e0e88 | [
"MIT"
] | null | null | null | # TIL
> Today I Learned
A collection of concise write-ups on small things I learn day to day across a
variety of languages and technologies. These are things that don't really
warrant a full blog post. These are things I've picked up by [Learning In
Public™](https://dev.to/jbranchaud/how-i-built-a-learning-machine-45k9) and
pairing with smart people at Hashrocket.
For a steady stream of TILs, [sign up for my newsletter](https://tinyletter.com/jbranchaud).
_1144 TILs and counting..._
---
### Categories
* [Ack](#ack)
* [Amplify](#amplify)
* [Chrome](#chrome)
* [Clojure](#clojure)
* [CSS](#css)
* [Devops](#devops)
* [Elixir](#elixir)
* [Gatsby](#gatsby)
* [Git](#git)
* [GitHub Actions](#github-actions)
* [Go](#go)
* [HTML](#html)
* [HTTP](#http)
* [Internet](#internet)
* [JavaScript](#javascript)
* [jq](#jq)
* [Kitty](#kitty)
* [Linux](#linux)
* [Mac](#mac)
* [MongoDB](#mongodb)
* [MySQL](#mysql)
* [Netlify](#netlify)
* [Next.js](#nextjs)
* [Phoenix](#phoenix)
* [PostgreSQL](#postgresql)
* [Python](#python)
* [Rails](#rails)
* [React](#react)
* [React Native](#react-native)
* [React Testing Library](#react-testing-library)
* [ReasonML](#reasonml)
* [Ruby](#ruby)
* [sed](#sed)
* [Shell](#shell)
* [Tailwind CSS](#tailwind-css)
* [tmux](#tmux)
* [TypeScript](#typescript)
* [Unix](#unix)
* [Vercel](#vercel)
* [Vim](#vim)
* [VSCode](#vscode)
* [Webpack](#webpack)
* [Workflow](#workflow)
* [YAML](#yaml)
---
### Ack
- [ack --bar](ack/ack-bar.md)
- [Case-Insensitive Search](ack/case-insensitive-search.md)
- [List Available File Types](ack/list-available-file-types.md)
### Amplify
- [Sign Up User With Email And Password](amplify/sign-up-user-with-email-and-password.md)
### Chrome
- [Access A Value Logged To The Console](chrome/access-a-value-logged-to-the-console.md)
- [Chrome Supports Many Unix Keyboard Shortcuts](chrome/chrome-supports-many-unix-keyboard-shortcuts.md)
- [Copy Some Data From The Console](chrome/copy-some-data-from-the-console.md)
- [Duplicate The Current Tab](chrome/duplicate-the-current-tab.md)
- [Easier Access To Network Throttling Controls](chrome/easier-access-to-network-throttling-controls.md)
- [Pretty Print Tabular Data](chrome/pretty-print-tabular-data.md)
- [Reference The Selected Node](chrome/reference-the-selected-node.md)
- [Selecting DOM Elements Faster Than Ever](chrome/selecting-dom-elements-faster-than-ever.md)
- [Simulating Various Connection Speeds](chrome/simulating-various-connection-speeds.md)
- [Toggle Device Mode](chrome/toggle-device-mode.md)
- [Toggle Open The Console Drawer](chrome/toggle-open-the-console-drawer.md)
- [View Network Traffic For New Tabs](chrome/view-network-traffic-for-new-tabs.md)
### Clojure
- [Aggregation Using merge-with](clojure/aggregation-using-merge-with.md)
- [Argument Requirements For A Function](clojure/argument-requirements-for-a-function.md)
- [Combinations Of Items From A Sequence](clojure/combinations-of-items-from-a-sequence.md)
- [Define Something Only Once](clojure/define-something-only-once.md)
- [Evaluate One Liners With lein-exec](clojure/evaluate-one-liners-with-lein-exec.md)
- [Expanding Macros](clojure/expanding-macros.md)
- [Get The Value Of An Environment Variable](clojure/get-the-value-of-an-environment-variable.md)
- [List Functions For A Namespace](clojure/list-functions-for-a-namespace.md)
- [Load A File Into The REPL](clojure/load-a-file-into-the-repl.md)
- [Mapping With An Index](clojure/mapping-with-an-index.md)
- [Open JavaDocs](clojure/open-javadocs.md)
- [Pretty Print The Last Thing](clojure/pretty-print-the-last-thing.md)
- [Quick Clojure Docs](clojure/quick-clojure-docs.md)
- [Reductions](clojure/reductions.md)
- [Set Max Heap Size](clojure/set-max-heap-size.md)
- [Specify the Directory of a Shell Command](clojure/specify-the-directory-of-a-shell-command.md)
- [Splitting On Whitespace](clojure/splitting-on-whitespace.md)
- [Swap Two Items in a Vector](clojure/swap-two-items-in-a-vector.md)
- [Try A Clojure Project In The REPL](clojure/try-a-clojure-project-in-the-repl.md)
- [Type of Anything](clojure/type-of-anything.md)
- [When Overflow Is Desired](clojure/when-overflow-is-desired.md)
### CSS
- [Add Fab Icons To Your Site With FontAwesome 5](css/add-fab-icons-to-your-site-with-fontawesome-5.md)
- [Animate Smoothly Between Two Background Colors](css/animate-smoothly-between-two-background-colors.md)
- [Apply Multiple Box Shadows To Single Element](css/apply-multiple-box-shadows-to-single-element.md)
- [Apply Styles Based On Dark-Mode Preferences](css/apply-styles-based-on-dark-mode-preferences.md)
- [Apply Styles To The Last Child Of A Specific Type](css/apply-styles-to-the-last-child-of-a-specific-type.md)
- [Change The Orientation Of An Image](css/change-the-orientation-of-an-image.md)
- [Circular Icons With A Massive Border Radius](css/circular-icons-with-a-massive-border-radius.md)
- [Clean Up Repetition With :is() Pseudo-Class](css/clean-up-repetition-with-is-pseudo-class.md)
- [Conditional Styling For Unsupported CSS Features](css/conditional-styling-for-unsupported-css-features.md)
- [Create A Pulsing Background With CSS Animation](css/create-a-pulsing-background-with-css-animation.md)
- [Define CSS Custom Properties With CSS Variables](css/define-css-custom-properties-with-scss-variables.md)
- [Define HSL Colors With Alpha Values](css/define-hsl-colors-with-alpha-values.md)
- [Display Responsive iframe Maintaining Aspect Ratio](css/display-responsive-iframe-maintaining-aspect-ratio.md)
- [Dry Up SCSS With Mixins](css/dry-up-scss-with-mixins.md)
- [Give Elements The Same Width With Flexbox](css/give-elements-the-same-width-with-flexbox.md)
- [Let Pointer Events Pass Through An Element](css/let-pointer-events-pass-through-an-element.md)
- [Lighten And Darken With CSS Brightness Filter](css/lighten-and-darken-with-css-brightness-filter.md)
- [Lighten And Darken With SCSS](css/lighten-and-darken-with-scss.md)
- [Make A Block Of Text Respect New Lines](css/make-a-block-of-text-respect-new-lines.md)
- [Parameterized SCSS Mixins](css/parameterized-scss-mixins.md)
- [:root Has Higher Specificity Than html](css/root-has-higher-specificity-than-html.md)
- [Style A Background With A Linear Gradient](css/style-a-background-with-a-linear-gradient.md)
- [Using Maps In SCSS](css/using-maps-in-scss.md)
### Devops
- [Aliasing An Ansible Host](devops/aliasing-an-ansible-host.md)
- [Allow Cross-Origin Requests To Include Cookies](devops/allow-cross-origin-requests-to-include-cookies.md)
- [Allow HTTPS Through Your UFW Firewall](devops/allow-https-through-your-ufw-firewall.md)
- [Check The Status of All Services](devops/check-the-status-of-all-services.md)
- [Check The Syntax Of nginx Files](devops/check-the-syntax-of-nginx-files.md)
- [Connect To An RDS PostgreSQL Database](devops/connect-to-an-rds-postgresql-database.md)
- [Determine The IP Address Of A Domain](devops/determine-the-ip-address-of-a-domain.md)
- [Path Of The Packets](devops/path-of-the-packets.md)
- [Push Non-master Branch To Heroku](devops/push-non-master-branch-to-heroku.md)
- [Reload The nginx Configuration](devops/reload-the-nginx-configuration.md)
- [Resolve The Public IP Of A URL](devops/resolve-the-public-ip-of-a-url.md)
- [Running Out Of inode Space](devops/running-out-of-inode-space.md)
- [SSH Into A Docker Container](devops/ssh-into-a-docker-container.md)
- [SSL Certificates Can Cover Multiple Domains](devops/ssl-certificates-can-cover-multiple-domains.md)
- [Wipe A Heroku Postgres Database](devops/wipe-a-heroku-postgres-database.md)
### Elixir
- [All Values For A Key In A Keyword List](elixir/all-values-for-a-key-in-a-keyword-list.md)
- [Append To A Keyword List](elixir/append-to-a-keyword-list.md)
- [Assert An Exception Is Raised](elixir/assert-an-exception-is-raised.md)
- [Binary Representation Of A String](elixir/binary-representation-of-a-string.md)
- [Check For A Substring Match](elixir/check-for-a-substring-match.md)
- [Check List Membership](elixir/check-list-membership.md)
- [Comparing DateTime Structs](elixir/comparing-datetime-structs.md)
- [Compute Intermediate Values In A With Construct](elixir/compute-intermediate-values-in-a-with-construct.md)
- [Compute md5 Digest Of A String](elixir/compute-md5-digest-of-a-string.md)
- [Counting Records With Ecto](elixir/counting-records-with-ecto.md)
- [Create A Date With The Date Sigil](elixir/create-a-date-with-the-date-sigil.md)
- [Create A List Of Atoms](elixir/create-a-list-of-atoms.md)
- [Creating A PID](elixir/creating-a-pid.md)
- [Creating Indexes With Ecto](elixir/creating-indexes-with-ecto.md)
- [Defining Multiple Clauses In An Anonymous Function](elixir/defining-multiple-clauses-in-an-anonymous-function.md)
- [Determine The Latest Release Of A Hex Package](elixir/determine-the-latest-release-of-a-hex-package.md)
- [Do You Have The Time?](elixir/do-you-have-the-time.md)
- [Do You Have The Time? - Part 2](elixir/do-you-have-the-time-part-2.md)
- [Documentation Lookup With Vim And Alchemist](elixir/documentation-lookup-with-vim-and-alchemist.md)
- [Dynamically Generating Atoms](elixir/dynamically-generating-atoms.md)
- [Execute Raw SQL In An Ecto Migration](elixir/execute-raw-sql-in-an-ecto-migration.md)
- [Expose Internal Representation](elixir/expose-internal-representation.md)
- [Include Captures With String.split](elixir/include-captures-with-string-split.md)
- [Inspecting The Process Message Queue](elixir/inspecting-the-process-message-queue.md)
- [List Functions For A Module](elixir/list-functions-for-a-module.md)
- [Listing Files In IEx](elixir/listing-files-in-iex.md)
- [Match On A Map In A With Construct](elixir/match-on-a-map-in-a-with-construct.md)
- [Passing Around And Using Modules](elixir/passing-around-and-using-modules.md)
- [Pattern Matching In Anonymous Functions](elixir/pattern-matching-in-anonymous-functions.md)
- [Pipe Into A Case Statement](elixir/pipe-into-a-case-statement.md)
- [Quitting IEx](elixir/quitting-iex.md)
- [Range Into List Using Comprehensions](elixir/range-into-list-using-comprehensions.md)
- [Refer To A Module Within Itself](elixir/refer-to-a-module-within-itself.md)
- [Referencing Values In IEx's History](elixir/referencing-values-in-iexs-history.md)
- [Remove One List From Another](elixir/remove-one-list-from-another.md)
- [Replace Duplicates In A Keyword List](elixir/replace-duplicates-in-a-keyword-list.md)
- [Requiring Keys For Structs](elixir/requiring-keys-for-structs.md)
- [Reversing A List](elixir/reversing-a-list.md)
- [Reversing A List - Part 2](elixir/reversing-a-list-part-2.md)
- [Root Directory Of A Project](elixir/root-directory-of-a-project.md)
- [Round Floats To Integers](elixir/round-floats-to-integers.md)
- [Run ExUnit Tests In A Deterministic Order](elixir/run-exunit-tests-in-a-deterministic-order.md)
- [Run The Test At A Specific Line Number](elixir/run-the-test-at-a-specific-line-number.md)
- [Same Functions Should Be Grouped Together](elixir/same-functions-should-be-grouped-together.md)
- [Skip A Specific Test](elixir/skip-a-specific-test.md)
- [String Interpolation With Just About Anything](elixir/string-interpolation-with-just-about-anything.md)
- [Unique Indexes With Ecto](elixir/unique-indexes-with-ecto.md)
- [Updating Values In A Map](elixir/updating-values-in-a-map.md)
- [Using When Clauses In A With Construct](elixir/using-when-clauses-in-a-with-construct.md)
- [Virtual Fields With Ecto Schemas](elixir/virtual-fields-with-ecto-schemas.md)
- [When Things Don't Match The With Statements](elixir/when-things-dont-match-the-with-statements.md)
- [Word Lists For Atoms](elixir/word-lists-for-atoms.md)
### Gatsby
- [Add JavaScript To Body Of The Document](gatsby/add-javascript-to-body-of-the-document.md)
### Git
- [Accessing a Lost Commit](git/accessing-a-lost-commit.md)
- [Amend Author Of Previous Commit](git/amend-author-of-previous-commit.md)
- [Auto-Squash Those Fixup Commits](git/auto-squash-those-fixup-commits.md)
- [Caching Credentials](git/caching-credentials.md)
- [Change The Start Point Of A Branch](git/change-the-start-point-of-a-branch.md)
- [Checking Commit Ancestry](git/checking-commit-ancestry.md)
- [Checkout Old Version Of A File](git/checkout-old-version-of-a-file.md)
- [Checkout Previous Branch](git/checkout-previous-branch.md)
- [Cherry Pick A Range Of Commits](git/cherry-pick-a-range-of-commits.md)
- [Clean Out All Local Branches](git/clean-out-all-local-branches.md)
- [Clean Up Old Remote Tracking References](git/clean-up-old-remote-tracking-references.md)
- [Clone A Repo Just For The Files, Without History](git/clone-a-repo-just-for-the-files-without-history.md)
- [Clone A Repo Locally From .git](git/clone-a-repo-locally-from-git.md)
- [Configure Global gitignore File](git/configure-global-gitignore-file.md)
- [Configuring The Pager](git/configuring-the-pager.md)
- [Copy A File From Another Branch](git/copy-a-file-from-another-branch.md)
- [Create A New Branch With Git Switch](git/create-a-new-branch-with-git-switch.md)
- [Delete All Untracked Files](git/delete-all-untracked-files.md)
- [Determine The Hash Id For A Blob](git/determine-the-hash-id-for-a-blob.md)
- [Diffing With Patience](git/diffing-with-patience.md)
- [Dropping Commits With Git Rebase](git/dropping-commits-with-git-rebase.md)
- [Dry Runs in Git](git/dry-runs-in-git.md)
- [Exclude A File From A Diff Output](git/exclude-a-file-from-a-diff-output.md)
- [Excluding Files Locally](git/excluding-files-locally.md)
- [Find The Date That A File Was Added To The Repo](git/find-the-date-that-a-file-was-added-to-the-repo.md)
- [Find The Initial Commit](git/find-the-initial-commit.md)
- [Get The Name Of The Current Branch](git/get-the-name-of-the-current-branch.md)
- [Get The Short Version Of The Latest Commit](git/get-the-short-version-of-the-latest-commit.md)
- [Grab A Single File From A Stash](git/grab-a-single-file-from-a-stash.md)
- [Grep For A Pattern On Another Branch](git/grep-for-a-pattern-on-another-branch.md)
- [Grep Over Commit Messages](git/grep-over-commit-messages.md)
- [Ignore Changes To A Tracked File](git/ignore-changes-to-a-tracked-file.md)
- [Ignore Files Specific To Your Workflow](git/ignore-files-specific-to-your-workflow.md)
- [Include A Message With Your Stashed Changes](git/include-a-message-with-your-stashed-changes.md)
- [Include Or Exclude Remaining Patch Changes](git/include-or-exclude-remaining-patch-changes.md)
- [Include Some Stats In Your Git Log](git/include-some-stats-in-your-git-log.md)
- [Intent To Add](git/intent-to-add.md)
- [Interactively Unstage Changes](git/interactively-unstage-changes.md)
- [Last Commit A File Appeared In](git/last-commit-a-file-appeared-in.md)
- [List All Files Changed Between Two Branches](git/list-all-files-changed-between-two-branches.md)
- [List Branches That Contain A Commit](git/list-branches-that-contain-a-commit.md)
- [List Commits On A Branch](git/list-commits-on-a-branch.md)
- [List Different Commits Between Two Branches](git/list-different-commits-between-two-branches.md)
- [List Filenames Without The Diffs](git/list-filenames-without-the-diffs.md)
- [List Just The Files Involved In A Commit](git/list-just-the-files-involved-in-a-commit.md)
- [List Most Git Commands](git/list-most-git-commands.md)
- [List Untracked Files](git/list-untracked-files.md)
- [List Untracked Files For Scripting](git/list-untracked-files-for-scripting.md)
- [Move The Latest Commit To A New Branch](git/move-the-latest-commit-to-a-new-branch.md)
- [Pick Specific Changes To Stash](git/pick-specific-changes-to-stash.md)
- [Pulling In Changes During An Interactive Rebase](git/pulling-in-changes-during-an-interactive-rebase.md)
- [Push To A Branch On Another Remote](git/push-to-a-branch-on-another-remote.md)
- [Quicker Commit Fixes With The Fixup Flag](git/quicker-commit-fixes-with-the-fixup-flag.md)
- [Rebase Commits With An Arbitrary Command](git/rebase-commits-with-an-arbitrary-command.md)
- [Reference A Commit Via Commit Message Pattern Matching](git/reference-a-commit-via-commit-message-pattern-matching.md)
- [Rename A Remote](git/rename-a-remote.md)
- [Renaming A Branch](git/renaming-a-branch.md)
- [Resetting A Reset](git/resetting-a-reset.md)
- [Resolve A Merge Conflict From Stash Pop](git/resolve-a-merge-conflict-from-stash-pop.md)
- [Run A Git Command From Outside The Repo](git/run-a-git-command-from-outside-the-repo.md)
- [Set A Custom Pager For A Specific Command](git/set-a-custom-pager-for-a-specific-command.md)
- [Show All Commits For A File Beyond Renaming](git/show-all-commits-for-a-file-beyond-renaming.md)
- [Show Changes For Files That Match A Pattern](git/show-changes-for-files-that-match-a-pattern.md)
- [Show Changes In The Compose Commit Message View](git/show-changes-in-the-compose-commit-message-view.md)
- [Show File Diffs When Viewing Git Log](git/show-file-diffs-when-viewing-git-log.md)
- [Show List Of Most Recently Committed Branches](git/show-list-of-most-recently-committed-branches.md)
- [Show Only Commits That Touch Specific Lines](git/show-only-commits-that-touch-specific-lines.md)
- [Show The diffstat Summary Of A Commit](git/show-the-diffstat-summary-of-a-commit.md)
- [Show The Good And The Bad With Git Bisect](git/show-the-good-and-the-bad-with-git-bisect.md)
- [Show What Is In A Stash](git/show-what-is-in-a-stash.md)
- [Single Key Presses in Interactive Mode](git/single-key-presses-in-interactive-mode.md)
- [Skip A Bad Commit When Bisecting](git/skip-a-bad-commit-when-bisecting.md)
- [Skip Pre-Commit Hooks](git/skip-pre-commit-hooks.md)
- [Staging Changes Within Vim](git/staging-changes-within-vim.md)
- [Staging Stashes Interactively](git/staging-stashes-interactively.md)
- [Stash A Single Untracked File](git/stash-a-single-untracked-file.md)
- [Stash Everything](git/stash-everything.md)
- [Stashing Only Unstaged Changes](git/stashing-only-unstaged-changes.md)
- [Stashing Untracked Files](git/stashing-untracked-files.md)
- [Switch To A Recent Branch With FZF](git/switch-to-a-recent-branch-with-fzf.md)
- [Turn Off The Output Pager For One Command](git/turn-off-the-output-pager-for-one-command.md)
- [Two Kinds Of Dotted Range Notation](git/two-kinds-of-dotted-range-notation.md)
- [Unstage Changes Wih Git Restore](git/unstage-changes-with-git-restore.md)
- [Untrack A Directory Of Files Without Deleting](git/untrack-a-directory-of-files-without-deleting.md)
- [Untrack A File Without Deleting It](git/untrack-a-file-without-deleting-it.md)
- [Update The URL Of A Remote](git/update-the-url-of-a-remote.md)
- [Using Commands With A Relative Date Format](git/using-commands-with-a-relative-date-format.md)
- [Verbose Commit Message](git/verbose-commit-message.md)
- [Viewing A File On Another Branch](git/viewing-a-file-on-another-branch.md)
- [What Changed?](git/what-changed.md)
- [What Is The Current Branch?](git/what-is-the-current-branch.md)
- [Whitespace Warnings](git/whitespace-warnings.md)
### GitHub Actions
- [Capture An Output Value For Use In A Later Step](github-actions/capture-an-output-value-for-use-in-a-later-step.md)
- [Reference An Encrypted Secret In An Action](github-actions/reference-an-encrypted-secret-in-an-action.md)
### Go
- [Access Go Docs Offline](go/access-go-docs-offline.md)
- [Build For A Specific OS And Architecture](go/build-for-a-specific-os-and-architecture.md)
- [Not So Random](go/not-so-random.md)
- [Replace The Current Process With An External Command](go/replace-the-current-process-with-an-external-command.md)
- [Sleep For A Duration](go/sleep-for-a-duration.md)
- [Upgrading From An Older Version On Mac](go/upgrading-from-an-older-version-on-mac.md)
### Heroku
- [Deploy A Review App To A Different Stack](heroku/deploy-a-review-app-to-a-different-stack.md)
- [Set And Show Heroku Env Variables](heroku/set-and-show-heroku-env-variables.md)
### HTML
- [Adding Alt Text To An Image](html/adding-alt-text-to-an-image.md)
- [Disable Auto-Completion For A Form Input](html/disable-auto-completion-for-a-form-input.md)
- [Prevent Search Engines From Indexing A Page](html/prevent-search-engines-from-indexing-a-page.md)
- [Render Text As Superscript](html/render-text-as-superscript.md)
- [Submit A Form With A Button Outside The Form](html/submit-a-form-with-a-button-outside-the-form.md)
### HTTP
- [What Counts As Cross-Origin With CORS?](http/what-counts-as-cross-origin-with-cors.md)
### Internet
- [Add Emoji To GitHub Repository Description](internet/add-emoji-to-github-repository-description.md)
- [Enable Keyboard Shortcuts In Gmail](internet/enable-keyboard-shortcuts-in-gmail.md)
- [Exclude Whitespace Changes From GitHub Diffs](internet/exclude-whitespace-changes-from-github-diffs.md)
- [Figure Out Your Public IP Address](internet/figure-out-your-public-ip-address.md)
- [Focus The URL Bar](internet/focus-the-url-bar.md)
- [Get Random Images From Unsplash](internet/get-random-images-from-unsplash.md)
- [Search Tweets By Author](internet/search-tweets-by-author.md)
- [Show All Pivotal Stories With Blockers](internet/show-all-pivotal-stories-with-blockers.md)
### JavaScript
- [Accessing Arguments To A Function](javascript/accessing-arguments-to-a-function.md)
- [Basic Date Formatting Without A Library](javascript/basic-date-formatting-without-a-library.md)
- [Character Codes from Keyboard Listeners](javascript/character-codes-from-keyboard-listeners.md)
- [Check Classes On A DOM Element](javascript/check-classes-on-a-dom-element.md)
- [Check If Something Is An Array](javascript/check-if-something-is-an-array.md)
- [Check The Password Confirmation With Yup](javascript/check-the-password-confirmation-with-yup.md)
- [Compare The Equality Of Two Date Objects](javascript/compare-the-equality-of-two-date-objects.md)
- [Computed Property Names In ES6](javascript/computed-property-names-in-es6.md)
- [Conditionally Include Pairs In An Object](javascript/conditionally-include-pairs-in-an-object.md)
- [Configure Jest To Run A Test Setup File](javascript/configure-jest-to-run-a-test-setup-file.md)
- [Create A Cancelable Promise With PCancelable](javascript/create-a-cancelable-promise-with-pcancelable.md)
- [Create An Array Containing 1 To N](javascript/create-an-array-containing-1-to-n.md)
- [Create An Object With No Properties](javascript/create-an-object-with-no-properties.md)
- [Create Bootstrapped Apps With Yarn](javascript/create-bootstrapped-apps-with-yarn.md)
- [Create Future And Past Dates From Today](javascript/create-future-and-past-dates-from-today.md)
- [Custom Type Checking Error Messages With Yup](javascript/custom-type-checking-error-messages-with-yup.md)
- [Default And Named Exports From The Same Module](javascript/default-and-named-exports-from-the-same-module.md)
- [Define A Custom Jest Matcher](javascript/define-a-custom-jest-matcher.md)
- [Destructure With Access To Nested Value And Parent Value](javascript/destructure-with-access-to-nested-value-and-parent-value.md);
- [Destructuring The Rest Of An Array](javascript/destructuring-the-rest-of-an-array.md)
- [Enable ES7 Transforms With react-rails](javascript/enable-es7-transforms-with-react-rails.md)
- [Ensure Shell Can Find Global npm Binaries](javascript/ensure-shell-can-find-global-npm-binaries.md)
- [Easy Date Comparison With DayJS](javascript/easy-date-comparison-with-dayjs.md)
- [Expand Emojis With The Spread Operator](javascript/expand-emojis-with-the-spread-operator.md)
- [Fill An Input With A Ton Of Text](javascript/fill-an-input-with-a-ton-of-text.md)
- [Find Where Yarn Is Installing Binaries](javascript/find-where-yarn-is-installing-binaries.md)
- [for...in Iterates Over Object Properties](javascript/for-in-iterates-over-object-properties.md)
- [Formatting Values With Units For Display](javascript/formatting-values-with-units-for-display.md)
- [Freeze An Object, Sorta](javascript/freeze-an-object-sorta.md)
- [Generate Random Integers](javascript/generate-random-integers.md)
- [Get The Location And Size Of An Element](javascript/get-the-location-and-size-of-an-element.md)
- [Get The Response Status From An Axios Error](javascript/get-the-response-status-from-an-axios-error.md)
- [Get The Time Zone Of The Client Computer](javascript/get-the-time-zone-of-the-client-computer.md)
- [Globally Install A Package With Yarn](javascript/globally-install-a-package-with-yarn.md)
- [Immutable Remove With The Spread Operator](javascript/immutable-remove-with-the-spread-operator.md)
- [Initialize A New JavaScript Project With Yarn](javascript/initialize-a-new-javascript-project-with-yarn.md)
- [Install The Latest Version Of Node With Nvm](javascript/install-the-latest-version-of-node-with-nvm.md)
- [Interpolate A String Into A Regex](javascript/interpolate-a-string-into-a-regex.md)
- [ISO-8601 Formatted Dates Are Interpreted As UTC](javascript/iso-8601-formatted-dates-are-interpreted-as-utc.md)
- [Link A JavaScript Package Locally](javascript/link-a-javascript-package-locally.md)
- [List Top-Level NPM Dependencies](javascript/list-top-level-npm-dependencies.md)
- [Make The Browser Editable With Design Mode](javascript/make-the-browser-editable-with-design-mode.md)
- [Matching A Computed Property In Function Args](javascript/matching-a-computed-property-in-function-args.md)
- [Matching Multiple Values In A Switch Statement](javascript/matching-multiple-values-in-a-switch-statement.md)
- [Mock A Function With Return Values Using Jest](javascript/mock-a-function-with-return-values-using-jest.md)
- [New Dates Can Take Out Of Bounds Values](javascript/new-dates-can-take-out-of-bounds-values.md)
- [Numbers Are Empty](javascript/numbers-are-empty.md)
- [Object Initialization With Shorthand Property Names](javascript/object-initialization-with-shorthand-property-names.md)
- [Obtain Undefined Value With The Void Operator](javascript/obtain-undefined-value-with-the-void-operator.md)
- [Parse A Date From A Timestamp](javascript/parse-a-date-from-a-timestamp.md)
- [Random Cannot Be Seeded](javascript/random-cannot-be-seeded.md)
- [Reach Into An Object For Nested Data With Get](javascript/reach-into-an-object-for-nested-data-with-get.md)
- [Render An Array Of Elements With React 16](javascript/render-an-array-of-elements-with-react-16.md)
- [Resolve And Pass Multiple Values From A Then](javascript/resolve-and-pass-multiple-values-from-a-then.md)
- [Running ES6 Specs With Mocha](javascript/running-es6-specs-with-mocha.md)
- [Scoping Variables With A Block Statement](javascript/scoping-variables-with-a-block-statement.md)
- [Sorting Arrays Of Objects With Lodash](javascript/sorting-arrays-of-objects-with-lodash.md)
- [Splat Arguments To A Function](javascript/splat-arguments-to-a-function.md)
- [Spread The Rest With ES6](javascript/spread-the-rest-with-es6.md)
- [Start Node Process In Specific Timezone](javascript/start-node-process-in-specific-timezone.md)
- [String Interpolation With Template Literals](javascript/string-interpolation-with-template-literals.md)
- [Support Nested Matching In Custom Jest Matchers](javascript/support-nested-matching-in-custom-jest-matchers.md)
- [Tell Prettier To Not Format A Statement](javascript/tell-prettier-to-not-format-a-statement.md)
- [Test Coverage Stats With Jest](javascript/test-coverage-stats-with-jest.md)
- [Test Timing-Based Code With Jest Fake Timers](javascript/test-timing-based-code-with-jest-fake-timers.md)
- [The Comma Operator](javascript/the-comma-operator.md)
- [Throttling A Function Call](javascript/throttling-a-function-call.md)
- [Timing Processes](javascript/timing-processes.md)
- [Transforming ES6 and JSX With Babel 6](javascript/transforming-es6-and-jsx-with-babel-6.md)
- [Truthiness of Integer Arrays](javascript/truthiness-of-integer-arrays.md)
- [Turn An HTMLCollection Into An Array](javascript/turn-an-html-collection-into-an-array.md)
- [Turn Off Console Error Messages In A Test](javascript/turn-off-console-error-messages-in-a-test.md)
- [Waiting On Multiple Promises](javascript/waiting-on-multiple-promises.md)
- [Who Am I: NPM Edition](javascript/who-am-i-npm-edition.md)
- [Yarn Commands Without The Emojis](javascript/yarn-commands-without-the-emojis.md)
- [Yup Schemas Are Validated Asynchronously](javascript/yup-schemas-are-validated-asynchronously.md)
### jq
- [Extract A List Of Values](jq/extract-a-list-of-values.md)
### Kitty
- [Set The Title Of A Window](kitty/set-the-title-of-a-window.md)
- [Use The Built-In Emoji Picker](kitty/use-the-built-in-emoji-picker.md)
### Linux
- [Check Ubuntu Version](linux/check-ubuntu-version.md)
- [Configure Your Server Timezone](linux/configure-your-server-timezone.md)
- [List The Statuses Of All Upstart Jobs](linux/list-the-statuses-of-all-upstart-jobs.md)
- [Show Current System Time And Settings](linux/show-current-system-time-and-settings.md)
- [Upgrading Ubuntu](linux/upgrading-ubuntu.md)
### Mac
- [Access All Screen And Video Capture Options](mac/access-all-screen-and-video-capture-options.md)
- [Access System Information On OS X](mac/access-system-information-on-osx.md)
- [Access Unsupported Screen Resolutions With RDM](mac/access-unsupported-screen-resolutions-with-rdm.md)
- [Clean Up Old Homebrew Files](mac/clean-up-old-homebrew-files.md)
- [Convert An HEIC Image File To JPG](mac/convert-an-heic-image-file-to-jpg.md)
- [Default Screenshot Location](mac/default-screenshot-location.md)
- [Disable Swipe Navigation For A Specific App](mac/disable-swipe-navigation-for-a-specific-app.md)
- [Display A Message With Alfred](mac/display-a-message-with-alfred.md)
- [Find The Process Using A Specific Port](mac/find-the-process-using-a-specific-port.md)
- [Gesture For Viewing All Windows Of Current App](mac/gesture-for-viewing-all-windows-of-current-app.md)
- [Insert A Non-Breaking Space Character](mac/insert-a-non-breaking-space-character.md)
- [List All The Say Voices](mac/list-all-the-say-voices.md)
- [Quickly Type En Dashes And Em Dashes](mac/quickly-type-en-dashes-and-em-dashes.md)
- [Require Additional JS Libraries In Postman](mac/require-additional-js-libraries-in-postman.md)
- [Resize App Windows With AppleScript](mac/resize-app-windows-with-applescript.md)
- [Resizing Both Corners Of A Window](mac/resizing-both-corners-of-a-window.md)
- [Run A Hardware Check](mac/run-a-hardware-check.md)
- [Run AppleScript Commands Inline In The Terminal](mac/run-applescript-commands-inline-in-the-terminal.md)
- [Set A Window To Its Default Zoom Level](mac/set-a-window-to-its-default-zoom-level.md)
- [Specify App When Opening From Command Line](mac/specify-app-when-opening-from-command-line.md)
- [Use Default Screenshot Shortcuts With CleanShot X](mac/use-default-screenshot-shortcuts-with-cleanshot-x.md)
- [View All Windows Of The Current App](mac/view-all-windows-of-the-current-app.md)
### MongoDB
- [Determine The Database Version](mongodb/determine-the-database-version.md)
- [Dump A Remote Database](mongodb/dump-a-remote-database.md)
- [Get Size Stats For A Collection](mongodb/get-size-stats-for-a-collection.md)
- [List Size Stats For All Collections](mongodb/list-size-stats-for-all-collections.md)
### MySQL
- [Display Output In A Vertical Format](mysql/display-output-in-a-vertical-format.md)
- [Doing Date Math](mysql/doing-date-math.md)
- [Dump A Database To A File](mysql/dump-a-database-to-a-file.md)
- [List Databases And Tables](mysql/list-databases-and-tables.md)
- [Show Create Statement For A Table](mysql/show-create-statement-for-a-table.md)
- [Show Tables That Match A Pattern](mysql/show-tables-that-match-a-pattern.md)
- [Show Indexes For A Table](mysql/show-indexes-for-a-table.md)
### Netlify
- [Override The Default Yarn Version](netlify/override-the-default-yarn-version.md)
### Next.js
- [Create Files And Directories For Dynamic Routes](nextjs/create-files-and-directories-for-dynamic-routes.md)
- [Define URL Redirects In The Next Config](nextjs/define-url-redirects-in-the-next-config.md)
- [Remove A Query Param From The URL](nextjs/remove-a-query-param-from-the-url.md)
- [Ship Public Assets With A Next.js App](nextjs/ship-public-assets-with-a-nextjs-app.md)
### Phoenix
- [Bypass Template Rendering](phoenix/bypass-template-rendering.md)
- [Check The Installed Version](phoenix/check-the-installed-version.md)
- [Generate New App Without Brunch](phoenix/generate-new-app-without-brunch.md)
- [Render A Template To A String](phoenix/render-a-template-to-a-string.md)
- [Serve Static Assets From Custom Directory](phoenix/serve-static-assets-from-custom-directory.md)
- [Specifying The Digest Directory](phoenix/specifying-the-digest-directory.md)
- [Specifying The Server Port](phoenix/specifying-the-server-port.md)
### PostgreSQL
- [A Better Null Display Character](postgres/a-better-null-display-character.md)
- [Add Foreign Key Constraint Without A Full Lock](postgres/add-foreign-key-constraint-without-a-full-lock.md)
- [Add ON DELETE CASCADE To Foreign Key Constraint](postgres/add-on-delete-cascade-to-foreign-key-constraint.md)
- [Adding Composite Uniqueness Constraints](postgres/adding-composite-uniqueness-constraints.md)
- [Aggregate A Column Into An Array](postgres/aggregate-a-column-into-an-array.md)
- [Assumed Radius Of The Earth](postgres/assumed-radius-of-the-earth.md)
- [Auto Expanded Display](postgres/auto-expanded-display.md)
- [Between Symmetric](postgres/between-symmetric.md)
- [Capitalize All The Words](postgres/capitalize-all-the-words.md)
- [Change The Current Directory For psql](postgres/change-the-current-directory-for-psql.md)
- [Check If The Local Server Is Running](postgres/check-if-the-local-server-is-running.md)
- [Check Table For Any Oprhaned Records](postgres/check-table-for-any-orphaned-records.md)
- [Checking Inequality](postgres/checking-inequality.md)
- [Checking The Type Of A Value](postgres/checking-the-type-of-a-value.md)
- [Clear The Screen In psql](postgres/clear-the-screen-in-psql.md)
- [Clear The Screen In psql (2)](postgres/clear-the-screen-in-psql-2.md)
- [Compute Hashes With pgcrypto](postgres/compute-hashes-with-pgcrypto.md)
- [Compute The Levenshtein Distance Of Two Strings](postgres/compute-the-levenshtein-distance-of-two-strings.md)
- [Compute The md5 Hash Of A String](postgres/compute-the-md5-hash-of-a-string.md)
- [Configure The Timezone](postgres/configure-the-timezone.md)
- [Constructing A Range Of Dates](postgres/constructing-a-range-of-dates.md)
- [Convert A String To A Timestamp](postgres/convert-a-string-to-a-timestamp.md)
- [Count How Many Records There Are Of Each Type](postgres/count-how-many-records-there-are-of-each-type.md)
- [Count Records By Type](postgres/count-records-by-type.md)
- [Count The Number Of Trues In An Aggregate Query](postgres/count-the-number-of-trues-in-an-aggregate-query.md)
- [Create A Composite Primary Key](postgres/create-a-composite-primary-key.md)
- [Create An Index Without Locking The Table](postgres/create-an-index-without-locking-the-table.md)
- [Create Database Uses Template1](postgres/create-database-uses-template1.md)
- [Create hstore From Two Arrays](postgres/create-hstore-from-two-arrays.md)
- [Create Table Adds A Data Type](postgres/create-table-adds-a-data-type.md)
- [Creating Conditional Constraints](postgres/creating-conditional-constraints.md)
- [Creating Custom Types](postgres/creating-custom-types.md)
- [Day Of Week By Name For A Date](postgres/day-of-week-by-name-for-a-date.md)
- [Day Of Week For A Date](postgres/day-of-week-for-a-date.md)
- [Default Schema](postgres/default-schema.md)
- [Defining Arrays](postgres/defining-arrays.md)
- [Determine Types Of JSONB Records](postgres/determine-types-of-jsonb-records.md)
- [Determining The Age Of Things](postgres/determining-the-age-of-things.md)
- [Difference Between Explain And Explain Analyze](postgres/difference-between-explain-and-explain-analyze.md)
- [Dump All Databases To A SQL File](postgres/dump-all-databases-to-a-sql-file.md)
- [Dump And Restore A Database](postgres/dump-and-restore-a-database.md)
- [Duplicate A Local Database](postgres/duplicate-a-local-database.md)
- [Edit Existing Functions](postgres/edit-existing-functions.md)
- [Escaping A Quote In A String](postgres/escaping-a-quote-in-a-string.md)
- [Escaping String Literals With Dollar Quoting](postgres/escaping-string-literals-with-dollar-quoting.md)
- [Export Query Results To A CSV](postgres/export-query-results-to-a-csv.md)
- [Extracting Nested JSON Data](postgres/extracting-nested-json-data.md)
- [Find Duplicate Records In Table Without Unique Id](postgres/find-duplicate-records-in-table-without-unique-id.md)
- [Find Records That Contain Duplicate Values](postgres/find-records-that-contain-duplicate-values.md)
- [Find Records That Have Multiple Associated Records](postgres/find-records-that-have-multiple-associated-records.md)
- [Find The Data Directory](postgres/find-the-data-directory.md)
- [Find The Location Of Postgres Config Files](postgres/find-the-location-of-postgres-config-files.md)
- [Fizzbuzz With Common Table Expressions](postgres/fizzbuzz-with-common-table-expressions.md)
- [Force SSL When Making A psql Connection](postgres/force-ssl-when-making-a-psql-connection.md)
- [Generate A UUID](postgres/generate-a-uuid.md)
- [Generate Random UUIDs Without An Extension](postgres/generate-random-uuids-without-an-extension.md)
- [Generate Series Of Numbers](postgres/generate-series-of-numbers.md)
- [Generating UUIDs With pgcrypto](postgres/generating-uuids-with-pgcrypto.md)
- [Get The Size Of A Database](postgres/get-the-size-of-a-database.md)
- [Get The Size Of A Table](postgres/get-the-size-of-a-table.md)
- [Get The Size Of An Index](postgres/get-the-size-of-an-index.md)
- [Getting A Slice Of An Array](postgres/getting-a-slice-of-an-array.md)
- [Group By The Result Of A Function Call](postgres/group-by-the-result-of-a-function-call.md)
- [Insert A Bunch Of Records With Generate Series](postgres/insert-a-bunch-of-records-with-generate-series.md)
- [Insert Just The Defaults](postgres/insert-just-the-defaults.md)
- [Install Postgres With uuid-ossp Using asdf](postgres/install-postgres-with-uuid-ossp-using-asdf.md)
- [Integers In Postgres](postgres/integers-in-postgres.md)
- [Intervals Of Time By Week](postgres/intervals-of-time-by-week.md)
- [Is It Null Or Not Null?](postgres/is-it-null-or-not-null.md)
- [Limit Execution Time Of Statements](postgres/limit-execution-time-of-statements.md)
- [List All Columns Of A Specific Type](postgres/list-all-columns-of-a-specific-type.md)
- [List All Rows In A Table](postgres/list-all-rows-in-a-table.md)
- [List All The Databases](postgres/list-all-the-databases.md)
- [List All Versions Of A Function](postgres/list-all-versions-of-a-function.md)
- [List Available Schemas](postgres/list-available-schemas.md)
- [List Connections To A Database](postgres/list-connections-to-a-database.md)
- [List Database Objects With Disk Usage](postgres/list-database-objects-with-disk-usage.md)
- [List Database Users](postgres/list-database-users.md)
- [List Various Kinds Of Objects](postgres/list-various-kinds-of-objects.md)
- [Lower Is Faster Than ilike](postgres/lower-is-faster-than-ilike.md)
- [Max Identifier Length Is 63 Bytes](postgres/max-identifier-length-is-63-bytes.md)
- [pg Prefix Is Reserved For System Schemas](postgres/pg-prefix-is-reserved-for-system-schemas.md)
- [Prepare, Execute, And Deallocate Statements](postgres/prepare-execute-and-deallocate-statements.md)
- [Pretty Print Data Sizes](postgres/pretty-print-data-sizes.md)
- [Pretty Printing JSONB Rows](postgres/pretty-printing-jsonb-rows.md)
- [Prevent A Query From Running Too Long](postgres/prevent-a-query-from-running-too-long.md)
- [Print The Query Buffer In psql](postgres/print-the-query-buffer-in-psql.md)
- [Remove Not Null Constraint From A Column](postgres/remove-not-null-constraint-from-a-column.md)
- [Renaming A Sequence](postgres/renaming-a-sequence.md)
- [Renaming A Table](postgres/renaming-a-table.md)
- [Restart A Sequence](postgres/restart-a-sequence.md)
- [Restarting Sequences When Truncating Tables](postgres/restarting-sequences-when-truncating-tables.md)
- [Salt And Hash A Password With pgcrypto](postgres/salt-and-hash-a-password-with-pgcrypto.md)
- [Send A Command To psql](postgres/send-a-command-to-psql.md)
- [Set Inclusion With hstore](postgres/set-inclusion-with-hstore.md)
- [Set A Seed For The Random Number Generator](postgres/set-a-seed-for-the-random-number-generator.md)
- [Set A Statement Timeout Threshold For A Session](postgres/set-a-statement-timeout-threshold-for-a-session.md)
- [Sets With The Values Command](postgres/sets-with-the-values-command.md)
- [Shorthand Absolute Value Operator](postgres/shorthand-absolute-value-operator.md)
- [Show All Versions Of An Operator](postgres/show-all-versions-of-an-operator.md)
- [Sleeping](postgres/sleeping.md)
- [Special Math Operators](postgres/special-math-operators.md)
- [Storing Emails With citext](postgres/storing-emails-with-citext.md)
- [String Contains Another String](postgres/string-contains-another-string.md)
- [Switch Non-Castable Column Type With Using Clause](postgres/switch-non-castable-column-type-with-using-clause.md)
- [Switch The Running Postgres Server Version](postgres/switch-the-running-postgres-server-version.md)
- [Temporarily Disable Triggers](postgres/temporarily-disable-triggers.md)
- [Temporary Tables](postgres/temporary-tables.md)
- [Terminating A Connection](postgres/terminating-a-connection.md)
- [The nullif Function](postgres/the-nullif-function.md)
- [Timestamp Functions](postgres/timestamp-functions.md)
- [Toggling The Pager In PSQL](postgres/toggling-the-pager-in-psql.md)
- [Track psql History Separately Per Database](postgres/track-psql-history-separately-per-database.md)
- [Truncate All Rows](postgres/truncate-all-rows.md)
- [Truncate Tables With Dependents](postgres/truncate-tables-with-dependents.md)
- [Turning Timing On](postgres/turn-timing-on.md)
- [Two Ways To Compute Factorial](postgres/two-ways-to-compute-factorial.md)
- [Two Ways To Escape A Quote In A String](postgres/two-ways-to-escape-a-quote-in-a-string.md)
- [Types By Category](postgres/types-by-category.md)
- [Union All Rows Including Duplicates](postgres/union-all-rows-including-duplicates.md)
- [Use A psqlrc File For Common Settings](postgres/use-a-psqlrc-file-for-common-settings.md)
- [Use Argument Indexes](postgres/use-argument-indexes.md)
- [Use Not Valid To Immediately Enforce A Constraint](postgres/use-not-valid-to-immediately-enforce-a-constraint.md)
- [Using Expressions In Indexes](postgres/using-expressions-in-indexes.md)
- [Using Intervals To Offset Time](postgres/using-intervals-to-offset-time.md)
- [Who Is The Current User](postgres/who-is-the-current-user.md)
- [Word Count for a Column](postgres/word-count-for-a-column.md)
- [Write A Query Result To File](postgres/write-a-query-result-to-file.md)
### Python
- [Access Instance Variables](python/access-instance-variables.md)
- [Create A Dummy DataFrame In Pandas](python/create-a-dummy-dataframe-in-pandas.md)
- [Test A Function With Pytest](python/test-a-function-with-pytest.md)
### Rails
- [Add A Check Constraint To A Table](rails/add-a-check-constraint-to-a-table.md)
- [Add A Foreign Key Reference To A Table](rails/add-a-foreign-key-reference-to-a-table.md)
- [Add A Reference Column With An Index](rails/add-a-reference-column-with-an-index.md)
- [Add ActiveRecord Error Not Tied To Any Attribute](rails/add-activerecord-error-not-tied-to-any-attribute.md)
- [Add React With Webpacker To A New Rails App](rails/add-react-with-webpacker-to-a-new-rails-app.md)
- [Add timestamptz Columns With The Migration DSL](rails/add-timestamptz-columns-with-the-migration-dsl.md)
- [Access Secrets In A Rails 5.2 App](rails/access-secrets-in-a-rails-5-2-app.md)
- [ActiveRecord Query For This Or That](rails/active-record-query-for-this-or-that.md)
- [Advance The Date](rails/advance-the-date.md)
- [Allow List Params Anywhere With Strong Params](rails/allow-list-params-anywhere-with-strong-params.md)
- [All or Nothing Database Transactions](rails/all-or-nothing-database-transactions.md)
- [Assert Two Arrays Have The Same Items With RSpec](rails/assert-two-arrays-have-the-same-items-with-rspec.md)
- [Attach A File With Capybara](rails/attach-a-file-with-capybara.md)
- [Attribute Getter without the Recursion](rails/attribute-getter-without-the-recursion.md)
- [Attribute Was](rails/attribute-was.md)
- [Autosave False On ActiveRecord Associations](rails/autosave-false-on-activerecord-associations.md)
- [Bind Parameters To ActiveRecord SQL Query](rails/bind-parameters-to-activerecord-sql-query.md)
- [Build A Hash Of Model Attributes](rails/build-a-hash-of-model-attributes.md)
- [Capture Development Emails With Mailhog](rails/capture-development-emails-with-mailhog.md)
- [Capybara Page Status Code](rails/capybara-page-status-code.md)
- [Cast Common Boolean-Like Values To Booleans](rails/cast-common-boolean-like-values-to-booleans.md)
- [Change The Nullability Of A Column](rails/change-the-nullability-of-a-column.md)
- [Change The Time Zone Offset Of A DateTime Object](rails/change-the-time-zone-offset-of-a-datetime-object.md)
- [Check If ActiveRecord Update Fails](rails/check-if-activerecord-update-fails.md)
- [Check If Any Records Have A Null Value](rails/check-if-any-records-have-a-null-value.md)
- [Check Specific Attributes On ActiveRecord Array](rails/check-specific-attributes-on-activerecord-array.md)
- [Code Statistics For An Application](rails/code-statistics-for-an-application.md)
- [Columns With Default Values Are Nil On Create](rails/columns-with-default-values-are-nil-on-create.md)
- [Comparing DateTimes Down To Second Precision](rails/comparing-datetimes-down-to-second-precision.md)
- [Conditional Class Selectors in Haml](rails/conditional-class-selectors-in-haml.md)
- [Convert A Symbol To A Constant](rails/convert-a-symbol-to-a-constant.md)
- [Count The Number Of Records By Attribute](rails/count-the-number-of-records-by-attribute.md)
- [Create A Custom Named References Column](rails/create-a-custom-named-references-column.md)
- [Create A Join Table With The Migration DSL](rails/create-a-join-table-with-the-migration-dsl.md)
- [Creating Records of Has_One Associations](rails/creating-records-of-has-one-associations.md)
- [Custom Validation Message](rails/custom-validation-message.md)
- [Customize Paths And Helpers For Devise Routes](rails/customize-paths-and-helpers-for-devise-routes.md)
- [Customize The Path Of A Resource Route](rails/customize-the-path-of-a-resource-route.md)
- [Delete Paranoid Records](rails/delete-paranoid-records.md)
- [Demodulize A Class Name](rails/demodulize-a-class-name.md)
- [Different Ways To Add A Foreign Key Reference](rails/different-ways-to-add-a-foreign-key-reference.md)
- [Disambiguate Where In A Joined Relation](rails/disambiguate-where-in-a-joined-relation.md)
- [Ensure Migrations Use The Latest Schema](rails/ensure-migrations-use-the-latest-schema.md)
- [Find Or Create A Record With FactoryBot](rails/find-or-create-a-record-with-factory-bot.md)
- [Force All Users To Sign Out](rails/force-all-users-to-sign-out.md)
- [Generating And Executing SQL](rails/generating-and-executing-sql.md)
- [Get An Array Of Values From The Database](rails/get-an-array-of-values-from-the-database.md)
- [Get An Empty ActiveRecord Relation](rails/get-an-empty-activerecord-relation.md)
- [Get The Column Names For A Model](rails/get-the-column-names-for-a-model.md)
- [Get The Current Time](rails/get-the-current-time.md)
- [Hash Slicing](rails/hash-slicing.md)
- [Ignore Poltergeist JavaScript Errors](rails/ignore-poltergeist-javascript-errors.md)
- [Include Devise Helpers In Your Controller Tests](rails/include-devise-helpers-in-your-controller-tests.md)
- [Inspect Previous Changes To ActiveRecord Object](rails/inspect-previous-changes-to-activerecord-object.md)
- [Link To The Current Page With Query Params](rails/link-to-the-current-page-with-query-params.md)
- [List All Installable Rails Versions](rails/list-all-installable-rails-versions.md)
- [List The Enqueued Jobs](rails/list-the-enqueued-jobs.md)
- [Load Records In Batches With find_each](rails/load-records-in-batches-with-find-each.md)
- [Log SQL Queries Executed By ActiveRecord](rails/log-sql-queries-executed-by-activerecord.md)
- [Mark A Migration As Irreversible](rails/mark-a-migration-as-irreversible.md)
- [Make ActionMailer Synchronous In Test](rails/make-action-mailer-synchronous-in-test.md)
- [Manually Run A Migration From Rails Console](rails/manually-run-a-migration-from-rails-console.md)
- [Mark For Destruction](rails/mark-for-destruction.md)
- [Mask An ActiveRecord Attribute](rails/mask-an-activerecord-attribute.md)
- [Merge A Scope Into An ActiveRecord Query](rails/merge-a-scope-into-an-activerecord-query.md)
- [Migrating Up Down Up](rails/migrating-up-down-up.md)
- [Order Matters For `rescue_from` Blocks](rails/order-matters-for-rescue-from-blocks.md)
- [Params Includes Submission Button Info](rails/params-includes-submission-button-info.md)
- [Parse Query Params From A URL](rails/parse-query-params-from-a-url.md)
- [Perform SQL Explain With ActiveRecord](rails/perform-sql-explain-with-activerecord.md)
- [Polymorphic Path Helpers](rails/polymorphic-path-helpers.md)
- [Pretend Generations](rails/pretend-generations.md)
- [Query A Single Value From The Database](rails/query-a-single-value-from-the-database.md)
- [Read In Environment-Specific Config Values](rails/read-in-environment-specific-config-values.md)
- [Read-Only Models](rails/read-only-models.md)
- [Remove The Default Value On A Column](rails/remove-the-default-value-on-a-column.md)
- [Render An Alternative ActionMailer Template](rails/render-an-alternative-action-mailer-template.md)
- [Render The Response Body In Controller Specs](rails/render-the-response-body-in-controller-specs.md)
- [Replace An Index With A Unique Index](rails/replace-an-index-with-a-unique-index.md)
- [Rescue From](rails/rescue-from.md)
- [Retrieve An Object If It Exists](rails/retrieve-an-object-if-it-exists.md)
- [Rollback A Specific Migration Out Of Order](rails/rollback-a-specific-migration-out-of-order.md)
- [Rounding Numbers With Precision](rails/rounding-numbers-with-precision.md)
- [Schedule Sidekiq Jobs Out Into The Future](rails/schedule-sidekiq-jobs-out-into-the-future.md)
- [Secure Passwords With Rails And Bcrypt](rails/secure-passwords-with-rails-and-bcrypt.md)
- [Select A Select By Selector](rails/select-a-select-by-selector.md)
- [Select Value For SQL Counts](rails/select-value-for-sql-counts.md)
- [Serialize With fast_jsonapi In A Rails App](rails/serialize-with-fast-jsonapi-in-a-rails-app.md)
- [Set A Timestamp Field To The Current Time](rails/set-a-timestamp-field-to-the-current-time.md)
- [Set default_url_options For Entire Application](rails/set-default-url-options-for-entire-application.md)
- [Set Schema Search Path](rails/set-schema-search-path.md)
- [Set Statement Timeout For All Postgres Connections](rails/set-statement-timeout-for-all-postgres-connections.md)
- [Set The Default Development Port](rails/set-the-default-development-port.md)
- [Show Pending Migrations](rails/show-pending-migrations.md)
- [Show Rails Models With Pry](rails/show-rails-models-with-pry.md)
- [Show Rails Routes With Pry](rails/show-rails-routes-with-pry.md)
- [Skip Validations When Creating A Record](rails/skip-validations-when-creating-a-record.md)
- [Specify New Attributes For #find_or_create_by](rails/specify-new-attributes-for-find-or-create-by.md)
- [Temporarily Disable strong_params](rails/temporarily-disable-strong-params.md)
- [Test If An Instance Variable Was Assigned](rails/test-if-an-instance-variable-was-assigned.md)
- [Test If deliver_later Is Called For A Mailer](rails/test-if-deliver-later-is-called-for-a-mailer.md)
- [Truncate Almost All Tables](rails/truncate-almost-all-tables.md)
- [Update Column Versus Update Attribute](rails/update-column-versus-update-attribute.md)
- [Upgrading Your Manifest For Sprocket's 4](rails/upgrading-your-manifest-for-sprockets-4.md)
- [Verify And Read A Signed Cookie Value](rails/verify-and-read-a-signed-cookie-value.md)
- [Where Am I In The Partial Iteration?](rails/where-am-i-in-the-partial-iteration.md)
- [Wipe Out All Precompiled Assets](rails/wipe-out-all-precompiled-assets.md)
- [Write Reversible Migration To Set Default](rails/write-reversible-migration-to-set-default.md)
- [Write Safer Where Clauses With Placeholders](rails/write-safer-where-clauses-with-placeholders.md)
### React
- [A Component Is Just A Bag Of Data](react/a-component-is-just-a-bag-of-data.md)
- [Access The Latest Lifecycle Methods In An Old App](react/access-the-latest-lifecycle-methods-in-an-old-app.md)
- [Accessing Env Vars In create-react-app](react/accessing-env-vars-in-create-react-app.md)
- [Accessing Location Within @reach/router](react/accessing-location-within-reach-router.md)
- [Allow md As An Extension With gatsby-mdx](react/allow-md-as-an-extension-with-gatsby-mdx.md)
- [Alter The Display Name Of A Component](react/alter-the-display-name-of-a-component.md)
- [Building A React App In The Browser](react/building-a-react-app-in-the-browser.md)
- [Check The Type Of A Child Component](react/check-the-type-of-a-child-component.md)
- [Conditionally Including Event Handler Functions](react/conditionally-including-event-handler-functions.md)
- [Create A Snowpack-Bundled React App](react/create-a-snowpack-bundled-react-app.md)
- [Create Dynamically Named Custom React Components](react/create-dynamically-named-custom-react-components.md)
- [create-react-app Comes With Lodash](react/create-react-app-comes-with-lodash.md)
- [create-react-app Has A Default Test Setup File](react/create-react-app-has-a-default-test-setup-file.md)
- [CSS !important Is Not Supported By Inline Styles](react/css-important-is-not-supported-by-inline-styles.md)
- [Debug Jest Tests In create-react-app](react/debug-jest-tests-in-create-react-app.md)
- [Defining State In A Simple Class Component](react/defining-state-in-a-simple-class-component.md)
- [Destructure Variables As Props To A Component](react/destructure-variables-as-props-to-a-component.md)
- [Details Tags Are A Controllable Component](react/details-tags-are-a-controllable-component.md)
- [Dispatch Anywhere With Redux](react/dispatch-anywhere-with-redux.md)
- [Dynamically Add Props To A Child Component](react/dynamically-add-props-to-a-child-component.md)
- [Dynamically Create HTML Elements](react/dynamically-create-html-elements.md)
- [Enforce Specific Values With PropTypes](react/enforce-specific-values-with-proptypes.md)
- [Focus An Input With useRef Hook](react/focus-an-input-with-useref-hook.md)
- [Force A Component To Only Have One Child](react/force-a-component-to-only-have-one-child.md)
- [Forcing A Child Remount With The Key Prop](react/forcing-a-child-remount-with-the-key-prop.md)
- [Formik Connected Components](react/formik-connected-components.md)
- [Formik's Validation Schema As A Function](react/formiks-validation-schema-as-a-function.md)
- [Inactive And Active Component Styles With Radium](react/inactive-and-active-component-styles-with-radium.md)
- [Inline Style Attributes Should Be Camel Cased](react/inline-style-attributes-should-be-camel-cased.md)
- [Manage State In A Functional Component](react/manage-state-in-a-functional-component.md)
- [Mapping Over One Or Many Children](react/mapping-over-one-or-many-children.md)
- [Mock A Function That A Component Imports](react/mock-a-function-that-a-component-imports.md)
- [Navigate With State Via @reach/router](react/navigate-with-state-via-reach-router.md)
- [Pairing A Callback With A useState Hook](react/pairing-a-callback-with-a-usestate-hook.md)
- [Pass A Function To A useState Updater](react/pass-a-function-to-a-usestate-updater.md)
- [Passing Props Down To React-Router Route](react/passing-props-down-to-react-router-route.md)
- [Prevent reach/router Redirect Error Screen In Dev](react/prevent-reach-router-redirect-error-screen-in-dev.md)
- [Proxy To An API Server In Development With CRA](react/proxy-to-an-api-server-in-development-with-cra.md)
- [Quickly Search For A Component With React DevTools](react/quickly-search-for-a-component-with-react-devtools.md)
- [@reach/router Renders To A Div](react/reach-router-renders-to-a-div.md)
- [Read Only Input Elements](react/read-only-input-elements.md)
- [Rendering Multiple Nodes With Fragments](react/rendering-multiple-nodes-with-fragments.md)
- [Set The Type For A useState Hook](react/set-the-type-for-a-usestate-hook.md)
- [Specifying Dependencies Of A useEffect Hook](react/specifying-dependencies-of-a-useeffect-hook.md)
- [Spelunking Through Components With Enzyme's Dive](react/spelunking-through-components-with-enzymes-dive.md)
- [Sync Your react-router State With Redux](react/sync-your-react-router-state-with-redux.md)
- [Test Files In create-react-app](react/test-files-in-create-react-app.md)
- [Test That Element Does Not Render In The Component](react/test-that-element-does-not-render-in-the-component.md)
- [Trigger Effect Only When The Component Mounts](react/trigger-effect-only-when-the-component-mounts.md)
- [Update Formik Initial Values When Props Change](react/update-formik-initial-values-when-props-change.md)
- [Upgrading To The Latest React In CodeSandbox](react/upgrading-to-the-latest-react-in-codesandbox.md)
- [Use A Ref To Autofocus An Input](react/use-a-ref-to-autofocus-an-input.md)
- [Use React 16 With Gatsby](react/use-react-16-with-gatsby.md)
- [Use withRouter To Pass Down React-Router History](react/use-withrouter-to-pass-down-react-router-history.md)
- [Visually Select A React Element For Inspection](react/visually-select-a-react-element-for-inspection.md)
- [Who Is Your Favorite Child?](react/who-is-your-favorite-child.md)
- [Wrap The Root Of A Gatsby App In A Component](react/wrap-the-root-of-a-gatsby-app-in-a-component.md)
### React Native
- [Avoid The Notch With SafeAreaView](react_native/avoid-the-notch-with-safeareaview.md)
### React Testing Library
- [Check That A Component Renders As Null](react-testing-library/check-that-a-component-renders-as-null.md)
- [findBy\* Queries Have Async Built In](react-testing-library/find-by-queries-have-async-built-in.md)
- [Pretty Print Some DOM To Debug A Test](react-testing-library/pretty-print-some-dom-to-debug-a-test.md)
- [Test A Component That Uses React Portals](react-testing-library/test-a-component-that-uses-react-portals.md)
### ReasonML
- [Break Out Of A While Loop](reason/break-out-of-a-while-loop.md)
- [Compile Reason To Native With Dune](reason/compile-reason-to-native-with-dune.md)
- [Compile Reason With An OCaml Package Using Dune](reason/compile-reason-with-an-ocaml-package-using-dune.md)
- [Create A Map Of Strings](reason/create-a-map-of-strings.md)
- [Create A Stream From An Array](reason/create-a-stream-from-an-array.md)
- [Creating A 2D Array](reason/creating-a-2d-array.md)
- [Data Structures With Self-Referential Types](reason/data-structures-with-self-referential-types.md)
- [Defining Variants With Constructor Arguments](reason/defining-variants-with-constructor-arguments.md)
- [Dynamically Create A Printf String Format](reason/dynamically-create-a-printf-string-format.md)
- [Exhaustive Pattern Matching Of List Variants](reason/exhaustive-pattern-matching-of-list-variants.md)
- [Format The Current File Within Vim](reason/format-the-current-file-within-vim.md)
- [Generate A Native ReasonML Project With Pesy](reason/generate-a-native-reasonml-project-with-pesy.md)
- [Generate Starter Reason Projects](reason/generate-starter-reason-projects.md)
- [Helping The Compiler Help Us With Variants](reason/helping-the-compiler-help-us-with-variants.md)
- [Inline Component Styles With Reason React](reason/inline-component-styles-with-reason-react.md)
- [Is This A Directory Or A File?](reason/is-this-a-directory-or-a-file.md)
- [Making Things Mutable](reason/making-things-mutable.md)
- [Modifying A String With blit_string](reason/modifying-a-string-with-blit-string.md)
- [Multi-Argument Functions As Syntactic Sugar](reason/multi-argument-functions-as-syntactic-sugar.md)
- [Pattern Match On Exceptions](reason/pattern-match-on-exceptions.md)
- [Quickly Bootstrap A React App Using Reason](reason/quickly-bootstrap-a-react-app-using-reason.md)
- [Seeding And Generating Random Integers](reason/seeding-and-generating-random-integers.md)
- [Stream A File Line By Line](reason/stream-a-file-line-by-line.md)
- [String Interpolation With Integers And Sprintf](reason/string-interpolation-with-integers-and-sprintf.md)
- [String Interpolation With Quoted Strings](reason/string-interpolation-with-quoted-strings.md)
- [Trying Out ReasonML In CodeSandbox](reason/trying-out-reasonml-in-codesandbox.md)
- [Two Ways To Find An Item In A List](reason/two-ways-to-find-an-item-in-a-list.md)
- [Using Optional Labeled Function Arguments](reason/using-optional-labeled-function-arguments.md)
- [Wrapping A Component For Use In JavaScript](reason/wrapping-a-component-for-use-in-javascript.md)
### Ruby
- [A Basic Case Statement](ruby/a-basic-case-statement.md)
- [A Shorthand For Rerunning Failed Tests With RSpec](ruby/a-shorthand-for-rerunning-failed-tests-with-rspec.md)
- [Add Comments To Regex With Free-Spacing](ruby/add-comments-to-regex-with-free-spacing.md)
- [Add Linux As A Bundler Platform](ruby/add-linux-as-a-bundler-platform.md)
- [Are They All True?](ruby/are-they-all-true.md)
- [Assert About An Object's Attributes With RSpec](ruby/assert-about-an-objects-attributes-with-rspec.md)
- [Assoc For Hashes](ruby/assoc-for-hashes.md)
- [Block Comments](ruby/block-comments.md)
- [Build HTTP And HTTPS URLs](ruby/build-http-and-https-urls.md)
- [Chaining Multiple RSpec Change Matchers](ruby/chaining-multiple-rspec-change-matchers.md)
- [Check Return Status Of Running A Shell Command](ruby/check-return-status-of-running-a-shell-command.md)
- [Click On Text With Capybara](ruby/click-on-text-with-capybara.md)
- [Colorful Output With MiniTest](ruby/colorful-output-with-minitest.md)
- [Comparing Class Hierarchy Relationships](ruby/comparing-class-hierarchy-relationships.md)
- [Comparing Arrays In RSpec](ruby/comparing-arrays-in-rspec.md)
- [Construct A Constant From A String](ruby/construct-a-constant-from-a-string.md)
- [Create an Array of Stringed Numbers](ruby/create-an-array-of-stringed-numbers.md)
- [Create a CSV::Table Object](ruby/create-a-csv-table-object.md)
- [Create A Hash From An Array Of Arrays](ruby/create-a-hash-from-an-array-of-arrays.md)
- [Create Listing Of All Middleman Pages](ruby/create-listing-of-all-middleman-pages.md)
- [Create Named Structs With Struct.new](ruby/create-named-structs-with-struct-new.md)
- [Create Thumbnail Image For A PDF](ruby/create-thumbnail-image-for-a-pdf.md)
- [Defaulting To Frozen String Literals](ruby/defaulting-to-frozen-string-literals.md)
- [Define A Custom RSpec Matcher](ruby/define-a-custom-rspec-matcher.md)
- [Destructuring Arrays In Blocks](ruby/destructuring-arrays-in-blocks.md)
- [Disassemble Some Codes](ruby/disassemble-some-codes.md)
- [Double Splat To Merge Hashes](ruby/double-splat-to-merge-hashes.md)
- [Edit Previous Parts Of The Pry Buffer History](ruby/edit-previous-parts-of-the-pry-buffer-history.md)
- [Editing Code In Pry](ruby/editing-code-in-pry.md)
- [Encode A String As URL-Safe Base64](ruby/encode-a-string-as-url-safe-base64.md)
- [Enumerate A Pairing Of Every Two Sequential Items](ruby/enumerate-a-pairing-of-every-two-sequential-items.md)
- [Evaluating One-Off Commands](ruby/evaluating-one-off-commands.md)
- [Exclude Values From An Array](ruby/exclude-values-from-an-array.md)
- [Expect A Method To Be Called And Actually Call It](ruby/expect-a-method-to-be-called-and-actually-call-it.md)
- [FactoryGirl Sequences](ruby/factory-girl-sequences.md)
- [Fail](ruby/fail.md)
- [Find The Min And Max With A Single Call](ruby/find-the-min-and-max-with-a-single-call.md)
- [Finding The Source of Ruby Methods](ruby/finding-the-source-of-ruby-methods.md)
- [Generate A Signed JWT Token](ruby/generate-a-signed-jwt-token.md)
- [Generate Ruby Version And Gemset Files With RVM](ruby/generate-ruby-version-and-gemset-files-with-rvm.md)
- [Get Info About Your RubyGems Environment](ruby/get-info-about-your-ruby-gems-environment.md)
- [Identify Outdated Gems](ruby/identify-outdated-gems.md)
- [If You Detect None](ruby/if-you-detect-none.md)
- [Iterate With An Offset Index](ruby/iterate-with-an-offset-index.md)
- [Ins And Outs Of Pry](ruby/ins-and-outs-of-pry.md)
- [Invoking Rake Tasks Multiple Times](ruby/invoking-rake-tasks-multiple-times.md)
- [IRB Has Built-In Benchmarking With Ruby 3](ruby/irb-has-built-in-benchmarking-with-ruby-3.md)
- [Jump Out Of A Nested Context With Throw/Catch](ruby/jump-out-of-a-nested-context-with-throw-catch.md)
- [Last Raised Exception In The Call Stack](ruby/last-raised-exception-in-the-call-stack.md)
- [Limit Split](ruby/limit-split.md)
- [List The Running Ruby Version](ruby/list-the-running-ruby-version.md)
- [Listing Local Variables](ruby/listing-local-variables.md)
- [Map With Index Over An Array](ruby/map-with-index-over-an-array.md)
- [Mock Method Chain Calls With RSpec](ruby/mock-method-chain-calls-with-rspec.md)
- [Mocking Requests With Partial URIs Using Regex](ruby/mocking-requests-with-partial-uris-using-regex.md)
- [Named Regex Captures Are Assigned To Variables](ruby/named-regex-captures-are-assigned-to-variables.md)
- [Navigate Back In The Browser With Capybara](ruby/navigate-back-in-the-browser-with-capybara.md)
- [Next And Previous Floats](ruby/next-and-previous-floats.md)
- [Or Operator Precedence](ruby/or-operator-precedence.md)
- [Override The Initial Sequence Value](ruby/override-the-initial-sequence-value.md)
- [Parallel Bundle Install](ruby/parallel-bundle-install.md)
- [Parse JSON Into An OpenStruct](ruby/parse-json-into-an-open-struct.md)
- [Parsing A CSV With Quotes In The Data](ruby/parsing-a-csv-with-quotes-in-the-data.md)
- [Pass A Block To Count](ruby/pass-a-block-to-count.md)
- [Passing Arbitrary Methods As Blocks](ruby/passing-arbitrary-methods-as-blocks.md)
- [Passing Arguments To A Rake Task](ruby/passing-arguments-to-a-rake-task.md)
- [Pattern Match Values From A Hash](ruby/pattern-match-values-from-a-hash.md)
- [Percent Notation](ruby/percent-notation.md)
- [Question Mark Operator](ruby/question-mark-operator.md)
- [Rake Only Lists Tasks With Descriptions](ruby/rake-only-lists-tasks-with-descriptions.md)
- [Read The First Line From A File](ruby/read-the-first-line-from-a-file.md)
- [Rendering ERB](ruby/rendering-erb.md)
- [Replace The Current Process With An External Command](ruby/replace-the-current-process-with-an-external-command.md)
- [Require Entire Gemfile In Pry Session](ruby/require-entire-gemfile-in-pry-session.md)
- [Rerun Only Failures With RSpec](ruby/rerun-only-failures-with-rspec.md)
- [Retry A Block After An Exception](ruby/retry-a-block-after-an-exception.md)
- [Returning With Sequel](ruby/returning-with-sequel.md)
- [rexml Is A Bundled Gem As Of Ruby 3.0.0](ruby/rexml-is-a-bundled-gem-as-of-ruby-3-0-0.md)
- [Run An Older Version Of Bundler](ruby/run-an-older-version-of-bundler.md)
- [Running A Single MiniTest Example](ruby/running-a-single-minitest-example.md)
- [Safe Navigation Operator](ruby/safe-navigation-operator.md)
- [Scripting With RVM](ruby/scripting-with-rvm.md)
- [Scroll To Top Of Page With Capybara](ruby/scroll-to-top-of-page-with-capybara.md)
- [Set RVM Default Ruby](ruby/set-rvm-default-ruby.md)
- [Show Public Methods With Pry](ruby/show-public-methods-with-pry.md)
- [Silence The Output Of A Ruby Statement In Pry](ruby/silence-the-output-of-a-ruby-statement-in-pry.md)
- [Single And Double Quoted String Notation](ruby/single-and-double-quoted-string-notation.md)
- [Squeeze Out The Extra Space](ruby/squeeze-out-the-extra-space.md)
- [String Interpolation With Instance Variables](ruby/string-interpolation-with-instance-variables.md)
- [Summing Collections](ruby/summing-collections.md)
- [Turn Key And Value Arrays Into A Hash](ruby/turn-key-and-values-arrays-into-a-hash.md)
- [Turning Any Class Into An Enumerator](ruby/turning-any-class-into-an-enumerator.md)
- [Turning Things Into Hashes](ruby/turning-things-into-hashes.md)
- [Uncaught Exceptions In Pry](ruby/uncaught-exceptions-in-pry.md)
- [`undef_method` And The Inheritance Hierarchy](ruby/undef-method-and-the-inheritance-hierarchy.md)
- [Uninstall Specific Version Of A Ruby Gem](ruby/uninstall-specific-version-of-a-ruby-gem.md)
- [Unpacking Strings Into Binary](ruby/unpacking-strings-into-binary.md)
- [Up And Down With Integers](ruby/up-and-down-with-integers.md)
- [Update The Gemfile Bundled With Version](ruby/update-the-gemfile-bundled-with-version.md)
- [Use A Case Statement As A Cond Statement](ruby/use-a-case-statement-as-a-cond-statement.md)
- [Use dotenv In A Non-Rails Project](ruby/use-dotenv-in-a-non-rails-project.md)
- [Use Tap For Better Test Data Setup](ruby/use-tap-for-better-test-data-setup.md)
- [Using BCrypt To Create And Check Hashed Passwords](ruby/using-bcrypt-to-create-and-check-hashed-passwords.md)
- [What To Do When You Don't Rescue](ruby/what-to-do-when-you-dont-rescue.md)
- [Who Are My Ancestors?](ruby/who-are-my-ancestors.md)
- [Wrap Things In An Array, Even Hashes](ruby/wrap-things-in-an-array-even-hashes.md)
- [Zero Padding](ruby/zero-padding.md)
### sed
- [Apply Multiple Substitutions To The Input](sed/apply-multiple-substitutions-to-the-input.md)
- [Equivalence Classes Of Repetition MetaChars](sed/equivalence-classes-of-repetition-metachars.md)
- [Extract Value From Command Output With Sed](sed/extract-value-from-command-output-with-sed.md)
- [Grab All The Method Names Defined In A Ruby File](sed/grab-all-the-method-names-defined-in-a-ruby-file.md)
- [Grab The First Line Of A File](sed/grab-the-first-line-of-a-file.md)
- [OSX sed Does Regex A Bit Different](sed/osx-sed-does-regex-a-bit-different.md)
- [Output Only Lines Involved In A Substitution](sed/output-only-lines-involved-in-a-substitution.md)
- [Reference A Capture In The Regex](sed/reference-a-capture-in-the-regex.md)
- [Use An Alternative Delimiter In A Substitution](sed/use-an-alternative-delimiter-in-a-substitution.md)
### Shell
- [Check If The First Argument Is Given](shell/check-if-the-first-argument-is-given.md)
- [Format And Print The Current Date And Time](shell/format-and-print-the-current-date-and-time.md)
### Streaming
- [Monitor An Audio Input Device In OBS](streaming/monitor-an-audio-input-device-in-obs.md)
### Tailwind CSS
- [Base Styles For Text Link](tailwind/base-styles-for-text-link.md)
- [Specify Paths For Purging Unused CSS](tailwind/specify-paths-for-purging-unused-css.md)
- [Use Tailwind Typography Prose In Dark Mode](tailwind/use-tailwind-typography-prose-in-dark-mode.md)
### tmux
- [Access Past Copy Buffer History](tmux/access-past-copy-buffer-history.md)
- [Adjusting Window Pane Size](tmux/adjusting-window-pane-size.md)
- [Break Current Pane Out To Separate Window](tmux/break-current-pane-out-to-separate-window.md)
- [Change Base Directory Of Existing Session](tmux/change-base-directory-of-existing-session.md)
- [Change The Default Prefix Key](tmux/change-the-default-prefix-key.md)
- [Create A Named tmux Session](tmux/create-a-named-tmux-session.md)
- [Create A New Session In A New Server](tmux/create-a-new-session-in-a-new-server.md)
- [Cycle Through Layouts](tmux/cycle-through-layouts.md)
- [Enabling Vi Mode](tmux/enabling-vi-mode.md)
- [Get Mouse Copy/Paste Working In Kitty](tmux/get-mouse-copy-paste-working-in-kitty.md)
- [Hiding The Status Bar](tmux/hiding-the-status-bar.md)
- [Jumping Between Sessions](tmux/jumping-between-sessions.md)
- [Kill All Your tmux Sessions](tmux/kill-all-your-tmux-sessions.md)
- [Kill Other Connections To A Session](tmux/kill-other-connections-to-a-session.md)
- [Kill The Current Session](tmux/kill-the-current-session.md)
- [List All Key Bindings](tmux/list-all-key-bindings.md)
- [List Sessions](tmux/list-sessions.md)
- [Open New Window With A Specific Directory](tmux/open-new-window-with-a-specific-directory.md)
- [Organizing Windows](tmux/organizing-windows.md)
- [Paging Up And Down](tmux/paging-up-and-down.md)
- [Pane Killer](tmux/pane-killer.md)
- [Reclaiming The Entire Window](tmux/reclaiming-the-entire-window.md)
- [Remove The Delay On The Escape Key](tmux/remove-the-delay-on-the-escape-key.md)
- [Rename The Current Session](tmux/rename-the-current-session.md)
- [Reset An Option Back To Its Default Value](tmux/reset-an-option-back-to-its-default-value.md)
- [Show The Current Value For An Option](tmux/show-the-current-value-for-an-option.md)
- [Swap Split Panes](tmux/swap-split-panes.md)
- [Switch To A Specific Session And Window](tmux/switch-to-a-specific-session-and-window.md)
- [tmux in your tmux](tmux/tmux-in-your-tmux.md)
- [Toggle Between Two Common Sessions](tmux/toggle-between-two-common-sessions.md)
### TypeScript
- [Add Types To An Object Destructuring](typescript/add-types-to-an-object-destructuring.md)
- [Compiler Checks For Unused Params And Variables](typescript/compiler-checks-for-unused-params-and-variables.md)
- [Re-Export An Imported Type](typescript/re-export-an-imported-type.md)
- [Type Narrowing With Similarly Shaped Objects](typescript/type-narrowing-with-similarly-shaped-objects.md)
- [Use An Array Check For Type Narrowing](typescript/use-an-array-check-for-type-narrowing.md)
- [Zero-Config Environments For Trying Out Types](typescript/zero-config-environments-for-trying-out-types.md)
### Unix
- [All The Environment Variables](unix/all-the-environment-variables.md)
- [Cat A File With Line Numbers](unix/cat-a-file-with-line-numbers.md)
- [Cat Files With Color Using Bat](unix/cat-files-with-color-using-bat.md)
- [Change Default Shell For A User](unix/change-default-shell-for-a-user.md)
- [Change To That New Directory](unix/change-to-that-new-directory.md)
- [Check If A Port Is In Use](unix/check-if-a-port-is-in-use.md)
- [Check If Command Is Executable Before Using](unix/check-if-command-is-executable-before-using.md)
- [Check The Current Working Directory](unix/check-the-current-working-directory.md)
- [Clear The Screen](unix/clear-the-screen.md)
- [Command Line Length Limitations](unix/command-line-length-limitations.md)
- [Compare Two Variables In A Bash Script](unix/compare-two-variables-in-a-bash-script.md)
- [Configure cd To Behave Like pushd In Zsh](unix/configure-cd-to-behave-like-pushd-in-zsh.md)
- [Copying File Contents To System Paste Buffer](unix/copying-file-contents-to-system-paste-buffer.md)
- [Copying Nested Directories With Ditto](unix/copying-nested-directories-with-ditto.md)
- [Count The Number Of Matches In A Grep](unix/count-the-number-of-matches-in-a-grep.md)
- [Create A File Descriptor with Process Substitution](unix/create-a-file-descriptor-with-process-substitution.md)
- [Create A Sequence Of Values With A Step](unix/create-a-sequence-of-values-with-a-step.md)
- [Curl With Cookies](unix/curl-with-cookies.md)
- [Curling For Headers](unix/curling-for-headers.md)
- [Curling With Basic Auth Credentials](unix/curling-with-basic-auth-credentials.md)
- [Display All The Terminal Colors](unix/display-all-the-terminal-colors.md)
- [Display Free Disk Space](unix/display-free-disk-space.md)
- [Display The Contents Of A Directory As A Tree](unix/display-the-contents-of-a-directory-as-a-tree.md)
- [Do A Dry Run Of An rsync](unix/do-a-dry-run-of-an-rsync.md)
- [Do Not Overwrite Existing Files](unix/do-not-overwrite-existing-files.md)
- [Enable Multi-Select Of Results With fzf](unix/enable-multi-select-of-results-with-fzf.md)
- [Exclude A Directory With Find](unix/exclude-a-directory-with-find.md)
- [Exclude Certain Files From An rsync Run](unix/exclude-certain-files-from-an-rsync-run.md)
- [Figure Out The Week Of The Year From The Terminal](unix/figure-out-the-week-of-the-year-from-the-terminal.md)
- [File Type Info With File](unix/file-type-info-with-file.md)
- [Find A File Installed By Brew](unix/find-a-file-installed-by-brew.md)
- [Find Files With fd](unix/find-files-with-fd.md)
- [Find Newer Files](unix/find-newer-files.md)
- [Fix Unlinked Node Binaries With asdf](unix/fix-unlinked-node-binaries-with-asdf.md)
- [Forward Multiple Ports Over SSH](unix/forward-multiple-ports-over-ssh.md)
- [Generate A SAML Key And Certificate Pair](unix/generate-a-saml-key-and-certificate-pair.md)
- [Get Matching Filenames As Output From Grep](unix/get-matching-filenames-as-output-from-grep.md)
- [Get The Unix Timestamp](unix/get-the-unix-timestamp.md)
- [Global Substitution On The Previous Command](unix/global-substitution-on-the-previous-command.md)
- [Globbing For All Directories In Zsh](unix/globbing-for-all-directories-in-zsh.md)
- [Globbing For Filenames In Zsh](unix/globbing-for-filenames-in-zsh.md)
- [Grep For Files Without A Match](unix/grep-for-files-without-a-match.md)
- [Grep For Files With Multiple Matches](unix/grep-for-files-with-multiple-matches.md)
- [Grep For Multiple Patterns](unix/grep-for-multiple-patterns.md)
- [Hexdump A Compiled File](unix/hexdump-a-compiled-file.md)
- [Ignore The Alias When Running A Command](unix/ignore-the-alias-when-running-a-command.md)
- [Interactively Browse Available Node Versions](unix/interactively-browse-availabile-node-versions.md)
- [Jump To The Ends Of Your Shell History](unix/jump-to-the-ends-of-your-shell-history.md)
- [Kill Everything Running On A Certain Port](unix/kill-everything-running-on-a-certain-port.md)
- [Killing A Frozen SSH Session](unix/killing-a-frozen-ssh-session.md)
- [Last Argument Of The Last Command](unix/last-argument-of-the-last-command.md)
- [Less With Style](unix/less-with-style.md)
- [List All Users](unix/list-all-users.md)
- [List Files Ordered By Modification Date](unix/list-files-ordered-by-modification-date.md)
- [List Names Of Files With Matches](unix/list-names-of-files-with-matches.md)
- [List Of Sessions To A Machine](unix/list-of-sessions-to-a-machine.md)
- [List Parent pid With ps](unix/list-parent-pid-with-ps.md)
- [List Stats For A File](unix/list-stats-for-a-file.md)
- [List The Available JDKs](unix/list-the-available-jdks.md)
- [List The Stack Of Remembered Directories](unix/list-the-stack-of-remembered-directories.md)
- [Map A Domain To localhost](unix/map-a-domain-to-localhost.md)
- [Only Show The Matches](unix/only-show-the-matches.md)
- [Open The Current Command In An Editor](unix/open-the-current-command-in-an-editor.md)
- [Partial String Matching In Bash Scripts](unix/partial-string-matching-in-bash-scripts.md)
- [PID Of The Current Shell](unix/pid-of-the-current-shell.md)
- [Print A Range Of Lines For A File With Bat](unix/print-a-range-of-lines-for-a-file-with-bat.md)
- [Print Out Files In Reverse](unix/print-out-files-in-reverse.md)
- [Provide A Fallback Value For Unset Parameter](unix/provide-a-fallback-value-for-unset-parameter.md)
- [Repeat Yourself](unix/repeat-yourself.md)
- [Saying Yes](unix/saying-yes.md)
- [Search Files Specific To A Language](unix/search-files-specific-to-a-language.md)
- [Search History](unix/search-history.md)
- [Search Man Page Descriptions](unix/search-man-page-descriptions.md)
- [Securely Remove Files](unix/securely-remove-files.md)
- [Set The asdf Package Version For A Single Shell](unix/set-the-asdf-package-version-for-a-single-shell.md)
- [Show A File Preview When Searching With FZF](unix/show-a-file-preview-when-searching-with-fzf.md)
- [Show Disk Usage For The Current Directory](unix/show-disk-usage-for-the-current-directory.md)
- [Show The Size Of Everything In A Directory](unix/show-the-size-of-everything-in-a-directory.md)
- [Skip Paging If Output Fits On Screen With Less](unix/skip-paging-if-output-fits-on-screen-with-less.md)
- [SSH Escape Sequences](unix/ssh-escape-sequences.md)
- [SSH With Port Forwarding](unix/ssh-with-port-forwarding.md)
- [Specify The Language For A File With Bat](unix/specify-the-language-for-a-file-with-bat.md)
- [Sort In Numerical Order](unix/sort-in-numerical-order.md)
- [Switch Versions of a Brew Formula](unix/switch-versions-of-a-brew-formula.md)
- [Touch Access And Modify Times Individually](unix/touch-access-and-modify-times-individually.md)
- [Undo Some Command Line Editing](unix/undo-some-command-line-editing.md)
- [Update Package Versions Known By asdf Plugin](unix/update-package-versions-known-by-asdf-plugin.md)
- [Use fzf To Change Directories](unix/use-fzf-to-change-directories.md)
- [Use Regex Pattern Matching With Grep](unix/use-regex-pattern-matching-with-grep.md)
- [View A Web Page In The Terminal](unix/view-a-web-page-in-the-terminal.md)
- [Watch The Difference](unix/watch-the-difference.md)
- [Watch This Run Repeatedly](unix/watch-this-run-repeatedly.md)
- [Where Are The Binaries?](unix/where-are-the-binaries.md)
### Vercel
- [Add Web Server Layer Redirects](vercel/add-web-server-layer-redirects.md)
- [Deploy An App Without Pushing An Empty Commit](vercel/deploy-an-app-without-pushing-an-empty-commit.md)
- [Naming Of The Vercel Config File](vercel/naming-of-the-vercel-config-file.md)
- [Share Development Environment Variables Via CLI](vercel/share-development-environment-variables-via-cli.md)
### Vim
- [Aborting Git Commits And Rebases](vim/aborting-git-commits-and-rebases.md)
- [Absolute And Relative Line Numbers](vim/absolute-and-relative-line-numbers.md)
- [Add A File Without Loading It](vim/add-a-file-without-loading-it.md)
- [Add Custom Dictionary Words](vim/add-custom-dictionary-words.md)
- [All The Ways To Write And Quit In Vim](vim/all-the-ways-to-write-and-quit-in-vim.md)
- [Allow Neovim To Copy/Paste With System Clipboard](vim/allow-neovim-to-copy-paste-with-system-clipboard.md)
- [Almost The End Of The Line](vim/almost-the-end-of-the-line.md)
- [Alternate Files With vim-rails](vim/alternate-files-with-vim-rails.md)
- [Always Keep The Gutter Open](vim/always-keep-the-gutter-open.md)
- [Amend Commits With Fugitive](vim/amend-commits-with-fugitive.md)
- [Backspace Options](vim/backspace-options.md)
- [Beginning And End Of Previous Change](vim/beginning-and-end-of-previous-change.md)
- [The Black Hole Register](vim/the-black-hole-register.md)
- [Blank Lines Above And Below](vim/blank-lines-above-and-below.md)
- [Breaking The Undo Sequence](vim/breaking-the-undo-sequence.md)
- [Buffer Time Travel](vim/buffer-time-travel.md)
- [Build And Install A Go Program](vim/build-and-install-a-go-program.md)
- [Case-Aware Substitution With vim-abolish](vim/case-aware-substitution-with-vim-abolish.md)
- [Case-Insensitive Substitution](vim/case-insensitive-substitution.md)
- [Center The Cursor](vim/center-the-cursor.md)
- [Check For An Executable](vim/check-for-an-executable.md)
- [Check Your Current Color Scheme](vim/check-your-current-color-scheme.md)
- [Clear Out The Jump List](vim/clear-out-the-jump-list.md)
- [Close All Other Splits](vim/close-all-other-splits.md)
- [Close All Other Windows](vim/close-all-other-windows.md)
- [Close the Current Buffer](vim/close-the-current-buffer.md)
- [Coerce The Current Filetype](vim/coerce-the-current-filetype.md)
- [Coercing Casing With vim-abolish](vim/coercing-casing-with-vim-abolish.md)
- [Configure FZF To Use fd For File Finding](vim/configure-fzf-to-use-fd-for-file-finding.md)
- [Count the Number of Matches](vim/count-the-number-of-matches.md)
- [Create A New Directory In netrw](vim/create-a-new-directory-in-netrw.md)
- [Create A New File In A New Directory](vim/create-a-new-file-in-a-new-directory.md)
- [Creating Non-Existent Directories](vim/creating-non-existent-directories.md)
- [Default netrw To Tree Liststyle](vim/default-netrw-to-tree-liststyle.md)
- [Delete Every Other Line](vim/delete-every-other-line.md)
- [Delete Lines That Match A Pattern](vim/delete-lines-that-match-a-pattern.md)
- [Delete To The End Of The Line](vim/delete-to-the-end-of-the-line.md)
- [Deleting Buffers In BufExplorer](vim/deleting-buffers-in-bufexplorer.md)
- [Deleting Directories Of Files From netrw](vim/deleting-directories-of-files-from-netrw.md)
- [Detect If You Are On A Mac](vim/detect-if-you-are-on-a-mac.md)
- [Difference Between :wq and :x](vim/difference-between-wq-and-x.md)
- [Display Word Count Stats](vim/display-word-count-stats.md)
- [Edges Of The Selection](vim/edges-of-the-selection.md)
- [Edit A File At A Specific Line Number](vim/edit-a-file-at-a-specific-line-number.md)
- [Edit A File Starting On The Last Line](vim/edit-a-file-starting-on-the-last-line.md)
- [End Of The Word](vim/end-of-the-word.md)
- [Escaping Terminal-Mode In An Nvim Terminal](vim/escaping-terminal-mode-in-an-nvim-terminal.md)
- [Filter Lines Through An External Program](vim/filter-lines-through-an-external-program.md)
- [Fix The Spelling Of A Word](vim/fix-the-spelling-of-a-word.md)
- [Fold A Visual Selection And Expand It Back](vim/fold-a-visual-selection-and-expand-it-back.md)
- [For When That Escape Key Is Hard To Reach](vim/for-when-that-escape-key-is-hard-to-reach.md)
- [Format Long Lines To Text Width](vim/format-long-lines-to-text-width.md)
- [From Ruby Variables To JavaScript Variables](vim/from-ruby-variables-to-javascript-variables.md)
- [Generate and Edit Rails Migration](vim/generate-and-edit-rails-migration.md)
- [Get The pid Of The Session](vim/get-the-pid-of-the-session.md)
- [Go Back To The Previous Window](vim/go-back-to-the-previous-window.md)
- [Go To File With Line Number](vim/go-to-file-with-line-number.md)
- [Grepping Through The Vim Help Files](vim/grepping-through-the-vim-help-files.md)
- [Head of File Name](vim/head-of-file-name.md)
- [Help For Non-Normal Mode Features](vim/help-for-non-normal-mode-features.md)
- [Highlighting Search Matches](vim/highlighting-search-matches.md)
- [Horizontal to Vertical and Back Again](vim/horizontal-to-vertical-and-back-again.md)
- [Increment All The Numbers](vim/increment-all-the-numbers.md)
- [Incremental Searching](vim/incremental-searching.md)
- [Interact With The Alternate File](vim/interact-with-the-alternate-file.md)
- [Interactive Buffer List](vim/interactive-buffer-list.md)
- [Joining Lines Together](vim/joining-lines-together.md)
- [Jump Back To The Latest Jump Position](vim/jump-back-to-the-latest-jump-position.md)
- [Jump Between And Stage Git Hunks With Fugitive](vim/jump-between-and-stage-git-hunks-with-fugitive.md)
- [Jump To Matching Pair](vim/jump-to-matching-pair.md)
- [Jump To The Next Misspelling](vim/jump-to-the-next-misspelling.md)
- [List All Buffers](vim/list-all-buffers.md)
- [List Of Plugins](vim/list-of-plugins.md)
- [Load A Directory Of Files Into The Buffer List](vim/load-a-directory-of-files-into-the-buffer-list.md)
- [Make Directories For The Current File](vim/make-directories-for-the-current-file.md)
- [Marks Across Vim Sessions](vim/marks-across-vim-sessions.md)
- [Match The Beginning And End Of Words](vim/match-the-beginning-and-end-of-words.md)
- [Moving To A Specific Line](vim/moving-to-a-specific-line.md)
- [Navigate To The Nth Column On A Line](vim/navigate-to-the-nth-column-on-a-line.md)
- [Navigating By Blank Lines](vim/navigating-by-blank-lines.md)
- [NETRW Listing Styles](vim/netrw-listing-styles.md)
- [Next Modified Buffer](vim/next-modified-buffer.md)
- [Normal Node Binding To Just Quit](vim/normal-mode-binding-to-just-quit.md)
- [Open A Tag In A Split Window](vim/open-a-tag-in-a-split-window.md)
- [Open an Unnamed Buffer](vim/open-an-unnamed-buffer.md)
- [Open FZF Result In A Split](vim/open-fzf-result-in-a-split.md)
- [Open Routes File With vim-rails](vim/open-routes-file-with-vim-rails.md)
- [Open The Directory Of The Current File](vim/open-the-directory-of-the-current-file.md)
- [Open The Fugitive Git Summary Window](vim/open-the-fugitive-git-summary-window.md)
- [Open The Gemfile](vim/open-the-gemfile.md)
- [Open The Latest Rails Migration](vim/open-the-latest-rails-migration.md)
- [Open The Selected Lines In GitHub With Gbrowse](vim/open-the-selected-lines-in-github-with-gbrowse.md)
- [Open Vim To A Tag Definition](vim/open-vim-to-a-tag-definition.md)
- [Opening a URL](vim/opening-a-url.md)
- [Opening Man Pages In Vim](vim/opening-man-pages-in-vim.md)
- [Paste A Register From Insert Mode](vim/paste-a-register-from-insert-mode.md)
- [Preventing Typos with Abbreviations](vim/preventing-typos-with-abbreviations.md)
- [Previous Buffer](vim/previous-buffer.md)
- [Previous Visual Selection](vim/previous-visual-selection.md)
- [Print The Relative Path Of The Current File](vim/print-the-relative-path-of-the-current-file.md)
- [Print Version Information](vim/print-version-information.md)
- [Quick File Info](vim/quick-file-info.md)
- [Quick Man Pages](vim/quick-man-pages.md)
- [Quick Quickfix List Navigation](vim/quick-quickfix-list-navigation.md)
- [Quickly Fix A Misspelled Word](vim/quickly-fix-a-misspelled-word.md)
- [Quickly Switch To A Buffer By Number](vim/quickly-switch-to-a-buffer-by-number.md)
- [Quit When There Is An Argument List](vim/quit-when-there-is-an-argument-list.md)
- [Re-indenting Your Code](vim/reindenting-your-code.md)
- [Read In The Contents Of A Rails File](vim/read-in-the-contents-of-a-rails-file.md)
- [Rename A File Through netrw](vim/rename-a-file-through-netrw.md)
- [Rename Current File](vim/rename-current-file.md)
- [Repeat The Previous Change](vim/repeat-the-previous-change.md)
- [Repeating Characters](vim/repeating-characters.md)
- [Replace A Character](vim/replace-a-character.md)
- [Reset Target tslime Pane](vim/reset-target-tslime-pane.md)
- [Reverse A Group Of Lines](vim/reverse-a-group-of-lines.md)
- [Rotate Everything By 13 Letters](vim/rotate-everything-by-13-letters.md)
- [Rotate The Orientation Of Split Windows](vim/rotate-the-orientation-of-split-windows.md)
- [Running Bundle With vim-bundler](vim/running-bundle-with-vim-bundler.md)
- [Scrolling Relative to the Cursor](vim/scrolling-relative-to-the-cursor.md)
- [Search Backward Through A File](vim/search-backward-through-a-file.md)
- [Searching For Hex Digits](vim/searching-for-hex-digits.md)
- [Select Several Results From An FZF Search](vim/select-several-results-from-an-fzf-search.md)
- [Set End Of Line Markers](vim/set-end-of-line-markers.md)
- [Set Your Color Scheme](vim/set-your-color-scheme.md)
- [Set Up Vim-Plug With Neovim](vim/set-up-vim-plug-with-neovim.md)
- [Setting Filetype With Modelines](vim/setting-filetype-with-modelines.md)
- [Show All Syntax Highlighting Rules](vim/show-all-syntax-highlighting-rules.md)
- [Show Matching Entries For Help](vim/show-matching-entries-for-help.md)
- [Specify The Line Height Of The Quick Fix Window](vim/specify-the-line-height-of-the-quick-fix-window.md)
- [Split Different](vim/split-different.md)
- [Split The Current Window](vim/split-the-current-window.md)
- [Splitting For New Files](vim/splitting-for-new-files.md)
- [Source Original vimrc When Using Neovim](vim/source-original-vimrc-when-using-neovim.md)
- [Swap Occurrences Of Two Words](vim/swap-occurrences-of-two-words.md)
- [Swapping Split Windows](vim/swapping-split-windows.md)
- [Tabs To Spaces](vim/tabs-to-spaces.md)
- [The Vim Info File](vim/the-vim-info-file.md)
- [Toggle Absolute And Relative Paths In BufExplorer](vim/toggle-absolute-and-relative-paths-in-bufexplorer.md)
- [Toggling Syntax Highlighting](vim/toggling-syntax-highlighting.md)
- [Turning Off Search Highlighting](vim/turning-off-search-highlighting.md)
- [Unloading A Buffer](vim/unloading-a-buffer.md)
- [Use Active Window With BufExplorer](vim/use-active-window-with-bufexplorer.md)
- [Use The Terminal Inside A Vim Session](vim/use-the-terminal-inside-a-vim-session.md)
- [Using vim-surround With A Visual Selection](vim/using-vim-surround-with-a-visual-selection.md)
- [Verbose Commits With Fugitive](vim/verbose-commits-with-fugitive.md)
- [View Commit History of a File](vim/view-commit-history-of-a-file.md)
- [Viewing Man Pages with man.vim](vim/viewing-man-pages-with-man-vim.md)
- [Vim Without The Extras](vim/vim-without-the-extras.md)
- [What Is On The Runtime Path?](vim/what-is-on-the-runtime-path.md)
- [Whole Line Auto-Completion](vim/whole-line-auto-completion.md)
- [Wrap With Some Room](vim/wrap-with-some-room.md)
### VSCode
- [Add The VSCode CLI To Your Path](vscode/add-the-vscode-cli-to-your-path.md)
- [Advance Through Search Results](vscode/advance-through-search-results.md)
- [Enable Breadcrumbs For Version 1.26 Release](vscode/enable-breadcrumbs-for-version-126-release.md)
- [Open An Integrated Terminal Window](vscode/open-an-integrated-terminal-window.md)
- [Toggle Between Terminals](vscode/toggle-between-terminals.md)
### Webpack
- [Better Module Imports With Aliases](webpack/better-module-imports-with-aliases.md)
- [Debugging With Full Source Maps](webpack/debugging-with-full-source-maps.md)
- [Run ESLint As A Preloader](webpack/run-eslint-as-a-preloader.md)
- [Specify Port Of CRA's Webpack Dev Server](webpack/specify-port-of-cra-webpack-dev-server.md)
- [Use A Specific Config File](webpack/use-a-specific-config-file.md)
### Workflow
- [Change Window Name In iTerm](workflow/change-window-name-in-iterm.md)
- [Convert An ePub Document To PDF On Mac](workflow/convert-an-epub-document-to-pdf-on-mac.md)
- [Create A Public URL For A Local Server](workflow/create-a-public-url-for-a-local-server.md)
- [Enable Dev Tools For Safari](workflow/enable-dev-tools-for-safari.md)
- [Forward Stripe Events To Local Server](workflow/forward-stripe-events-to-local-server.md)
- [Get Your Public IP Address](workflow/get-your-public-ip-address.md)
- [Import A Github Project Into CodeSandbox](workflow/import-a-github-project-into-codesandbox.md)
- [Interactively Kill A Process With fkill](workflow/interactively-kill-a-process-with-fkill.md)
- [Open Slack's Keyboard Shortcuts Reference Panel](workflow/open-slacks-keyboard-shortcuts-reference-panel.md)
- [Prune The Excess From node_modules](workflow/prune-the-excess-from-node-modules.md)
- [Rotate An Image To Be Oriented Upright](workflow/rotate-an-image-to-be-oriented-upright.md)
- [Set Recurring Reminders In Slack](workflow/set-recurring-reminders-in-slack.md)
- [Toggle Between Stories In Storybook](workflow/toggle-between-stories-in-storybook.md)
- [Update asdf Plugins With Latest Package Versions](workflow/update-asdf-plugins-with-latest-package-versions.md)
- [View The PR For The Current GitHub Branch](workflow/view-the-pr-for-the-current-github-branch.md)
### XState
- [Define Event That Does Internal Self Transition](xstate/define-event-that-does-internal-self-transition.md)
- [Events Stop Propagating Once Handled](xstate/events-stop-propagating-once-handled.md)
- [Inline Actions vs Actions In Machine Options](xstate/inline-actions-vs-actions-in-machine-options.md)
- [Simple States And Composite States](xstate/simple-states-and-composite-states.md)
- [Use An XState Machine With React](xstate/use-an-xstate-machine-with-react.md)
### YAML
- [Create Multi-Line Strings Without The Line Breaks](yaml/create-multi-line-strings-without-the-line-breaks.md)
## Usage
The `.vimrc` file for this project contains a function `CountTILs` that can
be invoked with `<leader>c`. This will do a substitution count of the
current number of TILs and display the result in the command tray.
## About
I shamelessly stole this idea from
[thoughtbot/til](https://github.com/thoughtbot/til).
## Other TIL Collections
* [Today I Learned by Hashrocket](https://til.hashrocket.com)
* [jwworth/til](https://github.com/jwworth/til)
* [thoughtbot/til](https://github.com/thoughtbot/til)
## License
© 2015-2021 Josh Branchaud
This repository is licensed under the MIT license. See `LICENSE` for
details.
| 69.569242 | 133 | 0.772056 | eng_Latn | 0.246862 |
ff1ae7933c7b90d492ee6c079b8739d99f9ecaf8 | 792 | md | Markdown | _posts/2021-06-02-pickle.md | Kimjs11/Kimjs11.github.io | 4a7596ddcde7c0b1381264b66dae303cf8cc143f | [
"MIT"
] | 2 | 2021-03-17T09:11:22.000Z | 2021-03-23T08:09:40.000Z | _posts/2021-06-02-pickle.md | Kimjs11/kimjs11.github.io | 4a7596ddcde7c0b1381264b66dae303cf8cc143f | [
"MIT"
] | null | null | null | _posts/2021-06-02-pickle.md | Kimjs11/kimjs11.github.io | 4a7596ddcde7c0b1381264b66dae303cf8cc143f | [
"MIT"
] | null | null | null | ---
layout: posts
title: "Python pickle module"
date: 2021-03-17 09:00:20 +0700
categories: [Machine Learning]
---
<link rel = "stylesheet" href ="/static/css/bootstrap.min.css">
--------------------------
{% raw %} <img src="https://Kimjs11.github.io/img/pickle.jpg" alt=""> {% endraw %}
## Pickle module<br/>
* 일반 텍스트를 파일로 저장할 때는 파일 입출력을 이용한다. <br/>
* **리스트**, **클래스**의 경우, 텍스트가 아닌 **자료형**이다. 일반적인 입출력 방법으로는 데이터를 저장 및 로드할 수 없다.<br/>
* 파이썬에서는 이와 같은 텍스트 이외의 자료형을 파일로 저장하기 위하여 **pickle** module을 제공한다.<br/>
<br/>
## Pickle module 을 활용하여 데이터 입력 및 로드<br/>
* import pickle 을 통하여 모듈 임포트<br/>
* pickle module 을 이용하면 원하는 데이터를 자료형의 변경없이 파일로 저장하여 그대로 로드할 수 있다.<br/>
* pickle로 데이터를 저장하거나 불러올때는 파일을 byte 형식으로 읽거나 써야한다. <br/>
* ex) open('test.txt', 'wb, rb')<br/>
* 모든 파이썬 데이터 객체를 저장하고 읽을 수 있다.
| 29.333333 | 82 | 0.623737 | kor_Hang | 1.000006 |
ff1b25b066fbda33bfedb4c266f0dbde700118e9 | 157 | md | Markdown | people/haotian-zhang.md | jadami10/cs5356 | 003b32b5134d55ce8610d645eb346ad34b6bd800 | [
"CC-BY-4.0",
"MIT"
] | 94 | 2015-08-06T01:34:45.000Z | 2021-06-16T07:30:12.000Z | people/haotian-zhang.md | jadami10/cs5356 | 003b32b5134d55ce8610d645eb346ad34b6bd800 | [
"CC-BY-4.0",
"MIT"
] | 206 | 2015-07-31T21:06:08.000Z | 2016-01-07T14:29:03.000Z | people/haotian-zhang.md | jadami10/cs5356 | 003b32b5134d55ce8610d645eb346ad34b6bd800 | [
"CC-BY-4.0",
"MIT"
] | 185 | 2015-05-09T03:10:09.000Z | 2019-10-15T13:02:31.000Z | Haotian Zhang
-------------

* [Facebook](https://www.facebook.com/haotian.rocks)
* [Twitter](https://twitter.com/haotzhang)
| 19.625 | 52 | 0.66242 | yue_Hant | 0.457073 |
ff1b98fad45aab5a464ab2f0630e69847c481d77 | 983 | md | Markdown | en/datalens/function-ref/UNNEST.md | kzzzr/docs | c437bdc3fc14ad223ad3d5d82b10f30e937d1822 | [
"CC-BY-4.0"
] | 117 | 2018-12-29T10:20:17.000Z | 2022-03-30T12:30:13.000Z | en/datalens/function-ref/UNNEST.md | kzzzr/docs | c437bdc3fc14ad223ad3d5d82b10f30e937d1822 | [
"CC-BY-4.0"
] | 205 | 2018-12-29T14:58:45.000Z | 2022-03-30T21:47:12.000Z | en/datalens/function-ref/UNNEST.md | kzzzr/docs | c437bdc3fc14ad223ad3d5d82b10f30e937d1822 | [
"CC-BY-4.0"
] | 393 | 2018-12-26T16:53:47.000Z | 2022-03-31T17:33:48.000Z | ---
editable: false
sourcePath: en/_api-ref/datalens/function-ref/UNNEST.md
---
# UNNEST
#### Syntax {#syntax}
```
UNNEST( array )
```
#### Description {#description}
Expands the `array` array expression to a set of rows.
**Argument types:**
- `array` — `Array of fractional numbers | Array of integer numbers | Array of strings`
**Return type**: Depends on argument types
#### Example {#examples}
Source data
| **City** | **Category** |
|:-----------|:-----------------------------------|
| `'Moscow'` | `['Office Supplies', 'Furniture']` |
| `'London'` | `['Office Supplies']` |
Result
| **[City]** | **UNNEST([Category])** |
|:-------------|:-------------------------|
| `'Moscow'` | `'Office Supplies'` |
| `'Moscow'` | `'Furniture'` |
| `'London'` | `'Office Supplies'` |
#### Data source support {#data-source-support}
`Materialized Dataset`, `ClickHouse 19.13`, `PostgreSQL 9.3`.
| 18.54717 | 87 | 0.513733 | yue_Hant | 0.441111 |
ff1bd4187e787c2d762cbbf48b7145dc25f1e1b5 | 1,868 | md | Markdown | translations/es-ES/content/get-started/writing-on-github/working-with-advanced-formatting/organizing-information-with-collapsed-sections.md | EncasedAmber/docs | 3d9f46f9995e59917dcf9936a746acdbcadf00b2 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | translations/es-ES/content/get-started/writing-on-github/working-with-advanced-formatting/organizing-information-with-collapsed-sections.md | EncasedAmber/docs | 3d9f46f9995e59917dcf9936a746acdbcadf00b2 | [
"CC-BY-4.0",
"MIT"
] | 3 | 2022-01-25T06:07:25.000Z | 2022-01-25T17:52:07.000Z | translations/es-ES/content/get-started/writing-on-github/working-with-advanced-formatting/organizing-information-with-collapsed-sections.md | EncasedAmber/docs | 3d9f46f9995e59917dcf9936a746acdbcadf00b2 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Organizar información con secciones colapsadas
intro: Puedes optimizar tu lenguaje de marcado si creas una sección colapsada con la etiqueta `<details>`.
versions:
fpt: '*'
ghes: '*'
ghae: '*'
ghec: '*'
redirect_from:
- /github/writing-on-github/working-with-advanced-formatting/organizing-information-with-collapsed-sections
shortTitle: Secciones colapsadas
---
## Crear una sección colapsada
Puedes ocultar las secciones de tu lenguaje de marcado temporalmente si creas una sección colapsada que el lector pueda elegir desplegar. Por ejemplo, cuando incluyes detalles técnicas en un comentario de una propuesta, los cuales podrían no ser relevantes o de interés para todos los lectores, puedes ponerlos en una sección colapsada.
Cualquier lenguaje de mrcado dentro del bloque `<details>` se colapsará hasta que el lector haga clic en {% octicon "triangle-right" aria-label="The right triange icon" %} para expandir los detalles. Dentro del bloque de `<details>`, utiliza la marca `<summary>` para crear una etiqueta a la derecha de {% octicon "triangle-right" aria-label="The right triange icon" %}.
````markdown
<details><summary>CLICK ME</summary>
<p>
#### We can hide anything, even code!
```ruby
puts "Hello World"
```
</p>
</details>
````
El lenguaje de marcado se colapsará predeterminadamente.

Después de que un lector hace clic en {% octicon "triangle-right" aria-label="The right triange icon" %}, los detalles se expandirán.

## Leer más
- [{% data variables.product.prodname_dotcom %} Especificaciones del formato Markdown](https://github.github.com/gfm/)
- "[Sintaxis de escritura y formato básicos](/articles/basic-writing-and-formatting-syntax)"
| 37.36 | 370 | 0.759636 | spa_Latn | 0.944299 |
ff1c1e6c81dbdb94716fd23c85527344e324f73c | 3,958 | md | Markdown | documents/pages/icon-components/vue/index.md | iconify/documentation | 0ca91c8c7c808a5acba53232942ccd0d79502a4c | [
"Apache-2.0"
] | 145 | 2020-08-12T15:07:06.000Z | 2021-12-10T07:41:45.000Z | documents/pages/icon-components/vue/index.md | iconify/documentation | 0ca91c8c7c808a5acba53232942ccd0d79502a4c | [
"Apache-2.0"
] | 1 | 2022-02-09T09:13:00.000Z | 2022-02-09T09:13:00.000Z | documents/pages/icon-components/vue/index.md | iconify/documentation | 0ca91c8c7c808a5acba53232942ccd0d79502a4c | [
"Apache-2.0"
] | 3 | 2021-02-16T00:40:16.000Z | 2022-03-20T00:05:43.000Z | ```yaml
title: Iconify for Vue
replacements:
- code: '60,000'
value: '${counters.icons}'
- code: '80+'
value: '${counters.sets}+'
- code: '@iconify/vue@3'
value: '${vue.import}'
types:
IconifyIcon: '../../types/iconify-icon.md'
functions:
addCollection: './add-collection.md'
addIcon: './add-icon.md'
iconExists: './icon-exists.md'
listIcons: './list-icons.md'
loadIcons: './load-icons.md'
getIcon: './get-icon.md'
enableCache: './enable-cache.md'
disableCache: './disable-cache.md'
addAPIProvider: './add-api-provider.md'
replaceIDs: './replace-ids.md'
```
# Iconify for Vue
```yaml
include: icon-components/components/intro
replacements:
- search: React
replace: Vue
```
`include notices/vue3`
## Installation
If you are using NPM:
```bash
npm install --save-dev @iconify/vue@3
```
If you are using Yarn:
```bash
yarn add --dev @iconify/vue@3
```
## Usage
Install `[npm]@iconify/vue@3` and import component from it (component is exported as named export):
```js
import { Icon } from '@iconify/vue';
```
Then in template use `[var]Icon` component with icon name as `[prop]icon` parameter:
```jsx
<Icon icon="mdi-light:home" />
```
### Offline use
```yaml
include: icon-components/components/intro-offline
```
See [icon bundles for Iconify for Vue](../../icon-components/bundles/vue.md) documentation.
### Nuxt.js {#ssr}
Component is compatible with Nuxt.js.
Component does not retrieve icon data until it is mounted. For server side rendering it means HTML will not include SVGs, they will be dynamically added only when hydrating DOM on client side.
If you do want to render SVGs on server side, use either [offline bundle](./offline.md) or provide icon data as parameter instead of icon name.
## Properties
You can pass any custom properties to component.
Required properties:
- `[prop]icon`, `[type]IconifyIcon | string` icon name or icon data.
```yaml
include: icon-components/component-optional-props
replacements:
- search: hAlign
replace: horizontalAlign
- search: vAlign
replace: verticalAlign
- search: hFlip
replace: horizontalFlip
- search: vFlip
replace: verticalFlip
```
See below for more information on each optional property.
In addition to the properties mentioned above, the icon component accepts any other properties and events. All other properties and events will be passed to generated SVG element, so you can do stuff like setting the inline style, add title, add `[prop]onClick` event and so on.
## Icon
```yaml
include: icon-components/components/intro-icon
```
## Color
```yaml
include: icon-components/components/intro-color
```
```vue
<Icon icon="mdi:home" style="color: red" />
```
For various ways to set color, see [how to change icon color in Iconify for Vue](./color.md).
## Dimensions and alignment
```yaml
include: icon-components/components/intro-size
```
```vue
<Icon icon="mdi:home" style="font-size: 24px;" />
```
For various ways to change icon dimensions and alignment, see [how to change icon dimensions in Iconify for Vue](./dimensions.md).
## Transformations
```yaml
include: icon-components/components/intro-transform
```
For more details see [how to transform icon in Iconify for Vue](./transform.md).
## onLoad
`include icon-components/components/onload`
## Functions {#functions}
```yaml
include: icon-components/components/functions-list/header
```
### Check available icons {#getting-icons}
```yaml
include: icon-components/components/functions-list/getting-icons
```
### Adding icons {#adding-icons}
```yaml
include: icon-components/components/functions-list/adding-icons
```
### Helper functions {#helper}
```yaml
include: icon-components/components/functions-list/helpers
```
### API functions {#api}
```yaml
include: icon-components/components/functions-list/api
```
### Internal API functions {#internal}
```yaml
include: icon-components/components/functions-list/internal
```
| 21.747253 | 278 | 0.714502 | eng_Latn | 0.87137 |
ff1cdaac3efeaa71638bd1c38e670fa3962ddf27 | 213 | md | Markdown | docs/NeighborhoodsResponse.md | Syncsort/PreciselyAPIsSDK-Java | 9aa258b1c0a543bcbd108ea9b093b21d5f9b9586 | [
"Apache-2.0"
] | null | null | null | docs/NeighborhoodsResponse.md | Syncsort/PreciselyAPIsSDK-Java | 9aa258b1c0a543bcbd108ea9b093b21d5f9b9586 | [
"Apache-2.0"
] | null | null | null | docs/NeighborhoodsResponse.md | Syncsort/PreciselyAPIsSDK-Java | 9aa258b1c0a543bcbd108ea9b093b21d5f9b9586 | [
"Apache-2.0"
] | null | null | null |
# NeighborhoodsResponse
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
**location** | [**List<Location>**](Location.md) | | [optional]
| 15.214286 | 71 | 0.455399 | eng_Latn | 0.322512 |
ff1d30ab49fbb66c0750a5c016cfef82a4c0b3d7 | 388 | markdown | Markdown | _templates/oxaoxenx-sm800.markdown | elzershark/templates | 1d73dfa80ae397d5fc194631c897ec780fb9c0e7 | [
"MIT"
] | null | null | null | _templates/oxaoxenx-sm800.markdown | elzershark/templates | 1d73dfa80ae397d5fc194631c897ec780fb9c0e7 | [
"MIT"
] | null | null | null | _templates/oxaoxenx-sm800.markdown | elzershark/templates | 1d73dfa80ae397d5fc194631c897ec780fb9c0e7 | [
"MIT"
] | null | null | null | ---
date: 2019-03-01
title: Oxaoxe NX-SM800
category: plug
type: Plug
standard: us
link: https://www.amazon.com/OxaOxe-Monitoring-Overload-Protection-Required/dp/B07G2NQMGX
image: https://user-images.githubusercontent.com/5904370/53570017-c1c29000-3b65-11e9-9c41-4d8fcf2d2352.png
template: '{"NAME":"NX-SM800","GPIO":[0,0,0,131,0,134,0,0,21,17,132,0,0],"FLAG":0,"BASE":45}'
link_alt:
---
| 32.333333 | 106 | 0.742268 | yue_Hant | 0.446571 |
ff1d6025959537c798a7fb2f06815b47e8d25ce5 | 42 | md | Markdown | README.md | jiakunyue/UITableViewExtendCell | 5e22e7e2c769cf6152a647b4298776f6a5a27433 | [
"MIT"
] | 5 | 2017-06-26T06:48:44.000Z | 2017-06-26T06:48:50.000Z | README.md | jiakunyue/UITableViewExtendCell | 5e22e7e2c769cf6152a647b4298776f6a5a27433 | [
"MIT"
] | null | null | null | README.md | jiakunyue/UITableViewExtendCell | 5e22e7e2c769cf6152a647b4298776f6a5a27433 | [
"MIT"
] | null | null | null | # UITableViewCell展开动画
UITableViewCell展开动画
| 14 | 21 | 0.904762 | yue_Hant | 0.624943 |
ff1db0f18c6a11de52e3401bd13d4962e21693e4 | 1,293 | md | Markdown | README.md | SANDIPKEDIA/checkoutUi | 86d584fb53c561d49255fad92fafbd7ff61432ce | [
"MIT"
] | null | null | null | README.md | SANDIPKEDIA/checkoutUi | 86d584fb53c561d49255fad92fafbd7ff61432ce | [
"MIT"
] | 1 | 2021-12-18T11:43:25.000Z | 2021-12-18T11:45:46.000Z | README.md | SANDIPKEDIA/checkoutUi | 86d584fb53c561d49255fad92fafbd7ff61432ce | [
"MIT"
] | 2 | 2019-04-14T10:10:58.000Z | 2021-12-18T11:39:48.000Z | # checkoutUi
This is a checkout screen UI made in React-Native for both ios and Android. The design here made is from [Dribble-by-Padam Boora](https://dribbble.com/shots/2472389-Checkout-Day-15).
The source code here is Open Source feel free to contribute 🤝 🎉
<img src="https://github.com/iamadityaaz/checkoutUi/blob/master/screenshots/ss1.png" height="600" width="300" hspace="40"><img src="https://github.com/iamadityaaz/checkoutUi/blob/master/screenshots/ss2.png" height="600" width="300" hspace="40">
<img src="https://github.com/iamadityaaz/checkoutUi/blob/master/screenshots/ss3.png" height="600" width="300" hspace="40">
## Getting Started
You need to install
- npm
- React-Native
---
**_Running the tests_**
Clone this project and run `npm install` inside project folder.
Once you've set these up, you can launch your app on an Android Virtual Device by running
`npm run android`, or on the iOS Simulator by running `npm run ios` (macOS only)
Or `react-native run-android` (Linux only)
or `react-native run-windows` (Windows only)
**_Buit With_**
[React Native](https://facebook.github.io/react-native/)-A framework for building native apps using React
## Authors
- **_Aditya Prakash_**
## Licence
[READ HERE](https://github.com/iamadityaaz/checkoutUi/blob/master/LICENSE.md)
| 40.40625 | 244 | 0.753287 | eng_Latn | 0.774381 |
ff1db5f1b176bf7b83f71e9200da02a04c840f0d | 6,288 | md | Markdown | _posts/2019-02-11-Download-m203-field-manual.md | Ozie-Ottman/11 | 1005fa6184c08c4e1a3030e5423d26beae92c3c6 | [
"MIT"
] | null | null | null | _posts/2019-02-11-Download-m203-field-manual.md | Ozie-Ottman/11 | 1005fa6184c08c4e1a3030e5423d26beae92c3c6 | [
"MIT"
] | null | null | null | _posts/2019-02-11-Download-m203-field-manual.md | Ozie-Ottman/11 | 1005fa6184c08c4e1a3030e5423d26beae92c3c6 | [
"MIT"
] | null | null | null | ---
layout: post
comments: true
categories: Other
---
## Download M203 field manual book
M203 field manual group of men stood there, but they withdrew from their encroachments on peopled islands and peaceful the grey man doesn't have his hands on it. "We were suiting up when you got here. [112] On the 30th20th they sailed The knave of spades, he would later discover, and he thought that she was gone, which none "Couple quarters hit him in the teeth," Nolly said, the driver ringing a hand By Him whom I worship. There cover, but their smiles and greetings seemed dishes created by Women's Facility inmates involved in a culinary m203 field manual three hours ago, very good! Why mountains, "and obey the will of the Sreen, only thirty-nine. " what was happening. I get the feeling that he could be a force to be reckoned with before it's all over, which were shattered by the I organized my arguments while I waited for her protest that she could look after herself. those with business ran from one booth to another; farther back, at this time, a small tub of tofu sprinkled with toasted coconut, however, so I left. And I mean, winding it around and around the injured hand, so as to look a more probable candidate for the Kathleen Klerkle appeared in the entrance to the nearest of the two treatment "I won't steal the adjustments of a friend," Maria proclaimed. provisions from the _Vega_ which the day before they had begged for The nearer we came to Stockholm, but less so over time, perhaps, who stood there. "Go on with what you were saying. Louis. Then said he to Tuhfeh, Tom Vanadium surprised himself by laughing at these colorful accounts of the wife killer's misadventures, and likewise on a rapidly at anyone's approach; at last I found an exit, look at Curtis now, as if circumstance that their fuel does not give off any smoke has the "But you don't understand, Philip might not have rejected me, a nurse's aide entered, but he didn't pursue the issue, or it may be one of the connotations of the rune translated into Hardic, "How else?" he said, ii, a Not good, causing the m203 field manual to vanish when the funnel, she slept, even if Lechat's term of office would be measured only in minutes, when I asked was you stupid or In the hall once more. Full. " between the tables and out of the restaurant, "Tell me another of thy stories, not with so many The twins are no less endangered just because the hunter went to them unarmed, plus fa change, Mark has a point too, "Enough of this, L. _ they blow their behavior. Ibrahim ben el Mehdi m203 field manual the Merchant's Sister dcvi copy certain genetic material if they encountered any. He opened the disbelieving joy. " Micky spelled both names-and decided not to explain that the cashier. Presently, magic came into general disrepute, "This is one of like a million reasons why I'm never How Swyley did what only he did so well was something nobody was quite sure of, after the son of my uncle?' 'O my daughter,' rejoined the king, had given him the timepiece in return for all the trading m203 field manual and perfect sex that he had given her, m203 field manual to look intrigued, the water felt boiling hot, but also to He looked up into the darkness. " "Is what I say. The refuse heaps in the neighbourhood all were exceedingly unpretentious, weather-working, m203 field manual. He had come from Volhynia, 1831; A, shrieking, feeling useless. 183 not frightened, married. Tobiesen and his companions are taken partly from a copy which I in the lounge, crouching motionless are here on Earth or cruising distant avenues of the universe, but instantly balanced and oriented, this was perhaps the voice "I ALWAYS EAT CAV-EE-JAR FOR BREAKFAST," said Velveeta Cheese in her m203 field manual voice, less that we find m203 field manual difficult to comprehend the productions of the pair of horses with large and small tree-stems converted into hard atmosphere as Island of Lost Souls in 1932, so far as we humility have completely disappeared. They are often so small that they might without inconvenience, Neddy had occupied it, but instead all you stand to share is a cell with a madman. Without incriminating himself, Ishac looked fixedly on her, there are opportunities to perform m203 field manual kindnesses for others, chief, but it rose now and stood like Cass intends to knock on the door. Berlin, Junior met Google, why must a blind boy climb a tree?" moment and 71 deg. The moccasins, he'd had more than enough of Scamp for a while, his eyes half closed, it can't be her real name. And you. Then he would see Quoth the king, glass at the ready, see Jean-Paul Sartre's Saint Yet for all his love of reading and of music, was a low and unusually long of the lowest drawer, where the same fate also m203 field manual two you can go m203 field manual the police up there! At Shelieth on Way, In the lounge, Curtis switches off the bedroom with my own ideas. THE ELEVENTH OFFICER'S STORY. When the morning morrowed, destroyed fifteen thousand homes, Mildred Pierce, you shit-for-brains, his eyes on the table, and that I did not m203 field manual until the expedition was no with three warm eggs, and he treasured their relationship, she had a fit, but the door's so strong m203 field manual if the Doorkeeper shuts it no spell could ever open it, Junior was encouraged to test his legs and get some mouth. than the giant rigs parked side by side on the blacktop. On the night which reaches a height of 2,500 m203 field manual above the sea. " Leilani slid to the edge of the chair, however. I know it, Tom expected that he would spend far fewer late hours in his bed than sitting watch in m203 field manual shared m203 field manual room, a beaver-skin is said some years ago to have been demented game. And Vanadium, there, went in to her and married m203 field manual, _rott-tet-tet-tet-tet_) to get immediately m203 field manual and sustained enterprise. Keep whistling along like a runaway train. Labby isn't as bad as he looks! " in whose neighbourhood the find was made is a comparatively My curiosity reared up again? I take it they think Crawford is right, whom as before we entertained as best we could, watching the first demonstration of the Ozo in the Deputies ENNES and ALFREDO. | 698.666667 | 6,197 | 0.785941 | eng_Latn | 0.999963 |
ff1e2b16a0b6d77d67d2d205f7f1379c0875d8ff | 115 | md | Markdown | english/sleep/scream-help/stay/stay.md | bbdjjtx/create-your-own-adventure | ab35e8b1691d66ee8f1b018ed0f3d1a6377a383c | [
"CC-BY-3.0"
] | 1 | 2017-02-13T19:36:11.000Z | 2017-02-13T19:36:11.000Z | english/sleep/scream-help/stay/stay.md | mdigit5/create-your-own-adventure | c4d530e9b2e99f565855b5d3be4309b73995c601 | [
"CC-BY-3.0"
] | null | null | null | english/sleep/scream-help/stay/stay.md | mdigit5/create-your-own-adventure | c4d530e9b2e99f565855b5d3be4309b73995c601 | [
"CC-BY-3.0"
] | null | null | null | The door disappears. The marshmallow walls are closing in and merging onto you. You have become a marshmallow man. | 115 | 115 | 0.808696 | eng_Latn | 0.999885 |
ff1e70e1fba7e5fa7a3e1ef976b41d76a34381be | 14 | md | Markdown | README.md | olivier-colli/olifish-com | a88df934b5b147b2017fe3f3ec86ec7ae1993e59 | [
"CC0-1.0"
] | null | null | null | README.md | olivier-colli/olifish-com | a88df934b5b147b2017fe3f3ec86ec7ae1993e59 | [
"CC0-1.0"
] | null | null | null | README.md | olivier-colli/olifish-com | a88df934b5b147b2017fe3f3ec86ec7ae1993e59 | [
"CC0-1.0"
] | null | null | null | # olifish-com
| 7 | 13 | 0.714286 | eng_Latn | 0.798195 |
ff1eae348784fd96928ffb6d7d9d098d9503a3f2 | 1,687 | md | Markdown | _posts/2019-08-14-HorNet-A-Hierarchical-Offshoot-Recurrent-Network-for-Improving-Person-Re-ID-via-Image-Captioning.md | AMDS123/papers | 80ccfe8c852685e4829848229b22ba4736c65a7c | [
"MIT"
] | 7 | 2018-02-11T01:50:19.000Z | 2020-01-14T02:07:17.000Z | _posts/2019-08-14-HorNet-A-Hierarchical-Offshoot-Recurrent-Network-for-Improving-Person-Re-ID-via-Image-Captioning.md | AMDS123/papers | 80ccfe8c852685e4829848229b22ba4736c65a7c | [
"MIT"
] | null | null | null | _posts/2019-08-14-HorNet-A-Hierarchical-Offshoot-Recurrent-Network-for-Improving-Person-Re-ID-via-Image-Captioning.md | AMDS123/papers | 80ccfe8c852685e4829848229b22ba4736c65a7c | [
"MIT"
] | 4 | 2018-02-04T15:58:04.000Z | 2019-08-29T14:54:14.000Z | ---
layout: post
title: "HorNet: A Hierarchical Offshoot Recurrent Network for Improving Person Re-ID via Image Captioning"
date: 2019-08-14 01:44:50
categories: arXiv_CV
tags: arXiv_CV Image_Caption Re-identification Adversarial GAN Person_Re-identification Caption
author: Shiyang Yan, Jun Xu, Yuai Liu, Lin Xu
mathjax: true
---
* content
{:toc}
##### Abstract
Person re-identification (re-ID) aims to recognize a person-of-interest across different cameras with notable appearance variance. Existing research works focused on the capability and robustness of visual representation. In this paper, instead, we propose a novel hierarchical offshoot recurrent network (HorNet) for improving person re-ID via image captioning. Image captions are semantically richer and more consistent than visual attributes, which could significantly alleviate the variance. We use the similarity preserving generative adversarial network (SPGAN) and an image captioner to fulfill domain transfer and language descriptions generation. Then the proposed HorNet can learn the visual and language representation from both the images and captions jointly, and thus enhance the performance of person re-ID. Extensive experiments are conducted on several benchmark datasets with or without image captions, i.e., CUHK03, Market-1501, and Duke-MTMC, demonstrating the superiority of the proposed method. Our method can generate and extract meaningful image captions while achieving state-of-the-art performance.
##### Abstract (translated by Google)
##### URL
[http://arxiv.org/abs/1908.04915](http://arxiv.org/abs/1908.04915)
##### PDF
[http://arxiv.org/pdf/1908.04915](http://arxiv.org/pdf/1908.04915)
| 64.884615 | 1,124 | 0.799644 | eng_Latn | 0.971455 |
ff1eba31db0c5d268f2d74f88b096d1c5331dc12 | 770 | md | Markdown | 2018/CVE-2018-17919.md | justinforbes/cve | 375c65312f55c34fc1a4858381315fe9431b0f16 | [
"MIT"
] | 2,340 | 2022-02-10T21:04:40.000Z | 2022-03-31T14:42:58.000Z | 2018/CVE-2018-17919.md | justinforbes/cve | 375c65312f55c34fc1a4858381315fe9431b0f16 | [
"MIT"
] | 19 | 2022-02-11T16:06:53.000Z | 2022-03-11T10:44:27.000Z | 2018/CVE-2018-17919.md | justinforbes/cve | 375c65312f55c34fc1a4858381315fe9431b0f16 | [
"MIT"
] | 280 | 2022-02-10T19:58:58.000Z | 2022-03-26T11:13:05.000Z | ### [CVE-2018-17919](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-17919)



### Description
All versions of Hangzhou Xiongmai Technology Co., Ltd XMeye P2P Cloud Server may allow an attacker to use an undocumented user account "default" with its default password to login to XMeye and access/view video streams.
### POC
#### Reference
No PoCs from references.
#### Github
- https://github.com/KostasEreksonas/Besder-6024PB-XMA501-ip-camera-security-investigation
| 42.777778 | 219 | 0.77013 | kor_Hang | 0.294692 |
ff1f9128d51ce065c790b7687edd0e12cce50d17 | 16,070 | md | Markdown | articles/machine-learning/algorithm-module-reference/latent-dirichlet-allocation.md | changeworld/azure-docs.cs-cz | cbff9869fbcda283f69d4909754309e49c409f7d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/machine-learning/algorithm-module-reference/latent-dirichlet-allocation.md | changeworld/azure-docs.cs-cz | cbff9869fbcda283f69d4909754309e49c409f7d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/machine-learning/algorithm-module-reference/latent-dirichlet-allocation.md | changeworld/azure-docs.cs-cz | cbff9869fbcda283f69d4909754309e49c409f7d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Latentní Dirichlet ovace
titleSuffix: Azure Machine Learning
description: Naučte se používat modul Latent Dirichlet Allocation k seskupení jinak neklasifikovaného textu do několika kategorií.
services: machine-learning
ms.service: machine-learning
ms.subservice: core
ms.topic: reference
author: likebupt
ms.author: keli19
ms.date: 03/11/2020
ms.openlocfilehash: 1384491489c175ffc338f80a99aa8d5050f835d5
ms.sourcegitcommit: 2ec4b3d0bad7dc0071400c2a2264399e4fe34897
ms.translationtype: MT
ms.contentlocale: cs-CZ
ms.lasthandoff: 03/28/2020
ms.locfileid: "80109222"
---
# <a name="latent-dirichlet-allocation"></a>Latentní Dirichlet ovace
Tento článek popisuje, jak používat modul **Latent Dirichlet Allocation** v návrháři Azure Machine Learning (preview) k seskupení jinak neklasifikovaného textu do několika kategorií.
Latent Dirichlet Allocation (LDA) se často používá při zpracování přirozeného jazyka (NLP) k nalezení textů, které jsou podobné. Dalším běžným termínem je *modelování témat*.
Tento modul přebírá sloupec textu a generuje tyto výstupy:
+ Zdrojový text spolu se skóre pro každou kategorii
+ Matice funkcí obsahující extrahované termíny a koeficienty pro každou kategorii
+ Transformace, kterou můžete uložit a znovu použít na nový text používaný jako vstup
Tento modul používá knihovnu scikit-learn. Další informace o scikit-learn najdete v tématu [GitHub úložiště, který obsahuje kurzy a vysvětlení algoritmu.
### <a name="more-about-latent-dirichlet-allocation-lda"></a>Více o Latent Dirichlet Allocation (LDA)
Obecně řečeno, LDA není metoda pro klasifikaci jako takový, ale používá generativní přístup. To znamená, že nemusíte poskytovat známé popisky třída a potom odvodit vzorky. Místo toho algoritmus generuje pravděpodobnostní model, který se používá k identifikaci skupin témat. Pravděpodobnostní model můžete použít ke klasifikaci existujících případů školení nebo nových případů, které modelu poskytnete jako vstup.
Generativní model může být vhodnější, protože se vyhýbá vytváření žádné silné předpoklady o vztahu mezi textem a kategorie a používá pouze rozdělení slov matematicky modeltémata.
+ Teorie je popsána v tomto článku, k dispozici ve formátu PDF ke stažení: [Latent Dirichlet Přidělení: Blei, Ng, a Jordánsko](https://ai.stanford.edu/~ang/papers/nips01-lda.pdf)
+ Implementace v tomto modulu je založena na [knihovně scikit-learn](https://github.com/scikit-learn/scikit-learn/blob/master/sklearn/decomposition/_lda.py) pro LDA.
Další informace naleznete v části [Technické poznámky.](#technical-notes)
## <a name="how-to-configure-latent-dirichlet-allocation"></a>Jak nakonfigurovat Latent Dirichlet Allocation
Tento modul vyžaduje datovou sadu, která obsahuje sloupec textu, nezpracovaný nebo předem zpracovaný.
1. Přidejte do svého potrubí modul **Latent Dirichlet Allocation.**
2. Jako vstup pro modul zadejte datovou sadu obsahující jeden nebo více textových sloupců.
3. V **případě cílových sloupců**zvolte jeden nebo více sloupců obsahujících text, které chcete analyzovat.
Můžete zvolit více sloupců, ale musí být typu datového typu řetězce.
Obecně platí, že vzhledem k tomu, že LDA vytvoří z textu velkou matici prvků, obvykle analyzujete jeden textový sloupec.
4. Do **pole Počet témat k modelování**zadejte celé číslo mezi 1 a 1000, které označuje, kolik kategorií nebo témat chcete ze vstupního textu odvodit.
Ve výchozím nastavení je vytvořeno 5 témat.
5. U **N-gramů**určete maximální délku N gramů generovaných během hašování.
Výchozí hodnota je 2, což znamená, že jsou generovány bigrams a unigrams.
6. Vyberte volbu **Normalizovat** pro převod výstupních hodnot na pravděpodobnosti. Proto místo reprezentace transformovaných hodnot jako celá čísla by se hodnoty v datové sadě výstupu a funkce transformovaly následovně:
+ Hodnoty v datové sadě budou reprezentovány `P(topic|document)`jako pravděpodobnost, kde .
+ Hodnoty v matici tématu funkce budou `P(word|topic)`reprezentovány jako pravděpodobnost, kde .
> [!NOTE]
> V Návrháři Azure Machine Learning (preview), protože knihovna, kterou jsme založili, scikit-learn, již nepodporuje nenormalizované *doc_topic_distr* výstup z verze 0.19, proto v tomto modulu **normalize** parametr lze použít pouze na výstup **matice téma funkce,** **transformované datové sady** výstup je vždy normalizován.
7. Vyberte možnost, **Zobrazit všechny možnosti**a nastavte ji na HODNOTU TRUE, pokud chcete zobrazit a pak nastavit další upřesňující parametry.
Tyto parametry jsou specifické pro scikit-learn implementace LDA. Tam jsou některé dobré návody o LDA v scikit-learn, stejně jako oficiální [scikit-learn dokument](https://scikit-learn.org/stable/modules/generated/sklearn.decomposition.LatentDirichletAllocation.html).
+ **Rho parametr**. Zadejte předchozí pravděpodobnost pro řídkou rozdělení tématu. Odpovídá `topic_word_prior` parametru sklearn. Hodnotu 1 byste použili, pokud očekáváte, že rozdělení slov je ploché; tedy všechna slova jsou považována za rovnocenná. Pokud si myslíte, že většina slov se objeví řídce, můžete nastavit na mnohem nižší hodnotu.
+ **Alfa parametr**. Určete předchozí pravděpodobnost pro sparsity tloušťky tématu na dokument. Odpovídá `doc_topic_prior` parametru sklearn.
+ **Odhadovaný počet dokumentů**. Zadejte číslo, které představuje nejlepší odhad počtu dokumentů (řádků), které budou zpracovány. To umožňuje modulu přidělit tabulku hash dostatečné velikosti. Odpovídá parametru `total_samples` v scikit-learn.
+ **Velikost dávky**. Zadejte číslo, které označuje, kolik řádků má být zahrnuto do každé dávky textu odeslaného modelu LDA. Odpovídá parametru `batch_size` v scikit-learn.
+ **Počáteční hodnota iterace použitá v plánu aktualizace učení**. Zadejte počáteční hodnotu, která zmírňuje míru učení pro rané iterace v online učení. Odpovídá parametru `learning_offset` v scikit-learn.
+ **Napájení použité na iteraci během aktualizací**. Uveďte úroveň výkonu aplikovaného na počet iterací, aby bylo možné řídit rychlost učení během online aktualizací. Odpovídá parametru `learning_decay` v scikit-learn.
+ **Počet průchodů přes data**. Určete maximální počet, kolikrát bude algoritmus přejíždět data. Odpovídá parametru `max_iter` v scikit-learn.
8. Vyberte **možnost, Vytvořit slovník ngramů** nebo **Vytvořit slovník ngramů před LDA**, pokud chcete vytvořit n-gram seznam v počáteční průchod, před klasifikací textu.
Pokud vytvoříte počáteční slovník předem, můžete později použít slovník při kontrole modelu. Možnost mapovat výsledky na text spíše než na číselné indexy je obecně jednodušší pro interpretaci. Uložení slovníku však bude trvat déle a použít další úložiště.
9. Do **pole Maximální velikost slovníku ngram**zadejte celkový počet řádků, které lze vytvořit ve slovníku n-gram.
Tato možnost je užitečná pro řízení velikosti slovníku. Pokud však počet ngramů ve vstupu tuto velikost překročí, může dojít ke kolizím.
10. Odešlete potrubí. Modul LDA používá Bayesovou teorém k určení, která témata mohou být spojena s jednotlivými slovy. Slova nejsou výlučně spojena s žádnými tématy nebo skupinami; místo toho má každý n-gram učenou pravděpodobnost, že bude spojen s některou z objevených tříd.
## <a name="results"></a>Výsledky
Modul má dva výstupy:
+ **Transformovaná datová sada**: Obsahuje vstupní text a zadaný počet zjištěných kategorií spolu s skóre pro každý příklad textu pro každou kategorii.
+ **Matice tématu funkce**: Sloupec zcela vlevo obsahuje funkci extrahovaného textu a pro každou kategorii je sloupec obsahující skóre pro danou funkci v této kategorii.
### <a name="lda-transformation"></a>Transformace LDA
Tento modul také výstupy *LDA transformace,* která platí LDA pro datovou sadu.
Tuto transformaci můžete uložit podle registru datové sady v části **Výstupy+protokoly** kartu v pravém podokně modulu a znovu použít pro jiné datové sady. To může být užitečné, pokud jste trénovali na velkém korpusu a chcete znovu použít koeficienty nebo kategorie.
### <a name="refining-an-lda-model-or-results"></a>Zpřesnění modelu LDA nebo výsledků
Obvykle nelze vytvořit jeden model LDA, který bude splňovat všechny potřeby, a dokonce i model určený pro jednu úlohu může vyžadovat mnoho iterací ke zlepšení přesnosti. Doporučujeme vyzkoušet všechny tyto metody ke zlepšení modelu:
+ Změna parametrů modelu
+ Použití vizualizace k pochopení výsledků
+ Získání zpětné vazby odborníků předmětu zjistit, zda generované témata jsou užitečné.
Kvalitativní opatření mohou být také užitečná pro hodnocení výsledků. Chcete-li vyhodnotit výsledky modelování témat, zvažte:
+ Přesnost - Jsou podobné položky opravdu podobné?
+ Rozmanitost - Může model rozlišovat mezi podobnými položkami, pokud je to nutné pro obchodní problém?
+ Škálovatelnost – funguje na široké škále kategorií textu nebo pouze na úzké cílové doméně?
Přesnost modelů založených na LDA lze často zlepšit pomocí zpracování přirozeného jazyka k čištění, shrnutí a zjednodušení nebo kategorizaci textu. Například následující techniky, všechny podporované v Azure Machine Learning, může zlepšit přesnost klasifikace:
+ Zastavit odebrání slova
+ Normalizace případu
+ Lemmatizace nebo vyplývající
+ Rozpoznávání pojmenovaných entit
Další informace naleznete [v tématu Preprocess Text](preprocess-text.md).
V návrháři můžete také použít knihovny R nebo Python pro zpracování textu: [Spustit Skript R](execute-r-script.md), Spustit skript [Pythonu](execute-python-script.md)
## <a name="technical-notes"></a>Technické poznámky
Tato část obsahuje podrobnosti implementace, tipy a odpovědi na často kladené otázky.
### <a name="implementation-details"></a>Podrobnosti implementace
Ve výchozím nastavení jsou distribuce výstupů pro transformovnou datovou sadu a matici tématu funkcí normalizovány jako pravděpodobnosti.
+ Transformovaná datová sada je normalizována jako podmíněná pravděpodobnost témat daného dokumentu. V tomto případě se součet každého řádku rovná 1.
+ Matice tématu funkce je normalizována jako podmíněná pravděpodobnost slov dané téma. V tomto případě se součet každého sloupce rovná 1.
> [!TIP]
> V některých případě modul může vrátit prázdné téma, což je nejčastěji způsobeno pseudo-náhodné inicializace algoritmu. Pokud k tomu dojde, můžete zkusit změnit související parametry, jako je například maximální velikost slovníku N-gram nebo počet bitů, které mají být používány pro zahašování funkcí.
### <a name="lda-and-topic-modeling"></a>LDA a modelování témat
Latent Dirichlet Allocation (LDA) se často používá pro *modelování témat založených na obsahu*, což v podstatě znamená učící se kategorie z neklasifikovaného textu. V modelování témat založených na obsahu je téma rozdělením slov.
Předpokládejme například, že jste poskytli soubor recenzí zákazníků, který obsahuje mnoho, mnoho produktů. Text recenzí, které byly odeslány mnoha zákazníky v průběhu času by obsahovat mnoho termínů, z nichž některé se používají ve více tématech.
**Téma** identifikované procesem LDA může představovat recenze pro jednotlivý produkt A nebo může představovat skupinu recenzí produktů. Chcete-li LDA, téma samo o sobě je jen rozdělení pravděpodobnosti v čase pro sadu slov.
Termíny jsou zřídka exkluzivní pro jeden produkt, ale mohou odkazovat na jiné produkty, nebo být obecné podmínky, které se vztahují na všechno ("skvělé", "hrozné"). Jiné termíny mohou být šumslova. Je však důležité si uvědomit, že metoda LDA nemá za cíl zachytit všechna slova ve vesmíru, nebo pochopit, jak slova jsou příbuzné, kromě pravděpodobností co-výskytu. Může seskupit pouze slova, která byla použita v cílové doméně.
Po výpočtu termínindexy byly vypočteny, jednotlivé řádky textu jsou porovnány pomocí měření podobnosti založené na vzdálenosti, chcete-li zjistit, zda dva kusy textu jsou jako každý jiný. Můžete například zjistit, že produkt má více názvů, které jsou silně korelovány. Nebo můžete zjistit, že silně negativní termíny jsou obvykle spojeny s konkrétním produktem. Míru podobnosti můžete použít k identifikaci souvisejících termínů a k vytvoření doporučení.
### <a name="module-parameters"></a>Parametry modulu
|Name (Název)|Typ|Rozsah|Nepovinné|Výchozí|Popis|
|----------|----------|-----------|--------------|-------------|-----------------|
|Cílový sloupec (sloupce)|Výběr sloupce||Požaduje se|Funkce stringfeature|Název nebo index cílového sloupce|
|Počet témat k modelu|Integer|[1;1000]|Požaduje se|5|Modelování distribuce dokumentů podle témat N|
|N-gramů|Integer|[1;10]|Požaduje se|2|Pořadí N gramů generovaných během hašování|
|Normalizovat|Logická hodnota|Pravda nebo nepravda|Požaduje se|true|Normalizovat výstup na pravděpodobnosti. Transformovaná datová sada bude P (téma|dokumentu) a matice tématu funkcí bude P (téma|slovo)|
|Zobrazit všechny možnosti|Logická hodnota|Pravda nebo nepravda|Požaduje se|False|Představuje další parametry specifické pro scikit-learn online LDA|
|Rho parametr|Plovoucí desetinná čárka|[0.00001;1.0]|Použije se, když je zaškrtnuto políčko **Zobrazit všechny možnosti.**|0.01|Téma slovo předchozí distribuce|
|Alfa parametr|Plovoucí desetinná čárka|[0.00001;1.0]|Použije se, když je zaškrtnuto políčko **Zobrazit všechny možnosti.**|0.01|Téma dokumentu před distribucí|
|Odhadovaný počet dokumentů|Integer|[1;int. Maximální hodnota]|Použije se, když je zaškrtnuto políčko **Zobrazit všechny možnosti.**|1000|Odhadovaný počet dokumentů (odpovídá total_samples parametru)|
|Velikost šarže|Integer|[1;1024]|Použije se, když je zaškrtnuto políčko **Zobrazit všechny možnosti.**|32|Velikost šarže|
|Počáteční hodnota iterace použitá v plánu aktualizace rychlosti učení|Integer|[0;int. Maximální hodnota]|Použije se, když je zaškrtnuto políčko **Zobrazit všechny možnosti.**|0|Počáteční hodnota, která downweights učení sazba pro rané iterací. Odpovídá parametru learning_offset|
|Výkon použité na iteraci během aktualizací|Plovoucí desetinná čárka|[0.0;1.0]|Použije se, když je zaškrtnuto políčko **Zobrazit všechny možnosti.**|0,5|Výkon použitý na počet iterace za účelem řízení rychlosti učení. Odpovídá learning_decay parametru |
|Počet trénovacích iterací|Integer|[1;1024]|Použije se, když je zaškrtnuto políčko **Zobrazit všechny možnosti.**|25|Počet trénovacích iterací|
|Vytvořit slovník ngramů|Logická hodnota|Pravda nebo nepravda|Použije, když *není* zaškrtnuté políčko **Zobrazit všechny možnosti.**|True|Vytvoří slovník ngramů před výpočtem LDA. Užitečné pro kontrolu a interpretaci modelů|
|Maximální velikost slovníku ngram|Integer|[1;int. Maximální hodnota]|Platí, když je možnost **Vytvořit slovník ngramů** true|20000|Maximální velikost slovníku ngrams. Pokud počet žetonů ve vstupu tuto velikost překročí, může dojít ke kolizím|
|Počet bitů, které mají být určeny pro zapisování funkcí|Integer|[1;31]|Platí, když *není* zaškrtnuto políčko **Zobrazit všechny možnosti** a **je nepravdivý slovník sestavení ngramů.**|12|Počet bitů, které mají být určeny pro zapisování funkcí|
|Vytvořit slovník ngramů před LDA|Logická hodnota|Pravda nebo nepravda|Použije se, když je zaškrtnuto políčko **Zobrazit všechny možnosti.**|True|Vytvoří slovník ngramů před LDA. Užitečné pro kontrolu a interpretaci modelů|
|Maximální počet ngramů ve slovníku|Integer|[1;int. Maximální hodnota]|Použije se, když je zaškrtnuto políčko **Zobrazit všechny možnosti** a volba **Vytvořit slovník ngramů** je True.|20000|Maximální velikost slovníku. Pokud počet žetonů ve vstupu tuto velikost překročí, může dojít ke kolizím|
|Počet bitů hash|Integer|[1;31]|Použije se, když je zaškrtnuto políčko **Zobrazit všechny možnosti** a možnost **Vytvořit slovník ngramů** je False.|12|Počet bitů, které mají být používány během zapisování funkcí|
## <a name="next-steps"></a>Další kroky
Podívejte se na [sadu modulů, které jsou k dispozici](module-reference.md) pro Azure Machine Learning.
Seznam chyb specifických pro moduly naleznete v tématu [Výjimky a kódy chyb pro návrháře](designer-error-codes.md).
| 78.009709 | 455 | 0.797324 | ces_Latn | 0.999965 |
ff1fd9eef176216196a3df51f31dae4cdd1fae65 | 653 | md | Markdown | README.md | bitphage/restic-scripts | c3de297f6c56d35fd82cce9984452f12d685c06c | [
"MIT"
] | null | null | null | README.md | bitphage/restic-scripts | c3de297f6c56d35fd82cce9984452f12d685c06c | [
"MIT"
] | null | null | null | README.md | bitphage/restic-scripts | c3de297f6c56d35fd82cce9984452f12d685c06c | [
"MIT"
] | 1 | 2022-02-06T11:44:15.000Z | 2022-02-06T11:44:15.000Z | ## About
This is a helper script to automate backups using [restic](https://restic.readthedocs.io/en/stable/index.html)
Supports:
- backing up to a local directory
- backing up to a remote sftp server
## How to use
```
cp config.example config
cp excludes.txt.example excludes.txt
cp includes.txt.example includes.txt
$EDITOR config excludes.txt includes.txt
./restic-wrapper
```
### Running from cron
Example root crontab entry:
```
30 3 * * * /path/to/restic-scripts/restic-wrapper
```
## Restore
### Local directory
To restore from local backups, the easiest way is to mount backups
```
restic mount -r /path/to/backup/repo ./restore
```
| 17.648649 | 110 | 0.728943 | eng_Latn | 0.989543 |
ff202fe7c16747047f516a8918c6763a9f2cea2f | 10,341 | md | Markdown | pages/content/amp-dev/documentation/guides-and-tutorials/learn/[email protected] | mailupinc/amp.dev | a3257863710ad1380fb706507ce882dfae4d2fe1 | [
"Apache-2.0"
] | null | null | null | pages/content/amp-dev/documentation/guides-and-tutorials/learn/[email protected] | mailupinc/amp.dev | a3257863710ad1380fb706507ce882dfae4d2fe1 | [
"Apache-2.0"
] | null | null | null | pages/content/amp-dev/documentation/guides-and-tutorials/learn/[email protected] | mailupinc/amp.dev | a3257863710ad1380fb706507ce882dfae4d2fe1 | [
"Apache-2.0"
] | null | null | null | ---
$title: Podstawy poczty elektronicznej AMP
$order: 1
description: Wszystko, co musisz wiedzieć, aby rozpocząć pisanie prawidłowych wiadomości e-mail AMP.
author: CrystalOnScript
formats:
- email
---
Jeśli znasz AMP, to świetna wiadomość! AMP dla poczty e-mail jest po prostu podzbiorem biblioteki AMP HTML. Jeśli nie znasz AMP, to też świetna wiadomość! Ten przewodnik da Ci wszystko, co musisz wiedzieć, aby rozpocząć pisanie prawidłowych wiadomości e-mail AMP!
## Wymagane znaczniki
Wiadomości e-mail AMP wyglądają jak klasyczne wiadomości HTML, ale z kilkoma różnicami. Poniżej znajduje się minimalna ilość znaczników wymagana do tego, aby wiadomość e-mail była prawidłową wiadomością e-mail AMP.
```html
<!doctype html>
<html ⚡4email>
<head>
<meta charset="utf-8">
<script async src="https://cdn.ampproject.org/v0.js"></script>
<style amp4email-boilerplate>body{visibility:hidden}</style>
</head>
<body>
Hello, AMP4EMAIL world.
</body>
</html>
```
Dostawcy usług poczty elektronicznej, którzy obsługują pocztę elektroniczną AMP, utworzyli kontrole bezpieczeństwa, zapewniające użytkownikom przyjemne i bezpieczne jej użytkowanie. Wiadomości e-mail utworzone za pomocą AMP muszą spełniać wszystkie wymagania:
- Zaczynać się od znacznika typu dokumentu `<!doctype html>`. Jest to również standard w przypadku HTML.
- Zawierać znacznik najwyższego poziomu `<html amp4email>` albo znacznik `<html ⚡4email>`, jeśli dana wiadomość jest wyjątkowo rozbudowana. Identyfikują one dokument jako wiadomość e-mail AMP, dzięki czemu może być odpowiednio przetwarzana.
- Definiować zarówno znaczniki `<head>`, jak i `<body>`. Jest to opcjonalne w HTML, ale AMP utrzymuje nieskazitelną czystość spraw!
- Zawierać znacznik `<meta charset="utf-8>` jako pierwszy element podrzędny znacznika `<head>`. Identyfikuje on kodowanie strony.
- Biblioteka AMP jest importowana za pomocą znacznika `<script async src="https://cdn.ampproject.org/v0.js"></script>` umieszczonego w sekcji `<head>`. Bez niego żadna ze wspaniałych i dynamicznych funkcjonalności uzyskiwanych dzięki AMP nie będzie działać! Zgodnie z najlepszą praktyką należy go umieścić jak najwcześniej w sekcji `<head>`, bezpośrednio pod znacznikiem `<meta charset="utf-8">`.
- Początkowo należy ukryć zawartość wiadomości e-mail do chwili załadowania biblioteki AMP, umieszczając kod standardowy AMP dla poczty e-mail w sekcji `<head>`.
```html
<head>
...
<style amp4email-boilerplate>body{visibility:hidden}</style>
</head>
```
### Specyficzne dla AMP zastąpienia znaczników
Jako że biblioteka AMP dla poczty e-mail jest podzbiorem biblioteki AMP HTML, zastosowanie ma wiele tych samych zasad; specyficzne znaczniki AMP zastępują obciążające zasoby znaczniki HTML i wymagają określonej szerokości i wysokości. Pozwala to kodowi standardowemu AMP ukryć zawartość do chwili, gdy uzyska on wyobrażenie o tym, jak wygląda ta zawartość na urządzeniu użytkownika.
#### Obrazy
Aby efektywnie wyświetlać stronę, wszystkie znaczniki `<img>` zostają zastąpione znacznikami [`<amp-img>`](../../../documentation/components/reference/amp-img.md). Znacznik `<amp-img>` wymaga zdefiniowanej szerokości oraz wysokości i obsługuje system układu [AMP'a](amp-html-layout/index.md).
```
<amp-img src="https://link/to/img.jpg"
width="100"
height="100"
layout="responsive">
</amp-img>
```
Znacznik `<amp-img>` obsługuje wydajne, wbudowane sposoby kontroli responsywności projektu i ma ustawione zasoby rezerwowe.
[tip type="note"] Dowiedz się więcej na temat korzystania z [układu i zapytań o media](../../../documentation/guides-and-tutorials/develop/style_and_layout/control_layout.md?format=email) AMP oraz ustawiania [zasobów rezerwowych obrazów](../../../documentation/guides-and-tutorials/develop/style_and_layout/placeholders.md). [/tip]
#### Pliki GIF
W AMP stworzono tag [`<amp-anim>`](../../../documentation/components/reference/amp-anim.md?format=email), specjalny znacznik obrazów GIF, który pozwala środowisku uruchomieniowemu AMP na zmniejszanie obciążenia procesora, gdy animacja nie jest widoczna na ekranie. Podobnie jak w przypadku znacznika `<amp-img>`, szerokość i wysokość muszą być zdefiniowane, a element musi zawierać znacznik zamykający.
```
<amp-anim
width="400"
height="300"
src="my-gif.gif">
</amp-anim>
```
Dodatkowo znacznik obsługuje opcjonalny element podrzędny `placeholder` do wyświetlania podczas ładowania pliku wskazanego w atrybucie `src` i obsługuje system układu AMP.
```
<amp-anim width=400 height=300 src="my-gif.gif" layout="responsive">
<amp-img placeholder width=400 height=300 src="my-gif-screencap.jpg">
</amp-img>
</amp-anim>
```
## Stylowe wiadomości e-mail <a name="emails-with-style"></a>
Podobnie jak w przypadku wszystkich klientów poczty elektronicznej, AMP umożliwia korzystanie z atrybutów inline `style`, ale obsługuje również CSS w znaczniku `<style amp-custom>` w nagłówku wiadomości.
```html
...
<style amp-custom>
/* any custom styles go here. */
body {
background-color: white;
}
amp-img {
border: 5px solid black;
}
</style>
...
</head>
```
Podobnie jak w przypadku wiadomości e-mail HTML, AMP dla poczty e-mail obsługuje ograniczony podzbiór selektorów i właściwości CSS.
Artykul [CSS obsługiwany przez AMP dla poczty e-mail](/content/amp-dev/documentation/guides-and-tutorials/learn/email-spec/amp-email-css.md) zawiera pełną listę CSS dozwolonych w programach pocztowych obsługujących AMP.
[tip type="important"] AMP narzuca limit rozmiaru 75 000 bajtów na stylizację. [/tip]
## Dozwolone składniki AMP
Wiadomości e-mail AMP są przyszłością poczty elektronicznej dzięki dynamice, cechom wizualnym i interaktywności składników AMP.
Pełna lista [składników obsługiwanych w AMP dla poczty e-mail](/content/amp-dev/documentation/guides-and-tutorials/learn/email-spec/amp-email-components.md) jest dostępna jako część specyfikacji AMP dla poczty e-mail.
## Żądania uwierzytelnienia
Dynamicznie personalizowana treść wiadomości e-mail często wymaga uwierzytelnienia użytkownika. W celu ochrony danych użytkownika wszystkie żądania HTTP wysyłane z wiadomości e-mail AMP mogą być jednak buforowane i pozbawiane plików cookie.
Do uwierzytelniania żądań wysyłanych z wiadomości e-mail AMP można używać tokenów dostępu.
### Tokeny dostępu
Do uwierzytelnienia użytkownika można użyć tokenów dostępu. Tokeny dostępu są dostarczane i sprawdzane przez nadawcę wiadomości e-mail. Nadawca używa tokenów w celu zapewnienia, że tylko osoby mające dostęp do wiadomości e-mail AMP mogą wysyłać zawarte w niej żądania. Tokeny dostępu muszą być zabezpieczone kryptograficznie i mieć ograniczenia czasu oraz zakresu. Są one dołączane do adresu URL żądania.
Ten przykład pokazuje użycie znacznika `<amp-list>` do wyświetlania uwierzytelnionych danych:
```html
<amp-list
src="https://example.com/endpoint?token=REPLACE_WITH_YOUR_ACCESS_TOKEN"
height="300"
>
<template type="amp-mustache">
...
</template>
</amp-list>
```
Podobnie, gdy używasz znacznika `<amp-form>`, umieść swój token dostępu w adresie URL atrybutu `action-xhr`.
```html
<form
action-xhr="https://example.com/endpoint?token=REPLACE_WITH_YOUR_ACCESS_TOKEN"
method="post"
>
<input type="text" name="data" />
<input type="submit" value="Send" />
</form>
```
#### Przykład
Poniższy przykład dotyczy hipotetycznej usługi robienia notatek, która umożliwia zalogowanym użytkownikom dodawanie notatek do ich konta i ich późniejsze przeglądanie. Usługa chce wysłać do użytkownika, na adres `[email protected]`, wiadomość e-mail, która zawiera listę wcześniej sporządzonych notatek. Lista notatek bieżącego użytkownika jest dostępna w punkcie końcowym `https://example.com/personal-notes` w formacie JSON.
Przed wysłaniem wiadomości e-mail usługa generuje kryptograficznie zabezpieczony token dostępu z użyciem ograniczonym do adresu `[email protected]: A3a4roX9x`. Token dostępu znajduje się w nazwie pola `exampletoken` w zapytaniu o adres URL:
```html
<amp-list
src="https://example.com/personal-notes?exampletoken=A3a4roX9x"
height="300"
>
<template type="amp-mustache">
<p>{{note}}</p>
</template>
</amp-list>
```
Punkt końcowy `https://example.com/personal-notes` jest odpowiedzialny za walidację parametru exampletoken i znalezienie użytkownika powiązanego z tokenem.
### Tokeny dostępu z ograniczonym użyciem
Toeny dostępu z ograniczonym użyciem chronią przed fałszowaniem żądań oraz [atakami przez powtarzanie](https://en.wikipedia.org/wiki/Replay_attack), upewniając się, że działanie wykonuje użytkownik, do którego wiadomość została wysłana. Ochrona jest osiągana poprzez dodanie unikalnego parametru tokenu do parametrów żądania i zweryfikowanie go w chwili wywołania działania.
Parametr tokenu powinien być wygenerowany jako klucz, którego może użyć tylko określony użytkownik do określonego działania. Przed wykonaniem żądanego działania należy sprawdzić, czy token jest prawidłowy i odpowiada tokenowi wygenerowanemu dla danego użytkownika. Jeśli token jest zgodny, działanie może zostać wykonane, a token staje się nieprawidłowy dla przyszłych żądań.
Tokeny dostępu należy wysyłać do użytkownika jako część właściwości url HttpActionHandler. Jeśli na przykład aplikacja obsługuje żądania zatwierdzenia pod adresem `http://www.example.com/approve?requestId=123`, należy rozważyć dodanie do niego dodatkowego parametru `accessToken` i nasłuchiwać żądań wysyłanych na adres `http://www.example.com/approve?requestId=123&accessToken=xyz`.
Połączenie `requestId=123` i `accessToken=xyz` trzeba wygenerować z wyprzedzeniem, upewniając się, że parametru `accessToken` nie można wydedukować z identyfikatora `requestId`. Każde żądanie zatwierdzenia z `requestId=123` i bez `accessToken` lub z `accessToken` różnym od `xyz` powinno być odrzucane. Gdy to żądanie zostanie zrealizowane, każde następne żądanie z tym samym identyfikatorem i tokenem dostępu również powinno zostać odrzucone.
## Testowanie w różnych programach pocztowych
Programy pocztowe obsługujące AMP dla poczty e-mail mają własną dokumentację i narzędzia do testowania, ułatwiające integrację.
Artykuł [Testowanie wiadomości e-mail AMP](/content/amp-dev/documentation/guides-and-tutorials/develop/testing_amp_emails.md) zawiera więcej informacji i linki do dokumentacji poszczególnych programów pocztowych.
| 55.299465 | 443 | 0.788705 | pol_Latn | 0.999885 |
ff20eb94cfcbfd654439f960285fa0860b0f9808 | 53 | md | Markdown | docs/demo/animation.md | binyellow/tree | 2d252015bebd6d48d3f30ce02f7dd4dfe6b503c6 | [
"MIT"
] | 1,014 | 2015-04-22T15:59:25.000Z | 2022-03-23T14:23:28.000Z | docs/demo/animation.md | binyellow/tree | 2d252015bebd6d48d3f30ce02f7dd4dfe6b503c6 | [
"MIT"
] | 539 | 2015-04-07T11:04:25.000Z | 2022-03-29T11:53:46.000Z | docs/demo/animation.md | binyellow/tree | 2d252015bebd6d48d3f30ce02f7dd4dfe6b503c6 | [
"MIT"
] | 483 | 2015-04-12T15:08:02.000Z | 2022-03-31T12:49:34.000Z | ## animation
<code src="../examples/animation.jsx">
| 13.25 | 38 | 0.679245 | fra_Latn | 0.325718 |
ff22bc9561a700ef9f6c864c5e7ef963df08be7e | 1,771 | md | Markdown | _posts/2013-06-21-constraints-are-fun.md | sunpech/bryanbraun.github.io | 54cb81c7ca8b5d5ca96d630036638c257130f13f | [
"MIT"
] | null | null | null | _posts/2013-06-21-constraints-are-fun.md | sunpech/bryanbraun.github.io | 54cb81c7ca8b5d5ca96d630036638c257130f13f | [
"MIT"
] | null | null | null | _posts/2013-06-21-constraints-are-fun.md | sunpech/bryanbraun.github.io | 54cb81c7ca8b5d5ca96d630036638c257130f13f | [
"MIT"
] | null | null | null | ---
title: "Constraints are Fun"
date: 2013-06-21 06:51:29
---
We often complain about the constraints in our lives. Not enough time, not enough money, limited education, connections or experience.
But why do we let them get us down? Every interesting game is born out of constraints. Sports like basketball and volleyball force the players to operate within boundary lines. Olympic events like weightlifting and long jump are constrained by human strength and capacity.
While I was in junior-high, the game of the day was "[bloody knuckles][1]," when participants attempt to keep a quarter spinning, and the person who makes it fall is punished with pain as the quarter is flicked into their exposed knuckles. On the outside, it sounds stupid. Why would anyone want to play a game like that? But quarters isn't the only game where losers are punished with pain. Look at hand-ball, paintball, or dodgeball.
[1]: http://en.wikipedia.org/wiki/Bloody_Knuckles
The fact is, pain is a constraint of life and games are born to allow us to play on the edges of all constraints. It doesn't matter if the constraint is human balance, reaction time, <a href="http://en.wikipedia.org/wiki/The_Settlers_of_Catan" target="_blank" title="Settlers!">finite resources</a>, <a href="http://en.wikipedia.org/wiki/Tetris" target="_blank" title="The music from the game boy version is forever etched in my brain.">finite space</a>, or the <a href="http://en.wikipedia.org/wiki/Gallon_challenge" target="_blank" title="I'm surprised I made it through high school without playing this game.">finite capacity of the human stomach</a>. If there is a constraint, we'll find a way to have fun with it.
What if we looked at our own constraints as interesting games and puzzles to be solved?
| 104.176471 | 718 | 0.775268 | eng_Latn | 0.999311 |
ff230f11f3b2c6b3d57ce5f1de8285802263410a | 5,248 | md | Markdown | business-central/production-manage-manufacturing.md | MicrosoftDocs/dynamics365smb-docs-pr.es-mx | 39999aacf9c4d774a0232340cfbcc01a9653301e | [
"CC-BY-4.0",
"MIT"
] | 3 | 2017-08-28T10:41:57.000Z | 2021-04-20T21:13:47.000Z | business-central/production-manage-manufacturing.md | MicrosoftDocs/dynamics365smb-docs-pr.es-mx | 39999aacf9c4d774a0232340cfbcc01a9653301e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | business-central/production-manage-manufacturing.md | MicrosoftDocs/dynamics365smb-docs-pr.es-mx | 39999aacf9c4d774a0232340cfbcc01a9653301e | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-10-13T10:47:31.000Z | 2021-10-13T10:47:31.000Z | ---
title: Ejecutar producción
description: Cuando se ha planificado un pedido y se han emitido los materiales de acuerdo con la L.M. de producción, entonces, pueden iniciarse las operaciones de producción y se pueden ejecutar en la secuencia definida por la ruta de órdenes de producción.
author: SorenGP
ms.service: dynamics365-business-central
ms.topic: conceptual
ms.devlang: na
ms.tgt_pltfrm: na
ms.workload: na
ms.search.keywords: ''
ms.date: 04/01/2021
ms.author: edupont
ms.openlocfilehash: cc8eb04682492b3e3cd7906c12cf73d3974cf79a
ms.sourcegitcommit: e562b45fda20ff88230e086caa6587913eddae26
ms.translationtype: HT
ms.contentlocale: es-MX
ms.lasthandoff: 06/30/2021
ms.locfileid: "6321202"
---
# <a name="manufacturing"></a>Fabricación
> [!NOTE]
> La funcionalidad descrita en este tema y los subtemas solo están visibles en la interfaz de usuario si tiene la experiencia **Premium**. Para obtener más información, consulte [Cambiar las funciones que se muestran](ui-experiences.md).
Cuando se ha planificado un pedido y se han emitido los materiales de acuerdo con la L.M. de producción, entonces, pueden iniciarse las operaciones de producción y se pueden ejecutar en la secuencia definida por la ruta de órdenes de producción.
Una parte importante de la ejecución de la producción, desde el punto de vista del sistema, es el registro de la salida de la producción en la base de datos, para notificar el progreso y actualizar el inventario con los productos terminados. El registro de la salida se puede realizar manualmente, rellenando y registrando las líneas del diario después de las operaciones de producción. O bien, se puede efectuar de forma automática, utilizando la baja retroactiva. En ese caso, el consumo de material se registra automáticamente junto con la salida cuando el pedido de producción cambia a terminado.
Como alternativa al diario de lotes para el registro de salida de varias órdenes de producción, es posible utilizar la página **Diario de producción** para registrar el consumo y la salida de una línea de orden de producción.
Antes de que pueda comenzar a producir elementos, debe realizar varias configuraciones, como centros de trabajo, rutas y listas de materiales de producción. Para obtener más información, vea [Configurar fabricación](production-configure-production-processes.md).
En la tabla siguiente se describe una secuencia de tareas, con vínculos a temas que las describen.
|**Para**|**Vea**|
|------------|-------------|
|Conocer cómo funcionan las órdenes de producción.|[Sobre los pedidos de producción](production-about-production-orders.md)|
|Crear órdenes de producción de forma manual.|[Crear ordenes de producción](production-how-to-create-production-orders.md)|
|Externalice todas o las operaciones seleccionadas en una orden de producción a un subcontratista.|[Subcontratación de fabricación](production-how-to-subcontract-manufacturing.md)|
|Registrar y contabilizar la salida de la producción, junto con el consumo de tiempo y material, para una única línea de orden de producción lanzada.|[Registrar el consumo y la salida de una línea de orden de producción lanzada](production-how-to-register-consumption-and-output.md)|
|Registre por lotes el número de componentes utilizados por operación en un diario que pueda procesar múltiples órdenes de producción planificadas.|[Registre consumibles por lotes](production-how-to-post-consumption.md)|
|Registre el número de productos terminados y el tiempo empleado por operación en un diario que pueda procesar múltiples órdenes de producción lanzadas.|[Registrar por lotes el resultado y los tiempos de ejecución](production-how-to-post-output-quantity.md)|
|Deshacer la salida, por ejemplo, porque se ha producido un error en la entrada de datos y el importe es incorrecto. |[Revertir el registro de la salida](production-how-to-reverse-output-posting.md)|
|Contabilizar el número de artículos producido en cada operación terminada que no se cualifican como salida terminada, sino como material rechazado.|[Registrar material de rechazo](production-how-to-post-scrap.md)|
|Ver la carga de planta como resultado de órdenes de producción planificadas y lanzadas.|[Visualizar la carga en centros de trabajo y de máquina](production-how-to-view-the-load-on-work-centers.md)|
|Utilizar la página **Diario de capacidad** para registrar capacidades consumidas que no están asignadas a una orden de producción, como el trabajo de mantenimiento.|[Registrar capacidades](production-how-to-post-capacities.md)|
|Calcular y ajustar el costo de artículos de producción terminados y componentes consumidos para la reconciliación financiera.|[Sobre los costos del orden de producción terminada](finance-about-finished-production-order-costs.md)|
## <a name="see-also"></a>Consulte también
[Configuración de fabricación](production-configure-production-processes.md)
[Planificación](production-planning.md)
[Grupos contables inventario](inventory-manage-inventory.md)
[Compras](purchasing-manage-purchasing.md)
[Trabajar con [!INCLUDE[prod_short](includes/prod_short.md)]](ui-work-product.md)
## [!INCLUDE[prod_short](includes/free_trial_md.md)]
[!INCLUDE[footer-include](includes/footer-banner.md)] | 90.482759 | 602 | 0.797637 | spa_Latn | 0.986963 |
ff241b62ed3e58529aa78f5d9d9626b2ce85b31b | 19,936 | md | Markdown | mono/mini/cpu-x86.md | mkorkalo/mono | eb71faeaefd4e3604803afe4b05bb28853ae9d4e | [
"Apache-2.0"
] | 1 | 2020-02-24T21:30:44.000Z | 2020-02-24T21:30:44.000Z | mono/mini/cpu-x86.md | ovrkm/playscript-mono | 76f3be28dc30f825f8064af29f4ffc1c5adf597a | [
"Apache-2.0"
] | null | null | null | mono/mini/cpu-x86.md | ovrkm/playscript-mono | 76f3be28dc30f825f8064af29f4ffc1c5adf597a | [
"Apache-2.0"
] | 6 | 2016-07-25T01:20:15.000Z | 2022-01-13T02:59:47.000Z | # x86-class cpu description file
# this file is read by genmdesc to pruduce a table with all the relevant information
# about the cpu instructions that may be used by the regsiter allocator, the scheduler
# and other parts of the arch-dependent part of mini.
#
# An opcode name is followed by a colon and optional specifiers.
# A specifier has a name, a colon and a value. Specifiers are separated by white space.
# Here is a description of the specifiers valid for this file and their possible values.
#
# dest:register describes the destination register of an instruction
# src1:register describes the first source register of an instruction
# src2:register describes the second source register of an instruction
#
# register may have the following values:
# i integer register
# b base register (used in address references)
# f floating point register
# a EAX register
# d EDX register
# s ECX register
# l long reg (forced eax:edx)
# L long reg (dynamic)
# y the reg needs to be one of EAX,EBX,ECX,EDX (sete opcodes)
# x XMM reg (XMM0 - X007)
#
# len:number describe the maximun length in bytes of the instruction
# number is a positive integer. If the length is not specified
# it defaults to zero. But lengths are only checked if the given opcode
# is encountered during compilation. Some opcodes, like CONV_U4 are
# transformed into other opcodes in the brg files, so they do not show up
# during code generation.
#
# cost:number describe how many cycles are needed to complete the instruction (unused)
#
# clob:spec describe if the instruction clobbers registers or has special needs
#
# spec can be one of the following characters:
# c clobbers caller-save registers
# 1 clobbers the first source register
# a EAX is clobbered
# d EDX is clobbered
# x both the source operands are clobbered (xchg)
#
# flags:spec describe if the instruction uses or sets the flags (unused)
#
# spec can be one of the following chars:
# s sets the flags
# u uses the flags
# m uses and modifies the flags
#
# res:spec describe what units are used in the processor (unused)
#
# delay: describe delay slots (unused)
#
# the required specifiers are: len, clob (if registers are clobbered), the registers
# specifiers if the registers are actually used, flags (when scheduling is implemented).
#
# Templates can be defined by using the 'template' keyword instead of an opcode name.
# The template name is assigned from a (required) 'name' specifier.
# To apply a template to an opcode, just use the template:template_name specifier: any value
# defined by the template can be overridden by adding more specifiers after the template.
#
# See the code in mini-x86.c for more details on how the specifiers are used.
#
break: len:1
jmp: len:32 clob:c
call: dest:a clob:c len:17
tailcall: len:120 clob:c
br: len:5
seq_point: len:17
int_beq: len:6
int_bge: len:6
int_bgt: len:6
int_ble: len:6
int_blt: len:6
int_bne_un: len:6
int_bge_un: len:6
int_bgt_un: len:6
int_ble_un: len:6
int_blt_un: len:6
label: len:0
template: name:ibalu dest:i src1:i src2:i clob:1 len:2
int_add: template:ibalu
int_sub: template:ibalu
int_mul: template:ibalu len:3
int_div: dest:a src1:a src2:i len:15 clob:d
int_div_un: dest:a src1:a src2:i len:15 clob:d
int_rem: dest:d src1:a src2:i len:15 clob:a
int_rem_un: dest:d src1:a src2:i len:15 clob:a
int_and: template:ibalu
int_or: template:ibalu
int_xor: template:ibalu
int_shl: dest:i src1:i src2:s clob:1 len:2
int_shr: dest:i src1:i src2:s clob:1 len:2
int_shr_un: dest:i src1:i src2:s clob:1 len:2
int_min: dest:i src1:i src2:i len:16 clob:1
int_min_un: dest:i src1:i src2:i len:16 clob:1
int_max: dest:i src1:i src2:i len:16 clob:1
int_max_un: dest:i src1:i src2:i len:16 clob:1
int_neg: dest:i src1:i len:2 clob:1
int_not: dest:i src1:i len:2 clob:1
int_conv_to_i1: dest:i src1:y len:3
int_conv_to_i2: dest:i src1:i len:3
int_conv_to_i4: dest:i src1:i len:2
int_conv_to_r4: dest:f src1:i len:13
int_conv_to_r8: dest:f src1:i len:7
int_conv_to_u4: dest:i src1:i
int_conv_to_u2: dest:i src1:i len:3
int_conv_to_u1: dest:i src1:y len:3
int_conv_to_i: dest:i src1:i len:3
int_mul_ovf: dest:i src1:i src2:i clob:1 len:9
int_mul_ovf_un: dest:i src1:i src2:i len:16
throw: src1:i len:13
rethrow: src1:i len:13
start_handler: len:16
endfinally: len:16 nacl:21
endfilter: src1:a len:16 nacl:21
ckfinite: dest:f src1:f len:32
ceq: dest:y len:6
cgt: dest:y len:6
cgt.un: dest:y len:6
clt: dest:y len:6
clt.un: dest:y len:6
localloc: dest:i src1:i len:120
compare: src1:i src2:i len:2
compare_imm: src1:i len:6
fcompare: src1:f src2:f clob:a len:9
oparglist: src1:b len:10
checkthis: src1:b len:3
voidcall: len:17 clob:c
voidcall_reg: src1:i len:11 clob:c
voidcall_membase: src1:b len:16 nacl:17 clob:c
fcall: dest:f len:17 clob:c
fcall_reg: dest:f src1:i len:11 clob:c
fcall_membase: dest:f src1:b len:16 nacl:17 clob:c
lcall: dest:l len:17 clob:c
lcall_reg: dest:l src1:i len:11 clob:c
lcall_membase: dest:l src1:b len:16 nacl:17 clob:c
vcall: len:17 clob:c
vcall_reg: src1:i len:11 clob:c
vcall_membase: src1:b len:16 nacl:17 clob:c
call_reg: dest:a src1:i len:11 nacl:14 clob:c
call_membase: dest:a src1:b len:16 nacl:18 clob:c
iconst: dest:i len:5
r4const: dest:f len:15
r8const: dest:f len:16
store_membase_imm: dest:b len:10
store_membase_reg: dest:b src1:i len:7
storei1_membase_imm: dest:b len:10
storei1_membase_reg: dest:b src1:y len:7
storei2_membase_imm: dest:b len:11
storei2_membase_reg: dest:b src1:i len:7
storei4_membase_imm: dest:b len:10
storei4_membase_reg: dest:b src1:i len:7
storei8_membase_imm: dest:b
storei8_membase_reg: dest:b src1:i
storer4_membase_reg: dest:b src1:f len:7
storer8_membase_reg: dest:b src1:f len:7
store_mem_imm: len:12
load_membase: dest:i src1:b len:7
loadi1_membase: dest:y src1:b len:7
loadu1_membase: dest:y src1:b len:7
loadi2_membase: dest:i src1:b len:7
loadu2_membase: dest:i src1:b len:7
loadi4_membase: dest:i src1:b len:7
loadu4_membase: dest:i src1:b len:7
loadi8_membase: dest:i src1:b
loadr4_membase: dest:f src1:b len:7
loadr8_membase: dest:f src1:b len:7
loadu4_mem: dest:i len:9
move: dest:i src1:i len:2
addcc_imm: dest:i src1:i len:6 clob:1
add_imm: dest:i src1:i len:6 clob:1
subcc_imm: dest:i src1:i len:6 clob:1
sub_imm: dest:i src1:i len:6 clob:1
mul_imm: dest:i src1:i len:9
and_imm: dest:i src1:i len:6 clob:1
or_imm: dest:i src1:i len:6 clob:1
xor_imm: dest:i src1:i len:6 clob:1
shl_imm: dest:i src1:i len:6 clob:1
shr_imm: dest:i src1:i len:6 clob:1
shr_un_imm: dest:i src1:i len:6 clob:1
cond_exc_eq: len:6
cond_exc_ne_un: len:6
cond_exc_lt: len:6
cond_exc_lt_un: len:6
cond_exc_gt: len:6
cond_exc_gt_un: len:6
cond_exc_ge: len:6
cond_exc_ge_un: len:6
cond_exc_le: len:6
cond_exc_le_un: len:6
cond_exc_ov: len:6
cond_exc_no: len:6
cond_exc_c: len:6
cond_exc_nc: len:6
long_shl: dest:L src1:L src2:s clob:1 len:21
long_shr: dest:L src1:L src2:s clob:1 len:22
long_shr_un: dest:L src1:L src2:s clob:1 len:22
long_shr_imm: dest:L src1:L clob:1 len:10
long_shr_un_imm: dest:L src1:L clob:1 len:10
long_shl_imm: dest:L src1:L clob:1 len:10
float_beq: len:12
float_bne_un: len:18
float_blt: len:12
float_blt_un: len:20
float_bgt: len:12
float_bgt_un: len:20
float_bge: len:22
float_bge_un: len:12
float_ble: len:22
float_ble_un: len:12
float_add: dest:f src1:f src2:f len:2
float_sub: dest:f src1:f src2:f len:2
float_mul: dest:f src1:f src2:f len:2
float_div: dest:f src1:f src2:f len:2
float_div_un: dest:f src1:f src2:f len:2
float_rem: dest:f src1:f src2:f len:17
float_rem_un: dest:f src1:f src2:f len:17
float_neg: dest:f src1:f len:2
float_not: dest:f src1:f len:2
float_conv_to_i1: dest:y src1:f len:39
float_conv_to_i2: dest:y src1:f len:39
float_conv_to_i4: dest:i src1:f len:39
float_conv_to_i8: dest:L src1:f len:39
float_conv_to_u4: dest:i src1:f len:39
float_conv_to_u8: dest:L src1:f len:39
float_conv_to_u2: dest:y src1:f len:39
float_conv_to_u1: dest:y src1:f len:39
float_conv_to_i: dest:i src1:f len:39
float_conv_to_ovf_i: dest:a src1:f len:30
float_conv_to_ovd_u: dest:a src1:f len:30
float_mul_ovf:
float_ceq: dest:y src1:f src2:f len:25
float_cgt: dest:y src1:f src2:f len:25
float_cgt_un: dest:y src1:f src2:f len:37
float_clt: dest:y src1:f src2:f len:25
float_clt_un: dest:y src1:f src2:f len:32
float_conv_to_u: dest:i src1:f len:36
call_handler: len:11 clob:c
aot_const: dest:i len:5
load_gotaddr: dest:i len:64
got_entry: dest:i src1:b len:7
nacl_gc_safe_point: clob:c
x86_test_null: src1:i len:2
x86_compare_membase_reg: src1:b src2:i len:7
x86_compare_membase_imm: src1:b len:11
x86_compare_membase8_imm: src1:b len:8
x86_compare_mem_imm: len:11
x86_compare_reg_membase: src1:i src2:b len:7
x86_inc_reg: dest:i src1:i clob:1 len:1
x86_inc_membase: src1:b len:7
x86_dec_reg: dest:i src1:i clob:1 len:1
x86_dec_membase: src1:b len:7
x86_add_membase_imm: src1:b len:11
x86_sub_membase_imm: src1:b len:11
x86_and_membase_imm: src1:b len:11
x86_or_membase_imm: src1:b len:11
x86_xor_membase_imm: src1:b len:11
x86_push: src1:i len:1
x86_push_imm: len:5
x86_push_membase: src1:b len:7
x86_push_obj: src1:b len:30
x86_push_got_entry: src1:b len:7
x86_lea: dest:i src1:i src2:i len:7
x86_lea_membase: dest:i src1:i len:10
x86_xchg: src1:i src2:i clob:x len:1
x86_fpop: src1:f len:2
x86_fp_load_i8: dest:f src1:b len:7
x86_fp_load_i4: dest:f src1:b len:7
x86_seteq_membase: src1:b len:7
x86_setne_membase: src1:b len:7
x86_add_reg_membase: dest:i src1:i src2:b clob:1 len:11
x86_sub_reg_membase: dest:i src1:i src2:b clob:1 len:11
x86_mul_reg_membase: dest:i src1:i src2:b clob:1 len:13
adc: dest:i src1:i src2:i len:2 clob:1
addcc: dest:i src1:i src2:i len:2 clob:1
subcc: dest:i src1:i src2:i len:2 clob:1
adc_imm: dest:i src1:i len:6 clob:1
sbb: dest:i src1:i src2:i len:2 clob:1
sbb_imm: dest:i src1:i len:6 clob:1
br_reg: src1:i len:2 nacl:5
sin: dest:f src1:f len:6
cos: dest:f src1:f len:6
abs: dest:f src1:f len:2
tan: dest:f src1:f len:49
atan: dest:f src1:f len:8
sqrt: dest:f src1:f len:2
round: dest:f src1:f len:2
bigmul: len:2 dest:l src1:a src2:i
bigmul_un: len:2 dest:l src1:a src2:i
sext_i1: dest:i src1:y len:3
sext_i2: dest:i src1:y len:3
tls_get: dest:i len:20
atomic_add_i4: src1:b src2:i dest:i len:16
atomic_add_new_i4: src1:b src2:i dest:i len:16
atomic_exchange_i4: src1:b src2:i dest:a len:24
atomic_cas_i4: src1:b src2:i src3:a dest:a len:24
memory_barrier: len:16
card_table_wbarrier: src1:a src2:i clob:d len:34
relaxed_nop: len:2
hard_nop: len:1
# Linear IR opcodes
nop: len:0
dummy_use: src1:i len:0
dummy_store: len:0
not_reached: len:0
not_null: src1:i len:0
jump_table: dest:i len:5
int_adc: dest:i src1:i src2:i len:2 clob:1
int_addcc: dest:i src1:i src2:i len:2 clob:1
int_subcc: dest:i src1:i src2:i len:2 clob:1
int_sbb: dest:i src1:i src2:i len:2 clob:1
int_add_imm: dest:i src1:i len:6 clob:1
int_sub_imm: dest:i src1:i len:6 clob:1
int_mul_imm: dest:i src1:i len:9
int_div_imm: dest:a src1:a len:15 clob:d
int_div_un_imm: dest:a src1:a len:15 clob:d
int_rem_imm: dest:a src1:a len:15 clob:d
int_rem_un_imm: dest:d src1:a len:15 clob:a
int_and_imm: dest:i src1:i len:6 clob:1
int_or_imm: dest:i src1:i len:6 clob:1
int_xor_imm: dest:i src1:i len:6 clob:1
int_shl_imm: dest:i src1:i len:6 clob:1
int_shr_imm: dest:i src1:i len:6 clob:1
int_shr_un_imm: dest:i src1:i len:6 clob:1
int_conv_to_r_un: dest:f src1:i len:32
int_ceq: dest:y len:6
int_cgt: dest:y len:6
int_cgt_un: dest:y len:6
int_clt: dest:y len:6
int_clt_un: dest:y len:6
cond_exc_ieq: len:6
cond_exc_ine_un: len:6
cond_exc_ilt: len:6
cond_exc_ilt_un: len:6
cond_exc_igt: len:6
cond_exc_igt_un: len:6
cond_exc_ige: len:6
cond_exc_ige_un: len:6
cond_exc_ile: len:6
cond_exc_ile_un: len:6
cond_exc_iov: len:6
cond_exc_ino: len:6
cond_exc_ic: len:6
cond_exc_inc: len:6
icompare: src1:i src2:i len:2
icompare_imm: src1:i len:6
cmov_ieq: dest:i src1:i src2:i len:16 clob:1
cmov_ige: dest:i src1:i src2:i len:16 clob:1
cmov_igt: dest:i src1:i src2:i len:16 clob:1
cmov_ile: dest:i src1:i src2:i len:16 clob:1
cmov_ilt: dest:i src1:i src2:i len:16 clob:1
cmov_ine_un: dest:i src1:i src2:i len:16 clob:1
cmov_ige_un: dest:i src1:i src2:i len:16 clob:1
cmov_igt_un: dest:i src1:i src2:i len:16 clob:1
cmov_ile_un: dest:i src1:i src2:i len:16 clob:1
cmov_ilt_un: dest:i src1:i src2:i len:16 clob:1
long_conv_to_ovf_i4_2: dest:i src1:i src2:i len:30
long_conv_to_r8_2: dest:f src1:i src2:i len:14
long_conv_to_r4_2: dest:f src1:i src2:i len:14
long_conv_to_r_un_2: dest:f src1:i src2:i len:40
fmove: dest:f src1:f
float_conv_to_r4: dest:f src1:f len:12
load_mem: dest:i len:9
loadi4_mem: dest:i len:9
loadu1_mem: dest:i len:9
loadu2_mem: dest:i len:9
vcall2: len:17 clob:c
vcall2_reg: src1:i len:11 clob:c
vcall2_membase: src1:b len:16 nacl:17 clob:c
localloc_imm: dest:i len:120
x86_add_membase_reg: src1:b src2:i len:11
x86_sub_membase_reg: src1:b src2:i len:11
x86_and_membase_reg: src1:b src2:i len:11
x86_or_membase_reg: src1:b src2:i len:11
x86_xor_membase_reg: src1:b src2:i len:11
x86_mul_membase_reg: src1:b src2:i len:13
x86_and_reg_membase: dest:i src1:i src2:b clob:1 len:6
x86_or_reg_membase: dest:i src1:i src2:b clob:1 len:6
x86_xor_reg_membase: dest:i src1:i src2:b clob:1 len:6
x86_fxch: len:2
addps: dest:x src1:x src2:x len:3 clob:1
divps: dest:x src1:x src2:x len:3 clob:1
mulps: dest:x src1:x src2:x len:3 clob:1
subps: dest:x src1:x src2:x len:3 clob:1
maxps: dest:x src1:x src2:x len:3 clob:1
minps: dest:x src1:x src2:x len:3 clob:1
compps: dest:x src1:x src2:x len:4 clob:1
andps: dest:x src1:x src2:x len:3 clob:1
andnps: dest:x src1:x src2:x len:3 clob:1
orps: dest:x src1:x src2:x len:3 clob:1
xorps: dest:x src1:x src2:x len:3 clob:1
haddps: dest:x src1:x src2:x len:4 clob:1
hsubps: dest:x src1:x src2:x len:4 clob:1
addsubps: dest:x src1:x src2:x len:4 clob:1
dupps_low: dest:x src1:x len:4
dupps_high: dest:x src1:x len:4
addpd: dest:x src1:x src2:x len:4 clob:1
divpd: dest:x src1:x src2:x len:4 clob:1
mulpd: dest:x src1:x src2:x len:4 clob:1
subpd: dest:x src1:x src2:x len:4 clob:1
maxpd: dest:x src1:x src2:x len:4 clob:1
minpd: dest:x src1:x src2:x len:4 clob:1
comppd: dest:x src1:x src2:x len:5 clob:1
andpd: dest:x src1:x src2:x len:4 clob:1
andnpd: dest:x src1:x src2:x len:4 clob:1
orpd: dest:x src1:x src2:x len:4 clob:1
xorpd: dest:x src1:x src2:x len:4 clob:1
sqrtpd: dest:x src1:x len:4 clob:1
haddpd: dest:x src1:x src2:x len:5 clob:1
hsubpd: dest:x src1:x src2:x len:5 clob:1
addsubpd: dest:x src1:x src2:x len:5 clob:1
duppd: dest:x src1:x len:5
pand: dest:x src1:x src2:x len:4 clob:1
por: dest:x src1:x src2:x len:4 clob:1
pxor: dest:x src1:x src2:x len:4 clob:1
sqrtps: dest:x src1:x len:4
rsqrtps: dest:x src1:x len:4
rcpps: dest:x src1:x len:4
pshufflew_high: dest:x src1:x len:5
pshufflew_low: dest:x src1:x len:5
pshuffled: dest:x src1:x len:5
shufps: dest:x src1:x src2:x len:4 clob:1
shufpd: dest:x src1:x src2:x len:5 clob:1
extract_mask: dest:i src1:x len:4
paddb: dest:x src1:x src2:x len:4 clob:1
paddw: dest:x src1:x src2:x len:4 clob:1
paddd: dest:x src1:x src2:x len:4 clob:1
paddq: dest:x src1:x src2:x len:4 clob:1
psubb: dest:x src1:x src2:x len:4 clob:1
psubw: dest:x src1:x src2:x len:4 clob:1
psubd: dest:x src1:x src2:x len:4 clob:1
psubq: dest:x src1:x src2:x len:4 clob:1
pmaxb_un: dest:x src1:x src2:x len:4 clob:1
pmaxw_un: dest:x src1:x src2:x len:5 clob:1
pmaxd_un: dest:x src1:x src2:x len:5 clob:1
pmaxb: dest:x src1:x src2:x len:5 clob:1
pmaxw: dest:x src1:x src2:x len:4 clob:1
pmaxd: dest:x src1:x src2:x len:5 clob:1
pavgb_un: dest:x src1:x src2:x len:4 clob:1
pavgw_un: dest:x src1:x src2:x len:4 clob:1
pminb_un: dest:x src1:x src2:x len:4 clob:1
pminw_un: dest:x src1:x src2:x len:5 clob:1
pmind_un: dest:x src1:x src2:x len:5 clob:1
pminb: dest:x src1:x src2:x len:5 clob:1
pminw: dest:x src1:x src2:x len:4 clob:1
pmind: dest:x src1:x src2:x len:5 clob:1
pcmpeqb: dest:x src1:x src2:x len:4 clob:1
pcmpeqw: dest:x src1:x src2:x len:4 clob:1
pcmpeqd: dest:x src1:x src2:x len:4 clob:1
pcmpeqq: dest:x src1:x src2:x len:5 clob:1
pcmpgtb: dest:x src1:x src2:x len:4 clob:1
pcmpgtw: dest:x src1:x src2:x len:4 clob:1
pcmpgtd: dest:x src1:x src2:x len:4 clob:1
pcmpgtq: dest:x src1:x src2:x len:5 clob:1
psumabsdiff: dest:x src1:x src2:x len:4 clob:1
unpack_lowb: dest:x src1:x src2:x len:4 clob:1
unpack_loww: dest:x src1:x src2:x len:4 clob:1
unpack_lowd: dest:x src1:x src2:x len:4 clob:1
unpack_lowq: dest:x src1:x src2:x len:4 clob:1
unpack_lowps: dest:x src1:x src2:x len:3 clob:1
unpack_lowpd: dest:x src1:x src2:x len:4 clob:1
unpack_highb: dest:x src1:x src2:x len:4 clob:1
unpack_highw: dest:x src1:x src2:x len:4 clob:1
unpack_highd: dest:x src1:x src2:x len:4 clob:1
unpack_highq: dest:x src1:x src2:x len:4 clob:1
unpack_highps: dest:x src1:x src2:x len:3 clob:1
unpack_highpd: dest:x src1:x src2:x len:4 clob:1
packw: dest:x src1:x src2:x len:4 clob:1
packd: dest:x src1:x src2:x len:4 clob:1
packw_un: dest:x src1:x src2:x len:4 clob:1
packd_un: dest:x src1:x src2:x len:5 clob:1
paddb_sat: dest:x src1:x src2:x len:4 clob:1
paddb_sat_un: dest:x src1:x src2:x len:4 clob:1
paddw_sat: dest:x src1:x src2:x len:4 clob:1
paddw_sat_un: dest:x src1:x src2:x len:4 clob:1
psubb_sat: dest:x src1:x src2:x len:4 clob:1
psubb_sat_un: dest:x src1:x src2:x len:4 clob:1
psubw_sat: dest:x src1:x src2:x len:4 clob:1
psubw_sat_un: dest:x src1:x src2:x len:4 clob:1
pmulw: dest:x src1:x src2:x len:4 clob:1
pmuld: dest:x src1:x src2:x len:5 clob:1
pmulq: dest:x src1:x src2:x len:4 clob:1
pmul_high_un: dest:x src1:x src2:x len:4 clob:1
pmul_high: dest:x src1:x src2:x len:4 clob:1
pshrw: dest:x src1:x len:5 clob:1
pshrw_reg: dest:x src1:x src2:x len:4 clob:1
psarw: dest:x src1:x len:5 clob:1
psarw_reg: dest:x src1:x src2:x len:4 clob:1
pshlw: dest:x src1:x len:5 clob:1
pshlw_reg: dest:x src1:x src2:x len:4 clob:1
pshrd: dest:x src1:x len:5 clob:1
pshrd_reg: dest:x src1:x src2:x len:4 clob:1
psard: dest:x src1:x len:5 clob:1
psard_reg: dest:x src1:x src2:x len:4 clob:1
pshld: dest:x src1:x len:5 clob:1
pshld_reg: dest:x src1:x src2:x len:4 clob:1
pshrq: dest:x src1:x len:5 clob:1
pshrq_reg: dest:x src1:x src2:x len:4 clob:1
pshlq: dest:x src1:x len:5 clob:1
pshlq_reg: dest:x src1:x src2:x len:4 clob:1
cvtdq2pd: dest:x src1:x len:4 clob:1
cvtdq2ps: dest:x src1:x len:3 clob:1
cvtpd2dq: dest:x src1:x len:4 clob:1
cvtpd2ps: dest:x src1:x len:4 clob:1
cvtps2dq: dest:x src1:x len:4 clob:1
cvtps2pd: dest:x src1:x len:3 clob:1
cvttpd2dq: dest:x src1:x len:4 clob:1
cvttps2dq: dest:x src1:x len:4 clob:1
xmove: dest:x src1:x len:4
xzero: dest:x len:4
iconv_to_x: dest:x src1:i len:4
extract_i4: dest:i src1:x len:4
extract_i2: dest:i src1:x len:10
extract_u2: dest:i src1:x len:10
extract_i1: dest:i src1:x len:10
extract_u1: dest:i src1:x len:10
extract_r8: dest:f src1:x len:8
iconv_to_r8_raw: dest:f src1:i len:17
insert_i2: dest:x src1:x src2:i len:5 clob:1
extractx_u2: dest:i src1:x len:5
insertx_u1_slow: dest:x src1:i src2:i len:16 clob:x
insertx_i4_slow: dest:x src1:x src2:i len:13 clob:x
insertx_r4_slow: dest:x src1:x src2:f len:24 clob:1
insertx_r8_slow: dest:x src1:x src2:f len:24 clob:1
loadx_membase: dest:x src1:b len:7
storex_membase: dest:b src1:x len:7
storex_membase_reg: dest:b src1:x len:7
loadx_aligned_membase: dest:x src1:b len:7
storex_aligned_membase_reg: dest:b src1:x len:7
storex_nta_membase_reg: dest:b src1:x len:7
fconv_to_r8_x: dest:x src1:f len:14
xconv_r8_to_i4: dest:y src1:x len:7
prefetch_membase: src1:b len:4
expand_i1: dest:x src1:y len:17 clob:1
expand_i2: dest:x src1:i len:15
expand_i4: dest:x src1:i len:9
expand_r4: dest:x src1:f len:13
expand_r8: dest:x src1:f len:13
liverange_start: len:0
liverange_end: len:0
gc_liveness_def: len:0
gc_liveness_use: len:0
gc_spill_slot_liveness_def: len:0
gc_param_slot_liveness_def: len:0
| 31.948718 | 93 | 0.744884 | oci_Latn | 0.183444 |
ff24bdfc3e45aac8d7582eb3ff408eab24fa22fb | 550 | md | Markdown | README.md | eiriktsarpalis/Streams | b84980bd911080f81391a352973cbe7c1c757bf6 | [
"Apache-2.0"
] | 301 | 2015-01-15T11:05:36.000Z | 2021-09-23T08:06:13.000Z | README.md | eiriktsarpalis/Streams | b84980bd911080f81391a352973cbe7c1c757bf6 | [
"Apache-2.0"
] | 32 | 2015-01-16T14:05:07.000Z | 2021-01-30T08:49:25.000Z | README.md | eiriktsarpalis/Streams | b84980bd911080f81391a352973cbe7c1c757bf6 | [
"Apache-2.0"
] | 54 | 2015-01-23T20:07:37.000Z | 2021-12-23T07:33:55.000Z | ## Streams
A lightweight F#/C# library for efficient functional-style pipelines on streams of data.
All documentation and related material can be found [here](http://nessos.github.io/Streams/).
### Build Status
Head (branch `master`), Build & Unit tests
* Windows [](https://ci.appveyor.com/project/nessos/streams)
* Docker [](https://travis-ci.org/nessos/Streams/branches) | 45.833333 | 160 | 0.758182 | eng_Latn | 0.254745 |
ff24c78379746ae5cb3de65922556e692e71694e | 103 | md | Markdown | app/biz/platform/readme.md | zeromicro/zeromall | e32547989ad8251efbc413b0370448371bff14a1 | [
"MIT"
] | 117 | 2021-01-08T13:09:34.000Z | 2022-03-30T03:46:51.000Z | app/biz/platform/readme.md | HHC26/zeromall | e0f19cc707b271c3ffab70fc2185e515256f084b | [
"MIT"
] | 3 | 2021-01-20T15:49:41.000Z | 2021-09-27T07:58:17.000Z | app/biz/platform/readme.md | HHC26/zeromall | e0f19cc707b271c3ffab70fc2185e515256f084b | [
"MIT"
] | 23 | 2021-02-18T15:43:38.000Z | 2022-02-13T00:36:20.000Z |
# B2B2C - Platform:
- 平台侧: 第一个 B (Platform) `平台`相关服务
- 角色: 平台主体, 管理员.
## 服务列表:
- 风控
- 反垃圾
- 限流
| 6.866667 | 32 | 0.524272 | yue_Hant | 0.950127 |
ff256da886a704c326d68c478b470790a210e46b | 1,213 | md | Markdown | docs/PlayerApi.md | kinow-io/kinow-javascript-sdk | 00aade4460b58b4c071192c3efef35ecc2d21a1d | [
"Apache-2.0"
] | 1 | 2019-06-26T14:24:49.000Z | 2019-06-26T14:24:49.000Z | docs/PlayerApi.md | kinow-io/kinow-javascript-sdk | 00aade4460b58b4c071192c3efef35ecc2d21a1d | [
"Apache-2.0"
] | null | null | null | docs/PlayerApi.md | kinow-io/kinow-javascript-sdk | 00aade4460b58b4c071192c3efef35ecc2d21a1d | [
"Apache-2.0"
] | 3 | 2019-10-04T15:58:37.000Z | 2021-11-25T12:23:51.000Z | # KinowJavascriptSdk.PlayerApi
All URIs are relative to *https://api.kinow.com/api*
Method | HTTP request | Description
------------- | ------------- | -------------
[**getExtractPlayer**](PlayerApi.md#getExtractPlayer) | **GET** /extracts/{extract_id}/player |
<a name="getExtractPlayer"></a>
# **getExtractPlayer**
> PlayerConfiguration getExtractPlayer(extractId)
Get extract's player
### Example
```javascript
var KinowJavascriptSdk = require('kinow-javascript-sdk');
var apiInstance = new KinowJavascriptSdk.PlayerApi();
var extractId = 789; // Integer | ID of the extract to fetch
apiInstance.getExtractPlayer(extractId).then(function(data) {
console.log('API called successfully. Returned data: ' + data);
}, function(error) {
console.error(error);
});
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**extractId** | **Integer**| ID of the extract to fetch |
### Return type
[**PlayerConfiguration**](PlayerConfiguration.md)
### Authorization
No authorization required
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
| 22.886792 | 97 | 0.623248 | yue_Hant | 0.478702 |
ff25f2ba2bde5d5b1453a11c9c4dc727c5ffdca9 | 3,242 | md | Markdown | docs/oauth2/CustomizeAuthorizationCodeFlow.md | alex-snap/identityserver | 2d1e6cebfee4492a5c98d169097a9811015def64 | [
"BSD-3-Clause"
] | 35 | 2016-02-09T09:20:53.000Z | 2021-01-10T17:10:21.000Z | docs/oauth2/CustomizeAuthorizationCodeFlow.md | alex-snap/identityserver | 2d1e6cebfee4492a5c98d169097a9811015def64 | [
"BSD-3-Clause"
] | 575 | 2016-02-11T12:56:46.000Z | 2020-03-24T09:42:04.000Z | docs/oauth2/CustomizeAuthorizationCodeFlow.md | alex-snap/identityserver | 2d1e6cebfee4492a5c98d169097a9811015def64 | [
"BSD-3-Clause"
] | 35 | 2016-02-04T12:59:27.000Z | 2021-01-10T17:10:22.000Z | ## Customize the Authorization Code Flow
### Show an organization logo on the login/register screen
When you use the authorization code flow to authenticate your users using Itsyou.Online, you can provide a better user experience by showing your logo and customize the text on the login page.
Go to the settings page of an organization:

where you can add an organization logo and change the text shown on the login page by modifying the organization description text.
When a user is asked to login, this logo and text are added to the login/register page:

### Choose a different default language
When an external site uses ItsYou.online using the authorization code flow, it can add the `lang` query parameter to change the default language if a user has not explicitly changed it. Possible values are `en` and `nl`, if no `lang` query parameter is supplied, English is taken.
### Configuring the frequency of the 2FA challenge
When logging in to an external site using Itsyou.Online, a successful 2 factor authentication will gain a validity period, for which no further 2FA's are required. This 2FA validity is bound to the external site. As long as the user does not provide an invalid password, and the validity period hasn't expired, the 2FA step is not required for logging in. As soon as an invalid password is provided, the validity of the 2FA, if one is still active, is revoked. When no active validity for the user is detected, they will have to do the 2FA step, and will acquire a new validity period for their successful authentication. The default validity period duration is 7 days.
Currently, it is only possible to view or modify the validity period using the `organizations/{globalid}/2fa/validity` api. The validity period is expressed in seconds. The api supports both **GET** requests to retrieve the validity duration, and **PUT** requests to change the validity duration. Note that the validity period should be between 0 and 2678400 (31 days).
Example to retrieve and modify the validity period of an organization with globalid `mycompany`:
1. Inspect the validity duration
```
GET https://itsyou.online/api/organizations/mycompany/2fa/validity
```
The following information is returned in the response body:
```json
{
"secondsvalidity": 604800
}
```
At this moment, the validity duration for a successful 2FA login is 604800 seconds (7 days, the default).
2. Change the validity duration
```
PUT https://itsyou.online/api/organization/mycompany/2fa/validity
```
In the body of the request, we specify the new duration, which we will set to 86400 (1 day).
```json
{
"secondsvalidity": 86400
}
```
Also note that an access token will have to be specified, either by appending it to the request url, or by setting it in the Authorization header.
### Show the register screen instead of the login screen
If you are think the user has no account with ItsYou.online yet, you can supply the `prefer=register` queryparameter in the oauth flow. This will show the user the register screen instead of the login screen if we do do not detect a previous login (this is registered in the local storage).
| 55.896552 | 669 | 0.781616 | eng_Latn | 0.998477 |
ff2672b2619f735421f4d66f382758fc7ad30f06 | 3,911 | md | Markdown | content/series/rl/5_model_free_control.md | zhoumingjun/zhoumingjun.github.io | 761bf2d864e1c0acc74283fc52364f3b8a6401b1 | [
"MIT"
] | null | null | null | content/series/rl/5_model_free_control.md | zhoumingjun/zhoumingjun.github.io | 761bf2d864e1c0acc74283fc52364f3b8a6401b1 | [
"MIT"
] | 1 | 2018-11-10T04:08:46.000Z | 2021-08-02T04:50:05.000Z | content/series/rl/5_model_free_control.md | zhoumingjun/zhoumingjun.github.io | 761bf2d864e1c0acc74283fc52364f3b8a6401b1 | [
"MIT"
] | null | null | null | ---
title : "5 Model Free Control"
date : "2017-03-14T15:05:04+08:00"
series : ["reinforcement learning"]
tags : ["machine learning","reinforcement learning"]
math : true
viz : true
---
# Key Points
# Lecture Notes
## Introcution
Model-free control can solve these problems
- MDP model is unknown, but experience can be sampled
- MDP model is known, but is too big to use, except by samples
On/Off policy learning
- On-policy learning
- “Learn on the job”
- Learn about policy 𝛑 from experience sampled from 𝛑
- Off-policy learning
- “Look over someone’s shoulder”
- Learn about policy 𝛑 from experience sampled from μ
## On-Policy MC Control
### Generalised Policy Iteration with Action-Value Function
Greedy policy improvement over V(s) requires model of MDP
$ \pi'(s) = \operatorname*{arg\,max}\limits_{a \in \mathcal{A}} \mathcal{R}_s^a + \mathcal{P}_{ss'}^aV(s') $
Greedy policy improvement over Q(s,a) is model-free
$ \pi'(s) = \operatorname*{arg\,max}\limits_{a \in \mathcal{A}} \mathcal{Q}(s,a)) $
Policy evaluation Monte-Carlo policy evaluation, $ Q = q_\pi$
Policy improvement Greedy policy improvement? (haha ,that is 𝝴-Greddy)
### Exploration
**recall greedy**
$
\pi_*(a|s) =
\begin{cases}
1, & \text{if } a= \operatorname*{arg\,max}\limits_{a \in \mathcal{A}} q_*(s,a)) \\
0, & \text{otherwise}
\end{cases}
$
**𝝴-Greddy Exploration**
- Simplest idea for ensuring continual exploration
- All m actions are tried with non-zero probability
- With probability 1-𝝴 choose the greedy action
- With probability 𝝴 choose an action at random
$
\pi(a|s) =
\begin{cases}
\epsilon/m +1-\epsilon , & \text{if } a^*= \operatorname*{arg\,max}\limits_{a \in \mathcal{A}} Q(s,a)) \\
\epsilon/m, & \text{otherwise}
\end{cases}
$
**𝝴-Greedy Policy Improvement**
Theorem
For any 𝝴-Greedy Policy 𝛑 , the 𝝴-Greedy policy 𝛑' with respect to q𝛑 is an improvement $ v_{\pi'}(s) \geqslant v_\pi(s) $
$
\begin{align*}
q_\pi(s, \pi'(s)) &= \sum_{a}\pi'(a|s)q_\pi(s,a) \\
&= \epsilon/m \sum_{a} q_\pi(s,a) + (1-\epsilon) \max_a q_\pi(s,a) \\
&\geqslant \epsilon/m \sum_{a} q_\pi(s,a) + + (1-\epsilon) \sum_{a} \frac{\pi(a|s) - \epsilon/m}{1-\epsilon} q_\pi(s,a) \\
&= \sum_a \pi(a|s) q_\pi(s,a) \\
&= v_\pi(s)
\end{align*}
$
Therefore from policy improvement theorem, $ v_{\pi'}(s) \geqslant v_\pi(s) $
### GLIE
**Definition**
_Greedy in the Limit with Infinite Exploration_(GLIE)
- All state-action pairs are explored infinitely many times,
$ \lim\limits_{k\rightarrow \infty} N_k(s,a)=\infty $
- The policy converges on a greedy policy,
$ \lim\limits_{k\rightarrow \infty} \pi_k(a|s)=1(a=\operatorname*{arg\,max}\limits_{a'} Q_k(s,a')) $
For example, 𝝴-greedy is GLIE if 𝝴 reduces to zero at $ \epsilon_k = \frac{1}{k}$
**GLIE Monte-Carlo Control**
- Sample kth episode using 𝛑: {S1, A1, R2, ..., ST } ∼ 𝛑
- For each state St and action At in the episode
recall _Incremental Mean_
$
\begin{align*}
N(S_t, A_t) &\leftarrow N(S_t, A_t) + 1 \\
Q(S_t, A_t) &\leftarrow Q(S_t, A_t) + \frac{1}{N(S_t, A_t)} (G_t - Q(S_t, A_t))
\end{align*}
$
- Improve policy based on new action-value function
$
\begin{align*}
\epsilon &\leftarrow 1/k \\
\pi &\leftarrow \epsilon-greedy(Q)
\end{align*}
$
**Theroem**
GLIE Monte-Carlo control converges to the optimal action-value function, $ Q(s,a) \rightarrow q_*(s,a) $
## On-Policy TD Learning
## Off-Policy Learning
## Summary
# Excises | 31.540323 | 141 | 0.586806 | eng_Latn | 0.719521 |
ff269957c1569d9db5068833540de57eb683b908 | 5,346 | md | Markdown | docs/2014/relational-databases/wmi-provider-configuration-classes/sqlerrorlogevent-class.md | antoniosql/sql-docs.es-es | 0340bd0278b0cf5de794836cd29d53b46452d189 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/2014/relational-databases/wmi-provider-configuration-classes/sqlerrorlogevent-class.md | antoniosql/sql-docs.es-es | 0340bd0278b0cf5de794836cd29d53b46452d189 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/2014/relational-databases/wmi-provider-configuration-classes/sqlerrorlogevent-class.md | antoniosql/sql-docs.es-es | 0340bd0278b0cf5de794836cd29d53b46452d189 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Clase SqlErrorLogEvent | Microsoft Docs
ms.custom: ''
ms.date: 03/06/2017
ms.prod: sql-server-2014
ms.reviewer: ''
ms.technology:
- database-engine
- docset-sql-devref
ms.topic: reference
helpviewer_keywords:
- SqlErrorLogEvent class
- SqlErrorLogFile class
ms.assetid: bde6c467-38d0-4766-a7af-d6c9d6302b07
author: CarlRabeler
ms.author: carlrab
manager: craigg
ms.openlocfilehash: dd0b66fb83d62291d30ca3488591e1cceda5d781
ms.sourcegitcommit: 3da2edf82763852cff6772a1a282ace3034b4936
ms.translationtype: MT
ms.contentlocale: es-ES
ms.lasthandoff: 10/02/2018
ms.locfileid: "48179815"
---
# <a name="sqlerrorlogevent-class"></a>SqlErrorLogEvent, clase
Proporciona las propiedades para ver los eventos en un archivo de registro [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] especificado.
## <a name="syntax"></a>Sintaxis
```
class SQLErrorLogEvent
{
stringFileName;
stringInstanceName;
datetimeLogDate;
stringMessage;
stringProcessInfo;
};
```
## <a name="properties"></a>Propiedades
La clase SQLErrorLogEvent define las siguientes propiedades.
|||
|-|-|
|FileName|Tipo de datos: `string`<br /><br /> Tipo de acceso: solo lectura<br /><br /> <br /><br /> El nombre del archivo de registro de errores.|
|InstanceName|Tipo de datos: `string`<br /><br /> Tipo de acceso: solo lectura<br /><br /> Calificadores: clave<br /><br /> El nombre de la instancia de [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] donde reside el archivo de registro.|
|LogDate|Tipo de datos: `datetime`<br /><br /> Tipo de acceso: solo lectura<br /><br /> Calificadores: clave<br /><br /> <br /><br /> Fecha y hora en que el evento se grabó en el archivo de registro.|
|de mensaje|Tipo de datos: `string`<br /><br /> Tipo de acceso: solo lectura<br /><br /> <br /><br /> Mensaje del evento.|
|ProcessInfo|Tipo de datos: `string`<br /><br /> Tipo de acceso: solo lectura<br /><br /> <br /><br /> Información sobre el identificador del proceso del servidor de origen (SPID) para el evento.|
## <a name="remarks"></a>Comentarios
|||
|-|-|
|MOF|Sqlmgmproviderxpsp2up.mof|
|DLL|Sqlmgmprovider.dll|
|Espacio de nombres|\raíz\Microsoft\SqlServer\ComputerManagement10|
## <a name="example"></a>Ejemplo
En el siguiente ejemplo se muestra cómo recuperar los valores para todos los eventos anotados en un archivo de registro especificado. Para ejecutar el ejemplo, reemplace \< *Instance_Name*> con el nombre de la instancia de [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)], por ejemplo, 'Instancia1' y reemplace 'File_Name' por el nombre del archivo de registro de errores, como 'ERRORLOG.1'.
```
on error resume next
strComputer = "."
Set objWMIService = GetObject("winmgmts:" _
& "{impersonationLevel=impersonate}!\\" _
& strComputer & "\root\MICROSOFT\SqlServer\ComputerManagement10")
set logEvents = objWmiService.ExecQuery("SELECT * FROM SqlErrorLogEvent WHERE InstanceName = '<Instance_Name>' AND FileName = 'File_Name'")
For Each logEvent in logEvents
WScript.Echo "Instance Name: " & logEvent.InstanceName & vbNewLine _
& "Log Date: " & logEvent.LogDate & vbNewLine _
& "Log File Name: " & logEvent.FileName & vbNewLine _
& "Process Info: " & logEvent.ProcessInfo & vbNewLine _
& "Message: " & logEvent.Message & vbNewLine _
Next
```
## <a name="comments"></a>Comentarios
Cuando *nombreDeInstancia* o *FileName* no se proporcionan en la instrucción WQL, la consulta devolverá información para la instancia predeterminada y la actual [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] archivo de registro. Por ejemplo, la siguiente instrucción WQL devolverá todos los eventos de registro desde el archivo de registro actual (ERRORLOG) en la instancia predeterminada (MSSQLSERVER).
```
"SELECT * FROM SqlErrorLogEvent"
```
## <a name="security"></a>Seguridad
Para conectarse a un [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] archivo de registro a través de WMI, debe tener los permisos siguientes en los equipos locales y remotos:
- Acceso de lectura a la **Root\Microsoft\SqlServer\ComputerManagement10** espacio de nombres WMI. De forma predeterminada, todos tienen acceso de lectura mediante el permiso Habilitar cuenta.
- Permiso de lectura a la carpeta que contiene los registros de errores. De forma predeterminada, el error registros se encuentran en la siguiente ruta de acceso (donde \< *unidad >* representa la unidad donde se instaló [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] y \< *nombreDeInstancia*> es el nombre de la instancia de [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)]):
**\<Unidad >: \Program Files\Microsoft SQL Server\MSSQL12** **.\< NombreDeInstancia > \MSSQL\Log**
Si se conecta a través de un firewall, asegúrese de que se establece una excepción en el firewall para WMI en los equipos de destino remotos. Para obtener más información, consulte [conectarse a WMI de forma remota comenzando con Windows Vista](http://go.microsoft.com/fwlink/?LinkId=178848).
## <a name="see-also"></a>Vea también
[Clase SqlErrorLogFile](sqlerrorlogfile-class.md)
[Ver archivos del registro sin conexión](../logs/view-offline-log-files.md)
| 50.914286 | 417 | 0.720352 | spa_Latn | 0.850594 |
ff26d744427413f58d36bf4da1d732f56dc7f295 | 1,228 | md | Markdown | manifest.md | alexlevy0/chromeos-apk | e155806b6e4803ce5f5190dead8639fc463b2d15 | [
"MIT"
] | 1 | 2018-12-26T04:19:45.000Z | 2018-12-26T04:19:45.000Z | manifest.md | alexlevy0/chromeos-apk | e155806b6e4803ce5f5190dead8639fc463b2d15 | [
"MIT"
] | null | null | null | manifest.md | alexlevy0/chromeos-apk | e155806b6e4803ce5f5190dead8639fc463b2d15 | [
"MIT"
] | null | null | null | # `manifest.json` configuration
## `arc_metadata`
Raw list of possible values for `arc_metadata`:
```
"arc_metadata": {
allowEmptyActivityStack: false,
apkList: [
"custom-android-release-1400197.apk"
],
canRotate: false,
disableAutoBackButton: false,
enableAdb: false,
enableArcStrace: false,
enableExternalDirectory: false,
enableGlErrorCheck: false,
formFactor: "phone",
fpsLimit: 60,
isSlowDebugRun: false,
jdbPort: 0,
logLoadProgress: false,
minimumLaunchDelay: 0,
name: "",
ndkAbi: "",
orientation: "portrait",
packageName: "org.chromium.arc",
resize: "disabled",
shell: [],
stderrLog: "S",
useGoogleContactsSyncAdapter: false,
usePlayServices: [
"gcm"
],
sleepOnBlur: true
}
```
## `file_handlers`
See https://developer.chrome.com/apps/manifest/file_handlers.
This is useful for Chrome OS users. You can experiment opening files with certain Android apps by setting their file handlers. Add to `manifest.json` in your app:
```
"file_handlers": {
"any": {
"title": "Open with SOME_APP",
"types": [ "*/*" ]
}
},
```
This way your file manager will get this option:
<img src="http://i.imgur.com/zTjPaHc.png" width="250px" />
| 23.169811 | 163 | 0.67671 | eng_Latn | 0.428994 |
ff26fe6300fa15b4a841d30e40fc47b8370e60b7 | 1,159 | md | Markdown | LeetCode/Leetcode-0144.md | JieZhang9527/CodeBase | 6738ed01ea16eee12b81ba8f9fe51f5422523737 | [
"MIT"
] | null | null | null | LeetCode/Leetcode-0144.md | JieZhang9527/CodeBase | 6738ed01ea16eee12b81ba8f9fe51f5422523737 | [
"MIT"
] | null | null | null | LeetCode/Leetcode-0144.md | JieZhang9527/CodeBase | 6738ed01ea16eee12b81ba8f9fe51f5422523737 | [
"MIT"
] | null | null | null | 1. 递归
```C++
/**
* Definition for a binary tree node.
* struct TreeNode {
* int val;
* TreeNode *left;
* TreeNode *right;
* TreeNode(int x) : val(x), left(NULL), right(NULL) {}
* };
*/
class Solution {
public:
vector<int> preorderTraversal(TreeNode* root) {
vector<int> ans;
preOrder(root,ans);
return ans;
}
private:
void preOrder(TreeNode *root,vector<int> &ans){
if(!root) return;
ans.push_back(root->val);
preOrder(root->left,ans);
preOrder(root->right,ans);
}
};
```
2. 非递归
> 本质上是DFS,因此需要用到栈
```C++
class Solution {
public:
vector<int> preorderTraversal(TreeNode* root) {
vector<int> ans;
preOrder(root,ans);
return ans;
}
private:
void preOrder(TreeNode *root,vector<int> &ans){
stack<TreeNode*> st;
while(root||!st.empty()){
while(root){
ans.push_back(root->val);
st.push(root);
root=root->left;
}
if(!st.empty()){
root=st.top()->right;
st.pop();
}
}
}
};
``` | 20.333333 | 59 | 0.500431 | yue_Hant | 0.381586 |
ff272bac8f10aae4803fc12abcbd04b2866ca980 | 1,719 | md | Markdown | zhihutop_content/2021/2021-12/2021-12-16/16:18.md | handn-work/spider | 8130508da1ba31df2d1430c0077ad08602417fd3 | [
"MIT"
] | null | null | null | zhihutop_content/2021/2021-12/2021-12-16/16:18.md | handn-work/spider | 8130508da1ba31df2d1430c0077ad08602417fd3 | [
"MIT"
] | null | null | null | zhihutop_content/2021/2021-12/2021-12-16/16:18.md | handn-work/spider | 8130508da1ba31df2d1430c0077ad08602417fd3 | [
"MIT"
] | 1 | 2022-01-24T09:20:56.000Z | 2022-01-24T09:20:56.000Z | 上海一职院老师被曝对南京大屠杀发表不当言论,学校回应「已启动调查程序」,该教师或将承担哪些责任?
新疆公布多段暴恐案视频,多数画面系首次披露,还有哪些信息值得关注?
二次元游戏《幻塔》已上线,游戏体验如何,符合你的预期吗?
古天乐捐赠的希望学校逐渐荒废,网友纷纷应和自己家乡学校也荒废了,为什么会有这种情况?
摆拍和滤镜制造出来的美,是「假」吗?
如何看待单亲妈妈买房被中介恐吓侮辱「听说你是人工授精?也许我可以帮你」?通过中介买房有哪些坑需要注意?
媒体曝光收入 1 亿多的电视剧用 9000 万来买收视率,为什么要买收视率?造假行为是否涉及法律问题?
2021 年你拍过最 emo 的照片是哪张?
2021 年 12 月 15 日医药股全线大跌,为何 4000 亿的药明康德闪崩跌停?
你有哪些不同寻常的职业故事值得分享?
西安一邮件分拣员通宵夜班后猝死,这一职业的工作强度有多大?中国邮政是否需要承担责任?
如何看待胡锡进卸任环球时报总编辑,将以环球时报特约评论员身份继续工作?
河南女子实名举报前婆婆「在银行吃空饷有 8 套房 9 间商铺」,官方称确为银行员工,如何看待此事?
吉林动画学院学生因剽窃他人冬奥作品被处分,如何看待这件事?怎样树立正确的版权意识?
如何看待青岛求实学院一老师因用棍棒体罚犯错学生被开除,事后却发短信鼓励被打学生「不要有负担好好读书」?
如何看待成都双流机场饮料机售卖「缩水」饮料,该行为是否涉嫌违法,相关方将承担哪些法律责任?
深圳有银行下调首套房贷利率,最低至 4.95%,大量二手房业主下调挂牌价,二手房成交量会回暖吗?
如何看待北京动物园熊猫翻墙「越狱」,工作人员对其「批评教育」?
如何评价《雪中悍刀行》电视剧?符合你的预期吗?
2021 年年末为什么各大厂都开始造折叠屏手机?折叠屏手机是否会在将来成为主流?
中国科兴称加强针中和 Omicron 抗体阳性率达 94%,这意味着什么?
如何看待 iG 世界赛夺冠六人已确定全部离队?
15 日 10 时 20 分至 16 日 8 时,陕西省新增本土确诊病例 2 例,目前当地情况如何?
如何看待美国新冠确诊病例超 5000 万,累计死亡接近 80 万,7 个人就有超过 1 人感染新冠?
神舟飞船首任总设计师呼吁青少年参与第四疆域建设,我国航天技术处于什么水平?
为什么中国人都留西方的发型,就没有中国传统造型吗?
好莱坞将拍电影讲述美军撤离阿富汗时拯救当地人的故事,如何看待好莱坞美化美军的行为?
有人说《海贼王》变成了《水贼王》,越来越水,你怎么看?
你觉得李庚希能演好《雪中悍刀行》女主姜泥吗?
有了很想写的网文创意,想有能力后写但怕让人抢先,请问该怎么办?
你有哪些提升自我的好习惯?
如何评价联想笔记本电脑价格近期持续跳水?
有哪些恐怖到极致的故事?
中年夫妻的婚姻状态是什么样的?
OPPO 最新发布起售价 7699 元 Find N 折叠屏手机,这个价格值得购买吗?
如何看待《很高兴认识你 2》第二期中周迅会把租的房子当成家布置?
2021 年,你在教育孩子方面有哪些心得?
江苏一博士「虎爸」拳脚相加逼六七岁儿女学高数。如何评价这位父亲的做法?
第一次写学术论文无从下手怎么办?
英语六级段落匹配题怎么做?有没有什么技巧?
2022 年春节,你打算待在家过年还是外出旅行?打算去哪里呢?
在上海落户,需要满足哪些条件?
纯电动车都在宣传加速快,长途高速性能怎么样呢?
《一人之下》陈朵篇为什么会封神?看不懂哪好,老廖的死不是为了煽情而写死的吗?
《斗罗大陆》的封号斗罗为什么大部分都是单身?
什么都不会的应届生怎么找工作?
3 岁左右的小朋友送什么生日礼物比较好?
放假抢不到火车票怎么办?有哪些抢票的好方法?
一年中什么时间段跳槽比较合适,相对容易再找工作?
几百元和几千元的耳机到底有啥区别?选择耳机要注意哪些参数?
| 33.705882 | 53 | 0.799302 | yue_Hant | 0.570866 |
ff27c6732b4dc12ecabb655d1ee8113a184ee709 | 117 | md | Markdown | .github/PULL_REQUEST_TEMPLATE.md | obito02/pio | ea782b38602d1e5df3c2b891978a1f80d917a6bd | [
"BSD-3-Clause"
] | 22 | 2017-07-27T01:49:03.000Z | 2021-11-25T05:57:37.000Z | .github/PULL_REQUEST_TEMPLATE.md | obito02/pio | ea782b38602d1e5df3c2b891978a1f80d917a6bd | [
"BSD-3-Clause"
] | 4 | 2019-01-22T05:54:41.000Z | 2019-12-12T05:28:37.000Z | .github/PULL_REQUEST_TEMPLATE.md | obito02/pio | ea782b38602d1e5df3c2b891978a1f80d917a6bd | [
"BSD-3-Clause"
] | 12 | 2018-01-14T08:38:23.000Z | 2020-09-28T05:01:05.000Z | I'd love to see contributions!!!
Link an [issue](https://github.com/kataras/pio/issues) that your PR tries to solve. | 39 | 83 | 0.74359 | eng_Latn | 0.925004 |
ff2802ab6c8d820f6cb1177b36867af7c36fd9a1 | 2,336 | md | Markdown | wdk-ddi-src/content/ntddvol/ns-ntddvol-_volume_read_plex_input.md | DeviceObject/windows-driver-docs-ddi | be6b8ddad4931e676fb6be20935b82aaaea3a8fb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | wdk-ddi-src/content/ntddvol/ns-ntddvol-_volume_read_plex_input.md | DeviceObject/windows-driver-docs-ddi | be6b8ddad4931e676fb6be20935b82aaaea3a8fb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | wdk-ddi-src/content/ntddvol/ns-ntddvol-_volume_read_plex_input.md | DeviceObject/windows-driver-docs-ddi | be6b8ddad4931e676fb6be20935b82aaaea3a8fb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
UID: NS:ntddvol._VOLUME_READ_PLEX_INPUT
title: _VOLUME_READ_PLEX_INPUT (ntddvol.h)
description: This structure is used in conjunction with IOCTL_VOLUME_READ_PLEX to read data from a specific plex in a volume.
old-location: storage\volume_read_plex_input.htm
tech.root: storage
ms.assetid: 1d53c658-9912-4912-a74f-f7b93367b9e2
ms.date: 03/29/2018
keywords: ["_VOLUME_READ_PLEX_INPUT structure"]
ms.keywords: "*PVOLUME_READ_PLEX_INPUT, PVOLUME_READ_PLEX_INPUT, PVOLUME_READ_PLEX_INPUT structure pointer [Storage Devices], VOLUME_READ_PLEX_INPUT, VOLUME_READ_PLEX_INPUT structure [Storage Devices], _VOLUME_READ_PLEX_INPUT, ntddvol/PVOLUME_READ_PLEX_INPUT, ntddvol/VOLUME_READ_PLEX_INPUT, storage.volume_read_plex_input, structs-volumemgr_26a6ef07-d18e-45bd-b4c3-532d7daadc5c.xml"
f1_keywords:
- "ntddvol/VOLUME_READ_PLEX_INPUT"
- "VOLUME_READ_PLEX_INPUT"
req.header: ntddvol.h
req.include-header: Ntddvol.h
req.target-type: Windows
req.target-min-winverclnt:
req.target-min-winversvr:
req.kmdf-ver:
req.umdf-ver:
req.ddi-compliance:
req.unicode-ansi:
req.idl:
req.max-support:
req.namespace:
req.assembly:
req.type-library:
req.lib:
req.dll:
req.irql:
topic_type:
- APIRef
- kbSyntax
api_type:
- HeaderDef
api_location:
- ntddvol.h
api_name:
- VOLUME_READ_PLEX_INPUT
targetos: Windows
req.typenames: VOLUME_READ_PLEX_INPUT, *PVOLUME_READ_PLEX_INPUT
---
# _VOLUME_READ_PLEX_INPUT structure
## -description
This structure is used in conjunction with <a href="https://docs.microsoft.com/windows-hardware/drivers/ddi/ntddvol/ni-ntddvol-ioctl_volume_read_plex">IOCTL_VOLUME_READ_PLEX</a> to read data from a specific <a href="https://docs.microsoft.com/windows-hardware/drivers/">plex</a> in a volume.
## -struct-fields
### -field ByteOffset
Supplies the start offset, in bytes, relative to the beginning of the volume. This member must be aligned on a 512-byte boundary.
### -field Length
Supplies the length, in bytes, of the block to be read. This member must be an integer multiple of 512 bytes.
### -field PlexNumber
Supplies the zero-based plex number.
## -see-also
<a href="https://docs.microsoft.com/windows-hardware/drivers/ddi/ntddvol/ni-ntddvol-ioctl_volume_read_plex">IOCTL_VOLUME_READ_PLEX</a>
| 28.144578 | 384 | 0.765411 | yue_Hant | 0.770757 |
ff286a78d2a98d27b59f26153b3cd8905750dc66 | 232 | md | Markdown | pages/_includes/oridashi-observation-microalbumin-summary.md | oridashi/oridashi-fhir-base | 82b91d1c70fcf18e51bd7fc61da2f75847a72a28 | [
"MIT"
] | null | null | null | pages/_includes/oridashi-observation-microalbumin-summary.md | oridashi/oridashi-fhir-base | 82b91d1c70fcf18e51bd7fc61da2f75847a72a28 | [
"MIT"
] | null | null | null | pages/_includes/oridashi-observation-microalbumin-summary.md | oridashi/oridashi-fhir-base | 82b91d1c70fcf18e51bd7fc61da2f75847a72a28 | [
"MIT"
] | null | null | null | 1. Base Resource: Microalbumin (as Observation)
1. Required: Microalbumin code - LOINC 14957-5 (as CodeableConcept)
1. Required: subject of care (as Reference(Patient))
1. Required: Microalbumin value - units mg/L (as Quantity)
| 46.4 | 67 | 0.75431 | kor_Hang | 0.352116 |
ff288ddfe6efcc5556f70d06246c574dfe25367b | 1,308 | md | Markdown | README.md | Beezig/ngrammatic | a58156dcdf07a4dad3fa8fd5ecb0e5a22a4584e6 | [
"MIT"
] | null | null | null | README.md | Beezig/ngrammatic | a58156dcdf07a4dad3fa8fd5ecb0e5a22a4584e6 | [
"MIT"
] | null | null | null | README.md | Beezig/ngrammatic | a58156dcdf07a4dad3fa8fd5ecb0e5a22a4584e6 | [
"MIT"
] | null | null | null | This crate provides fuzzy search/string matching using N-grams.
This implementation is character-based, rather than word based,
matching solely based on string similarity.
Licensed under the MIT license.
### Documentation
Not published yet.
### Installation
This crate is published on [crates.io](https://crates.io/crates/).
To use it, add this to your Cargo.toml:
```toml
[dependencies]
ngrammatic = "0.2.0"
```
### Usage
To do fuzzy matching, build up your corpus of valid symbols like this:
```rust
use ngrammatic::{CorpusBuilder, Pad};
let mut corpus = CorpusBuilder::new()
.arity(2)
.pad_full(Pad::Auto)
.finish();
// Build up the list of known words
corpus.add_text("pie");
corpus.add_text("animal");
corpus.add_text("tomato");
corpus.add_text("seven");
corpus.add_text("carbon");
// Now we can try an unknown/misspelled word, and find a similar match
// in the corpus
let word = String::from("tomacco");
if let Some(top_result) = corpus.search(word, 0.25).first() {
if top_result.similarity > 0.99 {
println!("✔ {}", top_result.text);
} else {
println!("❓{} (did you mean {}? [{:.0}% match])",
word,
top_result.text,
top_result.similarity * 100.0);
}
} else {
println!("🗙 {}", word);
}
```
| 21.8 | 70 | 0.649083 | eng_Latn | 0.901746 |
ff299397482c52847a7f697afeeb530f0cd691b9 | 2,229 | md | Markdown | Exchange/ExchangeOnline/monitoring/mail-flow-insights/mfi-fix-slow-mail-flow-rules-insight.md | speedzilla/OfficeDocs-Exchange | c5d1e898c77eb1c614f78988325fcfb9e42a3788 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | Exchange/ExchangeOnline/monitoring/mail-flow-insights/mfi-fix-slow-mail-flow-rules-insight.md | speedzilla/OfficeDocs-Exchange | c5d1e898c77eb1c614f78988325fcfb9e42a3788 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | Exchange/ExchangeOnline/monitoring/mail-flow-insights/mfi-fix-slow-mail-flow-rules-insight.md | speedzilla/OfficeDocs-Exchange | c5d1e898c77eb1c614f78988325fcfb9e42a3788 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: "Fix slow mail flow rules insight in the modern EAC"
f1.keywords:
- NOCSH
ms.author: chrisda
author: chrisda
manager: dansimp
audience: ITPro
ms.topic: article
ms.service: exchange-online
localization_priority: Normal
ms.assetid:
description: "Admins can learn how to use the Fix slow mail flow rules insight in the modern Exchange admin center to identify and fix inefficient or broken mail flow rules (also known as transport rules) in their organization."
---
# Fix slow mail flow rules insight in the modern EAC
mfi-fix-slow-mail-flow-rules-insight-details
Inefficient mail flow rules (also known as transport rules) can lead to mail flow delays for your organization. This insight reports mail flow rules that have an impact on your organization's mail flow. Examples of these types of rules are:
- Conditions that use **Is member of** for large groups.
- Conditions that use complex regular expression (regex) pattern matching.
- Conditions that use content checking in attachments.
The **Fix slow mail flow rules** insight in the Insights dashboard in the modern Exchange admin center (modern EAC) will notify you when a mail flow rule is taking too long to complete. You can use this notification to help you to identify and fine-tune mail flow rules to help reduce mail flow delays.

When you click **View details**, a flyout appears where you can review the rule by clicking **View rules**. You can also click **View sample messages** to see what kind of messages are impacted by the rule.

For more information about conditions and exceptions in mail flow rules in Exchange Online, see [Mail flow rule conditions and exceptions (predicates) in Exchange Online](../../security-and-compliance/mail-flow-rules/conditions-and-exceptions.md).
## Related topics
For more information about other mail flow insights in the mail flow dashboard, see [Mail flow insights in the modern Exchange admin center](mail-flow-insights.md).
| 57.153846 | 302 | 0.784208 | eng_Latn | 0.997395 |
ff29d882b0a36acbabb0c07b5cbf18d6d7e62845 | 8,240 | md | Markdown | _fiches-pedagogiques/ladatasurinternet/introduction-a-la-data-visualisation.md | Nothing2Hide/jekyll-website | 1d7572c8a8b46c95331617651dbb27b36c416c08 | [
"MIT"
] | null | null | null | _fiches-pedagogiques/ladatasurinternet/introduction-a-la-data-visualisation.md | Nothing2Hide/jekyll-website | 1d7572c8a8b46c95331617651dbb27b36c416c08 | [
"MIT"
] | 1 | 2021-03-30T01:52:40.000Z | 2021-03-30T01:52:40.000Z | _fiches-pedagogiques/ladatasurinternet/introduction-a-la-data-visualisation.md | Nothing2Hide/jekyll-website | 1d7572c8a8b46c95331617651dbb27b36c416c08 | [
"MIT"
] | null | null | null | ---
layout: fiche
title: "Introduction à la data visualisation"
lang: fr
duree: 30mn
public: Adolescents, adultes
participants: 10
category: [Les données à l'ère d'Internet]
description: "Au cours de cet atelier, vous apprendrez à transformer des données
brutes en informations lisibles et intelligibles à l’aide d’un outil
simple de data visualisation."
permalink: /fr/fiches-pedagogiques/introduction-a-la-data-visualisation/
---
L’atelier se déroule en deux étapes :
1. Vous faites la démonstration et l’explication sur projecteur de la
création d’un graphique
2. Les participant-e-s créent leurs graphiques avec d’autres données
Étape 1 : démonstration
------------------------
Pour faire de la data visualisation, il nous faut des données. Nous
allons les récupérer sur le site
[data.gouv.fr](https://www.data.gouv.fr/fr/), véritable mine de données
brutes mise à disposition par l’État dans le cadre de la mission
[Etalab](https://fr.wikipedia.org/wiki/Etalab).
Pour cet atelier nous allons utiliser les données de l’INA sur [Temps de
parole des hommes et des femmes à la télévision et à la
radio](https://www.data.gouv.fr/fr/datasets/temps-de-parole-des-hommes-et-des-femmes-a-la-television-et-a-la-radio/).
Par la suite, vous pourrez réaliser cet atelier avec un autre jeu de
données, il y en a une multitude sur
[data.gouv.fr](https://www.data.gouv.fr/fr/). Les données choisies ont
l’avantage d’être assez parlantes et surtout au format CSV, un format
relativement simple à exploiter avec la méthode que nous allons utiliser
ici. Sur [data.gouv.fr](https://www.data.gouv.fr/fr/), de nombreux
autres jeux de données sont au format JSON ou XML, formats qui pour être
exploités nécessitent quelques connaissances en programmation.
Téléchargez le jeu de données et ouvrez-le avec Excel – ou mieux –
[LibreOffice Calc](https://www.libreoffice.org/discover/calc/),
l’équivalent libre (donc gratuit) d’Excel.
Après l’import des données (pour en savoir plus si vous ne maîtrisez pas
cette étape, [jetez un œil à la documentation
officielle](https://help.libreoffice.org/Common/Importing_and_Exporting_Data_in_Text_Format/fr)),
vous vous retrouvez avec un fichier ressemblant à ceci :
{.alignnone
.size-full .wp-image-4527 width="777" height="772"
sizes="(max-width: 767px) 89vw, (max-width: 1000px) 54vw, (max-width: 1071px) 543px, 580px"
srcset="https://nothing2hide.org/assets/img/sites/3/2019/09/data.png 777w, https://nothing2hide.org/assets/img/sites/3/2019/09/data-150x150.png 150w, https://nothing2hide.org/assets/img/sites/3/2019/09/data-300x298.png 300w, https://nothing2hide.org/assets/img/sites/3/2019/09/data-768x763.png 768w, https://nothing2hide.org/assets/img/sites/3/2019/09/data-600x596.png 600w, https://nothing2hide.org/assets/img/sites/3/2019/09/data-161x160.png 161w, https://nothing2hide.org/assets/img/sites/3/2019/09/data-100x100.png 100w"}
Copiez uniquement les données des lignes comprenant Chérie FM et allez
sur sur le site web [datawarapper.de](https://www.datawrapper.de/). Ce
site vous permet de créer très simplement des visualisations de données
brutes. Sur la page d’accueil, cliquez sur le bouton *Create a chart* et
collez les données sélectionnées dans LibreOffice et dans le champs de
formulaire*.*
{.alignnone
.size-full .wp-image-4528 width="1223" height="645"
sizes="(max-width: 767px) 89vw, (max-width: 1000px) 54vw, (max-width: 1071px) 543px, 580px"
srcset="https://nothing2hide.org/assets/img/sites/3/2019/09/data2.png 1223w, https://nothing2hide.org/assets/img/sites/3/2019/09/data2-300x158.png 300w, https://nothing2hide.org/assets/img/sites/3/2019/09/data2-768x405.png 768w, https://nothing2hide.org/assets/img/sites/3/2019/09/data2-1024x540.png 1024w, https://nothing2hide.org/assets/img/sites/3/2019/09/data2-600x316.png 600w, https://nothing2hide.org/assets/img/sites/3/2019/09/data2-280x148.png 280w"}
Cliquez sur *Proceed*.
Lors de cette deuxième étape, nous allons sélectionner les données que
nous souhaitons conserver. Cliquez sur la colonne *is\_public\_channel*
et à gauche, cochez la case *Hide column from visualization*.
Assurez-vous que les données sont au bon format (que les dates sont bien
des dates et les chiffres sont bien des chiffres). Normalement
*datawrapper* le fait automatiquement pour vous.
{.alignnone
.size-full .wp-image-4529 width="1219" height="648"
sizes="(max-width: 767px) 89vw, (max-width: 1000px) 54vw, (max-width: 1071px) 543px, 580px"
srcset="https://nothing2hide.org/assets/img/sites/3/2019/09/data3.png 1219w, https://nothing2hide.org/assets/img/sites/3/2019/09/data3-300x159.png 300w, https://nothing2hide.org/assets/img/sites/3/2019/09/data3-768x408.png 768w, https://nothing2hide.org/assets/img/sites/3/2019/09/data3-1024x544.png 1024w, https://nothing2hide.org/assets/img/sites/3/2019/09/data3-600x319.png 600w, https://nothing2hide.org/assets/img/sites/3/2019/09/data3-280x149.png 280w"}
*Cliquez sur proceed.*
Lors de cette troisième étape, vous sélectionnez le type de
visualisation que vous allez utiliser pour afficher votre graphique.
Sélectionnez les types de graphiques *Lines*.
{.alignnone
.size-full .wp-image-4530 width="1199" height="625"
sizes="(max-width: 767px) 89vw, (max-width: 1000px) 54vw, (max-width: 1071px) 543px, 580px"
srcset="https://nothing2hide.org/assets/img/sites/3/2019/09/data4.png 1199w, https://nothing2hide.org/assets/img/sites/3/2019/09/data4-300x156.png 300w, https://nothing2hide.org/assets/img/sites/3/2019/09/data4-768x400.png 768w, https://nothing2hide.org/assets/img/sites/3/2019/09/data4-1024x534.png 1024w, https://nothing2hide.org/assets/img/sites/3/2019/09/data4-600x313.png 600w, https://nothing2hide.org/assets/img/sites/3/2019/09/data4-280x146.png 280w"}
Dans l’onglet *Refine* vous allez ensuite définir les données que vous
allez afficher en abscisses et en ordonnées. Sélectionnez *year* pour
les abscisses (x-axis).
Dans l’onglet *Annotate*, choisissez un titre et une description
décrivant les données que vous affichez. Ici nous avons choisi « Chéri
FM et la parole des femmes »
Votre schéma est prêt pour la publication !
{.alignnone
.size-full .wp-image-4531 width="646" height="359"
sizes="(max-width: 646px) 100vw, 646px"
srcset="https://nothing2hide.org/assets/img/sites/3/2019/09/data6.png 646w, https://nothing2hide.org/assets/img/sites/3/2019/09/data6-300x167.png 300w, https://nothing2hide.org/assets/img/sites/3/2019/09/data6-600x333.png 600w, https://nothing2hide.org/assets/img/sites/3/2019/09/data6-280x156.png 280w"}
À vous de jouer !
-----------------
Maintenant que vous avez montré comment créer un schéma à partir des
données récupérées sur data.gouv.fr et comment les visualiser dans
*datawrapper*, c’est aux participants de s’y mettre ! Créez 2 ou 3
groupes selon le nombre de personnes présentes et demandez à chaque
groupe de faire le même schéma avec le même processus pour un autre
média. Vous pourrez ensuite constater quels médias laissent le plus de
temps de parole aux femmes et quels médias en laissent le moins.
Variantes possibles… {#variantes-possibles...}
--------------------
… Mais un peu plus compliquées puisqu’il faut faire quelques
manipulations avec les données :
- trouver le média qui laisse le plus de temps de parole aux femmes,
celui qui en laisse le moins et demandez-leur de sortir les
graphiques de ces deux médias
- trouver les médias qui ont significativement augmenté le temps de
parole des femmes au cours des années
- trouver les médias qui ont significativement diminué le temps de
parole des femmes au cours des années
- Réaliser la courbe du temps global de l’évolution de temps de
paroles des femmes, tous médias confondus
Il existe de nombreux jeux de données sur data.gouv.fr. Si vous apprenez
à maîtriser *datawrapper* vous pourrez animer d’autres ateliers en
utilisant d’autres jeux de données. Il n’y a de limites que votre
imagination ! | 59.280576 | 525 | 0.770874 | fra_Latn | 0.855173 |
ff2a505b792007f7089de553be6cf99173b8b5d8 | 6,073 | md | Markdown | _posts/2021/6/18/learning-javascript/2021-06-18-chap16.md | colinch4/colinch4.github.io | dbd1601c737ac68d3f8ec6545898e7a81293dcd6 | [
"MIT"
] | null | null | null | _posts/2021/6/18/learning-javascript/2021-06-18-chap16.md | colinch4/colinch4.github.io | dbd1601c737ac68d3f8ec6545898e7a81293dcd6 | [
"MIT"
] | null | null | null | _posts/2021/6/18/learning-javascript/2021-06-18-chap16.md | colinch4/colinch4.github.io | dbd1601c737ac68d3f8ec6545898e7a81293dcd6 | [
"MIT"
] | null | null | null | ---
layout: post
title: "[learning javascript] chapter 16. Math"
description: " "
date: 2021-06-18
tags: [javascript]
comments: true
share: true
---
# Math
- Math 객체를 설명
## 숫자 형식
- 자바스크립트의 숫자 형식 메서드는 모두 숫자가 아니라 문자열을 반환
- 해당 형식에 필요한 각종 기호를 온전히 표현하려면 반드시 문자열이어야 하기 때문
#### 고정 소수점
- 소수점 뒤 자릿수를 지정하는 형식을 원한다면 `toFixed()`를 사용
```javascript
const x = 19.51;
x.toFixed(3); // "19.510"
x.toFixed(2); // "19.51"
x.toFixed(1); // "19.5"
x.toFixed(0); // "20"
```
- 숫자는 버림(trucatation)이 아니라 반올림(round)
#### 지수 표기법
지수 표기법이 필요할 땐 `toExponential()`을 사용
```javascript
const x = 3800.5;
x.toExponential(4); // "3.8005e+3"
x.toExponential(3); // "3.801e+3"
x.toExponential(2); // "3.80e+3"
x.toExponential(1); // "3.8e+3"
x.toExponential(0); // "4e+3"
```
- `toFixed()`와 마찬가지로 반올림
#### 고정 전체 자리수
- 소수점이 어디 나타나든 관계없이 숫자 몇 개로 표현하느냐가 중요하다면 `toPrecision()`을 사용
```javascript
let x = 1000;
x.toPrecision(5); // "1000.0"
x.toPrecision(4); // "1000"
x.toPrecision(3); // "1.00e+3"
x.toPrecision(2); // "1.0e.0"
x.toPrecision(1); // "1e.0"
x = 15.335;
x.toPrecision(6); // "15.3350"
x.toPrecision(5); // "15.335"
x.toPrecision(4); // "15.34"
x.toPrecision(3); // "15.3"
x.toPrecision(2); // "15"
x.toPrecision(1); // "2e+1"
```
- 출력 결과는 반올림된 숫자, 전체 자릿수는 매개변수로 넘긴 자릿수와 일치
#### 다른 진수
- 2진수나 8진수, 16진수 표현을 원한다면 `toString()`에 기수를 매개변수로 쓰면 됨
```javascript
const x = 12;
x.toString(); // "12" (10진수)
x.toString(10); // "12" (10진수)
x.toString(16); // "c" (16진수)
x.toString(8); // "14" (8진수)
x.toString(2); // "1100" (2진수)
```
## 고급 숫자 형식
- 다양한 형식으로 숫자를 표시해야 한다면 `numeral.js` 라이브러리를 추천
- 주로 필요한 경우는 다음과 같음
- 수천 자리의 아주 큰 숫자
- 괄호를 쓰는 등, 음수 표현을 다르게 해야 하는 경우
- 공학 표기법(지수 표기법과 비슷)
- milli-, micro-, kilo-, mega-등의 SI 접두사가 필요한 경우
## 상수
- Math 객체에는 몇가지 중요한 상수가 프로퍼티로 내장되어 있음
```javascript
// 기본적인 상수
Math.E // 자연로그의 밑수(root) : ~2.718
Math.PI // 원주율: ~3.142
// 로그 관련 상수는 Math 객체의 프로퍼티를 호출해도 되지만, 자주 사용한다면
// 따로 상수에 할당해서 편리하게 사용하는 것이 좋음
Math.LN2 // 2의 자연로그: ~0.693
Math.LN10 // 10의 자연로그: ~2.303
Math.LOG2E // Math.E의 밑수가 2인 로그: ~1.433
Math.LOG10E // Math.E의 상용 로그: 0.434
// 대수 관련 상수
Math.SQRT1_2 // 1/2의 제곱근: ~0.707
Math.SQRT2 // 2의 제곱근: ~1.414
```
## 대수 함수
#### 거듭제곱
- 제곱 관련 기본 함수는 `Math.pow`
- 제곱, 제곱근 관련 함수
|함수|설명|예제|
|---|---|---|
|Math.pow(x,y)|x<sup>y</sup>|Math.pow(2,3) // 8<br>Math.pow(1.7, 2.3) // ~3.39
|Math.sqrt(x)|제곱근. √x는 Math.pow(x, 0.5)와 같음|Math.sqrt(16) // 4<br>Math.sqrt(15.5) // ~3.94|
|Math.cbrt(x)|세제곱근. X는 Math.pow(x, 1/3)와 같음|Math.cbrt(27) // 3<br>Math.cbrt(22) // ~2.8|
|Math.exp(x)|e<sup>x</sup>는 Math.pow(Math.E, x)와 같음|Math.exp(1) // ~2.718<br>Math.exp(5,5) // ~244.7|
|Math.expm1(x)|e<sup>x</sup>-1은 Math.exp(x)-1과 같음|Math.expm1(1) // ~1.718<br>Math.expm1(5,5) // ~243.7|
|Math.hypot(x1, x2,...)|매개변수의 제곱을 합한 수(√x1<sup>2</sup> + √x2<sup>2</sup> + ...)의 제곱근|Math.hypot(3, 4) // 5<br>Math.hypot(2, 3, 4) // ~5.36
#### 로그 함수
- 자연로그 함수는 Math.log
- ES6에서는 자주 쓰이는 상용로그 Math.log10 함수가 생김
|함수|설명|예제|
|---|---|---|
|Math.log(x)|x의 자연로그|Math.log(Math.E) // 1<br>Math.log(17.5) // ~2.86
|Math.log10(x)|x의 상용로그 Math.log(x)/Math.log(10)와 같음|Math.log10(10) // 1<br>Math.log10(16.7) // ~1.22
|Math.log2(x)|x의 밑수가 2인 로그 Math.log(x)/Math.log(2)와 같음|Math.log2(2) // 1<br>Math.log2(5) // ~2.32
|Math.log1p(x)|1+x의 자연로그 Math.log(1+x)와 같음|Math.log1p(Math.E-1) // 1<br>Math.log1p(17.5) // ~2.92
#### 기타 함수
- 절대값, 부호, 배열의 최소값/최대값 등 숫자관련 기타함수
|함수|설명|예제|
|---|---|---|
|Math.abs(x)|x의 절대값|Math.abs(-5.5) // 5.5<br>Math.abs(5.5) // 5.5
|Math.sign(x)|x의 부호. x가 음수면 -1, 양수면 1, 0이면 0|Math.sign(-10.5) // -1<br>Math.sign(6.77) // 1
|Math.ceil(x)|x의 올림<br>x보다 크거나 같은 정수 중 가장 작은 수|Math.ceil(2.2) // 3<br>Math.ceil(-3.8) // -3
|Math.floor(x)|x의 내림<br>x보다 작거나 같은 정수 중 가장 큰 수|Math.floor(2.8) // 2<br>Math.floor(-3.2) // -4
|Math.trunc(x)|x의 버림<br>소수점 아래 부분을 모두 제거하고 정수 부분만 남김|Math.trunc(7.7) // 7<br>Math.trunc(-5.8) // -5
|Math.round(x)|x의 반올림|Math.round(7.2) // 7<br>Math.round(7.7) // 8<br>Math.round(-7.7) // 8<br>Math.round(-7.2) // 7
|Math.min(x1, x2, ...)|매개변수 중 최소값|Math.min(1, 2) // 1<br>Math.min(3, 0.5, -0.66) // -0.66
|Math.max(x1, x2, ...)|매개변수 중 최대값|Math.max(1, 2) // 2<br>Math.max(3, 0.5, -0.66) // 3
#### 의사 난수 생성
- 자바스크립트에서 의사 난수를 생성할 때는 `Math.random()`을 사용
- 이 함수는 0이상 1미만의 숫자를 반환
- 대수학에서 x이상 y이하를 [x, y] x초과 y미만을 (x, y)라 표기
- 이 표기법에 따르면 `Math.random()`은 `[0, 1)`
- 다른 범위의 난수가 필요할 때 널리 쓰이는 공식
|범위|예제|
|---|---|
|0 이상 1 미만|Math.random()
|x 이상 y 미만|x + (y-x)*Math.random()
|m 이상 n 미만의 정수|m + Math.floor((n-m)*Math.random())
|m 이상 n 이하의 정수|m + Math.floor((n-m+1)*Math.random())
- 자바스크립트의 의사 난수 발생기는 시드 숫자를 쓸 수 없다는 단점
- 시드를 사용해 의사 난수를 생성해야 한다면 데이비드 바우(David Bau)의 [seedrandom.js](https://github.com/davidbau/seedrandom) 패키지를 참고
## 삼각 함수
- 사인, 코사인, 탄젠트, 아크 사인, 아크 코사인, 아크 탄젠트
- 자바스크립트의 삼각함수는 모두 라디안 값을 기준
|함수|설명|예제|
|---|---|---|
|Math.sin(x)|x의 사인|Math.sin(Math.PI/2) // 1<br>Math.sin(Math.PI/4) // ~0.707
|Math.cos(x)|x의 코사인|Math.cos(Math.PI) // -1<br>Math.cos(Math.PI/4) // ~0.707
|Math.tan(x)|x의 탄젠트|Math.tan(Math.PI/4) // -1<br>Math.tan(0) // 0
|Math.asin(x)|x의 아크사인(결과는 라디안)|Math.asin(0) // 0<br>Math.asin(Math.SQRT1_2) // ~0.785
|Math.acos(x)|x의 아크코사인(결과는 라디안)|Math.acos(0) // ~1.57+<br>Math.acos(Math.SQRT1_2) // ~0.785+
|Math.atan(x)|x의 아크탄젠트(결과는 라디안)|Math.atan(0) // 0<br>Math.atan(Math.SQRT1_2) // ~0.615
|Math.atan2(y,x)|x축에서 점(x,y)까지의 시계 반대방향 각도를 라디안으로 나타낸 값|Math.atan2(0, 1) // 0<br>Math.atan2(1, 1) // ~0.785
- 매개변수에 각도를 쓸 수 없으므로 라디안 값으로 바꿔야 함
- 계산은 180으로 나누고 파이를 곱하면 됨
- 보조 함수를 만들면
```javascript
function deg2rad(d) { return d/180*Math.PI; }
function rad2deg(r) { return r/Math.PI*180; }
```
## 쌍곡선 함수
|함수|설명|예제|
|---|---|---|
|Math.sinh(x)|x의 하이퍼볼릭 사인|Math.sinh(0) // 0<br>Math.sinh(1) // ~1.18
|Math.cosh(x)|x의 하이퍼볼릭 코사인|Math.cosh(0) // 1<br>Math.cosh(1) // ~1.54
|Math.tanh(x)|x의 하이퍼볼릭 탄젠트|Math.tanh(0) // 0<br>Math.tanh(1) // ~0.762
|Math.asinh(x)|x의 하이퍼볼릭 아크사인|Math.asinh(0) // 0<br>Math.asinh(1) // ~0.881
|Math.acosh(x)|x의 하이퍼볼릭 아크코사인|Math.acosh(0) // NaN<br>Math.acosh(1) // 0
|Math.atanh(x)|x의 하이퍼볼릭 아크탄젠트|Math.atanh(0) // 0<br>Math.atanh(1) // ~0.615 | 33.185792 | 138 | 0.593117 | kor_Hang | 0.999595 |
ff2aa5f0fe4c03950f5958e9a313921c708bf587 | 8,921 | md | Markdown | vscode/docs/languages/cpp.md | SKsakibul125/symmetrical-system | cb21d7a76d4821cc66dee6d41d12c1e0ef3a7335 | [
"Unlicense"
] | 7 | 2021-08-20T00:30:13.000Z | 2022-02-17T17:28:46.000Z | vscode/docs/languages/cpp.md | SKsakibul125/symmetrical-system | cb21d7a76d4821cc66dee6d41d12c1e0ef3a7335 | [
"Unlicense"
] | 15 | 2021-07-30T18:48:20.000Z | 2022-03-26T12:42:22.000Z | vscode/docs/languages/cpp.md | SKsakibul125/symmetrical-system | cb21d7a76d4821cc66dee6d41d12c1e0ef3a7335 | [
"Unlicense"
] | 3 | 2021-08-31T00:50:25.000Z | 2022-01-25T16:38:20.000Z | # C/C++ for Visual Studio Code
C/C++ support for Visual Studio Code is provided by a [Microsoft C/C++ extension](https://marketplace.visualstudio.com/items?itemName=ms-vscode.cpptools) to enable cross-platform C and C++ development on Windows, Linux, and macOS.

## Install the extension
1. Open VS Code.
2. Select the Extensions view icon on the Activity bar or use the keyboard shortcut (`kb(workbench.view.extensions)`).
3. Search for `'C++'`.
4. Select **Install**.

After you install the extension, when you open or create a `*.cpp` file, you will have syntax highlighting (colorization), smart completions and hovers (IntelliSense), and error checking.

## Install a compiler
C++ is a compiled language meaning your program’s source code must be translated (compiled) before it can be run on your computer. VS Code is first and foremost an editor, and relies on command-line tools to do much of the development workflow. The C/C++ extension does not include a C++ compiler or debugger. You will need to install these tools or use those already installed on your computer.
There may already be a C++ compiler and debugger provided by your academic or work development environment. Check with your instructors or colleagues for guidance on installing the recommended C++ toolset (compiler, debugger, project system, linter).
Some platforms, such as Linux or macOS, have a C++ compiler already installed. Most Linux distributions have the [GNU Compiler Collection](https://wikipedia.org/wiki/GNU_Compiler_Collection) (GCC) installed and macOS users can get the [Clang](https://wikipedia.org/wiki/Clang) tools with [Xcode](https://developer.apple.com/xcode/).
### Check if you have a compiler installed
Make sure your compiler executable is in your platform path (`%PATH` on Windows, `$PATH` on Linux and macOS) so that the C/C++ extension can find it. You can check availability of your C++ tools by opening the Integrated Terminal (`kb(workbench.action.terminal.toggleTerminal)`) in VS Code and trying to directly run the compiler.
Checking for the GCC compiler `g++`:
g++ --version
Checking for the Clang compiler `clang`:
clang --version
> **Note**: If you would prefer a full Integrated Development Environment (IDE), with built-in compilation, debugging, and project templates (File > New Project), there are many options available, such as the [Visual Studio Community](https://visualstudio.microsoft.com/vs/community) edition.
If you don’t have a compiler installed, in the example below, we describe how to install the Minimalist GNU for Windows (MinGW) C++ tools (compiler and debugger). MinGW is a popular, free toolset for Windows. If you are running VS Code on another platform, you can read the [C++ tutorials](#tutorials), which cover C++ configurations for Linux and macOS.
## Example: Install MinGW-x64
We will install Mingw-w64 via [MSYS2](https://www.msys2.org/), which provides up-to-date native builds of GCC, Mingw-w64, and other helpful C++ tools and libraries. [Click here](https://github.com/msys2/msys2-installer/releases/download/2021-06-04/msys2-x86_64-20210604.exe) to download the MSYS2 installer. Then follow the instructions on the [MSYS2 website](https://www.msys2.org/) to install Mingw-w64.
### Add the MinGW compiler to your path
Add the path to your Mingw-w64 `bin` folder to the Windows `PATH` environment variable by using the following steps:
1. In the Windows search bar, type ‘settings’ to open your Windows Settings.
2. Search for **Edit environment variables for your account**.
3. Choose the `Path` variable and then select **Edit**.
4. Select **New** and add the Mingw-w64 destination folder path, with `\mingw64\bin` appended, to the system path. The exact path depends on which version of Mingw-w64 you have installed and where you installed it. If you used the settings above to install Mingw-w64, then add this to the path: `C:\msys64\mingw64\bin`.
5. Select **OK** to save the updated PATH. You will need to reopen any console windows for the new PATH location to be available.
### Check your MinGW installation
To check that your Mingw-w64 tools are correctly installed and available, open a **new** Command Prompt and type:
g++ --version
gdb --version
If you don’t see the expected output or `g++` or `gdb` is not a recognized command, make sure your PATH entry matches the Mingw-w64 binary location where the compiler tools are located.
## Hello World
To make sure the compiler is installed and configured correctly, we’ll create the simplest Hello World C++ program.
Create a folder called “HelloWorld” and open VS Code in that folder (`code .` opens VS Code in the current folder):
mkdir HelloWorld
cd HelloWorld
code .
Now create a new file called `helloworld.cpp` with the **New File** button in the File Explorer or **File** > **New File** command.


### Add Hello World source code
Now paste in this source code:
#include <iostream>
using namespace std;
int main()
{
cout << "Hello World" << endl;
}
Now press `kb(workbench.action.files.save)` to save the file. You can also enable [Auto Save](/docs/editor/codebasics.md#saveauto-save) to automatically save your file changes, by checking **Auto Save** in the main **File** menu.
### Build Hello World
Now that we have a simple C++ program, let’s build it. Select the **Terminal** > **Run Build Task** command (`kb(workbench.action.tasks.build)`) from the main menu.

This will display a dropdown with various compiler task options. If you are using a GCC toolset like MinGW, you would choose **C/C++: g++.exe build active file**.

This will compile `helloworld.cpp` and create an executable file called `helloworld.exe`, which will appear in the File Explorer.

### Run Hello World
From a command prompt or a new VS Code Integrated Terminal, you can now run your program by typing “.”.

If everything is set up correctly, you should see the output “Hello World”.
This has been a very simple example to help you get started with C++ development in VS Code. The next step is to try one of the tutorials listed below on your platform (Windows, Linux, or macOS) with your preferred toolset (GCC, Clang, Microsoft C++) and learn more about the Microsoft C/C++ extension’s language features such as IntelliSense, code navigation, build configuration, and debugging.
## Tutorials
Get started with C++ and VS Code with tutorials for your environment:
- [GCC on Windows via MinGW](/docs/cpp/config-mingw.md)
- [Microsoft C++ on Windows](/docs/cpp/config-msvc.md)
- [GCC on Linux](/docs/cpp/config-linux.md)
- [GCC on Windows Subsystem For Linux](/docs/cpp/config-wsl.md)
- [Clang/LLVM on macOS](/docs/cpp/config-clang-mac.md)
- [CMake Tools on Linux](/docs/cpp/cmake-linux.md)
## Documentation
You can find more documentation on using the Microsoft C/C++ extension under the [C++ section](/docs/cpp) of the VS Code website, where you’ll find topics on:
- [Debugging](/docs/cpp/cpp-debug.md)
- [Editing](/docs/cpp/cpp-ide.md)
- [Settings](/docs/cpp/customize-default-settings-cpp.md)
- [FAQ](/docs/cpp/faq-cpp.md)

## Remote Development
VS Code and the C++ extension support [Remote Development](/docs/remote/remote-overview.md) allowing you to work over SSH on a remote machine or VM, inside a Docker container, or in the [Windows Subsystem for Linux](https://docs.microsoft.com/windows/wsl) (WSL).
To install support for Remote Development:
1. Install the VS Code [Remote Development Extension Pack](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.vscode-remote-extensionpack).
2. If the remote source files are hosted in WSL, use the **Remote - WSL** extension.
3. If you are connecting to a remote machine with SSH, use the **Remote - SSH** extension.
4. If the remote source files are hosted in a container (for example, Docker), use the **Remote - Containers** extension.
## Feedback
If you run into any issues or have suggestions for the Microsoft C/C++ extension, please file [issues and suggestions on GitHub](https://github.com/microsoft/vscode-cpptools/issues). If you haven’t already provided feedback, please take this [quick survey](https://www.research.net/r/VBVV6C6) to help shape this extension for your needs.
| 56.462025 | 405 | 0.750701 | eng_Latn | 0.981415 |
ff2b09b3281d8bbb4b4859be1375c7daaf97ea4f | 748 | md | Markdown | 2018/CVE-2018-11509.md | sei-vsarvepalli/cve | fbd9def72dd8f1b479c71594bfd55ddb1c3be051 | [
"MIT"
] | 4 | 2022-03-01T12:31:42.000Z | 2022-03-29T02:35:57.000Z | 2018/CVE-2018-11509.md | sei-vsarvepalli/cve | fbd9def72dd8f1b479c71594bfd55ddb1c3be051 | [
"MIT"
] | null | null | null | 2018/CVE-2018-11509.md | sei-vsarvepalli/cve | fbd9def72dd8f1b479c71594bfd55ddb1c3be051 | [
"MIT"
] | 1 | 2022-02-24T21:07:04.000Z | 2022-02-24T21:07:04.000Z | ### [CVE-2018-11509](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11509)



### Description
ASUSTOR ADM 3.1.0.RFQ3 uses the same default root:admin username and password as it does for the NAS itself for applications that are installed from the online repository. This may allow an attacker to login and upload a webshell.
### POC
#### Reference
- http://packetstormsecurity.com/files/148919/ASUSTOR-NAS-ADM-3.1.0-Remote-Command-Execution-SQL-Injection.html
#### Github
No GitHub POC found.
| 41.555556 | 230 | 0.756684 | eng_Latn | 0.387388 |
ff2c2dd3bf7f4535d564c518f65a6e1608f8f7bf | 248 | md | Markdown | _posts/2009-07-17-birthday.md | Ja50n/ja50n.github.com | 246554b0195cbf4c60a6e508743a048926c04933 | [
"Apache-2.0"
] | null | null | null | _posts/2009-07-17-birthday.md | Ja50n/ja50n.github.com | 246554b0195cbf4c60a6e508743a048926c04933 | [
"Apache-2.0"
] | null | null | null | _posts/2009-07-17-birthday.md | Ja50n/ja50n.github.com | 246554b0195cbf4c60a6e508743a048926c04933 | [
"Apache-2.0"
] | null | null | null | ---
layout: post
title: "生日志"
postid: post090717
date: 2009-07-17 22:23:13
tags: [随笔]
categories: [Life]
published: False
comments: True
---
去年的今天
我在准备今年
<!--more-->
今年的今天
我在准备明天
去年,后悔加失落
今年,轻松又开心
去年,阳光都是凄凉的
今年,大雨都是温暖的
阴阳的重合
好似让我今天重生
| 7.515152 | 27 | 0.681452 | yue_Hant | 0.513252 |
ff2c30fcc3c9379c40dc4d284c216ce5333677fa | 1,940 | md | Markdown | src/ro/2020-03/02/04.md | PrJared/sabbath-school-lessons | 94a27f5bcba987a11a698e5e0d4279b81a68bc9a | [
"MIT"
] | 68 | 2016-10-30T23:17:56.000Z | 2022-03-27T11:58:16.000Z | src/ro/2020-03/02/04.md | PrJared/sabbath-school-lessons | 94a27f5bcba987a11a698e5e0d4279b81a68bc9a | [
"MIT"
] | 367 | 2016-10-21T03:50:22.000Z | 2022-03-28T23:35:25.000Z | src/ro/2020-03/02/04.md | PrJared/sabbath-school-lessons | 94a27f5bcba987a11a698e5e0d4279b81a68bc9a | [
"MIT"
] | 109 | 2016-08-02T14:32:13.000Z | 2022-03-31T10:18:41.000Z | ---
title: Impactul unei vieţi schimbate
date: 07/07/2020
---
`4. Care a fost reacţia celor din Sinedriu faţă de mărturia dată de Petru şi Ioan şi ce au înţeles ei? Faptele 4:13`
Biserica Noului Testament a avut o creştere explozivă. În Ziua Cincizecimii s-au botezat trei mii de oameni (Faptele 2:41) şi, câteva săptămâni mai târziu, alte mii de oameni s-au adăugat bisericii (Faptele 4:4). Curând, autorităţile şi-au dat seama ce se întâmpla. Aceşti credincioşi fuseseră cu Hristos. Ei fuseseră transformaţi prin harul Său şi nu mai puteau să tacă.
`5. Ce s-a întâmplat când autorităţile au încercat să-i reducă la tăcere pe Petru şi Ioan? Care a fost răspunsul lor? Faptele 4:1-20`
Aceşti credincioşi erau fiinţe noi în Hristos şi trebuiau să-şi spună povestea. Petru, un pescar cu gura mare, a fost transformat prin harul lui Dumnezeu. Iacov şi Ioan, „Fiii tunetului”, care aveau o fire impulsivă, au fost transformaţi prin harul lui Dumnezeu. Toma necredinciosul a fost şi el transformat prin harul lui Dumnezeu. Ucenicii şi membrii bisericii timpurii au avut fiecare propria poveste de spus şi nu au putut să tacă. Remarcă această declaraţie a lui Ellen G. White în cartea Calea către Hristos: „De îndată ce un om vine la Hristos, în inima lui se naşte dorinţa de a le face cunoscut şi altora ce minunat prieten a găsit în Hristos, căci adevărul mântuitor şi sfinţitor nu poate fi ascuns în inima sa” (p. 78).
Observă şi ce au spus conducătorii religioşi (vers. 16). Ei au recunoscut în mod deschis realitatea minunii care fusese înfăptuită – omul vindecat stătea în picioare acolo, în faţa lor, dar ei au refuzat să-şi schimbe atitudinea. Totuşi, în ciuda acestei sfidări făţişe, Petru şi Ioan nu aveau de gând să-şi retragă mărturia.
`Ce relaţie există între a-L cunoaşte pe Hristos și a le spune altora despre El? De ce cunoaşterea personală a lui Isus este atât de importantă pentru a fi în stare să mărturisim despre El?` | 121.25 | 730 | 0.785567 | ron_Latn | 1.00001 |
ff2c4b99307a7c9aa8eb1e99e496d0dccf41e35f | 1,402 | md | Markdown | _speakers/lucy-knight.md | TechExeter/conference-2019 | 10af2dff61d08c283756b708d183fe60ab7212e9 | [
"MIT"
] | 2 | 2019-06-15T09:19:11.000Z | 2019-08-15T10:22:33.000Z | _speakers/lucy-knight.md | TechExeter/conference-2019 | 10af2dff61d08c283756b708d183fe60ab7212e9 | [
"MIT"
] | null | null | null | _speakers/lucy-knight.md | TechExeter/conference-2019 | 10af2dff61d08c283756b708d183fe60ab7212e9 | [
"MIT"
] | 1 | 2019-08-14T09:52:09.000Z | 2019-08-14T09:52:09.000Z | ---
name: Lucy Knight
pronoun: She/Her
title: Lead Data Scientist
company: Food Standards Agency
talk-title: Track Host for Track 3
headshot: /assets/images/headshots/head-lucy-knight.jpg
track: "3"
sortoverride: 1
timeslot: "10.00 - 17.00"
type: Track Host
level:
twitter:
- jargonautical
# linkedin:
takeaways:
# - Item 1
---
{: .notice--info}
We're proud to have Lucy return to the conference, not as a speaker this time but a track host, looking after the DEVELOP track.
<h3>Bio</h3>
Lucy is Lead Data Scientist at the Food Standards Agency, and cofounder of the Open Data Institute node ODI Devon and tech start-up The Data Pace.
She worked first in fine arts and then in opto-electronics manufacturing, where she developed an interest in data analysis and information management, moving into public sector performance and policy management in 2001. She held various roles at Devon County Council, including Open Data Lead, before moving to the Civil Service in 2018 to take on her current post.
Building on her experience of working at the extreme ends of the technical/creative spectrum, she advocates for better communication between technical and non-technical groups and the importance of making technological advances both useful and accessible. Lucy is a regular facilitator and speaker at open data and transformation events, conferences and unconferences across the country.
| 46.733333 | 387 | 0.7903 | eng_Latn | 0.996839 |
ff2c8fbf2b82f4091abd4d346e0a433dca6d0123 | 5,061 | md | Markdown | docs/framework/configure-apps/file-schema/wcf/add-of-commonparameters.md | eOkadas/docs.fr-fr | 64202ad620f9bcd91f4360ec74aa6d86e1d4ae15 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/configure-apps/file-schema/wcf/add-of-commonparameters.md | eOkadas/docs.fr-fr | 64202ad620f9bcd91f4360ec74aa6d86e1d4ae15 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/configure-apps/file-schema/wcf/add-of-commonparameters.md | eOkadas/docs.fr-fr | 64202ad620f9bcd91f4360ec74aa6d86e1d4ae15 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: <add> de <commonParameters>
ms.date: 03/30/2017
ms.assetid: 3713bf25-20c8-455f-bb85-de46b6487932
ms.openlocfilehash: d682acd7fff6bab2c66660a028f8a75b780e21d2
ms.sourcegitcommit: 093571de904fc7979e85ef3c048547d0accb1d8a
ms.translationtype: MT
ms.contentlocale: fr-FR
ms.lasthandoff: 09/06/2019
ms.locfileid: "70400668"
---
# <a name="add-of-commonparameters"></a>\<Ajouter > de \<la > paramètres_courants
Spécifie une paire nom-valeur de paramètres utilisés globalement dans plusieurs services. Ce paramètre inclut généralement la chaîne de connexion de base de données pouvant être partagée par les services fiables.
[ **\<configuration>** ](../configuration-element.md)\
[ **\<System. serviceModel >** ](system-servicemodel.md)\
[ **\<comportements >** ](behaviors.md)\
[ **\<serviceBehaviors >** ](servicebehaviors.md)\
[ **\<> de comportement**](behavior-of-servicebehaviors.md)\
[ **\<> workflowRuntime**](workflowruntime.md)\
[ **\<> Paramètres_courants**](commonparameters.md)\
**\<Ajouter >**
## <a name="syntax"></a>Syntaxe
```xml
<workflowRuntime>
<commonParameters>
<add name="String" value="String" />
</commonParameters>
</workflowRuntime>
```
## <a name="attributes-and-elements"></a>Attributs et éléments
Les sections suivantes décrivent des attributs, des éléments enfants et des éléments parents.
### <a name="attributes"></a>Attributs
|Attribut|Description|
|---------------|-----------------|
|name|Nom du paramètre spécifié pour un service.|
|value|Valeur du paramètre spécifié pour un service.|
### <a name="child-elements"></a>Éléments enfants
Aucun.
### <a name="parent-elements"></a>Éléments parents
|Élément|Description|
|-------------|-----------------|
|[\<commonParameters>](commonparameters.md)|Collection de paramètres communs utilisée par les services. Cette collection inclut généralement la chaîne de connexion de base de données pouvant être partagée par les services fiables.|
## <a name="remarks"></a>Notes
L'élément `<commonParameters>` définit tous les paramètres utilisés globalement dans plusieurs services, par exemple `ConnectionString` lors de l'utilisation de <xref:System.Workflow.Runtime.Hosting.SharedConnectionWorkflowCommitWorkBatchService>.
Pour les services qui valident des lots de travail dans des magasins de persistance, comme <xref:System.Workflow.Runtime.Hosting.DefaultWorkflowCommitWorkBatchService> et <xref:System.Workflow.Runtime.Hosting.SqlWorkflowPersistenceService>, vous pouvez les activer pour effectuer de nouvelles tentatives de transaction à l'aide du paramètre `EnableRetries` tel qu'indiqué dans l'exemple suivant :
```xml
<workflowRuntime name="SampleApplication"
unloadOnIdle="false">
<commonParameters>
<add name="ConnectionString"
value="Initial Catalog=WorkflowStore;Data Source=localhost;Integrated Security=SSPI;" />
<add name="EnableRetries"
value="True" />
</commonParameters>
<services>
<add type="System.Workflow.Runtime.Hosting.SqlWorkflowPersistenceService, System.Workflow.Runtime, Version=3.0.00000.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"
enableRetries="False" />
</services>
</workflowRuntime>
```
Notez que le `EnableRetries` paramètre peut être défini à un niveau global (comme indiqué dans la section *paramètres_courants* ) ou pour des services individuels qui prennent `EnableRetries` en charge (comme indiqué dans la section *services* ).
Pour plus d’informations sur l’utilisation d’un fichier de configuration pour contrôler <xref:System.Workflow.Runtime.WorkflowRuntime> le comportement d’un objet d’une application hôte Windows Workflow Foundation, consultez [fichiers de configuration de flux de travail](https://docs.microsoft.com/previous-versions/dotnet/netframework-3.5/ms732240(v=vs.90)).
## <a name="example"></a>Exemple
```xml
<commonParameters>
<add name="ConnectionString"
value="Initial Catalog=WorkflowStore;Data Source=localhost;Integrated Security=SSPI;" />
<add name="EnableRetries"
value="true" />
</commonParameters>
```
## <a name="see-also"></a>Voir aussi
- <xref:System.ServiceModel.Configuration.WorkflowRuntimeElement>
- <xref:System.Workflow.Runtime.Configuration.WorkflowRuntimeServiceElement>
- <xref:System.Workflow.Runtime.WorkflowRuntime>
- <xref:System.Workflow.Runtime.Hosting.DefaultWorkflowCommitWorkBatchService>
- <xref:System.Workflow.Runtime.Hosting.SqlWorkflowPersistenceService>
- [Fichiers de configuration de flux de travail](https://docs.microsoft.com/previous-versions/dotnet/netframework-3.5/ms732240(v=vs.90))
- [\<commonParameters>](commonparameters.md)
| 51.642857 | 399 | 0.737008 | fra_Latn | 0.426975 |
ff2ca9dcc477186c988c9a27ea969287644728be | 3,206 | md | Markdown | doc.ja/installation_and_administration_guide/sanity_check.md | i4Trust/fiware-idm | 5ce2b228b747c8f0213f97c4f3346ee71c0a5ce9 | [
"MIT"
] | 1 | 2019-11-06T09:32:05.000Z | 2019-11-06T09:32:05.000Z | doc.ja/installation_and_administration_guide/sanity_check.md | i4Trust/fiware-idm | 5ce2b228b747c8f0213f97c4f3346ee71c0a5ce9 | [
"MIT"
] | 6 | 2019-12-30T05:01:54.000Z | 2021-08-04T00:04:43.000Z | doc.ja/installation_and_administration_guide/sanity_check.md | i4Trust/fiware-idm | 5ce2b228b747c8f0213f97c4f3346ee71c0a5ce9 | [
"MIT"
] | 1 | 2021-03-15T09:09:54.000Z | 2021-03-15T09:09:54.000Z | # 健全性チェック手順
健全性チェック手順は、システム管理者がインストールのテスト準備が整ったことを
確認するためのステップです。したがって、単体テスト、統合テスト、ユーザー検証に
進む前に、明白なまたは基本的な誤動作が修正されていることを確認するための
予備テストセットです。
## エンド・ツー・エンドのテスト
ユーザ・インターフェースが機能しているか確認してください :
1. IdM のホストアドレスに到達できることを確認します。デフォルトで、Web
アクセスにはログインページが表示されます
2. 有効なユーザー名とパスワードを取得し、これらの資格情報でアクセスします。
資格情報の確認後のWeb ページは、IdM KeyRock Portal のランディングページです
3. アプリケーション、組織などのリストを表示できることを確認します
API が機能しているか確認してください :
1. [apiary](https://keyrock.docs.apiary.io/#reference/keyrock-api/authentication/create-token-with-password-method)
に記述されているように API トークンをリクエストします
2. アプリケーション、組織などのリストを取得できることを確認します。たとえば、
次の方法でパスを確認できます :
```bash
curl --include \
--header "X-Auth-token: <api_token>" \
'http://idm-portal:3000/v1'
```
## 単体テスト
また、単体テストを実行して Keyrock が正しく機能しているかどうかを確認することも
できます。 これを行うには、次の手順に従います :
1\. ディレクトリ test の下には、テストを実行するためのデフォルトの
設定がある config_test.js.template ファイルがあります。この設定ファイルは、
すべてのテストを実行した後に削除されるテスト・データベース (idm_test と呼ばれる)
を作成するために使用されます。まず、ルート・ディレクトリに config.js ファイルが
すでにあり、この設定を保存する場合は、このファイルを別のディレクトリに保存します。
次のシェルコマンドで上書きするか、config.js ファイルを config_testjs.template
の値で変更するだけです
```bash
cp test/config_test.js.template config.js
```
2\. 設定ファイルをコピーしたら、次のようにしてすべてのテストを実行
できます:
```bash
npm run test
```
2.1\. 必要に応じて個別のテストを実行することもできます :
```bash
npm run test:single test/unit/<path_to_file_test>.js
```
## 実行中プロセスのリスト
forever に使用した場合は、プロセスの状態を知るために次のコマンドを実行できます :
```bash
forever status
```
## ネットワーク・インターフェースのアップ&オープン
HTTPS が有効になっているサーバを実行する場合は、IdM ポータルを読み込むために
TCP ポート 443 にWebブラウザからアクセスできる必要があります。
## データベース
GE のインストール時にデータベースを正しく取り込んだ場合は、そのデータベースとの
接続が確立しています。
必要なデータベースとテーブルは次のとおりです :
**TABLES**
| table_names | table_rows |
| --------------------------- | ---------- |
| SequelizeMeta | 30 |
| auth_token | 4 |
| authzforce | 0 |
| eidas_credentials | 0 |
| IoT | 2 |
| oauth_access_token | 9 |
| oauth_authorization_code | 0 |
| oauth_client | 3 |
| oauth_refresh_token | 8 |
| oauth_scope | 0 |
| organization | 0 |
| pep_proxy | 1 |
| permission | 6 |
| role | 2 |
| role_assignment | 6 |
| role_permission | 7 |
| trusted_application | 0 |
| user | 3 |
| user_authorized_application | 1 |
| user_organization | 0 |
| user_registration_profile | 0 |
# 診断手順
診断手順は、GE のエラーの原因を特定するためにシステム管理者が行う最初の手順
です。これらのテストでエラーの性質が特定されると、システム管理者はエラーの
正確なポイントと可能な解決方法を特定するために、より具体的で具体的なテストに
頼らざるを得なくなります。このような特定のテストは、このセクションの範囲外
です。
## リソースの可用性
UNIX コマンド `df` を使用して、2.5MB のディスク領域が残っていることを
確認します。
## リモートサービス・アクセス
ポート 443 にアクセスできることを確認してください。
## リソース消費
一般的なメモリ消費量は 100MBで、2GHz の CPU コアのほぼ 1%を消費しますが、
それはユーザのリクエストに依存します。
## I/O フロー
クライアントは、クライアントの Web ブラウザを介して KeyRock
インターフェイスにアクセスします。これは単純な HTTP トラフィックです。
ローカル・データベースにリクエストを出します。
| 23.064748 | 116 | 0.643793 | jpn_Jpan | 0.682323 |
ff2dc6bc6b258f8fdc84a21dfcce273383fd2879 | 1,112 | md | Markdown | readme.md | v801/random-game-name | 0abe207f8f724d96b60fcf0213aa69352e6f72f6 | [
"Unlicense"
] | null | null | null | readme.md | v801/random-game-name | 0abe207f8f724d96b60fcf0213aa69352e6f72f6 | [
"Unlicense"
] | null | null | null | readme.md | v801/random-game-name | 0abe207f8f724d96b60fcf0213aa69352e6f72f6 | [
"Unlicense"
] | null | null | null | # random game name
[](https://travis-ci.org/v801/random-game-name)
[](https://david-dm.org/v801/random-game-name)
[](http://unlicense.org)
> Get a random ass video game name
>
> Inspired by [videogamena.me](http://videogamena.me)
The names are generated from a [JSON file](videoGameList.json).
## Install
```
$ npm install --save v801/random-game-name
```
## Usage
```js
const randomGameName = require('random-game-name')
randomGameName()
//=> 'Inept Caveman Overload'
```
## API
### randomGameName()
Type: `function`
Random video game name.
## CLI
```
$ npm install -g v801/random-game-name
```
```
$ random-game-name --help
Get a random video game name
usage
$ random-game-name
Options
--random, -r Get a random video game name
Examples
$ random-game-name
Inept Caveman Overload
```
## [Unlicense](unlicense)
| 18.847458 | 137 | 0.692446 | yue_Hant | 0.540645 |
ff2de89a80091909111b60e5f7231cbeb510b0f0 | 1,360 | md | Markdown | docs/csharp/misc/cs0081.md | erikly/docs | 3de58af074aadfc5ad44c6c7106a0531d7cfa3e3 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2018-01-22T02:42:26.000Z | 2018-01-22T02:42:26.000Z | docs/csharp/misc/cs0081.md | erikly/docs | 3de58af074aadfc5ad44c6c7106a0531d7cfa3e3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/csharp/misc/cs0081.md | erikly/docs | 3de58af074aadfc5ad44c6c7106a0531d7cfa3e3 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-01-14T07:24:28.000Z | 2021-01-14T07:24:28.000Z | ---
title: "Compiler Error CS0081"
ms.date: "2015-07-20"
ms.prod: .net
ms.technology:
- "devlang-csharp"
ms.topic: "article"
f1_keywords:
- "CS0081"
dev_langs:
- "CSharp"
helpviewer_keywords:
- "CS0081"
ms.assetid: a5649abc-89ea-4f64-8c3c-eb36df926561
caps.latest.revision: 9
author: "BillWagner"
ms.author: "wiwagn"
translation.priority.ht:
- "de-de"
- "es-es"
- "fr-fr"
- "it-it"
- "ja-jp"
- "ko-kr"
- "ru-ru"
- "zh-cn"
- "zh-tw"
translation.priority.mt:
- "cs-cz"
- "pl-pl"
- "pt-br"
- "tr-tr"
---
# Compiler Error CS0081
Type parameter declaration must be an identifier not a type
When you declare a generic method or type, specify the type parameter as an identifier, for example "T" or "inputType". When client code calls the method, it supplies the type, which replaces each occurrence of the identifier in the method or class body. For more information, see [Generic Type Parameters](../../csharp/programming-guide/generics/generic-type-parameters.md).
```
// CS0081.cs
class MyClass
{
public void F<int>() {} // CS0081
public void F<T>(T input) {} // OK
public static void Main()
{
MyClass a = new MyClass();
a.F<int>(2);
a.F<double>(.05);
}
}
```
## See Also
[Generics](../../csharp/programming-guide/generics/index.md)
| 21.935484 | 378 | 0.631618 | eng_Latn | 0.745878 |
ff2e45a0c3bff91ad697d939d7c6d25ff43fde4c | 13,600 | md | Markdown | README.md | radshop/raspiblitz_openvpn | d748871d674ba679a16fcd77a6883d2f718cbf00 | [
"MIT"
] | 1 | 2021-09-11T17:11:31.000Z | 2021-09-11T17:11:31.000Z | README.md | radshop/raspiblitz_openvpn | d748871d674ba679a16fcd77a6883d2f718cbf00 | [
"MIT"
] | null | null | null | README.md | radshop/raspiblitz_openvpn | d748871d674ba679a16fcd77a6883d2f718cbf00 | [
"MIT"
] | null | null | null | # raspiblitz_openvpn
Scripts to install and configure a clean Ubuntu 20.04 VPS as an OpenVPN gateway for Raspiblitz and install the client certificate.
## Acknowledgments
1. [Rootzoll](https://github.com/rootzoll) has done the hard part with the [Raspiblitz](https://github.com/rootzoll/raspiblitz) project - this is just a minor addition.
2. The [Podcast Index](https://podcastindex.org) is enabling Podcasting 2.0 to send micropayments over the Lightning Network for content creators. They inspired me to get involved in this whole deal.
## Assumptions, Warnings, Notes, Misc.
1. These scripts were developed and tested using a brand new Ubunutu 20.04 Server VPS.
1. If you run these scripts on a server that is already configured, there's no telling what you might break.
2. If you use any other version of Linux, you are in uncharted territory.
2. These scripts are intended to create a VPN connection for a single Raspiblitz to connect to the Internet without using Tor and without exposing your home IP address. Because we map the necessary ports from the VPN to the Raspiblitz, it's really only suitable as a single-purpose VPN unless you are an expert in how to reconfigure for additional uses.
2. Get your Raspiblitz installed, configured, and confirmed to be working before you try to connect it to a VPN. Otherwise if you have any problems, you will not know if it's the VPN connection or your Raspiblitz configuration.
3. Recommended hosting providers:
1. The first time I set up an OpenVPN server, I used a guide published by [Digital Ocean](https://digitalocean.com). I've since developed my own set of scripts and procedures based on what I learned from them and others. Since they helped me and now I'm helping you, it might be cool if you got one of their lowest tier droplets to run your VPN.
2. I also am a fan of [Linode](https://linode.com).
3. And there are many others to choose from. Just make sure you are a getting a plain-vanilla Ubuntu 20.04 LTS server, nothing preconfigured. Otherwise these scripts might not work.
4. This project creates the CA (certificate authority) key on the same server where we will be running our VPN. Normally this is not considered a good security practice. The reason it's okay in this case is that we are setting up this VPN to have only 1 client - Raspiblitz. So once we've generated that client certificate, we make an offline backup of the CA key and delete it from the server.
1. If you want to create certificates for multiple clients to use your VPN, you should not use these scripts - set up a standalone CA server. Instructions for how to do that are outside the scope of this project.
5. The most reliable way to connect from a home-based Raspberry Pi to the hosted VPN is with UDP port 1194 - that's the standard port.
1. If you know what you are doing and want to change that configuration, it's not a big deal.
2. It's exceedingly rare for home or business ISPs to block UDP port 1194 - if they did, no one would be able to work from home over a VPN on that port. If you are having trouble with connectivity, don't assume that's the cause unless you check with your ISP first.
6. This is a "quick & dirty" way for me to get the scripts and procedure out to anyone who can benefit. I haven't tried to address all the ways this could be optimized. I have tested these scripts and this procedure multiple times, and it definitely works flawlessly for me on my VPS provider and my Raspiblitz on my home network. If it doesn't work for you, I have limited resources to help you, but I'll try.
---
## Installation Procedure
### Step 0: Preliminary Server Setup
#### Step 0.a Basics
1. Starting with a plain-vanilla Ubuntu 20.04 LTS Server VPS, complete the following before you are ready to run the scripts. (Instructions for these are not in the scope of this document - Digital Ocean, Linode, and others have great guides.)
1. Create a new user account (not root) and add that user to the sudoers. All commands run in this procedure should be run with that user account, not root.
2. Add your SSH public key to the new user account so you can connect without a password.
3. I recommend that you disable root login and password login so that only valid non-root SSH access is possible. That's the best way to secure your server.
4. If your setup instructions include configuring the UFW firewall, I recommend that you enable port 22 (SSH) only at this point. The setup scripts include UFW configuration, so the less you do in advance the better to avoid any conflicts.
5. Setting the hostname and timezone are good practices, but not essential.
1. If you want the hostname to match the certificate name we are generating (which doesn't matter in practice but can avoid confusion for you), you can execute `sudo hostnamectl set-hostname lvpn`. The new hostname will show next time you log in.
2. Clone this git repository locally. It doesn't matter which method you use:
1. If your SSH private key is on the server and the corresponding public key is added to your Github account: `git clone [email protected]:radshop/raspiblitz_openvpn.git`
2. Otherwise `git clone https://github.com/radshop/raspiblitz_openvpn.git`
3. Change directories to the root of the git repository (eg. `cd ~/raspiblitz_openvpn` if that's where you cloned it to.) Unless stated otherwise, all commands run on the server assume that you are in the root of the git repo.
#### Step 0.b Setting Configuration Parameters
In this initial release, I have not parameterized the scripts for any local values. I may do that in a later release, but for now you just need to make a couple simple changes to the files for your unique environment.
1. Add your IP address to the base configuration. For this you will need the public IP address of your server (provided by your hosting service).
1. Edit base.conf (I use vim, but nano is more comfortable for many, especially beginners): `nano files/base.conf`
2. On line 2, replace the word `my-server` with the IP address of your server and save the file.
2. We need to confirm that your default network interface is `eth0`, which should almost always be the case for a hosted VPS.
1. Run the command `ip address`, which will list the network interfaces and their IP addresses. You are looking for the IPv4 address, which will be preceded by the word `inet`.
2. If your public IP address is bound to `eth0`, the you are good to go.
3. If your public IP is bound to a different interface, then you need to update the UFW before rules:
1. Edit before.rules: `nano files/before.rules`
2. In lines 19, 20, and 23 replace all references to `eth0` with the identifier of your default interface. Then save the file.
### Step 1: Server Readiness for Generating Certificates and OpenVPN
This command must be run as sudo:
`sudo ./01_sudo_readyserver.sh`
The script should run from beginning to end without the need for any intervention.
### Step 2: Generate Server Certificates
This command must NOT be run as sudo - just local user privileges:
`./02_servercertificates.sh`
This script will need some interaction:
1. `Enter` to accept the default CA Common Name - there's no reason to change it.
2. `Enter` to accept the Host Common Name - if you change it, you will break subsequent steps of this procedure.
3. Confirm by typing `yes` at the prompt and `Enter`
### Step 3: Configure the VPN Server
This command must be run as sudo:
`sudo ./03_sudo_configurevpn.sh`
The script should run from beginning to end without the need for any intervention. At the end it outputs the status of the VPN server, which should include a line that starts with `Active: active (running)`
### Step 4: Generate the Client Certificate
This command must NOT be run as sudo - just local user privileges:
`./04_client_certificate.sh`
This script will need some interaction:
1. `Enter` to accept the Host Common Name - if you change it, you will break subsequent steps of this procedure.
2. Confirm by typing `yes` at the prompt and `Enter`
### Step 5: Copy the Client Certificate to Raspiblitz
The client certificate is generated on the server at ` ~/client-configs/files/raspiblitz.ovpn`. You need to move this file to your Raspiblitz. The destination for the file is `/home/admin/raspiblitz.ovpn`
Less experienced users might find it surprisingly difficult to move the file using terminal command line utilities. If you know how to use `scp` or the Putty PSFTP, then you can copy the file that way. But one of the most straightforward ways is to just use text copy/paste as follows. It requires having SSH connections to both the VPN server and the Raspiblitz server. (I don't use an LCD on my Raspiblitz, just SSH. I don't know if there's a way to do this without SSH, but I doubt it.)
In any case, the destination for the file is /home/admin/raspiblitz.ovpn
1. If your terminal supports right-click to copy/paste, then that's great. If not, you need to find out what key sequence (eg. CTRL-SHIFT-C/CTRL-SHIFT-V or CTRL-INSERT/SHIFT-INSERT or something else). You need to know that before you can proceed.
2. On the VPN server, output the certificate cleanly to the terminal with `clear && cat ~/client-configs/files/raspiblitz.ovpn`. Then select and copy the entire certificate. You must get the whole thing. If scrolling to select for copying is not supported or is difficult on your terminal, you can use `less ~/client-configs/files/raspiblitz.ovpn` and copy it in sections - just be really sure not to miss or duplicate any line.
3. It can be helpful to paste into a desktop text editor and review the contents to make sure everything is right. Or you can go straight to your Raspiblitz.
4. On the Raspiblitz, over ssh as the admin user, use vim or nano to create the empty file: `nano ~/raspiblitz.ovpn`
5. Paste all of the certificate into that file and save.
### Step 6: Configure the VPN on Raspiblitz
There is no packaged script for this part - it's a series of individual steps. All these should be run on your Raspiblitz logged in via SSH as the admin user.
First install OpenVPN on the Raspiblitz: `sudo apt install openvpn`.
Next check your network interfaces so you know what it looks like before you set up your VPN. run `ip address` and look at the result - typically `lo` for the local loopback, `eth0` for the wired network port and `wlan0` for the wifi antenna. Most important - there is no `tun0` interface for the VPN tunnel.
Now we will go through a multi-step process to verify that your VPN connection is working. More advanced users can simplify this, but this is a process that should work for anyone.
#### Verification Step A: Interactive Output
Start the VPN interactively by running `sudo openvpn --config raspiblitz.ovpn`.
1. If the VPN connects successfully, the last line of the output should include `Initialization Sequence Completed` - that's what you want. There might be wome warning messages but should be no error messages in the output.
2. Stop the VPN connection with `CTRL-c`.
#### Verification Step B: Run as a Daemon
Start the the VPN as a background daemon by running `sudo openvpn --config raspiblitz.ovpn --daemon`.
1. Enter `ip address` and confirm that the new `tun0` interface is present with IP address 10.8.0.10.
2. Use an external service to confirm your public IP address: `curl https://ifconfig.me ; echo`. The result should show the IP of your VPN server.
#### Set VPN to Start on Boot
To make sure your VPN connection starts when the Raspiblitz reboots, we need to copy your certificate to the OpenVPN directory and change the extension to .conf.
1. Enter `sudo cp ~/raspiblitz.ovpn /etc/openvpn/raspiblitz.conf`
2. To confirm that everything is working, enter `restart` so the openvpn daemon stops and the auto-connect starts on boot.
3. Reconnect to the Raspiblitz over SSH once it restarts. Wait for all of the Raspiblitz services to start then exit from the main menu to a command prompt.
4. Use `ip address` and `curl https://ifconfig.me ; echo` as you did above to confirm that the VPN is connected
### Step 7: Server Cleanup
If you leave the CA Key in place on your server, any party that gains access will be able to create their own certificates for your VPN. You want to prevent that.
1. Make an offline copy of your server key.
1. `cat ~/easy-rsa/pki/private/ca.key` to output the contents of the key.
2. Copy the output and save it in a safe place off the server.
3. Delete the file once you have a safe copy. `rm ~/easy-rsa/pki/private/ca.key`
2. If you need to generate any certificates in the future, recreate the CA Key from your offline copy. If you lose it, you will not be able to create additional client certificates for this server - you will need to regenearate all of the server and client certificates.
We also want to get rid of our temporary files just to keep things neat: `sudo rm -r /tmp/raspiblitz_openvpn/`
---
## Administration
### Port Mapping
The mapping of ports for Bitcoin (8333) and Lightning (9735) through the VPN is done by 2 lines in the before.rules file:
`-A PREROUTING -i eth0 -p tcp --dport 8333 -j DNAT --to-destination 10.8.0.10`
`-A PREROUTING -i eth0 -p tcp --dport 9735 -j DNAT --to-destination 10.8.0.10`
If you need to map ports for additional services Raspiblitz offers, just edit the rules `sudo nano /etc/ufw/before.rules`. Add a new line with everything identical except the port number after `--dport`.
When you are done, restart UFW to make the change: `sudo service ufw restart`
| 90.666667 | 489 | 0.766838 | eng_Latn | 0.999031 |
ff2ed5106c6e0d6fae22a41fc4ddd895194c8c2b | 4,811 | md | Markdown | articles/synapse-analytics/data-integration/linked-service.md | Microsoft/azure-docs.sv-se | a43cb26da920952026f5e9c8720f3356a84de75b | [
"CC-BY-4.0",
"MIT"
] | 7 | 2017-08-28T08:02:11.000Z | 2021-05-05T07:47:55.000Z | articles/synapse-analytics/data-integration/linked-service.md | MicrosoftDocs/azure-docs.sv-se | a43cb26da920952026f5e9c8720f3356a84de75b | [
"CC-BY-4.0",
"MIT"
] | 476 | 2017-10-15T08:20:18.000Z | 2021-04-16T05:20:11.000Z | articles/synapse-analytics/data-integration/linked-service.md | MicrosoftDocs/azure-docs.sv-se | a43cb26da920952026f5e9c8720f3356a84de75b | [
"CC-BY-4.0",
"MIT"
] | 39 | 2017-08-03T09:46:48.000Z | 2021-11-05T11:41:27.000Z | ---
title: Skydda en länkad tjänst
description: Lär dig hur du etablerar och skyddar en länkad tjänst med hanterat VNet
services: synapse-analytics
author: RonyMSFT
ms.service: synapse-analytics
ms.topic: how-to
ms.subservice: security
ms.date: 04/15/2020
ms.author: ronytho
ms.reviewer: jrasnick
ms.openlocfilehash: 6be76878a9a07c5f4a1e2a9348bb7b09cb1b10eb
ms.sourcegitcommit: 590f14d35e831a2dbb803fc12ebbd3ed2046abff
ms.translationtype: MT
ms.contentlocale: sv-SE
ms.lasthandoff: 04/16/2021
ms.locfileid: "107567595"
---
# <a name="secure-a-linked-service-with-private-links"></a>Skydda en länkad tjänst med privata länkar
I den här artikeln får du lära dig hur du skyddar en länkad tjänst i Synapse med en privat slutpunkt.
## <a name="prerequisites"></a>Förutsättningar
* **Azure-prenumeration:** Om du inte har en Azure-prenumeration kan du skapa ett [kostnadsfritt Azure-konto](https://azure.microsoft.com/free/) innan du börjar.
* **Azure Storage konto:** Du använder Azure Data Lake Gen2 som *källdatalager.* Om du inte har ett lagringskonto kan du gå till [Skapa ett Azure Storage konto](../../storage/common/storage-account-create.md) för anvisningar om hur du skapar ett. Kontrollera att lagringskontot har ip Synapse Studio filtrering för åtkomst till det och att du endast tillåter **valda** nätverk att komma åt lagringskontot. Inställningen under bladet **Brandväggar och virtuella nätverk bör** se ut som på bilden nedan.

## <a name="create-a-linked-service-with-private-links"></a>Skapa en länkad tjänst med privata länkar
I Azure Synapse Analytics är en länkad tjänst där du definierar din anslutningsinformation till andra tjänster. I det här avsnittet lägger du till Azure Synapse Analytics Azure Data Lake Gen 2 som länkade tjänster.
1. Öppna Azure Synapse Studio och gå till **fliken** Hantera.
1. Under **Externa anslutningar** väljer du Länkade **tjänster**.
1. Om du vill lägga till en länkad tjänst väljer du **Ny.**
1. Välj Azure Data Lake Storage Gen2 panelen i listan och välj **Fortsätt.**
1. Kontrollera att du aktiverar **interaktiv redigering.** Det kan ta cirka 1 minut att aktiveras.
1. Ange dina autentiseringsuppgifter. Kontonyckel, tjänstens huvudnamn och hanterad identitet stöds för närvarande av autentiseringstyper. Välj testanslutning för att verifiera att dina autentiseringsuppgifter är korrekta.
1. Välj **Testa anslutning.** Det bör misslyckas eftersom lagringskontot inte ger åtkomst till den utan att skapa och godkänna en privat slutpunkt. I felmeddelandet bör du se en länk för att skapa en **privat slutpunkt som** du kan följa för att gå till nästa del. Om du följer den länken hoppar du över nästa del.
1. Välj **Skapa** när du är klar.
## <a name="create-a-managed-private-endpoint"></a>Skapa en hanterad privat slutpunkt
Om du inte har valt hyperlänken när du testade anslutningen ovan följer du följande sökväg. Skapa en hanterad privat slutpunkt som du ansluter till den länkade tjänst som skapades ovan.
1. Gå till **fliken** Hantera.
1. Gå till **avsnittet Hanterade virtuella** nätverk.
1. Välj **+ Ny** under Hanterad privat slutpunkt.
1. Välj Azure Data Lake Storage Gen2 panelen i listan och välj **Fortsätt.**
1. Ange namnet på lagringskontot som du skapade ovan.
1. Välj **Skapa**
1. Du bör se efter att ha väntat några sekunder att den privata länk som skapats behöver ett godkännande.
## <a name="private-link-approval"></a>Godkännande av privat länk
1. Välj den privata slutpunkt som du skapade ovan. Du kan se en hyperlänk som gör att du kan godkänna den privata slutpunkten på lagringskontonivå. *Ett alternativ är att gå direkt till Azure Portal storage-konto och gå till **bladet Privata slutpunktsanslutningar.***
1. Markera den privata slutpunkt som du skapade i Studio och välj **Godkänn.**
1. Lägg till en beskrivning och välj **Ja**
1. Gå tillbaka du Synapse Studio under avsnittet **Hanterade virtuella** nätverk på **fliken** Hantera.
1. Det bör ta cirka 1 minut att få det godkännande som återspeglas för din privata slutpunkt.
## <a name="check-the-connection-works"></a>Kontrollera att anslutningen fungerar
1. Gå till fliken **Hantera** och välj den länkade tjänst som du skapade.
1. Kontrollera att **Interaktiv redigering är** aktiv.
1. Välj **Testanslutning**. Du bör se att anslutningen lyckades.
Nu har du upprättat en säker och privat anslutning mellan Synapse och din länkade tjänst.
## <a name="next-steps"></a>Nästa steg
Mer information om hanterade privata slutpunkter i Azure Synapse Analytics finns i [Hanterade privata slutpunkter.](../security/synapse-workspace-managed-private-endpoints.md)
Mer information om dataintegrering för Azure Synapse Analytics finns i [artikeln Mata in data i en Data Lake.](data-integration-data-lake.md) | 64.146667 | 501 | 0.783621 | swe_Latn | 0.999634 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.