hexsha
stringlengths 40
40
| size
int64 5
1.04M
| ext
stringclasses 6
values | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 3
344
| max_stars_repo_name
stringlengths 5
125
| max_stars_repo_head_hexsha
stringlengths 40
78
| max_stars_repo_licenses
sequencelengths 1
11
| max_stars_count
int64 1
368k
⌀ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
⌀ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
⌀ | max_issues_repo_path
stringlengths 3
344
| max_issues_repo_name
stringlengths 5
125
| max_issues_repo_head_hexsha
stringlengths 40
78
| max_issues_repo_licenses
sequencelengths 1
11
| max_issues_count
int64 1
116k
⌀ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
⌀ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
⌀ | max_forks_repo_path
stringlengths 3
344
| max_forks_repo_name
stringlengths 5
125
| max_forks_repo_head_hexsha
stringlengths 40
78
| max_forks_repo_licenses
sequencelengths 1
11
| max_forks_count
int64 1
105k
⌀ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
⌀ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
⌀ | content
stringlengths 5
1.04M
| avg_line_length
float64 1.14
851k
| max_line_length
int64 1
1.03M
| alphanum_fraction
float64 0
1
| lid
stringclasses 191
values | lid_prob
float64 0.01
1
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0c64eadc70e9588579d23f235cfc55a8737fec66 | 4,166 | md | Markdown | docs/quick_start/python_demo.md | wanglei91/Paddle-Lite | 8b2479f4cdd6970be507203d791bede5a453c09d | [
"Apache-2.0"
] | 6 | 2020-07-01T02:52:16.000Z | 2021-06-22T12:15:59.000Z | docs/quick_start/python_demo.md | wanglei91/Paddle-Lite | 8b2479f4cdd6970be507203d791bede5a453c09d | [
"Apache-2.0"
] | null | null | null | docs/quick_start/python_demo.md | wanglei91/Paddle-Lite | 8b2479f4cdd6970be507203d791bede5a453c09d | [
"Apache-2.0"
] | 1 | 2021-07-24T15:30:46.000Z | 2021-07-24T15:30:46.000Z | # Python 完整示例
Python 支持的平台包括:Windows X86_CPU / macOS X86_CPU / Linux X86_CPU / Linux ARM_CPU (ARM Linux)。
本章节包含2部分内容:(1) [Python 示例程序](python_demo.html#id1);(2) [Python 应用开发说明](python_demo.html#id6)。
## Python 示例程序
本章节展示的所有Python 示例代码位于 [demo/python](https://github.com/PaddlePaddle/Paddle-Lite/tree/develop/lite/demo/python) 。
### 1. 环境准备
如果是Windows X86_CPU / macOS X86_CPU / Linux X86_CPU 平台,不需要进行特定环境准备。
如果是ARM Linux平台,需要编译PaddleLite,环境配置参考[文档](../source_compile/compile_env),推荐使用docker。
### 2. 安装python预测库
PyPI源目前仅提供Windows X86_CPU / macOS X86_CPU / Linux X86_CPU 平台的pip安装包,执行如下命令。
```shell
# 当前最新版本是 2.8
python -m pip install paddlelite==2.8
```
如果您需要使用AMRLinux平台的Python预测功能,请参考[源码编译(ARMLinux)](../source_compile/compile_linux)编译、安装PaddleLite的python包。
### 3. 准备预测部署模型
(1) 模型下载:下载[mobilenet_v1](http://paddle-inference-dist.bj.bcebos.com/mobilenet_v1.tar.gz)模型后解压,得到Paddle非combined形式的模型,位于文件夹 `mobilenet_v1` 下。可通过模型可视化工具[Netron](https://lutzroeder.github.io/netron/)打开文件夹下的`__model__`文件,查看模型结构。
```shell
wget http://paddle-inference-dist.bj.bcebos.com/mobilenet_v1.tar.gz
tar zxf mobilenet_v1.tar.gz
```
(2) 模型转换:Paddle的原生模型需要经过[opt](../user_guides/model_optimize_tool)工具转化为Paddle-Lite可以支持的naive_buffer格式。
- Linux X86_CPU 平台:通过pip安装paddlelite,即可获得paddle_lite_opt命令工具
```shell
paddle_lite_opt --model_dir=./mobilenet_v1 \
--optimize_out=mobilenet_v1_opt \
--optimize_out_type=naive_buffer \
--valid_targets=x86
```
- MAC X86_CPU 平台: paddle_lite_opt工具使用方式同Linux。
- Windows X86_CPU 平台:windows 暂不支持命令行方式直接运行模型转换器,需要编写python脚本
```python
import paddlelite.lite as lite
a=lite.Opt()
# 非combined形式
a.set_model_dir("D:\\YOU_MODEL_PATH\\mobilenet_v1")
# conmbined形式,具体模型和参数名称,请根据实际修改
# a.set_model_file("D:\\YOU_MODEL_PATH\\mobilenet_v1\\__model__")
# a.set_param_file("D:\\YOU_MODEL_PATH\\mobilenet_v1\\__params__")
a.set_optimize_out("mobilenet_v1_opt")
a.set_valid_places("x86")
a.run()
```
- ARMLinux 平台:编写python脚本转换模型
```python
import paddlelite.lite as lite
a=lite.Opt()
# 非combined形式
a.set_model_dir("D:\\YOU_MODEL_PATH\\mobilenet_v1")
# conmbined形式,具体模型和参数名称,请根据实际修改
# a.set_model_file("D:\\YOU_MODEL_PATH\\mobilenet_v1\\__model__")
# a.set_param_file("D:\\YOU_MODEL_PATH\\mobilenet_v1\\__params__")
a.set_optimize_out("mobilenet_v1_opt")
a.set_valid_places("arm") # 设置为arm
a.run()
```
以上命令执行成功之后将在同级目录生成名为`mobilenet_v1_opt.nb`的优化后模型文件。
### 4. 下载和运行预测示例程序
从[demo/python](https://github.com/PaddlePaddle/Paddle-Lite/tree/develop/lite/demo/python)下载预测示例文件`mobilenetv1_light_api.py`和`mobilenetv1_full_api.py`,并运行Python预测程序。
```shell
# light api的输入为优化后模型文件mobilenet_v1_opt.nb
python mobilenetv1_light_api.py --model_dir=mobilenet_v1_opt.nb
# full api的输入为优化千的模型文件夹mobilenet_v1
python mobilenetv1_full_api.py --model_dir=./mobilenet_v1
# 运行成功后,将在控制台输出如下内容
[1L, 1000L]
[0.00019130950386170298, 0.0005920541007071733, 0.00011230241216253489, 6.27333574811928e-05, 0.0001275067188544199, 0.0013214796781539917, 3.138116153422743e-05, 6.52207963867113e-05, 4.780858944286592e-05, 0.0002588215284049511]
```
## Python 应用开发说明
Python代码调用Paddle-Lite执行预测库仅需以下六步:
(1) 设置config信息
```python
from paddlelite.lite import *
import numpy as np
from PIL import Image
config = MobileConfig()
config.set_model_from_file("./mobilenet_v1_opt.nb")
```
(2) 创建predictor
```python
predictor = create_paddle_predictor(config)
```
(3) 从图片读入数据
```python
image = Image.open('./example.jpg')
# resize the inputed image into shape (224, 224)
resized_image = image.resize((224, 224), Image.BILINEAR)
# put it from HWC to NCHW format
image_data = np.array(resized_image).transpose(2, 0, 1).reshape(1, 3, 224, 224)
```
(4) 设置输入数据
```python
input_tensor = predictor.get_input(0)
input_tensor.from_numpy(image_data)
```
(5) 执行预测
```python
predictor.run()
```
(6) 得到输出数据
```python
output_tensor = predictor.get_output(0)
print(output_tensor.shape())
print(output_tensor.numpy())
```
详细的Python API说明文档位于[Python API](../api_reference/python_api_doc)。更多Python应用预测开发可以参考位于位于[Paddle-Lite-Demo](https://github.com/PaddlePaddle/Paddle-Lite-Demo)的工程示例代码。
| 26.705128 | 230 | 0.757801 | yue_Hant | 0.58185 |
0c65a18615138d5e08a5ed995d15a754e81819cc | 111 | md | Markdown | README.md | ljk233/AdventOfCode2021 | 6478d3846040d5b3d179e19b4fa7434c0ad6c532 | [
"MIT"
] | null | null | null | README.md | ljk233/AdventOfCode2021 | 6478d3846040d5b3d179e19b4fa7434c0ad6c532 | [
"MIT"
] | null | null | null | README.md | ljk233/AdventOfCode2021 | 6478d3846040d5b3d179e19b4fa7434c0ad6c532 | [
"MIT"
] | null | null | null |
# Advent of Code 2021
## Julia
- [Day 1: Sonar Sweep](https://ljk233.github.io/AdventOfCode2021/d1.jl.html)
| 15.857143 | 76 | 0.702703 | yue_Hant | 0.772721 |
0c66123e0f075f4ca5724d181c5db9f1bb745488 | 392 | md | Markdown | content/components/link-list/code/react-usage.md | hqtan/gold-design-system-site | 3c1b247a96fab39ea93a6a5fb77764d0da9e5b04 | [
"MIT"
] | 2 | 2021-09-14T21:02:12.000Z | 2021-10-12T12:53:30.000Z | content/components/link-list/code/react-usage.md | hqtan/gold-design-system-site | 3c1b247a96fab39ea93a6a5fb77764d0da9e5b04 | [
"MIT"
] | 5 | 2021-09-19T00:03:06.000Z | 2022-02-18T23:14:30.000Z | content/components/link-list/code/react-usage.md | hqtan/gold-design-system-site | 3c1b247a96fab39ea93a6a5fb77764d0da9e5b04 | [
"MIT"
] | 2 | 2021-09-21T00:57:24.000Z | 2022-02-15T10:46:19.000Z | ---
layout: section
---
## React Usage
```jsx
import AUlinkList from '@gold.au/link-list';
<AUlinkList items={[
{
link: 'link/one/',
text: 'Link 1',
},
{
link: 'link/two/',
text: 'Link 2',
className: 'is-active',
li: {
className: 'li-wrapping-class',
},
},
{
text: 'Link 3',
onClick: () => console.log('You clicked me!'),
},
]} />
```
| 13.517241 | 50 | 0.5 | yue_Hant | 0.342193 |
0c66412fcfd9a20e166fabcf3f5d11b1a3348f51 | 62 | md | Markdown | README.md | chenqingtan/AX-Manager | 432a4fd764fa9e696ae33082e3ce324d876a7161 | [
"Apache-2.0"
] | 1 | 2022-03-19T20:37:51.000Z | 2022-03-19T20:37:51.000Z | README.md | chenqingtan/AX-Manager | 432a4fd764fa9e696ae33082e3ce324d876a7161 | [
"Apache-2.0"
] | null | null | null | README.md | chenqingtan/AX-Manager | 432a4fd764fa9e696ae33082e3ce324d876a7161 | [
"Apache-2.0"
] | 1 | 2022-03-19T02:31:04.000Z | 2022-03-19T02:31:04.000Z | # AX-Manager
Android平台双向文件管理器
交流群:782023441
作者QQ:1834661238
| 8.857143 | 16 | 0.822581 | yue_Hant | 0.953901 |
0c6756fefd9aa982d84d840bcd8ca7d8d329dd94 | 497 | md | Markdown | site/ko/tfx/guide/serving.md | jvishnuvardhan/docs-l10n | b5d8cc9fd4a98d9b2e12d550e27f584e93f74994 | [
"Apache-2.0"
] | 491 | 2020-01-27T19:05:32.000Z | 2022-03-31T08:50:44.000Z | site/ko/tfx/guide/serving.md | jvishnuvardhan/docs-l10n | b5d8cc9fd4a98d9b2e12d550e27f584e93f74994 | [
"Apache-2.0"
] | 511 | 2020-01-27T22:40:05.000Z | 2022-03-21T08:40:55.000Z | site/ko/tfx/guide/serving.md | jvishnuvardhan/docs-l10n | b5d8cc9fd4a98d9b2e12d550e27f584e93f74994 | [
"Apache-2.0"
] | 627 | 2020-01-27T21:49:52.000Z | 2022-03-28T18:11:50.000Z | # 모델 적용하기
## 소개
TensorFlow Serving은 운영 환경을 위해 설계되었으며 머신러닝 모델을 고성능으로 적용하는 유연한 시스템입니다. TensorFlow Serving을 사용하면 동일한 서버 아키텍처와 API를 유지하면서 새로운 알고리즘과 실험을 쉽게 배포할 수 있습니다. TensorFlow Serving은 TensorFlow 모델과의 기본적인 통합을 제공하면서도 다른 유형의 모델 및 데이터를 제공하도록 쉽게 확장할 수 있습니다.
다음은 TensorFlow Serving에 대한 자세한 개발자 설명서입니다.
- [아키텍처 개요](https://www.tensorflow.org/tfx/serving/architecture)
- [서버 API](https://www.tensorflow.org/tfx/serving/api_docs/cc/)
- [REST 클라이언트 API](https://www.tensorflow.org/tfx/serving/api_rest)
| 41.416667 | 236 | 0.762575 | kor_Hang | 1.000007 |
0c68181a858c469adc1c201b66ca14df480a5513 | 1,780 | md | Markdown | docs/guide-testing.md | JsseL/website | 424c544845356d434371a640dfad15bce46cfa38 | [
"MIT"
] | null | null | null | docs/guide-testing.md | JsseL/website | 424c544845356d434371a640dfad15bce46cfa38 | [
"MIT"
] | null | null | null | docs/guide-testing.md | JsseL/website | 424c544845356d434371a640dfad15bce46cfa38 | [
"MIT"
] | null | null | null | ---
id: guide-testing
title: Testing
---
Testing is an important feature of any library. To aid in our own tests we've developed a [`MockBinding`](api-binding-mock.md) a fake hardware binding that doesn't actually need any hardware to run. This class passes all of the same tests as our hardware based bindings and provides a few additional test related interfaces.
```js
const SerialPort = require('@serialport/stream')
const MockBinding = require('@serialport/binding-mock')
SerialPort.Binding = MockBinding
// Create a port and enable the echo and recording.
MockBinding.createPort('/dev/ROBOT', { echo: true, record: true })
const port = new SerialPort('/dev/ROBOT')
```
The code can be found in the [`@serialport/binding-mock`](api-binding-mock.md) package.
## Virtual Serial Ports
Sometimes you need to develope and test your software without being able to use a real serial port or without having access to the target device. In those situations you can use virtual serial ports.
### Free Virtual Serial Ports
On Windows you can use e.g. [Free* Virtual Serial Ports](https://freevirtualserialports.com/) to create virtual ports and then [Free* Serial Analyzer](https://freeserialanalyzer.com/) to view and monitor the port traffic.
**Free meaning trial mode in this case*
Use the "Create New Bridge" feature to create two virtual serial ports which then are linked together (you can even configure the wiring between the ports if you need).

After you have bridged your new virtual ports e.g. COM5 and COM6, you are able to write data to COM5 port and monitor the traffic from port COM6 with the Free Serial Analyzer.

| 46.842105 | 324 | 0.770787 | eng_Latn | 0.994172 |
0c68357fdd910673366333ddfc5f28aa9ac61259 | 115 | md | Markdown | README.md | tchappui/scherzo-orm | be3c34184491e373462fe462f37b9e318236e29b | [
"MIT"
] | null | null | null | README.md | tchappui/scherzo-orm | be3c34184491e373462fe462f37b9e318236e29b | [
"MIT"
] | null | null | null | README.md | tchappui/scherzo-orm | be3c34184491e373462fe462f37b9e318236e29b | [
"MIT"
] | null | null | null | # scherzo-orm
Scherzo-orm is the object relational mapper component used by the Scherzo educational web framework.
| 38.333333 | 100 | 0.826087 | eng_Latn | 0.996158 |
0c686e6e2085e73a5ac935afd0a15b4b3231ce05 | 2,337 | md | Markdown | content/games/BFK_Warzone/index.zh-cn.md | itey/metabd | 263376ea70afea08cbc752c7117928826d2ec61a | [
"MIT"
] | null | null | null | content/games/BFK_Warzone/index.zh-cn.md | itey/metabd | 263376ea70afea08cbc752c7117928826d2ec61a | [
"MIT"
] | 6 | 2022-03-14T18:35:35.000Z | 2022-03-28T18:43:54.000Z | content/games/BFK_Warzone/index.zh-cn.md | itey/metabd | 263376ea70afea08cbc752c7117928826d2ec61a | [
"MIT"
] | null | null | null | ---
title: "BFK Warzone"
description: "BFK Warzone is a 2D NFT P2E shooter & Marketplace"
lead: "BFK Warzone is a 2D NFT P2E shooter & Marketplace"
date: 2022-02-28T12:48:34.966+0800
lastmod: 2022-02-28T12:48:34.966+0800
draft: false
featuredImage: ["100_bfk-warzone.png"]
score: "155"
status: "Beta"
blockchain: ["Binance"]
nft_support: "Yes"
free_to_play: "Crypto"
play_to_earn: ["NFT","Crypto"]
website: "https://warzone.bfkwarzone.com/?utm_source=PlayToEarn.net&utm_medium=organic&utm_campaign=gamepage"
twitter: "https://twitter.com/BFKWarzone"
discord: "https://discord.com/invite/CYPPTjgASm"
telegram: "https://t.me/bfkwarzone"
github: "https://github.com/BlockAudit-Report/BFK_Warzone"
youtube: "https://www.youtube.com/channel/UC4v5OMX38Ooit5dg71_jmew"
twitch:
facebook: "https://m.facebook.com/bfkwarzone/"
instagram: "https://www.instagram.com/bfkwarzonebsc/"
reddit:
medium:
steam:
gitbook:
googleplay:
appstore:
categories: ["games"]
games: ["Action","Battle-Royale","Shooter"]
toc: false
pinned: false
weight:
---
BFK Warzone is a specialized gaming ecosystem on the Binance Smart Chain which will be completely powered by our native token $BFK. Certik audited.<br> <br> We are currently in development of our 2D with 3D isometric views NFT Play-To-Earn shooter game and NFT Marketplace. We are a strictly gaming focused project with simple 5% taxes for buying/selling as well as 0% tax on wallet to wallet transactions.<br> <br> The Interactive NFT Marketplace will feature in-game characters called Fortis which can be used to earn tokens in multiple game modes. Users can start playing right away with a free Legacy character called C. Draxis, a battle warrior from the army unit. This character comes equipped with 2 weapons, the M4 Carbine and the M4A16. Other characters can be purchased on the marketplace or can be minted using our unique minting algorithm which retains rarity based on total users.<br> <br> Soldiers range from Navy Seals, Paratroopers, Snipers, Medics, Machine Gunners and Special Forces. You can challenge your gaming buddies in several different modes including 1v1, 2v2, 3v3 or 5v5. Fortis can join forces by reserving their slot, and take down the enemy bosses in dungeon style gameplay where $BFK tokens and loot are dropped and rewarded based on contributions to the fight. | 59.923077 | 1,292 | 0.778348 | eng_Latn | 0.970499 |
0c69234f9b8fd47e195133a2159e6900ee5ee30c | 4,093 | md | Markdown | articles/iot-accelerators/iot-accelerators-remote-monitoring-rbac-tsi.md | julianosaless/azure-docs.pt-br | 461791547c9cc2b4df751bb3ed881ce57796f1e4 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/iot-accelerators/iot-accelerators-remote-monitoring-rbac-tsi.md | julianosaless/azure-docs.pt-br | 461791547c9cc2b4df751bb3ed881ce57796f1e4 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/iot-accelerators/iot-accelerators-remote-monitoring-rbac-tsi.md | julianosaless/azure-docs.pt-br | 461791547c9cc2b4df751bb3ed881ce57796f1e4 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Controle de acesso de dados de Monitoramento Remoto – Azure | Microsoft Docs
description: Este artigo fornece informações sobre como é possível configurar controles de acesso para o explorador de telemetria do Time Series Insights no acelerador de solução de monitoramento remoto
author: dominicbetts
manager: timlt
ms.author: dobett
ms.service: iot-accelerators
services: iot-accelerators
ms.date: 08/06/2018
ms.topic: conceptual
ms.openlocfilehash: 9d5d572c3e32e3645e65ba8d6fc28b567b3c1e9a
ms.sourcegitcommit: 849bb1729b89d075eed579aa36395bf4d29f3bd9
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 04/28/2020
ms.locfileid: "65827183"
---
# <a name="configure-access-controls-for-the-time-series-insights-telemetry-explorer"></a>Configurar controles de acesso para o explorador de telemetria do Time Series Insights
Este artigo fornece informações sobre como configurar controles de acesso para o explorador de Insights do Time Series no acelerador de solução de monitoramento remoto. Para permitir que os usuários do acelerador de solução acessem o explorador do Time Series Insights, é necessário conceder acesso a cada usuário.
As políticas de acesso a dados concedem permissões para emitir consultas de dados, manipular dados de referência no ambiente e salvar consultas compartilhadas e perspectivas associadas ao ambiente.
## <a name="grant-data-access"></a>Conceder acesso a dados
Siga estas etapas para conceder acesso a dados para uma entidade de usuário:
1. Entre no [portal do Azure](https://portal.azure.com).
2. Localize o seu ambiente do Time Series Insights. Digite **Time Series** na caixa **pesquisar**. Selecione **Ambiente do Time Series** nos resultados da pesquisa.
3. Selecione o seu ambiente de Análise de Séries Temporais na lista.
4. Selecione **Políticas de Acesso a Dados** e depois **+Adicionar**.

5. Selecione **Selecionar usuário**. Pesquise o nome de usuário ou endereço de email para localizar o usuário que deseja adicionar. Clique em **Selecionar** para confirmar a seleção.

6. Selecione **Selecionar função**. Escolha a função de acesso apropriada para o usuário:
- Selecione **colaborador** se desejar permitir que o usuário altere os dados de referência e compartilhe consultas salvas e perspectivas com outros usuários do ambiente.
- Caso contrário, selecione **leitor** para permitir que os dados de consulta do usuário no ambiente e salve as consultas pessoais (não compartilhadas) no ambiente.
Selecione **OK** para confirmar a escolha da função.

7. Selecione **Ok** na página **Selecionar Função do Usuário**.

8. A página **Políticas de Acesso de Dados** lista os usuários e suas funções.

## <a name="next-steps"></a>Próximas etapas
Neste artigo, você aprendeu como os controles de acesso são concedidos para o explorador de Insights do Time Series no acelerador de solução de monitoramento remoto.
Para obter mais informações conceituais sobre o acelerador de solução de monitoramento remoto, consulte [arquitetura de monitoramento remoto](iot-accelerators-remote-monitoring-sample-walkthrough.md)
Para saber mais sobre como personalizar a solução de Monitoramento Remoto, veja [Personalizar e reimplantar um microsserviço](iot-accelerators-microservices-example.md)
<!-- Next tutorials in the sequence --> | 63.953125 | 314 | 0.79697 | por_Latn | 0.992198 |
0c695c0a0ce2894fe7b418d994c4820376b4d2ce | 3,913 | md | Markdown | README.md | kevcadieux/cpp-build-insights-samples | edc69ada56e54b84ad8bab497321becd9c2d0bfc | [
"MIT"
] | null | null | null | README.md | kevcadieux/cpp-build-insights-samples | edc69ada56e54b84ad8bab497321becd9c2d0bfc | [
"MIT"
] | null | null | null | README.md | kevcadieux/cpp-build-insights-samples | edc69ada56e54b84ad8bab497321becd9c2d0bfc | [
"MIT"
] | null | null | null | ---
page_type: sample
languages:
- C++
products:
- cpp-build-insights
description: "This repository provides buildable and runnable samples for the C++ Build Insights SDK. Use it as a learning resource."
urlFragment: "cpp-build-insights-samples"
---
# C++ Build Insights SDK samples

This repository provides buildable and runnable samples for the C++ Build Insights SDK. Use it as a learning resource.
## Contents
| Sample | Description |
|-------------------|--------------------------------------------|
| BottleneckCompileFinder | Finds CL invocations that are bottlenecks and don't use /MP. |
| FunctionBottlenecks | Prints a list of functions that are code generation bottlenecks within their CL or Link invocation. |
| LongCodeGenFinder | Lists the functions that take more than 500 milliseconds to generate in your entire build. |
| RecursiveTemplateInspector | Identifies costly recursive template instantiations. |
| TopHeaders | Determines which headers you might want to precompile. |
## Prerequisites
In order to build and run the samples in this repository, you need:
- Visual Studio 2017 and above.
- Windows 8 and above.
## Build steps
1. Clone the repository on your machine.
1. Open the Visual Studio solution file. Each sample is a separate project within the solution.
1. All samples rely on the C++ Build Insights SDK NuGet package. Restore NuGet packages and accept the license for the SDK.
1. Build the desired configuration for the samples that interest you. Available platforms are x86 and x64, and available configurations are *Debug* and *Release*.
1. Samples will be built in their own directory following this formula: `{RepositoryRoot}\out\{SampleName}`.
## Running the samples
1. The samples require *CppBuildInsights.dll* and *KernelTraceControl.dll* to run. These files are available in the C++ Build Insights NuGet package. When building samples, these files are automatically copied next to them in their respective output directory. If you are going to move a sample around on your machine, please be sure to move these DLL's along with it.
1. Collect a trace of the build you want to analyze with the sample. You can do this using two methods:
1. Use vcperf:
1. Open an elevated x64 Native Tools Command Prompt for VS 2019.
1. Run the following command: `vcperf /start MySessionName`
1. Build your project. You do not need to use the same command prompt for building.
1. Run the following command: `vcperf /stopnoanalyze MySessionName outputTraceFile.etl`
1. Programmatically: see the [C++ Build Insights SDK](https://docs.microsoft.com/cpp/build-insights/reference/sdk/overview?view=vs-2019) documentation for details.
1. Invoke the sample, passing your trace as the first parameter.
## Contributing
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit [https://cla.opensource.microsoft.com](https://cla.opensource.microsoft.com).
When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [[email protected]](mailto:[email protected]) with any additional questions or comments.
| 64.147541 | 369 | 0.744186 | eng_Latn | 0.993145 |
0c69f49e5c08384f774c86913a3aea4dc00718dd | 710 | md | Markdown | includes/ansible-prereqs-cloudshell-use-or-vm-creation1.md | peder-andfrankly/azure-docs.sv-se | 49435a06686fc72ca9cd8c83883c3c3704a6ec72 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | includes/ansible-prereqs-cloudshell-use-or-vm-creation1.md | peder-andfrankly/azure-docs.sv-se | 49435a06686fc72ca9cd8c83883c3c3704a6ec72 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | includes/ansible-prereqs-cloudshell-use-or-vm-creation1.md | peder-andfrankly/azure-docs.sv-se | 49435a06686fc72ca9cd8c83883c3c3704a6ec72 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
author: tomarchermsft
ms.service: ansible
ms.topic: include
ms.date: 04/30/2019
ms.author: tarcher
ms.openlocfilehash: 3e99c24b17d19f94b8ba171cea5a16b7d0c0cd56
ms.sourcegitcommit: 3e98da33c41a7bbd724f644ce7dedee169eb5028
ms.translationtype: MT
ms.contentlocale: sv-SE
ms.lasthandoff: 06/18/2019
ms.locfileid: "67187362"
---
- **Installera Ansible**: Gör något av följande alternativ:
- [Installera](/azure/virtual-machines/linux/ansible-install-configure#install-ansible-on-an-azure-linux-virtual-machine) och [konfigurera](/azure/virtual-machines/linux/ansible-install-configure#create-azure-credentials) Ansible på en Linux-dator
- [Konfigurera Azure Cloud Shell](/azure/cloud-shell/quickstart) | 41.764706 | 252 | 0.804225 | swe_Latn | 0.091941 |
0c6c23bb1d61cf03c2c085528ceae6a6ff39c195 | 4,511 | md | Markdown | articles/remoteapp/remoteapp-appreqs.md | OpenLocalizationTestOrg/azure-docs-pr15_de-AT | ca82887d8067662697adba993b87860bdbefea29 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | 1 | 2020-11-29T22:55:06.000Z | 2020-11-29T22:55:06.000Z | articles/remoteapp/remoteapp-appreqs.md | Allyn69/azure-docs-pr15_de-CH | 211ef2a7547f43e3b90b3c4e2cb49e88d7fe139f | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/remoteapp/remoteapp-appreqs.md | Allyn69/azure-docs-pr15_de-CH | 211ef2a7547f43e3b90b3c4e2cb49e88d7fe139f | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | 2 | 2019-07-03T20:05:49.000Z | 2020-11-29T22:55:15.000Z |
<properties
pageTitle="App an Azure RemoteApp | Microsoft Azure"
description="Erfahren Sie mehr über die Vorschriften für apps in Azure RemoteApp verwenden möchten"
services="remoteapp"
documentationCenter=""
authors="lizap"
manager="mbaldwin" />
<tags
ms.service="remoteapp"
ms.workload="compute"
ms.tgt_pltfrm="na"
ms.devlang="na"
ms.topic="article"
ms.date="08/15/2016"
ms.author="elizapo" />
# <a name="app-requirements"></a>App-Vorschriften
> [AZURE.IMPORTANT]
> Azure RemoteApp wird eingestellt. Lesen Sie die [Ankündigung](https://go.microsoft.com/fwlink/?linkid=821148) für Details.
Azure RemoteApp unterstützt streaming 32-Bit- oder 64-Bit Windows-basierten Anwendung en von Windows Server 2012 R2 Bild. Die meisten vorhandenen 32-Bit- oder 64-Bit Windows-basierten Anwendung ausführen "wie besehen" in Azure RemoteApp (Remote Desktop Services oder früher als Terminaldienste bezeichnet) Umgebung. Jedoch gibt es ein Unterschied zwischen und und - einige Programme ordnungsgemäß und gut, andere nicht. Die folgende Informationen Hilfestellung zur Anwendungsentwicklung in einer Remotedesktopdienste-Umgebung testen, um die Kompatibilität sicherzustellen.
Tipp: Wir arbeiten einige Arbeitsbeispiele Apps für Sie zu erstellen. Sie sehen neue Themen, die mit Microsoft Access, QuickBooks und App-V in RemoteApp besprechen.
## <a name="requirements"></a>Vorschriften
Drei solcher helfen, wenn Ihre Anwendung in RemoteApp ausführen:
1. Programme, die alle [Zertifizierungsstellen an Windows desktop-apps](https://msdn.microsoft.com/library/windows/desktop/hh749939.aspx) und [Remotedesktopdienste Programmierungsrichtlinien](https://msdn.microsoft.com/library/aa383490.aspx) einhalten müssen vollständige Kompatibilität mit RemoteApp.
2. Anwendung sollte niemals Daten lokal speichern auf das Bild oder die RemoteApp-Instanzen, die verloren gehen können. Nach dem Erstellen einer Auflistung RemoteApp Instanzen werden geklont sind statusfrei und sollten nur Programme enthalten. Daten in einer externen Quelle oder im Profil des Benutzers gespeichert.
3. Benutzerdefinierte Bilder sollten niemals Daten, die verloren gehen können.
## <a name="testing-your-apps"></a>Testen Ihrer apps
Gehen Sie folgendermaßen vor um Anwendungstests:
1. Installieren Sie Windows Server 2012 R2 und der Anwendung
2. Aktivieren von Remotedesktop
3. Erstellen Sie zwei Benutzerkonten BenutzerA und BenutzerB Sicherheitsgruppe Remotedesktop beider Benutzerkonten hinzufügen.
4. Überprüfen Sie Multi-Session-Kompatibilität durch die Einrichtung von zwei gleichzeitiger RDS Sitzungen mit dem PC beim Starten der Anwendungdes.
5. Überprüfen von Anwendungsverhalten
## <a name="application-development-guidelines"></a>Richtlinien für die Entwicklung
Verwenden Sie die folgenden Richtlinien zur Anwendungsentwicklung für RemoteApp.
### <a name="multiple-users"></a>Mehrere Benutzer
- Installieren einer [Anwendung für einen einzelnen Benutzer ](https://msdn.microsoft.com/library/aa380661.aspx)kann Probleme in einer Mehrbenutzer-Umgebung erstellen.
- Programme sollten an benutzerspezifischen Speicherorten getrennt von globalen Informationen für alle Benutzer [Informationen speichern](https://msdn.microsoft.com/library/aa383452.aspx) .
- RemoteApp verwendet mehrere [Namespaces für Kernelobjekte](https://msdn.microsoft.com/library/aa382954.aspx). ein globaler Namespace wird hauptsächlich von Services in Client/Server-Anwendung verwendet.
- Es ist nicht davon ausgehen, dass den Computernamen oder die [IP-Adresse](https://msdn.microsoft.com/library/aa382942.aspx) dem Computer einen einzelnen Benutzer zugeordnet sind, da mehrere Benutzer gleichzeitig mit einem Remotedesktop-Sitzungshost (RD-Sitzungshost) Server anmelden können.
### <a name="performance"></a>Leistung
- Deaktivieren Sie [Grafikeffekte](https://msdn.microsoft.com/library/aa380822.aspx) RemoteApp Ihrer Anwendung hinzu.
- Zur Maximierung der CPU-Verfügbarkeit für alle Benutzer deaktivieren Sie [Hintergrundaufgaben](https://msdn.microsoft.com/library/aa380665.aspx) oder erstellen Sie effizienter Hintergrundaufgaben, die nicht ressourcenintensiv.
- Sie optimieren und Anwendung [Thread Verwendung](https://msdn.microsoft.com/library/aa383520.aspx) einer Mehrbenutzer Umgebung mit mehreren Prozessoren verteilen.
- Zur Optimierung der Leistung empfiehlt es Anwendung [erkennen](https://msdn.microsoft.com/library/aa380798.aspx) , ob sie in einer Clientsitzung ausgeführt werden.
| 73.95082 | 572 | 0.806916 | deu_Latn | 0.985092 |
0c6c351b50d4b07c5d46a7c1cc949e7e72dfca0e | 6,091 | md | Markdown | playfab-docs/features/multiplayer/lobby/lobby-matchmaking-sdks/multiplayer-unity-overview.md | v-alexmandel/playfab-docs | 745aa85a1711613b036775fa1d1fcb8e65621bb4 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | playfab-docs/features/multiplayer/lobby/lobby-matchmaking-sdks/multiplayer-unity-overview.md | v-alexmandel/playfab-docs | 745aa85a1711613b036775fa1d1fcb8e65621bb4 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | playfab-docs/features/multiplayer/lobby/lobby-matchmaking-sdks/multiplayer-unity-overview.md | v-alexmandel/playfab-docs | 745aa85a1711613b036775fa1d1fcb8e65621bb4 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: PlayFab Multiplayer Unity plugin overview
description: Overview of the PlayFab Multiplayer SDK plugin for Unity
author: vicodex
ms.author: victorku
ms.date: 11/23/2021
ms.topic: article
ms.prod: playfab
keywords: playfab, multiplayer, lobby, matchmaking, unity, middleware
---
# PlayFab Multiplayer Unity plugin overview
> [!IMPORTANT]
> This feature is currently in public preview. It is provided to give you an early look at an upcoming feature, and to allow you to provide feedback while it is still in development.
The PlayFab Multiplayer Unity SDK plugin is a Unity C# wrapper on top of a native PlayFabMultiplayer C++ library created for the convenience of Unity game developers.
It enables you to make use of PlayFab Multiplayer services in your Unity game. Currently, this includes Lobby and Matchmaking. It is designed for developing games on multiple platforms.
PlayFab Multiplayer Unity plugin works alongside the PlayFab "core" Unity SDK. The PlayFab "core" Unity SDK provides other PlayFab functionalities such as economy, leaderboards, and more. For more information, see [PlayFab Unity SDK](https://github.com/PlayFab/UnitySDK) and [PlayFab Unity SDK documentation](https://docs.microsoft.com/gaming/playfab/sdks/unity3d/).
PlayFab Multiplayer Unity plugin is available for download as a Unity Asset package.
## What API features are provided by PlayFab Multiplayer Unity plugin?
- Lobby
- Matchmaking
- Support for the following platforms:
- GDK:
- Xbox Series X|S
- Xbox One
- PC
- Windows
- Support for cross-play across the above platforms
## What is included in PlayFab Multiplayer Unity plugin?
- The top-level Multiplayer API written in Unity C# provided by `PlayFabMultiplayer` class and a prefab for integration of user's Unity game with PlayFabMultiplayer library
- C# interop layer providing managed-code interface to the underlying native (C++) Multiplayer library API. It is used by the top-level C# API.
- Underlying native (C++) PlayFabMultiplayer binaries for each supported platform:
- Multiplayer DLL libraries for GDK
- Multiplayer DLL libraries for Windows
- PlayFab "core" Unity SDK plugin (can be updated independently if needed)
## PlayFab Multiplayer Unity plugin versions and compatibility between platforms
PlayFab Multiplayer Unity plugin will be published and available for download at several distribution points (Git repos), depending on the platform. Access to some distribution points is restricted. You need to send a request to your Microsoft Representative and may include additional steps.
In order to provide a better guidance on compatibility between versions downloaded from different distribution points, and reflect a reference to a specific version of the underlying native library, PlayFab Multiplayer Unity plugin follows a custom versioning scheme.
### PlayFab Multiplayer Unity plugin versioning scheme
```
X.X.X.Y-(distribution-point-indicator).Z
```
For example, `1.2.0.3-gdk.0` (a version downloaded from the GDK repo with restricted access) or `1.2.0.3-ps5.0` (a version with Multiplayer binaries only for PS5, downloaded from the PS5 repo with restricted access).
Version components:
- `X.X.X` - the lowest version of the underlying PlayFabMultiplayer library across all supported platforms. This is used for general reference consistency with a version of the underlying C++ library. In the example above, the version of an included PlayFabMultiplayer library for each platform is `1.2.0` or higher.
- `Y` - an incremental index of any modifications in the Multiplayer Unity C# layer, for any given X.X.X part of the version.
- `(distribution-point-indicator)` - a mnemonic code for tracking which distribution point a particular PlayFab Multiplayer Unity plugin package was downloaded from. It differs by distribution point, for example, `gdk` (Microsoft Azure DevOps repo with restricted access for GDK developers), `ps5` (Microsoft Azure DevOps repo with restricted access for PS5 developers), etc.
- `Z` - an incremental index of any modifications unique to the distribution point (for example, Multiplayer binaries updated/patched for a specific platform only).
A higher number in any version component means a newer version, by significance from left to right.
### Compatibility between versions from different distribution points
Regardless of a distribution point a PlayFab Multiplayer Unity plugin is downloaded from, it is guaranteed to be fully compatible with a PlayFab Multiplayer Unity plugin downloaded from any other distribution point, if **the first four numbers (`X.X.X.Y`) of their version are the same**. Compatible versions from different distribution points can be imported into user's Unity project in any order without a risk of overwriting/breaking one another, as their shared code should be identical. Though each of them may have some additional (not shared) files, specific to a particular platform, which shouldn't overlap.
For example, you can import all the following versions of PlayFab Multiplayer Unity plugin in your Unity project, in any sequential order, if you are targeting GDK, PS5 and Switch:
- `1.2.0.3-gdk.0` (imports Multiplayer binaries for GDK, among other files)
- `1.2.0.3-ps5.0` (imports Multiplayer binaries for PS5, among other files)
- `1.2.0.3-sw.0` (imports Multiplayer binaries for Switch, among other files)
The shared (cross-platform) Unity C# code included in each of these plugins will be the same.
## Which versions of Unity are supported?
We strive to support all recent versions of Unity starting with Unity 2017, however your choice may be limited by availability of a Unity development add-on for each particular platform, see corresponding Unity documentation. That, in turn, may also limit your choice of the platform SDK.
In general, we test PlayFab Multiplayer Unity plugin with one of the most recent versions of Unity development add-ons available for each platform. We encourage you to report any build or runtime issues with any new version of Unity Editor, Unity add-on, or a platform SDK.
| 70.825581 | 617 | 0.793137 | eng_Latn | 0.998121 |
0c6c9bfb4821fbe7eda41f8fca9e8233738e673c | 1,703 | md | Markdown | README.md | charlesmaillu/Portfolio | 1542386363a72b78b21c9fdca514a7fd8922e12d | [
"MIT"
] | null | null | null | README.md | charlesmaillu/Portfolio | 1542386363a72b78b21c9fdca514a7fd8922e12d | [
"MIT"
] | null | null | null | README.md | charlesmaillu/Portfolio | 1542386363a72b78b21c9fdca514a7fd8922e12d | [
"MIT"
] | null | null | null | # Portfolio
#### A practice portfolio hosted in git hub, 15 July 2020
#### By **Charles N. Mailu**
## Description
This is a practice fortfolio creation, an assigment of week one in moringa school, the purpose is for practising how to create a E-portfolio
## Setup/Installation Requirements
* Text editor atom or visual studio
* install git and an account at git hub
* web browser chrome or firefox
* Fork the repo
* Create a new branch (git checkout -b improve-feature)
* Make the appropriate changes in the files
* Add changes to reflect the changes made
* Commit your changes (git commit -am 'Improve feature')
* Push to the branch (git push origin improve-feature)
* Create a Pull Request
## Known Bugs
No known bugs yet but If you encounter a bug, kindly reach me at <a href="https://charlesmaillu.github.io/contact-info">here</a>
## Technologies Used
* HTML
* CSS
## Support and contact details
If you run into any issues or have questions, ideas or concerns to reach me click <a href="https://charlesmaillu.github.io/contact-info">here</a>
Want to contribute? Great!
To fix a bug or enhance an existing module, follow these steps:
* Fork the repo
* Cr*eate a new branch (git checkout -b improve-feature)
* Make the appropriate changes in the files
* Add changes to reflect the changes made
* Commit your changes (git commit -am 'Improve feature')
* Push to the branch (git push origin improve-feature)
* Create a Pull Request
### Setup
Clone this repo to your desktop and run
### License
You can check out the full license <a href="https://github.com/charlesmaillu/Portfolio/blob/master/LICENSE">here</a>
This project is licensed under the terms of the MIT license.
**Charles Mailu**
| 34.755102 | 145 | 0.752202 | eng_Latn | 0.991954 |
0c6cc3d4d9a77ae9bf9de3d67049dfa25f1db6b5 | 76 | md | Markdown | README.md | Aecio1001/Livrom | 2f10f4c626d8543c5a64481e6346411087a48be4 | [
"MIT"
] | null | null | null | README.md | Aecio1001/Livrom | 2f10f4c626d8543c5a64481e6346411087a48be4 | [
"MIT"
] | null | null | null | README.md | Aecio1001/Livrom | 2f10f4c626d8543c5a64481e6346411087a48be4 | [
"MIT"
] | null | null | null | # Livrom
App para a catalogação de títulos de livros (controle de coleção)
| 25.333333 | 66 | 0.776316 | por_Latn | 0.999977 |
0c6cf1d302d91ed1dbab710f1723dc4571d09f56 | 342 | markdown | Markdown | _posts/work/2016-08-12-ISBA.markdown | ArtySemenov/test_site | f0917e7e3d964004c1461fc9fb10ae9dc820d393 | [
"MIT"
] | null | null | null | _posts/work/2016-08-12-ISBA.markdown | ArtySemenov/test_site | f0917e7e3d964004c1461fc9fb10ae9dc820d393 | [
"MIT"
] | null | null | null | _posts/work/2016-08-12-ISBA.markdown | ArtySemenov/test_site | f0917e7e3d964004c1461fc9fb10ae9dc820d393 | [
"MIT"
] | null | null | null | ---
layout: post
title: "ISBA 2016"
date: 2016-08-12 12:08:28 +0000
category: Work
tag: Graphic Design
thumbnail: /src/img/thumb-ISBA.png
description: "Agency: The Publishing Bureau. Conference programme and other collateral"
---
<img class="myImg" src="{{ site.baseurl }}/src/img/isba2016_final.jpg" alt="ISBA 2016 Conference Programme"> | 31.090909 | 108 | 0.739766 | yue_Hant | 0.410052 |
0c6d27b03e2cdc11ba5f3341894e92748fba55b8 | 5,907 | md | Markdown | sccm/develop/reference/sum/sms_updatecategoryinstance-server-wmi-class.md | bhuney/SCCMdocs | c45be4e3837ac31f670ebdb2131f46c0c8425158 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | sccm/develop/reference/sum/sms_updatecategoryinstance-server-wmi-class.md | bhuney/SCCMdocs | c45be4e3837ac31f670ebdb2131f46c0c8425158 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | sccm/develop/reference/sum/sms_updatecategoryinstance-server-wmi-class.md | bhuney/SCCMdocs | c45be4e3837ac31f670ebdb2131f46c0c8425158 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-09-09T17:33:05.000Z | 2019-09-09T17:33:05.000Z | ---
title: "SMS_UpdateCategoryInstance Class"
titleSuffix: "Configuration Manager"
ms.date: "09/20/2016"
ms.prod: "configuration-manager"
ms.technology: configmgr-sdk
ms.topic: conceptual
ms.assetid: bc441b19-52b2-4004-9af3-f37a5e0529dd
author: aczechowski
ms.author: aaroncz
manager: dougeby
---
# SMS_UpdateCategoryInstance Server WMI Class
The `SMS_UpdateCategoryInstance` Windows Management Instrumentation (WMI) class is an SMS Provider server class, in System Center Configuration Manager, that represents a software-update-specific `SMS_CategoryInstance Server WMI Class` object available on the site.
The following syntax is simplified from Managed Object Format (MOF) code and includes all inherited properties.
## Syntax
```
Class SMS_UpdateCategoryInstance : SMS_CategoryInstanceBase
{
Boolean AllowSubscription;
String CategoryInstance_UniqueID;
UInt32 CategoryInstanceID;
String CategoryTypeName;
Boolean IsSubscribed;
String LocalizedCategoryInstanceName;
SMS_Category_LocalizedProperties LocalizedInformation[];
UInt32 LocalizedPropertyLocaleID;
UInt32 ParentCategoryInstanceID;
String SourceSite;
};
```
## Methods
The `SMS_UpdateCategoryInstance` class does not define any methods.
> [!WARNING]
> The `ResendObjectToAllSites Method in Class SMS_UpdateCategoryInstance` has been deprecated in System Center Configuration Manager.
## Properties
`AllowSubscription`
Data type: `Boolean`
Access type: Read-only
Qualifiers: [read]
`true` if the category instance is enabled for subscription to the category metadata from the software update source. The default value is `false`. For more information, see [SMS_SoftwareUpdateSource Server WMI Class](../../../develop/reference/sum/sms_softwareupdatesource-server-wmi-class.md).
> [!NOTE]
> Not all categories can be marked for subscription.
`CategoryInstance_UniqueID`
Data type: `String`
Access type: Read/Write
Qualifiers: [unique, SizeLimit("512")
See [SMS_CategoryInstanceBase Server WMI Class](../../../develop/reference/compliance/sms_categoryinstancebase-server-wmi-class.md).
`CategoryInstanceID`
Data type: `UInt``3``2`
Access type: Read-only
Qualifiers: [key, read]
See [SMS_CategoryInstanceBase Server WMI Class](../../../develop/reference/compliance/sms_categoryinstancebase-server-wmi-class.md).
`CategoryTypeName`
Data type: `String`
Access type: Read/Write
Qualifiers: None
See [SMS_CategoryInstanceBase Server WMI Class](../../../develop/reference/compliance/sms_categoryinstancebase-server-wmi-class.md).
`IsSubscribed`
Data type: `Boolean`
Access type: Read/Write
Qualifiers: [read]
`true` if the category instance allows subscription. The default value is `false`. Set this property to `true` only if the `AllowSubscription` property is set to `true`.
`LocalizedCategoryInstanceName`
Data type: `String`
Access type: Read-only
Qualifiers: [read]
See [SMS_CategoryInstanceBase Server WMI Class](../../../develop/reference/compliance/sms_categoryinstancebase-server-wmi-class.md).
`LocalizedInformation`
Data type: `SMS_Category_LocalizedProperties Array` Access type: Read/Write
Qualifiers: [lazy]
See [SMS_CategoryInstanceBase Server WMI Class](../../../develop/reference/compliance/sms_categoryinstancebase-server-wmi-class.md).
`LocalizedPropertyLocaleID`
Data type: `UInt32`
Access type: Read-only
Qualifiers: [read]
See [SMS_CategoryInstanceBase Server WMI Class](../../../develop/reference/compliance/sms_categoryinstancebase-server-wmi-class.md).
`ParentCategoryInstanceID`
Data type: `UInt32`
Access type: Read-only
Qualifiers: [read]
See [SMS_CategoryInstanceBase Server WMI Class](../../../develop/reference/compliance/sms_categoryinstancebase-server-wmi-class.md).
`SourceSite`
Data type: `String`
Access type: Read-only
Qualifiers: [read]
See [SMS_CategoryInstanceBase Server WMI Class](../../../develop/reference/compliance/sms_categoryinstancebase-server-wmi-class.md).
## Remarks
Class qualifiers for this class include:
- Secured
- Read (read-only)
For more information about both the class qualifiers and the property qualifiers included in the Properties section, see [Configuration Manager Class and Property Qualifiers](../../../develop/reference/misc/class-and-property-qualifiers.md).
Your application uses the `SMS_UpdateCategoryInstance` class after creating or modifying a software update deployment using [SMS_UpdatesAssignment Server WMI Class](../../../develop/reference/sum/sms_updatesassignment-server-wmi-class.md). The application can use [SMS_CIAllCategories Server WMI Class](../../../develop/reference/sum/sms_ciallcategories-server-wmi-class.md) to query for all categories associated with the software updates configuration item or for all configuration items associated with a category.
To use this class, the application obtains an `SMS_SoftwareUpdateSource` object and sets the properties as required for the particular software update and the source.
## Requirements
## Runtime Requirements
For more information, see [Configuration Manager Server Runtime Requirements](../../../develop/core/reqs/server-runtime-requirements.md).
## Development Requirements
For more information, see [Configuration Manager Server Development Requirements](../../../develop/core/reqs/server-development-requirements.md).
## See Also
[Software Updates Server WMI Classes](../../../develop/reference/sum/software-updates-server-wmi-classes.md)
[SMS_CIAllCategories Server WMI Class](../../../develop/reference/sum/sms_ciallcategories-server-wmi-class.md)
[Configuration Manager Software Updates](../../../develop/sum/software-updates.md)
| 37.150943 | 521 | 0.753513 | eng_Latn | 0.426309 |
0c6db6f8790c786ba88d2c6d476bc405cbab56f1 | 2,959 | md | Markdown | wiki/translations/it/Arch_SetMaterial.md | dwhr-pi/FreeCAD-documentation | 0c889672d80e7969dcabe83f5ddf503e72a4f5bb | [
"CC0-1.0"
] | null | null | null | wiki/translations/it/Arch_SetMaterial.md | dwhr-pi/FreeCAD-documentation | 0c889672d80e7969dcabe83f5ddf503e72a4f5bb | [
"CC0-1.0"
] | null | null | null | wiki/translations/it/Arch_SetMaterial.md | dwhr-pi/FreeCAD-documentation | 0c889672d80e7969dcabe83f5ddf503e72a4f5bb | [
"CC0-1.0"
] | null | null | null | ---
- GuiCommand:/it
Name:Arch_SetMaterial
Name/it:Imposta Materiale
MenuLocation:Arch → Strumenti materiali → Materiale
Workbenches:[Arch](Arch_Workbench/it.md), [BIM](BIM_Workbench/it.md)
Shortcut:**M** **T**
SeeAlso:[Materiali](Arch_CompSetMaterial/it.md), [Multimateriali](Arch_MultiMaterial/it.md)
---
# Arch SetMaterial/it
</div>
## Descrizione
<div class="mw-translate-fuzzy">
Lo strumento materiale permette di aggiungere dei [materiali](Material/it.md) al documento attivo, e di attribuire un materiale ad un oggetto [Arch](Arch_Workbench/it.md). I Materiali possiedono tutte le proprietà di un determinato materiale, e controllano il colore dell\'oggetto a cui sono allegati. I materiali vengono memorizzati in una cartella **Materials** nel documento attivo.
</div>

## Utilizzo
1. Facoltativamente, selezionare uno o più oggetti a cui si desidera attribuire un nuovo materiale.
2. Richiamare il comando in uno di questi modi:
- Premere il pulsante **<img src="images/Arch_SetMaterial.svg" width=16px> [Imposta materiale](Arch_SetMaterial/it.md)** nella barra degli strumenti.
- Usare la scorciatoia **M** e poi **T** da tastiera.
- Usare il comando **Arch → Strumenti materiali → Materiale** dal menu principale.
3. Caricare un materiale predefinito, o crearne uno nuovo riempiendo i campi.
4. Premere **OK**.
## Opzioni
- Al momento della creazione di un nuovo materiale, un pannello consente di impostare le diverse opzioni:

<div class="mw-translate-fuzzy">
- **Choose preset**: Scegliere uno dei materiali preimpostati, da utilizzare come è, o per adattarlo modificando i campi sottostanti
- **Name**: Scegliere un nome per il materiale
- **Edit button**: Questo apre il materiale corrente nel [Editor dei materiali](Material_editor/it.md) di FreeCAD, che permette di modificare molte proprietà aggiuntive e di aggiungere le proprie personalizzate
- **Description**: Una descrizione più dettagliata del materiale
- **Color**: Il colore di visualizzazione per il materiale, che sarà applicato a tutti gli oggetti che utilizzano tale materiale
- **Code**: Un nome e numero di riferimento di un sistema di specificazione, come [Masterformat](https://en.wikipedia.org/wiki/MasterFormat) o [Omniclass](http://www.omniclass.org/).
- **Code browser button**: Non ancora implementato - permetterà di aprire il riferimento in un browser web
- **URL**: Un URL in si possono trovare ulteriori informazioni sul materiale
- **URL button**:Apre l\'URL in un browser Web
</div>
## Relazione con IFC
Questo corrisponde approssimativamente a [IfcMaterial](https://standards.buildingsmart.org/IFC/DEV/IFC4_2/FINAL/HTML/link/ifcmaterial.htm).
<div class="mw-translate-fuzzy">
</div>
---
 [documentation index](../README.md) > [Arch](Arch_Workbench.md) > Arch SetMaterial/it
| 38.428571 | 385 | 0.74586 | ita_Latn | 0.988127 |
0c6e44a3b81c6325a5d8fcabb918181019a8e276 | 1,201 | md | Markdown | docs/visual-basic/misc/bc42302.md | olifantix/docs.de-de | a31a14cdc3967b64f434a2055f7de6bf1bb3cda8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/visual-basic/misc/bc42302.md | olifantix/docs.de-de | a31a14cdc3967b64f434a2055f7de6bf1bb3cda8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/visual-basic/misc/bc42302.md | olifantix/docs.de-de | a31a14cdc3967b64f434a2055f7de6bf1bb3cda8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Der XML-Kommentar muss die erste Anweisung in einer Zeile sein.
ms.date: 07/20/2015
f1_keywords:
- bc42302
- vbc42302
helpviewer_keywords:
- BC42302
ms.assetid: c3344328-adef-4c3d-b75e-e8a9e450e67c
ms.openlocfilehash: 15371d3bca436ad4dfe8b9975f1c479a4af28385
ms.sourcegitcommit: 3d5d33f384eeba41b2dff79d096f47ccc8d8f03d
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 05/04/2018
---
# <a name="xml-comment-must-be-the-first-statement-on-a-line"></a>Der XML-Kommentar muss die erste Anweisung in einer Zeile sein.
Der XML-Kommentar muss die erste Anweisung in einer Zeile sein. Der XML-Kommentar wird ignoriert.
Ein XML-Kommentar wurde nach einem anderen Sprachelement platziert, was nicht zulässig ist.
**Fehler-ID:** BC42302
## <a name="to-correct-this-error"></a>So beheben Sie diesen Fehler
- Verschieben Sie den XML-Kommentar in eine neue Zeile.
## <a name="see-also"></a>Siehe auch
[XML-Kommentartags](../../visual-basic/language-reference/xmldoc/recommended-xml-tags-for-documentation-comments.md)
[Dokumentieren von Code mit XML](../../visual-basic/programming-guide/program-structure/documenting-your-code-with-xml.md)
| 40.033333 | 129 | 0.759367 | deu_Latn | 0.784304 |
0c6e7e62d500ddf91a9b8c1f0a82592c236a9442 | 1,822 | md | Markdown | CONTRIBUTING.md | ilkedemir/sumo-challenge | 0d147ce6d7fba51478d1f3a21d2fffebc06117b3 | [
"MIT"
] | 70 | 2018-08-31T09:46:22.000Z | 2021-11-06T16:25:42.000Z | CONTRIBUTING.md | ilkedemir/sumo-challenge | 0d147ce6d7fba51478d1f3a21d2fffebc06117b3 | [
"MIT"
] | 47 | 2018-08-30T15:49:08.000Z | 2020-12-29T22:02:32.000Z | CONTRIBUTING.md | ilkedemir/sumo-challenge | 0d147ce6d7fba51478d1f3a21d2fffebc06117b3 | [
"MIT"
] | 18 | 2018-08-30T11:33:59.000Z | 2021-11-06T16:25:35.000Z | # Contributing to the SUMO challenge API
The SUMO challenge API is primarily a set of utilities to support
participants in the SUMO challenge. As such, we do not anticipate significant
ongoing development in this project. However, the software is not perfect,
and we encourage participants to provide bug fixes if they identify any. We
want to make contributing to this project as easy and transparent as possible.
## Pull Requests
If you would like to contribute to the sumo-api, please use the
standard GitHub pull request process.
1. Fork the repo and create your branch from `master`.
2. If you've added code that should be tested, add tests.
3. If you've changed APIs, update the documentation.
4. Ensure the test suite passes.
5. Make sure your code lints.
6. If you haven't already, complete the Contributor License Agreement ("CLA").
## Contributor License Agreement ("CLA")
In order to accept your pull request, we need you to submit a CLA. You only need
to do this once to work on any of Facebook's open source projects.
Complete your CLA here: <https://code.facebook.com/cla>
## Issues
We use GitHub issues to track public bugs. Please ensure your description is
clear and has sufficient instructions to be able to reproduce the issue.
Facebook has a [bounty program](https://www.facebook.com/whitehat/) for the safe
disclosure of security bugs. In those cases, please go through the process
outlined on that page and do not file a public issue.
## Coding Style
* For Python code, we use the [PEP8 coding style
guide](https://legacy.python.org/dev/peps/pep-0008/)
* For C++ code, please follow the style of the existing code.
## License
By contributing to the SUMO challenge API, you agree that your contributions
will be licensed under the LICENSE file in the root directory of this
source tree. | 43.380952 | 80 | 0.77607 | eng_Latn | 0.998228 |
0c6ffcbbf93a53b948f86e4708364227cf5012ec | 216 | md | Markdown | about.md | imwrme/imwrme.github.io | b56ffd1d291d84125b2d88cdb5702673e64e4071 | [
"MIT"
] | null | null | null | about.md | imwrme/imwrme.github.io | b56ffd1d291d84125b2d88cdb5702673e64e4071 | [
"MIT"
] | null | null | null | about.md | imwrme/imwrme.github.io | b56ffd1d291d84125b2d88cdb5702673e64e4071 | [
"MIT"
] | null | null | null | ---
layout: page
title: About
permalink: /about/
---
<div class="mt50"></div>
My useful commands are here...
2018 01 prepare full stack developer...<br>
2017 11 made this blog. <br>
2017 03 react&redux, ROR....<br> | 18 | 43 | 0.675926 | eng_Latn | 0.901019 |
0c70dd21fd6cc74c37c7d41bf5d75fef1dba87da | 2,127 | md | Markdown | README.md | varvarra/tic_tac_toe | 23def285bd71e486a0e4d77544e1197d013a7573 | [
"MIT"
] | null | null | null | README.md | varvarra/tic_tac_toe | 23def285bd71e486a0e4d77544e1197d013a7573 | [
"MIT"
] | null | null | null | README.md | varvarra/tic_tac_toe | 23def285bd71e486a0e4d77544e1197d013a7573 | [
"MIT"
] | null | null | null | # Tic Tac Toe Tech Test!
The purpose of this test is to build the business logic for a game of tic tac toe. It should be easy to implement a working game of tic tac toe by combining your code with any user interface, whether web or command line.
## The brief
### The rules of tic-tac-toe are as follows:
- There are two players in the game (X and O)
- Players take turns until the game is over
- A player can claim a field if it is not already taken
- A turn ends when a player claims a field
- A player wins if they claim all the fields in a row, column or diagonal
- A game is over if a player wins
- A game is over when all fields are taken
### Instructions
- clone this repository
- Open a SpecRunner.html file - you will see the test coverage
- Open the console in the browser to interact with an application
## User Stories
```
As a player
To play tic-tac-toe
I want to choose a role to play with (X or O)
```
```
As a player
To play tic-tac-toe
I want to be able to claim a field if it is not already taken
```
```
As a player
To let another player play
I should not be allowed to claim a field straight after another claim
```
```
As a player
To win
I want to be able to claim all the fields in a row, column or diagonal
```
```
As a player
To finish playing
One of the players should win
```
```
As a player
To finish playing
All of the fields should be claimed
```
### Approach
- I decided to build my application based on Single Responsibility Principle and gave each 'class' its own responsibility.
- **Game** constructor function - understands just how to play
- **Rulebook** understands the rules of the game. Encapsulating behaviour of the rules in the Rulebook class will help avoid problems if the rules change not so ff all the information about rules was inside a Game class.
- **Player** just knows the properties of the players.
- **Board** only knows the state of its fields and how to change them.
I tried to follow these design principles as a good practice which might be very helpful if the application is extended.
I plan to further add an interface jQuery layer to the game.
| 29.136986 | 221 | 0.74189 | eng_Latn | 0.99992 |
0c70e4356e08a81629a27f9b1311684f4df6086c | 12,104 | md | Markdown | docs/devel/faster_reviews.md | mcluseau/kubernetes | cf94ec6944fbec72954a6d56c6cb8183fe9b88fa | [
"Apache-2.0"
] | 6 | 2021-05-01T14:35:57.000Z | 2022-03-09T13:31:26.000Z | docs/devel/faster_reviews.md | mcluseau/kubernetes | cf94ec6944fbec72954a6d56c6cb8183fe9b88fa | [
"Apache-2.0"
] | 1 | 2020-02-03T12:44:29.000Z | 2020-02-03T12:44:29.000Z | docs/devel/faster_reviews.md | mcluseau/kubernetes | cf94ec6944fbec72954a6d56c6cb8183fe9b88fa | [
"Apache-2.0"
] | 3 | 2021-05-01T14:36:03.000Z | 2022-03-09T13:30:54.000Z | <!-- BEGIN MUNGE: UNVERSIONED_WARNING -->
<!-- BEGIN STRIP_FOR_RELEASE -->
<img src="http://kubernetes.io/img/warning.png" alt="WARNING"
width="25" height="25">
<img src="http://kubernetes.io/img/warning.png" alt="WARNING"
width="25" height="25">
<img src="http://kubernetes.io/img/warning.png" alt="WARNING"
width="25" height="25">
<img src="http://kubernetes.io/img/warning.png" alt="WARNING"
width="25" height="25">
<img src="http://kubernetes.io/img/warning.png" alt="WARNING"
width="25" height="25">
<h2>PLEASE NOTE: This document applies to the HEAD of the source tree</h2>
If you are using a released version of Kubernetes, you should
refer to the docs that go with that version.
<strong>
The latest 1.0.x release of this document can be found
[here](http://releases.k8s.io/release-1.0/docs/devel/faster_reviews.md).
Documentation for other releases can be found at
[releases.k8s.io](http://releases.k8s.io).
</strong>
--
<!-- END STRIP_FOR_RELEASE -->
<!-- END MUNGE: UNVERSIONED_WARNING -->
# How to get faster PR reviews
Most of what is written here is not at all specific to Kubernetes, but it bears
being written down in the hope that it will occasionally remind people of "best
practices" around code reviews.
You've just had a brilliant idea on how to make Kubernetes better. Let's call
that idea "FeatureX". Feature X is not even that complicated. You have a
pretty good idea of how to implement it. You jump in and implement it, fixing a
bunch of stuff along the way. You send your PR - this is awesome! And it sits.
And sits. A week goes by and nobody reviews it. Finally someone offers a few
comments, which you fix up and wait for more review. And you wait. Another
week or two goes by. This is horrible.
What went wrong? One particular problem that comes up frequently is this - your
PR is too big to review. You've touched 39 files and have 8657 insertions.
When your would-be reviewers pull up the diffs they run away - this PR is going
to take 4 hours to review and they don't have 4 hours right now. They'll get to it
later, just as soon as they have more free time (ha!).
Let's talk about how to avoid this.
## 0. Familiarize yourself with project conventions
* [Development guide](development.md)
* [Coding conventions](coding-conventions.md)
* [API conventions](api-conventions.md)
* [Kubectl conventions](kubectl-conventions.md)
## 1. Don't build a cathedral in one PR
Are you sure FeatureX is something the Kubernetes team wants or will accept, or
that it is implemented to fit with other changes in flight? Are you willing to
bet a few days or weeks of work on it? If you have any doubt at all about the
usefulness of your feature or the design - make a proposal doc (in docs/proposals;
for example [the QoS proposal](http://prs.k8s.io/11713)) or a sketch PR (e.g., just
the API or Go interface) or both. Write or code up just enough to express the idea
and the design and why you made those choices, then get feedback on this. Be clear
about what type of feedback you are asking for. Now, if we ask you to change a
bunch of facets of the design, you won't have to re-write it all.
## 2. Smaller diffs are exponentially better
Small PRs get reviewed faster and are more likely to be correct than big ones.
Let's face it - attention wanes over time. If your PR takes 60 minutes to
review, I almost guarantee that the reviewer's eye for details is not as keen in
the last 30 minutes as it was in the first. This leads to multiple rounds of
review when one might have sufficed. In some cases the review is delayed in its
entirety by the need for a large contiguous block of time to sit and read your
code.
Whenever possible, break up your PRs into multiple commits. Making a series of
discrete commits is a powerful way to express the evolution of an idea or the
different ideas that make up a single feature. There's a balance to be struck,
obviously. If your commits are too small they become more cumbersome to deal
with. Strive to group logically distinct ideas into commits.
For example, if you found that FeatureX needed some "prefactoring" to fit in,
make a commit that JUST does that prefactoring. Then make a new commit for
FeatureX. Don't lump unrelated things together just because you didn't think
about prefactoring. If you need to, fork a new branch, do the prefactoring
there and send a PR for that. If you can explain why you are doing seemingly
no-op work ("it makes the FeatureX change easier, I promise") we'll probably be
OK with it.
Obviously, a PR with 25 commits is still very cumbersome to review, so use
common sense.
## 3. Multiple small PRs are often better than multiple commits
If you can extract whole ideas from your PR and send those as PRs of their own,
you can avoid the painful problem of continually rebasing. Kubernetes is a
fast-moving codebase - lock in your changes ASAP, and make merges be someone
else's problem.
Obviously, we want every PR to be useful on its own, so you'll have to use
common sense in deciding what can be a PR vs what should be a commit in a larger
PR. Rule of thumb - if this commit or set of commits is directly related to
FeatureX and nothing else, it should probably be part of the FeatureX PR. If
you can plausibly imagine someone finding value in this commit outside of
FeatureX, try it as a PR.
Don't worry about flooding us with PRs. We'd rather have 100 small, obvious PRs
than 10 unreviewable monoliths.
## 4. Don't rename, reformat, comment, etc in the same PR
Often, as you are implementing FeatureX, you find things that are just wrong.
Bad comments, poorly named functions, bad structure, weak type-safety. You
should absolutely fix those things (or at least file issues, please) - but not
in this PR. See the above points - break unrelated changes out into different
PRs or commits. Otherwise your diff will have WAY too many changes, and your
reviewer won't see the forest because of all the trees.
## 5. Comments matter
Read up on GoDoc - follow those general rules. If you're writing code and you
think there is any possible chance that someone might not understand why you did
something (or that you won't remember what you yourself did), comment it. If
you think there's something pretty obvious that we could follow up on, add a
TODO. Many code-review comments are about this exact issue.
## 5. Tests are almost always required
Nothing is more frustrating than doing a review, only to find that the tests are
inadequate or even entirely absent. Very few PRs can touch code and NOT touch
tests. If you don't know how to test FeatureX - ask! We'll be happy to help
you design things for easy testing or to suggest appropriate test cases.
## 6. Look for opportunities to generify
If you find yourself writing something that touches a lot of modules, think hard
about the dependencies you are introducing between packages. Can some of what
you're doing be made more generic and moved up and out of the FeatureX package?
Do you need to use a function or type from an otherwise unrelated package? If
so, promote! We have places specifically for hosting more generic code.
Likewise if FeatureX is similar in form to FeatureW which was checked in last
month and it happens to exactly duplicate some tricky stuff from FeatureW,
consider prefactoring core logic out and using it in both FeatureW and FeatureX.
But do that in a different commit or PR, please.
## 7. Fix feedback in a new commit
Your reviewer has finally sent you some feedback on FeatureX. You make a bunch
of changes and ... what? You could patch those into your commits with git
"squash" or "fixup" logic. But that makes your changes hard to verify. Unless
your whole PR is pretty trivial, you should instead put your fixups into a new
commit and re-push. Your reviewer can then look at that commit on its own - so
much faster to review than starting over.
We might still ask you to clean up your commits at the very end, for the sake
of a more readable history, but don't do this until asked, typically at the point
where the PR would otherwise be tagged LGTM.
General squashing guidelines:
* Sausage => squash
When there are several commits to fix bugs in the original commit(s), address reviewer feedback, etc. Really we only want to see the end state and commit message for the whole PR.
* Layers => don't squash
When there are independent changes layered upon each other to achieve a single goal. For instance, writing a code munger could be one commit, applying it could be another, and adding a precommit check could be a third. One could argue they should be separate PRs, but there's really no way to test/review the munger without seeing it applied, and there needs to be a precommit check to ensure the munged output doesn't immediately get out of date.
A commit, as much as possible, should be a single logical change. Each commit should always have a good title line (<70 characters) and include an additional description paragraph describing in more detail the change intended. Do not link pull requests by `#` in a commit description, because GitHub creates lots of spam. Instead, reference other PRs via the PR your commit is in.
## 8. KISS, YAGNI, MVP, etc
Sometimes we need to remind each other of core tenets of software design - Keep
It Simple, You Aren't Gonna Need It, Minimum Viable Product, and so on. Adding
features "because we might need it later" is antithetical to software that
ships. Add the things you need NOW and (ideally) leave room for things you
might need later - but don't implement them now.
## 9. Push back
We understand that it is hard to imagine, but sometimes we make mistakes. It's
OK to push back on changes requested during a review. If you have a good reason
for doing something a certain way, you are absolutely allowed to debate the
merits of a requested change. You might be overruled, but you might also
prevail. We're mostly pretty reasonable people. Mostly.
## 10. I'm still getting stalled - help?!
So, you've done all that and you still aren't getting any PR love? Here's some
things you can do that might help kick a stalled process along:
* Make sure that your PR has an assigned reviewer (assignee in GitHub). If
this is not the case, reply to the PR comment stream asking for one to be
assigned.
* Ping the assignee (@username) on the PR comment stream asking for an
estimate of when they can get to it.
* Ping the assignee by email (many of us have email addresses that are well
published or are the same as our GitHub handle @google.com or @redhat.com).
* Ping the [team](https://github.com/orgs/kubernetes/teams) (via @team-name)
that works in the area you're submitting code.
If you think you have fixed all the issues in a round of review, and you haven't
heard back, you should ping the reviewer (assignee) on the comment stream with a
"please take another look" (PTAL) or similar comment indicating you are done and
you think it is ready for re-review. In fact, this is probably a good habit for
all PRs.
One phenomenon of open-source projects (where anyone can comment on any issue)
is the dog-pile - your PR gets so many comments from so many people it becomes
hard to follow. In this situation you can ask the primary reviewer
(assignee) whether they want you to fork a new PR to clear out all the comments.
Remember: you don't HAVE to fix every issue raised by every person who feels
like commenting, but you should at least answer reasonable comments with an
explanation.
## Final: Use common sense
Obviously, none of these points are hard rules. There is no document that can
take the place of common sense and good taste. Use your best judgment, but put
a bit of thought into how your work can be made easier to review. If you do
these things your PRs will flow much more easily.
<!-- BEGIN MUNGE: GENERATED_ANALYTICS -->
[]()
<!-- END MUNGE: GENERATED_ANALYTICS -->
| 50.433333 | 449 | 0.763054 | eng_Latn | 0.999605 |
0c70f13cc14716fa307b2fa1c83cd87c86506169 | 7,514 | md | Markdown | windows-driver-docs-pr/kernel/irp-mn-query-all-data.md | msmarkma/windows-driver-docs | b5f403fff45d9a25f4d55e52d84996aba457360a | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-04-18T03:12:31.000Z | 2021-04-18T03:12:31.000Z | windows-driver-docs-pr/kernel/irp-mn-query-all-data.md | msmarkma/windows-driver-docs | b5f403fff45d9a25f4d55e52d84996aba457360a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | windows-driver-docs-pr/kernel/irp-mn-query-all-data.md | msmarkma/windows-driver-docs | b5f403fff45d9a25f4d55e52d84996aba457360a | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-02-23T22:45:54.000Z | 2021-02-23T22:45:54.000Z | ---
title: IRP_MN_QUERY_ALL_DATA
description: Learn about the 'IRP_MN_QUERY_ALL_DATA' kernel-mode driver architecture. All drivers that support WMI must handle this IRP.
ms.date: 08/12/2017
keywords:
- IRP_MN_QUERY_ALL_DATA Kernel-Mode Driver Architecture
ms.localizationpriority: medium
---
# IRP\_MN\_QUERY\_ALL\_DATA
All drivers that support WMI must handle this IRP. A driver can handle WMI IRPs either by calling [**WmiSystemControl**](/windows-hardware/drivers/ddi/wmilib/nf-wmilib-wmisystemcontrol) or by handling the IRP itself, as described in [Handling WMI Requests](./handling-wmi-requests.md).
If a driver calls [**WmiSystemControl**](/windows-hardware/drivers/ddi/wmilib/nf-wmilib-wmisystemcontrol) to handle an **IRP\_MN\_QUERY\_ALL\_DATA** request, WMI in turn calls that driver's [*DpWmiQueryDataBlock*](/windows-hardware/drivers/ddi/wmilib/nc-wmilib-wmi_query_datablock_callback) routine.
Major Code
----------
[**IRP\_MJ\_SYSTEM\_CONTROL**](irp-mj-system-control.md)
When Sent
---------
WMI sends this IRP to query for all instances of a given data block.
WMI sends this IRP at IRQL = PASSIVE\_LEVEL in an arbitrary thread context.
## Input Parameters
**Parameters.WMI.ProviderId** in the driver's I/O stack location in the IRP points to the device object of the driver that should respond to the request.
**Parameters.WMI.DataPath** points to a GUID that identifies the data block.
**Parameters.WMI.BufferSize** indicates the maximum size of the nonpaged buffer at **Parameters.WMI.Buffer**, which receives output data from the request. The buffer size must be greater than or equal to **sizeof**(**WNODE\_ALL\_DATA**) plus the sizes of instance names and data for all instances to be returned.
## Output Parameters
If the driver handles WMI IRPs by calling [**WmiSystemControl**](/windows-hardware/drivers/ddi/wmilib/nf-wmilib-wmisystemcontrol), WMI fills in a **WNODE\_ALL\_DATA** by calling the driver's *DpWmiQueryDataBlock* routine once for each block registered by the driver.
Otherwise, the driver fills in a [**WNODE\_ALL\_DATA**](/windows-hardware/drivers/ddi/wmistr/ns-wmistr-tagwnode_all_data) structure at **Parameters.WMI.Buffer** as follows:
- Sets **WnodeHeader.BufferSize** to the number of bytes of the entire **WNODE\_ALL\_DATA** to be returned, sets **WnodeHeader.Timestamp** to the value returned by **KeQuerySystemTime**, and sets **WnodeHeader.Flags** as appropriate for the data to be returned.
- Sets **InstanceCount** to the number of instances to be returned.
- If the block uses dynamic instance names, sets **OffsetInstanceNameOffsets** to the offset in bytes from the beginning of the **WNODE\_ALL\_DATA** to where an array of ULONG offsets begins. Each element in this array is the offset from the **WNODE\_ALL\_DATA** to where each dynamic instance name is stored. Each dynamic instance name is stored as a counted Unicode string where the count is a USHORT followed by the Unicode string. The count does not include any terminating null character that may be part of the Unicode string. If the Unicode string does include a terminating null character, this null character must still fit within the size established in **WNodeHeader.BufferSize**.
- If all instances are the same size:
- Sets WNODE\_FLAG\_FIXED\_INSTANCE\_SIZE in **WnodeHeader.Flags** and sets **FixedInstanceSize** to that size, in bytes.
- Writes instance data starting at **DataBlockOffset**, with padding so that each instance is aligned to an 8-byte boundary. For example, if **FixedInstanceSize** is 6, the driver adds 2 bytes of padding between instances.
- If instances vary in size:
- Clears WNODE\_FLAG\_FIXED\_INSTANCE\_SIZE in **WnodeHeader.Flags** and writes an array of **InstanceCount** **OFFSETINSTANCEDATAANDLENGTH** structures starting at **OffsetInstanceDataAndLength**. Each **OFFSETINSTANCEDATAANDLENGTH** structure specifies the offset in bytes from the beginning of the **WNODE\_ALL\_DATA** structure to the beginning of the data for each instance, and the length of the data. **DataBlockOffset** is not used.
- Writes instance data following the last element of the **OffsetInstanceDataAndLength** array, plus padding so that each instance is aligned to an 8-byte boundary.
If the buffer at **Parameters.WMI.Buffer** is too small to receive all of the data, a driver fills in the needed size in a [**WNODE\_TOO\_SMALL**](/windows-hardware/drivers/ddi/wmistr/ns-wmistr-tagwnode_too_small) structure at **Parameters.WMI.Buffer**. If the buffer is smaller than **sizeof**(**WNODE\_TOO\_SMALL**), the driver fails the IRP and returns STATUS\_BUFFER\_TOO\_SMALL.
## I/O Status Block
If the driver handles the IRP by calling [**WmiSystemControl**](/windows-hardware/drivers/ddi/wmilib/nf-wmilib-wmisystemcontrol), WMI sets **Irp->IoStatus.Status** and **Irp->IoStatus.Information** in the I/O status block.
Otherwise, the driver sets **Irp->IoStatus.Status** to STATUS\_SUCCESS or to an appropriate error status such as the following:
STATUS\_BUFFER\_TOO\_SMALL
STATUS\_WMI\_GUID\_NOT\_FOUND
On success, a driver sets **Irp->IoStatus.Information** to the number of bytes written to the buffer at **Parameters.WMI.Buffer**.
Operation
---------
A driver can handle WMI IRPs either by calling [**WmiSystemControl**](/windows-hardware/drivers/ddi/wmilib/nf-wmilib-wmisystemcontrol) or by handling the IRP itself, as described in [Handling WMI Requests](./handling-wmi-requests.md).
If a driver handles WMI IRPs by calling [**WmiSystemControl**](/windows-hardware/drivers/ddi/wmilib/nf-wmilib-wmisystemcontrol), that routine calls the driver's [*DpWmiQueryDataBlock*](/windows-hardware/drivers/ddi/wmilib/nc-wmilib-wmi_query_datablock_callback) routine.
If a driver handles an **IRP\_MN\_QUERY\_ALL\_DATA** request, it should do so only if **Parameters.WMI.ProviderId** points to the same device object that the driver passed to [**IoWMIRegistrationControl**](/windows-hardware/drivers/ddi/wdm/nf-wdm-iowmiregistrationcontrol). Otherwise, the driver must forward the request to the next-lower driver.
Before handling the request, the driver must determine whether **Parameters.WMI.DataPath** points to a GUID that the driver supports. If not, the driver must fail the IRP and return STATUS\_WMI\_GUID\_NOT\_FOUND.
If the driver supports the data block, it must do the following:
- Verify that **Parameters.WMI.BufferSize** specifies a buffer that is large enough to receive all the data that the driver will return.
- Fill in a [**WNODE\_ALL\_DATA**](/windows-hardware/drivers/ddi/wmistr/ns-wmistr-tagwnode_all_data) structure at **Parameters.WMI.Buffer** with data for all instances of that data block.
Requirements
------------
<table>
<colgroup>
<col width="50%" />
<col width="50%" />
</colgroup>
<tbody>
<tr class="odd">
<td><p>Header</p></td>
<td>Wdm.h (include Wdm.h, Ntddk.h, or Ntifs.h)</td>
</tr>
</tbody>
</table>
## See also
[*DpWmiQueryDataBlock*](/windows-hardware/drivers/ddi/wmilib/nc-wmilib-wmi_query_datablock_callback)
[**IoWMIRegistrationControl**](/windows-hardware/drivers/ddi/wdm/nf-wdm-iowmiregistrationcontrol)
[**KeQuerySystemTime**](/windows-hardware/drivers/ddi/wdm/nf-wdm-kequerysystemtime)
[**WMILIB\_CONTEXT**](/windows-hardware/drivers/ddi/wmilib/ns-wmilib-_wmilib_context)
[**WmiSystemControl**](/windows-hardware/drivers/ddi/wmilib/nf-wmilib-wmisystemcontrol)
[**WNODE\_ALL\_DATA**](/windows-hardware/drivers/ddi/wmistr/ns-wmistr-tagwnode_all_data)
| 61.590164 | 693 | 0.769231 | eng_Latn | 0.937473 |
0c71a0f3ce53874adbccdbc97c0aed04db0dc07f | 2,170 | md | Markdown | README.md | thephw/take_home | 9c53277ebebb236ae00676137af866bad7632804 | [
"MIT"
] | 5 | 2020-01-02T03:46:05.000Z | 2020-01-08T19:22:01.000Z | README.md | thephw/take_home | 9c53277ebebb236ae00676137af866bad7632804 | [
"MIT"
] | 2 | 2020-01-02T15:11:13.000Z | 2020-01-02T16:03:14.000Z | README.md | thephw/take_home | 9c53277ebebb236ae00676137af866bad7632804 | [
"MIT"
] | null | null | null | # 💰Take Home
Help folks figure out their take home pay and taxes. PRs welcome, add yo' own state to the FY2020 library.
Check for your own state and federal constants.
# Basic Usage
```
$ gem install take_home
```
```
require 'take_home'
you = TaxablePerson.single("georgia", 100_000)
you.take_home
=> 71790.5
```
# Initializing
## Options
By default the new TaxablePerson objects accept an dollar income and the following options:
```
:state_deductions # A number for state income deductions
:state_tax_rates # A hash where the key is the upper limit and the value is the percentage
:federal_deductions # A number for federal income deductions
:federal_tax_rates # A hash where the key is the upper limit and the value is the percentage
```
## Long Form
```
tchalla = TaxablePerson.single("wakanda", 48_000, opts = {
state_deductions: 4600,
state_tax_rates: {
750 => 0.01,
2250 => 0.02,
3750 => 0.03,
5250 => 0.04,
7000 => 0.05,
Float::INFINITY => 0.0575
},
federal_deductions: 12200,
federal_tax_rates: {
19400 => 0.10,
78950 => 0.12,
168400 => 0.22,
321450 => 0.24,
408200 => 0.32,
612350 => 0.35,
Float::INFINITY => 0.37
}
})
```
## Short Form Helpers
```
me = TaxablePerson.single("georgia", 48_000)
us = TaxablePerson.married("georgia", 96_000)
```
# Methods
```
#take_home returns income less federal income tax, state income tax, and payroll tax (social security + medicare)
#payroll_taxes returns the amount withheld for payroll taxes (SS + medicare)
#social_security_taxes returns the amount withheld for social security taxes
#medicare_taxes returns the amount withheld for medicare taxes
#federal_income_taxes returns the amount withheld for federal income taxes
#state_income_taxes returns the amount withheld for state income taxes
#taxes returns the cumulative amount withheld for federal income, state income, and payroll taxes
#effective_tax_rate shows a percentage effective tax rate
```
# Contributors
[Patrick Wiseman](https://github.com/thephw)
# License
[MIT License](https://github.com/thephw/take_home/blob/master/LICENSE)
| 25.232558 | 113 | 0.712903 | eng_Latn | 0.93807 |
0c71f7e0b3088daa1a40babb1486d42dd8a46fd6 | 2,564 | md | Markdown | articles/azure-relay/relay-authentication-and-authorization.md | grayknight2/mc-docs.zh-cn | dc705774cac09f2b3eaeec3c0ecc17148604133e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/azure-relay/relay-authentication-and-authorization.md | grayknight2/mc-docs.zh-cn | dc705774cac09f2b3eaeec3c0ecc17148604133e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/azure-relay/relay-authentication-and-authorization.md | grayknight2/mc-docs.zh-cn | dc705774cac09f2b3eaeec3c0ecc17148604133e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Azure 中继身份验证和授权
description: 本文概述了如何使用 Azure 中继服务进行共享访问签名 (SAS) 身份验证。
ms.topic: article
origin.date: 06/23/2020
ms.date: 07/27/2020
ms.testscope: no
ms.testdate: ''
ms.author: v-yeche
author: rockboyfor
ms.openlocfilehash: 01f708d83c1da751345d819d63bc21a2f44bfad4
ms.sourcegitcommit: 091c672fa448b556f4c2c3979e006102d423e9d7
ms.translationtype: HT
ms.contentlocale: zh-CN
ms.lasthandoff: 07/24/2020
ms.locfileid: "87162160"
---
# <a name="azure-relay-authentication-and-authorization"></a>Azure 中继身份验证和授权
应用程序可以使用共享访问签名 (SAS) 身份验证对 Azure 中继进行身份验证。 通过 SAS 身份验证,应用程序能够使用在中继命名空间中配置的访问密钥向 Azure 中继服务进行身份验证。 然后可以使用此密钥生成共享访问签名令牌,客户端可用它向中继服务进行身份验证。
## <a name="shared-access-signature-authentication"></a>共享访问签名身份验证
通过 [SAS 身份验证](../service-bus-messaging/service-bus-sas.md)可向具有特定权限的用户授予对 Azure 中继资源的访问权限。 SAS 身份验证涉及配置具有资源相关权限的加密密钥。 客户端随后即可通过提供 SAS 令牌获取该资源的访问权限,该令牌由要访问的资源 URI 和签有已配置密钥的过期时间组成。
可以在中继命名空间上配置用于 SAS 的密钥。 与服务总线消息传送不同,[中继混合连接](relay-hybrid-connections-protocol.md)支持未经授权的发件人或匿名发件人。 可在创建实体时启用它的匿名访问权限,如门户中以下屏幕截图所示:
![][0]
若要使用 SAS,可在由以下项构成的中继命名空间上配置 [SharedAccessAuthorizationRule](https://docs.azure.cn/dotnet/api/microsoft.servicebus.messaging.sharedaccessauthorizationrule?view=azure-dotnet) 对象:
* 标识此规则的 KeyName。
* *PrimaryKey* ,是用于对 SAS 令牌进行签名/验证的加密密钥。
* *SecondaryKey* ,是用于对 SAS 令牌进行签名/验证的加密密钥。
* Rights,表示授予的侦听、发送或管理权限的集合。
在命名空间级别配置的授权规则,可以向具有使用相应密钥签名的令牌的客户端授予命名空间中所有中继连接的访问权限。 在中继命名空间上最多可配置 12 个此类授权规则。 默认情况下,首次预配时,为每个命名空间配置具有所有权限的 [SharedAccessAuthorizationRule](https://docs.azure.cn/dotnet/api/microsoft.servicebus.messaging.sharedaccessauthorizationrule?view=azure-dotnet) 。
若要访问某个实体,客户端需要使用特定 [SharedAccessAuthorizationRule](https://docs.azure.cn/dotnet/api/microsoft.servicebus.messaging.sharedaccessauthorizationrule?view=azure-dotnet) 生成的 SAS 令牌。 SAS 令牌是通过使用资源字符串的 HMAC-SHA256 生成的,该字符串由要授予对其访问权限的资源 URI 和授权规则相关加密密钥的过期时间组成。
Azure.NET SDK 2.0 版和更高版本中包含 Azure 中继的 SAS 身份验证支持。 SAS 支持 [SharedAccessAuthorizationRule](https://docs.azure.cn/dotnet/api/microsoft.servicebus.messaging.sharedaccessauthorizationrule?view=azure-dotnet)。 允许将连接字符串作为参数的所有 API 都支持 SAS 连接字符串。
## <a name="next-steps"></a>后续步骤
- 有关 SAS 的详细信息,请继续阅读[使用共享访问签名进行服务总线身份验证](../service-bus-messaging/service-bus-sas.md)。
- 有关混合连接功能的详细信息,请参阅[Azure 中继混合连接协议指南](relay-hybrid-connections-protocol.md)。
- 有关服务总线消息传送身份验证和授权的相关信息,请参阅[服务总线身份验证和授权](../service-bus-messaging/service-bus-authentication-and-authorization.md)。
[0]: ./media/relay-authentication-and-authorization/hcanon.png
<!-- Update_Description: update meta properties, wording update, update link --> | 50.27451 | 256 | 0.818643 | yue_Hant | 0.755343 |
0c72002a94b26b9ec4e059522cfb3ad42ae88d76 | 16 | md | Markdown | README.md | Tacolover420/SantaCoin | 33caff5220e5d6551215e897cd673a810b2a3016 | [
"MIT"
] | null | null | null | README.md | Tacolover420/SantaCoin | 33caff5220e5d6551215e897cd673a810b2a3016 | [
"MIT"
] | null | null | null | README.md | Tacolover420/SantaCoin | 33caff5220e5d6551215e897cd673a810b2a3016 | [
"MIT"
] | null | null | null | # SantaCoin--6-
| 8 | 15 | 0.625 | gle_Latn | 0.496536 |
0c7215657c99544e299382201ee23b65f4a05d6d | 585 | md | Markdown | Task-6_Exploratory-Data-Analysis--Retail/README.md | avinashkranjan/GRIP-SparkFoundation | b3cc6d32b29a027d18389193c999b3cf33395822 | [
"MIT"
] | 1 | 2020-12-25T18:16:15.000Z | 2020-12-25T18:16:15.000Z | Task-6_Exploratory-Data-Analysis--Retail/README.md | avinashkranjan/GRIP-TheSparksFoundation | b3cc6d32b29a027d18389193c999b3cf33395822 | [
"MIT"
] | null | null | null | Task-6_Exploratory-Data-Analysis--Retail/README.md | avinashkranjan/GRIP-TheSparksFoundation | b3cc6d32b29a027d18389193c999b3cf33395822 | [
"MIT"
] | null | null | null | # Task 6: Prediction using Decision Tree Algorithm 👨🏻💻
### DecisionTree 📊
Decision Trees (DTs) are a non-parametric supervised learning method used for classification and regression. The goal is to create a model that predicts the value of a target variable by learning simple decision rules inferred from the data features.
For instance, in the example below, decision trees learn from data to approximate a sine curve with a set of if-then-else decision rules. The deeper the tree, the more complex the decision rules and the fitter the model.
#### _Author: Avinash Kr. Ranjan_
| 58.5 | 250 | 0.784615 | eng_Latn | 0.999174 |
0c72913098b26b68310a643d97c27f0f23d0c975 | 2,724 | md | Markdown | wdk-ddi-src/content/portcls/nf-portcls-iminiportstreamaudioenginenode-setstreamloopbackprotection.md | tianye606/windows-driver-docs-ddi | 23fec97f3ed3a0c99b117543982d34ee592501e7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | wdk-ddi-src/content/portcls/nf-portcls-iminiportstreamaudioenginenode-setstreamloopbackprotection.md | tianye606/windows-driver-docs-ddi | 23fec97f3ed3a0c99b117543982d34ee592501e7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | wdk-ddi-src/content/portcls/nf-portcls-iminiportstreamaudioenginenode-setstreamloopbackprotection.md | tianye606/windows-driver-docs-ddi | 23fec97f3ed3a0c99b117543982d34ee592501e7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
UID: NF:portcls.IMiniportStreamAudioEngineNode.SetStreamLoopbackProtection
title: IMiniportStreamAudioEngineNode::SetStreamLoopbackProtection (portcls.h)
description: Sets the loopback protection status of the audio engine node.
old-location: audio\iminiportstreamaudioenginenode_setstreamloopbackprotection.htm
tech.root: audio
ms.assetid: FAC9AC9B-9C4B-4D53-A59A-8901EC8755BC
ms.date: 05/08/2018
ms.keywords: IMiniportStreamAudioEngineNode interface [Audio Devices],SetStreamLoopbackProtection method, IMiniportStreamAudioEngineNode.SetStreamLoopbackProtection, IMiniportStreamAudioEngineNode::SetStreamLoopbackProtection, SetStreamLoopbackProtection, SetStreamLoopbackProtection method [Audio Devices], SetStreamLoopbackProtection method [Audio Devices],IMiniportStreamAudioEngineNode interface, audio.iminiportstreamaudioenginenode_setstreamloopbackprotection, portcls/IMiniportStreamAudioEngineNode::SetStreamLoopbackProtection
ms.topic: method
f1_keywords:
- "portcls/IMiniportStreamAudioEngineNode.SetStreamLoopbackProtection"
req.header: portcls.h
req.include-header:
req.target-type: Universal
req.target-min-winverclnt: Windows 8
req.target-min-winversvr: Windows Server 2012
req.kmdf-ver:
req.umdf-ver:
req.ddi-compliance:
req.unicode-ansi:
req.idl:
req.max-support:
req.namespace:
req.assembly:
req.type-library:
req.lib:
req.dll:
req.irql:
topic_type:
- APIRef
- kbSyntax
api_type:
- COM
api_location:
- Portcls.h
api_name:
- IMiniportStreamAudioEngineNode.SetStreamLoopbackProtection
product:
- Windows
targetos: Windows
req.typenames:
---
# IMiniportStreamAudioEngineNode::SetStreamLoopbackProtection
## -description
Sets the loopback protection status of the audio engine node.
## -parameters
### -param ProtectionOption [in]
A CONSTRICTOR_OPTION enumeration value that indicates status of the loopback protection option.
## -returns
<b>SetStreamLoopbackProtection</b> returns S_OK, if the call was successful. Otherwise, the method returns an appropriate error code.
## -remarks
For more information about audio stream loopback protection, see <a href="https://docs.microsoft.com/windows-hardware/drivers/audio/ksproperty-audioengine-loopback-protection">KSPROPERTY_AUDIOENGINE_LOOPBACK_PROTECTION</a>.
## -see-also
CONSTRICTOR_OPTION
<a href="https://docs.microsoft.com/windows-hardware/drivers/ddi/portcls/nn-portcls-iminiportstreamaudioenginenode">IMiniportStreamAudioEngineNode</a>
<a href="https://docs.microsoft.com/windows-hardware/drivers/audio/ksproperty-audioengine-loopback-protection">KSPROPERTY_AUDIOENGINE_LOOPBACK_PROTECTION</a>
| 27.24 | 535 | 0.796623 | kor_Hang | 0.252616 |
0c72a8e40e30f175a05dc161370996279f124955 | 4,646 | md | Markdown | docs/vs-2015/extensibility/changing-view-settings-by-using-the-legacy-api.md | Birgos/visualstudio-docs.de-de | 64595418a3cea245bd45cd3a39645f6e90cfacc9 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/extensibility/changing-view-settings-by-using-the-legacy-api.md | Birgos/visualstudio-docs.de-de | 64595418a3cea245bd45cd3a39645f6e90cfacc9 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/extensibility/changing-view-settings-by-using-the-legacy-api.md | Birgos/visualstudio-docs.de-de | 64595418a3cea245bd45cd3a39645f6e90cfacc9 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Ändern von Einstellungen mithilfe der Legacy-API | Microsoft-Dokumentation
ms.custom: ''
ms.date: 11/15/2016
ms.prod: visual-studio-dev14
ms.reviewer: ''
ms.suite: ''
ms.technology:
- vs-ide-sdk
ms.tgt_pltfrm: ''
ms.topic: article
helpviewer_keywords:
- editors [Visual Studio SDK], legacy - changing view settings
ms.assetid: 12c9b300-0894-4124-96a1-764326176d77
caps.latest.revision: 19
ms.author: gregvanl
manager: ghogen
ms.openlocfilehash: df84fa92cb0da8dd408b1cc8717628afa3d5ba19
ms.sourcegitcommit: af428c7ccd007e668ec0dd8697c88fc5d8bca1e2
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 11/16/2018
ms.locfileid: "51730634"
---
# <a name="changing-view-settings-by-using-the-legacy-api"></a>Ändern von Einstellungen mithilfe der Legacy-API
[!INCLUDE[vs2017banner](../includes/vs2017banner.md)]
Einstellungen, die Kern-Editor-Features, z. B. Zeilenumbruch, Auswahlrand und virtuellen Leerraum befindet, können geändert werden, durch den Benutzer von der **Optionen** Dialogfeld. Es ist jedoch auch möglich, zum Ändern dieser Einstellungen programmgesteuert.
## <a name="changing-settings-by-using-the-legacy-api"></a>Ändern von Einstellungen mithilfe der Legacy-API
Die <xref:Microsoft.VisualStudio.TextManager.Interop.IVsTextEditorPropertyCategoryContainer> Schnittstelle verfügbar macht, einen Satz von Eigenschaften für Text-Editor. Die Textansicht enthält eine Kategorie von Eigenschaften (GUID_EditPropCategory_View_MasterSettings), die die Gruppe der programmgesteuert geänderte Einstellungen für die Textansicht darstellt. Sobald auf diese Weise Einstellungen geändert wurden, sie können nicht geändert werden, der **Optionen** (Dialogfeld), bis sie zurückgesetzt werden.
Es folgt der typische Prozess zum Ändern der Einstellungen für eine Instanz von der Kern-Editor anzeigen.
1. Rufen Sie `QueryInterface` auf die (<xref:Microsoft.VisualStudio.TextManager.Interop.VsTextView>) für die <xref:Microsoft.VisualStudio.TextManager.Interop.IVsTextEditorPropertyCategoryContainer> Schnittstelle.
2. Rufen Sie die <xref:Microsoft.VisualStudio.TextManager.Interop.IVsTextEditorPropertyCategoryContainer.GetPropertyCategory%2A> -Methode, geben Sie einen Wert GUID_EditPropCategory_View_MasterSettings für die `rguidCategory` Parameter.
Dies gibt einen Zeiger auf die <xref:Microsoft.VisualStudio.TextManager.Interop.IVsTextEditorPropertyCategoryContainer> -Schnittstelle, die den Satz der erzwungenen Eigenschaften für die Sicht enthält. Alle Einstellungen in dieser Gruppe werden dauerhaft erzwungen. Wenn eine Einstellung nicht in dieser Gruppe vorhanden ist, wird es im angegebenen Optionen folgen der **Optionen** Dialogfeld oder die Befehle des Benutzers.
3. Rufen Sie die <xref:Microsoft.VisualStudio.TextManager.Interop.IVsTextEditorPropertyContainer.SetProperty%2A> Methode, und geben den Wert der entsprechenden Einstellungen in der `idprop` Parameter.
Beispielsweise um Zeilenumbruch zu erzwingen, rufen Sie <xref:Microsoft.VisualStudio.TextManager.Interop.IVsTextEditorPropertyContainer.SetProperty%2A> und geben Sie den Wert VSEDITPROPID_ViewLangOpt_WordWrap, `vt` für die `idprop` Parameter. In diesem Aufruf `vt` ist eine Variante des Typs VT_BOOL und `vt.boolVal` auf VARIANT_TRUE festgelegt ist.
## <a name="resetting-changed-view-settings"></a>Von geänderten Anzeigeeinstellungen werden zurückgesetzt
Rufen Sie zum Zurücksetzen einer beliebigen geänderten Ansicht, die für eine Instanz von der Kern-Editor Festlegen der <xref:Microsoft.VisualStudio.TextManager.Interop.IVsTextEditorPropertyContainer.RemoveProperty%2A> Methode, und geben Sie den Wert der entsprechenden Einstellung in der `idprop` Parameter.
Z. B. um Zeilenumbruch schweben zu ermöglichen, Sie würden es aus der Eigenschaftskategorie entfernen durch Aufrufen von <xref:Microsoft.VisualStudio.TextManager.Interop.IVsTextEditorPropertyContainer.RemoveProperty%2A> und geben Sie einen Wert VSEDITPROPID_ViewLangOpt_WordWrap für die `idprop` Parameter.
Um alle geänderten Einstellungen für die Kern-Editor auf einmal zu entfernen, geben Sie den Wert VSEDITPROPID_ViewComposite_AllCodeWindowDefaults, vt für die `idprop` Parameter. Klicken Sie in diesem Aufruf vt ist eine Variante des Typs VT_BOOL und vt.boolVal ist auf VARIANT_TRUE festgelegt.
## <a name="see-also"></a>Siehe auch
[In der Kern-Editor](../extensibility/inside-the-core-editor.md)
[Zugreifen auf TheText Ansicht mit der Legacy-API](../extensibility/accessing-thetext-view-by-using-the-legacy-api.md)
[Optionen (Dialogfeld)](../ide/reference/options-dialog-box-visual-studio.md)
| 81.508772 | 515 | 0.812742 | deu_Latn | 0.921692 |
0c732e298e50b7511e491e30e5edba600a2d3dbf | 7,382 | md | Markdown | Runner/service-account.md | joenguyen215/PublicKB | d9547defe557771005d20a045abbfee524993dbc | [
"Apache-2.0"
] | 1 | 2019-10-05T20:44:19.000Z | 2019-10-05T20:44:19.000Z | Runner/service-account.md | joenguyen215/PublicKB | d9547defe557771005d20a045abbfee524993dbc | [
"Apache-2.0"
] | 1 | 2018-12-17T16:43:14.000Z | 2018-12-17T16:43:14.000Z | Runner/service-account.md | joenguyen215/PublicKB | d9547defe557771005d20a045abbfee524993dbc | [
"Apache-2.0"
] | 1 | 2018-08-14T15:28:53.000Z | 2018-08-14T15:28:53.000Z | {{{
"title": "Runner Service Account",
"date": "2-22-2018",
"author": "Justin Colbert",
"attachments": [],
"related-products" : [],
"contentIsHTML": false,
"sticky": true
}}}
### Audience
This article is to support customers of Runner, a product that enables teams, developers, and engineers to quickly provision, interact, and modify their environments anywhere - CenturyLink Cloud, third-party cloud providers, and on-premises. Additionally, the responses in this FAQ document are specific to using the service through the Control Portal.
### Service Account Overview
The Service Account alias is a secure store of a users credentials. These credentials are associated with a Job and a Schedule. When the schedule fires these credentials will be used to authenticate to CenturyLink Cloud to retrieve the necessary tokens to execute the job. The credentials are stored in a secure format separate from the job definition and other account identifying elements.
* [Create Service Account via Runner UI](#create-service-account-via-runner-ui)
* [Create Service Account via Runner API](#create-service-account-via-runner-api)
* [Get Service Account via Runner API](#get-service-account-via-runner-api)
* [Update Service Account via Runner API](#update-service-account-via-runner-api)
### Create Service Account via Runner UI
1. To create a Service Account through the Runner UI, first click the Help & Settings drop-down and select Settings.

2. On the left side, select Service Accounts and this will bring up a view showing you all of your existing Service Accounts. Click on the add button and a form will appear where you can enter the required information.

3. Input the following information:
* The Service Account Alias, this can be anything you want but it should be descriptive.
* The CenturyLink Cloud User Name that you want associated with this Service Account.
* The password that corresponds with the User Name you entered above.
4. Click the Save button and your service account will be created.
### Create Service Account via Runner API
The Service Account alias is a secure store of a user's credentials. These credentials are associated with a Job and a Schedule. The credentials are stored in a secure format separate from the job definition and other account identifying elements. Calls to this operation must include a token acquired from the authentication endpoint. See the [Login API](https://www.ctl.io/api-docs/v2/#authentication-login) for information on acquiring this token.
**When to use it**
Use this API to create Service Account alias under your CenturyLink Cloud account and use this alias for executing scheduled jobs.
##### URL
**Structure**
`POST https://api.runner.ctl.io/serviceAccounts/{accountAlias}`
**Example**
`POST https://api.runner.ctl.io/serviceAccounts/XXXX`
##### Request
**URI Parameters**
| NAME | TYPE | DESCRIPTION | REQ.|
| --- | --- | --- | --- |
| accountAlias | string | Short code of your CenturyLink Cloud Account Alias. | Yes |
**Entity Definition**
| NAME | TYPE | DESCRIPTION | REQ.|
| --- | --- | --- | --- |
| alias | string | Enter your Service Account Alias name. | Yes |
| username | string | Your CenturyLink Cloud Account username. | Yes |
| password | string | Your CenturyLink Cloud Account password. | Yes |
##### Response
The response will list the details of the new Service Account alias created.
**Entity Definition**
| NAME | TYPE | DESCRIPTION |
| --- | --- | --- |
| id | string | ID of your new Service Account Alias. |
| accountAlias | string | Short code of your CenturyLink Cloud Account Alias. |
| alias | string | Your Service Account Alias name. |
| username | string | Your CenturyLink Cloud Account username. |
| password | string | Your CenturyLink Cloud Account password. |
**Example**
```
{
"id": "729e3136-c474-42fb-976e-53786f5fb000",
"accountAlias": "XXXX",
"alias": demo-account-service-account1,
"username": "your-username",
"password": "your-password"
}
```
### Get Service Account via Runner API
The Service Account alias is a secure store of a users credentials. These credentials are associated with a Job and a Schedule. The credentials are stored in a secure format separate from the job definition and other account identifying elements. See the [Login API](https://www.ctl.io/api-docs/v2/#authentication-login) for information on acquiring this token.
**When to use it**
Use this API to view all the Service Account alias under your CenturyLink Cloud account alias.
##### URL
**Structure**
`GET https://api.runner.ctl.io/serviceAccounts/{accountAlias}`
**Example**
`GET https://api.runner.ctl.io/serviceAccounts/XXXX`
##### Request
**URI Parameters**
| NAME | TYPE | DESCRIPTION | REQ.|
| --- | --- | --- | --- |
| accountAlias | string | Short code of your CenturyLink Cloud Account Alias. | Yes |
##### Response
The response will provide an array of all Service Account alias created under your CenturyLink Cloud Account.
**Entity Definition**
| NAME | TYPE | DESCRIPTION |
| --- | --- | --- |
| accounts | array | List of Service Account Alias created. |
**Example**
```
{
"accounts": [
"demo-account-service-account1",
"demo-account-service-account2",
"demo-account-service-account3",
"demo-account-service-account4"
]
}
```
### Update Service Account via Runner API
The Service Account alias is a secure store of a users credentials. These credentials are associated with a Job and a Schedule. The credentials are stored in a secure format separate from the job definition and other account identifying elements. See the [Login API](https://www.ctl.io/api-docs/v2/#authentication-login) for information on acquiring this token.
**When to use it**
Use this API to update your existing Service Account alias credentials under your CenturyLink Cloud account.
##### URL
**Structure**
`PUT https://api.runner.ctl.io/serviceAccounts/{accountAlias}/{serviceAccountAlias}`
**Example**
`PUT https://api.runner.ctl.io/serviceAccounts/XXXX/demo-account-service-account1`
##### Request
**URI Parameters**
| NAME | TYPE | DESCRIPTION | REQ.|
| --- | --- | --- | --- |
| accountAlias | string | Short code of your CenturyLink Cloud Account Alias. | Yes |
| alias | string | Your Service Account Alias name to be updated. | Yes |
**Entity Definition**
| NAME | TYPE | DESCRIPTION | REQ.|
| --- | --- | --- | --- |
| username | string | Your CenturyLink Cloud Account username. | Yes |
| password | string | Your CenturyLink Cloud Account password. | Yes |
##### Response
The response will contain the updated Service Account alias details.
**Entity Definition**
| NAME | TYPE | DESCRIPTION |
| --- | --- | --- |
| id | string | ID of your new Service Account Alias. |
| accountAlias | string | Short code of your CenturyLink Cloud Account Alias. |
| alias | string | Your Service Account Alias name. |
| username | string | Your CenturyLink Cloud Account username. |
| password | string | Your CenturyLink Cloud Account password. |
**Example**
```
{
"id": "729e3136-c474-42fb-976e-53786f5fb000",
"accountAlias": "XXXX",
"alias": demo-account-service-account1,
"username": "Updated - username",
"password": "Updated - password"
}
```
| 36.186275 | 450 | 0.720672 | eng_Latn | 0.938999 |
0c73c6c4094c6c6901f7cdbaa536d1dc63bf7a91 | 1,638 | md | Markdown | README.md | BolajiOlajide/serverless-rest-api | d9b263eaa545c4b345b1a0a4b80106b96eaa1ea3 | [
"MIT"
] | null | null | null | README.md | BolajiOlajide/serverless-rest-api | d9b263eaa545c4b345b1a0a4b80106b96eaa1ea3 | [
"MIT"
] | null | null | null | README.md | BolajiOlajide/serverless-rest-api | d9b263eaa545c4b345b1a0a4b80106b96eaa1ea3 | [
"MIT"
] | null | null | null | # Serverless REST API
## DESCRIPTION
I recently got curious about the possibility of having an API without an actual server. I got intrigued by a post I found on medium.
I decided to check out a tutorial on Serverless and here is the result. A RESTful API build with Serverless technology.
### GETTING STARTED
The application start process has been automated and the only thing you to worry about is ensuring you have a mongo DB database named `serverless` or better still update the `DATABASE_URL` in the `.env.example` file to the one you'd like to make use of.
Once this is done, run the application from your terminal with the shell script `run.sh`. Simply enter the command `./run.sh` in your terminal and follow the prompts.
The following routes are available in the application and can be accessed with any client of your choice. You can check out `POSTMAN`, `RESTMAN`.
| EndPoint | Functionality | Public Access|
| -----------------------------------------|:-----------------------------:|-------------:|
| **GET** /hello | Register a user | TRUE |
| **GET** /notes | GET all notes | TRUE |
| **POST** /notes | Create a new note | TRUE |
| **GET** /notes/:id | GET a note by ID | TRUE |
| **PUT** /notes/:id | UPDATE an existing note | TRUE |
| **DELETE** /notes/:id | DELETE an existing note | TRUE |
### LICENSE
MIT
| 58.5 | 253 | 0.551893 | eng_Latn | 0.99321 |
0c73ddac50882fb1155e1ee05ee27f114baa4239 | 834 | md | Markdown | content/services/gc/ircc/1430.md | YOWCT/end-to-end-services | 633260242c645ab121cdf85db33125f9b5cd16ad | [
"MIT"
] | null | null | null | content/services/gc/ircc/1430.md | YOWCT/end-to-end-services | 633260242c645ab121cdf85db33125f9b5cd16ad | [
"MIT"
] | null | null | null | content/services/gc/ircc/1430.md | YOWCT/end-to-end-services | 633260242c645ab121cdf85db33125f9b5cd16ad | [
"MIT"
] | null | null | null | ---
title: "Federal Skilled Worker – Application for Permanent Residence"
summary: "The Federal Skilled Worker – Application for Permanent Residence service from Immigration, Refugees and Citizenship Canada is available end-to-end online, according to the GC Service Inventory."
url: "gc/ircc/1430"
department: "Immigration, Refugees and Citizenship Canada"
departmentAcronym: "ircc"
serviceId: "1430"
onlineEndtoEnd: 1
serviceDescription: "Permanent Residence granted and Permanent Resident card issued to skilled workers on the basis of their ability to become economically established in Canada and their intention to reside in a province other than Quebec."
serviceUrl: "https://www.canada.ca/en/immigration-refugees-citizenship/services/immigrate-canada/express-entry.html"
programDescription: "Federal Economic Immigration"
---
| 64.153846 | 241 | 0.816547 | eng_Latn | 0.9671 |
0c741ae3241a11e04b11432ae8ca38d970272c0f | 2,653 | md | Markdown | docs/vs-2015/misc/determining-the-default-namespace-of-a-project.md | tommorris/visualstudio-docs.de-de | 2f351c63cc51989c21aa6f5e705ec428144d8da6 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/misc/determining-the-default-namespace-of-a-project.md | tommorris/visualstudio-docs.de-de | 2f351c63cc51989c21aa6f5e705ec428144d8da6 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/misc/determining-the-default-namespace-of-a-project.md | tommorris/visualstudio-docs.de-de | 2f351c63cc51989c21aa6f5e705ec428144d8da6 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Bestimmen die Standard-Namespace eines Projekts | Microsoft-Dokumentation
ms.custom: ''
ms.date: 2018-06-30
ms.prod: visual-studio-dev14
ms.reviewer: ''
ms.suite: ''
ms.technology:
- devlang-csharp
ms.tgt_pltfrm: ''
ms.topic: article
helpviewer_keywords:
- custom tools, computing default namespace
ms.assetid: 6d890676-7016-458c-8a6a-95cc0a068612
caps.latest.revision: 13
manager: douge
ms.openlocfilehash: 27919985c09356764533e736899dc6a7cb5d0090
ms.sourcegitcommit: 55f7ce2d5d2e458e35c45787f1935b237ee5c9f8
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 08/22/2018
ms.locfileid: "47511456"
---
# <a name="determining-the-default-namespace-of-a-project"></a>Ermitteln des Standardnamespaces eines Projekts
Für [!INCLUDE[vbprvb](../includes/vbprvb-md.md)], wenn die `CustomToolNamespace` Eigenschaft wird festgelegt, auf die Eingabedatei, klicken Sie dann den Wert der `CustomToolNamespace` wird der Wert, der der Namespace-Standardparameter, die an die <xref:Microsoft.VisualStudio.Shell.Interop.IVsSingleFileGenerator.Generate%2A> Methode. Andernfalls die `wszDefaultNamespace` übergebene Parameter `Generate` ist immer gleich den Stamm-Namespace. Weitere Informationen zu Namespaces finden Sie unter [Namespace Schlüsselwörter](http://msdn.microsoft.com/library/091a66eb-b10d-4f54-9102-5ac0d4bdb84b).
[!INCLUDE[csprcs](../includes/csprcs-md.md)] werden ordnerbasierten Namespace verwendet. Der Namespace besteht, also den Stamm-Namespace sowie die Namen von Ordnern, die das benutzerdefinierte Tool enthält. Ordnernamen weisen in ein gültiger Bezeichner konvertiert, und Punkte trennen Sie alle Namen. Z. B. wenn die Eingabedatei FolderA\FolderB\FolderC\MyInput.txt ist und der Stamm-Namespace CL9 ist, klicken Sie dann der berechneten Standardnamespace wäre **CL9. FolderA.FolderB.FolderC**.
Eine Ausnahme von dieser Regel tritt auf, wenn der Hierarchiekette Ordner für Webverweise enthält. Z. B. wenn:
- FolderC wurden Ordner für Webverweise, der Namespace **CL9. FolderC**.
- FolderB wurden Ordner für Webverweise, der Namespace **CL9. FolderB.FolderC**.
Der Namespace verwendet, also das folgende Format:
```
rootNamespace.webReferenceFolder.containedFolder.containedFolder ...
```
## <a name="see-also"></a>Siehe auch
[Implementieren von Einzeldatei-Generatoren](../extensibility/internals/implementing-single-file-generators.md)
[Registrieren von Generatoren einzelner Dateien](../extensibility/internals/registering-single-file-generators.md)
[Verfügbarmachen von Typen für visuelle Designer](../extensibility/internals/exposing-types-to-visual-designers.md) | 60.295455 | 598 | 0.793064 | deu_Latn | 0.869906 |
0c742e621d3bc98b63e19fe16ea269cd092b9a8c | 480 | md | Markdown | mission/ja/MISSION.md | shunkakinoki/shunkakinoki | 07e18842d18710f985dd5d3f79d0beeaa4ce3f2f | [
"MIT"
] | 15 | 2020-02-23T06:31:28.000Z | 2021-12-08T17:54:03.000Z | mission/ja/MISSION.md | shunkakinoki/shunkakinoki | 07e18842d18710f985dd5d3f79d0beeaa4ce3f2f | [
"MIT"
] | 1,234 | 2019-10-05T09:17:14.000Z | 2021-09-14T19:37:58.000Z | mission/ja/MISSION.md | shunkakinoki/shunkakinoki | 07e18842d18710f985dd5d3f79d0beeaa4ce3f2f | [
"MIT"
] | 1 | 2020-02-27T21:10:11.000Z | 2020-02-27T21:10:11.000Z | # 志
私の人生の志は
## **銀河系を真っ二つにする**
ことです。
宇宙や無限大、その先にも衝撃を与えるような素晴らしいものを作りたいと思っています。
それはプロダクト通じて、スタートアップを通じて届けたいと思っています。
## 背景
アップル社の本社があるクパチーノ市で育った私は、素晴らしい製品がすべてを変えることを文字通り目の前で目撃しました。
[楽観的ニヒリズム](https://www.youtube.com/watch?v=MBRqu0YOH14)から、
私はこの地球上の短い時間の中で何を追求するかということの重要性を強く主張しています。
### **本質的には何もかも虚無に帰するかもしれませんが、『衝撃』の瞬間は永遠に人々の心に刻まれます。**
もし自分がそれほど世界を震撼させるくらいインパクトのあるプロダクトを出せるとすれば、それ以上に重要なことはありません。
It would mean the world to me.
それが私の使命だと自負しているので、一生かけて追求していきたいと思っています。
| 17.142857 | 59 | 0.827083 | jpn_Jpan | 0.95728 |
0c74aaf6e6e4dd823cfcf8ae4821e044faf18b0d | 184 | md | Markdown | packages/core/README.md | TheRakeshPurohit/figma-export | e3321288eb569007f1719677c0e080ba073287d5 | [
"MIT"
] | 1 | 2022-03-15T12:24:18.000Z | 2022-03-15T12:24:18.000Z | packages/core/README.md | TheRakeshPurohit/figma-export | e3321288eb569007f1719677c0e080ba073287d5 | [
"MIT"
] | null | null | null | packages/core/README.md | TheRakeshPurohit/figma-export | e3321288eb569007f1719677c0e080ba073287d5 | [
"MIT"
] | null | null | null | # @figma-export/core
> @figma-export core functionalities.
## Install
Using npm:
```sh
npm i -D @figma-export/core
```
or using yarn:
```sh
yarn add @figma-export/core --dev
```
| 10.222222 | 37 | 0.652174 | kor_Hang | 0.43905 |
0c74ba84f99636cf51ba52c0a317e444a5e3d647 | 9,908 | md | Markdown | _posts/2018-12-09-lstm(1).md | goldenbeluga/beluga.github.io | 568d786d420d95089e9fcb49104214c0ef528efd | [
"MIT"
] | null | null | null | _posts/2018-12-09-lstm(1).md | goldenbeluga/beluga.github.io | 568d786d420d95089e9fcb49104214c0ef528efd | [
"MIT"
] | null | null | null | _posts/2018-12-09-lstm(1).md | goldenbeluga/beluga.github.io | 568d786d420d95089e9fcb49104214c0ef528efd | [
"MIT"
] | null | null | null | LSTM 돌리려는데 딱 실무에 맞는 튜토리얼이 존재하지 않더라. 대부분의 튜토리얼은 1차원의 시계열을 가지고 돌린다. 악보, 가사, 기사, 주가 등이 대표적이다. 2차원의 여러 주가를 가지고 돌린 튜토리얼은 하나 존재했다. 하지만 3차원(ID가 추가)의 튜토리얼은 찾아볼 수 없었다. 코딩을 잘하시는 분은 쓱삭쓱싹 하셨겠지만, `따라하기, 문송합니다` 인생으로 살아와서 그런지 참고할 문서가 없다는 것은 거대한 벽처럼 느껴졌다. (혹시 참고할 문서가 있으시면 추천 부탁드립니다.)
정확하게 이 방법이 맞는지는 모르겠지만, 결과는 나왔으니 공유해본다.(더 좋은 방법이 분명히 존재할테지만...)
ID가 추가 된다는 말은 ID마다 sequence가 달라지기 때문에 ID마다 table을 따로 만들어서 처리한 후 합해서 batch로 넘겨줘야 한다.
처음 생각했던 방법은 for문을 이용해 ID별로 table을 따로 만든 후 처리하려고 했다. 하지만 250만 고객에 대한 데이터가 너무 커서 그런지(기간이 길어질 수록 커짐) for 문에 대한 응답은 하루가 지나도 완료될 기미는 보이지 않았다.
다른 방법을 찾던 중 판다스의 멀티 인덱싱에 눈이 갔다.
다뤄야할 데이터이다.
| ID | DATE | F1 | F2 | Y |
|---- |-------- |----- |---- |--- |
| a | 201812 | 43 | y | y |
| a | 201901 | 213 | n | y |
| b | 201811 | 123 | n | n |
| b | 201812 | 41 | y | y |
이 데이터를 다루기 전에 해야할 건 lstm과 multiindex
일단 multiindex를 배워보자.
이 [페이지](https://www.somebits.com/~nelson/pandas-multiindex-slice-demo.html)를 참고했다.
```python
import pandas, io
```
## Create an unindexed DataFrame
```python
data = io.StringIO('''Day,Fruit,Color,Count,Price
1,Apple,Red,3,$1.29
2,Apple,Green,9,$0.99
3,Pear,Red,25,$2.59
4,Pear,Green,26,$2.79
5,Lime,Green,99,$0.39
''')
df_unindexed = pandas.read_csv(data)
df_unindexed
```
<div>
<style scoped>
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>Day</th>
<th>Fruit</th>
<th>Color</th>
<th>Count</th>
<th>Price</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>1</td>
<td>Apple</td>
<td>Red</td>
<td>3</td>
<td>$1.29</td>
</tr>
<tr>
<th>1</th>
<td>2</td>
<td>Apple</td>
<td>Green</td>
<td>9</td>
<td>$0.99</td>
</tr>
<tr>
<th>2</th>
<td>3</td>
<td>Pear</td>
<td>Red</td>
<td>25</td>
<td>$2.59</td>
</tr>
<tr>
<th>3</th>
<td>4</td>
<td>Pear</td>
<td>Green</td>
<td>26</td>
<td>$2.79</td>
</tr>
<tr>
<th>4</th>
<td>5</td>
<td>Lime</td>
<td>Green</td>
<td>99</td>
<td>$0.39</td>
</tr>
</tbody>
</table>
</div>
## Add a multi-index based on two columns
```python
df = df_unindexed.set_index(['Day','Fruit'])
df
```
<div>
<style scoped>
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th></th>
<th>Color</th>
<th>Count</th>
<th>Price</th>
</tr>
<tr>
<th>Day</th>
<th>Fruit</th>
<th></th>
<th></th>
<th></th>
</tr>
</thead>
<tbody>
<tr>
<th>1</th>
<th>Apple</th>
<td>Red</td>
<td>3</td>
<td>$1.29</td>
</tr>
<tr>
<th>2</th>
<th>Apple</th>
<td>Green</td>
<td>9</td>
<td>$0.99</td>
</tr>
<tr>
<th>3</th>
<th>Pear</th>
<td>Red</td>
<td>25</td>
<td>$2.59</td>
</tr>
<tr>
<th>4</th>
<th>Pear</th>
<td>Green</td>
<td>26</td>
<td>$2.79</td>
</tr>
<tr>
<th>5</th>
<th>Lime</th>
<td>Green</td>
<td>99</td>
<td>$0.39</td>
</tr>
</tbody>
</table>
</div>
```python
df.index
```
MultiIndex(levels=[[1, 2, 3, 4, 5], ['Apple', 'Lime', 'Pear']],
labels=[[0, 1, 2, 3, 4], [0, 0, 2, 2, 1]],
names=['Day', 'Fruit'])
첫 번째 level([1,2,3,4,5])이 0번째이고 다음이 1번째이다. sort, slicing할 때 이 순서를 따른다.
## Slicing the data frame
```python
df.sort_index(inplace=True,level=0)
```
level이 0 이니깐 Day index를 기준으로 sort하겠다는 뜻
sort는 slicing을 위해 필수적임
```python
df
```
<div>
<style scoped>
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th></th>
<th>Color</th>
<th>Count</th>
<th>Price</th>
</tr>
<tr>
<th>Day</th>
<th>Fruit</th>
<th></th>
<th></th>
<th></th>
</tr>
</thead>
<tbody>
<tr>
<th>1</th>
<th>Apple</th>
<td>Red</td>
<td>3</td>
<td>$1.29</td>
</tr>
<tr>
<th>2</th>
<th>Apple</th>
<td>Green</td>
<td>9</td>
<td>$0.99</td>
</tr>
<tr>
<th>3</th>
<th>Pear</th>
<td>Red</td>
<td>25</td>
<td>$2.59</td>
</tr>
<tr>
<th>4</th>
<th>Pear</th>
<td>Green</td>
<td>26</td>
<td>$2.79</td>
</tr>
<tr>
<th>5</th>
<th>Lime</th>
<td>Green</td>
<td>99</td>
<td>$0.39</td>
</tr>
</tbody>
</table>
</div>
### 1. loc를 통한 slicing
```python
df.loc[(slice(1,4),slice(None)),:]
```
<div>
<style scoped>
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th></th>
<th>Color</th>
<th>Count</th>
<th>Price</th>
</tr>
<tr>
<th>Day</th>
<th>Fruit</th>
<th></th>
<th></th>
<th></th>
</tr>
</thead>
<tbody>
<tr>
<th>1</th>
<th>Apple</th>
<td>Red</td>
<td>3</td>
<td>$1.29</td>
</tr>
<tr>
<th>2</th>
<th>Apple</th>
<td>Green</td>
<td>9</td>
<td>$0.99</td>
</tr>
<tr>
<th>3</th>
<th>Pear</th>
<td>Red</td>
<td>25</td>
<td>$2.59</td>
</tr>
<tr>
<th>4</th>
<th>Pear</th>
<td>Green</td>
<td>26</td>
<td>$2.79</td>
</tr>
</tbody>
</table>
</div>
loc에는 Multi index라서 튜플형식으로 들어감
.loc[(0번째 index, 1번째 index), :]
slice는 문법이 slice(start, stop)인데 stop까지 포함해서 출력됨
### 2. idx를 통한 slicing
```python
idx = pandas.IndexSlice
df.loc[idx[1:4,:], :]
```
<div>
<style scoped>
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th></th>
<th>Color</th>
<th>Count</th>
<th>Price</th>
</tr>
<tr>
<th>Day</th>
<th>Fruit</th>
<th></th>
<th></th>
<th></th>
</tr>
</thead>
<tbody>
<tr>
<th>1</th>
<th>Apple</th>
<td>Red</td>
<td>3</td>
<td>$1.29</td>
</tr>
<tr>
<th>2</th>
<th>Apple</th>
<td>Green</td>
<td>9</td>
<td>$0.99</td>
</tr>
<tr>
<th>3</th>
<th>Pear</th>
<td>Red</td>
<td>25</td>
<td>$2.59</td>
</tr>
<tr>
<th>4</th>
<th>Pear</th>
<td>Green</td>
<td>26</td>
<td>$2.79</td>
</tr>
</tbody>
</table>
</div>
## Change multiindex order
```python
df.index
```
MultiIndex(levels=[[1, 2, 3, 4, 5], ['Apple', 'Lime', 'Pear']],
labels=[[0, 1, 2, 3, 4], [0, 0, 2, 2, 1]],
names=['Day', 'Fruit'])
```python
df=df.reindex(['Apple', 'Pear', 'Lime'],level=1)
df.index
```
MultiIndex(levels=[[1, 2, 3, 4, 5], ['Apple', 'Pear', 'Lime']],
labels=[[0, 1, 2, 3, 4], [0, 0, 1, 1, 2]],
names=['Day', 'Fruit'])
```python
df.loc[idx[:,'Apple':'Lime'], :]
```
<div>
<style scoped>
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th></th>
<th>Color</th>
<th>Count</th>
<th>Price</th>
</tr>
<tr>
<th>Day</th>
<th>Fruit</th>
<th></th>
<th></th>
<th></th>
</tr>
</thead>
<tbody>
<tr>
<th>1</th>
<th>Apple</th>
<td>Red</td>
<td>3</td>
<td>$1.29</td>
</tr>
<tr>
<th>2</th>
<th>Apple</th>
<td>Green</td>
<td>9</td>
<td>$0.99</td>
</tr>
<tr>
<th>3</th>
<th>Pear</th>
<td>Red</td>
<td>25</td>
<td>$2.59</td>
</tr>
<tr>
<th>4</th>
<th>Pear</th>
<td>Green</td>
<td>26</td>
<td>$2.79</td>
</tr>
<tr>
<th>5</th>
<th>Lime</th>
<td>Green</td>
<td>99</td>
<td>$0.39</td>
</tr>
</tbody>
</table>
</div>
| 17.024055 | 269 | 0.426221 | kor_Hang | 0.836094 |
0c7587e4332ff7fe4f853a27cb4f6d943fe2a234 | 13,989 | md | Markdown | windows-driver-docs-pr/display/required-dxgi-formats.md | scottnoone/windows-driver-docs | 0d67834ab63cf2a8993bccdea23d1b0186a4aec6 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-11-30T20:31:06.000Z | 2021-11-30T20:31:06.000Z | windows-driver-docs-pr/display/required-dxgi-formats.md | scottnoone/windows-driver-docs | 0d67834ab63cf2a8993bccdea23d1b0186a4aec6 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | windows-driver-docs-pr/display/required-dxgi-formats.md | scottnoone/windows-driver-docs | 0d67834ab63cf2a8993bccdea23d1b0186a4aec6 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Required DXGI formats
description: This topic presents the requirements that Microsoft Direct3D feature levels place on the user-mode display driver.
ms.assetid: 1CB419B9-DD5E-492F-AAAC-CFFFDE247F7F
ms.date: 04/20/2017
ms.localizationpriority: medium
---
# Required DXGI formats
This topic presents the requirements that Microsoft Direct3D feature levels place on the user-mode display driver.
The first and second columns of the first table show all Direct3D format types that the driver must support. The third column shows all associated constant values of the Direct3D [**D3D10\_FORMAT\_SUPPORT**](/windows/desktop/api/d3d10/ne-d3d10-d3d10_format_support) and/or [**D3D11\_FORMAT\_SUPPORT**](/windows/desktop/api/d3d11/ne-d3d11-d3d11_format_support) enumerations that the driver must support. The fourth column shows the minimum Direct3D feature level at which the driver must support each format.
The second table shows the Direct3D 10Level 9 support algorithm for each enumeration value.
<table>
<colgroup>
<col width="25%" />
<col width="25%" />
<col width="25%" />
<col width="25%" />
</colgroup>
<thead>
<tr class="header">
<th align="left">D3D9 format (D3DDDIFMT_* and/or D3DDECLTYPE<em></th>
<th align="left">D3D10+ API equivalent (DXGI_FORMAT_</em>)</th>
<th align="left">Required D3D10_ or D3D11_ FORMAT_SUPPORT_* enumeration values</th>
<th align="left">Minimum required Direct3D level</th>
</tr>
</thead>
<tbody>
<tr class="odd">
<td align="left">A32B32G32R32F or D3DDECLTYPE_FLOAT4</td>
<td align="left">R32G32B32A32_FLOAT</td>
<td align="left"><p>IA_VERTEX_BUFFER</p>
<p>TEXTURE2D</p>
<p>TEXTURE3D</p>
<p>TEXTURECUBE</p>
<p>SHADER_LOAD</p>
<p>MIP</p>
<p>MIP_AUTOGEN</p>
<p>RENDER_TARGET</p>
<p>CPU_LOCKABLE</p></td>
<td align="left"><p>9_1</p>
<p>9_2</p>
<p>9_3</p>
<p>9_3</p>
<p>9_2</p>
<p>9_3</p>
<p>9_3</p>
<p>9_2</p>
<p>9_2</p></td>
</tr>
<tr class="even">
<td align="left">D3DDECLTYPE_FLOAT3</td>
<td align="left">R32G32B32_FLOAT</td>
<td align="left"><p>IA_VERTEX_BUFFER</p></td>
<td align="left"><p>9_1</p></td>
</tr>
<tr class="odd">
<td align="left">A16B16G16R16F or D3DDECLTYPE_FLOAT16_4</td>
<td align="left">R16G16B16A16_FLOAT</td>
<td align="left"><p>IA_VERTEX_BUFFER</p>
<p>TEXTURE2D</p>
<p>TEXTURE3D</p>
<p>TEXTURECUBE</p>
<p>SHADER_LOAD</p>
<p>MIP</p>
<p>MIP_AUTOGEN</p>
<p>RENDER_TARGET</p>
<p>BLENDABLE</p>
<p>CPU_LOCKABLE</p></td>
<td align="left"><p>9_3</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_3</p>
<p>9_2</p></td>
</tr>
<tr class="even">
<td align="left">A16B16G16R16 or D3DDECLTYPE_USHORT4N</td>
<td align="left">R16G16B16A16_UNORM</td>
<td align="left"><p>TEXTURE2D</p>
<p>TEXTURE3D</p>
<p>TEXTURECUBE</p>
<p>SHADER_LOAD</p>
<p>SHADER_SAMPLE</p>
<p>MIP</p>
<p>MIP_AUTOGEN</p>
<p>RENDER_TARGET</p>
<p>CPU_LOCKABLE</p></td>
<td align="left"><p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p></td>
</tr>
<tr class="odd">
<td align="left">Q16W16V16U16 or D3DDECLTYPE_SHORT4N</td>
<td align="left">R16G16B16A16_SNORM</td>
<td align="left"><p>IA_VERTEX_BUFFER</p></td>
<td align="left"><p>9_1</p></td>
</tr>
<tr class="even">
<td align="left">D3DDECLTYPE_SHORT4</td>
<td align="left">R16G16B16A16_SINT</td>
<td align="left"><p>IA_VERTEX_BUFFER</p></td>
<td align="left"><p>9_1</p></td>
</tr>
<tr class="odd">
<td align="left">G32R32F or D3DDECLTYPE_FLOAT2</td>
<td align="left">R32G32_FLOAT</td>
<td align="left"><p>IA_VERTEX_BUFFER</p>
<p>TEXTURE2D</p>
<p>TEXTURE3D</p>
<p>TEXTURECUBE</p>
<p>SHADER_LOAD</p>
<p>RENDER_TARGET</p>
<p>CPU_LOCKABLE</p></td>
<td align="left"><p>9_1</p>
<p>9_3</p>
<p>9_3</p>
<p>9_3</p>
<p>9_3</p>
<p>9_3</p>
<p>9_3</p></td>
</tr>
<tr class="even">
<td align="left">D3DDECLTYPE_UBYTE4</td>
<td align="left">R8G8B8A8_UINT</td>
<td align="left"><p>IA_VERTEX_BUFFER</p></td>
<td align="left"><p>9_1</p></td>
</tr>
<tr class="odd">
<td align="left">A8R8G8B8 or D3DDECLTYPE_UBYTE4N</td>
<td align="left">R8G8B8A8_UNORM</td>
<td align="left"><p>IA_VERTEX_BUFFER</p>
<p>TEXTURE2D</p>
<p>TEXTURE3D</p>
<p>TEXTURECUBE</p>
<p>SHADER_LOAD</p>
<p>SHADER_SAMPLE</p>
<p>MIP</p>
<p>MIP_AUTOGEN</p>
<p>RENDER_TARGET</p>
<p>BLENDABLE</p>
<p>CPU_LOCKABLE</p>
<p>DISPLAY</p>
<p>BACK_BUFFER_CAST</p></td>
<td align="left"><p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p></td>
</tr>
<tr class="even">
<td align="left">A8R8G8B8</td>
<td align="left">R8G8B8A8_UNORM_SRGB</td>
<td align="left"><p>TEXTURE2D</p>
<p>TEXTURE3D</p>
<p>TEXTURECUBE</p>
<p>SHADER_LOAD</p>
<p>SHADER_SAMPLE</p>
<p>MIP</p>
<p>MIP_AUTOGEN</p>
<p>RENDER_TARGET</p>
<p>BLENDABLE</p>
<p>CPU_LOCKABLE</p>
<p>DISPLAY</p>
<p>BACK_BUFFER_CAST</p></td>
<td align="left"><p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p></td>
</tr>
<tr class="odd">
<td align="left">Q8W8V8U8</td>
<td align="left">R8G8B8A8_SNORM</td>
<td align="left"><p>TEXTURE2D</p>
<p>TEXTURECUBE</p>
<p>SHADER_LOAD</p>
<p>SHADER_SAMPLE</p>
<p>MIP</p>
<p>CPU_LOCKABLE</p></td>
<td align="left"><p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p></td>
</tr>
<tr class="even">
<td align="left">A8R8G8B8</td>
<td align="left">B8G8R8A8_UNORM</td>
<td align="left"><p>TEXTURE2D</p>
<p>TEXTURE3D</p>
<p>TEXTURECUBE</p>
<p>SHADER_LOAD</p>
<p>SHADER_SAMPLE</p>
<p>MIP</p>
<p>MIP_AUTOGEN</p>
<p>RENDER_TARGET</p>
<p>BLENDABLE</p>
<p>CPU_LOCKABLE</p>
<p>DISPLAY</p>
<p>BACK_BUFFER_CAST</p></td>
<td align="left"><p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p></td>
</tr>
<tr class="odd">
<td align="left">X8R8G8B8</td>
<td align="left">B8G8R8X8_UNORM</td>
<td align="left"><p>TEXTURE2D</p>
<p>TEXTURE3D</p>
<p>TEXTURECUBE</p>
<p>SHADER_LOAD</p>
<p>SHADER_SAMPLE</p>
<p>MIP</p>
<p>MIP_AUTOGEN</p>
<p>RENDER_TARGET</p>
<p>BLENDABLE</p>
<p>CPU_LOCKABLE</p></td>
<td align="left"><p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p></td>
</tr>
<tr class="even">
<td align="left">A8R8G8B8</td>
<td align="left">B8G8R8A8_UNORM_SRGB</td>
<td align="left"><p>TEXTURE2D</p>
<p>TEXTURE3D</p>
<p>TEXTURECUBE</p>
<p>SHADER_LOAD</p>
<p>SHADER_SAMPLE</p>
<p>MIP</p>
<p>MIP_AUTOGEN</p>
<p>RENDER_TARGET</p>
<p>BLENDABLE</p>
<p>CPU_LOCKABLE</p>
<p>DISPLAY</p>
<p>BACK_BUFFER_CAST</p></td>
<td align="left"><p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p></td>
</tr>
<tr class="odd">
<td align="left">X8R8G8B8</td>
<td align="left">B8G8R8X8_UNORM_SRGB</td>
<td align="left"><p>TEXTURE2D</p>
<p>TEXTURE3D</p>
<p>TEXTURECUBE</p>
<p>SHADER_LOAD</p>
<p>SHADER_SAMPLE</p>
<p>MIP</p>
<p>MIP_AUTOGEN</p>
<p>RENDER_TARGET</p>
<p>BLENDABLE</p>
<p>CPU_LOCKABLE</p></td>
<td align="left"><p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p></td>
</tr>
<tr class="even">
<td align="left">G16R16F or D3DDECLTYPE_FLOAT16_2</td>
<td align="left">R16G16_FLOAT</td>
<td align="left"><p>IA_VERTEX_BUFFER</p>
<p>TEXTURE2D</p>
<p>TEXTURE3D</p>
<p>TEXTURECUBE</p>
<p>SHADER_LOAD</p>
<p>MIP</p>
<p>MIP_AUTOGEN</p>
<p>RENDER_TARGET</p>
<p>CPU_LOCKABLE</p></td>
<td align="left"><p>9_3</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p></td>
</tr>
<tr class="odd">
<td align="left">G16R16 or D3DDECLTYPE_USHORT2N</td>
<td align="left">R16G16_UNORM</td>
<td align="left"><p>TEXTURE2D</p>
<p>TEXTURE3D</p>
<p>TEXTURECUBE</p>
<p>SHADER_LOAD</p>
<p>SHADER_SAMPLE</p>
<p>MIP</p>
<p>MIP_AUTOGEN</p>
<p>RENDER_TARGET</p>
<p>CPU_LOCKABLE</p></td>
<td align="left"><p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p></td>
</tr>
<tr class="even">
<td align="left">V16U16 or D3DDECLTYPE_SHORT2N</td>
<td align="left">R16G16_SNORM</td>
<td align="left"><p>IA_VERTEX_BUFFER</p>
<p>TEXTURE2D</p>
<p>TEXTURE3D</p>
<p>TEXTURECUBE</p>
<p>SHADER_LOAD</p>
<p>SHADER_SAMPLE</p>
<p>MIP</p>
<p>CPU_LOCKABLE</p></td>
<td align="left"><p>9_1</p>
<p>9_1</p>
<p>9_2</p>
<p>9_2</p>
<p>9_1</p>
<p>9_2</p>
<p>9_1</p>
<p>9_1</p></td>
</tr>
<tr class="odd">
<td align="left">D3DDECLTYPE_SHORT2</td>
<td align="left">R16G16_SINT</td>
<td align="left"><p>IA_VERTEX_BUFFER</p></td>
<td align="left"><p>9_1</p></td>
</tr>
<tr class="even">
<td align="left">R32F or D3DDECLTYPE_FLOAT1</td>
<td align="left">R32_FLOAT</td>
<td align="left"><p>IA_VERTEX_BUFFER</p>
<p>TEXTURE2D</p>
<p>TEXTURE3D</p>
<p>TEXTURECUBE</p>
<p>SHADER_LOAD</p>
<p>MIP</p>
<p>MIP_AUTOGEN</p>
<p>RENDER_TARGET</p>
<p>CPU_LOCKABLE</p></td>
<td align="left"><p>9_1</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p></td>
</tr>
<tr class="odd">
<td align="left"></td>
<td align="left">R32_UINT</td>
<td align="left"><p>IA_INDEX_BUFFER</p></td>
<td align="left"><p>9_1</p></td>
</tr>
<tr class="even">
<td align="left">S8D24 or D24S8</td>
<td align="left">D24_UNORM_S8_UINT</td>
<td align="left"><p>TEXTURE2D</p>
<p>DEPTH_STENCIL</p></td>
<td align="left"><p>9_1</p>
<p>9_1</p></td>
</tr>
<tr class="odd">
<td align="left">L16</td>
<td align="left">R16_UNORM</td>
<td align="left"><p>TEXTURE2D</p>
<p>TEXTURE3D</p>
<p>TEXTURECUBE</p>
<p>SHADER_LOAD</p>
<p>SHADER_SAMPLE</p>
<p>MIP</p>
<p>CPU_LOCKABLE</p></td>
<td align="left"><p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p>
<p>9_2</p></td>
</tr>
<tr class="even">
<td align="left"></td>
<td align="left">R16_UINT</td>
<td align="left"><p>IA_INDEX_BUFFER</p></td>
<td align="left"><p>9_1</p></td>
</tr>
<tr class="odd">
<td align="left">D16 or D16_LOCKABLE</td>
<td align="left">D16_UNORM</td>
<td align="left"><p>TEXTURE2D</p>
<p>DEPTH_STENCIL</p></td>
<td align="left"><p>9_1</p>
<p>9_1</p></td>
</tr>
<tr class="even">
<td align="left">V8U8</td>
<td align="left">R8G8_SNORM</td>
<td align="left"><p>TEXTURE2D</p>
<p>SHADER_LOAD</p>
<p>SHADER_SAMPLE</p>
<p>MIP</p>
<p>CPU_LOCKABLE</p></td>
<td align="left"><p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p></td>
</tr>
<tr class="odd">
<td align="left">L8</td>
<td align="left">R8_UNORM</td>
<td align="left"><p>TEXTURE2D</p>
<p>TEXTURE3D</p>
<p>TEXTURECUBE</p>
<p>SHADER_LOAD</p>
<p>SHADER_SAMPLE</p>
<p>MIP</p>
<p>CPU_LOCKABLE</p></td>
<td align="left"><p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p></td>
</tr>
<tr class="even">
<td align="left">DXT1</td>
<td align="left">BC1_UNORM or BC1_UNORM_SRGB</td>
<td align="left"><p>TEXTURE2D</p>
<p>TEXTURECUBE</p>
<p>SHADER_LOAD</p>
<p>SHADER_SAMPLE</p>
<p>MIP</p>
<p>CPU_LOCKABLE</p></td>
<td align="left"><p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p></td>
</tr>
<tr class="odd">
<td align="left">DXT2</td>
<td align="left">BC2_UNORM or BC2_UNORM_SRGB</td>
<td align="left"><p>TEXTURE2D</p>
<p>TEXTURECUBE</p>
<p>SHADER_LOAD</p>
<p>SHADER_SAMPLE</p>
<p>MIP</p>
<p>CPU_LOCKABLE</p></td>
<td align="left"><p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p></td>
</tr>
<tr class="even">
<td align="left">DXT4</td>
<td align="left">BC3_UNORM or BC3_UNORM_SRGB</td>
<td align="left"><p>TEXTURE2D</p>
<p>TEXTURECUBE</p>
<p>SHADER_LOAD</p>
<p>SHADER_SAMPLE</p>
<p>MIP</p>
<p>CPU_LOCKABLE</p></td>
<td align="left"><p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p>
<p>9_1</p></td>
</tr>
</tbody>
</table>
<table>
<colgroup>
<col width="50%" />
<col width="50%" />
</colgroup>
<thead>
<tr class="header">
<th align="left">Required D3D10_ or D3D11_ FORMAT_SUPPORT_* enumeration values</th>
<th align="left">Support algorithm in Direct3D 10Level 9</th>
</tr>
</thead>
<tbody>
<tr class="odd">
<td align="left"><p>BACK_BUFFER_CAST</p></td>
<td align="left"><p>Assumed true for any format that supports DISPLAY.</p></td>
</tr>
<tr class="even">
<td align="left"><p>BLENDABLE</p></td>
<td align="left"><p>No FORMATOP_NOALPHABLEND</p></td>
</tr>
<tr class="odd">
<td align="left"><p>CPU_LOCKABLE</p></td>
<td align="left"><p>Assumed always true.</p></td>
</tr>
<tr class="even">
<td align="left"><p>DISPLAY</p></td>
<td align="left"><p>Hard-coded.</p></td>
</tr>
<tr class="odd">
<td align="left"><p>IA_VERTEX_BUFFER</p></td>
<td align="left"><p>D3DDTCAPS_* (See Note.)</p></td>
</tr>
<tr class="even">
<td align="left"><p>MIP</p></td>
<td align="left"><p>No FORMATOP_NOTEXCOORDWRAPNORMIP</p></td>
</tr>
<tr class="odd">
<td align="left"><p>MIP_AUTOGEN</p></td>
<td align="left"><p>(See Note.)</p></td>
</tr>
<tr class="even">
<td align="left"><p>RENDER_TARGET</p></td>
<td align="left"><p>FORMATOP_OFFSCREEN_RENDERTARGET</p></td>
</tr>
<tr class="odd">
<td align="left"><p>SHADER_LOAD</p></td>
<td align="left"><p>Assumed for all non-depth formats.</p></td>
</tr>
<tr class="even">
<td align="left"><p>SHADER_SAMPLE</p></td>
<td align="left"><p>(See Note.)</p></td>
</tr>
<tr class="odd">
<td align="left"><p>TEXTURE2D</p></td>
<td align="left"><p>FORMATOP_TEXTURE</p></td>
</tr>
<tr class="even">
<td align="left"><p>TEXTURE3D</p></td>
<td align="left"><p>FORMATOP_VOLUMETEXTURE</p></td>
</tr>
<tr class="odd">
<td align="left"><p>TEXTURECUBE</p></td>
<td align="left"><p>FORMATOP_CUBETEXTURE</p></td>
</tr>
</tbody>
</table>
**Note** These are further details on the support algorithm's requirements in Direct3D 10Level 9:
- The IA\_VERTEX\_BUFFER and/or IA\_INDEX\_BUFFER formats are supported by software vertex processing if there is no D3DDEVCAPS\_HWTRANSFORMANDLIGHT capability.
- The TEXTURE2D format can also be inferred from it being a depth-stencil format.
- For the SHADER\_SAMPLE format, the driver must support FORMATOP\_TEXTURE, FORMATOP\_VOLUMETEXTURE, or FORMATOP\_CUBETEXTURE, and it must not report FORMATOP\_NOFILTER.
- For the MIP\_AUTOGEN format, Direct3D 10Level 9 generates its own mip-maps, so it requires MIP, RENDER\_TARGET, and TEXTURE2D bits.
| 22.418269 | 507 | 0.64565 | yue_Hant | 0.791752 |
0c75b06f2a929c0f78b903cbf8a6d197a3fb0c1b | 1,617 | md | Markdown | docs/linechart/overview.md | liuxinyumocn/miniapp-charts | 97dfd643f5c177681be7d9eceabfa3938e27fad2 | [
"MIT"
] | 7 | 2018-09-11T11:12:35.000Z | 2022-03-14T07:54:47.000Z | docs/linechart/overview.md | liuxinyumocn/miniapp-charts | 97dfd643f5c177681be7d9eceabfa3938e27fad2 | [
"MIT"
] | 18 | 2020-02-05T11:43:35.000Z | 2022-02-26T23:15:14.000Z | docs/linechart/overview.md | liuxinyumocn/miniapp-charts | 97dfd643f5c177681be7d9eceabfa3938e27fad2 | [
"MIT"
] | 2 | 2020-01-03T09:12:53.000Z | 2021-05-11T08:09:12.000Z | # 介绍
折线图组件用于绘制数据趋势图:
- 可以绘制一条或者多条曲线;
- 每条曲线的样式都可以独立配置;
- 最多支持左右两个坐标轴,每条曲线可以自由选择使用左右坐标轴;
- 几乎整个组件所有部分都可以配置;
## 示意图
<img :src="$withBase('/imgs/linechart.jpg')" width=600>
## 代码示例
1. 在wxml添加canvas
``` html
<canvas
canvas-id="linechart1"
style="width:414px;height:200px;margin:0;position: absolute; left: 0; top: 0"
bindtouchstart="bindtouchstart"
bindtouchmove="bindtouchmove"
bindtouchend="bindtouchend"
/>
<canvas
canvas-id="linechart2"
style="width:414px;height:200px;margin:0;position: absolute; left: 0; top: 0"
bindtouchstart="bindtouchstart"
bindtouchmove="bindtouchmove"
bindtouchend="bindtouchend"
/>
```
2. 在js里面实例化组件
``` js
import LineChart from 'miniapp-charts';
Page({
onLoad() {
this.init();
},
init() {
let linechart = new LineChart(
wx.createCanvasContext('linechart1'),
{
height: 200,
},
wx.createCanvasContext('linechart2'),
);
this.linechart = linechart;
let points = [];
for ( let i = 0; i < 108;i++) {
points.push({
x: i + 1,
y: Math.ceil(Math.random()*30),
});
}
linechart.draw({
datasets: [
{
points : points,
lineName: 'test',
},
]
});
},
bindtouchstart(e) {
this.linechart.touch(e);
},
bindtouchmove(e) {
this.linechart.touch(e);
},
bindtouchend(e) {
this.linechart.touchEnd(e);
}
});
```
| 19.719512 | 81 | 0.528757 | kor_Hang | 0.155718 |
0c76831291bad783b33f9477a66837520455e8a9 | 2,925 | md | Markdown | README.md | JDPDO/nwg-panel | f5622e1a35e63840fb1826a2ef35a8edb9dfcbc2 | [
"MIT"
] | null | null | null | README.md | JDPDO/nwg-panel | f5622e1a35e63840fb1826a2ef35a8edb9dfcbc2 | [
"MIT"
] | null | null | null | README.md | JDPDO/nwg-panel | f5622e1a35e63840fb1826a2ef35a8edb9dfcbc2 | [
"MIT"
] | null | null | null | # nwg-panel
This application is a part of the [nwg-shell](https://github.com/nwg-piotr/nwg-shell) project.
I have been using sway since 2019 and find it the most comfortable working environment, but...
Have you ever missed all the graphical bells and whistles in your panel, we used to have in tint2? It happens to me.
That's why I decided to try to code a GTK-based panel, including best features from my two favourites:
[Waybar](https://github.com/Alexays/Waybar) and [tint2](https://gitlab.com/o9000/tint2). Many thanks to Developers
and Contributors of the both projects!
There are 12 modules available at the moment, and I don't plan on many more. Basis system controls are available in the
Controls module, and whatever else you may need,
[there's an executor for that](https://github.com/nwg-piotr/nwg-panel/wiki/Some-useful-executors).

[](https://repology.org/project/nwg-panel/versions)
## Modules
### Controls
Panel widget with a popup window, including sliders, some system info, user-defined rows
and customizable menu *(top right in the picture)*.
### SwayNC
Provides integration of the Eric Reider's [Sway Notification Center](https://github.com/ErikReider/SwayNotificationCenter).
### Tray
Supports SNI based system tray.
### Clock
Just a label to show the `date` command output in the format of your choice *(top center)*.
### Playerctl
Set of buttons, and a label to control mpris media player with the
[Playerctl utility](https://github.com/altdesktop/playerctl) *(top left)*.
### SwayTaskbar
Shows tasks from a selected or all outputs, with the program icon and name; allows switching between them,
toggle the container layout (`tabbed stacking splitv splith`) with the mouse scroller, move to workspaces,
toggle floating and kill with the right-click menu *(bottom left)*;
### SwayWorkspaces
Set of textual buttons to switch between workspaces, and a label to see the current task icon and title.
### Scratchpad
Displays clickable icons representing windows moved to the sway scratchpad;
### DWL Tags
The DwlTags module displays tags (free, occupied, selected), layout and the active window name from the dwl Wayland
compositor. The `nwg-dwl-interface` command provides dwl -> panel communication. It also executes the autostart script,
if found.
### Executor
The Executor module displays an icon, and a label on the basis of a script output, in user-defined intervals.
### CustomButton
Simple Gtk.Button with an icon, and a command assigned to it *(top left corner)*;
### MenuStart
Allows defining settings for the Menu Start plugin.
See [Wiki](https://github.com/nwg-piotr/nwg-panel/wiki) for more information. You'll also find some useful
executor examples there.
| 37.987013 | 130 | 0.765812 | eng_Latn | 0.990042 |
0c76d1afb1e48eb183aee21c9195070189af55f1 | 873 | md | Markdown | README.md | linsijay/py-aws | 0e7fec80f568d29f8efc775769a2de5bd2d3281b | [
"BSD-2-Clause"
] | null | null | null | README.md | linsijay/py-aws | 0e7fec80f568d29f8efc775769a2de5bd2d3281b | [
"BSD-2-Clause"
] | null | null | null | README.md | linsijay/py-aws | 0e7fec80f568d29f8efc775769a2de5bd2d3281b | [
"BSD-2-Clause"
] | null | null | null | # py-aws
python sample code to connect S3 & EC2
## Prerequisites
- Python 3.4 or above
- boto3 https://github.com/boto/boto3
- Flask http://flask.pocoo.org/
## Setup a virtual environment
- pull code
- run the following command
```
#python3 -m venv py-aws
#cd py-aws && source bin/activate
#pip install -r requirements.txt
```
- Add some information for this simple application
```
#vi config.py
add your AWS access ID & Key and Region
if required, change the site information for Flask
```
## How to run
- run S3 application
```
#python3 py-s3.py
test this with http://SITE_ADDRESS:SITE_PORT
```

- run EC2 application
```
#python3 py-ec2.py
test this with http://SITE_ADDRESS:SITE_PORT
```

| 23.594595 | 87 | 0.735395 | eng_Latn | 0.641889 |
0c76f7c90c9d4bd17027a7ca8528527364984957 | 2,479 | md | Markdown | data-explorer/kusto/query/materialized-view-function.md | jimbobbennett/dataexplorer-docs | e341ad0a583fa12a57c8e4c00b06d667dba3f7fe | [
"CC-BY-4.0",
"MIT"
] | null | null | null | data-explorer/kusto/query/materialized-view-function.md | jimbobbennett/dataexplorer-docs | e341ad0a583fa12a57c8e4c00b06d667dba3f7fe | [
"CC-BY-4.0",
"MIT"
] | null | null | null | data-explorer/kusto/query/materialized-view-function.md | jimbobbennett/dataexplorer-docs | e341ad0a583fa12a57c8e4c00b06d667dba3f7fe | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: materialized_view() (scope function) - Azure Data Explorer
description: This article describes materialized_view() function in Azure Data Explorer.
ms.reviewer: yifats
ms.topic: reference
ms.date: 08/30/2020
---
# materialized_view() function
References the materialized part of a [materialized view](../management/materialized-views/materialized-view-overview.md).
The `materialized_view()` function supports a way of querying the *materialized* part only of the view, while specifying the max latency the user is willing to tolerate. This option isn't guaranteed to return the most up-to-date records, but should always be more performant than querying the entire view. This function is useful for scenarios in which you're willing to sacrifice some freshness for performance, for example in telemetry dashboards.
<!--- csl --->
```
materialized_view('ViewName')
```
## Syntax
`materialized_view` `(`*ViewName*`,` [*max_age*] `)`
## Arguments
* *ViewName*: The name of the `materialized view`.
* *max_age*: Optional. If not provided, only the *materialized* part of the view is returned. If provided, function will return the
_materialized_ part of the view if last materialization time is greater than [@now - max_age]. Otherwise, the entire view is returned (identical
to querying *ViewName* directly.
## Examples
Query the *materialized* part of the view only, independent on when it was last materialized.
<!-- csl -->
```
materialized_view("ViewName")
```
Query the *materialized* part only if it was materialized in the last 10 minutes. If the materialized part is older than 10 minutes, return the full view. This option is expected to be less performant than querying the materialized part.
<!-- csl -->
```
materialized_view("ViewName", 10m)
```
## Notes
* Once a view is created, it can be queried just as any other table in the database, including participate in cross-cluster / cross-database queries.
* Materialized views aren't included in wildcard unions or searches.
* Syntax for querying the view is the view name (like a table reference).
* Querying the materialized view will always return the most up-to-date results, based on all records ingested to the source table. The query combines the materialized part of the view with all unmaterialized records in the source table. For more information, see [behind the scenes](../management/materialized-views/materialized-view-overview.md#how-materialized-views-work) for details.
| 46.773585 | 449 | 0.768455 | eng_Latn | 0.996402 |
0c7732755dc2a14a27bb2d1cde1d4af6ca581108 | 25 | md | Markdown | cdf+cd/ci-cd/Jenkins/README.md | xiaosongfu/nav | e3e0339bdb11760aaa4dd4f8016490aca375fca6 | [
"MIT"
] | null | null | null | cdf+cd/ci-cd/Jenkins/README.md | xiaosongfu/nav | e3e0339bdb11760aaa4dd4f8016490aca375fca6 | [
"MIT"
] | null | null | null | cdf+cd/ci-cd/Jenkins/README.md | xiaosongfu/nav | e3e0339bdb11760aaa4dd4f8016490aca375fca6 | [
"MIT"
] | null | null | null | https://jenkins-zh.cn/
| 6.25 | 22 | 0.64 | hun_Latn | 0.348904 |
0c77868da7f4b0de221ba0255338559a2cde6d2d | 5,208 | md | Markdown | README.md | NewKnowledge/weibo-scraper | a136fb6b88f66597ad38ef5deb4ec54e5eb05941 | [
"MIT"
] | 58 | 2018-05-10T20:10:54.000Z | 2022-03-15T11:11:31.000Z | README.md | NewKnowledge/weibo-scraper | a136fb6b88f66597ad38ef5deb4ec54e5eb05941 | [
"MIT"
] | 18 | 2018-05-21T16:20:30.000Z | 2021-12-15T19:35:21.000Z | README.md | NewKnowledge/weibo-scraper | a136fb6b88f66597ad38ef5deb4ec54e5eb05941 | [
"MIT"
] | 17 | 2018-05-25T09:26:51.000Z | 2022-03-14T01:56:43.000Z | # Weibo Scraper
[](https://pypi.org/project/weibo-scraper/)
[](https://docs.python.org/3/whatsnew/3.6.html)
[](https://travis-ci.org/Xarrow/weibo-scraper)
[](https://codecov.io/gh/Xarrow/weibo-scraper)
----
Simple weibo tweet scraper . Crawl weibo tweets without authorization.
There are many limitations in official API .
In general , we can inspect mobile site which has it's own API by Chrome.
----
# Why
1. Crawl weibo data in order to research big data .
2. Back up data for weibo's shameful blockade .
----
# Installation
### pip
```shell
$ pip install weibo-scraper
```
Or Upgrade it.
```shell
$ pip install --upgrade weibo-scraper
```
### pipenv
```shell
$ pipenv install weibo-scraper
```
Or Upgrade it.
```shell
$ pipenv update --outdated # show packages which are outdated
$ pipenv update weibo-scraper # just update weibo-scraper
```
Only Python 3.6+ is supported
----
# Usage
### CLI
```bash
$ weibo-scraper -h
usage: weibo-scraper [-h] [-u U] [-p P] [-o O] [-f FORMAT]
[-efn EXPORTED_FILE_NAME] [-s] [-d] [--more] [-v]
weibo-scraper-1.0.7-beta 🚀
optional arguments:
-h, --help show this help message and exit
-u U username [nickname] which want to exported
-p P pages which exported [ default 1 page ]
-o O output file path which expected [ default 'current
dir' ]
-f FORMAT, --format FORMAT
format which expected [ default 'txt' ]
-efn EXPORTED_FILE_NAME, --exported_file_name EXPORTED_FILE_NAME
file name which expected
-s, --simplify simplify available info
-d, --debug open debug mode
--more more
-v, --version weibo scraper version
```
### API
1. Firstly , you can get weibo profile by `name` or `uid` .
```python
>>> from weibo_scraper import get_weibo_profile
>>> weibo_profile = get_weibo_profile(name='来去之间',)
>>> ....
```
You will get weibo profile response which is type of `weibo_base.UserMeta`, and this response include fields as below
field|chinese|type|sample|ext
---|---|---|---|---
id|用户id|str||
screen_name|微博昵称|Option[str]||
avatar_hd|高清头像|Option[str]|'https://ww2.sinaimg.cn/orj480/4242e8adjw8elz58g3kyvj20c80c8myg.jpg'|
cover_image_phone|手机版封面|Option[str]|'https://tva1.sinaimg.cn/crop.0.0.640.640.640/549d0121tw1egm1kjly3jj20hs0hsq4f.jpg'|
description| 描述|Option[str]||
follow_count|关注数|Option[int]|3568|
follower_count|被关注数|Option[int]|794803|
gender|性别|Option[str]|'m'/'f'|
raw_user_response|原始返回|Option[dict]||
2. Secondly , via `tweet_container_id` to get weibo tweets is a rare way to use but it also works well .
```python
>>> from weibo_scraper import get_weibo_tweets
>>> for tweet in get_weibo_tweets(tweet_container_id='1076033637346297',pages=1):
>>> print(tweet)
>>> ....
```
3. Of Course , you can also get raw weibo tweets by nick name which is exist . And the param of `pages` is optional .
```python
>>> from weibo_scraper import get_weibo_tweets_by_name
>>> for tweet in get_weibo_tweets_by_name(name='嘻红豆', pages=1):
>>> print(tweet)
>>> ....
```
3. If you want to get all tweets , you can set the param of `pages` as `None`
```python
>>> from weibo_scraper import get_weibo_tweets_by_name
>>> for tweet in get_weibo_tweets_by_name(name='嘻红豆', pages=None):
>>> print(tweet)
>>> ....
```
4. You can also get formatted tweets via api of `weibo_scrapy.get_formatted_weibo_tweets_by_name`,
```python
>>> from weibo_scraper import get_formatted_weibo_tweets_by_name
>>> result_iterator = get_formatted_weibo_tweets_by_name(name='嘻红豆', pages=None)
>>> for user_meta in result_iterator:
>>> if user_meta is not None:
>>> for tweetMeta in user_meta.cards_node:
>>> print(tweetMeta.mblog.text)
>>> ....
```

5. Get realtime hot words
```python
hotwords = weibo_scraper.get_realtime_hotwords()
for hw in hotwords:
print(str(hw))
```
6. Get realtime hot words in every interval
```python
wt = Timer(name="realtime_hotword_timer", fn=weibo_scraper.get_realtime_hotwords, interval=1)
wt.set_ignore_ex(True)
wt.scheduler()
```
<!-- ---- -->
<!-- # Weibo Flasgger -->
<!-- [Weibo Flasgger](https://github.com/Xarrow/weibo-scraper/blob/search_name/samples/weibo_flasgger/FLASGGER_README.md) is a web api document for weibo scraper , and powered by flasgger . -->
<!--  -->
<!-- # P.S -->
<!-- 1. Inspiration from [Twitter-Scraper](https://github.com/kennethreitz/twitter-scraper) . -->
<!-- 2. For 'XIHONGDOU' . -->
<!-- 3. Welcome To Fork Me . -->
<!-- ---- -->
# LICENSE
MIT
This Project Powered By Jetbrains OpenSource License
 | 26.04 | 193 | 0.678187 | eng_Latn | 0.700295 |
0c7854472a373ea34b61e39c76e5fbb53a8a8856 | 789 | md | Markdown | mass-docs/src/main/paradox/rdi/index.md | yangbajing/fusion-data | eb5e5ef5d1ec667e6087652bcee4368e633a03e3 | [
"Apache-2.0"
] | 15 | 2020-03-09T01:31:05.000Z | 2021-11-23T05:58:22.000Z | mass-docs/src/main/paradox/rdi/index.md | yangbajing/fusion-data | eb5e5ef5d1ec667e6087652bcee4368e633a03e3 | [
"Apache-2.0"
] | 1 | 2019-09-01T10:35:12.000Z | 2019-09-01T10:35:12.000Z | mass-docs/src/main/paradox/rdi/index.md | yangbajing/mass-data | eb5e5ef5d1ec667e6087652bcee4368e633a03e3 | [
"Apache-2.0"
] | 9 | 2018-11-11T15:54:10.000Z | 2020-02-06T03:57:47.000Z | # 反应式数据处理(mass-rdi)
**mass-rdi** 的数据处理部分将基于 Reactive Extension 的思想进行设计,采集 **Akka Stream** 做为实现框架,基于 **Alpakka** 进行落地实现。
## 特性
**有效提供工作效率**
mass-rdp可数据工程师**提高**对复杂数据的处理能力,**提高**对各种数据来源、数据格式的数据的入库效率,**提高**对业务快速变化的响应。
**丰富的数据源支持**
mass-rdi 支持结构化、非结构化数据处理,支持对文本格式数据(csv、XML、JSON等)处理。支持传统关系型数据库:PostgreSQL、MySQL、Oracle、SQL Server,
支持NoSQL数据存储:HDFS、Hive、HBase、Cassandra、MongoDB。同时,mass-rdp还支持国产数据库:达梦数据库、GBase。
**反应式架构设计**
mass-rdi 采用 ReactiveStreams(反应式流处理) 设计数据处理功能(ETL/ELT),拥有高性、能吞吐、可扩展、容错、回压(流速控制)等特性。
**适用性**
mass-rdi 做为一款全方位的数据处理工具,适用于多种业务产场。
- 数据采集
- 数据清洗
- 任务调度管理
mass-rdi 可以单独使用,也可以集成到其它业务系统里面。
**可扩展**
mass-rdi 提供了完善的接口和丰富的扩展点,支持对 mass-rdi 进行二次开发,或将其嵌入到特定的应用系统中。
@@toc { depth=1 }
@@@ index
* [core](core.md)
* [workflow](workflow.md)
* [extension](extension.md)
@@@
| 18.348837 | 99 | 0.730038 | yue_Hant | 0.768708 |
0c78b3246ee4f52b6d9035c78793509e29eb543a | 2,199 | md | Markdown | docs/ja/api/README.md | Orange-OpenSource/vue-test-utils | 85e976511b7552dc57aa7d05da6c2c0ed51c2484 | [
"MIT"
] | null | null | null | docs/ja/api/README.md | Orange-OpenSource/vue-test-utils | 85e976511b7552dc57aa7d05da6c2c0ed51c2484 | [
"MIT"
] | null | null | null | docs/ja/api/README.md | Orange-OpenSource/vue-test-utils | 85e976511b7552dc57aa7d05da6c2c0ed51c2484 | [
"MIT"
] | null | null | null | # API
* [mount](./mount.md)
* [shallow](./shallow.md)
* [マウンティングオプション](./options.md)
- [context](./options.md#context)
- [slots](./options.md#slots)
- [stubs](./options.md#stubs)
- [mocks](./options.md#mocks)
- [localVue](./options.md#localvue)
- [attachToDocument](./options.md#attachtodocument)
- [attrs](./options.md#attrs)
- [listeners](./options.md#listeners)
- [provide](./options.md#provide)
- [その他のオプション](./options.md#その他のオプション)
* [Wrapper](./wrapper/README.md)
* [attributes](./wrapper/attributes.md)
* [classes](./wrapper/classes.md)
* [contains](./wrapper/contains.md)
* [emitted](./wrapper/emitted.md)
* [emittedByOrder](./wrapper/emittedByOrder.md)
* [exists](./wrapper/exists.md)
* [destroy](./wrapper/destroy.md)
* [find](./wrapper/find.md)
* [findAll](./wrapper/findAll.md)
* [html](./wrapper/html.md)
* [is](./wrapper/is.md)
* [isEmpty](./wrapper/isEmpty.md)
* [isVueInstance](./wrapper/isVueInstance.md)
* [name](./wrapper/name.md)
* [props](./wrapper/props.md)
* [setData](./wrapper/setData.md)
* [setMethods](./wrapper/setMethods.md)
* [setProps](./wrapper/setProps.md)
* [text](./wrapper/text.md)
* [trigger](./wrapper/trigger.md)
* [update](./wrapper/update.md)
* [visible](./wrapper/visible.md)
* [WrapperArray](./wrapper-array/README.md)
* [at](./wrapper-array/at.md)
* [contains](./wrapper-array/contains.md)
* [exists](./wrapper/exists.md)
* [destroy](./wrapper-array/destroy.md)
* [filter](./wrapper-array/filter.md)
* [is](./wrapper-array/is.md)
* [isEmpty](./wrapper-array/isEmpty.md)
* [isVueInstance](./wrapper-array/isVueInstance.md)
* [setData](./wrapper-array/setData.md)
* [setMethods](./wrapper-array/setMethods.md)
* [setProps](./wrapper-array/setProps.md)
* [trigger](./wrapper-array/trigger.md)
* [update](./wrapper-array/update.md)
* [visible](./wrapper-array/visible.md)
* [コンポーネント](./components/README.md)
* [TransitionStub](./components/TransitionStub.md)
* [TransitionGroupStub](./components/TransitionGroupStub.md)
* [RouterLinkStub](./components/RouterLinkStub.md)
* [セレクタ](./selectors.md)
* [createLocalVue](./createLocalVue.md)
* [config](./config.md)
| 36.04918 | 62 | 0.659845 | kor_Hang | 0.122314 |
0c7b0b9b10d65d7430b5087431547652c47b042c | 11,170 | md | Markdown | README.md | allegheny-college-cmpsc-100-spring-2020/lab-08 | b44a79ea2fa4d5f51b91c61863c682ec3efa1a5e | [
"CC-BY-4.0"
] | null | null | null | README.md | allegheny-college-cmpsc-100-spring-2020/lab-08 | b44a79ea2fa4d5f51b91c61863c682ec3efa1a5e | [
"CC-BY-4.0"
] | null | null | null | README.md | allegheny-college-cmpsc-100-spring-2020/lab-08 | b44a79ea2fa4d5f51b91c61863c682ec3efa1a5e | [
"CC-BY-4.0"
] | null | null | null | # CMPSC 100-02 Lab Session 8: The Catnap Caper
* Assigned: 12 March 2020
* Due: 27 March 2020 by 11:59 PM
* Point value: 45
During this lab session, we engage with `if` statements, `conditions`, and `boolean` expressions.
* [Slack](https://cmpsc-100-02-sp-2020.slack.com)
* [GitHub](https://www.github.com)
* git
* Markdown
* [Atom](https://atom.io)
* [Docker](https://www.docker.com/products/docker-desktop)
* GatorGrader
* gradle
## Table of Contents
* [Accepting the assignment](#accepting-the-assignment)
* [Activity: HumanQuest](#the-catnap-caper)
* [Evaluation](#evaluation)
* [GatorGrader](#gatorgrader)
## General guidelines for laboratory sessions
* **Follow steps carefully.** Laboratory sessions often get a bit more complicated than their preceeding Practical sessions. Especially for early sessions which expose you to platforms with which you may not be familiar, take notes on `command`s you run and their corresponding effects/outputs. If you find yourself stuck on a step, let a TL or the professor know! Laboratory sessions do not mean that we won't help you in the same way we do during Practicals.
* **Regularly ask and answer questions.** Some of you may have more experience with the topics we're discussing than others. We can use this knowledge to our advantage. But, like in Practicals, let students try things for a while before offering help (**always _offer_ first**). To ask questions, use our [Slack](https://cmpsc-100-02-sp-2020.slack.com)'s `#labs` channel.
* **Store and transfer files using GitHub.** Various forms of file storage are more or less volatile. *You* are responsible for backing up and storing files. If you're unsure of files which have been changed, you can always type `git status` in the terminal for your working folder to determine what you need to back up.
* **Keep all of your files.** See above, but also remember that you're responsible for the files you create.
* **Back up your files regularly.** See above (and above-above).
* **Review the [Honor Code](https://sites.allegheny.edu/about/honor-code/) regularly when working.** If you're taking a solution from another student or the Internet at-large (_especially_ [Stack Overflow](https://stackoverflow.com)), bear in mind that using these solutions _can_ constitute a form of plagiarism that violates the Allegheny Honor Code. While it may seem easy and convenient to use these sources, it is equally easy and convenient to rely on them and create bad habits which include not attributing credit or relying exclusively on others to solve issues. Neither are productive uses of your intellect. Really.
## Further helpful reading for this assignment
I recommend reading the [GitHub Guides](https://guides.github.com) which GitHub makes available. In particular, the guides:
* [Mastering Markdown](https://guides.github.com/features/mastering-markdown/)
* [Documenting your projects on GitHub](https://guides.github.com/features/wikis/)
* [GitHub Handbook](https://guides.github.com/introduction/git-handbook/)
## The Catnap Caper

Hearing about G. Wiz's crystal ball (somebody in class must've let their secret slip!), authorities of the global Animal Kingdom have called in our magical reptile friend to help predict all of the various scenarios that could occur once they begin interrogation of the two Human catnappers! Of course, there's some nice money in it for G. Wiz's hat collection.
After consulting with G. Wiz to work out the possibilities, the detectives have narrowed down the following possibilities, confirming our suspicions from the other day:

It is up to you to write a program to simulate all the various ways that their interrogation could go, and the corresponding outcomes.
Overall, the output should look like the sample(s) below (with some variation based on the `true`/`false` values input).
### Example 1
```
Does Alice defect [true/false]: false
Alice stays quiet.
Does Bob defect [true/false]: true
Bob defects!
Alice recieves 3 year(s).
Bob recieves 0 year(s).
```
### Example 2
```
Does Alice defect [true/false]: true
Alice defects!
Does Bob defect [true/false]: true
Bob defects!
Alice recieves 2 year(s).
Bob recieves 2 year(s).
```
### Final notes
The prompts `Does Alice defect [true/false]:` and `Does Bob defect [true/false]:` serve as prompts to let the user know that your program will expect input of either `true` or `false`.
#### Running multiple different scenarios
The command `gradle runScenario` randomly runs one of the four scenarios. This is the command I use to grade the exercise, but you can also use it to test against various random combinations!
## Evaluation
### `CatnapCaper.java`
- [ ] Uses a `Scanner` to take input from the keyboard
* These inputs should be `boolean` values
- [ ] Evaluates the `boolean` values to print whether a suspect defects or stays quiet.
* Each catnapper's decision should display as: `{NAME} defects!` or `{NAME} stays quiet`.
* This evaluation should take place using `if` statements
- [ ] Uses `if` statements to evaluate the number of years assessed to a given catnapper based on how cooperative they were
- [ ] Assesses if either catnapper recieved less than two years and print the appropriate status message, described above
#### Statements to print
This lab is a little more persnickety about _exact_ matches for statements in the output. The following table displays the various inputs, outputs, and statements required:
| Input/outcome | Statement |
|---------------|-----------|
| Alice defects (input `true`) | Alice defects! |
| Alice doesn't defect (input `false` | Alice stays quiet. |
| Alice defects (input `true`) | Bob defects! |
| Bob doesn't defect (input `false`) | Bob stays quiet. |
### `reflection.md`
- [ ] Contains no:
* `TODO` markers
* `{Your Name Here}` markers
- [ ] Contains 300 words
- [ ] Written in correct markup (passes `mdl`)
- [ ] Responds to all questions
### Repository
- [ ] Has at least 3 `commit`s
## Accepting the assignment
- [ ] Log into the `#labs` channel in our class [Slack](https://cmpsc-100-02-sp-2020.slack.com)
- [ ] Click the link provided for the lab assignment and accept it in GitHub classroom
- [ ] Once the assignment finishes building, click the link to go to your personal repository assignment
## "Cloning" a repository
### Using the correct download link
- [ ] On this repository's page, click the `Clone or download` button in the upper right hand corner
* You may need to scroll up to see it
- [ ] In the upper right corner of the box that appears, click `Use SSH`
- [ ] Copy the link that appears in the textbox below the phrase "Use a password with a protected key."
### Cloning
- [ ] Type `ls` in your terminal window
* You should be in your `CMPSC100` directory
- [ ] Change directories (`cd`) to your `Labs` folder
- [ ] Once in the labs folder, "clone" the repository using the link copied above
* If I (the instructor) were to clone my own repository, I'd enter (your link will look different):
```
git clone [email protected]:allegheny-college-cmpsc-100-spring-2020/cmpsc-100-spring-2020-lab-08-dluman
```
## GatorGrader
### A note about `gradle`
`gradle` is a program which manages our program's many moving parts. It coordinates building, testing, compiling, and running our code -- tasks that will become more complex over the course of the semester in direct proportion to the complexity of our programs. There are three "tasks" that we will use extensively in this course.
#### `gradle build`
`gradle build` compares code against a set of convetions ("best practices") for creating legible code. There are many different standards for doing this, but our department chooses to follow the [Google Style Guide for the Java language](https://google.github.io/styleguide/javaguide.html). This includes such rules as:
* Each "level" of code being indented by 2 spaces
* Not using single-letter identifiers
* Enforcing consistent use of "Javadoc" (and other) comments
* ... and more!
These kinds of standard make reading code much easier, and help folks like our Technical Leaders (and me) read your code to figure out where something isn't going as planned.
#### `gradle run` (and its variants)
`gradle run` (and its counterpart `gradle -q --console plain run`) compile and run our Java programs. Once again, this will become more handy when we start to build projects that require _multiple_ files.
#### `gradle grade`
`gradle grade` runs the GatorGrader application. This application uses grading standards _specific to an assignment_. This means that the grading files created when you accept an assignment are the same ones by which I will evaluate your work: _once you've cloned an assignment, they do not change_.
You will use this command to grade your work before you turn it in, enabling you to know before I grade it what your grade will be.
#### Running GatorGrader directly on `container` start
- [ ] `cd` to your `CMPSC100` folder
- [ ] Locate the `cmpsc-100-spring-2020-lab-08` folder and `cd` to it.
* Remember that if you run `ls -la`, you should see a `.git` folder if you're in the main repository folder.
- [ ] To make sure you're in the right repository, type `pwd` and press `Enter`
* If you recieve the expected path, you're in the right place!
```
docker run -it --mount type=bind,source="$(pwd)",target="/project" --hostname GatorGrader gatoreducator/dockagator
```
#### Run `gradle` commands in the container`
```
docker run -it --mount type=bind,source="$(pwd)",target="/project" --hostname GatorGrader gatoreducator/dockagator bash
```
- [ ] To `build` your Java work, type `gradle build` at the `command` prompt and press the `Enter` key.
- [ ] To `grade` your Java work, type `gradle grade` at the `command` prompt and press the `Enter` key.
## Submitting the assignment/saving progress
The GitHub platform is a place to store your work. So, it makes some sense that should be able to _clone_ (download) from it, and push back (upload) to it. Here, we'll learn this second part.
- [ ] `cd` to your `CMPSC100` folder
- [ ] Locate the `cmpsc-100-spring-2020-lab-08` folder and `cd` to it.
Once in this folder, we need to tell git that there have been changes.
- [ ] Type `git add .` and press `Enter`
* This tells git to look through the _entire_ folder structure for new changes and "stage" them
- [ ] Type `git commit -m "{Commit message}"` to "label" the commit
* This is typically something like `git commit -m "Fixing a typo"` -- the message in quotes should be as descriptive as possible, while still remaining somewhat short
- [ ] To send all changes to the server, type `git push`
- [ ] At the prompt, input the password associated with the `SSH Key` you created earlier in this exercise (yesterday)
Once the process finishes successfully, we're done!
| 51.474654 | 626 | 0.745121 | eng_Latn | 0.996182 |
0c7b5fa52a36d0507daa6e5765fd2755dda89813 | 11,031 | md | Markdown | README.md | recca0120/repository | cabf1df52f78df567be3d608d37e636d8575165c | [
"MIT"
] | 127 | 2016-06-06T07:41:16.000Z | 2022-03-25T01:35:25.000Z | README.md | recca0120/repository | cabf1df52f78df567be3d608d37e636d8575165c | [
"MIT"
] | 7 | 2017-05-23T04:13:50.000Z | 2021-04-27T06:31:28.000Z | README.md | recca0120/repository | cabf1df52f78df567be3d608d37e636d8575165c | [
"MIT"
] | 27 | 2016-07-08T07:54:25.000Z | 2021-11-05T16:08:39.000Z | # Repository Design Pattern for Laravel
[](https://styleci.io/repos/60332194)
[](https://travis-ci.org/recca0120/laravel-repository)
[](https://packagist.org/packages/recca0120/repository)
[](https://packagist.org/packages/recca0120/repository)
[](https://packagist.org/packages/recca0120/repository)
[](https://packagist.org/packages/recca0120/repository)
[](https://packagist.org/packages/recca0120/repository)
[](https://packagist.org/packages/recca0120/repository)
[](https://scrutinizer-ci.com/g/recca0120/laravel-repository/?branch=master)
[](https://scrutinizer-ci.com/g/recca0120/laravel-repository/?branch=master)
## Install
To get the latest version of Laravel Exceptions, simply require the project using [Composer](https://getcomposer.org):
```bash
composer require recca0120/repository
```
Instead, you may of course manually update your require block and run `composer update` if you so choose:
```json
{
"require": {
"recca0120/repository": "~2.0.0"
}
}
```
## Methods
### Recca0120\Repository\EloquentRepository
- find($id, $columns = ['*']);
- findMany($ids, $columns = ['*']);
- findOrFail($id, $columns = ['*']);
- findOrNew($id, $columns = ['*']);
- firstOrNew(array $attributes, array $values = []);
- firstOrCreate(array $attributes, array $values = []);
- updateOrCreate(array $attributes, array $values = []);
- firstOrFail($criteria = [], $columns = ['*']);
- create($attributes);
- forceCreate($attributes);
- update($id, $attributes);
- forceUpdate($id, $attributes);
- delete($id);
- forceDelete($id);
- newInstance($attributes = [], $exists = false);
- get($criteria = [], $columns = ['*']);
- chunk($criteria, $count, callable $callback);
- each($criteria, callable $callback, $count = 1000);
- first($criteria = [], $columns = ['*']);
- paginate($criteria = [], $perPage = null, $columns = ['*'], $pageName = 'page', $page = null);
- simplePaginate($criteria = [], $perPage = null, $columns = ['*'], $pageName = 'page', $page = null);
- count($criteria = [], $columns = '*');
- min($criteria, $column);
- max($criteria, $column);
- sum($criteria, $column);
- avg($criteria, $column);
- average($criteria, $column);
- matching($criteria);
- getQuery($criteria = []);
- getModel();
- newQuery();
### Recca0120\Repository\Criteria
- static create()
- static expr($value)
- static raw($value)
- select($columns = ['*'])
- selectRaw($expression, array $bindings = [])
- selectSub($query, $as)
- addSelect($column)
- distinct()
- from($table)
- join($table, $first, $operator = null, $second = null, $type = 'inner', $where = false)
- joinWhere($table, $first, $operator, $second, $type = 'inner')
- leftJoin($table, $first, $operator = null, $second = null)
- leftJoinWhere($table, $first, $operator, $second)
- rightJoin($table, $first, $operator = null, $second = null)
- rightJoinWhere($table, $first, $operator, $second)
- crossJoin($table, $first = null, $operator = null, $second = null)
- mergeWheres($wheres, $bindings)
- tap($callback)
- where($column, $operator = null, $value = null, $boolean = 'and')
- orWhere($column, $operator = null, $value = null)
- whereColumn($first, $operator = null, $second = null, $boolean = 'and')
- orWhereColumn($first, $operator = null, $second = null)
- whereRaw($sql, $bindings = [], $boolean = 'and')
- orWhereRaw($sql, array $bindings = [])
- whereIn($column, $values, $boolean = 'and', $not = false)
- orWhereIn($column, $values)
- whereNotIn($column, $values, $boolean = 'and')
- orWhereNotIn($column, $values)
- whereNull($column, $boolean = 'and', $not = false)
- orWhereNull($column)
- whereNotNull($column, $boolean = 'and')
- whereBetween($column, array $values, $boolean = 'and', $not = false)
- orWhereBetween($column, array $values)
- whereNotBetween($column, array $values, $boolean = 'and')
- orWhereNotBetween($column, array $values)
- orWhereNotNull($column)
- whereDate($column, $operator, $value = null, $boolean = 'and')
- orWhereDate($column, $operator, $value)
- whereTime($column, $operator, $value, $boolean = 'and')
- orWhereTime($column, $operator, $value)
- whereDay($column, $operator, $value = null, $boolean = 'and')
- whereMonth($column, $operator, $value = null, $boolean = 'and')
- whereYear($column, $operator, $value = null, $boolean = 'and')
- whereNested(Closure $callback, $boolean = 'and')
- addNestedWhereQuery($query, $boolean = 'and')
- whereExists(Closure $callback, $boolean = 'and', $not = false)
- orWhereExists(Closure $callback, $not = false)
- whereNotExists(Closure $callback, $boolean = 'and')
- orWhereNotExists(Closure $callback)
- addWhereExistsQuery(Builder $query, $boolean = 'and', $not = false)
- dynamicWhere($method, $parameters)
- groupBy()
- having($column, $operator = null, $value = null, $boolean = 'and')
- orHaving($column, $operator = null, $value = null)
- havingRaw($sql, array $bindings = [], $boolean = 'and')
- orHavingRaw($sql, array $bindings = [])
- orderBy($column, $direction = 'asc')
- orderByDesc($column)
- latest($column = 'created_at')
- oldest($column = 'created_at')
- inRandomOrder($seed = '')
- orderByRaw($sql, $bindings = [])
- skip($value)
- offset($value)
- take($value)
- limit($value)
- forPage($page, $perPage = 15)
- forPageAfterId($perPage = 15, $lastId = 0, $column = 'id')
- union($query, $all = false)
- unionAll($query)
- lock($value = true)
- lockForUpdate()
- sharedLock()
- when($value, $callback, $default = null)
- unless($value, $callback, $default = null)
- whereKey($id)
- whereKeyNot($id)
- with($relations)
- without($relations)
- setQuery($query)
- setModel(Model $model)
- has($relation, $operator = '>=', $count = 1, $boolean = 'and', Closure $callback = null)
- orHas($relation, $operator = '>=', $count = 1)
- doesntHave($relation, $boolean = 'and', Closure $callback = null)
- whereHas($relation, Closure $callback = null, $operator = '>=', $count = 1)
- orWhereHas($relation, Closure $callback = null, $operator = '>=', $count = 1)
- whereDoesntHave($relation, Closure $callback = null)
- withCount($relations)
- mergeConstraintsFrom(Builder $from)
- withTrashed()
- withoutTrashed()
- onlyTrashed()
## Usage
### Eloquent
#### Create a Model
Create your model normally, but it is important to define the attributes that can be filled from the input form data.
```php
namespace App;
use Illuminate\Database\Eloquent\Model;
class Post extends Model
{
protected $fillable = [
'title',
'author',
];
}
```
### Create a Contract
```php
namespace App\Repositories\Contracts;
interface PostRepository
{
}
```
#### Create a Repository
```php
namespace App\Repositories;
use App\Repositories\Contracts\PostRepository as PostRepositoryContract;
use App\Post;
use Recca0120\Repository\EloquentRepository;
class PostRepository extends EloquentRepository implements PostRepositoryContract
{
public function __construct(Post $model)
{
$this->model = $model;
}
}
```
### Bind
```php
namespace App\Providers;
use Illuminate\Support\ServiceProvider;
use App\Repositories\Contracts\PostRepository as PostRepositoryContract;
class AppServiceProvider extends ServiceProvider
{
public function register()
{
$this->app->singleton(PostRepositoryContract::class, PostRepository::class);
}
}
```
#### Controller
```php
namespace App\Http\Controllers;
use App\Repositories\Contracts\PostRepository;
class PostsController extends Controller
{
protected $repository;
public function __construct(PostRepository $repository)
{
$this->repository = $repository;
}
}
```
## Methods
Find all results in Repository
```php
$posts = $this->repository->get();
```
Find all results in Repository with pagination
```php
$posts = $this->repository->paginate();
```
Count results in Repository
```php
$posts = $this->repository->count();
```
Create new entry in Repository
```php
$post = $this->repository->create(request()->all());
```
Update entry in Repository
```php
$post = $this->repository->update($id, request()->all());
```
Delete entry in Repository
```php
$this->repository->delete($id);
```
New instance
```php
$post = $this->repository->newInstance([
'author' => 'author'
]);
```
Return Model With Conditions
```
$model = $this->repository->matching(Criteria::create()->where('title', '=', 'title'));
```
Find result by id
```php
$post = $this->repository->find($id);
```
### Find by conditions
#### Using the Criteria
Criteria is support all of Eloquent functions
##### Single Criteria
```php
use Recca0120\Repository\Criteria;
$criteria = Criteria::create()
->select('*')
->where('author', '=', 'author')
->orWhere('title', '=', 'title')
->orderBy('author', 'asc');
$this->repository->get($criteria);
$this->repository->paginate($criteria);
```
#### Multiple Criteria
```php
use Recca0120\Repository\Criteria;
$criteria = [];
$criteria[] = Criteria::create()
->orderBy('author', 'asc');
$criteria[] = Criteria::create()
->where('author', '=', 'author')
->orWhere('title', '=', 'title');
$this->repository->get($criteria);
// $this->repository->paginate($criteria);
```
##### With
```php
use Recca0120\Repository\Criteria;
$criteria = Criteria::create()
->with('author', function($criteria) {
$criteria->where('author', 'author');
});
$this->repository->get($criteria);
// $this->repository->paginate($criteria);
```
#### Join
```php
use Recca0120\Repository\Criteria;
$criteria = Criteria::create()
->join('author', function ($criteria) {
$criteria->on('posts.author_id', '=', 'author.id');
});
$this->repository->get($criteria);
// $this->repository->paginate($criteria);
```
#### Expression
```php
use Recca0120\Repository\Criteria;
$criteria = Criteria::create()
->where('created_at', '<=', Criteria::expr('NOW()'));
$this->repository->get($criteria);
// $this->repository->paginate($criteria);
```
#### Custom Criteria
```php
use Recca0120\Repository\Criteria;
class CustomCriteria extends Criteria
{
public function __construct($id)
{
$this->where('id', '=', $id);
}
}
$this->repository->get((new CustomCriteria(1))->where('autor', 'autor'));
```
## ToDo
- Cache
| 26.516827 | 196 | 0.673103 | eng_Latn | 0.43707 |
0c7ba579f33439a77badcded492367062b459998 | 886 | md | Markdown | NEWS.md | pevnak/MLBase.jl | 8a9e13ec640652f2233e9a251d2675726143e259 | [
"MIT"
] | null | null | null | NEWS.md | pevnak/MLBase.jl | 8a9e13ec640652f2233e9a251d2675726143e259 | [
"MIT"
] | null | null | null | NEWS.md | pevnak/MLBase.jl | 8a9e13ec640652f2233e9a251d2675726143e259 | [
"MIT"
] | null | null | null | ## Change Logs
#### From v0.3 to v0.4
- Move deviation computing functions to [StatsBase](https://github.com/JuliaStats/StatsBase.jl)
- Move ``countne`` and ``counteq`` to [StatsBase](https://github.com/JuliaStats/StatsBase.jl)
- Deprecate ``Standardize`` (in favor of [StatsBase](https://github.com/JuliaStats/StatsBase.jl)'s ``zscore``)
#### From v0.4 to v0.5
- Move documentation from Readme to [Sphinx Docs](http://mlbasejl.readthedocs.org/en/latest/)
- ``cross_validate`` now returns a vector of scores (see [here](http://mlbasejl.readthedocs.org/en/latest/crossval.html#cross_validate)).
- New function ``gridtune``: search for the best settings of paramters (see [here](http://mlbasejl.readthedocs.org/en/latest/modeltune.html#gridtune)).
- New function ``confusmat``: compute confusion matrix (see [here](http://mlbasejl.readthedocs.org/en/latest/perfeval.html#confusmat)).
| 59.066667 | 151 | 0.741535 | yue_Hant | 0.633578 |
0c7c28b9ac36450dfaed41ce538aba84bd8e0e6a | 8,949 | md | Markdown | book/src/gobject_memory_management.md | panicbit/gtk4-rs | a5ae08ccb9b6a3979a35bd51f34a76f2b87ee844 | [
"MIT-0",
"MIT"
] | null | null | null | book/src/gobject_memory_management.md | panicbit/gtk4-rs | a5ae08ccb9b6a3979a35bd51f34a76f2b87ee844 | [
"MIT-0",
"MIT"
] | null | null | null | book/src/gobject_memory_management.md | panicbit/gtk4-rs | a5ae08ccb9b6a3979a35bd51f34a76f2b87ee844 | [
"MIT-0",
"MIT"
] | null | null | null | # Memory Management
A GObject (or `glib::Object` in Rust terms) is a reference-counted, mutable object.
Let us see in a set of real life examples which consequences this has.
```rust ,no_run,compile_fail
use gtk::prelude::*;
use gtk::{self, Application, ApplicationWindow, Button, Orientation};
fn main() {
// Create a new application
let app = Application::builder()
.application_id("org.gtk-rs.example")
.build();
// Connect to "activate" signal of `app`
app.connect_activate(build_ui);
// Run the application
app.run();
}
fn build_ui(application: &Application) {
// Create a window
let window = ApplicationWindow::new(application);
// Create two buttons
let button_increase = Button::builder()
.label("Increase")
.margin_top(12)
.margin_bottom(12)
.margin_start(12)
.margin_end(12)
.build();
let button_decrease = Button::builder()
.label("Decrease")
.margin_top(12)
.margin_bottom(12)
.margin_start(12)
.margin_end(12)
.build();
// A mutable integer
let mut number = 0;
// Connect callbacks
// When a button is clicked, `number` should be changed
button_increase.connect_clicked(|_| number += 1);
button_decrease.connect_clicked(|_| number -= 1);
// Add buttons
let gtk_box = gtk::Box::new(Orientation::Vertical, 0);
window.set_child(Some(>k_box));
gtk_box.append(&button_increase);
gtk_box.append(&button_decrease);
window.present();
}
```
Here we would like to create a simple app with two buttons.
If we click on one button, an integer number should be increased. If we press the other one, it should be decreased.
The Rust compiler refuses to compile it though.
For once the borrow checker kicked in:
```console
error[E0499]: cannot borrow `number` as mutable more than once at a time
--> main.rs:27:37
|
26 | button_increase.connect_clicked(|_| number += 1);
| ------------------------------------------------
| | | |
| | | first borrow occurs due to use of `number` in closure
| | first mutable borrow occurs here
| argument requires that `number` is borrowed for `'static`
27 | button_decrease.connect_clicked(|_| number -= 1);
| ^^^ ------ second borrow occurs due to use of `number` in closure
| |
| second mutable borrow occurs here
```
Also, the compiler tells us that our closures may outlive `number`:
```console
error[E0373]: closure may outlive the current function, but it borrows `number`, which is owned by the current function
--> main.rs:26:37
|
26 | button_increase.connect_clicked(|_| number += 1);
| ^^^ ------ `number` is borrowed here
| |
| may outlive borrowed value `number`
|
note: function requires argument type to outlive `'static`
--> main.rs:26:5
|
26 | button_increase.connect_clicked(|_| number += 1);
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
help: to force the closure to take ownership of `number` (and any other referenced variables), use the `move` keyword
|
26 | button_increase.connect_clicked(move |_| number += 1);
| ^^^^^^^^
```
Thinking about the second error message, it makes sense that the closure requires the lifetimes of references to be `'static`.
The compiler cannot know when the user presses a button, so references must live forever.
And our `number` gets immediately deallocated after it reaches the end of its scope.
The error message is also suggesting that we could take ownership of `number`.
But is there actually a way that both closures could take ownership of the same value?
Yes! That is exactly what the [`Rc`](https://doc.rust-lang.org/std/rc/struct.Rc.html) type is there for.
The `Rc` counts the number of strong references created via `Clone::clone` and released via `Drop::drop`, and only deallocates it when this number drops to zero.
We call every object containing a strong reference a shared owner of the value.
If we want to modify the content of our [`Rc`](https://doc.rust-lang.org/std/rc/struct.Rc.html),
we can use the [`Cell`](https://doc.rust-lang.org/std/cell/struct.Cell.html) type.
> The `Cell` class is only suitable for objects that implement the [`Copy`](https://doc.rust-lang.org/core/marker/trait.Copy.html) trait.
> For other objects, [`RefCell`](https://doc.rust-lang.org/std/cell/struct.RefCell.html) is the way to go.
> You can learn more about the two cell types in this [section](https://doc.rust-lang.org/1.30.0/book/first-edition/choosing-your-guarantees.html#cell-types) of an older edition of the Rust book.
<span class="filename">Filename: listings/gobject_memory_management/1/main.rs</span>
```rust ,no_run,noplayground
{{#rustdoc_include ../listings/gobject_memory_management/1/main.rs:callback}}
```
It is not very nice though to fill the scope with temporary variables like `number_copy`.
We can improve that by using the `glib::clone!` macro.
<span class="filename">Filename: listings/gobject_memory_management/2/main.rs</span>
```rust ,no_run,noplayground
{{#rustdoc_include ../listings/gobject_memory_management/2/main.rs:callback}}
```
Just like `Rc<Cell<T>>`, GObjects are reference-counted and mutable.
Therefore, we can pass the buttons the same way to the closure as we did with `number`.
<span class="filename">Filename: listings/gobject_memory_management/3/main.rs</span>
```rust ,no_run,noplayground
{{#rustdoc_include ../listings/gobject_memory_management/3/main.rs:callback}}
```
If we now click on one button, the other button's label gets changed.
But whoops!
Did we forget about one annoyance of reference-counted systems?
Yes we did: [reference cycles](https://doc.rust-lang.org/book/ch15-06-reference-cycles.html).
`button_increase` holds a strong reference to `button_decrease` and vice-versa.
A strong reference keeps the referenced value from being deallocated.
If this chain leads to a circle, none of the values in this cycle ever get deallocated.
With weak references we can break this cycle, because they do not keep their value alive but instead provide a way to retrieve a strong reference if the value is still alive.
Since we want our apps to free unneeded memory, we should use weak references for the buttons instead.
> In this simple example, GTK actually resolves the reference cycle on its own once you close the window.
> However, the general point to avoid strong references whenever possible remains valid.
<span class="filename">Filename: listings/gobject_memory_management/4/main.rs</span>
```rust ,no_run,noplayground
{{#rustdoc_include ../listings/gobject_memory_management/4/main.rs:callback}}
```
The reference cycle is broken.
Every time the button is clicked, `glib::clone` tries to upgrade the weak reference.
If we now for example click on one button and the other button is not there anymore, `upgrade` will return `None`.
Per default, it immediately returns from the closure with `()` as return value.
In case the closure expects a different return value or a panic is preferred `@default-return` or `@default-panic`.
For more information about `glib::clone`, please have a look at the [docs](http://gtk-rs.org/gtk-rs-core/stable/latest/docs/glib/macro.clone.html).
Notice that we move `number` in the second closure.
If we had only moved _weak_ reference in both closures, nothing would have kept `number` alive and the closure would have never been called.
Thinking about this, `button_increase` and `button_decrease` are also dropped at the end of the scope of `build_ui`.
Who then keeps the buttons alive?
<span class="filename">Filename: listings/gobject_memory_management/4/main.rs</span>
```rust ,no_run,noplayground
{{#rustdoc_include ../listings/gobject_memory_management/4/main.rs:box_append}}
```
When we append the buttons to the `gtk_box`, `gtk_box` keeps a strong reference to them.
<span class="filename">Filename: listings/gobject_memory_management/4/main.rs</span>
```rust ,no_run,noplayground
{{#rustdoc_include ../listings/gobject_memory_management/4/main.rs:window_child}}
```
When we set `gtk_box` as child of `window`, `window` keeps a strong reference to it.
Until we close the `window` it keeps `gtk_box` and with it the buttons alive.
Since our application has only one window, closing it also means exiting the application.
As long as you use weak references whenever possible you will find it perfectly doable to avoid memory cycles within your application.
If that is ensured, you can rely on GTK to properly manage the memory of GObjects you pass to it.
| 45.426396 | 195 | 0.694714 | eng_Latn | 0.993296 |
0c7ca5ad17df98df653a9764c76fc0862a615fdd | 9,371 | md | Markdown | README.md | skeptycal/bin | 50ce015c70d0ea70bbd4579a891e93d00b2cd6d7 | [
"MIT"
] | 1 | 2020-08-26T03:30:17.000Z | 2020-08-26T03:30:17.000Z | README.md | skeptycal/bin | 50ce015c70d0ea70bbd4579a891e93d00b2cd6d7 | [
"MIT"
] | null | null | null | README.md | skeptycal/bin | 50ce015c70d0ea70bbd4579a891e93d00b2cd6d7 | [
"MIT"
] | null | null | null | # DEV Tools Template Repo
---
[](https://app.netlify.com/sites/mystifying-keller-ab5658/deploys) [](https://travis-ci.com/skeptycal/.dotfiles) [](https://coveralls.io) [](https://bestpractices.coreinfrastructure.org/projects/3454)
[](https://www.python.org/) [](https://nuxtjs.org/) [](https://www.apple.com)
[](https://github.com/prettier/prettier) [](https://skeptycal.mit-license.org/1976/) [](CODE_OF_CONDUCT.md)
 
## System setup for MacBook Pro using Mojave, Bash, and Python Development
This is my software development setup for a MacBook Pro (mid-2015, 16g ram, 256g SSD). It is the setup I currently use and may change frequently. I am a dabbler in many arts ... and far from expert in most areas. Take it for what it is worth to you.
**Please feel free to offer suggestions and changes** (contribution instructions below). I have been coding for many years, but mostly as a side activity ... as a tool to assist me in other endeavors ... so I have not had the 'hard time' invested of constant coding that many of you have.
> Copyright © 2018-2019 [Michael Treanor](https:/skeptycal.github.com)
> Many original settings © 2018 [Mathias Bynens](https://mathiasbynens.be/)
> [MIT License](https://opensource.org/licenses/MIT) - enjoy ...
## Installation
>**Warning:** If you want to use this setup, you should fork this repository, review the code, and make changes to suit your needs. If you aren't sure, don't use it!
>This setup works for me for what I do. Don’t blindly use my settings unless you know what is going on. If parts of it do not apply, commment them out or delete them. Misuse could make your system inoperable or at least very annoying to use! Use at your own risk!
### Repo Setup
I use python quite a bit and throw in a bit of flask and Vue / Nuxt. The python ecosystem has a very fiddly system to isoloate the coding environment for each project. There are many [options available](), but as of now I have found this works best for my situation:
You can clone this GitHub repository as a starting point:
```sh
git clone https://github.com/skeptycal/repo_template.git && cd repo_template && translate_template.sh
```
This will automate most of the naming and prepping. I have included a list of steps used to create this environment from scratch so anyone can modify the process to their needs:
1. **Pick a repo name:** This name is used in **MANY LOCATIONS** so choose one
you like a lot. I do not provide a way to automate changeing the name.
For this example, I'm using ***MasterBlaster*** for the name.
2. **Location:** Make a directory and change to that directory:
mkdir MasterBlaster && cd MasterBlaster
3. **Virtual Environment:** create a python environment:
(I use python3.x (latest) and I have several aliases setup for that)
alias py='python3 -m '
So to setup my environment, I use:
git init # always have a git repo in place to track changes
# this is the recommended way for python 3.6+
# older versions used pyenv or virtualenv ... time marches on ...
py venv venv # equivalent to python3 -m venv venv
This should get you setup with something like this:
├── bin
│ ├── activate
│ ├── activate.csh
│ ├── activate.fish
│ ├── easy_install
│ ├── easy_install-3.5
│ ├── pip
│ ├── pip3
│ ├── pip3.5
│ ├── python -> python3.5
│ ├── python3 -> python3.5
│ └── python3.5 -> /Library/Frameworks/Python.framework/Versions/3.5/bin/python3.5
├── include
├── lib
│ └── python3.5
│ └── site-packages
└── pyvenv.cfg
>I always use the same name for my virtual environment so all my scripts
work in any location. If you prefer different names for each one, there is
an environment variable that is set whenever a virtual environment is
active:
(I like to keep it in `~/.dotfiles`) The bootstrapper script will pull in the latest version and copy the files to your home folder.
To update, `cd` into your local `.dotfiles` repository and then:
```bash
source bootstrap.sh
```
Alternatively, to update while avoiding the confirmation prompt:
```bash
set -- -f; source bootstrap.sh
```
### Specify the `$PATH`
If `~/.path` exists, it will be sourced along with the other files, before any feature testing (such as [detecting which version of `ls` is being used](https://github.com/mathiasbynens/dotfiles/blob/aff769fd75225d8f2e481185a71d5e05b76002dc/.aliases#L21-26)) takes place.
Here’s an example `~/.path` file that adds `/usr/local/bin` to the `$PATH`:
```bash
export PATH="/usr/local/bin:$PATH"
```
### Add custom commands without creating a new fork
If `~/.extra` exists, it will be sourced after the other files. You can use this to add a few custom commands without the need to fork this entire repository, or to add commands you don’t want to commit to a public repository.
My `~/.extra` looks something like this:
```bash
# Git credentials
# Not in the repository, to prevent people from accidentally committing under my name
GIT_AUTHOR_NAME="Michael Treanor"
GIT_COMMITTER_NAME="$GIT_AUTHOR_NAME"
git config --global user.name "$GIT_AUTHOR_NAME"
GIT_AUTHOR_EMAIL="[email protected]"
GIT_COMMITTER_EMAIL="$GIT_AUTHOR_EMAIL"
git config --global user.email "$GIT_AUTHOR_EMAIL"
```
Since it is sourced last, you could also use `~/.extra` to override settings, functions and aliases from my dotfiles repository. It’s probably better to [fork this repository](https://github.com/mathiasbynens/dotfiles/fork) instead, though.
### Sensible macOS defaults
When setting up a new Mac, you may want to set some sensible macOS defaults:
```bash
./.macos
```
### Install Homebrew formulae
When setting up a new Mac, you may want to install some common [Homebrew](https://brew.sh/) formulae (after installing Homebrew, of course):
```bash
./brew.sh
```
Some of the functionality of these dotfiles depends on formulae installed by `brew.sh`. If you don’t plan to run `brew.sh`, you should look carefully through the script and manually install any particularly important ones. A good example is Bash/Git completion: the dotfiles use a special version from Homebrew.
## Feedback
Suggestions/improvements
[welcome](https://github.com/skeptycal/dotfiles/issues)!
## Author
[](http://twitter.com/skeptycal "Follow @skeptycal on Twitter")
[**Michael Treanor**](https://www.skeptycal.com)
## Thanks to…
- [Mathias Bynens](https://mathiasbynens.be/)
- @ptb and [his _macOS Setup_ repository](https://github.com/ptb/mac-setup)
- [Ben Alman](http://benalman.com/) and his [dotfiles repository](https://github.com/cowboy/dotfiles)
- [Cătălin Mariș](https://github.com/alrra) and his [dotfiles repository](https://github.com/alrra/dotfiles)
- [Gianni Chiappetta](https://butt.zone/) for sharing his [amazing collection of dotfiles](https://github.com/gf3/dotfiles)
- [Jan Moesen](http://jan.moesen.nu/) and his [ancient `.bash_profile`](https://gist.github.com/1156154) + [shiny _tilde_ repository](https://github.com/janmoesen/tilde)
- [Lauri ‘Lri’ Ranta](http://lri.me/) for sharing [loads of hidden preferences](http://osxnotes.net/defaults.html)
- [Matijs Brinkhuis](https://matijs.brinkhu.is/) and his [dotfiles repository](https://github.com/matijs/dotfiles)
- [Nicolas Gallagher](http://nicolasgallagher.com/) and his [dotfiles repository](https://github.com/necolas/dotfiles)
- [Sindre Sorhus](https://sindresorhus.com/)
- [Tom Ryder](https://sanctum.geek.nz/) and his [dotfiles repository](https://sanctum.geek.nz/cgit/dotfiles.git/about)
- [Kevin Suttle](http://kevinsuttle.com/) and his [dotfiles repository](https://github.com/kevinSuttle/dotfiles) and [macOS-Defaults project](https://github.com/kevinSuttle/macOS-Defaults), which aims to provide better documentation for [`~/.macos`](https://mths.be/macos)
- [Haralan Dobrev](https://hkdobrev.com/)
- Anyone who [contributed a patch](https://github.com/mathiasbynens/dotfiles/contributors) or [made a helpful suggestion](https://github.com/mathiasbynens/dotfiles/issues)
| 51.773481 | 572 | 0.727564 | eng_Latn | 0.894938 |
0c7ce903bfa2d2aed81e6938359535807d9e441d | 1,465 | md | Markdown | README.md | alttch/weatherbroker | 8bd017b1cacc97c56d698485ab89284297729d94 | [
"Apache-2.0"
] | 2 | 2018-09-24T23:54:05.000Z | 2018-09-25T07:33:34.000Z | README.md | alttch/weatherbroker | 8bd017b1cacc97c56d698485ab89284297729d94 | [
"Apache-2.0"
] | null | null | null | README.md | alttch/weatherbroker | 8bd017b1cacc97c56d698485ab89284297729d94 | [
"Apache-2.0"
] | 1 | 2019-12-06T19:03:04.000Z | 2019-12-06T19:03:04.000Z | # weatherbroker
Weather broker for Python
License: Apache License 2.0
Warning: Refer to weather provider license about caching, storing and
redistributing weather information.
Provides current weather information as an unified dict:
* time: monitoring time, unix timestamp
* lat: latitude
* lon: longitude
* clouds (%)
* temp: temperature
* hum: humidity (%)
* pres: pressure
* wind_spd: wind speed
* wind_deg: wind degree
* vis: visibility
* dewp: dew point
* uv: UV index
* icon: icon code (provider specific)
* description: human readable weather description
* precip_type: precip type
* precip_prob: precip probability (%)
* precip_int: precip intensity
* sunrise: HH:mm
* sunset: HH:mm
Weather forecast is not implemented (yet).
Note: some providers may not provide certain data fields, in this case they are
set to None.
The module contains providers for:
* openweathermap: https://openweathermap.org/
* weatherbit: https://www.weatherbit.io/
* darksky: https://darksky.net/
Usage example:
```python
from weatherbroker import WeatherEngine
w = WeatherEngine()
w.set_provider('darksky')
w.key = 'my secret api key'
w.set_location(lat=50.08804, lon=14.42076)
# the code below sets the same location, but DarkSky supports only coordinates,
# so we use them instead
# w.set_location(city_id=3067696)
# w.set_location(city='Prague', country='CZ')
# w.lang = 'cs'
print(w.get_current())
```
(c) 2018 Altertech Group, https://www.altertech.com/
| 22.890625 | 79 | 0.746758 | eng_Latn | 0.874868 |
0c7d489848a452d6720e4b85b05d4ef720034f64 | 1,331 | md | Markdown | _posts/2015-12-22-Eugenia-Couture-300-Portia-Short-Sleeves-FloorLength-AlinePrincess.md | retroiccolors/retroiccolors.github.io | 9fcec38e243fcc96a9862bab7dcbf8a153843117 | [
"MIT"
] | null | null | null | _posts/2015-12-22-Eugenia-Couture-300-Portia-Short-Sleeves-FloorLength-AlinePrincess.md | retroiccolors/retroiccolors.github.io | 9fcec38e243fcc96a9862bab7dcbf8a153843117 | [
"MIT"
] | null | null | null | _posts/2015-12-22-Eugenia-Couture-300-Portia-Short-Sleeves-FloorLength-AlinePrincess.md | retroiccolors/retroiccolors.github.io | 9fcec38e243fcc96a9862bab7dcbf8a153843117 | [
"MIT"
] | null | null | null | ---
layout: post
date: 2015-12-22
title: "Eugenia Couture 300- Portia Short Sleeves Floor-Length Aline/Princess"
category: Eugenia
tags: [Eugenia,Aline/Princess ,Bateau,Floor-Length,Short Sleeves]
---
### Eugenia Couture 300- Portia
Just **$419.99**
### Short Sleeves Floor-Length Aline/Princess
<table><tr><td>BRANDS</td><td>Eugenia</td></tr><tr><td>Silhouette</td><td>Aline/Princess </td></tr><tr><td>Neckline</td><td>Bateau</td></tr><tr><td>Hemline/Train</td><td>Floor-Length</td></tr><tr><td>Sleeve</td><td>Short Sleeves</td></tr></table>
<a href="https://www.readybrides.com/en/eugenia-couture-prom-dress-2015-spring-collection/26363-eugenia-couture-300-portia.html"><img src="//img.readybrides.com/59089/eugenia-couture-300-portia.jpg" alt="Eugenia Couture 300- Portia" style="width:100%;" /></a>
<!-- break --><a href="https://www.readybrides.com/en/eugenia-couture-prom-dress-2015-spring-collection/26363-eugenia-couture-300-portia.html"><img src="//img.readybrides.com/59088/eugenia-couture-300-portia.jpg" alt="Eugenia Couture 300- Portia" style="width:100%;" /></a>
Buy it: [https://www.readybrides.com/en/eugenia-couture-prom-dress-2015-spring-collection/26363-eugenia-couture-300-portia.html](https://www.readybrides.com/en/eugenia-couture-prom-dress-2015-spring-collection/26363-eugenia-couture-300-portia.html)
| 83.1875 | 273 | 0.741548 | yue_Hant | 0.173262 |
0c7d571da5331bb536cb71d6238c0ecb3c7d7894 | 762 | md | Markdown | backend/node_modules/disparity/CHANGELOG.md | FlaviuVadan/sick-fits | 51d9b05ae8796c614727879c1d8fbc0fb2b92bf0 | [
"MIT"
] | 141 | 2015-01-02T16:02:28.000Z | 2021-09-13T05:08:05.000Z | writers_studio-win32-ia32/resources/app/node_modules/disparity/CHANGELOG.md | Phalanstere/WritersStudio | ef1a460506001ae56d5f657e60a960e5ca3d8d8c | [
"MIT"
] | 34 | 2015-02-23T00:03:33.000Z | 2018-08-29T03:02:40.000Z | writers_studio-win32-ia32/resources/app/node_modules/disparity/CHANGELOG.md | Phalanstere/WritersStudio | ef1a460506001ae56d5f657e60a960e5ca3d8d8c | [
"MIT"
] | 23 | 2015-01-22T20:29:16.000Z | 2018-12-20T14:53:47.000Z | # disparity changelog
## v2.0.0 (2015/04/03)
- change API so all methods have same signature (str1, str2, opts).
- merge CLI options `--unified --no-color ` into `--unified-no-color`.
- better error messages on the CLI.
- show paths on chars diff as well (if supplied) or fallback to
`disparity.removed` and `disparity.added`.
## v1.3.1 (2015/04/01)
- fix line number alignment.
## v1.3.0 (2015/04/01)
- add support for custom colors.
- update unified color scheme.
- simplify line splitting logic.
- improve way strings are colorized (avoids `\n` and `\r` chars).
## v1.2.0 (2015/04/01)
- add `unifiedNoColor` support.
## v1.1.0 (2015/04/01)
- allow override of `removed` and `added` strings.
## v1.0.0 (2015/04/01)
- initial release
| 23.090909 | 71 | 0.671916 | eng_Latn | 0.93637 |
0c7d5836fbcef9fc28d2382e103a26399bb24604 | 1,898 | md | Markdown | shift/tools.rc/doc/tools.rc.md | cspanier/shift | 5b3b9be310155fbc57d165d06259b723a5728828 | [
"Apache-2.0"
] | 2 | 2018-11-28T18:14:08.000Z | 2020-08-06T07:44:36.000Z | shift/tools.rc/doc/tools.rc.md | cspanier/shift | 5b3b9be310155fbc57d165d06259b723a5728828 | [
"Apache-2.0"
] | 4 | 2018-11-06T21:01:05.000Z | 2019-02-19T07:52:52.000Z | shift/tools.rc/doc/tools.rc.md | cspanier/shift | 5b3b9be310155fbc57d165d06259b723a5728828 | [
"Apache-2.0"
] | null | null | null | # shift.tools.rc - Resource Compiler
This console executable is a lightweight frontend for the [shift.rc](../../rc/doc/rc.md) library.
## Commandline Options
```
Allowed options:
--help Shows this help message.
--silent Disables logging to console.
--no-logfile Disables logging to file.
--log-level arg (=warn) Selects a log level (may be one of
'debug', 'info', 'warn', or 'error')
--log-arguments Writes all program arguments to the log.
--show-console arg (=1) Show or hide the console window
-i [ --input ] arg (="../resources") Base path to all source files to run
through the resource compiler.
-b [ --build ] arg (="../build-rc") Base path to write temporary resource
files to.
-o [ --output ] arg (=".") Base path to write compiled files to.
-r [ --rules ] arg (=.rc-rules.json) Name of rules json files to search for.
-c [ --cache ] arg (=.rc-cache.json) Name of a cache json file used to store
private data which is used to improve
performance of subsequent rc
invocations.
-v [ --verbose ] [=arg(=1)] (=0) Print more information.
--image-magick arg (=magick) Image Magick's command line executable.
--task-num-workers arg (=0) Number of worker threads to use to
process tasks. The default value of zero
lets the application automatically chose
the number of threads.
--floating-point-exceptions arg (=1) Enable floating-point exceptions.
```
| 57.515152 | 97 | 0.513699 | eng_Latn | 0.958132 |
0c7dd8f77d8cb58d476365e1255b9135df172728 | 3,399 | md | Markdown | _posts/2016-05-07-math-for-machine-learning-4999-course.md | Nexusdecode/nexusdecode.github.io | 521b49cec0abe4792e46db6103b047b746ce6b3c | [
"MIT"
] | null | null | null | _posts/2016-05-07-math-for-machine-learning-4999-course.md | Nexusdecode/nexusdecode.github.io | 521b49cec0abe4792e46db6103b047b746ce6b3c | [
"MIT"
] | 2 | 2020-03-08T13:42:17.000Z | 2020-03-08T13:56:13.000Z | _posts/2016-05-07-math-for-machine-learning-4999-course.md | Nexusdecode/nexusdecode.github.io | 521b49cec0abe4792e46db6103b047b746ce6b3c | [
"MIT"
] | null | null | null | ---
title: 'Math for Machine Learning | [ 49.99$ Course For Free ]'
date: 2018-12-04T14:30:00+01:00
draft: false
tags : [BUSINESS, MATH]
---
[](https://1.bp.blogspot.com/-VxItHUO7DjI/XAaAeBNXVAI/AAAAAAAAAgw/727MqcwsFNsXuXAEF0xB9w4F6xZYDPcPQCLcBGAs/s1600/Math-for-Machine-Learning.jpg)
### DESCRIPTION:
Would you like to learn a mathematics subject that is crucial for many high-demand lucrative career fields such as:
* Computer Science
* Data Science
* Artificial Intelligence
If you’re looking to gain a solid foundation in Machine Learning to further your career goals, in a way that allows you to study on your own schedule at a fraction of the cost it would take at a traditional university, this online course is for you. If you’re a working professional needing a refresher on machine learning or a complete beginner who needs to learn Machine Learning for the first time, this online course is for you.
**Why you should take this online course**: You need to refresh your knowledge of machine learning for your career to earn a higher salary. You need to learn machine learning because it is a required mathematical subject for your chosen career field such as data science or artificial intelligence. You intend to pursue a masters degree or PhD, and machine learning is a required or recommended subject.
**Why you should choose this instructor**: I earned my PhD in Mathematics from the University of California, Riverside. I have created many successful online math courses that students around the world have found invaluable—courses in linear algebra, discrete math, and calculus.
In this course, I cover the core concepts such as:
* Linear Regression
* Linear Discriminant Analysis
* Logistic Regression
* Artificial Neural Networks
* Support Vector Machines
After taking this course, you will feel CARE-FREE AND CONFIDENT. I will break it all down into bite-sized no-brainer chunks. I explain each definition and go through each example STEP BY STEP so that you understand each topic clearly. I will also be AVAILABLE TO ANSWER ANY QUESTIONS you might have on the lecture material or any other questions you are struggling with.
Practice problems are provided for you, and detailed solutions are also provided to check your understanding.
30 day full refund if not satisfied.
Grab a cup of coffee and start listening to the first lecture. I, and your peers, are here to help. We’re waiting for your insights and questions! **Enroll now!**
### Who is the target audience?
* Working Professionals
* Anyone interested in gaining mastery of machine learning
* Data Scientists
* AI professionals
* Adult Learners
* College Students
**File Size: 507.6MB**
**Direct Download Links:**
[Link1](http://turboagram.com/18521555/math-for-machine-link1) | [Mirror](http://turboagram.com/18521555/math-for-machine-link2)
**Torrent Download Links **
[Link1](http://turboagram.com/18521555/math-for-machine-torrent1) | [Mirror](http://turboagram.com/18521555/math-for-machine-torrent2)
**Source:** https://www.udemy.com/mathematics-for-machine-learning/
**NOTICE: **
**\- Turn Off Your Ad-Blocker **
**\- Highly Recommend to Use Torrent Links for Big Size Files** | 56.65 | 434 | 0.768167 | eng_Latn | 0.985654 |
0c7dfb8bbb556c23dfb4def09b1edc2ece76f47b | 4,600 | md | Markdown | book/docs/about.md | AvijeetPrasad/laputas-blog | 27d969e341b1d264ef4fe3a334c775ce631ba2f1 | [
"BSD-3-Clause"
] | null | null | null | book/docs/about.md | AvijeetPrasad/laputas-blog | 27d969e341b1d264ef4fe3a334c775ce631ba2f1 | [
"BSD-3-Clause"
] | null | null | null | book/docs/about.md | AvijeetPrasad/laputas-blog | 27d969e341b1d264ef4fe3a334c775ce631ba2f1 | [
"BSD-3-Clause"
] | null | null | null | # About Us
Team:
- Prof. C. Sivaram
- Arun Kenath
- Avijeet Prasad
**Prof. C. Sivaram** retired as senior professor, Indian Institute of Astrophysics, Bangalore, received his PhD Degree from the indian Institute of Science in 1977. After a few years on the staff of IISc, he joined Indian institute of Astrophysics as a visiting fellow in 1980 and moved up the academic ladder since. In IIA his research work has mainly been in several areas of theoretical astrophysics and cosmology.
His early work on the early universe resulted in some of the earliest work on inflation models dealing with a time varying cosmological constant. Some of the work has led to insights into the unification of fundamental interactions in the early universe with energy dependent coupling constants. He had also made several contributions to high energy astrophysics, some of which include the origin and acceleration of the highest energy cosmic rays, gamma ray and neutrino production in quasar effects of neutron antineutron oscillations, decay of cosmic strings and dbranes and COBE constraints on particle properties and theories. Also, contributions are made to the solar neutrino problem, pulsar physics, gravitational radiation, theories of galaxy formation, dark matter etc. along with some new work on black hole entropy.
He has authored or co-authored several review articles which have also appeared in Physics Reports, Reports in Prog Physics, etc. Recently he had been invited to contribute many such articles to several monographs. He was an invited speaker at more than 25 international conferences and also a plenary speaker at several NATO Advanced Study Institutes an topics ranging from black holes, cosmology and particle physics to gravitational waves, etc.
His recent studies on spin effects in gravitation led to a monograph on the subject (co-authored with de Sabbata) published by World Scientific). Over the years he has collaborated with leading scientists from several countries. He has held more than ten visiting professorships abroad (including two sabbaticals) over the past decade in universities in Europe, North and South America, Australia, Russia, etc. He was also a Joint Director of projects of the World Laboratory Geneva for the period between 1989 to 1994.
His recent foreign collaborations include: work on chromogravity approach to quark and gluon confinement with Nobel laureate Abdus Salam, later extended with Ne'eman; spin and torsion effects gravitation with de Sabbata; new models of inflation with de Andrade; radioactive effects of additional long range forces in binary systems with Bertotti; COBE constraints on galaxy correlations with Landsberg plus on many other topics ranging from interference in gravitational field, entropy of gravitating systems and new approaches to the quantization of gravity (these latter areas with noted German, Russian, Brazilian and Chinese scientists).
Apart from outside collaborations, he has collaborated with several of his colleagues in IIA on a whole spectrum of topics in astrophysics.
Among the international awards received are Martin Forster Gold Medal and the Ettore Majorana Fellowship. Over twenty of his papers were selected for Honourable Mention in the Gravity Research Foundation Competition. He also received an international prize for a recent proper.
He has more than 250 publications in international journals. He is a fellow of the International Astronomical Union (IAU) and a member of the IAU Commission No. 51 an on Bioastronomy for the past several years.
He has authored/co-authored seven books on topics ranging from astrophysics, cosmology to astrobiology and rocket dynamics.
---
**Arun Kenath** is an avid physicist and academician. He is the recipient of six gold medals from Bangalore University. He has a PhD in Physics (Astrophysics). His primary area of expertise is cosmology and theoretical astrophysics. He has been an invited speaker to various institutions, on topics ranging from cosmology to artificial intelligence. He serves in the editorial board of three international peer-review journals/newsletters. He has published papers in various international journals, and has authored four books on astrophysics and astrobiology. He currently works as a lecturer at the Department of Physics, Christ Junior College, Bangalore.
---
**Avijeet Prasad** is currently a postdoctoral fellow at the University of Alabama in Huntsville. His research focuses on the magnetohydrodynamics of the solar atmosphere and high energetic phenomena like solar flares and coronal mass ejections.
| 153.333333 | 827 | 0.82087 | eng_Latn | 0.999545 |
0c7e1b359d29c05382ef9b988728c2e8b86d40eb | 314 | md | Markdown | abstract-factory/README.md | cxcoder/design-patterns | 0acf870c710b41e6f2927979b567e62ae05188ec | [
"MIT"
] | 2 | 2017-01-22T15:10:46.000Z | 2017-01-22T15:27:35.000Z | abstract-factory/README.md | cyhe/design-patterns | 0acf870c710b41e6f2927979b567e62ae05188ec | [
"MIT"
] | null | null | null | abstract-factory/README.md | cyhe/design-patterns | 0acf870c710b41e6f2927979b567e62ae05188ec | [
"MIT"
] | null | null | null | ### 抽象工厂 - Abstract Factory
提供一个创建一系列相关或相互依赖的对象的接口,而无需指定它们具体的类。

### 适用场景
- 一个系统要独立于其产品的创建、组合和表示时
- 一个系统要由多个产品系列中的一个来配置时
- 一系列相关的产品对象被设计成一起使用时
- 提供一个产品类库,只显示它们的接口而不是实现时
### 已知应用
TODO
| 20.933333 | 114 | 0.761146 | yue_Hant | 0.809214 |
d01db28493c63c6e37ef9cc5f28fe10f39a24739 | 1,431 | md | Markdown | README.md | CallitMez/PSbot | cbc6ed1aea11067fc42792ab3eed12334f5a62b5 | [
"MIT"
] | null | null | null | README.md | CallitMez/PSbot | cbc6ed1aea11067fc42792ab3eed12334f5a62b5 | [
"MIT"
] | null | null | null | README.md | CallitMez/PSbot | cbc6ed1aea11067fc42792ab3eed12334f5a62b5 | [
"MIT"
] | null | null | null | Pokemon Showdown Bot made in Python 3.4
Detailed information can be found in the respective files.
Structure
---------
The Showdown bot is built from three components:
- app.py which contains PSBot(), the central network, and is where most of the connections to other pieces of the app is created.
- The class PSBot is extended from the base class PokemonShowdownBot found in robot.py, and contain almost all basic functions that are required for the bot to function. Most of the more general functions like join, leave and say are defined here.
- The third file this rely on is room.py, as every room joined creats a new room object that store important information for the bot, such as userlists and allowed uses.
Setting up
----------
#### Python version:
- Python 3.4.2 (Supposedly works with 2.7 versions of the libraries too, but cannot confirm)
#### Guide:
1. Clone the git repo to your desired location
2. Use `pip install requirements.txt` to get relevant modules for the project
3. Follow the instructions in `details-example.py` to configure it
4. Run using `python app.py` (or `python3 app.py` if you have both versions installed)
License
-------
This is distributed under the terms of the [MIT License][1].
[1]: https://github.com/QuiteQuiet/PokemonShowdownBot/blob/master/LICENCE
Credits
-------
This bot is based on PokemonShowdownBot by:
- Quite Quiet
https://github.com/QuiteQuiet/PokemonShowdownBot
| 36.692308 | 247 | 0.75891 | eng_Latn | 0.999147 |
d01f53a42446fea6d5f45130fa8694bc5b9cdc45 | 69 | md | Markdown | content/docs/user-guide/gen-tests/_index.md | httprunner/website | 6eddb7bad079808adacbbd005ac0c1983e7568da | [
"Apache-2.0"
] | null | null | null | content/docs/user-guide/gen-tests/_index.md | httprunner/website | 6eddb7bad079808adacbbd005ac0c1983e7568da | [
"Apache-2.0"
] | null | null | null | content/docs/user-guide/gen-tests/_index.md | httprunner/website | 6eddb7bad079808adacbbd005ac0c1983e7568da | [
"Apache-2.0"
] | null | null | null | ---
title: 准备测试用例
weight: 4
description: HttpRunner 初始化测试用例的几种方式
---
| 11.5 | 36 | 0.73913 | eng_Latn | 0.584828 |
d01f89f38f35413ab5d983d2ab2229d7dc8a96c3 | 3,855 | md | Markdown | articles/azure-stack/azure-stack-tools-paas-services.md | kanr/azure-content | 636e8035ef7b784da820688b21908f860aabae0e | [
"CC-BY-3.0"
] | null | null | null | articles/azure-stack/azure-stack-tools-paas-services.md | kanr/azure-content | 636e8035ef7b784da820688b21908f860aabae0e | [
"CC-BY-3.0"
] | null | null | null | articles/azure-stack/azure-stack-tools-paas-services.md | kanr/azure-content | 636e8035ef7b784da820688b21908f860aabae0e | [
"CC-BY-3.0"
] | null | null | null | <properties
pageTitle="Tools and PaaS services for Azure Stack | Microsoft Azure"
description="Learn how to get started with PaaS services in Azure Stack."
services="azure-stack"
documentationCenter=""
authors="ErikjeMS"
manager="byronr"
editor=""/>
<tags
ms.service="multiple"
ms.workload="na"
ms.tgt_pltfrm="na"
ms.devlang="na"
ms.topic="article"
ms.date="01/29/2016"
ms.author="erikje"/>
# Tools and PaaS services for Azure Stack
Azure Stack enables deploying Platform as a Service (PaaS) services from Microsoft and other 3rd party providers. You can also download the tools described below. If you want to be notified of new services and tools, follow #AzureStack on Twitter.
## Additional PaaS services
In Technical Preview 1, three PaaS resource providers are now available.
[Add a SQL Server resource provider to Azure Stack](azure-stack-sql-rp-deploy-short.md)
[Add a MySQL resource provider to Azure Stack](azure-stack-mysql-rp-deploy-short.md)
[Add a Web Apps resource provider to Azure Stack](azure-stack-webapps-deploy.md)
## Template tools
### Azure Stack Github templates
Explore the growing collection of [Azure Stack GitHub Templates](https://github.com/Azure/AzureStack-QuickStart-Templates). Just like [Azure](https://github.com/Azure/azure-quickstart-templates), these “Quick Start” Azure Resource Manager templates aim to get you started with simple building blocks and examples, ready to deploy on the Microsoft Azure Stack Technical Preview Proof of Concept Environment. Included are first party workload examples for [ad-non-ha](https://github.com/Azure/AzureStack-QuickStart-Templates/tree/master/ad-non-ha), [sql-2014-non-ha](https://github.com/Azure/AzureStack-QuickStart-Templates/tree/master/sql-2014-non-ha), [sharepoint-2013-non-ha](https://github.com/Azure/AzureStack-QuickStart-Templates/tree/master/sharepoint-2013-non-ha), as well as several simple 101 templates like [101-simple-windows-vm](https://github.com/Azure/AzureStack-QuickStart-Templates/tree/master/101-simple-windows-vm).
### Marketplace item packaging tool and sample
[Download and use the Packaging tool](http://www.aka.ms/azurestackmarketplaceitem) to create marketplace items for your own custom templates to add to the Azure Stack marketplace. Instructions on how to create a marketplace item and make it available to your tenants can be found in [Create Marketplace item](azure-stack-create-marketplace-item.md).
## Developer tools
### Visual Studio Cloud Tools
Use the Visual Studio Cloud Tools to quickly build new applications or deploy existing applications to Azure Stack.
[Download for Visual Studio 2015](http://go.microsoft.com/fwlink/?linkid=518003)
### Azure PowerShell SDK
Azure PowerShell is a module that provides cmdlets to manage Azure and Azure Stack with Windows PowerShell. You can use the cmdlets to create, test, deploy, and manage solutions and services delivered through the Azure Stack platform.
[Download Azure PowerShell SDK](http://aka.ms/azStackPsh)
> [AZURE.NOTE] If you work on the Client VM, you’ll need to first **uninstall** the existing Azure PowerShell module and then [download](http://aka.ms/azStackPsh) the latest Azure PowerShell SDK.
### Azure cross platform command line interfaces
Quickly install the Azure Command-Line Interface (Azure CLI) to use a set of open-source shell-based commands for creating and managing resources in Microsoft Azure Stack.
[Download the Windows CLI](http://aka.ms/azstack-windows-cli)
[Download the Mac CLI](http://aka.ms/azstack-linux-cli)
[Download the Linux CLI](http://aka.ms/azstack-mac-cli)
>[AZURE.NOTE]
>
> + If you’re on a Mac or Linux machine, you can also get the CLI by using the command `npm install -g [email protected]`</br>
> + If you're getting certificate validation issues, run the command `set NODE_TLS_REJECT_UNAUTHORIZED=0`
| 57.537313 | 932 | 0.783139 | eng_Latn | 0.829484 |
d020270579110d199136ccf3177792fe4ef5b2b9 | 92 | md | Markdown | README.md | codechimp/arcadian-wookie | 4aeb12608c0dbdb6b66f08f15c6d4b81b80507ec | [
"Apache-2.0"
] | null | null | null | README.md | codechimp/arcadian-wookie | 4aeb12608c0dbdb6b66f08f15c6d4b81b80507ec | [
"Apache-2.0"
] | null | null | null | README.md | codechimp/arcadian-wookie | 4aeb12608c0dbdb6b66f08f15c6d4b81b80507ec | [
"Apache-2.0"
] | null | null | null | # arcadian-wookie
CQRS framework for which we could not find a cool name (yet)
IN PROGRESS
| 18.4 | 60 | 0.771739 | eng_Latn | 0.99282 |
d020babd0ca7c057d0577ce81cd99bb9f47a3796 | 38 | md | Markdown | README.md | akashdathan/crypto-trader-bot | 241c6658dbb1b025c65c7446125e47e75c85efa1 | [
"MIT"
] | null | null | null | README.md | akashdathan/crypto-trader-bot | 241c6658dbb1b025c65c7446125e47e75c85efa1 | [
"MIT"
] | null | null | null | README.md | akashdathan/crypto-trader-bot | 241c6658dbb1b025c65c7446125e47e75c85efa1 | [
"MIT"
] | null | null | null | # crypto-trader-bot
Crypto Trader Bot
| 12.666667 | 19 | 0.789474 | ceb_Latn | 0.509341 |
d021e09bd6e1aac2106801c3370ad38a157d3aa9 | 1,089 | md | Markdown | TK_14.md | YingyingXie16/RDataScience | 6d4e73fb7c63a76720289aec5ab1943c7d16c039 | [
"MIT"
] | null | null | null | TK_14.md | YingyingXie16/RDataScience | 6d4e73fb7c63a76720289aec5ab1943c7d16c039 | [
"MIT"
] | null | null | null | TK_14.md | YingyingXie16/RDataScience | 6d4e73fb7c63a76720289aec5ab1943c7d16c039 | [
"MIT"
] | null | null | null | ---
title: Final Project 2nd Draft / Building and summarizing models
subtitle: Building a species distribution model
week: 14
type: Task
reading:
- Chapter [23-25 in R4DS](http://r4ds.had.co.nz)
tasks:
- Commit second draft of final project to GitHub for review
- Demonstrate a simple presence/absence model in spatial context.
- Model spatial dependence (autocorrelation) in the response.
---
<div class='extraswell'>
<button data-toggle='collapse' class='btn btn-link' data-target='#pres'>View Presentation </button> [Open presentation in a new tab](){target='_blank'}
<div id='pres' class='collapse'>
<div class='embed-responsive embed-responsive-16by9'>
<iframe class='embed-responsive-item' src='' allowfullscreen></iframe>
_Click on presentation and then use the space bar to advance to the next slide
or escape key to show an overview._
</div>
</div>
</div>
# Reading
- Chapter [23-25 in R4DS](http://r4ds.had.co.nz)
# Exercise
Work through the [exercise illustrated here](13_SDM_Exercise.html). You do not need to upload anything to github.
| 30.25 | 158 | 0.728191 | eng_Latn | 0.900461 |
d022a0fd5b8db856a2ca6482edefde9894997b59 | 77 | md | Markdown | contributing.md | CodeLongAndProsper90/nat | 80d74aa573de18e522b71b69648735ffd1dfa420 | [
"MIT"
] | 1,358 | 2020-10-22T05:00:17.000Z | 2022-03-20T00:29:56.000Z | contributing.md | CodeLongAndProsper90/nat | 80d74aa573de18e522b71b69648735ffd1dfa420 | [
"MIT"
] | 61 | 2020-10-23T06:54:52.000Z | 2021-11-22T01:11:45.000Z | contributing.md | CodeLongAndProsper90/nat | 80d74aa573de18e522b71b69648735ffd1dfa420 | [
"MIT"
] | 43 | 2020-10-23T06:58:10.000Z | 2021-09-21T09:26:35.000Z | ```bash
cargo fmt
```
```bash
rustup component add clippy
cargo clippy
```
| 7.7 | 27 | 0.662338 | eng_Latn | 0.55216 |
d0239771ac0187d67cfe9fba7b91360d0a122679 | 685 | md | Markdown | docs/markdown/02-project/02-code/03-review.md | sfeir-open-source/sfeir-school-azure-devops | cfd77d14a458092b6dfa34637747ee139ead0f17 | [
"Apache-2.0"
] | 1 | 2022-03-30T07:16:53.000Z | 2022-03-30T07:16:53.000Z | docs/markdown/02-project/02-code/03-review.md | sfeir-open-source/sfeir-school-azure-devops | cfd77d14a458092b6dfa34637747ee139ead0f17 | [
"Apache-2.0"
] | null | null | null | docs/markdown/02-project/02-code/03-review.md | sfeir-open-source/sfeir-school-azure-devops | cfd77d14a458092b6dfa34637747ee139ead0f17 | [
"Apache-2.0"
] | null | null | null | <!-- .slide: class="transition bg-pink" -->
# Code

**Review Code**
##--##
## Objectif: Collaboration
**Fondement de l'open source**
Pull request
- Partager à l'équipe les changements
- Demander l'avis des autres
- Discuter des modifications éventuelles
- Détecter les bugs (un peu)
- Detecter les problèmes
<div class="tip">
Un PR n'est pas une obligation de merger !
</div>
##--##
## It's all about feedback
- Assigner au bonnes personnes
- Description claire
- Feedback constructif, rapide
<div class="tip">Feedback POSITIF 💝</div>
Notes:
- Bonnes personnes => policy + manuel
- Changements out of scope => new work item
| 17.564103 | 48 | 0.70365 | fra_Latn | 0.83045 |
d023a94d2a0c026e099df98d50c15991204a066f | 4,597 | md | Markdown | README.md | l0wskilled/AliceGenerator | 5d59aa94a5590b47c94f0698fbaad652b0a046e3 | [
"MIT"
] | 48 | 2016-08-25T07:41:33.000Z | 2021-12-22T11:01:32.000Z | README.md | l0wskilled/AliceGenerator | 5d59aa94a5590b47c94f0698fbaad652b0a046e3 | [
"MIT"
] | 16 | 2016-08-30T14:48:12.000Z | 2020-06-05T13:08:25.000Z | README.md | l0wskilled/AliceGenerator | 5d59aa94a5590b47c94f0698fbaad652b0a046e3 | [
"MIT"
] | 23 | 2016-08-24T19:31:31.000Z | 2022-02-25T21:23:51.000Z | AliceGenerator [](https://travis-ci.org/trappar/AliceGenerator)
==============
Recursively convert existing objects into [Alice](https://github.com/nelmio/alice) Fixtures.
## Introduction
Sometimes you find yourself working on a large project with no existing fixtures.
In this case even though Alice makes fixtures much easier to write, that process can still be tedious.
This library proposes an alternate starting point - *automatically generate fixtures from your existing data.*
This opens up a whole new, much faster way to get your test data established... just enter it in your user interface!
#### Example
Let's say you have the following objects
```php
<?php
class Post {
public $title;
public $body;
public $postedBy;
}
class User {
public $username;
}
$user = new User();
$user->username = 'Trappar';
$post1 = new Post();
$post1->title = 'Is Making Fixtures Too Time Consuming?';
$post1->body = 'Check out Alice!';
$post1->postedBy = $user;
$post2 = new Post();
$post2->title = 'Too Much Data to Hand Write?';
$post2->body = 'Check out AliceGenerator!';
$post2->postedBy = $user;
```
This library let's you turn that directly into...
```yaml
Post:
Post-1:
title: 'Is Making Fixtures Too Time Consuming?'
body: 'Check out Alice!'
postedBy: '@User-1'
Post-2:
title: 'Too Much Data to Hand Write?'
body: 'Check out AliceGenerator!'
postedBy: '@User-1'
User:
User-1:
username: Jeff
```
## Installation
You can use [Composer](https://getcomposer.org/) to install the library to your project:
```bash
composer require trappar/alice-generator
```
## Features
* Framework support
* Supports Symfony via [AliceGeneratorBundle](https://github.com/trappar/AliceGeneratorBundle) - Start generating fixtures immediately with *zero custom code required!*
* ORM support
* Supports Doctrine natively
* Can operate without any ORM
* Can be extended to support any ORM
* Many ways to make use of Faker providers
* Configure how your objects are serialized using annotations or YAML metadata
* Can serialize any object type using custom ObjectHandlers
* Supports multiple levels of recursion taming
* Handles circular references automatically
* Customizable maximum recursion depth
* Can restrict object traversal to only specific objects of a type
* Supports several methods of naming Alice references natively - fully customizable
## Table of Contents
* [Usage](doc/usage.md)
* [Basic usage](doc/usage.md#basic-usage)
* [Fixture Generation Contexts](doc/usage.md#fixture-generation-contexts)
* [Limiting Recursion](doc/usage.md#limiting-recursion)
* [Limiting Recursion Depth](doc/usage.md#limiting-recursion-depth)
* [Limiting Recursion With Object Constraints](doc/usage.md#limiting-recursion-with-object-constraints)
* [Customizing Reference Naming Strategy](doc/usage.md#customizing-reference-naming-strategy)
* [Force Inclusion of Properties at Default Values](doc/usage.md#force-inclusion-of-properties-at-default-values)
* [Property Metadata](doc/metadata.md)
* [Data](doc/metadata.md#data)
* [Ignore](doc/metadata.md#ignore)
* [Faker](doc/metadata.md#faker)
* [Built-in Faker Resolver Types](doc/metadata.md#built-in-faker-resolver-types)
* [array](doc/metadata.md#array)
* [value-as-arg](doc/metadata.md#value-as-arg)
* [callback](doc/metadata.md#callback)
* [Custom Object Handlers](doc/custom-object-handlers.md)
* [Configuration](doc/configuration.md)
* [Constructing a FixtureGenerator](doc/configuration.md#constructing-a-fixturegenerator)
* [Using with Doctrine](doc/configuration.md#using-with-doctrine)
* [Property Namer Strategy](doc/configuration.md#property-namer-strategy)
* [Adding Custom Object Handlers](doc/configuration.md#adding-custom-object-handlers)
* [Configuring Metadata Locations](doc/configuration.md#configuring-metadata-locations)
* [Disabling Strict Type Checking](doc/configuration.md#disabling-strict-type-checking)
## Resources
* [Changelog](CHANGELOG.md)
## Credits
This bundle was developed by [Jeff Way](https://github.com/trappar) with quite a lot of inspiration from:
* [nelmio/alice](https://github.com/nelmio/alice)
* [schmittjoh/serializer](https://github.com/schmittjoh/serializer)
[Other contributors](https://github.com/trappar/AliceGenerator/graphs/contributors).
## License
[](Resources/meta/LICENSE)
| 35.635659 | 170 | 0.73722 | eng_Latn | 0.660161 |
d023d0ac387db2e05de65bb86b9be0170f5969e1 | 19,463 | md | Markdown | articles/media-services/previous/media-services-deliver-keys-and-licenses.md | changeworld/azure-docs.pl-pl | f97283ce868106fdb5236557ef827e56b43d803e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/media-services/previous/media-services-deliver-keys-and-licenses.md | changeworld/azure-docs.pl-pl | f97283ce868106fdb5236557ef827e56b43d803e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/media-services/previous/media-services-deliver-keys-and-licenses.md | changeworld/azure-docs.pl-pl | f97283ce868106fdb5236557ef827e56b43d803e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Korzystanie z usługi Azure Media Services w celu dostarczania licencji DRM lub kluczy AES | Dokumenty firmy Microsoft
description: W tym artykule opisano, jak można używać usługi Azure Media Services do dostarczania licencji PlayReady i/lub Widevine i kluczy AES, ale wykonaj resztę (kodowanie, szyfrowanie, przesyłanie strumieniowe) przy użyciu serwerów lokalnych.
services: media-services
documentationcenter: ''
author: Juliako
manager: femila
editor: ''
ms.assetid: 8546c2c1-430b-4254-a88d-4436a83f9192
ms.service: media-services
ms.workload: media
ms.tgt_pltfrm: na
ms.devlang: na
ms.topic: article
ms.date: 03/18/2019
ms.author: juliako
ms.openlocfilehash: b1f8b158c511919a72e72629d72b0e5ff73ff7db
ms.sourcegitcommit: 2ec4b3d0bad7dc0071400c2a2264399e4fe34897
ms.translationtype: MT
ms.contentlocale: pl-PL
ms.lasthandoff: 03/28/2020
ms.locfileid: "78268110"
---
# <a name="use-media-services-to-deliver-drm-licenses-or-aes-keys"></a>Dostarczanie licencji DRM lub kluczy AES za pomocą usługi Media Services
> [!NOTE]
> Do usługi Media Services w wersji 2 nie są już dodawane żadne nowe funkcje. <br/>Sprawdź najnowszą wersję usługi [Media Services w wersji 3](https://docs.microsoft.com/azure/media-services/latest/). Zobacz też [wskazówki dotyczące migracji z wersji 2 do v3](../latest/migrate-from-v2-to-v3.md)
Usługa Azure Media Services umożliwia pozyskiwania, kodowania, dodawania ochrony zawartości i przesyłania strumieniowego zawartości. Aby uzyskać więcej informacji, zobacz [Korzystanie z funkcji PlayReady i/lub Widevine dynamic common encryption](media-services-protect-with-playready-widevine.md). Niektórzy klienci chcą używać usługi Media Services tylko do dostarczania licencji i/lub kluczy oraz kodowania, szyfrowania i przesyłania strumieniowego przy użyciu serwerów lokalnych. W tym artykule opisano, jak za pomocą usługi Media Services do dostarczania licencji PlayReady i/lub Widevine, ale resztę z serwerami lokalnymi.
Do wykonania kroków tego samouczka potrzebne jest konto platformy Azure. Aby uzyskać szczegółowe informacje, zobacz [Bezpłatna wersja próbna platformy Azure](https://azure.microsoft.com/pricing/free-trial/).
## <a name="overview"></a>Omówienie
Usługa Media Services zapewnia usługę dostarczania licencji PlayReady i Widevine do zarządzania prawami cyfrowymi (DRM) oraz kluczy AES-128. Usługi Media Services udostępnia również interfejsy API, które umożliwiają konfigurowanie praw i ograniczeń, które mają być wymuszane przez środowisko uruchomieniowe DRM, gdy użytkownik odtwarza zawartość chroniona przez drm. Gdy użytkownik zażąda chronionej zawartości, aplikacja odtwarzacza żąda licencji od usługi licencji Usługi Media Services. Jeśli licencja jest autoryzowana, usługa licencji usługi Media Services wystawia ją odtwarzaczowi. Licencje PlayReady i Widevine zawierają klucz deszyfrujący, który może być używany przez odtwarzacz klienta do odszyfrowywania i przesyłania strumieniowego zawartości.
Usługa Media Services obsługuje wiele sposobów autoryzowania użytkowników, którzy dokonują żądań licencji lub kluczy. Konfigurowanie zasad autoryzacji klucza zawartości. Zasady mogą mieć co najmniej jedno ograniczenie. Opcje są otwarte lub ograniczenie tokenu. Zasadom ograniczenia tokenu musi towarzyszyć token wystawiony przez usługę tokenu zabezpieczającego (STS). Usługi Media Services obsługują tokeny w formacie prostego tokenu internetowego (SWT) i formatu JSON Web Token (JWT).
Na poniższym diagramie przedstawiono główne kroki, które należy podjąć, aby użyć usługi Media Services do dostarczania licencji PlayReady i/lub Widevine, ale wykonaj resztę z serwerami lokalnymi:

## <a name="download-sample"></a>Pobieranie próbki
Aby pobrać przykład opisany w tym artykule, zobacz [Korzystanie z usługi Azure Media Services w celu dostarczania licencji PlayReady i/lub Widevine za pomocą platformy .NET](https://github.com/Azure/media-services-dotnet-deliver-drm-licenses).
## <a name="create-and-configure-a-visual-studio-project"></a>Tworzenie i konfigurowanie projektu programu Visual Studio
1. Skonfiguruj środowisko programistyczne i wypełnij plik app.config informacjami o połączeniu, zgodnie z opisem w programie [Media Services development za pomocą platformy .NET](media-services-dotnet-how-to-use.md).
2. Dodaj następujące elementy do węzła **appSettings** zdefiniowanego w pliku app.config:
```xml
<add key="Issuer" value="http://testissuer.com"/>
<add key="Audience" value="urn:test"/>
```
## <a name="net-code-example"></a>Przykład kodu platformy .NET
W poniższym przykładzie kodu pokazano, jak utworzyć wspólny klucz zawartości i uzyskać adresy URL nabycia licencji PlayReady lub Widevine. Aby skonfigurować serwer lokalny, potrzebny jest klucz zawartości, identyfikator klucza i adres URL nabycia licencji. Po skonfigurowaniu serwera lokalnego można przesyłać strumieniowo z własnego serwera przesyłania strumieniowego. Ponieważ zaszyfrowany strumień wskazuje serwer licencji usługi Media Services, odtwarzacz żąda licencji od usługi Media Services. Jeśli wybierzesz uwierzytelnianie tokenu, serwer licencji usługi Media Services sprawdza poprawność tokenu wysłanego za pośrednictwem protokołu HTTPS. Jeśli token jest prawidłowy, serwer licencji dostarcza licencję z powrotem do odtwarzacza. W poniższym przykładzie kodu pokazano tylko, jak utworzyć wspólny klucz zawartości i uzyskać adresy URL nabycia licencji PlayReady lub Widevine. Jeśli chcesz dostarczyć klucze AES-128, musisz utworzyć klucz zawartości koperty i uzyskać adres URL pozyskiwania klucza. Aby uzyskać więcej informacji, zobacz [Korzystanie z usługi szyfrowania dynamicznego aes-128 i dostarczania kluczy](media-services-protect-with-aes128.md).
```csharp
using System;
using System.Collections.Generic;
using System.Configuration;
using Microsoft.WindowsAzure.MediaServices.Client;
using Microsoft.WindowsAzure.MediaServices.Client.ContentKeyAuthorization;
using Microsoft.WindowsAzure.MediaServices.Client.Widevine;
using Newtonsoft.Json;
namespace DeliverDRMLicenses
{
class Program
{
// Read values from the App.config file.
private static readonly string _AADTenantDomain =
ConfigurationManager.AppSettings["AMSAADTenantDomain"];
private static readonly string _RESTAPIEndpoint =
ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
private static readonly string _AMSClientId =
ConfigurationManager.AppSettings["AMSClientId"];
private static readonly string _AMSClientSecret =
ConfigurationManager.AppSettings["AMSClientSecret"];
private static readonly Uri _sampleIssuer =
new Uri(ConfigurationManager.AppSettings["Issuer"]);
private static readonly Uri _sampleAudience =
new Uri(ConfigurationManager.AppSettings["Audience"]);
// Field for service context.
private static CloudMediaContext _context = null;
static void Main(string[] args)
{
AzureAdTokenCredentials tokenCredentials =
new AzureAdTokenCredentials(_AADTenantDomain,
new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
AzureEnvironments.AzureCloudEnvironment);
var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
_context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
bool tokenRestriction = true;
string tokenTemplateString = null;
IContentKey key = CreateCommonTypeContentKey();
// Print out the key ID and Key in base64 string format
Console.WriteLine("Created key {0} with key value {1} ",
key.Id, System.Convert.ToBase64String(key.GetClearKeyValue()));
Console.WriteLine("PlayReady License Key delivery URL: {0}",
key.GetKeyDeliveryUrl(ContentKeyDeliveryType.PlayReadyLicense));
Console.WriteLine("Widevine License Key delivery URL: {0}",
key.GetKeyDeliveryUrl(ContentKeyDeliveryType.Widevine));
if (tokenRestriction)
tokenTemplateString = AddTokenRestrictedAuthorizationPolicy(key);
else
AddOpenAuthorizationPolicy(key);
Console.WriteLine("Added authorization policy: {0}",
key.AuthorizationPolicyId);
Console.WriteLine();
Console.ReadLine();
}
static public void AddOpenAuthorizationPolicy(IContentKey contentKey)
{
// Create ContentKeyAuthorizationPolicy with Open restrictions
// and create authorization policy
List<ContentKeyAuthorizationPolicyRestriction> restrictions =
new List<ContentKeyAuthorizationPolicyRestriction>
{
new ContentKeyAuthorizationPolicyRestriction
{
Name = "Open",
KeyRestrictionType = (int)ContentKeyRestrictionType.Open,
Requirements = null
}
};
// Configure PlayReady and Widevine license templates.
string PlayReadyLicenseTemplate = ConfigurePlayReadyLicenseTemplate();
string WidevineLicenseTemplate = ConfigureWidevineLicenseTemplate();
IContentKeyAuthorizationPolicyOption PlayReadyPolicy =
_context.ContentKeyAuthorizationPolicyOptions.Create("",
ContentKeyDeliveryType.PlayReadyLicense,
restrictions, PlayReadyLicenseTemplate);
IContentKeyAuthorizationPolicyOption WidevinePolicy =
_context.ContentKeyAuthorizationPolicyOptions.Create("",
ContentKeyDeliveryType.Widevine,
restrictions, WidevineLicenseTemplate);
IContentKeyAuthorizationPolicy contentKeyAuthorizationPolicy = _context.
ContentKeyAuthorizationPolicies.
CreateAsync("Deliver Common Content Key with no restrictions").
Result;
contentKeyAuthorizationPolicy.Options.Add(PlayReadyPolicy);
contentKeyAuthorizationPolicy.Options.Add(WidevinePolicy);
// Associate the content key authorization policy with the content key.
contentKey.AuthorizationPolicyId = contentKeyAuthorizationPolicy.Id;
contentKey = contentKey.UpdateAsync().Result;
}
public static string AddTokenRestrictedAuthorizationPolicy(IContentKey contentKey)
{
string tokenTemplateString = GenerateTokenRequirements();
List<ContentKeyAuthorizationPolicyRestriction> restrictions =
new List<ContentKeyAuthorizationPolicyRestriction>
{
new ContentKeyAuthorizationPolicyRestriction
{
Name = "Token Authorization Policy",
KeyRestrictionType = (int)ContentKeyRestrictionType.TokenRestricted,
Requirements = tokenTemplateString,
}
};
// Configure PlayReady and Widevine license templates.
string PlayReadyLicenseTemplate = ConfigurePlayReadyLicenseTemplate();
string WidevineLicenseTemplate = ConfigureWidevineLicenseTemplate();
IContentKeyAuthorizationPolicyOption PlayReadyPolicy =
_context.ContentKeyAuthorizationPolicyOptions.Create("Token option",
ContentKeyDeliveryType.PlayReadyLicense,
restrictions, PlayReadyLicenseTemplate);
IContentKeyAuthorizationPolicyOption WidevinePolicy =
_context.ContentKeyAuthorizationPolicyOptions.Create("Token option",
ContentKeyDeliveryType.Widevine,
restrictions, WidevineLicenseTemplate);
IContentKeyAuthorizationPolicy contentKeyAuthorizationPolicy = _context.
ContentKeyAuthorizationPolicies.
CreateAsync("Deliver Common Content Key with token restrictions").
Result;
contentKeyAuthorizationPolicy.Options.Add(PlayReadyPolicy);
contentKeyAuthorizationPolicy.Options.Add(WidevinePolicy);
// Associate the content key authorization policy with the content key
contentKey.AuthorizationPolicyId = contentKeyAuthorizationPolicy.Id;
contentKey = contentKey.UpdateAsync().Result;
return tokenTemplateString;
}
static private string GenerateTokenRequirements()
{
TokenRestrictionTemplate template = new TokenRestrictionTemplate(TokenType.SWT);
template.PrimaryVerificationKey = new SymmetricVerificationKey();
template.AlternateVerificationKeys.Add(new SymmetricVerificationKey());
template.Audience = _sampleAudience.ToString();
template.Issuer = _sampleIssuer.ToString();
template.RequiredClaims.Add(TokenClaim.ContentKeyIdentifierClaim);
return TokenRestrictionTemplateSerializer.Serialize(template);
}
static private string ConfigurePlayReadyLicenseTemplate()
{
// The following code configures PlayReady License Template using .NET classes
// and returns the XML string.
//The PlayReadyLicenseResponseTemplate class represents the template
//for the response sent back to the end user.
//It contains a field for a custom data string between the license server
//and the application (may be useful for custom app logic)
//as well as a list of one or more license templates.
PlayReadyLicenseResponseTemplate responseTemplate =
new PlayReadyLicenseResponseTemplate();
// The PlayReadyLicenseTemplate class represents a license template
// for creating PlayReady licenses
// to be returned to the end users.
// It contains the data on the content key in the license
// and any rights or restrictions to be
// enforced by the PlayReady DRM runtime when using the content key.
PlayReadyLicenseTemplate licenseTemplate = new PlayReadyLicenseTemplate();
// Configure whether the license is persistent
// (saved in persistent storage on the client)
// or non-persistent (only held in memory while the player is using the license).
licenseTemplate.LicenseType = PlayReadyLicenseType.Nonpersistent;
// AllowTestDevices controls whether test devices can use the license or not.
// If true, the MinimumSecurityLevel property of the license
// is set to 150. If false (the default),
// the MinimumSecurityLevel property of the license is set to 2000.
licenseTemplate.AllowTestDevices = true;
// You can also configure the Play Right in the PlayReady license by using the PlayReadyPlayRight class.
// It grants the user the ability to play back the content subject to the zero or more restrictions
// configured in the license and on the PlayRight itself (for playback specific policy).
// Much of the policy on the PlayRight has to do with output restrictions
// which control the types of outputs that the content can be played over and
// any restrictions that must be put in place when using a given output.
// For example, if the DigitalVideoOnlyContentRestriction is enabled,
//then the DRM runtime will only allow the video to be displayed over digital outputs
//(analog video outputs won’t be allowed to pass the content).
// IMPORTANT: These types of restrictions can be very powerful
// but can also affect the consumer experience.
// If the output protections are configured too restrictive,
// the content might be unplayable on some clients.
// For more information, see the PlayReady Compliance Rules document.
// For example:
//licenseTemplate.PlayRight.AgcAndColorStripeRestriction = new AgcAndColorStripeRestriction(1);
responseTemplate.LicenseTemplates.Add(licenseTemplate);
return MediaServicesLicenseTemplateSerializer.Serialize(responseTemplate);
}
private static string ConfigureWidevineLicenseTemplate()
{
var template = new WidevineMessage
{
allowed_track_types = AllowedTrackTypes.SD_HD,
content_key_specs = new[]
{
new ContentKeySpecs
{
required_output_protection =
new RequiredOutputProtection { hdcp = Hdcp.HDCP_NONE},
security_level = 1,
track_type = "SD"
}
},
policy_overrides = new
{
can_play = true,
can_persist = true,
can_renew = false
}
};
string configuration = JsonConvert.SerializeObject(template);
return configuration;
}
static public IContentKey CreateCommonTypeContentKey()
{
// Create envelope encryption content key
Guid keyId = Guid.NewGuid();
byte[] contentKey = GetRandomBuffer(16);
IContentKey key = _context.ContentKeys.Create(
keyId,
contentKey,
"ContentKey",
ContentKeyType.CommonEncryption);
return key;
}
static private byte[] GetRandomBuffer(int length)
{
var returnValue = new byte[length];
using (var rng =
new System.Security.Cryptography.RNGCryptoServiceProvider())
{
rng.GetBytes(returnValue);
}
return returnValue;
}
}
}
```
## <a name="additional-notes"></a>Uwagi dodatkowe
* Widevine jest usługą świadczoną przez Google Inc. i podlega warunkom korzystania z usługi oraz Polityce prywatności Firmy Google, Inc.
## <a name="media-services-learning-paths"></a>Ścieżki szkoleniowe dotyczące usługi Media Services
[!INCLUDE [media-services-learning-paths-include](../../../includes/media-services-learning-paths-include.md)]
## <a name="provide-feedback"></a>Przekazywanie opinii
[!INCLUDE [media-services-user-voice-include](../../../includes/media-services-user-voice-include.md)]
## <a name="see-also"></a>Zobacz też
* [Używanie dynamicznego szyfrowania Common Encryption w usługach PlayReady i Widevine](media-services-protect-with-playready-widevine.md)
* [Używanie dynamicznego szyfrowania AES-128 i usługi dostarczania kluczy](media-services-protect-with-aes128.md)
| 53.914127 | 1,164 | 0.689102 | pol_Latn | 0.787461 |
d02488745b101b7a38a8cd5af93829474cfb90b6 | 1,020 | md | Markdown | README.md | cuongnv219/UCE-Handler | a5bfa1081c0c4749039d742c4f1e9ffbb08b5d4e | [
"Apache-2.0"
] | null | null | null | README.md | cuongnv219/UCE-Handler | a5bfa1081c0c4749039d742c4f1e9ffbb08b5d4e | [
"Apache-2.0"
] | null | null | null | README.md | cuongnv219/UCE-Handler | a5bfa1081c0c4749039d742c4f1e9ffbb08b5d4e | [
"Apache-2.0"
] | null | null | null | # UCE-Hanler
[](https://jitpack.io/#cuongnv219/UCE-Handler)
### Install
Download via **Gradle**:
Add this to the **project `build.gradle`** file:
```gradle
allprojects {
repositories {
...
maven { url "https://jitpack.io" }
}
}
```
And then add the dependency to the **module `build.gradle`** file:
```gradle
implementation 'com.github.cuongnv219:UCE-Handler:latest_version'
```
### License
```
Copyright (C) 2018
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
```
| 26.153846 | 98 | 0.72451 | eng_Latn | 0.91336 |
d024cc9ee4b6dc0f5d8028c95162f4a41a1dc629 | 25 | md | Markdown | README.md | iremsvn/data-structures-I-works | f51fa4ba0bb7e371b185da97354172bbef7b51f6 | [
"MIT"
] | null | null | null | README.md | iremsvn/data-structures-I-works | f51fa4ba0bb7e371b185da97354172bbef7b51f6 | [
"MIT"
] | null | null | null | README.md | iremsvn/data-structures-I-works | f51fa4ba0bb7e371b185da97354172bbef7b51f6 | [
"MIT"
] | null | null | null | # data-structures-I-works | 25 | 25 | 0.8 | eng_Latn | 0.392998 |
d025185b1b623a992553eceab96393441e0627b5 | 44,394 | md | Markdown | repos/elixir/remote/1.11.md | Michaelzp1/repo-info | 542b4a638ccca7bd3368d3131b0a9a71d2b8185d | [
"Apache-2.0"
] | 400 | 2016-08-11T10:14:00.000Z | 2022-03-24T09:41:03.000Z | repos/elixir/remote/1.11.md | Michaelzp1/repo-info | 542b4a638ccca7bd3368d3131b0a9a71d2b8185d | [
"Apache-2.0"
] | 47 | 2016-08-18T22:30:36.000Z | 2022-02-24T01:27:22.000Z | repos/elixir/remote/1.11.md | Michaelzp1/repo-info | 542b4a638ccca7bd3368d3131b0a9a71d2b8185d | [
"Apache-2.0"
] | 319 | 2016-08-24T06:35:11.000Z | 2022-03-22T17:07:28.000Z | ## `elixir:1.11`
```console
$ docker pull elixir@sha256:cf927a843bde578a62d07c44425da38b10ba8b5700f7fa5fc70420036915950e
```
- Manifest MIME: `application/vnd.docker.distribution.manifest.list.v2+json`
- Platforms: 6
- linux; amd64
- linux; arm variant v7
- linux; arm64 variant v8
- linux; 386
- linux; ppc64le
- linux; s390x
### `elixir:1.11` - linux; amd64
```console
$ docker pull elixir@sha256:77571f8a67059d9aa757258dd140d4bcccb35a7f41c065355495ddcc1dbf0adb
```
- Docker Version: 20.10.7
- Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json`
- Total Size: **483.2 MB (483239690 bytes)**
(compressed transfer size, not on-disk size)
- Image ID: `sha256:1d7dfd4fd47f1c94d368a65daefa123b953449fe46bb63dd2d1031ea3cfdd628`
- Default Command: `["iex"]`
```dockerfile
# Tue, 12 Oct 2021 01:20:53 GMT
ADD file:98c256057b79b141aea9a806a4538cf6c3f340d7e3b0d6e8c363699333f3406b in /
# Tue, 12 Oct 2021 01:20:53 GMT
CMD ["bash"]
# Tue, 12 Oct 2021 15:44:14 GMT
RUN set -eux; apt-get update; apt-get install -y --no-install-recommends ca-certificates curl netbase wget ; rm -rf /var/lib/apt/lists/*
# Tue, 12 Oct 2021 15:44:21 GMT
RUN set -ex; if ! command -v gpg > /dev/null; then apt-get update; apt-get install -y --no-install-recommends gnupg dirmngr ; rm -rf /var/lib/apt/lists/*; fi
# Tue, 12 Oct 2021 15:44:41 GMT
RUN apt-get update && apt-get install -y --no-install-recommends git mercurial openssh-client subversion procps && rm -rf /var/lib/apt/lists/*
# Tue, 12 Oct 2021 15:46:05 GMT
RUN set -ex; apt-get update; apt-get install -y --no-install-recommends autoconf automake bzip2 dpkg-dev file g++ gcc imagemagick libbz2-dev libc6-dev libcurl4-openssl-dev libdb-dev libevent-dev libffi-dev libgdbm-dev libglib2.0-dev libgmp-dev libjpeg-dev libkrb5-dev liblzma-dev libmagickcore-dev libmagickwand-dev libmaxminddb-dev libncurses5-dev libncursesw5-dev libpng-dev libpq-dev libreadline-dev libsqlite3-dev libssl-dev libtool libwebp-dev libxml2-dev libxslt-dev libyaml-dev make patch unzip xz-utils zlib1g-dev $( if apt-cache show 'default-libmysqlclient-dev' 2>/dev/null | grep -q '^Version:'; then echo 'default-libmysqlclient-dev'; else echo 'libmysqlclient-dev'; fi ) ; rm -rf /var/lib/apt/lists/*
# Tue, 19 Oct 2021 19:19:49 GMT
ENV OTP_VERSION=23.3.4.8 REBAR3_VERSION=3.17.0
# Tue, 19 Oct 2021 19:19:49 GMT
LABEL org.opencontainers.image.version=23.3.4.8
# Tue, 19 Oct 2021 19:31:13 GMT
RUN set -xe && OTP_DOWNLOAD_URL="https://github.com/erlang/otp/archive/OTP-${OTP_VERSION}.tar.gz" && OTP_DOWNLOAD_SHA256="e31cbfe0f83cc3e06df997551393fca0e7fa36da371cd2b64c5e196dd843469e" && runtimeDeps='libodbc1 libsctp1 libwxgtk3.0' && buildDeps='unixodbc-dev libsctp-dev libwxgtk3.0-dev' && apt-get update && apt-get install -y --no-install-recommends $runtimeDeps && apt-get install -y --no-install-recommends $buildDeps && curl -fSL -o otp-src.tar.gz "$OTP_DOWNLOAD_URL" && echo "$OTP_DOWNLOAD_SHA256 otp-src.tar.gz" | sha256sum -c - && export ERL_TOP="/usr/src/otp_src_${OTP_VERSION%%@*}" && mkdir -vp $ERL_TOP && tar -xzf otp-src.tar.gz -C $ERL_TOP --strip-components=1 && rm otp-src.tar.gz && ( cd $ERL_TOP && ./otp_build autoconf && gnuArch="$(dpkg-architecture --query DEB_HOST_GNU_TYPE)" && ./configure --build="$gnuArch" && make -j$(nproc) && make -j$(nproc) docs DOC_TARGETS=chunks && make install install-docs DOC_TARGETS=chunks ) && find /usr/local -name examples | xargs rm -rf && apt-get purge -y --auto-remove $buildDeps && rm -rf $ERL_TOP /var/lib/apt/lists/*
# Tue, 19 Oct 2021 19:31:14 GMT
CMD ["erl"]
# Tue, 19 Oct 2021 19:31:14 GMT
ENV REBAR_VERSION=2.6.4
# Tue, 19 Oct 2021 19:31:19 GMT
RUN set -xe && REBAR_DOWNLOAD_URL="https://github.com/rebar/rebar/archive/${REBAR_VERSION}.tar.gz" && REBAR_DOWNLOAD_SHA256="577246bafa2eb2b2c3f1d0c157408650446884555bf87901508ce71d5cc0bd07" && mkdir -p /usr/src/rebar-src && curl -fSL -o rebar-src.tar.gz "$REBAR_DOWNLOAD_URL" && echo "$REBAR_DOWNLOAD_SHA256 rebar-src.tar.gz" | sha256sum -c - && tar -xzf rebar-src.tar.gz -C /usr/src/rebar-src --strip-components=1 && rm rebar-src.tar.gz && cd /usr/src/rebar-src && ./bootstrap && install -v ./rebar /usr/local/bin/ && rm -rf /usr/src/rebar-src
# Tue, 19 Oct 2021 19:31:45 GMT
RUN set -xe && REBAR3_DOWNLOAD_URL="https://github.com/erlang/rebar3/archive/${REBAR3_VERSION}.tar.gz" && REBAR3_DOWNLOAD_SHA256="4c7f33a342bcab498f9bf53cc0ee5b698d9598b8fa9ef6a14bcdf44d21945c27" && mkdir -p /usr/src/rebar3-src && curl -fSL -o rebar3-src.tar.gz "$REBAR3_DOWNLOAD_URL" && echo "$REBAR3_DOWNLOAD_SHA256 rebar3-src.tar.gz" | sha256sum -c - && tar -xzf rebar3-src.tar.gz -C /usr/src/rebar3-src --strip-components=1 && rm rebar3-src.tar.gz && cd /usr/src/rebar3-src && HOME=$PWD ./bootstrap && install -v ./rebar3 /usr/local/bin/ && rm -rf /usr/src/rebar3-src
# Tue, 19 Oct 2021 21:25:24 GMT
ENV ELIXIR_VERSION=v1.11.4 LANG=C.UTF-8
# Tue, 19 Oct 2021 21:28:05 GMT
RUN set -xe && ELIXIR_DOWNLOAD_URL="https://github.com/elixir-lang/elixir/archive/${ELIXIR_VERSION}.tar.gz" && ELIXIR_DOWNLOAD_SHA256="85c7118a0db6007507313db5bddf370216d9394ed7911fe80f21e2fbf7f54d29" && curl -fSL -o elixir-src.tar.gz $ELIXIR_DOWNLOAD_URL && echo "$ELIXIR_DOWNLOAD_SHA256 elixir-src.tar.gz" | sha256sum -c - && mkdir -p /usr/local/src/elixir && tar -xzC /usr/local/src/elixir --strip-components=1 -f elixir-src.tar.gz && rm elixir-src.tar.gz && cd /usr/local/src/elixir && make install clean
# Tue, 19 Oct 2021 21:28:05 GMT
CMD ["iex"]
```
- Layers:
- `sha256:07471e81507f7cf1100827f10c60c3c0422d1222430e34e527d97ec72b14a193`
Last Modified: Tue, 12 Oct 2021 01:26:26 GMT
Size: 50.4 MB (50436692 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:c6cef1aa2170c001b320769bf8b018ed82d2c94a673e3010ea1ffe152e107419`
Last Modified: Tue, 12 Oct 2021 15:54:16 GMT
Size: 7.8 MB (7833862 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:13a51f13be8e69cfc526b671d0bbf621b985b0932acd1523050e2995777b5926`
Last Modified: Tue, 12 Oct 2021 15:54:17 GMT
Size: 10.0 MB (9997204 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:def39d67a1a77adaac93be02cc61a57145a5a6273cd061d97660f30ef1e09bc1`
Last Modified: Tue, 12 Oct 2021 15:54:37 GMT
Size: 51.8 MB (51840680 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:a8367252e08e761371f9573b3782f46abf9fc70ae38395ae9f3d3c232ced60d3`
Last Modified: Tue, 12 Oct 2021 15:55:14 GMT
Size: 192.4 MB (192425750 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:1d95411fa1e890fcd73e1bb3e03a1b3dc958d71f005039557f09073039b08be8`
Last Modified: Tue, 19 Oct 2021 21:06:33 GMT
Size: 162.6 MB (162593070 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:4d2c4ac6cc9070127abe7889bc6710fdeb889988271ae5067f0dfc2ebe628129`
Last Modified: Tue, 19 Oct 2021 21:06:04 GMT
Size: 196.3 KB (196267 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:31c465b55fa7a1306d00ab3eb03a8c981a892c9f1a75a7f3f5bccf02b56f34e9`
Last Modified: Tue, 19 Oct 2021 21:06:04 GMT
Size: 919.0 KB (918967 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:95da2e4f1603bd9964d171e76cc07f143fc43de5f0f9679a546dd5a85b58fd55`
Last Modified: Tue, 19 Oct 2021 21:49:15 GMT
Size: 7.0 MB (6997198 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
### `elixir:1.11` - linux; arm variant v7
```console
$ docker pull elixir@sha256:0686c8c65d1845cc92fc288c40c71027818cafb9d4d8f46198a6fa32fb856bc1
```
- Docker Version: 20.10.7
- Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json`
- Total Size: **438.9 MB (438858319 bytes)**
(compressed transfer size, not on-disk size)
- Image ID: `sha256:e24b82aeda652e080c2ec9ec49d29ee15583cefd659f8d701600811479afd811`
- Default Command: `["iex"]`
```dockerfile
# Tue, 12 Oct 2021 01:29:06 GMT
ADD file:b1857dc4fe4fbc4af220829cc6af7169c4555d3f63a3a304db6fa0146b1d5539 in /
# Tue, 12 Oct 2021 01:29:08 GMT
CMD ["bash"]
# Tue, 12 Oct 2021 18:39:37 GMT
RUN set -eux; apt-get update; apt-get install -y --no-install-recommends ca-certificates curl netbase wget ; rm -rf /var/lib/apt/lists/*
# Tue, 12 Oct 2021 18:39:53 GMT
RUN set -ex; if ! command -v gpg > /dev/null; then apt-get update; apt-get install -y --no-install-recommends gnupg dirmngr ; rm -rf /var/lib/apt/lists/*; fi
# Tue, 12 Oct 2021 18:40:43 GMT
RUN apt-get update && apt-get install -y --no-install-recommends git mercurial openssh-client subversion procps && rm -rf /var/lib/apt/lists/*
# Tue, 12 Oct 2021 18:42:41 GMT
RUN set -ex; apt-get update; apt-get install -y --no-install-recommends autoconf automake bzip2 dpkg-dev file g++ gcc imagemagick libbz2-dev libc6-dev libcurl4-openssl-dev libdb-dev libevent-dev libffi-dev libgdbm-dev libglib2.0-dev libgmp-dev libjpeg-dev libkrb5-dev liblzma-dev libmagickcore-dev libmagickwand-dev libmaxminddb-dev libncurses5-dev libncursesw5-dev libpng-dev libpq-dev libreadline-dev libsqlite3-dev libssl-dev libtool libwebp-dev libxml2-dev libxslt-dev libyaml-dev make patch unzip xz-utils zlib1g-dev $( if apt-cache show 'default-libmysqlclient-dev' 2>/dev/null | grep -q '^Version:'; then echo 'default-libmysqlclient-dev'; else echo 'libmysqlclient-dev'; fi ) ; rm -rf /var/lib/apt/lists/*
# Tue, 19 Oct 2021 19:58:54 GMT
ENV OTP_VERSION=23.3.4.8 REBAR3_VERSION=3.17.0
# Tue, 19 Oct 2021 19:58:54 GMT
LABEL org.opencontainers.image.version=23.3.4.8
# Tue, 19 Oct 2021 20:12:58 GMT
RUN set -xe && OTP_DOWNLOAD_URL="https://github.com/erlang/otp/archive/OTP-${OTP_VERSION}.tar.gz" && OTP_DOWNLOAD_SHA256="e31cbfe0f83cc3e06df997551393fca0e7fa36da371cd2b64c5e196dd843469e" && runtimeDeps='libodbc1 libsctp1 libwxgtk3.0' && buildDeps='unixodbc-dev libsctp-dev libwxgtk3.0-dev' && apt-get update && apt-get install -y --no-install-recommends $runtimeDeps && apt-get install -y --no-install-recommends $buildDeps && curl -fSL -o otp-src.tar.gz "$OTP_DOWNLOAD_URL" && echo "$OTP_DOWNLOAD_SHA256 otp-src.tar.gz" | sha256sum -c - && export ERL_TOP="/usr/src/otp_src_${OTP_VERSION%%@*}" && mkdir -vp $ERL_TOP && tar -xzf otp-src.tar.gz -C $ERL_TOP --strip-components=1 && rm otp-src.tar.gz && ( cd $ERL_TOP && ./otp_build autoconf && gnuArch="$(dpkg-architecture --query DEB_HOST_GNU_TYPE)" && ./configure --build="$gnuArch" && make -j$(nproc) && make -j$(nproc) docs DOC_TARGETS=chunks && make install install-docs DOC_TARGETS=chunks ) && find /usr/local -name examples | xargs rm -rf && apt-get purge -y --auto-remove $buildDeps && rm -rf $ERL_TOP /var/lib/apt/lists/*
# Tue, 19 Oct 2021 20:13:00 GMT
CMD ["erl"]
# Tue, 19 Oct 2021 20:13:00 GMT
ENV REBAR_VERSION=2.6.4
# Tue, 19 Oct 2021 20:13:11 GMT
RUN set -xe && REBAR_DOWNLOAD_URL="https://github.com/rebar/rebar/archive/${REBAR_VERSION}.tar.gz" && REBAR_DOWNLOAD_SHA256="577246bafa2eb2b2c3f1d0c157408650446884555bf87901508ce71d5cc0bd07" && mkdir -p /usr/src/rebar-src && curl -fSL -o rebar-src.tar.gz "$REBAR_DOWNLOAD_URL" && echo "$REBAR_DOWNLOAD_SHA256 rebar-src.tar.gz" | sha256sum -c - && tar -xzf rebar-src.tar.gz -C /usr/src/rebar-src --strip-components=1 && rm rebar-src.tar.gz && cd /usr/src/rebar-src && ./bootstrap && install -v ./rebar /usr/local/bin/ && rm -rf /usr/src/rebar-src
# Tue, 19 Oct 2021 20:14:06 GMT
RUN set -xe && REBAR3_DOWNLOAD_URL="https://github.com/erlang/rebar3/archive/${REBAR3_VERSION}.tar.gz" && REBAR3_DOWNLOAD_SHA256="4c7f33a342bcab498f9bf53cc0ee5b698d9598b8fa9ef6a14bcdf44d21945c27" && mkdir -p /usr/src/rebar3-src && curl -fSL -o rebar3-src.tar.gz "$REBAR3_DOWNLOAD_URL" && echo "$REBAR3_DOWNLOAD_SHA256 rebar3-src.tar.gz" | sha256sum -c - && tar -xzf rebar3-src.tar.gz -C /usr/src/rebar3-src --strip-components=1 && rm rebar3-src.tar.gz && cd /usr/src/rebar3-src && HOME=$PWD ./bootstrap && install -v ./rebar3 /usr/local/bin/ && rm -rf /usr/src/rebar3-src
# Tue, 19 Oct 2021 22:21:37 GMT
ENV ELIXIR_VERSION=v1.11.4 LANG=C.UTF-8
# Tue, 19 Oct 2021 22:25:03 GMT
RUN set -xe && ELIXIR_DOWNLOAD_URL="https://github.com/elixir-lang/elixir/archive/${ELIXIR_VERSION}.tar.gz" && ELIXIR_DOWNLOAD_SHA256="85c7118a0db6007507313db5bddf370216d9394ed7911fe80f21e2fbf7f54d29" && curl -fSL -o elixir-src.tar.gz $ELIXIR_DOWNLOAD_URL && echo "$ELIXIR_DOWNLOAD_SHA256 elixir-src.tar.gz" | sha256sum -c - && mkdir -p /usr/local/src/elixir && tar -xzC /usr/local/src/elixir --strip-components=1 -f elixir-src.tar.gz && rm elixir-src.tar.gz && cd /usr/local/src/elixir && make install clean
# Tue, 19 Oct 2021 22:25:03 GMT
CMD ["iex"]
```
- Layers:
- `sha256:139a906e9407e96c50669130c30fe8fba2681b14aa838a3e52f8753b566ef1c8`
Last Modified: Tue, 12 Oct 2021 01:45:13 GMT
Size: 45.9 MB (45917899 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:3907051e2154d5bd4ad15141b8938bc09674039787512494a99b93fd4ebc088d`
Last Modified: Tue, 12 Oct 2021 19:00:49 GMT
Size: 7.1 MB (7125242 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:8651d4793d5f4d465002213d2d462c964aaf8665a87d5f9cc0de7384f10eeb10`
Last Modified: Tue, 12 Oct 2021 19:00:49 GMT
Size: 9.3 MB (9343819 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:b239d69f9df3bcbccb5272f18c9864a0646c74a277438c7cc9091914887d366f`
Last Modified: Tue, 12 Oct 2021 19:01:24 GMT
Size: 47.4 MB (47357395 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:927dd764beb11df813f3d3bf7e3a65f6751e588dca0cccf4066f6f9ce6f394bf`
Last Modified: Tue, 12 Oct 2021 19:02:29 GMT
Size: 168.6 MB (168608334 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:38ac3094b8dfa88f41283c3af4054a0c2662374ae1f125f017ecec7c4fc23782`
Last Modified: Tue, 19 Oct 2021 21:10:44 GMT
Size: 152.4 MB (152393137 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:971372fb3ed3d3bf9e3c6bb174887098acad677e0c357c691467eebcfca710e3`
Last Modified: Tue, 19 Oct 2021 21:09:15 GMT
Size: 196.2 KB (196234 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:4f9c5e5b06d999b8c4f7409da8780409f8a01583dd4e6f96da47329ce9d9b887`
Last Modified: Tue, 19 Oct 2021 21:09:15 GMT
Size: 919.0 KB (918973 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:53fdd44ac75235787f11a38c657b0d50f8b4346523042796b6988544628a7eca`
Last Modified: Tue, 19 Oct 2021 23:01:40 GMT
Size: 7.0 MB (6997286 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
### `elixir:1.11` - linux; arm64 variant v8
```console
$ docker pull elixir@sha256:2e290e052a8a8797792f5fcc5c4a8e819adcb3424a2a715872ed2d31c2844302
```
- Docker Version: 20.10.7
- Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json`
- Total Size: **466.2 MB (466153694 bytes)**
(compressed transfer size, not on-disk size)
- Image ID: `sha256:c8ef293b061e026440f4834e68ec08b3ff5daa81ecdcdf9e88a23867abf1ec7f`
- Default Command: `["iex"]`
```dockerfile
# Tue, 12 Oct 2021 01:41:28 GMT
ADD file:aed1709ccba6a81b9726b228fad7b81bcf4c16bafe723981ad37076322d78986 in /
# Tue, 12 Oct 2021 01:41:29 GMT
CMD ["bash"]
# Sat, 16 Oct 2021 02:59:51 GMT
RUN set -eux; apt-get update; apt-get install -y --no-install-recommends ca-certificates curl netbase wget ; rm -rf /var/lib/apt/lists/*
# Sat, 16 Oct 2021 02:59:56 GMT
RUN set -ex; if ! command -v gpg > /dev/null; then apt-get update; apt-get install -y --no-install-recommends gnupg dirmngr ; rm -rf /var/lib/apt/lists/*; fi
# Sat, 16 Oct 2021 03:00:12 GMT
RUN apt-get update && apt-get install -y --no-install-recommends git mercurial openssh-client subversion procps && rm -rf /var/lib/apt/lists/*
# Sat, 16 Oct 2021 03:00:55 GMT
RUN set -ex; apt-get update; apt-get install -y --no-install-recommends autoconf automake bzip2 dpkg-dev file g++ gcc imagemagick libbz2-dev libc6-dev libcurl4-openssl-dev libdb-dev libevent-dev libffi-dev libgdbm-dev libglib2.0-dev libgmp-dev libjpeg-dev libkrb5-dev liblzma-dev libmagickcore-dev libmagickwand-dev libmaxminddb-dev libncurses5-dev libncursesw5-dev libpng-dev libpq-dev libreadline-dev libsqlite3-dev libssl-dev libtool libwebp-dev libxml2-dev libxslt-dev libyaml-dev make patch unzip xz-utils zlib1g-dev $( if apt-cache show 'default-libmysqlclient-dev' 2>/dev/null | grep -q '^Version:'; then echo 'default-libmysqlclient-dev'; else echo 'libmysqlclient-dev'; fi ) ; rm -rf /var/lib/apt/lists/*
# Tue, 19 Oct 2021 19:43:41 GMT
ENV OTP_VERSION=23.3.4.8 REBAR3_VERSION=3.17.0
# Tue, 19 Oct 2021 19:43:42 GMT
LABEL org.opencontainers.image.version=23.3.4.8
# Tue, 19 Oct 2021 19:49:25 GMT
RUN set -xe && OTP_DOWNLOAD_URL="https://github.com/erlang/otp/archive/OTP-${OTP_VERSION}.tar.gz" && OTP_DOWNLOAD_SHA256="e31cbfe0f83cc3e06df997551393fca0e7fa36da371cd2b64c5e196dd843469e" && runtimeDeps='libodbc1 libsctp1 libwxgtk3.0' && buildDeps='unixodbc-dev libsctp-dev libwxgtk3.0-dev' && apt-get update && apt-get install -y --no-install-recommends $runtimeDeps && apt-get install -y --no-install-recommends $buildDeps && curl -fSL -o otp-src.tar.gz "$OTP_DOWNLOAD_URL" && echo "$OTP_DOWNLOAD_SHA256 otp-src.tar.gz" | sha256sum -c - && export ERL_TOP="/usr/src/otp_src_${OTP_VERSION%%@*}" && mkdir -vp $ERL_TOP && tar -xzf otp-src.tar.gz -C $ERL_TOP --strip-components=1 && rm otp-src.tar.gz && ( cd $ERL_TOP && ./otp_build autoconf && gnuArch="$(dpkg-architecture --query DEB_HOST_GNU_TYPE)" && ./configure --build="$gnuArch" && make -j$(nproc) && make -j$(nproc) docs DOC_TARGETS=chunks && make install install-docs DOC_TARGETS=chunks ) && find /usr/local -name examples | xargs rm -rf && apt-get purge -y --auto-remove $buildDeps && rm -rf $ERL_TOP /var/lib/apt/lists/*
# Tue, 19 Oct 2021 19:49:27 GMT
CMD ["erl"]
# Tue, 19 Oct 2021 19:49:28 GMT
ENV REBAR_VERSION=2.6.4
# Tue, 19 Oct 2021 19:49:38 GMT
RUN set -xe && REBAR_DOWNLOAD_URL="https://github.com/rebar/rebar/archive/${REBAR_VERSION}.tar.gz" && REBAR_DOWNLOAD_SHA256="577246bafa2eb2b2c3f1d0c157408650446884555bf87901508ce71d5cc0bd07" && mkdir -p /usr/src/rebar-src && curl -fSL -o rebar-src.tar.gz "$REBAR_DOWNLOAD_URL" && echo "$REBAR_DOWNLOAD_SHA256 rebar-src.tar.gz" | sha256sum -c - && tar -xzf rebar-src.tar.gz -C /usr/src/rebar-src --strip-components=1 && rm rebar-src.tar.gz && cd /usr/src/rebar-src && ./bootstrap && install -v ./rebar /usr/local/bin/ && rm -rf /usr/src/rebar-src
# Tue, 19 Oct 2021 19:49:58 GMT
RUN set -xe && REBAR3_DOWNLOAD_URL="https://github.com/erlang/rebar3/archive/${REBAR3_VERSION}.tar.gz" && REBAR3_DOWNLOAD_SHA256="4c7f33a342bcab498f9bf53cc0ee5b698d9598b8fa9ef6a14bcdf44d21945c27" && mkdir -p /usr/src/rebar3-src && curl -fSL -o rebar3-src.tar.gz "$REBAR3_DOWNLOAD_URL" && echo "$REBAR3_DOWNLOAD_SHA256 rebar3-src.tar.gz" | sha256sum -c - && tar -xzf rebar3-src.tar.gz -C /usr/src/rebar3-src --strip-components=1 && rm rebar3-src.tar.gz && cd /usr/src/rebar3-src && HOME=$PWD ./bootstrap && install -v ./rebar3 /usr/local/bin/ && rm -rf /usr/src/rebar3-src
# Tue, 19 Oct 2021 21:15:09 GMT
ENV ELIXIR_VERSION=v1.11.4 LANG=C.UTF-8
# Tue, 19 Oct 2021 21:16:20 GMT
RUN set -xe && ELIXIR_DOWNLOAD_URL="https://github.com/elixir-lang/elixir/archive/${ELIXIR_VERSION}.tar.gz" && ELIXIR_DOWNLOAD_SHA256="85c7118a0db6007507313db5bddf370216d9394ed7911fe80f21e2fbf7f54d29" && curl -fSL -o elixir-src.tar.gz $ELIXIR_DOWNLOAD_URL && echo "$ELIXIR_DOWNLOAD_SHA256 elixir-src.tar.gz" | sha256sum -c - && mkdir -p /usr/local/src/elixir && tar -xzC /usr/local/src/elixir --strip-components=1 -f elixir-src.tar.gz && rm elixir-src.tar.gz && cd /usr/local/src/elixir && make install clean
# Tue, 19 Oct 2021 21:16:22 GMT
CMD ["iex"]
```
- Layers:
- `sha256:2ff6d7a9e7d73e4a01b9417518d18c001728c45fa8109ed8f55aaa50e7981482`
Last Modified: Tue, 12 Oct 2021 01:48:38 GMT
Size: 49.2 MB (49222756 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:b7324ea4098419bc5fa2ac5a138522230bf12cef3996d1740dd00f9d4737d004`
Last Modified: Sat, 16 Oct 2021 03:15:37 GMT
Size: 7.7 MB (7695063 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:4e213c33a07316d84d829be685bd3b02e1e2bc135f7748c932050e6ed6a3a0d3`
Last Modified: Sat, 16 Oct 2021 03:15:37 GMT
Size: 9.8 MB (9767289 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:e7c82db586c3ef7a5c4716aeca2d6e779ec11c568c84c8ef7e6df7bd72512c80`
Last Modified: Sat, 16 Oct 2021 03:15:56 GMT
Size: 52.2 MB (52167277 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:859bcdbf920958cc1dcb903194056a8e4b6561668cfae85c6a0fe7a5c5caac14`
Last Modified: Sat, 16 Oct 2021 03:16:31 GMT
Size: 184.0 MB (183992615 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:6fb890240ae94bfabc3c0a81e233ee2b465451d6751fb71df89f160c68eb29a6`
Last Modified: Tue, 19 Oct 2021 20:15:48 GMT
Size: 155.2 MB (155195539 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:88d2c665f928070e1b6ca1bd1095f4944e3756647bd6e2cf0d7c816bcb6adafd`
Last Modified: Tue, 19 Oct 2021 20:15:28 GMT
Size: 196.2 KB (196166 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:8557ff63f038efd939964059355c8d406a36f1d6355c4e6ad1afdf049be5df56`
Last Modified: Tue, 19 Oct 2021 20:15:28 GMT
Size: 918.9 KB (918870 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:53793f4071102f4f6aa5103333b9fca0a3e4d79984d4652aeb09c84b1218ecc7`
Last Modified: Tue, 19 Oct 2021 21:29:57 GMT
Size: 7.0 MB (6998119 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
### `elixir:1.11` - linux; 386
```console
$ docker pull elixir@sha256:7b76cc21536377ce63cbe4983cc7f903f17933b57cd08521d0b3953a7e0e974e
```
- Docker Version: 20.10.7
- Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json`
- Total Size: **493.6 MB (493570703 bytes)**
(compressed transfer size, not on-disk size)
- Image ID: `sha256:fd95985cd725c3cf437b01f2d22dc6e02bc1fb06399c7754a8fbbd8b5f5ee0e8`
- Default Command: `["iex"]`
```dockerfile
# Tue, 12 Oct 2021 01:40:01 GMT
ADD file:1461fa0362c70b5b7a677c57dd48633827fd9635a9cb136730aa7581cc523b46 in /
# Tue, 12 Oct 2021 01:40:02 GMT
CMD ["bash"]
# Tue, 12 Oct 2021 04:37:56 GMT
RUN set -eux; apt-get update; apt-get install -y --no-install-recommends ca-certificates curl netbase wget ; rm -rf /var/lib/apt/lists/*
# Tue, 12 Oct 2021 04:38:03 GMT
RUN set -ex; if ! command -v gpg > /dev/null; then apt-get update; apt-get install -y --no-install-recommends gnupg dirmngr ; rm -rf /var/lib/apt/lists/*; fi
# Tue, 12 Oct 2021 04:38:28 GMT
RUN apt-get update && apt-get install -y --no-install-recommends git mercurial openssh-client subversion procps && rm -rf /var/lib/apt/lists/*
# Tue, 12 Oct 2021 04:39:33 GMT
RUN set -ex; apt-get update; apt-get install -y --no-install-recommends autoconf automake bzip2 dpkg-dev file g++ gcc imagemagick libbz2-dev libc6-dev libcurl4-openssl-dev libdb-dev libevent-dev libffi-dev libgdbm-dev libglib2.0-dev libgmp-dev libjpeg-dev libkrb5-dev liblzma-dev libmagickcore-dev libmagickwand-dev libmaxminddb-dev libncurses5-dev libncursesw5-dev libpng-dev libpq-dev libreadline-dev libsqlite3-dev libssl-dev libtool libwebp-dev libxml2-dev libxslt-dev libyaml-dev make patch unzip xz-utils zlib1g-dev $( if apt-cache show 'default-libmysqlclient-dev' 2>/dev/null | grep -q '^Version:'; then echo 'default-libmysqlclient-dev'; else echo 'libmysqlclient-dev'; fi ) ; rm -rf /var/lib/apt/lists/*
# Tue, 19 Oct 2021 19:39:17 GMT
ENV OTP_VERSION=23.3.4.8 REBAR3_VERSION=3.17.0
# Tue, 19 Oct 2021 19:39:17 GMT
LABEL org.opencontainers.image.version=23.3.4.8
# Tue, 19 Oct 2021 20:01:08 GMT
RUN set -xe && OTP_DOWNLOAD_URL="https://github.com/erlang/otp/archive/OTP-${OTP_VERSION}.tar.gz" && OTP_DOWNLOAD_SHA256="e31cbfe0f83cc3e06df997551393fca0e7fa36da371cd2b64c5e196dd843469e" && runtimeDeps='libodbc1 libsctp1 libwxgtk3.0' && buildDeps='unixodbc-dev libsctp-dev libwxgtk3.0-dev' && apt-get update && apt-get install -y --no-install-recommends $runtimeDeps && apt-get install -y --no-install-recommends $buildDeps && curl -fSL -o otp-src.tar.gz "$OTP_DOWNLOAD_URL" && echo "$OTP_DOWNLOAD_SHA256 otp-src.tar.gz" | sha256sum -c - && export ERL_TOP="/usr/src/otp_src_${OTP_VERSION%%@*}" && mkdir -vp $ERL_TOP && tar -xzf otp-src.tar.gz -C $ERL_TOP --strip-components=1 && rm otp-src.tar.gz && ( cd $ERL_TOP && ./otp_build autoconf && gnuArch="$(dpkg-architecture --query DEB_HOST_GNU_TYPE)" && ./configure --build="$gnuArch" && make -j$(nproc) && make -j$(nproc) docs DOC_TARGETS=chunks && make install install-docs DOC_TARGETS=chunks ) && find /usr/local -name examples | xargs rm -rf && apt-get purge -y --auto-remove $buildDeps && rm -rf $ERL_TOP /var/lib/apt/lists/*
# Tue, 19 Oct 2021 20:01:09 GMT
CMD ["erl"]
# Tue, 19 Oct 2021 20:01:09 GMT
ENV REBAR_VERSION=2.6.4
# Tue, 19 Oct 2021 20:01:18 GMT
RUN set -xe && REBAR_DOWNLOAD_URL="https://github.com/rebar/rebar/archive/${REBAR_VERSION}.tar.gz" && REBAR_DOWNLOAD_SHA256="577246bafa2eb2b2c3f1d0c157408650446884555bf87901508ce71d5cc0bd07" && mkdir -p /usr/src/rebar-src && curl -fSL -o rebar-src.tar.gz "$REBAR_DOWNLOAD_URL" && echo "$REBAR_DOWNLOAD_SHA256 rebar-src.tar.gz" | sha256sum -c - && tar -xzf rebar-src.tar.gz -C /usr/src/rebar-src --strip-components=1 && rm rebar-src.tar.gz && cd /usr/src/rebar-src && ./bootstrap && install -v ./rebar /usr/local/bin/ && rm -rf /usr/src/rebar-src
# Tue, 19 Oct 2021 20:02:12 GMT
RUN set -xe && REBAR3_DOWNLOAD_URL="https://github.com/erlang/rebar3/archive/${REBAR3_VERSION}.tar.gz" && REBAR3_DOWNLOAD_SHA256="4c7f33a342bcab498f9bf53cc0ee5b698d9598b8fa9ef6a14bcdf44d21945c27" && mkdir -p /usr/src/rebar3-src && curl -fSL -o rebar3-src.tar.gz "$REBAR3_DOWNLOAD_URL" && echo "$REBAR3_DOWNLOAD_SHA256 rebar3-src.tar.gz" | sha256sum -c - && tar -xzf rebar3-src.tar.gz -C /usr/src/rebar3-src --strip-components=1 && rm rebar3-src.tar.gz && cd /usr/src/rebar3-src && HOME=$PWD ./bootstrap && install -v ./rebar3 /usr/local/bin/ && rm -rf /usr/src/rebar3-src
# Tue, 19 Oct 2021 22:04:27 GMT
ENV ELIXIR_VERSION=v1.11.4 LANG=C.UTF-8
# Tue, 19 Oct 2021 22:05:57 GMT
RUN set -xe && ELIXIR_DOWNLOAD_URL="https://github.com/elixir-lang/elixir/archive/${ELIXIR_VERSION}.tar.gz" && ELIXIR_DOWNLOAD_SHA256="85c7118a0db6007507313db5bddf370216d9394ed7911fe80f21e2fbf7f54d29" && curl -fSL -o elixir-src.tar.gz $ELIXIR_DOWNLOAD_URL && echo "$ELIXIR_DOWNLOAD_SHA256 elixir-src.tar.gz" | sha256sum -c - && mkdir -p /usr/local/src/elixir && tar -xzC /usr/local/src/elixir --strip-components=1 -f elixir-src.tar.gz && rm elixir-src.tar.gz && cd /usr/local/src/elixir && make install clean
# Tue, 19 Oct 2021 22:05:57 GMT
CMD ["iex"]
```
- Layers:
- `sha256:f4b233498baa64e956a5f70979351e206bd085bd547a3cf25c08b154348df726`
Last Modified: Tue, 12 Oct 2021 01:48:07 GMT
Size: 51.2 MB (51207606 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:ece122ff48522de249e81ee22f617bf84d59b003d3c79f44331163046a937e4c`
Last Modified: Tue, 12 Oct 2021 04:49:37 GMT
Size: 8.0 MB (8000221 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:378307f52e8eeddb964324804377cd9fd65b1bf7b848c5b690e63ef92f1fe3d5`
Last Modified: Tue, 12 Oct 2021 04:49:37 GMT
Size: 10.3 MB (10339916 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:559cdd7287c6a9a0f142216d3645f886b4a777073daae85c51de968330bb9f9d`
Last Modified: Tue, 12 Oct 2021 04:50:08 GMT
Size: 53.4 MB (53437801 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:325db7f1d3dd664f85cad29c0867f78c550d2b5c426ed460bf5283c008942bb6`
Last Modified: Tue, 12 Oct 2021 04:50:59 GMT
Size: 199.0 MB (198959424 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:d5c3e2091776ea44a389e118738fbf36de6b424adebdaf42a4e9e9554d8f86de`
Last Modified: Tue, 19 Oct 2021 21:44:58 GMT
Size: 163.5 MB (163513295 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:2ef35489c2e891f616629951f5b5ee6252a0fce4431ced69558df4084040369e`
Last Modified: Tue, 19 Oct 2021 21:44:25 GMT
Size: 196.2 KB (196208 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:9f5dba62b732d1f3c33f630a1e04c9c19836ece9dbbb6f888ddfe2760654a98d`
Last Modified: Tue, 19 Oct 2021 21:44:25 GMT
Size: 919.0 KB (918970 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:64b2f968ceca20376f0dff0f92d2398eed59f4b2fe0710fc778d0cdb33bf0cef`
Last Modified: Tue, 19 Oct 2021 22:24:59 GMT
Size: 7.0 MB (6997262 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
### `elixir:1.11` - linux; ppc64le
```console
$ docker pull elixir@sha256:aaa7047bdc0fa96969b343e2dfb95c1f3a44c0fcdc9bbf9827ea9fbcf5db285d
```
- Docker Version: 20.10.7
- Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json`
- Total Size: **505.4 MB (505400284 bytes)**
(compressed transfer size, not on-disk size)
- Image ID: `sha256:2bdbbc86b638c38348090a5889b0a0e53b833796c974e28da761b2982324db85`
- Default Command: `["iex"]`
```dockerfile
# Tue, 12 Oct 2021 01:26:10 GMT
ADD file:94a7157f0c578810dcc73fd2dbdcb4ce021626d9858288c970e007a590c71d44 in /
# Tue, 12 Oct 2021 01:26:18 GMT
CMD ["bash"]
# Tue, 12 Oct 2021 04:06:42 GMT
RUN set -eux; apt-get update; apt-get install -y --no-install-recommends ca-certificates curl netbase wget ; rm -rf /var/lib/apt/lists/*
# Tue, 12 Oct 2021 04:07:15 GMT
RUN set -ex; if ! command -v gpg > /dev/null; then apt-get update; apt-get install -y --no-install-recommends gnupg dirmngr ; rm -rf /var/lib/apt/lists/*; fi
# Tue, 12 Oct 2021 04:10:36 GMT
RUN apt-get update && apt-get install -y --no-install-recommends git mercurial openssh-client subversion procps && rm -rf /var/lib/apt/lists/*
# Tue, 12 Oct 2021 04:22:10 GMT
RUN set -ex; apt-get update; apt-get install -y --no-install-recommends autoconf automake bzip2 dpkg-dev file g++ gcc imagemagick libbz2-dev libc6-dev libcurl4-openssl-dev libdb-dev libevent-dev libffi-dev libgdbm-dev libglib2.0-dev libgmp-dev libjpeg-dev libkrb5-dev liblzma-dev libmagickcore-dev libmagickwand-dev libmaxminddb-dev libncurses5-dev libncursesw5-dev libpng-dev libpq-dev libreadline-dev libsqlite3-dev libssl-dev libtool libwebp-dev libxml2-dev libxslt-dev libyaml-dev make patch unzip xz-utils zlib1g-dev $( if apt-cache show 'default-libmysqlclient-dev' 2>/dev/null | grep -q '^Version:'; then echo 'default-libmysqlclient-dev'; else echo 'libmysqlclient-dev'; fi ) ; rm -rf /var/lib/apt/lists/*
# Tue, 19 Oct 2021 19:23:20 GMT
ENV OTP_VERSION=23.3.4.8 REBAR3_VERSION=3.17.0
# Tue, 19 Oct 2021 19:23:26 GMT
LABEL org.opencontainers.image.version=23.3.4.8
# Tue, 19 Oct 2021 19:42:27 GMT
RUN set -xe && OTP_DOWNLOAD_URL="https://github.com/erlang/otp/archive/OTP-${OTP_VERSION}.tar.gz" && OTP_DOWNLOAD_SHA256="e31cbfe0f83cc3e06df997551393fca0e7fa36da371cd2b64c5e196dd843469e" && runtimeDeps='libodbc1 libsctp1 libwxgtk3.0' && buildDeps='unixodbc-dev libsctp-dev libwxgtk3.0-dev' && apt-get update && apt-get install -y --no-install-recommends $runtimeDeps && apt-get install -y --no-install-recommends $buildDeps && curl -fSL -o otp-src.tar.gz "$OTP_DOWNLOAD_URL" && echo "$OTP_DOWNLOAD_SHA256 otp-src.tar.gz" | sha256sum -c - && export ERL_TOP="/usr/src/otp_src_${OTP_VERSION%%@*}" && mkdir -vp $ERL_TOP && tar -xzf otp-src.tar.gz -C $ERL_TOP --strip-components=1 && rm otp-src.tar.gz && ( cd $ERL_TOP && ./otp_build autoconf && gnuArch="$(dpkg-architecture --query DEB_HOST_GNU_TYPE)" && ./configure --build="$gnuArch" && make -j$(nproc) && make -j$(nproc) docs DOC_TARGETS=chunks && make install install-docs DOC_TARGETS=chunks ) && find /usr/local -name examples | xargs rm -rf && apt-get purge -y --auto-remove $buildDeps && rm -rf $ERL_TOP /var/lib/apt/lists/*
# Tue, 19 Oct 2021 19:42:49 GMT
CMD ["erl"]
# Tue, 19 Oct 2021 19:42:54 GMT
ENV REBAR_VERSION=2.6.4
# Tue, 19 Oct 2021 19:43:32 GMT
RUN set -xe && REBAR_DOWNLOAD_URL="https://github.com/rebar/rebar/archive/${REBAR_VERSION}.tar.gz" && REBAR_DOWNLOAD_SHA256="577246bafa2eb2b2c3f1d0c157408650446884555bf87901508ce71d5cc0bd07" && mkdir -p /usr/src/rebar-src && curl -fSL -o rebar-src.tar.gz "$REBAR_DOWNLOAD_URL" && echo "$REBAR_DOWNLOAD_SHA256 rebar-src.tar.gz" | sha256sum -c - && tar -xzf rebar-src.tar.gz -C /usr/src/rebar-src --strip-components=1 && rm rebar-src.tar.gz && cd /usr/src/rebar-src && ./bootstrap && install -v ./rebar /usr/local/bin/ && rm -rf /usr/src/rebar-src
# Tue, 19 Oct 2021 19:44:28 GMT
RUN set -xe && REBAR3_DOWNLOAD_URL="https://github.com/erlang/rebar3/archive/${REBAR3_VERSION}.tar.gz" && REBAR3_DOWNLOAD_SHA256="4c7f33a342bcab498f9bf53cc0ee5b698d9598b8fa9ef6a14bcdf44d21945c27" && mkdir -p /usr/src/rebar3-src && curl -fSL -o rebar3-src.tar.gz "$REBAR3_DOWNLOAD_URL" && echo "$REBAR3_DOWNLOAD_SHA256 rebar3-src.tar.gz" | sha256sum -c - && tar -xzf rebar3-src.tar.gz -C /usr/src/rebar3-src --strip-components=1 && rm rebar3-src.tar.gz && cd /usr/src/rebar3-src && HOME=$PWD ./bootstrap && install -v ./rebar3 /usr/local/bin/ && rm -rf /usr/src/rebar3-src
# Tue, 19 Oct 2021 21:46:15 GMT
ENV ELIXIR_VERSION=v1.11.4 LANG=C.UTF-8
# Tue, 19 Oct 2021 21:49:00 GMT
RUN set -xe && ELIXIR_DOWNLOAD_URL="https://github.com/elixir-lang/elixir/archive/${ELIXIR_VERSION}.tar.gz" && ELIXIR_DOWNLOAD_SHA256="85c7118a0db6007507313db5bddf370216d9394ed7911fe80f21e2fbf7f54d29" && curl -fSL -o elixir-src.tar.gz $ELIXIR_DOWNLOAD_URL && echo "$ELIXIR_DOWNLOAD_SHA256 elixir-src.tar.gz" | sha256sum -c - && mkdir -p /usr/local/src/elixir && tar -xzC /usr/local/src/elixir --strip-components=1 -f elixir-src.tar.gz && rm elixir-src.tar.gz && cd /usr/local/src/elixir && make install clean
# Tue, 19 Oct 2021 21:49:09 GMT
CMD ["iex"]
```
- Layers:
- `sha256:77e7cc3fe486cc9a5ddc4cee43979cbebb5e7c4f36b82ccaa61dbda5dd37dac8`
Last Modified: Tue, 12 Oct 2021 01:37:52 GMT
Size: 54.2 MB (54183476 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:7ef410353e2a31335f42fc4620f0d13cd6062c9ee6aa1dd0b300f7a8cbadedc5`
Last Modified: Tue, 12 Oct 2021 04:42:58 GMT
Size: 8.3 MB (8272912 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:9f783e122aabe6c06785c6c466429027a12dd9c8c4ca516dcebccf1d0186d751`
Last Modified: Tue, 12 Oct 2021 04:42:59 GMT
Size: 10.7 MB (10727675 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:2a9159589fd196e5a0fd8f448cda535ca5aa215e7d116e4be5c030a543f75d7f`
Last Modified: Tue, 12 Oct 2021 04:43:23 GMT
Size: 57.5 MB (57456920 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:f1c9269032b2fdb2d7b6c4fe0147551ee29bc1903576a13737d3f5c8d4767832`
Last Modified: Tue, 12 Oct 2021 04:44:09 GMT
Size: 203.3 MB (203300620 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:90bb094ac1ccbc2b1bb3fc8f06d761580221269a3ea28742684a5e890ba81ce0`
Last Modified: Tue, 19 Oct 2021 20:52:12 GMT
Size: 163.3 MB (163346229 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:70902f16b6c640363f3166c64c5af53f6f9748054cdb6a54cca535aa0b9f275c`
Last Modified: Tue, 19 Oct 2021 20:49:55 GMT
Size: 196.2 KB (196229 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:26b53c498308b04814b2038d3f239bde613970940f841b5af0b629f754e3ba54`
Last Modified: Tue, 19 Oct 2021 20:49:56 GMT
Size: 919.0 KB (918974 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:df1d5f15d28a39eda34306f813e2498a74ec61e99029731bb5abbf2e365c9a22`
Last Modified: Tue, 19 Oct 2021 22:17:38 GMT
Size: 7.0 MB (6997249 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
### `elixir:1.11` - linux; s390x
```console
$ docker pull elixir@sha256:f7e03f860b53a1a373d93dfa6edf8ca334af9cbb521bfc400b428230d04ceabf
```
- Docker Version: 20.10.7
- Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json`
- Total Size: **459.2 MB (459152292 bytes)**
(compressed transfer size, not on-disk size)
- Image ID: `sha256:278b6ae1d3d86723d2dd6a25efe48280c17c132d83bf3ec5063f33caae453dbf`
- Default Command: `["iex"]`
```dockerfile
# Tue, 12 Oct 2021 00:42:39 GMT
ADD file:91e4bb81a5308737580259a9213b02933901431aa2ea23f3f4f59321a6ccc301 in /
# Tue, 12 Oct 2021 00:42:41 GMT
CMD ["bash"]
# Tue, 12 Oct 2021 07:41:34 GMT
RUN set -eux; apt-get update; apt-get install -y --no-install-recommends ca-certificates curl netbase wget ; rm -rf /var/lib/apt/lists/*
# Tue, 12 Oct 2021 07:41:38 GMT
RUN set -ex; if ! command -v gpg > /dev/null; then apt-get update; apt-get install -y --no-install-recommends gnupg dirmngr ; rm -rf /var/lib/apt/lists/*; fi
# Tue, 12 Oct 2021 07:41:58 GMT
RUN apt-get update && apt-get install -y --no-install-recommends git mercurial openssh-client subversion procps && rm -rf /var/lib/apt/lists/*
# Tue, 12 Oct 2021 07:43:00 GMT
RUN set -ex; apt-get update; apt-get install -y --no-install-recommends autoconf automake bzip2 dpkg-dev file g++ gcc imagemagick libbz2-dev libc6-dev libcurl4-openssl-dev libdb-dev libevent-dev libffi-dev libgdbm-dev libglib2.0-dev libgmp-dev libjpeg-dev libkrb5-dev liblzma-dev libmagickcore-dev libmagickwand-dev libmaxminddb-dev libncurses5-dev libncursesw5-dev libpng-dev libpq-dev libreadline-dev libsqlite3-dev libssl-dev libtool libwebp-dev libxml2-dev libxslt-dev libyaml-dev make patch unzip xz-utils zlib1g-dev $( if apt-cache show 'default-libmysqlclient-dev' 2>/dev/null | grep -q '^Version:'; then echo 'default-libmysqlclient-dev'; else echo 'libmysqlclient-dev'; fi ) ; rm -rf /var/lib/apt/lists/*
# Tue, 19 Oct 2021 19:42:13 GMT
ENV OTP_VERSION=23.3.4.8 REBAR3_VERSION=3.17.0
# Tue, 19 Oct 2021 19:42:13 GMT
LABEL org.opencontainers.image.version=23.3.4.8
# Tue, 19 Oct 2021 19:47:59 GMT
RUN set -xe && OTP_DOWNLOAD_URL="https://github.com/erlang/otp/archive/OTP-${OTP_VERSION}.tar.gz" && OTP_DOWNLOAD_SHA256="e31cbfe0f83cc3e06df997551393fca0e7fa36da371cd2b64c5e196dd843469e" && runtimeDeps='libodbc1 libsctp1 libwxgtk3.0' && buildDeps='unixodbc-dev libsctp-dev libwxgtk3.0-dev' && apt-get update && apt-get install -y --no-install-recommends $runtimeDeps && apt-get install -y --no-install-recommends $buildDeps && curl -fSL -o otp-src.tar.gz "$OTP_DOWNLOAD_URL" && echo "$OTP_DOWNLOAD_SHA256 otp-src.tar.gz" | sha256sum -c - && export ERL_TOP="/usr/src/otp_src_${OTP_VERSION%%@*}" && mkdir -vp $ERL_TOP && tar -xzf otp-src.tar.gz -C $ERL_TOP --strip-components=1 && rm otp-src.tar.gz && ( cd $ERL_TOP && ./otp_build autoconf && gnuArch="$(dpkg-architecture --query DEB_HOST_GNU_TYPE)" && ./configure --build="$gnuArch" && make -j$(nproc) && make -j$(nproc) docs DOC_TARGETS=chunks && make install install-docs DOC_TARGETS=chunks ) && find /usr/local -name examples | xargs rm -rf && apt-get purge -y --auto-remove $buildDeps && rm -rf $ERL_TOP /var/lib/apt/lists/*
# Tue, 19 Oct 2021 19:48:06 GMT
CMD ["erl"]
# Tue, 19 Oct 2021 19:48:06 GMT
ENV REBAR_VERSION=2.6.4
# Tue, 19 Oct 2021 19:48:09 GMT
RUN set -xe && REBAR_DOWNLOAD_URL="https://github.com/rebar/rebar/archive/${REBAR_VERSION}.tar.gz" && REBAR_DOWNLOAD_SHA256="577246bafa2eb2b2c3f1d0c157408650446884555bf87901508ce71d5cc0bd07" && mkdir -p /usr/src/rebar-src && curl -fSL -o rebar-src.tar.gz "$REBAR_DOWNLOAD_URL" && echo "$REBAR_DOWNLOAD_SHA256 rebar-src.tar.gz" | sha256sum -c - && tar -xzf rebar-src.tar.gz -C /usr/src/rebar-src --strip-components=1 && rm rebar-src.tar.gz && cd /usr/src/rebar-src && ./bootstrap && install -v ./rebar /usr/local/bin/ && rm -rf /usr/src/rebar-src
# Tue, 19 Oct 2021 19:48:31 GMT
RUN set -xe && REBAR3_DOWNLOAD_URL="https://github.com/erlang/rebar3/archive/${REBAR3_VERSION}.tar.gz" && REBAR3_DOWNLOAD_SHA256="4c7f33a342bcab498f9bf53cc0ee5b698d9598b8fa9ef6a14bcdf44d21945c27" && mkdir -p /usr/src/rebar3-src && curl -fSL -o rebar3-src.tar.gz "$REBAR3_DOWNLOAD_URL" && echo "$REBAR3_DOWNLOAD_SHA256 rebar3-src.tar.gz" | sha256sum -c - && tar -xzf rebar3-src.tar.gz -C /usr/src/rebar3-src --strip-components=1 && rm rebar3-src.tar.gz && cd /usr/src/rebar3-src && HOME=$PWD ./bootstrap && install -v ./rebar3 /usr/local/bin/ && rm -rf /usr/src/rebar3-src
# Tue, 19 Oct 2021 20:30:58 GMT
ENV ELIXIR_VERSION=v1.11.4 LANG=C.UTF-8
# Tue, 19 Oct 2021 20:31:57 GMT
RUN set -xe && ELIXIR_DOWNLOAD_URL="https://github.com/elixir-lang/elixir/archive/${ELIXIR_VERSION}.tar.gz" && ELIXIR_DOWNLOAD_SHA256="85c7118a0db6007507313db5bddf370216d9394ed7911fe80f21e2fbf7f54d29" && curl -fSL -o elixir-src.tar.gz $ELIXIR_DOWNLOAD_URL && echo "$ELIXIR_DOWNLOAD_SHA256 elixir-src.tar.gz" | sha256sum -c - && mkdir -p /usr/local/src/elixir && tar -xzC /usr/local/src/elixir --strip-components=1 -f elixir-src.tar.gz && rm elixir-src.tar.gz && cd /usr/local/src/elixir && make install clean
# Tue, 19 Oct 2021 20:31:57 GMT
CMD ["iex"]
```
- Layers:
- `sha256:9df790508568720a3b71c02b057e4a119d9d2e8ed003ccba18d600e1ea44fa8a`
Last Modified: Tue, 12 Oct 2021 00:48:22 GMT
Size: 49.0 MB (49004847 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:b7b06d83ee66ef95b96d501c5e0636ce063e0b231fa90d5c4195b351c28dbe4b`
Last Modified: Tue, 12 Oct 2021 07:49:16 GMT
Size: 7.4 MB (7401291 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:d1d873816a9ec26e49ccf4e32a0457007016ac2f6492724888b36562b6dc3b27`
Last Modified: Tue, 12 Oct 2021 07:49:16 GMT
Size: 9.9 MB (9883050 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:631142a77b52394b9a9a4db420460aa022bad636ce01cf42b52e42dbac9f2663`
Last Modified: Tue, 12 Oct 2021 07:49:29 GMT
Size: 51.4 MB (51380285 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:ec44a0de9d7d6b063857100024f2764bb9b4ccf5cc360beb49a96f3a3fe969a9`
Last Modified: Tue, 12 Oct 2021 07:49:55 GMT
Size: 176.9 MB (176913954 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:e3101885d1808621028d54b8757a430f01876516f5d425f7b7839c1375cdb557`
Last Modified: Tue, 19 Oct 2021 20:13:32 GMT
Size: 156.5 MB (156456404 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:be0b30d021338affe502d209a351f744e4a7617324aa0b2c6caf19d58363ab30`
Last Modified: Tue, 19 Oct 2021 20:13:14 GMT
Size: 196.2 KB (196243 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:8188431a6d1558118a0e92cb4942cab7906ac223d71ee36f2fcfcc1fdee026b6`
Last Modified: Tue, 19 Oct 2021 20:13:14 GMT
Size: 919.0 KB (918973 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:04a43cb4ba5598ee1dc20c44d93f5aab505e149d1328c0b74c78a7cbc06d688d`
Last Modified: Tue, 19 Oct 2021 20:41:21 GMT
Size: 7.0 MB (6997245 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
| 83.60452 | 1,126 | 0.725391 | kor_Hang | 0.152132 |
d025493802f7ad471539f012f672e9d405886607 | 3,484 | md | Markdown | docs/src/main/tut/datatypes/eval.md | isomarcte/cats | ac2154d1a40da43f4726d85471a3becd524f9d28 | [
"MIT"
] | 1 | 2020-12-14T17:53:22.000Z | 2020-12-14T17:53:22.000Z | docs/src/main/tut/datatypes/eval.md | dabliuw22/cats | 2c0cefbebcad81a7a3963aa760c62de8f33e6bf8 | [
"MIT"
] | null | null | null | docs/src/main/tut/datatypes/eval.md | dabliuw22/cats | 2c0cefbebcad81a7a3963aa760c62de8f33e6bf8 | [
"MIT"
] | 1 | 2018-01-14T15:01:33.000Z | 2018-01-14T15:01:33.000Z | ---
layout: docs
title: "Eval"
section: "data"
source: "core/src/main/scala/cats/Eval.scala"
scaladoc: "#cats.Eval"
---
# Eval
Eval is a data type for controlling synchronous evaluation.
Its implementation is designed to provide stack-safety at all times using a technique called trampolining.
There are two different factors that play into evaluation: memoization and laziness.
Memoized evaluation evaluates an expression only once and then remembers (memoizes) that value.
Lazy evaluation refers to when the expression is evaluated.
We talk about eager evaluation if the expression is immediately evaluated when defined and about lazy evaluation if the expression is evaluated when it's first used.
For example, in Scala, a `lazy val` is both lazy and memoized, a method definition `def` is lazy, but not memoized, since the body will be evaluated on every call.
A normal `val` evaluates eagerly and also memoizes the result.
`Eval` is able to express all of these evaluation strategies and allows us to chain computations using its `Monad` instance.
#### Eval.now
First of the strategies is eager evaluation, we can construct an `Eval` eagerly using `Eval.now`:
```tut:book
import cats.Eval
import cats.implicits._
val eager = Eval.now {
println("Running expensive calculation...")
1 + 2 * 3
}
```
We can run the computation using the given evaluation strategy anytime by using the `value` method.
```tut:book
eager.value
```
#### Eval.later
If we want lazy evaluation, we can use `Eval.later`:
```tut:book
val lazyEval = Eval.later {
println("Running expensive calculation...")
1 + 2 * 3
}
lazyEval.value
lazyEval.value
```
Notice that "Running expensive calculation" is printed only once, since the value was memoized internally.
`Eval.later` is different to using a `lazy val` in a few different ways.
First, it allows the runtime to perform garbage collection of the thunk after evaluation, leading to more memory being freed earlier.
Secondly, when `lazy val`s are evaluated, in order to preserve thread-safety, the Scala compiler will lock the whole surrounding class, whereas `Eval` will only lock itself.
#### Eval.always
If we want lazy evaluation, but without memoization akin to `Function0`, we can use `Eval.always`
```tut:book
val always = Eval.always {
println("Running expensive calculation...")
1 + 2 * 3
}
always.value
always.value
```
Here we can see, that the expression is evaluated every time we call `.value`.
### Chaining lazy computations
One of the most useful applications of `Eval` is its ability to chain together computations in a stack-safe way.
You can see one such usage when looking at the `foldRight` method found in [`Foldable`](../typeclasses/foldable.html).
Another great example are mutual tail-recursive calls:
```tut:book
object MutualRecursion {
def even(n: Int): Eval[Boolean] =
Eval.always(n == 0).flatMap {
case true => Eval.True
case false => odd(n - 1)
}
def odd(n: Int): Eval[Boolean] =
Eval.always(n == 0).flatMap {
case true => Eval.False
case false => even(n - 1)
}
}
MutualRecursion.odd(199999).value
```
Because `Eval` guarantees stack-safety, we can chain a lot of computations together using `flatMap` without fear of blowing up the stack.
You can also use `Eval.defer` to defer any computation that will return an `Eval[A]`.
This is useful, because nesting a call to `.value` inside any of the `Eval` creation methods can be unsafe.
| 30.295652 | 173 | 0.739667 | eng_Latn | 0.995931 |
d0256c1795f4bfe6bf17cbfd3e44147879fe091a | 506 | md | Markdown | docs/Facility.GeneratorApi.Http/FacilityGeneratorApiHttpHandler/TryHandleGenerateAsync.md | FacilityApi/FacilityGeneratorApi | 9625a9e067cd0c74d83487324ae1368187148737 | [
"MIT"
] | 1 | 2020-05-22T15:31:35.000Z | 2020-05-22T15:31:35.000Z | docs/Facility.GeneratorApi.Http/FacilityGeneratorApiHttpHandler/TryHandleGenerateAsync.md | FacilityApi/FacilityGeneratorApi | 9625a9e067cd0c74d83487324ae1368187148737 | [
"MIT"
] | 4 | 2016-11-15T04:26:14.000Z | 2022-01-28T17:08:07.000Z | docs/Facility.GeneratorApi.Http/FacilityGeneratorApiHttpHandler/TryHandleGenerateAsync.md | FacilityApi/FacilityGeneratorApi | 9625a9e067cd0c74d83487324ae1368187148737 | [
"MIT"
] | 7 | 2016-11-08T00:01:08.000Z | 2022-01-27T23:13:35.000Z | # FacilityGeneratorApiHttpHandler.TryHandleGenerateAsync method
Generates code from a service definition.
```csharp
public Task<HttpResponseMessage?> TryHandleGenerateAsync(HttpRequestMessage httpRequest,
CancellationToken cancellationToken = default)
```
## See Also
* class [FacilityGeneratorApiHttpHandler](../FacilityGeneratorApiHttpHandler.md)
* namespace [Facility.GeneratorApi.Http](../../Facility.GeneratorApi.md)
<!-- DO NOT EDIT: generated by xmldocmd for Facility.GeneratorApi.dll -->
| 31.625 | 89 | 0.802372 | kor_Hang | 0.393591 |
d025b62f743b891c715dcf4193b7ca8d69e84f17 | 48 | md | Markdown | README.md | JaWitold/sente_recruitment | d606df7f87b585f44099589b02dcc7849efa4703 | [
"MIT"
] | null | null | null | README.md | JaWitold/sente_recruitment | d606df7f87b585f44099589b02dcc7849efa4703 | [
"MIT"
] | null | null | null | README.md | JaWitold/sente_recruitment | d606df7f87b585f44099589b02dcc7849efa4703 | [
"MIT"
] | null | null | null | # sente_recruitment
recruitment task from sente
| 16 | 27 | 0.854167 | eng_Latn | 0.891987 |
d025db0884c1e44201077289154dc898e5b522e6 | 2,896 | md | Markdown | History.md | VBill/fc-java-sdk | ce6fdb28d1e0b7b55d36c5eb95166377b106daa4 | [
"MIT"
] | 68 | 2017-05-10T08:02:34.000Z | 2022-03-04T09:41:54.000Z | History.md | VBill/fc-java-sdk | ce6fdb28d1e0b7b55d36c5eb95166377b106daa4 | [
"MIT"
] | 73 | 2017-05-11T03:23:26.000Z | 2022-03-22T12:18:52.000Z | History.md | VBill/fc-java-sdk | ce6fdb28d1e0b7b55d36c5eb95166377b106daa4 | [
"MIT"
] | 30 | 2017-06-06T22:50:01.000Z | 2022-03-04T09:42:39.000Z | 1.8.23 / 2021-10-08
==================
* add logBeginRule for logConfig
1.8.21 / 2021-06-17
==================
* add acree instanceID for customContainerConfig
1.8.18 / 2021-04-19
==================
* add instance metrics support
1.8.15 / 2021-01-09
==================
* add targetTrackingPolicy for provision
1.8.14 / 2021-01-07
==================
* add scheduledAction for provision
1.8.12 / 2020-12-16
==================
* add request metrics support
1.8.11 / 2020-10-27
==================
* add tracing support
1.8.10 / 2020-09-14
==================
* add asyncConfig support
1.8.8 / 2020-08-25
==================
* add config validator
1.8.7 / 2020-08-05
==================
* add getCaPort and getCustomContainerConfig in GetFunctionResponse
1.8.6 / 2020-07-27
==================
* add connection request timeout
1.8.5 / 2020-07-22
==================
* add connection request timeout
1.8.4 / 2020-06-22
==================
* add vpc binding api
1.8.2 / 2020-06-19
==================
* add customContainerConfig/caPort and instanceType
1.8.1 / 2020-03-16
==================
* add http method PATCH
1.8.0 / 2020-03-13
==================
* add SignURL for http trigger
1.7.1 / 2019-12-27
==================
* Add new field for prepaid list interface
1.7.0 / 2019-11-22
==================
* add async client, for now only support functions and services api
1.6.0 / 2019-11-15
==================
* add on-demand-config API (#87)
* add instanceConcurrency for Create/Update function (#83)
* fix tag/untag response without body, requestId already in header
1.5.0 / 2019-09-04
==================
* Add tag support
1.4.0 / 2019-08-26
==================
* Add provision interface
1.3.4 / 2019-05-30
==================
* Set all response encode with utf-8
1.3.3 / 2019-05-23
==================
* Add Custom Domain HTTPS Support
1.3.2 / 2019-05-23
==================
* Add Custom Domain HTTPS Support
1.3.1 / 2018-11-22
==================
* add Get Account Setting
1.3.0 / 2018-11-21
==================
* add Initializer support
* add Versioning support
1.2.1 / 2018-09-20
==================
* Add Custom Domain Support
1.2.0 / 2018-08-27
==================
* Add NAS Support
1.1.10 / 2018-06-01
==================
* Add vpc and cdn events trigger support
1.1.8 / 2018-05-04
==================
* Fix url encode related problem
1.1.7 / 2018-04-27
==================
* Add http trigger support
1.1.6 / 2018-04-12
==================
* Add environment variables support
1.1.5 / 2018-03-16
==================
* Fixed a compilation problem for jdk 1.9
1.1.4 / 2018-03-09
==================
* Add time trigger support
1.1.3 / 2017-10-12
==================
* Fixed FC endpoints for newer regions
1.1.2 / 2017-09-29
==================
* Refactor log trigger config
| 16.548571 | 69 | 0.518301 | oci_Latn | 0.462641 |
d0262a82ab91cc34d3441b500830231e4df4bc8f | 1,108 | md | Markdown | content/post/2019/07/katex.md | vunb/blog | c75f85a62e12a8ae2f33b39ee930a8d1e8309a28 | [
"MIT"
] | null | null | null | content/post/2019/07/katex.md | vunb/blog | c75f85a62e12a8ae2f33b39ee930a8d1e8309a28 | [
"MIT"
] | null | null | null | content/post/2019/07/katex.md | vunb/blog | c75f85a62e12a8ae2f33b39ee930a8d1e8309a28 | [
"MIT"
] | null | null | null | ---
title: Katex
date: 2019-07-10T00:00:00Z
draft: false
share: false
ad: true
katex: true
markup: "mmark"
tags:
- Katex
- Latex
- Hugo
- Today I Learned
categories:
- TIL # Today I Learned
---
Cấu hình Hugo sử dụng `katex` và `mmark`. Không khó lắm, mất khoảng 15 phút là làm được. Sau khi cài đặt xong thì giờ quan tâm tới cách sử dụng thôi :smile:
### Display Block
Type equation on a single line, with top and bottom spaces. Use double dollar signs as delimiter.
```md
The following
$$\int_{a}^{b} x^2 dx$$
Is an integral
```
The following
$$\int_{a}^{b} x^2 dx$$
Is an integral
### Display Inline
Type the equation as per usual, nothing extra needed. Use double dollar signs as delimiter.
```md
Integrate $$\int x^3 dx$$
```
Integrate $$\int x^3 dx$$
### Notice
Để đoạn mã trên hoạt động được, thêm cấu hình vào từng bài post cấu hình sau:
```js
---
katex: true
markup: "mmark"
---
```
# Liên kết tham khảo
Một vài hướng dẫn cấu hình
* https://eankeen.github.io/blog/render-latex-with-katex-in-hugo-blog/
* https://katex.org/docs/browser.html
* https://katex.org/docs/autorender.html
| 16.787879 | 156 | 0.690433 | vie_Latn | 0.981411 |
d02690bbf95d28f17d08f2a0f60b284b20818cff | 336 | md | Markdown | _posts/2021-07-08/2021-07-02-Ever-fucked-a-milf-20210702054720138445.md | ipussy/ipussy.github.io | 95d19a74e38bb54303cf18057a99a57c783e76bf | [
"Apache-2.0"
] | null | null | null | _posts/2021-07-08/2021-07-02-Ever-fucked-a-milf-20210702054720138445.md | ipussy/ipussy.github.io | 95d19a74e38bb54303cf18057a99a57c783e76bf | [
"Apache-2.0"
] | null | null | null | _posts/2021-07-08/2021-07-02-Ever-fucked-a-milf-20210702054720138445.md | ipussy/ipussy.github.io | 95d19a74e38bb54303cf18057a99a57c783e76bf | [
"Apache-2.0"
] | null | null | null | ---
title: "Ever fucked a milf?"
metadate: "hide"
categories: [ Pussy ]
image: "https://preview.redd.it/d0oxw55gho871.jpg?auto=webp&s=7403b9a416f799e1136504e48f686fd50e510830"
thumb: "https://preview.redd.it/d0oxw55gho871.jpg?width=640&crop=smart&auto=webp&s=fdf712ddbd346f074824e92ad3b1c6fff34afbc3"
visit: ""
---
Ever fucked a milf?
| 33.6 | 124 | 0.77381 | yue_Hant | 0.153042 |
d027404a6737be54947db2367c799289af8ca3a0 | 3,528 | md | Markdown | src/iterating.md | tetrane/api-cookbook | 973acc10d1643d8f5f97a80e35db90486f5ff7af | [
"X11"
] | null | null | null | src/iterating.md | tetrane/api-cookbook | 973acc10d1643d8f5f97a80e35db90486f5ff7af | [
"X11"
] | 2 | 2021-11-25T11:31:14.000Z | 2021-11-26T13:00:10.000Z | src/iterating.md | tetrane/api-cookbook | 973acc10d1643d8f5f97a80e35db90486f5ff7af | [
"X11"
] | null | null | null | {% import "templates/bulma.tera" as bulma %}
# Manipulating transitions and contexts
## Getting a transition or context from a transition id
{{ bulma::begin_bulma() }}
{{ bulma::tags(tags=[
bulma::reven_version(version="v2.2.0"),
]) }}
{{ bulma::end_bulma() }}
```py
tr = server.trace.transition(1234)
ctx_before = server.trace.context_before(1234)
ctx_after = server.trace.context_after(1234)
```
## Context <-> Transition
### Transition -> Context
{{ bulma::begin_bulma() }}
{{ bulma::tags(tags=[
bulma::reven_version(version="v2.2.0"),
]) }}
{{ bulma::end_bulma() }}
```py
ctx_before = tr.context_before()
ctx_after = tr.context_after()
```
### Context -> Transition
{{ bulma::begin_bulma() }}
{{ bulma::tags(tags=[
bulma::reven_version(version="v2.6.0"),
]) }}
{{ bulma::end_bulma() }}
```py
if ctx != server.trace.first_context:
tr_before = ctx.transition_before()
if ctx != server.trace.last_context:
tr_after = ctx.transition_after()
```
{{ bulma::begin_bulma() }}
{{ bulma::tags(tags=[
bulma::reven_version(version="v2.2.0"),
]) }}
{{ bulma::end_bulma() }}
```py
if ctx != server.trace.context_before(0):
tr_before = ctx.transition_before()
if ctx != server.trace.context_after(server.trace.transition_count - 1):
tr_after = ctx.transition_after()
```
{{ bulma::begin_bulma() }}
{{ bulma::begin_message(header="There are not always transitions around a context", class="is-warning") }}
<p>
While <code>transition.context_before/context_after()</code> always works, one must handle the case where a context is the first/last of the trace, in which case no transition before/after it can be accessed.
</p>
<p>
Trying to access the transition before the first context/after the last, will trigger an <code>IndexError</code>.
</p>
{{ bulma::end_message() }}
{{ bulma::end_bulma() }}
## Getting the next/previous context and transition
```py
next_tr = tr + 1
prev_tr = tr - 1
next_ctx = ctx + 1
prev_ctx = ctx - 1
next_next_tr = tr + 2
# ...
```
{{ bulma::begin_bulma() }}
{{ bulma::begin_message(header="There is not always a next/previous transition/context", class="is-warning") }}
<p>
Make sure that the resulting transition/context is in range when adding/subtracting an offset to generate a new transition/context.
</p>
<p>
Trying to access a transition/context out-of-range will trigger an <code>IndexError</code>.
</p>
{{ bulma::end_message() }}
{{ bulma::end_bulma() }}
## Iterating on a range of transitions/contexts
{{ bulma::begin_bulma() }}
{{ bulma::tags(tags=[
bulma::reven_version(version="v2.2.0"),
]) }}
{{ bulma::end_bulma() }}
```py
for tr in server.trace.transitions(0, 1000):
print(tr)
for ctx in server.trace.contexts():
print(ctx)
```
# Getting the first/last context/transition in the trace
{{ bulma::begin_bulma() }}
{{ bulma::tags(tags=[
bulma::reven_version(version="v2.6.0"),
]) }}
{{ bulma::end_bulma() }}
```py
first_tr = server.trace.first_transition
last_tr = server.trace.last_transition
first_ctx = server.trace.first_context
last_ctx = server.trace.last_context
```
{{ bulma::begin_bulma() }}
{{ bulma::tags(tags=[
bulma::reven_version(version="v2.2.0"),
]) }}
{{ bulma::end_bulma() }}
```py
first_tr = server.trace.transition(0)
last_tr = server.trace.transition(server.trace.transition_count - 1)
first_ctx = server.trace.context_before(0)
last_ctx = server.trace.context_after(server.trace.transition_count - 1)
```
| 25.565217 | 210 | 0.671485 | eng_Latn | 0.560121 |
d0274e09f5e6685dfe35c6b7f5fe021961d070f2 | 5,911 | md | Markdown | README.md | ripe-tech/pconvert-rust | cf9ffcfc59d5838cf4d74a2c6c666e3f94f7cdc3 | [
"Apache-2.0"
] | 6 | 2020-09-02T17:50:43.000Z | 2021-10-10T00:58:04.000Z | README.md | ripe-tech/pconvert-rust | cf9ffcfc59d5838cf4d74a2c6c666e3f94f7cdc3 | [
"Apache-2.0"
] | 20 | 2020-09-03T12:00:34.000Z | 2022-03-11T15:45:16.000Z | README.md | ripe-tech/pconvert-rust | cf9ffcfc59d5838cf4d74a2c6c666e3f94f7cdc3 | [
"Apache-2.0"
] | null | null | null | # P(NG)Convert Rust
The [Rust](https://www.rust-lang.org) version of the famous [P(NG)Convert](https://github.com/hivesolutions/pconvert) from Hive Solutions.
This Rust crate can be used as a **command line application**, as a **crate** in another rust project, as a **Web Assembly module** (able to be used within JavaScript that targets web browsers or NodeJS) or as a **Python package**.
## Command Line Application
### Compiling & Executing
Build and run with:
```bash
cargo run
```
Alternatively, compile first with:
```bash
cargo build
```
and then run the binary with:
```bash
./target/debug/pconvert-rust
```
Additionally, for better code optimization, compile with the `--release` flag:
```bash
cargo build --release
```
and then run the release binary with:
```bash
./target/release/pconvert-rust
```
### Usage
```console
$ pconvert-rust
Usage: pconvert-rust <command> [args...]
where command can be one of the following: compose, convert, benchmark, version
```
```console
$ pconvert-rust compose <dir>
```
```console
$ pconvert-rust convert <file_in> <file_out>
```
```console
$ pconvert-rust benchmark <dir> [--parallel]
```
```console
$ pconvert-rust version
```
### Example
```rust
// blends the provided image as a new image to be used
// under the current instance
let top = pconvert_rust::utils::read_png_from_file("top.png".to_string(), false).unwrap();
let mut bottom = pconvert_rust::utils::read_png_from_file("bottom.png".to_string(), false).unwrap();
// gathers the mask top blending algorithm function and
// uses it to blend both images
let blending_fn = pconvert_rust::blending::get_blending_algorithm(
&pconvert_rust::blending::BlendAlgorithm::DestinationOver,
);
pconvert_rust::blending::blend_images(&mut bottom, &top, &blending_fn, &None);
// "outputs" the blended image contents to the `out.png` file
pconvert_rust::utils::write_png_to_file_d("out.png".to_string(), &bottom).unwrap();
```
## WebAssembly (WASM) Module
### Compiling & Executing
Follow [this guide](https://developer.mozilla.org/en-US/docs/WebAssembly/Rust_to_wasm) on how to install `wasm-pack`.
To build, use the `wasm-extension` feature:
```bash
wasm-pack build -- --features wasm-extension
```
To run the demo, follow [this](https://developer.mozilla.org/en-US/docs/WebAssembly/Rust_to_wasm#Making_our_package_availabe_to_npm).
### Usage
Check the [demo site](examples/wasm/index.js) to see how to use the PConvert WASM module.
JavaScript API exposed:
```javascript
// blends two File objects and returns a File object
blendImages(bot, top, target_file_name, algorithm, is_inline, options)
// blends two ImageData objects and returns an ImageData object
blendImagesData(bot, top, algorithm, is_inline, options)
// blends multiple File objects and returns a File object
blendMultiple(image_files, target_file_name, algorithm, algorithms, is_inline, options)
// blends multiple ImageData objects and returns an ImageData object
blendMultipleData(images, algorithm, algorithms, is_inline, options)
// returns a JSON of module constants (e.g. ALGORITHMS, FILTER_TYPES, COMPILER_VERSION, ...)
getModuleConstants()
// benchmarks and prints to console various times for different combinations of blending algorithms, compression algorithms and filters for `blendImages`
blendImagesBenchmarkAll(bot, top, is_inline)
// benchmarks and prints to console various times for different combinations of blending algorithms, compression algorithms and filters for `blendMultiple`
blendMultipleBenchmarkAll(image_files, is_inline)
```
## Python package
### Compiling & Executing
This crate can be installed as a python package through the use of `pip`. Simply run:
```bash
pip install .
```
### Usage
Check [this folder](examples/python/) for examples.
Import the python package with:
```python
import pconvert_rust
```
Python API exposed. The parameter `options` is a python dictionary of optional parameters and if `num_threads` is specified with a value of 1 or more, the work load will be distributed across multiple threads (belonging to a internally managed thread pool).
```python
# blends two images read from the local file system and writes the result to the file system
blend_images(bot_path, top_path, target_path, algorithm, is_inline, options)
# blends multiple images read from the local file system and writes the result to the file system
blend_multiple(img_paths, out_path, algorithm, algorithms, is_inline, options)
# returns a python dict with summary information about the internal thread pool (size, active jobs, queued jobs)
get_thread_pool_status()
# access module constants (e.g. ALGORITHMS, FILTER_TYPES, COMPILER_VERSION, ...)
pconvert_rust.ALGORITHMS
pconvert_rust.FILTER_TYPES
pconvert_rust.COMPILER_VERSION
```
## Tests
For rust crate tests:
```bash
cargo test
```
For python API tests:
```bash
python setup.py test
```
For WASM API tests:
```bash
npm test
```
## Documentation
Generate documentation using:
```bash
cargo doc --lib --all-features
```
## License
P(NG)Convert Rust is currently licensed under the [Apache License, Version 2.0](http://www.apache.org/licenses/).
## Build Automation
[](https://travis-ci.com/github/ripe-tech/pconvert-rust)
[](https://github.com/ripe-tech/pconvert-rust/actions)
[](https://crates.io/crates/pconvert-rust)
[](https://pypi.python.org/pypi/pconvert-rust)
[](https://www.npmjs.com/package/pconvert-rust)
[](https://www.apache.org/licenses/)
| 28.834146 | 257 | 0.755033 | eng_Latn | 0.775119 |
d02867ccee22684f5a1bed6206693805c51efa69 | 1,545 | md | Markdown | operators/meetings/009-20191106.md | askhade/sigs | 22a7504efb800ba9ead8ba61dd6937cd9012c256 | [
"MIT"
] | null | null | null | operators/meetings/009-20191106.md | askhade/sigs | 22a7504efb800ba9ead8ba61dd6937cd9012c256 | [
"MIT"
] | null | null | null | operators/meetings/009-20191106.md | askhade/sigs | 22a7504efb800ba9ead8ba61dd6937cd9012c256 | [
"MIT"
] | null | null | null | # Wednesday November 6, 2019 at 9:00am PST
## Agenda
* Dynamic shape proposal:
* Link: https://1drv.ms/w/s!Aq5Cu4d76dsCjbFWbJajaHnH25CkXQ?e=Dabhup
* Link: https://github.com/goldgeisser/onnx/blob/master/docs/proposals/StaticShape4DynShapeOpsProposal.md
* Removing operator update
## Attendees
* Emad Barsoum (Microsoft)
* Michal Karzynski (Intel)
* Ganesan Ramalingam (Microsoft)
* Spandan Tiwari (Microsoft)
* Dilip Sequeira (NVidia)
* Itay Hubara (Habana)
* Scott Cyphers (Intel)
* Leonid Goldgeisser (Habana)
## Notes
* We discussed the dynamic versus static shape for accelerator.
* We focused on what need to be in the spec and what should be an implementation detail.
* Here the summary.
* We will add a hint attributed to specify the max size.
* This attribute is optional.
* It can be set by the model creator or afterward.
* The behavior of exceeding the max value is NOT defined by the standard.
* There is no concept of dynamic versus static in ONNX, ONNX is a file format, it is up to the runtime to respect the hint and create a fixed size graph in-memory.
* This attribute is optional and can be ignored by the converter.
* The detailed design TBD.
* For removed OP.
* Archive older operator.md for each release.
* Add remvoed/deprecated OPs at the bottem of operator.md.
## Action items
* Emad Barsoum - submit the operator removal requirements to ONNX master.
* Leonid Goldgeisser - to write a final draft document for the dynamic shape and share it with the rest of the group.
| 40.657895 | 167 | 0.744984 | eng_Latn | 0.985156 |
d02925604ead4e8259c90cbe31b844081c307e82 | 2,296 | md | Markdown | networks-seminar/09-pFabric.md | danalex97/perls | d311bef2156046b2420d16240e4a2b103269fc52 | [
"MIT"
] | null | null | null | networks-seminar/09-pFabric.md | danalex97/perls | d311bef2156046b2420d16240e4a2b103269fc52 | [
"MIT"
] | null | null | null | networks-seminar/09-pFabric.md | danalex97/perls | d311bef2156046b2420d16240e4a2b103269fc52 | [
"MIT"
] | null | null | null | # [pFabric](https://web.stanford.edu/~skatti/pubs/sigcomm13-pfabric.pdf)
### Paper summary
pFabric is data center transport design which decouples rate control and flow scheduling. The system uses an approximation of SRPT, ordering the packets based on a packet priority which is ideally set as the remaining flow size. The highest priority packet is scheduled when a port is idle, while the lowest priority packet is dropped if the queue overflows. Rate control does additive increase and resets the window at timeouts. The algorithm achieves low average flow completion times for simulations of spine-leaf topologies. Finally, the paper offers a flow priority assignment methodology for current routers to approximate pFabric.
### Key insight(s)
-SRPT is a 2-approximation of the optimal flow allocation that minimizes the FCT in the big switch model.
-pFabric offers similar guarantees to SRPT as long as at each switch port one of the highest priority packets is always available. A packet can be dropped as long as the queue takes more than an RTT to drain, supposing the host retransmits the packet.
-pFabric allows scheduling based on deadlines and flow sizes.
-Starvation can be used by up-capping the usage of the algorithm for big size flows.
-The algorithm can be approximated without changing the hardware by setting good thresholds for priorities based on flow sizes.
### What could be improved?
One of my concerns about this paper is about the deployment feasibility since it requires a completely new hardware for switches. Moreover, the incremental deployment optimizes the thresholds using the given workload, making the approach less generalizable. More research can be done on changing the thresholds dynamically, not based on a single workload distribution.
Another missed point from my point of view is not discussing the size of the packets in the algorithm for dropping packets. Dropping a packet of 1500bytes is not same as dropping a 20bytes packet.
pFabric also does not support streams and the paper suggests using a hierarchical approach to solve this problem: split the traffic then apply pFabric approach only on some of the traffic. How does this work in terms of hardware? How much of the data center has to get pFabric switches as opposed to usual ones?
| 99.826087 | 637 | 0.810976 | eng_Latn | 0.999649 |
d029314703a791eff8e446a5673714884159e13e | 1,405 | md | Markdown | content/2003/06/12/121.md | nobu666/nobu666.com | 917dc730b45c5b123aa830f6d3dcb712706abee4 | [
"MIT"
] | null | null | null | content/2003/06/12/121.md | nobu666/nobu666.com | 917dc730b45c5b123aa830f6d3dcb712706abee4 | [
"MIT"
] | null | null | null | content/2003/06/12/121.md | nobu666/nobu666.com | 917dc730b45c5b123aa830f6d3dcb712706abee4 | [
"MIT"
] | null | null | null | +++
date = "2003-06-12T00:00:00+09:00"
draft = false
title = "おもむろに的を得たことを言い放ち、すざまじい勢いで汚名挽回します"
categories = ["diary"]
+++
昨日のオトナ語読んでて思ったけど、日本語間違って使ってることって凄く多い。調べてみたら自分も間違っているものが沢山出てくる。「情けは人の為ならず」が有名だけど、他にも「的を得る」とか言ったりするけど「的を射る」のが正しいし、「ありえる」も「ありうる」の誤用だ。「おざなり」と「なおざり」をごちゃ混ぜに使っている人も多く見るし、「青田刈り」と「青田買い」も間違いが多い。「喧々囂々(けんけんごうごう)」と「侃々諤々(かんかんがくがく)」が混ざって「ケンケンガクガク」とかもありがち。「花に水をあげ」たり、「汚名挽回」してみたり、「舌づつみ」をうってみたり、「屈辱を果たし」たり「雪辱を晴らし」たり。「うる覚え」とか「すざまじい」とか。
他にも、これは間違って使うこと自分もあるし、周りも間違ってることが多いなぁと思うものをちょっとリストアップ。
<ul>
<li>おもむろに:(誤)唐突に (正)ゆっくりと</li>
<li>やおら:(誤)唐突に (正)ゆっくりと</li>
<li>檄を飛ばす:(誤)叱咤する (正)自分の主張や考えを広く人々に知らせ、人々に決起を促したりする</li>
<li>檄を送る:(誤)激励の言葉を掛ける (正)檄は送らない...</li>
<li>やぶさかでない:(誤)嫌ではない (正)?する努力を惜しまない、快く?する</li>
<li>草場の蔭で:(誤)隠れて (正)あの世で</li>
<li>元旦:(誤)元日 (正)元日の朝</li>
<li>役不足:(誤)力不足 (正)俳優などが割り当てられた役に不満を抱くこと。力量に比べて、役目が不相応に軽いこと。</li>
<li>歌のさわり:(誤)歌い出し (正)聞かせどころ</li>
</ul>
思ったよりいっぱいあるもんだ。中には明らかに間違っている方の意味が、一般的に使われてるものもあるし。最近の言葉では「逆ギレ」もなんか微妙な使われ方をしている気がする。たとえば6/11付けのZAKZAKの記事に、<a href="http://www.zakzak.co.jp/top/t-2003_06/1t2003061117.html">売春女子高生、“無銭セックス”に逆ギレ</a>とあるが、「金を払う約束でセックスしたのに『顔を殴るなどし、車から無理やり降し、タダ乗りを図った』ことに対して、女子高生が怒った」のが何故逆ギレなのか?ここで男の方が「なんで金払う必要があんだよ!フザケンナ!」とか怒り出したりするのが逆ギレなんじゃないのか?
まぁこれを機に、会社とかで恥をかかないように気をつけてみてはいかがか。ただ客先とかで、得意げに相手の間違いを指摘したりすると大変なことになるかも知れないので、その辺はご注意を(笑)。
<h3>:: Today's Music ::</h3>
<ul>
<li>HEADING FOR A STORM / VANDENBERG</li>
<li>Animetal Marathon / ANIMETAL</li>
</ul>
| 43.90625 | 323 | 0.76726 | jpn_Jpan | 0.75154 |
d029737c3e39a4e8f58e2a46e7677977a9c581d7 | 3,992 | md | Markdown | README.md | vardrop/mantle | b0e54a8c8648c3e78c0b7822ca21dec86efa38b0 | [
"Apache-2.0"
] | 1 | 2020-09-21T06:41:14.000Z | 2020-09-21T06:41:14.000Z | README.md | vardrop/mantle | b0e54a8c8648c3e78c0b7822ca21dec86efa38b0 | [
"Apache-2.0"
] | null | null | null | README.md | vardrop/mantle | b0e54a8c8648c3e78c0b7822ca21dec86efa38b0 | [
"Apache-2.0"
] | null | null | null | # Mantle

[](https://github.com/nektro/mantle/blob/master/LICENSE)
[](https://discord.gg/P6Y4zQC)
[](https://paypal.me/nektro)
[](https://circleci.com/gh/nektro/mantle)
[](https://github.com/nektro/mantle/releases/latest)
[](https://goreportcard.com/report/github.com/nektro/mantle)
[](https://www.codefactor.io/repository/github/nektro/skarn)
Easy and effective communication is the foundation of any successful team or community. That's where Mantle comes in, providing you the messaging and voice platform that puts you in charge of both the conversation and the data.
## Getting Started
These instructions will help you get the project up and running and are required before moving on.
### Creating External Auth Credentials
In order to allow users to log in to Mantle, you will need to create an app on your Identity Provider(s) of choice. See the [nektro/go.oauth2](https://github.com/nektro/go.oauth2#readme) docs for more detailed info on this process on where to go and what data you'll need.
Here you can also fill out a picture and description that will be displayed during the authorization of users on your chosen Identity Provider. When prompted for the "Redirect URI" during the app setup process, the URL to use will be `http://mantle/callback`, replacing `mantle` with any origins you wish Mantle to be usable from, such as `example.com` or `localhost:800`.
Once you have finished the app creation process you should now have a Client ID and Client Secret. These are passed into Mantle through flags as well.
| Name | Type | Default | Description |
|------|------|---------|-------------|
| `--auth-{IDP-ID}-id` | `string` | none. | Client ID. |
| `--auth-{IDP-ID}-secret` | `string` | none. | Client Secret. |
The Identity Provider IDs can be found from the table in the [nektro/go.oauth2](https://github.com/nektro/go.oauth2#readme) documentation.
## Development
### Prerequisites
- The Go Language 1.12+ (https://golang.org/dl/)
- GCC on your PATH (for the https://github.com/mattn/go-sqlite3 installation)
### Installing
Run
```
$ go get -u -v github.com/nektro/mantle
```
and then make your way to `$GOPATH/src/github.com/nektro/mantle/`.
Once there, run:
```
$ ./start.sh
```
## Deployment
Pre-compiled binaries can be obtained from https://github.com/nektro/mantle/releases/latest.
Or you can build from source:
```
$ ./build_all.sh
```
## Built With
- http://github.com/gorilla/sessions
- http://github.com/gorilla/websocket
- http://github.com/nektro/go-util
- http://github.com/nektro/go.etc
- http://github.com/nektro/go.oauth2
- http://github.com/satori/go.uuid
- http://github.com/spf13/pflag
## Contributing
[](https://github.com/nektro/mantle/issues)
[](https://github.com/nektro/mantle/pulls)
We listen to issues all the time right here on GitHub. Labels are extensively to show the progress through the fixing process. Question issues are okay but make sure to close the issue when it has been answered! Off-topic and '+1' comments will be deleted. Please use post/comment reactions for this purpose.
When making a pull request, please have it be associated with an issue and make a comment on the issue saying that you're working on it so everyone else knows what's going on :D
## Contact
- [email protected]
- Meghan#2032 on discordapp.com
- https://twitter.com/nektro
## License
Apache 2.0
| 49.9 | 372 | 0.74524 | eng_Latn | 0.861499 |
d02a498774d82fe80ef8481d87d75b0431a90781 | 2,691 | md | Markdown | _posts/gitblog/2020-10-14-jekyll-setting-postDate.md | heoseongh/heoseongh.github.io | f7e684e915695e2626b46a84805769900ba1b5b5 | [
"MIT"
] | null | null | null | _posts/gitblog/2020-10-14-jekyll-setting-postDate.md | heoseongh/heoseongh.github.io | f7e684e915695e2626b46a84805769900ba1b5b5 | [
"MIT"
] | 1 | 2020-11-09T17:05:42.000Z | 2020-11-09T19:16:22.000Z | _posts/gitblog/2020-10-14-jekyll-setting-postDate.md | heoseongh/heoseongh.github.io | f7e684e915695e2626b46a84805769900ba1b5b5 | [
"MIT"
] | 2 | 2021-07-14T10:34:22.000Z | 2022-03-28T07:32:17.000Z | ---
title: "jekyll 블로그 포스팅 날짜로 바꾸기"
excerpt: "포스팅 목록에 표시되는 소요시간을 날짜로 바꿔보자!"
categories:
- GitBlog
tags:
- Git
- GitBlog
- jekyll
- Custom
---
## 서론
한 블로거의 포스팅을 따라 열심히 블로그를 만들었다. 그러다 문득 포스트 제목 아래 표시되는 '포스팅을 읽는데 예상되는 소요시간(read-time)'이 필요 없다고 느껴졌다. 날짜로 바꾸려고 아무리 구글링을 해봐도 최신 버전의 방법이 나오지 않았기 때문에 결국 내가 직접 삽질을 했다.
우선 어떻게 바꿀 것인지 결과물을 보고 시작하자.

구글링을 해보면 포스팅된 글들의 목록을 보여주는 부분(`_includes/archive-single.html`)과 글을 보여주는 부분(`_includes/single.html`) 두 군데를 수정하면 된다고 하는데 아무리 건드려도 안된다.. 결국 이것저것 건드려보다가 방법을 찾아냈다.
포스팅을 읽을 시간이 없다면 [commit 95d60de](https://github.com/heoseongh/heoseongh.github.io/commit/95d60de577b8f90c7fab02ece706debaa403511e#diff-ecec67b0e1d7e17a83587c6d27b6baaaa133f42482b07bd3685c77f34b62d883)를 참고하여 변경 사항만 챙기고 넘어가자.
## step1.
`archive-single.html` 에서 <h2>태그 이후에 있는 `include page_meta.html` 코드를 이해해보자.
> archive-signle.html
```html
...
</h2>
{ % include page__meta.html type=include.type % }
{ % if post.excerpt % }
<p class="archive__item-excerpt" itemprop="description">
{ { post.excerpt | markdownify | strip_html | truncate: 160 } }
</p>
{ % endif % }
</article>
</div>
```
* { % include page_meta.html % } : page_meta.html 파일을 끼워넣는다.
> jekyll에서 사용하는 Liquid 언어는 루비 기반의 언어이다. `{ % 태그 % }` 와 같은 형태로 `if-else` 문과 같이 다양한 태그를 실행할 수 있는데 `include` 태그를 사용하면 `_includes` 폴더에 저장된 다른 파일의 내용을 포함시킬 수 있게 된다.
>
> 참고: [JekyllDocs](https://jekyllrb-ko.github.io/docs/includes/)
## step2.
이제 page_meta.html 파일을 살펴보자. 크게 조건문 3개가 보인다.
```ruby
{ % if document.read_time or document.show_date % }
{ % if document.read_time and document.show_date % }
{ % if document.read_time % }
```
1. `read_time` 이거나 `show_date` 이면?
2. `read_time` 이고 `show_date` 이면?
3. `read_time` 이면?
이것을 보면 대충 이전에 `_config.yml`에서 `read_time`을 `true`로 설정한 기억이 떠오를 것이다.
`_config.yml`로 돌아가서 포스트 부분에 `read_time`을 `false`로 바꿔주고 `show_date: true`를 추가하자. 이쁘게 주석도 달아놓자.
```yaml
defaults:
# _posts
- scope:
path: ""
type: posts
values:
layout: single
author_profile: true
read_time: false # 읽는데 걸리는 소요시간 표시
show_date: true # 포스팅 날짜 표시
comments: true
share: true
related: true
```
이렇게 설정 파일을 건드렸기 때문에 새로 고침으로는 적용이 안되고 서버를 다시 실행시켜줘야 한다.
```bash
$ bundle exec jekyll serve # 서버 재실행
```
이렇게 하면 소요시간을 감추고 포스팅 날짜만 표시되게 설정할 수 있다.
## 마치며..
이전 버전까지는 `includes/archive-single.html` 파일에 목록에 대한 설정 코드가 다이렉트로 들어갔지만 최근 버전부터는 시간, 날짜 관련 메타 정보들은 따로 관리하게끔 `includes/page__meta.html` 파일에 관련 코드들을 묶어논 듯하다. 아마 이전에는 _config.yml에서 read_time과 show_date(과거 post.date)를 따로 설정할 수가 없었나 보다.
| 27.742268 | 230 | 0.679673 | kor_Hang | 0.999993 |
d02a983112645e9d079e9a4cd74adbda57aac69c | 2,378 | md | Markdown | ce/customerengagement/on-premises/developer/org-service/sample-create-linq-query.md | mairaw/dynamics-365-customer-engagement | 18b45fa62f6af559f6f272575878c21ab279638c | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ce/customerengagement/on-premises/developer/org-service/sample-create-linq-query.md | mairaw/dynamics-365-customer-engagement | 18b45fa62f6af559f6f272575878c21ab279638c | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ce/customerengagement/on-premises/developer/org-service/sample-create-linq-query.md | mairaw/dynamics-365-customer-engagement | 18b45fa62f6af559f6f272575878c21ab279638c | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: "Sample: Create a LINQ query (Developer Guide for Dynamics 365 Customer Engagement)| MicrosoftDocs"
description: "This sample shows how to create simple .NET Language-Integrated Query (LINQ) queries"
ms.custom:
ms.date: 02/05/2020
ms.reviewer: pehecke
ms.service: crm-online
ms.suite:
ms.tgt_pltfrm:
ms.topic: samples
applies_to:
- Dynamics 365 Customer Engagement (on-premises)
helpviewer_keywords:
- LINQ query examples and samples, creating simple LINQ queries sample
- simple LINQ queries sample
- creating simple LINQ queries sample
- sample for creating simple LINQ queries
ms.assetid: 43b26b09-636e-4781-8477-65454c4c5232
caps.latest.revision: 19
author: KumarVivek
ms.author: kvivek
manager: amyla
search.audienceType:
- developer
---
# Sample: Create a LINQ query
Download the complete sample from [Sample: Query data using LINQ](https://github.com/microsoft/PowerApps-Samples/tree/master/cds/orgsvc/C%23/QueriesUsingLINQ).
[!INCLUDE[cc-sample-note](../includes/cc-sample-note.md)]
## Prerequisites
[!INCLUDE[sdk-prerequisite](../../includes/sdk-prerequisite.md)]
## Requirements
[!INCLUDE[cc-how-to-run-PA-samples](../includes/cc-how-to-run-PA-samples.md)]
## Demonstrates
This sample shows how to create simple [!INCLUDE[pn_LINQ](../../includes/pn-linq.md)] queries. The following queries are demonstrated:
- Retrieve all accounts that the calling user has access to.
- Retrieve all accounts owned by the user who has read access rights to the accounts and where the last name of the user is not “Cannon.”
- Return a count of all accounts that have a county specified in their addresses.
- Return a count of states in which we have an account. This uses the `distinct` keyword that counts a state only once.
- Return contacts where the city equals “Redmond” AND the first name is “Joe” OR “John.”
## Example
[SampleProgram.cs](https://github.com/microsoft/PowerApps-Samples/blob/master/cds/orgsvc/C%23/QueriesUsingLINQ/CreateQuery/SampleProgram.cs)
### See also
[Build Queries with LINQ (.NET Language-Integrated Query)](build-queries-with-linq-net-language-integrated-query.md)
[Sample: Complex LINQ Queries](sample-complex-linq-queries.md)
<xref:Microsoft.Xrm.Sdk.Client.OrganizationServiceContext>
[!INCLUDE[footer-include](../../../../includes/footer-banner.md)] | 38.354839 | 159 | 0.748949 | eng_Latn | 0.907817 |
d02abc11a3f062e4b3177747f30527f40ede9783 | 8,665 | md | Markdown | docs/error-messages/tool-errors/resource-compiler-error-rw2002.md | TimTalerJr/cpp-docs.de-de | 474abea3949c6c2972936dfd31c0861fc89a8d9e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/error-messages/tool-errors/resource-compiler-error-rw2002.md | TimTalerJr/cpp-docs.de-de | 474abea3949c6c2972936dfd31c0861fc89a8d9e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/error-messages/tool-errors/resource-compiler-error-rw2002.md | TimTalerJr/cpp-docs.de-de | 474abea3949c6c2972936dfd31c0861fc89a8d9e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 'Ressourcencompiler: Fehler RW2002'
ms.date: 11/04/2016
f1_keywords:
- RW2002
helpviewer_keywords:
- RW2002
ms.assetid: b1d1a49b-b50b-4b0b-9f09-c7762e2dbe8f
ms.openlocfilehash: 1726e6ce74dfd7b6b0c6e4b69771a826cdf07774
ms.sourcegitcommit: 389c559918d9bfaf303d262ee5430d787a662e92
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 09/24/2019
ms.locfileid: "71230409"
---
# <a name="resource-compiler-error-rw2002"></a>Ressourcencompiler: Fehler RW2002
Fehler beim übernehmen
### <a name="to-fix-by-checking-the-following-possible-causes"></a>Dieser Fehler kann eine der folgenden Ursachen haben:
1. **Accelerator-Typ erforderlich (ASCII oder VIRTKEY)**
Das `type` -Feld der **ACCELERATORS** -Anweisung muss entweder den ASCII- oder den VIRTKEY-Wert enthalten.
1. **BEGIN in Zugriffstasten Tabelle erwartet**
Das **BEGIN** -Schlüsselwort muss direkt auf das **ACCELERATORS** -Schlüsselwort folgen.
1. **BEGIN in Dialogfeld erwartet**
Das **Begin** -Schlüsselwort muss direkt auf das **Dialog** Schlüsselwort folgen.
1. **BEGIN in Menü erwartet**
Das **BEGIN** -Schlüsselwort muss direkt auf das **MENU** -Schlüsselwort folgen.
1. **BEGIN in RCData erwartet**
Das **BEGIN** -Schlüsselwort muss direkt auf das **RCDATA** -Schlüsselwort folgen.
1. **BEGIN-Schlüsselwort in Zeichen folgen Tabelle erwartet**
Das **Begin** -Schlüsselwort muss direkt auf das **STRINGTABLE** -Schlüsselwort folgen.
1. **Zeichen folgen Konstanten können nicht wieder verwendet werden.**
Der gleiche Wert wird in einer **STRINGTABLE** -Anweisung zweimal verwendet. Stellen Sie sicher, dass überlappende Dezimal-und hexadezimale Werte nicht gemischt werden. Jede ID in einer **STRINGTABLE** muss eindeutig sein. Verwenden Sie für maximale Effizienz zusammenhängende Konstanten, die auf einem Vielfachen von 16 starten.
1. **Steuerzeichen außerhalb des gültigen Bereichs [^ A-^ Z]**
Ein Steuerzeichen in der **ACCELERATORS** -Anweisung ist ungültig. Das auf das Caretzeichen ( **^** ) folgende Zeichen muss zwischen A und Z (einschl.) liegen.
1. **Leere Menüs sind nicht zulässig.**
Ein **End** -Schlüsselwort wird angezeigt, bevor beliebige Menü Elemente in der **Menü** Anweisung definiert werden. Der Ressourcen Compiler lässt keine leeren Menüs zu. Stellen Sie sicher, dass in der **Menü** Anweisung keine öffnenden Anführungszeichen vorhanden sind.
1. **Ende erwartet in Dialogfeld**
Das **End** -Schlüsselwort muss am Ende einer **Dialog** Feld Anweisung auftreten. Stellen Sie sicher, dass keine offenen Anführungszeichen von der vorangehenden Anweisung übrig sind.
1. **Ende im Menü erwartet**
Das **END** -Schlüsselwort muss am Ende einer **MENU** -Anweisung stehen. Stellen Sie sicher, dass keine öffnenden Anführungszeichen oder nicht übereinstimmende Paare von **BEGIN** - und **END** -Anweisungen vorhanden sind.
1. **Komma in Zugriffstasten Tabelle erwartet**
Der Ressourcencompiler erfordert ein Komma zwischen dem Feld `event` und dem Feld *idvalue* in der **ACCELERATORS** -Anweisung.
1. **Name der Steuerelement Klasse erwartet**
Das `class` -Feld einer **Steuer** Element Anweisung in der **Dialog** -Anweisung muss einen der folgenden Typen aufweisen: Button, ComboBox, Edit, ListBox, ScrollBar, static oder User-Defined. Stellen Sie sicher, dass die Klasse richtig geschrieben ist.
1. **Der Name der Schriftart wurde erwartet.**
Das Feld *typeface* der FONT-Option in der **DIALOG** -Anweisung muss eine ASCII-Zeichenfolge in doppelten Anführungszeichen sein. Dieses Feld gibt den Namen einer Schriftart an.
1. **Erwarteter ID-Wert für MenuItem.**
Die **MENU** -Anweisung muss ein *menuID* -Feld aufweisen, in dem der Name oder die Nummer angegeben ist, der bzw. die die Menüressource identifiziert.
1. **Erwartete Menü Zeichenfolge**
Jede **MENUITEM** - und **POPUP** -Anweisung muss ein *Text* -Feld enthalten. Hierbei handelt es sich um eine Zeichenfolge in doppelten Anführungszeichen, die den Namen des Menüelements oder Popupmenüs angibt. Eine **MENUITEM SEPARATOR** -Anweisung erfordert keine Zeichenfolge in Anführungszeichen.
1. **Numerischer Befehls Wert erwartet.**
Der Ressourcen Compiler hat ein numerisches *idValue* -Feld in der **Accelerators** -Anweisung erwartet. Stellen Sie sicher, dass Sie eine `#define` Konstante verwendet haben, um den Wert anzugeben, und dass die Konstante richtig geschrieben ist.
1. **Numerische Konstante in Zeichen folgen Tabelle erwartet**
Eine in einer `#define` -Anweisung definierte numerische Konstante muss direkt auf das **BEGIN** -Schlüsselwort in einer **STRINGTABLE** -Anweisung folgen.
1. **Erwartete numerische Punktgröße**
Das Feld *pointsize* der Option FONT in der Anweisung **DIALOG** muss ein ganzzahliger Punktgrößenwert sein.
1. **Numerische Dialog Konstante erwartet**
Eine **Dialog** Feld Anweisung erfordert ganzzahlige Werte für die Felder *x, y, Width*und *height* . Stellen Sie sicher, dass diese Werte nach dem Schlüsselwort " **Dialog** " eingefügt werden und dass Sie nicht negativ sind.
1. **Erwartete Zeichenfolge in STRINGTABLE**
Nach jedem *stringid* -Wert in einer **STRINGTABLE** -Anweisung wird eine Zeichenfolge erwartet.
1. **Erwarteter String-oder Constant Accelerator-Befehl**
Der Ressourcencompiler konnte nicht bestimmen, welche Art von Schlüssel für die Zugriffstaste eingerichtet wurde. Das `event` -Feld in der **ACCELERATORS** -Anweisung ist möglicherweise ungültig.
1. **Anzahl für ID wird erwartet.**
Es wird eine Zahl für `id` das Feld einer Steuerelement Anweisung in der **Dialog** -Anweisung erwartet. Stellen Sie sicher, dass Sie eine `#define` Zahl oder eine Anweisung für die Steuerelement-ID haben.
1. **Zeichenfolge in Anführungszeichen in Dialog Klasse erwartet**
Das `class` -Feld der CLASS-Option in der **DIALOG** -Anweisung muss eine ganze Zahl oder eine Zeichenfolge, die in doppelte Anführungszeichen eingeschlossen ist, enthalten.
1. **Zeichenfolge in Anführungszeichen in Dialogfeld Titel erwartet**
Das `captiontext` -Feld der CAPTION-Option in der **DIALOG** -Anweisung muss eine ASCII-Zeichenfolge in doppelten Anführungszeichen sein.
1. **Datei nicht gefunden: Dateiname**
Die in der Befehlszeile des Ressourcencompilers angegebene Datei wurde nicht gefunden. Überprüfen Sie, ob die Datei in ein anderes Verzeichnis verschoben wurde und ob der Dateiname und der Pfad korrekt eingegeben wurden. Dateien werden mithilfe der **include** -Umgebungsvariablen oder der Visual Studio-Einstellung, falls verfügbar, gesucht.
1. **Schriftart Namen müssen Ordnungszahlen sein.**
Das *pointsize* -Feld in der Font-Anweisung muss eine ganze Zahl und keine Zeichenfolge sein.
1. **Ungültige Zugriffstaste**
Ein `event` -Feld in der **ACCELERATORS** -Anweisung wurde nicht erkannt oder ist mehr als zwei Zeichen lang.
1. **Ungültiger Accelerator-Typ (ASCII oder VIRTKEY)**
Das `type` -Feld der **ACCELERATORS** -Anweisung muss entweder den ASCII- oder den VIRTKEY-Wert enthalten.
1. **Ungültiges Steuerzeichen**
Ein Steuerzeichen in der **ACCELERATORS** -Anweisung ist ungültig. Ein gültiges Steuerzeichen besteht aus einem Buchstaben (nur), der einem Caretzeichen (^) folgt.
1. **Ungültiger Steuerungstyp**
Jede Steuerelement Anweisung in einer **Dialog** -Anweisung muss eine der folgenden sein: CHECKBOX, COMBOBOX, CONTROL, CTEXT, DEFPUSHBUTTON, EDITTEXT, GROUPBOX, SYMBOL, LISTBOX, LTEXT, PUSHBUTTON, RADIOBUTTON, RTEXT, SCROLLBAR. Stellen Sie sicher, dass diese Steuerungs Anweisungen richtig geschrieben sind.
1. **Ungültiger Typ**
Der Ressourcentyp gehörte nicht zu den Typen, die in der Datei "Windows. h" definiert sind.
1. **Text Zeichenfolge oder Ordnungszahl im Steuerelement erwartet**
Das *Textfeld* einer **Control** -Anweisung in der **Dialog** -Anweisung muss entweder eine Text Zeichenfolge oder ein Ordinalverweis auf den Typ des Steuer Elements sein. Wenn Sie eine Ordnungszahl verwenden, achten Sie darauf, eine `#define` -Anweisung für das Steuerelement zu verwenden.
1. **Klammern stimmen nicht überein.**
Stellen Sie sicher, dass Sie alle geöffneten Klammern in der **Dialog** Feld Anweisung geschlossen haben.
1. **Unerwarteter Wert in RCDATA.**
Die *raw-data* -Werte in der **RCDATA** -Anweisung müssen ganze Zahlen oder Zeichenfolgen und jeweils durch ein Komma voneinander getrennt sein. Stellen Sie sicher, dass Sie kein Komma oder Anführungszeichen um eine Zeichenfolge vergessen haben.
1. **Unbekannter Menü Untertyp.**
Das Feld " *Element Definition* " der **Menü** Anweisung kann nur **MenuItem** -und **Popup** -Anweisungen enthalten. | 52.835366 | 345 | 0.767455 | deu_Latn | 0.996122 |
d02b6f37085550d8887f96504c5ccb7acd4cdd46 | 1,519 | md | Markdown | README.md | alsimoes/pyHE | 964981d80080e57f61f0439449dd5b00e70795ff | [
"CNRI-Python"
] | 2 | 2021-09-07T18:26:53.000Z | 2021-09-07T18:27:32.000Z | README.md | alsimoes/pyHE | 964981d80080e57f61f0439449dd5b00e70795ff | [
"CNRI-Python"
] | null | null | null | README.md | alsimoes/pyHE | 964981d80080e57f61f0439449dd5b00e70795ff | [
"CNRI-Python"
] | 1 | 2020-01-02T03:16:13.000Z | 2020-01-02T03:16:13.000Z | # pyHE
Aplicativo multiplataforma para cálculo de horas extras para banco de horas. Desenvolvido em [Python](http://www.python.org/) com o *framework* [PySide](http://qt-project.org/wiki/PySide) que utiliza a biblioteca [Qt](http://qt.digia.com/) para design da *interface* gráfica. É compatível com os sistemas Windows, Linux e OSX.
#### Características
* Armazena marcações de horários.
* Calcula total de horas trabalhadas no mês.
* Parametrização de configuração de horas contratadas.
* Organização de meses e anos em uma *TreeView*.
* Exportação de dados consolidados do mês para conferência.
* Apresentação da consolidação do mês selecionada na tela principal.
* Marcação rápida do horário atual.
* Importação de arquivo CSV.

----
#### Informações e Copyright©
Autor: André Simões ([GitHub](https://github.com/alsimoes), [Twitter] (http://twitter.com/alsimoes))
Linguagem: [Python](http://www.python.org/)
Frameworks e Bibliotecas:
* [Qt](http://qt.digia.com/)
* [PySide](http://qt-project.org/wiki/PySide)
Licença: [GPLv3](http://www.gnu.org/licenses/gpl-3.0.html)
Plataformas: Windows, Linux e OSX.
----
## Atualizações
#### [Waffle.io](http://waffle.io/alsimoes/pyHE?label=ready "Kanban board") → [](http://waffle.io/alsimoes/pyHE)
#### 0.01 (alpha)
* Design da interface principal;
* Geração de arquivo de *log*;
| 38.948718 | 326 | 0.736669 | por_Latn | 0.965006 |
d02bec8333379dda96bf7172c155c88b90faee51 | 1,281 | md | Markdown | CHANGELOG.md | hemberger/serializable-closure | 25de3be1bca1b17d52ff0dc02b646c667ac7266c | [
"MIT"
] | 179 | 2021-09-07T15:01:48.000Z | 2022-03-30T13:52:14.000Z | CHANGELOG.md | hemberger/serializable-closure | 25de3be1bca1b17d52ff0dc02b646c667ac7266c | [
"MIT"
] | 18 | 2021-09-16T09:01:43.000Z | 2022-03-30T11:10:03.000Z | CHANGELOG.md | hemberger/serializable-closure | 25de3be1bca1b17d52ff0dc02b646c667ac7266c | [
"MIT"
] | 13 | 2021-09-08T14:34:02.000Z | 2022-03-30T10:53:59.000Z | # Release Notes
## [Unreleased](https://github.com/laravel/serializable-closure/compare/v1.0.5...master)
## [v1.0.5 (2020-11-30)](https://github.com/laravel/serializable-closure/compare/v1.0.4...v1.0.5)
### Fixed
- Fixes serialisation of closures with named arguments code ([#29](https://github.com/laravel/serializable-closure/pull/29))
## [v1.0.4 (2020-11-16)](https://github.com/laravel/serializable-closure/compare/v1.0.3...v1.0.4)
### Fixed
- Fixes the serialization of Enum objects ([#28](https://github.com/laravel/serializable-closure/pull/28))
## [v1.0.3 (2020-10-07)](https://github.com/laravel/serializable-closure/compare/v1.0.2...v1.0.3)
### Fixed
- Possible stream protocol collision with `opis/closure` ([#23](https://github.com/laravel/serializable-closure/pull/23))
## [v1.0.2 (2020-09-29)](https://github.com/laravel/serializable-closure/compare/v1.0.1...v1.0.2)
### Fixed
- Fixes serialization of closures that got rebound ([#19](https://github.com/laravel/serializable-closure/pull/19))
## [v1.0.1 (2020-09-29)](https://github.com/laravel/serializable-closure/compare/v1.0.0...v1.0.1)
### Fixed
- Fixes null safe operator with properties ([#16](https://github.com/laravel/serializable-closure/pull/16))
## v1.0.0 (2020-09-14)
Initial release
| 32.846154 | 124 | 0.711944 | eng_Latn | 0.23869 |
d02c2ff423ba328daff6cd931994c606ba3ef1b6 | 4,477 | md | Markdown | packages/nauth0/CHANGELOG.md | jamiedavenport/nauth0 | 315cf7a54bbacd96085c7e81ad3c25b16d2e78d3 | [
"MIT"
] | 2 | 2020-12-11T17:52:36.000Z | 2020-12-15T08:42:25.000Z | packages/nauth0/CHANGELOG.md | jamiedavenport/nauth0 | 315cf7a54bbacd96085c7e81ad3c25b16d2e78d3 | [
"MIT"
] | 24 | 2020-12-03T12:57:10.000Z | 2022-03-15T15:58:44.000Z | packages/nauth0/CHANGELOG.md | jamiedavenport/next-auth0 | 315cf7a54bbacd96085c7e81ad3c25b16d2e78d3 | [
"MIT"
] | 1 | 2020-12-15T08:42:09.000Z | 2020-12-15T08:42:09.000Z | # Change Log
All notable changes to this project will be documented in this file.
See [Conventional Commits](https://conventionalcommits.org) for commit guidelines.
## [0.8.1-canary.0](https://github.com/jamiedavenport/nauth0/compare/v0.8.0-canary.0...v0.8.1-canary.0) (2020-12-21)
**Note:** Version bump only for package nauth0
# [0.8.0-canary.0](https://github.com/jamiedavenport/nauth0/compare/v0.7.0-canary.0...v0.8.0-canary.0) (2020-12-21)
**Note:** Version bump only for package nauth0
# [0.7.0-canary.0](https://github.com/jamiedavenport/nauth0/compare/v0.6.0-canary.0...v0.7.0-canary.0) (2020-12-21)
### Features
- Refresh Tokens ([#37](https://github.com/jamiedavenport/nauth0/issues/37)) ([6244edc](https://github.com/jamiedavenport/nauth0/commit/6244edc48d2546c02cfd888bdb61d80a5c6a247c))
# [0.6.0-canary.0](https://github.com/jamiedavenport/nauth0/compare/v0.5.0-canary.0...v0.6.0-canary.0) (2020-12-21)
### Features
- Adds optional login redirect uri to options and redirects to it on callback if it exists. ([#40](https://github.com/jamiedavenport/nauth0/issues/40)) ([b1309d5](https://github.com/jamiedavenport/nauth0/commit/b1309d5a666fff606ee2e6b7294f38dd7de9e1aa))
# [0.5.0-canary.0](https://github.com/jamiedavenport/nauth0/compare/v0.4.1-canary.0...v0.5.0-canary.0) (2020-12-18)
### Features
- Logout route ([#34](https://github.com/jamiedavenport/nauth0/issues/34)) ([a88a46a](https://github.com/jamiedavenport/nauth0/commit/a88a46a86634ef98957cc1709d1ed6f9e9f074a5))
## [0.4.1-canary.0](https://github.com/jamiedavenport/nauth0/compare/v0.4.0-canary.0...v0.4.1-canary.0) (2020-12-17)
### Bug Fixes
- Remove baseUrl and rewrite imports as relative ([#33](https://github.com/jamiedavenport/nauth0/issues/33)) ([fc8a88a](https://github.com/jamiedavenport/nauth0/commit/fc8a88ae4ae303d24770efaf10e686f7898de29d))
# [0.4.0-canary.0](https://github.com/jamiedavenport/nauth0/compare/v0.3.3-canary.0...v0.4.0-canary.0) (2020-12-15)
### Features
- getSession method on server ([#29](https://github.com/jamiedavenport/nauth0/issues/29)) ([240a9f0](https://github.com/jamiedavenport/nauth0/commit/240a9f04c0616379e7357c4a49ca2a4041e1e059))
## [0.3.3-canary.0](https://github.com/jamiedavenport/nauth0/compare/v0.3.2-canary.0...v0.3.3-canary.0) (2020-12-14)
**Note:** Version bump only for package nauth0
## [0.3.2-canary.0](https://github.com/jamiedavenport/nauth0/compare/v0.3.1-canary.0...v0.3.2-canary.0) (2020-12-14)
**Note:** Version bump only for package nauth0
## [0.3.1-canary.0](https://github.com/jamiedavenport/nauth0/compare/v0.3.0-canary.0...v0.3.1-canary.0) (2020-12-14)
**Note:** Version bump only for package nauth0
# [0.3.0-canary.0](https://github.com/jamiedavenport/nauth0/compare/v0.2.0-canary.0...v0.3.0-canary.0) (2020-12-14)
### Features
- Add CI ([#10](https://github.com/jamiedavenport/nauth0/issues/10)) ([2aa7a7f](https://github.com/jamiedavenport/nauth0/commit/2aa7a7fcbbdfa2d0c50ef1d922e69ee0fe04f7dd))
- Example Application ([#7](https://github.com/jamiedavenport/nauth0/issues/7)) ([e1fe055](https://github.com/jamiedavenport/nauth0/commit/e1fe05593da8768b6fe03e6b4b02a01bfbaba585))
- Implement Callback route ([#19](https://github.com/jamiedavenport/nauth0/issues/19)) ([44664d4](https://github.com/jamiedavenport/nauth0/commit/44664d466c9a30ae5d97a7433d2520044adee9a8))
- Implement session route ([#21](https://github.com/jamiedavenport/nauth0/issues/21)) ([8f06d62](https://github.com/jamiedavenport/nauth0/commit/8f06d623b018e073e4e9a8a117a1e25b1310a4d9))
- Implement useSession hook ([#22](https://github.com/jamiedavenport/nauth0/issues/22)) ([e810f02](https://github.com/jamiedavenport/nauth0/commit/e810f0258256a9eb54b3f70d8a733a42302171e2))
- Login Route ([#17](https://github.com/jamiedavenport/nauth0/issues/17)) ([f1b6c18](https://github.com/jamiedavenport/nauth0/commit/f1b6c18b238b8ac08293b0c793a9d19c53a26415))
# [0.2.0-canary.0](https://github.com/jamiedavenport/nauth0/compare/v0.1.1-canary.1...v0.2.0-canary.0) (2020-12-03)
### Features
- test change ([444f3c8](https://github.com/jamiedavenport/nauth0/commit/444f3c8de100a140031b85fbb6dd38932a77a3ea))
## [0.1.1-canary.1](https://github.com/jamiedavenport/nauth0/compare/v0.1.1-canary.0...v0.1.1-canary.1) (2020-12-03)
**Note:** Version bump only for package nauth0
## [0.1.1-canary.0](https://github.com/jamiedavenport/nauth0/compare/v0.1.1-alpha.0...v0.1.1-canary.0) (2020-12-03)
**Note:** Version bump only for package nauth0
| 55.9625 | 253 | 0.751173 | yue_Hant | 0.147446 |
d02c8d9a8b6ed35421dd26fbd0fa0eff20874902 | 433 | md | Markdown | Source/Tools/HistorianPlaybackUtility/README.md | SRM256/gsf | 7fafa85480963dace278c90efb60dc87d54287a8 | [
"MIT"
] | 131 | 2015-08-05T12:22:53.000Z | 2022-03-29T14:29:21.000Z | Source/Tools/HistorianPlaybackUtility/README.md | SRM256/gsf | 7fafa85480963dace278c90efb60dc87d54287a8 | [
"MIT"
] | 53 | 2016-04-22T19:07:53.000Z | 2022-01-27T14:55:55.000Z | Source/Tools/HistorianPlaybackUtility/README.md | SRM256/gsf | 7fafa85480963dace278c90efb60dc87d54287a8 | [
"MIT"
] | 83 | 2015-08-31T00:00:47.000Z | 2022-01-11T16:46:06.000Z | 
The openHistorian Playback Utility is used to stream and/or export data from a v1.0 openHistorian archive and is included with GPA synchrophasor products, e.g., the openPDC. The tool is used to export selected signals from a given time-range to a CSV file using a custom format and/or stream the data to a socket for replay purposes.
| 108.25 | 333 | 0.810624 | eng_Latn | 0.991336 |
d02ca321e37fe1151ec96c4b4bf27880634a13f9 | 560 | md | Markdown | reports/research/2022.4.3 research-hrx.md | OSH-2022/x-realism | 94a7bddf05d9fc079af81e0052043e1cffd363dc | [
"MIT"
] | 3 | 2022-03-20T13:15:34.000Z | 2022-03-20T13:16:05.000Z | reports/research/2022.4.3 research-hrx.md | OSH-2022/x-realism | 94a7bddf05d9fc079af81e0052043e1cffd363dc | [
"MIT"
] | null | null | null | reports/research/2022.4.3 research-hrx.md | OSH-2022/x-realism | 94a7bddf05d9fc079af81e0052043e1cffd363dc | [
"MIT"
] | null | null | null | # 完成一个操作系统需要哪些工作
## 操作系统做了什么事
管理计算机的资源;为用户/应用使用提供一层抽象。
### 从管理计算机的资源看
- 进程调度(资源分配)
- I/O操作
- 文件系统操作
- 通信
- 错误检测
- 保护机制
### 从提供用户使用环境看
- 命令行界面、命令解释程序
- 提供给用户调用系统调用的接口

## 写操作系统的规划
0. 搞清楚我们操作系统的服务对象
1. 规划系统特性(结合Rust语言特性,以及目前科研方向/市场方向需求)
2. 在各类模块和子系统的实现中,用什么样的算法、模型(小组研究,也可以结合目前科研界提出的一些新的方向)
总而言之,一切的一切:做好规划,落到实处,不要将操作系统当成空中楼阁。
## 引用
《操作系统概念》亚伯拉罕·西尔伯沙茨
[rCore-tutorial-book](https://rcore-os.github.io/)
[计算机自制操作系统:完成总结](https://zhuanlan.zhihu.com/p/149465851)
| 14.358974 | 87 | 0.741071 | yue_Hant | 0.71422 |
d02d19bed5a6b9d93bf28e05c46dc0e34ebee14f | 10,716 | md | Markdown | articles/mysql/connect-python.md | nsrau/azure-docs.it-it | 9935e44b08ef06c214a4c7ef94d12e79349b56bc | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/mysql/connect-python.md | nsrau/azure-docs.it-it | 9935e44b08ef06c214a4c7ef94d12e79349b56bc | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/mysql/connect-python.md | nsrau/azure-docs.it-it | 9935e44b08ef06c214a4c7ef94d12e79349b56bc | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 'Avvio rapido: Connettersi con Python - Database di Azure per MySQL'
description: Questa guida introduttiva fornisce diversi esempi di codice Python che è possibile usare per connettersi ai dati ed eseguire query da Database di Azure per MySQL.
author: savjani
ms.author: pariks
ms.service: mysql
ms.custom:
- mvc
- seo-python-october2019
- devx-track-python
ms.devlang: python
ms.topic: quickstart
ms.date: 10/28/2020
ms.openlocfilehash: 8aa0ea4b1e01cc7363f49d5897695c7c237b339b
ms.sourcegitcommit: 6ab718e1be2767db2605eeebe974ee9e2c07022b
ms.translationtype: HT
ms.contentlocale: it-IT
ms.lasthandoff: 11/12/2020
ms.locfileid: "94535589"
---
# <a name="quickstart-use-python-to-connect-and-query-data-in-azure-database-for-mysql"></a>Avvio rapido: Usare Python per connettersi ai dati ed eseguire query in Database di Azure per MySQL
In questo argomento di avvio rapido ci si connette a un'istanza di Database di Azure per MySQL usando Python. È quindi possibile usare istruzioni SQL per eseguire query, inserire, aggiornare ed eliminare dati nel database dalle piattaforme Mac, Ubuntu Linux e Windows.
## <a name="prerequisites"></a>Prerequisiti
Per questa guida di avvio rapido, è necessario:
- Un account Azure con una sottoscrizione attiva. [Creare un account gratuitamente](https://azure.microsoft.com/free).
- Creare un server singolo del database di Azure per MySQL usando il [portale di Azure](./quickstart-create-mysql-server-database-using-azure-portal.md) <br/> o in alternativa l'[interfaccia della riga di comando di Azure](./quickstart-create-mysql-server-database-using-azure-cli.md).
- A seconda che si usi l'accesso pubblico o privato, completare **UNA** delle azioni seguenti per abilitare la connettività.
|Action| Metodo di connettività|Guida pratica|
|:--------- |:--------- |:--------- |
| **Configurare le regole del firewall** | Pubblico | [Portale](./howto-manage-firewall-using-portal.md) <br/> [CLI](./howto-manage-firewall-using-cli.md)|
| **Configurare l'endpoint di servizio** | Pubblico | [Portale](./howto-manage-vnet-using-portal.md) <br/> [CLI](./howto-manage-vnet-using-cli.md)|
| **Configurare il collegamento privato** | Privato | [Portale](./howto-configure-privatelink-portal.md) <br/> [CLI](./howto-configure-privatelink-cli.md) |
- [Creare un database e un utente non amministratore](./howto-create-users.md)
## <a name="install-python-and-the-mysql-connector"></a>Installare Python e il connettore MySQL
Installare Python e il connettore MySQL per Python nel computer seguendo questa procedura:
> [!NOTE]
> Questo argomento di avvio rapido usa la [Guida dello sviluppatore per il connettore MySQL/Python](https://dev.mysql.com/doc/connector-python/en/).
1. Scaricare e installare [Python 3.7 o versione successiva](https://www.python.org/downloads/) per il sistema operativo in uso. Assicurarsi di aggiungere Python a `PATH`, perché è un requisito del connettore MySQL.
2. Aprire un prompt dei comandi o una shell `bash` e controllare la versione di Python eseguendo `python -V` con l'opzione V in maiuscolo.
3. Il programma di installazione del pacchetto `pip` è incluso nelle ultime versioni di Python. Aggiornare `pip` alla versione più recente eseguendo`pip install -U pip`.
Se `pip` non è installato, è possibile scaricarlo e installarlo con `get-pip.py`. Per altre informazioni, vedere [Installazione](https://pip.pypa.io/en/stable/installing/).
4. Usare `pip` per installare il connettore MySQL per Python e le relative dipendenze:
```bash
pip install mysql-connector-python
```
[Problemi? Segnalarli](https://aka.ms/mysql-doc-feedback)
## <a name="get-connection-information"></a>Ottenere informazioni di connessione
Ottenere le informazioni di connessione necessarie per connettersi al database di Azure per MySQL dal portale di Azure. È necessario avere il nome del server, il nome del database e le credenziali di accesso.
1. Accedere al [portale di Azure](https://portal.azure.com/).
1. Sulla barra di ricerca del portale cercare e selezionare il server di database di Azure per MySQL creato, ad esempio **mydemoserver**.
:::image type="content" source="./media/connect-python/1_server-overview-name-login.png" alt-text="Nome del server del database di Azure per MySQL":::
1. Nella pagina **Panoramica** del server prendere nota dei valori riportati in **Nome server** e **Nome di accesso dell'amministratore server**. Se si dimentica la password, in questa pagina è anche possibile reimpostarla.
:::image type="content" source="./media/connect-python/azure-database-for-mysql-server-overview-name-login.png" alt-text="Nome del server del database di Azure per MySQL 2":::
## <a name="step-1-create-a-table-and-insert-data"></a>Passaggio 1: Creare una tabella e inserire i dati
Usare il codice seguente per connettersi al server e al database, creare una tabella e caricare i dati usando un'istruzione SQL **INSERT**. Il codice importa la libreria mysql.connector e usa il metodo:
- funzione [connect()](https://dev.mysql.com/doc/connector-python/en/connector-python-api-mysql-connector-connect.html) per connettersi a Database di Azure per MySQL usando gli [argomenti](https://dev.mysql.com/doc/connector-python/en/connector-python-connectargs.html) presenti nella raccolta di configurazione.
- il metodo [cursor.execute()](https://dev.mysql.com/doc/connector-python/en/connector-python-api-mysqlcursor-execute.html) esegue la query SQL sul database MySQL.
- [cursor.close()](https://dev.mysql.com/doc/connector-python/en/connector-python-api-mysqlcursor-close.html) al termine dell'utilizzo di un cursore.
- [conn.close()](https://dev.mysql.com/doc/connector-python/en/connector-python-api-mysqlconnection-close.html) per chiudere la connessione.
> [!IMPORTANT]
> - SSL è abilitato per impostazione predefinita. Potrebbe essere necessario scaricare il [certificato SSL DigiCertGlobalRootG2](https://cacerts.digicert.com/DigiCertGlobalRootG2.crt.pem) per connettersi dall'ambiente locale.
> - Sostituire i segnaposto `<mydemoserver>`, `<myadmin>`, `<mypassword>` e `<mydatabase>` con i valori relativi al server e al database MySQL.
```python
import mysql.connector
from mysql.connector import errorcode
# Obtain connection string information from the portal
config = {
'host':'<mydemoserver>.mysql.database.azure.com',
'user':'<myadmin>@<mydemoserver>',
'password':'<mypassword>',
'database':'<mydatabase>',
'client_flags': [ClientFlag.SSL],
'ssl_cert': '/var/wwww/html/DigiCertGlobalRootG2.crt.pem'
}
# Construct connection string
try:
conn = mysql.connector.connect(**config)
print("Connection established")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print("Something is wrong with the user name or password")
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print("Database does not exist")
else:
print(err)
else:
cursor = conn.cursor()
# Drop previous table of same name if one exists
cursor.execute("DROP TABLE IF EXISTS inventory;")
print("Finished dropping table (if existed).")
# Create table
cursor.execute("CREATE TABLE inventory (id serial PRIMARY KEY, name VARCHAR(50), quantity INTEGER);")
print("Finished creating table.")
# Insert some data into table
cursor.execute("INSERT INTO inventory (name, quantity) VALUES (%s, %s);", ("banana", 150))
print("Inserted",cursor.rowcount,"row(s) of data.")
cursor.execute("INSERT INTO inventory (name, quantity) VALUES (%s, %s);", ("orange", 154))
print("Inserted",cursor.rowcount,"row(s) of data.")
cursor.execute("INSERT INTO inventory (name, quantity) VALUES (%s, %s);", ("apple", 100))
print("Inserted",cursor.rowcount,"row(s) of data.")
# Cleanup
conn.commit()
cursor.close()
conn.close()
print("Done.")
```
[Problemi? Segnalarli](https://aka.ms/mysql-doc-feedback)
## <a name="step-2-read-data"></a>Passaggio 2: Leggere i dati
Usare il codice seguente per connettersi e leggere i dati usando un'istruzione SQL **SELECT**. Il codice importa la libreria mysql.connector e usa il metodo [cursor.execute()](https://dev.mysql.com/doc/connector-python/en/connector-python-api-mysqlcursor-execute.html) per eseguire la query SQL sul database MySQL.
Il codice legge le righe di dati usando il metodo [fetchall()](https://dev.mysql.com/doc/connector-python/en/connector-python-api-mysqlcursor-fetchall.html), mantiene il set di risultati in una riga della raccolta e usa un iteratore `for` per scorrere le righe in un ciclo.
```python
# Read data
cursor.execute("SELECT * FROM inventory;")
rows = cursor.fetchall()
print("Read",cursor.rowcount,"row(s) of data.")
# Print all rows
for row in rows:
print("Data row = (%s, %s, %s)" %(str(row[0]), str(row[1]), str(row[2])))
```
## <a name="step-3-update-data"></a>Passaggio 3: Aggiornare i dati
Usare il codice seguente per connettersi e aggiornare i dati usando un'istruzione SQL **UPDATE**. Il codice importa la libreria mysql.connector e usa il metodo [cursor.execute()](https://dev.mysql.com/doc/connector-python/en/connector-python-api-mysqlcursor-execute.html) per eseguire la query SQL sul database MySQL.
```python
# Update a data row in the table
cursor.execute("UPDATE inventory SET quantity = %s WHERE name = %s;", (200, "banana"))
print("Updated",cursor.rowcount,"row(s) of data.")
```
## <a name="step-4-delete-data"></a>Passaggio 4: Eliminare i dati
Usare il codice seguente per connettersi e rimuovere i dati usando un'istruzione SQL **DELETE**. Il codice importa la libreria mysql.connector e usa il metodo [cursor.execute()](https://dev.mysql.com/doc/connector-python/en/connector-python-api-mysqlcursor-execute.html) per eseguire la query SQL sul database MySQL.
```python
# Delete a data row in the table
cursor.execute("DELETE FROM inventory WHERE name=%(param1)s;", {'param1':"orange"})
print("Deleted",cursor.rowcount,"row(s) of data.")
```
## <a name="clean-up-resources"></a>Pulire le risorse
Per pulire tutte le risorse usate in questo argomento di avvio rapido, eliminare il gruppo di risorse con il comando seguente:
```azurecli
az group delete \
--name $AZ_RESOURCE_GROUP \
--yes
```
## <a name="next-steps"></a>Passaggi successivi
> [!div class="nextstepaction"]
> [Gestire il server di Database di Azure per MySQL usando il portale](./howto-create-manage-server-portal.md)<br/>
> [!div class="nextstepaction"]
> [Gestire il server di Database di Azure per MySQL usando l'interfaccia della riga di comando](./how-to-manage-single-server-cli.md)
[Ci sono problemi a trovare le informazioni giuste? Segnalarli](https://aka.ms/mysql-doc-feedback).
| 53.849246 | 318 | 0.745987 | ita_Latn | 0.8853 |
d02d47212e977ebaa10cecc5d84001357ea91333 | 1,893 | md | Markdown | docs/includes/install-cli.md | Miguel-byte/docs.microsoft.com-nuget.es-es | 153c1681e525d66f2e17c8d470556db5acd9c04d | [
"MIT"
] | null | null | null | docs/includes/install-cli.md | Miguel-byte/docs.microsoft.com-nuget.es-es | 153c1681e525d66f2e17c8d470556db5acd9c04d | [
"MIT"
] | null | null | null | docs/includes/install-cli.md | Miguel-byte/docs.microsoft.com-nuget.es-es | 153c1681e525d66f2e17c8d470556db5acd9c04d | [
"MIT"
] | null | null | null | ---
ms.openlocfilehash: 1f65939493cf423a76c024607264acee6c7e9050
ms.sourcegitcommit: ef08f376688f0191a8d3d873b6a4386afd799373
ms.translationtype: HT
ms.contentlocale: es-ES
ms.lasthandoff: 05/28/2019
ms.locfileid: "66271559"
---
#### <a name="windows"></a>Windows
> [!Note]
> NuGet.exe 5.0 y las versiones posteriores requieren .NET Framework 4.7.2 o versiones posteriores para ejecutarse.
1. Visite [nuget.org/downloads](https://nuget.org/downloads) y seleccione NuGet 3.3 o posterior (2.8.6 no es compatible con Mono). Siempre se recomienda la última versión, y 4.1.0+ es la versión necesaria para publicar paquetes en nuget.org.
1. Cada descarga es el archivo `nuget.exe` directamente. Indique al explorador que guarde el archivo en una carpeta de su elección. El archivo *no* es un instalador; no verá nada si lo ejecuta directamente desde el explorador.
1. Agregue la carpeta donde colocó `nuget.exe` a la variable de entorno de RUTA DE ACCESO para usar la herramienta CLI desde cualquier lugar.
#### <a name="macoslinux"></a>macOS y Linux
Los comportamientos pueden variar ligeramente según la distribución del sistema operativo.
1. Instale [Mono 4.4.2 o versiones posteriores](http://www.mono-project.com/docs/getting-started/install/).
1. Ejecute los comandos siguientes en un símbolo del sistema del shell:
```bash
# Download the latest stable `nuget.exe` to `/usr/local/bin`
sudo curl -o /usr/local/bin/nuget.exe https://dist.nuget.org/win-x86-commandline/latest/nuget.exe
```
1. Cree un alias mediante la adición del script siguiente al archivo apropiado del sistema operativo (normalmente `~/.bash_aliases` o `~/.bash_profile`):
```bash
# Create as alias for nuget
alias nuget="mono /usr/local/bin/nuget.exe"
```
1. Recargue el shell. Para probar la instalación, escriba `nuget` sin parámetros. Debe aparecer la Ayuda de la CLI de NuGet.
| 48.538462 | 241 | 0.756999 | spa_Latn | 0.880625 |
d02dd9b225ea56edeb30d105e8e162d694b271c1 | 625 | md | Markdown | README.md | travispaul/earthwiggle-scraper | 0aae53b43fd39c221445f08972e84acb9c2061f2 | [
"BSD-2-Clause"
] | 1 | 2019-06-05T14:36:56.000Z | 2019-06-05T14:36:56.000Z | README.md | travispaul/earthwiggle-scraper | 0aae53b43fd39c221445f08972e84acb9c2061f2 | [
"BSD-2-Clause"
] | null | null | null | README.md | travispaul/earthwiggle-scraper | 0aae53b43fd39c221445f08972e84acb9c2061f2 | [
"BSD-2-Clause"
] | null | null | null | # earthwiggle-scraper
Scrapes earthquake data from the [Philippine Institute of Volcanology and Seismology website](https://www.phivolcs.dost.gov.ph/).
Data is cached to a local SQLite database and is exposed as a simple public API. New events trigger a notification to [Slack](https://slack.com/) and [Discord](https://discordapp.com/).
## Related Repos
- [earthwiggle-web](https://github.com/earthwiggle/earthwiggle-web): Front-end code for [earthwiggle.com](https://earthwiggle.com)
## Slack Webhook Example

## Discord Webhook Example

| 48.076923 | 185 | 0.7696 | eng_Latn | 0.47378 |
d02e533e1e9ee734821732d48481a0d6bead90a1 | 4,051 | md | Markdown | _posts/2021-03-31-why-i-ride.md | b0ndman/b0ndman.github.io | 5314463855dba9008145c8b3b306e4a7b9e761a0 | [
"MIT"
] | null | null | null | _posts/2021-03-31-why-i-ride.md | b0ndman/b0ndman.github.io | 5314463855dba9008145c8b3b306e4a7b9e761a0 | [
"MIT"
] | null | null | null | _posts/2021-03-31-why-i-ride.md | b0ndman/b0ndman.github.io | 5314463855dba9008145c8b3b306e4a7b9e761a0 | [
"MIT"
] | null | null | null | ---
layout: post
title: "Why I Ride"
date: 2021-04-02
---
In summer 2019, I bought a brand new, mid-range adventure bike. It had drop bars, hydraulic brakes, and a carbon fiber fork. It was the best bike I had ever owned, replacing a steel mountain bike from the 80's built by a long bankrupt company that I took for free when a lost and found sale failed to get rid of it. That mountain bike was acquired my freshman year in college, making it to a year after my last, in 2019. That firetruck red mountain I converted into a singlespeed became the "bike zero" of my biking passion.
That summer, I "bikepacked" for the first time. Crossing the 140 mile Denali highway with (at the time) a friend over the course of 3 days. There was no previous experience that I had that could have prepared me for what I felt. The second day required covering 85 miles; The most I had biked before that was barely 50 miles, and on only flat pavement. The [Denali Highway](https://www.alaska.org/guide/denali-highway) is a mostly gravel, incredibly potholed, and endlessly hilly road that goes from above the treeline to Denali's foothills. Whatever it was I felt after that second day at 2 am after 85 miles and a couple of rainstorms and darkness, I fell in love.
At the beginning of 2020, I acquired a [Surly Pugsley frame](https://surlybikes.com/bikes/pugsley) with the intention of building it up into a complete bike. That endeavor took almost a year and a half (That thing itself deserves its own post. Eventually.) Later that summer, I biked 2 one hundred plus mile routes, raced down double track, and trudged through muddy spruce forests. Hailed on a few times. There was a crash or two, but luckily no stitches were needed. The next winter I finished the fatbike I was building, and rode in the cold.
Why do I ride? Besides the fact that biking probably saved me a few dozen dollars on fuel (and CO2 emmissions) and the fact that when a human is on a bike, they are the most efficient animal on earth, besides all of that, it might be boiled down to the idiom "more miles, more smiles." The scenery is great, the feel of sending it down a hill with some buddies is like no other, and a beer or two at the end is undoubtededly enjoyable. It could be that simple, but it doesn't explain what makes it worth it to go through physical pain like that from biting hail or numbing cold or the inevitable breathlessness of climbing a hill.
Sometimes why I ride is pushing myself for the eternal question of "what is possible?" The anwswer to that question could be why I pushed through the hail and climbed all those hills. It could explain why I wanted to ride just under 130 miles in 12 hours.
Sometimes why I ride is control. Many people today probably could ride over 100 miles in a day, provided a decent bike, a good saddle, and an okay padded pair of pants. What prevents that is the thought that 100 miles is somehow unachievable. It takes control of fluid and food intake, cadence, exertion, and many other minutia. Without control, it could be very easy for oneself to tire out too quickly or experience painful cramps. If I am able to control what I have the capability to do, then I can go farther than ever before.
Sometimes why I ride is spite. Not very long ago, I was exiting an abusive relationship. I had heard, more than a few times, those hurtful words from my former partner, "you won't do much without me." There are times when, for whatever reason, I just want to prove that wrong.
Why I ride is to heal and become a better, stronger version of myself. There will be a day where I can ride without a thought of spite; I will have healed to the point where that is no longer how I feel. I ride to push myself to become stronger, both within the physical and the mindfull. I ride to do my part to reduce CO2 emmissions and to enjoy a beer. The story of hike-a-bike for several miles through mud is a good one to tell people. Sometimes, why I ride is a mix of everything.
What else would it drive a person to bike a 100 miles, anyway? | 176.130435 | 666 | 0.778573 | eng_Latn | 0.99997 |
d02f21923348023582837200abeba74b88c5ec51 | 5,445 | md | Markdown | docs/2014/relational-databases/security/authentication-access/create-a-server-role.md | Sticcia/sql-docs.it-it | 31c0db26a4a5b25b7c9f60d4ef0a9c59890f721e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/2014/relational-databases/security/authentication-access/create-a-server-role.md | Sticcia/sql-docs.it-it | 31c0db26a4a5b25b7c9f60d4ef0a9c59890f721e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/2014/relational-databases/security/authentication-access/create-a-server-role.md | Sticcia/sql-docs.it-it | 31c0db26a4a5b25b7c9f60d4ef0a9c59890f721e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Creare un ruolo del server | Microsoft Docs
ms.custom: ''
ms.date: 06/13/2017
ms.prod: sql-server-2014
ms.reviewer: ''
ms.technology: security
ms.topic: conceptual
f1_keywords:
- sql12.swb.serverrole.members.f1
- SQL12.SWB.SERVERROLE.GENERAL.F1
- sql12.swb.serverrole.memberships.f1
helpviewer_keywords:
- SERVER ROLE, creating
ms.assetid: 74f19992-8082-4ed7-92a1-04fe676ee82d
author: VanMSFT
ms.author: vanto
manager: craigg
ms.openlocfilehash: 22e08b5eb0bccc02303201b7fae46b55f1012fd8
ms.sourcegitcommit: f7fced330b64d6616aeb8766747295807c92dd41
ms.translationtype: MT
ms.contentlocale: it-IT
ms.lasthandoff: 04/23/2019
ms.locfileid: "63011966"
---
# <a name="create-a-server-role"></a>Creazione di un ruolo del server
In questo argomento viene descritto come creare un nuovo ruolo server in [!INCLUDE[ssCurrent](../../../includes/sscurrent-md.md)] tramite [!INCLUDE[ssManStudioFull](../../../includes/ssmanstudiofull-md.md)] o [!INCLUDE[tsql](../../../includes/tsql-md.md)].
**Contenuto dell'argomento**
- **Prima di iniziare:**
[Limitazioni e restrizioni](#Restrictions)
[Sicurezza](#Security)
- **Per creare un nuovo ruolo del server utilizzando:**
[SQL Server Management Studio](#SSMSProcedure)
[Transact-SQL](#TsqlProcedure)
## <a name="BeforeYouBegin"></a> Prima di iniziare
### <a name="Restrictions"></a> Limitazioni e restrizioni
Non è possibile concedere ai ruoli del server l'autorizzazione sulle entità a protezione diretta a livello di database. Per creare ruoli del database, vedere [CREATE ROLE (Transact-SQL)](/sql/t-sql/statements/create-role-transact-sql).
### <a name="Security"></a> Sicurezza
#### <a name="Permissions"></a> Autorizzazioni
- È richiesta l'autorizzazione CREATE SERVER ROLE o l'appartenenza al ruolo predefinito del server sysadmin.
- È anche richiesta l'autorizzazione IMPERSONATE in *server_principal* per gli account di accesso, l'autorizzazione ALTER per i ruoli del server usati come *server_principal*o l'appartenenza a un gruppo di Windows usato come server_principal.
- Se si utilizza l'opzione AUTHORIZATION per assegnare la proprietà del ruolo del server, sono necessarie anche le autorizzazioni seguenti:
- Per assegnare la proprietà di un ruolo del server a un altro account di accesso, è richiesta l'autorizzazione IMPERSONATE per tale account di accesso.
- Per assegnare la proprietà di un ruolo del server a un altro ruolo del server, è richiesta l'appartenenza al ruolo del server destinatario oppure l'autorizzazione ALTER per tale ruolo.
## <a name="SSMSProcedure"></a> Utilizzo di SQL Server Management Studio
#### <a name="to-create-a-new-server-role"></a>Per creare un nuovo ruolo del server
1. In Esplora oggetti espandere il server in cui si desidera creare il nuovo ruolo del server.
2. Espandere la cartella **Sicurezza** .
3. Fare clic con il pulsante destro del mouse sulla cartella **Ruoli server** e selezionare **Nuovo ruolo server**.
4. Nel **nuovo ruolo del Server -**_nome_ruolo_server_ finestra di dialogo il **generali** pagina, immettere un nome per il nuovo ruolo del server nel **nome ruolo del Server**finestra.
5. Nella casella **Proprietario** immettere il nome dell'entità del server proprietaria del nuovo ruolo. In alternativa, fare clic sui puntini di sospensione **(...)** per aprire la finestra di dialogo **Seleziona account di accesso o ruolo server**.
6. In **Entità a protezione diretta**selezionare una o più entità a protezione diretta a livello di server. Quando è selezionata un'entità a protezione diretta, al ruolo del server è possibile concedere o negare autorizzazioni per tale entità.
7. Nel **autorizzazioni: Esplicita** , selezionare la casella di controllo per concedere, concedere con concessione o negare autorizzazioni al ruolo del server per l'entità a protezione diretta selezionate. Se un'autorizzazione non può essere concessa o negata a tutte le entità a protezione diretta selezionate, l'autorizzazione viene rappresentata come selezione parziale.
8. Nella pagina **Membri** utilizzare il pulsante **Aggiungi** per aggiungere account di accesso che rappresentano singoli utenti o gruppi al nuovo ruolo del server.
9. Un ruolo del server definito dall'utente può essere membro di un altro ruolo del server. Nella pagina **Appartenenze** selezionare una casella di controllo per impostare il ruolo del server definito dall'utente corrente come membro di un ruolo del server selezionato.
10. [!INCLUDE[clickOK](../../../includes/clickok-md.md)]
## <a name="TsqlProcedure"></a> Utilizzo di Transact-SQL
#### <a name="to-create-a-new-server-role"></a>Per creare un nuovo ruolo del server
1. In **Esplora oggetti**connettersi a un'istanza del [!INCLUDE[ssDE](../../../includes/ssde-md.md)].
2. Sulla barra Standard fare clic su **Nuova query**.
3. Copiare e incollare l'esempio seguente nella finestra Query, quindi fare clic su **Esegui**.
```
--Creates the server role auditors that is owned the securityadmin fixed server role.
USE master;
CREATE SERVER ROLE auditors AUTHORIZATION securityadmin;
GO
```
Per altre informazioni, vedere [CREATE SERVER ROLE (Transact-SQL)](/sql/t-sql/statements/create-server-role-transact-sql).
| 51.367925 | 377 | 0.737006 | ita_Latn | 0.9949 |
d02f9e370aad4e450535275a724f73a539d4aaf1 | 4,424 | md | Markdown | README.md | diurnate/django-gulp | 12704dc416ec1a85219ae996afe79f0b8c4d72a3 | [
"MIT"
] | 102 | 2015-05-04T16:53:18.000Z | 2022-02-14T13:38:40.000Z | README.md | diurnate/django-gulp | 12704dc416ec1a85219ae996afe79f0b8c4d72a3 | [
"MIT"
] | 13 | 2015-07-09T07:00:08.000Z | 2018-06-27T18:21:37.000Z | README.md | diurnate/django-gulp | 12704dc416ec1a85219ae996afe79f0b8c4d72a3 | [
"MIT"
] | 23 | 2015-06-27T01:57:05.000Z | 2020-06-30T16:59:19.000Z | ## django-gulp
`django-gulp` overrides `./manage.py runserver` and `./manage.py collectstatic`
so that they also run your gulp tasks.
I've used this in conjunction with watchify and livereload in gulp, so that my
simple unadorned runserver automatically watches and compiles new JavaScript
files with browserify and live reloads new CSS that's been automatically
compiled from SASS.
### Installation
Add `"django_gulp"` to your `INSTALLED_APPS` setting like this, making sure
that it comes before `django.contrib.staticfiles` (or other apps that override
`runserver` or `collectstatic` in the list if they're listed):
```
INSTALLED_APPS = (
'django_gulp',
...
)
```
Now when you run `./manage.py runserver` or `./manage.py collectstatic` your
`gulp` tasks will run as well!
### Settings
`GULP_CWD` defaults to the current working directory. Override it if your `gulpfile.js` does not reside within the Django project's toplevel directory.
`GULP_PRODUCTION_COMMAND` defaults to `gulp build --production`.
`GULP_DEVELOP_COMMAND` defaults to `gulp`. Note that when specifying this setting manually, `GULP_CWD` is ignored.
### Heroku
`django-gulp` works on Heroku! You'll just need to use buildpack-multi and make
sure your `.buildpacks` file looks like this:
```
https://github.com/heroku/heroku-buildpack-nodejs.git
https://github.com/heroku/heroku-buildpack-python.git
```
To use buildback-multi set your configuration like so:
```
$ heroku config:set BUILDPACK_URL=https://github.com/heroku/heroku-buildpack-multi.git
```
### Example output
```sh
$ ./manage.py runserver
>>> Starting gulp
>>> gulp process on pid 47863
Performing system checks...
System check identified no issues.
May 04, 2015 - 18:27:52
Django version 1.8.1, using settings 'example.settings'
Starting development server at http://127.0.0.1:8000/
Quit the server with CONTROL-C.
[18:27:53] Using gulpfile ~/p/example/gulpfile.js
[18:27:53] Starting 'bower-install'...
[18:27:54] Using cwd: /Users/beau/p/example
[18:27:54] Using bower dir: static/vendor
[18:27:54] Starting 'sass'...
[18:27:54] Starting 'watch'...
[18:27:54] Finished 'watch' after 19 ms
[18:27:54] Starting 'watchify'...
[18:28:08] Watching files required by bundle-about.js
[18:28:08] Bundling bundle-about.js...
[18:28:08] Watching files required by bundle-accounts-login.js
[18:28:08] Bundling bundle-accounts-login.js...
[18:28:08] Watching files required by bundle-accounts-signup.js
[18:28:08] Bundling bundle-accounts-signup.js...
[18:28:08] Watching files required by bundle-activities.js
[18:28:08] Bundling bundle-activities.js...
[18:28:08] Finished 'watchify' after 14 s
[18:28:09] Finished 'sass' after 15 s
^C>>> Closing gulp process
```
```sh
$ ./manage.py collectstatic
[18:32:54] Using gulpfile ~/p/example/gulpfile.js
[18:32:54] Starting 'bower-install'...
[18:32:55] Using cwd: /Users/beau/p/example
[18:32:55] Using bower dir: static/vendor
[18:32:55] Starting 'sass'...
[18:32:56] Starting 'browserify'...
[18:33:05] Bundling bundle-about.js...
[18:33:05] Bundling bundle-accounts-login.js...
[18:33:05] Bundling bundle-accounts-signup.js...
[18:33:05] Bundling bundle-activities.js...
[18:33:05] Finished 'browserify' after 9.39 s
[18:33:08] Finished 'sass' after 13 s
[18:33:14] Finished 'bower-install' after 19 s
[18:33:14] Starting 'bower-main-files'...
[18:33:14] Starting 'bower-detritus'...
[18:33:14] Finished 'bower-main-files' after 104 ms
[18:33:14] Finished 'bower-detritus' after 507 ms
[18:33:14] Starting 'bower'...
[18:33:14] Finished 'bower' after 18 μs
[18:33:14] Starting 'build'...
[18:33:14] Finished 'build' after 5 μs
You have requested to collect static files at the destination
location as specified in your settings:
/Users/beau/p/example/static-files
This will overwrite existing files!
Are you sure you want to do this?
Type 'yes' to continue, or 'no' to cancel: yes
Copying '/Users/beau/p/example/build/js/bundle-about.js'
Copying '/Users/beau/p/example/build/js/bundle-about.map.json'
Copying '/Users/beau/p/example/build/js/bundle-accounts-login.js'
Copying '/Users/beau/p/example/build/js/bundle-accounts-login.map.json'
Copying '/Users/beau/p/example/build/js/bundle-accounts-signup.js'
Copying '/Users/beau/p/example/build/js/bundle-accounts-signup.map.json'
Copying '/Users/beau/p/example/build/js/bundle-activities.js'
Copying '/Users/beau/p/example/build/js/bundle-activities.map.json'
```
| 35.111111 | 151 | 0.740506 | eng_Latn | 0.86018 |
d02fd6d89e7dfacb701e1164aaf03a7314fd24d4 | 7,970 | md | Markdown | articles/search/index-add-language-analyzers.md | FloKlaffenbach/azure-docs.de-de | 938684e8ca7296a11e35f658e389a5b952f786a7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/search/index-add-language-analyzers.md | FloKlaffenbach/azure-docs.de-de | 938684e8ca7296a11e35f658e389a5b952f786a7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/search/index-add-language-analyzers.md | FloKlaffenbach/azure-docs.de-de | 938684e8ca7296a11e35f658e389a5b952f786a7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Hinzufügen von Sprachanalysetools zu Zeichenfolgenfeldern
titleSuffix: Azure Cognitive Search
description: Mehrsprachige lexikalische Textanalyse für nicht englischsprachige Abfragen und Indizes in der kognitiven Azure-Suche.
manager: nitinme
author: Yahnoosh
ms.author: jlembicz
ms.service: cognitive-search
ms.topic: conceptual
ms.date: 12/10/2019
translation.priority.mt:
- de-de
- es-es
- fr-fr
- it-it
- ja-jp
- ko-kr
- pt-br
- ru-ru
- zh-cn
- zh-tw
ms.openlocfilehash: a97bee27b74aa211b4d4d56547726555edefa87a
ms.sourcegitcommit: 163be411e7cd9c79da3a3b38ac3e0af48d551182
ms.translationtype: HT
ms.contentlocale: de-DE
ms.lasthandoff: 02/21/2020
ms.locfileid: "77539398"
---
# <a name="add-language-analyzers-to-string-fields-in-an-azure-cognitive-search-index"></a>Hinzufügen von Sprachanalysetools zu Zeichenfolgenfeldern in einem Azure Cognitive Search-Index
Ein *Sprachanalysetool* ist eine bestimmte Art einer [Textanalyse](search-analyzers.md), die eine lexikalische Analyse mithilfe der linguistischen Regeln der Zielsprache durchführt. Jedes durchsuchbare Feld verfügt über die Eigenschaft **analyzer**. Wenn Ihr Index übersetzte Zeichenfolgen enthält, wie z.B. separate Felder für englischen und chinesischen Text, können Sie für jedes Feld Sprachanalysetools angeben, um auf die umfangreichen linguistischen Funktionen dieser Sprachanalysetools zuzugreifen.
Die kognitive Azure-Suche unterstützt 35 Analysetools auf Basis von Lucene und 50 Analysetools, die von Microsoft-Technologien zur Verarbeitung natürlicher Sprache unterstützt werden, die in Office und Bing zum Einsatz kommen.
## <a name="comparing-analyzers"></a>Vergleichen von Analysetools
Einige Entwickler bevorzugen möglicherweise die vertrautere, einfachere Open Source-Lösung von Lucene. Lucene-Sprachanalysetools sind schneller, während Microsoft-Sprachanalysetools erweiterte Funktionen bieten, wie z.B. Lemmatisierung, Wortzerlegung (in Sprachen wie Deutsch, Dänisch, Niederländisch, Schwedisch, Norwegisch, Estnisch, Finnisch, Ungarisch und Slowakisch) und Entitätserkennung (URLs, E-Mails, Datumsangaben und Zahlen). Vergleichen Sie nach Möglichkeit die Analyseprogramme von Microsoft und Lucene, um die für Ihre Anforderungen passendere Lösung zu ermitteln.
Die Indizierung mit Analyseprogrammen von Microsoft dauert je nach Sprache durchschnittlich zwei bis drei Mal länger als mit entsprechenden Analyseprogrammen von Lucene. Die Suchleistung sollte bei durchschnittlich großen Abfragen nicht wesentlich eingeschränkt sein.
### <a name="english-analyzers"></a>Analysetools für Englisch
Standardmäßig wird das Lucene-Standardanalysetool verwendet, das für Englisch gut funktioniert, vielleicht aber nicht ganz so gut wie das Analysetool für Englisch von Lucene oder das Analysetool für Englisch von Microsoft.
+ Das Analysetool für Englisch von Lucene ist beispielsweise eine Erweiterung des Standardanalysetools. Es entfernt Possessivformen (nachgestelltes -s) bei Wörtern, wendet gemäß dem Wortstammerkennungsalgorithmus von Porter die Wortstammerkennung an und entfernt englische Stoppwörter.
+ Das Analysetool für Englisch von Microsoft führt die Lemmatisierung anstelle der Wortstammerkennung durch. Dadurch können gebeugte und unregelmäßige Wortformen viel besser verarbeitet werden, was zu relevanteren Suchergebnissen führt.
## <a name="configuring-analyzers"></a>Konfigurieren von Analysetools
Sprachanalysen werden in der vorliegenden Form verwendet. Sie können für die Eigenschaft **analyzer** aller Felder in der Indexdefinition den Namen eines Analyseprogramms festlegen, das die Sprache und den Linguistikstapel (Microsoft oder Lucene) angibt. Bei der Indizierung dieses Felds und der Suche danach wird das gleiche Analyseprogramm verwendet. Beispiel: Sie können im selben Index separate Felder für englische, französische und spanische Hotelbeschreibungen verwenden.
> [!NOTE]
> Es ist nicht möglich, für ein Feld zur Indizierungszeit eine andere Sprachanalyse als zur Abfragezeit zu verwenden. Diese Funktion ist für [benutzerdefinierte Analysen](index-add-custom-analyzers.md) reserviert. Wenn Sie daher versuchen, die Eigenschaften **searchAnalyzer** oder **indexAnalyzer** auf den Namen einer Sprachanalyse festzulegen, gibt die REST-API eine Fehlerantwort zurück. Sie müssen stattdessen die **analyzer**-Eigenschaft verwenden.
Verwenden Sie den Abfrageparameter **searchFields**, um anzugeben, für welches sprachspezifische Feld in den Abfragen gesucht werden soll. Beispiele für Abfragen mit der Eigenschaft „analyzer“ finden Sie unter [Search Documents (Azure Search Service REST API)](https://docs.microsoft.com/rest/api/searchservice/search-documents) (Durchsuchen von Dokumenten (REST-API des Azure Search-Diensts)).
Weitere Informationen zu Indexeigenschaften finden Sie unter [Erstellen eines Index (REST-API für die kognitive Azure-Suche)](https://docs.microsoft.com/rest/api/searchservice/create-index). Weitere Informationen zur Analyse in der kognitiven Azure-Suche finden Sie unter [Analysetools in der kognitiven Azure-Suche](https://docs.microsoft.com/azure/search/search-analyzers).
<a name="language-analyzer-list"></a>
## <a name="language-analyzer-list"></a>Liste der Sprachanalysen
Es folgt die Liste der unterstützten Sprachen sowie die Namen der Lucene- und Microsoft-Analyseprogramme.
|Sprache|Name des Microsoft-Analysetools|Name des Lucene-Analysetools|
|--------------|-----------------------------|--------------------------|
|Arabisch|ar.microsoft|ar.lucene|
|Armenisch||hy.lucene|
|Bengalisch|bn.microsoft||
|Baskisch||eu.lucene|
|Bulgarisch|bg.microsoft|bg.lucene|
|Katalanisch|ca.microsoft|ca.lucene|
|Chinesisch (vereinfacht)|zh-Hans.microsoft|zh-Hans.lucene|
|Chinesisch (traditionell)|zh-Hant.microsoft|zh-Hant.lucene|
|Kroatisch|hr.microsoft||
|Tschechisch|cs.microsoft|cs.lucene|
|Dänisch|da.microsoft|da.lucene|
|Niederländisch|nl.microsoft|nl.lucene|
|Englisch|en.microsoft|en.lucene|
|Estnisch|et.microsoft||
|Finnisch|fi.microsoft|fi.lucene|
|Französisch|fr.microsoft|fr.lucene|
|Galizisch||gl.lucene|
|Deutsch|de.microsoft|de.lucene|
|Griechisch|el.microsoft|el.lucene|
|Gujarati|gu.microsoft||
|Hebräisch|he.microsoft||
|Hindi|hi.microsoft|hi.lucene|
|Ungarisch|hu.microsoft|hu.lucene|
|Isländisch|is.microsoft||
|Indonesisch (Bahasa)|id.microsoft|id.lucene|
|Irisch||ga.lucene|
|Italienisch|it.microsoft|it.lucene|
|Japanisch|ja.microsoft|ja.lucene|
|Kannada|kn.microsoft||
|Koreanisch|ko.microsoft|ko.lucene|
|Lettisch|lv.microsoft|lv.lucene|
|Litauisch|lt.microsoft||
|Malayalam|ml.microsoft||
|Malaiisch (Lateinisch)|ms.microsoft||
|Marathi|mr.microsoft||
|Norwegisch|nb.microsoft|no.lucene|
|Persisch||fa.lucene|
|Polnisch|pl.microsoft|pl.lucene|
|Portugiesisch (Brasilien)|pt-Br.microsoft|pt-Br.lucene|
|Portugiesisch (Portugal)|pt-Pt.microsoft|pt-Pt.lucene|
|Pandschabi|pa.microsoft||
|Rumänisch|ro.microsoft|ro.lucene|
|Russisch|ru.microsoft|ru.lucene|
|Serbisch (Kyrillisch)|sr-cyrillic.microsoft||
|Serbisch (Lateinisch)|sr-latin.microsoft||
|Slowakisch|sk.microsoft||
|Slowenisch|sl.microsoft||
|Spanisch|es.microsoft|es.lucene|
|Schwedisch|sv.microsoft|sv.lucene|
|Tamilisch|ta.microsoft||
|Telugu|te.microsoft||
|Thailändisch|th.microsoft|th.lucene|
|Türkisch|tr.microsoft|tr.lucene|
|Ukrainisch|uk.microsoft||
|Urdu|ur.microsoft||
|Vietnamesisch|vi.microsoft||
Alle Analysetools mit **Lucene** im Namen werden von den [Sprachanalysetools von Apache Lucene](https://lucene.apache.org/core/6_6_1/core/overview-summary.html ) unterstützt.
## <a name="see-also"></a>Weitere Informationen
+ [Erstellen eines Index (REST-API für die kognitive Azure-Suche)](https://docs.microsoft.com/rest/api/searchservice/create-index)
+ [AnalyzerName-Klasse](https://docs.microsoft.com/dotnet/api/microsoft.azure.search.models.analyzername)
| 60.378788 | 579 | 0.791719 | deu_Latn | 0.96769 |
d030c9043697f58cc13cf63cb2772b697b8d91dd | 1,470 | md | Markdown | posts/workers-custom-domain.md | yanyuechuixue/blog | e449b2c33cc07f8b3a44129e4f6ba68cec567845 | [
"MIT"
] | null | null | null | posts/workers-custom-domain.md | yanyuechuixue/blog | e449b2c33cc07f8b3a44129e4f6ba68cec567845 | [
"MIT"
] | null | null | null | posts/workers-custom-domain.md | yanyuechuixue/blog | e449b2c33cc07f8b3a44129e4f6ba68cec567845 | [
"MIT"
] | null | null | null | ### 创建项目
开始之前,我们检查一下自己是否已经安装了Wrangler,一般来说是没有的。
那么我们就需要先安装Wrangler,首先是配置Rust 运行时:
```shell
curl https://sh.rustup.rs -sSf | sh
```
然后安装Wrangler:
```shell
cargo install wrangler
```
接着我们创建一个项目,名字就叫test-worker:
```shell
cd ~
wrangler generate test-worker
cd test-worker
```
然后列出目录文件,你会看到一个index.js,编辑它:
```shell
vim index.js
```
修改完之后保存。
### 发布项目
你需要配置你的Cloudflare账户信息,首先我们打开Cloudflare官网:https://www.cloudflare.com/
登录到dashboard 后,右上角的头像那里点一下,然后点击My Profile,接着会进入个人资料页。
底部有一个API Keys,点击Global API Key 那一栏右边的View 按钮查看你的API Key,接着会要求你输入密码并进行人机验证,有个谷歌验证码。

确认后就会显示出你的API Key,将其复制并保管好,它和你的密码一样重要。
回到Shell,输入以下命令进行配置:
```shell
wrangler config --api-key
```
然后我们修改wrangler.toml,这个是项目的配置文件,里面记录了一些关于这个Worker 的选项:
```
# 你的 Worker 名字
name = "test-worker"
# 你的 Cloudflare 账户 ID
# 在域名的 Overview 页面的右下角可以查看到 Account ID
account_id = "1234567890abcdefghijklmn"
# 你的 Cloudflare 域名 ID
# 同样是在域名的 Overview 页面右下角可以看到 Zone ID
zone_id = "abcdefghijklmn1234567890"
# 你想要绑定的域名
# 记得结尾加个 /*
route = "blog.natfrp.org/*"
# 设置应用程序的类型,默认是 webpack,无需更改
type = "webpack"
```
然后将项目构建并发布到Cloudflare:
```shell
wrangler build
wrangler publish --release
```
最后一步,增加DNS 解析,将你设置的域名前缀解析到你的workers.dev 域名。
例如我绑定了blog.natfrp.org,我的workers.dev 域名是zerodream.workers.dev,那么就将blog 前缀解析到zerodream.workers.dev,类型CNAME,并打开CDN 模式(点亮黄色的云)。

完成,现在你可以访问你的域名查看效果了。
| 16.896552 | 123 | 0.771429 | yue_Hant | 0.889063 |
d030da08ab76cd9f73e958ac1c6937f1ebde11a6 | 979 | md | Markdown | README.md | ArthurSens/operator | 08f589b69dbd6a0072f7af8f5225c93c905c1fb2 | [
"Apache-2.0"
] | 8 | 2020-09-07T14:17:06.000Z | 2022-03-01T15:58:46.000Z | README.md | ArthurSens/operator | 08f589b69dbd6a0072f7af8f5225c93c905c1fb2 | [
"Apache-2.0"
] | 57 | 2020-09-10T18:11:52.000Z | 2022-01-07T01:00:01.000Z | README.md | ArthurSens/operator | 08f589b69dbd6a0072f7af8f5225c93c905c1fb2 | [
"Apache-2.0"
] | 12 | 2020-09-08T14:59:24.000Z | 2022-02-14T18:21:35.000Z | # Observatorium Operator
[](https://circleci.com/gh/observatorium/operator)
Operator deploying the Observatorium project.
Currently, this includes:
1. the operator itself; and
1. example configuration for deploying Observatorium via the Observatorium Operator.
For more information on the operator, see the [Observatorium Operator documentation](./docs/deploy-operator.md).
## Development
We provide a few make commands that should help you getting started with development a bit quicker.
There is `make generate` that will generate new Kubernetes client files after updating the types for the Custom Resource Definition.
Then there is `make manifests` which is going to generate all manifest YAML files from jsonnet so that things are kept up-to-date.
If you need to update the dependencies for jsonnet simply use `make jsonnet-update` or more specifically `make jsonnet-update-deployments`.
| 44.5 | 139 | 0.804903 | eng_Latn | 0.978302 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.