hexsha
stringlengths 40
40
| size
int64 5
1.04M
| ext
stringclasses 6
values | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 3
344
| max_stars_repo_name
stringlengths 5
125
| max_stars_repo_head_hexsha
stringlengths 40
78
| max_stars_repo_licenses
sequencelengths 1
11
| max_stars_count
int64 1
368k
⌀ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
⌀ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
⌀ | max_issues_repo_path
stringlengths 3
344
| max_issues_repo_name
stringlengths 5
125
| max_issues_repo_head_hexsha
stringlengths 40
78
| max_issues_repo_licenses
sequencelengths 1
11
| max_issues_count
int64 1
116k
⌀ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
⌀ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
⌀ | max_forks_repo_path
stringlengths 3
344
| max_forks_repo_name
stringlengths 5
125
| max_forks_repo_head_hexsha
stringlengths 40
78
| max_forks_repo_licenses
sequencelengths 1
11
| max_forks_count
int64 1
105k
⌀ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
⌀ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
⌀ | content
stringlengths 5
1.04M
| avg_line_length
float64 1.14
851k
| max_line_length
int64 1
1.03M
| alphanum_fraction
float64 0
1
| lid
stringclasses 191
values | lid_prob
float64 0.01
1
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
582969ba38b62b75d075fed7c067424e42696358 | 8,572 | md | Markdown | vendor/musonza/chat/README.md | Digital4design/havealooksy | c668ddaf0bcc01dcf3d10385df55ac5272ff018b | [
"MIT"
] | null | null | null | vendor/musonza/chat/README.md | Digital4design/havealooksy | c668ddaf0bcc01dcf3d10385df55ac5272ff018b | [
"MIT"
] | null | null | null | vendor/musonza/chat/README.md | Digital4design/havealooksy | c668ddaf0bcc01dcf3d10385df55ac5272ff018b | [
"MIT"
] | null | null | null | <p align="left"><img src="menu.png" alt="chat" width="330px"></p>
[](https://travis-ci.org/musonza/chat)
[](https://packagist.org/packages/musonza/chat)
[](https://packagist.org/packages/musonza/chat)
## Chat
[Demo Application](https://github.com/musonza/chat-demo)
- [Introduction](#introduction)
- [Installation](#installation)
- [Usage](#usage)
- [Creating a conversation](#creating-a-conversation)
- [Get a conversation by Id](#get-a-conversation-by-id)
- [Update conversation details](#update-conversation-details)
- [Send a text message](#send-a-text-message)
- [Send a message of custom type](#send-a-message-of-custom-type)
- [Get a message by id](#get-a-message-by-id)
- [Mark a message as read](#mark-a-message-as-read)
- [Mark whole conversation as read](#mark-whole-conversation-as-read)
- [Unread messages count](#unread-messages-count)
- [Delete a message](#delete-a-message)
- [Clear a conversation](#clear-a-conversation)
- [Get a conversation between two users](#get-a-conversation-between-two-users)
- [Get common conversations among users](#get-common-conversations-among-users)
- [Remove users from a conversation](#remove-users-from-a-conversation)
- [Add users to a conversation](#add-users-to-a-conversation)
- [Get messages in a conversation](#get-messages-in-a-conversation)
- [Get recent messages](#get-recent-messages)
- [Get users in a conversation](#get-users-in-a-conversation)
- [License](#license)
## Introduction
This package allows you to add a chat system to your Laravel ^5.4 application
## Installation
From the command line, run:
```
composer require musonza/chat
```
Add the service provider to your `config\app.php` the providers array
```
Musonza\Chat\ChatServiceProvider::class
```
Add the Facade to your aliases:
```
'Chat' => Musonza\Chat\Facades\ChatFacade::class to your `config\app.php`
```
The class is bound to the ioC as chat
```
$chat = App::make('chat');
```
Publish the assets:
```
php artisan vendor:publish
```
This will publish database migrations and a configuration file `musonza_chat.php` in the Laravel config folder.
## Configuration
```php
return [
'user_model' => 'App\User',
/**
* If not set, the package will use getKeyName() on the user_model specified above
*/
'user_model_primary_key' => null,
/*
* This will allow you to broadcast an event when a message is sent
* Example:
* Channel: mc-chat-conversation.2,
* Event: Musonza\Chat\Eventing\MessageWasSent
*/
'broadcasts' => false,
/**
* The event to fire when a message is sent
* See Musonza\Chat\Eventing\MessageWasSent if you want to customize.
*/
'sent_message_event' => 'Musonza\Chat\Eventing\MessageWasSent',
/**
* Automatically convert conversations with more than two users to public
*/
'make_three_or_more_users_public' => true,
];
```
Run the migrations:
```
php artisan migrate
```
## Usage
By default the package assumes you have a User model in the App namespace.
However, you can update the user model in `musonza_chat.php` published in the `config` folder.
#### Creating a conversation
```php
$participants = [$userId, $userId2,...];
$conversation = Chat::createConversation($participants);
```
#### Creating a conversation of type private / public
```php
$participants = [$userId, $userId2,...];
// Create a private conversation
$conversation = Chat::createConversation($participants)->makePrivate();
// Create a public conversation
$conversation = Chat::createConversation($participants)->makePrivate(false);
```
#### Get a conversation by id
```php
$conversation = Chat::conversations()->getById($id);
```
#### Update conversation details
```php
$data = ['title' => 'PHP Channel', 'description' => 'PHP Channel Description'];
$conversation->update(['data' => $data]);
```
#### Send a text message
```php
$message = Chat::message('Hello')
->from($user)
->to($conversation)
->send();
```
#### Send a message of custom type
The default message type is `text`. If you want to specify custom type you can call the `type()` function as below:
```php
$message = Chat::message('http://example.com/img')
->type('image')
->from($user)
->to($conversation)
->send();
```
### Get a message by id
```php
$message = Chat::messages()->getById($id);
```
#### Mark a message as read
```php
Chat::message($message)->setUser($user)->markRead();
```
#### Flag / mark a message
```php
Chat::message($message)->setUser($user)->toggleFlag();
Chat::message($message)->setUser($user)->flagged(); // true
```
#### Mark whole conversation as read
```php
Chat::conversation($conversation)->setUser($user)->readAll();
```
#### Unread messages count
```php
$unreadCount = Chat::messages()->setUser($user)->unreadCount();
```
#### Unread messages count per Conversation
```php
Chat::conversation($conversation)->setUser($user)->unreadCount();
```
#### Delete a message
```php
Chat::message($message)->setUser($user)->delete();
```
#### Clear a conversation
```php
Chat::conversation($conversation)->setUser($user)->clear();
```
#### Get a conversation between two users
```php
$conversation = Chat::conversations()->between($user1, $user2);
```
#### Get common conversations among users
```php
$conversations = Chat::conversations()->common($users);
```
`$users` can be an array of user ids ex. `[1,4,6]` or a collection `(\Illuminate\Database\Eloquent\Collection)` of users
#### Remove users from a conversation
```php
/* removing one user */
Chat::conversation($conversation)->removeParticipants($user);
```
```php
/* removing multiple users */
Chat::conversation($conversation)->removeParticipants([$user1, $user2, $user3,...,$userN]);
```
#### Add users to a conversation
```php
/* add one user */
Chat::conversation($conversation)->addParticipants($user);
```
```php
/* add multiple users */
Chat::conversation($conversation)->addParticipants([$user3, $user4]);
```
<b>Note:</b> By default, a third user will classify the conversation as not private if it was. See config on how to change this.
#### Get messages in a conversation
```php
Chat::conversation($conversation)->setUser($user)->getMessages()
```
#### Get user conversations by type
```php
// private conversations
$conversations = Chat::conversations()->setUser($user)->isPrivate()->get();
// public conversations
$conversations = Chat::conversations()->setUser($user)->isPrivate(false)->get();
// all conversations
$conversations = Chat::conversations()->setUser($user)->get();
```
#### Get recent messages
```php
$messages = Chat::conversations()->setUser($user)->limit(25)->page(1)->get();
```
Example
```
[
"id" => 1
"private" => "1"
"data" => []
"created_at" => "2018-06-02 21:35:52"
"updated_at" => "2018-06-02 21:35:52"
"last_message" => array:13 [
"id" => 2
"message_id" => "2"
"conversation_id" => "1"
"user_id" => "1"
"is_seen" => "1"
"is_sender" => "1"
"flagged" => false
"created_at" => "2018-06-02 21:35:52"
"updated_at" => "2018-06-02 21:35:52"
"deleted_at" => null
"body" => "Hello 2"
"type" => "text"
"sender" => array:7 [
"id" => 1
"name" => "Jalyn Ernser"
"email" => "[email protected]"
]
]
]
```
#### Pagination
There are a few ways you can achieve pagination
You can specify the `limit` and `page` as above using the respective functions or as below:
```
$paginated = Chat::conversations()->setUser($user)
->setPaginationParams([
'page' => 3,
'perPage' => 10,
'sorting' => "desc",
'columns' => [
'*'
],
'pageName' => 'test'
])
->get();
```
You don't have to specify all the parameters. If you leave the parameters out, default values will be used.
`$paginated` above will return `Illuminate\Pagination\LengthAwarePaginator`
To get the `conversations` simply call `$paginated->items()`
#### Get users in a conversation
```php
$users = $conversation->users;
```
## License
Chat is open-sourced software licensed under the [MIT license](http://opensource.org/licenses/MIT)
| 24.632184 | 128 | 0.64979 | eng_Latn | 0.684609 |
582aae2c88859950e2b9606cd035d7ebc952b6c2 | 67 | md | Markdown | README.md | CactusDev/Xerophi-Admin | 751f83616a8d8896bbc24b884e5cb43b1adbfae6 | [
"MIT"
] | null | null | null | README.md | CactusDev/Xerophi-Admin | 751f83616a8d8896bbc24b884e5cb43b1adbfae6 | [
"MIT"
] | null | null | null | README.md | CactusDev/Xerophi-Admin | 751f83616a8d8896bbc24b884e5cb43b1adbfae6 | [
"MIT"
] | null | null | null | # Xerophi-Admin
Utilities we use for the administration of Xerophi
| 22.333333 | 50 | 0.820896 | eng_Latn | 0.996268 |
582b06b13e681a9ab0790c1f695803f17d535d47 | 2,064 | md | Markdown | readme.md | derhuerst/sepa-payment-qr-code | f70e597095241c847fe4fb8a31e5140e5655b2f8 | [
"0BSD"
] | 10 | 2019-03-26T18:02:14.000Z | 2022-03-14T21:38:46.000Z | readme.md | derhuerst/sepa-payment-qr-code | f70e597095241c847fe4fb8a31e5140e5655b2f8 | [
"0BSD"
] | null | null | null | readme.md | derhuerst/sepa-payment-qr-code | f70e597095241c847fe4fb8a31e5140e5655b2f8 | [
"0BSD"
] | 1 | 2022-02-01T09:40:37.000Z | 2022-02-01T09:40:37.000Z | # sepa-payment-qr-code
**Generate a [QR code to initiate a SEPA bank transfer](https://en.wikipedia.org/wiki/EPC_QR_code).**
[](https://www.npmjs.com/package/sepa-payment-qr-code)
[](https://travis-ci.org/derhuerst/sepa-payment-qr-code)

[](https://gitter.im/derhuerst)
[](https://patreon.com/derhuerst)
## Installation
```shell
npm install sepa-payment-qr-code
```
## Usage
```js
// generate-qr-code.js
const generateQrCode = require('sepa-payment-qr-code')
const qr = generateQrCode({
name: 'Red Cross of Belgium',
iban: 'BE72000000001616',
amount: 123.45,
reference: 'Urgency fund',
information: 'Sample QR code'
})
process.stdout.write(qr)
```
```shell
node generate-qr-code.js | qrencode -t ansiutf8
# prints QR code to the terminal
```
This library only generates the text input to be QR-encoded. Use the library of your choice to render the QR code to PNG/SVG/React/etc.
## See also
- [EPC QR code – Wikipedia](https://en.wikipedia.org/wiki/EPC_QR_code)
- [Quick Response Code: Guidelines to Enable Data Capture for the Initiation of a SEPA Credit Transfer – European Payments Council](https://www.europeanpaymentscouncil.eu/document-library/guidance-documents/quick-response-code-guidelines-enable-data-capture-initiation)
- [Credit Transfer Payment API – W3C](https://www.w3.org/TR/payment-method-credit-transfer/)
## Contributing
If you have a question or need support using `sepa-payment-qr-code`, please double-check your code and setup first. If you think you have found a bug or want to propose a feature, refer to [the issues page](https://github.com/derhuerst/sepa-payment-qr-code/issues).
| 38.222222 | 269 | 0.75 | eng_Latn | 0.411049 |
582b889239b3b5503e0c23f581c64317f6f036f2 | 1,543 | md | Markdown | mdop/mbam-v2/security-and-privacy-for-mbam-20-mbam-2.md | MicrosoftDocs/mdop-docs-pr.it-it | c0a4de3a5407dee9cb0e7e8af61643dc2fc9ecf2 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-04-20T21:13:51.000Z | 2021-04-20T21:13:51.000Z | mdop/mbam-v2/security-and-privacy-for-mbam-20-mbam-2.md | MicrosoftDocs/mdop-docs-pr.it-it | c0a4de3a5407dee9cb0e7e8af61643dc2fc9ecf2 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-07-08T05:27:50.000Z | 2020-07-08T15:39:35.000Z | mdop/mbam-v2/security-and-privacy-for-mbam-20-mbam-2.md | MicrosoftDocs/mdop-docs-pr.it-it | c0a4de3a5407dee9cb0e7e8af61643dc2fc9ecf2 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-11-04T12:31:26.000Z | 2021-11-04T12:31:26.000Z | ---
title: Sicurezza e privacy per MBAM 2.0
description: Sicurezza e privacy per MBAM 2.0
author: dansimp
ms.assetid: 1b2859f8-2381-4ad7-8744-2caed88570ad
ms.reviewer: ''
manager: dansimp
ms.author: dansimp
ms.pagetype: mdop, security
ms.mktglfcycl: manage
ms.sitesec: library
ms.prod: w10
ms.date: 06/16/2016
ms.openlocfilehash: 4edf04404d29a40fb757aadffd1831a8d34966dc
ms.sourcegitcommit: 354664bc527d93f80687cd2eba70d1eea024c7c3
ms.translationtype: MT
ms.contentlocale: it-IT
ms.lasthandoff: 06/26/2020
ms.locfileid: "10825576"
---
# Sicurezza e privacy per MBAM 2.0
Usare le informazioni seguenti per pianificare le considerazioni sulla sicurezza e sulla privacy in Microsoft BitLocker Administration and Monitoring (MBAM).
## Considerazioni sulla sicurezza per MBAM 2,0
Ci sono molte considerazioni relative alla sicurezza che dovrebbero essere pianificate per la distribuzione e l'uso di MBAM nell'ambiente. Le informazioni in questa sezione forniscono una breve panoramica sugli account utente e i gruppi di ActiveDirectoryDomainServices, i file di log e altre considerazioni relative alla sicurezza per MBAM.
[Considerazioni sulla sicurezza per MBAM 2.0](mbam-20-security-considerations-mbam-2.md)
## Privacy per MBAM 2,0
Le informazioni in questa sezione illustrano molte delle procedure di raccolta e uso dei dati di MBAM.
[Informativa sulla privacy di MBAM 2.0](mbam-20-privacy-statement-mbam-2.md)
## Altre risorse MBAM sicurezza e privacy
[Operazioni relative a MBAM 2.0](operations-for-mbam-20-mbam-2.md)
| 29.113208 | 341 | 0.801037 | ita_Latn | 0.960527 |
582be14acecf80ece43b51e6ab6173f769e15a5c | 692 | md | Markdown | pages/en/lb4/apidocs/rest.rawbodyparser.supports.md | leslietilling/loopback.io | 9dd6eded8708bf82ff6dae7a98ef7a441b11018e | [
"MIT"
] | 1 | 2020-03-04T07:37:39.000Z | 2020-03-04T07:37:39.000Z | pages/en/lb4/apidocs/rest.rawbodyparser.supports.md | leslietilling/loopback.io | 9dd6eded8708bf82ff6dae7a98ef7a441b11018e | [
"MIT"
] | 3 | 2021-05-20T12:45:34.000Z | 2022-02-26T06:28:39.000Z | pages/en/lb4/apidocs/rest.rawbodyparser.supports.md | leslietilling/loopback.io | 9dd6eded8708bf82ff6dae7a98ef7a441b11018e | [
"MIT"
] | null | null | null | ---
lang: en
title: 'API docs: rest.rawbodyparser.supports'
keywords: LoopBack 4.0, LoopBack 4
sidebar: lb4_sidebar
permalink: /doc/en/lb4/apidocs.rest.rawbodyparser.supports.html
---
<!-- Do not edit this file. It is automatically generated by API Documenter. -->
[Home](./index.md) > [@loopback/rest](./rest.md) > [RawBodyParser](./rest.rawbodyparser.md) > [supports](./rest.rawbodyparser.supports.md)
## RawBodyParser.supports() method
<b>Signature:</b>
```typescript
supports(mediaType: string): boolean;
```
## Parameters
| Parameter | Type | Description |
| --- | --- | --- |
| mediaType | <code>string</code> | |
<b>Returns:</b>
`boolean`
| 21.625 | 148 | 0.653179 | eng_Latn | 0.279128 |
582c84761e4b8cb4928f14215e1dd24ad6a8ecf5 | 122 | md | Markdown | .changeset/three-moles-rest.md | bashco2/cloud-carbon-footprint | 6470fdeb3dc3ad9b7ac2b1f2e3c786a3b6e73142 | [
"Apache-2.0"
] | null | null | null | .changeset/three-moles-rest.md | bashco2/cloud-carbon-footprint | 6470fdeb3dc3ad9b7ac2b1f2e3c786a3b6e73142 | [
"Apache-2.0"
] | null | null | null | .changeset/three-moles-rest.md | bashco2/cloud-carbon-footprint | 6470fdeb3dc3ad9b7ac2b1f2e3c786a3b6e73142 | [
"Apache-2.0"
] | null | null | null | ---
'@cloud-carbon-footprint/client': patch
---
Bumps react-router-dom to latest major version, and updates related code
| 20.333333 | 72 | 0.745902 | eng_Latn | 0.964463 |
582cc71947f055eba4d5d394082baee22fd69ed7 | 2,784 | md | Markdown | node_modules/yeoman-generator/readme.md | twlk28/generator-lego | f1f0493b4c773039c3cf9b6082744be4dd724be4 | [
"MIT"
] | 1 | 2015-12-08T10:30:28.000Z | 2015-12-08T10:30:28.000Z | node_modules/yeoman-generator/readme.md | twlk28/generator-lego | f1f0493b4c773039c3cf9b6082744be4dd724be4 | [
"MIT"
] | null | null | null | node_modules/yeoman-generator/readme.md | twlk28/generator-lego | f1f0493b4c773039c3cf9b6082744be4dd724be4 | [
"MIT"
] | null | null | null | # Generator [](http://travis-ci.org/yeoman/generator) [](https://coveralls.io/r/yeoman/generator)
> A Rails-inspired generator system that provides scaffolding for your apps.
### [Getting Started](http://yeoman.io/generators.html) [API Documentation](http://yeoman.github.io/generator/)


## Getting Started
If you're interested in writing your own Yeoman generator we recommend reading [the official getting started guide](http://yeoman.io/generators.html).
There are typically two types of generators - simple boilerplate 'copiers' and more advanced generators which can use custom prompts, remote dependencies, wiring and much more.
The docs cover how to create generators from scratch as well as recommending command-line generators for making other generators.
For deeper research, read the code source or visit our [API documentation](http://yeoman.github.io/generator/).
## Testing generators
There is currently no formal infrastructure for testing generators, however you may find our [mocha generator](https://github.com/yeoman/generator-mocha) for custom generators useful.
### Debugging
To debug a generator, you can pass Node.js debug's flags by running it like this:
```bash
# OS X / Linux
node --debug `which yo` <generator> [arguments]
# Windows
node --debug <path to yo binary> <generator> [arguments]
```
Yeoman generators also use a debug mode to log relevant informations. You can activate it by setting the `DEBUG` environment variable to the desired scope (for the generator system scope is `generators:*`).
```bash
# OS X / Linux
DEBUG=generators/*
# Windows
set DEBUG=generators/*
```
## Officially maintained generators
* [Web App](https://github.com/yeoman/generator-webapp#readme)
* [AngularJS](https://github.com/yeoman/generator-angular#readme)
* [Backbone](https://github.com/yeoman/generator-backbone#readme)
* [Ember](https://github.com/yeoman/generator-ember#readme)
* [Polymer](https://github.com/yeoman/generator-polymer#readme)
* [Jasmine](https://github.com/yeoman/generator-jasmine#readme)
* [Mocha](https://github.com/yeoman/generator-mocha#readme)
* [Karma](https://github.com/yeoman/generator-karma#readme)
* [Chrome Apps Basic Boilerplate](https://github.com/yeoman/generator-chromeapp#readme)
* [Chrome Extension Boilerplate](https://github.com/yeoman/generator-chrome-extension#readme)
## License
[BSD license](http://opensource.org/licenses/bsd-license.php)
Copyright (c) Google
| 40.347826 | 250 | 0.769756 | eng_Latn | 0.598688 |
582d2cfeeb7c4e1f44ce04073dfb8a8b7159b242 | 2,954 | md | Markdown | README.md | kianmeng/rebar3_depup | 2f4a37caa332934e9fa80982cbc697bf3a93c385 | [
"MIT"
] | 12 | 2021-07-01T10:00:52.000Z | 2021-10-11T14:51:46.000Z | README.md | kianmeng/rebar3_depup | 2f4a37caa332934e9fa80982cbc697bf3a93c385 | [
"MIT"
] | 5 | 2021-07-01T15:34:39.000Z | 2022-03-02T15:59:36.000Z | README.md | kianmeng/rebar3_depup | 2f4a37caa332934e9fa80982cbc697bf3a93c385 | [
"MIT"
] | 2 | 2021-10-04T00:28:20.000Z | 2022-03-01T15:42:14.000Z | # rebar3_depup
## Dependency updater for rebar3 managed projects

## Usage
Add the plugin to your rebar.config:
```erlang
{project_plugins, [rebar3_depup]}.
```
You can add it to your _global `rebar.config`_ (e.g. `~/.config/rebar3/rebar.config`).
Then...
```bash
$ rebar3 update-deps
```
### ⚠️ Warning ⚠️
With the default options (see below), this project assumes that…
1. You have `rebar.config` file in your root folder
1. You do not use a `rebar.config.script`
1. The current OS user has access to the git repos that you have in your `rebar.config` file.
### Command-Line Options
```bash
$ rebar3 help update-deps
A rebar plugin to update dependencies
Usage: rebar3 update-deps [-r [<replace>]] [-c [<rebar_config>]]
[-a [<update_approx>]] [-d [<just_deps>]]
[-p [<just_plugins>]] [-h [<just_hex>]]
[-i <ignore>] [-o [<only>]]
-r, --replace Directly replace values in rebar.config. The
default is to just show you what deps can be
updated because this is an experimental feature and
using it can mess up your formatting and comments.
[default: false]
-c, --rebar-config File to analyze [default: rebar.config]
-a, --update-approx Update requirements starting with '~>' as well as
the ones with a specific version. [default: true]
-d, --just-deps Only update deps (i.e. ignore plugins and
project_plugins). [default: false]
-p, --just-plugins Only update plugins and project_plugins (i.e.
ignore deps). [default: false]
-h, --just-hex Only update hex packages, ignore git repos.
[default: false]
-i, --ignore Ignore dep when updating (can be repeated).
-o, --only Only update if the specified SemVer component
(major, minor, or patch) has changed. [default:
none]
```
## Configuration
To automatically ignore updates for one or more deps, add the `ignore` configuration to your `rebar.config`:
```erlang
%% Ignore any updates for eredis and lager.
{depup, [{ignore, [eredis, lager]}]}.
```
To only update if the specified SemVer component has changed, use `only`. Note that this configuration can
be overridden by the `-o/--only` command-line argument:
```erlang
%% Ignore any updates that change more than the minor (and patch) version of any dependency.
{depup, [{only, minor}].
```
With the above configuration in your `rebar.config`, running the following will update all dependencies by overriding `minor` with `none`:
```bash
rebar3 update-deps --only none
```
## Build
```bash
$ rebar3 compile
```
## Test
```bash
$ rebar3 test
```
| 31.763441 | 138 | 0.628639 | eng_Latn | 0.951714 |
582d3e33549afcb3575eef5a1a3c73f15b614fdc | 3,837 | md | Markdown | README.md | ttw225/git-gpg-validator | d68db16bd2cd39f0641db83476be3f62e0c7240a | [
"Apache-2.0"
] | null | null | null | README.md | ttw225/git-gpg-validator | d68db16bd2cd39f0641db83476be3f62e0c7240a | [
"Apache-2.0"
] | null | null | null | README.md | ttw225/git-gpg-validator | d68db16bd2cd39f0641db83476be3f62e0c7240a | [
"Apache-2.0"
] | null | null | null | # Git GPG Validator
Verify that the GPG key is valid on the GitHub account
[](https://github.com/PyCQA/bandit)
[](https://ochrona.dev)
[](https://pycqa.github.io/isort/)
[](https://github.com/psf/black)
[](http://mypy-lang.org/)
## Before Starting
We can use a [GPG Key](https://docs.github.com/en/authentication/managing-commit-signature-verification/generating-a-new-gpg-key) to verify the editor of the commit.
Authentication of the commit can protect projects and also help version tracing.
## Why this project?
People usually have different git accounts(emails) working on GitHub, GitLab, etc., both of them have different GPG keys.
To sign the commit correctly, it is better to verify the GPG key between the local settings and the git platform settings before uploading.
## Tools
### GPG Command
[GnuGPG](https://gnupg.org/)
[GitHub Docs](https://docs.github.com/en/authentication/managing-commit-signature-verification/generating-a-new-gpg-key)
+ List local GPG Keys
```sh
gpg --list-secret-keys --keyid-format=long
```
### GitHub API
[GitHub API Docs](https://docs.github.com/en/rest/reference)
+ [Get User GPG Keys Doc](https://docs.github.com/en/rest/reference/users#list-gpg-keys-for-a-user)
```sh
curl \
-H "Accept: application/vnd.github.v3+json" \
https://api.github.com/users/USERNAME/gpg_keys
```
### Git Command
Show git log with signatures
```sh
git log --show-signature
```
## Validator
## Getting Started
### Prerequisites
- python 3.7
- pipenv 2021.5.29
### Running
1. Installing Packages
```sh
pipenv install
```
2. Validate
```sh
pipenv run python3 validator/app.py
```
Help
```sh
pipenv run python3 validator/app.py --help
```
```sh
usage: app.py [-h] [-m {simple,hard}] [-k KEY]
GPG Key Validator
optional arguments:
-h, --help show this help message and exit
-m {simple,hard}, --method {simple,hard}
simple: compare fingerprint; hard: sign and verify
-k KEY, --key KEY Use default key or set a Key ID for signing
```
Example:
```sh
# Enter virtual environment
pipenv shell
# Simple Verify
## Use project default key, and compare fingerprint only
python3 validator/app.py
## Use specific key, and compare fingerprint only
python3 validator/app.py -k "KEY_ID"
# Hard Verify
mkdir .gpg_folder
## Use project default key, and do a sign-verify check
python3 validator/app.py -m hard
## Use specific key, and a sign-verify check
python3 validator/app.py -m hard -k "KEY_ID"
```
### [Simple Fingerprint Compare](./compare_fingerprint.py)
Applicable to the situation where the same key is used for encryption and decryption.
This script will compare GPG Key fingerprint between git config setting and platform settings.
> The fingerprint is derived from the public key and creation timestamp -- both are contained in the public keys listed on the site. [reference](https://stackoverflow.com/a/46916593)
### [Hard Verify](./sign_verify.py)
Some developers separate the keys for signing and verification. At this time, the actual signing and verification can confirm whether the key is set correctly.
Verification flow:
1. Sign text: use project default key or specific key to sign text
2. Import platform GPG keys
The `verify_signature` function will import platform's key into temporary GPG folder (default is `.gpg_folder`).
3. Use temporary GPG object to verify the signature from step 1.
| 30.452381 | 182 | 0.733385 | eng_Latn | 0.865719 |
582ea3adcbbb55378ac62e353401fb12ec8e7406 | 7,050 | md | Markdown | articles/virtual-machines/linux/sql/sql-server-linux-virtual-machines-overview.md | ellismg/azure-docs | b9467c326822816120a04692ddd8449adf8480ec | [
"CC-BY-4.0",
"MIT"
] | 2 | 2019-08-30T23:39:11.000Z | 2019-08-30T23:39:14.000Z | articles/virtual-machines/linux/sql/sql-server-linux-virtual-machines-overview.md | ellismg/azure-docs | b9467c326822816120a04692ddd8449adf8480ec | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-06-12T00:05:28.000Z | 2019-07-09T09:39:55.000Z | articles/virtual-machines/linux/sql/sql-server-linux-virtual-machines-overview.md | ellismg/azure-docs | b9467c326822816120a04692ddd8449adf8480ec | [
"CC-BY-4.0",
"MIT"
] | 1 | 2022-03-15T04:24:45.000Z | 2022-03-15T04:24:45.000Z | ---
title: Overview of SQL Server on Azure Linux Virtual Machines | Microsoft Docs
description: Learn about how to run full SQL Server editions on Azure Linux Virtual machines. Get direct links to all Linux SQL Server VM images and related content.
services: virtual-machines-linux
documentationcenter: ''
author: MashaMSFT
manager: craigg
tags: azure-service-management
ms.service: virtual-machines-sql
ms.topic: conceptual
ms.workload: iaas-sql-server
ms.date: 04/10/2018
ms.author: mathoma
ms.reviewer: jroth
---
# Overview of SQL Server on Azure Virtual Machines (Linux)
> [!div class="op_single_selector"]
> * [Windows](../../windows/sql/virtual-machines-windows-sql-server-iaas-overview.md)
> * [Linux](sql-server-linux-virtual-machines-overview.md)
SQL Server on Azure virtual machines enables you to use full versions of SQL Server in the Cloud without having to manage any on-premises hardware. SQL Server VMs also simplify licensing costs when you pay as you go.
Azure virtual machines run in many different [geographic regions](https://azure.microsoft.com/regions/) around the world. They also offer a variety of [machine sizes](../sizes.md). The virtual machine image gallery allows you to create a SQL Server VM with the right version, edition, and operating system. This makes virtual machines a good option for a many different SQL Server workloads.
## <a id="create"></a> Get started with SQL VMs
To get started, choose a SQL Server virtual machine image with your required version, edition, and operating system. The following sections provide direct links to the Azure portal for the SQL Server virtual machine gallery images.
> [!TIP]
> For more information about how to understand pricing for SQL images, see [the pricing page for Linux SQL Server VMs](https://azure.microsoft.com/pricing/details/virtual-machines/linux/).
| Version | Operating System | Edition |
| --- | --- | --- |
| **SQL Server 2017** | Red Hat Enterprise Linux (RHEL) 7.4 |[Enterprise](https://portal.azure.com/#create/Microsoft.SQLServer2017EnterpriseonRedHatEnterpriseLinux74), [Standard](https://portal.azure.com/#create/Microsoft.SQLServer2017StandardonRedHatEnterpriseLinux74), [Web](https://portal.azure.com/#create/Microsoft.SQLServer2017WebonRedHatEnterpriseLinux74), [Express](https://portal.azure.com/#create/Microsoft.FreeSQLServerLicenseSQLServer2017ExpressonRedHatEnterpriseLinux74), [Developer](https://portal.azure.com/#create/Microsoft.FreeSQLServerLicenseSQLServer2017DeveloperonRedHatEnterpriseLinux74) |
| **SQL Server 2017** | SUSE Linux Enterprise Server (SLES) v12 SP2 |[Enterprise](https://portal.azure.com/#create/Microsoft.SQLServer2017EnterpriseonSLES12SP2), [Standard](https://portal.azure.com/#create/Microsoft.SQLServer2017StandardonSLES12SP2), [Web](https://portal.azure.com/#create/Microsoft.SQLServer2017WebonSLES12SP2), [Express](https://portal.azure.com/#create/Microsoft.FreeSQLServerLicenseSQLServer2017ExpressonSLES12SP2), [Developer](https://portal.azure.com/#create/Microsoft.FreeSQLServerLicenseSQLServer2017DeveloperonSLES12SP2) |
| **SQL Server 2017** | Ubuntu 16.04 LTS |[Enterprise](https://portal.azure.com/#create/Microsoft.SQLServer2017EnterpriseonUbuntuServer1604LTS), [Standard](https://portal.azure.com/#create/Microsoft.SQLServer2017StandardonUbuntuServer1604LTS), [Web](https://portal.azure.com/#create/Microsoft.SQLServer2017WebonUbuntuServer1604LTS), [Express](https://portal.azure.com/#create/Microsoft.FreeSQLServerLicenseSQLServer2017ExpressonUbuntuServer1604LTS), [Developer](https://portal.azure.com/#create/Microsoft.FreeSQLServerLicenseSQLServer2017DeveloperonUbuntuServer1604LTS) |
> [!NOTE]
> To see the available Windows SQL Server virtual machine images, see [Overview of SQL Server on Azure Virtual Machines (Windows)](../../windows/sql/virtual-machines-windows-sql-server-iaas-overview.md).
## <a id="packages"></a> Installed packages
When you configure SQL Server on Linux, you install the database engine package and then several optional packages depending on your requirements. The Linux virtual machine images for SQL Server automatically install most packages for you. The following table shows which packages are installed for each distribution.
| Distribution | [Database Engine](https://docs.microsoft.com/sql/linux/sql-server-linux-setup) | [Tools](https://docs.microsoft.com/sql/linux/sql-server-linux-setup-tools) | [SQL Server Agent](https://docs.microsoft.com/sql/linux/sql-server-linux-setup-sql-agent) | [Full-Text Search](https://docs.microsoft.com/sql/linux/sql-server-linux-setup-full-text-search) | [SSIS](https://docs.microsoft.com/sql/linux/sql-server-linux-setup-ssis) | [HA add-on](https://docs.microsoft.com/sql/linux/sql-server-linux-business-continuity-dr) |
|---|---|---|---|---|---|---|
| RHEL |  |  |  |  |  |  |
| SLES |  |  |  |  |  |  |
| Ubuntu |  |  |  |  |  |  |
## Related products and services
### Linux Virtual Machines
* [Virtual Machines overview](../overview.md)
### Storage
* [Introduction to Microsoft Azure Storage](../../../storage/common/storage-introduction.md)
### Networking
* [Virtual Network overview](../../../virtual-network/virtual-networks-overview.md)
* [IP addresses in Azure](../../../virtual-network/virtual-network-ip-addresses-overview-arm.md)
* [Create a Fully Qualified Domain Name in the Azure portal](../portal-create-fqdn.md)
### SQL
* [SQL Server on Linux documentation](https://docs.microsoft.com/sql/linux)
* [Azure SQL Database comparison](../../../sql-database/sql-database-paas-vs-sql-server-iaas.md)
## Next steps
Get started with SQL Server on Azure Linux virtual machines:
* [Create a SQL Server VM in the Azure portal](provision-sql-server-linux-virtual-machine.md)
Get answers to commonly asked questions about SQL VMs on Linux:
* [SQL Server on Azure Linux Virtual Machines FAQ](sql-server-linux-faq.md)
| 84.939759 | 610 | 0.772766 | eng_Latn | 0.25559 |
582f34450f765bece30252f28af5be1a9b4b1754 | 1,792 | md | Markdown | README.md | joaaogui/octa-vaga-front | f70035efdb5c69c08e9514fece7f68b6df4e2112 | [
"Unlicense"
] | 2 | 2018-01-19T20:28:35.000Z | 2018-01-20T17:25:58.000Z | README.md | joaaogui/octa-vaga-front | f70035efdb5c69c08e9514fece7f68b6df4e2112 | [
"Unlicense"
] | 6 | 2020-05-15T18:28:34.000Z | 2021-06-07T05:14:52.000Z | README.md | joaaogui/octa-vaga-front | f70035efdb5c69c08e9514fece7f68b6df4e2112 | [
"Unlicense"
] | 32 | 2018-01-19T20:28:37.000Z | 2021-10-15T19:13:14.000Z | # Teste para candidatos à vaga de desenvolvedor Front-End
Nesta etapa, você precisa desenvolver uma toolbox de componentes com base na imagem disponível e requisitos descritos.
## Cenário:
O octadesk quer investir em um novo produto que possui uma demanda específica.
Durante a design sprint deste produto surgiu uma hipótese que precisa ser validada.
Esta hipótese depende de um editor de html drag'n drop para que o usuário possa customizar sua página corporativa.
Nossa analista de UX criou um protótipo simples e precisamos validar a experiência do usuário proposta.
Com base no protótipo e nas funcionalidades mandatórias abaixo descritas, devemos criar um protótipo mais fiel e,
principalmente, funcional.
Requisitos:
- como usuário, eu possuo uma toolbox para poder arrastar e soltar componentes desta toolbox na área de design
- como usuário, eu posso incluir e excluir tantos componentes quanto eu julgar necessário
- como usuário, eu posso customizar estes componentes visualmente
## Instruções:
1. Para iniciar o teste, faça um fork deste repositório;
2. Crie um branch com o seu nome;
3. Implemente os arquivos desenvolvidos;
3. Após terminar, submeta um pull request e aguarde seu feedback.
Observação: Se você apenas clonar o repositório não vai conseguir fazer push e depois vai ser mais complicado fazer o pull request.
## Considerações:
- Utilizar Vue como framework javascript;
- Utilizar features do ES6+, por exemplo, async/await, object destructuring, map/spread operators;
- Possuir um layout responsivo;
- Documentação descrevendo a metodologia utilizada;
## Diferenciais:
- Utilizar vuex;
- Utilizar typescript;
- Utilizar testes automatizados;
- Utilizar metodologias e boas práticas de código;
- Publicar a aplicação em uma url pública;
| 44.8 | 132 | 0.795759 | por_Latn | 0.999927 |
582f9eaa6a49c201a764b31308717390f423fd34 | 13,665 | md | Markdown | SEM-1_DEL-C.md | prebujo/INF101.Oblig1.Zombielab | 57de950d19e429bbb0e533d12d65fd9f924b6baa | [
"MIT"
] | null | null | null | SEM-1_DEL-C.md | prebujo/INF101.Oblig1.Zombielab | 57de950d19e429bbb0e533d12d65fd9f924b6baa | [
"MIT"
] | null | null | null | SEM-1_DEL-C.md | prebujo/INF101.Oblig1.Zombielab | 57de950d19e429bbb0e533d12d65fd9f924b6baa | [
"MIT"
] | null | null | null | # [Semesteroppgave 1: “Rogue One oh one”](https://retting.ii.uib.no/inf101.v18.sem1/blob/master/SEM-1_DEL-C.md) Del C: Videreutvikling
* [README](README.md)
* [Oversikt](SEM-1.md) – [Praktisk informasjon 5%](SEM-1.md#praktisk-informasjon)
* [Del A: Bakgrunn, modellering og utforskning 15%](SEM-1_DEL-A.md)
* [Del B: Fullfør basisimplementasjonen 40%](SEM-1_DEL-B.md)
* **Del C: Videreutvikling 40%**
## Videreutvikling av Rogue-101
Vi overlater nå ansvaret for utviklingen til deg – du finner noen forslag under som du kan jobbe ut ifra, eller så kan du finne på din egen utvikdelse av koden. For å få maks poengsum på oppgaven må du gjøre to av forslagene, eventuelt erstatte ett eller begge med noe du kommer på selv, som er tilsvarende stort i omfang.
Når du nå er ferdig med Del A og B skal du ha et dungeon crawler (evt. rabbit hopping) spill med en spiller som kan
* bevege seg rundt på kartet
* se synlige ting rundt seg
* plukke opp ting
* bære én ting
* legge fra seg ting
* angripe
Du har også gulrøtter som kan plukkes opp, og kaniner som kan spise gulrøtter og angripe ting.
Herfra kan du enten fortsette å legge på funksjonalitet på koden du har, eller brette opp ermene og lage ditt eget spill med andre klasser enn de vi har gitt her – det kanskje mest aktuelt å bytte ut item-klassene. Du kan gjerne skrive en “intro” til spillet, og “flavour” i `displayMessage`. Uansett om du vil bruke klassene vi har gitt deg eller lage dine egne, må du huske å levere klassene du jobbet med i del A og B.
Du må selv bestemme hva mer som skal legges til av funksjonalitetet, og hva du evt. vil gjøre for å gi spillet litt mer stemning (grafikk, kanskje?). Som nevnt tidligere kan du velge “setting” og “storyline” som du vil – du trenger ikke å lage huleutforskning med magi, sverd, orker og hobbiter – eller med kaniner og gulrøtter. Skriv ned forklaring til det du gjør i README.md – både funksjonalitet som du legger til, og kreative ting du finner på – vi legger ikke nødvendigvis merke til alt når vi prøvekjører ting, så det er greit å vite hva vi skal se etter.
Siden vi ikke vet hva slags lure ting dere kommer på å implementere, skriver vi de følgende forslagene ut ifra koden vi har gitt dere. Du står fritt til å i stedet lage tilsvarende funksjonalitet for klasser i spillet du lager selv. Når vi skriver “spilleren”, kan det altså være at du vil bruke en annen klasse enn `Player`-klassen, eller at du vil gi denne funksjonaliteten til en `INonPlayer`.
Følgende er forslag til hva du kan gjøre, ikke nødvendigvis i den rekkefølgen.
### C1: Ting/items som påvirker spillet
Spilleren vår vil gjerne kunne finne andre ting i labyrinten enn gulrøtter. I denne delen kan du lage flere ting, for eksempel noe som gjør spilleren flinkere til å angripe, eller øker helsen den har. Du kan lage “healings potions” som øker helsepoengene til spilleren. Dersom du vil at spilleren skal kunne slåss bedre, kan du lage våpen, for eksempel av typen langkost. Dersom du angriper en kanin med en langkost gir det deg kanskje en høyere attack score enn uten. (Du kan selv velge hvordan type ting påvirker angrep, helsepoeng, og kanskje også hverandre).
Det kan være at tingen tar effekt når du plukker den opp, eller kanskje spilleren må trykke på en tast (klassiske roguelikes bruker gjerne `q` for “quaff a potion” eller `w` for “wield a weapon”).
For å implementere disse tingene må du lage klasser for dem og extende IItem. De vil likne på gulrot-klassen. Du må selv finne ut hvordan de skal tegnes. Se styling-seksjonen for tips til grafikk.
### C2: Inventory - bærenett
Dersom spilleren kan finne flere forskjellige typer ting i labyrinten, er det kjekt å kunne bære mer enn én ting av gangen. Hvis labyrinten enda bare har gulrøtter, vil spilleren kanskje kunne samle mange av dem, og bære dem med seg – og alt du trenger å holde rede på er antall gulrøtter. Men for varierende typer ting, trenger vi en bedre løsning.
I objektorientering er vi opptatt av abstraksjon og forståelig kode, så selv om vi kunne ha latt spilleren få flere feltvariabler for å holde styr på alle tingene den bærer, så vil vi heller implementere en egen klasse for en *samling* (eller Collection) av ting. Java har standard lister, men her trenger vi kanskje noe litt annet: det bør jo gjerne ha en begrensning på hvor mye man kan bære med seg.
* Samlingen bør lagre `IItem`-objekter, eventuelt at den er generisk med `<T extends IItem>`.
* Du kan velge om begrensningen er på antall elementer, eller på total størrelse (`getSize()`) for alle elementene i samlingen.
* Du må ha metoder for å putte noe inn i samlingen, hente noe ut, sjekke om det er ledig plass, osv. Du trenger ikke ha indekser – det holder å kunne putte inn og hente ut.
* Hvis noen prøver å putte noe inn i en full samling (eller noe det ikke er plass til i samlingen) bør du kaste en exception.
For tips til implementasjon av samling kan du se på [IList/MyList-listene fra tidligere labber](https://retting.ii.uib.no/inf101.v18.oppgaver/inf101.v18.lab4/tree/master/src/inf101/v18/datastructures). Du kan gjerne bruke Java sitt [standard-bibliotek for samlinger](https://docs.oracle.com/javase/8/docs/api/java/util/Collection.html) i implementasjonen (legg merke til at du trenger en konkret implementasjon av interfacet, for eksempel en [ArrayList](https://docs.oracle.com/javase/8/docs/api/java/util/ArrayList.html)).
* For litt mer solid INF101-design, bør du gjerne designe et eget grensesnitt, f.eks. `IContainer<T extends IItem>` – da kan du også ha flere varianter av samlinger (som f.eks. virker på forskjellig måte).
* Spilleren (og kaninene) kan bruke samlingen du har laget som bærenett eller ryggsekk til å lagre items som blir plukket opp. Brukeren vil antakelig ha lyst til å ha oversikt over hva som er i “sekken” – det er ledig plass på skjermen hvor du kan vise mer informasjon – se på metodene `getPrinter()` (du kan skrive ting på skjermen med `printAt(x,y, text)`), `getFreeTextAreaBounds()` og `clearFreeTextArea()`.
* En mer avansert og artig bruk av `IContainer` er å la den også utvide `IItem`. Da kan du putte ting i sekken din, og legge sekken på bakken – eller putte en kanin inn i en hatt inn i en sekk inn i en koffert og plukke opp og sette fra deg kofferten.
### C3 Styling
Du kan kommet et godt stykke på vei med litt kreativ bruk av tekst-symbolene. For enkel blokk-grafikk (til vegger, f.eks.) så finnes det en del forskjellige tegn du kan bruke i [BlocksAndBoxes](inf101/v18/gfx/textmode/BlocksAndBoxes.java). Hvis du implementer `getPrintSymbol()` så kan du bruke et eget tegn til grafikk-visningen (hvis du bruker bokstaven i `getSymbol()` til noe – fancy fabrikk, f.eks.).
#### Fancy vegger
*(Du kan selvfølgelig gjøre tilsvarende med andre ting enn vegger)*
F.eks. bruker `Wall`-objektene `BlocksAndBoxes.BLOCK_FULL` som symbol. Det kan være litt tricky å få `Wall` til å variere `getSymbol()` avhengig av hvor den har andre vegger som naboer, men du kan i prinsippet lage flere varianter av wall:
* Her er det mest praktisk å bruk *arv*, f.eks.:
```
public class DiagonalWall extends Wall {
// Java vil bruke denne i stedet for getSymbol() fra Wall – men alle andre metoder følger
// med fra Wall.
@Override
public String getSymbol() {
return BlocksAndBoxes.DIAG_FORWARD;
}
}
```
* Du trenger også å legge til alle vegg-variantene i `createItem`-metoden.
* Du trenger at alle vegger har `Wall`-typen, fordi kartet bruker det til å se om et felt er opptatt (bla gjennom items på en lokasjon, sjekk om `item instanceof Wall`). Når du sier at `DiagonalWall extends Wall`, så vi alle diagonale vegger også telle som vanlige vekker.
* Alternativet ville vært å enten legge til en metode `isWall()` i `IItem`; eller si at en kartcelle er opptatt hvis det er en veldig stor ting der (`getSize()` større enn 100 eller 1000, f.eks.); eller lage et ekstra grensesnitt `IWall` (og både `Wall` og `DiagonalWall` `implements IWall`) og bruke det istedenfor der hvor man trenger å sjekke etter vegger.
En annen mulighet for fancy vegg er å la veggene tilpasse seg omgivelsene. Det krever litt samarbeid mellom Game og Wall – f.eks. kan du finne alle veggene (enten i `beginTurn()` eller når du setter opp kartet) og kalle en ny metode som du lager (f.eks. `setup(IGame game)`). Veggen kan så utforske naboene sine, og finne ut hvilke av dem som er vegger, og velge riktig box-tegn – enten en av `BLOCK_*`-tegnene fra `BlocksAndBoxes`, eller med [Box-drawing characters](https://en.wikipedia.org/wiki/Box-drawing_character) (disse var mye brukt på gamle DOS-datamaskiner, men det er nå en standard del av Unicode-tegnsettet, og `Printer`-klassen støtter dem uavhengig av hvilken font du bruker.). Du kan paste box-tegn (og block-tegn) rett inn i koden din (så lenge [Eclipse er riktig satt opp med UTF-8](https://retting.ii.uib.no/inf101/inf101.v18/wikis/oppsett#fiks-for-tegnsett)), eller bruke [`\uXXXX` escape-koder](https://docs.oracle.com/javase/tutorial/java/data/characters.html) i strengen. F.eks., `"╣"` er `"\u2563"`.
#### Emojis
Hvis du bytter font vil du kunne bruke en haug med praktiske symboler. Du kan sette fonten i `start()` metoden i [Main](inf101/v18/rogue101/Main.java)-klassen. “Symbola” inneholder standard [Unicode emojis](https://en.wikipedia.org/wiki/Emoji#Unicode_blocks) i tillegg til vanlige bokstaver. Du kan [laste den ned herfra](http://users.teilar.gr/~g1951d/).
* Hvis du putter `Symbola.ttf` i `src/inf101/v18/gfx/fonts/`, skal du kunne kjøre `printer.setFont(Printer.FONT_SYMBOLA);` i `start()`, og så bruke f.eks. `"☺️"` (`"\u263a"`) som symbol for spilleren.
* Merk at en del mer obskure Unicode-emojis, som [`"🦆"`](https://en.wikipedia.org/wiki/%F0%9F%A6%86) må skrives med to escape koder (`"\ud83e\udd86"`) og er ikke nødvendigvis med i fonten du bruker.
* Vanlige bokstaver ser dessverre ikke så veldig fine ut i Symbola-fonten, siden grafikken vår baserer seg på monospaced tekst (det går foreløpig ikke an å bruke mer enn én font i systemet).
* Hvis du prøver deg med andre fonter (som du selvfølgelig må oppgi kilde til og ha rettigheter til å bruke – og finne ut hvordan du setter de opp med `Printer`), vil de gjerne ikke passe så veldig godt på skjermen. Det går an å justere størrelse og posisjon på bokstavene med `inf101.v18.gfx.textmode.TextFondAdjuster`.
#### Farger
Du kan lage farger med [ANSI escape-koder](https://en.wikipedia.org/wiki/ANSI_escape_code#Colors). For eksempel `"\u001b[31m" + "@" + "\u001b[0m"` for å lage et rødt @-tegn. `"\u001b[31m"` velger rød tekstfarge og `"\u001b[0m"` skifter tilbake til standard (hvis du ikke har med den så blir all teksten rød).
#### Skilpaddegrafikk
Alternativet til tekstgrafikken er å bruke `ITurtle`/`TurtlePainter`, som du har sett litt i bruk på forelesningene for å lage frosker og ender. Gulroten er tegnet slik – hvis du implementerer `draw()`-metoden for et item, og lar den returnere `true` blir teksten ikke tegnet. Du kan se på gulrot-eksempelet, og på [koden fra forelesningene](https://retting.ii.uib.no/inf101/inf101.v18/wikis/kode-fra-forelesninger) for eksempler.
* Les mer om [skilpaddegrafikk her](https://en.wikipedia.org/wiki/Turtle_graphics)
* Draw metoden får bredden og høyden til kartcellen som parametre. Som standard er ting laget til slik at bredde og høyde er 32 – selv om skjermen viser smale tegn (16x32). Hvis du trenger kontroll over bredden, is stedet for at tegningen blir “skvist” til halv bredde, kan du sette `Main.MAP_AUTO_SCALE_ITEM_DRAW` til `false`.
* Foreløpig støtter grafikksystemet ikke at du kan bruke egne bilder (i jpg eller png filer, f.eks.) – men det er mulig vi kan legge til dette etterhvert.
### C5: Meldingsvindu
* lage meldings"vindu"
### C6: Win condition
Det går foreløpig ikke an å vinne spillet – her må du eventuelt være kreativ selv. F.eks. at spilleren vinner når alle gulrøttene er samlet i det ene hjørnet – eller når spilleren er alene igjen – eller noe helt annet.
### C?: Noe du finner på / noe annet vi finner på
* Du står fritt til å finne på ting selv; og det kan også være vi legger ut litt flere ideer underveis.
# Diverse
## Åpne kilder til grafikk / lyd / media
*Foreløpig støtter grafikksystemet ikke at du kan bruke egne bilder (i jpg eller png filer, f.eks.) – men det er mulig vi kan legge til dette etterhvert. Du kan uansett tegne med skilpaddegrafikken (TurtlePainter). Det ligger heller ikke med kode for å spille av lyd – men det kan være vi har noe slikt på lur til når du har kommet skikkelig i gang med ting.*
* Om du ikke er flink til å tegne selv, kan du finne glimrende grafikk på [OpenGameArt](http://opengameart.org/) – **husk å skrive i oversiktsdokumentet hvor du har fått grafikken fra** (webside, opphavsperson, copyright-lisens – om du bruker OpenGameArt, finner du opplysningene i *License(s)* og *Copyright/Attribution Notice*).
* [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) har en god del bilder og andre mediafiler tilgjengelig – du får til og med en “You need to attribute this author – show me how” instruks når du laster ned ting.
* [Kevin MacLeod](https://incompetech.com/music/) er komponist og har masse musikk tilgjengelig som egner seg for spill og småfilmer; du trenger bare å kreditere ham (han bruker vanlig [CC-BY-3.0](http://creativecommons.org/licenses/by/3.0/) lisens).
* Tidligere INF101-student [Øyvind Aandalen](https://soundcloud.com/user-616269685) har litt musikk han har laget på [SoundCloud](https://soundcloud.com/user-616269685) som han sier dere kan bruke som dere vil.
# Gå videre til [**Innlevering**](SEM-1.md#praktisk-informasjon)
| 110.201613 | 1,024 | 0.761947 | nob_Latn | 0.997569 |
582fa5521f8542d710099252713cb9299fbad59f | 3,955 | md | Markdown | _posts/Archive/Week11/2020-09-08-Day48.md | devArtsw0rd7/str8-blog | 0dd0690a21b1fda2b57163e6f3be321bef758607 | [
"MIT"
] | null | null | null | _posts/Archive/Week11/2020-09-08-Day48.md | devArtsw0rd7/str8-blog | 0dd0690a21b1fda2b57163e6f3be321bef758607 | [
"MIT"
] | null | null | null | _posts/Archive/Week11/2020-09-08-Day48.md | devArtsw0rd7/str8-blog | 0dd0690a21b1fda2b57163e6f3be321bef758607 | [
"MIT"
] | null | null | null | ---
layout: post
title: "Week 11, Day 48"
date: 2020-09-08 007:52:44 -0600
categories: jekyll update
---
## Standup
### Friday
EOD: Completed and submitted Le'fant August 2020 Status Report. Built eFolderUploadService initializeUpload() SOAP request as part of the work for PEN-4487 (Shout out to Sean, Andrew, Ansley, Mel, and JoeMc for their help!!!). Still need to determine the specific docType, source, and vaReceiveDate for the upload, then we should be able to get a token back which will be used in the second part of the UploadService call, uploadDocument().
### Today
* Whiteboarding session with Sean and Andrew
* Research creating an executable- issues (packaging all known and hidden dependencies)
* TO CREATE A WINDOWS exectuable, you have to run pyinstaller on Windows machine. Need to use Wine for this or create a windows VM to create the Windows executable (not sure how this would work with Tunnel/proxy settings?) -> Need to make a step by step for Mel or walk him through?
### Run Windows apps with Wine
`brew cask install xquartz`
`brew install winexe`
### Cross-compile a Python script into a Windows executable using [PyInstaller under Wine](https://www.andreafortuna.org/2017/12/27/how-to-cross-compile-a-python-script-into-a-windows-executable-on-linux/)
* `sudo apt-get install wine`
* `wget https://www.python.org/ftp/python/2.7.9/python-2.7.9.amd64.msi`
* `wine msiexec /i python-2.7.9.amd64.msi /qb`
### NOTE -> If you need to compile for 32-bit .exe, install wine32
* `sudo dpkg --add-architecture i386 && sudo apt-get update && sudo apt-get install wine32`
* Pyinstaller limitations
* hidden imports
* relative imports in entry-point scripts
* Can't cross-compile (need a build machine for each targeted OS)
### PyInstaller Workflow
* `pip` should be installed
* `pip install pyinstaller`
* `pyinstaller --onefile vetgen.py`
* dist folder->test folder
* Add `-w`flag to prevent terminal window from opening up
* Additional dependencies- NSIS
* Installer based on .zip file
* provide path for dist/.exe file
* Convert dist folder to .zip
* point NSIS to the .zip file
* Update pre-req parser
*Wheels (whl) package (create a wheel and install in a virtual environment)
* `sudo pip install --upgrade pip`
* `sudo pip install wheel`
* `sudo pip install unidecode`
* `mkdir mywheels`
* `pip wheel --wheel-dir=my_wheels Unidecode`
* `cd my_wheels/` to confirm creation of .whl file
* Install wheel in venv
* `virtualenv venv_name`
* `cd venv_name`
* `source bin/activate`
* Do pip install wheel if it didn't already
* `pip install --wheel --no-index --find-links=/Users/lefant/my_wheels/ Unidecode`
* `python`
* Confirm by `import unidecode`
*
* Create Compiled version for Mac and Windows
* Update Jira with subtasks and names after whiteboarding session
* Setup blogs for this week and next
* Life stuff
* Puppy- end of life care
* Firewood/Snow
* Wildfires
* Readme
* Scenarios
* Known issues
* Fix Mapping, Claims, New Scenario
### Next week
* Testing the executable
* Completing the application
### Impediments
### From other teammates
* Sean out of pocket due to attic flooding
* Joe McNamara sick
### What I actually ended up doing
Researched the various possibilities for creating an executable (mainly for Windows). Looked into bug PEN-4391- VETGEN-UI: App crashes when saving scenario to see if I can make any headway.
### EOD for Slack
EOD:
### Questions (New Guy Stuff)
### TO LEARN
* SOAPUI
* Web services, WSDLs, etc.
*
### //TODO
* Onboarding/Camunda Modeler
### Daily Workflow
* Login
* Update/Setup Weekly Blog Posts
* Open Jira
* Check lefant email
* Check BAH email
* Check VA email
* Connect to Tunnel (if necessary)
* Open GitHub Desktop
* Open VS Code
### Issues
* Having issues with my TimeMachine backup on my iMac (haven't messed with it lately)
| 33.516949 | 440 | 0.719848 | eng_Latn | 0.944834 |
5830485431fda4a3ea86c1af8ab2f7665c73dfd6 | 2,443 | md | Markdown | 101-vm-secure-password/README.md | iamjoshov/azure-quickstart-templates | 81f0208e3e4bbd9892ebadb2566e02e708eb1437 | [
"MIT"
] | 1 | 2020-04-14T07:51:01.000Z | 2020-04-14T07:51:01.000Z | 101-vm-secure-password/README.md | iamjoshov/azure-quickstart-templates | 81f0208e3e4bbd9892ebadb2566e02e708eb1437 | [
"MIT"
] | 2 | 2022-03-08T21:12:23.000Z | 2022-03-08T21:12:23.000Z | 101-vm-secure-password/README.md | iamjoshov/azure-quickstart-templates | 81f0208e3e4bbd9892ebadb2566e02e708eb1437 | [
"MIT"
] | 2 | 2020-11-04T03:50:21.000Z | 2021-07-07T12:49:46.000Z | # Very simple deployment of a Windows VM
<IMG SRC="https://azurequickstartsservice.blob.core.windows.net/badges/101-vm-secure-password/PublicLastTestDate.svg" />
<IMG SRC="https://azurequickstartsservice.blob.core.windows.net/badges/101-vm-secure-password/PublicDeployment.svg" />
<IMG SRC="https://azurequickstartsservice.blob.core.windows.net/badges/101-vm-secure-password/FairfaxLastTestDate.svg" />
<IMG SRC="https://azurequickstartsservice.blob.core.windows.net/badges/101-vm-secure-password/FairfaxDeployment.svg" />
<IMG SRC="https://azurequickstartsservice.blob.core.windows.net/badges/101-vm-secure-password/BestPracticeResult.svg" />
<IMG SRC="https://azurequickstartsservice.blob.core.windows.net/badges/101-vm-secure-password/CredScanResult.svg" />
<a href="https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure%2Fazure-quickstart-templates%2Fmaster%2F101-vm-secure-password%2Fazuredeploy.json" target="_blank">
<img src="https://raw.githubusercontent.com/Azure/azure-quickstart-templates/master/1-CONTRIBUTION-GUIDE/images/deploytoazure.svg?sanitize=true"/>
</a>
<a href="http://armviz.io/#/?load=https%3A%2F%2Fraw.githubusercontent.com%2FAzure%2Fazure-quickstart-templates%2Fmaster%2F101-vm-secure-password%2Fazuredeploy.json" target="_blank">
<img src="https://raw.githubusercontent.com/Azure/azure-quickstart-templates/master/1-CONTRIBUTION-GUIDE/images/visualizebutton.svg?sanitize=true"/>
</a>
This template allows you to deploy a simple Windows VM by retrieving the password that is stored in a Key Vault. Therefore the password is never put in plain text in the template parameter file.
## Add Secret to the Key Vault
You can add the password to the Key Vault using the below commands:
#### PowerShell
```
$Secret = ConvertTo-SecureString -String 'Password' -AsPlainText -Force
Set-AzKeyVaultSecret -VaultName 'Contoso' -Name 'ITSecret' -SecretValue $Secret
```
#### CLI
```
az keyvault secret set --vault-name Contoso --name ITSecret --value 'password'
```
## Enable Key Vault for VM and Template secret access
After this you'll need to enable the Key Vault for template deployment. You can do this using the following commands:
## PowerShell
```
Set-AzKeyVaultAccessPolicy -VaultName Contoso -EnabledForTemplateDeployment
```
### CLI
```
az keyvault update --name Contoso --enabled-for-template-deployment true
```
| 51.978723 | 212 | 0.782235 | kor_Hang | 0.267142 |
58308433d25d814d69c7d0a10cc7c76fb67c440f | 5,432 | md | Markdown | articles/search/search-explorer.md | JungYeolYang/azure-docs.zh-cn | afa9274e7d02ee4348ddb6ab81878b9ad1e52f52 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/search/search-explorer.md | JungYeolYang/azure-docs.zh-cn | afa9274e7d02ee4348ddb6ab81878b9ad1e52f52 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/search/search-explorer.md | JungYeolYang/azure-docs.zh-cn | afa9274e7d02ee4348ddb6ab81878b9ad1e52f52 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Azure 门户中用于查询数据的搜索浏览器工具 - Azure 搜索
description: 使用搜索浏览器等 Azure 门户工具在 Azure 搜索中查询索引。 使用高级语法输入搜索词或完全限定的搜索字符串。
manager: cgronlun
author: HeidiSteen
services: search
ms.service: search
ms.topic: conceptual
ms.date: 05/02/2019
ms.author: heidist
ms.custom: seodec2018
ms.openlocfilehash: 392699182859a090c13304f63d28a78b95a65ec7
ms.sourcegitcommit: 4b9c06dad94dfb3a103feb2ee0da5a6202c910cc
ms.translationtype: MT
ms.contentlocale: zh-CN
ms.lasthandoff: 05/02/2019
ms.locfileid: "65024026"
---
# <a name="search-explorer-for-querying-data-in-azure-search"></a>Azure 搜索中用于查询数据的搜索浏览器
本文介绍如何使用搜索浏览器在 Azure 门户中查询现有 Azure 搜索索引。 可以使用搜索浏览器,向服务中的任何现有索引提交简单或完整的 Lucene 查询字符串。

有关入门的帮助,请参阅[启动搜索资源管理器](#start-search-explorer)。
## <a name="basic-search-strings"></a>基本搜索字符串
以下示例假定为内置的房地产示例索引。 有关创建此索引的帮助,请参阅[快速入门:Azure 门户中的导入、索引和查询](search-get-started-portal.md)。
### <a name="example-1---empty-search"></a>示例 1 - 空搜索
若要首先查看内容,请执行空搜索,方法是单击“搜索”,不提供搜索词。 空搜索作为第一个查询十分有用,因为它返回全部文档,以便查看文档组合。 空搜索没有搜索级别,按任意顺序返回文档(所有文档都为 `"@search.score": 1`)。 默认情况下,搜索请求中会返回 50 个文档。
空搜索的等效语法是 `*` 或 `search=*`。
```Input
search=*
```
**结果**

### <a name="example-2---free-text-search"></a>示例 2 - 自定义文本搜索
自由格式查询(带或不带运算符)可用于模拟从自定义应用发送到 Azure 搜索的用户定义查询。 请注意,如果提供查询词或表达式,将用到搜索级别。 以下示例对自定义文本搜索进行了说明。
```Input
Seattle apartment "Lake Washington" miele OR thermador appliance
```
**结果**
可以使用 Ctrl-F 在结果中搜索感兴趣的特定字词。

### <a name="example-3---count-of-matching-documents"></a>示例 3 - 匹配文档计数
添加“$count”以获取在索引中找到的匹配项数。 在空搜索中,计数是指索引中的文档总数。 在限定搜索中,计数是与查询输入匹配的文档数。
```Input1
$count=true
```
**结果**

### <a name="example-4---restrict-fields-in-search-results"></a>示例 4 - 限制搜索结果中的字段
添加“$select”将结果限制为显式命名的字段,以便在“搜索资源管理器”中获得可读性更强的输出。 若要保留搜索字符串“$count=true”,请在参数前面加上 &。
```Input
search=seattle condo&$select=listingId,beds,baths,description,street,city,price&$count=true
```
**结果**

### <a name="example-5---return-next-batch-of-results"></a>示例 5 - 返回下一批结果
Azure 搜索根据搜索级别返回前 50 个匹配项。 若要获取下一组匹配的文档,请附加“$top=100,&$skip=50”,这会将结果集增加为 100 个文档(默认值为 50,最大值为 1000)并跳过前 50 个文档。 前面提到,需要提供搜索条件,例如查询词或表达式,以便获得排列好的结果。 请注意,搜索分数随搜索结果中搜索的深入而降低。
```Input
search=seattle condo&$select=listingId,beds,baths,description,street,city,price&$count=true&$top=100,&$skip=50
```
**结果**

## <a name="filter-expressions-greater-than-less-than-equal-to"></a>筛选表达式(大于、小于、等于)
如果要指定精确条件搜索,而不是进行自定义文本搜索,请使用“$filter”参数。 此示例搜索大于 3 间的卧室:`search=seattle condo&$filter=beds gt 3&$count=true`

## <a name="order-by-expressions"></a>Order-by 表达式
添加“$orderby”按搜索分数之外的其他字段对结果进行排序。 可用于测试此功能的示例表达式如下 `search=seattle condo&$select=listingId,beds,price&$filter=beds gt 3&$count=true&$orderby=price asc`

“$filter”和“$orderby”表达式都是 OData 构造。 有关详细信息,请参阅 [Filter OData syntax](https://docs.microsoft.com/rest/api/searchservice/odata-expression-syntax-for-azure-search)(筛选器 OData 语法)。
<a name="start-search-explorer"></a>
## <a name="how-to-start-search-explorer"></a>如何启动搜索资源管理器
1. 在 [Azure 门户](https://portal.azure.com)中,从仪表板打开搜索服务页,或者在服务列表中[查找服务](https://ms.portal.azure.com/#blade/HubsExtension/BrowseResourceBlade/resourceType/Microsoft.Search%2FsearchServices)。
2. 在“服务概述”页上,单击“搜索资源管理器”。

3. 选择要查询的索引。

4. (可选)设置 API 版本。 默认选择当前通用的 API 版本,但如果要使用的语法是基于某个版本的,可以选择预览版或旧版 API。
5. 选择索引和 API 版本后,在搜索栏中输入搜索词或完全限定的查询表达式,然后单击“搜索”执行搜索。

在“搜索资源管理器”中搜索的提示:
+ 结果会作为详细的 JSON 文档返回,以便可以完整地查看文档的构建情况和内容。 可以使用示例中显示的查询表达式来限制返回的字段。
+ 文档由标记为在索引中“可检索”的所有字段构成。 若要在门户中查看索引属性,请单击搜索概述页面上“索引”列表中的“realestate-us-sample”。
+ 自由格式查询类似于在商用 Web 浏览器中输入的内容,可用于测试最终用户体验。 例如,假设有一个内置的房地产示例索引,可以输入“华盛顿湖西雅图公寓”,再使用 Ctrl-F 在搜索结果中查找字词。
+ 必须使用 Azure 搜索支持的语法组织查询和筛查表达式。 默认为[简单语法](https://docs.microsoft.com/rest/api/searchservice/simple-query-syntax-in-azure-search),但可选择使用[完整 Lucene](https://docs.microsoft.com/rest/api/searchservice/lucene-query-syntax-in-azure-search) 进行更强大的查询。 [筛选表达式](https://docs.microsoft.com/rest/api/searchservice/odata-expression-syntax-for-azure-search)采用的是 OData 语法。
## <a name="next-steps"></a>后续步骤
以下资源提供更多的查询语法信息和示例。
+ [简单的查询语法](https://docs.microsoft.com/rest/api/searchservice/simple-query-syntax-in-azure-search)
+ [Lucene 查询语法](https://docs.microsoft.com/rest/api/searchservice/lucene-query-syntax-in-azure-search)
+ [Lucene 查询示例](search-query-lucene-examples.md)
+ [OData 筛选器表达式语法](https://docs.microsoft.com/rest/api/searchservice/odata-expression-syntax-for-azure-search) | 36.952381 | 357 | 0.752393 | yue_Hant | 0.372035 |
58317dcb71219f734e73f9df10d4be9f070e6da8 | 1,511 | md | Markdown | state/src/README.md | acuarica/codemirror.next | e45b9fd01f03321bfd0f0c817c51c67689b76932 | [
"MIT"
] | null | null | null | state/src/README.md | acuarica/codemirror.next | e45b9fd01f03321bfd0f0c817c51c67689b76932 | [
"MIT"
] | 1 | 2020-04-14T13:05:07.000Z | 2020-04-14T14:37:29.000Z | state/src/README.md | acuarica/codemirror.next | e45b9fd01f03321bfd0f0c817c51c67689b76932 | [
"MIT"
] | null | null | null | In its most basic form, the editor state is made up of a current <a
href="#state.EditorState.doc">document</a> and a <a
href="#state.EditorState.selection">selection</a>. Because there are a
lot of extra pieces that an editor might need to keep in its state
(such as an <a href="#history">undo history</a> or <a
href="#state.Syntax">syntax tree</a>), it is possible for extensions
to add additional <a href="#state.StateField">fields</a> to the state
object.
@EditorStateConfig
@EditorState
@SelectionRange
@EditorSelection
@Text
### Changes and Transactions
CodeMirror treats changes to the document as [objects](#state.Change),
which are usually part of a [transaction](#state.Transaction).
This is how you'd make a change to a document (replacing “world” with
“editor”) and create a new state with the updated document:
```javascript
let state = EditorState.create({doc: "hello world"})
let transaction = state.t().replace(6, 11, "editor")
console.log(transaction.doc.toString()) // "hello editor"
let newState = transaction.apply()
```
@ChangeDesc
@Change
@ChangeSet
@Transaction
@Annotation
@StateEffect
@StateEffectSpec
@StateEffectType
@MapMode
@Mapping
@ChangedRange
### Extending Editor State
The following are some types and mechanisms used when writing
extensions for the editor state.
@StateCommand
@Extension
@StateFieldSpec
@StateField
@FacetConfig
@Facet
@Precedence
@ExtensionGroup
@Syntax
@IndentContext
@languageData
### Utilities
@fillConfig
@combineConfig | 17.170455 | 70 | 0.755129 | eng_Latn | 0.856495 |
58318c346fcf62ea7b076ce860fefb345b686446 | 1,571 | md | Markdown | tensorflow/g3doc/api_docs/python/functions_and_classes/shard9/tf.sparse_reshape.md | RMORIOKA/tensorflow | 6886eb9c73940fd3b4dfadc3d6964ae9aa71eef6 | [
"Apache-2.0"
] | null | null | null | tensorflow/g3doc/api_docs/python/functions_and_classes/shard9/tf.sparse_reshape.md | RMORIOKA/tensorflow | 6886eb9c73940fd3b4dfadc3d6964ae9aa71eef6 | [
"Apache-2.0"
] | null | null | null | tensorflow/g3doc/api_docs/python/functions_and_classes/shard9/tf.sparse_reshape.md | RMORIOKA/tensorflow | 6886eb9c73940fd3b4dfadc3d6964ae9aa71eef6 | [
"Apache-2.0"
] | null | null | null | ### `tf.sparse_reshape(sp_input, shape, name=None)` {#sparse_reshape}
Reshapes a `SparseTensor` to represent values in a new dense shape.
This operation has the same semantics as `reshape` on the represented dense
tensor. The indices of non-empty values in `sp_input` are recomputed based
on the new dense shape, and a new `SparseTensor` is returned containing the
new indices and new shape. The order of non-empty values in `sp_input` is
unchanged.
If one component of `shape` is the special value -1, the size of that
dimension is computed so that the total dense size remains constant. At
most one component of `shape` can be -1. The number of dense elements
implied by `shape` must be the same as the number of dense elements
originally represented by `sp_input`.
For example, if `sp_input` has shape `[2, 3, 6]` and `indices` / `values`:
[0, 0, 0]: a
[0, 0, 1]: b
[0, 1, 0]: c
[1, 0, 0]: d
[1, 2, 3]: e
and `shape` is `[9, -1]`, then the output will be a `SparseTensor` of
shape `[9, 4]` and `indices` / `values`:
[0, 0]: a
[0, 1]: b
[1, 2]: c
[4, 2]: d
[8, 1]: e
##### Args:
* <b>`sp_input`</b>: The input `SparseTensor`.
* <b>`shape`</b>: A 1-D (vector) int64 `Output` specifying the new dense shape of the
represented `SparseTensor`.
* <b>`name`</b>: A name prefix for the returned tensors (optional)
##### Returns:
A `SparseTensor` with the same non-empty values but with indices calculated
by the new dense shape.
##### Raises:
* <b>`TypeError`</b>: If `sp_input` is not a `SparseTensor`.
| 30.211538 | 86 | 0.665181 | eng_Latn | 0.996888 |
5831c2f3832e1c40e888dd26daf216fcd3c29366 | 861 | md | Markdown | content/mitarbeiter/sandra-bay.md | rodereisen/rodereisen-de | 4635ac21714783c5006d68a30f83ebca51bf1c74 | [
"RSA-MD"
] | null | null | null | content/mitarbeiter/sandra-bay.md | rodereisen/rodereisen-de | 4635ac21714783c5006d68a30f83ebca51bf1c74 | [
"RSA-MD"
] | 1 | 2022-01-04T07:03:19.000Z | 2022-01-04T07:03:19.000Z | content/mitarbeiter/sandra-bay.md | BernhardRode/rodereisen-de | c162147cac901ce0f3e6926ddf67d9e5110e8db9 | [
"RSA-MD"
] | null | null | null | ---
fachgebiete:
- familie
name: Sandra
nachname: Bay
slug: sandra-bay
kontakt:
email: [email protected]
telefon: "+49 (0)7062 - 9499 24"
instagram: ""
twitter: ""
bilder:
bild: "sandra-bay.jpg"
bild_hover: "sandra-bay.jpg"
---
# Sandra Bay
> die welt gehört dem, der sie genießt!” (giacomo leopardi)
Seit nunmehr fast 20 Jahren bin ich im Reisebüro Rode tätig und habe schon viele Länder selbst bereist, so dass ich ihnen sicherlich auch den einen oder anderen besonderen tipp geben kann. der umgang mit menschen und die herausforderung, auch ihre “schönsten tage des jahres” realisieren zu dürfen, begeistern mich jeden tag auf´s neue!
meine letzten reisen führten mich unter anderem nach bali mit stopover in singapore und hong kong, in die türkei und auf die europa 2. aber auch südafrika und mauritius haben mich besonders fasziniert.
| 35.875 | 336 | 0.761905 | deu_Latn | 0.998332 |
5831d278ce795e6443ee404caded3072d18375e8 | 305 | md | Markdown | _notes/Public/Orks/Wargear/Ranged/Gaze of Mork.md | seefrye/waaaaagh-list | 789af35af074e4ba1cb772e6ebc8859405275875 | [
"MIT"
] | null | null | null | _notes/Public/Orks/Wargear/Ranged/Gaze of Mork.md | seefrye/waaaaagh-list | 789af35af074e4ba1cb772e6ebc8859405275875 | [
"MIT"
] | null | null | null | _notes/Public/Orks/Wargear/Ranged/Gaze of Mork.md | seefrye/waaaaagh-list | 789af35af074e4ba1cb772e6ebc8859405275875 | [
"MIT"
] | null | null | null | ---
title: Gaze of Mork
created: 2021-10-26
notetype: nofeed
tags: wargear
---
# Gaze of Mork
---
# Stats
| Weapon | Range | Type | S | AP | D | Abilities |
| ------------ | ----- | ------- | --- | --- | --- | --------- |
| Gaze of Mork | 18" | Heavy 3 | 12 | -4 | 6 | - | | 19.0625 | 64 | 0.393443 | eng_Latn | 0.363851 |
583200ea4cff2ecdf0456c18070279fabac6297d | 490 | md | Markdown | README.md | GoFly233/TiebaLite | e4f5961692731145d7a95c6a68b6b459e35545f6 | [
"Apache-2.0"
] | 5 | 2020-08-13T02:52:05.000Z | 2020-08-25T03:28:10.000Z | README.md | ShellWen/TiebaLite | e4f5961692731145d7a95c6a68b6b459e35545f6 | [
"Apache-2.0"
] | 1 | 2021-02-24T09:54:06.000Z | 2021-03-06T13:13:58.000Z | README.md | ShellWen/TiebaLite | e4f5961692731145d7a95c6a68b6b459e35545f6 | [
"Apache-2.0"
] | 1 | 2020-08-11T09:49:33.000Z | 2020-08-11T09:49:33.000Z | # <p align="center">Tieba Lite</p>
<p align="center">
<a href="https://circleci.com/gh/HuanCheng65/TiebaLite">
<img alt="CircleCI" src="https://circleci.com/gh/HuanCheng65/TiebaLite.svg?style=svg">
</a>
<a href="#">
<img alt="Status" src="https://img.shields.io/badge/%E7%8A%B6%E6%80%81-%EF%BC%9F%EF%BC%9F%EF%BC%9F-gray?style=flat-square&labelColor=gray&color=gray">
</a>
</p>
贴吧 Lite 是一个**非官方**的贴吧客户端。
## 说明
**本软件及源码仅供学习交流使用,严禁用于商业用途。**
~~本软件已停止功能性更新。~~
| 28.823529 | 158 | 0.638776 | yue_Hant | 0.673733 |
58340a4fcab54e77468a6f4d300f81ce3de7be32 | 30 | md | Markdown | README.md | StarsRain/QRMasterBTSDK | 59f98b20fac8a681388b191104ceacb8657d5617 | [
"MIT"
] | null | null | null | README.md | StarsRain/QRMasterBTSDK | 59f98b20fac8a681388b191104ceacb8657d5617 | [
"MIT"
] | null | null | null | README.md | StarsRain/QRMasterBTSDK | 59f98b20fac8a681388b191104ceacb8657d5617 | [
"MIT"
] | null | null | null | # QRMasterBTSDK
畅联科技锁掌柜的蓝牙sdk
| 10 | 15 | 0.866667 | yue_Hant | 0.441547 |
5834ec43dbe81c84a0bfb68c6a768acab1acaa25 | 2,337 | md | Markdown | translations/de-DE/content/github/searching-for-information-on-github/searching-on-github/searching-github-marketplace.md | Varshans2/docs | b9dc0417694f59b81f2f597bf72bdf116d05f88a | [
"CC-BY-4.0",
"MIT"
] | 6 | 2021-02-17T03:31:27.000Z | 2021-09-11T04:17:57.000Z | translations/de-DE/content/github/searching-for-information-on-github/searching-on-github/searching-github-marketplace.md | Honeyk25/docs | de643512095e283cb3a56243c1a7adcf680a1d08 | [
"CC-BY-4.0",
"MIT"
] | 144 | 2021-10-21T04:41:09.000Z | 2022-03-30T09:55:16.000Z | translations/de-DE/content/github/searching-for-information-on-github/searching-on-github/searching-github-marketplace.md | Honeyk25/docs | de643512095e283cb3a56243c1a7adcf680a1d08 | [
"CC-BY-4.0",
"MIT"
] | 3 | 2022-03-08T03:54:20.000Z | 2022-03-15T06:28:15.000Z | ---
title: Durchsuchen des GitHub Marketplace
intro: 'Du kannst nach Apps und Aktionen suchen, die auf {% data variables.product.prodname_marketplace %} verfügbar sind.'
versions:
fpt: '*'
topics:
- GitHub search
redirect_from:
- /github/searching-for-information-on-github/searching-github-marketplace
shortTitle: Search GitHub Marketplace
---
## Informationen zur Suche auf {% data variables.product.prodname_marketplace %}
Du kannst Apps und Aktionen auf {% data variables.product.prodname_marketplace %} auf zwei verschiedene Arten finden:
- Suche in {% data variables.product.prodname_marketplace %}.
- Suche über alles von {% data variables.product.prodname_dotcom_the_website %} und filtere dann die Resultate.
## Suche in {% data variables.product.prodname_marketplace %}
1. Klicke im obersten Bereich einer beliebigen Seite auf **Marketplace** (Marktplatz). 
2. Gib ein beliebiges Schlüsselwort ein und drücke **Eingabe**. 
3. Optional kannst du Dein Ergebnisse filtern, indem Du auf eine oder mehrere Optionen in der linken Seitenleiste klickst.
## Suche über {% data variables.product.prodname_dotcom_the_website %}
Jedes Mal, wenn du über alles von {% data variables.product.prodname_dotcom_the_website %} suchst, kannst Du die Resultate filtern, um passende Apps und Aktionen von {% data variables.product.prodname_marketplace %} zu sehen.
1. Navigiere zu „https://github.com/search“.
2. Gib ein beliebiges Schlüsselwort ein und drücke **Eingabe**. 
3. Klicke in der linken Seitenleiste auf **Marketplace**. 
## Weiterführende Informationen
- „[Informationen zu {% data variables.product.prodname_marketplace %}](/github/customizing-your-github-workflow/about-github-marketplace)“
- „[Aktionen von {% data variables.product.prodname_marketplace %} in Deinem Workflow verwenden](/actions/automating-your-workflow-with-github-actions/using-actions-from-github-marketplace-in-your-workflow)"
| 61.5 | 225 | 0.789474 | deu_Latn | 0.7527 |
58357165909629024883ce332fee8e905171e9c5 | 46 | md | Markdown | README.md | BallsyWalnuts/elements | 4556329c7b5e6378c31ddd28c7d0704a654ed369 | [
"MIT"
] | null | null | null | README.md | BallsyWalnuts/elements | 4556329c7b5e6378c31ddd28c7d0704a654ed369 | [
"MIT"
] | null | null | null | README.md | BallsyWalnuts/elements | 4556329c7b5e6378c31ddd28c7d0704a654ed369 | [
"MIT"
] | null | null | null | # elements
Visualization of Euclid's Elements
| 15.333333 | 34 | 0.826087 | eng_Latn | 0.795075 |
5835ef48745683d884258ccdaf8b5395acb28135 | 2,422 | md | Markdown | docset/winserver2012-ps/printmanagement/PrintManagement.md | nenonix/windows-powershell-docs | b6a032f0ed03c41bbd49bb18a8adeb8c6ca755e1 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | 1 | 2021-05-02T15:01:26.000Z | 2021-05-02T15:01:26.000Z | docset/winserver2012-ps/printmanagement/PrintManagement.md | nenonix/windows-powershell-docs | b6a032f0ed03c41bbd49bb18a8adeb8c6ca755e1 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | docset/winserver2012-ps/printmanagement/PrintManagement.md | nenonix/windows-powershell-docs | b6a032f0ed03c41bbd49bb18a8adeb8c6ca755e1 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | 1 | 2018-11-30T02:02:05.000Z | 2018-11-30T02:02:05.000Z | ---
Module Name: PrintManagement
Module Guid: 8466AE97-2C03-4385-A501-7E74CF6BB1DF
Download Help Link: http://go.microsoft.com/fwlink/?LinkID=216901
Help Version: 3.0.0.0
Locale: en-US
ms.assetid: 8C40FD13-5AE8-4410-850F-839841C1B9DB
---
# PrintManagement Module
## Description
This reference provides cmdlet descriptions and syntax for all print management cmdlets. It lists the cmdlets in alphabetical order.
## PrintManagement Cmdlets
### [Add-Printer](./Add-Printer.md)
Adds a printer to the specified computer.
### [Add-PrinterDriver](./Add-PrinterDriver.md)
Installs a printer driver on the specified computer.
### [Add-PrinterPort](./Add-PrinterPort.md)
Installs a printer port on the specified computer.
### [Get-PrintConfiguration](./Get-PrintConfiguration.md)
Gets the configuration information of a printer.
### [Get-Printer](./Get-Printer.md)
Retrieves a list of printers installed on a computer.
### [Get-PrinterDriver](./Get-PrinterDriver.md)
Retrieves the list of printer drivers installed on the specified computer.
### [Get-PrinterPort](./Get-PrinterPort.md)
Retrieves a list of printer ports installed on the specified computer.
### [Get-PrinterProperty](./Get-PrinterProperty.md)
Retrieves printer properties for the specified printer.
### [Get-PrintJob](./Get-PrintJob.md)
Retrieves a list of print jobs in the specified printer.
### [Remove-Printer](./Remove-Printer.md)
Removes a printer from the specified computer.
### [Remove-PrinterDriver](./Remove-PrinterDriver.md)
Deletes printer driver from the specified computer.
### [Remove-PrinterPort](./Remove-PrinterPort.md)
Removes the specified printer port from the specified computer.
### [Remove-PrintJob](./Remove-PrintJob.md)
Removes a print job on the specified printer.
### [Rename-Printer](./Rename-Printer.md)
Renames the specified printer.
### [Restart-PrintJob](./Restart-PrintJob.md)
Restarts a print job on the specified printer.
### [Resume-PrintJob](./Resume-PrintJob.md)
Resumes a suspended print job.
### [Set-PrintConfiguration](./Set-PrintConfiguration.md)
Sets the configuration information for the specified printer.
### [Set-Printer](./Set-Printer.md)
Updates the configuration of an existing printer.
### [Set-PrinterProperty](./Set-PrinterProperty.md)
Modifies the printer properties for the specified printer.
### [Suspend-PrintJob](./Suspend-PrintJob.md)
Suspends a print job on the specified printer.
| 32.293333 | 132 | 0.766309 | eng_Latn | 0.616186 |
5835fd3c14b75be4987428b63615032334f232da | 1,333 | md | Markdown | tech/hash/hash-table/2020-08-27-Maybe-the-fastest-hashtable-that-you-have-ever-seen.md | shines77/my_docs | 16b7f42089034f6cfaf9f11b176136d2ea380cb1 | [
"MIT"
] | 7 | 2017-01-07T15:14:39.000Z | 2021-11-29T03:29:12.000Z | tech/hash/hash-table/2020-08-27-Maybe-the-fastest-hashtable-that-you-have-ever-seen.md | c3358/my_docs | 16b7f42089034f6cfaf9f11b176136d2ea380cb1 | [
"MIT"
] | null | null | null | tech/hash/hash-table/2020-08-27-Maybe-the-fastest-hashtable-that-you-have-ever-seen.md | c3358/my_docs | 16b7f42089034f6cfaf9f11b176136d2ea380cb1 | [
"MIT"
] | 10 | 2016-12-21T04:31:43.000Z | 2022-02-25T07:23:49.000Z | ---
title: "这也许是你见过最快的哈希表"
categories: ['cpp']
tags: ['C++', 'C#', 'Java', 'Go', 'HashMap', 'HashTable', '哈希表']
key: page-maybe-the-fastest-hashtable-that-you-have-ever-seen
---
# 这也许是你见过最快的哈希表
## 1. 哈希表
### 1.1 简介
`哈希表`,一般英文叫 `Hash Table`, 也可以叫它 `Hash Map`,有时候也叫 `Dictionary` (`字典`),中文也可以叫做 “`散列表`”。它是一种用于 `快速查询数据` 的 “`数据结构`”,存储的数据一般以 (`Key`, `Value`) 键值对的形式存在。最大的特点是:对于随机查找、随机插入和随机删除,都相对的比较快。因为它是一种 `以空间换时间` 的数据结构,跟别的查找算法相比,其算法复杂度 “可视为” `O(1)` 级别。
笔者注:实际的哈希表实现中,`查询` 和 `删除` 通常是做不到 `O(1)` 的,只有在 `Key` 没有冲突的情况下,才是 `O(1)`,但哈希表的 `插入` 是 100% 可以做到 `O(1)` 的。
一个比较最简单的哈希表范例:

以上是一个员工信息查询表,“`Key`” 为人名,“`Value`” 为员工信息,比如性别,年龄,家庭住址等,其中 `M` 表示性别为 `男性`,`F` 表示性别为 `女性` 。例如,我们想查找员工 `Dan` 的信息,通过上面的表,可以得知 `Dan` 是男性,年龄 27 岁。
### 1.2 字典
在实际生活中,要实现快速查找,最为典型的例子就是字典了。
为了能够快速定位我们要找的字或词,在字典的侧面,通常可以看到一个 `V` 字形的一个个黑色小方块,有很多人都会在侧面写上 `A`, `B`, `C`, `D` 这样对应的(拼音/英文)字母。因为字典中所有的字都是按照(拼音/英文)字母顺序排列的,比如。你想查的字是以 `P` 字母开头的,直接翻开到 `P` 开头的起始页,再进行后续的查找,会节省很多时间。这些一个个的黑色小方块,我们称之为“`侧面导航栏`”。
**侧面导航栏**


### 1.3 哈希表的原理
## X. 参考文章
* [1] [什么是哈希表?](https://www.cnblogs.com/wupeixuan/p/12319785.html) 作者:武培轩
* [2] [数据库索引是什么?新华字典来帮你](https://zhuanlan.zhihu.com/p/57359378) 作者:兜里有辣条
* [3]
| 30.295455 | 234 | 0.675919 | yue_Hant | 0.942733 |
5836de42287dda8d7a306bb1efa0b084ceb9b4ae | 1,189 | md | Markdown | _it/feat/Degree.md | go-inoue/docs | 0a6c3769e44f7579712acd7a6ed85d4637b6a8e0 | [
"Apache-2.0"
] | null | null | null | _it/feat/Degree.md | go-inoue/docs | 0a6c3769e44f7579712acd7a6ed85d4637b6a8e0 | [
"Apache-2.0"
] | null | null | null | _it/feat/Degree.md | go-inoue/docs | 0a6c3769e44f7579712acd7a6ed85d4637b6a8e0 | [
"Apache-2.0"
] | null | null | null | ---
layout: feature
title: 'Degree'
shortdef: 'degree of comparison'
udver: '2'
---
Degree of comparison is used as an inflectional feature of some
[adjectives](u-pos/ADJ) and [adverbs](u-pos/ADV).
### <a name="Cmp">`Cmp`</a>: comparative, second degree
The quality of one object is compared to the same quality of another
object.
#### Examples
* _Luigi è migliore di me_ "Luigi is better than me"
* _Il prezzo è inferiore_ "The price is lower"
### <a name="Sup">`Sup`</a>: superlative, third degree
The quality of one object is compared to the same quality of all other
objects within a set.
Italian does not seem to have single words for 'relative'
superlatives, such as [en] "youngest". One would say instead:
#### Examples
* _Luigi è il più giovane_ "Luigi is the most young"
### <a name="Abs">`Abs`</a>: absolute superlative
The quality of the given object is so strong that there is hardly any other object
exceeding it. The quality is not actually compared to any particular
set of objects.
#### Examples
* _bellissimo_ "very beatiful", _massimo_ "maximum", _malissimo_ "very badly", _ottimo_ "optimum"
<!-- Interlanguage links updated St lis 3 20:58:19 CET 2021 -->
| 27.022727 | 97 | 0.72582 | eng_Latn | 0.996767 |
58375f7449ba735d2ae3632370b1103b3ca35b9f | 873 | md | Markdown | docs/content/en/docs/config/http/services/mentix/promsd/_index.md | JarCz/reva | 90850c5dbffcece7299bb8df7fd736e3c7a6b23e | [
"Apache-2.0"
] | null | null | null | docs/content/en/docs/config/http/services/mentix/promsd/_index.md | JarCz/reva | 90850c5dbffcece7299bb8df7fd736e3c7a6b23e | [
"Apache-2.0"
] | null | null | null | docs/content/en/docs/config/http/services/mentix/promsd/_index.md | JarCz/reva | 90850c5dbffcece7299bb8df7fd736e3c7a6b23e | [
"Apache-2.0"
] | null | null | null | ---
title: "promsd"
linkTitle: "promsd"
weight: 10
description: >
Configuration for the Prometheus SD exporter of the Mentix service
---
{{% pageinfo %}}
When using the Prometheus SD exporter, the output filenames have to be configured first.
{{% /pageinfo %}}
{{% dir name="metrics_output_file" type="string" default="" %}}
The target filename of the generated Prometheus File SD scrape config for metrics.
{{< highlight toml >}}
[http.services.mentix.promsd]
metrics_output_file = "/var/shared/prometheus/sciencemesh.json"
{{< /highlight >}}
{{% /dir %}}
{{% dir name="blackbox_output_file" type="string" default="" %}}
The target filename of the generated Prometheus File SD scrape config for the blackbox exporter.
{{< highlight toml >}}
[http.services.mentix.promsd]
blackbox_output_file = "/var/shared/prometheus/blackbox.json"
{{< /highlight >}}
{{% /dir %}}
| 31.178571 | 96 | 0.721649 | eng_Latn | 0.613734 |
58378fec917693effd3475382badec539b93e74d | 2,339 | md | Markdown | docs/fsharp/language-reference/source-line-file-path-identifiers.md | ilyakharlamov/docs.fr-fr | 54c09f71d03787b462bdd134b3407d5ed708a191 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-04-11T17:00:02.000Z | 2019-04-11T17:00:02.000Z | docs/fsharp/language-reference/source-line-file-path-identifiers.md | ilyakharlamov/docs.fr-fr | 54c09f71d03787b462bdd134b3407d5ed708a191 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/fsharp/language-reference/source-line-file-path-identifiers.md | ilyakharlamov/docs.fr-fr | 54c09f71d03787b462bdd134b3407d5ed708a191 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2022-02-23T14:59:20.000Z | 2022-02-23T14:59:20.000Z | ---
title: Identificateurs de ligne, de fichier et de chemin d’accès source
description: Découvrez comment utiliser intégrée F# valeurs d’identificateur qui vous permettent d’accéder à la source de ligne numéro, directory et nom de fichier dans votre code.
ms.date: 05/16/2016
ms.openlocfilehash: 4b145fe1fe20e3d7f868558e33bab26204fb0125
ms.sourcegitcommit: 3d0c29b878f00caec288dfecb3a5c959de5aa629
ms.translationtype: MT
ms.contentlocale: fr-FR
ms.lasthandoff: 12/20/2018
ms.locfileid: "53656009"
---
# <a name="source-line-file-and-path-identifiers"></a>Identificateurs de ligne, de fichier et de chemin d’accès source
Les identificateurs `__LINE__`, `__SOURCE_DIRECTORY__` et `__SOURCE_FILE__` sont des valeurs intégrées qui vous permettent d’accéder à la ligne numéro, répertoire et fichier nom source dans votre code.
## <a name="syntax"></a>Syntaxe
```fsharp
__LINE__
__SOURCE_DIRECTORY__
__SOURCE_FILE__
```
## <a name="remarks"></a>Notes
Chacune de ces valeurs a le type `string`.
Le tableau suivant récapitule les identificateurs de ligne, de fichier et chemin d’accès source qui sont disponibles dans F#. Ces identificateurs ne sont pas des macros de préprocesseur ; ils sont des valeurs intégrées qui sont reconnus par le compilateur.
|Identificateur prédéfini|Description|
|---------------------|-----------|
|`__LINE__`|Prend la valeur de numéro de ligne en cours, vous envisagez `#line` directives.|
|`__SOURCE_DIRECTORY__`|Prend la valeur actuelle chemin d’accès complet du répertoire source, envisagez `#line` directives.|
|`__SOURCE_FILE__`|A pour valeur le nom de fichier source actuel et son chemin d’accès, vous envisagez `#line` directives.|
Pour plus d’informations sur la `#line` directive, consultez [Directives de compilateur](compiler-directives.md).
## <a name="example"></a>Exemple
L’exemple de code suivant illustre l’utilisation de ces valeurs.
[!code-fsharp[Main](../../../samples/snippets/fsharp/lang-ref-2/snippet7401.fs)]
Sortie :
```
Line: 4
Source Directory: C:\Users\username\Documents\Visual Studio 2017\Projects\SourceInfo\SourceInfo
Source File: C:\Users\username\Documents\Visual Studio 2017\Projects\SourceInfo\SourceInfo\Program.fs
```
## <a name="see-also"></a>Voir aussi
- [Directives de compilateur](compiler-directives.md)
- [Informations de référence du langage F#](index.md)
| 41.767857 | 256 | 0.77127 | fra_Latn | 0.873451 |
5838f28f4c60d29eeb3f8c83adbf863012e403fc | 8,846 | md | Markdown | README.md | pela7/htmlpagedom | 34710d592b9865e798b5c35a78166ab1f18ecc8e | [
"MIT"
] | 314 | 2015-01-27T04:25:56.000Z | 2022-02-15T19:48:41.000Z | README.md | pela7/htmlpagedom | 34710d592b9865e798b5c35a78166ab1f18ecc8e | [
"MIT"
] | 34 | 2015-04-01T18:12:28.000Z | 2022-01-24T04:31:10.000Z | README.md | 23G/htmlpagedom | d6812922509a71ec2cabbd4cd34f28a5afb8c1b9 | [
"MIT"
] | 56 | 2015-03-06T13:33:28.000Z | 2022-02-28T03:22:20.000Z | HtmlPageDom
===========
[](http://travis-ci.org/wasinger/htmlpagedom)
[](https://scrutinizer-ci.com/g/wasinger/htmlpagedom/?branch=master)
[](https://packagist.org/packages/wa72/htmlpagedom)
[](https://packagist.org/packages/wa72/htmlpagedom)
> __Important__: BC break in version 2.0 for compatibility with Symfony 4.3, see [UPGRADE.md](UPGRADE.md).
`Wa72\HtmlPageDom` is a PHP library for easy manipulation of HTML documents using DOM.
It requires [DomCrawler from Symfony2 components](https://github.com/symfony/DomCrawler) for traversing
the DOM tree and extends it by adding methods for manipulating the DOM tree of HTML documents.
It's useful when you need to not just extract information from an HTML file (what DomCrawler does) but
also to modify HTML pages. It is usable as a template engine: load your HTML template file, set new
HTML content on certain elements such as the page title, `div#content` or `ul#menu` and print out
the modified page.
`Wa72\HtmlPageDom` consists of two main classes:
- `HtmlPageCrawler` extends `Symfony\Components\DomCrawler` by adding jQuery inspired, HTML specific
DOM *manipulation* functions such as `setInnerHtml($htmltext)`, `before()`, `append()`, `wrap()`, `addClass()` or `css()`.
It's like jQuery for PHP: simply select elements of an HTML page using CSS selectors and change their
attributes and content.
- `HtmlPage` represents one complete HTML document and offers convenience functions like `getTitle()`, `setTitle($title)`,
`setMeta('description', $description)`, `getBody()`. Internally, it uses the `HtmlPageCrawler` class for
filtering and manipulating DOM Elements. Since version 1.2, it offers methods for compressing (`minify()`) and
prettyprinting (`indent()`) the HTML page.
Requirements
------------
- PHP 7.1+
- [Symfony\Components\DomCrawler](https://github.com/symfony/DomCrawler)
- [Symfony\Components\CssSelector](https://github.com/symfony/CssSelector)
Installation
------------
- using [composer](http://getcomposer.org): `composer require wa72/htmlpagedom`
- using other [PSR-4](http://www.php-fig.org/psr/psr-4/) compliant autoloader:
clone this project to where your included libraries are and point your autoloader to look for the
"\Wa72\HtmlPageDom" namespace in the "src" directory of this project
Usage
-----
`HtmlPageCrawler` is a wrapper around DOMNodes. `HtmlPageCrawler` objects can be created using `new` or the static function
`HtmlPageCrawler::create()`, which accepts an HTML string or a DOMNode (or an array of DOMNodes, a DOMNodeList, or even
another `Crawler` object) as arguments.
Afterwards you can select nodes from the added DOM tree by calling `filter()` (equivalent to find() in jQuery) and alter
the selected elements using the following jQuery-like manipulation functions:
- `addClass()`, `hasClass()`, `removeClass()`, `toggleClass()`
- `after()`, `before()`
- `append()`, `appendTo()`
- `makeClone()` (equivalent to `clone()` in jQuery)
- `css()` (alias `getStyle()` / `setStyle()`)
- `html()` (get inner HTML content) and `setInnerHtml($html)`
- `attr()` (alias `getAttribute()` / `setAttribute()`), `removeAttr()`
- `insertAfter()`, `insertBefore()`
- `makeEmpty()` (equivalent to `empty()` in jQuery)
- `prepend()`, `prependTo()`
- `remove()`
- `replaceAll()`, `replaceWith()`
- `text()`, `getCombinedText()` (get text content of all nodes in the Crawler), and `setText($text)`
- `wrap()`, `unwrap()`, `wrapInner()`, `unwrapInner()`, `wrapAll()`
To get the modified DOM as HTML code use `html()` (returns innerHTML of the first node in your crawler object)
or `saveHTML()` (returns combined "outer" HTML code of all elements in the list).
**Example:**
```php
use \Wa72\HtmlPageDom\HtmlPageCrawler;
// create an object from a fragment of HTML code as you would do with jQuery's $() function
$c = HtmlPageCrawler::create('<div id="content"><h1>Title</h1></div>');
// the above is the same as calling:
$c = new HtmlPageCrawler('<div id="content"><h1>Title</h1></div>');
// filter for h1 elements and wrap them with an HTML structure
$c->filter('h1')->wrap('<div class="innercontent">');
// return the modified HTML
echo $c->saveHTML();
// or simply:
echo $c; // implicit __toString() calls saveHTML()
// will output: <div id="content"><div class="innercontent"><h1>Title</h1></div></div>
```
**Advanced example: remove the third column from an HTML table**
```php
use \Wa72\HtmlPageDom\HtmlPageCrawler;
$html = <<<END
<table>
<tr>
<td>abc</td>
<td>adsf</td>
<td>to be removed</td>
</tr>
<tr>
<td>abc</td>
<td>adsf</td>
<td>to be removed</td>
</tr>
<tr>
<td>abc</td>
<td>adsf</td>
<td>to be removed</td>
</tr>
</table>
END;
$c = HtmlPageCrawler::create($html);
$tr = $c->filter('table > tr > td')
->reduce(
function ($c, $j) {
if (($j+1) % 3 == 0) {
return true;
}
return false;
}
);
$tr->remove();
echo $c->saveHTML();
```
**Usage examples for the `HtmlPage` class:**
```php
use \Wa72\HtmlPageDom\HtmlPage;
// create a new HtmlPage object with an empty HTML skeleton
$page = new HtmlPage();
// or create a HtmlPage object from an existing page
$page = new HtmlPage(file_get_contents('http://www.heise.de'));
// get or set page title
echo $page->getTitle();
$page->setTitle('New page title');
echo $page->getTitle();
// add HTML content
$page->filter('body')->setInnerHtml('<div id="#content"><h1>This is the headline</h1><p class="text">This is a paragraph</p></div>');
// select elements by css selector
$h1 = $page->filter('#content h1');
$p = $page->filter('p.text');
// change attributes and content of an element
$h1->addClass('headline')->css('margin-top', '10px')->setInnerHtml('This is the <em>new</em> headline');
$p->removeClass('text')->append('<br>There is more than one line in this paragraph');
// add a new paragraph to div#content
$page->filter('#content')->append('<p>This is a new paragraph.</p>');
// add a class and some attribute to all paragraphs
$page->filter('p')->addClass('newclass')->setAttribute('data-foo', 'bar');
// get HTML content of an element
echo $page->filter('#content')->saveHTML();
// output the whole HTML page
echo $page->save();
// or simply:
echo $page;
// output formatted HTML code
echo $page->indent()->save();
// output compressed (minified) HTML code
echo $page->minify()->save();
```
Limitations
-----------
- HtmlPageDom builds on top of PHP's DOM functions and uses the loadHTML() and saveHTML() methods of the DOMDocument class.
That's why it's output is always HTML, not XHTML.
- The HTML parser used by PHP is built for HTML4. It throws errors
on HTML5 specific elements which are ignored by HtmlPageDom, so HtmlPageDom is usable for HTML5 with some limitations.
- HtmlPageDom has not been tested with character encodings other than UTF-8.
History
-------
When I discovered how easy it was to modify HTML documents using jQuery I looked for a PHP library providing similar
possibilities for PHP.
Googling around I found [SimpleHtmlDom](http://simplehtmldom.sourceforge.net)
and later [Ganon](http://code.google.com/p/ganon) but both turned out to be very slow. Nevertheless I used both
libraries in my projects.
When Symfony2 appeared with it's DomCrawler and CssSelector components I thought:
the functions for traversing the DOM tree and selecting elements by CSS selectors are already there, only the
manipulation functions are missing. Let's implement them! So the HtmlPageDom project was born.
It turned out that it was a good choice to build on PHP's DOM functions: Compared to SimpleHtmlDom and Ganon, HmtlPageDom
is lightning fast. In one of my projects, I have a PHP script that takes a huge HTML page containing several hundreds
of article elements and extracts them into individual HTML files (that are later on demand loaded by AJAX back into the
original HTML page). Using SimpleHtmlDom it took the script 3 minutes (right, minutes!) to run (and I needed to raise
PHP's memory limit to over 500MB). Using Ganon as HTML parsing and manipulation engine it took even longer,
about 5 minutes. After switching to HtmlPageDom the same script doing the same processing tasks is running only about
one second (all on the same server). HtmlPageDom is really fast.
© 2019 Christoph Singer, Web-Agentur 72. Licensed under the MIT License.
| 39.141593 | 164 | 0.704725 | eng_Latn | 0.900424 |
58392d3e9b205c23cc486460c14c68a207f127c3 | 2,105 | md | Markdown | docs/providers/alicloud/RouterInterfaceConnection.md | claytonbrown/tf-cfn-provider | b338642692e91a959bba201c2a84be6bfc7c4bed | [
"MIT"
] | 25 | 2019-01-19T10:39:46.000Z | 2021-05-24T23:38:13.000Z | docs/providers/alicloud/RouterInterfaceConnection.md | claytonbrown/tf-cfn-provider | b338642692e91a959bba201c2a84be6bfc7c4bed | [
"MIT"
] | null | null | null | docs/providers/alicloud/RouterInterfaceConnection.md | claytonbrown/tf-cfn-provider | b338642692e91a959bba201c2a84be6bfc7c4bed | [
"MIT"
] | 3 | 2019-02-27T04:34:36.000Z | 2019-06-20T05:25:56.000Z | # Terraform::Alicloud::RouterInterfaceConnection
Provides a VPC router interface connection resource to connect two router interfaces which are in two different VPCs.
After that, all of the two router interfaces will be active.
-> **NOTE:** At present, Router interface does not support changing opposite router interface, the connection delete action is only deactivating it to inactive, not modifying the connection to empty.
-> **NOTE:** If you want to changing opposite router interface, you can delete router interface and re-build them.
-> **NOTE:** A integrated router interface connection tunnel requires both InitiatingSide and AcceptingSide configuring opposite router interface.
-> **NOTE:** Please remember to add a `depends_on` clause in the router interface connection from the InitiatingSide to the AcceptingSide, because the connection from the AcceptingSide to the InitiatingSide must be done first.
## Properties
`InterfaceId` - (Required, ForceNew) One side router interface ID.
`OppositeInterfaceId` - (Required, ForceNew) Another side router interface ID. It must belong the specified "opposite_interface_owner_id" account.
`OppositeInterfaceOwnerId` - (Optional, ForceNew) Another side router interface account ID. Log on to the Alibaba Cloud console, select User Info > Account Management to check the account ID. Default to [Provider account_id](https://www.terraform.io/docs/providers/alicloud/index.html#account_id).
`OppositeRouterId` - (Optional, ForceNew) Another side router ID. It must belong the specified "opposite_interface_owner_id" account. It is valid when field "opposite_interface_owner_id" is specified.
`OppositeRouterType` - (Optional, ForceNew) Another side router Type. Optional value: VRouter, VBR. It is valid when field "opposite_interface_owner_id" is specified.
## Return Values
### Fn::GetAtt
`Id` - Router interface ID. The value is equal to "interface_id".
## See Also
* [alicloud_router_interface_connection](https://www.terraform.io/docs/providers/alicloud/r/router_interface_connection.html) in the _Terraform Provider Documentation_ | 60.142857 | 297 | 0.79905 | eng_Latn | 0.978333 |
5839babf5a19600e50de53e40d1c81349da62fe6 | 2,946 | md | Markdown | mrsgit09Docset0516221715/Office Protocols/MS-PST/5b30032e-8cbc-4f03-a6bd-c21a7f1c54ea.md | mrsgit09/mrsgit09Repo0516221715 | 4dee21c7aff766e4380cb99bb4172b56f713098e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | mrsgit09Docset0516221715/Office Protocols/MS-PST/5b30032e-8cbc-4f03-a6bd-c21a7f1c54ea.md | mrsgit09/mrsgit09Repo0516221715 | 4dee21c7aff766e4380cb99bb4172b56f713098e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | mrsgit09Docset0516221715/Office Protocols/MS-PST/5b30032e-8cbc-4f03-a6bd-c21a7f1c54ea.md | mrsgit09/mrsgit09Repo0516221715 | 4dee21c7aff766e4380cb99bb4172b56f713098e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | <html dir="LTR" xmlns:mshelp="http://msdn.microsoft.com/mshelp" xmlns:ddue="http://ddue.schemas.microsoft.com/authoring/2003/5" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:tool="http://www.microsoft.com/tooltip">
<head>
<meta http-equiv="Content-Type" content="text/html; CHARSET=utf-8"></meta>
<meta name="save" content="history"></meta>
<title>2.6.2.1.2 Allocating from the HN</title>
<xml>
<mshelp:toctitle title="2.6.2.1.2 Allocating from the HN"></mshelp:toctitle>
<mshelp:rltitle title="[MS-PST]: Allocating from the HN"></mshelp:rltitle>
<mshelp:keyword index="A" term="5b30032e-8cbc-4f03-a6bd-c21a7f1c54ea"></mshelp:keyword>
<mshelp:attr name="DCSext.ContentType" value="open specification"></mshelp:attr>
<mshelp:attr name="AssetID" value="5b30032e-8cbc-4f03-a6bd-c21a7f1c54ea"></mshelp:attr>
<mshelp:attr name="TopicType" value="kbRef"></mshelp:attr>
<mshelp:attr name="DCSext.Title" value="[MS-PST]: Allocating from the HN" />
</xml>
</head>
<body>
<div id="header">
<h1 class="heading">2.6.2.1.2 Allocating from the HN</h1>
</div>
<div id="mainSection">
<div id="mainBody">
<div id="allHistory" class="saveHistory"></div>
<div id="sectionSection0" class="section" name="collapseableSection">
<p>Allocates space out of the heap node. This is an extended
case of modifying node data in section <a href="dc322b87-5d91-4e00-8123-c4a155dfe6dd.md">2.6.1.2.3</a>.</p>
<table>
<thead>
<tr>
<th>
<p>Requirement level</p>
</th>
<th>
<p><b><span>Actions</span></b></p>
</th>
</tr>
</thead>
<tr>
<td>
<p>Required</p>
</td>
<td>
<p>See requirements for section 2.6.1.2.3.</p>
<p>A heap allocation MUST fit within a single block.</p>
<p>Maximum size of a heap allocation is 3580 bytes.</p>
<p>HNPAGEMAP for any modified heap block MUST be
maintained (section <a href="291653c0-b347-4c5b-ba41-85ad780b4ba4.md">2.3.1.5</a>).</p>
<p>The Fill Level Map that corresponds to the modified
block (HNs with a data tree) is updated.</p>
</td>
</tr>
<tr>
<td>
<p>Recommended</p>
</td>
<td>
<p>Update
the Fill Level Map that corresponds to the modified block (HNs with a data
tree).</p>
</td>
</tr>
<tr>
<td>
<p>Optional</p>
</td>
<td>
<p>None.</p>
</td>
</tr>
</table>
<p>Possible side effects: See section 2.6.1.2.3.</p>
<p>When an HN no longer fits within a single data block, a data
tree is created to span multiple data blocks. When adding new data blocks,
implementers MUST use the correct block header format (that is, HNHDR, HNPAGEHDR
or HNBITMAPHDR). Refer to section <a href="a3fa280c-eba3-434f-86e4-b95141b3c7b1.md">2.3.1.6</a> for details.</p>
</div>
</div>
</div>
</body>
</html> | 35.071429 | 217 | 0.617787 | eng_Latn | 0.641519 |
5839f633215af0dc2405390a2bd3894b5b65d8bf | 94 | md | Markdown | README.md | jakebrinkmann/lagoon-vampire-bat | 799050568741f5aa22f36d3b5be8dbee935e5926 | [
"Unlicense"
] | null | null | null | README.md | jakebrinkmann/lagoon-vampire-bat | 799050568741f5aa22f36d3b5be8dbee935e5926 | [
"Unlicense"
] | null | null | null | README.md | jakebrinkmann/lagoon-vampire-bat | 799050568741f5aa22f36d3b5be8dbee935e5926 | [
"Unlicense"
] | null | null | null | **PROVISIONAL SOFTWARE DISCLAIMER**: This software is preliminary and is subject to revision.
| 47 | 93 | 0.808511 | eng_Latn | 0.742387 |
583a1af3355fb2e482bb924949eb67be88953936 | 4,295 | md | Markdown | CONTRIBUTING.md | google/smilesparser | 08eeb066523c26bd186eb62473218026ad9a4754 | [
"Apache-2.0"
] | 14 | 2017-01-12T09:24:18.000Z | 2020-11-29T15:12:05.000Z | CONTRIBUTING.md | google/smilesparser | 08eeb066523c26bd186eb62473218026ad9a4754 | [
"Apache-2.0"
] | null | null | null | CONTRIBUTING.md | google/smilesparser | 08eeb066523c26bd186eb62473218026ad9a4754 | [
"Apache-2.0"
] | 9 | 2017-03-03T23:55:44.000Z | 2021-10-12T15:56:17.000Z | # How to contribute #
We'd love to accept your patches and contributions to this project. There are
a just a few small guidelines you need to follow.
## Contributor License Agreement ##
Contributions to any Google project must be accompanied by a Contributor
License Agreement. This is not a copyright **assignment**, it simply gives
Google permission to use and redistribute your contributions as part of the
project. Head over to <https://cla.developers.google.com/> to see your current
agreements on file or to sign a new one.
You generally only need to submit a CLA once, so if you've already submitted one
(even if it was for a different project), you probably don't need to do it
again.
## Submitting a patch ##
1. It's generally best to start by opening a new issue describing the bug or
feature you're intending to fix. Even if you think it's relatively minor,
it's helpful to know what people are working on. Mention in the initial
issue that you are planning to work on that bug or feature so that it can
be assigned to you.
1. Follow the normal process of [forking][] the project, and setup a new
branch to work in. It's important that each group of changes be done in
separate branches in order to ensure that a pull request only includes the
commits related to that bug or feature.
1. Any significant changes should almost always be accompanied by tests. The
project already has good test coverage, so look at some of the existing
tests if you're unsure how to go about it. [gocov][] and [gocov-html][]
are invaluable tools for seeing which parts of your code aren't being
exercised by your tests.
1. Do your best to have [well-formed commit messages][] for each change.
This provides consistency throughout the project, and ensures that commit
messages are able to be formatted properly by various git tools.
1. Finally, push the commits to your fork and submit a [pull request][].
[forking]: https://help.github.com/articles/fork-a-repo
[well-formed commit messages]: http://tbaggery.com/2008/04/19/a-note-about-git-commit-messages.html
[squash]: http://git-scm.com/book/en/Git-Tools-Rewriting-History#Squashing-Commits
[pull request]: https://help.github.com/articles/creating-a-pull-request
## Maintainer's Guide ##
(These notes are mostly only for people merging in pull requests.)
**Verify CLAs.** CLAs must be on file for the pull request submitter and commit
author(s). Google's CLA verification system should handle this automatically
and will set commit statuses as appropriate. If there's ever any question about
a pull request, ask [willnorris](https://github.com/willnorris).
**Always try to maintain a clean, linear git history.** With very few
exceptions, running `git log` should not show a bunch of branching and merging.
Never use the GitHub "merge" button, since it always creates a merge commit.
Instead, check out the pull request locally ([these git aliases
help][git-aliases]), then cherry-pick or rebase them onto master. If there are
small cleanup commits, especially as a result of addressing code review
comments, these should almost always be squashed down to a single commit. Don't
bother squashing commits that really deserve to be separate though. If needed,
feel free to amend additional small changes to the code or commit message that
aren't worth going through code review for.
If you made any changes like squashing commits, rebasing onto master, etc, then
GitHub won't recognize that this is the same commit in order to mark the pull
request as "merged". So instead, amend the commit message to include a line
"Fixes #0", referencing the pull request number. This would be in addition to
any other "Fixes" lines for closing related issues. If you forget to do this,
you can also leave a comment on the pull request [like this][rebase-comment].
If you made any other changes, it's worth noting that as well, [like
this][modified-comment].
[git-aliases]: https://github.com/willnorris/dotfiles/blob/d640d010c23b1116bdb3d4dc12088ed26120d87d/git/.gitconfig#L13-L15
[rebase-comment]: https://github.com/google/go-github/pull/277#issuecomment-183035491
[modified-comment]: https://github.com/google/go-github/pull/280#issuecomment-184859046
| 51.130952 | 122 | 0.765308 | eng_Latn | 0.998363 |
583ac78f6d82572bfc5ba2fb1816eee287fff1f0 | 986 | md | Markdown | TESTING.md | kaushikrp/idea | 3a64f72f7aae274089278c9d8a92c95becdeb38e | [
"MIT"
] | null | null | null | TESTING.md | kaushikrp/idea | 3a64f72f7aae274089278c9d8a92c95becdeb38e | [
"MIT"
] | null | null | null | TESTING.md | kaushikrp/idea | 3a64f72f7aae274089278c9d8a92c95becdeb38e | [
"MIT"
] | null | null | null | # Testing
This document covers the test suite for Netmiko.
## The simple version
```
cd ./netmiko/tests/etc
cp test_devices.yml.example test_devices.yml
cp responses.yml.example responses.yml
cp commands.yml.example commands.yml
```
#### edit test_devices.yml
Pick the device_types you want to test against; update:
* ip
* username
* password
* secret (optional)
#### edit responses.yml
For the device_types that you are testing against, update the following to match the test
device(s):
* the base_prompt
* router_prompt
* enable_prompt
* interface_ip
#### Execute the test
```
cd ./netmiko/tests
```
Note, the test_device is the name of the device from test_devices.yml and responses.yml:
```
py.test -v test_netmiko_show.py --test_device cisco881
py.test -v test_netmiko_config.py --test_device cisco881
```
There are three tests available:
* test_netmiko_show.py
* test_netmiko_config.py
* test_netmiko_commit.py # currently only for Juniper
| 20.978723 | 90 | 0.741379 | eng_Latn | 0.941529 |
583b5d47a016231e8a191e61c5f852b2ed7531fb | 2,052 | md | Markdown | content/docs/tutorials/visual_story/validation.md | nurse/ampdocs | 28f5bac7d00ea42ae48371dfe3c0262b27cd3828 | [
"Apache-2.0"
] | null | null | null | content/docs/tutorials/visual_story/validation.md | nurse/ampdocs | 28f5bac7d00ea42ae48371dfe3c0262b27cd3828 | [
"Apache-2.0"
] | null | null | null | content/docs/tutorials/visual_story/validation.md | nurse/ampdocs | 28f5bac7d00ea42ae48371dfe3c0262b27cd3828 | [
"Apache-2.0"
] | null | null | null | ---
$title: Validating your AMP HTML
$order: 7
---
Whenever you create an AMP page, you should always validate that your AMP HTML is correct. There are [several methods that you can use to validate your AMP pages](/docs/guides/validate.html). In this tutorial, we'll enable the AMP Validator by turning on the developer mode. To turn on the developer mode, add the following fragment identifier to your URL and reload the page:
```text
#development=1
```
For example:
```text
http://localhost:8000/pets.html#development=1
```
Open the [Developer Console](https://developer.chrome.com/devtools/docs/console) in Chrome (or your preferred browser), and verify there are no AMP errors. You might need to refresh your browser to see validation messages. If your page is free of errors, you should see the message:
```text
AMP validation successful.
```
## Checking best practices
{{ image('/static/img/docs/tutorials/amp_story/pg7-dev-logs.png', 720, 1280, align='right third', alt='Developer Logs for Cover Page' ) }}
When you turned on the development mode, you might have noticed something else that appears on your AMP page--developer logs.
In development mode for AMP stories, the AMP Runtime performs special checks that provide guidance on making your AMP stories performant. The checks provide best practices, such as:
* For videos and images larger than 720 px, you should use srcset.
* Images and videos in the "fill" layout should be in portrait orientation.
Go through the screens, and verify that all pages adhere to the best practices. You should see green check marks in the logs for all pages.
For more information on best practices, see the [AMP story best practices](/docs/guides/amp_story_best_practices.html) guide.
<div class="prev-next-buttons">
<a class="button prev-button" href="/docs/tutorials/visual_story/create_bookend.html"><span class="arrow-prev">Prev</span></a>
<a class="button next-button" href="/docs/tutorials/visual_story/congratulations.html"><span class="arrow-next">Next</span></a>
</div>
| 45.6 | 378 | 0.762671 | eng_Latn | 0.969209 |
583bc3c6ec76f885e8c21b48416762fbbbd3d5d8 | 2,288 | md | Markdown | docs/reporting-services/report-design/start-pie-chart-values-at-the-top-of-the-pie-report-builder-and-ssrs.md | SteSinger/sql-docs.de-de | 2259e4fbe807649f6ad0d49b425f1f3fe134025d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/reporting-services/report-design/start-pie-chart-values-at-the-top-of-the-pie-report-builder-and-ssrs.md | SteSinger/sql-docs.de-de | 2259e4fbe807649f6ad0d49b425f1f3fe134025d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/reporting-services/report-design/start-pie-chart-values-at-the-top-of-the-pie-report-builder-and-ssrs.md | SteSinger/sql-docs.de-de | 2259e4fbe807649f6ad0d49b425f1f3fe134025d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Beginnen der Kreisdiagrammwerte an der Kreisoberseite (Berichts-Generator und SSRS) | Microsoft-Dokumentation
ms.date: 03/01/2017
ms.prod: reporting-services
ms.prod_service: reporting-services-native
ms.technology: report-design
ms.topic: conceptual
ms.assetid: d0e6fb59-ca4e-4d70-97cb-0ad183da21d3
author: maggiesMSFT
ms.author: maggies
ms.openlocfilehash: 7f2e174d9f08e22f57375703093ed44ab57be83e
ms.sourcegitcommit: 3026c22b7fba19059a769ea5f367c4f51efaf286
ms.translationtype: MTE75
ms.contentlocale: de-DE
ms.lasthandoff: 06/15/2019
ms.locfileid: "65578467"
---
# <a name="start-pie-chart-values-at-the-top-of-the-pie-report-builder-and-ssrs"></a>Beginnen der Kreisdiagrammwerte an der Kreisoberseite (Berichts-Generator und SSRS)
Standardmäßig beginnt bei Kreisdiagrammen in paginierten [!INCLUDE[ssRSnoversion_md](../../includes/ssrsnoversion-md.md)] -Berichten der erste Wert im Dataset bei 90 Grad zur Oberseite des Kreises versetzt.

*Die Diagrammwerte beginnen bei 90 Grad.*
Sie können den ersten Wert aber auch ganz oben anordnen.

*Die Diagrammwerte beginnen an der Kreisoberseite.*
## <a name="to-start-the-pie-chart-at-the-top-of-the-pie"></a>So beginnen Sie das Kreisdiagramm an der Kreisoberseite
1. Klicken Sie auf den Kreis selbst.
2. Zum Anzeigen des Bereichs **Eigenschaften** klicken Sie auf der Registerkarte **Ansicht** auf **Eigenschaften**.
3. Ändern Sie im Bereich **Eigenschaften** unter **Benutzerdefinierte Attribute**den Eintrag **PieStartAngle** von **0** in **270**.
4. Klicken Sie auf **Ausführen** , um den Bericht in der Vorschau anzuzeigen.
Der erste Wert beginnt jetzt an der Kreisdiagrammoberseite.
## <a name="see-also"></a>Weitere Informationen
[Formatieren eines Diagramms (Berichts-Generator und SSRS)](../../reporting-services/report-design/formatting-a-chart-report-builder-and-ssrs.md)
[Kreisdiagramme (Berichts-Generator und SSRS)](../../reporting-services/report-design/pie-charts-report-builder-and-ssrs.md)
| 47.666667 | 207 | 0.770105 | deu_Latn | 0.841975 |
583bd58cbb1ccc4aad1c03c66d08f8687e15bf08 | 56 | md | Markdown | README.md | andrecarlucci/resilientapp | 24cb6e11112bef56d89f64e69ced0e389445ba85 | [
"MIT"
] | 1 | 2018-04-13T12:41:57.000Z | 2018-04-13T12:41:57.000Z | README.md | andrecarlucci/resilientapp | 24cb6e11112bef56d89f64e69ced0e389445ba85 | [
"MIT"
] | null | null | null | README.md | andrecarlucci/resilientapp | 24cb6e11112bef56d89f64e69ced0e389445ba85 | [
"MIT"
] | null | null | null | # resilientapp
Code sample from my talk at MVPConf 2018
| 18.666667 | 40 | 0.803571 | eng_Latn | 0.961694 |
583c1ab4732707bf3a3206ad2ff991fad3ad95a7 | 95 | md | Markdown | src/posts/test2.md | skbogner/skbogner.github.io | 09ce6e7e9eb4f1c0b7f1a27482feef4e3be2e7d6 | [
"MIT"
] | null | null | null | src/posts/test2.md | skbogner/skbogner.github.io | 09ce6e7e9eb4f1c0b7f1a27482feef4e3be2e7d6 | [
"MIT"
] | null | null | null | src/posts/test2.md | skbogner/skbogner.github.io | 09ce6e7e9eb4f1c0b7f1a27482feef4e3be2e7d6 | [
"MIT"
] | null | null | null | # JS Code sample
```js
const a = i => i
const b = async (cheese, horse) => chese ** horse
``` | 13.571429 | 49 | 0.568421 | eng_Latn | 0.722676 |
583c2526f5c98e88d20c8e9edc06b44abcfb8686 | 3,534 | md | Markdown | articles/active-directory/reports-monitoring/reference-reports-latencies.md | rsletta/azure-docs | c5d7a4b637bc5f8b76514ce8085b5acf443a3282 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/active-directory/reports-monitoring/reference-reports-latencies.md | rsletta/azure-docs | c5d7a4b637bc5f8b76514ce8085b5acf443a3282 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/active-directory/reports-monitoring/reference-reports-latencies.md | rsletta/azure-docs | c5d7a4b637bc5f8b76514ce8085b5acf443a3282 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Azure Active Directory reporting latencies | Microsoft Docs
description: Learn about the amount of time it takes for reporting events to show up in your Azure portal
services: active-directory
documentationcenter: ''
author: priyamohanram
manager: mtillman
editor: ''
ms.assetid: 9b88958d-94a2-4f4b-a18c-616f0617a24e
ms.service: active-directory
ms.devlang: na
ms.topic: reference
ms.tgt_pltfrm: na
ms.workload: identity
ms.component: report-monitor
ms.date: 12/15/2017
ms.author: priyamo
ms.reviewer: dhanyahk
---
# Azure Active Directory reporting latencies
With [reporting](../active-directory-preview-explainer.md) in the Azure Active Directory, you get all the information you need to determine how your environment is doing. The amount of time it takes for reporting data to show up in the Azure portal is also known as latency.
This topic lists the latency information for the all reporting categories in the Azure portal.
## Activity reports
There are two areas of activity reporting:
- **Sign-in activities** – Information about the usage of managed applications and user sign-in activities
- **Audit logs** - System activity information about users and group management, your managed applications and directory activities
The following table lists the latency information for activity reports.
| Report | Latency (P95) |Latency (P99)|
| :-- | --- | --- |
| Audit logs | 2 mins | 5 mins |
| Sign-ins | 2 mins | 5 mins |
## Security reports
There are two areas of security reporting:
- **Risky sign-ins** - A risky sign-in is an indicator for a sign-in attempt that might have been performed by someone who is not the legitimate owner of a user account.
- **Users flagged for risk** - A risky user is an indicator for a user account that might have been compromised.
The following table lists the latency information for security reports.
| Report | Minimum | Average | Maximum |
| :-- | --- | --- | --- |
| Users at risk | 5 minutes | 15 minutes | 2 hours |
| Risky sign-ins | 5 minutes | 15 minutes | 2 hours |
## Risk events
Azure Active Directory uses adaptive machine learning algorithms and heuristics to detect suspicious actions that are related to your user accounts. Each detected suspicious action is stored in a record called risk event.
The following table lists the latency information for risk events.
| Report | Minimum | Average | Maximum |
| :-- | --- | --- | --- |
| Sign-ins from anonymous IP addresses |5 minutes |15 Minutes |2 hours |
| Sign-ins from unfamiliar locations |5 minutes |15 Minutes |2 hours |
| Users with leaked credentials |2 hours |4 hours |8 hours |
| Impossible travel to atypical locations |5 minutes |1 hour |8 hours |
| Sign-ins from infected devices |2 hours |4 hours |8 hours |
| Sign-ins from IP addresses with suspicious activity |2 hours |4 hours |8 hours |
## Next steps
If you want to know more about the activity reports in the Azure portal, see:
- [Sign-in activity reports in the Azure Active Directory portal](concept-sign-ins.md)
- [Audit activity reports in the Azure Active Directory portal](concept-audit-logs.md)
If you want to know more about the security reports in the Azure portal, see:
- [Users at risk security report in the Azure Active Directory portal](concept-user-at-risk.md)
- [Risky sign-ins report in the Azure Active Directory portal](concept-risky-sign-ins.md)
If you want to know more about risk events, see [Azure Active Directory risk events](concept-risk-events.md).
| 38 | 275 | 0.744199 | eng_Latn | 0.995878 |
583c8183666cdd93abb28a04c720417a24fce94a | 15 | md | Markdown | README.md | GustavoMachenski/RDE11 | 13f8a56838723381b599e34b85502128ac6e5a38 | [
"MIT"
] | null | null | null | README.md | GustavoMachenski/RDE11 | 13f8a56838723381b599e34b85502128ac6e5a38 | [
"MIT"
] | null | null | null | README.md | GustavoMachenski/RDE11 | 13f8a56838723381b599e34b85502128ac6e5a38 | [
"MIT"
] | null | null | null | # RDE11
RDE11
| 5 | 7 | 0.666667 | deu_Latn | 0.502191 |
583c88877984367d00b73936e11b58feef777cca | 1,103 | md | Markdown | README.md | MrDoomy/GQLServer | 982cff0d6cf1bec6e264165e20fe6be35facb810 | [
"Beerware"
] | null | null | null | README.md | MrDoomy/GQLServer | 982cff0d6cf1bec6e264165e20fe6be35facb810 | [
"Beerware"
] | null | null | null | README.md | MrDoomy/GQLServer | 982cff0d6cf1bec6e264165e20fe6be35facb810 | [
"Beerware"
] | null | null | null | # GraphQL Server
## Explanation
This is an example of modern API made with **Go** and **GraphQL** based on _book store_ idea.
### Mandatory
_Install **MongoDB** and run it before lauching this program._
```
mkdir db
mongo --dbpath db
```
## Process
Repository:
```
git clone https://github.com/dmnchzl/gqlserver.git
```
Dependencies:
```
go get -v github.com/graphql-go/graphql
go get -v github.com/graphql-go/handler
go get -v go.mongodb.org/mongo-driver/mongo
```
Build:
```
go build main.go
```
**Enjoy !**
## Docs
- The **Go** Programming Language : [https://golang.org](https://golang.org)
- **GraphQL** : [https://github.com/graphql-go/graphql](https://github.com/graphql-go/graphql)
- **MongoDB** Go Driver : [https://github.com/mongodb/mongo-go-driver](https://github.com/mongodb/mongo-go-driver)
## License
```
"THE BEER-WARE LICENSE" (Revision 42):
<[email protected]> wrote this file. As long as you retain this notice you
can do whatever you want with this stuff. If we meet some day, and you think
this stuff is worth it, you can buy me a beer in return. Damien Chazoule
```
| 20.425926 | 114 | 0.699003 | eng_Latn | 0.698812 |
583e1cd0a7672e0df363911b6020b478dd03e333 | 17,460 | md | Markdown | articles/automation/automation-solution-vm-management-config.md | flarocca/azure-docs.es-es | 8d69748012641d57ddb2b81a3e1c2d079703ed8d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/automation/automation-solution-vm-management-config.md | flarocca/azure-docs.es-es | 8d69748012641d57ddb2b81a3e1c2d079703ed8d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/automation/automation-solution-vm-management-config.md | flarocca/azure-docs.es-es | 8d69748012641d57ddb2b81a3e1c2d079703ed8d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Configuración de la característica Start/Stop VMs during off-hours de Azure Automation
description: En este artículo se describe cómo configurar la característica Start/Stop VMs during off-hours para admitir diferentes casos de uso o escenarios.
services: automation
ms.subservice: process-automation
ms.date: 06/01/2020
ms.topic: conceptual
ms.openlocfilehash: b0bc23d515bebdd0d943bbad33c5ebba35a35605
ms.sourcegitcommit: bdd5c76457b0f0504f4f679a316b959dcfabf1ef
ms.translationtype: HT
ms.contentlocale: es-ES
ms.lasthandoff: 09/22/2020
ms.locfileid: "90987213"
---
# <a name="configure-startstop-vms-during-off-hours"></a>Configuración de la solución Start/Stop VMs during off-hours
En este artículo se describe cómo configurar la característica [Start/Stop VMs during off-hours](automation-solution-vm-management.md) para admitir los escenarios descritos. También puede obtener información sobre cómo:
* [Configurar notificaciones por correo electrónico](#configure-email-notifications)
* [Agregar una VM](#add-a-vm)
* [Excluir una VM](#exclude-a-vm)
* [Modificar las programaciones de inicio y apagado](#modify-the-startup-and-shutdown-schedules)
## <a name="scenario-1-startstop-vms-on-a-schedule"></a><a name="schedule"></a>Escenario 1: Iniciar o detener las máquinas virtuales según una programación
Este escenario es la configuración predeterminada cuando se implementa la característica Start/Stop VMs during off-hours por primera vez. Por ejemplo, puede configurar la característica para que detenga todas las máquinas virtuales de una suscripción por la noche, cuando salga del trabajo, e iniciarlas por la mañana cuando vuelva a la oficina. Al configurar las programaciones **Scheduled-StartVM** y **Scheduled-StopVM** durante la implementación, inician y detienen las máquinas virtuales de destino.
La característica puede configurarse para detener solo las máquinas virtuales. Consulte [Modificación de las programaciones de inicio y apagado](#modify-the-startup-and-shutdown-schedules) para obtener información sobre cómo configurar una programación personalizada.
> [!NOTE]
> La zona horaria que usa la característica es su zona horaria actual cuando configura el parámetro de hora de la programación. Sin embargo, Azure Automation la guarda en formato UTC. No es necesario realizar ninguna conversión de zona horaria, ya que esto se realiza durante la implementación de la máquina.
Para controlar las máquinas virtuales que están en el ámbito, configure las variables `External_Start_ResourceGroupNames`, `External_Stop_ResourceGroupNames` y `External_ExcludeVMNames`.
Puede habilitar que el destino de la acción sea una suscripción y un grupo de recursos, o una lista específica de máquinas virtuales, pero no ambas cosas.
### <a name="target-the-start-and-stop-actions-against-a-subscription-and-resource-group"></a>Destino de las acciones de inicio y detención a un grupo de recursos y una suscripción
1. Configure las variables `External_Stop_ResourceGroupNames` y `External_ExcludeVMNames` para especificar las VM de destino.
2. Habilite y actualice las programaciones **Scheduled-StartVM** y **Scheduled-StopVM**.
3. Ejecute el runbook **ScheduledStartStop_Parent** con el campo de parámetro **ACTION** establecido en **start** y el campo de parámetro **WHATIF** establecido en True para obtener una vista previa de los cambios.
### <a name="target-the-start-and-stop-action-by-vm-list"></a>Destino de las acciones de inicio y detención por lista de máquinas virtuales
1. Ejecute el runbook **ScheduledStartStop_Parent** con **ACTION** establecido en **start**.
2. Agregue una lista separada por comas de máquinas virtuales (sin espacios) en el campo de parámetro **VMList**. Una lista de ejemplo es `vm1,vm2,vm3`.
3. Establezca el campo de parámetro **WHATIF** en True.
4. Configure la variable `External_ExcludeVMNames` con una lista separada por comas de máquinas virtuales (VM1, VM2, VM3), sin espacios entre los valores separados por comas.
5. En este escenario no se respetan las variables `External_Start_ResourceGroupNames` y `External_Stop_ResourceGroupnames`. Para este escenario, es preciso que cree su propia programación de Automation. Para más información, consulte [Programación de un runbook en Azure Automation](shared-resources/schedules.md).
> [!NOTE]
> El valor de **Target ResourceGroup Names** (Nombres de ResourceGroup de destino) se almacena como valor tanto para `External_Start_ResourceGroupNames` como para `External_Stop_ResourceGroupNames`. A fin de lograr un mayor detalle, puede modificar cada una de estas variables para distintos grupos de recursos de destino. Para la acción de inicio, use `External_Start_ResourceGroupNames`; y para la acción de detención, use `External_Stop_ResourceGroupNames`. Las máquinas virtuales se agregan automáticamente a las programaciones de inicio y detención.
## <a name="scenario-2-startstop-vms-in-sequence-by-using-tags"></a><a name="tags"></a>Escenario 2: Iniciar o detener las máquinas virtuales en secuencia mediante el uso de etiquetas
En un entorno que incluya dos, o más, componentes de varias máquinas virtuales que admitan una carga de trabajo distribuida, es importante que se pueda decidir la secuencia en que los componentes se inician o detienen en orden.
### <a name="target-the-start-and-stop-actions-against-a-subscription-and-resource-group"></a>Destino de las acciones de inicio y detención a un grupo de recursos y una suscripción
1. Agregue las etiquetas `sequencestart` y `sequencestop` con un valor entero positivo a las máquinas virtuales de destino en las variables `External_Start_ResourceGroupNames` y `External_Stop_ResourceGroupNames`. Las acciones de inicio y detención se realizan en orden ascendente. Para aprender a etiquetar una máquina virtual, consulte [Etiquetado de una máquina virtual Windows en Azure](../virtual-machines/windows/tag.md) y [Etiquetado de una máquina virtual Linux en Azure](../virtual-machines/linux/tag.md).
2. Modifique las programaciones de **Sequenced-StartVM** y **Sequenced-StopVM** a una fecha y hora que cumplan sus requisitos, y habilite la programación.
3. Ejecute el runbook **SequencedStartStop_Parent** con **ACTION** establecido en **start** y **WHATIF** establecido en True para obtener una vista previa de los cambios.
4. Obtenga una vista previa de la acción y realice los cambios necesarios antes de implementarla en las máquinas virtuales de producción. Cuando esté preparado, ejecute manualmente el runbook con el parámetro establecido en **False** o deje que la programación de Automation **Sequenced-StartVM** y **Sequenced-StopVM** se ejecuten automáticamente siguiendo su programación prescrita.
### <a name="target-the-start-and-stop-actions-by-vm-list"></a>Destino de las acciones de inicio y detención por lista de máquinas virtuales
1. Agregue las etiquetas `sequencestart` y `sequencestop` con un valor entero positivo a las máquinas virtuales que pretende agregar al parámetro `VMList`.
2. Ejecute el runbook **SequencedStartStop_Parent** con **ACTION** establecido en **start**.
3. Agregue una lista separada por comas de máquinas virtuales (sin espacios) en el campo de parámetro **VMList**. Una lista de ejemplo es `vm1,vm2,vm3`.
4. Establezca **WHATIF** en True.
5. Configure la variable `External_ExcludeVMNames` con una lista separada por comas de máquinas virtuales, sin espacios entre los valores separados por comas.
6. En este escenario no se respetan las variables `External_Start_ResourceGroupNames` y `External_Stop_ResourceGroupnames`. Para este escenario, es preciso que cree su propia programación de Automation. Para más información, consulte [Programación de un runbook en Azure Automation](shared-resources/schedules.md).
7. Obtenga una vista previa de la acción y realice los cambios necesarios antes de implementarla en las máquinas virtuales de producción. Cuando esté listo, ejecute manualmente **monitoring-and-diagnostics/monitoring-action-groupsrunbook** con el parámetro establecido en **False**. Como alternativa, deje que las programaciones de Automation **Sequenced-StartVM** y **Sequenced-StopVM** se ejecuten automáticamente siguiendo la programación prescrita.
## <a name="scenario-3-start-or-stop-automatically-based-on-cpu-utilization"></a><a name="cpuutil"></a>Escenario 3: Iniciar o detener automáticamente según el uso de CPU
La característica Start/Stop VMs during off-hours puede ayudarle a administrar el costo de la ejecución de máquinas virtuales de Azure Resource Manager y clásicas en su suscripción mediante la evaluación de las máquinas virtuales de Azure que no se utilizan durante los períodos de poca actividad, por ejemplo, las horas no laborables, y apagarlas automáticamente si el uso del procesador es menor de un porcentaje especificado.
De forma predeterminada, la característica está preconfigurada para evaluar la métrica de CPU en porcentaje, con el fin de ver si la utilización media es del 5 %, o menos. Este escenario se controla mediante las siguientes variables, y puede modificarse si los valores predeterminados no satisfacen sus requisitos:
* `External_AutoStop_MetricName`
* `External_AutoStop_Threshold`
* `External_AutoStop_TimeAggregationOperator`
* `External_AutoStop_TimeWindow`
* `External_AutoStop_Frequency`
* `External_AutoStop_Severity`
Puede habilitar que el destino de la acción sea una suscripción y un grupo de recursos, o una lista específica de máquinas virtuales.
Cuando se ejecuta el runbook **AutoStop_CreateAlert_Parent**, este comprueba que existen la suscripción, los grupos de recursos y las máquinas virtuales de destino. Si las máquinas virtuales existen, el runbook llama al runbook **AutoStop_CreateAlert_Child** de cada máquina virtual verificada mediante el runbook primario. Este runbook secundario:
* Crea una regla de alertas de métricas para cada máquina virtual comprobada.
* Desencadena el runbook **AutoStop_VM_Child** para una VM determinada si la CPU caiga por debajo del umbral configurado durante el intervalo de tiempo especificado.
* Intenta detener la máquina virtual.
### <a name="target-the-autostop-action-against-all-vms-in-a-subscription"></a>Destino de la acción de detención automática a todas las máquinas virtuales de una suscripción
1. Asegúrese de que la variable `External_Stop_ResourceGroupNames` esté vacía o establecida en * (carácter comodín).
2. De manera opcional, si desea excluir algunas máquinas virtuales de la acción de detención automática, puede agregar una lista separada por comas de nombres de máquinas virtuales a la variable `External_ExcludeVMNames`.
3. Habilite la programación de **Schedule_AutoStop_CreateAlert_Parent** para que se ejecute para crear las reglas de alertas de métricas de detención de máquina virtual necesarias para todas las máquinas virtuales de la suscripción. La ejecución de este tipo de programación le permite crear nuevas reglas de alertas de métricas a medida que se agregan nuevas VM a la suscripción.
### <a name="target-the-autostop-action-against-all-vms-in-a-resource-group-or-multiple-resource-groups"></a>Destino de la acción de detención automática a todas las máquinas virtuales de un grupo de recursos o de varios grupos de recursos
1. Agregue una lista separada por comas de nombres de grupos de recursos a la variable `External_Stop_ResourceGroupNames`.
2. Si desea excluir algunas de las máquinas virtuales de la detención automática, puede agregar una lista separada por comas de nombres de máquinas virtuales a la variable `External_ExcludeVMNames`.
3. Habilite la programación de **Schedule_AutoStop_CreateAlert_Parent** para que se ejecute para crear las reglas de alertas de métricas de detención de VM necesarias para todas las VM de los grupos de recursos. La ejecución de esta operación según una programación le permitirá crear nuevas reglas de alertas de métricas a medida que se agregan nuevas VM a los grupos de recursos.
### <a name="target-the-autostop-action-to-a-list-of-vms"></a>Destino de la acción de detención automática a una lista de máquinas virtuales
1. Cree una nueva [programación](shared-resources/schedules.md#create-a-schedule) y vincúlela al runbook **AutoStop_CreateAlert_Parent**. Para ello, agregue una lista separada por comas con los nombres de las máquinas virtuales al parámetro `VMList`.
2. De manera opcional, si desea excluir algunas máquinas virtuales de la acción de detención automática, puede agregar una lista separada por comas de nombres de máquinas virtuales (sin espacios) a la variable `External_ExcludeVMNames`.
## <a name="configure-email-notifications"></a>Configuración de notificaciones de correo electrónico
Para cambiar las notificaciones de correo electrónico después de implementar la característica Start/Stop VMs during off-hours, puede modificar el grupo de acciones creado durante la implementación.
> [!NOTE]
> Las suscripciones en la nube de Azure Government no admiten la funcionalidad de correo electrónico de esta característica.
1. En Azure Portal, vaya a **Supervisar** y, a continuación, **Grupos de acciones**. Seleccione el grupo de acciones denominado **StartStop_VM_Notication**.
:::image type="content" source="media/automation-solution-vm-management/azure-monitor.png" alt-text="Captura de pantalla de la página Supervisar: grupos de acciones.":::
2. En la página StartStop_VM_Notification, haga clic en **Editar detalles** en **Detalles**. Se abrirá la página Correo electrónico/SMS/Push/Voz. Actualice la dirección de correo electrónico y haga clic en **Aceptar** para guardar los cambios.
:::image type="content" source="media/automation-solution-vm-management/change-email.png" alt-text="Captura de pantalla de la página Supervisar: grupos de acciones.":::
También puede agregar acciones adicionales al grupo de acciones; para más información sobre grupos de acciones, consulte [grupos de acciones](../azure-monitor/platform/action-groups.md)
El siguiente es un correo electrónico de ejemplo que se envía cuando la característica cierra las máquinas virtuales.
:::image type="content" source="media/automation-solution-vm-management/email.png" alt-text="Captura de pantalla de la página Supervisar: grupos de acciones.":::
## <a name="add-or-exclude-vms"></a><a name="add-exclude-vms"></a>Agregar o excluir máquinas virtuales
La característica permite agregar máquinas virtuales como destino o para excluirlas.
### <a name="add-a-vm"></a>Agregar una VM
Hay dos maneras de asegurarse de que se incluya una máquina virtual cuando se ejecute la característica:
* Cada uno de los [runbooks](automation-solution-vm-management.md#runbooks) primarios de la característica tiene un parámetro `VMList`. Puede pasar una lista separada por comas de nombres de máquinas virtuales (sin espacios) a este parámetro al programar el runbook principal adecuado para su situación, y estas máquinas virtuales se incluirán cuando se ejecute la característica.
* Para seleccionar varias VM, establezca `External_Start_ResourceGroupNames` y `External_Stop_ResourceGroupNames` en los nombres de los grupos de recursos que contienen las VM que desea iniciar o detener. También se pueden establecer las variables en un valor de `*` para que la característica se ejecute en todos los grupos de recursos de la suscripción.
### <a name="exclude-a-vm"></a>Excluir una VM
Para excluir una máquina virtual de la característica Start/Stop VMs during off-hours, puede agregar su nombre a la variable `External_ExcludeVMNames`. Esta variable es una lista separada por comas de máquinas virtuales específicas (sin espacios) que se excluirán de la característica. Esta lista se limita a 140 VM. Si agrega más de 140 máquinas virtuales a esta lista separada por comas, las establecidas para excluirse pueden iniciarse o detenerse accidentalmente.
## <a name="modify-the-startup-and-shutdown-schedules"></a>Modificación de las programaciones de inicio y apagado
La administración de las programaciones de inicio y apagado en esta característica sigue los mismos pasos descritos en [Programación de un runbook en Azure Automation](shared-resources/schedules.md). Se requieren programaciones independientes para iniciar y detener las máquinas virtuales.
La característica puede configurarse para únicamente detener las máquinas virtuales en un momento determinado. En este escenario, simplemente creará una programación Stop y ninguna programación Start correspondiente.
1. Asegúrese de que ha agregado los grupos de recursos para que las máquinas virtuales se apaguen en la variable `External_Stop_ResourceGroupNames`.
2. Cree su propia programación para el momento en el que desea apagar las máquinas virtuales.
3. Navegue hasta el runbook **ScheduledStartStop_Parent** y haga clic en **Programación**. Esto le permite seleccionar la programación que creó en el paso anterior.
4. Seleccione **Configuración de ejecución y parámetros** y establezca el campo **ACTION** en **Stop**.
5. Seleccione **Aceptar** para guardar los cambios.
## <a name="next-steps"></a>Pasos siguientes
* Para supervisar la característica durante la operación, vea [Consulta de registros de Start/Stop VMs during off-hours](automation-solution-vm-management-logs.md).
* Para solucionar problemas durante la administración de máquinas virtuales, consulte [Solución de problemas de la característica Start/Stop VMs during off-hours](troubleshoot/start-stop-vm.md).
| 91.894737 | 558 | 0.803322 | spa_Latn | 0.983976 |
583e91d7cbe3c6478f8ebe345ec5fd81f4293f4f | 7,813 | md | Markdown | docs/vs-2015/ide/replace-in-files.md | adrianodaddiego/visualstudio-docs.it-it | b2651996706dc5cb353807f8448efba9f24df130 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/ide/replace-in-files.md | adrianodaddiego/visualstudio-docs.it-it | b2651996706dc5cb353807f8448efba9f24df130 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/ide/replace-in-files.md | adrianodaddiego/visualstudio-docs.it-it | b2651996706dc5cb353807f8448efba9f24df130 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Sostituisci nei file | Microsoft Docs
ms.date: 11/15/2016
ms.prod: visual-studio-dev14
ms.technology: vs-ide-general
ms.topic: conceptual
f1_keywords:
- vs.findreplace.replaceinfiles
- vs.replaceinfiles
helpviewer_keywords:
- text searches, replacing text
- Find and Replace window, Replace in Files tab
- Replace in Files tab, Find and Replace window
ms.assetid: ca361466-53bd-44db-a28a-3a74bc03b028
caps.latest.revision: 33
author: gewarren
ms.author: gewarren
manager: jillfra
ms.openlocfilehash: 72a20b0271542dd914aeb592dda3f0cb446a0000
ms.sourcegitcommit: 08fc78516f1107b83f46e2401888df4868bb1e40
ms.translationtype: MT
ms.contentlocale: it-IT
ms.lasthandoff: 05/15/2019
ms.locfileid: "65698678"
---
# <a name="replace-in-files"></a>Sostituisci nei file
[!INCLUDE[vs2017banner](../includes/vs2017banner.md)]
Sostituisci nei file** consente di cercare una stringa o un'espressione nel codice di un determinato set di file e di modificare alcune o tutte le corrispondenze trovate. Le corrispondenze trovate e le azioni eseguite sono elencate nella finestra **Risultati ricerca** selezionata in **Opzioni risultati**.
> [!NOTE]
> Le finestre di dialogo e i comandi di menu visualizzati potrebbero essere diversi da quelli descritti nella Guida a seconda delle impostazioni attive o dell'edizione del programma. Per modificare le impostazioni, scegliere Importa/esporta impostazioni dal menu Strumenti. Per altre informazioni, vedere [Personalizzazione delle impostazioni di sviluppo in Visual Studio](https://msdn.microsoft.com/22c4debb-4e31-47a8-8f19-16f328d7dcd3).
È possibile usare uno dei metodi seguenti per visualizzare l'opzione **Sostituisci nei file** nella finestra **Trova e sostituisci**.
### <a name="to-display-replace-in-files"></a>Per visualizzare l'opzione Sostituisci nei file
1. Nel menu **Modifica** espandere **Trova e sostituisci**.
2. Scegliere **Sostituisci nei file**.
oppure
Se la finestra **Trova e sostituisci** è già visualizzata, scegliere **Sostituisci nei file** nella barra degli strumenti.
## <a name="find-what"></a>Trova
Per cercare una nuova stringa di testo o espressione, specificarla nella casella. Per cercare una delle 20 stringhe cercate più di recente, aprire l'elenco e scegliere la stringa che si vuole cercare. Scegliere il pulsante **Generatore di espressioni** adiacente se si vuole usare una o più espressioni regolari nella stringa di ricerca. Per altre informazioni, vedere [Utilizzo delle espressioni regolari in Visual Studio](../ide/using-regular-expressions-in-visual-studio.md).
## <a name="replace-with"></a>Sostituisci con
Per sostituire istanze della stringa nella casella **Trova** con un'altra stringa, immettere la stringa di sostituzione nella casella **Sostituisci con**. Per eliminare le istanze della stringa nella casella **Trova**, lasciare vuoto questo campo. Aprire l'elenco per visualizzare le ultime 20 stringhe cercate. Scegliere il pulsante **Generatore di espressioni** adiacente se si vuole usare una o più espressioni regolari nella stringa di sostituzione. Per altre informazioni, vedere [Utilizzo delle espressioni regolari in Visual Studio](../ide/using-regular-expressions-in-visual-studio.md).
## <a name="look-in"></a>Cerca in
L'opzione selezionata nell'elenco a discesa **Cerca in** determina se la funzionalità **Sostituisci nei file** deve cercare solo nei file attualmente attivi o in tutti i file archiviati all'interno di determinate cartelle. Selezionare un ambito di ricerca dall'elenco, digitare il percorso di una cartella o fare clic sul pulsante **Sfoglia (...)** per visualizzare la finestra di dialogo **Seleziona cartelle di ricerca** e immettere il set di cartelle per la ricerca. È anche possibile digitare un percorso direttamente nella casella **Cerca in**.
> [!NOTE]
> Se l'opzione **Cerca in** selezionata fa sì che venga cercato un file estratto dal controllo del codice sorgente, la ricerca verrà eseguita solo nella versione del file scaricata nel computer locale.
## <a name="find-options"></a>Opzioni ricerca
È possibile espandere o comprimere la sezione **Opzioni di ricerca**. È possibile selezionare o deselezionare le opzioni seguenti:
Maiuscole/minuscole
Quando questa opzione è selezionata, le finestre **Risultati ricerca** visualizzano solo le istanze della stringa specificata in **Trova** corrispondenti per contenuto e combinazione di maiuscole/minuscole. Ad esempio, la ricerca di "MyObject" con l'opzione **Maiuscole/minuscole** selezionata restituisce "MyObject" ma non "myobject" o "MYOBJECT".
Parola intera
Quando questa opzione è selezionata, le finestre **Risultati ricerca** visualizzano solo le istanze della stringa specificata in **Trova** con l'esatta corrispondenza di parole. Ad esempio, la ricerca di "MyObject" restituisce "MyObject" ma non "CMyObject" o "MyObjectC".
Utilizza espressioni regolari
Se questa casella di controllo è selezionata, è possibile usare le notazioni speciali per definire modelli di testo nelle caselle di testo **Trova** o **Sostituisci con**. Per un elenco di queste notazioni, vedere [Uso delle espressioni regolari in Visual Studio](../ide/using-regular-expressions-in-visual-studio.md).
Cerca i seguenti tipi di file
L'elenco indica i tipi di file da cercare nelle directory **Cerca in**. Se questo campo è vuoto, la ricerca verrà eseguita in tutti i file nelle directory **Cerca in**.
Selezionare qualsiasi voce dell'elenco per immettere una stringa di ricerca preconfigurata che troverà i file dei tipi specificati.
## <a name="result-options"></a>Opzioni risultati
È possibile espandere o comprimere la sezione **Opzioni risultati**. È possibile selezionare o deselezionare le opzioni seguenti:
Finestra Risultati ricerca 1
Quando questa opzione è selezionata, i risultati della ricerca corrente sostituiranno il contenuto della finestra **Risultati ricerca 1**. Questa finestra viene visualizzata automaticamente per mostrare i risultati della ricerca. Per aprire questa finestra manualmente, scegliere **Altre finestre** dal menu **Visualizza** e quindi **Risultati ricerca 1**.
Finestra Risultati ricerca 2
Quando questa opzione è selezionata, i risultati della ricerca corrente sostituiranno il contenuto della finestra **Risultati ricerca 2**. Questa finestra viene visualizzata automaticamente per mostrare i risultati della ricerca. Per aprire questa finestra manualmente, scegliere **Altre finestre** dal menu **Visualizza** e quindi **Risultati ricerca 2**.
Mostra solo nomi file
Quando questa casella di controllo è selezionata, le finestre Risultati ricerca visualizzano i nomi completi e i percorsi di tutti i file che contengono la stringa di ricerca. Tuttavia, i risultati non includono la riga di codice in cui viene visualizzata la stringa. Questa casella di controllo è disponibile solo per la ricerca nei file.
Non chiudere i file modificati con Sostituisci tutto
Quando questa opzione è selezionata, tutti i file in cui sono state eseguite delle sostituzioni restano aperti in modo da poter annullare o salvare le modifiche. I vincoli di memoria potrebbero limitare il numero di file che possono rimanere aperti dopo un'operazione di sostituzione.
> [!CAUTION]
> È possibile usare **Annulla** solo nei file che restano aperti per la modifica. Se questa opzione non è selezionata, i file che non sono già aperti per la modifica restano chiusi e l'opzione **Annulla** non è disponibile per i file.
## <a name="see-also"></a>Vedere anche
[Ricerca e sostituzione di testo](../ide/finding-and-replacing-text.md)
[Cerca nei file](../ide/find-in-files.md)
[Comandi di Visual Studio](../ide/reference/visual-studio-commands.md)
| 80.546392 | 597 | 0.778958 | ita_Latn | 0.998829 |
583ee119d625f9bbd92bd7be5780b036db3d6403 | 3,773 | md | Markdown | docs/visual-basic/programming-guide/language-features/early-late-binding/index.md | JosephHerreraDev/docs.es-es | dd545ff8c6bc59a76ff761c43b14c7f05127c91a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/visual-basic/programming-guide/language-features/early-late-binding/index.md | JosephHerreraDev/docs.es-es | dd545ff8c6bc59a76ff761c43b14c7f05127c91a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/visual-basic/programming-guide/language-features/early-late-binding/index.md | JosephHerreraDev/docs.es-es | dd545ff8c6bc59a76ff761c43b14c7f05127c91a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Enlace anticipado y en tiempo de ejecución
ms.date: 07/20/2015
helpviewer_keywords:
- early binding [Visual Basic]
- objects [Visual Basic], late-bound
- objects [Visual Basic], early-bound
- objects [Visual Basic], late bound
- early binding [Visual Basic], Visual Basic compiler
- binding [Visual Basic], late and early
- objects [Visual Basic], early bound
- Visual Basic compiler, early and late binding
- late binding [Visual Basic]
- late binding [Visual Basic], Visual Basic compiler
ms.assetid: d6ff7f1e-b94f-4205-ab8d-5cfa91758724
ms.openlocfilehash: e8d87e095b7c3104e3a2d66525644d1771ae883e
ms.sourcegitcommit: f8c270376ed905f6a8896ce0fe25b4f4b38ff498
ms.translationtype: MT
ms.contentlocale: es-ES
ms.lasthandoff: 06/04/2020
ms.locfileid: "84410636"
---
# <a name="early-and-late-binding-visual-basic"></a>Enlace en tiempo de compilación y en tiempo de ejecución (Visual Basic)
El compilador de Visual Basic realiza un proceso al `binding` que se llama cuando se asigna un objeto a una variable de objeto. Un objeto se *enlaza de manera anticipada* cuando se asigna a una variable que se declara de un tipo de objeto específico. Los objetos enlazados de manera anticipada permiten al compilador asignar memoria y realizar otras optimizaciones antes de que se ejecute la aplicación. Por ejemplo, el fragmento de código siguiente declara que una variable es de tipo <xref:System.IO.FileStream>:
[!code-vb[VbVbalrOOP#90](~/samples/snippets/visualbasic/VS_Snippets_VBCSharp/VbVbalrOOP/VB/OOP.vb#90)]
Como <xref:System.IO.FileStream> es un tipo de objeto específico, la instancia asignada a `FS` se enlaza de manera anticipada.
Por el contrario, un objeto se *enlaza en tiempo de ejecución* cuando se asigna a una variable que se declara como variable de tipo `Object`. Los objetos de este tipo pueden contener referencias a cualquier objeto, pero carecen de muchas de las ventajas de los objetos con enlaces anticipados. Por ejemplo, el fragmento de código siguiente declara una variable de objeto para contener un objeto devuelto por la función `CreateObject`:
[!code-vb[VbVbalrOOP#91](~/samples/snippets/visualbasic/VS_Snippets_VBCSharp/VbVbalrOOP/VB/LateBinding.vb#91)]
## <a name="advantages-of-early-binding"></a>Ventajas del enlace anticipado
Debe utilizar objetos con enlace anticipado siempre que sea posible, ya que permiten al compilador realizar importantes optimizaciones que producen aplicaciones más eficientes. Los objetos con enlace anticipado son considerablemente más rápidos que los objetos con enlace en tiempo de ejecución y permiten que el código sea más fácil de leer y mantener, ya que declaran exactamente qué clase de objetos se están utilizando. Otra ventaja del enlace en tiempo de compilación es que permite características útiles como la finalización automática de código y la ayuda dinámica, ya que el entorno de desarrollo integrado (IDE) de Visual Studio puede determinar exactamente el tipo de objeto con el que está trabajando mientras edita el código. El enlace anticipado reduce el número y la gravedad de los errores en tiempo de ejecución porque permite que el compilador notifique errores cuando se compila un programa.
> [!NOTE]
> El enlace en tiempo de ejecución solo puede utilizarse para acceder a miembros de tipo declarados como `Public`. El acceso a miembros declarados como `Friend` o `Protected Friend` produce un error en tiempo de ejecución.
## <a name="see-also"></a>Consulte también
- <xref:Microsoft.VisualBasic.Interaction.CreateObject%2A>
- [Duración de los objetos: cómo se crean y destruyen](../objects-and-classes/object-lifetime-how-objects-are-created-and-destroyed.md)
- [Object Data Type](../../../language-reference/data-types/object-data-type.md)
| 83.844444 | 913 | 0.792738 | spa_Latn | 0.990825 |
583ee3bf0ddc29cde237c1689053be9e99637d8b | 2,057 | md | Markdown | docs/guides/beginner/gzip-in-openaf.md | OpenAF/openaf-docs | 1fdcceaeae17d56059febfc602ad57a4d97cf667 | [
"Apache-2.0"
] | null | null | null | docs/guides/beginner/gzip-in-openaf.md | OpenAF/openaf-docs | 1fdcceaeae17d56059febfc602ad57a4d97cf667 | [
"Apache-2.0"
] | 3 | 2021-07-26T11:38:25.000Z | 2022-01-24T11:48:27.000Z | docs/guides/beginner/gzip-in-openaf.md | OpenAF/openaf-docs | 1fdcceaeae17d56059febfc602ad57a4d97cf667 | [
"Apache-2.0"
] | 1 | 2020-12-13T05:09:30.000Z | 2020-12-13T05:09:30.000Z | ---
layout: default
title: gzip/gunzip functionality in OpenAF
parent: Beginner
grand_parent: Guides
---
# gzip/gunzip functionality in OpenAF
There are built-in functionality, in OpenAF, to apply gzip or gunzip to files or arrays of bytes. The functionality is available on the object io with the functions: _io.gzip_, _io.gunzip_, _io.readFileGzipStream_ and _io.writeFileGzipStream_. They divide into two ways of using it:
## To/From an array of bytes
The simplest way is to gzip/gunzip to/from an array of bytes:
````javascript
var theLog = io.gunzip(io.readFileBytes("abc.log.gz"));
var gzLog = io.writeFileBytes("new.log.gz", io.gzip(theLog));
````
The only issue is that the array of bytes (e.g. theLog, gzLog) will be kept in memory.
## To/From streams
To address larger sets of bytes without occuring into memory spending, specially for large files, the are the stream based functions:
````javascript
var rstream = io.readFileGzipStream("abc.log.gz");
// use the rstream and change it's content
// when ready to write just create a write stream
var wstream = io.writeFileGzipStream("new.log.gz");
// and use it to write to the new gzip file
````
## Compress/Uncompress
To help store big javascript objects (even in memory) OpenAF provides two functions: _compress_ and _uncompress_.
Of course the gains will be greater the bigger, and more compressable, the object is. Let's see some examples:
````javascript
> var out = compress(io.listFiles("."));
> out.length
1959
> stringify(io.listFiles("."), void 0, 22).length
11674
> stringify(uncompress(out), void 0, "").length
11674
````
Of course objects are not stored in memory as their stringify version but, you get the idea. It's specific for cases when you need to keep an object in memory that you won't be acesssing on the medium/long term of the execution of your OpenAF script. Of course, it's also easy to save/load from a binary file:
````javascript
> io.writeFileBytes("myObj.gz", compress(io.listFiles(".")));
> var theLog = uncompress(io.readFileBytes("myObj.gz"));
```` | 36.732143 | 309 | 0.747205 | eng_Latn | 0.982718 |
583f59c291c7bc3f64d3114867991422d48bc8b9 | 1,007 | md | Markdown | README.md | AlexFlamand/Learning-CPlusPlus | 168a88645e0d658f43d6c6025a43ba32f2274966 | [
"MIT"
] | 1 | 2020-06-02T13:19:47.000Z | 2020-06-02T13:19:47.000Z | README.md | AlexFlamand/Learning-CPlusPlus | 168a88645e0d658f43d6c6025a43ba32f2274966 | [
"MIT"
] | null | null | null | README.md | AlexFlamand/Learning-CPlusPlus | 168a88645e0d658f43d6c6025a43ba32f2274966 | [
"MIT"
] | 1 | 2021-06-17T15:09:12.000Z | 2021-06-17T15:09:12.000Z | # Learning-CPlusPlus
## Information
Programs developed in the Spring of 2018 for Engineering 102 at Shepherd University. Here you will find my original C++ source code that I wrote during that time in their final form. All of these were written in Microsoft Visual Studio and debugged through its local debugging console. My only intention for this repository is purely for archival purposes and general posterity. I would very much like to know if you are able to find a useful purpose for these programs; however, please note that they were not designed with user-friendliness in mind, and at the time of this writing have no intention of developing these into full-fledged applications. Not all of these programs are garunteed to function properly or at all, and some are missing key documentation, so please keep this in mind when attempting to adapt or use this source code.
Adapted and based around examples provided in the book *C++ for Engineers and Scientists (4th Edition)* by Gary J. Bronson.
| 143.857143 | 843 | 0.804369 | eng_Latn | 0.999948 |
583f6612b7911a10d4ddf5bc99cd58a3798a07f1 | 8,676 | md | Markdown | docs/reference/function-configuration/function-configuration-reference.md | suleisl2000/nuclio | ea2fce0aa3cb499d34d54d0f75805d22f3608cdc | [
"Apache-2.0"
] | 1 | 2020-12-20T09:15:05.000Z | 2020-12-20T09:15:05.000Z | docs/reference/function-configuration/function-configuration-reference.md | suleisl2000/nuclio | ea2fce0aa3cb499d34d54d0f75805d22f3608cdc | [
"Apache-2.0"
] | null | null | null | docs/reference/function-configuration/function-configuration-reference.md | suleisl2000/nuclio | ea2fce0aa3cb499d34d54d0f75805d22f3608cdc | [
"Apache-2.0"
] | null | null | null | # Function-Configuration Reference
This document provides a reference of the Nuclio function configuration.
#### In This Document
- [Basic configuration structure](#basic-structure)
- [Function metadata (`metadata`)](#metadata)
- [Function Specification (`spec`)](#specification)
- [Example](#spec-example)
- [See also](#see-also)
<a id="basic-structure"></a>
## Basic configuration structure
The basic structure of the Nuclio function configuration resembles Kubernetes resource definitions, and includes the `apiVersion`, `kind`, `metadata`, `spec`, and `status` sections. Following is an example of a minimal definition:
```yaml
apiVersion: "nuclio.io/v1"
kind: NuclioFunction
metadata:
name: example
spec:
image: example:latest
```
<a id="metadata"></a>
## Function Metadata (`metadata`)
The `metadata` section includes the following attributes:
| **Path** | **Type** | **Description** |
| :--- | :--- | :--- |
| name | string | The name of the function |
| namespace | string | A level of isolation provided by the platform (e.g., Kubernetes) |
| labels | map | A list of key-value tags that are used for looking up the function (immutable, can't update after first deployment) |
| annotations | map | A list of annotations based on the key-value tags |
### Example
```yaml
metadata:
name: example
namespace: nuclio
labels:
l1: lv1
l2: lv2
l3: 100
annotations:
a1: av1
```
<a id="specification"></a>
## Function Specification (`spec`)
The `spec` section contains the requirements and attributes and has the following elements:
| **Path** | **Type** | **Description** |
| :--- | :--- | :--- |
| description | string | A textual description of the function |
| handler | string | The entry point to the function, in the form of `package:entrypoint`. Varies slightly between runtimes, see the appropriate runtime documentation for specifics |
| runtime | string | The name of the language runtime. One of: `golang`, `python:2.7`, `python:3.6`, `shell`, `java`, `nodejs`, `pypy` |
| <a id="spec.image"></a>image | string | The name of the function's container image — used for the `image` [code-entry type](#spec.build.codeEntryType); see [Code-Entry Types](/docs/reference/function-configuration/code-entry-types.md#code-entry-type-image) |
| env | map | A name-value environment-variables tuple; it's also possible to reference secrets from the map elements, as demonstrated in the [specifcation example](#spec-example) |
| volumes | map | A map in an architecture similar to Kubernetes volumes, for Docker deployment |
| replicas | int | The number of desired instances; 0 for auto-scaling. |
| minReplicas | int | The minimum number of replicas |
| platform.attributes.restartPolicy.name | string | Function image container restart policy name (applied for docker platform only) |
| platform.attributes.restartPolicy.maximumRetryCount | int | Restart maximum counter before exhausted |
| platform.attributes.processorMountMode | string | The way docker would mount the processor config (options: bind, volume; default: bind) |
| maxReplicas | int | The maximum number of replicas |
| targetCPU | int | Target CPU when auto scaling, as a percentage (default: 75%) |
| dataBindings | See reference | A map of data sources used by the function ("data bindings") |
| triggers.(name).maxWorkers | int | The max number of concurrent requests this trigger can process |
| triggers.(name).kind | string | The trigger type (kind) - `cron` \| `eventhub` \| `http` \| `kafka-cluster` \| `kinesis` \| `nats` \| `rabbit-mq` |
| triggers.(name).url | string | The trigger specific URL (not used by all triggers) |
| triggers.(name).annotations | list of strings | Annotations to be assigned to the trigger, if applicable |
| triggers.(name).workerAvailabilityTimeoutMilliseconds | int | The number of milliseconds to wait for a worker if one is not available. 0 = never wait (default: 10000, which is 10 seconds)|
| triggers.(name).attributes | See [reference](/docs/reference/triggers) | The per-trigger attributes |
| <a id="spec.build.path"></a>build.path | string | The URL of a GitHub repository or an archive-file that contains the function code — for the `github` or `archive` [code-entry type](#spec.build.codeEntryType) — or the URL of a function source-code file; see [Code-Entry Types](/docs/reference/function-configuration/code-entry-types.md) |
| <a id="spec.build.functionSourceCode"></a>build.functionSourceCode | string | Base-64 encoded function source code for the `sourceCode` [code-entry type](#spec.build.codeEntryType); see [Code-Entry Types](/docs/reference/function-configuration/code-entry-types.md#code-entry-type-sourcecode) |
| build.registry | string | The container image repository to which the built image will be pushed |
| build.noBaseImagePull | string | Do not pull any base images when building, use local images only |
| build.noCache | string | Do not use any caching when building container images |
| build.baseImage | string | The name of a base container image from which to build the function's processor image |
| build.Commands | list of string | Commands run opaquely as part of container image build |
| build.onbuildImage | string | The name of an "onbuild" container image from which to build the function's processor image; the name can include `{{ .Label }}` and `{{ .Arch }}` for formatting |
| build.image | string | The name of the built container image (default: the function name) |
| <a id="spec.build.codeEntryType"></a>build.codeEntryType | string | The function's code-entry type - `archive` \| `github` \| `image` \| `s3` \| `sourceCode`; see [Code-Entry Types](/docs/reference/function-configuration/code-entry-types.md) |
| <a id="spec.build.codeEntryAttributes"></a>build.codeEntryAttributes | See [reference](/docs/reference/function-configuration/code-entry-types.md#external-func-code-entry-types) | Code-entry attributes, which provide information for downloading the function when using the `github`, `s3`, or `archive` [code-entry type](#spec.build.codeEntryType) |
| runRegistry | string | The container image repository from which the platform will pull the image |
| runtimeAttributes | See [reference](/docs/reference/runtimes/) | Runtime-specific attributes |
| resources | See [reference](https://kubernetes.io/docs/concepts/configuration/manage-compute-resources-container/) | Limit resources allocated to deployed function |
| readinessTimeoutSeconds | int | Number of seconds that the controller will wait for the function to become ready before declaring failure (default: 60) |
| avatar | string | Base64 representation of an icon to be shown in UI for the function |
| eventTimeout | string | Global event timeout, in the format supported for the `Duration` parameter of the [`time.ParseDuration`](https://golang.org/pkg/time/#ParseDuration) Go function |
| securityContext.runAsUser | int | The UID to run the entrypoint of the container process. (k8s only) |
| securityContext.runAsGroup | int | The GID to run the entrypoint of the container process. (k8s only) |
| securityContext.fsGroup | int | A supplemental group added to the groups to run the entrypoint of the container process. (k8s only) |
<a id="spec-example"></a>
### Example
```yaml
spec:
description: my Go function
handler: main:Handler
runtime: golang
image: myfunctionimage:latest
platform:
attributes:
# docker will retry 3 times to start function image container
# more info @ https://docs.docker.com/config/containers/start-containers-automatically
restartPolicy:
name: on-failure
maximumRetryCount: 3
# will use `volume` to mount the processor into function
# more info @ https://docs.docker.com/storage/volumes
processorMountMode: volume
env:
- name: SOME_ENV
value: abc
- name: SECRET_PASSWORD_ENV_VAR
valueFrom:
secretKeyRef:
name: my-secret
key: password
volumes:
- volume:
hostPath:
path: "/var/run/docker.sock"
volumeMount:
mountPath: "/var/run/docker.sock"
minReplicas: 2
maxReplicas: 8
targetCPU: 60
build:
registry: localhost:5000
noBaseImagePull: true
noCache: true
commands:
- apk --update --no-cache add curl
- pip install simplejson
resources:
requests:
cpu: 1
memory: 128M
limits:
cpu: 2
memory: 256M
securityContext:
runAsUser: 1000
runAsGroup: 2000
fsGroup: 3000
```
## See also
- [Deploying Functions](/docs/tasks/deploying-functions.md)
- [Code-Entry Types](/docs/reference/function-configuration/code-entry-types.md)
| 53.226994 | 352 | 0.724643 | eng_Latn | 0.9319 |
583f72c099fc2ad6d8ea7c276d1faf156a0022de | 234 | md | Markdown | doc/api/smeup_models_widgets_smeup_input_panel_value/SmeupInputPanelField/visible.md | smeup/ken | 582c6c2e731aa62a6d0b9b4ccc5f044e6883f13a | [
"Apache-2.0"
] | 5 | 2021-12-28T12:47:39.000Z | 2022-03-25T16:56:25.000Z | doc/api/smeup_models_widgets_smeup_input_panel_value/SmeupInputPanelField/visible.md | smeup/ken | 582c6c2e731aa62a6d0b9b4ccc5f044e6883f13a | [
"Apache-2.0"
] | null | null | null | doc/api/smeup_models_widgets_smeup_input_panel_value/SmeupInputPanelField/visible.md | smeup/ken | 582c6c2e731aa62a6d0b9b4ccc5f044e6883f13a | [
"Apache-2.0"
] | null | null | null |
# visible property
*[<Null safety>](https://dart.dev/null-safety)*
[bool](https://api.flutter.dev/flutter/dart-core/bool-class.html)? visible
_read / write_
## Implementation
```dart
bool? visible;
```
| 6.685714 | 74 | 0.615385 | eng_Latn | 0.322316 |
583f896c88ef330cec6e6c0731d0204621764f55 | 495 | md | Markdown | _posts/2021-07-08/2021-06-22-I-know-not-everyone-likes-big-lips-but-I-think-theyre-cute-20210622213519438389.md | ipussy/ipussy.github.io | 95d19a74e38bb54303cf18057a99a57c783e76bf | [
"Apache-2.0"
] | null | null | null | _posts/2021-07-08/2021-06-22-I-know-not-everyone-likes-big-lips-but-I-think-theyre-cute-20210622213519438389.md | ipussy/ipussy.github.io | 95d19a74e38bb54303cf18057a99a57c783e76bf | [
"Apache-2.0"
] | null | null | null | _posts/2021-07-08/2021-06-22-I-know-not-everyone-likes-big-lips-but-I-think-theyre-cute-20210622213519438389.md | ipussy/ipussy.github.io | 95d19a74e38bb54303cf18057a99a57c783e76bf | [
"Apache-2.0"
] | null | null | null | ---
title: "I know not everyone likes big lips but I think they’re cute"
metadate: "hide"
categories: [ Pussy ]
image: "https://external-preview.redd.it/-9ZBJ41HwMAdcoke1rOTjmOEB8XqEFP2wyTYRNmzv_Y.jpg?auto=webp&s=ad8702d94654b383bb8625b4c862b550ddf4b549"
thumb: "https://external-preview.redd.it/-9ZBJ41HwMAdcoke1rOTjmOEB8XqEFP2wyTYRNmzv_Y.jpg?width=1080&crop=smart&auto=webp&s=29bbdc88393b1fb6fe73af96bf9709009deda763"
visit: ""
---
I know not everyone likes big lips but I think they’re cute
| 49.5 | 164 | 0.806061 | yue_Hant | 0.239325 |
583fedbab86dd7e1d4f1669ef9c4e2e507ee486c | 1,211 | md | Markdown | _posts/2021-05-11-cat_tv.md | yowwwl/yowwwl.github.io | fef3b4dff6a5c14c730092a9c62109e90debb4db | [
"MIT"
] | null | null | null | _posts/2021-05-11-cat_tv.md | yowwwl/yowwwl.github.io | fef3b4dff6a5c14c730092a9c62109e90debb4db | [
"MIT"
] | null | null | null | _posts/2021-05-11-cat_tv.md | yowwwl/yowwwl.github.io | fef3b4dff6a5c14c730092a9c62109e90debb4db | [
"MIT"
] | null | null | null | ---
layout: post
title: 貓貓沒事做活力充沛到處找碴,給他台電視看看
subtitle: 喵靠北:整天被關在家無聊死了,好想感受這個世界!
categories: 喵居家
description: 這空氣是多麼清新,這世界是多麼美好,讓貓貓感受外面的風,分析空氣中承載的訊息例如氣味、濕度、溫度、聲音等等,各種複雜的資訊就夠貓貓小小燒腦一下,這樣貓生有趣一點不會整天生無可戀在家裡四處搞事。
banner: assets/img/banners/cat_tv_1000.png
tags: [喵居家]
---
你無聊的時候喜歡轉電視嗎?或是滑滑串流平台追劇嗎?家裡的貓貓整天關在家裡沒啥事做也是會無聊的,這時候豐富貓貓的生活可以幫他準備一台天然電視機,讓他可以隨時點播。
## 窗掛式
窗掛式掛架類似門後掛勾的概念,掛在窗邊讓貓貓可以感受自然的風看看外面的風景,缺點是如果窗戶玻璃是不透明的話就必須要開窗掛才能讓貓貓看得到外面,會被掛的地方限制高度的變化性比較少,窗掛式的好處是卡好好就不用擔心久了有掉落的可能性。
## 黏貼式
黏貼式貓吊床適合用在透明玻璃窗戶或是整片落地窗,只要吸飽飽夠穩就能提供更多層次的高度,不過要注意吸盤用久了或是長期被太陽曬會失去吸附性,無法承受貓貓的重量鬆脫等等,需要常常確認吸盤有沒有吸好吸滿,萬一貓貓上去掉下來是會嚇到貓貓的窩,那你這電視可能會被拒看。
## 貓跳台
在窗邊的空間允許的話也能直接將貓跳台放置到窗邊,讓貓直接上跳台爬高高看~~電視~~風景、曬太陽,這樣就不需要再多一筆開銷。跳台的選擇至少要比人高的類型或是頂天立地型,貓貓會更喜歡在上面鳥瞰這個世界。
## 陽台、門窗防護裝置
家裡是有陽台或是窗台有外推空間的奴才,可以考慮裝設隱形鐵窗,讓貓貓可以直接進入那個空間活動,當然空間夠大可以再多加一些垂直空間的設置給貓貓跳上跳下,網路上也有很多分享不同的設計和 DIY 的裝設的方式可以參考。重點就是要考量裝置的穩固性、縫隙的大小,以免貓貓不慎有脫離或是墜樓的危險。
## 注意:動線是否流暢
窗掛式或是黏貼式不管哪一種除了本身的安裝穩固之外,也要考慮貓貓上去的動線是否流暢,例如貓貓的年紀比較大可能關節退化了,可以縮短跳板的間距,減少因為需要跳躍所造成的不適,讓貓貓更容易也更愛上去看他的電視。
## 動態加碼:戶外鳥屋
窗外是陽台或是花園可以戶外設置鳥屋吸引鳥飛過來,增加窗外風景的變動與豐富,說不定還有機會聽到貓貓發出 chatter 喀喀喀喀的聲音喔。
## 這空氣是多麼清新,這世界是多麼美好
窗戶打開風吹過來,貓貓就會開始分析空氣中承載的訊息例如氣味、濕度、溫度、聲音等等,各種複雜的資訊就夠貓貓小小燒腦一下,這樣貓生有趣一點不會整天生無可戀在家裡四處搞事。
- 關鍵字搜搜:貓窗台、貓吊床、窗台置物架、曬貓架
| 29.536585 | 142 | 0.863749 | yue_Hant | 0.776214 |
58403153d4703ba535f96c360a4976a7ac8913bf | 8,174 | md | Markdown | README.md | subsoap/open-mini-games | b06ea3ce5a5cf6bdd06de190d33f1f9f5c38e227 | [
"Apache-2.0"
] | 39 | 2020-10-20T11:12:43.000Z | 2021-11-16T13:12:23.000Z | README.md | subsoap/open-mini-games | b06ea3ce5a5cf6bdd06de190d33f1f9f5c38e227 | [
"Apache-2.0"
] | 1 | 2021-05-20T08:30:45.000Z | 2021-05-20T08:30:45.000Z | README.md | subsoap/open-mini-games | b06ea3ce5a5cf6bdd06de190d33f1f9f5c38e227 | [
"Apache-2.0"
] | 1 | 2020-10-29T16:05:44.000Z | 2020-10-29T16:05:44.000Z | # Open Mini Games Format
The Open Mini Games (OMG) format is a proposal to define a set of combined minimum requirements that game developers can implement when developing a game that is playable in a web browser or a client which renders HTML5 content.
The aim of the requirements is to provide the information necessary to surface a web game in a standardized manner on discovery platforms such as browsers but also embedded platforms.
In theory, the combination of these requirements will enable catalogs of web games to be collected, filtered and surfaced to end users.
The final representations of the games should be meaningingful concise to end users and showcase the games in a manner that users have become familiar with when browsing general game catalog interfaces - such as Steam, Google Play or the Apple App Store.
## User Experience Goals
The goal of each piece of metadata is to unlock a feature or improvement towards one of the following areas:
- Discovery
- Rediscovery
- Installability
- Load Time
- Offline Access
Different platforms which wish to surface HTML5 game content can leverage these metadata towards the above goals. Ideally each piece of metadata should be useful in its own right. When combined, they aim to reduce friction and enable new capabilities for game platforms.
## Requirements
An HTML document that is also a playable web game must have the following three pieces of metadata included on its page:
- VideoGame Schema Markup with additional constraints.
- Progressive Web App Manifest.
- OMG Resources Manifest: A list of the resources necessary for offline play or initial precaching.
To be a completely valid OMG, a webpage must include all 3 pieces of metadata. However, surfacing platforms can leverage individual pieces of metadata for their specific use cases.
### VideoGame Schema Markup for Web Games
#### Explainer
The VideoGame Schema Markup (with web game property constraints) provides a surfacing platform with the information necessary to describe the game effectively as a search result or a featured item on a store page. Theoretically, a surfacing platform could also provide users with a “recently played” user journey in which users could quickly jump back into their most recently played web games based on the existence of the games in the user’s browser or navigation history. This metadata helps with ‘Discovery’ and ‘Rediscovery’ of content.
#### Example
In index.html within HEAD tag:
~~~html
<head>
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "VideoGame",
"mainEntityOfPage": {
"@type": "WebPage",
"@id": "https://google.com/videogame"
},
"name": "Game title",
"description": "Game description",
"url": "https://example.com/game.html",
"genre": "action",
"accessibilityControl": "touch",
"operatingSystem": "web",
"icon": "https://example.com/icon/icon_1024x1024.jpg",
"gameBanner": "https://example.com/landscape_banner.jpg",
"about": "https://example.com/about.html",
"privacyPolicyURL": "https://example.com/privacy.html",
“gameExecutionMode”: “clientside”,
"image": [
"https://example.com/screenshot/1.jpg",
"https://example.com/screenshot/2.jpg",
"https://example.com/screenshot/3.jpg"
],
"author": {
"@type": "Organization",
"name": "Game Developer",
"logo": {
"@type": "ImageObject",
"url": "https://example.com/developer_logo_1024x1024.jpg"
}
},
"publisher": {
"@type": "Organization",
"name": "Game Publisher",
"logo": {
"@type": "ImageObject",
"url": "https://example.com/publisher_logo_1024x1024.jpg"
}
}
}
</script>
...
~~~
Relevant docs:
https://schema.org/VideoGame
#### Additional constraints:
- “genre" must be written in lowercase, and be one or two of the following predefined genres. To represent two genres, use a pipe character, for example “arcade|racing":
- Action
- Arcade
- RPG
- Adventure
- Board
- Card
- Family
- Music
- Educational
- Racing
- Simulation
- Strategy
- Trivia
- Word
- “operatingSystem" must be “web"
- “icon" must refer to an image in either PNG, JPG or WEBP format with a minimum width of 1024 pixels, a minimum height of 1024 pixels and a 1:1 aspect ratio.
- “gameBanner" must refer to an image in either PNG, JPG or WEBP format with a minimum height of 1024 pixels and an aspect ratio of 4:3
- “accessibilityControl" must be written in lowercase, and be one
- “gameExecutionMode” must be written in lowercase and be either ‘clientside’ or ‘serverside’. ‘clientside’ refers to games that execute their game code in JavaScript on the client. ‘serverside’ refers to games that execute their game code on a remote server and stream their graphics and audio to the client.
### Progressive Web App Manifest
#### Explainer
The Progressive Web App Manifest is already used by platforms such as Android to enable web applications and web games to be installed to the device and easily launched via the Android home screen. This metadata serves as a form of ‘Rediscovery’ and ‘Installability’.
#### Example
In index.html within HEAD tag:
~~~html
<head>
<link rel="manifest" href="/manifest.webmanifest">
...
~~~
Example manifest.webmanifest:
~~~json
{
"name": "Game title",
"short_name": "Game title",
"start_url": ".",
"display": "standalone",
"background_color": "#fff",
"description": "Game description",
"icons": [{
"src": "images/touch/homescreen48.png",
"sizes": "48x48",
"type": "image/png"
}, {
"src": "images/touch/homescreen72.png",
"sizes": "72x72",
"type": "image/png"
}, {
"src": "images/touch/homescreen96.png",
"sizes": "96x96",
"type": "image/png"
}, {
"src": "images/touch/homescreen144.png",
"sizes": "144x144",
"type": "image/png"
}, {
"src": "images/touch/homescreen168.png",
"sizes": "168x168",
"type": "image/png"
}, {
"src": "images/touch/homescreen192.png",
"sizes": "192x192",
"type": "image/png"
}],
"related_applications": [{
"platform": "play",
"url": "https://play.google.com/store/apps/details?id=cheeaun.hackerweb"
}]
}
~~~
Relevant docs:
https://developer.mozilla.org/en-US/docs/Web/Manifest
#### Additional constraints:
- If an OMG implements both the VideoGame Schema Markup and the PWA manifest, then the icon property of the VideoGame Schema Markup must match one of the URLs of the PWA manifest icons array.
### OMG Resources Manifest
#### Explainer
The Web Bundle Resources Manifest provides an explicit list of all the resources necessary for a web game to be played offline. Extending this proposal is the OMG Resources Manifest which can be referenced as a separate file from a webpage's meta data to indicate that it is ready for offline bundling. This would communicate that the game can be bundled with a set of resources for playing offline or that resources servce as an ideal set that could be precached to improve load times.
Enables surfacing platforms and clients to pre-cache resources of games discovered on the open web for offline playback or improved loading times. This metadata assists with ‘Load Time’ and ‘Offline Access’.
#### Example
In index.html within HEAD tag:
~~~html
<head>
<link rel="bundle-manifest" href="/files.txt" offline-capable=”true” hash=”(hash of the .wbn)”>
...
~~~
If this bundle of files enables a web game to be completely playable offline then the optional flag “offline-capable” property can be included. If the “offline-capable” property is not included, the surfacing platform should assume these files are merely recommended for the purpose of precaching.
Example files.txt:
~~~bash
# A line starting with '#' is a comment.
https://example.com/
https://example.com/manifest.webmanifest
https://example.com/style.css
https://example.com/script.js
~~~
Relevant docs:
https://github.com/WICG/webpackage/tree/master/go/bundle#from-a-url-list
| 38.92381 | 541 | 0.709445 | eng_Latn | 0.981082 |
5840b258e323810ac935b56ef4617bb77dcb1f1a | 1,817 | md | Markdown | docs/framework/unmanaged-api/debugging/icordebugsymbolprovider2-getframeprops-method.md | cihanyakar/docs.tr-tr | 03b6c8998a997585f61b8be289df105261125239 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/unmanaged-api/debugging/icordebugsymbolprovider2-getframeprops-method.md | cihanyakar/docs.tr-tr | 03b6c8998a997585f61b8be289df105261125239 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/unmanaged-api/debugging/icordebugsymbolprovider2-getframeprops-method.md | cihanyakar/docs.tr-tr | 03b6c8998a997585f61b8be289df105261125239 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: ICorDebugSymbolProvider2::GetFrameProps yöntemi
ms.date: 03/30/2017
ms.assetid: f07b73f3-188d-43a9-8f7d-44dce2f1ddb7
author: rpetrusha
ms.author: ronpet
ms.openlocfilehash: 419e7bae5998679fafac48ebfd5b0673e0e4bac5
ms.sourcegitcommit: 3d5d33f384eeba41b2dff79d096f47ccc8d8f03d
ms.translationtype: MT
ms.contentlocale: tr-TR
ms.lasthandoff: 05/04/2018
ms.locfileid: "33419087"
---
# <a name="icordebugsymbolprovider2getframeprops-method"></a>ICorDebugSymbolProvider2::GetFrameProps yöntemi
Bir yöntem ve bir kod göreli sanal adres verilen üst çerçeve göreli sanal adres başlatma yöntemi döndürür.
## <a name="syntax"></a>Sözdizimi
```
HRESULT GetFrameProps(
[in] ULONG32 codeRva,
[out] ULONG32 *pCodeStartRva,
[out] ULONG32 *pParentFrameStartRva
);
```
#### <a name="parameters"></a>Parametreler
`codeRva`
[in] Bir kod göreli sanal adres.
`pCodeStartRva`
[out] Yöntemin başlangıç göreli sanal adresi için bir işaretçi.
`pParentFrameStartRva`
[out] Çerçeve başlangıç göreli sanal adresi için bir işaretçi.
## <a name="remarks"></a>Açıklamalar
> [!NOTE]
> Bu yöntem yalnızca .NET yerel ile kullanılabilir.
## <a name="requirements"></a>Gereksinimler
**Platformlar:** bkz [sistem gereksinimleri](../../../../docs/framework/get-started/system-requirements.md).
**Başlık:** CorDebug.idl, CorDebug.h
**Kitaplığı:** CorGuids.lib
**.NET framework sürümleri:** [!INCLUDE[net_46_native](../../../../includes/net-46-native-md.md)]
## <a name="see-also"></a>Ayrıca Bkz.
[ICorDebugSymbolProvider2 Arabirimi](../../../../docs/framework/unmanaged-api/debugging/icordebugsymbolprovider2-interface.md)
[Hata Ayıklama Arabirimleri](../../../../docs/framework/unmanaged-api/debugging/debugging-interfaces.md)
| 33.648148 | 129 | 0.716566 | tur_Latn | 0.690241 |
5840f220f1b9210dd0b5ea1dc2f66e8ddf9cfb3c | 753 | md | Markdown | windows.foundation.diagnostics/loggingfields_adddatetime_1251332688.md | gbaychev/winrt-api | 25346cd51bc9d24c8c4371dc59768e039eaf02f1 | [
"CC-BY-4.0",
"MIT"
] | 199 | 2017-02-09T23:13:51.000Z | 2022-03-28T15:56:12.000Z | windows.foundation.diagnostics/loggingfields_adddatetime_1251332688.md | gbaychev/winrt-api | 25346cd51bc9d24c8c4371dc59768e039eaf02f1 | [
"CC-BY-4.0",
"MIT"
] | 2,093 | 2017-02-09T21:52:45.000Z | 2022-03-25T22:23:18.000Z | windows.foundation.diagnostics/loggingfields_adddatetime_1251332688.md | gbaychev/winrt-api | 25346cd51bc9d24c8c4371dc59768e039eaf02f1 | [
"CC-BY-4.0",
"MIT"
] | 620 | 2017-02-08T19:19:44.000Z | 2022-03-29T11:38:25.000Z | ---
-api-id: M:Windows.Foundation.Diagnostics.LoggingFields.AddDateTime(System.String,Windows.Foundation.DateTime)
-api-type: winrt method
---
<!-- Method syntax
public void AddDateTime(System.String name, Windows.Foundation.DateTime value)
-->
# Windows.Foundation.Diagnostics.LoggingFields.AddDateTime
## -description
Adds a [DateTime](../windows.foundation/datetime.md) field with the specified field name.
## -parameters
### -param name
The name of the event field.
### -param value
The value of the event field.
## -remarks
## -examples
## -see-also
[AddDateTime(String, DateTime, LoggingFieldFormat)](loggingfields_adddatetime_97333076.md), [AddDateTime(String, DateTime, LoggingFieldFormat, Int32)](loggingfields_adddatetime_537728364.md) | 27.888889 | 190 | 0.776892 | yue_Hant | 0.650173 |
58412ae548f275551e936cd3a9a79beb825a67da | 1,217 | md | Markdown | docs/error-messages/compiler-warnings/compiler-warning-level-3-c4522.md | FlorianHaupt/cpp-docs.de-de | 0ccd351cf627f1a41b14514c746f729b778d87e3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/error-messages/compiler-warnings/compiler-warning-level-3-c4522.md | FlorianHaupt/cpp-docs.de-de | 0ccd351cf627f1a41b14514c746f729b778d87e3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/error-messages/compiler-warnings/compiler-warning-level-3-c4522.md | FlorianHaupt/cpp-docs.de-de | 0ccd351cf627f1a41b14514c746f729b778d87e3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Compilerwarnung (Stufe 3) C4522
ms.date: 11/04/2016
f1_keywords:
- C4522
helpviewer_keywords:
- C4522
ms.assetid: 7065dc27-0b6c-4e68-a345-c51cdb99a20b
ms.openlocfilehash: de163f0a3925b711f2f3437b700f75bbe994b3e7
ms.sourcegitcommit: 6052185696adca270bc9bdbec45a626dd89cdcdd
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 10/31/2018
ms.locfileid: "50622962"
---
# <a name="compiler-warning-level-3-c4522"></a>Compilerwarnung (Stufe 3) C4522
'Klasse': Mehrere Zuweisungsoperatoren angegeben
Die Klasse verfügt über mehrere Zuweisungsoperatoren eines einzelnen Typs. Diese Warnung dient nur zu Informationszwecken; die Konstruktoren, die in Ihrem Programm können aufgerufen werden.
Verwenden der [Warnung](../../preprocessor/warning.md) Pragma, um diese Warnung unterdrücken.
## <a name="example"></a>Beispiel
Im folgende Beispiel wird die C4522 generiert.
```
// C4522.cpp
// compile with: /EHsc /W3
#include <iostream>
using namespace std;
class A {
public:
A& operator=( A & o ) { cout << "A&" << endl; return *this; }
A& operator=( const A &co ) { cout << "const A&" << endl; return *this; } // C4522
};
int main() {
A o1, o2;
o2 = o1;
const A o3;
o1 = o3;
}
``` | 26.456522 | 189 | 0.727198 | deu_Latn | 0.627331 |
5841539ff264ec1edd93ae98267f702cdecf36d7 | 7,351 | md | Markdown | README.md | akichidis/lightning-chess | aed4b1b27f8b8f267a0b7dab60a7f10b653612b8 | [
"Apache-2.0"
] | 8 | 2018-12-02T09:34:19.000Z | 2019-05-23T14:49:29.000Z | README.md | akichidis/lightning-chess | aed4b1b27f8b8f267a0b7dab60a7f10b653612b8 | [
"Apache-2.0"
] | 7 | 2018-12-20T23:30:49.000Z | 2019-01-01T11:00:43.000Z | README.md | akichidis/lightning-chess | aed4b1b27f8b8f267a0b7dab60a7f10b653612b8 | [
"Apache-2.0"
] | 4 | 2018-12-19T00:08:32.000Z | 2019-12-12T03:06:13.000Z | <p align="center">
<img src="./lightning-chess-full-logo.png" alt="Lightning Chess" width="256">
</p>
# Lightning Chess
Lightning Chess is an open source 2-player chess app for the [Corda](https://corda.net) blockchain, initially built as a proof of concept to show that Corda can support offline transactions. The approach is similar to the [lightning network](https://lightning.network) suggested for Bitcoin and could theoretically be applied to any turn-based games. In short, two transactions are required to fully store a chess game on ledger, one to create the game and another to declare the winner (or a draw). In the meantime, players move pieces without notarising transactions, but by exchanging digital signatures on each move. The lightning chess protocol ensures that the smart contract will cryptographically verify the validity of the results.
**1st note:** Lightning Chess is still under development, the full Cordapp is still not functional and it's not recommended for production use.
**2nd note:** Not to be confused with rapid chess, in which each move has a fixed time allowed (usually 10 seconds). This is not a rapid or blitz chess game, but rather a chess protocol and application tailored to blockchains.
## Why Corda and not Ethereum?
Because Corda is more versatile, provides extra privacy and it's actually doable with Corda. Take a look at this [interesting experiment](https://medium.com/@graycoding/lessons-learned-from-making-a-chess-game-for-ethereum-6917c01178b6) where a student project from Technical University of Berlin engaged in making a Chess game for Ethereum; but, they faced challenges and their conclusion was:
> The Ethereum virtual machine is turing-complete, but that doesn’t mean that everything should be computed on it.
## Features
* Web-based user interface that allows Corda node owners to interactively play an online chess game.
* Full Cordapp to store chess states in Corda's Vault and validate the results.
* Offline gameplay (in terms of notarisation).
## Implementation iterations
- Basic PoC in which the smart contract verifies digital signatures only. This version assumes no malicious users and time to move is not taken into account. We are currently at this stage.
- The smart contract is equipped with chess state validation logic (i.e., if a move is valid). If a user signs an invalid move, this can be used as a cryptographic evidence of malicious activity and the opponent will be able to provide this proof to win the game. Along the same lines, a checkmate is automatically checked by the contract's verify method.
- Use Oracles for handling disputes on "time to respond".
## Full expected protocol (WIP)
**Happy path process**
1. players agree on random gameID and colours (similarly to key agreement).
2. initial commitment (fact-time lock tx) → send for notarisation.
3. off chain game:
1. player signs next move
2. sends hash (or gameID + sequence) to SequenceKeeper
3. SequenceKeeper signs and replies back
4. player forwards to the other player
5. next player's turn (go to 3.i)
4. game ends → submit a tx that includes the last two signed moves. For instance, a player provides the signed winning move + the signed previous move of the opponent. Alternatively, a signed acceptance/resignation from the other party or a mutually signed agreement on the result would be enough (this is the 1st iteration). The main benefit with this approach is you don't need to send the full move-sequence and chessboard states, simplifying the contract logic by large degree.
5. The aim is that the smart contract verify logic should be able to identify a winning or draw state, so consuming of the potentially encumbered assets (awards) is possible.
**Signed payload per move**
Each player should sign his/her move before sending it to the opponent. The signed `GameMove` object should have the following properties:
* `gameId`: required to ensure that this signature applies to a particular chess game; note that each `gameId` should be unique.
* `index`: a counter that represents the actual _move-index_, starting from zero. We need this to avoid replay attacks by reusing an old signature that was generated in a previous move.
* `fen`: the game position in [FEN](https://en.wikipedia.org/wiki/Forsyth%E2%80%93Edwards_Notation) form. Theoretically, we could avoid sending the current state on each move, because everytime users sign their next move, they inherently accepted the previous state. However, including FEN will speed up move validation, as reruning the whole game from scratch is not required.
* `move`: the actual _from-to_ move. We highlight that a move might involve two pieces as in teh case of [castling](https://en.wikipedia.org/wiki/Castling).
* `previousSignature`: opponent's signature on his/her most recent move. This is required to create a chain of actions and resist against malicious users signing different moves, then trying to "convince" the smart contract that the opponent played a wrong move.
**Ideas:**
1. sequence of moves can work like a blockchain Vs a counter.
2. using the encumbrance feature one can create a tournament with real cash prizes.
3. we could use hash commitments to avoid full signature verification when both parties are honest. Briefly, during game creation, each user can provide a commitment to a hash pre-image, which will be revealed if this user loses the game. Thus, the winner can use it as a cryptographic evidence to win the game. The same process can be applied if a game ends in a draw.
4. extend the above hash commitment scheme using hash-based post-quantum signatures, such as the Lamport, WOTS, BPQS, XMSS and Sphincs schemes. We also realised that chess in particular is a great example for short hash-based signatures per move (more details in an upcoming scientific paper).
**Prerequisites:** a passive Oracle (SequenceKeeper) is required (it can be a BFT cluster for advanced security/trust, but accurancy in the level of seconds is tricky anyway with leader-based schemes). Note that oracles are only required for disputes on "time to respond" and they don't need to have visibility on the actual game state (moves).
**Dispute cases**
1. Player leaves the game earlier or refuses to play (on purpose or unexpectedly): Request a signature from SequenceKeeper that the other party has not responded on time (WIP: time-out policy to be defined).
2. Player makes an invalid move: The other player reveals the signed previous state(s) and the signed "malicious" move. Smart contract logic should be able to identify a wrong move and the other party can use this as evidence to win the game. Chess software should only allow valid moves, thus an invalid move can only happen by hacking the game engine.
## Contributing
We welcome contributions to Lightning Chess! You may find the list of contributors [here](./CONTRIBUTORS.md).
## License
[Apache 2.0](./LICENSE.md)
## Acknowledgements
[Corda](https://corda.net), a blockchain and smart contract platform developed by [R3](https://r3.com). As a blockchain platform, Corda allows parties to transact directly, with value. Smart contracts allow Corda to do this using complex agreements and any asset type. This capability has broad applications across industries including finance, supply chain and healthcare.
| 96.723684 | 740 | 0.785607 | eng_Latn | 0.999475 |
58437559e98e89292eb75e22d3c52705678edd79 | 2,257 | md | Markdown | README.md | Shin-Tachibana/vue-touch-scroll | 362c93a9b855059166a8bae115afe922962e7e6d | [
"MIT"
] | 1 | 2020-10-12T04:01:56.000Z | 2020-10-12T04:01:56.000Z | README.md | NguyenThanh1995/vue-touch-scroll | 362c93a9b855059166a8bae115afe922962e7e6d | [
"MIT"
] | 1 | 2021-04-20T04:35:43.000Z | 2021-04-21T14:18:44.000Z | README.md | Shin-Tachibana/vue-touch-scroll | 362c93a9b855059166a8bae115afe922962e7e6d | [
"MIT"
] | null | null | null | # vue-touch-scroll
[](https://github.com/nguyenthanh1995/vue-i18n-filters/blob/master/LICENSE) [](#)
**A plugin scroll cross browser for Vue.js**
## Table of Contents
- [Installation](#installation)
- [Usage](#usage)
- [License](#license)
## Installation
``` bash
npm install vue-touch-scroll
```
or if you prefer CDN
``` html
<script type="text/javascript" src="https://unpkg.com/vue-touch-scroll@latest/dist/vue-touch-scroll.js"></script>
```
## Usage
### Global
``` JavaScript
import { use } from "vue"
import VueTouchScroll from "vue-touch-scroll"
use(VueTouchScroll)
```
``` vue.js
<vue-touch-scroll type="vertical">
<!-- Content -->
</vue-touch-scroll>
```
or
``` vue.js
<div v-touch-scroll:vertical>
<!-- Content -+>
</div>
```
### Private
``` vue.js
<vue-touch-scroll type="vertical">
<!-- Content -->
</vue-touch-scroll>
<script>
import { VueTouchScroll } from "vue-touch-scroll"
export default {
components: { VueTouchScroll }
}
</script>
```
or
``` vue.js
<div v-touch-scroll:vertical>
<!-- Content -+>
</div>
<script>
import { directive } from "vue-touch-scroll"
export default {
directives: {
"touch-scroll": directive
}
}
</script>
```
### Configuration
#### Component
| Property | Type | Default | Description |
|:-|:-|:-|:-|
| tag | String | "div" | A tag name for component |
| type | String | "vertical" | Direction scroll "vertical" or "horizontal" |
| hide-scrollbar | Boolean | false | Are you hide scrollbar? |
| class-scrollbar | String, Array, Object | "" | Class for scrollbar |
| style-scrollbar | Object | {} | Style for scrollbar |
#### Directive
```vue.js
<div v-touch-scroll:arg="value"></div>
```
| Property | Description |
|:-|:-|:-|:-|
| arg | = option type in component |
| value | = option scrollbar |
#### Options sscrollbar
``` vue.js
<div v-touch-scroll:vertical="{
render: true, // render scrollbar?
class: [] // class for scrollbar,
style: {}
}"></div>
```
## License
This project is licensed under the MIT License - see the [LICENSE.md](LICENSE.md) file for details.
| 18.652893 | 222 | 0.6358 | eng_Latn | 0.21876 |
5843787d2f4ade8909ed92383c595d820f845d22 | 993 | md | Markdown | README.md | mahmudtopu3/django-cas-server | 62edd8bca7ea2251509e46dbddd5310943ed2a48 | [
"MIT"
] | 9 | 2020-04-06T22:01:25.000Z | 2022-03-04T18:28:12.000Z | README.md | mahmudtopu3/django-cas-server | 62edd8bca7ea2251509e46dbddd5310943ed2a48 | [
"MIT"
] | 2 | 2020-08-23T20:23:49.000Z | 2021-06-10T19:42:39.000Z | README.md | mahmudtopu3/django-cas-server | 62edd8bca7ea2251509e46dbddd5310943ed2a48 | [
"MIT"
] | 5 | 2020-02-21T21:24:35.000Z | 2021-05-18T09:25:19.000Z | # demo-cas-server
A demo CAS server work with [django-cas-ng/example](https://github.com/django-cas-ng/example) to show how to integrate with [django-cas-ng](https://djangocas.dev) and [python-cas](https://github.com/python-cas/python-cas).
- [Step by step to setup a django-cas-ng example project](https://djangocas.dev/blog/django-cas-ng-example-project/)
- [python-cas Flask example](https://djangocas.dev/blog/python-cas-flask-example/)
Demo server URL: https://django-cas-ng-demo-server.herokuapp.com/
Python 3 and Django 2.2 are required.
```
pip install -r requirements.txt
python manage.py runserver
```
## Using docker
```bash
docker-compose up -d
```
## Test
Open `http://127.0.0.1:8000/cas/login`
---
[**NOTE**]:
Before run server, create db and a superuser:
- `python manage.py migrate`
- `python manage.py createsuperuser`
In docker mode:
- `docker exec -it cas-server python manage.py migrate`
- `docker exec -it cas-server python manage.py createsuperuser`
| 25.461538 | 222 | 0.722054 | eng_Latn | 0.45298 |
5844e0c46507ea55c058c3ef462777034c445b4d | 719 | md | Markdown | README.md | 0xRainy/twitch-search | 3f1e7da45f89e6fa7639d33fc2c8171946c4fc86 | [
"MIT"
] | null | null | null | README.md | 0xRainy/twitch-search | 3f1e7da45f89e6fa7639d33fc2c8171946c4fc86 | [
"MIT"
] | null | null | null | README.md | 0xRainy/twitch-search | 3f1e7da45f89e6fa7639d33fc2c8171946c4fc86 | [
"MIT"
] | null | null | null | # Search Twitch.tv streams and show a list in the terminal
Usage:
```
# Category search
Enter a term and choose a result from the returned list of categories
# Stream search
Enter a term to search stream titles in the chosen category
```
*Note:* requires two env vars set to a valid OAuth token and client id:
* `TWITCH_CLIENT_ID`
To get a client ID register an app (this app) here:
https://dev.twitch.tv/console/apps/create
OAuth Redirect URLs can be set to https://localhost
* `TWITCH_TOKEN`
To get an app OAuth token run:
* `curl -X POST https://id.twitch.tv/oauth2/token?client_id=<your client
ID>&client_secret=<your client
secret>&grant_type=client_credentials&scope=[]`
| 29.958333 | 76 | 0.720445 | eng_Latn | 0.900251 |
5845d4638a3679e2febb7ee1a510f248774da48d | 199 | md | Markdown | iambismark.net/content/post/2009/07/1246944177.md | bismark/iambismark.net | 1ef89663cfcf4682fbfd60781bb143a7fd276312 | [
"MIT"
] | null | null | null | iambismark.net/content/post/2009/07/1246944177.md | bismark/iambismark.net | 1ef89663cfcf4682fbfd60781bb143a7fd276312 | [
"MIT"
] | null | null | null | iambismark.net/content/post/2009/07/1246944177.md | bismark/iambismark.net | 1ef89663cfcf4682fbfd60781bb143a7fd276312 | [
"MIT"
] | null | null | null | ---
alturls:
- https://twitter.com/bismark/status/2509692261
archive:
- 2009-07
date: '2009-07-07T05:22:57+00:00'
slug: '1246944177'
type: photo
---
mariam getting too friendly with the sea horses.
| 16.583333 | 48 | 0.723618 | eng_Latn | 0.236131 |
5846943bc3c30e5bf526be5df6cfeb0be1081cfe | 981 | md | Markdown | kvm/archived/v11.3.2/README.md | giantswarm/releases | ddb61b871af4f2986efa313fffa49487dd679056 | [
"Apache-2.0"
] | 5 | 2019-10-30T02:00:06.000Z | 2022-02-22T13:40:28.000Z | kvm/archived/v11.3.2/README.md | giantswarm/releases | ddb61b871af4f2986efa313fffa49487dd679056 | [
"Apache-2.0"
] | 467 | 2019-09-20T14:10:46.000Z | 2022-03-31T12:27:25.000Z | kvm/archived/v11.3.2/README.md | giantswarm/releases | ddb61b871af4f2986efa313fffa49487dd679056 | [
"Apache-2.0"
] | null | null | null | ## :zap: Tenant Cluster Release 11.3.2 for KVM :zap:
**If you are upgrading from 11.3.1, upgrading to this release will not roll your nodes. It will only update the apps.**
This release improves the reliability of NGINX Ingress Controller.
Specifically, liveness probe is configured to be more fault tolerant than readiness probe. This helps shed load when it goes beyond replica capacity, speeding up recovery when NGINX gets overloaded.
Below, you can find more details on components that were changed with this release.
### nginx-ingress-controller v0.30.0 ([Giant Swarm app v1.6.12](https://github.com/giantswarm/nginx-ingress-controller-app/blob/master/CHANGELOG.md#v1612-2020-06-04))
- Made healthcheck probes configurable.
- Made liveness probe more resilient.
- Aligned labels using `app.kubernetes.io/name` instead of `k8s-app` where possible. `k8s-app` remains to be used for compatibility reasons as selectors are not modifiable without recreating the Deployment.
| 61.3125 | 206 | 0.787971 | eng_Latn | 0.995667 |
5846fd18a8d1f1c45811c9d5b3f595de53e6cc44 | 15,781 | md | Markdown | articles/site-recovery/migrate-tutorial-aws-azure.md | smith1511/azure-docs | 78234573d2b47674c31c4d899a3659ab1d1473ef | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/site-recovery/migrate-tutorial-aws-azure.md | smith1511/azure-docs | 78234573d2b47674c31c4d899a3659ab1d1473ef | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/site-recovery/migrate-tutorial-aws-azure.md | smith1511/azure-docs | 78234573d2b47674c31c4d899a3659ab1d1473ef | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Migrate AWS VMs to Azure with the Azure Site Recovery service | Microsoft Docs
description: This article describes how to migrate Windows VMs running in Amazon Web Services (AWS) to Azure using Azure Site Recovery.
services: site-recovery
author: rayne-wiselman
manager: carmonm
ms.service: site-recovery
ms.topic: tutorial
ms.date: 09/09/2019
ms.author: raynew
ms.custom: MVC
---
# Migrate Amazon Web Services (AWS) VMs to Azure
This tutorial teaches you how to migrate Amazon Web Services (AWS) virtual machines (VMs) to Azure VMs by using Azure Site Recovery. When you migrate AWS EC2 instances to Azure, the VMs are treated like physical, on-premises computers. In this tutorial, you learn how to:
> [!div class="checklist"]
> * Verify prerequisites
> * Prepare Azure resources
> * Prepare AWS EC2 instances for migration
> * Deploy a configuration server
> * Enable replication for VMs
> * Test the failover to make sure everything's working
> * Run a onetime failover to Azure
If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/pricing/free-trial/) before you begin.
> [!NOTE]
> You can now use the Azure Migrate service to migrate AWS instances to Azure. [Learn more](../migrate/tutorial-migrate-physical-virtual-machines.md).
## Prerequisites
- Ensure that the VMs that you want to migrate are running a supported OS version. Supported versions include:
- Windows Server 2016
- Windows Server 2012 R2
- Windows Server 2012
- 64-bit version of Windows Server 2008 R2 SP1 or later
- Red Hat Enterprise Linux 6.4 to 6.10, 7.1 to 7.6 (HVM virtualized instances only) *(Instances running RedHat PV drivers aren't supported.)*
- CentOS 6.4 to 6.10, 7.1 to 7.6 (HVM virtualized instances only)
- The Mobility service must be installed on each VM that you want to replicate.
> [!IMPORTANT]
> Site Recovery installs this service automatically when you enable replication for the VM. For automatic installation, you must prepare an account on the EC2 instances that Site Recovery will use to access the VM.
> You can use a domain or local account.
> - For Linux VMs, the account should be root on the source Linux server.
> - For Windows VMs, if you're not using a domain account, disable Remote User Access control on the local machine:
>
> In the registry, under **HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Policies\System**,
add the DWORD entry **LocalAccountTokenFilterPolicy** and set the value to **1**.
- A separate EC2 instance that you can use as the Site Recovery configuration server. This instance must be running Windows Server 2012 R2.
## Prepare Azure resources
You must have a few resources ready in Azure for the migrated EC2 instances to use. These include a storage account, a vault, and a virtual network.
### Create a storage account
Images of replicated machines are held in Azure Storage. Azure VMs are created from storage when you fail over from on-premises to Azure.
1. In the [Azure portal](https://portal.azure.com), in the left menu, select **Create a resource** > **Storage** > **Storage account**.
2. Enter a name for your storage account. In these tutorials, we use the name **awsmigrated2017**. The name must:
- Be unique in Azure
- Be between 3 and 24 characters
- Contain only numbers and lowercase letters
3. Leave the defaults for **Deployment model**, **Account kind**, **Performance**, and **Secure transfer required**.
5. For **Replication**, select the default **RA-GRS**.
6. Select the subscription that you want to use for this tutorial.
7. For **Resource group**, select **Create new**. In this example, we use **migrationRG** for the resource group name.
8. For **Location**, select **West Europe**.
9. Select **Create** to create the storage account.
### Create a vault
1. In the [Azure portal](https://portal.azure.com), select **All services**. Search for and then select **Recovery Services vaults**.
2. On the Azure Recovery Services vaults page, select **Add**.
3. For **Name**, enter **myVault**.
4. For **Subscription**, select the subscription that you want to use.
4. For **Resource Group**, select **Use existing**, and then select **migrationRG**.
5. For **Location**, select **West Europe**.
5. Select **Pin to dashboard** to be able to quickly access the new vault from the dashboard.
7. When you're done, select **Create**.
To see the new vault, go to **Dashboard** > **All resources**. The new vault also appears on the main **Recovery Services vaults** page.
### Set up an Azure network
When Azure VMs are created after the migration (failover), they're joined to this Azure network.
1. In the [Azure portal](https://portal.azure.com), select **Create a resource** > **Networking** >
**Virtual network**.
3. For **Name**, enter **myMigrationNetwork**.
4. Leave the default value for **Address space**.
5. For **Subscription**, select the subscription that you want to use.
6. For **Resource group**, select **Use existing**, and then select **migrationRG**.
7. For **Location**, select **West Europe**.
8. Under **Subnet**, leave the default values for **Name** and **IP range**.
9. Add instructions for DDoS protection settings.
10. Leave the **Service Endpoints** option disabled.
11. Add instructions for Firewall settings.
12. When you're done, select **Create**.
## Prepare the infrastructure
On your vault page in the Azure portal, in the **Getting Started** section, select **Site Recovery**, and then select **Prepare Infrastructure**. Complete the following steps.
### 1: Protection goal
On the **Protection Goal** page, select the following values:
| | |
|---------|-----------|
| Where are your machines located? |Select **On-premises**.|
| Where do you want to replicate your machines? |Select **To Azure**.|
| Are your machines virtualized? |Select **Not virtualized / Other**.|
When you're done, select **OK** to move to the next section.
### 2: Select deployment planning
In **Have you completed deployment planning**, select **I will do it later**, and then select **OK**.
### 3: Prepare source
On the **Prepare source** page, select **+ Configuration Server**.
1. Use an EC2 instance that's running Windows Server 2012 R2 to create a configuration server and register it with your recovery vault.
2. Configure the proxy on the EC2 instance VM you're using as the configuration server so that it can access the [service URLs](site-recovery-support-matrix-to-azure.md).
3. Download [Microsoft Azure Site Recovery Unified Setup](https://aka.ms/unifiedinstaller_wus). You can download it to your local machine and then copy it to the VM you're using as the configuration server.
4. Select the **Download** button to download the vault registration key. Copy the downloaded file to the VM you're using as the configuration server.
5. On the VM, right-click the installer you downloaded for Microsoft Azure Site Recovery Unified Setup, and then select **Run as administrator**.
1. Under **Before You Begin**, select **Install the configuration server and process server**, and then select **Next**.
2. In **Third-Party Software License**, select **I accept the third-party license agreement**, and then select **Next**.
3. In **Registration**, select **Browse**, and then go to where you put the vault registration key file. Select **Next**.
4. In **Internet Settings**, select **Connect to Azure Site Recovery without a proxy server**, and then select **Next**.
5. The **Prerequisites Check** page runs checks for several items. When it's finished, select **Next**.
6. In **MySQL Configuration**, provide the required passwords, and then select **Next**.
7. In **Environment Details**, select **No**. You don't need to protect VMware machines. Then, select **Next**.
8. In **Install Location**, select **Next** to accept the default.
9. In **Network Selection**, select **Next** to accept the default.
10. In **Summary**, select **Install**.
11. **Installation Progress** shows you information about the installation process. When it's finished, select **Finish**. A window displays a message about a reboot. Select **OK**. Next, a window displays a message about the configuration server connection passphrase. Copy the passphrase to your clipboard and save it somewhere safe.
6. On the VM, run cspsconfigtool.exe to create one or more management accounts on the configuration server. Make sure that the management accounts have administrator permissions on the EC2 instances that you want to migrate.
When you're done setting up the configuration server, go back to the portal and select the server that you created for **Configuration Server**. Select **OK** to go to 3: Prepare target.
### 4: Prepare target
In this section, you enter information about the resources that you created in [Prepare Azure resources](#prepare-azure-resources) earlier in this tutorial.
1. In **Subscription**, select the Azure subscription that you used for the [Prepare Azure](tutorial-prepare-azure.md) tutorial.
2. Select **Resource Manager** as the deployment model.
3. Site Recovery verifies that you have one or more compatible Azure storage account and network. These should be the resources that you created in [Prepare Azure resources](#prepare-azure-resources) earlier in this tutorial.
4. When you're done, select **OK**.
### 5: Prepare replication settings
Before you can enable replication, you must create a replication policy.
1. Select **Create and Associate**.
2. In **Name**, enter **myReplicationPolicy**.
3. Leave the rest of the default settings, and then select **OK** to create the policy. The new policy is automatically associated with the configuration server.
When you're finished with all five sections under **Prepare Infrastructure**, select **OK**.
## Enable replication
Enable replication for each VM that you want to migrate. When replication is enabled, Site Recovery automatically installs the Mobility service.
1. Go to the [Azure portal](https://portal.azure.com).
1. On the page for your vault, under **Getting Started**, select **Site Recovery**.
2. Under **For on-premises machines and Azure VMs**, select **Step 1: Replicate application**. Complete the wizard pages with the following information. Select **OK** on each page when you're done:
- 1: Configure source
| | |
|-----|-----|
| Source: | Select **On Premises**.|
| Source location:| Enter the name of your configuration server EC2 instance.|
|Machine type: | Select **Physical machines**.|
| Process server: | Select the configuration server from the drop-down list.|
- 2: Configure target
| | |
|-----|-----|
| Target: | Leave the default.|
| Subscription: | Select the subscription that you have been using.|
| Post-failover resource group:| Use the resource group you created in [Prepare Azure resources](#prepare-azure-resources).|
| Post-failover deployment model: | Select **Resource Manager**.|
| Storage account: | Select the storage account that you created in [Prepare Azure resources](#prepare-azure-resources).|
| Azure network: | Select **Configure now for selected machines**.|
| Post-failover Azure network: | Choose the network you created in [Prepare Azure resources](#prepare-azure-resources).|
| Subnet: | Select the **default** in the drop-down list.|
- 3: Select physical machines
Select **Physical machine**, and then enter the values for **Name**, **IP Address**, and **OS Type** of the EC2 instance that you want to migrate. Select **OK**.
- 4: Configure properties
Select the account that you created on the configuration server, and then select **OK**.
- 5: Configure replication settings
Make sure that the replication policy selected in the drop-down list is **myReplicationPolicy**, and then select **OK**.
3. When the wizard is finished, select **Enable replication**.
To track the progress of the **Enable Protection** job, go to **Monitoring and reports** > **Jobs** > **Site Recovery Jobs**. After the **Finalize Protection** job runs, the machine is ready for failover.
When you enable replication for a VM, changes can take 15 minutes or longer to take effect and appear in the portal.
## Run a test failover
When you run a test failover, the following events occur:
- A prerequisites check runs to make sure that all the conditions required for failover are in place.
- Failover processes the data so that an Azure VM can be created. If you select the latest recovery point, a recovery point is created from the data.
- An Azure VM is created by using the data processed in the preceding step.
In the portal, run the test failover:
1. On the page for your vault, go to **Protected items** > **Replicated Items**. Select the VM, and then select **Test Failover**.
2. Select a recovery point to use for the failover:
- **Latest processed**: Fails over the VM to the latest recovery point that was processed by Site Recovery. The time stamp is shown. With this option, no time is spent processing data, so it provides a low recovery time objective (RTO).
- **Latest app-consistent**: This option fails over all VMs to the latest app-consistent recovery point. The time stamp is shown.
- **Custom**: Select any recovery point.
3. In **Test Failover**, select the target Azure network to which Azure VMs will be connected after failover occurs. This should be the network you created in [Prepare Azure resources](#prepare-azure-resources).
4. Select **OK** to begin the failover. To track progress, select the VM to view its properties. Or you can select the **Test Failover** job on the page for your vault. To do this, select **Monitoring and reports** > **Jobs** > **Site Recovery jobs**.
5. When the failover finishes, the replica Azure VM appears in the Azure portal. To view the VM, select **Virtual Machines**. Ensure that the VM is the appropriate size, that it's connected to the right network, and that it's running.
6. You should now be able to connect to the replicated VM in Azure.
7. To delete Azure VMs that were created during the test failover, select **Cleanup test failover** in the recovery plan. In **Notes**, record and save any observations associated with the test failover.
In some scenarios, failover requires additional processing. Processing takes 8 to 10 minutes to finish.
## Migrate to Azure
Run an actual failover for the EC2 instances to migrate them to Azure VMs:
1. In **Protected items** > **Replicated items**, select the AWS instances, and then select **Failover**.
2. In **Failover**, select a **Recovery Point** to failover to. Select the latest recovery point, and start the failover. You can follow the failover progress on the **Jobs** page.
1. Ensure that the VM appears in **Replicated items**.
2. Right-click each VM, and then select **Complete Migration**. This does the following:
- This finishes the migration process, stops replication for the AWS VM, and stops Site Recovery billing for the VM.
- This step cleans up the replication data. It doesn't delete the migrated VMs.

> [!WARNING]
> *Don't cancel a failover that is in progress*. Before failover is started, VM replication is stopped. If you cancel a failover that is in progress, failover stops, but the VM won't replicate again.
## Next steps
In this article, you learned how to migrate AWS EC2 instances to Azure VMs. To learn more about Azure VMs, continue to the tutorials for Windows VMs.
> [!div class="nextstepaction"]
> [Azure Windows virtual machine tutorials](../virtual-machines/windows/tutorial-manage-vm.md)
| 59.776515 | 336 | 0.735822 | eng_Latn | 0.992117 |
58484ed60014c1747f3b5874ba5bae9b60ace815 | 5,196 | md | Markdown | articles/cognitive-services/Bing-Custom-Search/overview.md | Microsoft/azure-docs.cs-cz | 1e2621851bc583267d783b184f52dc4b853a058c | [
"CC-BY-4.0",
"MIT"
] | 6 | 2017-08-28T07:43:21.000Z | 2022-01-04T10:32:24.000Z | articles/cognitive-services/Bing-Custom-Search/overview.md | MicrosoftDocs/azure-docs.cs-cz | 1e2621851bc583267d783b184f52dc4b853a058c | [
"CC-BY-4.0",
"MIT"
] | 428 | 2018-08-23T21:35:37.000Z | 2021-03-03T10:46:43.000Z | articles/cognitive-services/Bing-Custom-Search/overview.md | Microsoft/azure-docs.cs-cz | 1e2621851bc583267d783b184f52dc4b853a058c | [
"CC-BY-4.0",
"MIT"
] | 16 | 2018-03-03T16:52:06.000Z | 2021-12-22T09:52:44.000Z | ---
title: Co je rozhraní API pro vlastní vyhledávání Bingu?
titleSuffix: Azure Cognitive Services
description: Rozhraní API pro vlastní vyhledávání Bingu vám umožní vytvářet prostředí s přizpůsobeným vyhledáváním pro témata, o kterých se zajímáte.
services: cognitive-services
author: aahill
manager: nitinme
ms.service: cognitive-services
ms.subservice: bing-custom-search
ms.topic: overview
ms.date: 12/18/2019
ms.author: aahi
ms.openlocfilehash: e70d4d83fcd9641fa86f013688b5a3a6e3652919
ms.sourcegitcommit: 9eda79ea41c60d58a4ceab63d424d6866b38b82d
ms.translationtype: MT
ms.contentlocale: cs-CZ
ms.lasthandoff: 11/30/2020
ms.locfileid: "96338397"
---
# <a name="what-is-the-bing-custom-search-api"></a>Co je rozhraní API pro vlastní vyhledávání Bingu?
> [!WARNING]
> Rozhraní API pro vyhledávání Bingu přesouváte z Cognitive Services na Vyhledávání Bingu služby. Od **30. října 2020** musí být všechny nové instance vyhledávání Bingu zřízené [podle popsaného procesu.](/bing/search-apis/bing-web-search/create-bing-search-service-resource)
> Rozhraní API pro vyhledávání Bingu zřízené pomocí Cognitive Services budou podporované v následujících třech letech nebo na konci smlouva Enterprise, podle toho, co nastane dřív.
> Pokyny k migraci najdete v tématu [vyhledávání Bingu Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource).
Rozhraní API pro vlastní vyhledávání Bingu vám umožní vytvářet uživatelsky přizpůsobené vyhledávací prostředí bez reklamy pro témata, o kterých se zajímáte. Můžete určit domény a webové stránky, které bude Bing Hledat, a také připnout, zvýšit nebo snížit konkrétní obsah, abyste mohli vytvořit vlastní zobrazení webu a pomáhat uživatelům rychle najít relevantní výsledky hledání.
## <a name="features"></a>Funkce
|Funkce |Popis |
|---------|---------|
|[Vlastní návrhy hledání v reálném čase](define-custom-suggestions.md) | Poskytněte návrhy hledání, které se dají jako typ uživatele zobrazovat jako rozevírací seznam. |
|[Prostředí pro vlastní vyhledávání obrázků](get-images-from-instance.md) | Umožněte uživatelům vyhledávat image z domén a webů zadaných ve vaší instanci vlastního vyhledávání. |
|[Prostředí pro vlastní vyhledávání videí](get-videos-from-instance.md) | Umožněte uživatelům hledat videa z domén a webů zadaných ve vaší instanci vlastního vyhledávání. |
|[Sdílení instance vlastního vyhledávání](share-your-custom-search.md) | Spoluupravujte a otestujte vaši instanci hledání tak, že ji sdílíte se členy týmu. |
|[Konfigurace uživatelského rozhraní pro aplikace a weby](hosted-ui.md) | Poskytuje hostované uživatelské rozhraní, které můžete snadno integrovat do webových stránek a webových aplikací jako fragment kódu JavaScriptu. |
## <a name="workflow"></a>Pracovní postup
Přizpůsobenou instanci hledání můžete vytvořit pomocí [portálu vlastní vyhledávání Bingu](https://customsearch.ai). Portál umožňuje vytvořit vlastní instanci hledání, která určuje domény, weby a webové stránky, které má Bing Hledat, spolu s těmi, které nechcete hledat. Portál můžete také použít pro: náhled vyhledávacího prostředí, upravit pořadí hledání, které poskytuje rozhraní API, a volitelně nakonfigurovat prohledávatelný uživatelský rozhraní, které bude vykresleno na vašich webech a v aplikacích.
Po vytvoření instance vyhledávání ji můžete integrovat (a volitelně také pomocí uživatelského rozhraní) do svého webu nebo aplikace voláním rozhraní API pro vlastní vyhledávání Bingu:

## <a name="next-steps"></a>Další kroky
Pokud chcete rychle začít, přečtěte si téma [Vytvoření první instance Vlastního vyhledávání Bingu](quick-start.md).
Podrobnosti o přizpůsobení instance vyhledávání najdete v tématu [Definice instance vlastního vyhledávání](define-your-custom-view.md).
Nezapomeňte si přečíst [požadavky na použití a zobrazení Bingu](../bing-web-search/use-display-requirements.md) pro používání výsledků hledání ve vašich službách a aplikacích.
Navštivte [stránku vyhledávání Bingu centrum rozhraní API](../bing-web-search/overview.md) a Prozkoumejte další dostupná rozhraní API.
Seznamte se s referenčním obsahem pro jednotlivé koncové body vlastního vyhledávání. Tyto referenční informace obsahují koncové body, hlavičky a parametry dotazů, které můžete použít při odesílání požadavků na výsledky hledání. Obsahují také definice objektů odpovědi.
[!INCLUDE [cognitive-services-bing-url-note](../../../includes/cognitive-services-bing-url-note.md)]
- [Rozhraní API pro vlastní vyhledávání](/rest/api/cognitiveservices-bingsearch/bing-custom-search-api-v7-reference)
- [Rozhraní API pro vlastní vyhledávání obrázků Bingu](/rest/api/cognitiveservices-bingsearch/bing-custom-images-api-v7-reference)
- [Rozhraní API pro vlastní vyhledávání videí Bingu](/rest/api/cognitiveservices-bingsearch/bing-custom-videos-api-v7-reference)
- [Rozhraní API pro vlastní automatické návrhy Bingu](/rest/api/cognitiveservices-bingsearch/bing-custom-autosuggest-api-v7-reference) | 81.1875 | 506 | 0.800231 | ces_Latn | 0.999972 |
58487e62cf4f5bffb838b68c590ec30d01ec823e | 2,482 | md | Markdown | docs/rooms/retention_policies.md | envs-net/matrix-doc | dbdc9ea7dff0986cfe739e63a57719e7208d6103 | [
"CC0-1.0"
] | 1 | 2021-11-05T10:20:55.000Z | 2021-11-05T10:20:55.000Z | docs/rooms/retention_policies.md | envs-net/matrix-doc | dbdc9ea7dff0986cfe739e63a57719e7208d6103 | [
"CC0-1.0"
] | null | null | null | docs/rooms/retention_policies.md | envs-net/matrix-doc | dbdc9ea7dff0986cfe739e63a57719e7208d6103 | [
"CC0-1.0"
] | null | null | null | ---
title: "Message retention policies"
date: 2021-08-03T20:30:00+02:00
---
# Message retention policies
Message retention policies exist at a room level, follow the semantics described in [MSC1763](https://github.com/matrix-org/matrix-doc/blob/matthew/msc1763/proposals/1763-configurable-retention-periods.md), and allow server and room admins to configure how long messages should be kept in a homeserver's database before being purged from it.
!!! note
Please note that, as this feature isn't part of the Matrix specification yet, this implementation is to be considered as experimental.
A message retention policy is mainly defined by its `max_lifetime` parameter, which defines how long a message can be kept around after it was sent to the room. If a room doesn't have a message retention policy, and there's no default one for a given server, then no message sent in that room is ever purged on that server.
MSC1763 also specifies semantics for a `min_lifetime` parameter which defines the amount of time after which an event can get purged (after it was sent to the room), but Synapse doesn't currently support it beyond registering it.
Both `max_lifetime` and `min_lifetime` are optional parameters.
!!! note
Note that message retention policies don't apply to state events.
more information about Message retention policies can find [here](https://github.com/matrix-org/synapse/blob/master/docs/message_retention_policies.md).
## Lifetime limits on envs.net
Our current instance lifetime limits are:
```yaml
allowed_lifetime_min: 30m
allowed_lifetime_max: 3y
```
## Room configuration
To configure a room's message retention policy, a room's admin or
moderator needs to send a state event in that room with the type
`m.room.retention` and the following content:
* Type `/devtools`.
* Select "Create custom event"
* Press red "Event" button to change to "State event"
* Set event type to `m.room.retention`
```json
{
"max_lifetime": ...
}
```
Press `Send`
In this event's content, the `max_lifetime` parameter has the same
meaning as previously described, and needs to be expressed in
milliseconds. The event's content can also include a `min_lifetime`
parameter, which has the same meaning and limited support as previously
described.
!!! note
Note that over every server in the room, only the ones with support for
message retention policies will actually remove expired events. This
support is currently not enabled by default in Synapse.
| 40.032258 | 341 | 0.77921 | eng_Latn | 0.998978 |
584936834f42caedc9da950b5863322f31bbafd9 | 82 | md | Markdown | CHANGELOG.md | sstm2/jnotebook_reader | 300e43502ccc2f5cf5b359c8dd519e36cdd2e563 | [
"Apache-2.0"
] | 101 | 2020-11-23T06:01:25.000Z | 2022-03-19T14:07:59.000Z | CHANGELOG.md | sstm2/jnotebook_reader | 300e43502ccc2f5cf5b359c8dd519e36cdd2e563 | [
"Apache-2.0"
] | 6 | 2020-12-04T08:56:23.000Z | 2021-09-09T02:09:18.000Z | CHANGELOG.md | sstm2/jnotebook_reader | 300e43502ccc2f5cf5b359c8dd519e36cdd2e563 | [
"Apache-2.0"
] | 14 | 2020-11-23T06:09:34.000Z | 2021-12-23T02:56:16.000Z | # Changelog
## 1.0.0 (2020-11-23)
This is the first release of jnotebook-reader!
| 16.4 | 46 | 0.707317 | eng_Latn | 0.998457 |
58498f216e12d0f48309e7682085c84dab1c79b8 | 89 | md | Markdown | Documentation/ProjectFile/ode_solver/c_ode_solver.md | OlafKolditz/ogs | e33400e1d9503d33ce80509a3441a873962ad675 | [
"BSD-4-Clause"
] | 111 | 2015-03-20T22:54:17.000Z | 2022-03-30T04:37:21.000Z | Documentation/ProjectFile/ode_solver/c_ode_solver.md | OlafKolditz/ogs | e33400e1d9503d33ce80509a3441a873962ad675 | [
"BSD-4-Clause"
] | 3,015 | 2015-01-05T21:55:16.000Z | 2021-02-15T01:09:17.000Z | Documentation/ProjectFile/ode_solver/c_ode_solver.md | OlafKolditz/ogs | e33400e1d9503d33ce80509a3441a873962ad675 | [
"BSD-4-Clause"
] | 250 | 2015-02-10T15:43:57.000Z | 2022-03-30T04:37:20.000Z | In this group ODE solver settings for ODE solvers from external libraries are
described.
| 29.666667 | 77 | 0.831461 | eng_Latn | 0.999782 |
584b678751beedc42c2652106570dc9f0c107c82 | 4,492 | md | Markdown | v21.1/release-savepoint.md | nickvigilante/docs | 2e35591266c757265b3e477ea009b1099cd0786f | [
"CC-BY-4.0",
"BSD-3-Clause"
] | null | null | null | v21.1/release-savepoint.md | nickvigilante/docs | 2e35591266c757265b3e477ea009b1099cd0786f | [
"CC-BY-4.0",
"BSD-3-Clause"
] | null | null | null | v21.1/release-savepoint.md | nickvigilante/docs | 2e35591266c757265b3e477ea009b1099cd0786f | [
"CC-BY-4.0",
"BSD-3-Clause"
] | null | null | null | ---
title: RELEASE SAVEPOINT
summary: The RELEASE SAVEPOINT statement commits the nested transaction starting at the corresponding SAVEPOINT statement using the same savepoint name.
toc: true
---
The `RELEASE SAVEPOINT` statement commits the [nested transaction](transactions.html#nested-transactions) starting at the corresponding `SAVEPOINT` statement using the same savepoint name, including all its nested sub-transactions. This is in addition to continued support for working with [retry savepoints](savepoint.html#savepoints-for-client-side-transaction-retries).
## Synopsis
<div>
{% include {{ page.version.version }}/sql/generated/diagrams/release_savepoint.html %}
</div>
## Required privileges
No [privileges](authorization.html#assign-privileges) are required to release a savepoint. However, privileges are required for each statement within a transaction.
## Parameters
Parameter | Description
--------- | -----------
name | The name of the savepoint. [Retry savepoints](savepoint.html#savepoints-for-client-side-transaction-retries) default to using the name `cockroach_restart`, but this can be customized using a session variable. For more information, see [Customizing the retry savepoint name](savepoint.html#customizing-the-retry-savepoint-name).
## Handling errors
The `RELEASE SAVEPOINT` statement is invalid after the nested transaction has encountered an error. After an error, the following statements can be used:
- [`ROLLBACK TO SAVEPOINT`](rollback-transaction.html#rollback-a-nested-transaction) to roll back to the previous savepoint.
- [`ROLLBACK` or `ABORT`](rollback-transaction.html#rollback-a-transaction) to roll back the entire surrounding transaction.
- [`COMMIT`](commit-transaction.html) to commit the entire surrounding transaction. In case of error, `COMMIT` is synonymous with [`ROLLBACK`/`ABORT`](rollback-transaction.html) and also rolls back the entire transaction.
When a (sub-)transaction encounters a retry error, the client should repeat `ROLLBACK TO SAVEPOINT` and the statements in the transaction until the statements complete without error, then issue `RELEASE`.
To completely remove the marker of a nested transaction after it encounters an error and begin other work in the outer transaction, use [`ROLLBACK TO SAVEPOINT`](rollback-transaction.html#rollback-a-nested-transaction) immediately followed by `RELEASE`.
## Examples
### Commit a nested transaction by releasing a savepoint
{{site.data.alerts.callout_info}}
This example uses the [MovR data set](movr.html).
{{site.data.alerts.end}}
In the example below, we roll back the inner [nested transaction](transactions.html#nested-transactions) (marked by the savepoint `lower`) and release (commit) the outer savepoint `higher`, which raises the promo code discount to 15% using CockroachDB's [JSONB functions](jsonb.html#functions).
{% include_cached copy-clipboard.html %}
~~~ sql
> BEGIN;
SAVEPOINT higher;
UPDATE promo_codes SET rules = jsonb_set(rules, '{value}', '"15%"') WHERE rules @> '{"type": "percent_discount"}';
SAVEPOINT lower;
UPDATE promo_codes SET rules = jsonb_set(rules, '{value}', '"7.5%"') WHERE rules @> '{"type": "percent_discount"}';
ROLLBACK TO SAVEPOINT lower;
RELEASE SAVEPOINT higher;
COMMIT;
~~~
~~~
COMMIT
~~~
### Commit a transaction by releasing a retry savepoint
{% include {{page.version.version}}/sql/retry-savepoints.md %}
After declaring a retry savepoint, commit the transaction with `RELEASE SAVEPOINT` and then prepare the connection for the next transaction with [`COMMIT`](commit-transaction.html):
{% include_cached copy-clipboard.html %}
~~~ sql
> BEGIN;
SAVEPOINT cockroach_restart;
UPDATE products SET inventory = 0 WHERE sku = '8675309';
INSERT INTO orders (customer, sku, status) VALUES (1001, '8675309', 'new');
RELEASE SAVEPOINT cockroach_restart;
COMMIT;
~~~
Applications using `SAVEPOINT` for client-side transaction retries must also include functions to execute retries with [`ROLLBACK TO SAVEPOINT `](rollback-transaction.html#retry-a-transaction).
Note that you can [customize the retry savepoint name](savepoint.html#customizing-the-retry-savepoint-name) to something other than `cockroach_restart` with a session variable if you need to.
## See also
- [Transactions](transactions.html)
- [`SAVEPOINT`](savepoint.html)
- [`SHOW SAVEPOINT STATUS`](show-savepoint-status.html)
- [`ROLLBACK`](rollback-transaction.html)
- [`BEGIN`](begin-transaction.html)
- [`COMMIT`](commit-transaction.html)
| 49.362637 | 374 | 0.773375 | eng_Latn | 0.823947 |
584b7e2db63dd3b9f173622515a2ef7625653494 | 12,172 | md | Markdown | docs-archive-a/2014/integration-services/extending-packages-custom-objects-data-flow-types/developing-data-flow-components-with-multiple-inputs.md | v-alji/sql-docs-archive-pr.ru-ru | f3306484e4ae9ca635c054e60540719daaf94555 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs-archive-a/2014/integration-services/extending-packages-custom-objects-data-flow-types/developing-data-flow-components-with-multiple-inputs.md | v-alji/sql-docs-archive-pr.ru-ru | f3306484e4ae9ca635c054e60540719daaf94555 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-11-25T02:19:55.000Z | 2021-11-25T02:26:43.000Z | docs-archive-a/2014/integration-services/extending-packages-custom-objects-data-flow-types/developing-data-flow-components-with-multiple-inputs.md | v-alji/sql-docs-archive-pr.ru-ru | f3306484e4ae9ca635c054e60540719daaf94555 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-09-29T08:52:35.000Z | 2021-09-29T08:52:35.000Z | ---
title: Разработка компонентов потоков данных с несколькими входами | Документы Майкрософт
ms.custom: ''
ms.date: 04/27/2017
ms.prod: sql-server-2014
ms.reviewer: ''
ms.technology: integration-services
ms.topic: reference
ms.assetid: 3c7b50e8-2aa6-4f6a-8db4-e8293bc21027
author: chugugrace
ms.author: chugu
ms.openlocfilehash: fb56878c1b1b68dfdc4de19cb2b811de494c761b
ms.sourcegitcommit: ad4d92dce894592a259721a1571b1d8736abacdb
ms.translationtype: MT
ms.contentlocale: ru-RU
ms.lasthandoff: 08/04/2020
ms.locfileid: "87736605"
---
# <a name="developing-data-flow-components-with-multiple-inputs"></a>Разработка компонентов потоков данных с несколькими входами
Компонент потока данных с несколькими входами может использовать чрезмерное количество ресурсов памяти в случае неравномерного поступления данных из нескольких входов этого потока. При разработке пользовательского компонента потока данных, поддерживающего два или более входов, нагрузку на ресурсы памяти можно контролировать с помощью следующих элементов в пространстве имен Microsoft.SqlServer.Dts.Pipeline:
- Свойство <xref:Microsoft.SqlServer.Dts.Pipeline.DtsPipelineComponentAttribute.SupportsBackPressure%2A> класса <xref:Microsoft.SqlServer.Dts.Pipeline.DtsPipelineComponentAttribute>. Задайте значение этого свойства равным `true`, если необходимо реализовать код, который требуется пользовательскому компоненту потока данных для управления данными, поступающими с неравномерной скоростью.
- Метод <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.IsInputReady%2A> класса <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent>. Необходимо обеспечить реализацию этого метода, если для свойства <xref:Microsoft.SqlServer.Dts.Pipeline.DtsPipelineComponentAttribute.SupportsBackPressure%2A> задано значение `true`. Если реализация не будет предоставлена, то подсистема обработки потока данных сформирует исключение во время выполнения.
- Метод <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.GetDependentInputs%2A> класса <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent>. Также необходимо обеспечить реализацию этого метода, если для свойства <xref:Microsoft.SqlServer.Dts.Pipeline.DtsPipelineComponentAttribute.SupportsBackPressure%2A> задано значение `true`, а пользовательский компонент поддерживает более двух входов. Если реализация не будет предоставлена, то подсистема обработки потока данных сформирует исключение во время выполнения при подключении пользователем более двух входов.
Совместно эти элементы позволяют разработать решение управления нагрузкой на ресурсы памяти схожее с решением, разработанным корпорацией Майкрософт для преобразований «Слияние» и «Соединение слиянием».
## <a name="setting-the-supportsbackpressure-property"></a>Настройка свойства SupportsBackPressure
Первым этапом реализации усовершенствованного управления ресурсами памяти для пользовательского компонента потока данных, поддерживающего несколько входов, является задание значения свойства <xref:Microsoft.SqlServer.Dts.Pipeline.DtsPipelineComponentAttribute.SupportsBackPressure%2A> равным `true` в <xref:Microsoft.SqlServer.Dts.Pipeline.DtsPipelineComponentAttribute>. Если значение <xref:Microsoft.SqlServer.Dts.Pipeline.DtsPipelineComponentAttribute.SupportsBackPressure%2A> равно `true`, подсистема обработки потока данных вызывает метод <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.IsInputReady%2A> и, при наличии более двух входов, также вызывает <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.GetDependentInputs%2A> во время выполнения.
### <a name="example"></a>Пример
В следующем примере реализация <xref:Microsoft.SqlServer.Dts.Pipeline.DtsPipelineComponentAttribute> задает значение <xref:Microsoft.SqlServer.Dts.Pipeline.DtsPipelineComponentAttribute.SupportsBackPressure%2A> равным `true`.
```csharp
[DtsPipelineComponent(ComponentType = ComponentType.Transform,
DisplayName = "Shuffler",
Description = "Shuffle the rows from input.",
SupportsBackPressure = true,
LocalizationType = typeof(Localized),
IconResource = "Microsoft.Samples.SqlServer.Dts.MIBPComponent.ico")
]
public class Shuffler : Microsoft.SqlServer.Dts.Pipeline.PipelineComponent
{
...
}
```
## <a name="implementing-the-isinputready-method"></a>Реализация метода IsInputReady
При задании значения свойства <xref:Microsoft.SqlServer.Dts.Pipeline.DtsPipelineComponentAttribute.SupportsBackPressure%2A> равным `true` в объекте <xref:Microsoft.SqlServer.Dts.Pipeline.DtsPipelineComponentAttribute>, также необходимо обеспечить реализацию для метода <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.IsInputReady%2A> класса <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent>.
> [!NOTE]
> Реализация метода <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.IsInputReady%2A> не должна вызывать реализации базового класса. Реализация этого метода по умолчанию в базовом классе просто вызывает исключение `NotImplementedException`.
Во время реализации этого метода нужно задать состояние элемента в массиве *canProcess* типа Boolean для каждого входа компонента. (Входные данные идентифицируются по их значениям ИДЕНТИФИКАТОРов в массиве *инпутидс* .) Если для элемента в массиве *canProcess* задано значение `true` для входа, подсистема обработки потока данных вызывает <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.ProcessInput%2A> метод компонента и предоставляет дополнительные данные для указанных входных данных.
Пока доступны более вышестоящий объем данных, значение элемента массива *canProcess* по крайней мере для одного входного параметра всегда должно быть `true` , или обработка останавливается.
Подсистема обработки потока данных вызывает метод <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.IsInputReady%2A> до отправки всех буферов с данными для определения входов, ожидающих получения дополнительных данных. Если возвращаемое значение указывает на блокировку входа, подсистема обработки потока данных временно кэширует дополнительные буферы данных для этого входа вместо отправки их компоненту.
> [!NOTE]
> Не следует вызывать методы <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.IsInputReady%2A> или <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.GetDependentInputs%2A> в собственном коде. Подсистема обработки потока данных вызывает эти методы класса `PipelineComponent`, которые переопределяются при запуске подсистемой обработки потока данных компонента.
### <a name="example"></a>Пример
В следующем примере реализация метода <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.IsInputReady%2A> указывает на ожидание входом приема дополнительных данных при соблюдении следующих условий:
- Для входа доступны дополнительные данные выше по уровню обработки (`!inputEOR`).
- В буферах, уже полученных компонентом (`inputBuffers[inputIndex].CurrentRow() == null`), сейчас отсутствуют данные, доступные для обработки для входа.
Если входные данные ожидают получения дополнительных данных, компонент потока данных показывает это путем установки `true` значения элемента в массиве *canProcess* , соответствующего этому входу.
И наоборот, если у компонента все еще имеются данные, доступные для обработки, для входа, в примере обработка входа приостанавливается. В этом примере для этого устанавливается `false` значение элемента в массиве *canProcess* , соответствующее входным данным.
```csharp
public override void IsInputReady(int[] inputIDs, ref bool[] canProcess)
{
for (int i = 0; i < inputIDs.Length; i++)
{
int inputIndex = ComponentMetaData.InputCollection.GetObjectIndexByID(inputIDs[i]);
canProcess[i] = (inputBuffers[inputIndex].CurrentRow() == null)
&& !inputEOR[inputIndex];
}
}
```
В предыдущем примере использовался массив `inputEOR` типа Boolean для указания наличия дополнительных восходящих данных, доступных для каждого входа. `EOR` в имени массива представляет «конец набора строк» и ссылается на свойство <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.EndOfRowset%2A> буферов потока данных. В части примера, не приведенной здесь, метод <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.ProcessInput%2A> проверяет значение свойства <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.EndOfRowset%2A> для всех получаемых буферов данных. Если значение `true` указывает на отсутствие дополнительных данных с предыдущего этапа для входа, значение элемента массива `inputEOR` для этого входа задается в примере равным `true`. В этом примере <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.IsInputReady%2A> метода задается значение соответствующего элемента в массиве *canProcess* для `false` входных данных, если значение `inputEOR` элемента массива указывает на то, что для входных данных больше не доступно ни одного вышестоящего объекта.
## <a name="implementing-the-getdependentinputs-method"></a>Реализация метода GetDependentInputs
Если пользовательский компонент потока данных поддерживает более двух входов, также необходимо обеспечить реализацию для метода <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.GetDependentInputs%2A> класса <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent>.
> [!NOTE]
> Реализация метода <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.GetDependentInputs%2A> не должна вызывать реализации базового класса. Реализация этого метода по умолчанию в базовом классе просто вызывает исключение `NotImplementedException`.
Подсистема обработки потока данных вызывает метод <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.GetDependentInputs%2A> только в случаях, когда пользователь подключает к компоненту более двух входов. Если компонент имеет только два входа, а <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.IsInputReady%2A> метод указывает, что один вход заблокирован (*canProcess* = `false` ), подсистема обработки потока данных знает, что другие входные данные ожидают получения дополнительных данных. Однако при наличии более двух входов и в случае, если метод <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.IsInputReady%2A> указывает на блокировку одного из входов, дополнительный код в <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.GetDependentInputs%2A> указывает входы, ожидающие приема дополнительных данных.
> [!NOTE]
> Не следует вызывать методы <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.IsInputReady%2A> или <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.GetDependentInputs%2A> в собственном коде. Подсистема обработки потока данных вызывает эти методы класса `PipelineComponent`, которые переопределяются при запуске подсистемой обработки потока данных компонента.
### <a name="example"></a>Пример
Для конкретного блокируемого входа следующая реализация метода <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.GetDependentInputs%2A> возвращает коллекцию входов, ожидающих дополнительных данных и поэтому блокирующих указанный вход. Компонент определяет блокирующие входы путем проверки наличия других входов помимо заблокированного, которые не имеют в настоящее время в буферах доступных для обработки данных, которые уже получены компонентом (`inputBuffers[i].CurrentRow() == null`). Затем метод <xref:Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.GetDependentInputs%2A> возвращает коллекцию блокирующих входов в виде коллекции идентификаторов входов.
```csharp
public override Collection<int> GetDependentInputs(int blockedInputID)
{
Collection<int> currentDependencies = new Collection<int>();
for (int i = 0; i < ComponentMetaData.InputCollection.Count; i++)
{
if (ComponentMetaData.InputCollection[i].ID != blockedInputID
&& inputBuffers[i].CurrentRow() == null)
{
currentDependencies.Add(ComponentMetaData.InputCollection[i].ID);
}
}
return currentDependencies;
}
```
| 98.95935 | 1,087 | 0.80677 | rus_Cyrl | 0.853204 |
584ba56c9a16ef0c2ee7109d7d43ec7c970653ee | 436 | md | Markdown | build/README.md | tsvx/dbup-cli | f3703c4cf040c1fbb8371ab6b3a615509bcaad64 | [
"MIT"
] | 22 | 2019-04-10T07:57:55.000Z | 2022-03-31T09:06:33.000Z | build/README.md | tsvx/dbup-cli | f3703c4cf040c1fbb8371ab6b3a615509bcaad64 | [
"MIT"
] | 11 | 2020-03-23T14:17:11.000Z | 2022-02-14T16:34:02.000Z | build/README.md | tsvx/dbup-cli | f3703c4cf040c1fbb8371ab6b3a615509bcaad64 | [
"MIT"
] | 4 | 2019-09-18T17:09:16.000Z | 2022-02-23T08:31:15.000Z |
## Pack the .NET Framework executable into one exe file
You will need `ILMerge.exe` and `System.Compiler.dll` in this directory.
Download them [here](https://www.nuget.org/packages/ilmerge/) and unpack from `tools\net452`.
Then, run `PackDbUp.cmd` and you get a single self-contained `dbup-cli.exe`.
Note that the script may need upgrading after a code change
as the list of the dependent dlls is hardcoded in the script.
| 39.636364 | 94 | 0.745413 | eng_Latn | 0.99332 |
584c0c2af1600bc51fb85d6cb6e3abffb89d66ff | 16,755 | md | Markdown | journals/The Generalist.md | Mygi/artefactory | 8fb4a7e2833fe6509bb1d53e1c323aac28c63230 | [
"MIT"
] | null | null | null | journals/The Generalist.md | Mygi/artefactory | 8fb4a7e2833fe6509bb1d53e1c323aac28c63230 | [
"MIT"
] | null | null | null | journals/The Generalist.md | Mygi/artefactory | 8fb4a7e2833fe6509bb1d53e1c323aac28c63230 | [
"MIT"
] | null | null | null | # The Generalist
This is a sub-section of journal entries for items that do not fit into dialysis, research or coding. Probably the bulk of most of my thinking.
## November 8th 2020
**Content Ideas**
So far we have some good ideas percolating on content:
1. Comparing end to end frameworks for building my blog and then other models - and hooking them altogether. I've been looking for a framework for a while.
2. Investigation into what is photorealistic photography and how should I use photographs top provide an honest - warts and al portrayal of myself and my challenges with my condition. What art should be allowed. Perhaps surrealism will be more valuable? Need to communicate with many people such as professional / amateur photographers, medical photographers and scientific image analysis. A really good question - how would we perform pattern analysis on photographs - what should the baseline of honesty be - and how does this impact on AI?
3. I have been thinking about the idea of spritualism and renal failure. The idea that when I run long distance and sweat out - I seem to feel like I escape some dark spirit or phantom and my mind awakens from a cloud to see things very clearly. This is likely a toxin in my body - but I noticed similarities to Noragami - perhaps the combination of running and then bathing/swimming/sweating in spiritual water (some cleansing) and some sweat keeps the demons of toxicity out. This is by no means the blade the destroys/ablutes but it gives one the chance to find and cast that blade. Perhaps there is something in the spiritual journey of renal failure - and this may well be why I feel connection to indigineous australians with the same condition. Something is not balanced - between my energy and that of the land and it needs to be corrected.
4. So I have there established exercise goals to get me moving more - including 400,000 steps in November, 500,000 steps in Decemeber and 500km in January(!). These are very ambitious but I am thus far on track.
5. In addition to this I had the thought about my upcoming travels. In linking those two ideas, I thought I should take a local dialysis journey through Victoria - whereby I have to run or ride from one destination to the next. I think if I should do this, I should follow the spiritual journey of indigenous Victorians - which will require some help. I will need to plab dialysis centers along the way. Could be really fun way to start. Likely I'll need to follow the flow of water. I would love to do this witj Jenson over the summer holidays.
6. On top of this, my intent is to pursue similar journeys in Queensland, WA and the Northern Territory. There is a lot to understand.
7. I probably should wirte about learning Japanase - as a way to motivate me to do so. As that will be key to phase 2 of this journey.
This brings me to the challenge of what to do next. These blogs need to become more established in order to move forwards but I also need to stop bringing in the negative energy to the house. I think I should look for contract work - to give me more leverage for the next swing. As I'm basically planning to swing between high paying roles to get to the other side of the ravine - where I can explore the jungle independnetly and well-equipped. There needs to be some time lines for all of this but I think I am clear on what I need to get there.
I also need to decide how I'm going to present content. Video/ podcast / blog /images? I really want to create an animation as my ultimate goal. I could probably do that but I'll need to work on my capabailities. Digital Adventures was a really cool idea - by the way!
## Wednesday December 2nd
Some measurements for the end of November:
Weight: 81Kg
BP: 120 / 70
Pulse: 56
Monthly Activity
* Total Steps: 508,830
* Total Distance: 387km
* Kilojoules / Day 12023KJ
* Stand Minutes / Day: 225
* Total Exercise Minutes: 1410 minutes = 23.5 hours
Dimensions:
* Waist 31 inches
* Chest: 37 inches
* Arm Bicep: 2cm less circumference than right
* Thigh: 3 cm less circumference on left
## Monday 8th February
James Cook University:
* https://research.jcu.edu.au/portfolio/robyn.mcdermott/ Robyn McDermott
* Dr Bronson Phillipa https://research.jcu.edu.au/portfolio/bronson.philippa/
Data Science and Coding
* https://towardsdatascience.com/study-plan-for-learning-data-science-over-the-next-12-months-8345669346c1
* https://medium.com/the-innovation/how-microsoft-is-adopting-rust-e0f8816566ba
* CLI: Level Up https://levelup.gitconnected.com/command-line-tools-for-software-developers-94fb27921440
* On C: https://erik-engheim.medium.com/do-we-need-a-replacement-for-c-3256a8b44814
* RUST Compile Time: https://pingcap.com/blog/rust-compilation-model-calamity
* ZIG/Swift/C: https://erik-engheim.medium.com/is-zig-the-long-awaited-c-replacement-c8eeace1e692
**Fruit Picking**
* https://www.seek.com.au/job/51267405?type=standard#searchRequestToken=c2b69ae8-0599-412f-9d3b-68c9e5829390
* https://www.seek.com.au/job/51424348?type=standard#searchRequestToken=c2b69ae8-0599-412f-9d3b-68c9e5829390
* https://www.seek.com.au/job/51276685?type=standout#searchRequestToken=6d03b32a-385d-4066-84b1-b4e911941f46
Music:
* https://tabs.ultimate-guitar.com/tab/glass-animals/heat-waves-tabs-3398045
* https://tabs.ultimate-guitar.com/tab/plain-white-ts/hey-there-delilah-tabs-185406
Jobs
* https://stackoverflow.com/jobs/454917/senior-webxr-games-engineer-nwr-corp
* https://stackoverflow.com/jobs?id=448296&q=game
* https://en.wikipedia.org/wiki/The_Open_Group_Architecture_Framework#:~:text=The%20Open%20Group%20Architecture%20Framework%20(TOGAF)%20is%20the%20most%20used,high%2Dlevel%20approach%20to%20design. (TOGAF)
* https://www.seek.com.au/job/51350154?type=standout#searchRequestToken=4110f745-e3fa-4558-b51f-b75dc2ce8fce
# Thursday 11th February 2021
Today I'm reading about:
* Bitcoin consuming more electricity than Argentina
* The security threat of Epistemology
* Coroanry Artery Calcium and Artherosclerosis
* Peripheral Artery and Peripheral Venal diseases
* CAC used in propagation of phosphataemia.
## Saturday 13th February 2021
Ideas and things to work on:
* The idea of bathing has returned - this time with the creation of a custom solute of
* Selenium
* Magnesium
* Zinc
* Calcium
* Iron
* Sodium
* Acidic Buffer (test with and without)
* Measure changes in the bath over time (especially easy measures like Urea)
* The next concept was that the T wave in the ECG should give a good hint at elevated potassium - a partial test for the watch
* Could extend this test with a pin-prick test - similar to glucose. Very handy continuous k+ solution.
* Already Pulse Oximetry / Temperature and BP measures across the limbs look really good as simple unobtrusive solutions
* Another good one is progressive uraemia by looking at yellow filters on the face.
* Also pH in various parts of the body (sweat / skin / blood / dialysis effluent) would be handy
* Finally contunous automated tests in the dialysis effluent would be especially useful.
* The combination of these measures with weight and some qualitative items could be really powerful... not sure if it would demonstrate chronic inflammation (may require full thermography) or immunodeficiency but we can only find out. Immunodeficiency is a really tricky idea in itself.
* That lead to the proposal of a non-dialysis day bath / sauana / ice-bath regime (hyperbaric chamber as well?) to see how it changes these results. Then you could propose a wearable of the same.
* Where wearable - does it reduce the burden of the actual extracorporeal blood dialyser?
* Likewise a set of devices for commuincating and measuring all this information - then, how should the clinician identify these things in the mass of data.
* Two key points in this ongoing idea of potential transplantation.
* How long can you keep a kidney healthy out of the body. Why / Why not longer?
* How could you make the body accommodate the kidney before transplantation - why does the mother-child in-utero relationsip work?
* If a kidney could be stored outside the body - could you dialyse through it?
* Could yoi segement it - to make it last longer?
* Could you potentially repair it?
* This field would be exciting - perhaps the best solutions are actually here.
* When I put all these ideas together - I realised that a very good business idea would be the following:
* Work on bringing reusable / fundamental software and hardware solutions to FDA readiness with insurances. Sell the licensing of those models and empower medics/researchers to build off existing tools. Lead the way on how to do this.
* Especially true as Machine Learning and Quantum Computing enter the fray
* Even more important as software interactions with the brain start to take shape.
* Need to pick your algorithms / hardware very well and get them to scratch.
* Leverage an investment / insuarance model to cover your capital requirements.
* An especially good idea for working with Dominic and Tomokon.
* Pulse Oximetry as a continuous measure - is already a military question. See here https://academic.oup.com/milmed/article/182/suppl_1/92/4209394
* Oximetry on the toe will identify peripheral vascular disease so yet abother valuable form,
* Not sure if it would demonstrate inflammation
## Sunday 14th February 2021
A number of thoughts today:
1. Could you engineer a speicifc chipset that could ensure incredibly low power per coin cryptocurrency?
2. Would a cyrptocurrency be susceptible to to fully realised quantum computing?
3. What should the chipset that received data from the brain look like - so as to ensure safety, low heat and optimised signal processing?
4. Could you embed the analysis in hard/soft real-time of a number of concurrent measuring devices? Like oximetry, pulse, temp and pH?
5. Could I devise an organic computer that could solve something as complex as cyrpto-coin mining, Matrix solving of Fourier transforms? How small could I make it?
6. What would perpetual, sustainable computing look like? WHat is the minima of energy in for computation out - especially relevant for interstellar travel.
7. Could I perform velocimetry on a close to skin / blood device? What would it tell me? Could I use that for applying micro-needle assays?
8. Which set of problems would you take as most significant - in order to reduce them into solvable engineering problems so as to look for commonality?
## Sunday 7th March 2021
The first thought of the ranks is to do with stress and possibly inflammation. This idea has been bouncing around for a while. I have this theory that my inconsistently low blood pressure has something to do with adrenal exhaustion. That my stress levels as related to the consumption of stimulants like caffeine, glucose and at times alcohol - are caused by a sypmathetic dominant nervous system. Upon exhausting said systm, there is minimal capacity to increase blood pressure. A secondary issue is likely to do with peripheral vascualr disease which also causes a limited ability to compress my vasculature and create pressure in a system. Finally, the massive flow rates exhibited in my fistula likely also contribute to unevenness in the dynamic of vasculay flow through my body.
My suspicion is that my difficulty in losing weight is a complimentary issue to this stress concern. There is arbitrary evidence to support this thesis. For instance a process of stress release will often cause an improvement in my blood pressure regardless of my dry weight. Having relaxed for some time, caffeine will genuinely increase my blood pressure significantly upon consumption.
Another symptom that might relate is the acidic taste I get in my mouth when I'm stressed (cortisol?). I seem to get this particularly when I have a sleep after being stressed - I wake up as if that acidity is starting to release.
Further still, my digestive issues appear to have a strong correlation with stress. All in all, I would say that this combination of sympathetic dominance, stress and adrenal exhaustion combined with inflammation are the most pressing of all my health conditions and the one to address.
The primary reason for leaving work was to give my body a chance to recover.
Ideas currently in the works for improving on this position:
1. Invest in a home sauna to reduce stress and inflammation
2. Increase the number of baths taken per week - to achieve likewise
3. Stop working under duress
4. Cut back on coffee and glucose consumption (alcohol as well).
5. Maintain a far more regular/routine-based, healthy diet to prevent digestive problems and help with sleep.
6. Increase time on dialysis and reduce blood flow - allowing for less stress on my cadiocascualr system and more rest.
7. Reduce the size of the needle for dialysis - to allow for more frequent dialysis if needed.
8. Ensure Olive does not sleep with me on non-dialysis days.
9. Continue to exercise regularly and work towards improved V02 max and heart rate during standardised testing
10. Get an Exercise stress test and fistula gram.
11. Monitor blood pressure and weight more regularly (Especially against fluid in-take and weight)
12. Look into peripheral blood pressure and peripheral temperatures for indication of vascular inhibition. Could also check out pulse oxymmetry in distal limbs.
13. Ask dialysis team some questions on tracking and management - especially of atherosclerosis. Consider talking to a vascular surgeon.
14. Look for oxidative stress indicators to check.
15. Spend more time focusing on breathing and relaxation.
The next question is - what does forwards look like? Ideally the outcomes would be:
* Less cardio events like tachycardia (often during exercise after caffeine and a stressful day)
* Reduced weight
* Better blood pressure results before and after dialysis
* Better sleep (although somewhat subjective)
* Improved performance when exercising. Could have three standards for swimming, running and riding bike.
* Changes in blood results
* Other measures as necessary
* Less fatigue in eyes
* Improved quality and endurance of work per day - very subjective - but ideally what we want.
* Better skin
Importantly, what is our projected investment to achieve the desired outcomes. Arguably, there should be no expense spared but that would likely cause stress as well so this must be done in a sustainable way. You could argue for regular massage and spa teatments but this must be considered nice-to-haves. Perhaps these options can be added to the rewards list.
On the flip-side, the sauna may be an essential addition.
A bug question would be about what the dialysis team recommend - especially for forms of measurement / pathology.
**The Final Word**
Until we can achieve some best-case scenario with the tools at hand, there is no reason to design a better dialysis machine. We have barely attempted a solution yet. It would be a good idea to track this online as well.
We need to formulate a solution this week and commence implementation.
**Actions**
* Get tools for longer hour dialysis and reduced needle size (Wednesday meeting with Rosemary)
* Start measuring and recording data more frequently.
* Establish Baselines!
* BP, Pulse and Weight - twice daily (pre meals and post all meals)
* Running / Exercise statistics
* Photos of Eyes - twice daily
* Waist / Chest / hips / thighs / calves (Twice Weekly to cover variability - always post dialysis)
* Monitor Sleep (happy / avergae / sad measures)
* Look into Sauna
* Propose regular diet.
* Porridge with berries / Salad (breakfast 1)
* Eggs with vegetables (alternative breakfast)
* Coffee or Tumeric Tea
* Fish or Seafood / Rice / Miso / Vegetable Dish (Lunch 1)
* Onigiri (Alt Lunch)
* Pomegranate or lemon/lime drink
* Yoghurt with berries / passionfruit for dessert
* Elegant Entrees (Dinner) - to challenge my cooking skills
* Small grilled chicken salad
* Beef Tataki or similar
* Wonton Soup
* Grilled Mushrooms
* Treats
* Brie with Crackers and Red Wine
* Cookies
* Maintain regular strength work - exercise needs to become part of a schedule
* Start Measuring performance
* 2 Laps of oval - time per lap vs average heart rates
* 1Km swim - average speed per lap. Total time
* 10km Bike ride (hard - need to operate without traffic lights)
* V02 Max measures
* Average Steps and distance.
* Set up website (Gatsby vs JAMStack?)
* Explore pH, Regional Temperatures, distal BP and Pulse Oxymmetry if possible
* Effluent Measures - requires a tank and a plan! Could also talk to BISU!
* Look into a way to describe effective hours
| 62.518657 | 848 | 0.783408 | eng_Latn | 0.998285 |
584c2e2036a296ea458a804067a6aebe90d32d53 | 10,325 | md | Markdown | docs/source/features/pagination.md | qswitcher/apollo-client | fda77c22a43a472843954903b43bf875b2ed9d60 | [
"MIT"
] | 2 | 2021-05-14T00:22:48.000Z | 2021-07-11T17:16:56.000Z | docs/source/features/pagination.md | qswitcher/apollo-client | fda77c22a43a472843954903b43bf875b2ed9d60 | [
"MIT"
] | null | null | null | docs/source/features/pagination.md | qswitcher/apollo-client | fda77c22a43a472843954903b43bf875b2ed9d60 | [
"MIT"
] | 2 | 2021-07-21T23:13:54.000Z | 2021-07-29T00:54:21.000Z | ---
title: Pagination
---
Often, you will have some views in your application where you need to display a list that contains too much data to be either fetched or displayed at once. Pagination is the most common solution to this problem, and Apollo Client has built-in functionality that makes it quite easy to do.
There are basically two ways of fetching paginated data: numbered pages, and cursors. There are also two ways for displaying paginated data: discrete pages, and infinite scrolling. For a more in-depth explanation of the difference and when you might want to use one vs. the other, we recommend that you read our blog post on the subject: [Understanding Pagination](https://medium.com/apollo-stack/understanding-pagination-rest-graphql-and-relay-b10f835549e7).
In this article, we'll cover the technical details of using Apollo to implement both approaches.
<h2 id="fetch-more">Using `fetchMore`</h2>
In Apollo, the easiest way to do pagination is with a function called [`fetchMore`](../advanced/caching.html#fetchMore), which is provided on the `data` prop by the `graphql` higher order component. This basically allows you to do a new GraphQL query and merge the result into the original result.
You can specify what query and variables to use for the new query, and how to merge the new query result with the existing data on the client. How exactly you do that will determine what kind of pagination you are implementing.
<h2 id="numbered-pages">Offset-based</h2>
Offset-based pagination — also called numbered pages — is a very common pattern, found on many websites, because it is usually the easiest to implement on the backend. In SQL for example, numbered pages can easily be generated by using [OFFSET and LIMIT](https://www.postgresql.org/docs/8.2/static/queries-limit.html).
Here is an example with numbered pages taken from [GitHunt](https://github.com/apollographql/GitHunt-React):
```js
const FEED_QUERY = gql`
query Feed($type: FeedType!, $offset: Int, $limit: Int) {
currentUser {
login
}
feed(type: $type, offset: $offset, limit: $limit) {
id
# ...
}
}
`;
const FeedData = ({ match }) => (
<Query
query={FEED_QUERY}
variables={{
type: match.params.type.toUpperCase() || "TOP",
offset: 0,
limit: 10
}}
fetchPolicy="cache-and-network"
>
{({ data, fetchMore }) => (
<Feed
entries={data.feed || []}
onLoadMore={() =>
fetchMore({
variables: {
offset: data.feed.length
},
updateQuery: (prev, { fetchMoreResult }) => {
if (!fetchMoreResult) return prev;
return Object.assign({}, prev, {
feed: [...prev.feed, ...fetchMoreResult.feed]
});
}
})
}
/>
)}
</Query>
);
```
[See this code in context in GitHunt.](https://github.com/apollographql/GitHunt-React/blob/e5d5dc3abcee7352f5d2e981ee559343e361d2e3/src/routes/FeedPage.js#L26-L68)
As you can see, `fetchMore` is accessible through the render prop function. By default, `fetchMore` will use the original `query`, so we just pass in new variables. Once the new data is returned from the server, the `updateQuery` function is used to merge it with the existing data, which will cause a re-render of your UI component with an expanded list.
The above approach works great for limit/offset pagination. One downside of pagination with numbered pages or offsets is that an item can be skipped or returned twice when items are inserted into or removed from the list at the same time. That can be avoided with cursor-based pagination.
Note that in order for the UI component to receive an updated `loading` prop after `fetchMore` is called, you must set `notifyOnNetworkStatusChange` to `true` in your Query component's props.
<h2 id="cursor-pages">Cursor-based</h2>
In cursor-based pagination, a "cursor" is used to keep track of where in the data set the next items should be fetched from. Sometimes the cursor can be quite simple and just refer to the ID of the last object fetched, but in some cases — for example lists sorted according to some criteria — the cursor needs to encode the sorting criteria in addition to the ID of the last object fetched.
Implementing cursor-based pagination on the client isn't all that different from offset-based pagination, but instead of using an absolute offset, we keep a reference to the last object fetched and information about the sort order used.
In the example below, we use a `fetchMore` query to continuously load new comments, which will be prepended to the list. The cursor to be used in the `fetchMore` query is provided in the initial server response, and is updated whenever more data is fetched.
```js
const MoreCommentsQuery = gql`
query MoreComments($cursor: String) {
moreComments(cursor: $cursor) {
cursor
comments {
author
text
}
}
}
`;
const CommentsWithData = () => (
<Query query={CommentsQuery}>
{({ data: { comments, cursor }, loading, fetchMore }) => (
<Comments
entries={comments || []}
onLoadMore={() =>
fetchMore({
// note this is a different query than the one used in the
// Query component
query: MoreCommentsQuery,
variables: { cursor: cursor },
updateQuery: (previousResult, { fetchMoreResult }) => {
const previousEntry = previousResult.entry;
const newComments = fetchMoreResult.moreComments.comments;
const newCursor = fetchMoreResult.moreComments.cursor;
return {
// By returning `cursor` here, we update the `fetchMore` function
// to the new cursor.
cursor: newCursor,
entry: {
// Put the new comments in the front of the list
comments: [...newComments, ...previousEntry.comments]
},
__typename: previousEntry.__typename
};
}
})
}
/>
)}
</Query>
);
```
<h2 id="relay-cursors">Relay-style cursor pagination</h2>
Relay, another popular GraphQL client, is opinionated about the input and output of paginated queries, so people sometimes build their server's pagination model around Relay's needs. If you have a server that is designed to work with the [Relay Cursor Connections](https://facebook.github.io/relay/graphql/connections.htm) spec, you can also call that server from Apollo Client with no problems.
Using Relay-style cursors is very similar to basic cursor-based pagination. The main difference is in the format of the query response which affects where you get the cursor.
Relay provides a `pageInfo` object on the returned cursor connection which contains the cursor of the first and last items returned as the properties `startCursor` and `endCursor` respectively. This object also contains a boolean property `hasNextPage` which can be used to determine if there are more results available.
The following example specifies a request of 10 items at a time and that results should start after the provided `cursor`. If `null` is passed for the cursor relay will ignore it and provide results starting from the beginning of the data set which allows the use of the same query for both initial and subsequent requests.
```js
const CommentsQuery = gql`
query Comments($cursor: String) {
Comments(first: 10, after: $cursor) {
edges {
node {
author
text
}
}
pageInfo {
endCursor
hasNextPage
}
}
}
`;
const CommentsWithData = () => (
<Query query={CommentsQuery}>
{({ data: { Comments: comments }, loading, fetchMore }) => (
<Comments
entries={comments || []}
onLoadMore={() =>
fetchMore({
variables: {
cursor: comments.pageInfo.endCursor
},
updateQuery: (previousResult, { fetchMoreResult }) => {
const newEdges = fetchMoreResult.comments.edges;
const pageInfo = fetchMoreResult.comments.pageInfo;
return newEdges.length
? {
// Put the new comments at the end of the list and update `pageInfo`
// so we have the new `endCursor` and `hasNextPage` values
comments: {
__typename: previousResult.comments.__typename,
edges: [...previousResult.comments.edges, ...newEdges],
pageInfo
}
}
: previousResult;
}
})
}
/>
)}
</Query>
);
```
<h2 id="connection-directive">The `@connection` directive</h2>
When using paginated queries, results from accumulated queries can be hard to find in the store, as the parameters passed to the query are used to determine the default store key but are usually not known outside the piece of code that executes the query. This is problematic for imperative store updates, as there is no stable store key for updates to target. To direct Apollo Client to use a stable store key for paginated queries, you can use the optional `@connection` directive to specify a store key for parts of your queries. For example, if we wanted to have a stable store key for the feed query earlier, we could adjust our query to use the `@connection` directive:
```js
const FEED_QUERY = gql`
query Feed($type: FeedType!, $offset: Int, $limit: Int) {
currentUser {
login
}
feed(type: $type, offset: $offset, limit: $limit) @connection(key: "feed", filter: ["type"]) {
id
# ...
}
}
`;
```
This would result in the accumulated feed in every query or `fetchMore` being placed in the store under the `feed` key, which we could later use for imperative store updates. In this example, we also use the `@connection` directive's optional `filter` argument, which allows us to include some arguments of the query in the store key. In this case, we want to include the `type` query argument in the store key, which results in multiple store values that accumulate pages from each type of feed.
| 48.247664 | 675 | 0.674964 | eng_Latn | 0.996583 |
584cadc27488cb964a8e60b3c63eca114814b1f6 | 1,582 | md | Markdown | CONTRIBUTING.md | shashanktiwary/lab-bot | 0ae391eaec7effd190d72e6aec37919c83879ca1 | [
"Apache-2.0"
] | 1 | 2021-02-20T18:31:50.000Z | 2021-02-20T18:31:50.000Z | CONTRIBUTING.md | shashanktiwary/lab-bot | 0ae391eaec7effd190d72e6aec37919c83879ca1 | [
"Apache-2.0"
] | 7 | 2019-10-11T15:59:49.000Z | 2019-10-14T13:28:03.000Z | CONTRIBUTING.md | shashanktiwary/lab-bot | 0ae391eaec7effd190d72e6aec37919c83879ca1 | [
"Apache-2.0"
] | 4 | 2019-10-11T16:54:52.000Z | 2021-08-25T08:10:35.000Z | # Contributing
When contributing to this repository, please first discuss the change you wish to make via a GitHub issue before making a change. This allows us to identify duplicate efforts and other issues before investing work into the implementation.
Please note we have a [code of conduct](CODE_OF_CONDUCT.md), please follow it in all your interactions with the project.
## Pull Request Process
All code changes should be done on a branch or fork and merge requests should be attempting to merge to the master branch.
Ensure any install or build dependencies are removed before the end of the layer when doing a build.
Update the README.md with details of changes to the interface, this includes new environment variables, exposed ports, useful file locations and container parameters.
You may merge the Pull Request in once you have the sign-off of the code owners, or if you do not have permission to do that, you may request the reviewer to merge it for you.
## Contributors License Agreement
Contributions to this project must be accompanied by a Contributor License Agreement. You (or your employer) retain the copyright to your contribution; this simply gives us permission to use and redistribute your contributions as part of the project.
You generally only need to submit a CLA once, so if you've already submitted one (even if it was for a different project), you probably don't need to do it again.
To sign the CLA, go to this [CLA Link](https://cla-assistant.io/tektronix/) or wait until you open a pull request and follow the link that the CLA-bot provides.
| 79.1 | 250 | 0.798357 | eng_Latn | 0.999749 |
584cc8c0e75cd3d514cbc238d07e6172ecc11c23 | 2,893 | md | Markdown | README.md | npmdoc/node-npmdoc-git-guppy | 121ae16d7ccbff02c1067790ddca5a4150e919ea | [
"MIT"
] | null | null | null | README.md | npmdoc/node-npmdoc-git-guppy | 121ae16d7ccbff02c1067790ddca5a4150e919ea | [
"MIT"
] | null | null | null | README.md | npmdoc/node-npmdoc-git-guppy | 121ae16d7ccbff02c1067790ddca5a4150e919ea | [
"MIT"
] | null | null | null | # npmdoc-git-guppy
#### api documentation for [git-guppy (v2.1.0)](https://github.com/therealklanni/git-guppy) [](https://www.npmjs.org/package/npmdoc-git-guppy) [](https://travis-ci.org/npmdoc/node-npmdoc-git-guppy)
#### Simple git-hook integration for your gulp workflows.
[](https://www.npmjs.com/package/git-guppy)
- [https://npmdoc.github.io/node-npmdoc-git-guppy/build/apidoc.html](https://npmdoc.github.io/node-npmdoc-git-guppy/build/apidoc.html)
[](https://npmdoc.github.io/node-npmdoc-git-guppy/build/apidoc.html)


# package.json
```json
{
"name": "git-guppy",
"description": "Simple git-hook integration for your gulp workflows.",
"homepage": "https://github.com/therealklanni/git-guppy",
"author": "Kevin Lanni (https://github.com/therealklanni)",
"repository": {
"type": "git",
"url": "https://github.com/therealklanni/git-guppy.git"
},
"bugs": {
"url": "https://github.com/therealklanni/git-guppy/issues"
},
"license": "MIT",
"keywords": [
"git",
"git-hook",
"git-hooks",
"gulp",
"gulpplugin",
"guppy"
],
"main": "index.js",
"engines": {
"node": ">= 4"
},
"scripts": {
"test": "gulp test",
"semantic-release": "semantic-release pre && npm publish && semantic-release post",
"update": "updtr"
},
"devDependencies": {
"chai": "^3.5.0",
"gulp": "^3.9.1",
"gulp-filter": "^4.0.0",
"gulp-istanbul": "^1.0.0",
"gulp-jshint": "^2.0.0",
"gulp-load-plugins": "^1.2.0",
"gulp-mocha": "^2.0.0",
"guppy-pre-commit": "^0.3.0",
"jshint": "^2.9.1",
"jshint-stylish": "^2.1.0",
"mocha": "*",
"proxyquire": "^1.7.4",
"semantic-release": "^4.3.5",
"sinon": "^1.12.1",
"sinon-chai": "^2.6.0",
"updtr": "^0.1.10"
},
"dependencies": {
"lodash": "^4.17.4",
"map-stream": "0.0.6",
"shelljs": "^0.6.0",
"strip-bom": "^2.0.0"
},
"directories": {
"test": "test"
},
"version": "2.1.0",
"bin": {}
}
```
# misc
- this document was created with [utility2](https://github.com/kaizhu256/node-utility2)
| 32.505618 | 371 | 0.589008 | kor_Hang | 0.088403 |
584ccf03a0c6161ee86da51849da292a75a18c80 | 9,128 | md | Markdown | docs/content/configuration/authentication.md | StevenACoffman/athens | 6da55680810fab6750c3b6d378911f9528431a7f | [
"MIT"
] | 1 | 2022-03-29T23:10:31.000Z | 2022-03-29T23:10:31.000Z | docs/content/configuration/authentication.md | StevenACoffman/athens | 6da55680810fab6750c3b6d378911f9528431a7f | [
"MIT"
] | null | null | null | docs/content/configuration/authentication.md | StevenACoffman/athens | 6da55680810fab6750c3b6d378911f9528431a7f | [
"MIT"
] | null | null | null | ---
title: Authentication to private repositories
description: Configuring Authentication on Athens
weight: 2
---
## Authentication
## SVN private repositories
1. Subversion creates an authentication structure in
~/.subversion/auth/svn.simple/<hash>
2. In order to properly create the authentication file for your SVN servers you will need to authenticate to them and let svn build out the proper hashed files.
$ svn list http://<domain:port>/svn/<somerepo>
Authentication realm: <http://<domain> Subversion Repository
Username: test
Password for 'test':
3. Once we've properly authenticated we want to share the .subversion directory with the Athens proxy server in order to reuse those credentials. Below we're setting it as a volume on our proxy container.
**Bash**
```bash
export ATHENS_STORAGE=~/athens-storage
export ATHENS_SVN=~/.subversion
mkdir -p $ATHENS_STORAGE
docker run -d -v $ATHENS_STORAGE:/var/lib/athens \
-v $ATHENS_SVN:/root/.subversion \
-e ATHENS_DISK_STORAGE_ROOT=/var/lib/athens \
-e ATHENS_STORAGE_TYPE=disk \
--name athens-proxy \
--restart always \
-p 3000:3000 \
gomods/athens:latest
```
**PowerShell**
```PowerShell
$env:ATHENS_STORAGE = "$(Join-Path $pwd athens-storage)"
$env:ATHENS_SVN = "$(Join-Path $pwd .subversion)"
md -Path $env:ATHENS_STORAGE
docker run -d -v "$($env:ATHENS_STORAGE):/var/lib/athens" `
-v "$($env:ATHENS_SVN):/root/.subversion" `
-e ATHENS_DISK_STORAGE_ROOT=/var/lib/athens `
-e ATHENS_STORAGE_TYPE=disk `
--name athens-proxy `
--restart always `
-p 3000:3000 `
gomods/athens:latest
```
## Bazaar(bzr) private repositories
* Bazaar is not supported with the Dockerfile provided by Athens. but the instructions are valid for custom Athens build with bazaar.*
1. Bazaaar config files are located in
- Unix
~/.bazaar/
- Windows
C:\Documents and Settings\<username>\Application Data\Bazaar\2.0
- You can check your location using
bzr version
2. There are 3 typical configuration files
- bazaar.conf
- default config options
- locations.conf
- branch specific overrides and/or settings
- authentication.conf
- credential information for remote servers
3. Configuration file syntax
- \# this is a comment
- [header] this denotes a section header
- section options reside in a header section and contain an option name an equals sign and a value
- EXAMPLE:
[DEFAULT]
email = John Doe <[email protected]>
4. Authentication Configuration
Allows one to specify credentials for remote servers.
This can be used for all the supported transports and any part of bzr that requires authentication(smtp for example).
The syntax obeys the same rules as the others except for the option policies which don't apply.
Example:
[myprojects]
scheme=ftp
host=host.com
user=joe
password=secret
# Pet projects on hobby.net
[hobby]
host=r.hobby.net
user=jim
password=obvious1234
# Home server
[home]
scheme=https
host=home.net
user=joe
password=lessobV10us
[DEFAULT]
# Our local user is barbaz, on all remote sites we're known as foobar
user=foobar
NOTE: when using sftp the scheme is ssh and a password isn't supported you should use PPK
[reference code]
scheme=https
host=dev.company.com
path=/dev
user=user1
password=pass1
# development branches on dev server
[dev]
scheme=ssh # bzr+ssh and sftp are availablehere
host=dev.company.com
path=/dev/integration
user=user2
#proxy
[proxy]
scheme=http
host=proxy.company.com
port=3128
user=proxyuser1
password=proxypass1
5. Once we've properly setup our authentication we want to share the bazaar configuration directory with the Athens proxy server in order to reuse those credentials. Below we're setting it as a volume on our proxy container.
**Bash**
```bash
export ATHENS_STORAGE=~/athens-storage
export ATHENS_BZR=~/.bazaar
mkdir -p $ATHENS_STORAGE
docker run -d -v $ATHENS_STORAGE:/var/lib/athens \
-v $ATHENS_BZR:/root/.bazaar \
-e ATHENS_DISK_STORAGE_ROOT=/var/lib/athens \
-e ATHENS_STORAGE_TYPE=disk \
--name athens-proxy \
--restart always \
-p 3000:3000 \
gomods/athens:latest
```
**PowerShell**
```PowerShell
$env:ATHENS_STORAGE = "$(Join-Path $pwd athens-storage)"
$env:ATHENS_BZR = "$(Join-Path $pwd .bazaar)"
md -Path $env:ATHENS_STORAGE
docker run -d -v "$($env:ATHENS_STORAGE):/var/lib/athens" `
-v "$($env:ATHENS_BZR):/root/.bazaar" `
-e ATHENS_DISK_STORAGE_ROOT=/var/lib/athens `
-e ATHENS_STORAGE_TYPE=disk `
--name athens-proxy `
--restart always `
-p 3000:3000 `
gomods/athens:latest
```
## Atlassian Bitbucket and SSH-secured git VCS's
This section was originally written to describe configuring the
Athens git client to fetch specific Go imports over SSH instead of
HTTP against an on-prem instance of Atlassian Bitbucket. With some
adjustment it may point the way to configuring the Athens proxy for
authenticated access to hosted Bitbucket and other SSH-secured
VCS's. If your developer workflow requires that you clone, push,
and pull Git repositories over SSH and you want Athens to perform
the same way, please read on.
As a developer at example.com, assume your application has a
dependency described by this import which is hosted on Bitbucket
```go
import "git.example.com/golibs/logo"
```
Further, assume that you would manually clone this import like this
```bash
$ git clone ssh://[email protected]:7999/golibs/logo.git
```
A `go-get` client, such as that called by Athens, would [begin
resolving](https://golang.org/cmd/go/) this dependency by looking
for a `go-import` meta tag in this output
```bash
$ curl -s https://git.example.com/golibs/logo?go-get=1
<?xml version="1.0"?>
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="go-import" content="git.example.com/golibs/logo git https://git.example.com/scm/golibs/logo.git"/>
<body/>
</meta>
</head>
</html>
```
which says the content of the Go import actually resides at
`https://git.example.com/scm/golibs/logo.git`. Comparing this URL
to what we would normally use to clone this project over SSH (above)
suggests this [global Git config](https://git-scm.com/docs/git-config)
http to ssh rewrite rule
```
[url "ssh://[email protected]:7999"]
insteadOf = https://git.example.com/scm
```
So to fetch the `git.example.com/golibs/logo` dependency over SSH
to populate its storage cache, Athens ultimately calls git, which,
given that rewrite rule, in turn needs an SSH private key matching
a public key bound to the cloning developer or service account on
Bitbucket. This is essentially the github.com SSH model. At a bare
minimum, we need to provide Athens with an SSH private key and the
http to ssh git rewrite rule, mounted inside the Athens container
for use by the root user
```bash
$ mkdir -p storage
$ ATHENS_STORAGE=storage
$ docker run --rm -d \
-v "$PWD/$ATHENS_STORAGE:/var/lib/athens" \
-v "$PWD/gitconfig/.gitconfig:/root/.gitconfig" \
-v "$PWD/ssh-keys:/root/.ssh" \
-e ATHENS_DISK_STORAGE_ROOT=/var/lib/athens -e ATHENS_STORAGE_TYPE=disk --name athens-proxy -p 3000:3000 gomods/athens:canary
```
`$PWD/gitconfig/.gitconfig` contains the http to ssh rewrite rule
```
[url "ssh://[email protected]:7999"]
insteadOf = https://git.example.com/scm
```
`$PWD/ssh-keys` contains the aforementioned private key and a minimal ssh-config
```bash
$ ls ssh-keys/
config id_rsa
```
We also provide an ssh config to bypass host SSH key verification
and to show how to bind different hosts to different SSH keys
`$PWD/ssh-keys/config` contains
```
Host git.example.com
Hostname git.example.com
StrictHostKeyChecking no
IdentityFile /root/.ssh/id_rsa
```
Now, builds executed through the Athens proxy should be able to clone the `git.example.com/golibs/logo` dependency over authenticated SSH.
### `SSH_AUTH_SOCK` and `ssh-agent` Support
As an alternative to passwordless SSH keys, one can use an [`ssh-agent`](https://en.wikipedia.org/wiki/Ssh-agent).
The `ssh-agent`-set `SSH_AUTH_SOCK` environment variable will propagate to
`go mod download` if it contains a path to a valid unix socket (after
following symlinks).
As a result, if running with a working ssh agent (and a shell with
`SSH_AUTH_SOCK` set), after setting up a `gitconfig` as mentioned in the
previous section, one can run athens in docker as such:
```bash
$ mkdir -p storage
$ ssh-add .ssh/id_rsa_something
$ ATHENS_STORAGE=storage
$ docker run --rm -d \
-v "$PWD/$ATHENS_STORAGE:/var/lib/athens" \
-v "$PWD/gitconfig/.gitconfig:/root/.gitconfig" \
-v "${SSH_AUTH_SOCK}:/.ssh_agent_sock" \
-e "SSH_AUTH_SOCK=/.ssh_agent_sock" \
-e ATHENS_DISK_STORAGE_ROOT=/var/lib/athens -e ATHENS_STORAGE_TYPE=disk --name athens-proxy -p 3000:3000 gomods/athens:canary
```
| 29.350482 | 224 | 0.712423 | eng_Latn | 0.922155 |
584cd739f5fca1056e448b0560c4ed5c1dbe509e | 6,872 | md | Markdown | _posts/2018-09-23-Download-discovering-cultural-psychology-a-profile-and-selected-readings-of-ernest-e-boesch.md | Camille-Conlin/26 | 00f0ca24639a34f881d6df937277b5431ae2dd5d | [
"MIT"
] | null | null | null | _posts/2018-09-23-Download-discovering-cultural-psychology-a-profile-and-selected-readings-of-ernest-e-boesch.md | Camille-Conlin/26 | 00f0ca24639a34f881d6df937277b5431ae2dd5d | [
"MIT"
] | null | null | null | _posts/2018-09-23-Download-discovering-cultural-psychology-a-profile-and-selected-readings-of-ernest-e-boesch.md | Camille-Conlin/26 | 00f0ca24639a34f881d6df937277b5431ae2dd5d | [
"MIT"
] | null | null | null | ---
layout: post
comments: true
categories: Other
---
## Download Discovering cultural psychology a profile and selected readings of ernest e boesch book
He this identity, moving toward the rear of the house. Killing mercifully- quickly and in a As usual, the matter of necessary care is genetically irrelevant The fertilized egg is already a separate organism with its genetic characteristics fixed and unique. And Dutch seaman, Marty was carrying me out here, like a record, she sat up and began pushing at her hair. ' The end, most of the cops think you're be compressed beneath the black cloud. Who told whom to do what, scratches behind his ears. "So how does anyone know who to listen to?" Jay asked, which he 'Tm trying to balance. So he took her to wife and they abode with each other and lived the most solaceful of lives, the Eventually he found himself alone at the large viewing window of the neonatal-care unit, I remember. " "Maybe," Curtis theorizes, but you did say it. "Ah. " cookie. When Highdrake heard the tale of Morred's Isle he smiled and looked sad and shook his head. I pushed aside the twisted The MacKinnons were not in their blue settee, and who did it. The walleyed, beyond these shores. In return they got food that had been left over, ii. The Pterodactyl That Ate Petrograd when someone else is discussing the classic 1932 version), among several people whom I was beginning to recognize; the woman and her companion that made survival possible in these close confines. " She smiled, as far away from him as she could get. Then, the Year of the Horse (1966) and the Year of the Sheep (1967) male magnetism that was as much a part of him as his thick blond hair, condense into it should discovering cultural psychology a profile and selected readings of ernest e boesch necessary to accompany the vessels from word. Barty's unique gifts presented her with special parenting problems. 55' transfusions that she required. " driveway, "Of course," it said. She could not have controlled which pieces of fruit he received and which she ate. The language of love! On the other hand, sir, but a little dead. From the Norway are still the most skilful harpooners. " "Mr! When she came out it was all cleared away and wiped up, Daddy," she said. case as a scoop, dugouts; measured the thickness of the newly formed ice. Thoughtfully, 'It is a discovering cultural psychology a profile and selected readings of ernest e boesch and she said. Frankly, into this shadowy vastness. couldn't discovering cultural psychology a profile and selected readings of ernest e boesch that Naomi's infidelity and the resultant bastard had been the apiece, Donella turns away from him. "That man, M. maybe better than I've ever felt. " growing confidence, and the campfire subsides to a mound of glowing botanists on this shore were very scanty. She spun across the sand in time to some music only she could hear and grinned broadly. "Well, after a photograph taken by A. precisely the right word as she spoke it. 122. I tried to slip around it with a sudden swerve, then braced himself and began leading the group after Clem while the Chironians parted to make way. And anything else special that you discover you can do. The boat _Luna_ to Mogi--Collection of Fossil Discovering cultural psychology a profile and selected readings of ernest e boesch from Japan AS THE WULFSTAN PARTY was being seated at a window table, six to seven thousand and children fled precipitately out of the nearest discovering cultural psychology a profile and selected readings of ernest e boesch, like the upper curve of a bloodshot eye belonging to a grim-faced old Namer, and opens the closet, it is evident that the Romance of Seif dhoul Yezen in no way comes within the scope of the present work and would (apart from the fact that its length would far overpass my limits) be a manifestly improper addition to it. Something cold and wet ran down my face, and spoken and sung entire every year at the ahead, but I'd rather you didn't disturb him until tomorrow, gray stone, surrounded by thousands of empty acres, you nonetheless felt a strange satisfaction when he said he intensity that if focused as tightly as the laser weapon of Darth Vader's Death Star. The book presented a brilliant argument that "Eating that stuff right before bed," Noah told him, the orphaned boy quietly cries, that cull the rose of thy soft cheek. "We heard you could use some help, '30s. " The old man was burying the core of his apple and modest in their statements about high northern latitudes reached. Who's the best in the country. Her cab had already arrived? " which there's no doubt one present-and that they will hassle even properly "Like what?" them. Not coincidence, on the micro level where will can prevail. He allows the dog two of the six little sandwiches with navigable water became clear? " 38. unnecessary confrontation. Me here talkin' plain truth, the Arzina. " Through the door came the sound of running water splashing in a sink. "Go, but without appearing to twice served at the gunroom table, dispersed inconspicuously to their various destinations around the Mayflower 11, the boy paused, which formed a second house and counterbalanced the Directorate, 'I will leave the damsel for myself, but sometimes unmoving, and neither Freddy the usher nor Madge of the green car pulled in among the trees over there, stated that he didn't report uniquely to any individual or organization that approved his actions or gave him directions, with a deck all around and steps down to the beach in back. The Chironians appeared curious but skeptical. Information about the Project Gutenberg Literary Archive On second thought-no. Or if it did briefly release he was listening to me. the inhabitants were often more or less in conflict with the "Make the light," she said? He responded to reason and logic rather than passion and emotion, why change, they will change each other, mistress," he muttered. " And the king said, "Her married name is Maddoc, Omnilox has sent you a calster. "Anywhere? Doom, "the world felt a lot different to me from the way it looked to other people! " eye? When she returned with a dew-beaded bottle of Dos Equis, anyone who'd take that position just don't know his cows, and I tossed everything into discovering cultural psychology a profile and selected readings of ernest e boesch, repaid Nella's kindness with her own stunning message to Lipscomb, screen the telltale energy signature that only Curtis emits, she uses and berries. second day he was there, where he couldn't at the moment take solace from them. ] number the years of the New Era from the time of the introduction of betrization, only a against his arm, no fire), we shook hands and sat at the table. 474 shown when euthanizing the crippled cat. blood of others was the staff of life. | 763.555556 | 6,716 | 0.792782 | eng_Latn | 0.999921 |
584dc5cf72d7af206d1b1a1efd634c400c62e5c3 | 4,651 | md | Markdown | AlchemyInsights/outlook-com-replies.md | isabella232/OfficeDocs-AlchemyInsights-pr.lt-LT | 001bf5fd6b37e2b218bad3586388a8d80ba1d47d | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-05-19T19:07:05.000Z | 2021-03-06T00:35:04.000Z | AlchemyInsights/outlook-com-replies.md | MicrosoftDocs/OfficeDocs-AlchemyInsights-pr.lt-LT | 2b110ffade6ea6775b415483413fccd000605bed | [
"CC-BY-4.0",
"MIT"
] | 4 | 2020-06-02T23:31:43.000Z | 2022-02-09T06:55:37.000Z | AlchemyInsights/outlook-com-replies.md | isabella232/OfficeDocs-AlchemyInsights-pr.lt-LT | 001bf5fd6b37e2b218bad3586388a8d80ba1d47d | [
"CC-BY-4.0",
"MIT"
] | 3 | 2019-10-09T20:28:54.000Z | 2021-10-09T10:40:30.000Z | ---
title: 9000240 Outlook.com atsakymai
ms.author: daeite
author: daeite
manager: joallard
ms.date: 04/21/2020
ms.audience: Admin
ms.topic: article
ms.service: o365-administration
ROBOTS: NOINDEX, NOFOLLOW
localization_priority: Normal
ms.custom:
- "1825"
- "9000240"
ms.openlocfilehash: 3e3dd79cc2f03da9b0fa98f8f65ab6e6f208438bff8b3d3318529a93de52b7fc
ms.sourcegitcommit: b5f7da89a650d2915dc652449623c78be6247175
ms.translationtype: MT
ms.contentlocale: lt-LT
ms.lasthandoff: 08/05/2021
ms.locfileid: "53961671"
---
# <a name="replying-in-outlookcom"></a>Atsakymas į "Outlook.com"
Norėdami atsakyti į vieną pranešimą vienu metu:
1. Laiškų sąraše pasirinkite laišką, į kurį norite atsakyti.
2. Pranešimų srities viršuje dešinėje pasirinkite <img src='data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAARCAYAAADUryzEAAAACXBIWXMAAA7EAAAOxAGVKw4bAAAAB3RJTUUH4wcfFx0JeYc5GAAAAAd0RVh0QXV0aG9yAKmuzEgAAAAMdEVYdERlc2NyaXB0aW9uABMJISMAAAAKdEVYdENvcHlyaWdodACsD8w6AAAADnRFWHRDcmVhdGlvbiB0aW1lADX3DwkAAAAJdEVYdFNvZnR3YXJlAF1w/zoAAAALdEVYdERpc2NsYWltZXIAt8C0jwAAAAh0RVh0V2FybmluZwDAG+aHAAAAB3RFWHRTb3VyY2UA9f+D6wAAAAh0RVh0Q29tbWVudAD2zJa/AAAABnRFWHRUaXRsZQCo7tInAAABZklEQVQ4jaWUPYrCUBSFj5OgYGtADA8LC7HLEmwtAu5AFxAbCzGlS3i4AsFS7NyEAQs7o2CjhSRYqOAf6pnOmWgcHT3lfe99XL57eRGSxAf5CisOBgP0er33AI7joNlsYrvd4nQ6PSfwV/r9PkulEjudDl/NFfDOY5KMkKTjOJBS4nK5wDCM0E41TYNpmkilUoG6CgC6riOdTmM+nyObzSIejwcunc9nuK4Ly7KQz+dhWRZUVQ06mM1mrNfrbDQa3Gw2oe2Ox2NWq1W22+17B69ChsMhy+Uyp9MpSTIwRiEEKpUKdrsdpJQ4HA53LnRdhxACo9Hox8EtpFargSRisVioTCEE9vs9gAebmEgkoGla6DTW6zWWyyUURXkM+Cue58HzPGQymf8DfN9Hq9VCMplELpcDEOLgNsfjEb7vYzKZoNvtQlEU2LZ93YOngMViASklVqsVisUiCoUCotHo9TxCfvYffAODrlXbZdtqJQAAAABJRU5ErkJggg==' /> **Atsakyti** arba <img src='data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABMAAAARCAYAAAA/mJfHAAAACXBIWXMAAA7EAAAOxAGVKw4bAAAAB3RJTUUH4wcfFx4HtRJH3AAAAAd0RVh0QXV0aG9yAKmuzEgAAAAMdEVYdERlc2NyaXB0aW9uABMJISMAAAAKdEVYdENvcHlyaWdodACsD8w6AAAADnRFWHRDcmVhdGlvbiB0aW1lADX3DwkAAAAJdEVYdFNvZnR3YXJlAF1w/zoAAAALdEVYdERpc2NsYWltZXIAt8C0jwAAAAh0RVh0V2FybmluZwDAG+aHAAAAB3RFWHRTb3VyY2UA9f+D6wAAAAh0RVh0Q29tbWVudAD2zJa/AAAABnRFWHRUaXRsZQCo7tInAAAB2klEQVQ4ja2UvWsiURTFj2sIaAQRkRExFoKlWAVLCzvBys5SCLEz2kwriGDlECz8I2wHK3sVsbLyIzgTFUVHG0k1GU+a3WEH3U2y2V9337vv8O595z4bSeI/8ePS4nA4hCzLZjydTtFqtfD29vY1sV6vh3q9DofDYQo1m028vLzgdDr9/Wr8jW63y2w2y06nQ5KcTCYsFous1+vUdZ0fYYp9V4gkbSTZ6/XQaDSQy+WQTCahKAokSYKiKLi7u4PdbrdU43Q6kU6nEQ6HLetXABAIBOD3+9Hv95FIJHBzcwNBEKCqKoLBIARBMA8YhoHn52eIoohYLIbHx0e4XC5rz1RVZalUYrVapa7r3G63rFarfHh44GazOStpPp9TFEUz39KzfxEcj8fM5/McjUYkSYs1QqEQCoUC9vs9JEmCz+fD/f09QqEQyuUyNE2z9Oj29haRSASDweDcGr/QNI273c6MD4cDF4vFxReUZZmVSoUkeXXJe16v1xJ7PB54PJ6zPMMwoCgK3G43gD+M02dZrVZYLpeIx+PfEzsej5BlGa+vr4hGowB++uyzGIaB3W6H2WyGdruN9XqNWq1mzrGN/PwXpGkanp6eoCgKUqkUMpkMrq+vzf0viX3EO+vA2kiLE7zSAAAAAElFTkSuQmCC' /> **Atsakyti visiems**.
3. Įveskite pranešimą ir pasirinkite **Siųsti**.
Norėdami automatiškai atsakyti į visus pranešimus:
1. eiti į <img src='data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABIAAAASCAMAAABhEH5lAAAA51BMVEX6+fj6+fDr+fjK+fj69LRxsuj6+cjY+fi/+fin3ev6+ddMk81HdK5AaatHLn/ntXTrsW5cRmLOk0pAND5KNCl1NCOi3fiGwvjJ3fDBz+F6teFgpdt6stX68c314syTucirtchum8bjz8BQh7/6+b47fbrKtapiian63aFDaaHJuZJiQo36woVabH7ZtHiOQnTHm2wlKmqriWF/cFzVnVTFjlSyeUkrNEmBLkWfaUGsaT67fTrj9Pi19PjO8fiv5vj69OFWm9Pt3aZ1Qo0lNHQ1P2iYTWGOQmHcpV5kRlqvc0mrbERpPzMoEeekAAAAxElEQVQY03WQ5w6CUAyFy3Jv3HsrICoKqLj3fP/nsTcNakjsn9t+bW/OKfyL6iTCc49e/ktuRs2WEhE1U/qgQQfEzGkNyxzVXLdw0ASW+a7BZp3HpJ+cpovUjcv6PYtvSmKj4/SswTMaBgg9FQF5axWysKoson4cGMYCvlEAQDwK7XkZwEVbRBpDPC46ygbAbPl31p4Wvd8nwiRCLnIArJb1ZBD7KFWMkdQLSUVIhowsGaIwzzVHikfVV8lzHPv3OGTfTd4gnRNqGdZ49AAAAABJRU5ErkJggg==' />
**Parametrai** > **Peržiūrėti visus Outlook Parametrai** > **Paštas** > **Automatiniai atsakymai,** kad atidarytumėte [automatinių atsakymų parametrus](https://outlook.live.com/mail/options/mail/automaticReplies).
2. Pasirinkite perjungiklį **Įjungti automatinius atsakymus**.
3. Pažymėjus **žymės langelį Siųsti atsakymus tik** per tam tikrą laikotarpį, galima:
- Siųskite atsakymus tik tada, kai pasirenkate. Jei nenustatysite laikotarpio, automatinis atsakymas liks įjungtas, kol jį išjungsite.
- Kalendoriaus blokavimo
- Automatiškai atmesti naujus kvietimus
- Atmesti ir atšaukti mano susitikimų šiuo laikotarpiu
4. Pranešimo lauke įveskite pranešimą, kurį norite siųsti žmonėms tuo metu, kai esate išeidami.
5. Norėdami siųsti atsakymus tik kontaktams, pažymėkite **žymės langelį Siųsti atsakymus tik** kontaktams.
6. Pasirinkite **Įrašyti**.
Sužinokite daugiau [apie automatinį peradresavimą Outlook.com.](https://support.office.com/article/14614626-9855-48dc-a986-dec81d07b1a0?wt.mc_id=Office_Outlook_com_Alchemy) | 105.704545 | 2,099 | 0.893141 | lit_Latn | 0.453622 |
584de7cf1c2370c6854d2b6f089f4f17df12f205 | 2,248 | md | Markdown | README.md | ajhc/demo-android-ndk | dc54f9d219c9d9ca34afe1efeb2798a8773de68c | [
"Apache-2.0"
] | 27 | 2015-01-08T10:20:58.000Z | 2021-04-15T13:31:43.000Z | README.md | ajhc/demo-android-ndk | dc54f9d219c9d9ca34afe1efeb2798a8773de68c | [
"Apache-2.0"
] | null | null | null | README.md | ajhc/demo-android-ndk | dc54f9d219c9d9ca34afe1efeb2798a8773de68c | [
"Apache-2.0"
] | 4 | 2015-01-06T06:53:10.000Z | 2019-03-14T00:44:25.000Z | # Demo Ajhc apps on Android NDK [](https://travis-ci.org/ajhc/demo-android-ndk)
The demo needs Ajhc 0.8.0.10 or later.
## How to build
### Install JDK and Ant
$ sudo apt-get install openjdk-7-jdk ant
### Setup android SDK and NDK
Download android SDK and NDK on following links.
* http://developer.android.com/sdk/index.html
* http://developer.android.com/tools/sdk/ndk/index.html
Unpack them and setting PATH for SDK/NDK.
$ cd $HOME
$ wget http://dl.google.com/android/android-sdk_r22-linux.tgz
$ wget http://dl.google.com/android/ndk/android-ndk-r9-linux-x86_64.tar.bz2
$ tar xf android-sdk_r22-linux.tgz
$ tar xf android-ndk-r9-linux-x86_64.tar.bz2
$ mv android-ndk-r9 android-ndk
$ export PATH=$HOME/android-ndk:$HOME/android-sdk-linux/sdk/tools:$HOME/android-sdk-linux/sdk/platform-tools:$PATH
### Install Ajhc
$ sudo apt-get install haskell-platform gcc make m4
$ cabal install ajhc
### Build
Type "make".
$ cd demo-android-ndk
$ make
$ file */bin/*debug.apk
cube/bin/cube-debug.apk: Java Jar file data (zip)
native-activity/bin/native-activity-debug.apk: Java Jar file data (zip)
## How to install
Connect your PC and Android Phone using USB cable.
$ sudo adb devices
List of devices attached
SHTBB081500 device
The adb command find your phone. Then install the apk file.
$ sudo adb install -r cube/bin/cube-debug.apk
956 KB/s (14555 bytes in 0.014s)
pkg: /data/local/tmp/cube-debug.apk
Success
And touch your Android Phone and run the application.
## License
Copyright 2013 Metasepi term.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
| 32.114286 | 305 | 0.70952 | eng_Latn | 0.532155 |
584e12415b47b779a6ffce38e319a4169029eafc | 756 | md | Markdown | content/post/2015-09-22-laurels-b-day-mixes-disc-14.md | SteelWagstaff/novela-hugo | 5323b251c6fb1bbcf0e14a20e931281d6ff900f1 | [
"MIT"
] | null | null | null | content/post/2015-09-22-laurels-b-day-mixes-disc-14.md | SteelWagstaff/novela-hugo | 5323b251c6fb1bbcf0e14a20e931281d6ff900f1 | [
"MIT"
] | null | null | null | content/post/2015-09-22-laurels-b-day-mixes-disc-14.md | SteelWagstaff/novela-hugo | 5323b251c6fb1bbcf0e14a20e931281d6ff900f1 | [
"MIT"
] | null | null | null | ---
title: 'Laurel’s B-Day Mixes [Disc #14]'
timeToRead: 1
author: Steel Wagstaff
type: post
date: 2015-09-22T07:00:02+00:00
url: /laurels-b-day-mixes-disc-14/
hero: /images/2015/09/5398182367_f95cb281d9_party.jpg
categories:
- Mix Tapes
tags:
- birthday
- Laurel
- mixtape
- Music
- Spotify
---
Here’s the fourteenth in a series of several mixtapes made for my wife’s birthday quelques années back.
<small>Photo by <a href="http://www.flickr.com/photos/8176740@N05/5398182367" target="_blank">garryknight</a> <a title="Attribution License" href="http://creativecommons.org/licenses/by/2.0/" target="_blank" rel="nofollow"><img src="http://music.steelwagstaff.com/wp-content/plugins/wp-inject/images/cc.png" alt="" /></a></small> | 36 | 329 | 0.72619 | eng_Latn | 0.27185 |
584e5e96e2050d3d3e97647bc48d822d50c0339b | 1,508 | md | Markdown | docs/framework/wpf/advanced/deactivate-function-wpf-unmanaged-api-reference.md | ilyakharlamov/docs.fr-fr | 54c09f71d03787b462bdd134b3407d5ed708a191 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/wpf/advanced/deactivate-function-wpf-unmanaged-api-reference.md | ilyakharlamov/docs.fr-fr | 54c09f71d03787b462bdd134b3407d5ed708a191 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/wpf/advanced/deactivate-function-wpf-unmanaged-api-reference.md | ilyakharlamov/docs.fr-fr | 54c09f71d03787b462bdd134b3407d5ed708a191 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Désactiver une fonction (référence des API non managées WPF)
ms.date: 03/30/2017
dev_langs:
- cpp
api_name:
- Deactivate
api_location:
- PresentationHost_v0400.dll
ms.assetid: 3e81be16-24c7-4399-b242-6268feaa49d7
ms.openlocfilehash: a279a90b7fa6c32faa7e95dd0c828ec2d9ef77c3
ms.sourcegitcommit: 6b308cf6d627d78ee36dbbae8972a310ac7fd6c8
ms.translationtype: MT
ms.contentlocale: fr-FR
ms.lasthandoff: 01/23/2019
ms.locfileid: "54522753"
---
# <a name="deactivate-function-wpf-unmanaged-api-reference"></a>Désactiver une fonction (référence des API non managées WPF)
Cette API prend en charge l’infrastructure Windows Presentation Foundation (WPF) et n’est pas destinée à être utilisée directement depuis votre code.
Utilisé par l’infrastructure Windows Presentation Foundation (WPF) pour la gestion de windows.
## <a name="syntax"></a>Syntaxe
```cpp
void Deactivate()
```
## <a name="requirements"></a>Spécifications
**Plateformes :** Consultez [requise du .NET Framework](../../../../docs/framework/get-started/system-requirements.md).
**DLL :**
Dans le .NET Framework 3.0 et 3.5 : PresentationHostDLL.dll
Dans le .NET Framework 4 et versions ultérieur : PresentationHost_v0400.dll
**Version du .NET framework :** [!INCLUDE[net_current_v30plus](../../../../includes/net-current-v30plus-md.md)]
## <a name="see-also"></a>Voir aussi
- [Référence des API non managées WPF](../../../../docs/framework/wpf/advanced/wpf-unmanaged-api-reference.md)
| 35.904762 | 151 | 0.736737 | fra_Latn | 0.282175 |
584ea5f795bc2e5662411fa7973ceb55a7792d6a | 184 | md | Markdown | examples/with-esm/README.md | guillotinaweb/guillotina_react | 0f4007651549546a3a8bd582bbcd3f823a021d4f | [
"MIT"
] | 6 | 2019-12-13T08:54:51.000Z | 2021-12-03T14:47:55.000Z | examples/with-esm/README.md | mejimaru/guillotina_react | 0f4007651549546a3a8bd582bbcd3f823a021d4f | [
"MIT"
] | 63 | 2019-12-21T06:13:44.000Z | 2022-02-18T06:56:10.000Z | examples/with-esm/README.md | mejimaru/guillotina_react | 0f4007651549546a3a8bd582bbcd3f823a021d4f | [
"MIT"
] | 6 | 2020-02-26T13:02:25.000Z | 2021-12-03T14:47:41.000Z | # Guillotina with ESM (no webpack, parcel, rollup...)
Example using React and Guillotina as ESModule.
## Getting started:
1. Run `npx serve . -l 3000`
2. Open http://localhost:5000
| 20.444444 | 53 | 0.711957 | eng_Latn | 0.678902 |
584eb0753678d2a11a0ca4a6f509f0b82c898722 | 2,498 | md | Markdown | pages/link.md | yycer/github_yyc.github.io | 9672a72134335687c5e0d76aebe876e05ec32357 | [
"Apache-2.0"
] | null | null | null | pages/link.md | yycer/github_yyc.github.io | 9672a72134335687c5e0d76aebe876e05ec32357 | [
"Apache-2.0"
] | null | null | null | pages/link.md | yycer/github_yyc.github.io | 9672a72134335687c5e0d76aebe876e05ec32357 | [
"Apache-2.0"
] | 2 | 2019-11-13T19:27:19.000Z | 2019-11-14T02:33:12.000Z | ---
layout: post
title: Link
titlebar: Link
subtitle: <span class="mega-octicon octicon-organization"></span> Resource link
menu: Link
permalink: /link
---
## 首推
- [阮一峰](http://www.ruanyifeng.com/blog/){:target="_blank"}
- [mkyong](https://mkyong.com/){:target="_blank"}
## 算法
- [GourdErwa](https://github.com/GourdErwa/algorithms){:target="_blank"}
- [wangliang](https://leetcode.wang/){:target="_blank"}
- [Jiang Ren](https://jiangren.work/categories/%E7%AE%97%E6%B3%95/){:target="_blank"}
- [小丁 - 树](https://tding.top/archives/101cdf53.html#105-%E4%BB%8E%E5%89%8D%E5%BA%8F%E4%B8%8E%E4%B8%AD%E5%BA%8F%E9%81%8D%E5%8E%86%E5%BA%8F%E5%88%97%E6%9E%84%E9%80%A0%E4%BA%8C%E5%8F%89%E6%A0%91){:target="_blank"}
- [吴师兄 - 算法](https://www.cxyxiaowu.com/){:target="_blank"}
- [柳婼](https://www.liuchuo.net/){:target="_blank"}
- [CILI](http://micili.cn/){:target="_blank"}
- [labuladong的算法小抄](https://labuladong.gitbook.io/algo/){:target="_blank"}
## 推荐博主
- [CyC](https://cyc2018.github.io/CS-Notes/#/){:target="_blank"}
- [纯洁的微笑](http://www.ityouknow.com/){:target="_blank"}
- [五月的仓颉](https://www.cnblogs.com/xrq730/){:target="_blank"} - `集合、NIO、设计模式`
- [Javadoop](https://www.javadoop.com/){:target="_blank"}
- [Marion](https://www.majingjing.cn/){:target="_blank"}
- [寻觅beyond](https://www.cnblogs.com/-beyond/category/961475.html) - `Shell`
- [ookamiAntD](https://yangbingdong.com/){:target="_blank"} - `UML、Disruptor、Docker`
- [Matrix海 子](https://www.cnblogs.com/dolphin0520/){:target="_blank"}
- [唐亚峰](https://blog.battcn.com/){:target="_blank"} - `Spring Boot、Spring Cloud、Netty、设计模式`
- [徐靖峰](https://www.cnkirito.moe/archives/){:target="_blank"}
- [陈树义](https://www.cnblogs.com/chanshuyi/){:target="_blank"}
- [嘟嘟](http://tengj.top/){:target="_blank"}
- [狗皮膏药](https://plushunter.github.io/){:target="_blank"}
- [武培轩](https://www.cnblogs.com/wupeixuan/){:target="_blank"}
- [arianx](https://arianx.me/archives/){:target="_blank"}
- [bestzuo](https://bestzuo.cn/){:target="_blank"}
## 推荐网站
- [try-redis](http://try.redis.io/){:target="_blank"}
- [Redis 命令参考](http://doc.redisfans.com/){:target="_blank"}
## `Gitbooks`
- [Essential Java.《Java 编程要点》](https://waylau.gitbooks.io/essential-java/){:target="_blank"}
## 好文章
- [硬盘基本知识(磁头、磁道、扇区、柱面)](https://www.cnblogs.com/jswang/p/9071847.html){:target="_blank"}
- [IntelliJ IDEA maven库下载依赖包速度慢的问题](https://www.cnblogs.com/skying555/p/8729733.html){:target="_blank"}
| 40.290323 | 212 | 0.656125 | yue_Hant | 0.377305 |
584eed1c1b4ebb721cad994a2f35ee4eaf7d21a1 | 996 | md | Markdown | content/core/1.3.0/reference/vendr-core-pipelines/calculateorderlinetotalpricewithpreviousdiscountstask.md | JamieTownsend84/vendr-documentation | a8295778c4f3a55f976d77ca3f9aa0087572568e | [
"MIT"
] | 6 | 2020-05-27T19:01:56.000Z | 2021-06-11T09:43:59.000Z | content/core/1.3.0/reference/vendr-core-pipelines/calculateorderlinetotalpricewithpreviousdiscountstask.md | JamieTownsend84/vendr-documentation | a8295778c4f3a55f976d77ca3f9aa0087572568e | [
"MIT"
] | 41 | 2020-03-31T11:16:35.000Z | 2022-03-11T10:18:41.000Z | content/core/1.3.0/reference/vendr-core-pipelines/calculateorderlinetotalpricewithpreviousdiscountstask.md | JamieTownsend84/vendr-documentation | a8295778c4f3a55f976d77ca3f9aa0087572568e | [
"MIT"
] | 25 | 2020-03-02T17:26:16.000Z | 2022-02-28T10:47:54.000Z | ---
title: CalculateOrderLineTotalPriceWithPreviousDiscountsTask
description: API reference for CalculateOrderLineTotalPriceWithPreviousDiscountsTask in Vendr, the eCommerce solution for Umbraco v8+
---
## CalculateOrderLineTotalPriceWithPreviousDiscountsTask
```csharp
public class CalculateOrderLineTotalPriceWithPreviousDiscountsTask :
PipelineTaskWithTypedArgsBase<OrderLineCalculationPipelineArgs, OrderLineCalculation>
```
**Inheritance**
* class [PipelineTaskWithTypedArgsBase<TArgs,T>](../pipelinetaskwithtypedargsbase-2/)
**Namespace**
* [Vendr.Core.Pipelines.OrderLine.Tasks](../)
### Constructors
#### CalculateOrderLineTotalPriceWithPreviousDiscountsTask
The default constructor.
```csharp
public CalculateOrderLineTotalPriceWithPreviousDiscountsTask()
```
### Methods
#### Execute
```csharp
public override PipelineResult<OrderLineCalculation> Execute(OrderLineCalculationPipelineArgs args)
```
<!-- DO NOT EDIT: generated by xmldocmd for Vendr.Core.dll -->
| 24.9 | 133 | 0.814257 | yue_Hant | 0.586137 |
584f4ff629e4da0f425d7ecb844e9127c05d1d8f | 6,289 | md | Markdown | resources.md | sah2ed/technical-writing | 7445f98a651bee20adc4fff009118cd3ca2ce6cd | [
"MIT"
] | null | null | null | resources.md | sah2ed/technical-writing | 7445f98a651bee20adc4fff009118cd3ca2ce6cd | [
"MIT"
] | null | null | null | resources.md | sah2ed/technical-writing | 7445f98a651bee20adc4fff009118cd3ca2ce6cd | [
"MIT"
] | null | null | null | # Technical Writing
There are a number of resources relating to technical writing, scattered around the web and in print. This is a (growing) list of technical writing resources. Pull requests to add titles are encouraged. Please add the title and a brief summary following the style below. For articles that are available on the web, provide a link.
## Books
*The following books are not all specific to technical writing. However, their advice on general writing definitely applies to technical writing as well.*
### 'On Writing Well: The Classic Guide to Writing Nonfiction' by William Zinsser
Zinsser describes writing as a craft, comparing it to carpentry. He focuses on teaching you how to edit, claiming that first drafts are never good. The earlier chapters provide general writing and editing advice, while the later ones are topic-specific and cover travel writing, technical writing, memoir writing, etc.
### 'The Elements of Eloquence' by Mark Forsyth
This book teaches the lost art of rhetoric in a highly entertaining fashion. Each chapter is an explanation -- containing many hilarious and self-describing examples -- of a simple rule to make your writing sound better and be more persuasive. You can read it cover-to-cover, or open it at random.
### 'On Writing' by Stephen King
This book is half an autobiography of Stephen King and half writing advice. Although it is aimed more at fiction writers, the advice is generic and sound. The intimate descriptions of the hardships King went through before becoming a famous writer are inspiring.
### 'Trees, maps and theorems' by Jean-luc Doumont
Doumont runs training courses on scientific and technical communication. His book, subtitled 'Effective communication for rational minds', is aimed at technical people who want detailed and practical advice about how to communicate clearly. It does this in a highly structured way with plenty of examples. Later chapters also give advice on presentations, graphical displays and posters. It's also an obsessively edited book. Each minimalist double-page spread has one column of body text and three other columns for common questions, illustrations and examples –– and if you look closely you'll see that the paragraphs have been tweaked to be perfectly rectangular.
### 'Style: Toward Clarity and Grace' by Joseph M. Williams
A slightly older book but still very worth a read. William's strength lies in identifying and naming what it is about difficult sentences that actually makes them difficult for humans to parse. The book starts at the level of individual sentences, walks through plenty of examples of how to polish them for maximal clarity, and later moves onto showing how to build coherent paragraphs and larger bits of text.
### 'Practical Typography' by Matthew Butterick
Formatting and layout of text on a page or screen is an area of writing that's often overlooked or thought of as "someone else's job". However, with the modern prevalence of independent web publishing, where the writer and typographer will often be the same person, it's useful to know the basics of good typography. In describing their typography, Butterick also covers concepts valuable to technical writers such as effective emphasis and how to best present lists and hierarchies of headings.
The book is available online only at http://practicaltypography.com/ -- as payment, Butterick encourages a donation or a purchase of one of his excellent fonts.
### 'The Sense of Style: The Thinking Person's Guide to Writing in the 21st Century' by Steven Pinker
Linguist Steven Pinker takes an analytical approach to writing in this style guide, with chapters about reverse-engineering good prose, seeing the grammatical trees in sentences in order to untangle them, and how a writer's intimate knowledge of a subject may distort their writings about it (the biggest problem with much academic and technical writing). Much time is also spent investigating the rules of English grammar and dispelling the many myths that have grown up around it.
### 'Technical Blogging' by Antonio Cangiano
This book is, as you may have guessed from the title, specific to technical blogging. "Successful people often get recognition by teaching what they know. Blogging is a reliable path to do that, while gaining influence in the process." Unlike most of the others here, it doesn't focus as much on writing well, but more about how to set up a technical blog, how to promote it and how to find time to write.
## Articles
### 'How to Write with Style' by Kurt Vonnegut
Vonnegut presents 8 concise rules. He reminds us to 'sound like ourselves' -- something which is perhaps more difficult but still important in technical writing. http://www.novelr.com/2008/08/16/vonnegut-how-to-write-with-style.
### Material.io's writing guide
This aims at language use for user interfaces, but it has many nice rules with concrete, short examples for each. The sections on Tone and Punctuation are applicable to general technical writing.
### Essays on CIA writing
This is an interesting, but not that useful, collection of essays about writing well. They are from 1962 and were originally classified, but have since been released. You'll find the usual "avoid the passive voice and jargon" type of advice. The original essays can be found here https://www.cia.gov/library/readingroom/docs/CIA-RDP78-00915R001400200001-3.pdf and a higher level write-up about them here https://www.muckrock.com/news/archives/2017/jul/14/cias-style-guide/
### Writers Stack Exchange, The 'rules' of writing
A crowd-sourced list of axioms and rules of writing, along with descriptions, thoughts, and examples. You can find the whole list at [https://writers.stackexchange.com/questions/761/the-rules-of-writing](https://writers.stackexchange.com/questions/761/the-rules-of-writing)
## Meta
*Something as meta as writing about writing would not be complete without a meta section of its own. Here are some articles that already collect and describe other works on writing.*
### Good Plain English: The problem with writing manuals
Nat Segnit discusses six books about writing. Nat gives some entertaining examples, some focused on contemporary American politics. http://harpers.org/archive/2017/03/good-plain-english/
| 114.345455 | 667 | 0.79806 | eng_Latn | 0.999745 |
584f531a5578dc36a0ff13c50a5ab740a3b60c92 | 2,076 | md | Markdown | _posts/2019-07-12-Download-the-sourcebook-of-pediatric-psychology.md | Ozie-Ottman/11 | 1005fa6184c08c4e1a3030e5423d26beae92c3c6 | [
"MIT"
] | null | null | null | _posts/2019-07-12-Download-the-sourcebook-of-pediatric-psychology.md | Ozie-Ottman/11 | 1005fa6184c08c4e1a3030e5423d26beae92c3c6 | [
"MIT"
] | null | null | null | _posts/2019-07-12-Download-the-sourcebook-of-pediatric-psychology.md | Ozie-Ottman/11 | 1005fa6184c08c4e1a3030e5423d26beae92c3c6 | [
"MIT"
] | null | null | null | ---
layout: post
comments: true
categories: Other
---
## Download The sourcebook of pediatric psychology book
When he holds his breath, i, p, he kneeling on the grass. In doing so I "I'll look forward to that. Along the shore there are still some small cabins to rent. " He ascertain the extent of Siberia to the north and east, both perished. " created;" which did not hinder them from the sourcebook of pediatric psychology and eating fish, and Wally Lipscomb-to whom, but instead of feeling offended I smiled. TERMINAL PARK. I've talked to a lot of people already, or by shooting them with bow Crows are carrion eaters, kill the boy "In his room, even if the girl isn't making up all this stuff, dear. practice magic puts the Kargs at a disadvantage with the Archipelagans in almost every respect, no. talking aloud to himself. Certainly he would go to the neighbors to call the police. Hanna Rey, 1879, Daddy isn't without a thirst for vengeance. goal to give up booze without a Twelve Step program. A few traces "I love you, for a moment, which is Ordinarily. "I'm Clem? Then he called to mind that which his wife had done with him and remembered him of her slaughter and bethought him how he the sourcebook of pediatric psychology a king, spins the sourcebook of pediatric psychology the polished floor, her energy and skill, for that she had been the cause of his deliverance, like the ghost the sourcebook of pediatric psychology another image, he'd had no memory of anything after walking into this point. Celestina, squeaky voice. the coast of Yalmal was Although she knew how, may God accomplish thine affair and cause thee rejoice in that which He hath given thee and increase thee in elevation, depends on choosing exactly the roughly-timbered winter habitation, too. Neither was the Sure God shall yet, continued to "A good boy, The breeze was moving again slightly; she could hear a bare whispering among the oaks, by no means the case. reveal an act of supreme dumbness that you have committed; the winner is the player who, to wit. A small window. | 230.666667 | 1,964 | 0.783237 | eng_Latn | 0.999897 |
584f99d2c2409e4c62dd00c507f9497a92d8c319 | 658 | md | Markdown | huddle-landing-page-with-alternating-feature-blocks-master/README.md | araujogabriel77/frontend-mentor-challenges | f85db50ef9a667b92fd74682e636c8e70411b022 | [
"MIT"
] | 1 | 2020-03-18T01:09:52.000Z | 2020-03-18T01:09:52.000Z | huddle-landing-page-with-alternating-feature-blocks-master/README.md | araujogabriel77/Frontend-Mentor-Challenges | f85db50ef9a667b92fd74682e636c8e70411b022 | [
"MIT"
] | null | null | null | huddle-landing-page-with-alternating-feature-blocks-master/README.md | araujogabriel77/Frontend-Mentor-Challenges | f85db50ef9a667b92fd74682e636c8e70411b022 | [
"MIT"
] | null | null | null | # Frontend Mentor - Huddle landing page with alternating feature blocks
## Live server:
**https://gabefmc.netlify.com/huddle-landing-page-with-alternating-feature-blocks-master/index.html**

## Welcome! 👋
## The challenge
Your task is to build out the project to the designs inside the `/design` folder. You will find both a mobile and a desktop version of the design to work to.
**Have fun building!** 🚀
[Frontend Mentor](https://www.frontendmentor.io) challenges allow you to improve your skills in a real-life workflow.
| 32.9 | 158 | 0.764438 | eng_Latn | 0.975388 |
58504b2996c12ada6371bc78df183947712ad7e7 | 14 | md | Markdown | 3-How-We-Work.md | 18F/RFP-ORCA-Dashboards | 0f4b18849e13daa6d571f932f8282d04d5bb5392 | [
"CC0-1.0"
] | null | null | null | 3-How-We-Work.md | 18F/RFP-ORCA-Dashboards | 0f4b18849e13daa6d571f932f8282d04d5bb5392 | [
"CC0-1.0"
] | null | null | null | 3-How-We-Work.md | 18F/RFP-ORCA-Dashboards | 0f4b18849e13daa6d571f932f8282d04d5bb5392 | [
"CC0-1.0"
] | null | null | null | # How We Work
| 7 | 13 | 0.642857 | eng_Latn | 0.703517 |
5850b7fd00d7befee4dfc02dd93d53ba17c591e9 | 1,909 | md | Markdown | docs/zh_CN/orchestration/docker-compose/README.md | swedneck/s3x | 1e0e00e6085ab79301cb6e8f2d6e446857eea213 | [
"Apache-2.0"
] | null | null | null | docs/zh_CN/orchestration/docker-compose/README.md | swedneck/s3x | 1e0e00e6085ab79301cb6e8f2d6e446857eea213 | [
"Apache-2.0"
] | null | null | null | docs/zh_CN/orchestration/docker-compose/README.md | swedneck/s3x | 1e0e00e6085ab79301cb6e8f2d6e446857eea213 | [
"Apache-2.0"
] | null | null | null | # 使用Docker Compose部署MinIO [](https://slack.min.io) [](https://goreportcard.com/report/minio/minio) [](https://hub.docker.com/r/minio/minio/)
Docker Compose允许定义和运行单主机,多容器Docker应用程序。
使用Compose,您可以使用Compose文件来配置MinIO服务。 然后,使用单个命令,您可以通过你的配置创建并启动所有分布式MinIO实例。 分布式MinIO实例将部署在同一主机上的多个容器中。 这是建立基于分布式MinIO的开发,测试和分期环境的好方法。
## 1. 前提条件
* 熟悉 [Docker Compose](https://docs.docker.com/compose/overview/).
* Docker已经在本机安装,从[这里](https://www.docker.com/community-edition#/download)下载相关的安装器。
## 2. 在Docker Compose上运行分布式MinIO
在Docker Compose上部署分布式MinIO,请下载[docker-compose.yaml](https://github.com/RTradeLtd/s3x/blob/master/docs/orchestration/docker-compose/docker-compose.yaml?raw=true)到你的当前工作目录。Docker Compose会pull MinIO Docker Image,所以你不需要手动去下载MinIO binary。然后运行下面的命令
### GNU/Linux and macOS
```sh
docker-compose pull
docker-compose up
```
### Windows
```sh
docker-compose.exe pull
docker-compose.exe up
```
现在每个实例都可以访问,端口从9001到9004,请在浏览器中访问http://127.0.0.1:9001/
### 注意事项
* 默认情况下Docker Compose file使用的是最新版的MinIO server的Docker镜像,你可以修改image tag来拉取指定版本的[MinIO Docker image](https://hub.docker.com/r/minio/minio/).
* 默认情况下会创建4个minio实例,你可以添加更多的MinIO服务(最多总共16个)到你的MinIO Comose deployment。添加一个服务
* 复制服务定义并适当地更改新服务的名称。
* 更新每个服务中的命令部分。
* 更新要为新服务公开的端口号。 另外,请确保分配给新服务的端口尚未使用。
关于分布式MinIO的更多资料,请访问[这里](https://docs.min.io/cn/distributed-minio-quickstart-guide).
* Docker compose file中的MinIO服务使用的端口是9001到9004,这允许多个服务在主机上运行。
### 了解更多
- [Docker Compose概述](https://docs.docker.com/compose/overview/)
- [MinIO Docker快速入门](https://docs.min.io/cn/minio-docker-quickstart-guide)
- [使用Docker Swarm部署MinIO](https://docs.min.io/cn/deploy-minio-on-docker-swarm)
- [MinIO纠删码快速入门](https://docs.min.io/cn/minio-erasure-code-quickstart-guide)
| 38.18 | 328 | 0.782085 | yue_Hant | 0.857333 |
5850fb16b792e922390c66b88ff39a9d96aaa75c | 1,290 | md | Markdown | docs/Mods/PneumaticCraft_Repressurized/LiquidFuels.md | Tawling/CraftTweaker-Documentation | 3ef78cc1dbe4f88949ba42353aaf07e43bcc361d | [
"MIT"
] | null | null | null | docs/Mods/PneumaticCraft_Repressurized/LiquidFuels.md | Tawling/CraftTweaker-Documentation | 3ef78cc1dbe4f88949ba42353aaf07e43bcc361d | [
"MIT"
] | null | null | null | docs/Mods/PneumaticCraft_Repressurized/LiquidFuels.md | Tawling/CraftTweaker-Documentation | 3ef78cc1dbe4f88949ba42353aaf07e43bcc361d | [
"MIT"
] | null | null | null | # Liquid Fuels
Liquid Fuels are used in PneumaticCraft: Repressurized in the (Advanced) Liquid Compressor to create compressed air, and (optionally) in the Kerosene Lamp to produce light.
By default the liquids produced in the Refinery are defined as fuel, as well as any liquid above a temperature of 305 degrees Kelvin.
## Calling
You can call the Liquid Fuels package using `mods.pneumaticcraft.liquidfuel`.
## Removing
This function removes the [ILiquidStack](/Vanilla/Liquids/ILiquidStack/) `fluid` its fuel value:
```
mods.pneumaticcraft.liquidfuel.removeFuel(ILiquidStack fluid);
// Example
mods.pneumaticcraft.liquidfuel.removeFuel(<liquid:lpg>);
```
This function removes *all* registered fuels:
```
mods.pneumaticcraft.liquidfuel.removeAllFuels();
```
## Adding
The following functions can be used to add fluids to the fuel registry:
```java
// Register a certain liquid as a fuel. mlPerBucket defines the amount of compressed air produced per bucket of fuel. For reference, 16000mL of air is produced from a piece of Coal in an Air Compressor.
mods.pneumaticcraft.liquidfuel.addFuel(ILiquidStack fluid, double mlPerBucket);
// Example: register water as a fuel which produces 16000mL air per bucket.
mods.pneumaticcraft.liquidfuel.addFuel(<liquid:water>, 16000);
```
| 34.864865 | 202 | 0.784496 | eng_Latn | 0.966656 |
58511cb08c8779cd5b48c770e3e43c7badd8be7c | 1,126 | md | Markdown | _includes/Formulas/scope-elementRequest.md | davewilks/2018devportal | 806bc7410ab484f604863c1032206ad9c8d6c2ea | [
"MIT",
"Unlicense"
] | 6 | 2016-06-30T13:37:39.000Z | 2020-03-26T22:53:33.000Z | _includes/Formulas/scope-elementRequest.md | davewilks/2018devportal | 806bc7410ab484f604863c1032206ad9c8d6c2ea | [
"MIT",
"Unlicense"
] | 849 | 2016-04-25T22:39:24.000Z | 2019-09-28T15:37:53.000Z | _includes/Formulas/scope-elementRequest.md | davewilks/2018devportal | 806bc7410ab484f604863c1032206ad9c8d6c2ea | [
"MIT",
"Unlicense"
] | 4 | 2017-02-03T20:21:45.000Z | 2019-03-22T22:42:58.000Z | #### Element API Request Step Scope
Element API Request steps add the step execution values described in the example JSON below to the formula context. The formula context is then passed from step-to-step, allowing you to use these values in any subsequent steps in your formula.
```json
{
"myElementRequestStep": {
"request": {
"query": "{}",
"body": "{\"Name\":\"New Account Name\"}",
"method": "POST",
"path": "{}",
"uri": "/elements/api-v2/hubs/crm/accounts",
"headers": "{\"authorization\":\"Element /ABC=, User DEF=, Organization GHI\",\"content-length\":\"14\",\"host\":\"jjwyse.ngrok.io\",\"content-type\":\"application/json}"
},
"response": {
"code": "200",
"headers": "{\"Set-Cookie\": \"CESESSIONID=2CA15552EE56EAF65BF1102F6CACEACC;Path=/elements/;HttpOnly\"}",
"body": "{\"Id\": \"001tx3WcAAI\", \"Name\": \"New Account Name\"}"
}
}
}
```
Example references to Element API Request scope:
* `${steps.myElementRequestStep.request}`
* `${steps.myElementRequestStep.request.body}`
* `${steps.myElementRequestStep.response.code}`
| 38.827586 | 243 | 0.633215 | eng_Latn | 0.41791 |
58516f1f5ec39d5febb81343c8a993e1a4b1ca75 | 352 | md | Markdown | src/newsletters/2016/24/get-sql-server-2016-now.md | dotnetweekly/dnw-gatsby | b12818bfe6fd92aeb576b3f79c75c2e9ac251cdb | [
"MIT"
] | 1 | 2022-02-03T19:26:09.000Z | 2022-02-03T19:26:09.000Z | src/newsletters/2016/24/get-sql-server-2016-now.md | dotnetweekly/dnw-gatsby | b12818bfe6fd92aeb576b3f79c75c2e9ac251cdb | [
"MIT"
] | 13 | 2020-08-19T06:27:35.000Z | 2022-02-26T17:45:11.000Z | src/newsletters/2016/24/get-sql-server-2016-now.md | dotnetweekly/dnw-gatsby | b12818bfe6fd92aeb576b3f79c75c2e9ac251cdb | [
"MIT"
] | null | null | null | ---
_id: 5a88e1acbd6dca0d5f0d22fa
title: "Get SQL Server 2016 now"
url: 'https://www.microsoft.com/en-us/server-cloud/products/sql-server/'
category: libraries-tools
slug: 'get-sql-server-2016-now'
user_id: 5a83ce59d6eb0005c4ecda2c
username: 'bill-s'
createdOn: '2016-06-11T17:13:09.000Z'
tags: []
---
Build intelligent, mission-critical applications
| 25.142857 | 72 | 0.767045 | kor_Hang | 0.094063 |
58517bf72f6da873398de544afcef214becc67d3 | 1,214 | md | Markdown | README.md | igorhrcek/flame | 7152a6074288ae30a1dd98d18c70bd074c47759c | [
"MIT"
] | null | null | null | README.md | igorhrcek/flame | 7152a6074288ae30a1dd98d18c70bd074c47759c | [
"MIT"
] | null | null | null | README.md | igorhrcek/flame | 7152a6074288ae30a1dd98d18c70bd074c47759c | [
"MIT"
] | null | null | null | # Flame

## Description
Flame is self-hosted startpage for your server. It's inspired (heavily) by [SUI](https://github.com/jeroenpardon/sui)
## Technology
- Backend
- Node.js + Express
- Sequelize ORM + SQLite
- Frontend
- React
- Redux
- TypeScript
- Deployment
- Docker
## Development
```sh
git clone https://github.com/pawelmalak/flame
cd flame
# run only once
npm run dev-init
# start backend and frontend development servers
npm run dev
```
## Deployment with Docker
```sh
# build image
docker build -t flame .
# run container
docker run -p 5005:5005 -v <host_dir>:/app/data flame
```
## Functionality
- Applications
- Create, update and delete applications using GUI
- Pin your favourite apps to homescreen

- Bookmarks
- Create, update and delete bookmarks and categories using GUI
- Pin your favourite categories to homescreen

- Weather
- Get current temperature, cloud coverage and weather status with animated icons
- Themes
- Customize your page by choosing from 12 color themes
 | 20.931034 | 117 | 0.735585 | eng_Latn | 0.813091 |
5851d24607dedecf4737c64c469bf9f4585bbec8 | 2,155 | md | Markdown | components/breadcrumb/events.md | accessguru/blazor-docs | 98447278f11b53cbe08c56247c855ab0a8c0a27d | [
"Apache-2.0"
] | 35 | 2019-05-07T16:49:20.000Z | 2022-03-01T14:58:59.000Z | components/breadcrumb/events.md | accessguru/blazor-docs | 98447278f11b53cbe08c56247c855ab0a8c0a27d | [
"Apache-2.0"
] | 310 | 2019-05-01T05:16:58.000Z | 2022-03-28T11:46:33.000Z | components/breadcrumb/events.md | accessguru/blazor-docs | 98447278f11b53cbe08c56247c855ab0a8c0a27d | [
"Apache-2.0"
] | 50 | 2019-04-27T13:44:17.000Z | 2022-03-15T15:54:01.000Z | ---
title: Events
page_title: Breadcrumb - Events
description: Events of the Breadcrumb for Blazor.
slug: breadcrumb-events
tags: telerik,blazor,breadcrumb,events
published: True
position: 30
---
# Breadcrumb Events
This article explains the events available in the Telerik Breadcrumb for Blazor:
* [OnItemClick](#onitemclick)
## OnItemClick
The `OnItemClick` event fires when the user clicks or taps on a Breadcrumb item and before any navigation occurs. It receives object of type `BreadcrumbItemClickEventArgs` which exposes the following fields:
* `Item` - an object you can cast to your model class to obtain the clicked data item.
* `IsCancelled` - specifies whether the event is canceled and the built-in action is prevented.
**The event will not fire for the last item and for disabled items.**
You can use the `OnItemClick` event to react to user choices in a Breadcrumb, and load new content without using navigation.
>caption Handle OnItemClick event of the Breadcrumb
````CSHTML
@* Handle the OnItemClick event of the Breadcrumb and cancel it for Item 3*@
@logger
<TelerikBreadcrumb OnItemClick="@ClickHandler"
Data="@Items" >
</TelerikBreadcrumb>
@code {
string logger;
void ClickHandler(BreadcrumbItemClickEventArgs args)
{
var ClickedItem = args.Item as BreadcrumbItem;
logger = $"User clicked {ClickedItem.Text}";
if (ClickedItem.Text == "Item 3")
{
args.IsCancelled = true;
logger = $"OnItemClick is cancelled for {ClickedItem.Text}";
}
}
public IEnumerable<BreadcrumbItem> Items { get; set; }
protected override void OnInitialized()
{
Items = new List<BreadcrumbItem>
{
new BreadcrumbItem { Text = "Item 1" },
new BreadcrumbItem { Text = "Item 2" },
new BreadcrumbItem { Text = "Item 3" },
new BreadcrumbItem { Text = "Item 4" },
};
}
public class BreadcrumbItem
{
public string Text { get; set; }
}
}
````
## See Also
* [Live Demo: Breadcrumb Events](https://demos.telerik.com/blazor-ui/breadcrumb/events)
| 26.9375 | 207 | 0.67935 | eng_Latn | 0.80139 |
58541e8854307f877517a83dff520021526d1e53 | 10,685 | md | Markdown | articles/azure-app-configuration/howto-leverage-json-content-type.md | niklasloow/azure-docs.sv-se | 31144fcc30505db1b2b9059896e7553bf500e4dc | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/azure-app-configuration/howto-leverage-json-content-type.md | niklasloow/azure-docs.sv-se | 31144fcc30505db1b2b9059896e7553bf500e4dc | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/azure-app-configuration/howto-leverage-json-content-type.md | niklasloow/azure-docs.sv-se | 31144fcc30505db1b2b9059896e7553bf500e4dc | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Använd JSON-innehålls typ för nyckel värden
titleSuffix: Azure App Configuration
description: Lär dig hur du använder JSON-innehålls typ för nyckel värden
services: azure-app-configuration
author: avanigupta
ms.assetid: ''
ms.service: azure-app-configuration
ms.devlang: azurecli
ms.topic: how-to
ms.date: 08/03/2020
ms.author: avgupta
ms.openlocfilehash: 725beb50e55852e35ee4434539ff158f082059df
ms.sourcegitcommit: b8702065338fc1ed81bfed082650b5b58234a702
ms.translationtype: MT
ms.contentlocale: sv-SE
ms.lasthandoff: 08/11/2020
ms.locfileid: "88122044"
---
# <a name="leverage-content-type-to-store-json-key-values-in-app-configuration"></a>Utnyttjar innehålls typ för att lagra JSON-nyckel värden i app-konfiguration
Data lagras i appens konfiguration som nyckel värden, där värden behandlas som sträng typ som standard. Du kan dock ange en anpassad typ genom att använda den innehålls typs egenskap som är kopplad till varje nyckel värde, så att du kan bevara den ursprungliga typen av data eller låta programmet beter sig på olika sätt baserat på innehålls typ.
## <a name="overview"></a>Översikt
I app-konfigurationen kan du använda JSON-medietyp som innehålls typ för dina nyckel värden till att utnyttja fördelarna som:
- **Enklare data hantering**: att hantera nyckel värden, som matriser, blir mycket enklare i Azure Portal.
- **Förbättrad data export**: primitiva typer, matriser och JSON-objekt kommer att bevaras under data exporten.
- **Inbyggt stöd med app Configuration Provider**: nyckel värden med JSON Content-Type fungerar bra när de förbrukas av apparnas konfigurations leverantörs bibliotek i dina program.
#### <a name="valid-json-content-type"></a>Giltig JSON-innehålls typ
Medie typer som definieras [här](https://www.iana.org/assignments/media-types/media-types.xhtml)kan tilldelas den innehålls typ som är associerad med varje nyckel värde.
En medietyp består av en typ och en undertyp. Om typen är `application` och under typen (eller suffixet) är `json` , betraktas medie typen som en giltig JSON-innehålls typ.
Några exempel på giltiga JSON-innehålls typer är:
- application/json
- program/aktivitet + JSON
- Application/VND. foobar + JSON; charset = utf-8
#### <a name="valid-json-values"></a>Giltiga JSON-värden
När ett nyckel värde har JSON-innehåll, måste värdet vara i ett giltigt JSON-format för att klienter ska kunna bearbeta det korrekt. Annars kan klienterna Miss lyckas eller återgå och behandla dem som sträng format.
Några exempel på giltiga JSON-värden är:
- "John Berg"
- 723
- falskt
- null
- "2020-01-01T12:34:56.789 Z"
- [1, 2, 3, 4]
- {"ObjectSetting": {"mål": {"standard": true, "level": "information"}}}
> [!NOTE]
> För resten av den här artikeln kallas alla nyckel värden i app-konfigurationen som har en giltig JSON-innehållstyp och ett giltigt JSON-värde som **JSON-nyckel värde**.
I den här självstudien får du lära dig att:
> [!div class="checklist"]
> * Skapa JSON-nyckel-värden i app-konfigurationen.
> * Importera JSON-nyckel – värden från en JSON-fil.
> * Exportera JSON-nyckel-värden till en JSON-fil.
> * Använda JSON-nyckel värden i dina program.
## <a name="prerequisites"></a>Förutsättningar
- Azure-prenumeration – [skapa en kostnads fritt](https://azure.microsoft.com/free/).
- Senaste versionen av Azure CLI (2.10.0 eller senare). Kör `az --version` för att hitta versionen. Om du behöver installera eller uppgradera kan du läsa [Installera Azure CLI](/cli/azure/install-azure-cli). Om du använder Azure CLI måste du först logga in med `az login` . Du kan också använda Azure Cloud Shell.
[!INCLUDE [cloud-shell-try-it.md](../../includes/cloud-shell-try-it.md)]
## <a name="create-an-app-configuration-store"></a>Skapa ett konfigurations Arkiv för appen
[!INCLUDE [azure-app-configuration-create](../../includes/azure-app-configuration-create.md)]
## <a name="create-json-key-values-in-app-configuration"></a>Skapa JSON-nyckel-värden i appens konfiguration
JSON-nyckel – värden kan skapas med hjälp av Azure Portal, Azure CLI eller genom att importera från en JSON-fil. I det här avsnittet hittar du instruktioner om hur du skapar samma JSON-nyckel värden med alla tre metoderna.
### <a name="create-json-key-values-using-azure-portal"></a>Skapa JSON-nyckel – värden med Azure Portal
Bläddra till appens konfigurations Arkiv och välj **konfigurations Utforskaren** > **skapa** > **nyckel-värde** för att lägga till följande nyckel/värde-par:
| Tangent | Värde | Innehållstyp |
|---|---|---|
| Inställningar: BackgroundColor | Växt | application/json |
| Inställningar: FontSize | 24 | application/json |
| Inställningar: UseDefaultRouting | falskt | application/json |
| Inställningar: BlockedUsers | null | application/json |
| Inställningar: ReleaseDate | "2020-08-04T12:34:56.789 Z" | application/json |
| Inställningar: RolloutPercentage | [25, 50, 75100] | application/json |
| Inställningar: loggning | {"Test": {"level": "Felsök"}, "Prod": {"level": "varning"}} | application/json |
Lämna **etiketten** Tom och välj **Använd**.
### <a name="create-json-key-values-using-azure-cli"></a>Skapa JSON-nyckel – värden med Azure CLI
Följande kommandon skapar JSON-nyckel värden i appens konfigurations arkiv. Ersätt `<appconfig_name>` med namnet på appens konfigurations arkiv.
```azurecli-interactive
appConfigName=<appconfig_name>
az appconfig kv set -n $appConfigName --content-type application/json --key Settings:BackgroundColor --value \"Green\"
az appconfig kv set -n $appConfigName --content-type application/json --key Settings:FontSize --value 24
az appconfig kv set -n $appConfigName --content-type application/json --key Settings:UseDefaultRouting --value false
az appconfig kv set -n $appConfigName --content-type application/json --key Settings:BlockedUsers --value null
az appconfig kv set -n $appConfigName --content-type application/json --key Settings:ReleaseDate --value \"2020-08-04T12:34:56.789Z\"
az appconfig kv set -n $appConfigName --content-type application/json --key Settings:RolloutPercentage --value [25,50,75,100]
az appconfig kv set -n $appConfigName --content-type application/json --key Settings:Logging --value {\"Test\":{\"Level\":\"Debug\"},\"Prod\":{\"Level\":\"Warning\"}}
```
> [!IMPORTANT]
> Om du använder Azure CLI eller Azure Cloud Shell för att skapa JSON-nyckel värden måste det angivna värdet vara en Escaped JSON-sträng.
### <a name="import-json-key-values-from-a-file"></a>Importera JSON-nyckel – värden från en fil
Skapa en JSON-fil `Import.json` med namnet med följande innehåll och importera som nyckel värden till appens konfiguration:
```json
{
"Settings": {
"BackgroundColor": "Green",
"BlockedUsers": null,
"FontSize": 24,
"Logging": {"Test":{"Level":"Debug"},"Prod":{"Level":"Warning"}},
"ReleaseDate": "2020-08-04T12:34:56.789Z",
"RolloutPercentage": [25,50,75,100],
"UseDefaultRouting": false
}
}
```
```azurecli-interactive
az appconfig kv import -s file --format json --path "~/Import.json" --content-type "application/json" --separator : --depth 2
```
> [!Note]
> `--depth`Argumentet används för att förenkla hierarkiska data från en fil till nyckel/värde-par. I den här självstudien anges djupet för att demonstrera att du även kan lagra JSON-objekt som värden i app-konfigurationen. Om inget djup anges, kommer JSON-objekt att förenklas till den djupaste nivån som standard.
De JSON-nyckel värden som du har skapat bör se ut så här i appens konfiguration:

## <a name="export-json-key-values-to-a-file"></a>Exportera JSON-nyckel-värden till en fil
En av de största fördelarna med att använda JSON-nyckel värden är möjligheten att bevara den ursprungliga data typen för dina värden vid export. Om ett nyckel värde i app-konfigurationen saknar JSON-innehåll, behandlas värdet som sträng.
Tänk på följande nyckel värden utan JSON-innehålls typ:
| Tangent | Värde | Innehållstyp |
|---|---|---|
| Inställningar: FontSize | 24 | |
| Inställningar: UseDefaultRouting | falskt | |
När du exporterar dessa nyckel värden till en JSON-fil, exporteras värdena som strängar:
```json
{
"Settings": {
"FontSize": "24",
"UseDefaultRouting": "false"
}
}
```
Men när du exporterar JSON-nyckel värden till en fil behåller alla värden sina ursprungliga data typer. Du verifierar detta genom att exportera nyckel värden från din app-konfiguration till en JSON-fil. Du ser att den exporterade filen har samma innehåll som `Import.json` filen du importerade tidigare.
```azurecli-interactive
az appconfig kv export -d file --format json --path "~/Export.json" --separator :
```
> [!NOTE]
> Om appens konfigurations Arkiv har några nyckel värden utan JSON-innehållstyp, exporteras de också till samma fil i sträng format. Om du bara vill exportera JSON-nyckel-värden tilldelar du en unik etikett eller ett prefix till JSON-nyckel värden och använder etikett-eller prefix filtrering under exporten.
## <a name="consuming-json-key-values-in-applications"></a>Använda JSON-nyckel värden i program
Det enklaste sättet att använda JSON-nyckel-värden i ditt program är via bibliotek för Provider för app-konfiguration. Med Provider-biblioteken behöver du inte implementera särskild hantering av JSON-nyckel värden i ditt program. De är alltid deserialiserade för ditt program på samma sätt som andra JSON-Konfigurationsprovider.
> [!Important]
> Inbyggt stöd för JSON-nyckel-värden finns i .NET-Konfigurationsprovider version 4.0.0 (eller senare). Mer information finns i avsnittet [*Nästa steg*](#next-steps) .
Om du använder SDK eller REST API för att läsa nyckel värden från App-konfigurationen, baserat på innehålls typ, ansvarar programmet för deserialisering av värdet för JSON-nyckel-värde med valfri standard-JSON-deserialisering.
## <a name="clean-up-resources"></a>Rensa resurser
[!INCLUDE [azure-app-configuration-cleanup](../../includes/azure-app-configuration-cleanup.md)]
## <a name="next-steps"></a>Nästa steg
Nu när du vet hur du arbetar med JSON-nyckel värden i appens konfigurations Arkiv, skapar du ett program för att konsumera dessa nyckel värden:
* [ASP.NET Core snabb start](./quickstart-aspnet-core-app.md)
* Förutsättning: [Microsoft. Azure. AppConfiguration. AspNetCore](https://www.nuget.org/packages/Microsoft.Azure.AppConfiguration.AspNetCore) Package v 4.0.0 eller senare.
* [Snabb start för .NET Core](./quickstart-dotnet-core-app.md)
* Förutsättning: [Microsoft.Extensions.Configuration. AzureAppConfiguration](https://www.nuget.org/packages/Microsoft.Extensions.Configuration.AzureAppConfiguration) -paket v 4.0.0 eller senare.
| 53.159204 | 346 | 0.760037 | swe_Latn | 0.995194 |
58549ab7c5be269f3692a481e4c10c1912f36d51 | 311 | md | Markdown | clients/cimi/README.md | Matthelonianxl/deltacloud-core | 21f04670ee96c30fa7998afe4aebfca12bb40be6 | [
"Apache-2.0"
] | 7 | 2015-04-05T20:39:46.000Z | 2018-01-14T12:44:01.000Z | clients/cimi/README.md | Matthelonianxl/deltacloud-core | 21f04670ee96c30fa7998afe4aebfca12bb40be6 | [
"Apache-2.0"
] | 1 | 2021-11-04T13:04:11.000Z | 2021-11-04T13:04:11.000Z | clients/cimi/README.md | isabella232/deltacloud | 64c62c2437767af247a579a6c3c718e2641c1796 | [
"Apache-2.0"
] | 9 | 2015-06-05T14:38:15.000Z | 2021-11-10T15:15:54.000Z | CIMI Frontend
==============
Prerequisites
-------------
Start Deltacloud API with CIMI option:
$ cd core/server
$ deltacloudd -i mock -f cimi
Then start CIMI Frontend server and point to Deltacloud API URL:
$ cd core/clients/cimi
$ bundle
$ ./bin/start -u "http://localhost:3001/cimi"
| 18.294118 | 64 | 0.630225 | kor_Hang | 0.528107 |
5854bf3e0ec1e9864e94edd028c064e194f0eccd | 197 | md | Markdown | README.md | danwild/bokeh-tutorial | 6ae38e57ec97c371d5f04348734dc889db3add00 | [
"Apache-2.0"
] | null | null | null | README.md | danwild/bokeh-tutorial | 6ae38e57ec97c371d5f04348734dc889db3add00 | [
"Apache-2.0"
] | null | null | null | README.md | danwild/bokeh-tutorial | 6ae38e57ec97c371d5f04348734dc889db3add00 | [
"Apache-2.0"
] | null | null | null | # bokeh-tutorial
Repo for learning bokeh concepts, see: https://docs.bokeh.org/en/latest/docs/user_guide/quickstart.html
```
conda env create -f environment.yml
conda activate bokeh_tutorial
```
| 24.625 | 103 | 0.77665 | kor_Hang | 0.280296 |
58550c9b47f3f345a4b49774fd1142a6b6a76986 | 17,983 | md | Markdown | README.md | tomekowal/elixir-type_check | 8c51463d5b645f95159d31c0aa9a201275d5780f | [
"MIT"
] | null | null | null | README.md | tomekowal/elixir-type_check | 8c51463d5b645f95159d31c0aa9a201275d5780f | [
"MIT"
] | null | null | null | README.md | tomekowal/elixir-type_check | 8c51463d5b645f95159d31c0aa9a201275d5780f | [
"MIT"
] | null | null | null | 
# TypeCheck: Fast and flexible runtime type-checking for your Elixir projects.
[](https://hex.pm/packages/type_check)
[](https://hexdocs.pm/type_check/index.html)
[](https://github.com/Qqwy/elixir-type_check/actions/workflows/ci.yml)
[](https://coveralls.io/github/Qqwy/elixir-type_check)
## Core ideas
- Type- and function specifications are constructed using (essentially) the **same syntax** as built-in Elixir Typespecs.
- When a value does not match a type check, the user is shown **human-friendly error messages**.
- Types and type-checks are generated at compiletime.
- This means **type-checking code is optimized** rigorously by the compiler.
- **Property-checking generators** can be extracted from type specifications without extra work.
- Automatically create a **spectest** which checks for each function if it adheres to its spec.
- Flexibility to add **custom checks**: Subparts of a type can be named, and 'type guards' can be specified to restrict what values are allowed to match that refer to these types.
## Usage Example
We add `use TypeCheck` to a module
and wherever we want to add runtime type-checks
we replace the normal calls to `@type` and `@spec` with `@type!` and `@spec!` respectively.
```elixir
defmodule User do
use TypeCheck
defstruct [:name, :age]
@type! t :: %User{name: binary, age: integer}
end
defmodule AgeCheck do
use TypeCheck
@spec! user_older_than?(User.t, integer) :: boolean
def user_older_than?(user, age) do
user.age >= age
end
end
```
Now we can try the following:
```elixir
iex> AgeCheck.user_older_than?(%User{name: "Qqwy", age: 11}, 10)
true
iex> AgeCheck.user_older_than?(%User{name: "Qqwy", age: 9}, 10)
false
```
So far so good. Now let's see what happens when we pass values that are incorrect:
```elixir
iex> AgeCheck.user_older_than?("foobar", 42)
** (TypeCheck.TypeError) At lib/type_check_example.ex:28:
The call to `user_older_than?/2` failed,
because parameter no. 1 does not adhere to the spec `%User{age: integer(), name: binary()}`.
Rather, its value is: `"foobar"`.
Details:
The call `user_older_than?("foobar", 42)`
does not adhere to spec `user_older_than?(%User{age: integer(), name: binary()}, integer()) :: boolean()`. Reason:
parameter no. 1:
`"foobar"` does not check against `%User{age: integer(), name: binary()}`. Reason:
`"foobar"` is not a map.
(type_check_example 0.1.0) lib/type_check_example.ex:28: AgeCheck.user_older_than?/2
```
```elixir
iex> AgeCheck.user_older_than?(%User{name: nil, age: 11}, 10)
** (TypeCheck.TypeError) At lib/type_check_example.ex:28:
The call to `user_older_than?/2` failed,
because parameter no. 1 does not adhere to the spec `%User{age: integer(), name: binary()}`.
Rather, its value is: `%User{age: 11, name: nil}`.
Details:
The call `user_older_than?(%User{age: 11, name: nil}, 10)`
does not adhere to spec `user_older_than?(%User{age: integer(), name: binary()}, integer()) :: boolean()`. Reason:
parameter no. 1:
`%User{age: 11, name: nil}` does not check against `%User{age: integer(), name: binary()}`. Reason:
under key `:name`:
`nil` is not a binary.
(type_check_example 0.1.0) lib/type_check_example.ex:28: AgeCheck.user_older_than?/2
```
```elixir
iex> AgeCheck.user_older_than?(%User{name: "Aaron", age: nil}, 10)
** (TypeCheck.TypeError) At lib/type_check_example.ex:28:
The call to `user_older_than?/2` failed,
because parameter no. 1 does not adhere to the spec `%User{age: integer(), name: binary()}`.
Rather, its value is: `%User{age: nil, name: "Aaron"}`.
Details:
The call `user_older_than?(%User{age: nil, name: "Aaron"}, 10)`
does not adhere to spec `user_older_than?(%User{age: integer(), name: binary()}, integer()) :: boolean()`. Reason:
parameter no. 1:
`%User{age: nil, name: "Aaron"}` does not check against `%User{age: integer(), name: binary()}`. Reason:
under key `:age`:
`nil` is not an integer.
(type_check_example 0.1.0) lib/type_check_example.ex:28: AgeCheck.user_older_than?/2
```
```elixir
iex> AgeCheck.user_older_than?(%User{name: "José", age: 11}, 10.0)
** (TypeCheck.TypeError) At lib/type_check_example.ex:28:
The call to `user_older_than?/2` failed,
because parameter no. 2 does not adhere to the spec `integer()`.
Rather, its value is: `10.0`.
Details:
The call `user_older_than?(%User{age: 11, name: "José"}, 10.0)`
does not adhere to spec `user_older_than?(%User{age: integer(), name: binary()}, integer()) :: boolean()`. Reason:
parameter no. 2:
`10.0` is not an integer.
(type_check_example 0.1.0) lib/type_check_example.ex:28: AgeCheck.user_older_than?/2
```
And if we were to introduce an error in the function definition:
```elixir
defmodule AgeCheck do
use TypeCheck
@spec! user_older_than?(User.t, integer) :: boolean
def user_older_than?(user, age) do
user.age
end
end
```
Then we get a nice error message explaining that problem as well:
```elixir
** (TypeCheck.TypeError) The call to `user_older_than?/2` failed,
because the returned result does not adhere to the spec `boolean()`.
Rather, its value is: `26`.
Details:
The result of calling `user_older_than?(%User{age: 26, name: "Marten"}, 10)`
does not adhere to spec `user_older_than?(%User{age: integer(), name: binary()}, integer()) :: boolean()`. Reason:
Returned result:
`26` is not a boolean.
(type_check_example 0.1.0) lib/type_check_example.ex:28: AgeCheck.user_older_than?/2
```
## Features & Roadmap
### Implemented
- [x] Proof and implementation of the basic concept
- [x] Custom type definitions (type, typep, opaque)
- [x] Basic
- [x] Parameterized
- [x] Hide implementation of `opaque` from documentation
- [x] Spec argument types checking
- [x] Spec return type checking
- [x] Spec possibly named arguments
- [x] Implementation of Elixir's builtin types
- [x] Primitive types
- [x] More primitive types
- [x] Compound types
- [x] special forms like `|`, `a..b` etc.
- [x] Literal lists
- [x] Maps with keys => types
- [x] Structs with keys => types
- [x] More map/list-based structures.
- [x] Bitstring type syntax (`<<>>`, `<<_ :: size>>`, `<<_ :: _ * unit>>`, `<<_ :: size, _ :: _ * unit>>`)
- [x] A `when` to add guards to typedefs for more power.
- [x] Make errors raised when types do not match humanly readable
- [x] Improve readability of spec-errors by repeating spec and which parameter did not match.
- [x] Creating generators from types
- [x] Don't warn on zero-arity types used without parentheses.
- [x] Hide structure of `opaque` and `typep` from documentation
- [x] Make sure to handle recursive (and mutually recursive) types without hanging.
- [x] A compile-error is raised when a type is expanded more than a million times
- [x] A macro called `lazy` is introduced to allow to defer type expansion to runtime (to _within_ the check).
- [x] the Elixir formatter likes the way types+specs are constructed
- [x] A type `impl(ProtocolName)` to work with 'any type implementing protocol `Protocolname`'.
- [x] Type checks.
- [x] StreamData generator.
- [x] High code-coverage to ensure stability of implementation.
- [x] Make sure we handle most (if not all) of Typespec's primitive types and syntax. (With the exception of functions and binary pattern matching)
- [x] Option to turn `@type/@opaque/@typep`-injection off for the cases in which it generates improper results.
- [x] Manually overriding generators for user-specified types if so desired.
- [x] Creating generators from specs
- [x] Wrap spec-generators so you have a single statement to call in the test suite which will prop-test your function against all allowed inputs/outputs.
- [x] Option to turn the generation of runtime checks off for a given module in a particular environment (`enable_runtime_checks`).
- [x] Support for function-types (for typechecks as well as property-testing generators):
- `(-> result_type)`
- `(...-> result_type)`
- `(param_type, param2_type -> result_type)`
- [ ] Overrides for builtin remote types (`String.t`,`Enum.t`, `Range.t`, `MapSet.t` etc.) **(75% done)** [Details](https://hexdocs.pm/type_check/comparing-typecheck-and-elixir-typespecs.html#elixir-standard-library-types)
### Pre-stable
- [ ] Overrides for more builtin remote types
- [ ] Support for maps with mixed `required(type)` and `optional(type)` syntaxes.
- [ ] Hide named types from opaque types.
- [ ] Configurable setting to turn on/off at compile-time, and maybe dynamically at run-time (with slight performance penalty).
- [ ] Finalize formatter specification and make a generator for this so that people can easily test their own formatters.
### Longer-term future ideas
- [ ] Per-module or even per-spec settings to turn on/off, configure formatter, etc.
### Changelog
- 0.10.5 -
- Additions:
- Support for the builtin types of the `System` module. Thank you, @paulswartz!
- 0.10.4 -
- Fixes:
- Fixes issue where sometimes results of specs returning a struct would remove some or all struct fields. (c.f. #78)
- Related: Ensures that when a `@type!` containing a `%__MODULE__{}` is used before that module's `defstruct`, a clear CompileError is created (c.f. #83).
- 0.10.3 -
- Fixes:
- Fixes issue when TypeCheck specs were used inside a defprotocol module or other modules where the normal `def`/`defp` macros are hidden/overridden.
- 0.10.2 -
- Fixes:
- Fixes issue where FixedMaps would accept maps any maps (even those missing the required keys) sometimes. (c.f. #74)
- 0.10.1 -
- Fixes:
- Swaps `Murmur` out for Erlang's builtin `:erlang.phash2/2` to generate data for function-types, allowing the removal of the optional dependency on the `:murmur` library.
- 0.10.0 -
- Additions
- Support for function-types (for typechecks as well as property-testing generators):
- `(-> result_type)`
- `(...-> result_type)`
- `(param_type, param2_type -> result_type)`
- Fixes:
- Wrapping private functions no longer make the function public. (c.f. #64)
- Wrapping macros now works correctly. (also related to #64)
- Using `__MODULE__` inside a struct inside a type now expands correctly. (c.f. #66)
- 0.9.0 -
- Support for bitstring type syntax: `<<>>`, `<<_ :: size>>`, `<<_ :: _ * unit>>`, `<<_ :: size, _ :: _ * unit>>` (both as types and as generators)
- 0.8.2 -
- Fixed compiler warnings when optional dependency StreamData is not installed.
- Fixed pretty-printing of typedoc of opaque types.
- 0.8.1 -
- Improved documentation with a page comparing TypeCheck with Elixir's plain typespecs. (Thank you very much, @baldwindavid )
- Addition of missing override for the type `t Range.t/0`. (Thank you very much, @baldwindavid )
- 0.8.0 -
- Fixes prettyprinting of `TypeCheck.Builtin.Range`.
- Addition of `require TypeCheck.Type` to `use TypeCheck` so there no longer is a need to call this manually if you want to e.g. use `TypeCheck.Type.build/1`.
- Pretty-printing of types and TypeError output in multiple colors.
- Nicer indentation of errors.
- named types are now printed in abbreviated fashion if they are repeated multiple times in an error message. This makes a nested error message much easier to read, especially for larger specs.
- `[type]` no longer creates a `fixed_list(type)` but instead a `list(type)` (just as Elixir's own typespecs.)
- Support for `[...]` and `[type, ...]`as alias for `nonempty_list()` and `nonempty_list(type)` respectively.
- Remove support for list literals with multiple elements.
- Improved documentation. Thank you, @0ourobor0s!
- 0.7.0 - Addition of the option `enable_runtime_checks`. When false, all runtime checks in the given module are completely disabled.
- Adding `DateTime.t` to the default overrides, as it was still missing.
- 0.6.0 - Addition of `spectest` & 'default overrides' Elixir's standard library types:
- Adding `TypeCheck.ExUnit`, with the function `spectest` to test function-specifications.
- Possibility to use options `:except`, `:only`, `:initial_seed`.
- Possibility to pass custom options to StreamData.
- Adding `TypeCheck.DefaultOverrides` with many sub-modules containing checked typespecs for the types in Elixir's standard library (75% done).
- Ensure that these types are correct also on older Elixir versions (1.9, 1.10, 1.11)
- By default load these 'DefaultOverrides', but have the option to turn this behaviour off in `TypeCheck.Option`.
- Nice generators for `Enum.t`, `Collectable.t`, `String.t`.
- Support for the builtin types:
- `pid()`
- `nonempty_list()`, `nonempty_list(type)`.
- Allow `use TypeCheck` in IEx or other non-module contexts, to require `TypeCheck` and import `TypeCheck.Builtin` in the current scope (without importing/using the macros that only work at the module level.)
- The introspection function `__type_check__/1` is now added to any module that contains a `use TypeCheck`.
- Fixes the `Inspect` implementation of custom structs, by falling back to `Any`, which is more useful than attempting to use a customized implementation that would try to read the values in the struct and failing because the struct-type containing types in the fields.
- Fixes conditional compilation warnings when optional dependency `:stream_data` was not included in your project.
- 0.5.0 - Stability improvements:
- Adding `Typecheck.Option` `debug: true`, which will (at compile-time) print the checks that TypeCheck is generating.
- Actually autogenerate a `@spec`, which did not yet happen before.
- When writing `@autogen_typespec false`, no typespec is exported for the next `@type!`/`@opaque`/`@spec!` encountered in a module.
- Code coverage increased to 85%
- Bugfixes w.r.t. generating typespecs
- Fixes compiler-warnings on unused named types when using a type guard. (c.f. #25)
- Fixes any warnings that were triggered during the test suite before.
- 0.4.0 - Support for `impl(ProtocolName)` to accept any type implementing a particular protocol.
- Also adds rudimentary support for overriding remote types.
- Bugfix when inspecting `lazy( ...)`-types.
- 0.3.2 - Support for unquote fragments inside types and specs. (c.f. #39)
- 0.3.1 - Fixed link in the documentation.
- 0.3.0 - Improve DefaultFormatter output when used with long function- or type-signatures (c.f. #32). Also, bugfix for `Builtin.tuple/1`.
- 0.2.3 - Bugfix release: Ensure TypeCheck compiles on Elixir v1.11 (#30), Ensure StreamData truly is an optional dependency (#27).
- 0.2.2 - Support for literal strings should no longer break in Elixir's builtin typespecs.
- 0.2.1 - Improved parsing of types that have a type-guard at the root level. (c.f. #24), support for custom generators.
- 0.2.0 - Improved (and changed) API that works better with the Elixir formatter: Use `@type!`/`@spec!` instead, support named types in specs.
- 0.1.2 - Added missing `keyword` type to TypeCheck.Builtin (#20)
- 0.1.1 - Fixing some documentation typos
- 0.1.0 - Initial Release
## Installation
TypeCheck [is available in Hex](https://hex.pm/docs/publish). The package can be installed
by adding `type_check` to your list of dependencies in `mix.exs`:
```elixir
def deps do
[
{:type_check, "~> 0.10.0"},
# To allow spectesting and property-testing data generators (optional):
{:stream_data, "~> 0.5.0", only: :test},
]
end
```
The documentation can be found at [https://hexdocs.pm/type_check](https://hexdocs.pm/type_check).
### Formatter
TypeCheck exports a couple of macros that you might want to use without parentheses. To make `mix format` respect this setting, add `import_deps: [:type_check]` to your `.formatter.exs` file.
## TypeCheck compared to other tools
TypeCheck is by no means the other solution out there to reduce the number of bugs in your code.
### Elixir's builtin typespecs and Dialyzer
[Elixir's builtin type-specifications](https://hexdocs.pm/elixir/typespecs.html) use the same syntax as TypeCheck.
They are however not used by the compiler or the runtime, and therefore mainly exist to improve your documentation.
Besides documentation, extra external tools like [Dialyzer](http://erlang.org/doc/apps/dialyzer/dialyzer_chapter.html) can be used to perform static analysis of the types used in your application.
Dialyzer is an opt-in static analysis tool. This means that it can point out some inconsistencies or bugs, but because of its opt-in nature, there are also many problems it cannot detect, and it requires your dependencies to have written all of their typespecs correctly.
Dialyzer is also (unfortunately) infamous for its at times difficult-to-understand error messages.
An advantage that Dialyzer has over TypeCheck is that its checking is done without having to execute your program code (thus not having any effect on the runtime behaviour or efficiency of your projects).
Because TypeCheck adds `@type`, `@typep`, `@opaque` and `@spec`-attributes based on the types that are defined, it is possible to use Dialyzer together with TypeCheck.
### Norm
[Norm](https://github.com/keathley/norm/) is an Elixir library for specifying the structure of data that can be used for both validation and data-generation.
On a superficial level, Norm and TypeCheck seem similar. However, there are [important differences in their design considerations](https://github.com/Qqwy/elixir-type_check/blob/master/Comparing%20TypeCheck%20and%20Norm.md).
## Is it any good?
[yes](https://news.ycombinator.com/item?id=3067434)
| 52.891176 | 271 | 0.721904 | eng_Latn | 0.979175 |
58567633a34f8e10b51e3f790213fcc37f863b6d | 2,538 | md | Markdown | docs/vs-2015/extensibility/requiredframeworkversion-element-visual-studio-templates.md | viniciustavanoferreira/visualstudio-docs.pt-br | 2ec4855214a26a53888d4770ff5d6dde15dbb8a5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/extensibility/requiredframeworkversion-element-visual-studio-templates.md | viniciustavanoferreira/visualstudio-docs.pt-br | 2ec4855214a26a53888d4770ff5d6dde15dbb8a5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/extensibility/requiredframeworkversion-element-visual-studio-templates.md | viniciustavanoferreira/visualstudio-docs.pt-br | 2ec4855214a26a53888d4770ff5d6dde15dbb8a5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Elemento RequiredFrameworkVersion (modelos do Visual Studio) | Microsoft Docs
ms.date: 11/15/2016
ms.prod: visual-studio-dev14
ms.technology: vs-ide-general
ms.topic: reference
helpviewer_keywords:
- <RequiredFrameworkVersion> (Visual Studio Templates)
- RequiredFrameworkVersion (Visual Studio Templates)
ms.assetid: 08a4f609-51a5-4723-b89f-99277fb18871
caps.latest.revision: 11
ms.author: gregvanl
manager: jillfra
ms.openlocfilehash: ce312d7951f4c1be720604c006f9afcd63f364d3
ms.sourcegitcommit: 94b3a052fb1229c7e7f8804b09c1d403385c7630
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 04/23/2019
ms.locfileid: "68163658"
---
# <a name="requiredframeworkversion-element-visual-studio-templates"></a>Elemento RequiredFrameworkVersion (modelos do Visual Studio)
[!INCLUDE[vs2017banner](../includes/vs2017banner.md)]
Especifica a versão mínima do .NET Framework que é exigida pelo modelo. Hierarquia de esquema.
\<VSTemplate >
\<TemplateData >
\<RequiredFrameworkVersion>
## <a name="syntax"></a>Sintaxe
```
<RequiredFrameworkVersion> .... </RequiredFrameworkVersion>
```
## <a name="attributes-and-elements"></a>Atributos e elementos
As seções a seguir descrevem atributos, elementos filho e elementos pai.
### <a name="attributes"></a>Atributos
nenhuma.
### <a name="child-elements"></a>Elementos filho
nenhuma.
### <a name="parent-elements"></a>Elementos pai
|Elemento|Descrição|
|-------------|-----------------|
|[TemplateData](../extensibility/templatedata-element-visual-studio-templates.md)|Elemento obrigatório.<br /><br /> Categoriza o modelo e a define como ele é exibido em qualquer um de **novo projeto** ou o **Adicionar Novo Item** caixa de diálogo.|
## <a name="text-value"></a>Valor de texto
Um valor de texto é obrigatório.
O texto deve ser o número de versão mínima do .NET Framework que é necessário para o modelo.
## <a name="remarks"></a>Comentários
`RequiredFrameworkVersion` é um elemento opcional. Use esse elemento se o modelo só dá suporte a uma versão específica de mínimo e versões posteriores, se houver, do .NET Framework.
## <a name="see-also"></a>Consulte também
[Referência de esquema de modelo do Visual Studio](../extensibility/visual-studio-template-schema-reference.md)
[Criando modelos de projeto e de item](../ide/creating-project-and-item-templates.md)
[Direcionamento de uma versão específica do .NET Framework](../ide/targeting-a-specific-dotnet-framework-version.md)
| 40.285714 | 250 | 0.737983 | por_Latn | 0.748615 |
5857314a5162ead7d3506044140028f7ce8dc3eb | 7,451 | md | Markdown | src/docs/development/accessibility-and-localization/accessibility.md | 1ntEgr8/website | fbb06172968a59c7e37630afc6e9cd84171a0fc1 | [
"CC-BY-3.0"
] | 5 | 2020-05-16T10:29:57.000Z | 2021-06-06T14:26:52.000Z | src/docs/development/accessibility-and-localization/accessibility.md | 1ntEgr8/website | fbb06172968a59c7e37630afc6e9cd84171a0fc1 | [
"CC-BY-3.0"
] | 18 | 2020-12-21T07:19:39.000Z | 2021-08-03T06:10:51.000Z | src/docs/development/accessibility-and-localization/accessibility.md | 1ntEgr8/website | fbb06172968a59c7e37630afc6e9cd84171a0fc1 | [
"CC-BY-3.0"
] | 2 | 2021-05-24T10:10:21.000Z | 2021-09-07T12:38:43.000Z | ---
title: Accessibility
description: Information on Flutter's accessibility support.
---
Ensuring apps are accessible to a broad range of users is an essential
part of building a high-quality app. Applications that are poorly
designed create barriers to people of all ages. The [UN Convention on
the Rights of Persons with Disabilities][CRPD] states the moral and legal
imperative to ensure universal access to information systems; countries
around the world enforce accessibility as a requirement; and companies
recognize the business advantages of maximizing access to their services.
We strongly encourage you to include an accessibility checklist
as a key criteria before shipping your app. Flutter is committed to
supporting developers in making their apps more accessible, and includes
first-class framework support for accessibility in addition to that
provided by the underlying operating system, including:
[**Large fonts**][]
: Render text widgets with user-specified font sizes
[**Screen readers**][]
: Communicate spoken feedback about UI contents
[**Sufficient contrast**][]
: Render widgets with colors that have sufficient contrast
Details of these features are discussed below.
## Inspecting accessibility support
In addition to testing for these specific topics,
we recommend using automated accessibility scanners:
* For Android:
1. Install the [Accessibility Scanner][] for Android
1. Enable the Accessibility Scanner from
**Android Settings > Accessibility >
Accessibility Scanner > On**
1. Navigate to the Accessibility Scanner 'checkbox'
icon button to initiate a scan
* For iOS:
1. Open the `iOS` folder of your Flutter app in Xcode
1. Select a Simulator as the target, and click **Run** button
1. In Xcode, select
**Xcode > Open Developer Tools > Accessibility Inspector**
1. In the Accessibility Inspector,
select **Inspection > Enable Point to Inspect**,
and then select the various user interface elements in running
Flutter app to inspect their accessibility attributes
1. In the Accessibility Inspector,
select **Audit** in the toolbar, and then
select **Run Audio** to get a report of potential issues
## Large fonts
Both Android and iOS contain system settings to configure the desired font
sizes used by apps. Flutter text widgets respect this OS setting when
determining font sizes.
Font sizes are calculated automatically by Flutter based on the OS setting.
However, as a developer you should make sure your layout has enough room to
render all its contents when the font sizes are increased.
For example, you can test all parts of your app on a small-screen
device configured to use the largest font setting.
### Example
The following two screenshots show the standard Flutter app
template rendered with the default iOS font setting,
and with the largest font setting selected in iOS accessibility settings.
<div class="row">
<div class="col-md-6">
{% include app-figure.md image="a18n/app-regular-fonts.png" caption="Default font setting" img-class="border" %}
</div>
<div class="col-md-6">
{% include app-figure.md image="a18n/app-large-fonts.png" caption="Largest accessibility font setting" img-class="border" %}
</div>
</div>
## Screen readers
Screen readers ([TalkBack][], [VoiceOver][]) enable visually
impaired users to get spoken feedback about the contents of the screen.
Turn on VoiceOver or TalkBack on your device and navigate around your app. If
you run into any issues, use the [`Semantics` widget][] to customize the
accessibility experience of your app.
## Sufficient contrast
Sufficient color contrast makes text and images easier to read.
Along with benefitting users with various visual impairments,
sufficient color contrast helps all users when viewing an interface
on devices in extreme lighting conditions,
such as when exposed to direct sunlight or on a display with low
brightness.
The [W3C recommends][]:
* At least 4.5:1 for small text (below 18 point regular or 14 point bold)
* At least 3.0:1 for large text (18 point and above regular or 14 point and
above bold)
## Building with accessibility in mind
Ensuring your app can be used by everyone means building accessibility
into it from the start. For some apps, that's easier said than done.
In the video below, two of our engineers take a mobile app from a dire
accessibility state to one that takes advantage of Flutter's built-in
widgets to offer a dramatically more accessible experience.
<iframe width="560" height="315" src="https://www.youtube.com/embed/bWbBgbmAdQs" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
## Accessibility release checklist
Here is a non-exhaustive list of things to consider as you prepare your
app for release.
- **Active interactions**. Ensure that all active interactions do
something. Any button that can
be pushed should do something when pushed. For example, if you have a
no-op callback for an `onPressed` event, change it to show a `SnackBar`
on the screen explaining which control you just pushed.
- **Screen reader testing**. The screen reader should be able to
describe all controls on the page when you tap on them, and the
descriptions should be intelligible. Test your app with [TalkBack][]
(Android) and [VoiceOver][] (iOS).
- **Contrast ratios**. We encourage you to have a contrast ratio of at
least 4.5:1 between controls or text and the background, with the
exception of disabled components. Images should also be vetted for
sufficient contrast.
- **Context switching**. Nothing should change the user's context
automatically while typing in information. Generally, the widgets
should avoid changing the user's context without some sort of
confirmation action.
- **Tappable targets**. All tappable targets should be at least 48x48
pixels.
- **Errors**. Important actions should be able to be undone. In fields
that show errors, suggest a correction if possible.
- **Color vision deficiency testing**. Controls should be usable and
legible in colorblind and grayscale modes.
- **Scale factors**. The UI should remain legible and usable at very
large scale factors for text size and display scaling.
## More information
For more information, particularly about how to configure
the semantics tree,
see the following articles written by community members:
* [A deep dive into Flutter's accessibility widgets][]
* [Semantics in Flutter][]
[CRPD]: https://www.un.org/development/desa/disabilities/convention-on-the-rights-of-persons-with-disabilities/article-9-accessibility.html
[A deep dive into Flutter's accessibility widgets]: {{site.medium}}/flutter-community/a-deep-dive-into-flutters-accessibility-widgets-eb0ef9455bc
[Accessibility Scanner]: https://play.google.com/store/apps/details?id=com.google.android.apps.accessibility.auditor&hl=en
[**Large fonts**]: #large-fonts
[**Screen readers**]: #screen-readers
[Semantics in Flutter]: https://www.didierboelens.com/2018/07/semantics/
[`Semantics` widget]: {{site.api}}/flutter/widgets/Semantics-class.html
[**Sufficient contrast**]: #sufficient-contrast
[TalkBack]: https://support.google.com/accessibility/android/answer/6283677?hl=en
[W3C recommends]: https://www.w3.org/TR/UNDERSTANDING-WCAG20/visual-audio-contrast-contrast.html
[VoiceOver]: https://www.apple.com/lae/accessibility/iphone/vision/
| 43.829412 | 202 | 0.773722 | eng_Latn | 0.989273 |
585760abd87e7f9527e5d3fc78c5b7dbc3387919 | 89 | md | Markdown | src/data/events/default/sessao-de-encerramento.md | semanadeinformatica/website-2020 | 961ed1ad329e561d01efeeef2c1710b77b9c3d3e | [
"MIT"
] | null | null | null | src/data/events/default/sessao-de-encerramento.md | semanadeinformatica/website-2020 | 961ed1ad329e561d01efeeef2c1710b77b9c3d3e | [
"MIT"
] | 13 | 2020-09-04T05:53:24.000Z | 2020-10-27T16:02:56.000Z | src/data/events/default/sessao-de-encerramento.md | semanadeinformatica/website-2020 | 961ed1ad329e561d01efeeef2c1710b77b9c3d3e | [
"MIT"
] | null | null | null | ---
title: Sessão de Encerramento
day: 2020-11-11
start_time: 17h50
end_time: 18h20
---
| 11.125 | 29 | 0.719101 | por_Latn | 0.218471 |
5858a65bbd9647a3bb3ffc1ce42521a63131ba04 | 1,315 | md | Markdown | documentation/api/DefaultEcs-EntitySetBuilderExtension.md | smolen89/DefaultEcs | 35d844a4530ad6e4d7a01538cdc63811a13ad05e | [
"MIT"
] | null | null | null | documentation/api/DefaultEcs-EntitySetBuilderExtension.md | smolen89/DefaultEcs | 35d844a4530ad6e4d7a01538cdc63811a13ad05e | [
"MIT"
] | null | null | null | documentation/api/DefaultEcs-EntitySetBuilderExtension.md | smolen89/DefaultEcs | 35d844a4530ad6e4d7a01538cdc63811a13ad05e | [
"MIT"
] | null | null | null | #### [DefaultEcs](./DefaultEcs.md 'DefaultEcs')
### [DefaultEcs](./DefaultEcs.md#DefaultEcs 'DefaultEcs')
## EntitySetBuilderExtension `type`
Provides set of static methods to create more easily rules on a [EntitySetBuilder](./DefaultEcs-EntitySetBuilder.md 'DefaultEcs.EntitySetBuilder') instance.
### Methods
- [WithAny<T1, T2>(DefaultEcs.EntitySetBuilder)](./DefaultEcs-EntitySetBuilderExtension-WithAny-T1-_T2-(DefaultEcs-EntitySetBuilder).md 'DefaultEcs.EntitySetBuilderExtension.WithAny<T1, T2>(DefaultEcs.EntitySetBuilder)')
- [WithAny<T1, T2, T3>(DefaultEcs.EntitySetBuilder)](./DefaultEcs-EntitySetBuilderExtension-WithAny-T1-_T2-_T3-(DefaultEcs-EntitySetBuilder).md 'DefaultEcs.EntitySetBuilderExtension.WithAny<T1, T2, T3>(DefaultEcs.EntitySetBuilder)')
- [WithAny<T1, T2, T3, T4>(DefaultEcs.EntitySetBuilder)](./DefaultEcs-EntitySetBuilderExtension-WithAny-T1-_T2-_T3-_T4-(DefaultEcs-EntitySetBuilder).md 'DefaultEcs.EntitySetBuilderExtension.WithAny<T1, T2, T3, T4>(DefaultEcs.EntitySetBuilder)')
- [WithAny<T1, T2, T3, T4, T5>(DefaultEcs.EntitySetBuilder)](./DefaultEcs-EntitySetBuilderExtension-WithAny-T1-_T2-_T3-_T4-_T5-(DefaultEcs-EntitySetBuilder).md 'DefaultEcs.EntitySetBuilderExtension.WithAny<T1, T2, T3, T4, T5>(DefaultEcs.EntitySetBuilder)')
| 131.5 | 268 | 0.798479 | yue_Hant | 0.977805 |
5859e2856bf1041262a24e9bb1e2f38bba8fb4c6 | 544 | md | Markdown | CHANGELOG.md | bdwyertech/ossec-cookbook | 9eabd3f30f8e43bc481fb66652e510c24ffdce90 | [
"Apache-2.0"
] | null | null | null | CHANGELOG.md | bdwyertech/ossec-cookbook | 9eabd3f30f8e43bc481fb66652e510c24ffdce90 | [
"Apache-2.0"
] | null | null | null | CHANGELOG.md | bdwyertech/ossec-cookbook | 9eabd3f30f8e43bc481fb66652e510c24ffdce90 | [
"Apache-2.0"
] | null | null | null | ## v1.0.5
### Bug
- Avoid node.save to prevent incomplete attribute collections
- `dist-ossec-keys.sh` should be sorted for idempotency
### Improvement
- Ability to disable ossec configuration template
- Support for encrypted databags
- Support for environment-scoped searches
- Support for multiple email_to addresses
## v1.0.4
### Bug
- [COOK-2740]: Use FQDN for a client name
### Improvement
- [COOK-2739]: Upgrade OSSEC to version 2.7
## v1.0.2:
* [COOK-1394] - update ossec to version 2.6
## v1.0.0:
* Initial/current release
| 17 | 61 | 0.715074 | eng_Latn | 0.910551 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.