It is not Mobile First, Voice First or AI First, it is just
Context First
It is not Mobile First, Voice First or AI First, it is just
Context First
Kommando konnte nicht korrekt an den Master Server gesendet werden
Problem ist der Dienst mediDOK Bridge-Server (medatiXX-Edition). Dieser muss neu gestartet werden. Dies kann man entweder manuell, oder durch ein einfaches Script.
Legen Sie dazu eine neue Datei an und taufen diese auf den Namen: medidok_reparieren.cmd
Editieren sie die Datei mit dem Editor und kopieren sie die nachfolgenden Zeilen hinein:
net stop BridgeServer
net start BridgeServer
Speichern und fertig. Jetzt können Sie jederzeit den Fehler mit einem Doppelklick auf die Datei reparieren.
Das wars schon.
See the error using this query
index=_internal sourcetype=scheduler <YOUR-DATAMODEL>
Update everything
sudo apt-get update && sudo apt-get upgrade && sudo apt-get install git
Install MySQL
sudo apt-get install mysql-server
Securing it
mysql_secure_installation
(Remove remote connectivity etc.)
Login MySQL
mysql -u root -p
In MySQL Shell
CREATE DATABASE gogs;
GRANT ALL PRIVILEGES ON gogs.* TO ‘gogs’@’localhost’ IDENTIFIED BY ‘<>’;
FLUSH PRIVILEGES;
quit;
Create a Service User to run Gogs and store repositories (optional)
useradd –system –create-home git
sudo passwd git
Login as git user (optional)
su git
cd —
Download Gogs
Goto Gogs site and copy the current version download link.
wget https://dl.gogs.io/0.11.34/linux_amd64.zip
unzip linux_amd64.zip
Systemd
nano ./gogs/scripts/systemd/gogs.service
Edit File like this (for git user; change git if you install it in an another directory)
[Unit]
Description=Gogs
After=syslog.target
After=network.target
After=mysqld.service[Service]
# Modify these two values and uncomment them if you have
# repos with lots of files and get an HTTP error 500 because
# of that
###
#LimitMEMLOCK=infinity
#LimitNOFILE=65535
Type=simple
User=git
Group=git
WorkingDirectory=/home/git/gogs
ExecStart=/home/git/gogs/gogs web
Restart=always
Environment=USER=git HOME=/home/git
Copy and enable
cp ./gogs/scripts/systemd/gogs.service /etc/systemd/system/gogs.service
sudo systemctl enable gogs
Start Gogs
sudo systemctl start gogs
Finally, open http://<<yourserver>>:3000 and enter the needed data:
At last, register a new user. The user will be admin as default.
Apache2 Reverse Proxy (Optional)
sudo apt-get install apache2
sudo a2enmod proxy
sudo a2enmod proxy_http
Edit conf
sudo nano /etc/apache2/sites-available/000-default.conf
Set reverse proxy
<VirtualHost *:80>
ProxyPreserveHost On
ProxyRequests off
ProxyPass / http://127.0.0.1:3000/
ProxyPassReverse / http://127.0.0.1:3000/
</VirtualHost>
Restart Apache2
sudo systemctl restart apache2
Fin.
wget -q -O – https://pkg.jenkins.io/debian/jenkins-ci.org.key | sudo apt-key add –
deb https://pkg.jenkins.io/debian-stable binary/ | sudo tee /etc/apt/sources.list.d/jenkins.list
sudo apt-get install jenkins
sudo systemctl start jenkins
sudo cat /var/lib/jenkins/secrets/initialAdminPassword
Do not forget to install the simple theme plugin.
/usr/include/curlpp/cURLpp.hpp:34:23: fatal error: curl/curl.h: No such file or directory
sudo apt-get install libcurl3-dev libcurl3
Just a list of all Azure Services (Jan. 2017)
Advisor
AKS
Analysis Services
API Management
App Service
Application Gateway
Application Insights
Authorization
Automation
Batch AI
Batch Management
Batch Service
Billing
CDN
Cognitive Services
Compute
Consumption
Container Instances
Container Registry
Container Service
Cosmos DB
Cosmos DB Resource Provider
Data Catalog
Data Factory
Data Lake Analytics
Data Lake Store
Dev Test Labs
DNS
Event Grid
Event Hubs
ExpressRoute
Graph RBAC
HDInsight
HDInsight Spark
Intune
IoT Hub
IoT Hub Device Provisioning Service
Key Vault
Load balancer
Location Based Services Functional API
Location Based Services Management API
Log Analytics
Logic Apps
Machine Learning
Media Services
Monitor
MySQL
Network Gateway
Network Watcher
Networking Operations
Notification Hubs
PostgreSQL
Power BI Embedded
Power BI Workspace Collections
Recovery Services
Recovery Services – Backup
Recovery Services – Site Recovery
Redis Cache
Relay
Reserved VM Instances
Resource Health
Resource Management
Scheduler
Search Management
Search Service
Server Management
Service Bus
Service Fabric
Service Map
SQL Database
Storage Import-Export
Storage Resource Provider
Storage Services
StorSimple
Stream Analytics
Time Series Insights
Time Series Insights Management
Traffic Manager
Virtual Networks
Problem
Solution
Install the Powershell as described here:
Install-Module AzureRM -AllowClobber
Set-ExecutionPolicy Unrestricted
Import-Module AzureRM
Login to Azure using Powershell:
Login-AzureRmAccount
Download the powershell script from s_lapointe and install it or just executing it:
$ErrorActionPreference = ‘Stop’
if(-not (Get-Module AzureRm.Profile)) {
Import-Module AzureRm.Profile
}
$azureRmProfileModuleVersion = (Get-Module AzureRm.Profile).Version
# refactoring performed in AzureRm.Profile v3.0 or later
if($azureRmProfileModuleVersion.Major -ge 3) {
$azureRmProfile = [Microsoft.Azure.Commands.Common.Authentication.Abstractions.AzureRmProfileProvider]::Instance.Profile
if(-not $azureRmProfile.Accounts.Count) {
Write-Error “Ensure you have logged in before calling this function.”
}
} else {
# AzureRm.Profile < v3.0
$azureRmProfile = [Microsoft.WindowsAzure.Commands.Common.AzureRmProfileProvider]::Instance.Profile
if(-not $azureRmProfile.Context.Account.Count) {
Write-Error “Ensure you have logged in before calling this function.”
}
}$currentAzureContext = Get-AzureRmContext
$profileClient = New-Object Microsoft.Azure.Commands.ResourceManager.Common.RMProfileClient($azureRmProfile)
Write-Debug (“Getting access token for tenant” + $currentAzureContext.Subscription.TenantId)
$token = $profileClient.AcquireAccessToken($currentAzureContext.Subscription.TenantId)
$token.AccessToken
All credits to s_lapointe for this script. Thank you.
First, check which table uses the most space
select
o.name,
max(s.row_count) AS ‘Rows’,
sum(s.reserved_page_count) as ‘PageCount’,
sum(s.reserved_page_count) * 8.0 / (1024 * 1024) as ‘GB’,
(8 * 1024 * sum(s.reserved_page_count)) / (max(s.row_count)) as ‘Bytes/Row’
from sys.dm_db_partition_stats s, sys.objects o
where o.object_id = s.object_id
group by o.name
having max(s.row_count) > 0
order by GB desc
The result should look like this:
As you can see, two tables using more than 10 GB. You also will notice, that the PageCount is really high, because the table is fragmented.
To defragment the table and also using data compression, you can use this:
ALTER TABLE <<TABLENAME>> REBUILD PARTITION = ALL
WITH
(DATA_COMPRESSION = ROW
)