CA certificate somehow not taken by browsers on android

This is a follow up question related to this answer.

In short: I am importing the Root CA certificate into android system via

Settings -> Security -> Trusted Credentials -> install from SD

path slightly differs on different android versions.

Then point any browser (tested with Firefox, chrome and opera) to the secure (java script based) resource and I receive a socket error. The resource is an index.html with js web-socket logic to securely connect to a mosquitto broker.

If I on the other hand point the browser to "https://myserver:<mysecure port>" I receive a privacy warning, can continue unsafe and this somehow sets a cookie or other storage thing thus I am able to do future requests over the js based secure resource.

How to accomplish browser based requests on android without accepting unsafe privacy risks?

Go to Source
Author: woodz

Fitting a equation to a set of data

I have a set of data:

1.12158 0.42563 0.07
1.12471 0.42112 0.07
1.12784 0.41685 0.07
1.13097 0.41283 0.07
1.13409 0.40907 0.07
1.13722 0.40556 0.07
1.14035 0.40231 0.07
1.14348 0.39933 0.07
1.1466 0.39661 0.07
1.14973 0.39417 0.07
1.15285 0.39201 0.07
1.15598 0.39012 0.07
1.15911 0.38852 0.07
1.16224 0.3872 0.07
1.16536 0.38618 0.07
1.16849 0.38544 0.07
1.17162 0.385 0.07
1.17474 0.38486 0.07
1.17787 0.38543 0.07
1.181 0.38714 0.07
1.18413 0.38994 0.07
1.18725 0.39378 0.07
1.19038 0.39858 0.07
1.19351 0.40426 0.07
1.19664 0.41071 0.07
1.19976 0.41786 0.07

The first column is the x-axis and the second column is the y-axis.

I want to fit this data to the equation:

Ax^2 + Bx + c

and find out the values of A, B and c.

What program can I use ?

I would be very glad if you could show me how to do it.

Thanks.

Go to Source
Author: physu

NGINX serving by IP only, not by server name

A Raspbery Pi running (arm-)Arch sits behind my router NAT. The RasPi has a static IP 192.168.1.6 and an nginx serving on port 8093. (The nginx is listening on port 80 for another webpage.)

The server_name is “pi.hole” and it is resolved correctly by the source machine to 192.168.1.6

The interface opens successfully in my browser at “http://192.168.1.6:8093”

A “404 Not Found” pops when opening “pi.hole”

Bellow are my /etc/nginx/nginx.conf

user http;

worker_processes auto;

worker_rlimit_nofile 10240;

events {
    # Use epoll on Linux 2.6+
    use epoll;
    # Max number of simultaneous connections per worker process
    worker_connections 2048;
    # Accept all new connections at one time
    multi_accept on;
}

http {

    server_tokens off;

    sendfile on;

    tcp_nopush on;

    tcp_nodelay off;
    
    send_timeout 30;

    keepalive_timeout 60;

    keepalive_requests 200;
    reset_timedout_connection on;
    
    types_hash_max_size 2048;

    server_names_hash_bucket_size 64;

    include /etc/nginx/mime.types;
    default_type text/html;
    charset UTF-8;

    access_log /var/log/nginx/access.log;
    error_log /var/log/nginx/error.log;

    gzip on;

    gzip_min_length 1000;

    gzip_disable "msie6";
    gzip_proxied any;

    gzip_comp_level 5;
    
    gzip_types
        text/plain
        text/css
        application/json
        application/x-javascript
        text/xml
        application/xml
        application/xml+rss
        text/javascript
        application/javascript
    application/octet-stream;


    open_file_cache max=1000 inactive=20s;
    open_file_cache_valid    30s;
    open_file_cache_min_uses 2;
    open_file_cache_errors   on;
    include /etc/nginx/conf.d/*.conf;
    include /etc/nginx/sites-enabled/*;
}


and /etc/nginx/conf.d/pihole.conf

# /etc/nginx/conf.d/pihole.conf
#
# https://github.com/pi-hole/pi-hole/wiki/Nginx-Configuration
#

server {
    listen 192.168.1.6:8093 ;

    root /srv/http/pihole;
    server_name pi.hole;
    autoindex off;

    proxy_intercept_errors on;
    error_page 404 /pihole/index.php;

    index pihole/index.php index.php index.html index.htm;

    location / {
        expires max;
        try_files $uri $uri/ /pihole/index.php?$args =404;
        add_header X-Pi-hole "A black hole for Internet advertisements";
    }

    location ~ .php$ {
        include fastcgi.conf;
        fastcgi_intercept_errors on;
        fastcgi_pass unix:/run/php-fpm/php-fpm.sock;
        #fastcgi_pass 127.0.0.1:9000;
        #fastcgi_param VIRTUAL_HOST "pi.hole";
        #fastcgi_param SERVER_NAME $host;
        fastcgi_param SERVER_NAME "pi.hole";
    }
    
    location /admin {
        root /srv/http/pihole;
        index index.php index.html index.htm;
        add_header X-Pi-hole "The Pi-hole Web interface is working!";
        add_header X-Frame-Options "DENY";
    }
    
    location ~ /.ttf {
        add_header Access-Control-Allow-Origin "*";
    }

    location ~ /admin/. {
        deny all;
    }

    location ~ /.ht {
        deny all;
    }
}

I tried adding the ip to the listener and playing with the fastcgi_param for the host name to no better end.

The user running nginx is the same for php-fpm and has ownership and read-write permissions and the root and down the tree.

What am I doing wrong?

Go to Source
Author: superAnnoyingUser

Ubuntu 18.04LTS apache2 parse php in html

on ubuntu 16.04LTS the following lines added to apache2.conf via IncludeOptional prevent indexing and allow parsing of php in html files

    Options FollowSymLinks
    AllowOverride None
</Directory>

<Directory /var/www/>
    Options -Indexes +FollowSymLinks
    AllowOverride None
    Order allow,deny
    Allow from all
</Directory>

AddType application/x-httpd-php .php .htm .html

The same code added on a 18.04LTS server prevent indexing but wont parse php in html.
It’s a long time since I set up the 16.06 Is there a change in 18.04 or am I forgetting something.

Go to Source
Author: FRANK POLAN

Derive Date Spans from Start and End Dates in SQL Server table

I am using SQL Server 2016

I have a table that contains 1 row per month that a patient is assigned to a particular Provider.

A patient can be assigned to multiple providers during the year.

How can I derive date spans (startdate & enddate) to represent the time a patient was assigned to each provider.

My table looks like this:

+----------+---------------+------------+-----------+
| Provider | Patient       | StartDate  | EndDate  | 
+----------+---------------+------------+-----------+
| 1922157  | 12345         | 20191201  | 20191231 | 
| 1904176  | 12345         | 20191101  | 20191201 |
| 1904176  | 12345         | 20191001  | 20191101 |
| 1904176  | 12345         | 20190901  | 20191001 | 
| 1904176  | 12345         | 20190801  | 20190901 |
| 1904176  | 12345         | 20190701  | 20190801 |
| 1904176  | 12345         | 20190601  | 20190701 |
| 1904176  | 12345         | 20190501  | 20190601 |
| 1904176  | 12345         | 20190401  | 20190501 |
| 1904176  | 12345         | 20190301  | 20190401 |
| 1904176  | 12345         | 20190201  | 20190301 |
| 1922157  | 12345         | 20190101  | 20190201 |
| 1922157  | 56789         | 20190101  | 20190201 |
+----------+---------------+------------+-----------+

In this case, patient 12345 was assigned to 2 different providers. One for 2 months, January and then December and the other for the rest of the year (10 months) February through November. Patient 56789 was only assigned to 1 provider (1922157) for 1 month (in December).

I’m trying to make it so my output looks like the below table but I am running into issues I think because the patient is assigned to the same pcp during 2 different times of the year. I tried using the lag function but I only get the correct results for some cases but not all such as this particular case.

+----------+---------------+------------+-----------+
| Provider | Patient       | StartDate  | EndDate  | 
+----------+---------------+------------+-----------+
| 1922157  | 12345         | 20190101  | 20190201  | 
| 1904176  | 12345         | 20190201  | 20191201  | 
| 1922157  | 12345         | 20191201  | 20191231  | 
| 1922157  | 56789         | 20191201  | 20191231  |
+----------+---------------+------------+-----------+

Go to Source
Author: Juan Velez

Is a transaction time of <10ms for an SQL database viable? If so, under what conditions?

Appreciate this is a rather odd question, so I will try to clarify as much as possible. Please also be assured this is a question purely for my own education, I’m not about to rush off and do crazy things in our software on the back of it.

I have a customer requirement for a transaction time of <10ms on a system that is based around an SQL database – in our specific implementation it is Oracle DB. I’m aware that this is not a useful or meaningful requirement, so with my business hat on I’ll be dealing with that. I fully expect that the requirement will be revised to something more useful and achievable.

However, I am curious on a technical level. Could you squeeze transaction time on an SQL DB down below 10ms? Lets be generous and say this is pure SQL execution time, no comms, no abstraction layers etc. Right now, running select 1 from dual on one of our systems gives a reported execution time of 10-20ms and I’d assume that’s about the simplest query possible. What if anything might you do to reduce that time (a) within Oracle/SQL or the server environment (b) by making a different tech choice? I’d assume maybe a higher clock speed on the CPU might help, but I wouldn’t bet on it.

Go to Source
Author: SimonN

How to get rewrited name of cusom taxonomy?

I have registered custom post taxonomy with rewrited name:

register_taxonomy('behold_gallery-albums-subject', 'behold_gallery',array(
    'hierarchical'              => true,
    'labels'                    => $labels,
    'show_ui'                   => true,
    'show_admin_column'         => true,
    'rewrite'                   => array( 'slug' => 'kategorie-galerii', 'with_front' => true ),
    'update_count_callback'     => '_update_post_term_count',
    'query_var'                 => true,
));

so name of my custom taxonomy is ‘behold_gallery-albums-subject’ but is rewrited to ‘kategorie-galerii’.

If I want to get name of this taxonomy, I can use this

$postsTaxonomy = get_sub_field('archive__post-choose-taxonomy'); // ACF
$postsTerm = get_term_by( 'slug', get_query_var( 'term' ), get_query_var( 'taxonomy' ) ); 
$postsTerm_id = $postsTerm->term_id;
$PostsTaxName = get_taxonomy($postsTaxonomy)->labels->name;

but how can I get this rewrited name, not the ‘original’?

Thanks!

Go to Source
Author: Damian P.

Deploy .NET application from Jenkins on Linux to MS Azure Web service

We have .NET application which deployed to Azure Web services. Now is time to create deployment pipeline for it.

And want to know what is high level plan how .NET application can be deployed to Azure from Linux servers based on CentOS.

Do we need an AZ cli installed on Linux server to deploy it and Azure .NET SDK?

Go to Source
Author: pleyades

How to turn on GTID on MariaDB if already using binary/position replication?

MariaDB 10.3

current master config

[mysqld]
log-bin
server_id=1
binlog-format=mixed
expire_logs_days=10

current slave config

[mysqld]
log-bin
server_id=2
binlog-format=mixed
expire_logs_days=10

I can stop both master and slave if needed for changing replication method.

So questions are:

  1. Do I need to stop replication and application and then execute “CHANGE MASTER TO MASTER_USE_GTID = slave_pos” Is it enough? Should I purge bin logs?
  2. If so how do slave knows about where master server is?
  3. Should I changed binlog-format to “row”?

Go to Source
Author: GarfieldCat

How do I get the current GPS location programmatically in Asp.Net MVC C#

I am working on a Asp.Net MVC 5 Application for getting gy06n gps live data from hosted server to store in mysql

Let’s say I have GPS Device and I want to use the GPS location in my application, what api / code should i use?

For Example if i have travel agency and i have 2 cars and both cars has gt06n GPS device the condition is “if My GPS enable car stop more then 2.5 min on road in that case i need a single ping from my Gps device.”

Go to Source
Author: Chandan

Installing Certificate Authority

I have used this guide to install, in my lab a 2-tier PKI on Windows Server 2019
https://social.technet.microsoft.com/wiki/contents/articles/15037.ad-cs-step-by-step-guide-two-tier-pki-hierarchy-deployment.aspx
I know that guide is pretty old but it seems to have been updated pretty recently. Some steps are slightly different in the newer Windows version but nothing that can’t be figured out. The only deviation from the guide is that I have combined the the roles of the issuing (CA02) and the CDP/AIA publisher (SRV1). Other than that I followed the guide step-by-step (or at least I think I have, there are a couple of parts that are not very clear). I have redone the whole thing a couple of times. I keep winding up with the same issue:
I cannot validate the ldap connections for AIA, CDP or DeltaCRL in PKIView. I also notice that the share location that I create during the initial setup of the issuing server has somehow changed to the CertEnroll folder under certsrv in system32 rather than C:CertEnroll where I created it. How the heck does that happen?!? I am not sure at what point in the process that changes. I’ve just noticed it when I am troubleshooting the pkiview fail after completing all the setup steps. I am obviously most concerned witht he PKIView failure, just really curious as to why that share location changes. Thanks for reading.Screenshot of PKIView

Go to Source
Author: RobS

Help please – installed ubuntu 18.04 lts – cant get ethernet/wifi working

I have a dell optiplex 9010 desktop PC. I have just installed ubuntu 18.04, alongside windows 10, on my hard disc in a separate partition. All is ok except I cannot establish an internet connection.
When I go onto settings> Wi-fi…..it says no wifi adapter found. I have tried running an ethernet cable from laptop to desktop but it will not establish a connection. I then tried tethering my mobile phone and this allowed me to go online however I still cant figure out how to install the drivers!!

Using my laptop (packard bell) I have looked online for the drivers for these devices only because I am assuming the device drivers are not present, I downloaded from an intel web site what I think are the related drivers. The name on the file to be extracted was e1000e-3.8.4.tar.gz.
I have extracted these files to a usb stick but have no idea if 1) This is the probable fix and 2) if it is how can I upload these drivers to my desktop. (my desktop is an optiplex 9010)

I am very new to linux and following some suggestions online I used the terminal command ‘lspci’ and this showed the following what I believe to be wifi/ethernet related devices on the system:

  1. ethernet controller [0200]:intel corp 82579LM gigabit network connection.
    Subsystem: Dell 82579LM gigabit network connection.
    Kernel driver in use: e1000e.
    Kernel modules: e1000e
  2. ethernet controller [0200]:intel corp 82572EI gigabit ethernet controller. Subsystem: intel corp PRO/1000PT server adapter [8086:1082]
    Kernel driver in use: e1000e.
    Kernel modules: e1000e

I have two ethernet cards in my desktop.

  1. Intel PRO/1000PT server adapter 82572EI and…
  2. Intel 82579LM gigabit network connection…

Any help would be very much appreciated. Thanks in advance.

Go to Source
Author: zatta303

Trying to install wifi adaptor to kali linux manually

I have a huaweii matebookd, with a VM box and kali linux installed.
Bought a Wireless adapter (TL-WN725N), and although it appears on my VM, it does not appear on my kali.
I am trying to manually install it using these instructions:
1)You will need to blacklist another driver in order to use this one.
2)”echo “blacklist r8188eu” > “/etc/modprobe.d/realtek.conf”
3)”make && make install”
4)Reboot in order to blacklist and load the new driver/module.

However, when I am on step 3, this error appears:
make *** no rule to make target “install”. Stop.

Then it says no targets specified and no makefile found. Stop.

Any suggestions?

Go to Source
Author: Pedro