Search Results

Search found 14544 results on 582 pages for 'ssh config'.

Page 388/582 | < Previous Page | 384 385 386 387 388 389 390 391 392 393 394 395  | Next Page >

  • Configure firewalld for OpenVPN (server-bridge) in Fedora 20

    - by rsc1975
    I've installed an OpenVPN server (server-bridge) on Fedora 20, but I cannot get it to work. I'm almost sure that It's a firewall issue. I'm trying to connect from an OSX client, but I can connect (just connect to VPN server, without access to anything) before the bridge is configured in server, however once I configure the bridge interface (using this script), then I cannot connect anymore. I've configured it as server-bridge, following these HOW-TOs from Fedora and OpenVPN Ethernet-Bridge. The firewall config is explained using iptables: iptables -A INPUT -i tap0 -j ACCEPT iptables -A INPUT -i br0 -j ACCEPT iptables -A FORWARD -i br0 -j ACCEPT However, in Fedora 20, by default, It's installed firewalld, so, Can anyone tell me the equivalent commands using firewall-cmd ? I read the firewalld guide, but It's not clear to me how to achieve it (I'm a developer, no SysAdmin). I know that I can install iptables, but I want it to work with firewalld.

    Read the article

  • How to stop Nginx sending static file requests to the CakePHP app controller when running Cake in a subdirectory?

    - by robotmay
    I'm trying to run a CakePHP app from within a subfolder on Nginx, but the static files are not being found and are instead being passed to the app controller. Here's my current config: location /uniquetv { index index.php index.html; if (-f $request_filename) { break; } if (!-e $request_filename) { rewrite ^/uniquetv(.+)$ /uniquetv/webroot/$1 last; break; } } location /uniquetv/webroot { index index.php; if (!-e $request_filename) { rewrite ^/uniquetv/webroot/(.+)$ /uniquetv/webroot/index.php?url=$1 last; break; } } Any ideas? :)

    Read the article

  • Redirect non-www ssl traffic to www ssl (apache)

    - by The NinjaSysadmin
    Hello, I'm attempting to get a redirect which is failing, and for some reason I can't think today. I have a vHost file within HTTPD that listens on standard port 80 and port 443. I'm attempting to redirect https://domain.com/(.*) to https://www.domain.com/$1 so that the URL remains intact. My config is as follows: ServerName www.domain.com ServerAlias tempdomain.testdomain.co.uk ServerAlias domain.com My rerwrite rule I'm using is. RewriteCond %{HTTP_HOST} ^domain.com$ RewriteRule ^(.*)$ https://www.domain.com$1 [R=301,L] I've also tried removing the . and $ but nothing.. When I visit the url https://domain.com/secure.page?action=comp it doesn't redirect to https://www.domain.com/secure.page?action=comp I do also have other SSL pages, the above was just an example.. Can anyone point out my stupidity.

    Read the article

  • apache: lists of all directives for a context?

    - by ajsie
    in the apache online documentation each directive could belong to a context eg: server-config, virtualhost, directory, .htaccess and so on. i wonder if there is a list of all directives belonging to each context? eg. a list with all directives for virtualhost so i know exactly which one i can use? and also, where can i find directives for apache modules? on their page or does each module has its own page with documentation (eg. mod_rewrite)?

    Read the article

  • How to enable/disable authentication without password when executing commands as superuser?

    - by 44taka
    On a Fedora 19 system which I set up for somebody a while ago I noticed that no authentication is required when commands are executed as the superuser. So, for example, when running Yum Extender, configuring the firewall or running some command with sudo in the terminal, I am not asked to provide a password. (With graphical applications the authentication dialog pops up for a few milliseconds.) For better security I would like to disable this automatic (authentication-less) assumption of superuser privileges. I do not remember if or how I enabled this authentication without a password. I might have enabled it for convenience for the non-pro user of this machine, but did not do any "fancy" things (like editing config files) to do so. I did not edit the sudoer file. I just checked that. I might have checked a "Do not ask for password again" checkbox or something similar. Whatever I did, I would like to undo it and enforce authentication for superuser tasks again.

    Read the article

  • Running a script on startup before X starts in Ubuntu 9.10

    - by Epcylon
    I have a script that I want to run at startup to switch X-configs depending on location, but I can't seem to find out where to put it in order to get it to run before X is started. This results in me having to restart X to get it to run the correct config. Currently, my script is located in /etc/init.d/whereami, with symlinks in /etc/rc[2-5].d/S25whereami. I was trying to find out when X is started, in case the problem is simply the 25, but I can't seem to find the answer... Any help is appreciated.

    Read the article

  • Tricking Linux apps about current time with environment variables

    - by geek
    Sometimes it is possible to trick a Linux app by calling it like this: HOME=/tmp/foo myapp This would make myapp think /tmp/foo is the home directory, it won't try to get the user id, find its home directory via getpwent(). This is useful when myapp must be forced to dump some of its config files into a non-standard location different than ~. A similar trick can be done like this: LANG=foo LC_ALL=bar myapp This is useful when myapp needs to be called once with a different locale without having to make the change persistent by using the export bash built-in or even modify stuff in /etc/profile. Is it possible to pull the same trick with time and date? The goal is to make an app use another time than the system ones. The final goal - to make timestamps that appear in logs/commit messages not being tied to the system time.

    Read the article

  • Cross Domain access of self hosted WCF services?

    - by Macnique
    I have developed a WPF application which communicates with a set of self hosted WCF services which are under same domain and I use the following setup in the config files. <security mode="Transport"> <transport clientCredentialType="Windows" protectionLevel="EncryptAndSign" /> <message clientCredentialType="Windows" /> </security> I hosted the services on a server in different domain .I can achieve the communication by setting the security mode to "None" which is not ideal. Is there any other setting i can user for cross domain communication or i have to do with some trusted certificates ? I would be glad if some one can guide me because all the searches on google directed me to silverlight applications +Crossdomain.xml+WCF but i have't seen any combination of WPF on crossdomain environment.

    Read the article

  • Limiting to the ServerName in Apache2

    - by David
    I have 2 sites defined in my Apache2. Each one has a servername. For example: Server 1 (first in sites-enabled) responds to www.example.com Server 2 (second in sites-enabled) responds to www.example2.com Ok, the problem is when I type the server IP in the URL, the first server responds. How could I limit the response to only specifying its servername? I would like to block the IP calls. If that is not possible, I would like the second server to respond, not the first. I cannot change the order because there are aliases defined in the second server that would override the first server config.

    Read the article

  • Synergy: How to screen positioning configuration while it is running?

    - by Brandon
    I am using Synergy between two Macbooks (10.6 & 10.7). Installed using homebrew, version '1.3.6p2' from I will sit in various places in relation to the secondary laptop, so sometimes I want the other screen to be on the right of my main screen and sometimes to the left. How can I reconfigure this without shutting down synergys, changing the config file, and restarting the server and the client? Ideally it will be a terminal command so I can easily assign it to a keyboard shortcut. Thanks!

    Read the article

  • Using a BTHomeHub as a wireless booster

    - by Ploo
    I currently have a BT HomeHub downstairs but the wireless signal isnt strong enough on the second floor. I have an old BT HomeHub from the previous contract that I wanted to set up upstairs. I currently have an ethernet cable leading up into the old router. I then took my laptop and plugged it into the old router in hopes of configuring it but when I cannot reach api.home which is the address of the router. I cannot connect to the router wirelessly either since the password was changed and I'd have to change it through config. At this point I am completely stumped, any ideas?

    Read the article

  • mysql wont stop, mysqld_safe appeared in top

    - by power4
    my server (CentOS) contains lots of website, which collect data from lots of sources with cron. the mysql config is the default recently, PHP failed to communicate with mysql. Firstly I just restart the server but after restarted, PHP still failed to communicate with mysql I've tried: ps ax | grep mysql Then run: kill -9 #### (I've also tried killall -9 ####) - this failed, ps ax | grep mysql showing the killed process id is still there service mysqld start (I've also tried /etc/init.d/mysqld start) - I got reply Timeout error occurred trying to start MySQL Daemon. when run top, mysqld_safe is appeared on top with about 50% of CPU usage. I dont know the size of all the database. I really confused

    Read the article

  • Hosting website when port 80 is taken?

    - by cinqoTimo
    A few months ago, we purchased an R-HUB unit to replace WebEx for remote support. The device operates through port 80, ehich doesn't appear to be configurable. I know in IIS, you can specify a port besides port 80, but the problem is in the port forwarding. On our router, we have to map an incoming port to the forward port which then directs traffic to the node (webserver). However, the incoming port for both the webserver and the R-HUB is 80 - and the server seems to be getting confused as I can only get to the R-HUB, not the website. How can I expose both devices? Host header headers? DNS config?

    Read the article

  • Apache RewriteRule ignoring RewriteCond?

    - by winsmith
    So I have an Apache running on OSX Server 10.4 (don't ask) with multiple sites. In 0002_[example.com].conf, I have this bit of code: <Directory "/Library/WebServer/Documents/secret/"> RewriteEngine On RewriteCond %{REMOTE_ADDR} !^137\.250\. RewriteRule .* /messages/secret.html </Directory> However, in this configuration, the RewriteCond always seems to evaluate to false, since the secret directory gets shown even if the client's address does not begin with 137.250. If I change the config to this <Directory "/Library/WebServer/Documents/secret/"> RewriteEngine On RewriteRule .* /messages/secret.html RewriteCond %{REMOTE_ADDR} !^137\.250\. </Directory> the condition either does not get evaluated at all or always evaluates to true. Either way, all clients get blocked. What am I doing wrong?

    Read the article

  • MSSQL 2012 Error 26 and remote connection

    - by Rayfloyd
    I'm trying to set up MSSQL 2012 for a school project and I need to be able to connect to it remotely as my teammates will also be working on it. I did a clean install of SQL Server 2012 Express. Knowing I can't connect remotely straight off, I tweaked the settings that needed tweaking according to the internet. What I did 1.Made sure remote connections were allowed 2.Enabled TCP/IP 3.Removed 0s from Dynamic ports and set 1433 in TCP Port 4.Enabled Named Pipes 5.Created Outbound and Inbound traffic rules in the firewall for TCP port 1433 and UDP port 1434 6.Port forwarded 1433 to my "server" and 1434 too 7.made sure I was pingable 8.SQL Server authentication is enabled 9.I have restarted my computer so that changes to the config are saved So whenever I try to connect using management studio on another computer than the server using myusername.dyndns.org\SQLEXPRESS I get error 26 I have been searching for different solutions for 3 hours with no luck.

    Read the article

  • Ubuntu 9.10 Karmic, nVidea Quadro NVS 280 PCi, Eizo S1921 Dual Screen (Twin View) Slow Window Draws

    - by Spasm
    I have been following this Tutorial to get dual monitors working on my box http://www.dwasifar.com/?p=862&cpage=1#comment-5727 It works! However, when ever I move a window, the redraw of that window takes 3-8 seconds. Even moving the window takes the same amount of time Is this being done in software rather than the nVidea hardware? The windows themselves do not respond. I have seen a few old threads but no relevant fixes - If anyone could suggest a fix I would very much appreciate it. I have tried: sudo nvidea-xconfig sudo nvidia-settings Then configure TwinView go to save the config... and the error unable to parse xorg.conf file and the error in the console VALIDATION ERROR: Data incomplete in file /etc/X11/xorg.conf. Undefined Device "null" referenced by Screen "Configured Screen Device" Segmentation fault

    Read the article

  • Tomcat 7: Include virtual host definitions from a directory?

    - by grog_7
    I'm setting up a new Tomcat 7.0.33 server, and I'm looking for a simple way to include a directory full of virtual host definition XML files for configuration. I've found that it's possible to do XML includes from server.xml, but this requires a line in server.xml for each file, and is no less work than just having the entire config right in server.xml. I'm looking for something similar to Apache's Include directive. The end result I'm looking for is to have a directory I can drop an XML file into that Tomcat will pick up on the next restart, without having to modify server.xml. Is there a way to do this? Thanks!

    Read the article

  • RAID 0 disk failure, how to recover the RAID?

    - by user7985
    Situation is this. A PC with 2 hard disks, in an RAID 0 Array. The electronics on one of the disks has failed. I can not find the same board for the disk (I've tried this, removed board from the OK disk, and the second, the damaged one, works fine). I've made an image with "dd" on linux on a new hard drive (same size, not same model) and now I get "Offline member" in the RAID config screen. Will I succeed to recover the data which is stored on the drives, any help, any experience with this kind of problem. And surly, I know it was stupid to put the disks in RAID 0 and store data on them :(

    Read the article

  • nginx location pathing issue

    - by Michael Jefferys
    I've got a pretty much default sites-enabled set up in my nginx on debian squeeze and i'm now trying to get it to serve up my munin graphs on myhost/munin/ Heres the location i've added to the config location /munin { root /var/cache/munin/www/; index index.htm index.html; } And here is the error I recieve: 2012/07/09 23:52:03 [error] 3598#0: *13 "/var/cache/munin/www/munin/index.htm" is not found (2: No such file or directory), client: 93.*.*.*, server: , request: "GET /munin/ HTTP/1.1", host: "" This set up used to 'just work' in apache. I'm new to nginx so a bit lost as to why its adding the extra /munin when looking for the path. Any advice?

    Read the article

  • OS won't boot during vs2010 install on vista 64

    - by Noam
    I am installing vs2010 on a vista 64bit machine. During the install it asked for a restart. Since that, vista won't load. I tried restore to previous good config - didn't help. I am only able to boot it using safe mode with networking. When I did that, it continued the vista part of the install (the screen with the 3 out of 3 updates) but after that when I restarted again - still fails. HELP!!!

    Read the article

  • Can time skew on Windows be reduced to +/- 5ms?

    - by mbac32768
    A number of our Windows workstations, running ntpd, simply cannot keep time. Our Linux workstations and servers running the same ntpd config don't have this problem, they can stay within +/- 5ms of skew. The Windows hosts easily drift to seconds and sometimes minutes apart. This is a problem for us. The only common factor we have been able to isolate is that the hosts that can't keep time are running Windows. Is there something fundamentally impossible with what we're trying to do?

    Read the article

  • When a new user is created on Centos 6, it takes a while (30 mins) before he can access his group folder

    - by Diepseun
    I created a new user and made it part of a certain group which has full access (777) to a folder. Checked the user in Samba, password the same as his Windows (XP) password, rebooted his desktop but he didn't have access to the folder. Checked the Samba group and config file and the user was defined as a member of the group. It didn't make sense and I then did something else for a while. When I tried again, without doing anything further about it, the user had access to the folder. I did restart the Samba server after my original changes. Thanks in advance.

    Read the article

  • Apache 2.4 subdomain setup fails

    - by Grashopper
    I am struggling with this all the day, no answer i found here as well. Please advice how to setup proper a subdomain i need. My Apache config has 2 domains configured (on same IP), for the domain2.com i need to setup a sub-domain. Here is what i have so far, but the subdomain keeps redirecting me to domain2.com (main site). <VirtualHost 11.11.11.11:80> ServerName domain1.com ServerAlias domain1.com *.domain1.com DocumentRoot "C:/wwwmap/domain1.com" </VirtualHost> <VirtualHost 11.11.11.11:80> ServerName domain2.com ServerAlias domain2.com *.domain2.com DocumentRoot "C:/wwwmap/domain2.com" </VirtualHost> <VirtualHost 46.4.24.4:80> ServerName projects.domain2.com DocumentRoot "C:/wwwmap/projects" </VirtualHost> The DNS entry is: projects in CNAME domain2.com Trying to remove ServerAlias domain2.com *.domain2.com worked so far, but then domain2.com is redirecting to domain1.com What am i doing wrong?

    Read the article

  • How to make one CPU to be used simulataneously be three different users

    - by beginning_steps
    As a bootstrapping start-up we are thinking of saving on the IT hardware cost by making more use of the hardware that we have. As a solopreneur I have a laptop config : intel core2duo processor, 3Gb RAM and 250 GB RAM. Now we are planning to increase our team to 3 members. Will like your suggestions on the nest cost-effective step that I can take so that I can use the computing power of the existing laptop to act as a kind of server and then buy to more monitors where the new recruits can do the daily work on and they need to have different login id and access and they dont need access to all the files/applications as are available in my laptop. We use internet intensively to do our day to day activity. Please share you experience, whether you think this is a good ploy or there is any other more effective way of achieving the same result.

    Read the article

  • Puppetize everything or not?

    - by stderr
    Notice: there is a lot of theoretical questions. Recently I'm reading about Puppet (and similar systems), which - as I believe - can make my work easier, a lot. But I try - and unfortunately can't - to understand what all I can "puppetize". I can imagine "clouds" or HA clusters, where is the same config on more servers. But what about workstations? I have one pc (centos with kvm), one notebook (fedora) and personal server, can (or should) it be puppetized? What are (dis)advantages? Or in our company we have hundreds of servers (mainly with centos), but each of them is a little bit different. Can't decide if it's better to have a lot of configs on one place.. (Dis)advantages? I will be happy for all your opinions or links with this topic.

    Read the article

< Previous Page | 384 385 386 387 388 389 390 391 392 393 394 395  | Next Page >