Search Results

Search found 34056 results on 1363 pages for 'mod access'.

Page 472/1363 | < Previous Page | 468 469 470 471 472 473 474 475 476 477 478 479  | Next Page >

  • In Opera 11, how can I set up shortcuts to switch to specific tabs?

    - by gphilip
    In Firefox on Linux, we can use Alt+1, Alt+2, etc. to switch to the first, second, etc. tabs, till the 9th tab. This is very useful to switch between tabs. In Opera 11 on Linux, I couldn't find a way to do this: the only way to switch among tabs seems to be to use Ctrl+Tab and then cycle till we reach the tab that we want. Needless to say, this "linear access" method is slower (and more annoying) than the random access available in FF. So my question is: How can I set up shortcuts in Opera 11 on Linux so that I can switch to one of the first nine tabs using a numbered shortcut?

    Read the article

  • Boot another OS e.g. Windows *once* on a dual-boot machine

    - by user974312
    I have a dual-boot machine with Windows and Linux on it. It doesn't reside at my hand, instead , it's placed in the datacenter which I have to access remotely. For most of time, I work on Linux. But there is some occasion that I have to use the Windows OS on it. Here is the problem. I hope to do all those following things remotely. Do some magic to Grub. Reboot the machine from Linux. Grub boots Windows. Access Windows remotely. Work done. Reboot the machine from Windows. Grub boots Linux. So I wonder whether I can set the booting target at the next time, for only once? Thanks.

    Read the article

  • How to install Gitlab in a VM on a production server?

    - by Michaël Perrin
    I have a production server running Ubuntu 12.04 and I would like to install on it a VM with Gitlab (using Vagrant and Virtualbox). Let's say that the address to access Gitlab is gitlab.mydomain.com . The DNS zone has been configured to point to the IP address of the server. I want users to be able to access to Gitlab (either for pushing to a repository, or for accessing to the web interface) from the outside. The VM has been configured to have an IP address. It means that when browsing http://gitlab.mydomain.com for instance, the request has to be forwarded to the VM on the server, ie. to the VM IP address. What are the ways to configure this? Can Apache be used as a proxy? In this case, I guess it only works for HTTP requests, but not for pushing to a Git repository on the VM.

    Read the article

  • One vs. many domain user accounts in a server farm

    - by mjustin
    We are in a migration process of a group of related computers (Intranet servers, SQL, application servers of one application) to a new domain. In the past we used one domain user account for every computer (web1, web2, appserver1, appserver2, sql1, sqlbackup ...) to access central Windows resources like network shares. Every computer also has a local user account with the same name. I am not sure if this is necessary, or if it would be easier to configure and maintain to use one domain user account. Are there key advantages / disadvantages of having one single user account vs. dedicated accounts per computer for this group of background servers? If I am not wrong, one advantage besides easier administration of the user account could be that moving installed applications and services around between the computers does not require a check of the access rights anymore. (Except where IP addresses or ports are used)

    Read the article

  • Why are some web clients requesting a page named "cache"?

    - by Toto
    We see errors like this in the apache error log: [Thu May 17 14:32:35 2012] [error] [client 192.168.1.1] File does not exist: /home/www-data/mywebsite.com/r/cache, referer: http://www.mywebsite.com/r/1010 It is strange because: There is no reference in the code/url about a folder/file "cache". The folder/file "cache" does not exist The client is randomly trying to access a "cache" folder everywhere on the website. It is always trying to access the folder/file "cache" following this pattern: Pattern: /level1/.../levelwhatever/filename (referer) /level1/.../levelwhatever/cache We run a LAMP (Debian stable: PHP 5.3.3-7+squeeze9. We also use APC 3.1.3p1). We use Google Analytics and AdSense. We do not know how to reproduce the problem. Note: I replaced the user's IP in the code for privacy.

    Read the article

  • Why my webservice not live on internet? [closed]

    - by blankon91
    I've windows server that goes live on internet, (e.g. www.mysite.com). Then I want to create another site with different port (e.g. www.mysite.com:502). I've create that and it works when I access it on local network, but when I access it from outside of local network (internet) the www.mysite.com:502 can't accessed but the www.mysite.com can accessed. what should I do to make www.mysite.com:502 goes online? I use windows server 2008 standard

    Read the article

  • windows 2008 R2 TS printer security - can't take owership

    - by Ian
    I have a Windows 2008 R2 server with Terminal server role installed. I'm seeing a problem with an ordinary user who is member of local printer operators group on the server. If the user opens a cmd window using ‘run as administrator’ they can run printmanager.msc without needing to enter their password again. In printmanager they can change the ownership of redirected (easy print) printers without problems. If, from the same cmd window, they use subinacl to try and change the onwership of the queue to themselves they get access denied: >subinacl.exe /printer "_#MyPrinter (2 redirected)" /setowner="MyDom\MyUsr" Elapsed Time: 00 00:00:00 Done: 1, Modified 0, Failed 1, Syntax errors 0 Last Done : _#MyPrinter (2 redirected) Last Failed: _#MyPrinter (2 redirected) - OpenPrinter Error : 5 Access denied so, same context, same action but one works and one doesn't. Any ideas for this odd behaviour? I'm using subinacl x86 on an x64 server as I can't find anything more up to date. I've tried with icacls and others but couldn't get them to do anything with printers.

    Read the article

  • How to check use of userva boot option on Win 2K3 server

    - by Tim Sylvester
    I have some 32-bit Win2K3 servers running an application that fails now and then apparently due to heap fragmentation. (Process virtual bytes grows, private bytes does not) I do not have access to the source code or build process of this application. I have modified the boot.ini file on one of these servers to include /userva=2560, half way between the normal mode of operation and the /3GB option. Normally it takes weeks to reach the point of failure, but I'd like to see right away whether this has actually had any effect. As I understand it, this option limits the kernel to the remaining address space (1536MB instead of 2048), but does not necessarily give an application the extra address space, depending on the flags in the application's PE header. How can I determine whether the O/S is allowing a particular application, running in production, to access address space above 2GB? Additionally, what's the best way to monitor the system to ensure that the kernel is not starved for address space, and more generally how should I go about finding the optimal value for this setting?

    Read the article

  • Is it possible to know a user logged in on Ubuntu instantly?

    - by Mustafa Orkun Acar
    In fact, I am trying to restrict access to some websites for different users. I asked the question: Restrict access to some websites for different users. The given answer is ok; but as the owner of answer says, it works if users are locally logged in. That is; if the user logs out and logs in, restrictions are no more valid. So, I decided to run a script including the iptables commands for restrictions at every log in event. I want to know whether it is possible to know instantly the user logs in.

    Read the article

  • Install Trac Without Setting Up a VirtualHost in Apache?

    - by jobu1324
    I'm trying to set up Trac to test out it's functionality, and the only guides I can find online talk about setting up a VirtualHost. Right now I am under the impression that I need access to a DNS server to properly use the VirtualHost directive, and for various reasons I don't have access to one. Is it possible to set up Trac without setting up a VirtualHost? I haven't had any luck. If I run the site with tracd, it works - which means that at least part of it is set up properly. Right now all I have is an Apache Directory directive pointing to /pathToTracSite/htdocs/, and when I visit the trac location, all I get when viewing the site from a browser is an empty directory (which makes sense, because htdocs/ is empty). My server is running Apache2 I know I'm missing a lot here, because I don't understand Apache the Trac system very well - any help would be appreciated.

    Read the article

  • Repeated requests on our server?

    - by pitty.platsch
    I encountered something strange in the access log of our Apache server which I cannot explain. Requests for webpages that I or my colleagues do from the office's Windows network get repeated by another IP (that we don't know) a couple of seconds later. The user agent repeating our requests is Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; .NET CLR 2.0.50727; .NET CLR 3.0.04506.648; .NET CLR 3.5.21022; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729; InfoPath.2) Has anyone an idea? Update: I've got some more information now. The referrer of the replicate is set to the URL I requested before and it's not the exact same request as the protocol version is changed from 'HTTP/1.1' to 'HTTP/1.0'. The IP is not just one, it's just one of a subnet (80.40.134.*). It's just the first request to a resource that's get repeated, so it seems the "spy" is building up some kind of cache of visited places. The repeater is also picky. I tried randomly URLs with different HTTP status codes and different file patterns. 301s and 200s are redone, 404s not. Image extensions seem to be ignored. While doing my tests I discovered that this behavior seems to be common as I found other clients visiting just after the first requests: 66.249.73.184 - - [25/Oct/2012:10:51:33 +0100] "GET /foobar/ HTTP/1.1" 200 10952 "-" "Mediapartners-Google" 50.17.125.180 - - [25/Oct/2012:10:51:33 +0100] "GET /foobar/ HTTP/1.1" 200 41312 "-" "Mozilla/5.0 (compatible; proximic; +http://www.proximic.com/info/spider.php)" I wasn't aware about this practice, so I don't see it that much as a threat anymore. I still want to find out who this is, so any further help is appreciated. I'll try later if this also happens if I query some other server where I have access to the access logs and will update here then.

    Read the article

  • XAMPP server giving 404 error when requested by ipv4 connection

    - by boyb
    This is in reference to a previous question that I asked and was answered by womble. http://serverfault.com/a/406280/127729 So, now we have the real DNS records, we can do some diagnosis. dig for both A and AAAA on akosiboybastos.broker.freenet6.net gives a valid response, with an appropriate address. Good. dig for both A and AAAA on bastosforum.strangled.net gives the same responses (with a CNAME response thrown in). Also good. This means that the problem is not DNS-related, as those records are in order. wget -6 bastosforum.strangled.net/ gives a 200 OK response. wget -4 bastosforum.strangled.net/ gives a 404 Not Found response. This means that your webserver is misconfigured so that it's not serving the response you desire on IPv4. Given that the initial DNS problem asked in this question has been solved, I would recommend posting a new question with relevant webserver-related configuration, if you can't determine the configuration error yourself. I am using XAMPP(latest version) running phpbb3.0.10 via ipv6 tunnel from freenet6 and my domain is akosiboybastos.broker.freenet6.com, nothing fancy with the installation just out of the box install(with a few cosmetic mod). Both ipv4 and ipv6 traffic can connect using that url, but when I try to put a CNAME record on my test domain which is bastosforum.strangled.net pointing it to akosiboybastos.broker.freenet6.com only ipv6 can connect. As suggested by womble, this is a misconfigured webserver. To be honest I don't know where to start checking on the server as it is fully working if you use the domain given by freenet6 (akosiboybastos.broker.freenet6.com), any info on how to go about this server issue is welcome as i'm really a noob when it comes to computers. regards boyb

    Read the article

  • Lighttpd referer issue

    - by Chris
    I have a problem to block files from accessing from different domains as my one. I have added to my lighty config in the "virual host" following: $HTTP["referer"] !~ "^($|http://www\.my-site\.net)" { url.access-deny = ( "" ) } but anyway the site www.example.com can access http://player.my-site.net/player.swf, also it can be accessed directly without a referrer. any idea? //EDIT here is my old apache .htaccess with a rewrite rule thats works perfect, but i dont know how to convert it for lighty: RewriteEngine on RewriteBase / RewriteCond %{HTTP_REFERER} !^http://my-site\.net/ [NC] RewriteCond %{HTTP_REFERER} !^http://www\.my-site\.net/ [NC] RewriteCond %{HTTP_REFERER} !^http://player\.my-site\.net/ [NC] RewriteCond %{HTTP_REFERER} !^http://stream\.my-site\.net/ [NC] RewriteRule .* - [L,R=404]

    Read the article

  • How to protect Ruby on Rails code on external server?

    - by Phil Byobu
    I have to deploy a Ruby on Rails Applications on a client's server and I do not want them to be able to view or modify the source code. How would you protect the code technically? I thought about building a linux-based virtual machine with an encrypted filesystem where the application code resides. The client has no root access, or direct access to the system at all. All services start automatically and the application is ready to use. What would you suggest?

    Read the article

  • How SmartDNS Works

    - by Emad
    If you travel outside the US you'll notice that most of the streaming services like Netflix, Pandora, hulu etc are blocked, usually by the service providers themselves. To get around that, people use VPN services. They basically tunnel your traffic through a US server so your requests seem like they are originating in the US. These VPN services fix this blocking problem, but make your connection slower than the normal unVPNed connection. Recently however I've come across something called SmartDNS provided by overplay.net. You pay $5 a month and you get access to their DNS servers. After you change to their DNS you get access to the blocked streaming sites, without slowing down your normal traffic like email and browsing. What I'd like to know is the technical details of how this SmartDNS works. I've done some quick research but that didn't turn up anything of substance. Anybody out there knows?

    Read the article

  • Do superuser things with normal user

    - by OrangeTux
    I want to secure the SSH access to my server. One thing I read everywhere is to disable the root user login. To still have access via SSH I created another user via sudo adduser john How can I still do root things with this account? sudo command asks for a password of the user but gives john not in sudoer file. Action will be reported. When I use su I log in as root which I'm going to disable. How can I stil do root things with the normal account john?

    Read the article

  • Run a server and local wireless network off my laptop with no internet.

    - by greg
    I'm trying to run a wireless network from my computer so that people in range can connect to the network and hit a website running off my machine. I don't want to enable file sharing or remote access or anything else of that nature. I just want them to be able to connect to the network, type in an IP or domain name in a browser, and be taken to a locally hosted website. No broader internet access needed. Any ideas / links to good tutorials on the subject? Is this something i can achieve with just a wifi card or will need a router?

    Read the article

  • Connecting desktop computer to the internet through laptop wifi

    - by Josh
    Due to some home network complications I have had to move my router to a seperate part of the house, therefore the wired network I had set up can no longer work. Before I find the time to go out and buy a Wifi adapter for my desktop PC, I have a laptop that uses a built in Wifi card to connect to my router and this can connect to the internet, and I was wondering if I could somehow access the internet on my desktop PC via my laptop. I'm hoping for a not-so-complex solution as this will only be set up for a few days, but it is quite vital that my desktop computer gets internet access. Does anyone have experience in this sort of thing and can help me out? Thanks.

    Read the article

  • some keyboard keys not working properly

    - by surfmadpig
    I'm using windows 7. All of a sudden, a couple of hours ago, this happened: My keyboard number keys [above the letters] stopped working properly, both as numbers and as symbols. Only 5 and 6 are functional. Also, I've noticed that the End key isn't working either, and perhaps a couple more from that group. I'm pretty sure it has something to do with those evil Sticky Keys/Filter Keys/ whatever those ease of access things are, BUT I've turned off all the ease of access keyboard options and nothing has changed. Is it possible that something is still turned on while I unchecked it? Are the on/off checkboxes to control WHEN it happens or IF it happens? I also tried rebooting and uninstalling/reinstalling keyboard from device manager, to no avail. It's certainly a software issue and not a hardware issue, as I've tried another keyboard and the problem persists. And, predictably enough, it's annoying. Any ideas?

    Read the article

  • Prevent folder deletes at top level only on Server 2008

    - by DomoDomo
    I'm trying to prevent folders moves, really folder delete in NTFS parlance, for series of folders within a network share. So let's say I have: FolderA, FolderB, FolderC. Each folder has various files and subfolders. I want the Domain Users group to have modify access to all files and folders beneath FolderA, FolderB, and FolderC. However I don't want them to be able to delete these three top level folders. The issue we are having right now is people keep accidentally dragging one top level folder into another. I've tried used advanced NTFS permissions to deny domain users delete access to these top level folders, and set the permissions to apply to "This folder only", however it seems to only affect sub-folders, and not the top level. Platform is Server 2008 Standard. Thanks in advance.

    Read the article

  • dhcp client service won't start

    - by xyious
    I have a Laptop with 2 network interfaces and neither will get an IP address through dhcp. I found out that the dhcp client service didn't start. Upon manually starting it gives the error 2: File not found. I have checked that the files were there (both svchost and dhcpcore .dll), the local service account has read access to the system32 folder, the path in the registry is also correct and I can access the file. I have tried to netsh winsock reset and ip reset all. I have even added the local service account to the administrators group. sfc /scannow also came up clean. I have no idea what else I can try. Any suggestions are welcome. (side note it's a windows 7 32 bit, atheros wlan, deinstalled avira before any of the other troubleshooting)

    Read the article

  • I bought a domain name at GoDaddy and hosting at Dreamhost but the first doesn't work!

    - by janooChen
    I added the Dreamhost's nameservers like 12 hours ago to: I entered to the following panel: Nameservers -> Set Nameservers (I have specific nameservers for my domains) and added Dreamhost's nameservers liek this: Nameserver 1: NS1.DREAMHOST.COM Nameserver 2: NS2.DREAMHOST.COM Nameserver 3: NS3.DREAMHOST.COM So now in the admin panel I see this: Nameservers Nameservers: (Last Update 2/10/2011) NS1.DREAMHOST.COM NS2.DREAMHOST.COM NS3.DREAMHOST.COM But I get this when I run the analysis tools: Attention Required! There are critical issues Accessing Your Web Site Accessing Your Web Site Properly configuring your domain name and hosting account ensures that visitors can access your site. Did I do something wrong or I have to wait 24 to 48 hours? (Dreamhost does display my page because I can access the other domain name I bought together with the hosting) (By the way, if everyone uses the same nameserver, how will go GoDaddy know which is the hosting space that I purchased among all others)? Thanks in advance.

    Read the article

  • How to interconnect laptop, PC, Android & Symbiam phones with WiFi?

    - by noquery
    What I have Level One wifi Router hub Laptop with Windows 7 (wifi enabled) Desktop PC with Windows 7 (with no wifi, maybe LAN cables can help) 2 android & 1 Symbian Phones each wifi enabled. LAN Cables Hathway Internet connection which allows only 1 login to access internet What I want Is it possible to access internet on PC, laptop, mobiles at the same time with only one connection? Since I don't know even A of networking so please suggest some software which require minimum cofiguration. Let me know if any other hardware I require.

    Read the article

  • Windows 7 migration: How to move application settings?

    - by FrustratedWithFormsDesigner
    Migrating from WindowsXP Home to Windows 7 Pro. The last bit that I'm stuck on is migrating application settings, such as the settings for Opera, Firefox, MSN Messenger, and others. On the XP system, this all seems to be in "user/Local Settings" and "user/Application Data". On the Windos 7 system, there is a "user/AppData" folder as well as "user/Application Data" and "user/Local Settings". When I try to access "Application Data" and "Local Settings" on Windows 7, I get an "Access Denied" error (even though my user is an admin). So... if I can't copy my application settings files to "Application Data" and "Local Settings" on Windows 7, where to I copy them to?

    Read the article

  • Why is "chmod -R 777 /" destructive?

    - by samwise
    This is a Canonical Question about File Permission and Why 777 is "destructive". I'm not asking how to fix this problem, as there are a ton of references of that already on Server Fault (reinstall OS). Why does it do anything destructive at all? If you've ever ran this command you pretty much immediately destroy your operating system. I'm not clear why removing restrictions has any impact on existing processes. For example, if I don't have read access to something and after a quick mistype in the terminal suddenly I now have access well... why does that cause Linux to break?

    Read the article

< Previous Page | 468 469 470 471 472 473 474 475 476 477 478 479  | Next Page >