Search Results

Search found 51125 results on 2045 pages for 'access point'.

Page 382/2045 | < Previous Page | 378 379 380 381 382 383 384 385 386 387 388 389  | Next Page >

  • How to figure out which directory is web server root?

    - by matt
    I want to view websites hosted on my Mac when running Windows VMware Fusion. I have an entry in the Windows hosts file to enable the routing: #ip of my mac domain i use on the VM to access it 192.168.1.70 mymac However, it resolves to an empty directory as a 404 is generated. I can see the access log on my Mac that everything is OK access wise. Firefox on VMware states the following response headers: Server Apache/2.2.14 (Unix) mod_ssl/2.2.14 OpenSSL/0.9.8l DAV/2 PHP/5.3.1 Any ideas how I can figure out what directory is being served? I am lost in a maze of twisty httpd.conf passages. localhost on my Mac resolves to my ~/Sites directory. 192.168.1.70 resolves to the same empty directory/404. Thanks.

    Read the article

  • How can I protect files on my NGiNX server?

    - by Jean-Nicolas Boulay Desjardins
    I am trying to protect files on my server (multiple types), with NGiNX and PHP. Basically I want people to have to sign in to the website if they want to access those static files like images. DropBox does it very well. Where by they force you to sign in to access any static files you put on there server. I though about using NGiNX Perl Module. And I would write a perl script that would check the session to see if the user was sign in to give them access to a static file. I would prefer using PHP because all my code is running under PHP and I am not sure how to check a session created by PHP with PERL. So basically my question is: How can I protect static files of any types that would need the user to have sign in and have a valid session created with a PHP script?

    Read the article

  • Gnome 3 - Unable to change date and time

    - by Chris Harris
    I am running Arch Linux with Gnome 3. Unfortunately, although my time and date settings in /etc/rc.conf show that HARDWARECLOCK='UTC' and TIMEZONE='America/LosAngeles'. I continue to get the timezone of Europe/London. If I try to change the date and time via the GUI. It requires root access. After authorizing root access, the date and time may be changed; however, after closing the GUI window, it automatically reverts back to the previous incorrect timezone. I am able to use pool.ntp.org in order to sync my time to the correct one; however, this works only for the current session and is not fixed. This solution is inconvenient since there is not always network access. What other solutions are available for this problem?

    Read the article

  • Mounting LVM2 volume with XFS filesystem

    - by Chris
    Unfortunately I'm not able to access the data on my NAS anymore. I can't figure out why this is the case as I haven't changed anything. So I plugged one of the harddisks in my computer to access the data. What I did: kpartx -a /dev/sdc Now I should be able to access /dev/mapper/vg001-lv001 When trying to mount it I get: sudo mount -t xfs /dev/mapper/vg001-lv001 /home/user/mnt mount: /dev/mapper/vg001-lv001: can't read superblock Now I did a parted -l which gave me Modell: Linux device-mapper (linear) (dm) Festplatte /dev/mapper/vg001-lv001: 498GB Sektorgröße (logisch/physisch): 512B/512B Partitionstabelle: loop Number Begin End Size Filesystem Flags 1 0,00B 498GB 498GB xfs Does anybody have a solution how to recover the data?

    Read the article

  • What is the best filesystem for storing thousands of files in one dictionary-like id-blob structure?

    - by Ivan
    What filesystem best suits my needs? Thousands or even millions of files in one directory. Good (ext4 & ntfs level or close) reliability (incl. fault tolerance) and access speed. No directories actually needed, as well as descriptive names, just a dictionary-like structure of id-blob pairs is all I need. No links, attributes, and access control features needed. The purpose is a file storage where all the metadata (data describing all the facts about what the file actually contains and who can access it) is stored in a MySQL database. As far as I know common filesystems like NTFS and ext3/4 can go dead-slow if there are too many files placed in one directory - that's why I ask.

    Read the article

  • Faster, secure, protocol/code required for long-distance transfer.

    - by Chopper3
    I've ran into a problem and I'm looking for a new secure protocol/client/server that's faster over a 1Gb/s fibre link - let me tell you the story... I have a pair of redundant, diversely-routed, 1Gb/s links over a distance of around 250 miles or so (not dark fibre but a dedicated point to point link, not a mesh). At the 'client' end I have a HP DL380 G5 (2 x dual-core 2.66Ghz Xeon's, 4GB, Windows 2003EE 32-bit), at the 'server' end I have a HP BL460c G6 (2 x quad-core 2.53Ghz Xeons, 48GB, Oracle Linux 5.3 64-bit). I need to transfer around 500 x 2GB files per week from the client to the server machines per week - but the transfer NEEDS to be secure. Using both iPerf or regular FTP I can get ~80MB/s of transfer pretty consistently, which is great. Using WinSCP or Windows SFTP I can't seem to get more that ~3-4MB/s, at this point the server's CPU is 3% busy while CPU0 of the client goes to ~30% utilised. We've tried editing various TCP window sizes with little success. Both ends are connected to quite low-usage Cisco Cat6509's with Sup720's. I can replace the client machine with a newer machine and/or move it to Linux - but this will take time. Clearly these single-threaded secure Windows clients are introducing too much latency doing their encryption. So a few questions/thoughts; Are there any higher performing secure protocols or client software for Windows that I could try? I'm pretty protocol-gnostic so long as it'll work between Windows and Linux. Should I be using hardware to do the encryption, either in the client or the network parts? If so what would you recommend? I'm not convinced that just swapping the server would be that much faster, the CPU was only at 30% but then again that's higher than I'd have expected given the load - moving to Linux at the client end may be a better idea but would be quite disruptive. Am I missing a trick? Thanks in advance.

    Read the article

  • Apache Virtual Host with directory aliases

    - by brechtvhb
    I'm trying to set up a dynamic virtual host in apache with a directory alias pointing to a difirent path for every domain. Here's what I'm trying to achive. Say I have 2 domains: * www.domain1.com * www.domein2.com I want both to point to the same index.php file (C:/cms/index.php). Now the hard part ... I want directories or certain file types to point to a diffirent path for each domain. Example: * www.domain1.com/layout -> C:/store/www.domain1.com/layout * www.domain2.com/layout -> C:/store/www.domain2.com/layout * www.domain1.com/image.png -> C:/store/www.domain1.com/image.png * www.domain2.com/image.png -> C:/store/www.domain2.com/image.png However the admin directory should point to the same path again for all sites * www.domain1.com/admin -> C:/cms/admin * www.domain2.com/admin -> C:/cms/admin Is there a way to achieve this kind of behaviour in apache 2.2 without having to create a virtualhost entry for each new domain?

    Read the article

  • Netgear genie says my Internet is off, while Windows claims I am connected

    - by Manu
    I'm connected via wifi to my ISP's router/modem. While Windows says that I'm always connected, I keep getting messages from Netgear Genie that I've lost contact to the Internet, and I cannot access webpages until it comes back. There are two other computers in the house, one connected to the router via ethernet, the other via wifi, both seem to have no such problems. I've wondered if Netgear genie itself was the problem, but I am regularly disconnected even if I uninstall it. And I'd rather have it since it accurently tells me if I'm connected or not. Why does windows says I'm online if I can't access any online game, or website ? Is "netgear genie" the problem ? I've removed the connection, recreated it, I've even copied the settings to a usb key from the computer that has wifi access.

    Read the article

  • Client side certificates in client browsers with unix server for management

    - by user146253
    We are currently running Unix dedicated servers for everything (Web cluster, database, FTP, batch, ...) except for a Microsoft Active Directory Certificate Services. The sole purpose of this Windows box is to provide client side certificates to our clients browsers. All our clients are required to install a client side certificate on order for them to be able to access our website. Is there an alternative in the Unix space? The purpose is to make sure only the approved hardware of an approved client can access our website. I'm open for any solution that provides me with this level of security. We are however talking about thousands of certified computers just so you can factor that in in a proposed solution. Optionally we would also like to be able to revoke access. With Regards.

    Read the article

  • Wifi Works with Android and Windows 8 but not Linux and Win 7

    - by eramm
    Support has told me that our company wide wifi network is setup to support mobile phones only. However it doesn't make sense to me that they can identify a mobile device rather they have setup the Access Point to use a protocol that is only supported on Android and Windows phones. Because the Access Point supports Windows mobile this means that laptops running Windows 8 can also connect to the Access Point (proven). So it stands to reason that since Android is based on Linux there must be a way to connect using Linux as well. iwlist shows IEEE 802.11i/WPA2 Version 1 Group Cipher : TKIP Pairwise Ciphers (2) : TKIP CCMP Authentication Suites (1) : 802.1x WIreshark seems to show that a connection is being made to a website to get a certificate and use a Domain Controller for authentication. Questions: 1) what protocol could they be using that is supported on Win Mobile and Android but not on Win 7 and Linux (Debian) ? 2) what tools can I use to help me discover what protocol i need to support ? I have used iwlist and wireshark but I was not able to glean to much useful information from them. I can post the results if needed. 3) is there an app i can use on my Android phone to help me understand what kind of network it is connecting to ? I can provide more information if you tell me how to get it. I just don't know what I am looking for.

    Read the article

  • Connect two devices with ethernet

    - by mofle
    I have a MacBook Pro and I'm going to buy a Boxee Box. I have access to wireless internet, but it's slow and out of my control. So my question is, what is the best way for both devices to get internet access, and the Boxee Box to get SMB access to the Mac, when only the Mac is connected to the wireless internet? I'm thinking an ethernet cable from the Boxee Box to my Mac, and connection sharing on my Mac. Can you explain the setup, is there any configuring.

    Read the article

  • Parking domains and avoiding so called "search engine penalities"

    - by senthilkumar-c
    I have purchased two domains from one particular registrar and hosting from GoDaddy. Assume they are domain1.com and domain2.com Assume my hosting IP address is 111.111.111.111 I added both domain1.com and domain2.com in my domain management control panel and gave the same two nameservers for both domains at my registrar's control panel. So, now, both domains should show the same website. When I ping "domain1.com" or "domain2.com" the results say - Pinging domain1.com [111.111.111.111] with 32 bytes of data: Pinging domain2.com [111.111.111.111] with 32 bytes of data: respectively. So, they both point to the same hosting IP. BUT, internally, I have configured IIS to point them to different folders so that different websites are shown. (My hosting plan is expensive and I intend to use the space and bandwidth for many websites). But still, technically, all domains point to same IP address. Is this a bad thing? Is this what is called "domain parking"? I read some search engine forum posts that two domains pointing to the same IP/Website will be penalised by search engines and stuff. I have also read that simply "parking" the domains won't attract penality. I don't know whether what I have done is parking or the so called "wrong" thing. Can someone shed light on what I have done and what I should do? I don't want to be blacklisted by any search engine. P.S. I know this is not a search engine forum, but I am new to website hosting and domains and I am very weak in nearly all technical terms and concepts relating to web hosting and domains. I thought this will be a good place to understand these things.

    Read the article

  • Folder cannot be deleted

    - by Aaron
    I am using Windows XP Home Edition. When I try to delete a folder I have named cygwin. When I try to delete it or any file or folder within it, there is a long pause, and then an error pops up, saying: Cannot delete Cygwin: Access is denied. Make sure the disk is not full or read-write protected and that the file is not currently in use. I have tried deleting the folder and the files it contains with FileAssassin, and unlocking it with LockHunter. Neither report any errors unless I try to have them delete the file or folder, then I get an error message saying Access Denied. I have rebooted into Safe Mode to change the ownership, but I get Access Denied when I click OK or Apply. How can I delete this folder?

    Read the article

  • Is there a way to tell if a file is done copying?

    - by Mike Cooper
    The scenario is this: Machine A has files I want to copy to Machine C. Machine A can't access C directly, but can access Machine B that can access Machine C. I am using scp to copy from Machine A to B, and then from B to C. Machine B has limited storage space, so as files come in, I need to copy them to C and delete them from B. The second copy is much faster, so this is no problem with bandwidth. I could do this by hand, but I am lazy. What I would like is to run a script on B or C that will copy each file to C as each one finishes. The scp job is running from A. So what I need is a way to ask (preferably from a bash script) if file X.avi is "done" copying. Each of these files is a different size, and I can't really predict size or time of completion. Edit: by the way, the file transfer times are something about 1 hour from A to B and about 10 minutes from B to C, if time scale matters at all.

    Read the article

  • Troubleshooting an overheating CPU

    - by Jeff Fry
    I & my father just recently put together a new PC. Specs below. From the very beginning, on boot it will often complain that the CPU is too hot. If I sit in BIOS and watch the CPU, it'll drop back down from red to blue (<72C), at which point I've tended to just boot into Windows...and haven't had any problems. In fact, I've played a couple hours straight of Skyrim at max settings, and not had any visible issues. That said, I've occasionally walked away & come back to find that it's crashed. Yesterday, it crashed (while idle) twice in 12 hours, which shifted the balance from busy-with-life to nervous-I'm-about-to-melt-something. I just installed Core Temp which is showing my 4 cores fluxuating between 70-98C. I'm guessing at this point that the CPU fan may be incorrectly installed or defective. My first thought is to either (a) add water cooling (which the case supports) and / or (b) replace the CPU fan with an after-market one. That said, I'm very open to suggestions. A note, while I certainly don't want to burn money here, I have a baby coming any day now and am still unpacking from a recent move so if I have a choice between an option that costs money and another that takes a while...I'll happily spend a bit extra. Side question: Should I be nervous to even have this on at this point? Let me know if there's something useful I could add to my report. Otherwise, I'm looking forward to your suggestions! Thanks. CPU Intel i7-2600 CPU w/ stock fan Other HW ASUS P8Z68-V Pro motherboard 64G SSD boot drive 4 older SATA HDs GIGABYTE ATI Radeon HD6950 1 GB DDR5 8G Kingston T1 Series RAM Corsair 650W Gold Certified power supply Antec P280 case

    Read the article

  • ProFTPd: Multiple Domain VirtualHosts on one IP address

    - by Badger
    I have a webserver that we are giving a consultant FTP access to. For one domain hosted on that server he needs access to a "dev" directory and for a different domain hosted on that server he needs access to a different directory. I am trying to set this up with VirtualHosts, but I am having issues. Here is the VirtualHost bit of my proftpd.conf file: <VirtualHost www.example2.com> ServerName "Example 2" DefaultRoot /var/www/example2/dev </VirtualHost> <VirtualHost www.example1.com> ServerName "Example 1" DefaultServer on DefaultRoot /var/www/example1 </VirtualHost> When I FTP to either domain I always get the first VirtualHost, even if I FTP to the second domain.

    Read the article

  • Running a webserver behind a firewall, is it secure?

    - by i.am.intern
    Currently we have a Linux-based firewall which NAT-ing our public IP address to give internet access to our staff's PCs and a Windows Server 2003 for internal filesharing. I want to host Redmine/SVN (a bugtracker) internally behind this firewall using a Linux server. This webserver will be accessed by our clients externally so they can post bug reports. This means that I have to open port 80 & 22 at the firewall to give access to the webserver and me to SSH it from home. However, let's say I'm using password-based SSH for the webserver and somebody cracked it. Does that mean the cracker could ping and access other servers and PCs in the network?

    Read the article

  • How to block all multicast traffic travelling through a Cisco Catalyst 3750

    - by TrueDuality
    Something changed today. I can't seem to track down what, but one of our 3750s decided that it was going to forward all the multicast traffic it saw from the ghost server across every VLAN it has. I've tried writing a simple access group that consists of the following: access-list 100 deny ip any 224.0.0.10 0.0.0.255 access-list 100 permit ip any any I apparently mistakenly assumed that once applied to an interface that it would block all of the multicast traffic on that interface regardless of VLAN. I do not want any multicast traffic flowing through this particular switch to any VLAN or even to stay on the same VLAN beyond this switch. Does anyone have any ideas?

    Read the article

  • How to copy protected files when an Administrator in Vista (easily)

    - by earlz
    Hello, I have a harddrive I need to backup. In the harddrive is of course things like Documents and Settings which is set to not allow other people to see inside someone's personal folders. I am an administrator though and I can not figure out how to mark these files so that I am permitted to access them and copy them. IWhen I double click on My Documents then it pops up saying You must have permission to access this and gives me an option like ok or cancel. I click ok and then it says you do not have permission to access these files I'm an administrator on the system so I don't understand why Vista is locking me out. How can I setup vista so that it will let me copy every file, even ones I don't have permission to?

    Read the article

  • Terminal command to change permissions to my 'Sites' folder and apply change to enclosed items?

    - by Ryan
    Using Snow Leopard, I'm having issues with permissions in my Sites folder. While I can navigate to localhost/~username and read any files or folders there, the same permissions have not been applied to enclosed items, and I get a 403 error trying to access them in the browser. If I select one of these enclosed folders and get info using Finder, I see the user 'Everyone' is set to 'No Access' but I can't change that (this behavior seems buggy, actually). And if I select my 'Sites' folder, the tool to 'Apply to enclosed items' is grayed out... Is there a Terminal command I can use to grant 'Read Only' access to my Sites folder, and all it contains, for the user 'Everyone'?

    Read the article

  • How to secure svn+ssh checkout users?

    - by vvanscherpenseel
    All our SVN repositories are hosted on a dedicated machine on which all the developers have access. Every now and then we need to checkout a repository on a machine we don't own or operate ourselves. Currently we all use our own system (SSH) account for this, but instead I would like to use some generic 'checkoutsvn' user that can be used for this. This user is only used for checking out from a repository, but should not be allowed to log in to the system (no shell access). I tried to do this by setting the default shell of that account to /sbin/nologin but then SVN fails, as apparently svn+ssh requires shell access. How do you do this? Is there a good solution for this?

    Read the article

  • nagios ldap-group based front end login permission issues

    - by Eleven-Two
    I want to grant users access to the nagios 3 core frontend by using an active directory group ("NagiosWebfrontend" in the code below). The login works fine like this: AuthType Basic AuthName "Nagios Access" AuthBasicProvider ldap AuthzLDAPAuthoritative on AuthLDAPURL "ldap://ip-address:389/OU=user-ou,DC=domain,DC=tld?sAMAccountName?sub?(objectClass=*)" AuthLDAPBindDN CN=LDAP-USER,OU=some-ou,DC=domain,DC=tld AuthLDAPBindPassword the_pass Require ldap-group CN=NagiosWebfrontend,OU=some-ou,DC=domain,DC=tld Unfortunately, every nagios page just shows "It appears as though you do not have permission to view information for any of the services you requested...". I got the hint, that I am missing a contact in nagios configuration which is equal to my login, but creating one with the same name as the domain user had no effect on this issue. However, it would be great to find a solution without manually editing nagios.conf for every new user, so the admins could grant access to nagios by just putting the user to "NagiosWebfrontend" group. What would be the best way to solve it?

    Read the article

  • Getting started with the vCenter Web Client Administration tool

    - by Saariko
    I am trying to access a newly vCenter. The documentation clearly mentions to access the web-admin through: https://localhost:9443/admin-app but since I don't have a windows OS built under the vCenter (I use the vCenter Appliance) There is no localhost to use. If I try with the host IP I get the error: This PDF explains to install IIS component - But it's ESX 4 - and also not talking about appliance. so, a simple question: how can I access the web-app admin tool? also found a similar question on vmware forum. But I can't understand the solution/if any.

    Read the article

  • Computer not finding hard drives on boot -sometimes-

    - by todd.pund
    Computer specs: Mobo: Gigabyte ultradurable 3 - GA-970A-UD3 Processor: First gen I7 3.2GHZ Ram: 8GB Kingston DDR3 1066 Video Card: EVGA NVidia GTX 460 1GB Hard Drive: 500MB 7200rpm x2 (can't remember brand, sorry I'm at work.) Last week my developer preview for Windows 8 ran out so I put my copy of windows 7 back on the computer. The computer at that point started suffering from frequent freezing and crashing. When I rebooted the computer sometimes it wouldn't find the system HD at all. When I looked at the post screen it seemed to show that it wasn't finding either of the HDs. Then yesterday when turning on the computer I just got GRUB as a message (not a GRUB prompt, just GRUB) I haven't had a dual boot of Linux for at least a year. I loaded windows 7 recovery console from the disk and ran: bootrec /fixboot bootrec /fixmbr Which did not help. At that point I just installed Ubuntu 13.04 over the windows 7 install and still received the GRUB post. I went into the BIOS and switched the Hard Drive priorities and then it loaded into Ubuntu fine. For several days everything was just hunky dory until I installed the Ubuntu version of Steam, install Portal and tried to run it. At that point the computer froze and after hard rebooting couldn't find the hard disks again. Then after restarting the system it loaded up fine again and no issues since. (I have not tried to launch portal again). My next thought is to remove the system hard drive and try to use the secondary as the master to see if the primary HD is bad. I'm sorry if this has been confusing, I'll answer any questions I can. Any thoughts?

    Read the article

  • How to set the network profile of Windows 7 via group policy?

    - by Ricket
    We are deploying client computers and in testing noticed that the first time the user logs into the computer, it asks them if the location is a home, work, or public location. We are worried that some users in our workplace might misread it (or not read it at all) and click Public, thus likely denying our access to the computer and messing up security settings and such. Can we set our network to be a "Work Network" location via group policy or some other mechanism of our Windows Domain so that the user is not prompted when connected to our network? Also these are laptops, so we don't want every network they connect to be set as work network, and we have several access points (wired and three wireless) which our users often switch between so I'm not yet sure if it reprompts with each access point but I have the feeling it will, and I would like all of these to be set to the Work profile type.

    Read the article

< Previous Page | 378 379 380 381 382 383 384 385 386 387 388 389  | Next Page >