Search Results

Search found 34056 results on 1363 pages for 'mod access'.

Page 468/1363 | < Previous Page | 464 465 466 467 468 469 470 471 472 473 474 475  | Next Page >

  • Windows Server 2008 - one MAC Address, assign multiple external IP's to VirtualBoxes running as guests on host

    - by Sise
    Couldn't find any help @ google or here. The scenario: Windows Server 2008 Std x64 on i7-975, 12 GB RAM. The server is running in a data centre. One hardware NIC - RealTek PCIe GBE - one MAC Address. The data centre provides us 4 static external IP's. The first is assigned to the host by default of course. I have ordered all 4 IP's, the data centre can assign the available IP's to the physical MAC address of the given NIC only. This means one NIC, one MAC Address, 4 IP's. Everything works fine so far. Now, what I would like to have: Installed VirtualBox with 1-3 guests running, each gets it's own external IP assigned. Each of it should be an standalone Win Server 2008. It looks like the easiest way would be to put the guests into an virtual subnet and routing all data coming to the 2nd till 4th external IP through to this guests using there subnet IP's. I have been through the VirtualBox User Manuel regarding networking. What's not working: I can't use bridged networking without anything else, because the IP's are assigned to the one MAC address only. I can't use NAT networking because it does not allow access from outside or the host to the guest. I do not wanna use port forwarding. Host-only networking itself would not allow internet access, by sharing the default internet connection of the host, internet is granted from the guest to the outside but not from outside or the host to the guest. InternalNetworking is not really an option here. What I have tried is to create an additional MS Loopback adapter for a routed subnet, where the Vbox guests are in, now the idea was to NAT the internet connection to the loopback 'subnet'. But I can't ping the gateway from the guests. By using route command in the command shell or RRAS (static route, NAT) I didn't get there as well. Solutions like the following do work for the one way, but not for the way back: For your situation, it might be best to use the Host-Only adapter for ICS. Go to the preferences of VB itself and select network. There you can change the configuration for the interface. Set the IP address to 192.168.0.1, netmask 255.255.255.0. Disable the DHCP server if it isn't already and that's it. Now the Guest should get an IP from Windows itself and be able to get onto the internet, while you can also access the Host. Slowly I'm pretty stucked with this topic. There is a possibility I've just overlooked something or just didn't getting it by trying, especially using RRAS, but it's kinda hard to find useful howto's or something in the web. Thanks in advance! Best regards, Simon

    Read the article

  • Apache2 doesn't serve PHP-scripts correctly

    - by cmbrnt
    I've run into a problem with my Apache 2.2.16 configuration, running on Debian Squeeze. The problem is that it stopped serving PHP5-scripts completely. When I try to access the sites with Google Chrome, it instead downloads a file called "download", which contains the contents of the script. This is of course not a good thing. It does serve common html-files perfectly... I've been at this for quite a while now, and after all the googling and troubleshooting, I thought it would be a good time to ask you guys. Here's what I've got: The php5 and libapache2-mod-php5 packages are installed /etc/apache2/mods-available contains both php5.load and php5.conf, and these are symlinked from the mods-enabled directory The /etc/php5/ directory is left untouched since the installation. Here's the contents of /etc/apache2/mods-available/php.load: LoadModule php5_module /usr/lib/apache2/modules/libphp5.so And /etc/apache2/mods-available/php.conf: <IfModule mod_php5.c> <FilesMatch "\.ph(p3?|tml)$"> SetHandler application/x-httpd-php </FilesMatch> <FilesMatch "\.phps$"> SetHandler application/x-httpd-php-source </FilesMatch> <IfModule mod_userdir.c> <Directory /home/*/public_html> php_admin_value engine Off </Directory> </IfModule> </IfModule> What am I missing? This is a server with modified virtual hosts and the like, so I might have changed some settings which causes this problem, but simply purging and reinstalling is not an option so far, since the configuration is quite extensive. Any help would be great. Thanks.

    Read the article

  • OWA draft folder doesn't sync up wint Outlook 2007 (when using Citrix)

    - by George
    I have a user logging in to Citrix Server (on Windows 2003) to use Outlook 2007. In OWA, he sees all his drafts in Draft Folder and can easily access them, but when he is Citrix, he can see the folder, but not the messages. I had him check Normal and Favorite Folders under View - Navigation Pane as well as execute outlook /cleanviews to no help. I should also clarify, we host exchange locally and it syncs up with Outlook 2007 in Citrix. Remote users use either OWA for access or login to Citrix and use Outlook 2007. In his case ALL folders appear in Outlook 2007, but draft folder doesn't show any saved messages, even though in OWA messages are there and he can edit, delete and send them. Please, help! Thanks!!!

    Read the article

  • Finding proof of server being compromised by Black Hole Toolkit exploit

    - by cosmicsafari
    I recently took over maintenance of a company server. (Just Host, C Panel, Linux server), theres a tonne of websites on it which i know nothing about. It had came to my attention that a client had attempted to access one of the websites hosted on this server and was met with a warning from windows defender. It had blocked access because it said the website had been compromised by the Black Hole Toolkit or something to that effect. Anyway I went in and updated various plugins and deleted some old suspect websites. I have since ran the website in question through a few online malware scanners and its comes up clean everytime. However im not convinced. Do any of you guys know extensive ways i can check that the server isn't still compromised. I have no way to install any malware scanners or anti virus programs on the server as it is horribly locked down by Just Host.

    Read the article

  • Network connectivity issues with Windows Store

    - by Duy Tran
    I have my Windows 8 Pro build 9200 installed on my Dell laptop. I want to install some new apps and updates from the Store but there might be some network problem that caused the downloading gauge showing up but not really running at all. I followed some instructions that switched from local user to my Microsoft account, but this "Please wait" screen keeps showing and I don't really know why. I still have internet access and can use some apps like People, Mail, etc. with my account logged in, I can surf the net using Firefox, Chrome and Internet Explorer. I did another test using cmd with ping -t google.com and it showed that my laptop has internet access. Anybody knows a solution to make the Store working properly? Or is there any workaround to switch to the Microsoft account instead of a local user account?

    Read the article

  • Looking at desktop virtualization, but some users need 3D support. Is HP Remote Graphics a viable solution?

    - by Ryan Thompson
    My company is looking at desktop virtualization, and are planning to move all of the desktop compute resources into the server room or data center, and provide users with thin clients for access. In most cases, a simple VNC or Remote Desktop solution is adequate, but some users are running visualizations that require 3D capability--something that VNC and Remote Desktop cannot support. Rather than making an exception and providing desktop machines for these users, complicating out rollout and future operations, we are considering adding servers with GPUs, and using HP's Remote Graphics to provide access from the thin client. The demo version appears to work acceptably, but there is a bit of a learning curve, it's not clear how well it would work for multiple simultaneous sessions, and it's not clear if it would be a good solution to apply to non-3D sessions. If possible, as with the hardware, we want to deploy a single software solution instead of a mishmash. If anyone has had experience managing a large installation of HP Remote Graphics, I would appreciate any feedback you can provide.

    Read the article

  • ASA 5505 8.4 open ports for subnet

    - by fwrawx
    I have an ASA 5505 running 8.4 with its outside interface plugged into our internal network. I want to open up access to hosts on one of the vlans behind that ASA to hosts on our internal network. I was just starting to grasp NAT on our older PIX but the ASA 8.4 has me confused now. Given a clean ASA with an outside vlan of 10.0.0.1/24 and test vlan of 10.0.1.1/24 what's the basic configuration needed to allow any hosts on the outside network to have access to any of the hosts on the test network?

    Read the article

  • How to configure Amazon Security Groups to achieve multi-tier architecture?

    - by ks78
    What is the preferred way to configure Amazon Security Groups to achieve a multi-tier architecture? Each of my instances has its own Security Group, which I only want to use for rules specific to an instance. I'd like to keep any rules which apply to multiple instances in a separate Security Group, which can then be assigned to instance Security Groups as necessary. As an example, I've setup a group called "admin", which allows administrative access from my IP. I added the "admin" group as the source to each of my instance security groups. However, I still can't access the instances from my IP without adding the rules directly to the instance's group. Am I missing something? Although it seems a multi-tier security architecture should be possible, it doesn't seem to be working.

    Read the article

  • TV network capabilities.

    - by Narcolapser
    Question: Can the major TV producer's internet TV systems access network resource? Info: I'm looking for large TV's right now for my company to set up in conference rooms. We want the ability to load presentations with out having to have a computer to do so. Our hope is to put things on to network drives and access and display them from there. I've heard that LG's can do this if you convert the power point file in to a show format. that's fine. I just need to get this information to the TV with out the computer attached. Can anyone tell me if companies like LG, Vizio, Sony, Samsung, etc. have TV's that are capable of doing this? Thanks ~n

    Read the article

  • How to check use of userva boot option on Win 2K3 server

    - by Tim Sylvester
    I have some 32-bit Win2K3 servers running an application that fails now and then apparently due to heap fragmentation. (Process virtual bytes grows, private bytes does not) I do not have access to the source code or build process of this application. I have modified the boot.ini file on one of these servers to include /userva=2560, half way between the normal mode of operation and the /3GB option. Normally it takes weeks to reach the point of failure, but I'd like to see right away whether this has actually had any effect. As I understand it, this option limits the kernel to the remaining address space (1536MB instead of 2048), but does not necessarily give an application the extra address space, depending on the flags in the application's PE header. How can I determine whether the O/S is allowing a particular application, running in production, to access address space above 2GB? Additionally, what's the best way to monitor the system to ensure that the kernel is not starved for address space, and more generally how should I go about finding the optimal value for this setting?

    Read the article

  • Move a screen session back to its original PID

    - by cron410
    Installed McMyAdmin (minecraft manager) on Ubuntu 12.04 32 bit. Wrote my own service to start McMyAdmin (.net app running in Mono) in its own screen session, and be able to inject proper McMyAdmin commands into that session with the init.d script. Its been running great! Today, I decided to start installing a Source dedicated server (counterstrike pro mod) I determine its going to be a long download process so I quit the process and fire up a fresh screen session called "source". I paste the command in, but it has a space at the begining and bash complains of ignoring semaphores or some such. I detach and reattach the session. Its sliding like butter. I ctrl+a-d out of the session and start exploring the new folder structure and figure out where I need to place a symbolic link. I go to resume that screen and this is what I see: $screen -r source There are several suitable screens on 20091.source (12/02/12 22:59:53) (Detached) 19972.source (12/02/12 22:57:31) (Detached) 917.minecraft (11/30/12 15:30:37) (Attached) It appears I am connected to the minecraft screen?!?!?! So I attach to the other screens one at a time. minecraft is running in 19972.source and sourceds is running in 20091.source So how the hell did I move the minecraft process to another session called source and my main terminal is now "attached" to the minecraft screen? more: I just used exit to quit the putty session, then logged back in, its still the same. did that 3 more times and now the minecraft screen is gone and everything is acting as it should except, of course, for the session name and start time of the "new" minecraft screen. Should I just submit this as a bug for GNU screen?

    Read the article

  • Primary/secondary ethernet interfaces via NetworkManager in Ubuntu 9.10

    - by Josh
    I have an Ubuntu 9.10 machine with three ethernet interfaces, eth0, eth1 and eth2. eth2 is connected to a private network. eth0 and eth2 are connected to two different LANs. Either one will provide access to the internet. All three networks have DHCP servers. Using Ubuntu's the default settings (And Gnome), when I boot up all the interfaces are active and my system gets three IP addresses. However any attempt to access the internet results in connection timeouts and other weirdness. I suspect that traffic is going out on one NIC (like eth0) and coming back in on another (like eth1). I'm not sure what's going on. The only way I can access the internet at the moment is to bring two of the devices down with ifdown. How can I configure eth0 as my primary interface so all trafic goes out by default on that interface, while keeping the other two active? Also, I want to make sure Avahi broadcasts properly on all three IPs so that the computers on the LAN of eth1 can still connect to myHostname.local... EDIT: Here's my routing table: Kernel IP routing table Destination Gateway Genmask Flags MSS Window irtt Iface 172.16.151.0 0.0.0.0 255.255.255.0 U 0 0 0 eth2 172.16.30.0 0.0.0.0 255.255.255.0 U 0 0 0 eth0 10.1.0.0 0.0.0.0 255.255.0.0 U 0 0 0 eth1 169.254.0.0 0.0.0.0 255.255.0.0 U 0 0 0 eth1 0.0.0.0 172.16.30.2 0.0.0.0 UG 0 0 0 eth0 0.0.0.0 10.1.0.1 0.0.0.0 UG 0 0 0 eth1 I want the 172.16.30.2 network to be the primary one and the 10.1.0.0 network to be the secondary one. EDIT2: My nameservers are also incorrect. It seems like Ubuntu is bringing the networks up in order, eth0, then 1, then 2, and the DHCP information from eth1 is overriding eth0, and eth2 is overriding eth1. How can I reverse this so the DHCP information from eth0 is the "master"? EDIT3: This seems to be an issue with Gnome's NetworkManager.

    Read the article

  • Using wildcard domains to serve images without http blocking

    - by iopener
    I read that browsers sometimes block waiting for multiple images from the same host, and I'm trying to do everything I can to speed up page load times. One caveat: I need to serve files over HTTPS. Any opinions about whether this is feasible: Setup a wildcard cert for *.domain.com. Whenever I need an image, generate an number based on a hash mod 5 of the filename, and append it to an 'img' subdomain (eg img1.domain.com, img4.domain.com, img3.domain.com, etc.); the hash will make any filename always use the same subdomain, and therefore the browser should be able to cache the images Configure a dynamic virtualhost record to point all img#. subdomains to /var/www/img I am looking for feedback about this plan. My concerns are: Will I get warnings when my page has https:// links to multiple subdomains? Is the dynamic virtualhost record I'm talking about even possible? Considering the amount of processing this would require, is it likely to even produce any kind of overall benefit? I'm probably averaging a half-dozen images per page, with only half being changed on each page refresh. Thanks in advance for you feedback.

    Read the article

  • What's the easiest way to allow Exchange 2003 remote (no MSO client) users check their Mailbox size?

    - by Myrddin Emrys
    We are migrating from Exchange 2003 with no quota settings to Exchange 2010 with limited mailbox sizes. We are trying to get users to clean their mailboxes prior to the move to reduce the transfer load, as well as to comply with new quotas on the 2010 system. But many users access their mail through webmail only. I cannot see a way for users to access their mail store size in this manner. Has anyone else run into this problem? Is there a good way to easily let users check their own mailbox size? The only thing I've come up with as a workaround is a report that IT generates and mail-merge it out to users daily with their current mailbox size. This is cumbersome and time consuming compared to a way for them to check their own mailbox size however.

    Read the article

  • How to get Apache web server to run on other computers?

    - by Eric
    I have been using Apache Web Server for a while now and one thing I have noticed is that outside of my computer I can not access it. I use my Apache server for PHP development on my computer, but I would like to access stuff I have made outside of my computer. I am on a linksys router network. I usually run it of http://localhost/ or http://127.0.0.1/. I IPCONFIGED my computer and got 192.168.1.105 so I went there with my browser and got the page just fine. I tried doing this on another computer on the same network but it didn't work. How do I fix this? Sorry about the bad wording. I am in ah hurry Information you might need to know: Server: Apache 2.2 Operating System: Windows 7 ULTIMATE

    Read the article

  • Dial-in VPN Routing issue when on 192.168.x.x network range

    - by Ian
    I'm not an expert on networks but have a small office on the 192.168.x.x. range which is managed by a vigor (2800) router. I have enabled the VPN dial-in option on the router so I can get to the server on 192.168.1.100 which works fine from my macbook when i'm NOT on a local network with that is on the 192.168.x.x range. e.g. works fine when I tether over my Android smartphone but when I try & connect when on my home network, it connects, I can access the router (192.168.1.1) but cannot access 192.168.1.100 - traceroute doesn't hop via 192.168.1.1 I have enabled "send all traffic over VPN connection but again, not joy... Feels like the osx platform isn't routing the traffic out to the vpn endpoint as the destination address is on the local subnet but expect it would be. This work fine on a windows PC on the same home network. Any thoughts on what the issue could be?

    Read the article

  • How to forward connection from one interface to another under linux

    - by Daniel
    Hi, I have linux box which has two network interface, eth0, eth1. from eth1 I can access an internal website, say under port 8080. from outside the box, I can't access that network. my question is, is there a way I set up something so from outside the box, there appears to be a web server running in port 8080 and when I connect to it, it automatically forwards to eht1 the internal site? I tried to enable ip forward and add a static route, but it doesn't work. thanks.

    Read the article

  • Hard drive caught malware and all folders are in shortcuts

    - by Ammar
    I have an external hard drive from Seagate. I think it accidently caught a malware/virus, since all the files in there became shortcut folders. I have very important folders and now I cannot access them at all. I did not have an antivirus program; I just formatted the PC and forgot to install one. Just recently, I installed Avira and it caught the malware, but since I removed the malware via Avira, I can't access anything now. Please help me on what I need to do. I am really lost.

    Read the article

  • How to use 2 or more internet connections on the same network?

    - by Rogue
    Living in a joint family we have 3 internet connections, each floor has one internet connection which is split per floor. Each internet connection is shared between 4-5 computers using a switch per floor. Each of these internet sharing networks are independent of each other. What I want to achieve here is a local network (for local messenger and file sharing) that can combine all the 3 independent networks, problem is that whenever I try to do that the whole network tends to use just one internet connection. I have all the necessary hardware. How do I solve this problem, one approach could be that I could get one PC to act as a server and bridge the internet connections and then the whole network would have to access the internet through this server. Theoretically this could be possible but I have never tried this approach in real life. Also if certain computers need be restricted from internet access how would this be possible on the same network?

    Read the article

  • vsftpd with pam_winbind.so

    - by David
    I'm trying to setup vsftpd to use logins from our domain. I want the ftp users to be able to login using their active directory username/password and have be able to have full access to /media/storage/ftp/username. I setup pptp using winbind and it is working fine, so I belive the issue is with vsftpd and pam. The ftp server runs and gives 530 for the login. I turned on debug for the pam module, but I see nothing in the syslog. Vsftp only logs a wrong login in its log. /etc/pam.d/vsftpd auth required pam_winbind.so debug /etc/vsftpd.conf listen=YES listen_ipv6=NO connect_from_port_20=YES anonymous_enable=NO local_enable=YES write_enable=YES xferlog_enable=YES idle_session_timeout=600 data_connection_timeout=120 nopriv_user=ftp ftpd_banner=Welcome to Scantiva! Authorized access only! local_umask=022 local_root=/media/storage/ftp/$USER user_sub_token=$USER chroot_local_user=YES secure_chroot_dir=/var/run/vsftpd/empty pam_service_name=vsftpd guest_enable=YES guest_username=ftp ssl_enable=YES allow_anon_ssl=NO force_local_data_ssl=NO force_local_logins_ssl=NO ssl_tlsv1=YES ssl_sslv2=YES ssl_sslv3=YES rsa_cert_file=/etc/ssl/private/vsftpd.pem

    Read the article

  • Windows 7 connect to Lion file sharing

    - by Automaton
    Trying to access my Mac from a Windows 7 computer, I fail with the infamous error 86 incorrect password. Now this appears to be a well-known problem with countless threads on the internet giving as many "solutions" as there are discussion threads about it (mostly ranging from installing third-party commercial samba servers, to switching to some other protocol, to compiling a plain-vanilla Samba installation - the latter which I will probably do when I give up this :) ) I am stubborn, and I believe there must be some problem here that can be solved or worked around, but there is surprisingly little detail about this problem. It appears to have something to do with a mismatch of authentication methods. Trying to run samba in debug mode: sudo /usr/sbin/smbd -debug -stdout gets me this output when trying to access it from Win 7 ... smb1_dispatch_one [smb_dispatch.cpp:377] dispatching SMB_COM_SESSION_SETUP_ANDX smb1_dispatch_session_setup [session_setup.cpp:261] FIXME erase existing sessions log_gss_error [gssapi_mechanism.cpp:97] gssapi: gss-code: Miscellaneous failure (see text) log_gss_error [gssapi_mechanism.cpp:113] gssapi: mech-code: unknown mech-code 22 for mech unknown What is the problem here, and how do I fix it?

    Read the article

  • single sign-on integrating SVN

    - by ramdaz
    I need to authenticate my windows users on to a Linux Server which will act as a primary authentication source. Users need to be authenticated and use their access to run SVN or Mercurial ( with something like Tortoise SVN client), or some versioning system. The versioning system need to be authenticated against the Linux Server's authentication source, and users need to use their Windows login username and password to server. I'd have attempted to do this normally on Samba. But is there a better choice? Also how do you create a roaming profile? That is anyone should be able to access their SVN from any PC as long as they use their right Windows username and password

    Read the article

  • Articles of x386 and later CPU based systems

    - by user32569
    Hi there. I know this is hard question, and possibly not to be answered here, but if there is some article, or more you know about, please post a link. About books, its sad but many great computer books cannot be bought in my country. So, you can find many articles online, which says how memory was mapped back in pre x386 CPU. How there was explicit holes ready for MMIO BIOS, Video BIOS, etc. How there was A20 line for allowing higher memory access etc. Problem is, time changed. Today BIOSes are many times larger, and pure x86 16bit mode is used for booting and ROM flashing only. OS ignore BIOS as they access everything using drivers. And I just want to know, how it works today. I know not so specific question, but I read OS dev wiki, many articles, but all refering to days before massive usage of pure 32bit CPUs.

    Read the article

  • Alternative Windows Offline Files + Windows Backup + Previous Version Setup

    - by Herson
    Currently our documents are all hosted in a Windows 7 box. Users can access the files using Windows share and the documents are available offline (windows 7 feature). The documents are being backed up daily by Windows 7 backup and restore utility. Users can access previous versions of the file (from the backups) using Windows Explorer "previous versions" feature. This setup is currently working well, except for the following: We would prefer to have access to hourly versions of the file, not daily. The previous version mechanism is tied up to the backup mechanism. Windows 7 performs a full backup every week and incremental backup everyday. The previous versions of a file is actually what are the available in the backups. If you 20GB documents and want to maintain at least three(3) year history, you will use at minimum 3 years * 52 weeks * 20GB or about 3TB even if there are few changes in the documents. Its pretty inefficient use of space. Looking up previous versions of a file is very slow (tens of minutes). This is probably related to the previous issue - Windows has to traverse its all of its backups. I am considering using SVN + autocommit/autoupdate tortoisesvn. It will have the following advantages: Backups are easy and will also backup the whole history of each documents. (Just backup the repository). Creating previous versions can be frequent. I think svn commit / update can be done every two minutes or so. Users can sync over the net. However, I can see the following issues: More conflicts than the original setup because both multiple users can now edit the same file even both are online, i.e. can connect to the SVN repo. The users can off course lock the file first before editing, but that would mean they have to adjust. Delay on propagation of file changes. On windows 7 file sharing, changes made by one online user will be instantaneously available to other online users. With the SVN setup, changes will only be propagated when the users execute the svn add/commit/update sequence. Delay will be probably a few minutes. This workflow will no longer work: "Hi, I just edited document X, can you have a quick look?" I would like to ask the opinion of the community for alternative setups, or improvements on the above setups to work out the kinks.

    Read the article

  • Imagemagick apparently correctly configured, still won't work under IIS 7.5 + PHP 5.3.2

    - by Razor
    I have managed to get imagemagick working (tested using the command line example on their website) It also appears that the php extension is correctly installed: I can see it listed in my phpinfo(); list. However, when I try to run the following code: $im = new imagick( 'examples.jpg' ); $im->thumbnailImage( 200, 0); $im->writeImage( 'a_thumbnail.jpg' ); The execution stops at the second line, because it cannot find the thumbnailImage method. What I can think of is that some user doesn't have enough privileges to access/run something else, but I obviously cannot give user access to everything. Another possibility is that the PHP extension I'm using is not the suitable one for the latest imagemagick distribution.

    Read the article

< Previous Page | 464 465 466 467 468 469 470 471 472 473 474 475  | Next Page >