Search Results

Search found 10622 results on 425 pages for 'shared hosting'.

Page 260/425 | < Previous Page | 256 257 258 259 260 261 262 263 264 265 266 267  | Next Page >

  • RemoteApps and Cached Credentials

    - by user66774
    I'm looking for a guidance on an issue we're having. We are hosting an application over terminal services through RDWeb on Windows 2008 Server. To give users the ability to change their password we've exposed the iisadmpwd to allow the users to change their passwords. When the users change their password, they are prompted to log into broker server, even if they log off of the RDWeb page and log back in. What we've found is that the credentials seem to be cached in memory after logging in. Ending task on TSWBPRXY.EXE, WKSPRT.EXE, closing IE and logging back into the RDWEb page, then launching the application allows the user to log into the application without additional credentials. I'm wondering if there is a better way to either have the user change their password from a web interface, but allow them to reestablish their connection from the RDWeb login page rather than through the RDP login prompt that comes up.

    Read the article

  • Adobe software does not save to network share

    - by Bart van Heukelom
    I'm running Windows 7 inside virtualbox on a linux host. I have shared my linux filesystem so it's accesible in Windows under \vboxsvr\sharename. I've mounted this share on S:. For most software, it works fine. Adobe software like Photoshop has problems with it though. I can read from S: just fine, but if I try to save something it gives me the message "There are no more files". How can I make it able to write to the share?

    Read the article

  • Two domains accessing same folder

    - by Liam Quinn
    I've just taken a new role in a school and am still familiarizing myself with their network, how ever I have recently been given a task and I'm having a little trouble finding out the fundementals of it. I have an admin network/domain 10.49.x.x and a classroom network/domian 192.168.1.x both connect to a Proxy server 10.49.202.231/192.168.1.51. Each domain has it's own shared folders as you'd expect, files and software installs etc, how ever there is a folder "staff" on the classroom network that all the teachers on the classroom network can access. The users on the admin network would like to access this same folder. How do I go about making this happen?

    Read the article

  • trouble loggin into a Mac share from a Windows PC on the network

    - by villares
    I have this mixed network and usually log into the Macs from the Windows XP home machines and vice versa. I have no real networking knowledge, things just seem to work, more or less, with the default settings. Now I've got a new Snow Leopard Mac with a shared folder (added the user names of the Windows users at the sharing preferences) and the trouble is some machines can open the share and others can't. I can't see the difference. It feels like some Windows machines have a "cache" and won't ask for the share password, just deny access. I can also see old shares proposed at the Windows "add network place wizard".

    Read the article

  • Zeroing SSD drives

    - by jtnire
    We host VPSes for customers. Each customer VPS is given an LVM LV on a standard spindle hard disk. If the customer were to leave, we zero out this LV ensuring that their data does not leak over to another customers. We are thinking of going with SSDs for our hosting business. Given that SSDs have the "wear levelling" technology, does that make zeroing pointless? Does this make this SSD idea unfeasable, given we can't allow customer data to leak over to another customer? Thanks

    Read the article

  • Display CPU usage separately (without root privileges)

    - by synaptik
    I need to display the CPU usage for each processing core on a single shared-memory 12-core (SMP) machine. I don't have access to install htop, else I would simply use that. I don't need fancy graphs or meters, though they would be nice. For example, simply displaying: X X X X X X X X X X X X where each X is the percentage utilization of 1 of the 12 processing cores on my machine. FYI: I know I can simply look at the utilization in "top" and divide that number by the number of cores on my machine, but I prefer a solution that shows each core separately.

    Read the article

  • Execute Bash script on Ubuntu from remote Windows machine?

    - by John Isaacks
    I have a bash script on a ubuntu 10.4 machine. It is shared and I can access it from my win7 machine with \\LINUX-SERVER\bash_repo\make-live However when I do, windows tries to open it. This is not what I want. I want to tell ubuntu to execute it. I am actually hoping to be able to build a GUI app on windows where the user clicks a button and it tells the bash script on the ubuntu machine to execute. Is any of this possible?

    Read the article

  • Can you have a staging and production slot in Azure Websites

    - by Barry King
    I'm looking at hosting 3 Websites (there will all use the same linked database resource but I think I have to use 3 websites within Azure for this); www.website.com, provider.website.com and admin.website.com. Using Windows Azure Websites, can you have a Staging, Production slot? I think this feature is only available to Azure Cloud Services but there is little documentation on this. If its not possible, other than spinning up 3 more sites to act as the staging sites is there another way? I want the ability to "swap" from staging to production.

    Read the article

  • Compilation of Etherpad fails in an OpenVZ VE

    - by ulf
    Hi everyone. I’m almost giving up, this will be my last try: I try to compile Etherpad on my OpenVZ server. It’s running a Debian 5.0 as the host system, in the VE I’ve got Ubuntu 10.04. I installed Etherpad in this VE with the instructions from the official Ubuntu Wiki: https://wiki.ubuntu.com/Etherpad. Everything runs fine until it comes to compilation. After calling bin/build.sh as described in the wiki the first steps are running fine. But then I’m running into a memory error: java.io.IOException: Cannot run program "cp": java.io.IOException: error=12, Cannot allocate memory Well, I understand the error message but don’t see the cause. The command free tells me that there’s plenty memory left in this VE: total used free shared buffers cached Mem: 2415236 1140872 1274364 0 0 0 -/+ buffers/cache: 1140872 1274364 Swap: 0 0 0 Beautiful. But even repeating the compilation process doesn’t bring me any further. Any help would be appreciated.

    Read the article

  • MVC multi page form losing session

    - by Bryan
    I have a multi-page form that's used to collect leads. There are multiple versions of the same form that we call campaigns. Some campaigns are 3 page forms, others are 2 pages, some are 1 page. They all share the same lead model and campaign controller, etc. There is 1 action for controlling the flow of the campaigns, and a separate action for submitting all the lead information into the database. I cannot reproduce this locally, and there are checks in place to ensure users can't skip pages. Session mode is InProc. This runs after every POST action which stores the values in session: protected override void OnActionExecuted(ActionExecutedContext filterContext) { base.OnActionExecuted(filterContext); if (this.Request.RequestType == System.Net.WebRequestMethods.Http.Post && this._Lead != null) ParentStore.Lead = this._Lead; } This is the Lead property within the controller: private Lead _Lead; /// <summary> /// Gets the session stored Lead model. /// </summary> /// <value>The Lead model stored in session.</value> protected Lead Lead { get { if (this._Lead == null) this._Lead = ParentStore.Lead; return this._Lead; } } ParentStore class: public static class ParentStore { internal static Lead Lead { get { return SessionStore.Get<Lead>(Constants.Session.Lead, new Lead()); } set { SessionStore.Set(Constants.Session.Lead, value); } } Campaign POST action: [HttpPost] public virtual ActionResult Campaign(Lead lead, string campaign, int page) { if (this.Session.IsNewSession) return RedirectToAction("Campaign", new { campaign = campaign, page = 0 }); if (ModelState.IsValid == false) return View(GetCampaignView(campaign, page), this.Lead); TrackLead(this.Lead, campaign, page, LeadType.Shared); return RedirectToAction("Campaign", new { campaign = campaign, page = ++page }); } The problem is occuring between the above action, and before the following Submit action executes: [HttpPost] public virtual ActionResult Submit(Lead lead, string campaign, int page) { if (this.Session.IsNewSession || this.Lead.Submitted || !this.LeadExists) return RedirectToAction("Campaign", new { campaign = campaign, page = 0 }); lead.AddCustomQuestions(); MergeLead(campaign, lead, this.AdditionalQuestionsType, false); if (ModelState.IsValid == false) return View(GetCampaignView(campaign, page), this.Lead); var sharedLead = this.Lead.ToSharedLead(Request.Form.ToQueryString(false)); //Error occurs here and sends me an email with whatever values are in the form collection. EAUtility.ProcessLeadProxy.SubmitSharedLead(sharedLead); this.Lead.Submitted = true; VisitorTracker.DisplayConfirmationPixel = true; TrackLead(this.Lead, campaign, page, LeadType.Shared); return RedirectToAction(this.ConfirmationView); } Every visitor to our site gets a unique GUID visitorID. But when these error occurs there is a different visitorID between the Campaign POST and the Submit POST. Because we track each form submission via the TrackLead() method during campaign and submit actions I can see session is being lost between calls, despite the OnActionExecuted firing after every POST and storing the form in session. So when there are errors, we get half the form under one visitorID and the remainder of the form under a different visitorID. Luckily we use a third party service which sends an API call every time a form value changes which uses it's own ID. These IDs are consistent between the first half of the form, and the remainder of the form, and the only way I can save the leads from the lost session issues. I should also note that this works fine 99% of the time. EDIT: I've modified my code to explicitly store my lead object in TempData and used the TempData.Keep() method to persist the object between subsequent requests. I've only deployed this behavior to 1 of my 3 sites but so far so good. I had also tried storing my lead objects in Session directly in the controller action i.e., Session.Add("lead", this._Lead); which uses HTTPSessionStateBase, attempting to circumvent the wrapper class, instead of HttpContext.Current.Session which uses HTTPSessionState. This modification made no difference on the issue, as expected.

    Read the article

  • Check the disk for problems on Debian Lenny

    - by Equ
    Hi guys! I just bought a VPS hosting with Debian Lenny (I'm new to all this world). I've managed to install and setup everthing I need pretty well. My testing website works fast as expected most of the time, but sometimes it is really slow (response time is about 5-10 seconds). I checked everything and seems that there are may be some disk issues. How can I check the disk for problems/performance? What else could possible cause such a behaviour? Thank you!

    Read the article

  • rss downloader script

    - by The Digital Ninja
    I have a Synology NAS that is powered by linux at my house. I'm looking to set up a cron script to check a group of rss feeds and auto download new video podcasts to a shared folder. I can do most of the scripting, such as deleting files older than 3 weeks and the wget parts. But I'm not sure how to parse the rss feed and check dates to only grab the latest. I figured its best not to re-invent the wheel and surly someone out there has a command line rss downloader or some such script. Any ideas?

    Read the article

  • How to log Windows server share connections?

    - by sbussinger
    Can anyone make a suggestion for the best way to log connections and disconnections from Windows workstations to a Windows Server 2003 file share? We're having some issues with workstations that have a drive mapped to a server that seem to work fine for awhile and then suddenly appear to get disconnected from the server (with files open). Needless to say this causes some data corruption and error messages. It would help me to troubleshoot the problem if we could somehow monitor and log the session connections and disconnections, to attempt to correlate the connectivity issues with what actions the user is taking at the time and what the server is doing. I just haven't been able to find a way to do this. Specifically I'm talking about the same information that is displayed in the Computer Management control panel applet in the "System Tools|Shared Folders|Sessions" page. Thanks!

    Read the article

  • Confusion about DNS for mail server

    - by Tyron Gower
    We have migrated to Office365, everything is working except one company cannot email us as its connecting to our subdomains email server. So, We have companionsoftware.com.au hosted through office365 and all the required DNS entries. All seems to be working fine. We then have a web server hosting our website companionsoftware.com.au and our subdomain email attachments.companionsoftware.com.au (pop3/smtp). now for this one company when they try and email [email protected] it's connecting to STMP on attachments.companionsoftware.com.au. Now attachments.companionsoftware.com.au and companionsoftware.com.au have the same ip address, but this is only affecting one person (that we know of) when they try and email us. Have i configured something wrong or is it their server?

    Read the article

  • free -m output, should I be concerend about this servers low memory?

    - by Michael
    This is the output of free -m on a production database (MySQL with machine. 83MB looks pretty bad, but I assume the buffer/cache will be used instead of Swap? [admin@db1 www]$ free -m total used free shared buffers cached Mem: 16053 15970 83 0 122 5343 -/+ buffers/cache: 10504 5549 Swap: 2047 0 2047 top ouptut sorted by memory: top - 10:51:35 up 140 days, 7:58, 1 user, load average: 2.01, 1.47, 1.23 Tasks: 129 total, 1 running, 128 sleeping, 0 stopped, 0 zombie Cpu(s): 6.5%us, 1.2%sy, 0.0%ni, 60.2%id, 31.5%wa, 0.2%hi, 0.5%si, 0.0%st Mem: 16439060k total, 16353940k used, 85120k free, 122056k buffers Swap: 2096472k total, 104k used, 2096368k free, 5461160k cached PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 20757 mysql 15 0 10.2g 9.7g 5440 S 29.0 61.6 28588:24 mysqld 16610 root 15 0 184m 18m 4340 S 0.0 0.1 0:32.89 sysshepd 9394 root 15 0 154m 8336 4244 S 0.0 0.1 0:12.20 snmpd 17481 ntp 15 0 23416 5044 3916 S 0.0 0.0 0:02.32 ntpd 2000 root 5 -10 12652 4464 3184 S 0.0 0.0 0:00.00 iscsid 8768 root 15 0 90164 3376 2644 S 0.0 0.0 0:00.01 sshd

    Read the article

  • How can I unobstrusively backup a few client's email?

    - by tladuke
    This is a small office. Our web/email server is a shared host. In the office we have a windows 2008 box up all the time that runs our NAS and a couple other services. I don't have access to the ISP admin stuff, but I assume it has cpanel or something like that. I can get access if I ask. I want to get email backed up from the server to our NAS without the users having to do anything. I suppose I could set up Outlook on that server with everyone's account, but that's a terrible idea, maybe (would sent mail. The boss uses outlook, but we have Apple Mail and Thunderbird clients too. I guess the important thing is that outlook look at the backups, so boss is happy. Then again, maybe it should be stored in whatever is the most portable format (that will work on NTFS) This is for about 10 users.

    Read the article

  • Establish direct cable connection between Windows 8 PCs in home network

    - by Marie. P.
    I'm running two PCs, a desktop and a laptop with Windows 8 Release Preview ("Build 8400"). They are connected to the same router in infrastructure mode, thereby having wireless internet. Due to often file synchronization between the machines I want to establish a cable connection that allows direct file transfer, without needing to use the wireless. When I plug in the cable (normal, not cross-over), I see in "Control Panel\Network and Internet\Network Connections": "Ethernet - unidentified Network" on both PCs. Transferring a file between both still only uses the WiFi via the Router. I noticed that when turning off the wifi on one PC, I can set up a shared internet connection that will work via Ethernet-cable, but since sometimes only one PC runs, sometimes the other one, I do not want to have the internet of one machine to be dependent on the other one being switched on. I do not have a crossover-cable, but since I did connect the PCs already successfully (just without both being on the internet), I'm sure that this should also work with a normal ethernet cable.

    Read the article

  • Type 1 Hypervisor on the desktop

    - by Blazemore
    I have a powerful home PC, and I've used VirtualBox to run Linux distros in Windows (and vice versa). I'm interested in trying out a lightweight type 1 hypervisor to run all my operating systems (Windows 7, Debian, Arch) and was looking for suggestions of which to pick and how to implement this. From what I gather, a type 1 hypervisor is a lightweight OS which simply provides VM management functionality. Will I get reasonable performance under each guest OS? Can all the guest OSs have access to a shared data drive, or is is best to have a storage server in another guest OS and mount it over the virtual network? What about gaming, is this feasible, or will I realistically need to run Win7 on bare metal? I'd appreciate any input.

    Read the article

  • Cannot umount device is busy

    - by user132199
    Situation I am running a RHEL server via a VM on my laptop. I have a win7 desktop sharing out a folder and the VM on my laptop running RHEL6 has a CIFS windows mount at \mnt\win When I go to unmount the device I get a device is busy message. So I went to my laptop and check to see if there were any users connected to the share, since it listed none I turned off sharing. I went back to my RHEL instance and attempted another umount \mnt\win but received the same error. Question What are other alternatives to unmounting a shared drive?

    Read the article

  • WordPress issues with htaccess causing 500 server error

    - by Scott B
    I have a few customers of my custom wordpress theme that are reporting that their sites have went down over the past few weeks due to a 500 internal server error. In each case, it appears that the htaccess file has been to blame. In one case, the user's hosting company found a "_pvt/service.pwd" line in there that was apparently causing the problem. In another instance, the hosting company indicated that a chron job appeared to be causing the issue and sent the user the following as evidence... root@cherry [/home/login/public_html]# stat .htaccess File: `.htaccess.orig' Size: 587 Blocks: 8 IO Block: 4096 regular file Device: 811h/2065d Inode: 590021607 Links: 1 Access: (0644/-rw-r--r--) Uid: ( 2234/login) Gid: ( 2231/login) Access: 2010-03-07 16:42:01.000000000 -0600 Modify: 2010-03-26 09:15:15.000000000 -0500 Change: 2010-03-26 09:45:05.000000000 -0500 In yet another instance, the user reported this as the cause... The permissions on my .index file somehow got changed to 777 instead of 644 I'm just seeking to help these users understand what's going on, the likely cause and how to prevent it. I also want to eliminate my theme as a potential contributing factor. I have two areas in which I want to submit here to make sure that they are not likely to cause such an issue. They are my permalink rewrite code as well as my upgrade script (which sets 755 on the destination folder (my theme folder). Here's the permalink rewrite code... if (file_exists(ABSPATH.'/wp-admin/includes/taxonomy.php')) { require_once(ABSPATH.'/wp-admin/includes/taxonomy.php'); if(get_option('permalink_structure') !== "/%postname%/" || get_option('mycustomtheme_permalinks') !=="/%postname%/") { $mycustomtheme_permalinks = get_option('mycustomtheme_permalinks'); require_once(ABSPATH . '/wp-admin/includes/misc.php'); require_once(ABSPATH . '/wp-admin/includes/file.php'); global $wp_rewrite; $wp_rewrite->set_permalink_structure($mycustomtheme_permalinks); $wp_rewrite->flush_rules(); } if(!get_cat_ID('topMenu')){wp_create_category('topMenu');} if(!get_cat_ID('hidden')){wp_create_category('hidden');} if(!get_cat_ID('noads')){wp_create_category('noads');} } if (!is_dir(ABSPATH.'wp-content/uploads')) { mkdir(ABSPATH.'wp-content/uploads'); } And here is the relevant lines from my uploader script... // permission settings for newly created folders $chmod = 0755; // Ensures that the correct file was chosen $accepted_types = array('application/zip', 'application/x-zip-compressed', 'multipart/x-zip', 'application/s-compressed'); foreach($accepted_types as $mime_type) { if($mime_type == $type) { $okay = true; break; } } //Safari and Chrome don't register zip mime types. Something better could be used here. $okay = strtolower($name[1]) == 'zip' ? true: false; if(!$okay) { die("This upgrader requires a zip file. Please make sure your file is a valid zip file with a .zip extension"); } //mkdir($target); $saved_file_location = $target . $filename; if(move_uploaded_file($source, $saved_file_location)) { openZip($saved_file_location); } else { die("There was a problem. Sorry!"); }

    Read the article

  • Default documentroot apache does not work

    - by James Wise
    I have apache version 2.2 and php 5.3.15 on a single server. I configured virtual hosting and a default vhost. 0_default_.conf - goes to /var/www/default sub.domain.com.conf - goes to /var/www/sub.domain.com My question is, how could I set the default documentroot to sub.domain.com permanently? That means all request should be redirected to sub.domain.com. I try to remove 0_default_.conf but when viewing the page it display the php source code of sub.domain.com. Here is my configurations -- http://pastebin.com/4e3awUJ4 Although I can create index.php to /var/www/default and permanently redirect to sub.domain.com site but it's not viable solution for me because what if I didn't point the ip address of sub.domain.com to the server so user cannot view that subdomain. I would appreciate if anyone could share their knowledge and wisdom. Thanks. JamesW

    Read the article

  • Can't copy files from network drive

    - by user630320
    I have weird problem with copying files. When I copy file from network drive into C drive nothing happens but when I copy file from network drive to desktop I can copy the file. Also if I copy files from desktop into C it works fine. I have full local admin permission on this PC and the network drive. I have try these things Created new profile Run Windows Update Run checkdisk I'm using Windows XP 32bit pro Update: Network path: \\server1\shared\folder PC: C:\ (this doesn't work) C:\Documents and settings\Userid\Desktop (This works fine)

    Read the article

  • How can I tell how many bits my ssh key is?

    - by yairchu
    I already created an ssh key for myself sometime in the past. I don't remember "how many bits" it is. How can I tell? I'm wondering because I'm using hosting at nearlyfreespeech.net and their faq says: Can I configure my ssh connection to use a public key? ... we will not install keys that have a length less than 1536 bits ... We prefer that you use a key at least 2048 bits in length, and if you are generating a new key, the recommended length is 4096 bits.

    Read the article

  • How do I scale EC2 and push out code / data to my instances?

    - by chris
    Unfortunately I only have a limited knowledge of server architecture, I come from a development background. I am looking to ensure my new app can scale properly using EC2. I currently have a T1.micro for development running Windows with SQL server 2008. The system allows students to come to our site to search for a mentor, update their profile with pictures and employment history etc. Roughly the same sort of work as a LinkedIn profile. I need this to be able to scale very quickly without wasted resources. I understand the following is important. Separation of data, application etc. I will achieve this I think by hosting images using S3, Database instance via RDS and upgrade the EC2 instance. My main question is: How do I push data / code out to multiple ec2 / RDS instances seamlessly?

    Read the article

  • Trouble getting PHP, Apache, and Zend talking to eachother (localhost)

    - by Joel
    Hi guys, I've searched through several other questions, but haven't found my solution. THe main reason is that I'm not even sure if I have all these things properly installed. I have a hosting account, and have always just deployed everything into the internets, but I'm finally trying to figure out how to get my desktop set up right for learning Zend Framework. I have Apache Server 2.2, PHP, And Zend Framework installed. I'm trying to do this tutorial: http://akrabat.com/wp-content/uploads/Getting-Started-with-Zend-Framework.pdf The problem is when I click on the link: http://localhost/zf-tutorial/public I get an Error 404. If I type in http://www.localhost I get "It Works!" in the browser. I'm thinking this means I have Apache installed correctly, but am not pointing correctly to the Zend tutorial? Thanks for any help!

    Read the article

< Previous Page | 256 257 258 259 260 261 262 263 264 265 266 267  | Next Page >