Search Results

Search found 19928 results on 798 pages for 'multiple constructors'.

Page 533/798 | < Previous Page | 529 530 531 532 533 534 535 536 537 538 539 540  | Next Page >

  • Copy/Paste including Hidden Columns when Filtering Rows in Excel 2010

    - by hudsonsedge
    I suspect the solution will be related to this question?? I have a spreadsheet that comes to me pre-formatted with hidden columns sprinkled in multiple places (for viewing brevity's sake). I need to turn on filtering, apply a filter to one of the columns, and then paste the resulting rows to a new sheet - including the hidden columns (lather, rinse, repeat). I'd prefer to not undo/re-do the hidden columns unless I have to. Is it possible to paste the hidden columns without adding the extra steps?

    Read the article

  • Windows Server 2008 - Setting Up DNS and Web Server (IIS) to host personal website?

    - by Car Trader
    Okay, I have a server, (Windows Server 2008 R2 to be more precise) and I have installed PHP, MySQL, phpMyAdmin, for web hosting purposes. I have set up a static ip address internally. I have installed the role DNS and Web Server (IIS) role. I now set up my forward looking zone as my chosen domain. I set up the nameservers as ns1.domain.co.uk with my IP address which I found from whatismyip.org. However, when I type my IP address, it times out with an error (Timeout Error). Am I doing something wrong? Am I missing something? Also I have seen that most websites have multiple nameservers, which are apparently mirror IP addresses which all redirect to one IP address. Also, I can locally connect using the IP address 192.168.0.8, however, I want to put my website online/live on the internet. Can anyone help me with this? -- Regards

    Read the article

  • Can't connect to memcached

    - by DMClark
    We currently have memcached running on CentOS. None of our PHP applications can connect, have tried multiple applications trying to establish access. The most informative PHP error we get is: "Memcache::get() [function.Memcache-get]: Server 127.0.0.1 (tcp 11211) failed with: Permission denied (13) in /var/www/.." memcached 1.4.5 PECL 2.25 We can telnet and it works. IP tables is full access from lo to lo. We've tried this on two different servers with both compiled version and the rpm in CentOS 5.5 and get the same result. Is there anything fairly obvious that we are missing?

    Read the article

  • VGA passthrough and desktop virtualization

    - by Zacariaz
    In short, my dream is to have one machine with multiple paravirtualized desktop and server guests, one of which has to be a Windows desktop with powerful graphics. As Windows can't be paravirtualized in the normal sense of the word, I was quite happy when I heard about VGA passthrough, but then I read on. As I understand it, such a setup would mean that the Graphics will be dedicated to one particular guest, thus you wouldn't be able to switch between guests. If this is in fact so, would someone please explain to me what the purpose/use of VGA passthrough is? I can think of no real use for it. Yes it's a cool technology, but to me it seems pointless. It's true that it's possible to passthrough individual VMs to separate GPUs, which is also cool, but in the end I should think that two seperate computers would make life a whole lot simpler. Again it seems rather pointless.

    Read the article

  • IE session (-nomerge) manager

    - by skrco
    I'm looking for an application that can manage (save/open) multiple Internet Explorer instances (to be precise nomerge sessions), host them in single window and arrange these instances e.g. in tabs, so in result you have double tab bar. In functionality it's similar to Remote Desktop Manager, where you can create Web session, but in embedded mode you cannot set the nomerge option. I've been searching the web, but with no results. So I put this question whether anyone know of such application or workaround. Or I have to write my own app.

    Read the article

  • Photoshop: Trim a photo so it contains no transparent pixels?

    - by nickf
    In Photoshop, I've put together some panorama photos using the Photomerge tool and the resulting image contains a lot of transparent pixels. Also, because it's put together of multiple photos the alignment can be off. What I'd like to do is cut the image down to the largest box which contains no transparent pixels at all. It's similar to the Trim tool, but this would remove a number of non-transparent pixels. Is there anything like this? Basically something which would automatically crop the above image to the selection box there: a rectangle with no transparent pixels.

    Read the article

  • Iterative and Incremental Principle Series 5: Conclusion

    - by llowitz
    Thank you for joining me in the final segment in the Iterative and Incremental series.  During yesterday’s segment, I discussed Iteration Planning, and specifically how I planned my daily exercise (iteration) each morning by assessing multiple factors, while following my overall Implementation plan. As I mentioned in yesterday’s blog, regardless of the type of exercise or how many increment sets I decide to complete each day, I apply the 6 minute interval sets and a timebox approach.  When the 6 minutes are up, I stop the interval, even if I have more to give, saving the extra energy to apply to my next interval set.   Timeboxes are used to manage iterations.  Once the pre-determined iteration duration is reached – whether it is 2 weeks or 6 weeks or somewhere in between-- the iteration is complete.  Iteration group items (requirements) not fully addressed, in relation to the iteration goal, are addressed in the next iteration.  This approach helps eliminate the “rolling deadline” and better allows the project manager to assess the project progress earlier and more frequently than in traditional approaches. Not only do smaller, more frequent milestones allow project managers to better assess potential schedule risks and slips, but process improvement is encouraged.  Even in my simple example, I learned, after a few interval sets, not to sprint uphill!  Now I plan my route more efficiently to ensure that I sprint on a level surface to reduce of the risk of not completing my increment.  Project managers have often told me that they used an iterative and incremental approach long before OUM.   An effective project manager naturally organizes project work consistent with this principle, but a key benefit of OUM is that it formalizes this approach so it happens by design rather than by chance.    I hope this series has encouraged you to think about additional ways you can incorporate the iterative and incremental principle into your daily and project life.  I further hope that you will share your thoughts and experiences with the rest of us.

    Read the article

  • How can I fully automate the creation and configuration of a SharePoint virtual machine?

    - by vnat
    I typically require multiple SharePoint virtual machines for development purposes. I currently manually build these every time I need one, either starting from a fresh OS install or using sysprep when working with SharePoint 2010 and SQL Server 2008 R2. I currently use VMWare, but am open to VirtualBox or Hyper-V. I'd like to be able to go from zero to a working VM with SharePoint, SQL and Visual Studio all through script. Is this a feasible task? Or are there more practical methods which would start from a VM with a fresh installation of an OS, and then use more standard unattended installs. Although general, I'd like to know which direction to focus my efforts. Thanks in advance, vnat

    Read the article

  • random hard disk errors

    - by AugB
    For the past 2 years or so (4 year old custom build) I've been getting random moments where everything stops responding (or takes a very long time to respond) followed by I/O and hdd not detected errors on restart. To fix it, all I usually need to do is unplug my SATA cables from the hdd and mobo and plug them back in again and the problem disappears, at least for a little while (it ranges anywhere from a day to a few months time). Sometimes even a startup repair does the job. I've done multiple reformats and have also ran chkdsk more times than I can remember and both do not seem to help in the long run. Both the drives seem to be exhibiting the same problem. Have both my hdds been "dying" for the past couple of years, even though they are fully functional besides these occasional hiccups? Does the issue lie elsewhere? All feedback is appreciated. System specs: Biostar Tpower i45 mobo 2x WD Caviar 640GB hdds Zalman 750w psu Radeon 5870 gpu 2x2gb Gskill DDR2 ram Win7 64

    Read the article

  • website lookup extremely slow in ubuntu

    - by ubuntulover
    Hi I have a wireless broadband connection through a router and wireless modem. Everything works fine in Windows. However, in ubuntu on the same machine, websites seem to take longer to start loading. I think the dns lookup is slow. I think https sites may be slower, as Ijust can't log in to gmail. I am also using a mercurial repo with remote origin, and it takes forever (like 5 minutes) to push one small change. I think it is because it has to communicate through https multiple times. Should I change my dns server? I've seen that I don't have these problems at my work network (they have another dns server). This happens with the IPv4 settings being automatic (dhcp). When I change it to automatic (dhcp) addresses only, and add google's 8.8.8.8 in the dns servers, it still takes forever. Why is this happening?

    Read the article

  • Ubuntu 12.10 and nVidia drivers don't like each other?

    - by mingos
    I decided to upgrade both my computers from Precise to Quantal. What a mistake that was. My laptop has a nVidia GT 330M card, while the desktop has an nVidia 9600 GT. In both cases everything goes great as long as I use the Nouveau driver (ugh!). Can't really play games (Amnesia... and hoping for Steam Beta participation...), even though it's OK for work. Now, ever since 9.04 or so, I just installed nvidia-current and all just worked. Since 12.10, after installing nVidia drivers, Unity won't start at all (hangs with only the wallpaper displayed, no cursor or widgets), Gnome Shell is permanently in fallback mode. Now, I have tried on both computers, with multiple clean installs on Ubuntu (two separate downloads, just in case), one from Ubuntu Gnome Remix. And additionally, Fedora 17, which seems to suffer from the same issue. Tried all nVidia driver suggestions available in Software Sources, and even compiled the drivers myself. I tried several versions of the driver to exclude an issue with the newest one. In my frustration, I have switched to Windows (which, ironically, "just works" with my hardware), but still hold a twin OS configuration on the desktop and would like to use Ubuntu again. So, can anyone point me to where the issue might lie?

    Read the article

  • Unable to move or delete files

    - by Erik
    Hi: Just today I got the following error while trying to move/delete several files: The action can't be completed because the file is open in another program. The file wasn't open, but just in case, I closed all programs. When that failed to allow me to move or delete the file, I restarted the computer. When that failed to let me move/delete I came here. Any suggestions? The files can be copy/pasted but move/delete fails even after multiple restarts.

    Read the article

  • website lookup extremely slow in ubuntu

    - by ubuntulover
    Hi I have a wireless broadband connection through a router and wireless modem. Everything works fine in Windows. However, in ubuntu on the same machine, websites seem to take longer to start loading. I think the dns lookup is slow. I think https sites may be slower, as Ijust can't log in to gmail. I am also using a mercurial repo with remote origin, and it takes forever (like 5 minutes) to push one small change. I think it is because it has to communicate through https multiple times. Should I change my dns server? I've seen that I don't have these problems at my work network (they have another dns server). This happens with the IPv4 settings being automatic (dhcp). When I change it to automatic (dhcp) addresses only, and add google's 8.8.8.8 in the dns servers, it still takes forever. Why is this happening?

    Read the article

  • Is my laptop good enough to support my development needs? [closed]

    - by KodeSeeker
    I have an ASUS Pentium-R Dual Core CPU running at 2.20Ghz. It has 4 gb of built in ram, currently running a 64 bit Windows 7 . I just started graduate school and Im wondering whether I should go in for a new laptop or just repair the nagging battery on my current one. My requirements include - -Ability to support IDE's - I may end up running Eclipse, Visual Studio's and the like to help with my work. - Ability to run multiple VM's (not concurrently). Im currently running a Ubuntu 12 and 9 as VM's (not sure if this is overloading the system) - I'm a non gamer so I really dont care about a minor glitch caused by running a uber heavy game. -In addition I will have heavy use of Office Application Software and will be using my computer to watch movies and stream media. Looking forward to your replies and suggestions!

    Read the article

  • Install Windows XP without disk

    - by Pearsonartphoto
    So, my kid's computer has Windows XP, with no disks. I'm pretty sure it has some viruses on the computer, of the type that don't seem to come out despite trying multiple anti-viral programs on it. I'm ready to just format it and start over again. I have a license sticker on the box, but no media to install it. I strongly suspect the license is OEM, but I don't have any proof. What suggestions would you have? I should say, the computer originally belonged to a business, is probably 6 years old, and I am willing to pay a small charge if required. I don't want to change the OS installed either.

    Read the article

  • Firefox and internet explorer not reponding, but chrome works fine.

    - by Mick
    I have been using multiple browsers successfully for years, but now I seem to have run in to a problem. When I run firefox or IE, they appear to start up, but the mouse pointer just permanently turns into an eggtimer, and I can do nothing else with them. Task manager shows them as "not responding". I then have to use task manager to "end task". I tried re-installing firefox, but that made no difference. Google chrome works fine. Any ideas what could be happening?

    Read the article

  • Best server for mailing application [closed]

    - by Cyber Junkie
    My application is similar to a reminder service that reminds users of events that they scheduled. I'm sending emails to users through a PHP script. I'm not sending one email to multiple recipients. Each recipient receives a different message. I plan to use cron jobs every minute and expect the application to send roughly 200 individual emails in 1 hour (for a small user base that may grow). I don't have hosting experience with this type of application. I plan to start on a shared host and move up in the future to vps or dedicated. Most shared hosts that I looked into allow 50-100 emails per hour with delays between mailings. Please kindly inform me what I should look for in web hosts for this kind of application.

    Read the article

  • Preparing a hyper-v VM image

    - by Anteru
    We have a Hyper-V Windows Server 2k3, and we're hosting multiple VMs on it. However, right now, we always start the VM creation right on the server, i.e. when preparing a new Ubuntu image, I just install it into a new VM and set it up and when I'm happy we store the disk image. I wonder if there is a way to prepare a hyper-v image locally on my desktop machine instead? I'm running Windows 7, and I would love to be able to set up a VM so that we can copy the image over to the server and be done with it. This is for linux images only, and we definitely need the hyperv network integration. Is there a recommended way how to prepare hyperv images without running a hyperv instance somewhere?

    Read the article

  • designing solution to dynamically load class

    - by dot
    Background Information I have a web app that allows end users to connect to ssh-enabled devices and manipulate them. Right now, i only support one version of firmware. The logic is something like this: user clicks on a button to run some command on device. web application looks up the class name containing the correct ssh interface for the device, using the device's model name. (because the number of hardware models is so small, i have a list that's hardcoded in my web app) web app creates a new ssh object using the class loaded in step 2. ssh command is run and session closed. command results displayed on web page. This all works fine. Now the end user wants me to be able to support multiple versions of firmware. But the catch is, they don't want to have to document the firmware version anywhere becuase the amount of overhead this will create in maintaining the system database. In other words, I can't look up the firmware version based on the device. The good news is that it sounds like at most, I'll have to support two different versions of firmware per device. One option is to name the the classes like this: deviceX.1.php deviceX.2.php deviceY.1.php deviceY.2.php where "X" and "Y" represent the model names, and 1 and 2 represent the firmware versions. When a user runs a command, I will first try it with one of the class files, if it fails, i can try with the second. I think always try the newer version of firmware first... so let's say in the above example, I would load deviceX.2.php before deviceX.1.php. This will work, but it's not very efficient. But I can't think of another way around this. Any suggestions?

    Read the article

  • CPU Configuration Issue for 2 Servers (Server 2008 R2)

    - by Bill Moreland
    I have 2 servers running the exact same Classic ASP code with Access DBs (yes, not ideal, but it is what it is, for now). 1) Xeon 5520 @ 2.27 GHz (6 GB Memory) 2) Xeon E5-2620 @ 2.00 GHz (2 processors, 32 GB Memory) For most pages the newer E5-2620 processes the pages between 10-15% faster. On pages requiring heavy and/or multiple complicated access stored procedures (queries) the older 5520 does a much better job. I believe the servers are configured nearly identically. My question: is it possible that the newer, multi-processor server is not as good at handling Classic ASP as the older single processor? Is there a configuration difference that needs to be in place that I'm missing since I'm shooting for identical implementations?

    Read the article

  • I suspect that my HDD is causing hardlocks, as all other components have been replaced. How can I check this theory and solve the potential cause?

    - by user867814
    I have had this problem over quite a while now, thorough multiple Linux kernel versions and distributions, as well as replacement of all components, aside from my main HDD - RAM, GPU(twice), mother board, CPU, power supply. What happens is, at one point during the operation of the PC, it will hardlock - everything stops working, external HDD is not shut down correctly and continues to spin until I plug it out and in, there are no system/kernel logs of any kind, and no otherwise nothing that would suggest a cause. Another reason for my suspicion is that the failures happen almost exclusively during HDD read/write activity - shutdown(happens nearly 1/3 of the time so far, it's only been few days though), launching programs, and once during operation of apt. I hope the post is descriptive enough, if you need any additional info, ask(and tell me how to prepare/obtain it), and I will provide. If I'm wrong, point me in the right direction. Thanks in advance.

    Read the article

  • Domain in a hosted environment

    - by cpgascho
    We have an application we host in a third party data center for our clients. We have multiple clients running the same application on several racks of servers. Most of our clients require that our servers be SAS70 compliant. Currently each server has it's own set of users and security settings that need to be configured. We are creating scripts to do this, but what would be the risks/advantages of joining all the servers to the domain for User Management and Group policy for enforcing security settings? The rational of some is that if the DC is hacked the whole network would be compromised where as if one stand alone hosted server is hacked everything else should be safe.

    Read the article

  • Recommended laptops for system admins?

    - by 80skeys
    Hey all - I'm in the market for a new laptop and wanted to get some recommendations. I'm a Linux sysadmin and this laptop is primarily for work related use, working from home, after hours, occasional trips to the data center, etc. We all know the drill. My typical setup includes various utilities and tools plus multiple partitions for booting different OS's including VMware. Need serial port of course, DVD-RW, and all the usual stuff we use in our daily routine. I'm kinda thinking a Thinkpad T510 but open to other suggestions! Brief explanation of why you suggest some particular brand and configuration is appreciated.

    Read the article

  • What's an alternative to using public folders (in Outlook)?

    - by Ivo Flipse
    My colleagues abuse our mail servers public folders to store (old) emails so that everyone can read them using IMAP. I'm looking into good alternatives after reading this Tech Republic article: "10 reasons why you should begin phasing out Exchange public folders" The most important thing they need is access to emails from multiple computers without overloading our network. So do you have any suggestions for alternatives? If there's a nice combination with some CRM system it would be interesting too. Note: this doesn't have to be freeware, usability and efficiency are more important. The solution has to be Windows 32 bit only

    Read the article

  • Equalizing Agent and Master Nagios on state change alone

    - by punith
    We have a setup where there are distributed Nagios running on multiple sites and are equalizing their data to the main Nagios server. The problem is it sends back the data to main Nagios server no matter if there is a state change in host or service. Is it possible to configure the slave Nagios to check the service/Host every 5 sec but send back the data only if there is a state change. Currently it is implemented by Obsess Over Hosts/Service which always runs the command which will equalize. Nagios version is 3 I am no administrator but a developer so I don't know the exact jargon so please bare with me.

    Read the article

< Previous Page | 529 530 531 532 533 534 535 536 537 538 539 540  | Next Page >