Search Results

Search found 30046 results on 1202 pages for 'document load'.

Page 710/1202 | < Previous Page | 706 707 708 709 710 711 712 713 714 715 716 717  | Next Page >

  • Linux : How to convert media files to DVD format and burn?

    - by James.Elsey
    I have a load of media files on my PC, mostly AVI/MKV and some mpegs. On windows, I would use ConvertXToDVD to convert these to DVD format, and to burn to disc. That application also lets you save a bit of space on the DVD to put the original file in its AVI format on as well. How can I do this on linux? What are the alternatives to this Windows application? I could try to run ConvertXtoDVD under wine but I would prefer to find a native linux solution Thanks in advance!

    Read the article

  • Ubuntu install can't find hard drives

    - by Casey Hungler
    I recently got a Dell Inspiron Special Edition 7720 computer. I am trying to install Ubuntu along side Windows. When I use the WUBI installer, the installation of Ubuntu works as long as I do not boot into Windows; if I boot into Windows, when I go back into Ubuntu, I am given a variety of error messages which claim to have corrupt or missing kernel/root directory, etc. I have been working with this problem for about a week, and have reinstalled Ubuntu MANY times. So far, I have eliminated all of the following problems: Corrupt WUBI installation (Downloaded multiple times, used on other systems), I have tried using a CD and a flash drive, both of which work on other computers. I know that no program within Ubuntu is creating the problem. I know that others have successfully installed Ubuntu on a computer with my operating system (Windows 7 SP1). This is a much shortened version of the original question, which has been up for about 5 days, and included a more detailed description of the problem, but left everyone clueless as to the source of this problem. When I spoke with the Dell service technician who came over today to replace my keyboard, he suggested that the driver for my HDD was so new that it was not compatible with the current version of Ubuntu. His reasoning is as follows: 1) During an install from a flash drive or CD, where I am supposed to get the option to wipe my system or create a dual boot, I get a window that asks me to select a hard drive partition, but none are listed. 2) This model of computer was made public in June of this year, while Ubuntu was released in April Adopting this theory, it would seem to me that the WUBI install fails after booting into Windows because Ubuntu can no longer find the files that it needs to load. Does this theory seem at all plausible to anyone? I just want to install Ubuntu and have it stay on my computer. I don't care how I put it there, I just need it to work, so I would TRULY appreciate any advice or suggestions anyone could give. Thanks so much for your time and support!!!

    Read the article

  • Advantages to upgrading from SharePoint Foundation 2010?

    - by sharepointQuestion
    I feel like this should be extremely obvious, but after staring at this document from Microsoft and Googling for a while I'm still at a loss as to the advantages of SharePoint Server 2010 and SharePoint Enterprise 2010 over SharePoint Foundation 2010. My users currently use SharePoint Foundation 2010 to collaborate on a handful of excel documents within the office. There is talk of expanding to have a second and third SharePoint server at another plant and at our corporate offices. If there is a reason to upgrade now would be a good time to ask for the money while we're talking expansion. Is it worth it from either an administrative or an end-user perspective? Or is the free version really just that wonderful?

    Read the article

  • Planning office network [closed]

    - by gakhov
    I'm planning to setup my office network from scratch and want to ask professional opinions or tips. My office is connected to Internet with Cable connection (100Mb/s). The devices i would like to connect are VoIP Phone (RJ-11), TV (WiFi/LAN), 3 laptops (WiFi), a few smartphones (WiFi), iPad (WiFi), Kindle (WiFi) and, probably, MediaServer (WiFi/LAN). As you can see, the most load will be on WiFi connections (probably, even if TV supports WiFi it's better to connect it by LAN?). So, i need help to choose the best routers combination (or even one?) to support stable connections for all these devices and minimize the total number of routers/adapters. Any thoughts? Thank you!

    Read the article

  • Is there a way in IE9 on a Virtual Machine to do AD auth in IE9 without the machine being added to the domain but the host machine is?

    - by Micah Armantrout
    I have a virtual machine that is running IE 9 and windows 7 Latest Updates that I want to use to test my intranet site (ASP.Net Application). I can't add the virtual machine to the domain and I don't want to have to type my ad cruds everytime I load the site up. Is there a way for the IE on the virtualbox to Authenticate as my AD Cruds on the host machine so I don't have to always put my username and password in ? I guess I can just have IE on the virtual machine remember my username and password but other than that is there another way to do this ?

    Read the article

  • Server installation logging / logbook / diary?

    - by The MYYN
    Are there some ways field-tested ways to keep a kind of logbook for a server? Including: software installations (and de-installations) custom configurations (e.g. of a webserver, ssh daemon, etc.) personal notes The big picture. I am preparing a server and would like to extensively document the state and how it was established over time, so that a new person can easily see, what's going on and why. The setup is not too complicated, but I would like to do it anyway. I once used something like Maintain /etc with mercurial on Debian and it was nice, but I am looking for a little more flexible solution. Addendum: So I am interested in logging and documentation first. In an ideal world however, I would like to have a command, which in a few steps would take me from a bare newly installed unix system to a functional environment with all the components setup and in place by the means of, say an 'executable' log. But that would be very ideal, I imagine.

    Read the article

  • Dynamically changing one-node Cassandra cluster to two nodes

    - by Jason Axelson
    So I have an application that will be very dormant most of the time but will need high-bursting a few days out of the month. Since we are deploying on EC2 I would like to keep only one Cassandra server up most of the time and then on burst days I want to bring one more server up (with more RAM and CPU than the first) to help serve the load. What is the best way to do this? Should I take a different approach? Some notes about what I plan to do: Bring the node up and repair it immediately After the burst time is over decommission the powerful node Use the always-on server as the seed node My main question is how to get the nodes to share all the data since I want a replication factor of 2 (so both nodes have all the data) but that won't work while there is only one server. Should I bring up 2 extra servers instead of just one?

    Read the article

  • Windows 7 – Fun with VHD

    - by guybarrette
    I’m teaching about TFS 2008 next week and I wanted to use TFS in a virtualized environment so I downloaded the TFS + Team Suite VPC image from Microsoft’s Website.  Working with Windows 7, I opened the VM with the built-in Windows Virtual PC.  The VM loads fine but the problems started when I tried to install the VM additions: I simply couldn’t get them to install properly. I then looked at VMware and found that they have a product called VMware Player that can load Virtual PC VMs.  Tried that but VMware Player failed in converting the VHD. I then looked at VirtualBox.  Created a new VM, attached the VHD and bingo!  Worked like a charm.  The only real caveat is that the guest Windows will ask for the OS CDs to install new drivers so you must have either the CD/DVD or the ISO file (sweet!) to proceed. OK, I got it working in VirtualBox but I’m curious why I couldn’t install the additions from Windows 7 Virtual PC onto a Windows Server 2003 VM.  Anyone has a clue? BTW, thanks to Rolly Perreaux who pointed my to his blog where he goes into great details explaining how to use VM images with VirtualBox.  Good stuff! var addthis_pub="guybarrette";

    Read the article

  • High CPU Steal percentage on Amazon EC2 Instance

    - by Aditya Patawari
    I am experiencing high CPU steal percentage in a Amazon EC2 large instance. I know it means that my virtual CPU is waiting on the real CPU of the machine for time. My question is that what can I do to reduce this percentage and get maximum out of the CPU? Steal percentage is consistently at 20%. System load crosses 10 when this happens. I have checked memory and network and I am sure that they are not the bottleneck. Is that normal for such environment? Also are there any system level optimization techniques for reducing steal percentage form the virtual instance? avg-cpu: %user %nice %system %iowait %steal %idle 52.38 0.00 8.23 0.00 21.21 18.18

    Read the article

  • Oracle Enterprise Manager 12c Testing-as-a-Service Solution

    - by user810030
    With organizations spending as much as 50 percent of their QA time with non-test related activities like setting up hardware and deploying applications and test tools, the cloud will bring obvious benefits. A key component of Oracle Enterprise Manager our current Application Quality Management products have been helping our customers with application load testing, functional testing and test process management, but also test data management, data masking and real application testing. These products enable customers to thoroughly test applications and their underlying infrastructure to help ensure the best quality, scalability and availability prior to deployment.  Today, Oracle announced Oracle Enterprise Manager 12c Testing-as-a-Service Solution . This solution will allow users to significantly decrease the time needed to setup a complete test environment, while enhancing testing efficiency. Please read the Press Release mentioned above and join us in our Enterprise Manager LinkedIn Group discussion on this topic. (need to be a member). Or visit our booth this week during the EuroSTAR Software Testing conference in Amsterdam where we can demo this solution  I hope you find this helpfull Stay Connected: Twitter |  Facebook |  YouTube |  Linkedin |  Newsletter

    Read the article

  • PHP session files have permissions of 000 - They're unusable

    - by vanced
    I kept having issues with a Document Management System I'm trying to install as, at the first step of the installation process, it would error with: Warning: Unknown: open(/tmp/sess_d39cac7f80834b2ee069d0c867ac169c, O_RDWR) failed: Permission denied (13) in Unknown on line 0 Warning: Unknown: Failed to write session data (files). Please verify that the current setting of session.save_path is correct (/tmp) in Unknown on line 0 I looked in /tmp and saw the sess_* files have the following permissions ---------- 1 vanced vanced 1240 Jan 20 08:48 sess_d39cac7f80834b2ee069d0c867ac169c All the session files look like this. So obviously, they're unusable by PHP and it's causing me lots of problems. How can I get PHP to set the correct permissions? I've tried changing the directory which php.ini uses to /tmp/phpsessions and the same thing occurs. The directories are a+rwx.

    Read the article

  • glTexImage2D not loading my data

    - by Clyde
    Can anyone suggest why this code doesn't work? When I draw using this texture all I get is black. If I use GLUtils.texImage2D() to load a png file, it works correctly. ByteBuffer bb = ByteBuffer.allocateDirect(128*128*4).order(ByteOrder.nativeOrder()); bb.position(0); for(int row = 0; row != 128; row++) { for(int i = 0 ; i != 128 ; i++) { bb.put((byte)0x80); bb.put((byte)0xFF); bb.put((byte)0xFF); bb.put((byte)i); } } int[] handle = new int[1]; GLES20.glEnable(GLES20.GL_TEXTURE_2D); GLES20.glGenTextures(1, handle, 0); DrawAdapter.checkGlError("Gen textures"); GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, handle[0]); DrawAdapter.checkGlError("Bind textures"); bb.position(0); GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA, 128, 128, 0, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, bb); DrawAdapter.checkGlError("glTexImage2D"); return handle[0];

    Read the article

  • Improve efficiency when using parallel to read from compressed stream

    - by Yoga
    Is another question extended from the previous one [1] I have a compressed file and stream them to feed into a python program, e.g. bzcat data.bz2 | parallel --no-notice -j16 --pipe python parse.py > result.txt The parse.py can read from stdin continusuoly and print to stdout My ec2 instance is 16 cores but from the top command it is showing 3 to 4 load average only. From the ps, I am seeing a lot of stuffs like.. sh -c 'dd bs=1 count=1 of=/tmp/7D_YxccfY7.chr 2>/dev/null'; I know I can improve using the -a in.txtto improve performance, but with my case I am streaming from bz2 (I cannot exact it since I don't have enought disk space) How to improve the efficiency for my case? [1] Gnu parallel not utilizing all the CPU

    Read the article

  • Booting off a ZFS root in 14.04

    - by RJVB
    I've been running a Debian derivative (LMDE) on a ZFS root for half a year now. It was created by cloning a regular ext4-based install with all the necessary packages onto a ZFS pool, chrooting into that pool and recreating a grub menu and bootloader. The system uses an ext-3 dedicated /boot partition. I would like to do the same with Ubuntu 14.04, but have encountered several obstacles. There is no Trusty zfs-grub package The default grub package doesn't have ZFS support built in. I found a small bug in the build system responsible for that (report with patch created) and built my own grub packages. The built-in ZFS support is dysfunctional, it does not add the proper arguments to the kernel command line I thus installed the ZoL grub package I also use on my LMDE system, which does give me a correct grub.cfg However, even with that correct grub.cfg, the boot process apparently doesn't retrieve the bootfs parameter from the ZFS pool; instead the variable that's supposed to receive the value remains empty. As a result, initrd tries to load the default pool ("rpool"), which fails of course. I can however import the pool by hand, and complete the process by hand. If memory serves me well, I also had to disable apparmor, to avoid the boot process from blocking after importing the pool. Am I overlooking something? Just for comparison, I installed the Ubuntu 3.13 kernel on my LMDE system, and that works just fine (i.e. the identical kernel and grub binaries allow successful booting without glitches on LMDE but not on Ubuntu).

    Read the article

  • Apache routing vhosts to /var/www

    - by FHannes
    One user at my site has reported that he reaches the content at /var/www when browsing to any of the vhosts at my server. As far as I’m aware, my Apache server does not contain a document root that references this folder. On top of that, this user seems to be the only one experiencing the issue. According to his ISP, the issue isn’t caused by them, yet, on his mobile connection, he can access the site. When browsing to my server’s IP, he also receives the correct content from the default vhost. What could be the possible causes of this issue and how can I get it to stop? I’ve explored pretty much every option I could think of.

    Read the article

  • Issue updating domain name servers from BlueHost to AWS

    - by cowls
    I am trying to migrate my site hosting from bluehost to AWS cloud based service. I have the site up and running on AWS with an elastic IP configured, it loads fine when I specify the IP address in the browser. I have gone into Route 53 on the AWS console and created a "hosted zone" for the domain. I then created a new record set of type "A" using the IP address as the value. I have a domain name registered with bluehost. Ive logged into the bluehost account and updated the domain name servers to point to those specified in Route 53 in the AWS console. When I hit the IP address directly the site loads, however it doesn't load when using the domain name (I get a google chrome oops error page saying page is not found) I've tried using this site: http://dns.squish.net/ to debug but it seems to be giving me the correct results. fizaclegems.com 300 IN A 107.20.209.78 Where 107.20.209.78 matches the elastic IP configured in the AWS console. This is the result it gives for all 4 name servers. Am I missing a step here? Does anyone know what else I should be doing or looking for?

    Read the article

  • Odd SVN Checkout failures occur frequenctly on VMWare virtual machines

    - by snowballhg
    We've recently been experiencing seemingly random SVN checkout failures on our Hudson build system. Google search has failed me; I'm hoping the super user community can help me out :-) We are occasionally receiving the following SVN error when our Hudson build jobs checkout source via the Hudson Subversion plug-in (which uses svn kit): ERROR: Failed to check out http://server/svnroot/trunk org.tmatesoft.svn.core.SVNException: svn: Processing REPORT request response failed: XML document structures must start and end within the same entity. (/svnroot/!svn/vcc/default) svn: REPORT request failed on '/svnroot/!svn/vcc/default' This issue seems to only occur when checking out from our Virtual Machines (Windows XP, Fedora 9, Fedora 12) using Hudson's SVN Plug-in. Systems that use the traditional SVN client seem to work. SVN Server version: 1.6.6 Hudson version: 1.377 Hudson SVN Plugin Version: 1.17 Has anyone dealt with this issue, or have any suggestions? Thanks

    Read the article

  • Virtualized Development Server for simulating 3-Tier Environment

    - by chris.cyvas
    Hello, I am thinking about buying a new server based development box for development (redundantly redundant, I know ;)). Ideally, I want to run something like ESXi or Xen Hypervisor at the lowest level. Then I want to add (at least) 5 Linux VM's for the following uses: 2 Web Servers 2 Application Servers 1 Database Server I want to load balance the 2 web servers and the 2 application servers and (somewhat obviously) they need to be all networked together to simulate a production environment. Also, it used to be the case that the recommendation was to put each VM on it's own hard drive, but I'm not sure that holds water anymore. Any advice? Does anyone have any advice on how to pull this off? Gotchya's, LookOuts!, etc? Thanks!

    Read the article

  • Help updating cron entry using regular expressions

    - by Uday
    hi I am trying to update a cron entry NOT by using crontab -e but by shell commands. For example the cron entry is like this: 10 * * * * /home/localuser/foo.sh -b 1 -h 4 > foo_output.sh 2>&1 No i need edit the command line parameters part ONLY i.e -b 1 -h 4 to something else which will be coming in from the user. First thing would be to write the crontab to a tmp file and then manipulate that temp file. Now, is there an easy way to edit that line using SED or something? The crude way wud be to delete that entire line, write a new line with the entire expression and then load that into the cron. I am not very good with regular expressions. My system supports sed -i so was thinking this could be done in a single line command. Thanks in advance

    Read the article

  • Make a animation path separated by clicks

    - by Tomáš Zato
    I have a long text on powerpoint slide. Instead of separating it on multiple slides, I made an animation that moves it up using animation path, so that text hidden at bottom appears while text on top goes off screen. However, I need more move animations to reaveal more text (the text takes more than 2 screens). This means, I need two (or more) animation paths (of the same length) and I want them to move obejct from position, where the last path has left it. Instead, multiple animations always operate with objects original position. That's useless. You can download test document, where I made an example of what I want: animation test.pptx

    Read the article

  • Broken screen on laptop?

    - by John
    I recently damaged the screen on my HP Notebook...now my main concern is that I have Roboform installed and on my taskbar. The trouble is, because the screen is totaled, I can't see anything, and I discovered, to my horror, that I haven't backed any of the IDs and passwords up to a text document. Of course, I cannot access any of my content that hosts an ID name and password at all unless I can access Roboform that is installed on the notebook! Now hopefully I have ordered a SATA/USB cable to access the hard drive that I now removed form the notebook will I be able to access roboform that way and ALL the log-in details?

    Read the article

  • CLR Profiler Allocated Bytes and XNA ContentManager

    - by Vackup
    I've been fighting with XNA ContentManager and memory allocations for some weeks because I'm trying to port my game from XNA (Windows) to ExEn / Monotouch (iphone). The problem is that after playing a few levels, my game exits unexpectedly on a real iPhone device (not simulator). Profiling memory usage on Windows with CLRProfile, I found some useful stuff but I also found something I dont understand. If I use 2 ContentManagers (1 for shared assets and 1 for level assets), when profiling, "Allocated Bytes" grows and grows after level through level but Memory consumption measured by Windows Task Manager stays constant (down when I unload the content manager and up again when I load content). Obviously, I contentManager.Unload() when level ends. After a few levels my game exits unexpectedly on an iPhone device. If I use 1 content manager, "CRLProfiler Allocated Bytes" stays constant on Windows and on the iPhone; I can play the game normally and it doesnt exit unexpectedly. I use the same assets level through level. It seems like in ios (iPhone) when loading and unloading the same assets, it allocates memory and consumes all device memory, so the ios kill it. Can anybody explain me how this really works? I've read quite a bit, but I still don't understand what's going on.

    Read the article

  • Storing images in file system and returning URLs or virtually resizing and returning byte arrays?

    - by ismaelf
    I need to create a REST web service to manage user submitted images and displaying them all in a website. There are multiple websites that are going to use this service to manage and display images. The requirements are to have 5 pre-defined image sizes available. The 2 options I see are the following: The web service will create the 5 images, store them in the file system and and store the URL's in the database when the user submits the image. When the image is requested, the web service will return an array of URLs. I see this option to be a little hard on the hard drive. The estimates are 10,000 users per site, and lets say, 100 sites. The heavy processing will be done when the user submits the image and each image is going to be pulled from the File System. The web service will store just the image that the user submits in the file system and it's URL in the database. When the user request images, the web service will get the info from the DB, load the image on memory, create its 5 instances and return an object with 5 image arrays (I will probably cache the arrays). This option is harder on the processor and memory. The heavy processing will be done when the images get requested. A plus I see for option 2 is that it will give me the option to rewrite the URL of the image and make them site dependent (prettier) than having a image repository for all websites. But this is not a big deal. What do you think of these options? Do you have any other suggestions?

    Read the article

  • My home partition slowly fills up until the system is unable to complete even simple tasks

    - by user973810
    So I have an awesome configuration on my home pc. My /home directory has its own partition. My home partition slowly fills up until the system is unable to complete even simple tasks. For example, when this issue has occured, I can load up firefox. It just pops up an error message saying that it cannot be done. Rebooting solves the issue. The strange thing is, I've run baobab and it doesn't notice a problem. There should be hundreds of gigabytes of data somewhere, but it doesn't see it. Does anyone have any idea of how I might troubleshoot this issue? I'm thinking I could do lsof but I've always found the output of that to be too much information. Maybe my drive is, like, dying. Edit: is there a /home analog of /var that gets cleared out at boot time? Maybe I could check in there next time I notice this problem to see if I can divine what's up. Update: I found the issue. my .xsession-errors file is filling up with Authentication deferred - ignoring client message Is there a way I can see what is causing this error and fix it?

    Read the article

  • How to add entries to combobox in Word XP

    - by Kris C
    I'm trying to follow the following instructions to add a combobox to a form and add some values to it for the user to pick: http://office.microsoft.com/en-gb/help/create-forms-that-users-complete-in-word-HP005230270.aspx I've created a .dot and then dragged a combobox onto the document. When I double-click on it though, it opens up the VBA editor. Do I have to add the items programatically as per the following question How do I add a combobox in Word? or is it possible to do this using the UI? The other question/answer refers to creating a form. Is there a step I've missed between creating a .dot to then create a form somehow before adding form elements to it? Right clicking gives the following menu options: Cut Copy Paste properties View code Combo Box Object - Edit or Convert Format Control Hyperlink Choosing edit allows me to type some text onto the visible part of the control, but doesn't allow me to add multiple options Right-clicking and selecting properties opens the following:

    Read the article

< Previous Page | 706 707 708 709 710 711 712 713 714 715 716 717  | Next Page >