Search Results

Search found 17990 results on 720 pages for 'virtualization option'.

Page 124/720 | < Previous Page | 120 121 122 123 124 125 126 127 128 129 130 131  | Next Page >

  • Input not supported message when monitor is powered off then back on

    - by Jason Down
    I've been getting a message on my monitor where "Input not supported" is floating around. This only happens when I manually turn the monitor off and then later turn it back on. Leaving the monitor on and allowing it to go to the screen saver doesn't seem to cause the issue (but I prefer to turn the monitor off if I'm going to be away from the computer for any length of time). Any ideas what might cause this, only when the monitor is turned off manually? Specs: Acer X203w mointor Radeon 9600 Pro Video card Linux Mint 8 Resolution 1680 x 1050 (16:10 - Preferred native resolution for the monitor) Refresh Rate 60hz Here is what is in my xorg.conf file: Section "Device" Identifier "Radeon 9600" Driver "ati" BusID "PCI:1:0:0" Option "XAANoOffscreenPixmaps" Option "AccelMethod" "XAA" EndSection Section "Screen" Identifier "Default Screen" Device "Radeon 9600" DefaultDepth 24 SubSection "Display" Depth 24 Modes "1680x1050" "1440x900" "1024x768" EndSubSection EndSection Section "DRI" Mode 0666 EndSection Section "Extensions" Option "Composite" "Enable" EndSection

    Read the article

  • Problems installing Ubuntu on a vaio with SSD, GRUB installation failure

    - by Alberto
    I have installed and used Ubuntu in several computers. But now I have a problem that I don't know how to solve. I have a Vaio (Product name: vpcz13c5e), it has a SSD 128gb. I decided to install Ubuntu (12.04, but I have tried older versions as well). Firstly, I tested with live USB, and everything was fine, so I decided go for the complete installation. Then everything went as follows: I chose to use the whole disk (first option, formatting everything). I got a message Executing 'grub-install' /deb/sdb failed. This is a fatal error After clicking ok I got another window with 3 options: the first offers different devices to install the bootloader on (I tried all of them and none works). Second option: Continue without a bootloader. In that case I got You will need to manually install a bootloader in order to start Ubuntu The third option is Cancel the installation. So, I chose Continue without a bootloader. Then I restart the computer (with the Live cd) and in a terminal type sudo fdisk -l but I obtain fdisk: unable to seek on /dev/sda: Invalid argument What can I do? any help will be appreciated.

    Read the article

  • More Oracle VM templates for PeopleSoft and Oracle Enteprise Manager

    - by wcoekaer
    Just as I wrote up a blog promoting the Oracle VM Ebusiness suite templates, we also pushed out 2 other products : Oracle Enterprise Manager Cloud Control 12c Oracle PeopleSoft FSCM 9.1 and PeopleTools 8.52.03 They can be downloaded from edelivery. Same advantage... you download the template, import it and you have a completely pre-installed set of products. That's application deployment, not just VM deployment. That's flexibility across the stack, not just a hypervisor, not just virtualization, but a complete solution stack.

    Read the article

  • Multiple instances of Intellitrace.exe process

    - by Vincent Grondin
    Not so long ago I was confronted with a very bizarre problem… I was using visual studio 2010 and whenever I opened up the Test Impact view I would suddenly see my pc perf go down drastically…  Investigating this problem, I found out that hundreds of “Intellitrace.exe” processes had been started on my system and I could not close them as they would re-start as soon as I would close one.  That was very weird.  So I knew it had something to do with the Test Impact but how can this feature and Intellitrace.exe going crazy be related?  After a bit of thinking I remembered that a teammate (Etienne Tremblay, ALM MVP) had told me once that he had seen this issue before just after installing a MOCKING FRAMEWORK that uses the .NET Profiler API…  Apparently there’s a conflict between the test impact features of Visual Studio and some mocking products using the .NET profiler API…  Maybe because VS 2010 also uses this feature for Test Impact purposes, I don’t know… Anyways, here’s the fix…  Go to your VS 2010 and click the “Test” menu.  Then go to the “Edit Test Settings” and choose EACH test setting file applying the following actions (normally 2 files being “Local” and TraceAndTestImpact”: -          Select the Data And Diagnostic option on the left -          Make sure that the ASP.NET Client Proxy for Intellitrace and Test Impact option is NOT SELECTED -          Make sure that the Test Impact option is NOT SELECTED -          Save and close   Edit Test Settings   Problem solved…  For me having to choose between the “Test Impact” features and the “Mocking Framework” was a no brainer, bye bye test impact…  I did not investigate much on this subject but I feel there might be a way to have them both working by enabling one after the other in a precise sequence…  Feel free to leave a comment if you know how to make them both work at the same time!   Hope this helps someone out there !

    Read the article

  • XAML RadControls Q1 2010 Official

    Q1 2010 release focuses on strengthening 3 main aspects of RadControls for Silverlight and RadControls for WPF: Ensuring first-class performance for all data-centric controls through various techniques, Enhancing and polishing RadControls themes Providing highly advanced, enterprise-level features, especially for the data visualization controls  We know that performance is crucial for line-of-business applications. Therefore, we always make sure that RadControls can help you achieve unmatched performance and this has always been our number one priority. RadControls achieve unbeatable performance through UI and Data Virtualization, Data Sampling and built-in Load On Demand features. Several of the major controls in the bundles have been enhanced with UI virtualization support Scheduler, CoverFlow and Book. As a part of Q1 2010 we also want to bring an unparalleled visual richness to your applications. To achieve that we have done a major rework of all our themes. We used a uniform templating approach across all controls, streamlined naming conventions for resources and delivered a much more consistent look of the controls along the way. RadControls for WPF bundle has been enriched with two new controls Map and Book.   Another new control has been included in the Q1 2010 release. However, it continues to be in a CTP stage. This is the Transition control. We decided that this is the better way to proceed as we will need some more input from our community on how exactly to develop this control further. Therefore, we will be regularly blogging on the development progress so that we can clearly indicate the direction, in which the control is evolving and gather your feedback on whether this is the best direction. Our Charting controls for Silverilght and WPF have been advanced with major new features such as Data Sampling, Zooming and Scrolling, Automatic SmartLables positioning, Sorting and Filtering and many more. The new built-in paging of the GridView control now allows you to page through your data, thus resulting in an event faster and more responsive grid that can easily handle enormously large datasets. Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • which server is best for me??

    - by mathew
    I am looking for a best hosting service for my website. My website is a PHP MySQl driven site which hase got site scrapping for more than 10 websites and near about 8-10 API pursing and some about 150 mb dat file reading(from local hard drive), and also one rss pursing,Live graph from other sites, geo map from Google,Map api from Google and so on. one widget which real-time result for any one who chose the same. so my question is which is best option for me. actually I am considering softlayer cloud as they are pretty cheap and more facilities than rackspace. another option is a dedicated server which has 2 single core processor, 4GB ram and 250GB sata II. with 100 mbps uplink. so please tell me which will be the best option?? I heard that dedicated server has lots of limitations than cloud. but for cloud they uses SAN for storage so I am afraid the reading proces for database may be bit slower...and their basic plan ram is only 1 GB.

    Read the article

  • Nomachine 4 for X forwarding

    - by Yair
    I have been using nomachine nx client to connect from my mac to an ubuntu server for a while now and it has been a great experience. The most useful feature for me was the option to open up just one application on the remote machine, instead of a full remote desktop connection. I used to to open a terminal on the remote machine. Basically it was a much faster, much better replacement for ssh -X. All was great until I upgraded to the new version - nomachine 4. In this version I can not find that option. I have to run a full remote desktop session, which slows things down and is also much less convenient for my work. Was this option removed from the client? Or is it hiding somewhere in there and I just can't find it?

    Read the article

  • Flashback Data Archives: Ein gutes Gedächtnis für DBA und Entwickler

    - by Heinz-Wilhelm Fabry (DBA Community)
    Daten werden gespeichert und zum Teil lange aufbewahrt. Mitunter werden Daten nach ihrer ersten Speicherung geändert, vielleicht sogar mehrfach. Je nach gesetzlicher oder betrieblicher Vorgabe müssen die Veränderungen sogar nachverfolgbar sein. Damit sind zugleich Mechanismen gefordert, die sicherstellen, dass die Folge der Versionen lückenlos ist. Und implizit bedeutet das zusätzlich, dass die Versionen auch vor Löschen und Verändern geschützt sein müssen. Das Versionieren kann über die Anwendung, mit der die Daten auch erfasst werden, erfolgen, über Trigger oder über besondere Werkzeuge. Jede dieser Lösungen hat ihre eigenen Schwächen. Zusätzlich steht die Frage nach dem Schutz vor unerlaubtem Löschen oder Ändern versionierter Daten im Raum. Flashback Data Archives lösen diese Frage, denn sie bieten nicht nur einen wirksamen Mechanismus zum Versionieren von Datensätzen, sondern sie schützen diese Versionen auch vor Veränderung und löschen sie schließlich sogar automatisch nach Ablauf ihrer Aufbewahrungsfrist.Ursprünglich wurden die Archive als eigenständige Option zur Enterprise Edition der Oracle Database 11g unter dem Namen Total Recall eingeführt. Ende Juni 2012 verloren die Flashback Data Archives ihren Status als eigenständige Option. Weil die Archive aber grundsätzlich komprimiert wurden, hat Oracle sie stattdessen zu einem Feature der Advanced Compression Option der Enterprise Edition (ACO) gemacht. Seit der Version 11.2.0.4 der Datenbank ist das Komprimieren aber für die Archive nicht mehr zwangsläufig, sondern optional. Damit gibt es lizenzrechtlich erneut eine Änderung: Wer die Kompression verwendet, der muss nach wie vor ACO lizensieren. Wer die Flashback Data Archives dagegen ohne Kompression verwendet - also zum Beispiel Entwickler -, dem stehen sie ab 11.2.0.4 aufwärts im Lieferumfang aller Editionen der Datenbank zur Verfügung. Diese Änderung ist in den Handbüchern zur Lizensierung der Versionen 11.2 und 12.1 der Datenbank dokumentiert. Im Rahmen der DBA Community ist bereits über die Flashback Data Archives berichtet worden. Der hier vorliegende Artikel ersetzt alle vorangegangenen Beiträge zum Thema.

    Read the article

  • Hypervisor

    <b>Datamation:</b> "A hypervisor is a software technology used in virtualization, which allows several operating systems to run side-by-side on a given piece of hardware."

    Read the article

  • create video from jpg images using ffmpeg

    - by floppydisk
    I want to make short timelapse video using ffmpeg under ubuntu 12.04 LTS. I have a folder containing all images with names DSC_0000.jpg DSC_0001.jpg and so on. I found this question ffmpeg: create a video from images and I try to run the same command as mentioned there: ffmpeg -i DSC_%d.jpg -vcodec mpeg4 timelapse.avi and it fails with DSC_%d.jpg: No such file or directory I've also tried ffmpeg -i DSC_%04d.jpg -vcodec mpeg4 timelapse.avi and it fails with the same error And also for some reason my ffmpeg does not understand option -start_number, if I run ffmpeg -start_number 0 -i DSC_%d.jpg -vcodec mpeg4 timelapse.avi I get this error: Unrecognized option 'start_number' Failed to set value '0' for option 'start_number' I would appreciate any help

    Read the article

  • links for 2011-01-31

    - by Bob Rhubart
    Do (Software) Architects Architect? "The first question, is 'Why is architect being used as a verb?' Mirriam-Webster dictionary does not contain a definition of architect as a verb, nor do many other recognized dictionaries." -- TheCPUWizard (tags: softwarearchitecture) Oracle Business Intelligence Blog: Gartner Magic Quadrant for BI Platforms 2011 "Oracle customers indicate they deploy the Oracle Business Intelligence Suite Enterprise Edition (OBIEE) platform to support among the most complex deployments in our survey." - Gartner (tags: oracle businessintelligence gartner) Oracle BI Server Modeling, Part 1- Designing a Query Factory (Oracle BI Foundation) Bob Ertl lays the groundwork for Business Intelligence modeling concepts with a look at "the big picture of how the BI Server fits into the system, and how the CEIM controls the query processing." (tags: oracle otn businessintelligence) Tom Graves: Modelling people in enterprise-architecture Tom says: "One of the key characteristics of ‘crossing the chasm’ to a viable whole-of-enterprise architecture is the explicit inclusion of people. In short, we need to be able to model and map where people fit in relation to the architecture. But there’s a catch. A big catch." (tags: entarch) Java developer webcasts for customers and partners (SOA Partner Community Blog) Jurgen Kress shares info on several upcoming online events focused on WebLogic. (tags: weblogic oracle otn soa) Business SOA: Data Services are bogus, Information services are real Steve Jones says: "The other day when I was talking about MDM a bright spark pointed out that I hated data services but wasn't MDM just about data services?" (tags: SOA MDM) Andrejus Baranovskis's Blog: Configuring Missing Contribution Folders for Oracle UCM 11g and WebCenter 11g PS3 Andrejus says: "After doing some research on UCM, we found that Folders_g component must be configured in UCM, for Contribution Folders to be enabled." (tags: oracle otn oracleace UCM webcenter enterprise2.0) Wim Coekaerts: Converting an Oracle VM VirtualBox VM into an Oracle VM Server image Wim Coekaerts offers a few simple steps to convert an existing Oracle VM VirtualBox image.  (tags: oracle otn virtualization virtualbox) Stefan Hinker: Secure Deployment of Oracle VM Server for SPARC This new paper from Stefan Hinker will help you understand the general security concerns in virtualized environments as well as the specific additional threats that arise out of them. (tags: oracle otn SPARC virtualization enterprisearchitecture) The EA Roadmap to Rationalize, Standardize, and Consolidate the IT Portfolio Enterprise IT is in a state of constant evolution. As a result, business processes and technologies become increasingly more difficult to change and more costly to keep up-to-date. (tags: entarch oracle otn)

    Read the article

  • Lubuntu Full Install on USB Drive with Full Disk Encryption and Grub2

    - by vivi
    I apologise for the wall of text, but I want you to scrutinize my thought-process to make sure there's no mistakes and no other way around it: I wish to have a full install of lubuntu with full disk encryption on one of my usb drives. The laptop I would be booting it from also has windows 7. I want to maintain that OS. From what I've read I must place grub2 on the usb drive so that: If I have the usb plugged in, the laptop would start lubuntu (having USB HD in the BIOS Boot options) If I don't have the usb plugged in, it would normally start windows 7. That's exactly what I want it to do. But: If I install from the normal .iso: Clicking "install lubuntu alongside them" would install it onto my normal HD. Clicking "Erase disk and install lunbuntu" would delete all the stuff I have in my HD and install lubuntu on it. Clicking "Something else" would allow me to choose to install lunbuntu and grub2 onto the usb drive, but would not provide it with encryption. So the normal .iso won't work for what I want. Then I found the alternate .iso and this tutorial: It allows me to install lubuntu with all the options I want and gives me the option to choose where to place the grub2! Hopefully there are no flaws in my train of thought. If there aren't, I have a few questions regarding that tutorial: The author says in his case choosing "Yes to install GRUB to your MBR" installed the grub to the usb drive's mbr. I can't have "in his case". I need to be sure that's what it will do, so that it doesn't mess up the windows boot loader. Choosing "no" would open this window and allow to choose where I want to install the grub. Unfortunately I don't understand which option I should type in the box to install it into the usb drive. Would removing my laptop's Hard Drive ensure that the grub is installed onto the usb drive if i picked first option, "yes"? I apologise once again for the wall of text and appreciate any help you guys can offer me.

    Read the article

  • /etc/profile.d and "ssh -t"

    - by petersohn
    I wanted to run a script on a remote machine. The simple solution is this: ssh remote1 some-script This works until the remote script doesn't want to connect to another remote machine (remote2) which requires interactive authentication, like tis one (remote2 is only reachable through remote1 in this case): ssh remote1 "ssh remote2 some-script" The solution for the problem is to use the -t option for ssh. ssh -t remote1 "ssh remote2 some-script" This works, but I get probems in case I use this (where some-script may execute further ssh commands): ssh -t remote1 some-script I found that some environment variables are not set which are set when I don't use the -t option. These envrionment variables are set in scripts from /etc/profile.d. I guess that these scripts are not run for some reason if using the -t option, but are run if I don't use it. What's the reason of this? Is there any way to work around it? I am using SUSE linux (version 10).

    Read the article

  • New Virtual Compute Appliance Videos

    - by Cinzia Mascanzoni
    Watch the latest Virtual Compute Appliance videos to aid your conversations with partners and customers! Virtual Compute Appliance Flash demo shows your customers and partners the business benefits. VCA Product demo. Tier1 Customer Testimonial Video of using Oracle's Virtual Compute Appliance to build a private cloud virtualization platform to host its customers’ Oracle Enterprise and Windows applications. Centroid Partner Testimonial Video.

    Read the article

  • About online game servers and how to handle data

    - by TreantBG
    So my question isn't about what technology to use or how to do this or that, but a more general question. I'm currently developing a action third person shooter. With elements of RPG - weapon,armor upgrades and items. Players will be able to create new games or join old ones. So my question is how to create the game server that players will play in. I have two ideas on my mind. The player who made the game is the server. All data passes trough him and he send this data to the server updating the database of the players with their XP points kills/deaths score and other. Or my host machine is the server, the player who made the game just will open new instance on my host and will be like client. And all players send their input data to the host, the host updates the game and send response back to client for any new changes like where is the enemy and other. And if i choose option 1 is there a chance the host to change the game content and manipulate the game results? (I think there is but i'm not sure) And if i choose option 2 isn't that raising the response time and potentially the game lag? or maybe there is another option?

    Read the article

  • Configuring permissions with Bastille

    - by Lucio
    I was using Bastille to improve the security of OS and I found the next question there I don't know if I should answer for YES or NOT: Questions: Would you like to set more restrictive permissions on the administration utilities? Explanation: In general, the default file permissions set by most vendors are fairly secure. To make them more secure, though, you can remove non-root user access to some administrator functions. If you choose this option, you'll be changing the permissions on some common system administration utilities so that they're not readable or executable by users other than root. These utilities (which include linuxconf, fsck, ipconfig, runlevel and portmap) are ones that most users could never have a need to access. This option will increase your system security, but there's a chance it will inconvenience your users. My users: When I installed Ubuntu I had create a user (admin), then I was able to create another user (people) but I cannot change the permissions of this user. Questions: The user there I am using like admin it's not the root, right? The effects of this option will affect to the two users (admin & people) or just to people?

    Read the article

  • " this kernel required an X86-64 CPU, but only detected a i686 CPU"

    - by jy19
    I recently decided to use Virtualbox to run Ubuntu, but I get the message this kernel required an X86-64 CPU, but only detected a i686 CPU I've already enabled virtualization in BIOS, but that doesn't seem to work. Many other solutions suggest that I should download the 32-bit version, and not the 64-bit. I'm not sure about that though, since my computer clearly says "64-bit operating system" under systems. But I might just be mistaken.

    Read the article

  • Windows XP Task Management: no execution

    - by Ice09
    Hi, we used the following scenario sucessfully over a long period of time: Remotely log onto a Win XP server, which is used by one user most/all of time Schedule a task using the "task planner" Task was run at "almost" each scheduled point of time (seldom it did not start, presumably when someone else was logged in). For some time now, we share the server with several users. Even though I checked an option for running independently of the logged in user, this option does not seem to work. Now, the task is seldom executed, not seldom not executed. Now, the question is: is there some other option I can't see which disables the execution OR, even better, is there some other tool which we can use for task scheduling on Win XP servers with several different users?

    Read the article

  • SSIS Catalog: How to use environment in every type of package execution

    - by Kevin Shyr
    Here is a good blog on how to create a SSIS Catalog and setting up environments.  http://sqlblog.com/blogs/jamie_thomson/archive/2010/11/13/ssis-server-catalogs-environments-environment-variables-in-ssis-in-denali.aspx Here I will summarize 3 ways I know so far to execute a package while using variables set up in SSIS Catalog environment. First way, we have SSIS project having reference to environment, and having one of the project parameter using a value set up in the environment called "Development".  With this set up, you are limited to calling the packages by right-clicking on the packages in the SSIS catalog list and select Execute, but you are free to choose absolute or relative path of the environment. The following screenshot shows the 2 available paths to your SSIS environments.  Personally, I use absolute path because of Option 3, just to keep everything simple for myself. The second option is to call through SQL Job.  This does require you to configure your project to already reference an environment and use its variable.  When a job step is set up, the configuration part will require you to select that reference again.  This is more useful when you want to automate the same package that needs to be run in different environments. The third option is the most important to me as I have a SSIS framework that calls hundreds of packages.  The main part of the stored procedure is in this post (http://geekswithblogs.net/LifeLongTechie/archive/2012/11/14/time-to-stop-using-ldquoexecute-package-taskrdquondash-a-way-to.aspx).  But the top part had to be modified to include the logic to use environment reference. CREATE PROCEDURE [AUDIT].[LaunchPackageExecutionInSSISCatalog] @PackageName NVARCHAR(255) , @ProjectFolder NVARCHAR(255) , @ProjectName NVARCHAR(255) , @AuditKey INT , @DisableNotification BIT , @PackageExecutionLogID INT , @EnvironmentName NVARCHAR(128) = NULL , @Use32BitRunTime BIT = FALSE AS BEGIN TRY DECLARE @execution_id BIGINT = 0; -- Create a package execution IF @EnvironmentName IS NULL BEGIN   EXEC [SSISDB].[catalog].[create_execution]     @package_name=@PackageName,     @execution_id=@execution_id OUTPUT,     @folder_name=@ProjectFolder,     @project_name=@ProjectName,     @use32bitruntime=@Use32BitRunTime; END ELSE BEGIN   DECLARE @EnvironmentID AS INT   SELECT @EnvironmentID = [reference_id]    FROM SSISDB.[internal].[environment_references] WITH(NOLOCK)    WHERE [environment_name] = @EnvironmentName     AND [environment_folder_name] = @ProjectFolder      EXEC [SSISDB].[catalog].[create_execution]     @package_name=@PackageName,     @execution_id=@execution_id OUTPUT,     @folder_name=@ProjectFolder,     @project_name=@ProjectName,     @reference_id=@EnvironmentID,     @use32bitruntime=@Use32BitRunTime; END

    Read the article

< Previous Page | 120 121 122 123 124 125 126 127 128 129 130 131  | Next Page >