Search Results

Search found 9439 results on 378 pages for 'telepath needed'.

Page 158/378 | < Previous Page | 154 155 156 157 158 159 160 161 162 163 164 165  | Next Page >

  • Bluetooth adapter turned from working fine to unrecognized

    - by easoncxz
    i had been using bluetooth fine, with devices working, but today when i turned on my computer again bluetooth strangely failed. there is a bluetooth icon on the top bar, showing "bluetooth on", but if i click on the "bluetooth settings" item, a system settings window shows up and shows me a bluetooth on-off switch which is disabled (i.e. fixed to off). more information about my case: i am a new linux used, coming from windows, and do not know supposedly-obvious commands. i am using a laptop. it initially doesn't have bluetooth. i bought a built-in type (instead of USB type) bluetooth module, and added it inside the laptop. hence, i do not have a specific FN+* key for bluetooth. in windows, i needed to install an additional driver that was intended for other machines in my laptop's seires which have built-in (i.e. factoryly built-in)j bluetooth modules. the Fn+* key seemed to only affect wifi under ubuntu. i have been successfully using magicmouse with my later-added built-in bluetooth module/adapter on both windows and ubuntu i have been trying to tweak the magicmouse scrolling speed with commands rmmod something, modprobe hid_magicmouse --scroll_speed=45 --scroll_acceleration=30 or something, then added a file `/etc/modprobe.d/magicmouse.conf". the mouse seemed to be working fine with these changes. now if i run commands like hcitool dev, the shell tells me that i do not have any "Devices" or "adapters". i seem to have bluez installed, because when i type "blue" then tab-autocomplete, a bunch of commands like bluez-test-device pops up. -- update -- some commands and their results: easoncxz@eason-Aspire-4741-ubuntu:/etc$ hcitool dev Devices: easoncxz@eason-Aspire-4741-ubuntu:/etc$ hcitool scan Device is not available: No such device easoncxz@eason-Aspire-4741-ubuntu:/etc$ rfkill list 0: phy0: Wireless LAN Soft blocked: no Hard blocked: no 1: acer-wireless: Wireless LAN Soft blocked: no Hard blocked: no 2: acer-bluetooth: Bluetooth Soft blocked: no Hard blocked: no easoncxz@eason-Aspire-4741-ubuntu:/etc$ rfkill list 0: phy0: Wireless LAN Soft blocked: yes Hard blocked: no 1: acer-wireless: Wireless LAN Soft blocked: yes Hard blocked: no 2: acer-bluetooth: Bluetooth Soft blocked: yes Hard blocked: no

    Read the article

  • Success Quote: A Hybrid Approach for Success

    - by Lauren Clark
    We recently received this quote from a project that successfully used OUM: “On our project, we applied a combination of the Oracle Unified Method (OUM) and the client's methodology. The project was organized by OUM's phases and a subset of OUM's processes, tasks, and templates. Using a hybrid of the two methods resulted in an implementation approach that was optimized for the client-specific requirements for this project." This hybrid approach is an excellent example of using OUM in the flexible and scalable manner in which it was intended. The project team was able to scale OUM to be fit-for-purpose for their given situation. It's great to see how merging what was needed out of OUM with the client’s methodology resulted in an implementation approach that more closely aligned to the business needs. Successfully scaling OUM is dependent on the needs of the particular project and/or engagement. The key is to use no more than is necessary to satisfy the requirements of the implementation and appropriately address risks. For more information, check out the "Tailoring OUM for Your Project" page, which can be accessed by first clicking on the "OUM should be scaled to fit your implementation" link on the OUM homepage and then drilling into the link on the subsequent page. Have you used OUM in conjunction with a partner or customer methodology? Please share your experiences with us.

    Read the article

  • Personal knowledge base [closed]

    - by AlexLocust
    Possible Duplicate: How do you manage your knowledge base? Hello. Now I am using easy writing pad for scratches while doing some researches, solving troubles or doing workarounds. But.. you now, it's really difficult to remember all details of founded solution and it's alternatives after few months. And writing pad is not wery useful. Usually writing pad doesn't contains half of needed inforation: links founded in internet, output of some test commands and etc. Now I'am looking for a tool for storing information about my researches and it's results. Basic reqirements: ability to store files; ability to store formatted text (kind of HTML will be nice) with hyperlinks and code snippets; tags on notes; easy to use. Now I'am thinking about some kind of file-system organized storage, but I think it will be inconvenient. Another thinks is local wiki or blog. So, my question is: How do you organize you knowlege base? What tools do you use, and what "pros and cons".

    Read the article

  • Java EE 7 JSR update

    - by Heather VanCura
    Java EE 7 JSR update...in case you missed the last few entries with JSR updates, there are 8 Java EE 7 JSRs currently in JCP milestone review stages.  Your input is requested and needed! JSR 342: Early Draft Review 2– Java Platform, Enterprise Edition 7 (Java EE 7) Specification (review ends 30 November); Oracle JSR 107: Early Draft Review - JCACHE - Java Temporary Caching API (review ends 22 November); Greg Luck, Oracle JSR 236: Early Draft Review – Concurrency Utilities for Java EE (review ends 15 December); Oracle JSR 338: Early Draft Review 2 – Java Persistence 2 (review ends 30 November); Oracle JSR 346: Public Review – Contexts and Dependency Injection for Java EE 1.1 (EC ballot 4-17 December); RedHat JSR 352: Public Review – Batch Applications for the Java Platform (EC ballot 4-17 December); IBM JSR 349: Public Review – Bean Validation 1.1 (EC ballot 20- 26 November); RedHat JSR 339: Public Review – JAX-RS 2.0: The Java API for RESTful Web Services (Review period ended, EC ballot ends 26 November); Oracle  Also, check out the Java EE wiki with a specification and schedule update, including most recently, the addition of JSR 236.

    Read the article

  • Ubuntu...I love you, I hate you

    - by gregarobinson
     I have been working on seeing if a .NET 3.5 application will port over to Linux, Ubuntu to be specific. I started with version 9.01, then 9.10 and now 10.04 as I find more and more that I need from Mono. I have a dual boot on a dev box, Windows 7 and Ubuntu. An upgrade from Ubuntu 9.01 to 9.10 caused my mouse and keyboard to lock up. I was able to boot from a 9.10 cd. Then, I upgraded to 10.04 as I needed Mono 2,2. Upgrade worked, lost my windows boot though. it seems grub somehow jumped in and messed up the windows boot. After Googlign liek crazy and trying this and that, these 2 links finally got me my windows boot back:http://sourceforge.net/apps/mediawiki/bootinfoscript/index.php?title=Boot_Problems:Boot_Sector http://support.microsoft.com/kb/927392So, I am now thinking about trying SuSe instead as I hear\read it's more stable. I think a lot of my pains have been related to learning and getting use to Linux.        

    Read the article

  • Databases and the CI server

    - by mlk
    I have a CI server (Hudson) which merrily builds, runs unit tests and deploys to the development environment but I'd now like to get it running the integration tests. The integration tests will hit a database and that database will be consistently being changed to contain the data relevant to the test in question. This however leads to a problem - how do I make sure the database is not being splatted with data for one test and then that data being override by a second project before the first set of tests complete? I am current using the "hope" method, which is not working out too badly at the moment, but mostly due to the fact that we only have a small number of integration tests set up on CI. As I see it I have the following options: Test-local (in memory) databases I'm not sure if any in-memory databases handle all the scaryness of Oracles triggers and packages etc, and anything less I don't feel would be a worth while test. CI Executor-local databasesA fair amount of work would be needed to set this up and keep 'em up to date, but defiantly an option (most of the work is already done to keep the current CI database up-to-date). Single "integration test" executorLikely the easiest to implement, but would mean the integration tests could fall quite far behind. Locking the database (or set of tables) I'm sure I've missed some ways (please add them). How do you run database-based integration tests on the CI server? What issues have you had and what method do you recommend? (Note: While I use Hudson, I'm happy to accept answers for any CI server, the ideas I'm sure will be portable, even if the details are not). Cheers,      Mlk

    Read the article

  • Quick Script for Adding Skype Groups

    - by Robert May
    So, I needed to add about 30 people to several different Skype groups today, and I didn’t want to repeat the /add [skypename] thing over and over and over.  Building the list was a pain . . . I couldn’t find a good way to extract all of the users in an existing group.  There’s probably an api or something, but I just did that part by hand. Adding them to the groups was pretty easy with Windows Scripting Host.  Basically, I just ran this: <package>    <job id="vbs">       <script language="VBScript">          set WshShell = WScript.CreateObject("WScript.Shell")          WshShell.AppActivate 4484          WScript.Sleep 100          WshShell.SendKeys "/add user1~"          WScript.Sleep 100 …          WshShell.SendKeys "/add usern~"          WScript.Sleep 100       </script>    </job> </package> Add as many users as you need by copying the sendkeys and sleep lines.  Then, save the script to a .wsf file.  The AppActivate line needs to be changed to have the process id of skype instead of the number there.  To get that, open up Task Manager, click on Processes, then find skype.exe and find it’s PID. Before you double click on the file in windows explorer, you’ll need to have created the groups in skype.  For each group, open the group, and click in the chat window of the group.  Then double click on the WSF file.  If you don’t click in the chat window, you will likely get the add user dialog box instead of just adding the users. Technorati Tags: Skype,Script

    Read the article

  • Seeking .htaccess help: Converting multiple subdomains (both HTTP and HTTPS) to www.domain.com using .htaccess

    - by Joshua Dorkin
    I've been trying to get an answer to this question on other forums (the folks at SuperUser thought this was the place I needed to post) and via my connections, but I haven't gotten very far. Hopefully you guys can help me find an answer. I've got a dozen old subdomains that have been indexed by Google. These have been indexed as both HTTP AND HTTPS. I've managed to redirect all the subdomains properly, provided they are not HTTPS, but can't get any of the HTTPS subdomains to property redirect. Here's the code I'm using: RewriteCond %{HTTP_HOST} ^subdomain1.mysite.com$ [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] RewriteCond %{HTTP_HOST} ^subdomain2.mysite.com$ [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] RewriteCond %{HTTP_HOST} ^subdomain3.mysite.com$ [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] This works great until someone goes to: https://subdomain2.mysite.com$ which is not redirected back to http://www.mysite.com$ How can I get this to work? Additionally, I'm guessing there is an easier way to make it happen than setting up a dozen pairs of RewriteCond/RewriteRule? Is there any way to do this in just a few lines, including one where I list all the subdomains? I'd actually also like to redirect everything on https://www.mysite.com$ to http://www.mysite.com$ except for 3 folders. These are mysite.com/secure, mysite.com/store, mysite.com/user. Is there any good way to add this to the .htaccess file?

    Read the article

  • If You Include the Groovy Editor...

    - by Geertjan
    ...in a NetBeans RCP application, what additional JARs will you need to include for the Groovy Editor to work? Leaving aside the debate on the current state & quality of the NetBeans Groovy Editor, so, assuming you need the Groovy support that the NetBeans Groovy Editor provides, you would check the Groovy Editor checkbox in the Project Properties dialog of your application: As you can see, however, the Groovy Editor depends on other modules, some of which, in turn, depend on yet other modules, and so on. So, I clicked the "Resolve" button above and then created a ZIP distribution, to see which additional JARs had been included. Until that point, I had only been using the "platform" cluster, which means that absolutely everything found in the ZIP's "ide" cluster and "java" cluster have only been included so that the Groovy Editor could be included, i.e., all thanks to clicking the "Resolve" button above. Let's first look at what that means for the "java" cluster: That's not so bad and kind of a side effect of Groovy being Java, i.e., a lot of Java functionality is needed. Now let's look at the "ide" cluster: So, in answer to the original question, if all you want in your NetBeans Platform application, in terms of editor functionality, is the Groovy Editor, then you have a pretty high price to pay. At the very least, I would have assumed that the project support JARs and the debugger support JARs would not be so tightly coupled with the Groovy Editor. That would be a cool thing to separate out from the editor support.

    Read the article

  • SharePoint 2010 and Windows Server Backup

    - by Enrique Lima
    A couple of months ago, a friend found a bit of information on TechNet that has proven to be quite useful. See, I am of the opinion SharePoint allows for smaller deployments to be made, and with that said, I am talking about SharePoint Foundation 2010 being used for the most part. But truly the point here is not to discuss whether or not a deployment of SharePoint Foundation 2010 or SharePoint Server 2010 is right or not.  The fact is they do take place and happen.  And information will reside there. Now, the point of this post is to raise awareness on options available for companies that have implemented it and maybe are a bit “iffy” on how to protect the information being placed in libraries and lists.  In many cases I have found SharePoint comes first and business continuity becomes an afterthought.  The documentation piece from TechNet states: “You can register SharePoint Server 2010 with Windows Server Backup by using the stsadm.exe -o -registerwsswriter operation to configure the Volume Shadow Copy Service (VSS) writer for SharePoint Server. Windows Server Backup then includes SharePoint Server 2010 in server-wide backups. When you restore from a Windows Server backup, you can select Microsoft SharePoint Foundation (no matter which version of SharePoint 2010 Products is installed), and all components reported by the VSS writer forSharePoint Server 2010 on that server at the time of the backup will be restored. Windows Server Backup is recommended only for use with for single-server deployments.” Even in the event of single-server deployments you will have options to safeguard your data. The process will require that after you have executed the stsadm command above, you will then use Windows Server Backup to do a Full Server Backup.  Then when the restore operation is needed you will be able to select specifically the section that has the SharePoint technologies backup. The restore process: Hope you find this to be a helpful post.  I have found this to be specially handy in SharePoint deployments that are part of a Team Foundation Server deployment and that are isolated from any other SharePoint farm and such.   Credits:  Sean McDonough for passing along the information available on TechNet.

    Read the article

  • Reflecting on 2010 and Looking into 2011

    - by Sam Abraham
    In early 2010, I had blogged and shared my excitement as I was about to embark on a new journey relocating to South Florida.     As I settled down and adjusted to my new life, I was presented with an opportunity to get actively involved and volunteer in the local Florida .Net and Project Management communities.  I have since devoted a significant portion of my time to community initiatives, coordinating the West Palm Beach .Net User Group, volunteering as a member of the INETA Speaker’s Bureau and traveling to attend/speak at .Net code camps and user groups throughout the states of Florida and New York. I have also taken on various volunteer roles at the South Florida Chapter of the Project Management Institute starting as core team member on the chapter’s mentoring initiative and ending the year as Project Manager of the chapter’s mentoring program and as Director of Electronic Communications on the chapter’s IT team. I am also serving a one year term (2010-2011) as secretary and founding board member of Florida’s first official chapter of the International Association for Software Architects (IASA).   A big thank you is due for those who afforded me the opportunity and privilege to take part of these initiatives and those who provided guidance and encouragement when I needed them the most.   Looking ahead into 2011, I hope to continue my community involvement and volunteer activities. I will start by dedicating the first 5 weekends in the New Year to teach a free comprehensive Microsoft PowerPoint class at church. My goal will be to start from scratch and slowly cover the various available PowerPoint features that can be leveraged to create captivating presentations. Starting February, I will be resuming my user group/code camp speaking engagements at our South Florida .Net Code Camp and the West Palm Beach .Net User Group.   I look forward to continuing to meet, chat and share with our technical community members and to another active year in community service.   All the best, --Sam Abraham

    Read the article

  • Use Your Web Browser As A Calculator

    - by Gopinath
    Quite often most of us require Calculator application to evaluate percentage calculations, divisions,etc. Whenever I needed a calculator application I launch Windows Calculator application as it’s built into each and every version of Windows I use.  But the moment I learn that almost all the web browsers have a built in calculator, I stopped using Windows Calculator.  Google Search Box – Every Browser’s Built In Calculator Google Search Box is the built in calculator of every web browser. The search box is capable of evaluation simple expressions like 20/50+10 as well as complex arithmetic formulas that include functions like sin, cos, tan,etc. Almost every web browser has Google Search box by default, if not you can install it very quickly. In Google Chrome browser, Google Search box is built in right inside the address bar. In Firefox & Internet Explorer you can locate it on the top right corner.    To perform calculations, why to launch Calculator when we have a web browser open on our desktop most of the time? Join us on Facebook to read all our stories right inside your Facebook news feed.

    Read the article

  • MVVM - child windows and data contexts

    - by GlenH7
    Should a child window have it's own data context (View-Model) or use the data context of the parent? More broadly, should each View have its own View-Model? Are there are any rules to guide making that decision? What if the various View-Models will be accessing the same Model? I haven't been able to find any consistent guidance on my question. The MS definition of MVVM appears to be silent on child windows. For one example, I have created a warning message notification View. It really didn't need a data context since it was passed the message to display. But if I needed to fancy it up a bit, I would have tapped the parent's data context. I have run into another scenario that needs a child window and is more complicated than the notification box. The parent's View-Model is already getting cluttered, so I had planned on generating a dedicated VM for the child window. But I can't find any guidance on whether this is a good idea or what the potential consequences may be. FWIW, I happen to be working in Silverlight, but I don't know that this question is strictly a Silverlight issue.

    Read the article

  • Resetup kernel for virtualbox and now ubuntu is in initramfs state

    - by UbuntuMan
    My 10.04LTS became Read-Only so I wanted to reboot this virtual machine. Then I couldn't get it back up running. VirtualBox threw Result Code: NS_ERROR_FAILURE (0x80004005) error. I did the kernel re-setup: sudo /etc/init.d/vboxdrv setup * Stopping VirtualBox kernel modules [ OK ] * Uninstalling old VirtualBox DKMS kernel modules [ OK ] * Trying to register the VirtualBox kernel modules using DKMS [ OK ] * Starting VirtualBox kernel modules [ OK ] I can't even use sudo in this state. What can I do? (initframfs) /dev/sda1 /bin/sh: /dev/sda1: Permission denied (initframfs) /dev/sda2 /bin/sh: /dev/sda3: Permission denied (initframfs) /dev/sda3 /bin/sh: /dev/sda1: not found I have a similar image so I can check the disks setting if needed. Please help me. Thanks. I have an older version of VirtualBox: 4.0.8 something like that. But other vms on the same VirtualBox are working fine. UPDATE I can hold down SHIFT key and see the GURU menu. Only one kernel exists. Ubuntu(...) Ubuntu(recovery) master test master test

    Read the article

  • Story of success: MySQL Enterprise Backup (MEB) was successfully integrated with IBM Tivoli Storage Manager (TSM) via System Backup to Tape (SBT) interface.

    - by user13334359
    Since version 3.6 MEB supports backups to tape through the SBT interface.The officially supported tool for such backups to tape is Oracle Secure Backup (OSB).But there are a lot of other Storage Managers. MEB allows to use them through the SBT interface. Since version 3.7 it also has option --sbt-environment which allows to pass environment variables, not needed by OSB, to third-party managers. At the same time MEB can not guarantee it would work with all of them.This month we were contacted by a customer who wanted to use IBM Tivoli Storage Manager (TSM) with MEB. We could only say them same thing I wrote in previous paragraph: this solution is supposed to work, but you have to be pioneers of this technology. And they agreed. They agreed to be the pioneers and so the story begins.MEB requires following options to be specified by those who want to connect it to SBT interface:--sbt-database-name: a name which should be handed over to SBT interface. This can be any name. Default, MySQL, works for most cases, so user is not required to specify this option.--sbt-lib-path: path to SBT library. For TSM this library comes with "Data Protection for Oracle", which, in its turn, interfaces with Oracle Recovery Manager (RMAN), which uses SBT interface. So you need to install it even if you don't use Oracle.--sbt-environment: environment for third-party manager. This option is not needed when you use OSB, but almost always necessary for third-party SBT managers. TSM requires variable TDPO_OPTFILE to be set and point to the TSM configuration file.--backup-image=sbt:: path to the image. Prefix "sbt:" indicates that image should be sent through SBT interfaceSo full command in our case would look like: ./mysqlbackup --port=3307 --protocol=tcp --user=backup_user --password=foobar \ --backup-image=sbt:my-first-backup --sbt-lib-path=/usr/lib/libobk.so \ --sbt-environment="TDPO_OPTFILE=/path/to/my/tdpo.opt" --backup-dir=/path/to/my/dir backup-to-imageAnd this command results in the following output log: MySQL Enterprise Backup version 3.7.1 [2012/02/16] Copyright (c) 2003, 2012, Oracle and/or its affiliates. All Rights Reserved. INFO: Starting with following command line ...  ./mysqlbackup --port=3307 --protocol=tcp --user=backup_user         --password=foobar --backup-image=sbt:my-first-backup         --sbt-lib-path=/usr/lib/libobk.so         --sbt-environment="TDPO_OPTFILE=/path/to/my/tdpo.opt"         --backup-dir=/path/to/my/dir backup-to-image sbt-environment: 'TDPO_OPTFILE=/path/to/my/tdpo.opt' INFO: Got some server configuration information from running server. IMPORTANT: Please check that mysqlbackup run completes successfully.             At the end of a successful 'backup-to-image' run mysqlbackup             prints "mysqlbackup completed OK!". --------------------------------------------------------------------                        Server Repository Options: --------------------------------------------------------------------   datadir                          =  /path/to/data   innodb_data_home_dir             =  /path/to/data   innodb_data_file_path            =  ibdata1:2048M;ibdata2:2048M;ibdata3:64M:autoextend:max:2048M   innodb_log_group_home_dir        =  /path/to/data   innodb_log_files_in_group        =  2   innodb_log_file_size             =  268435456 --------------------------------------------------------------------                        Backup Config Options: --------------------------------------------------------------------   datadir                          =  /path/to/my/dir/datadir   innodb_data_home_dir             =  /path/to/my/dir/datadir   innodb_data_file_path            =  ibdata1:2048M;ibdata2:2048M;ibdata3:64M:autoextend:max:2048M   innodb_log_group_home_dir        =  /path/to/my/dir/datadir   innodb_log_files_in_group        =  2   innodb_log_file_size             =  268435456 Backup Image Path= sbt:my-first-backup mysqlbackup: INFO: Unique generated backup id for this is 13297406400663200 120220 08:54:00 mysqlbackup: INFO: meb_sbt_session_open: MMS is 'Data Protection for Oracle: version 5.5.1.0' 120220 08:54:00 mysqlbackup: INFO: meb_sbt_session_open: MMS version '5.5.1.0' mysqlbackup: INFO: Uses posix_fadvise() for performance optimization. mysqlbackup: INFO: System tablespace file format is Antelope. mysqlbackup: INFO: Found checkpoint at lsn 31668381. mysqlbackup: INFO: Starting log scan from lsn 31668224. 120220  8:54:00 mysqlbackup: INFO: Copying log... 120220  8:54:00 mysqlbackup: INFO: Log copied, lsn 31668381.           We wait 1 second before starting copying the data files... 120220  8:54:01 mysqlbackup: INFO: Copying /path/to/ibdata/ibdata1 (Antelope file format). mysqlbackup: Progress in MB: 200 400 600 800 1000 1200 1400 1600 1800 2000 120220  8:55:30 mysqlbackup: INFO: Copying /path/to/ibdata/ibdata2 (Antelope file format). mysqlbackup: Progress in MB: 200 400 600 800 1000 1200 1400 1600 1800 2000 120220  8:57:18 mysqlbackup: INFO: Copying /path/to/ibdata/ibdata3 (Antelope file format). mysqlbackup: INFO: Preparing to lock tables: Connected to mysqld server. 120220 08:57:22 mysqlbackup: INFO: Starting to lock all the tables.... 120220 08:57:22 mysqlbackup: INFO: All tables are locked and flushed to disk mysqlbackup: INFO: Opening backup source directory '/path/to/data/' 120220 08:57:22 mysqlbackup: INFO: Starting to backup all files in subdirectories of '/path/to/data/' mysqlbackup: INFO: Backing up the database directory 'mysql' mysqlbackup: INFO: Backing up the database directory 'test' mysqlbackup: INFO: Copying innodb data and logs during final stage ... mysqlbackup: INFO: A copied database page was modified at 31668381.           (This is the highest lsn found on page)           Scanned log up to lsn 31670396.           Was able to parse the log up to lsn 31670396.           Maximum page number for a log record 328 120220 08:57:23 mysqlbackup: INFO: All tables unlocked mysqlbackup: INFO: All MySQL tables were locked for 0.000 seconds 120220 08:59:01 mysqlbackup: INFO: meb_sbt_backup_close: blocks: 4162  size: 1048576  bytes: 4363985063 120220  8:59:01 mysqlbackup: INFO: Full backup completed! mysqlbackup: INFO: MySQL binlog position: filename bin_mysql.001453, position 2105 mysqlbackup: WARNING: backup-image already closed mysqlbackup: INFO: Backup image created successfully.:            Image Path: 'sbt:my-first-backup' -------------------------------------------------------------    Parameters Summary -------------------------------------------------------------    Start LSN                  : 31668224    End LSN                    : 31670396 ------------------------------------------------------------- mysqlbackup completed OK!Backup successfully completed.To restore it you should use same commands like you do for any other MEB image, but need to provide sbt* options as well: $./mysqlbackup --backup-image=sbt:my-first-backup --sbt-lib-path=/usr/lib/libobk.so \ --sbt-environment="TDPO_OPTFILE=/path/to/my/tdpo.opt" --backup-dir=/path/to/my/dir image-to-backup-dirThen apply log as usual: $./mysqlbackup --backup-dir=/path/to/my/dir apply-logThen stop mysqld and finally copy-back: $./mysqlbackup --defaults-file=path/to/my.cnf --backup-dir=/path/to/my/dir copy-back  Disclaimer. This is only story of one success which can be useful for someone else. MEB is not regularly tested and not guaranteed to work with IBM TSM or any other third-party storage manager.

    Read the article

  • Advantages of Hudson and Sonar over manual process or homegrown scripts.

    - by Tom G
    My coworker and I recently got into a debate over a proposed plan at our workplace. We've more or less finished transitioning our Java codebase into one managed and built with Maven. Now, I'd like for us to integrate with Hudson and Sonar or something similar. My reasons for this are that it'll provide a 'zero-click' build step to provide testers with new experimental builds, that it will let us deploy applications to a server more easily, that tools such as Sonar will provide us with well-needed metrics on code coverage, Javadoc, package dependencies and the like. He thinks that the overhead of getting up to speed with two new frameworks is unacceptable, and that we should simply double down on documentation and create our own scripts for deployment. Since we plan on some aggressive rewrites to pay down the technical debt previous developers incurred (gratuitous use of Java's Serializable interface as a file storage mechanism that has predictably bit us in the ass) he argues that we can document as we go, and that we'll end up changing a large swath of code in the process anyways. I contend that having accurate metrics that Sonar (or fill in your favorite similar tool) provide gives us a good place to start for any refactoring efforts, not to mention general maintenance -- after all, knowing which classes are the most poorly documented, even if it's just a starting point, is better than seat-of-the-pants guessing. Am I wrong, and trying to introduce more overhead than we really need? Some more background: an alumni of our company is working at a Navy research lab now and suggested these two tools in particular as one they've had great success with using. My coworker and I have also had our share of friendly disagreements before -- he's more of the "CLI for all, compiles Gentoo in his spare time and uses Git" and I'm more of a "Give me an intuitive GUI, plays with XNA and is fine with SVN" type, so there's definitely some element of culture clash here.

    Read the article

  • Spirent Communications Improves Customer Experience with Knowledge Management

    - by Tony Berk
    Spirent Communications plc is a global leader in test and measurement inspiring innovation within development labs, communication networks and IT organizations. The world’s leading communications companies rely on Spirent to help design, develop, validate, and deliver world-class network, devices, and services. Spirent’s customers require high levels of support for a diverse and complex product portfolio, and the company is committed to delivering on this requirement. Spirent needed a solution to help its customers get the information they need quickly and at their convenience through its Web site. After evaluating several solutions, Spirent selected and deployed Oracle Knowledge for Web Self Service Enterprise Edition. Oracle Knowledge Management uses natural language processing to understand the true intent of each inquiry logged via the support portal’s search function. The Spirent Knowledge Base on the company’s Customer Support Center (CSC) finds the best possible answer using search enhancement features?such as communications industry-specific libraries and federation to search external sources. Spirent has reduced contact center call volume while better serving its customers. Each time a customer uses the knowledge base, they find answers faster than by calling, and it saves Spirent an average of US$210 per call?which is significant when multiplied across the thousands of calls received monthly. Oracle Knowledge also helps support engineers find answers more quickly, enabling the company to scale without adding additional support engineers. Oracle Knowledge is integrated with Spirent's Siebel Contact Center implementation to provide an integrated desktop for CRM and agent intelligence, avoiding the need for contact center personnel to toggle between various screens to address customer inquiries, thereby accelerating customer service. Click here to learn more about Sprient's use of Siebel CRM and Oracle Knowledge Management.

    Read the article

  • Dynamic endpoint binding in Oracle SOA Suite by Cattle Crew

    - by JuergenKress
    Why is dynamic endpoint binding needed? Sometimes a BPEL process instance has to determine at run-time which implementation of a web service interface is to be called. We’ll show you how to achieve that using dynamic endpoint binding. Let’s imagine the following scenario: we’re running a car rental agency called RYLC (Rent Your Legacy Car) which operates different locations. The process of renting a car is basically identical for all locations except for the determination which cars are currently available. This is depicted in the following diagram: There are three different implementations of the GetAvailableCars service. But how can we achieve calling them dynamically at run-time using Oracle SOA Suite? How to dynamically set the service endpoint There are just a couple of implementation steps we need to perform to enable dynamic endpoint binding: create a new SOA project in JDeveloper add a CarRental BPEL process add an external reference to the GetAvailableCars service within the composite create a DVM file containing the URI’s by which the services for the different locations can be accessed set the endpointURI property on the Invoke component calling the GetAvailableCars service (value is taken from the DVM file) Read the complete article here. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Technorati Tags: Cattle crew,SOA binding,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

  • Moving my sprite in XNA using classes

    - by Tom
    Hey, im a newbie at this programming lark and its really frustrating me. I'm trying to move a snake in all directions while using classes. Ive created a vector2 for speed and ive attempted creating a method which moves the snake within the snake class. Now I'm confused and not sure what to do next. Appreciate any help. Thanks :D This is what i've done in terms of the method... public Vector2 direction() { Vector2 inputDirection = Vector2.Zero; if (Keyboard.GetState().IsKeyDown(Keys.Left)) inputDirection.X -= -1; if (Keyboard.GetState().IsKeyDown(Keys.Right)) inputDirection.X += 1; if (Keyboard.GetState().IsKeyDown(Keys.Up)) inputDirection.Y -= -1; if (Keyboard.GetState().IsKeyDown(Keys.Down)) inputDirection.Y += 1; return inputDirection * snakeSpeed; } Appreciate any help. Thanks :D EDIT: Well let me make everything clear. Im making a small basic game for an assignment. The game is similar to the old snake game on the old Nokia phones. I've created a snake class (even though I'm not sure whether this is needed because im only going to be having one moving sprite within the game). After I written the code above (in the snake class), the game ran with no errors but I couldn't actually move the image :( EDIT2: Thanks so much for everyones responses!!

    Read the article

  • Out-of-the-Box Integration Links Primavera Solutions with PeopleSoft Projects Applications

    - by Sylvie MacKenzie, PMP
    In a move that brings best-in-class enterprise project portfolio management to Oracle’s PeopleSoft enterprise resource planning customers, Oracle announced the integration of Oracle’s PeopleSoft projects applications and Oracle’s Primavera P6 Enterprise Project Portfolio Management. The combination of PeopleSoft financial controls and Primavera portfolio management capabilities brings greater oversight of end-to-end processes to help organizations improve the planning and execution efforts needed to deliver projects on time and within budget. “As an organization with many high-value, project-driven initiatives, we are very pleased to see Oracle’s investment in this important integration,” says Janardhanan Sankar, senior vice president for technology and quality at ITC Infotech India Ltd. Oracle’s PeopleSoft projects applications enable project-centric organizations and departments to establish core operational processes for full project lifecycle management across operations and finance. The integration with Primavera P6 Enterprise Project Portfolio Management means organizations can eliminate costly and difficult-to-maintain proprietary integrations. Organizations can also standardize on the Oracle technologies to Align back-office budgets and costs with project operations to help ensure accurate forecasting of costs, resources, and schedules Provide an accurate single source of truth to financial managers and analysts using Oracle’s PeopleSoft projects applications, and to project managers using Primavera P6 Enterprise Project Portfolio Management  Enhance project collaboration and execution by having all users utilizing common solutions to communicate, plan, and deliver projects “By bringing together Oracle’s PeopleSoft projects applications and Oracle’s Primavera P6 Enterprise Project Portfolio Management, we are able to provide customers with the infrastructure they need to achieve a single source of truth on the projects they are managing,” says Paco Aubrejuan, Oracle’s group vice president and general manager, PeopleSoft. “This real-time visibility drives profitability, increases productivity, and improves operations.” For more information, view the on-demand Webcast, “Bridging Business Processes for Optimal Portfolio Performance,” or read about the new integration.

    Read the article

  • Form Validation Options

    The steps involved in transmitting form data from the client to the Web server User loads web form. User enters data in to web form fields User clicks submit On submit page validates fields using JavaScript. If validation errors are found then the validation script stops the browser from canceling posting the data to the web server and displays error messages as needed. If the form passes the data validation process then the browser will URL encode the values of every field and post it to the server.  The server reads the posted data from the query string and then again validates the data just to ensure data consistency and to prevent any non-validated data because JavaScript was turned off on the clients browser from being inserted in to a database or passed on to other process. If the data passes the second validation check then the server side code will continue with the requested processes. In my opinion, it is mandatory to validate data using client side and server side validation as a fail over process. The client side validation allows users to correct any error before they are sent to the web server for processing, and this allows for an immediate response back to the user regarding data that is not correct or in the proper format that is desired. In addition, this prevents unnecessary interaction between the user and the web server and will free up the server over time compared to doing only server side validation. Server validation is the last line of defense when it comes to validation because you can check to ensure the user’s data is correct before it is used in a business process or stored to a database. Honestly, I cannot foresee a scenario where I would only want to use one form of validation over another especially with the current cost of creating and maintaining data. In my opinion, the redundant validation is well worth the overhead.

    Read the article

  • Engineering as a Service

    - by jgelhaus
    Oracle Exadata Database Machine is known for great compute performance, and over the past few years, it has also become known as a great platform for any type of Oracle Database workload, from data warehousing to online transaction processing (OLTP). But now organizations are turning to Oracle Exadata for business efficiencies and private cloud solutions—for consolidation and database as a service (DBaaS). University of Minnesota For an inside look at how DBaaS is working in the real world, it’s worth checking into the University of Minnesota’s database hotel.  With more than 50,000 students, the University of Minnesota in Minneapolis is one of the largest universities in the United States. The university’s centralized IT group not only has to support all those students but also must provide support and services to more than 40 departments and colleges within the university. They have two Exadata Database Machine X2-2 half-rack systems from Oracle, with four database nodes each and roughly 30 terabytes of usable disk space for each of the Oracle Exadata systems. The university is using Oracle Real Application Clusters (Oracle RAC) for high availability and the Data Guard feature of Oracle Database, Enterprise Edition, for disaster recovery capabilities. The deployment has been live in production since May 2011. Overhead Door When it comes to overhead, revolving, sliding, or other specialty residential and commercial doors, Overhead Door is the worldwide leader. But when they needed to open doors with their customers through a better, faster, and more agile IT infrastructure, Overhead Door turned to Oracle and Oracle Exadata. Oracle Exadata Database Machine plays an important part in Overhead Door’s IT and business strategy. The organization has two Exadata Database Machine X2-2s deployed, one in production and one in development and testing Read the full Oracle Magazine article Engineering as a Service

    Read the article

  • Long term plan of attack to learn math?

    - by zhenka
    I am a web-developer with a desire to expand my skill-set to mathematics relevant to programming. As 2nd career, I am stuck in college doing some of the requirements while working. I was hoping the my education will teach me the needed skills to apply math, however I am quickly finding it to be too much easily-testable breadth-based approach very inefficient for the time invested. For example in my calculus 2 class, the only remotely useful mind expanding experience I had was volumes and areas under the curve. The rest was just monotonous glorified algebra, which while comes easy to me, could be done by software like wolfram alpha within seconds. This is not my idea of learning math. So here I am a frustrated student looking for a way to improve my understanding of math in a way that focuses on application, understanding and maximally removed needless tedium. However I cannot find a good long term study strategy with this approach in mind. So for those of like mind, how would you go about learning the necessary math without worrying too much about stuff a computer can do much better?

    Read the article

  • Keeping game model and graphics/animation separate but in sync

    - by AJM
    Suppose I'm building a chess game where I want to have animations. Pieces glide to their new squares when moved. Pieces perform attack animations when capturing other pieces. I'm not sure how to effectively separate the data and logic needed for these animations and the actual game model (in the MVC sense). The pieces themselves should ideally not have to worry about their pixel coordinates or current animation frame. At the same time, many changes to the model are effectively driven by animations. A moved piece changes its position after (before?) its sprite is done gliding. A piece is removed from the board after the capturing piece is finished its attack animation. How would you suggest I manage the game model, the graphics and animations, and their relationships? For example, where would the animations "live"? How would animations be created and managed in response to player moves? How would animations drive updates to the game model, or how would the game model drive animations?

    Read the article

  • Problems with icons and shutting down Ubuntu 12.04

    - by Anders
    I recently upgraded from 11-10 to 12.04 as a dual boot (Windows 7) and am having problems with shutting down. I installed from a CD that I burned from the iso file, and to get it to boot from the CD, I needed to install the special help program from Windows that would then recognize the CD as the system booted.The installation gave me my own user account as well as a guest account. For the User account there is no "gear" icon (in the upper RH side) from which to access the shut down menu. Interestingly the icons for the Home folder, the Ubuntu One folder, and the System Settings folder are missing, although there are blank places shows these names if the mouse cursor is positioned over these areas. They will even launch with the press of a key - but no icons are visible. For the Guest account, all of these icons are visible and usable. The problem that occurs in shutting down is that I need to leave the User account, move to the Guest account (so that I can access the top right "gear" icon that has the shut down menu item) and press the shut down button. The problem here is that when the shut own menu appears and I press the confirmation that I wish to shut down the computer, the page blanks (as one might hope in the process of closing down), and then pops up with the log in menu, giving the option of logging in as a User or as a Guest. So the questions are: 1) how do you reinstate the far right top icon from which you can access the shut down drop down menu in the User account page. 2) how do you get the icons to display properly on the Left hand side (Home, Ubuntu One, System Settings, and Workspace Switcher) 3) how do you get the system to turn of when you press the shut down button! Boy oh boy! Thanks a bunch for any help!

    Read the article

< Previous Page | 154 155 156 157 158 159 160 161 162 163 164 165  | Next Page >