Search Results

Search found 39047 results on 1562 pages for 'process control'.

Page 712/1562 | < Previous Page | 708 709 710 711 712 713 714 715 716 717 718 719  | Next Page >

  • Allowing users to create an email address

    - by user532887
    I am creating a website and would like to allow users to create their own email forward. Basically, the site will allow groups to create pages on the site, each of which will be able to have its own domain name. I would like users to be able to automatically create an email address on the site that will forward any incoming emails to their own email account. Right now I have to manually set these up in my hosting account control panel but I'm hoping there is a way to do this automatically. Does anyone have experience with doing something similar?

    Read the article

  • Get Hands On with Raspberry Pi via Free OS-Building Course

    - by Jason Fitzpatrick
    Cambridge University is now offering a free 12-segment course that will guide you through building an OS from scratch for the tiny Raspberry Pi development board–learn the ins and outs of basic OS design on the cheap. You’ll need a Raspberry Pi board, a computer running Windows, OS X, or Linux, and an SD card, as well as a small amount of free software. The 12-part tutorial starts you off with basic OS theory and then walks you through basic control of the board, graphics manipulation, and, finally, creating a command line interface for your new operating system. Hit up the link below to read more and check out the lessons. Baking Pi – Operating Systems Development HTG Explains: What The Windows Event Viewer Is and How You Can Use It HTG Explains: How Windows Uses The Task Scheduler for System Tasks HTG Explains: Why Do Hard Drives Show the Wrong Capacity in Windows?

    Read the article

  • How can test users access an unpublished iOS app?

    - by David
    I am considering outsourcing the development of an iOS app to various independent developers. I will have have various testers of the app. We all work for separate companies. Some of these testers will be customers, who I would like feedback from. As there are multiple developers involved I expect there to be a new release on a daily basis. How can this be done? Would each of the testers need to buy some sort of license to avoid having to go through the app approval process? Is there any smooth way to do this so that it will not be a hassle for our friendly customers, who are willing to test our app?

    Read the article

  • When should I tell my boss that I'm thinking about looking for another job?

    - by BeachRunnerJoe
    I'm thinking about looking around for another job, but I don't know when I should tell my boss because I would like to see what kind of opportunities I can land before I even mention it. The reason I'm reluctant to tell him right away is I'm afraid he'll begin the process of replacing me. If I don't tell him while I'm looking around, then I can't use him as a reference and he'd most likely give a great recommendation. If I were to leave and go work for someone else, it wouldn't be until after I finish my current project which ends in two months because I don't want to screw anyone over. How would you approach him about this and when? Thanks in advance for your wisdom!

    Read the article

  • Ripping DVD to iso - Accurately

    - by johnnyturbo3
    I have been using Brasero to rip my DVD collection to .iso. However, I've discovered some errors in some of the DVDs through playback e.g. VLC player would just stop playing the iso file when a bad section in playback is met (half-way through a film). The worst thing is that no errors or warnings were thrown during the ripping process - I could have . Is there a method or application that will monitor DVD/file data integrity and avoid such scenarios in the future? Anything equivalent to Exact Audio Copier or CDparanoia for DVDs?

    Read the article

  • How to customise window decoration whilst using Compiz on Xubuntu?

    - by Benjamin
    I have installed Compiz on Xubuntu 11.10 with sudo apt-get install compiz compizconfig-settings-manager compiz --replace ccp & In the process the XFCE window decoration theme is overridden by that of Compiz (Gtk) which uses the Adwaita theme instead of the Greybird theme. Since Gtk is doing window decoration, I cannot change it back using the XFCE settings. I just need compiz for scale and window switch and I would like to return window decoration to XFCE (Xfwm4) or to be able to change the Gtk window decoration theme. How can I do that? I have found part of the (workaround) answer already: download Greybird Gtk theme install theme (here is where I failed I think) use dconf-editor to change the Gtk theme in org.gnome.desktop.interface The problem really at stage 2 is where do I place the theme? I tried in ~/.themes/ and then changed the value of gtk-theme in the editor to Greybird. But I saw no change.

    Read the article

  • Is it worth it to switch from home-grown remote command interface to using JMX

    - by Sam Goldberg
    Without knowing too much about JMX, I've always assumed that it would be the best approach for building in remote management to our standalone Java server application. Our server application has some minimal remote control capability, using text commands sent via TCP/IP socket to it. Using the home grown approach, it is fairly to add a new command. (Just create new command text, and the code to handle that in the message receiver). On the other hand, we have hardly implemented any commands, even though there are many things we would like to be able to execute remotely. I am trying to weigh the value of moving to incorporating JMX (learning it, and building the interfaces), versus just sticking with the home-grown approach. Does anyone have any experience or advice regarding changing an existing application to use JMX?

    Read the article

  • Ubuntu 12.04.1 LTS breaking down and sluggish

    - by ahamza
    I am in the process of switching from Windows to a Linux based system and am currently deciding between distributions...I am currently trying Ubuntu as WUBI. I find that my experience is not very smooth and streamlined for example, crash-reports, bugs, applications taking time to run etc etc (this in spite of the fact that I am very patient and am constantly researching solution to different driver and application issues). Was wondering if this is because I am running through NTFS right now or is it just like this? Looking to switch to Linux because of its opensource nature, interest in software development in college as well as maximizing the potential of my machine. I am running an AMD quadcore-x64 2.2GHz, 6GB RAM and 750 GB HDD on an HP G6 notebook. I would appreciate any honest opinions.

    Read the article

  • Data-tier Applications in SQL Server 2008 R2

    - by BuckWoody
    I had the privilege of presenting to the Adelaide SQL Server User Group in Australia last evening, and I covered the Data Access Component (DAC) and the Utility Control Point (UCP) from SQL Server 2008 R2. Here are some links from that presentation:   Whitepaper: http://msdn.microsoft.com/en-us/library/ff381683.aspx Tutorials: http://msdn.microsoft.com/en-us/library/ee210554(SQL.105).aspx From Visual Studio: http://msdn.microsoft.com/en-us/library/dd193245(VS.100).aspx Restrictions and capabilities by Edition: http://msdn.microsoft.com/en-us/library/cc645993(SQL.105).aspx    Glen Berry's Blog entry on scripts for UCP/DAC: http://www.sqlservercentral.com/blogs/glennberry/archive/2010/05/19/sql-server-utility-script-from-24-hours-of-pass.aspx    Objects supported by a DAC: http://msdn.microsoft.com/en-us/library/ee210549(SQL.105).aspx   Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • .NET development on a Retina MacBook Pro with Windows 8

    - by Jeff
    I remember sitting in Building 5 at Microsoft with some of my coworkers, when one of them came in with a shiny new 11” MacBook Air. It was nearly two years ago, and we found it pretty odd that the OEM’s building Windows machines sucked at industrial design in a way that defied logic. While Dell and HP were in a race to the bottom building commodity crap, Apple was staying out of the low-end market completely, and focusing on better design. In the process, they managed to build machines people actually wanted, and maintain an insanely high margin in the process. I stopped buying the commodity crap and custom builds in 2006, when Apple went Intel. As a .NET guy, I was still in it for Microsoft’s stack of development tools, which I found awesome, but had back to back crappy laptops from HP and Dell. After that original 15” MacBook Pro, I also had a Mac Pro tower (that I sold after three years for $1,500!), a 27” iMac, and my favorite, a 17” MacBook Pro (the unibody style) with an SSD added from OWC. The 17” was a little much to carry around because it was heavy, but it sure was nice getting as much as eight hours of battery life, and the screen was amazing. When the rumors started about a 15” model with a “retina” screen inspired by the Air, I made up my mind I wanted one, and ordered it the day it came out. I sold my 17”, after three years, for $750 to a friend who is really enjoying it. I got the base model with the upgrade to 16 gigs of RAM. It feels solid for being so thin, and if you’ve used the third generation iPad or the newer iPhone, you’ll be just as thrilled with the screen resolution. I’m typically getting just over six hours of battery life while running a VM, but Parallels 8 allegedly makes some power improvements, so we’ll see what happens. (It was just released today.) The nice thing about VM’s are that you can run more than one at a time. Primarily I run the Windows 8 VM with four cores (the laptop is quad-core, but has 8 logical cores due to hyperthreading or whatever Intel calls it) and 8 gigs of RAM. I also have a Windows Server 2008 R2 VM I spin up when I need to test stuff in a “real” server environment, and I give it two cores and 4 gigs of RAM. The Windows 8 VM spins up in about 8 seconds. Visual Studio 2012 takes a few more seconds, but count part of that as the “ReSharper tax” as it does its startup magic. The real beauty, the thing I looked most forward to, is that beautifully crisp C# text. Consolas has never looked as good as it does at 10pt. as it does on this display. You know how it looks great at 80pt. when conference speakers demo stuff on a projector? Think that sharpness, only tiny. It’s just gorgeous. Beyond that, everything is just so responsive and fast. Builds of large projects happen in seconds, hundreds of unit tests run in seconds… you just don’t spend a lot of time waiting for stuff. It’s kind of painful to go back to my 27” iMac (which would be better if I put an SSD in it before its third birthday). Are there negatives? A few minor issues, yes. As is the case with OS X, not everything scales right. You’ll see some weirdness at times with splash screens and icons and such. Chrome’s text rendering (in Windows) is apparently not aware of how to deal with higher DPI’s, so text is fuzzy (the OS X version is super sharp, however). You’ll also have to do some fiddling with keyboard settings to use the Windows 8 keyboard shortcuts. Overall, it’s as close to a no-compromise development experience as I’ve ever had. I’m not even going to bother with Boot Camp because the VM route already exceeds my expectations. You definitely get what you pay for. If this one also lasts three years and I can turn around and sell it, it’s worth it for something I use every day.

    Read the article

  • Installation of Ubuntu 13.04 seems to take forever

    - by Michel
    I am about to install Ubuntu 13.04 on my AMD x64 machine and the installation of the lib packages seems to take forever. Is that a known issue? I know that it might take some time when checking to download packages from the Internet but this went through very fast. In the terminal-like view I can see, the system is just unpacking, installing, configuring etc. in a quite slow way. In the opening screen it reads: It just takes a few minutes (or something close to that). Again: Is that a normal behavior or am I doing something wrong and what could I do to speed up the process in case I'm running an installation again on another system?

    Read the article

  • JCP Calendar for 2013 - First EC Meeting 15-16 January

    - by Heather VanCura
    The JCP 2013 calendar and EC Meeting schedule has now been finalized and published :-). This year the EC will be holding meetings in the San Francisco Bay Area in January, and also in September, scheduled around the JavaOne San Francisco Conference; and, Credit Suisse will host the May EC Meeting in Zurich, Switzerland.  The first JCP EC Meeting of 2013 will be a Face to Face Meeting in Santa Clara, California USA, hosted by Intel.  We are of in the midst preparing the agenda now, but it will include a 2012 annual review, JSR Spec Lead presentations & updates, as well as JSR 358, A major revision of the Java Community Process, Expert Group session. You can view meeting materials and minutes from the JCP EC Meetings on JCP.org. The JCP also plans to meet with the Java User Group Leaders attending the User Group Leaders Summit being held at Oracle Redwood Shores location 14-16 January.

    Read the article

  • Can't boot freshly burned Ubuntu cd

    - by user89004
    So I just burned a Ubuntu 12.04.1 powerpc .iso on a cd for my iMac G5 running Mac OS X 10.4.11 and it won't even recognize the cd. I burned it on my dad's Windows 7 laptop as the process is way easier (just 2 clicks). Mac OS X 10.4.11 gives me an error when it starts and when the CD is in saying "the disk you inserted was not readable by this computer". What's funny is that I burned a Ubuntu Minimal .iso on a CD and it would totally read that and even boot it though it gave me some errors afterwards and I couldn't install. I even tried going into openfirmware and hitting boot cd:,\tbxi but I get the error "Warning sector size mismatch can't OPEN cd:,\tbxi Can't open device or file" Was there something wrong with the .iso I burned? Mac OS X 10.4.11 won't even mount that .iso it tells me that the HFS file system is corrupt or something, but I know the .iso doesn't contain HFS file system. Any help? I downloaded the .iso from http://cdimage.ubuntu.com/releases/precise/release/ubuntu-12.04-desktop-powerpc.iso

    Read the article

  • Deferred Open Source licensing

    - by Thomas W.
    Are there established models for releasing an initially proprietary piece of software under FLOSS conditions after a defined period or a certain point of time? The main problem here is that all parties involved must be able to trust that the Open Source licensing will actually take place at the defined time and no party can further defer or cancel this process. Clearly such a model has its problems, for example it's problematic to deal with contributions from "outside", legally and technically. Ghostscript is a prominent example where a deferred model has been used and abandoned. However, if certain parties involved will insist on keeping the software proprietary, at least for a certain period of time, then the only options are a deferred Open Source licensing model or no Open Source licensing at all. I think I read about services that serve as trusted parties who take care of Open Sourcing the software. However, I was not successful in spotting any of those.

    Read the article

  • Annotation Processing Virtual Mini-Track at JavaOne 2012

    - by darcy
    Putting together the list of JavaOne talks I'm interested in attending, I noticed there is a virtual mini-track on annotation processing and related technology this year, with a combination of bofs, sessions, and a hands-on-lab: Monday Multidevice Content Display and a Smart Use of Annotation Processing, Dimitri BAELI and Gilles Di Guglielmo Tuesday Advanced Annotation Processing with JSR 269, Jaroslav Tulach Build Your Own Type System for Fun and Profit, Werner Dietl and Michael Ernst Wednesday Annotations and Annotation Processing: What’s New in JDK 8?, Joel Borggrén-Franck Thursday Hack into Your Compiler!, Jaroslav Tulach Writing Annotation Processors to Aid Your Development Process, Ian Robertson As the lead engineer on bot apt (rest in peace) in JDK 5 and JSR 269 in JDK 6, I'd be heartened to see greater adoption and use of annotation processing by Java developers.

    Read the article

  • JavaOne 2012 Sunday Strategy Keynote

    - by Janice J. Heiss
    At the Sunday Strategy Keynote, held at the Masonic Auditorium, Hasan Rizvi, EVP, Middleware and Java Development, stated that the theme for this year's JavaOne is: “Make the future Java”-- meaning that Java continues in its role as the most popular, complete, productive, secure, and innovative development platform. But it also means, he qualified, the process by which we make the future Java -- an open, transparent, collaborative, and community-driven evolution. "Many of you have bet your businesses and your careers on Java, and we have bet our business on Java," he said.Rizvi detailed the three factors they consider critical to the success of Java--technology innovation, community participation, and Oracle's leadership/stewardship. He offered a scorecard in these three realms over the past year--with OS X and Linux ARM support on Java SE, open sourcing of JavaFX by the end of the year, the release of Java Embedded Suite 7.0 middleware platform, and multiple releases on the Java EE side. The JCP process continues, with new JSR activity, and JUGs show a 25% increase in participation since last year. Oracle, meanwhile, continues its commitment to both technology and community development/outreach--with four regional JavaOne conferences last year in various part of the world, as well as the release of Java Magazine, with over 120,000 current subscribers. Georges Saab, VP Development, Java SE, next reviewed features of Java SE 7--the first major revision to the platform under Oracle's stewardship, which has included near-monthly update releases offering hundreds of fixes, performance enhancements, and new features. Saab indicated that developers, ISVs, and hosting providers have all been rapid adopters of the platform. He also noted that Oracle's entire Fusion middleware stack is supported on SE 7. The supported platforms for SE 7 has also increased--from Windows, Linux, and Solaris, to OS X, Linux ARM, and the emerging ARM micro-server market. "In the last year, we've added as many new platforms for Java, as were added in the previous decade," said Saab.Saab also explored the upcoming JDK 8 release--including Project Lambda, Project Nashorn (a modern implementation of JavaScript running on the JVM), and others. He noted that Nashorn functionality had already been used internally in NetBeans 7.3, and announced that they were planning to contribute the implementation to OpenJDK. Nandini Ramani, VP Development, Java Client, ME and Card, discussed the latest news pertaining to JavaFX 2.0--releases on Windows, OS X, and Linux, release of the FX Scene Builder tool, the JavaFX WebView component in NetBeans 7.3, and an OpenJFX project in OpenJDK. Nandini announced, as of Sunday, the availability for download of JavaFX on Linux ARM (developer preview), as well as Scene Builder on Linux. She noted that for next year's JDK 8 release, JavaFX will offer 3D, as well as third-party component integration. Avinder Brar, Senior Software Engineer, Navis, and Dierk König, Canoo Fellow, next took the stage and demonstrated all that JavaFX offers, with a feature-rich, animation-rich, real-time cargo management application that employs Canoo's just open-sourced Dolphin technology.Saab also explored Java SE 9 and beyond--Jigsaw modularity, Penrose Project for interoperability with OSGi, improved multi-tenancy for Java in the cloud, and Project Sumatra. Phil Rogers, HSA Foundation President and AMD Corporate Fellow, explored heterogeneous computing platforms that combine the CPU and the parallel processor of the GPU into a single piece of silicon and shared memory—a hardware technology driven by such advanced functionalities as HD video, face recognition, and cloud workloads. Project Sumatra is an OpenJDK project targeted at bringing Java to such heterogeneous platforms--with hardware and software experts working together to modify the JVM for these advanced applications and platforms.Ramani next discussed the latest with Java in the embedded space--"the Internet of things" and M2M--declaring this to be "the next IT revolution," with Java as the ideal technology for the ecosystem. Last week, Oracle released Java ME Embedded 3.2 (for micro-contollers and low-power devices), and Java Embedded Suite 7.0 (a middleware stack based on Java SE 7). Axel Hansmann, VP Strategy and Marketing, Cinterion, explored his company's use of Java in M2M, and their new release of EHS5, the world's smallest 3G-capable M2M module, running Java ME Embedded. Hansmaan explained that Java offers them the ability to create a "simple to use, scalable, coherent, end-to-end layer" for such diverse edge devices.Marc Brule, Chief Financial Office, Royal Canadian Mint, also explored the fascinating use-case of JavaCard in his country's MintChip e-cash technology--deployable on smartphones, USB device, computer, tablet, or cloud. In parting, Ramani encouraged developers to download the latest releases of Java Embedded, and try them out.Cameron Purdy, VP, Fusion Middleware Development and Java EE, summarized the latest developments and announcements in the Enterprise space--greater developer productivity in Java EE6 (with more on the way in EE 7), portability between platforms, vendors, and even cloud-to-cloud portability. The earliest version of the Java EE 7 SDK is now available for download--in GlassFish 4--with WebSocket support, better JSON support, and more. The final release is scheduled for April of 2013. Nicole Otto, Senior Director, Consumer Digital Technology, Nike, explored her company's Java technology driven enterprise ecosystem for all things sports, including the NikeFuel accelerometer wrist band. Looking beyond Java EE 7, Purdy mentioned NoSQL database functionality for EE 8, the concurrency utilities (possibly in EE 7), some of the Avatar projects in EE 7, some in EE 8, multi-tenancy for the cloud, supporting SaaS applications, and more.Rizvi ended by introducing Dr. Robert Ballard, oceanographer and National Geographic Explorer in Residence--part of Oracle's philanthropic relationship with the National Geographic Society to fund K-12 education around ocean science and conservation. Ballard is best known for having discovered the wreckage of the Titanic. He offered a fascinating video and overview of the cutting edge technology used in such deep-sea explorations, noting that in his early days, high-bandwidth exploration meant that you’d go down in a submarine and "stick your face up against the window." Now, it's a remotely operated, technology telepresence--"I think of my Hercules vehicle as my equivalent of a Na'vi. When I go beneath the sea, I actually send my spirit." Using high bandwidth satellite links, such amazing explorations can now occur via smartphone, laptop, or whatever platform. Ballard’s team regularly offers live feeds and programming out to schools and the world, spanning 188 countries--with embedding educators as part of the expeditions. It's technology at its finest, inspiring the next-generation of scientists and explorers!

    Read the article

  • Groups page is blank in SharePoint 2010 [migrated]

    - by Murali Ramakrishnan
    Sometimes it's very confusing how Sharepoint 2010 group creation works Here's a scenario we have been facing from a long time wrt groups in SharePoint 2010 We had requirement of creating a two custom groups followed by creating a custom site through programmatically, For the most case the scenario works as how it is excepted to work. but, out of 1/100 site creation process the groups creation fails, which means we were able to access the group and users associated with it through programmatically. but, when it comes to UI stand point if you try to access the specific group page from the site permissions page - SharePoint returns a BLANK WHITE Page... BLANK WHITE Page... nothing else... Ain't is this a Sharepoint 2010 issue. or anybody had this problem and fixed it. Kindly share your thoughts

    Read the article

  • Unity is using something that looks like Emerald instead of the correct theme (like Metacity)

    - by Scott Severance
    I just upgraded my netbook thus: Lucid - Maverick - Natty. (I skipped Maverick other than as an upgrade step due to issues with Unity.) Now, I seem to be stuck with something that looks like Emerald but apparently isn't (see note below). Compiz is running, as is gtk-window-decorator, but my title bars aren't following Ubuntu's theme. I was using the Ambiance theme with no problems until the upgrade to Natty. Here's a screenshot: How can I get the default theme? Note: I never installed Emerald, and as far as I can tell it isn't installed. There's no running process containing the string emerald. So I'm not sure where gtk-window-decorator is getting its configuration from.

    Read the article

  • Using Sql Server Change Data Capture with a frequently changing schema

    - by Pete
    We are looking into enabling Sql Server Change Data Capture for a new subsystem we are building. It's not really because we need it, but we are being pushed for having a complete history traceability, and CDC would nicely solve this requirement with minimum effort on our parts. We are following an agile development process, which in this case means that we frequently make changes to the database schema, e.g. adding new columns, moving data to other columns, etc. We did a small test where we created a table, enabled CDC for that table, and then added a new column to the table. Changes to the new column is not registered in the CDC table. Is there a mechanism to update the CDC table to the new schema, and are there any best practices to how you deal with captured data when migrating the database schema?

    Read the article

  • When to do Code Review

    - by mcass20
    We have recently moved to a scrum process and are working on tasks and user stories inside of sprints. We would like to do code reviews frequently to make them less daunting. We are thinking that doing them on a user story level but are unsure how to branch our code to account for this. We are using VS and TFS 2010 and we are a team of 6. We currently branch for features but are working on changing to branching for scrum. We do not currently use shelvesets and don't really want to implement if there are other techniques available. How do you recommend we implement code review per user story?

    Read the article

  • Presenting at Roanoke Code Camp Saturday!

    - by andyleonard
    Introduction I am honored to once again be selected to present at Roanoke Code Camp ! An Introductory Topic One of my presentations is titled "I See a Control Flow Tab. Now What?" It's a Level 100 talk for those wishing to learn how to build their very first SSIS package. This highly-interactive, demo-intense presentation is for beginners and developers just getting started with SSIS. Attend and learn how to build SSIS packages from the ground up . Designing an SSIS Framework I'm also presenting...(read more)

    Read the article

  • How to create an auto-grader in and for Python

    - by recluze
    I'm trying to create an auto-grader for one of my beginning programming courses for python. From my online search, I've come to know that it is effectively a unit test framework that tests the student's code rather than production code but I'm not really sure how to structure the flow of the program. Can anyone please provide a strategy for submission of code by students and automating the whole process of marking? For instance, how would the student code be submitted and then stored/structured on disk, how would the grades be stored/reported? I'm only looking for a broad strategy and will try on my own to fill in the blanks. (I asked this on stockoverflow.com initially but it's considered as off-topic and I was suggested to ask here.)

    Read the article

  • SQLite with two python processes accessing it: one reading, one writing

    - by BBnyc
    I'm developing a small system with two components: one polls data from an internet resource and translates it into sql data to persist it locally; the second one reads that sql data from the local instance and serves it via json and a restful api. I was originally planning to persist the data with postgresql, but because the application will have a very low-volume of data to store and traffic to serve, I thought that was overkill. Is SQLite up to the job? I love the idea of the small footprint and no need to maintain yet another sql server for this one task, but am concerned about concurrency. It seems that with write ahead logging enabled, concurrently reading and writing a SQLite database can happen without locking either process out of the database. Can a single SQLite instance sustain two concurrent processes accessing it, if only one reads and the other writes? I started writing the code but was wondering if this is a misapplication of SQLite.

    Read the article

  • Disaster Recovery Discovery

    - by Rodney Landrum
    Last weekend I joined several of my IT staff on a mission to perform a DR test in our remote CoLo center in a large South East city of the US. Can I be more obtuse? The goal was simple for me as the sole DBA in a throng of Windows, Storage, Network and SAN admins – restore the databases and make them work. There were 4 applications that back ended to 7 SQL Server databases on 4 different SQL Server instances. We would maintain the original server names, but beyond that it was fair game. We had time to prepare so I was able to script out or otherwise automate the recovery process. I used sp_help_revlogin for three of the servers, a bit of a cheat actually because restoring the Master database on the target DR servers was the specified course of action according to the DR procedures ( the caveat “IF REQUIRED” left it open to interpretation. I really wanted to avoid the step of restoring Master for a number of reasons but mainly because I did not want to deal with issues starting SQL Services afterward. Having to account for the location of TempDB and the version conflicts of the resource DBs were just two of the battles I chose not to fight. Not to mention other system database location problems that might arise and prevent SQL from starting.  I was going to have to restore all of the user databases anyway, so I would not really gain any benefit, outside of logins, for taking the time to restore the source Master database over the newly installed one on the fresh server. What I wanted was the ability to restore the Master database as a user database, call it Master_Mine, from a backup on the source system and then use that restored database to script the SQL Logins and passwords on the DR systems. While I did not attempt this on the trip, the thought stuck in my mind and this past week I succeeded at scripting user accounts and passwords using only a restored copy of the Master database. Granted there were several challenges to overcome.  Also, as is usual for any work like this the usual disclaimers apply:  This is not something that I would imagine Microsoft would condone or support and this was really only an experiment for me to learn if it was even possible. While I have tested the process with success, I do not know that I would use this technique in a documented procedure because future updates for SQL Server will render this technique non-functional. I thought at first, incorrectly of course, that I could use sp_help_revlogin on a restored copy of the master database I named Master_Mine.   Since sp_help_revlogin uses system schema objects, sys.syslogins and sys.server_principals, this was not going to work because all results would come from the main Master database. To test this I added a SQL login via SSMS, backed up Master, restored  it as Master_Mine, and then deleted the login.  Even though the test account I created should presumably still be in the Master_Mine database, I should be able to get to it and script out its creation with its password hash so that I would not need to know the password, but any applications that stored that password would not have to be altered in the DR scenario. They would just work as expected. Once I realized that would not work I began looking deeper.  Knowing that sys.syslogins and sys.server_principals are system views, their underlying code should be available with sp_helptext, right? They were. And this led me to discover the two tables sys.sysxlgns and sys.sysprivs, where the data I needed was stored. These tables existed in both the real Master and the restored copy, Master_Mine.  I used this information to tweak the sp_help_revlogin stored procedure to use these tables instead to create the logins cursor used in sp_help_revlogin. For the password hash,  sp_help_revlogin uses the function LoginProperty() which takes a user name and option ‘passwordhash’ to return the hash for the user. Unfortunately, it requires the login to exist in the Master database. This would not work. So another slight modification I had to make was to pull the password hash itself (pwdhash from sys.sysxlgns) into the logins cursor and comment out the section of sp_help_revlogin that uses LoginProperty. Instead, I pass the pwdhash value as the variable @PWD_varbinary to the sp_hexadecimal stored procedure which is also created by and used within the code provided by Microsoft in the link above for sp_help_revlogin. The final challenge: sys.sysxlgns and sys.server_principals are visible only within a Dedicated Administrator Connection (DAC) query window in SSMS or within SQLCDMD.  To open a DAC connection you have to be logged in on the SQL Server itself, via RDP in my case,  and you preface the server name in the query connection with ADMIN:, so that the server connection looks like ADMIN:ServerName. From there you can create the modified stored procedure in the restored copy of a Master database from a source system as whatever name you like, and then run the modified stored procedure. I named my new stored procedure usp_help_revlogin_MyMaster. Upon execution I was happy to see the logins and password hashes that I needed to apply from the source Master database without having to restore over the new Master system database and without the need to access the original server (assuming it was down due to whatever disaster put it in that state). You will note that I am not providing full code samples here of the modifications. I will say that it was a slight bit of work and anyone who needed to do this for whatever reason, could fairly easily roll their own solution with the information provided herein.  My goal, as I said was to prove that this could be done and provide another option if required to ease the burden of getting SQL Servers up and available in an emergency situation where alternatives may be more challenging or otherwise unavailable.  

    Read the article

  • Berkeley DB Java Edition 4.0.103 Available

    - by charles.lamb
    We'd like to let you know that JE 4.0.103 is now at http://www.oracle.com/technology/software/products/berkeley-db/je/index.html. The patch release contains both small features and bug fixes, many of which were prompted by feedback on this forum. Some items to note: New CacheMode values for more control over cache policies, and new statistics to enable better interpretation of caching behavior. These are just one initial part of our continuing work in progress to make JE caching more efficient. Fixes for proper cache utilization calculations when using the -XX:+UseCompressedOops JVM option. A variety of other bug fixes. There is no file format or API changes. As always, we encourage users to move promptly to this new release.

    Read the article

< Previous Page | 708 709 710 711 712 713 714 715 716 717 718 719  | Next Page >