Search Results

Search found 2911 results on 117 pages for 'restore'.

Page 12/117 | < Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >

  • How do I restore GRUB 2?

    - by uahug
    I upgraded my laptop with an SSD, moving my old HDD to where the DVD-drive was, so that I could have speed and storage. Now, I have reinstalled Ubuntu on the SSD, deleting all the partitions on the old HDD to make space for a data partition. But now the laptop doesn't even get to GRUB 2 if the HDD is plugged in! If I take it out, everything works, but as soon as I plug it in and retry to boot, I won't find GRUB. At first, I thought it was because of the boot order, but the order was OK: first the notebook hard drive (SSD) and then the CD/DVD drive (which in reality is the HDD). How can I fix it? Doing a simple grub-install /dev/sda doesn't work.. The SSD is sda, and the HDD is sdb.

    Read the article

  • How to restore my D Drive

    - by buggi
    I need help in recovering one of my drives from my windows7 machine. I used to have two drives, c and d. I used D to save all my data. i have been using my laptop from 3years. and when I saw it today, i cant find D drive on my computer :( . When I opened the parititioning wizard it showed me that the space I allocated for D is there but as unallocated space. I badly need the data from D. Can you please suggest if I can use ubuntu to recover D? Thanks in advance.

    Read the article

  • How to restore "working folders" upon restart?

    - by Smiles in a Jar
    I am not sure I am working the problem correctly but I will try my best. I am newbie to ubuntu and I am using Ubuntu 11.10. On a daily basis but I have a set of 5-10 folders which I need to refer for my work. I am looking for a way if I can create a "workspace" of these folders so that upon restart in a single click all the folders in the workspace can be opened in different tabs. Another option that I right now plan to use us create links of the folders and then select all of them and open in different tabs. Was wondering if there is a cleaner option already provided in ubuntu.

    Read the article

  • Ubuntu 14.04 Fatal Exception

    - by user286534
    I use Ubuntu 14.04, 64-bit. I installed Virtualbox and was testing another Linux OS (Deepin). My system froze and I could not get to a TTY session to reboot. I had to do a hard restart and when Ubuntu restarted I got various error codes one of which was "kernel panic - fatal exception in interrupt" Booting to Advance mode and attempting repairs did not work (fsck, grub repair, etc) I reinstalled Ubuntu and chose the option to keep my files intact. I can now access my system but many programs I have installed do not work. My question is; I have a Deja-Dup backup (but only of my Home directory), is it better to restore my backup files or do I have to reinstall all of my programs? The weird thing is, the programs I verified using the Software Center to see if they were installed are checked as installed, but won't appear in Dash.

    Read the article

  • Backup and restore Evolution

    <b>Ghacks:</b> "How many times have you migrated from one Linux box to another, only to say goodbye to your email and knowing you were going to have to set your email client up all over again"

    Read the article

  • Free eBook: SQL Server Backup and Restore

    You can download a free eBook from SQLServerCentral and Red Gate software on the most important task a SQL Server DBA or developer needs to understand. NEW! Deployment Manager Early Access ReleaseDeploy SQL Server changes and .NET applications fast, frequently, and without fuss, using Deployment Manager, the new tool from Red Gate. Try the Early Access Release to get a 20% discount on Version 1. Download the Early Access Release.

    Read the article

  • Error applying iptables rules using iptables-restore

    - by John Franic
    Hi I'm using Ubuntu 9.04 on a VPS. I'm getting an error if I apply a iptables rule. Here is what I have done. 1.Saved the existing rules iptables-save /etc/iptables.up.rules Created iptables.test.rules and add some rules to it nano /etc/iptables.test.rulesnano /etc/iptables.test.rules This is the rules I added *filter # Allows all loopback (lo0) traffic and drop all traffic to 127/8 that doesn't use lo0 -A INPUT -i lo -j ACCEPT -A INPUT -i ! lo -d 127.0.0.0/8 -j REJECT # Accepts all established inbound connections -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT # Allows all outbound traffic # You can modify this to only allow certain traffic -A OUTPUT -j ACCEPT # Allows HTTP and HTTPS connections from anywhere (the normal ports for websites) -A INPUT -p tcp --dport 80 -j ACCEPT -A INPUT -p tcp --dport 443 -j ACCEPT # Allows SSH connections # # THE -dport NUMBER IS THE SAME ONE YOU SET UP IN THE SSHD_CONFIG FILE # -A INPUT -p tcp -m state --state NEW --dport 22- j ACCEPT # Allow ping -A INPUT -p icmp -m icmp --icmp-type 8 -j ACCEPT # log iptables denied calls -A INPUT -m limit --limit 5/min -j LOG --log-prefix "iptables denied: " --log-level 7 # Reject all other inbound - default deny unless explicitly allowed policy -A INPUT -j REJECT -A FORWARD -j REJECT COMMIT After editing when I try to apply the rules by iptables-restore < /etc/iptables.test.rules I get the following error iptables-restore: line 42 failed Line 42 is COMMIT and I comment that out I get iptables-restore: COMMIT expected at line 43 I'm not sure what is the problem, it is expecting COMMIT but if COMMIT is there it's giving error. Could it be due to the fact i'm usin a VPS?My provider using OpenVZ for virtualizaton.

    Read the article

  • NetApp NDMP backup with BE 2010 R2 works, restore fails

    - by uuwe
    Hi, I'm having some issues with a new Backup Exec 2010 R2 installation. I configured a NetApp FAS2020 as an NDMP device and want to backup files from the NAS to a tape drive connected to my backup server. I set up ndmpd according to this document (http://www.symantec.com/business/support/index?page=content&id=TECH48957) and created a separate backup user (http://filers.blogspot.com/2006/09/setting-veritas-netbackup-with-non.html). Backup works perfectly, but restoring any file gives me an authentication failed error. The NDMP device has a "global" ndmp user configured in the device tab (tried this with the newly created ndmpd backup user and the netapp root) and I can also configure separate resource credentials in the BE restore job. I have tried setting the same accounts for the "global" ndmp device and the restore credentials and have also tried setting different accounts for them. NDMP debug level is at 5 and this is what shows up in /etc/messages. The session is closed immediately after it has been granted. 16:12:07 PST [Java_Thread:info]: ndmpdserver: ndmpd.access allowed for version = 4, sessionId = 51, from src ip = 192.168.11.17, dst ip = FAS2020-1/192.168.11.75, src port = 50857, dst port = 10000 16:12:07 PST [Java_Thread:info]: Ndmpd51: ndmpd session closed successfully for version = 4, sessionId = 51, from src ip = 192.168.11.17, dst ip = FAS2020-1/192.168.11.75, src port = 50857, dst port = 10000 Running wireshark on the backup server doesn't produce much. It shows a SYN - SYN/ACK - NDMP CONNECT_CLOSE Request from the backup server. The Resource Credentials for the restore job behave very oddly. If I enter NDMP credentials and do "Test All" it fails. If I use my regular domain backup account, it is successful. There are no failed or succeeded logons in the NetApp ndmp log and tracing this check shows that it doesn't even connect to the NAS. This makes me think that this is more likely flaky BE behaviour rather than misconfiguration of the NAS. Here is the options ndmp output: FAS2020-1 options ndmp ndmpd.access all ndmpd.authtype challenge ndmpd.connectlog.enabled on ndmpd.enable on ndmpd.ignore_ctime.enabled off ndmpd.offset_map.enable on ndmpd.password_length 16 ndmpd.preferred_interface disable ndmpd.tcpnodelay.enable off

    Read the article

  • Is it possible to restore a previous GL framebuffer?

    - by Rob
    Hi there, I'm working on an iPhone app that lets the user draw using GL. I used the GLPaint sample code project as a firm foundation, but now I want to add the ability for the user to load one of their previous drawings and continue working on it. I know how to get the framebuffer contents and save it as a UIImage. Is there a way for me to take the UIImage and tell GL to draw that? Any help is much appreciated.

    Read the article

  • Restore Mysql database query is not working in ASP.NET, C#

    - by santhosha
    We are using Mysql.exe to restore database by the following query string cmd ="-h" + ViewState["host"].ToString() + " " + "-u" + ViewState["user"].ToString() + " " + "-p" + ViewState["password"].ToString() + " " + ViewState["dbName"].ToString() + "<" + " " + Server.MapPath("BackupFiles/") + path; The same query is executing in MySql command prompt but we are not able to restore using the above query in VisualStudio .Net, we have tried MysqlImport.exe to do the restore but it was no use. we are newbie to MySql if any help would be appreciated.

    Read the article

  • SQL 2005 - any way to restore/copy a diagram?

    - by NealWalters
    I used the Redgate packager (ran MSI) to reset all the data in my database (i.e. I deleted everything, and let it build the new database). Unfortunately, I discovered that it didn't retain my diagrams, which has a nice arrangement and several annotations. Is there any way to copy/migrate/script the diagram from one database to another (the databases have identical structures). Thanks, Neal Walters

    Read the article

  • SQL SERVER – Four Posts on Removing the Bookmark Lookup – Key Lookup

    - by pinaldave
    In recent times I have observed that not many people have proper understanding of what is bookmark lookup or key lookup. Increasing numbers of the questions tells me that this is something developers are encountering every single day but have no idea how to deal with it. I have previously written three articles on this subject. I want to point all of you looking for further information on the same post. SQL SERVER – Query Optimization – Remove Bookmark Lookup – Remove RID Lookup – Remove Key Lookup SQL SERVER – Query Optimization – Remove Bookmark Lookup – Remove RID Lookup – Remove Key Lookup – Part 2 SQL SERVER – Query Optimization – Remove Bookmark Lookup – Remove RID Lookup – Remove Key Lookup – Part 3 SQL SERVER – Interesting Observation – Execution Plan and Results of Aggregate Concatenation Queries In one of my recent class we had in depth conversation about what are the alternative of creating covering indexes to remove the bookmark lookup. I really want to this question open to all of you and see what community thinks about the same. Is there any other way then creating covering index or included index to remove his expensive keylookup? Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Backup and Restore, SQL Index, SQL Optimization, SQL Performance, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, SQLAuthority News, SQLServer, T SQL, Technology

    Read the article

  • Webinar: MySQL Enterprise Backup - Online "Hot" Backup for MySQL

    - by mike.frank(at)oracle.com
    Online backup has been one of the most requested features for MySQL. With MySQL Enterprise Backup, developers and DBAs have tools they need to safely and rapidly backup and restore their databases. In this webinar we will go into the advantages of Hot "Online" backups. We will show how MySQL Enterprise Backup supports full, incremental, partial, and compressed backups that allow you to perform consistent Point-in-Time Recovery, as well as saving both time and money.In this Free Webinar you will learn:    * Backup Strategies & Methods    * Comparison of backup types for MySQL    * MySQL Enterprise Backup: Features    * MySQL Enterprise Backup  Performance    * MySQL Enterprise Backup: Architecture    * MySQL Enterprise Backup: How it Works    * MySQL Enterprise Backup: Script ExamplesEnglish WebinarWhoMike Frank and Alex Roedling WhenThursday, January 20, 2011: 09:00 Pacific time (English)Italian WebinarLuca Olivari Thursday, January 20, 2011: 10:00 Central European time (Italian)Register now: English, Italian.On demand French and German versions available as well.Related articles    * Introducing our "Hot" MySQL Enterprise Backup (blogs.oracle.com)

    Read the article

  • SQLAuthority News – Microsoft Whitepaper – AlwaysOn Solution Guide: Offloading Read-Only Workloads to Secondary Replicas

    - by pinaldave
    SQL Server 2012 has many interesting features but the most talked feature is AlwaysOn. Performance tuning is always a hot topic. I see lots of need of the same and lots of business around it. However, many times when people talk about performance tuning they think of it as a either query tuning, performance tuning, or server tuning. All are valid points, but performance tuning expert usually understands the business workload and business logic before making suggestions. For example, if performance tuning expert analysis workload and realize that there are plenty of reports as well read only queries on the server they can for sure consider alternate options for the same. If read only data is not required real time or it can accept the data which is delayed a bit it makes sense to divide the workload. A secondary replica of the original data which can serve all the read only queries and report is a good idea in most of the cases where there is plenty of workload which is not dependent on the real time data. SQL Server 2012 has introduced the feature of AlwaysOn which can very well fit in this scenario and provide a solution in Read-Only Workloads. Microsoft has recently announced a white paper which is based on absolutely the same subject. I recommend it to read for every SQL Enthusiast who is are going to implement a solution to offload read-only workloads to secondary replicas. Download white paper AlwaysOn Solution Guide: Offloading Read-Only Workloads to Secondary Replicas Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Backup and Restore, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: AlwaysOn

    Read the article

  • How to read default key value with dconf or gsettings?

    - by Zta
    I would like to know the default value of a dconf/gsettings key. My question is a followup of the question below: Where can I get a list of SCHEMA / PATH / KEY to use with gsettings? What I'm trying to do, so create a script that reads all my personal preferences so I can back them up and restore them. I plan to iterate though all keys, like the script above, see what keys have been changed from their default value, and make a note of these, that can be restored later. I see that the dconf-editor display the keys' default value, but I'd very much like to script this. Also, I don't see how parsing the schemas /usr/share/glib-2.0/schemas/ can be automated. Maybe someone can help? gsettings get-default|list-defaults would be nice =) (Geesh, it was much easier in the old days where you just kept your ~/.somethingrc in subversion ... =\ Based on the answer given below, I've updated the script to print schema, key, key's data type, default value, and actual value: #!/bin/bash for schema in $(gsettings list-schemas | sort); do for key in $(gsettings list-keys $schema | sort); do type="$(gsettings range $schema $key | tr "\n" " ")" default="$(XDG_CONFIG_HOME=/tmp/ gsettings get $schema $key | tr "\n" " ")" value="$(gsettings get $schema $key | tr "\n" " ")" echo "$schema :: $key :: $type :: $default :: $value" done done This workaround basically covers what I need. I'll continue working on the backup scrip from here.

    Read the article

  • Error when restoring database (Windows 7 test environment)

    - by Undh
    I have a windows 7 operating system as a test environment. I have SQL Server EE installed with two instances, named as test and production. I took a full backup from AdventureWorks database from test instance and I tried to restore it into the production instance: RESTORE DATABASE [testikanta] FROM DISK = N'C:\Program Files\Microsoft SQL Server\MSSQL10.SQL2008TESTI\MSSQL\Backup\AdventureWorks.bak' WITH FILE = 1, NOUNLOAD, REPLACE, STATS = 10 GO I got an error saying: Msg 3634, Level 16, State 1, Line 1 The operating system returned the error '32(failed to retrieve text for this error. Reason: 15105)' while attempting 'RestoreContainer::ValidateTargetForCreation' on 'C:\Program Files\Microsoft SQL Server\MSSQL10.SQL2008TESTI\MSSQL\DATA\AdventureWorks_Data.mdf'. Msg 3156, Level 16, State 8, Line 1 File 'AdventureWorks_Data' cannot be restored to 'C:\Program Files\Microsoft SQL Server\MSSQL10.SQL2008TESTI\MSSQL\DATA\AdventureWorks_Data.mdf'. Use WITH MOVE to identify a valid location for the file. Msg 3634, Level 16, State 1, Line 1 The operating system returned the error '32(failed to retrieve text for this error. Reason: 15105)' while attempting 'RestoreContainer::ValidateTargetForCreation' on 'C:\Program Files\Microsoft SQL Server\MSSQL10.SQL2008TESTI\MSSQL\DATA\AdventureWorks_Log.ldf'. Msg 3156, Level 16, State 8, Line 1 File 'AdventureWorks_Log' cannot be restored to 'C:\Program Files\Microsoft SQL Server\MSSQL10.SQL2008TESTI\MSSQL\DATA\AdventureWorks_Log.ldf'. Use WITH MOVE to identify a valid location for the file. Msg 3119, Level 16, State 1, Line 1 Problems were identified while planning for the RESTORE statement. Previous messages provide details. Msg 3013, Level 16, State 1, Line 1 RESTORE DATABASE is terminating abnormally. Where's the problem? I'm running these instances as on local machine adminstrator (SQL Server services are running with the same account).

    Read the article

  • What is the quickest and safest way to test new software and revert all changes, if needed?

    - by calbar
    I'm looking for Windows software that will allow me to quickly create a "checkpoint", do whatever I might need to do to my computer - install programs/drivers/updates, create/delete personal files, reboot the system multiple times, open questionable attachments - and then revert the entire system back to when the checkpoint was created. Essentially I want Windows Restore Points that save my personal files and partitions, too. It sounds like disk imaging might be the ticket, but creating them is much too slow and the restore process too involved... I'm hoping to sacrifice full disaster recovery for speed. Creating a checkpoint should be as close to one-click as possible, and rolling back should be a matter of selecting a restore point and rebooting. Ding! I'm familiar with Sandboxie, True Image Home "Try and Decide", Returnil, and a number of other "virtual system" apps that actively "catch" changes and allow you to commit or reject them. I'm not interested in these for a number of reasons - I prefer the "cut and dry" restore point approach. Finally, I'll note that I've just recently become aware of Comodo Time Machine. It sounds absolutely perfect, however, a quick skim through the user forums show more than a few horror stories of corrupted, unbootable systems. Any positive personal experience with the software to suppress my superstitions, or suggestions for more established alternatives would be greatly appreciated - Comodo Time Machine seems relatively new. Thanks for your help!

    Read the article

  • SQL SERVER – Log File Growing for Model Database – model Database Log File Grew Too Big

    - by pinaldave
    After reading my earlier article SQL SERVER – master Database Log File Grew Too Big, I received an email recently from another reader asking why does the log file of model database grow every day when he is not carrying out any operation in the model database. As per the email, he is absolutely sure that he is doing nothing on his model database; he had used policy management to catch any T-SQL operation in the model database and there were none. This was indeed surprising to me. I sent a request to access to his server, which he happily agreed for and within a min, we figured out the issue. He was taking the backup of the model database every day taking the database backup every night. When I explained the same to him, he did not believe it; so I quickly wrote down the following script. The results before and after the usage of the script were very clear. What is a model database? The model database is used as the template for all databases created on an instance of SQL Server. Any object you create in the model database will be automatically created in subsequent user database created on the server. NOTE: Do not run this in production environment. During the demo, the model database was in full recovery mode and only full backup operation was performed (no log backup). Before Backup Script Backup Script in loop DECLARE @FLAG INT SET @FLAG = 1 WHILE(@FLAG < 1000) BEGIN BACKUP DATABASE [model] TO  DISK = N'D:\model.bak' SET @FLAG = @FLAG + 1 END GO After Backup Script Why did this happen? The model database was in full recovery mode and taking full backup is logged operation. As there was no log backup and only full backup was performed on the model database, the size of the log file kept growing. Resolution: Change the backup mode of model database from “Full Recovery” to “Simple Recovery.”. Take full backup of the model database “only” when you change something in the model database. Let me know if you have encountered a situation like this? If so, how did you resolve it? It will be interesting to know about your experience. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Backup and Restore, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Move Database Files MDF and LDF to Another Location

    - by pinaldave
    When a novice DBA or Developer create a database they use SQL Server Management Studio to create new database. Additionally, the T-SQL script to create a database is very easy as well. You can just write CREATE DATABASE DatabaseName and it will create new database for you. The point to remember here is that it will create the database at the default location specified for SQL Server Instance (this default instance can be changed and we will see that in future blog posts). Now, once the database goes in production it will start to grow. It is not common to keep the Database on the same location where OS is installed. Usually Database files are on SAN, Separate Disk Array or on SSDs. This is done usually for performance reason and manageability perspective. Now the challenges comes up when database which was installed at not preferred default location and needs to move to a different location. Here is the quick tutorial how you can do it. Let us assume we have two folders loc1 and loc2. We want to move database files from loc1 to loc2. USE MASTER; GO -- Take database in single user mode -- if you are facing errors -- This may terminate your active transactions for database ALTER DATABASE TestDB SET SINGLE_USER WITH ROLLBACK IMMEDIATE; GO -- Detach DB EXEC MASTER.dbo.sp_detach_db @dbname = N'TestDB' GO Now move the files from loc1 to loc2. You can now reattach the files with new locations. -- Move MDF File from Loc1 to Loc 2 -- Re-Attached DB CREATE DATABASE [TestDB] ON ( FILENAME = N'F:\loc2\TestDB.mdf' ), ( FILENAME = N'F:\loc2\TestDB_log.ldf' ) FOR ATTACH GO Well, we are done. There is little warning here for you: If you do ROLLBACK IMMEDIATE you may terminate your active transactions so do not use it randomly. Do it if you are confident that they are not needed or due to any reason there is a connection to the database which you are not able to kill manually after review. Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Backup and Restore, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Transaction Log Full – Transaction Log Larger than Data File – Notes from Fields #001

    - by Pinal Dave
    I am very excited to announce a new series on this blog – Notes from Fields. I have been blogging for almost 7 years on this blog and it has been a wonderful experience. Though, I have extensive experience with SQL and Databases, it is always a good idea that we consult experts for their advice and opinion. Following the same thought process, I have started this new series of Notes from Fields. In this series we will have notes from various experts in the database world. My friends at Linchpin People have graciously decided to support me in my new initiation.  Linchpin People are database coaches and wellness experts for a data driven world. In this very first episode of the Notes from Fields series database expert Tim Radney (partner at Linchpin People) explains a very common issue DBA and Developer faces in their career, when database logs fills up your hard-drive or your database log is larger than your data file. Read the experience of Tim in his own words. As a consultant, I encounter a number of common issues with clients.  One of the more common things I encounter is finding a user database in the FULL recovery model that does not make a regular transaction log backups or ever had a transaction log backup. When I find this, usually the transaction log is several times larger than the data file. Finding this issue is very significant to me in that it allows to me to discuss service level agreements with the client. I get to ask questions such as, are nightly full backups sufficient or do they need point in time recovery.  This conversation has now signed with the customer and gets them to thinking about their disaster recovery and high availability solutions. This issue is also very prominent on SQL Server forums and usually has the title of “Help, my transaction log has filled up my disk” or “Help, my transaction log is many times the size of my database”. In cases where the client only needs the previous full nights backup, I am able to change the recovery model to SIMPLE and shrink the transaction log using DBCC SHRINKFILE (2,1) or by specifying the transaction log file name by using DBCC SHRINKFILE (file_name, target_size). When the client needs point in time recovery then in most cases I will still end up switching the client to the SIMPLE recovery model to truncate the transaction log followed by a full backup. I will then schedule a SQL Agent job to make the regular transaction log backups with an interval determined by the client to meet their service level agreements. It should also be noted that typically when I find an overgrown transaction log the virtual log file count is also out of control. I clean up will always take that into account as well.  That is a subject for a future blog post. If your SQL Server is facing any issue we can Fix Your SQL Server. Additional reading: Monitoring SQL Server Database Transaction Log Space Growth – DBCC SQLPERF(logspace)  SQL SERVER – How to Stop Growing Log File Too Big Shrinking Truncate Log File – Log Full Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Backup and Restore, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Can't restore backup from SQL Server 2008 R2 to SQL Server 2005 or 2008

    - by Erick
    Hi everyone, I'm trying to get a backup from SQL Server 2008 R2 restored to SQL Server 2008, but when we try to do the restore we get this: The database was backed up on a server running version 10.50.1092. That version is incompatible with this server, which is running version 10.00.2531. Either restore the database on a server that supports the backup, or use a backup that is compatible with this server. I can use the script wizard to generate a script, but that takes over an hour to run. I also tried just exporting the data from server to server, but it had issues with the primary keys/identity columns. I will be running into this issue with several other clients so any help you could offer about how to get around this would be great. Thanks for your help!

    Read the article

  • Restore using time machine from an macbook to an macbook pro (first intel)

    - by Anders Nørgaard
    Hello.. My girlfriend have Macbook 10.6.3, the first plastic version. the screen broke an its at service store now. In the mean time, i have tried to restore from hers TM backup to my old macbook pro 10.6.3 (the first intel version). Everything seems to work out fine, but when its finish, it says reboot, but nothing happens. When i hold down the power button, powering down, and starts again, its come up with the grey roll down screen "you need to restart your machine again" in different languages. I have tried the restore procedure over again 2 times, and every time it ends up like this... Anyone have a suggestion what to do ? Thanks - Anders.

    Read the article

  • Demote 2003 DC from within Directory Services Restore Mode

    - by adam
    We've had a child DC fail on us, and can't get into Windows on it as Directory Services is failing. A restore of the backed-up active directory hasn't worked due to a corruption, and so we've decided to demote the child DC and - for now - run AD from the PDC only. However, dcpromo /demote doesn't work from Safe Mode or Directory Services Restore Mode. We don't want to do a complete reinstall, as we have Exchange running on the child DC. Anyone know how (if?) we can demote the DC within safe mode or otherwise get into windows? Thanks

    Read the article

< Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >