Search Results

Search found 56222 results on 2249 pages for 'error checking'.

Page 595/2249 | < Previous Page | 591 592 593 594 595 596 597 598 599 600 601 602  | Next Page >

  • No mapping between account names and security IDs was done

    - by ybbest
    When I try to install SQL Server 2008 R2, I got the error “No mapping between account names and security IDs was done” when I try to set the SQL Server Database engine services identity to a domain user name. The reason I am getting the error is that I create a base VM forgot to run sysprep, before I copy the VM and used to install SQL servers. You need to run the sysprep as follows: References: How to Sysprep in Windows Server 2008 R2 and Windows 7

    Read the article

  • [SQLServer JDBC Driver][SQLServer]Could not find stored procedure 'master..xp_jdbc_open2'.

    - by Vijaya Moderator -Oracle
    When connecting to MS SQL Server Database via Weblogic Datasource and using XA jdbc driver, the following error is thrown. <Jun 3, 2014 5:16:49 AM PDT> <Error> <Console> <BEA-240003> <Console encountered the following error java.sql.SQLException: [FMWGEN][SQLServer JDBC Driver][SQLServer]Could not find stored procedure 'master..xp_jdbc_open2'. at weblogic.jdbc.sqlserverbase.ddb_.b(Unknown Source)at weblogic.jdbc.sqlserverbase.ddb_.a(Unknown Source)at weblogic.jdbc.sqlserverbase.ddb9.b(Unknown Source)at weblogic.jdbc.sqlserverbase.ddb9.a(Unknown Source)at weblogic.jdbc.sqlserver.tds.ddr.v(Unknown Source)at weblogic.jdbc.sqlserver.tds.ddr.a(Unknown Source)at weblogic.jdbc.sqlserver.tds.ddq.a(Unknown Source)at weblogic.jdbc.sqlserver.tds.ddr.a(Unknown Source)at weblogic.jdbc.sqlserver.ddj.m(Unknown Source)at weblogic.jdbc.sqlserverbase.ddel.e(Unknown Source)at weblogic.jdbc.sqlserverbase.ddel.a(Unknown Source)  The cause behind the issue is that  the MS SQL Server was not installed with the Stored procedures to enable JTA/XA Solution To connect to SQL Server via XA Driver from WLS Datasource you need to install Stored Procedures for JTATo use JDBC distributed transactions through JTA, your system administrator should use the following procedure to install Microsoft SQL Server JDBC XA procedures. This procedure must be repeated for each MS SQL Server installation that will be involved in a distributed transaction.To install stored procedures for JTA:1. Copy the appropriate sqljdbc.dll and instjdbc.sql files from the WL_HOME\server\lib directory to the SQL_Server_Root/bin directory of the MS SQL Server database server, where WL_HOME is the directory in which WebLogic server is installed, typically c:\Oracle\Middleware\wlserver_10.x.  Note:  If you are installing stored procedures on a database server with multiple Microsoft SQL Server instances, each running SQL Server instance must be able to locate the sqljdbc.dll file.Therefore the sqljdbc.dll file needs to be anywhere on the global PATH or on the application-specific path. For the application-specific path, place the sqljdbc.dll file into the :\Program Files\Microsoft SQL Server\MSSQL$\Binn directory for each instance. 2. From the database server, use the ISQL utility to run the instjdbc.sql script. As a precaution, have your system administrator back up the master database before running instjdbc.sql. At a command prompt, use the following syntax to run instjdbc.sql:  ISQL -Usa -Psa_password -Sserver_name -ilocation\instjdbc.sql  where:  sa_password is the password of the system administrator.  server_name is the name of the server on which SQL Server resides.  location is the full path to instjdbc.sql. (You copied this script to the SQL_Server_Root/bin directory in step 1.)  The instjdbc.sql script generates many messages. In general, these messages can be ignored; however, the system administrator should scan the output for any messages that may indicate an execution error. The last message should indicate that instjdbc.sql ran successfully. The script fails when there is insufficient space available in the master database to store the JDBC XA procedures or to log changes to existing procedures.

    Read the article

  • My nVidia GeForce 8600M GT card stopped working: "extension GLX missing on display 0:0"

    - by Wolter Hellmund
    So I turned on my computer and log into my gnome-shell session to find that the shell is gone, and there is instead a very thick panel with a vertically repeated pattern of the gradient the Unity panel displays. I ran some tests and found out that my graphic acceleration was failing. I have attached some outputs of some commands to help helpers help me: Some command outputs Video card: nVidia GeForce 8600M GT (note that it is not an Optimus card). Output of lspci | grep VGA: 01:00.0 VGA compatible controller: NVIDIA Corporation G84 [GeForce 8600M GT] (rev a1) Output of glxinfo: Xlib: extension "GLX" missing on display ":0.0". Xlib: extension "GLX" missing on display ":0.0". Xlib: extension "GLX" missing on display ":0.0". Xlib: extension "GLX" missing on display ":0.0". Xlib: extension "GLX" missing on display ":0.0". Error: couldn't find RGB GLX visual or fbconfig Xlib: extension "GLX" missing on display ":0.0". Xlib: extension "GLX" missing on display ":0.0". Xlib: extension "GLX" missing on display ":0.0". Xlib: extension "GLX" missing on display ":0.0". Xlib: extension "GLX" missing on display ":0.0". Contents of /var/log/Xorg.0.log: NVIDIA: Failed to load the NVIDIA kernel module. Please check your NVIDIA: system's kernel log for additional error messages. UnloadModule: "nvidia" Unloading nvidia Failed to load module "nvidia" (module-specific error, 0) If I use the Aditional Drivers system application, I see none of the drivers active, and when I try to activate any, I get an error telling me to check the jockey.log file. The contents of /var/log/jockey.log are: 2012-09-20 23:11:32,690 DEBUG: NVidia(nvidia_173).enabled(): target_alt None current_alt /usr/lib/nvidia-current/ld.so.conf other target alt None other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:11:32,690 DEBUG: nvidia_173 is not the alternative in use 2012-09-20 23:11:34,606 DEBUG: BroadcomWLHandler enabled(): kmod disabled, bcm43xx: blacklisted, b43: enabled, b43legacy: enabled 2012-09-20 23:11:34,657 DEBUG: NVidia(nvidia_173_updates).enabled(): target_alt None current_alt /usr/lib/nvidia-current/ld.so.conf other target alt None other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:11:34,657 DEBUG: nvidia_173_updates is not the alternative in use 2012-09-20 23:11:36,647 DEBUG: NVidia(nvidia_current).enabled(): target_alt /usr/lib/nvidia-current/ld.so.conf current_alt /usr/lib/nvidia-current/ld.so.conf other target alt /usr/lib/nvidia-current/alt_ld.so.conf other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:11:36,648 DEBUG: KMH enabled: False 2012-09-20 23:11:38,642 DEBUG: NVidia(nvidia_current_updates).enabled(): target_alt None current_alt /usr/lib/nvidia-current/ld.so.conf other target alt None other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:11:38,642 DEBUG: nvidia_current_updates is not the alternative in use 2012-09-20 23:11:40,546 DEBUG: NVidia(nvidia_173).enabled(): target_alt None current_alt /usr/lib/nvidia-current/ld.so.conf other target alt None other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:11:40,546 DEBUG: nvidia_173 is not the alternative in use 2012-09-20 23:11:40,620 DEBUG: NVidia(nvidia_173).enabled(): target_alt None current_alt /usr/lib/nvidia-current/ld.so.conf other target alt None other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:11:40,620 DEBUG: nvidia_173 is not the alternative in use 2012-09-20 23:11:40,629 DEBUG: BroadcomWLHandler enabled(): kmod disabled, bcm43xx: blacklisted, b43: enabled, b43legacy: enabled 2012-09-20 23:11:40,698 DEBUG: NVidia(nvidia_173_updates).enabled(): target_alt None current_alt /usr/lib/nvidia-current/ld.so.conf other target alt None other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:11:40,698 DEBUG: nvidia_173_updates is not the alternative in use 2012-09-20 23:11:40,759 DEBUG: NVidia(nvidia_current).enabled(): target_alt /usr/lib/nvidia-current/ld.so.conf current_alt /usr/lib/nvidia-current/ld.so.conf other target alt /usr/lib/nvidia-current/alt_ld.so.conf other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:11:40,760 DEBUG: KMH enabled: False 2012-09-20 23:11:40,826 DEBUG: NVidia(nvidia_current_updates).enabled(): target_alt None current_alt /usr/lib/nvidia-current/ld.so.conf other target alt None other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:11:40,826 DEBUG: nvidia_current_updates is not the alternative in use 2012-09-20 23:11:45,415 DEBUG: NVidia(nvidia_current).enabled(): target_alt /usr/lib/nvidia-current/ld.so.conf current_alt /usr/lib/nvidia-current/ld.so.conf other target alt /usr/lib/nvidia-current/alt_ld.so.conf other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:11:45,415 DEBUG: KMH enabled: False 2012-09-20 23:11:47,367 DEBUG: NVidia(nvidia_current).enabled(): target_alt /usr/lib/nvidia-current/ld.so.conf current_alt /usr/lib/nvidia-current/ld.so.conf other target alt /usr/lib/nvidia-current/alt_ld.so.conf other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:11:47,367 DEBUG: KMH enabled: False 2012-09-20 23:11:55,961 WARNING: modinfo for module nvidia_current failed: ERROR: modinfo: could not find module nvidia_current 2012-09-20 23:11:55,962 ERROR: XorgDriverHandler.enable(): package or module not installed, aborting 2012-09-20 23:12:27,780 DEBUG: NVidia(nvidia_current).enabled(): target_alt /usr/lib/nvidia-current/ld.so.conf current_alt /usr/lib/nvidia-current/ld.so.conf other target alt /usr/lib/nvidia-current/alt_ld.so.conf other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:12:27,781 DEBUG: KMH enabled: False 2012-09-20 23:12:27,840 DEBUG: NVidia(nvidia_current).enabled(): target_alt /usr/lib/nvidia-current/ld.so.conf current_alt /usr/lib/nvidia-current/ld.so.conf other target alt /usr/lib/nvidia-current/alt_ld.so.conf other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:12:27,840 DEBUG: KMH enabled: False 2012-09-20 23:12:43,196 DEBUG: NVidia(nvidia_173).enabled(): target_alt None current_alt /usr/lib/nvidia-current/ld.so.conf other target alt None other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:12:43,197 DEBUG: nvidia_173 is not the alternative in use 2012-09-20 23:12:43,208 DEBUG: BroadcomWLHandler enabled(): kmod disabled, bcm43xx: blacklisted, b43: enabled, b43legacy: enabled 2012-09-20 23:12:43,269 DEBUG: NVidia(nvidia_173_updates).enabled(): target_alt None current_alt /usr/lib/nvidia-current/ld.so.conf other target alt None other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:12:43,269 DEBUG: nvidia_173_updates is not the alternative in use 2012-09-20 23:12:43,333 DEBUG: NVidia(nvidia_current).enabled(): target_alt /usr/lib/nvidia-current/ld.so.conf current_alt /usr/lib/nvidia-current/ld.so.conf other target alt /usr/lib/nvidia-current/alt_ld.so.conf other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:12:43,333 DEBUG: KMH enabled: False 2012-09-20 23:12:43,393 DEBUG: NVidia(nvidia_current).enabled(): target_alt /usr/lib/nvidia-current/ld.so.conf current_alt /usr/lib/nvidia-current/ld.so.conf other target alt /usr/lib/nvidia-current/alt_ld.so.conf other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:12:43,394 DEBUG: KMH enabled: False 2012-09-20 23:12:43,465 DEBUG: NVidia(nvidia_current_updates).enabled(): target_alt None current_alt /usr/lib/nvidia-current/ld.so.conf other target alt None other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:12:43,465 DEBUG: nvidia_current_updates is not the alternative in use 2012-09-20 23:12:43,531 DEBUG: NVidia(nvidia_current).enabled(): target_alt /usr/lib/nvidia-current/ld.so.conf current_alt /usr/lib/nvidia-current/ld.so.conf other target alt /usr/lib/nvidia-current/alt_ld.so.conf other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:12:43,531 DEBUG: KMH enabled: False 2012-09-20 23:12:43,587 DEBUG: NVidia(nvidia_current).enabled(): target_alt /usr/lib/nvidia-current/ld.so.conf current_alt /usr/lib/nvidia-current/ld.so.conf other target alt /usr/lib/nvidia-current/alt_ld.so.conf other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:12:43,587 DEBUG: KMH enabled: False 2012-09-20 23:12:43,663 DEBUG: NVidia(nvidia_173).enabled(): target_alt None current_alt /usr/lib/nvidia-current/ld.so.conf other target alt None other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:12:43,663 DEBUG: nvidia_173 is not the alternative in use 2012-09-20 23:12:43,672 DEBUG: BroadcomWLHandler enabled(): kmod disabled, bcm43xx: blacklisted, b43: enabled, b43legacy: enabled 2012-09-20 23:12:43,738 DEBUG: NVidia(nvidia_173_updates).enabled(): target_alt None current_alt /usr/lib/nvidia-current/ld.so.conf other target alt None other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:12:43,739 DEBUG: nvidia_173_updates is not the alternative in use 2012-09-20 23:12:43,803 DEBUG: NVidia(nvidia_current).enabled(): target_alt /usr/lib/nvidia-current/ld.so.conf current_alt /usr/lib/nvidia-current/ld.so.conf other target alt /usr/lib/nvidia-current/alt_ld.so.conf other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:12:43,803 DEBUG: KMH enabled: False 2012-09-20 23:12:43,863 DEBUG: NVidia(nvidia_current).enabled(): target_alt /usr/lib/nvidia-current/ld.so.conf current_alt /usr/lib/nvidia-current/ld.so.conf other target alt /usr/lib/nvidia-current/alt_ld.so.conf other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:12:43,863 DEBUG: KMH enabled: False 2012-09-20 23:12:43,930 DEBUG: NVidia(nvidia_current_updates).enabled(): target_alt None current_alt /usr/lib/nvidia-current/ld.so.conf other target alt None other current alt /usr/lib/nvidia-current/alt_ld.so.conf 2012-09-20 23:12:43,930 DEBUG: nvidia_current_updates is not the alternative in use 2012-09-20 23:24:16,305 DEBUG: Shutting down Things I have tried Checked the video card on a Windows session. Reran nvidia-xconfig. Purged all nvidia software from my computer, and reinstall it. Run nvidia-xconfig. Tried a different kernel.

    Read the article

  • ¿Como puedo instalar Age of Mythology en Ubuntu? - How I can install Age of Mythology on Ubuntu?

    - by edgarsalguero93
    Spanish: Tengo un problema al tratar de instalar Age of Mythology Gold Edition (v 1.03). Al iniciarle con el Wine (1.3.28) en mi Ubuntu 11.10, y después de ingresar el serial y pulsar Siguiente, me sale un error "No se puede cargar PidGen.dll", y se regresa a la ventana anterior. En Configurar Wine, y en la pestaña Librerías le añadí el archivo DLL que me pide pero todo sigue igual. He intentado todo para que se instale pero no funciona. También he intentado instalarlo en una maquina virtual de VMware Workstation 8.0.1 con Microsoft Windows XP SP3 y me sale un error de compatibilidad con la tarjeta de vídeo: "Tarjeta de vídeo 0: vmx_fb.dll VMware SVGA II Vendor(0x15AD) Device(0x405)" y se cierra el Age of Mythology. Me parece que la solución en este caso seria aumentar la cantidad de vídeo que pone a disposición el VMware a la maquina virtual o un driver para esta tarjeta, pero no se como hacer eso. O tal vez alguien sabe la manera de que mi tarjeta de vídeo que tiene mi computadora se use también para VMware o destinar parte de la memoria del vídeo a la maquina virtual. He buscado en varios sitios, pero ninguno soluciona mi problema. Por favor, si alguien ya encontró la respuesta, que responda a esta pregunta, y de antemano gracias. English: I have a problem trying to install Age of Mythology Gold Edition (v 1.03). At the beginning of the Wine (1.3.28) on my Ubuntu 11.10, and after entering the serial and click Next, I get an error "Unable to load pidgen.dll" and return to the previous window. In Configure Wine and in the Libraries tab I added the DLL that calls me but nothing has changed. I tried everything to be installed but not working. I have also tried to install in a virtual machine with VMware Workstation 8.0.1 Microsoft Windows XP SP3 and I get a compatibility error with video card: "Video Card 0: VMware SVGA II vmx_fb.dll Vendor (0x15AD) Device (0x405) "and closes the Age of Mythology. I think the solution in this case would increase the amount of video available to the VMware virtual machine or a driver for this card, but not how to do that. Or maybe someone knows the way that my video card that has my computer is also used for VMware or earmark part of the video memory to the virtual machine. I searched several sites, but none solved my problem. Please, if someone already found the answer, to answer this question. I also apologize if the translation is not well understood. I used the Google translator. Thanks and greetings from Latin America

    Read the article

  • Illegal characters for SharePoint 2010 Content Type name

    - by Kelly Jones
    Quick tip: you can’t include a backslash in the name of the SharePoint 2010 Content Type.  In fact, there are several illegal characters:  \  / : * ? " # % < > { } | ~ & , two consecutive periods (..), or special characters such as a tab. What, you didn’t know that after entering one of these characters in the name?  Is it because you saw this screen: Oh, that’s right….you need to turn off custom errors in the layouts folder…See this blog post for details and you’ll also need to turn off for the web application. Once you do that, you’ll see this: I wonder why the SharePoint team just doesn’t let the user know that the content type name contains illegal characters before the user hits the create button. Here’s a copy of the complete error (for the search engines): Server Error in '/' Application. -------------------------------------------------------------------------------- The content type name 'asdfadsf\asdfasf' cannot contain: \  / : * ? " # % < > { } | ~ & , two consecutive periods (..), or special characters such as a tab. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: Microsoft.SharePoint.SPInvalidContentTypeNameException: The content type name 'asdfadsf\asdfasf' cannot contain: \  / : * ? " # % < > { } | ~ & , two consecutive periods (..), or special characters such as a tab. Source Error: An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below.  Stack Trace: [SPInvalidContentTypeNameException: The content type name 'asdfadsf\asdfasf' cannot contain: \  / : * ? " # % < > { } | ~ & , two consecutive periods (..), or special characters such as a tab.]    Microsoft.SharePoint.SPContentType.ValidateName(String name) +27419522    Microsoft.SharePoint.SPContentType.ValidateNameWithResource(String strVal, String& strLocalized) +423    Microsoft.SharePoint.SPContentType.set_Name(String value) +151    Microsoft.SharePoint.SPContentType.Initialize(SPContentType parentContentType, SPContentTypeCollection collection, String name) +112    Microsoft.SharePoint.SPContentType..ctor(SPContentType parentContentType, SPContentTypeCollection collection, String name) +132    Microsoft.SharePoint.ApplicationPages.ContentTypeCreatePage.BtnOK_Click(Object sender, EventArgs e) +497    System.Web.UI.WebControls.Button.OnClick(EventArgs e) +115    System.Web.UI.WebControls.Button.RaisePostBackEvent(String eventArgument) +140    System.Web.UI.Page.RaisePostBackEvent(IPostBackEventHandler sourceControl, String eventArgument) +29    System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) +2981   -------------------------------------------------------------------------------- Version Information: Microsoft .NET Framework Version:2.0.50727.4927; ASP.NET Version:2.0.50727.4927

    Read the article

  • AspNetCompatibility in WCF Services &ndash; easy to trip up

    - by Rick Strahl
    This isn’t the first time I’ve hit this particular wall: I’m creating a WCF REST service for AJAX callbacks and using the WebScriptServiceHostFactory host factory in the service: <%@ ServiceHost Language="C#" Service="WcfAjax.BasicWcfService" CodeBehind="BasicWcfService.cs" Factory="System.ServiceModel.Activation.WebScriptServiceHostFactory" %>   to avoid all configuration. Because of the Factory that creates the ASP.NET Ajax compatible format via the custom factory implementation I can then remove all of the configuration settings that typically get dumped into the web.config file. However, I do want ASP.NET compatibility so I still leave in: <system.serviceModel> <serviceHostingEnvironment aspNetCompatibilityEnabled="true"/> </system.serviceModel> in the web.config file. This option allows you access to the HttpContext.Current object to effectively give you access to most of the standard ASP.NET request and response features. This is not recommended as a primary practice but it can be useful in some scenarios and in backwards compatibility scenerios with ASP.NET AJAX Web Services. Now, here’s where things get funky. Assuming you have the setting in web.config, If you now declare a service like this: [ServiceContract(Namespace = "DevConnections")] #if DEBUG [ServiceBehavior(IncludeExceptionDetailInFaults = true)] #endif public class BasicWcfService (or by using an interface that defines the service contract) you’ll find that the service will not work when an AJAX call is made against it. You’ll get a 500 error and a System.ServiceModel.ServiceActivationException System error. Worse even with the IncludeExceptionDetailInFaults enabled you get absolutely no indication from WCF what the problem is. So what’s the problem?  The issue is that once you specify aspNetCompatibilityEnabled=”true” in the configuration you *have to* specify the AspNetCompatibilityRequirements attribute and one of the modes that enables or at least allows for it. You need either Required or Allow: [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Required)] without it the service will simply fail without further warning. It will also fail if you set the attribute value to NotAllowed. The following also causes the service to fail as above: [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.NotAllowed)] This is not totally unreasonable but it’s a difficult issue to debug especially since the configuration setting is global – if you have more than one service and one requires traditional ASP.NET access and one doesn’t then both must have the attribute specified. This is one reason why you’d want to avoid using this functionality unless absolutely necessary. WCF REST provides some basic access to some of the HTTP features after all, although what’s there is severely limited. I also wish that ServiceActivation errors would provide more error information. Getting an Activation error without further info on what actually is wrong is pretty worthless especially when it is a technicality like a mismatched configuration/attribute setting like this.© Rick Strahl, West Wind Technologies, 2005-2010Posted in ASP.NET  WCF  AJAX  

    Read the article

  • Crystal Reports for VS deployment to Web Application doesn't work (3 replies)

    I've followed the &quot;documentation&quot; for getting crystal reports to work on a production web site. I've migrated a VS 2003 web project to a VS 2008 web application. Everything works fine on my dev box. publish the site out to the server (2003 X86) and no go on the reports, get the infamous: ***** Error Type: System.IO.FileLoadException ***** Error Message: Could not load file or assembly 'CrystalDeci...

    Read the article

  • An Honest look at SharePoint Web Services

    - by juanlarios
    INTRODUCTION If you are a SharePoint developer you know that there are two basic ways to develop against SharePoint. 1) The object Model 2) Web services. SharePoint object model has the advantage of being quite rich. Anything you can do through the SharePoint UI as an administrator or end user, you can do through the object model. In fact everything that is done through the UI is done through the object model behind the scenes. The major disadvantage to getting at SharePoint this way is that the code needs to run on the server. This means that all web parts, event receivers, features, etc… all of this is code that is deployed to the server. The second way to get to SharePoint is through the built in web services. There are many articles on how to manipulate web services, how to authenticate to them and interact with them. The basic idea is that a remote application or process can contact SharePoint through a web service. Lots has been written about how great these web services are. This article is written to document the limitations, some of the issues and frustrations with working with SharePoint built in web services. Ultimately, for the tasks I was given to , SharePoint built in web services did not suffice. My evaluation of SharePoint built in services was compared against creating my own WCF Services to do what I needed. The current project I'm working on right now involved several "integration points". A remote application, installed on a separate server was to contact SharePoint and perform an task or operation. So I decided to start up Visual Studio and built a DLL and basically have 2 layers of logic. An integration layer and a data layer. A good friend of mine pointed me to SOLID principles and referred me to some videos and tutorials about it. I decided to implement the methodology (although a lot of the principles are common sense and I already incorporated in my coding practices). I was to deliver this dll to the application team and they would simply call the methods exposed by this dll and voila! it would do some task or operation in SharePoint. SOLUTION My integration layer implemented an interface that defined some of the basic integration tasks that I was to put together. My data layer was about the same, it implemented an interface with some of the tasks that I was going to develop. This gave me the opportunity to develop different data layers, ultimately different ways to get at SharePoint if I needed to. This is a classic SOLID principle. In this case it proved to be quite helpful because I wrote one data layer completely implementing SharePoint built in Web Services and another implementing my own WCF Service that I wrote. I should mention there is another layer underneath the data layer. In referencing SharePoint or WCF services in my visual studio project I created a class for every web service call. So for example, if I used List.asx. I created a class called "DocumentRetreival" this class would do the grunt work to connect to the correct URL, It would perform the basic operation of contacting the service and so on. If I used a view.asmx, I implemented a class called "ViewRetrieval" with the same idea as the last class but it would now interact with all he operations in view.asmx. This gave my data layer the ability to perform multiple calls without really worrying about some of the grunt work each class performs. This again, is a classic SOLID principle. So, in order to compare them side by side we can look at both data layers and with is involved in each. Lets take a look at the "Create Project" task or operation. The integration point is described as , "dll is to provide a way to create a project in SharePoint". Projects , in this case are basically document libraries. I am to implement a way in which a remote application can create a document library in SharePoint. Easy enough right? Use the list.asmx Web service in SharePoint. So here we go! Lets take a look at the code. I added the List.asmx web service reference to my project and this is the class that contacts it:  class DocumentRetrieval     {         private ListsSoapClient _service;      d   private bool _impersonation;         public DocumentRetrieval(bool impersonation, string endpt)         {             _service = new ListsSoapClient();             this.SetEndPoint(string.Format("{0}/{1}", endpt, ConfigurationManager.AppSettings["List"]));             _impersonation = impersonation;             if (_impersonation)             {                 _service.ClientCredentials.Windows.ClientCredential.Password = ConfigurationManager.AppSettings["password"];                 _service.ClientCredentials.Windows.ClientCredential.UserName = ConfigurationManager.AppSettings["username"];                 _service.ClientCredentials.Windows.AllowedImpersonationLevel =                     System.Security.Principal.TokenImpersonationLevel.Impersonation;             }     private void SetEndPoint(string p)          {             _service.Endpoint.Address = new EndpointAddress(p);          }          /// <summary>         /// Creates a document library with specific name and templateID         /// </summary>         /// <param name="listName">New list name</param>         /// <param name="templateID">Template ID</param>         /// <returns></returns>         public XmlElement CreateLibrary(string listName, int templateID, ref ExceptionContract exContract)         {             XmlDocument sample = new XmlDocument();             XmlElement viewCol = sample.CreateElement("Empty");             try             {                 _service.Open();                 viewCol = _service.AddList(listName, "", templateID);             }             catch (Exception ex)             {                 exContract = new ExceptionContract("DocumentRetrieval/CreateLibrary", ex.GetType(), "Connection Error", ex.StackTrace, ExceptionContract.ExceptionCode.error);                             }finally             {                 _service.Close();             }                                      return viewCol;         } } There was a lot more in this class (that I am not including) because i was reusing the grunt work and making other operations with LIst.asmx, For example, updating content types, changing or configuring lists or document libraries. One of the first things I noticed about working with the built in services is that you are really at the mercy of what is available to you. Before creating a document library (Project) I wanted to expose a IsProjectExisting method. This way the integration or data layer could recognize if a library already exists. Well there is no service call or method available to do that check. So this is what I wrote:   public bool DocLibExists(string listName, ref ExceptionContract exContract)         {             try             {                 var allLists = _service.GetListCollection();                                return allLists.ChildNodes.OfType<XmlElement>().ToList().Exists(x => x.Attributes["Title"].Value ==listName);             }             catch (Exception ex)             {                 exContract = new ExceptionContract("DocumentRetrieval/GetList/GetListWSCall", ex.GetType(), "Unable to Retrieve List Collection", ex.StackTrace, ExceptionContract.ExceptionCode.error);             }             return false;         } This really just gets an XMLElement with all the lists. It was then up to me to sift through the clutter and noise and see if Document library already existed. This took a little bit of getting used to. Now instead of working with code, you are working with XMLElement response format from web service. I wrote a LINQ query to go through and find if the attribute "Title" existed and had a value of the listname then it would return True, if not False. I didn't particularly like working this way. Dealing with XMLElement responses and then having to manipulate it to get at the exact data I was looking for. Once the check for the DocLibExists, was done, I would either create the document library or send back an error indicating the document library already existed. Now lets examine the code that actually creates the document library. It does what you are really after, it creates a document library. Notice how the template ID is really an integer. Every document library template in SharePoint has an ID associated with it. Document libraries, Image Library, Custom List, Project Tasks, etc… they all he a unique integer associated with it. Well, that's great but the client came back to me and gave me some specifics that each "project" or document library, should have. They specified they had 3 types of projects. Each project would have unique views, about 10 views for each project. Each Project specified unique configurations (auditing, versioning, content types, etc…) So what turned out to be a simple implementation of creating a document library as a repository for a project, turned out to be quite involved.  The first thing I thought of was to create a template for document library. There are other ways you can do this too. Using the web Service call, you could configure views, versioning, even content types, etc… the only catch is, you have to be working quite extensively with CAML. I am not fond of CAML. I can do it and work with it, I just don't like doing it. It is quite touchy and at times it is quite tough to understand where errors were made with CAML statements. Working with Web Services and CAML proved to be quite annoying. The service call would return a generic error message that did not particularly point me to a CAML statement syntax error, or even a CAML error. I was not sure if it was a security , performance or code based issue. It was quite tough to work with. At times it was difficult to work with because of the way SharePoint handles metadata. There are "Names", "Display Name", and "StaticName" fields. It was quite tough to understand at times, which one to use. So it took a lot of trial and error. There are tools that can help with CAML generation. There is also now intellisense for CAML statements in Visual Studio that might help but ultimately I'm not fond of CAML with Web Services.   So I decided on the template. So my plan was to create create a document library, configure it accordingly and then use The Template Builder that comes with the SharePoint SDK. This tool allows you to create site templates, list template etc… It is quite interesting because it does not generate an STP file, it actually generates an xml definition and a feature you can activate and make that template available on a site or site collection. The first issue I experienced with this is that one of the specifications to this template was that the "All Documents" view was to have 2 web parts on it. Well, it turns out that using the template builder , it did not include the web parts as part of the list template definition it generated. It backed up the settings, the views, the content types but not the custom web parts. I still decided to try this even without the web parts on the page. This new template defined a new Document library definition with a unique ID. The problem was that the service call accepts an int but it only has access to the built in library int definitions. Any new ones added or created will not be available to create. So this made it impossible for me to approach the problem this way.     I should also mention that one of the nice features about SharePoint is the ability to create list templates, back them up and then create lists based on that template. It can all be done by end user administrators. These templates are quite unique because they are saved as an STP file and not an xml definition. I also went this route and tried to see if there was another service call where I could create a document library based no given template name. Nope! none.      After some thinking I decide to implement a WCF service to do this creation for me. I was quite certain that the object model would allow me to create document libraries base on a template in which an ID was required and also templates saved as STP files. Now I don't want to bother with posting the code to contact WCF service because it's self explanatory, but I will post the code that I used to create a list with custom template. public ServiceResult CreateProject(string name, string templateName, string projectId)         {             string siteurl = SPContext.Current.Site.Url;             Guid webguid = SPContext.Current.Web.ID;                        using (SPSite site = new SPSite(siteurl))             {                 using (SPWeb rootweb = site.RootWeb)                 {                     SPListTemplateCollection temps = site.GetCustomListTemplates(rootweb);                     ProcessWeb(siteurl, webguid, web => Act_CreateProject(web, name, templateName, projectId, temps));                 }//SpWeb             }//SPSite              return _globalResult;                   }         private void Act_CreateProject(SPWeb targetsite, string name, string templateName, string projectId, SPListTemplateCollection temps) {                         var temp = temps.Cast<SPListTemplate>().FirstOrDefault(x => x.Name.Equals(templateName));             if (temp != null)             {                             try                 {                                         Guid listGuid = targetsite.Lists.Add(name, "", temp);                     SPList newList = targetsite.Lists[listGuid];                     _globalResult = new ServiceResult(true, "Success", "Success");                 }                 catch (Exception ex)                 {                     _globalResult = new ServiceResult(false, (string.IsNullOrEmpty(ex.Message) ? "None" : ex.Message + " " + templateName), ex.StackTrace.ToString());                 }                                       }        private void ProcessWeb(string siteurl, Guid webguid, Action<SPWeb> action) {                        using (SPSite sitecollection = new SPSite(siteurl)) {                 using (SPWeb web = sitecollection.AllWebs[webguid]) {                     action(web);                 }                     }                  } This code is actually some of the code I implemented for the service. there was a lot more I did on Project Creation which I will cover in my next blog post. I implemented an ACTION method to process the web. This allowed me to properly dispose the SPWEb and SPSite objects and not rewrite this code over and over again. So I implemented a WCF service to create projects for me, this allowed me to do a lot more than just create a document library with a template, it now gave me the flexibility to do just about anything the client wanted at project creation. Once this was implemented , the client came back to me and said, "we reference all our projects with ID's in our application. we want SharePoint to do the same". This has been something I have been doing for a little while now but I do hope that SharePoint 2010 can have more of an answer to this and address it properly. I have been adding metadata to SPWebs through property bag. I believe I have blogged about it before. This time it required metadata added to a document library. No problem!!! I also mentioned these web parts that were to go on the "All Documents" View. I took the opportunity to configure them to the appropriate settings. There were two settings that needed to be set on these web parts. One of them was a Project ID configured in the webpart properties. The following code enhances and replaces the "Act_CreateProject " method above:  private void Act_CreateProject(SPWeb targetsite, string name, string templateName, string projectId, SPListTemplateCollection temps) {                         var temp = temps.Cast<SPListTemplate>().FirstOrDefault(x => x.Name.Equals(templateName));             if (temp != null)             {                 SPLimitedWebPartManager wpmgr = null;                               try                 {                                         Guid listGuid = targetsite.Lists.Add(name, "", temp);                     SPList newList = targetsite.Lists[listGuid];                     SPFolder rootFolder = newList.RootFolder;                     rootFolder.Properties.Add(KEY, projectId);                     rootFolder.Update();                     if (rootFolder.ParentWeb != targetsite)                         rootFolder.ParentWeb.Dispose();                     if (!templateName.Contains("Natural"))                     {                         SPView alldocumentsview = newList.Views.Cast<SPView>().FirstOrDefault(x => x.Title.Equals(ALLDOCUMENTS));                         SPFile alldocfile = targetsite.GetFile(alldocumentsview.ServerRelativeUrl);                         wpmgr = alldocfile.GetLimitedWebPartManager(PersonalizationScope.Shared);                         ConfigureWebPart(wpmgr, projectId, CUSTOMWPNAME);                                              alldocfile.Update();                     }                                        if (newList.ParentWeb != targetsite)                         newList.ParentWeb.Dispose();                     _globalResult = new ServiceResult(true, "Success", "Success");                 }                 catch (Exception ex)                 {                     _globalResult = new ServiceResult(false, (string.IsNullOrEmpty(ex.Message) ? "None" : ex.Message + " " + templateName), ex.StackTrace.ToString());                 }                 finally                 {                     if (wpmgr != null)                     {                         wpmgr.Web.Dispose();                         wpmgr.Dispose();                     }                 }             }                         }       private void ConfigureWebPart(SPLimitedWebPartManager mgr, string prjId, string webpartname)         {             var wp = mgr.WebParts.Cast<System.Web.UI.WebControls.WebParts.WebPart>().FirstOrDefault(x => x.DisplayTitle.Equals(webpartname));             if (wp != null)             {                           (wp as ListRelationshipWebPart.ListRelationshipWebPart).ProjectID = prjId;                 mgr.SaveChanges(wp);             }         }   This Shows you how I was able to set metadata on the document library. It has to be added to the RootFolder of the document library, Unfortunately, the SPList does not have a Property bag that I can add a key\value pair to. It has to be done on the root folder. Now everything in the integration will reference projects by ID's and will not care about names. My, "DocLibExists" will now need to be changed because a web service is not set up to look at property bags.  I had to write another method on the Service to do the equivalent but with ID's instead of names.  The second thing you will notice about the code is the use of the Webpartmanager. I have seen several examples online, and also read a lot about memory leaks, The above code does not produce memory leaks. The web part manager creates an SPWeb, so just dispose it like I did. CONCLUSION This is a long long post so I will stop here for now, I will continue with more comparisons and limitations in my next post. My conclusion for this example is that Web Services will do the trick if you can suffer through CAML and if you are doing some simple operations. For Everything else, there's WCF! **** fireI apologize for the disorganization of this post, I was on a bus on a 12 hour trip to IOWA while I wrote it, I was half asleep and half awake, hopefully it makes enough sense to someone.

    Read the article

  • Crystal Reports for VS deployment to Web Application doesn't work (3 replies)

    I've followed the &quot;documentation&quot; for getting crystal reports to work on a production web site. I've migrated a VS 2003 web project to a VS 2008 web application. Everything works fine on my dev box. publish the site out to the server (2003 X86) and no go on the reports, get the infamous: ***** Error Type: System.IO.FileLoadException ***** Error Message: Could not load file or assembly 'CrystalDeci...

    Read the article

  • SSIS Catalog, Windows updates and deployment failures due to System.Core mismatch

    - by jamiet
    This is a heads-up for anyone doing development on SSIS. On my current project where we are implementing a SQL Server Integration Services (SSIS) 2012 solution we recently encountered a situation where we were unable to deploy any of our projects even though we had successfully deployed in the past. Any attempt to use the deployment wizard resulted in this error dialog: The text of the error (for all you search engine crawlers out there) was: A .NET Framework error occurred during execution of user-defined routine or aggregate "create_key_information": System.IO.FileLoadException: Could not load file or assembly 'System.Core, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040) ---> System.IO.FileLoadException: The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040) System.IO.FileLoadException: System.IO.FileLoadException:     at Microsoft.SqlServer.IntegrationServices.Server.Security.CryptoGraphy.CreateSymmetricKey(String algorithm)    at Microsoft.SqlServer.IntegrationServices.Server.Security.CryptoGraphy.CreateKeyInformation(SqlString algorithmName, SqlBytes& key, SqlBytes& IV) . (Microsoft SQL Server, Error: 6522) After some investigation and a bit of back and forth with some very helpful members of the SSIS product team (hey Matt, Wee Hyong) it transpired that this was due to a .Net Framework fix that had been delivered via Windows Update. I took a look at the server update history and indeed there have been some recently applied .Net Framework updates: This fix had (in the words of Matt Masson) “somehow caused a mismatch on System.Core for SQLCLR” and, as you may know, SQLCLR is used heavily within the SSIS Catalog. The fix was pretty simple – restart SQL Server. This causes the assemblies to be upgraded automatically. If you are using Data Quality Services (DQS) you may have experienced similar problems which are documented at Upgrade SQLCLR Assemblies After .NET Framework Update. I am hoping the SSIS team will follow-up with a more thorough explanation on their blog soon. You DBAs out there may be questioning why Windows Update is set to automatically apply updates on our production servers. We’re checking that out with our hosting provider right now You have been warned! @Jamiet

    Read the article

  • How to resolve broken dependencies of gnome-shell-extensions-user-theme package?

    - by swift
    After unsuccessful upgrade of Gnome3 packages in new Precise Pangolin 64-bit environment I get this error: The following packages have unmet dependencies: gnome-shell-extensions : Conflicts: gnome-shell-extensions-user-theme but 3.2.0-2~webupd8~oneiric is to be installed I tried to remove by running sudo apt-get purge gnome-shell-extensions-user-theme but get this: Package gnome-shell-extensions-user-theme is not installed, so not removed My Gnome Classic profile works well but Gnome3 session can't run. How to resolve this error?

    Read the article

  • How to Run Apache Commands From Oracle HTTP Server 11g Home

    - by Daniel Mortimer
    Every now and then you come across a problem when there is nothing in the "troubleshooting manual" which can help you. Instead you need to think outside the box. This happened to me two or three years back. Oracle HTTP Server (OHS) 11g did not start. The error reported back by OPMN was generic and gave no clue, and worse the HTTP Server error log was empty, and remained so even after I had increased the OPMN and HTTP Server log levels. After checking configuration files, operating system resources, etc I was still no nearer the solution. And then the light bulb moment! OHS is based on Apache - what happens if I attempt to start HTTP Server using the native apache command. Trouble was the OHS 11g solution has its binaries and configuration files in separate "home" directories ORACLE_HOME contains the binaries ORACLE_INSTANCE contains the configuration files How to set the environment so that native apache commands run without error? Eventually, with help from a colleague, the knowledge articleHow to Start Oracle HTTP Server 11g Without Using opmnctl [ID 946532.1]was born! To be honest, I cannot remember the exact cause and solution to that OHS problem two or three years ago. But, I do remember that an attempt to start HTTP Server using the native apache command threw back an error to the console which led me to discover the culprit was some unusual filesystem fault.The other day, I was asked to review and publish a new knowledge article which described how to use the apache command to dump a list of static and shared loaded modules. This got me thinking that it was time [ID 946532.1] was given an update. The resultHow To Run Native Apache Commands in an Oracle HTTP Server 11g Environment [ID 946532.1] Highlights: Title change Improved environment setting scripts Interactive, should be no need to manually edit the scripts (although readers are welcome to do so) Automatically dump out some diagnostic information Inclusion of some links to other troubleshooting collateral To view the knowledge article you need a My Oracle Support login. For convenience, you can obtain the scripts via the links below.MS Windows:Wrapper cmd script - calls main cmd script [After download, remove the ".txt" file extension]Main cmd script - sets OHS 11g environment to run Apache commands [After download, remove the ".txt" file extension]Unix:Shell script - sets OHS 11g environment to run Apache commands on Unix Please note: I cannot guarantee that the scripts held in the blog repository will be maintained. Any enhancements or faults will applied to the scripts attached to the knowledge article. Lastly, to find out more about native apache commands, refer to the Apache Documentation apachectl - Apache HTTP Server Control Interface[http://httpd.apache.org/docs/2.2/programs/apachectl.html]httpd - Apache Hypertext Transfer Protocol Server[http://httpd.apache.org/docs/2.2/programs/httpd.html]

    Read the article

  • Configure clean URLs using Laravel using a rewrite rule to index.php

    - by yannis hristofakis
    Recently I've started learning Laravel , I have none experience with framework before. I'm encountering the following problem .I'm trying to configure the .htaccess file so I can have clean URLs but the only thing I get are 404 Not Found error pages. I have created a virtual host - you can see below the configuration file - and changed the .htaccesss file on the public directory. /etc/apache2/sites-available <VirtualHost *:80> ServerAdmin [email protected] ServerName laravel.lar DocumentRoot "/home/giannis/Desktop/laravel/public" <Directory "/home/giannis/Desktop/laravel/public"> Options Indexes FollowSymLinks MultiViews AllowOverride All </Directory> <Directory /var/www/> Options Indexes FollowSymLinks MultiViews AllowOverride None Order allow,deny allow from all </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog ${APACHE_LOG_DIR}/access.log combined Alias /doc/ "/usr/share/doc/" <Directory "/usr/share/doc/"> Options Indexes MultiViews FollowSymLinks AllowOverride None Order deny,allow Deny from all Allow from 127.0.0.0/255.0.0.0 ::1/128 </Directory> </VirtualHost> .htaccesss file: laravel/public # Apache configuration file # http://httpd.apache.org/docs/current/mod/quickreference.html # Note: ".htaccess" files are an overhead for each request. This logic should # be placed in your Apache config whenever possible. # http://httpd.apache.org/docs/current/howto/htaccess.html # Turning on the rewrite engine is necessary for the following rules and # features. "+FollowSymLinks" must be enabled for this to work symbolically. <IfModule mod_rewrite.c> Options +FollowSymLinks RewriteEngine On </IfModule> # For all files not found in the file system, reroute the request to the # "index.php" front controller, keeping the query string intact <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ index.php?/$1 [L] </IfModule> In order to test it, I have created a view named about and made the proper routing. If I link to http://laravel.lar/index.php/about/ I'm routing to the about page instead if I link to http://laravel.lar/about/ I get a 404 Not Found error. I'm using a Debian based system.

    Read the article

  • Grub rescue: unknown filesystem erro while booting from USB

    - by Dilip Kumar
    I'm currently using ubuntu 12.04 which has grub2 and wanted to install windows 8 from my USB, i also created a bootable usb flash drive using windows 7 dvd download tool, the problem is - I formatted my hard drive and whenever i try to boot from the flash drive i get an error "error: unknown filesystem" and gives grub rescue prompt, since my dvd drive is not working the only way for me to install win8 is from USB,can anybody help me with this?

    Read the article

  • Atom feed validator keeps showing Self reference doesn't match document location

    - by Dino
    I am creating an atom feed, but when I validate it I keep getting: Self reference doesn't match document location and the specific line that is causing the error is: <link rel="self" type="application/atom+xml" href="http://www.example.com/test.rss"/> Please can anyone advise what the error is? Ps. I noticed an up arrow just at the end of that line. (presumably something to do with that bbut not sure)

    Read the article

  • Unable to install Konstruktor program

    - by George Barnick
    I've recently switched to Linux, having a good knowledge of how it works before switching, since I've had experience working with Ubuntu-based servers for several months. I've been trying to install a LEGO CAD program that I have found based on an open source library. Information on that is here. The program I'm looking at is called Konstruktor, and for the most part it should be working. One dependency however does not seem to be able to install. The missing dependency is "kdelibs5", which I know what it is and what it's for, but can't get it to install. When trying to install, I get this error: george@GEORGE-PC:~$ sudo apt-get install kdelibs5 Reading package lists... Done Building dependency tree Reading state information... Done Package kdelibs5 is not available, but is referred to by another package. This may mean that the package is missing, has been obsoleted, or is only available from another source E: Package 'kdelibs5' has no installation candidate When trying to install the proper package of Konstruktor for my system, I get this error: george@GEORGE-PC:~$ sudo dpkg -i '/home/george/Downloads/konstruktor_0.9-beta1-2_amd64.deb' Selecting previously unselected package konstruktor. (Reading database ... 224684 files and directories currently installed.) Unpacking konstruktor (from .../konstruktor_0.9-beta1-2_amd64.deb) ... dpkg: dependency problems prevent configuration of konstruktor: konstruktor depends on kdelibs5 (>= 4:4.4.5); however: Package kdelibs5 is not installed. dpkg: error processing konstruktor (--install): dependency problems - leaving unconfigured Processing triggers for bamfdaemon ... Rebuilding /usr/share/applications/bamf-2.index... Processing triggers for desktop-file-utils ... Processing triggers for gnome-menus ... Processing triggers for mime-support ... Processing triggers for hicolor-icon-theme ... Processing triggers for shared-mime-info ... Unknown media type in type 'all/all' Unknown media type in type 'all/allfiles' Unknown media type in type 'uri/mms' Unknown media type in type 'uri/mmst' Unknown media type in type 'uri/mmsu' Unknown media type in type 'uri/pnm' Unknown media type in type 'uri/rtspt' Unknown media type in type 'uri/rtspu' Errors were encountered while processing: konstruktor I have tried updating and adding apt repositories for KDE packages, but I continuously get the same error. I tried the package "kdelibs5-dev", which installed fine, but "kdelibs5" won't, and Konstruktor still won't install. I am on Ubuntu 13.10 with GNOME shell on amd64 architecture. Any help would be much appreciated. (originally posted my issue here) Thanks ahead of time.

    Read the article

  • Diagnosing ADF Mobile iOS deployment problems

    - by Chris Muir
    From time to time I encounter customers who have taken possession of a brand new Apple Mac, have that excited "I've just spent more on a computer then I ever wanted to but it's okay" crazy gleam in their eye, but on pre-loading all the necessary software for Oracle's ADF Mobile to start their mobile campaign, following Oracle's setup instructions and deploying their first app to Apple's XCode iPhone Simulator they hit this error message in the JDeveloper Log-Deployment window: [01:36:46 PM] Deployment cancelled. [01:36:46 PM] ----  Deployment incomplete  ----. [01:36:46 PM] Failed to build the iOS application bundle. [01:36:46 PM] Deployment failed due to one or more errors returned by '/Applications/Xcode.app/Contents/Developer/usr/bin/xcodebuild'.  The following is a summary of the returned error(s): Command-line execution failed (Return code: 69) "Oh, return code 69, I know that well" I hear you say.  Admittedly the error code is less than useful besides drawing some titters from the peanut gallery. Before explaining what's gone wrong, I think it's useful to teach customers how to diagnose these issues themselves.  When ADF Mobile commences a deployment, be it to Apple's iOS or Google's Android platforms, JDeveloper and ADF Mobile do a good job in the Log window of showing you what the deployment process entails.  In the case of deploying to iOS the log window will literally include the XCode commands executed to complete the deployment cycle. As example here's the log output that was produced before the error message was raised.... take the opportunity to read this line by line and note the command line calls highlighted in blue: (Note some of the following lines have been split over multiple lines to suit reading on this blog, each original line is preceded by a timestamp. Ensure to check the exact commands from JDev) [01:36:33 PM] Target platform is (iOS). [01:36:33 PM] Beginning deployment of ADF Mobile application 'LayoutDemo' to iOS using profile 'IOS_MOBILE_NATIVE_archive1'. [01:36:34 PM] Command-line executed: [/Applications/Xcode.app/Contents/Developer/usr/bin/xcodebuild, -version] [01:36:34 PM] Command-line execution succeeded. [01:36:34 PM] Running dependency analysis... [01:36:34 PM] Building... [01:36:34 PM] Deploying 3 profiles... [01:36:35 PM] Wrote Archive Module to /Users/chris/fmw/jdeveloper/jdev/extensions/ oracle.adf.mobile/Samples/PublicSamples/LayoutDemo/ApplicationController/ deploy/ApplicationController.jar [01:36:35 PM] WARNING: No Resource Catalog enabled ADF components found to package [01:36:36 PM] Wrote Archive Module to /Users/chris/fmw/jdeveloper/jdev/extensions/ oracle.adf.mobile/Samples/PublicSamples/LayoutDemo/ViewController/ deploy/ViewController.jar [01:36:36 PM] Verifying existence of the .adf source directory of the ADF Mobile application... [01:36:36 PM] Verifying Application Controller project exists... [01:36:36 PM] Verifying application dependencies... [01:36:36 PM] The application may not function correctly because the following dependent libraries are missing: /Users/chris/jdev/jdeveloper/jdeveloper/jdev/extensions/oracle.adf.mobile/ lib/adfmf.springboard.jar [01:36:36 PM] Verifying project dependencies... [01:36:36 PM] Validating application XML files... [01:36:36 PM] Validating XML files in project ApplicationController... [01:36:36 PM] Validating XML files in project ViewController... [01:36:40 PM] Copying common javascript files... [01:36:41 PM] Copying FARs to the ADF Mobile Framework application... [01:36:41 PM] Extracting Feature Archive file, "ApplicationController.jar" to deployment folder, "ApplicationController". [01:36:42 PM] Extracting Feature Archive file, "ViewController.jar" to deployment folder, "ViewController". [01:36:42 PM] Deploying skinning files... [01:36:43 PM] Copying the CVM SDK files built for the x86 processor... [01:36:43 PM] Copying the CVM JDK files built for the x86 processor... [01:36:43 PM] Command-line executed: [cp, -R, -p, /Users/chris/fmw/jdeveloper/jdev/extensions/oracle.adf.mobile/iOS/jvmti/x86/, /Users/chris/fmw/jdeveloper/jdev/extensions/oracle.adf.mobile/ Samples/PublicSamples/ LayoutDemo/deploy/IOS_MOBILE_NATIVE_archive1/temporary_xcode_project/lib] [01:36:43 PM] Command-line execution succeeded. [01:36:43 PM] Command-line executed: [cp, -R, -p, /Users/chris/fmw/jdeveloper/jdev/extensions/oracle.adf.mobile/iOS/jvmti/jar/, /Users/chris/fmw/jdeveloper/jdev/extensions/oracle.adf.mobile/Samples/ PublicSamples/LayoutDemo/deploy/IOS_MOBILE_NATIVE_archive1/ temporary_xcode_project/lib] [01:36:43 PM] Command-line execution succeeded. [01:36:43 PM] Copying security related files to the ADF Mobile Framework application... [01:36:44 PM] Command-line executed from path: /Users/chris/fmw/jdeveloper/jdev/extensions/oracle.adf.mobile/Samples/ PublicSamples/LayoutDemo/deploy/IOS_MOBILE_NATIVE_archive1/temporary_xcode_project/ [01:36:44 PM] Command-line executed: /Applications/Xcode.app/Contents/Developer/usr/bin/xcodebuild clean install -configuration Debug -sdk /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/ Developer/SDKs/iPhoneSimulator6.1.sdk DSTROOT=/Users/chris/fmw/jdeveloper/jdev/extensions/oracle.adf.mobile/Samples/ PublicSamples/LayoutDemo/deploy/IOS_MOBILE_NATIVE_archive1/Destination_Root/ IPHONEOS_DEPLOYMENT_TARGET=5.0 TARGETED_DEVICE_FAMILY=1,2 PRODUCT_NAME=LayoutDemo ADD_SETTINGS_BUNDLE=NO As you can see when we move from JDeveloper undertaking its work, it then passes the code off in the last few lines for Apple's XCode to assemble and deploy the required .ipa file.  From the original error message which followed this complaining about xcodebuild failing with return code 69, we can quickly see the exact command line used to call xcodebuild. As this is the exact command line call with all its options, you're free to open a Terminal window in Mac OSX and execute the same command by simply copying and pasting the command line. And via this you'll then find out what return code actually 69 means.  Unfortunately it's not that exciting. For Macs that have just been installed and configured with XCode, XCode (and for that matter iTunes) which is required by ADF Mobile to deploy must have been run at least once before hand on your brand new Mac (to be clear that's once ever, not once every restart). On doing so you will be presented with a license agreement from Apple that you must accept. Only once you've done this will the command line calls work.  They're currently failing as you haven't accepted the legal terms and conditions. (arguably you an also accept the terms and conditions from the command line too, but ADF Mobile cannot do this on your behalf, so it's just easier to open the tools and confirm the legal requirements that way). Putting aside the error code and its meaning, watching the log window, watching what commands are executed, learning what they do, this will assist you to diagnose issues yourself and solve these sort of issues more relatively quickly.  From my perspective as an Oracle Product Manager, it allows me to say "this is the stuff you don't need to worry about when you use ADF Mobile when it's configured correctly" .... as you can see my salesman qualities shine through. For anyone who is happily using ADF Mobile on a Mac and wondering why you didn't hit these issues, it's quite likely that you already accepted the license conditions before deploying via ADF Mobile.  For instance, though I'm not a fan of iTunes itself, iTunes was one of the first things I loaded on my Mac to access my Justin Bieber albums. Image courtesy of winnond / FreeDigitalPhotos.net

    Read the article

  • Conversion of BizTalk Projects to Use the New WCF-SAP Adaptor

    - by Geordie
    We are in the process of upgrading our BizTalk Environment from BizTalk 2006 R2 to BizTalk 2010. The SAP adaptor in BizTalk 2010 is an all new and more powerful WCF-SAP adaptor. When my colleagues tested out the new adaptor they discovered that the format of the data extracted from SAP was not identical to the old adaptor. This is not a big deal if the structure of the messages from SAP is simple. In this case we were receiving the delivery and invoice iDocs. Both these structures are complex especially the delivery document. Over the past few years I have tweaked the delivery mapping to remove bugs from original mapping. The idea of redoing these maps did not appeal and due to the current work load was not even an option. I opted for a rather crude alternative of pulling in the iDoc in the new typed format and then adding a static map at the start of the orchestration to convert the data to the old schema.  Note WCF-SAP data formats (on the binding tab of the configuration dialog box is the ‘RecieiveIdocFormat’ field): Typed:  Returns a XML document with the hierarchy represented in XML and all fields being represented by XML tags. RFC: Returns an XML document with the hierarchy represented in XML but the iDoc lines in flat file format. String: This returns the iDoc in a format that is closest to the original flat file format but is still wrapped with some top level XML tags. The files also contained some strange characters at the end of each line. I started with the invoice document and it was quite straight forward to add the mapping but this is where my problems started. The orchestrations for these documents are dynamic and so require the identity of the partner to be able to correctly configure the orchestration. The partner identity is in the EDI_DC40 segment of the iDoc. In the old project the RECPRN node of the segment was promoted. The code to set a variable to the partner ID was now failing. After lot of head scratching I discovered the problem was due to the addition of Namespaces to the fields in the EDI_DC40 segment. To overcome this I needed to use an xPath query with a Namespace Manager. This had to be done in custom code. I now tried to repeat the process with the delivery document. Unfortunately when we tried to get sample typed data from SAP an exception was thrown. The adapter "WCF-SAP" raised an error message. Details "Microsoft.ServiceModel.Channels.Common.XmlReaderGenerationException: The segment or group definition E2EDKA1001 was not found in the IDoc metadata. The UniqueId of the IDoc type is: IDOCTYP/3/DESADV01/ZASNEXT1/640. For Receive operations, the SAP adapter does not support unreleased segments.   Our guess is that when the WCF-SAP adaptor tries to down load the data it retrieves a data schema from SAP. For some reason the schema does not match the data. This may be due to the version of SAP we are running or due to a customization. Either way resolving this problem did not look easy. When doing some research on this problem I found an article showing me how to get the data from SAP using the WCF-SAP adaptor without any XML tags. http://blogs.msdn.com/b/adapters/archive/2007/10/05/receiving-idocs-getting-the-raw-idoc-data.aspx Reproduction of Mustansir blog: Since the WCF based SAP Adapter is ... well, WCF based, all data flowing in and out of the adapter is encapsulated within a SOAP message. Which means there are those pesky xml tags all over the place. If you want to receive an Idoc from SAP, you can receive it in "Typed" format (in which case each column in each segment of the idoc appears within its own xml tag), or you can receive it in "String" format (in which case there are just 2 xml tags at the top, the raw xml data in string/flat file format, and the 2 closing xml tags). In "String" format, an incoming idoc (for ORDERS05, containing 5 data records) would look like: <ReceiveIdoc ><idocData>EDI_DC40 8000000000001064985620 E2EDK01005 800000000000106498500000100000001 E2EDK14 8000000000001064985000002000000020111000 E2EDK14 8000000000001064985000003000000020081000 E2EDK14 80000000000010649850000040000000200710 E2EDK14 80000000000010649850000050000000200600</idocData></ReceiveIdoc> (I have trimmed part of the control record so that it fits cleanly here on one line). Now, you're only interested in the IDOC data, and don't care much for the XML tags. It isn't that difficult to write your own pipeline component, or even some logic in the orchestration to remove the tags, right? Well, you don't need to write any extra code at all - the WCF Adapter can help you here! During the configuration of your one-way Receive Location using WCF-Custom, navigate to the Messages tab. Under the section "Inbound BizTalk Messge Body", select the "Path" radio button, and: (a) Enter the body path expression as: /*[local-name()='ReceiveIdoc']/*[local-name()='idocData'] (b) Choose "String" for the Node Encoding. What we've done is, used an XPATH to pull out the value of the "idocData" node from the XML. Your Receive Location will now emit text containing only the idoc data. You can at this point, for example, put the Flat File Pipeline component to convert the flat text into a different xml format based on some other schema you already have, and receive your version of the xml formatted message in your orchestration.   This was potentially a much easier solution than adding the static maps to the orchestrations and overcame the issue with ‘Typed’ delivery documents. Not quite so fast… Note: When I followed Mustansir’s blog the characters at the end of each line disappeared. After configuring the adaptor and passing the iDoc data into the original flat file receive pipelines I was receiving exceptions. There was a failure executing the receive pipeline: "PAPINETPipelines.DeliveryFlatFileReceive, CustomerIntegration2.PAPINET.Pipelines, Version=1.0.0.0, Culture=neutral, PublicKeyToken=4ca3635fbf092bbb" Source: "Pipeline " Receive Port: "recSAP_Delivery" URI: "D:\CustomerIntegration2\SAP\Delivery\*.xml" Reason: An error occurred when parsing the incoming document: "Unexpected data found while looking for: 'Z2EDPZ7' The current definition being parsed is E2EDP07GRP. The stream offset where the error occured is 8859. The line number where the error occured is 23. The column where the error occured is 0.". Although the new flat file looked the same as the old one there was a differences. In the original file all lines in the document were exactly 1064 character long. In the new file all lines were truncated to the last alphanumeric character. The final piece of the puzzle was to add a custom pipeline component to pad all the lines to 1064 characters. This component was added to the decode node of the custom delivery and invoice flat file disassembler pipelines. Execute method of the custom pipeline component: public IBaseMessage Execute(IPipelineContext pc, IBaseMessage inmsg) { //Convert Stream to a string Stream s = null; IBaseMessagePart bodyPart = inmsg.BodyPart;   // NOTE inmsg.BodyPart.Data is implemented only as a setter in the http adapter API and a //getter and setter for the file adapter. Use GetOriginalDataStream to get data instead. if (bodyPart != null) s = bodyPart.GetOriginalDataStream();   string newMsg = string.Empty; string strLine; try { StreamReader sr = new StreamReader(s); strLine = sr.ReadLine(); while (strLine != null) { //Execute padding code if (strLine != null) strLine = strLine.PadRight(1064, ' ') + "\r\n"; newMsg += strLine; strLine = sr.ReadLine(); } sr.Close(); } catch (IOException ex) { throw new Exception("Error occured trying to pad the message to 1064 charactors"); }   //Convert back to stream and set to Data property inmsg.BodyPart.Data = new MemoryStream(Encoding.UTF8.GetBytes(newMsg)); ; //reset the position of the stream to zero inmsg.BodyPart.Data.Position = 0; return inmsg; }

    Read the article

  • broadcom BCM4318 firmware missing

    - by Lucas
    I tried sudo apt-get update && apt-cache search b43 sudo apt-get install firmware-b43-installer after that i get this error Can't update, Some index files failed to download i tried cd /var/lib/apt/lists/partial sudo rm gb.archive.ubuntu.com_ubuntu_dists_natty_multivers e_source_Sources OR sudo rm /var/lib/apt/lists/partial/gb.archive.ubuntu.com_ubuntu_dists_natty_multivers e_source_Sources another error cannot remove e_source_Sources

    Read the article

  • External usb 3.0 hard drive is not recognised when plugged into usb 3 port (ubuntu natty 64 bit).

    - by kimangroo
    I have an Iomega Prestige Portable External Hard Drive 1TB USB 3.0. It works fine on windows 7 as a usb 3.0 drive. It isn't detected on ubuntu natty 64bit, 2.6.38-8-generic. fdisk -l cannot see it at all: Disk /dev/sda: 500.1 GB, 500107862016 bytes 255 heads, 63 sectors/track, 60801 cylinders Units = cylinders of 16065 * 512 = 8225280 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x1bed746b Device Boot Start End Blocks Id System /dev/sda1 1 1689 13560832 27 Unknown /dev/sda2 * 1689 1702 102400 7 HPFS/NTFS /dev/sda3 1702 19978 146805760 7 HPFS/NTFS /dev/sda4 19978 60802 327914497 5 Extended /dev/sda5 25555 60802 283120640 7 HPFS/NTFS /dev/sda6 19978 23909 31571968 83 Linux /dev/sda7 23909 25555 13218816 82 Linux swap / Solaris Partition table entries are not in disk order lsusb can see it: Bus 003 Device 003: ID 059b:0070 Iomega Corp. Bus 003 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub Bus 002 Device 004: ID 05fe:0011 Chic Technology Corp. Browser Mouse Bus 002 Device 003: ID 0a12:0001 Cambridge Silicon Radio, Ltd Bluetooth Dongle (HCI mode) Bus 002 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 001 Device 005: ID 0489:e00f Foxconn / Hon Hai Bus 001 Device 004: ID 0c45:64b5 Microdia Bus 001 Device 003: ID 08ff:168f AuthenTec, Inc. Bus 001 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub And dmesg | grep -i xhci (I may have unplugged the drive and plugged it back in again after booting): [ 1.659060] pci 0000:04:00.0: xHCI HW did not halt within 2000 usec status = 0x0 [ 11.484971] xhci_hcd 0000:04:00.0: PCI INT A -> GSI 18 (level, low) -> IRQ 18 [ 11.484997] xhci_hcd 0000:04:00.0: setting latency timer to 64 [ 11.485002] xhci_hcd 0000:04:00.0: xHCI Host Controller [ 11.485064] xhci_hcd 0000:04:00.0: new USB bus registered, assigned bus number 3 [ 11.636149] xhci_hcd 0000:04:00.0: irq 18, io mem 0xc5400000 [ 11.636241] xhci_hcd 0000:04:00.0: irq 43 for MSI/MSI-X [ 11.636246] xhci_hcd 0000:04:00.0: irq 44 for MSI/MSI-X [ 11.636251] xhci_hcd 0000:04:00.0: irq 45 for MSI/MSI-X [ 11.636256] xhci_hcd 0000:04:00.0: irq 46 for MSI/MSI-X [ 11.636261] xhci_hcd 0000:04:00.0: irq 47 for MSI/MSI-X [ 11.639654] xHCI xhci_add_endpoint called for root hub [ 11.639655] xHCI xhci_check_bandwidth called for root hub [ 11.956366] usb 3-1: new SuperSpeed USB device using xhci_hcd and address 2 [ 12.001073] xhci_hcd 0000:04:00.0: WARN: short transfer on control ep [ 12.007059] xhci_hcd 0000:04:00.0: WARN: short transfer on control ep [ 12.012932] xhci_hcd 0000:04:00.0: WARN: short transfer on control ep [ 12.018922] xhci_hcd 0000:04:00.0: WARN: short transfer on control ep [ 12.049139] xhci_hcd 0000:04:00.0: WARN: short transfer on control ep [ 12.056754] xhci_hcd 0000:04:00.0: WARN: short transfer on control ep [ 12.131607] xhci_hcd 0000:04:00.0: WARN no SS endpoint bMaxBurst [ 12.179717] xhci_hcd 0000:04:00.0: WARN: short transfer on control ep [ 12.686876] xhci_hcd 0000:04:00.0: WARN: babble error on endpoint [ 12.687058] xhci_hcd 0000:04:00.0: WARN Set TR Deq Ptr cmd invalid because of stream ID configuration [ 12.687152] xhci_hcd 0000:04:00.0: ERROR Transfer event for disabled endpoint or incorrect stream ring [ 43.330737] usb 3-1: reset SuperSpeed USB device using xhci_hcd and address 2 [ 43.422579] xhci_hcd 0000:04:00.0: WARN: short transfer on control ep [ 43.422658] xhci_hcd 0000:04:00.0: xHCI xhci_drop_endpoint called with disabled ep ffff88014669af00 [ 43.422665] xhci_hcd 0000:04:00.0: xHCI xhci_drop_endpoint called with disabled ep ffff88014669af40 [ 43.422671] xhci_hcd 0000:04:00.0: xHCI xhci_drop_endpoint called with disabled ep ffff88014669af80 [ 43.422677] xhci_hcd 0000:04:00.0: xHCI xhci_drop_endpoint called with disabled ep ffff88014669afc0 [ 43.531159] xhci_hcd 0000:04:00.0: WARN no SS endpoint bMaxBurst [ 125.160248] xhci_hcd 0000:04:00.0: WARN no SS endpoint bMaxBurst [ 903.766466] usb 3-1: new SuperSpeed USB device using xhci_hcd and address 3 [ 903.807789] xhci_hcd 0000:04:00.0: WARN: short transfer on control ep [ 903.813530] xhci_hcd 0000:04:00.0: WARN: short transfer on control ep [ 903.819400] xhci_hcd 0000:04:00.0: WARN: short transfer on control ep [ 903.825104] xhci_hcd 0000:04:00.0: WARN: short transfer on control ep [ 903.855067] xhci_hcd 0000:04:00.0: WARN: short transfer on control ep [ 903.862314] xhci_hcd 0000:04:00.0: WARN: short transfer on control ep [ 903.862597] xhci_hcd 0000:04:00.0: WARN no SS endpoint bMaxBurst [ 903.913211] xhci_hcd 0000:04:00.0: WARN: short transfer on control ep [ 904.424416] xhci_hcd 0000:04:00.0: WARN: babble error on endpoint [ 904.424599] xhci_hcd 0000:04:00.0: WARN Set TR Deq Ptr cmd invalid because of stream ID configuration [ 904.424700] xhci_hcd 0000:04:00.0: ERROR Transfer event for disabled endpoint or incorrect stream ring [ 935.139021] usb 3-1: reset SuperSpeed USB device using xhci_hcd and address 3 [ 935.226075] xhci_hcd 0000:04:00.0: WARN: short transfer on control ep [ 935.226140] xhci_hcd 0000:04:00.0: xHCI xhci_drop_endpoint called with disabled ep ffff880148186b00 [ 935.226148] xhci_hcd 0000:04:00.0: xHCI xhci_drop_endpoint called with disabled ep ffff880148186b40 [ 935.226153] xhci_hcd 0000:04:00.0: xHCI xhci_drop_endpoint called with disabled ep ffff880148186b80 [ 935.226159] xhci_hcd 0000:04:00.0: xHCI xhci_drop_endpoint called with disabled ep ffff880148186bc0 [ 935.343339] xhci_hcd 0000:04:00.0: WARN no SS endpoint bMaxBurst I thought it might be that the firmware wasn't compatible with linux or something, but when booting a live image of partedmagic, (2.6.38.4-pmagic), the drive was detected fine, I could mount it and got usb 3.0 speeds (at least they double the speeds I got from plugging same drive in usb 2 ports). dmesg in partedmagic did say something about no SuperSpeed endpoint which was an error I saw in a previous dmesg of ubuntu: Jun 27 15:49:02 (none) user.info kernel: [ 2.978743] xhci_hcd 0000:04:00.0: PCI INT A -> GSI 18 (level, low) -> IRQ 18 Jun 27 15:49:02 (none) user.debug kernel: [ 2.978771] xhci_hcd 0000:04:00.0: setting latency timer to 64 Jun 27 15:49:02 (none) user.info kernel: [ 2.978781] xhci_hcd 0000:04:00.0: xHCI Host Controller Jun 27 15:49:02 (none) user.info kernel: [ 2.978856] xhci_hcd 0000:04:00.0: new USB bus registered, assigned bus number 3 Jun 27 15:49:02 (none) user.info kernel: [ 3.089458] xhci_hcd 0000:04:00.0: irq 18, io mem 0xc5400000 Jun 27 15:49:02 (none) user.debug kernel: [ 3.089541] xhci_hcd 0000:04:00.0: irq 42 for MSI/MSI-X Jun 27 15:49:02 (none) user.debug kernel: [ 3.089544] xhci_hcd 0000:04:00.0: irq 43 for MSI/MSI-X Jun 27 15:49:02 (none) user.debug kernel: [ 3.089546] xhci_hcd 0000:04:00.0: irq 44 for MSI/MSI-X Jun 27 15:49:02 (none) user.debug kernel: [ 3.089548] xhci_hcd 0000:04:00.0: irq 45 for MSI/MSI-X Jun 27 15:49:02 (none) user.debug kernel: [ 3.089550] xhci_hcd 0000:04:00.0: irq 46 for MSI/MSI-X Jun 27 15:49:02 (none) user.warn kernel: [ 3.092857] usb usb3: No SuperSpeed endpoint companion for config 1 interface 0 altsetting 0 ep 129: using minimum values Jun 27 15:49:02 (none) user.info kernel: [ 3.092864] usb usb3: New USB device found, idVendor=1d6b, idProduct=0003 Jun 27 15:49:02 (none) user.info kernel: [ 3.092866] usb usb3: New USB device strings: Mfr=3, Product=2, SerialNumber=1 Jun 27 15:49:02 (none) user.info kernel: [ 3.092867] usb usb3: Product: xHCI Host Controller Jun 27 15:49:02 (none) user.info kernel: [ 3.092869] usb usb3: Manufacturer: Linux 2.6.38.4-pmagic xhci_hcd Jun 27 15:49:02 (none) user.info kernel: [ 3.092870] usb usb3: SerialNumber: 0000:04:00.0 Jun 27 15:49:02 (none) user.debug kernel: [ 3.092961] xHCI xhci_add_endpoint called for root hub Jun 27 15:49:02 (none) user.debug kernel: [ 3.092963] xHCI xhci_check_bandwidth called for root hub Well I have no idea what's going wrong, and I haven't had much luck from google and the forums so far. A number of unanswered threads with people with similar error messages and problems only. Hopefully someone here can help or point me in the right direction?!

    Read the article

  • What are the GPU requirements for XNA 4.0?

    - by Nate Koppenhaver
    I tried to build a sample application using XNA, but I got an error saying that Pixel Shader 1.1 was required, so I got a used Radeon X300 GPU that supports Pixel Shader. I tried to build it again, but I got another error saying that "Your current graphics card does not support the XNA HiDef profile" and would not build. Since that card seems to not be compatible, I guess I need to buy another one. What features should I look for to make sure that it's compatible with XNA?

    Read the article

  • Adobe Flash Player fails

    - by David Cole
    Using UBUNTU 11.10 the FireFox error message says "A plugin is needed to display this content: Adobe Flash Player Installer" So I install it. Then it says "Installed - restart FireFox" I restart FireFox and the same error message appears. This problem doesn't happen with Windows 7 (IE, Chrome & Firefox are fine) or my previous version of Ubuntu. Problem occurs when I access CallOfRoma.com Thank You

    Read the article

  • Gparted can't create partition table

    - by William
    Here's what the problem is. About a day or so ago I used Gparted live cd to create 3 NTFS primary partitions on my external 500 gig Goflex and one extended with 2 logical partitiones. I had planned to install windows 8 on the first partition, then ubuntu and kubuntu on the other 2. After I finished partitioning my drive with gparted, I booted into windows vista to make my bootable windows 8 usb to install it with, I also decided to check to make sure all my partitions were working properly. Then I found they were, and they weren't. My 50 gig first partition I had planned to install windows on showed up normal and the 300 gigs of space left in the extended partition did as well, the rest showed up as raw. So I figured alright, something went awal while making the partitions, so I booted up gparted once again. Then to my surprise gparted showed the entire drive as unallocated, and when I refreshed the list, it showed as all the partitions I had made earlier, buy with a exclamation mark by them all. So I figured ok, might be a problem with the partition table as I'd seen a similar problem in past on a drive that was not partitioned at all, so I decided to create a new partition table and take the time out again to sit and wait. Then I got a message saying gparted could not create the partition table, followed by it showing the entire drive as formatted into ntfs. After that I figured ok I'll take a break, come back in a hour, maybe it's something I did. So a hour later I came back after having booted up windows, plugged the drive in to see if by some miracle windows could access the drive. In disk management when I plugged the drive in, it would freeze attempting to read the drive, as I'd seen in the past with raw disks, yet when I unplugged it I got a glimpse of disk management showing it as a perfectly fine ntfs file system on the drive followed by a "you must format disk K in order to use it". So I then was assured the disk was raw as that is what had happened in the past, followed by a new partition table through gparted to fix the problem and a 10 hour format in windows. So I once again booted up gparted, to get the message "error fsyncing/closing/dev/sdg:input/output error" followed by "error opening dev/sdg No such file in directory" after I refreshed and somehow saw the disk show up as perfectly fine ntfs and then tried to create a new partition table to try to wipe out all my problems and start over again. And not gparted only shows the drive there about 1/10 refreshes the rest I get the directory error. If anybody can assist me in any way shape or form I will be thankful.

    Read the article

  • apport-collect fails with "certificate verify failed" when trying to report a bug on launchpad

    - by Francesco
    I am trying to report a bug but I get root@beagle:/usr/lib/python2.7/dist-packages/apport# apport-collect <bug_id> ERROR: connecting to Launchpad failed: [Errno 1] _ssl.c:504: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed You can reset the credentials by removing the file "/root/.cache/apport/launchpad.credentials" Moreover firefox tells me Certificate is not currently valid for bugs.launchpad.net. What can I do?

    Read the article

  • How can I get add-apt-repository to work?

    - by Kristopher Arens
    Whenever I try to add a repository via the command line, I get the following error message: Traceback (most recent call last): File "/usr/bin/add-apt-repository", line 125, in ppa_info = get_ppa_info_from_lp(user, ppa_name) File "/usr/lib/python2.7/dist-packages/softwareproperties/ppa.py", line 80, in get_ppa_info_from_lp curl.perform() pycurl.error: (60, 'server certificate verification failed. CAfile: /etc/ssl/certs/ca-certificates.crt CRLfile: none') Is there a way to remedy this situation?

    Read the article

< Previous Page | 591 592 593 594 595 596 597 598 599 600 601 602  | Next Page >