Search Results

Search found 3604 results on 145 pages for 'sfdc deployment'.

Page 10/145 | < Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >

  • Error when deploying WAR file

    - by Deena
    Hi, I have a war file and when deploying it thro the admin console in websphere i am getting the following error after specifying the war file location and the context-root. The EAR file might be corrupt or incomplete. org.eclipse.jst.j2ee.commonarchivecore.internal.exception.DeploymentDescriptorLoadException: WEB-INF/web.xml Any suggesstions to resolve this issue? I also unpacked the war file and checked that the web.xml file is present in the web-inf folder. Cheers, Deena

    Read the article

  • Admin required for Visual Studio 2008 Setup Project

    - by user54064
    I have a VS 2008 Setup Project that is installing a very simple application in the local user's App Folder. When the Setup Project runs, it is requiring the Admin to login to run it. How can I allow a Standard User to run the installation. There are no Prerequisites. The MSI file is the only file to be run (no Setup.exe). I have signed the msi with a certificate that is installed on the user's machine as Trusted. I just can't get rid of the Admin login requirement.

    Read the article

  • Manually filling opcode cache for entire app using apc_compile_file, then switching to new release.

    - by Ben
    Does anyone have a great system, or any ideas, for doing as the title says? I want to switch production version of web app-- written in PHP and served by Apache-- from release 1234 to release 1235, but before that happens, have all files already in the opcode cache (APC). Then after the switch, remove the old cache entries for files from release 1234. As far as I can think of there are three easy ways of atomically switching from one version to the next. Have a symbolic link, for example /live, that is always the document root but is changed to point from one version to the next. Similarly, have a directory /live that is always the document root, but use mv live oldversion && mv newversion live to switch to new version. Edit apache configuration to change the document root to newversion, then restart apache. I think it is preferable not to have to do 3, but I can't think of anyway to precompile all php files AND use 1 or 2 to switch release. So can someone either convince me its okay to rely on option 3, or tell me how to work with 1 or 2, or reveal some other option I am not thinking of?

    Read the article

  • How do you manage your sqlserver database projects for new builds and migrations?

    - by Rory
    How do you manage your sql server database build/deploy/migrate for visual studio projects? We have a product that includes a reasonable database part (~100 tables, ~500 procs/functions/views), so we need to be able to deploy new databases of the current version as well as upgrade older databases up to the current version. Currently we maintain separate scripts for creation of new databases and migration between versions. Clearly not ideal, but how is anyone else dealing with this? This is complicated for us by having many customers who each have their own db instance, rather than say just having dev/test/live instances on our own web servers, but the processes around managing dev/test/live for others must be similar.

    Read the article

  • Automated SSRS deployment with the RS utility

    - by Stacy Vicknair
    If you’re familiar with SSRS and development you are probably aware of the SSRS web services. The RS utility is a tool that comes with SSRS that allows for scripts to be executed against against the SSRS web service without needing to create an application to consume the service. One of the better benefits of using this format rather than writing an application is that the script can be modified by others who might be involved in the creation and addition of scripts or management of the SSRS environment.   Reporting Services Scripter Jasper Smith from http://www.sqldbatips.com created Reporting Services Scripter to assist with the created of a batch process to deploy an entire SSRS environment. The helper scripts below were created through the modification of his generated scripts. Why not just use this tool? You certainly can. For me, the volume of scripts generated seems less maintainable than just using some common methods extracted from these scripts and creating a deployment in a single script file. I would, however, recommend this as a product if you do not think that your environment will change drastically or if you do not need to deploy with a higher level of control over the deployment. If you just need to replicate, this tool works great. Executing with RS.exe Executing a script against rs.exe is fairly simple. The Script Half the battle is having a starting point. For the scripting I needed to do the below is the starter script. A few notes: This script assumes integrated security. This script assumes your reports have one data source each. Both of the above are just what made sense for my scenario and are definitely modifiable to accommodate your needs. If you are unsure how to change the scripts to your needs, I recommend Reporting Services Scripter to help you understand how the differences. The script has three main methods: CreateFolder, CreateDataSource and CreateReport. Scripting the server deployment is just a process of recreating all of the elements that you need through calls to these methods. If there are additional elements that you need to deploy that aren’t covered by these methods, again I suggest using Reporting Services Scripter to get the code you would need, convert it to a repeatable method and add it to this script! Public Sub Main() CreateFolder("/", "Data Sources") CreateFolder("/", "My Reports") CreateDataSource("/Data Sources", "myDataSource", _ "Data Source=server\instance;Initial Catalog=myDatabase") CreateReport("/My Reports", _ "MyReport", _ "C:\myreport.rdl", _ True, _ "/Data Sources", _ "myDataSource") End Sub   Public Sub CreateFolder(parent As String, name As String) Dim fullpath As String = GetFullPath(parent, name) Try RS.CreateFolder(name, parent, GetCommonProperties()) Console.WriteLine("Folder created: {0}", name) Catch e As SoapException If e.Detail.Item("ErrorCode").InnerText = "rsItemAlreadyExists" Then Console.WriteLine("Folder {0} already exists and cannot be overwritten", fullpath) Else Console.WriteLine("Error : " + e.Detail.Item("ErrorCode").InnerText + " (" + e.Detail.Item("Message").InnerText + ")") End If End Try End Sub   Public Sub CreateDataSource(parent As String, name As String, connectionString As String) Try RS.CreateDataSource(name, parent,False, GetDataSourceDefinition(connectionString), GetCommonProperties()) Console.WriteLine("DataSource {0} created successfully", name) Catch e As SoapException Console.WriteLine("Error : " + e.Detail.Item("ErrorCode").InnerText + " (" + e.Detail.Item("Message").InnerText + ")") End Try End Sub   Public Sub CreateReport(parent As String, name As String, location As String, overwrite As Boolean, dataSourcePath As String, dataSourceName As String) Dim reportContents As Byte() = Nothing Dim warnings As Warning() = Nothing Dim fullpath As String = GetFullPath(parent, name)   'Read RDL definition from disk Try Dim stream As FileStream = File.OpenRead(location) reportContents = New [Byte](stream.Length-1) {} stream.Read(reportContents, 0, CInt(stream.Length)) stream.Close()   warnings = RS.CreateReport(name, parent, overwrite, reportContents, GetCommonProperties())   If Not (warnings Is Nothing) Then Dim warning As Warning For Each warning In warnings Console.WriteLine(Warning.Message) Next warning Else Console.WriteLine("Report: {0} published successfully with no warnings", name) End If   'Set report DataSource references Dim dataSources(0) As DataSource   Dim dsr0 As New DataSourceReference dsr0.Reference = dataSourcePath Dim ds0 As New DataSource ds0.Item = CType(dsr0, DataSourceDefinitionOrReference) ds0.Name=dataSourceName dataSources(0) = ds0     RS.SetItemDataSources(fullpath, dataSources)   Console.Writeline("Report DataSources set successfully")       Catch e As IOException Console.WriteLine(e.Message) Catch e As SoapException Console.WriteLine("Error : " + e.Detail.Item("ErrorCode").InnerText + " (" + e.Detail.Item("Message").InnerText + ")") End Try End Sub     Public Function GetCommonProperties() As [Property]() 'Common CatalogItem properties Dim descprop As New [Property] descprop.Name = "Description" descprop.Value = "" Dim hiddenprop As New [Property] hiddenprop.Name = "Hidden" hiddenprop.Value = "False"   Dim props(1) As [Property] props(0) = descprop props(1) = hiddenprop Return props End Function   Public Function GetDataSourceDefinition(connectionString as String) Dim definition As New DataSourceDefinition definition.CredentialRetrieval = CredentialRetrievalEnum.Integrated definition.ConnectString = connectionString definition.Enabled = True definition.EnabledSpecified = True definition.Extension = "SQL" definition.ImpersonateUser = False definition.ImpersonateUserSpecified = True definition.Prompt = "Enter a user name and password to access the data source:" definition.WindowsCredentials = False definition.OriginalConnectStringExpressionBased = False definition.UseOriginalConnectString = False Return definition End Function   Private Function GetFullPath(parent As String, name As String) As String If parent = "/" Then Return parent + name Else Return parent + "/" + name End If End Function

    Read the article

  • WiX, MSDeploy and an appealing configuration/deployment paradigm

    - by alexhildyard
    I do a lot of application and server configuration; I've done this for many years and have tended to view the complexity of this strictly in terms of the complexity of the ultimate configuration to be deployed. For example, specific APIs aside, I would tend to regard installing a server certificate as a more complex activity than, say, copying a file or adding a Registry entry.My prejudice revolved around the idea of a sequential deployment script that not only had the explicit prescription to apply a specific server configuration, but also made the implicit presumption that the server in question was in a good known state. Scripts like this fail for hundreds of reasons -- the Default Website didn't exist; the application had already been deployed; the application had already been partially deployed and failed to rollback fully, and so on. And so the problem is that the more complex the configuration activity, the more scope for error in any individual part of that activity, and therefore the greater the chance the server in question will not end up at exactly the desired configuration level.Recently I was introduced to a completely different mindset, which, for want of a better turn of phrase, I will call the "make it so" mindset. It's extremely simple both to explain and to implement. In place of the head-down, imperative script you used to use, you substitute a set of checks -- much like exception handlers -- around each configuration activity, starting with a check of the current system state. Thus the configuration logic becomes: "IF these services aren't started then start them, and IF XYZ website doesn't exist then create it, and IF these shares don't exist then create them, and IF these shares aren't permissioned in some particular way, then permission them so." This works. Really well, in my experience. Scenario 1: You want to get a system into a good known state; it's already in a good known state; you quickly realise there is nothing to do.Scenario 2: You want to get the system into a good known state; your script is flawed or the system is bust; it cannot be put into that state. You know exactly where (at least part of) the problem is and why.Scenario 3: You want to get the system into a good known state; people are fiddling around with the system just now. That's fine. You do what you can, and later you come back and try it againScenario 4: No one wants to deploy anything; they want you to prove that the previous deployment was successful. So you re-run the deployment script with the "-WhatIf" flag. It reports that there was nothing to change. There's your proof.I mentioned two technologies in the title -- MSI and MSDeploy. I am thinking specifically of the conversation that took place here. Having worked with both technologies, I think Rob Mensching's response is appropriately nuanced, and in essence the difference is this: sometimes your target is either to achieve a specific new server state, or to rollback to a known good one. Then again, your target may be to configure what you can, and to understand what you can't. Implicitly MSDeploy's "rollback" is simply to redeploy the previous version, whereas a well-crafted MSI will actively put your system into that state without further intervention. Either way, if all goes well it will leave you with a system in one of two states, whereas MSDeploy could leave your system in one of many states. The key is that MSDeploy and MSI are complementary technologies; which suits you best depends as much on Operational guidance as your Configuration remit.What I wanted to say was that I have always been for atomic, transactional-based configuration, but having worked with the "make it so" paradigm, I have been favourably impressed by the actual results. I'm tempted to put a more technical post up on this in due course.

    Read the article

  • Webcast: Best Practices for Speeding Virtual Infrastructure Deployment with Oracle VM

    - by Honglin Su
    We announced Oracle VM Blade Cluster Reference Configuration last month, see the blog. The new Oracle VM blade cluster reference configuration can help reduce the time to deploy virtual infrastructure by up to 98 percent when compared to multi-vendor configurations. Customers and partners have shown lots of interests. Join Oracle's experts to learn the best practices for speeding virtual infrastructure deployment with Oracle VM, register the webcast (1/25/2011) here.   Virtualization has already been widely accepted as a means to increase IT flexibility and help IT services align better with changing business needs. The flexibility of a virtualized IT infrastructure enables new applications to be rapidly deployed, capacity to be easily scaled, and IT resources to be quickly redirected. The net result is that IT can bring greater value to the business, making virtualization an obvious win from a business perspective. However, building a virtualized infrastructure typically requires assembling and integrating multiple components (e.g. servers, storage, network, virtualization, and operating systems). This infrastructure must be deployed and tested before applications can even be installed. It can take weeks or months to plan, architect, configure, troubleshoot, and deploy a virtualized infrastructure. The process is not only time-consuming, but also error-prone, making it hard to achieve a timely and profitable return on investment.  Oracle is the only vendor that can offer a fully integrated virtualization infrastructure with all of the necessary hardware and software components. The Oracle VM blade cluster reference configuration is a single-vendor solution that addresses every layer of the virtualization stack with Oracle hardware and software components, see the figure below. It enables quick and easy deployment of the virtualized infrastructure using components that have been tested together and are all supported together by Oracle. To learn more about Oracle's virtualization offerings, visit http://oracle.com/virtualization.

    Read the article

  • SQLAuthority News – Deployment guide for Microsoft SharePoint Foundation 2010

    - by pinaldave
    SharePoint and SQL Server both goes together – hands to hand. SharePoint installation is very interesting. At various organizations, the installation is very different and have various needs. SQL Server installation with SharePoint is equally important and I have often seen that it is being neglected. Microsoft has published the Deployment Guide for SharePoint Foundation. It talks about various database aspects as well. For optimal sharepoint installation the required version of SQL Server, including service packs and cumulative updates must be installed on the database server. The installation must include any additional features, such as SQL Analysis Services, and the appropriate SharePoint Foundation logins have to be added and configured. The database server must be hardened and, if it is required, databases must be created by the DBA. For more information, see: Hardware and software requirements (SharePoint Foundation 2010) Harden SQL Server for SharePoint environments (SharePoint Foundation 2010) Deploy by using DBA-created databases (SharePoint Foundation 2010) Deployment guide for Microsoft SharePoint Foundation 2010 Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: SharePoint

    Read the article

  • Financial institutions build predictive models using Oracle R Enterprise to speed model deployment

    - by Mark Hornick
    See the Oracle press release, Financial Institutions Leverage Metadata Driven Modeling Capability Built on the Oracle R Enterprise Platform to Accelerate Model Deployment and Streamline Governance for a description where a "unified environment for analytics data management and model lifecycle management brings the power and flexibility of the open source R statistical platform, delivered via the in-database Oracle R Enterprise engine to support open standards compliance." Through its integration with Oracle R Enterprise, Oracle Financial Services Analytical Applications provides "productivity, management, and governance benefits to financial institutions, including the ability to: Centrally manage and control models in a single, enterprise model repository, allowing for consistent management and application of security and IT governance policies across enterprise assets Reuse models and rapidly integrate with applications by exposing models as services Accelerate development with seeded models and common modeling and statistical techniques available out-of-the-box Cut risk and speed model deployment by testing and tuning models with production data while working within a safe sandbox Support compliance with regulatory requirements by carrying out comprehensive stress testing, which captures the effects of adverse risk events that are not estimated by standard statistical and business models. This approach supplements the modeling process and supports compliance with the Pillar I and the Internal Capital Adequacy Assessment Process stress testing requirements of the Basel II Accord Improve performance by deploying and running models co-resident with data. Oracle R Enterprise engines run in database, virtually eliminating the need to move data to and from client machines, thereby reducing latency and improving security"

    Read the article

  • Speed up ADF Mobile Deployment to Android with Keystore

    - by Shay Shmeltzer
    As you might have noticed from my latest ADF Mobile entries, I'm doing most of my ADF Mobile development on a windows machine and testing on an Android device. Unfortunately the Android/windows experience is not as fast as the iOS/Mac one. However, there is one thing I learned today that can make this a bit less painful in terms of the speed to deploy and test your application - and this is to use the "Release" mode when deploying your application instead of the "Debug" mode. To do this you'll first need to define a keystore, but as Joe from our Mobile team showed me today, this is quite easy. Here are the steps: Open a command line in your JDK bin directory (I just used the JDK that comes with the JDeveloper install). Issue the following command: keytool –genkey –v –keystore <Keystore Name>.keystore –alias <Alias Name> -keyalg RSA –keysize 2048 –validity 10000 Both keystore name and alias names are strings that you decide on. The keytool utility will then prompt you with various questions that you'll need to answer. Once this is done, the next step is to configure your JDeveloper preferences->ADF Mobile to add this keystore there under the release tab:  Then for your application specific deployment profile - switch the build mode from debug to release. The end result is a much smaller mobile application (for example from 60 to 21mb) and a much faster deployment cycle (for me it is about twice as fast as before).

    Read the article

  • obiee 10g teradata Solaris deployment

    - by user554629
    I have 3-4 years worth of notes on proper Teradata deployment across multiple operating systems.   The topic that is too large to cover succinctly in a blog entry.   I'm trying something new:  document a specific situation, consolidate the facts, document diagnostic procedures and then clone the structure to cover other obiee deployments (11g and other operating systems). Until the icon below is removed, this blog entry may be revised frequently.  No construction between June 6th through June 25th. Getting started obiee 10g certification:  pg 24-25 Teradata V2R5.1.x - V2R6.2, Client 13.10, certified 10.1.3.4.1obiee 10g documentation: Deployment Guide, Server Administration, Install/Config Guideobiee overview: teradata connectivity downloads: ( requires registration )solaris odbc drivers: sparc 13.10:  Choose 13.10.00.04  ( ReadMe ) sparc 14.00: probably would work, but not certified by Oracle on 10g I assume you have obiee 10.1.3.4.1 installed; 10.1.3.4.2 would be a better choice. Teradata odbc install requires root for Solaris pkgadd Only 1 version of Teradata odbc can be installed.symbolic links to the current version are created in /usr/lib at install obiee implementation background database access has two types of implementation:  native and odbcnative drivers use DB vendor client interfaces for accessodbc drivers are provided by the DB vendor for DB accessTeradata is an odbc interface Database. odbc drivers require an ODBC Driver Managerobiee uses Merant Data Direct driver manager obiee servers communicate with one another using odbc.The internal odbc driver is implemented by the obiee team and requires Merant Driver Manager. Teradata supplies a Driver Manager, which is built by Merant, but should not be used in obiee. The nqsserver shared library deployment looks like this  OBIEE Server<->DataDirect Manager<->Teradata Driver<->Teradata Database nqsserver startup $ cd $BI/setup$ . ./sa-init64.sh$ run-sa.sh autorestart64 The following files are referenced from setup:  .variant.sh  user.sh  NQSConfig.INI  DBFeatures.INI  $ODBCINI ( odbc.ini )  sqlnet.ora How does nqsserver connect to Teradata? A teradata DSN is created in the RPD. ( TD71 )setup/odbc.ini contains: [ODBC Data Sources] TD71=tdata.so[TD71]Driver=/opt/tdodbc/odbc/drivers/tdata.soDescription=Teradata V7.1.0DBCName=###.##.##.### LastUser=Username=northwindPassword=northwindDatabase=DefaultDatabase=northwind setup/user.sh contains LIBPATH\=/opt/tdicu/lib_64\:/usr/odbc/lib\:/usr/odbc/drivers\:/usr/lpp/tdodbc/odbc/drivers\:$LIBPATHexport LIBPATH   setup/.variant.sh contains if [ "$ANA_SERVER_64" = "1" ]; then  ANA_BIN_DIR=${SAROOTDIR}/server/Bin64  ANA_WEB_DIR=${SAROOTDIR}/web/bin64  ANA_ODBC_DIR=${SAROOTDIR}/odbc/lib64         setup/sa-run.sh  contains . ${ANA_INSTALL_DIR}/setup/.variant.sh. ${ANA_INSTALL_DIR}/setup/user.sh logfile="${SAROOTDIR}/server/Log/nqsserver.out.log"${ANA_BIN_DIR}/nqsserver -quiet >> ${logfile} 2>&1 &   nqsserver is running: nqsserver produces $BI/server/nqsserver.logAt startup, the native database drivers connect and record DB versions.tdata.so is not loaded until a Teradata DB connection is attempted.    Teradata odbc client installation Accept all the defaults for pkgadd.   Install in /opt. $ mkdir odbc$ cd odbc$ gzip -dc ../tdodbc__solaris_sparc.13.10.00.04.tar.gz | tar -xf - $ sudo su# pkgadd -d . TeraGSS# pkgadd -d . tdicu1310# pkgadd -d . tdodbc1310   Directory Notes: /opt/teradata/client/13.10/odbc_64/lib/tdata.soThe 64-bit obiee library loaded by nqsserver. /opt/teradata/client/13.10/odbc_64/lib is not needed in LD_LIBRARY_PATH /opt/teradata/client/13.10/tdicu/lib64is needed in LD_LIBRARY_PATH /usr/odbc should not be referenced;  it is a link to 32-bit libraries LD_LIBRARY_PATH_64 should not be used.     Useful bash functions and aliases export SAROOTDIR=/export/home/dw_adm/OracleBIexport TERA_HOME=/opt/teradata/client/13.10 export ORACLE_HOME=/export/home/oracle/product/10.2.0/clientexport ODBCINI=$SAROOTDIR/setup/odbc.iniexport TD_ICU_DATA=$TERA_HOME/tdicu/lib64alias cds="alias | grep '^alias cd' | sed 's/^alias //' | sort"alias cdtd="cd $TERA_HOME; ls" alias cdtdodbc="cd $TERA_HOME/odbc_64; ls -l"alias cdtdicu="cd $TERA_HOME/tdicu/lib64; ls -l"alias cdbi="cd $SAROOTDIR; ls"alias cdbiodbc="cd $SAROOTDIR/odbc; ls -l"alias cdsetup="cd $SAROOTDIR/setup; ls -ltr"alias cdsvr="cd $SAROOTDIR/server; ls"alias cdrep="cd $SAROOTDIR/server/Repository; ls -ltr"alias cdsvrcfg="cd $SAROOTDIR/server/Config; ls -ltr"alias cdsvrlog="cd $SAROOTDIR/server/Log; ls -ltr"alias cdweb="cd $SAROOTDIR/web; ls"alias cdwebconfig="cd $SAROOTDIR/web/config; ls -ltr"alias cdoci="cd $ORACLE_HOME; ls"pkgfiles() { pkgchk -l $1 | awk  '/^Pathname/ {print $2}'; }pkgfind()  { pkginfo | egrep -i $1 ; } Examples: $ pkgfind td$ pkgfiles tdodbc1310 | grep 64$ cds$ cdtdodbc$ cdsetup$ cdsvrlog$ cdweblog

    Read the article

  • Cloud Deployment Models

    - by B R Clouse
    Normal 0 false false false EN-US X-NONE X-NONE As the cloud paradigm grows in depth and breadth, more readers are approaching the topic for the first time, or from a new perspective.  This blog is a basic review of  cloud deployment models, to help orient newcomers and neophytes. Most cloud deployments today are either private or public. It is also possible to connect a private cloud and a public cloud to form a hybrid cloud. A private cloud is for the exclusive use of an organization. Enterprises, universities and government agencies throughout the world are using private clouds. Some have designed, built and now manage their private clouds. Others use a private cloud that was built by and is now managed by a provider, hosted either onsite or at the provider’s datacenter. Because private clouds are for exclusive use, they are usually the option chosen by organizations with concerns about data security and guaranteed performance. Public clouds are open to anyone with an Internet connection. Because they require no capital investment from their users, they are particularly attractive to companies with limited resources in less regulated environments and for temporary workloads such as development and test environments. Public clouds offer a range of products, from end-user software packages to more basic services such as databases or operating environments. Public clouds may also offer cloud services such as a disaster recovery for a private cloud, or the ability to “cloudburst” a temporary workload spike from a private cloud to a public cloud. These are examples of a hybrid cloud. These are most feasible when the private and public clouds are built with similar technologies. Usually people think of a public cloud in terms of a user role, e.g., “Which public cloud should I consider using?” But someone needs to own and manage that public cloud. The company who owns and operates a public cloud is known as a public cloud provider. Oracle Database Cloud Service, Amazon RDS, database.com and Savvis Symphony Database are examples of public cloud database services. Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} When evaluating deployment models, be aware that you can use any or all of the available options. Some workloads may be best-suited for a private cloud, some for a public or hybrid cloud. And you might deploy multiple private clouds in your organization. If you are going to combine multiple clouds, then you want to make sure that each cloud is based on a consistent technology portfolio and architecture. This simplifies management and gives you the greatest flexibility in moving resources and workloads among your different clouds. Oracle’s portfolio of cloud products and services enables both deployment models. Oracle can manage either model. Universities, government agencies and companies in all types of business everywhere in the world are using clouds built with the Oracle portfolio. By employing a consistent portfolio, these customers are able to run all of their workloads – from test and development to the most mission-critical -- in a consistent manner: One Enterprise Cloud, powered by Oracle.   /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • Http Hanlder must be reset with each deployment. How can I add this functionality to the web.config

    - by user42942
    My application is a dotnet 4 hybrid - MVC in some areas, web forms in others. This application was recently upgraded to dotnet 4 and includes a lot of older code and some mismatched parts. Unfortunately it includes a telerik component that requires me to run the Application pool in classic mode. In order to fix this (in IIS7) I have to add a handler mapping to the IIS configuration. This mapping is basically a wildcard mapping that points the wildcard path "*" to the %windir%\Microsoft.NET\Framework64\v4.0.30319\aspnet_isapi.dll. The problem I am running into is this: For some reason this mapping gets dropped when deploying the site. So, can I add the functionality of this mapping to the web config? If so, How? Or is there another solution to make this manually added mapping "sticky" so that it remains in place during and after a deployment? (I am also asking this on StackOverflow, as I'm not sure if this should be a coding question or a Server question)

    Read the article

  • Trouble running setup package after Publishing in Visual Studio 2008

    - by Andrew Cooper
    I've got a small winform application that I've written that is running fine in the IDE. It builds with no errors or warnings. It's not using any third party controls. I'm coding in C# in Visual Studio 2008. When I Build -- Publish the application, everything seems to work fine. However, when I go and attempt to install the application via the setup.exe file I get an error message that says, "Application cannot be started." The error details are below: ERROR DETAILS Following errors were detected during this operation. * [3/18/2010 10:50:56 AM] System.Runtime.InteropServices.COMException - The referenced assembly is not installed on your system. (Exception from HRESULT: 0x800736B3) - Source: System.Deployment - Stack trace: at System.Deployment.Internal.Isolation.IStore.GetAssemblyInformation(UInt32 Flags, IDefinitionIdentity DefinitionIdentity, Guid& riid) at System.Deployment.Internal.Isolation.Store.GetAssemblyManifest(UInt32 Flags, IDefinitionIdentity DefinitionIdentity) at System.Deployment.Application.ComponentStore.GetAssemblyManifest(DefinitionIdentity asmId) at System.Deployment.Application.ComponentStore.GetSubscriptionStateInternal(DefinitionIdentity subId) at System.Deployment.Application.SubscriptionStore.GetSubscriptionStateInternal(SubscriptionState subState) at System.Deployment.Application.ComponentStore.CollectCrossGroupApplications(Uri codebaseUri, DefinitionIdentity deploymentIdentity, Boolean& identityGroupFound, Boolean& locationGroupFound, String& identityGroupProductName) at System.Deployment.Application.SubscriptionStore.CommitApplication(SubscriptionState& subState, CommitApplicationParams commitParams) at System.Deployment.Application.ApplicationActivator.InstallApplication(SubscriptionState& subState, ActivationDescription actDesc) at System.Deployment.Application.ApplicationActivator.PerformDeploymentActivation(Uri activationUri, Boolean isShortcut, String textualSubId, String deploymentProviderUrlFromExtension, BrowserSettings browserSettings, String& errorPageUrl) at System.Deployment.Application.ApplicationActivator.ActivateDeploymentWorker(Object state) I'm not sure what else to do. The only slightly odd thing I used in this application is the SQL Compact Server. Any help would be appreciated. Thanks, Andrew

    Read the article

  • Oracle Service Bus Customer Panel - Choice Hotel's Deployment Description at OpenWorld

    - by Bruce Tierney
    Choice Hotels shared their Oracle Service Bus deployment during the recent Customer Panel on Oracle Service Bus.  Charlie Taylor of Choice provides an excellent in-depth description of architectural guidelines including project naming and project structure.  Below is a screenshot from the session highlighting the flow from proxy service to business service, transformation, orchestration and more: For more information about Oracle OpenWorld SOA & BPM Session, please see the Focus on SOA and BPM document 

    Read the article

  • Error occurred in deployment step 'Recycle IIS Application Pool'

    - by shehan
    Encountered this error while trying to deploy a SharePoint 2010 project from Visual Studio 2010:Error occurred in deployment step 'Recycle IIS Application Pool': The open operation did not complete within the allotted timeout of 00:01:00. The time allotted to this operation may have been a portion of a longer timeout.All my other projects in the solution deploy just fine. To fix this, I had to retract the offending project (through Visual Studio) and re-deploy.

    Read the article

  • Continuous Deployment to Azure powered by Git

    Today Scott Guthrie announced several updated capabilities for Azure Web Sites. Announcing: Great Improvements to Windows Azure Web Sites I recommend you checkout the full post there are some really cool improvements. My favorite is the ability to enable Continuous Deployment from your CodePlex project into Azure. David Ebbo has a great video walk-through: (Please visit the site to view this video)

    Read the article

  • SQL in the City Seminar Portland 2013 –Deployment Stairway

    Join Red Gate for a free seminar on November 15 (the day before SQL Saturday Oregon). Steve Jones and Grant Fritchey, SQL Server MVPs, will present best practices for SQL Server version control, continuous integration and deployment, in addition to showing Red Gate tools in action. Want faster, smaller backups you can rely on? Use SQL Backup Pro for up to 95% compression, faster file transfer and integrated DBCC CHECKDB. Download a free trial now.

    Read the article

  • Create App_Data and register Excel application on ASP.NET deployment? (IIS7.5)

    - by Francesco
    I am deploying an ASP.NET MVC3 application in IIS7. I already deployed other applications but they never made use of the App_Data folder or any additional component such as the Interop library. I used the one click deployement and I sue the default application pool. When I launch the application I immediately get an error stating: [web access] Sorry, an error occurred while processing your request. [browse from IIS7] Could not find a part of the path 'D:\Data\Apps\OppUpdate\App_Data\Test.xlsx'. Then I manually added the App_Data folder inside the deployment directory and the application starts regularly. Then when it comes to the taks that uses the Interop library, I get the following error: [web access] Sorry, an error occurred while processing your request. [browse from IIS7] Retrieving the COM class factory for component with CLSID {00024500-0000-0000-C000-000000000046} failed due to the following error: 80040154 Class not registered (Exception from HRESULT: 0x80040154 (REGDB_E_CLASSNOTREG)). Is there any way to automatically add the App_Data folder when using 1 click deploy? How can I register the Interop services? Thanks you, Francesco

    Read the article

  • Zero downtime deployment (Tomcat), Nginx or HAProxy, behind hardware LB - how to "starve" old server?

    - by alexeypro
    Currently we have the following setup. Hardware Load Balancer (LB) Box A running Tomcat on 8080 (TA) Box B running Tomcat on 8080 (TB) TA and TB are running behind LB. For now it's pretty complicated and manual job to take Box A or Box B out of LB to do the zero downtime deployment. I am thinking to do something like this: Hardware Load Balancer (LB) Box A running Nginx on 8080 (NA) Box A running Tomcat on 8081 (TA1) Box A running Tomcat on 8082 (TA2) Box B running Nginx on 8080 (NB) Box B running Tomcat on 8081 (TB1) Box B running Tomcat on 8082 (TB2) Basically LB will be directing traffic between NA and NB now. On each of Nginx's we'll have TA1, TA2 and TB1, TB2 configured as upstream servers. Once one of the upstreams's healthcheck page is unresponsive (shutdown) the traffic goes to another one (HttpHealthcheckModule module on Nginx). So the deploy process is simple. Say, TA1 is active with version 0.1 of the app. Healthcheck on TA1 is OK. We start TA2 with Healthcheck on it as ERROR. So Nginx is not talking to it. We deploy app version 0.2 to TA2. Make sure it works. Now, we switch the Healthcheck on TA2 to OK, switch Healthcheck to TA1 to ERROR. Nginx will start serving TA2, and will remove TA1 out of rotation. Done! And now same with the other box. While it sounds all cool and nice, how do we "starve" the Nginx? Say we have pending connections, some users on TA1. If we just turn it off, sessions will break (we have cookie-based sessions). Not good. Any way to starve traffic to one of the upstream servers with Nginx? Thanks!

    Read the article

  • How to rollback a database deployment without losing new data?

    - by devlife
    My company uses virtual machines for our web/app servers. This allows for very easy rollbacks of a deployment if something goes wrong. However, if an app server deployment also requires a database deployment and we have to rollback I'm kind of at a loss. How can you rollback database schema changes without losing data? The only thing that I can think of is to write a script that will drop/revert tables/columns back to their original state. Is this really the best way?

    Read the article

  • Deploying workstations - best practices?

    - by V. Romanov
    Hi guys I've been researching on the subject of workstation deployment for a while, and found a ton of info and dozens different methods and tools, but no "best practice" method that doesn't lack at least one feature that i consider required for the solution to be perfect. I'm currently interested in windows workstation deployment, but if the tools can be extended to Linux, then it's an added value. I want the deployment tools I use to be able to do the following: hardware independent - I want my image or installation to have a minimum of hardware and driver dependency, so that i can use a single image/package for all workstations easily updatable - I want to be able to update my image as easily as possible without redeploying/rebuilding/reimaging all configurations PXE bootable deployment - I want the tools to be bootable off the network so that I don't need a boot cd/DOK. scriptable for minimum human input - Ideally, the tool should run automatically after being booted and perform a "default" deployment (including partitioning) unless prompted otherwise. i.e - take a pc, hook it up, power on, PXE boot and forget about it until the OS is deployed. I found no single product or environment that does all this. Closest i came to is the windows deployment services/WIM image format. I also checked out numerous imaging and deployment tools including clonezilla, ghost, g4u, wpkg and others, but most of them lack the hardware Independence and updatability features. We currently have a Symantec Ghost server setup that does imaging over the network, but I'm not satisfied with it as it has all the drawbacks i listed above. Do you have suggestions how to optimize the process of workstation deployment? How do you deploy them in your organization? Thanks! Vadim.

    Read the article

< Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >