Search Results

Search found 29622 results on 1185 pages for 'deployment project'.

Page 346/1185 | < Previous Page | 342 343 344 345 346 347 348 349 350 351 352 353  | Next Page >

  • TFS Changeset history using QueryHistory after branch rename

    - by bryantb
    I'm using the VersionControlServer.QueryHistory method to retrieve a list of files that have changed during a timespan that ranges from 5/1/2009 to 10/1/2009. My results unexpectedly only included items that had changed after 9/1/2009. I then realized that the path that I was using, $/Project/Reports/Main, didn't exist until 9/1/2009. Before 9/1/2009, there had been another node named $/Project/Main/Reports, which was renamed to $/Project/Reports/Main. When I query from Source Control Explorer I can see the entire history (5/1/2009 - 10/1/2009) that I expect to see. However, I can't get the same results via the API. I've tried to specify the branch that no longer exists because it was renamed, but not surprisingly I get zero results. Any ideas?

    Read the article

  • Flex builder3 is not generating html wrapper when targeting flex 4 sdk

    - by gonzohunter
    In Flex builder 3 when I create a new flex application targeting the flex 4 sdk, it wont generate a html wrapper file. I have hunted around the web for answers, but no success. I have made sure the box is checked in the project properties to generate html wrapper. The only workaround is to target an older version of the sdk (i.e. 3.2), which will cause the wrapper to be generated. Then I can revert the project to sdk 4. This then means I can never do a clean of my project because this will result in the wrapper being deleted. Has anyone else come across this? Is this just a bug with Flexbuilder3?

    Read the article

  • Java POI 3.6 XWPF usage guidelines (reading content of docx file)

    - by Mr CooL
    I assume the following objects should be used to read contents of DOCX file: XWPFDocument XWPFWordExtractor However, somewhere the compiler warns me from not including the correct libraries needed in classpath. I think I'm kinda lost for not knowing which jar file is the right one to include for this since there are so many jar files (POI libraries). My project so far involve in reading doc and docx files as part of the project. I've managed to read the contents of doc file. However, for docx file, I'm still having problem with that. Can anyone show the guidelines in terms of the codes and libraries needed (jar files) to read the content of docx file? I'm trying to limit the libraries need to be added on into project since I need to read doc and docx only. The following works for doc: fs = new POIFSFileSystem(new FileInputStream(fileName)); HWPFDocument doc = new HWPFDocument(fs); WordExtractor we = new WordExtractor(doc); String[] p = we.getParagraphText();

    Read the article

  • Android Eclipse Error "Android Packaging Problem"

    - by Peter Delaney
    Hello; I am getting an error in my Problems tab for my Android Project in Eclipse. The error is "Android Packaging Problem" with an Unknown location. Unknown Error NullPointerException I cannot determine what this problem is. My project was working a few hours ago. The only change I made was to add a public interface ITrackDao to my project and implement it. There are no errors associated with this. I am not even sure where to begin to look. I cannot launch the application. Can someone give me an idea on what area I can look into? Thanks Peter

    Read the article

  • How to configure hibernate-tools with maven to generate hibernate.cfg.xml, *.hbm.xml, POJOs and DAOs

    - by mmm
    Hi, can any one tell me how to force maven to precede mapping .hbm.xml files in the automatically generated hibernate.cfg.xml file with package path? My general idea is, I'd like to use hibernate-tools via maven to generate the persistence layer for my application. So, I need the hibernate.cfg.xml, then all my_table_names.hbm.xml and at the end the POJO's generated. Yet, the hbm2java goal won't work as I put *.hbm.xml files into the src/main/resources/package/path/ folder but hbm2cfgxml specifies the mapping files only by table name, i.e.: <mapping resource="MyTableName.hbm.xml" /> So the big question is: how can I configure hbm2cfgxml so that hibernate.cfg.xml looks like below: <mapping resource="package/path/MyTableName.hbm.xml" /> My pom.xml looks like this at the moment: <plugin> <groupId>org.codehaus.mojo</groupId> <artifactId>hibernate3-maven-plugin</artifactId> <version>2.2</version> <executions> <execution> <id>hbm2cfgxml</id> <phase>generate-sources</phase> <goals> <goal>hbm2cfgxml</goal> </goals> <inherited>false</inherited> <configuration> <components> <component> <name>hbm2cfgxml</name> <implemetation>jdbcconfiguration</implementation> <outputDirectory>src/main/resources/</outputDirectory> </component> </components> <componentProperties> <packagename>package.path</packageName> <configurationFile>src/main/resources/hibernate.cfg.xml</configurationFile> </componentProperties> </configuration> </execution> </executions> </plugin> And then the second question: is there a way to tell maven to copy resources to the target folder before executing hbm2java? At the moment I'm using mvn clean resources:resources generate-sources for that, but there must be a better way. Thanks for any help. Update: @Pascal: Thank you for your help. The path to mappings works fine now, I don't know what was wrong before, though. Maybe there is some issue with writing to hibernate.cfg.xml while reading database config from it (though the file gets updated). I've deleted the file hibernate.cfg.xml, replaced it with database.properties and run the goals hbm2cfgxml and hbm2hbmxml. I also don't use the outputDirectory nor configurationfile in those goals anymore. As a result the files hibernate.cfg.xml and all *.hbm.xml are being generated into my target/hibernate3/generated-mappings/ folder, which is the default value. Then I updated the hbm2java goal with the following: <componentProperties> <packagename>package.name</packagename> <configurationfile>target/hibernate3/generated-mappings/hibernate.cfg.xml</configurationfile> </componentProperties> But then I get the following: [INFO] --- hibernate3-maven-plugin:2.2:hbm2java (hbm2java) @ project.persistence --- [INFO] using configuration task. [INFO] Configuration XML file loaded: file:/C:/Documents%20and%20Settings/mmm/workspace/project.persistence/target/hibernate3/generated-mappings/hibernate.cfg.xml 12:15:17,484 INFO org.hibernate.cfg.Configuration - configuring from url: file:/C:/Documents%20and%20Settings/mmm/workspace/project.persistence/target/hibernate3/generated-mappings/hibernate.cfg.xml 12:15:19,046 INFO org.hibernate.cfg.Configuration - Reading mappings from resource : package.name/Messages.hbm.xml [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.codehaus.mojo:hibernate3-maven-plugin:2.2:hbm2java (hbm2java) on project project.persistence: Execution hbm2java of goal org.codehaus.mojo:hibernate3-maven-plugin:2.2:hbm2java failed: resource: package/name/Messages.hbm.xml not found How do I deal with that? Of course I could add: <outputDirectory>src/main/resources/package/name</outputDirectory> to the hbm2hbmxml goal, but I think this is not the best approach, or is it? Is there a way to keep all the generated code and resources away from the src/ folder? I assume, the goal of this approach is not to generate any sources into my src/main/java or /resources folder, but to keep the generated code in the target folder. As I generally agree with this point of view, I'd like to continue with that eventually executing hbm2dao and packaging the project to be used as a generated persistence layer component from the business layer. Is this also what you meant?

    Read the article

  • The command "".\Bin\mt.exe" -nologo -manifest ... exited with error code 3 in CCNET

    - by soldieraman
    I am trying to build my VS 2008 project in CCNEt and getting the below error <message level="high"><![CDATA[".\Bin\mt.exe" -nologo -manifest "C:\MyProject\MyFile.exe.manifest" -outputresource:"C:\MyProject\bin\Release\MyFile.exe;#1"]]></message> <message level="high"><![CDATA[The system cannot find the path specified.]]></message> <error code="MSB3073" file="C:\WINDOWS\Microsoft.NET\Framework\v3.5\Microsoft.Common.targets" line="3397" column="13"><![CDATA[The command "".\Bin\mt.exe" -nologo -manifest "C:\Work\CI\Abc20ServerTrunkCheckout\ScannerInterface\Abc.ScannerInterface.Tester.exe.manifest" -outputresource:"C:\MyProject\bin\Release\MyFile.exe;#1" exited with code 3.]]></error> This project builds happily on my local server. ALso there is no Bin folder in M.Net\Framework\v3.5.... Any help will be awesome I also did an msbuild on the project and got the same error.

    Read the article

  • How can I set initial values when using Silverlight DataForm and .Net RIA Services DomainDataSource?

    - by TheDuke
    I'm experimenting with .Net RIA and Silverlight, I have a few of related entities; Client, Project and Job, a Client has many Projects, and a Project has many Jobs. In the Silverlight app, I'm uisng a DomainDataSource, and DataForm controls to perform the CRUD operations. When a Client is selected a list of projects appears, at which point the user can add a new project for that client. I'd like to be able to fill in the value for client automatically, but there doesn't seem to be any way to do that, while there is an AddingNewItem event on the DataForm control, it seems to fire before the DataForm has an instance of the new object and I'm not sure trawling through the ChangeSet from the DomainDataSource SubmittingChanges event is the best way to do this. I would of thought this would of been an obvious feature... anyone know the best way to achieve this functionality?

    Read the article

  • jquery strange flickering on mouseover/out

    - by Jonah
    The HTML: <div id="timerList"> ... <li rel="project" class="open"> <a class="" style="" href=""><ins>&nbsp;</ins>Project C</a> </li> ... </div> The javascript/jquery: $('#timerList li[rel="project"]').mouseover(function(){ $('a:first',this).after('<span class="addNew"><a href="#">Add Timer</a></span>'); }).mouseout(function(){ $('.addNew',this).remove(); }); When I hover my mouse over an li element, a span.addNew element is created within THE PROBLEM: When I put my mouse ofer the span.addNew, it flickers on and off. Perhaps the mouseout event is firing, but I don't understand why it would or how to prevent it. Thanks!

    Read the article

  • Getting ANT to scp only new/changed files

    - by Artem
    I would like to optimize my scp deployment which currently copies all files to only copy files that have changed since the last build. I believe it should be possible with the current setup somehow, but I don't know how to do this. I have the following: Project/src/blah/blah/ <---- files I am editing (mostly PHP in this case, some static assets) Project/build <------- I have a local build step that I use to copy the files to here I have an scp task right now that copies all of Project/build out to a remote server when I need it. Is it possible to somehow take advantage of this extra "build" directory to accomplish what I want -- meaning I only want to upload the "diff" between src/** and build/**. Is it possible to somehow retrieve this as a fileset in ANT and then scp that? I do realize that what it means is that if I somehow delete/mess around with files on the server in between, the ANT script would not notice, but for me this is okay.

    Read the article

  • Cannot find System.Web.Script.Service namespace error after upgrading to Visual studio 2010

    - by Gavin
    I've just upgraded a VS 2008 project to VS 2010, converting the project but keeping the target as .NET 3.5 (SP1 is installed). My project worked without issue under VS 2008 on another machine. I've added references to System.Web.Extensions.dll but I'm still getting the following errors from code in the App_Code folder: 1) Cannot find System.Web.Script.Service namespace 2) Type 'System.Web.Script.Services.ScriptService' is not defined. 3) Type 'System.Runtime.Serialization.Json.DataContractJsonSerializer' is not defined. Anyone have any ideas what the problem might be as I'm pretty stumped? :(

    Read the article

  • ASP.NET MVC & Silverlight development - on Ubuntu

    - by queen3
    I recently moved my working environment from Windows 7 to Ubuntu, and enjoy this every minute of my working day. My work is currently to develop ASP.NET MVC and Silverlight applications. Thus, important Windows stuff is being still run in VirtualBox (such as IIS, MSSQL, Silverlight 3, and legacy COM stuff). For now, I use Visual Studio under VirtualBox as editor/IDE/debugger. But since I prefer Ubuntu fonts and UI much more I'd like to move at least editor (and better IDE) to native Ubuntu. Things I have already done: I store project files in my home folder and run Visual Studio from \vboxsvr share. With few tricks for ASP.NET it works. I use svn on Ubuntu. I test my ASP.NET MVC site using FireFox/FireBug on Ubuntu. What I need on Ubuntu: SQL client to manage MSSQL. I mostly need querying, but I would miss SQL intellisense support. I enjoy command-line svn a lot, but there're times when it's not enough (e.g. view files / check diffs / selectively commit at the sames time) so I wonder if there're any addons - I don't mean replacement for svn, just addons for rare cases like above. I wonder if there're editors that can provide some C# intellisense. Yes I know about MonoDevelop, but will it provide intellisense without compiling (since I'm going to compile remotely in Win box)? And pretty big topic, what's the best way (editor/IDE) to do "folder-based" development? What I mean: The project is /trunk. Everything is there. I don't want to manually add files to project or like that. The project is the folder and files down there. Main task is to edit files of course. I need a quick way to open / search for files in the project. Like in Resharper, I can click Ctrl-Shift-T (IIRC) and just type file name, a list of matching files in the project folder and below is shown. For example, gedit has file tree browser, but I can't quickly type XYZ to find all XYZ files there; moreover it doesn't automatically switch focus to/from editor; so it's more mouse-oriented; I need 100% keyboard way. I need syntax highlighting for C#/HTML/JS. Most importantly, I need HTML tags autocompletion. I can live without it, but I'll be sorry. I need to run compilation remotely (via ssh I think, invoking NAnt script which does MSBuild) and grab results such as errors and warnings, and I'd prefer to quickly go to error line/file. In short, I need to edit/search/open/svn/compile/run files in some folder. Looks like a case for command-line, but imaging I'm in /trunk and want to open file.cs inside /trunk/foo/bar/boo/far, I wouldn't want to type all this path even with bash autocompletion help. I'd prefer to enter :open file.cs, and maybe then select from list of file.cs and file1.cs. Well, maybe I'll add more soon. Actually, I don't have exact requirements; for example, I don't even know if I need to ask for IDE or editor; or should it have svn support integrated or I need to use it from console; do people work with files from console (search/commit/delete/etc) and open editor from there, or they work from editor/IDE and manage files (search/commit/delete) from there? What's better? I have a feeling that vim might have everything I need. I'd like to confirm that, before I spend a lot of time learning it. No I don't want Emacs. Any other IDE? I like Eclipse, but is it good for such stuff? And after all, do you feel like it's a good way to go? I enjoy Ubuntu, enjoy learning new stuff, and Ubuntu + Windows in VirtualBox actually runs faster than Windows7 on my mahcine, but maybe I need to keep development (editor/files management/etc) in Windows/VirtualBox only, leaving other stuff for Ubuntu?

    Read the article

  • UITableView backgroundColor always gray on iPad

    - by rjobidon
    Hi, When I set the backgroundColor for my UITableView it works fine on iPhone (device and simulator) but NOT on the iPad simulator. Instead I get a light gray background for any color I set including groupTableViewBackgroundColor. Steps to reproduce: Create a new navigation-based project. Open RootViewController.xib and set the table view style to "Grouped". Add this responder to the RootViewController:- (void)viewDidLoad { [super viewDidLoad]; self.view.backgroundColor = [UIColor blackColor]; } Select Simulator SDK 3.2, build and run. You will get a black background (device and simulator). Select your target in the project tree. Click on Project : Upgrade Current Target for iPad. Build and run. You will get a light gray background. Revert the table view style to Plain and you will get a black background. Thanks for your help!

    Read the article

  • My delete function does not delete the targeted file

    - by Chester Sabado
    Basically I could upload files based on a project. Whenever I create a project, a new directory is created with the directory name as the project_name e.g. this is a test - this-is-a-test. But my problem is I couldn't delete a file in a directory. function delete_image($id) { $this->load->model(array('work_model', 'project_model')); $result = $this->work_model->get_work($id); $result = $this->project_model->get_project($result->project_id); $dir = str_replace(" ", "-", $result->project_name); $result = $this->work_model->delete($id); if (isset($result)){ unlink('./uploads/' . $dir . '/' . $result->full_path); } redirect('admin/project/view_project/' . $result->project_id); } Need help on this thanks.

    Read the article

  • Getting started with zxing on android

    - by amitlicht
    Hi I'm trying to add zxing to my project (add a button which calls the scanner upon press). I found this: http://groups.google.com/group/android-developers/browse_thread/thread/788eb52a765c28b5 and of course the zxing homesite: http://code.google.com/p/zxing/, but still couldn't figure out what to include in the project classpath to make it all work! as for now, I copied the classes in the first link to my project (with some package name changes), and it runs but crashes after pressing the button and trying to install the barcode scanner. some code: private void setScanButton(){ Button scan = (Button) findViewById(R.id.MainPageScanButton); scan.setOnClickListener(new OnClickListener() { @Override public void onClick(View v) { IntentIntegrator.initiateScan(MyActivity.this); } }); } resulting error (from logcat): 06-13 15:26:01.540: ERROR/AndroidRuntime(1423): Uncaught handler: thread main exiting due to uncaught exception 06-13 15:26:01.560: ERROR/AndroidRuntime(1423): android.content.ActivityNotFoundException: No Activity found to handle Intent { act=android.intent.action.VIEW dat=market://search?q=pname:com.google.zxing.client.android } Ideas?

    Read the article

  • Drupal 6 vs Drupal 7 performance

    - by lifecoder
    Hi all. I want to start new project and stuck without idea what version to use. I have huge expirience with D6, and also one project (module developement) for D7. It looks like D7 slower, have bigger memory consumption and also have a lack of documentation by the moment. I don't need new CCK, Views and other - looks like I'll coding all features needed as modules. Is D7 have sweet parts now, or better way is develop project under D6? What way you choose for yourself, and why?

    Read the article

  • How to draw a better looking Graph (A4 size) in Dot?

    - by Nazgulled
    Hi, I have this project that it's due in a few hours and I still have a report to write... The project has nothing to do with Dot, but we were asked to draw a Graph with Dot, which I did. It looks something like this: http://img683.imageshack.us/img683/9735/dotj.jpg The longer arrows represent smaller weights and the shorter arrows represent bigger weights. There isn't any problem in submitting my project like this, it does what's is supposed to do and this Dot thing is just an extra. But I would like to make it pretty, I just don't have time to learn about Dot right now. Basically, all I want is make pretty. Perhaps, a bigger height for the page, like A4 paper size. And have the graph display more to the bottom than everything to the side. What should I put on my .dot file to make it look better?

    Read the article

  • Big Data – Role of Cloud Computing in Big Data – Day 11 of 21

    - by Pinal Dave
    In yesterday’s blog post we learned the importance of the NewSQL. In this article we will understand the role of Cloud in Big Data Story What is Cloud? Cloud is the biggest buzzword around from last few years. Everyone knows about the Cloud and it is extremely well defined online. In this article we will discuss cloud in the context of the Big Data. Cloud computing is a method of providing a shared computing resources to the application which requires dynamic resources. These resources include applications, computing, storage, networking, development and various deployment platforms. The fundamentals of the cloud computing are that it shares pretty much share all the resources and deliver to end users as a service.  Examples of the Cloud Computing and Big Data are Google and Amazon.com. Both have fantastic Big Data offering with the help of the cloud. We will discuss this later in this blog post. There are two different Cloud Deployment Models: 1) The Public Cloud and 2) The Private Cloud Public Cloud Public Cloud is the cloud infrastructure build by commercial providers (Amazon, Rackspace etc.) creates a highly scalable data center that hides the complex infrastructure from the consumer and provides various services. Private Cloud Private Cloud is the cloud infrastructure build by a single organization where they are managing highly scalable data center internally. Here is the quick comparison between Public Cloud and Private Cloud from Wikipedia:   Public Cloud Private Cloud Initial cost Typically zero Typically high Running cost Unpredictable Unpredictable Customization Impossible Possible Privacy No (Host has access to the data Yes Single sign-on Impossible Possible Scaling up Easy while within defined limits Laborious but no limits Hybrid Cloud Hybrid Cloud is the cloud infrastructure build with the composition of two or more clouds like public and private cloud. Hybrid cloud gives best of the both the world as it combines multiple cloud deployment models together. Cloud and Big Data – Common Characteristics There are many characteristics of the Cloud Architecture and Cloud Computing which are also essentially important for Big Data as well. They highly overlap and at many places it just makes sense to use the power of both the architecture and build a highly scalable framework. Here is the list of all the characteristics of cloud computing important in Big Data Scalability Elasticity Ad-hoc Resource Pooling Low Cost to Setup Infastructure Pay on Use or Pay as you Go Highly Available Leading Big Data Cloud Providers There are many players in Big Data Cloud but we will list a few of the known players in this list. Amazon Amazon is arguably the most popular Infrastructure as a Service (IaaS) provider. The history of how Amazon started in this business is very interesting. They started out with a massive infrastructure to support their own business. Gradually they figured out that their own resources are underutilized most of the time. They decided to get the maximum out of the resources they have and hence  they launched their Amazon Elastic Compute Cloud (Amazon EC2) service in 2006. Their products have evolved a lot recently and now it is one of their primary business besides their retail selling. Amazon also offers Big Data services understand Amazon Web Services. Here is the list of the included services: Amazon Elastic MapReduce – It processes very high volumes of data Amazon DynammoDB – It is fully managed NoSQL (Not Only SQL) database service Amazon Simple Storage Services (S3) – A web-scale service designed to store and accommodate any amount of data Amazon High Performance Computing – It provides low-tenancy tuned high performance computing cluster Amazon RedShift – It is petabyte scale data warehousing service Google Though Google is known for Search Engine, we all know that it is much more than that. Google Compute Engine – It offers secure, flexible computing from energy efficient data centers Google Big Query – It allows SQL-like queries to run against large datasets Google Prediction API – It is a cloud based machine learning tool Other Players Besides Amazon and Google we also have other players in the Big Data market as well. Microsoft is also attempting Big Data with the Cloud with Microsoft Azure. Additionally Rackspace and NASA together have initiated OpenStack. The goal of Openstack is to provide a massively scaled, multitenant cloud that can run on any hardware. Thing to Watch The cloud based solutions provides a great integration with the Big Data’s story as well it is very economical to implement as well. However, there are few things one should be very careful when deploying Big Data on cloud solutions. Here is a list of a few things to watch: Data Integrity Initial Cost Recurring Cost Performance Data Access Security Location Compliance Every company have different approaches to Big Data and have different rules and regulations. Based on various factors, one can implement their own custom Big Data solution on a cloud. Tomorrow In tomorrow’s blog post we will discuss about various Operational Databases supporting Big Data. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Organizing source code in TFS 2010

    - by Rick
    We have just gotten TFS 2010 up and running. We will be migrating our source into TFS but I have a question on how to organize the code. TFS 2010 has a new concept of project collections so I have decided that different groups within our organization. My team develops many different web applications and we have several shared components. We also use a few third party components (such as telerik). Clearly each web application is it's own project but where do I put the shard components? Should each component be in it's own project with separate builds and work items? Is there a best practice or recommended way to do this specific to TFS 2010?

    Read the article

  • Building a Distributed Commerce Infrastructure in the Cloud using Azure and Commerce Server

    - by Lewis Benge
    One of the biggest questions I routinely get asked is how scalable Commerce Server is. Of course the text book answer is the product has been around for 10 years, powers some of the largest e-Commerce websites in the world, so it scales horizontally extremely well. One argument however though is what if you can't predict the growth of demand required of your Commerce Platform, or need the ability to scale up during busy seasons such as Christmas for a retail environment but are hesitant on maintaining the infrastructure on a year-round basis? The obvious answer is to utilise the many elasticated cloud infrastructure providers that are establishing themselves in the ever-growing market, the problem however is Commerce Server is still product which has a legacy tightly coupled dependency on Windows and IIS components. Commerce Server 2009 codename "R2" however introduced to the concept of an n-tier deployment of Microsoft Commerce Server, meaning you are no longer tied to core objects API but instead have serializable Commerce Entity objects, and business logic allowing for Commerce Server to now be built into a WCF-based SOA architecture. Presentation layers no-longer now need to remain on the same physical machine as the application server, meaning you can now build the user experience into multiple-technologies and host them in multiple places – leveraging the transport benefits that a WCF service may bring, such as message queuing, security, and multiple end-points. All of this logic will still need to remain in your internal infrastructure, for two reasons. Firstly cloud based computing infrastructure does not support PCI security requirements, and secondly even though many of the legacy Commerce Server dependencies have been abstracted away within this version of the application, it is still not a fully supported to be deployed exclusively into the cloud. If you do wish to benefit from the scalability of the cloud however, you can still achieve a great Commerce Server and Azure setup by utilising both the Azure App Fabric in terms of the service bus, and authentication services and Windows Azure to host any online presence you may require. The architecture would be something similar to this: This setup would allow you to construct your Commerce Services as part of your on-site infrastructure. These services would contain all of the channels custom business logic, and provide the overall interface back into the underlying Commerce Server components. It would be recommended that services are constructed around the specific business domain of the application, which based on your business model would usually consist of separate services around Catalogue, Orders, Search, Profiles, and Marketing. The App Fabric service bus is then used to abstract and aggregate further the services, making them available to the cloud and subsequently secured by App Fabrics authentication services. These services are now available for consumption by any client, using any supported technology – not just .NET. Thus meaning you are now able to construct apps for IPhone, integrate with Java based POS Devices, and any many other potential uses. This aggregation is useful, and forms the basis of the further strategy around diversifying and enhancing the e-Commerce experience, but also provides the foundation for the scalability we want to gain from utilising a cloud-based application platform. The Windows Azure application platform is Microsoft solution to benefiting from the true economies of scale in terms of the elasticity of the cloud. Just before the launch of the Azure Platform – Domino's pizza actually managed to run their whole SuperBowl operation from the scalability of Windows Azure, and simply switching back to their traditional operation the next day with no residual infrastructure costs. The platform also natively can subscribe to services and messages exposed within the AppFabric service bus, making it an ideal solution to build and deploy a presentation layer which will need to support of scalable infrastructure – such as a high demand public facing e-Commerce portal, or a promotion element of a brand. Windows Azure has excellent support for ASP.NET, including its own caching providers meaning expensive operations such as catalogue queries can persist in memory on the application server, reducing the demand on internal infrastructure and prioritising it for more business critical operations such as receiving orders and processing payments. Windows Azure also supports other languages too, meaning utilising this approach you can technically build a Commerce Server presentation layer in Java, PHP, or Ruby – or equally in ASP.NET or Silverlight without having to change any of the underlying business or Commerce Server implementation. This SOA-style architecture is one of the primary differentiators for Commerce Server as a product in the e-Commerce market, and now with the introduction of a WCF capability in Commerce Server 2009/2009 R2 the opportunities for extensibility of the both the user experience, and integration into third parties, are drastically increased, all with no effect to the underlying channel logic. So if you are looking at deployment options for your e-Commerce application to help support demand in a cost effective way. I would highly recommend you consider looking at Windows Azure, and if you have any questions in-particular about this style of deployment, please feel free to get in touch!

    Read the article

  • Eclipse Could not Delete error

    - by KáGé
    Hello I'm working on a project with Eclipse and by now everything was fine, but last time I've tried building it, it returned the error "The project was not built due to "Could not delete '/Torpedo/bin/bin'.". Fix the problem, then try refreshing this project and building it since it may be inconsistent Torpedo Unknown Java Problem" and it deleted my bin folder which stores all the images and stuff needed for the program. (Fortunately I had a backup). I've tried googling it and tried every solution I found, but nothing helped, and also most of them suggests to delete the folder by hand, which I can't. What should I do? Thanks.

    Read the article

  • Pass data to Master Page with ASP.NET MVC

    - by Brian David Berman
    I have a hybrid ASP.NET WebForms/MVC project. In my Master Page, I have a "menu" User Control and a "footer" User Control. Anyways. I need to pass some data (2 strings) to my "menu" User Control on my Master Page (to select the current tab in my menu navigation, etc.) My views are strongly-typed to my data model. How can I push data from my controller to my menu or at least allow my master page to access some data pre-defined in my controller? Note: I understand this violates pure ASP.NET MVC, but like I said, it is a hybrid project. The main purpose of my introduction to ASP.NET MVC into my project was to have more control over my UI for certain situations only.

    Read the article

  • Error LNK1223 on ARM builds

    - by Seva Alekseyev
    eMbedded Visual C++ 3 project, building for PocketPC 2000. On the ARM build, the linker throws the following error: fatal error LNK1223: invalid or corrupt file: file contains invalid pdata contributions On SH3, the project compiles, links, and works. The project also works when built for ARM on Visual C++ 2005, but I need to test builds specifically from eVC3. Any ideas, please? What's a pdata contribution and how do I affect (or disable) those? It's something to do with exception handling; I've tried disabling SEH by specifying /EHsc, to no effect.

    Read the article

  • How to refactor large projects in visual studio

    - by Aaron
    I always run into a problem where my projects in Visual Studio (2008) become huge monstrosities and everything is generally thrown into a Web Application project. I know from checking out some open source stuff that they tend to have multiple projects within a solution, each with their own responsibilities. Does anyone have any advice for how to refactor this out? What should be in a separate project vs. part of the web project? Can you point me to any reference materials on the subject, or is it just something you become accustomed to with time?

    Read the article

  • How to Increment Visual Studio build number using C++?

    - by Brock Woolf
    I have a Visual Studio 2008 project that produces a file called: "Game-Release.exe". This was configured under Project Properties - C/C++ - Linker - General: $(OutDir)\$(ProjectName)-Release.exe I would like to take this a bit further by have an incrementing build number so I would have something which says: Game-Release-Build-1002.exe The number on the end should be an incrementing integer. I will be storing the build exe's on subversion so I think i would find this useful (although not necessary). Perhaps there is a built in macro in Visual Studio that could handle this. Quite possibly I was thinking I could have a text file with the build number in it and have the compiler read, use and increment the number in the file each time the project is built. My goal is however to make the process as automated as possible. What is the best way to accomplish this? If you offer an opinion, please also provide the code we can all share. Thnx.

    Read the article

  • Upgrading to Oracle Enterprise Manager 12c Release 2: Top Tips One Must Know

    - by AnkurGupta
    Recently Oracle announced incremental release of Enterprise Manager 12c called Enterprise Manager 12c Release 2 (EM12c R2) which includes several new exciting features (Press announcement). Right before the official release, we upgraded an internal production site from EM 12c R1 to EM 12c R2 and had an extremely pleasant experience. Let me share few key takeaways as well as few tips from this upgrade exercise. I - Why Should You Upgrade To Enterprise Manager 12c Release 2 While an upgrade is usually recommended primarily to take benefit of the latest features (which is valid for this upgrade as well), I found several other compelling reasons purely from deployment perspective. Standardize your EM deployment:  Enterprise Manager comprises of several different components (OMS, agents, plug-ins, etc) and it might be possible that these are at varied patch levels in your environment. For instance, in case of an environment containing Bundle Patch 1 (customer announcement), there is a good chance that you may not have all the components up-to-date. There are two possible reasons. Bundle Patch 1 involved patching different components (OMS, agents, plug-ins) with multiple one-off patches which may not have been applied to all components yet. Bundle Patch 1 for different platforms were not released together. Which means you may not have got the chance to patch all the components on different platforms. Note: BP1 patches are not mandatory to upgrade to EM12c R2 release EM 12c R2 provides an excellent opportunity to standardize your Cloud Control environment (OMS, repository and agents) and plug-ins to latest versions in single shot. All platform releases are made available simultaneously: For the very first time in the history of EM release, all the platforms were released on day one itself, which means you do not need to wait for platform specific binaries for EM OMS or Agent to perform install or upgrades in a heterogeneous environment. Highly refined and automated process – Upgrade process is by far the smoothest and the cleanest as compared to previous releases of Enterprise manager. Following are the ones that stand out. Automatic Plug-in management – Plug-in upgrade along with new plug-in deployment is supported in upgrade installer wizard which means bulk of the updates to OMS and repository can be done in the same workflow. Saves time and minimizes user inputs. Plug-in Upgrade or Migrate Auto Update: While doing the OMS and repository upgrade, you can use Auto Update screen in Oracle Universal Installer to check for any updates/patches. That will help you to avoid the know issues and will make sure that your upgrade is successful. Allows mass upgrade of EM Agents – A new dedicated menu has been added in the EM console for agent upgrade. Agent upgrade workflow is extremely simple that requires agent name as the only input. ADM / JVMD Manager/Agent upgrade – complete process is supported via UI screens. EM12c R2 Upgrade Guide is much simpler to follow as compared to those for earlier releases. This is attributed to the simpler upgrade process. Robust and Performing Platform: EM12c R2 release not only includes several new features, but also provides a more stable platform which incorporates several fixes and enhancements in the Enterprise Manager framework. II - Few Tips To Remember In my last post (blog link) I shared few tips and tricks from my experience applying the Bundle Patch. Recently I upgraded the same site to EM 12c R2 and found few points that you must take note of, while planning this upgrade. The tips below are also applicable to EM 12c R1 environments that do not have Bundle Patch 1 patches applied. Verify the monitored application certification – Specific targets like E-Business Suite have not yet been certified as managed target in EM 12c R2. Therefore make sure to recheck the Enterprise Manager certification Matrix on My Oracle Support before planning the upgrade. Plan downtime – Because EM 12c R2 is an incremental release of EM 12c, for EM 12c R1 to EM 12c R2 upgrade supports only 1-system upgrade approach, which mean there will be downtime. OMS name change after upgrade – In case of multi OMS environments, additional OMS is renamed after upgrade, which has few implications when you upgrade JVMD and ADP agents on OMS. This is well documented in upgrade guide but make sure you read through all the notes. Upgrading BI Publisher– EM12c R2 is certified with BI Publisher 11.1.1.6.0 only. Therefore in case you are using EM 12c R1 which is integrated with BI Publisher 11.1.1.5.0, you must upgrade the BI Publisher to 11.1.1.6.0. Follow the steps from Advanced Installation and Configuration Guide here. Perform Post upgrade Tasks – Make sure to perform post upgrade steps mentioned in documentation here. These include critical changes that must be done right after upgrade to get the right configuration. For instance Database plug-in should be upgraded to Revision 3 (12.1.0.2.0 [u120804]). Delete old OMS Home – EM12c R1 to EM12c R2 is an out of place upgrade, which means it creates a new oracle home for OMS, plug-ins, etc. Therefore please ensure that You have sufficient extra space for new OMS before starting the upgrade process. You clean up the old OMS home after the upgrade process. Steps are available here. DO NOT remove the agent home on OMS host, because agent is upgraded in-place. If you have standby OMS setup then do look into the steps to upgrade the standby OMS from the upgrade guide before going ahead. Read the right documentation – Make sure to follow the Upgrade guide which provides the most comprehensive information on EM12c R2 upgrade process. Additionally you can refer other resources to get familiar with upgrade concepts. Recorded webcast - Oracle Enterprise Manager Cloud Control 12c Release 2 Installation and Upgrade Overview Presentation - Understanding Enterprise Manager 12.1.0.2 Upgrade We are very excited about this latest release and will look forward to hear back any feedback from your upgrade experience!

    Read the article

< Previous Page | 342 343 344 345 346 347 348 349 350 351 352 353  | Next Page >