Search Results

Search found 50722 results on 2029 pages for 'web config transformation'.

Page 8/2029 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • As a web designer, which language should I learn first for my feature career? (PHP or JavaScript) [closed]

    - by kdevs3
    Possible Duplicates: Best Programming Language for Web Development How can I choose a web development language? What language will you choose if you are going to build something big? What is the right option of programming languages and tools for building our website? What is the easiest web programing language at....? Well, I'm more of a basic web designer. I know the easy stuff pretty well. (Ya know, html, css) But I've been trying to take it to the next step and I'm contemplating about what I should learn that will help me out the most in my future web design/programming career, should it be JavaScript or maybe I should try to learn a back end programming language such as PHP. Lately, I have been hearing about a lot how JavaScript is so great and useful now, because of libraries such as jQuery and what possibility's it can bring by using Node.js and other frameworks. I've only learned the most basic of JavaScript and used some jQuery (mostly plugins) so i wouldn't know at all of what it can actually do. Would JS being so popular as it is now and useful, be a reason to stick with JavaScript and only learn it that for now? Or as a web designer, how important would it be to learn how to make a web application/website operate and functional, and know how to work with servers, etc? (Such as getting forms to work and sending data to the server and back) I've took a look at frameworks such as Code Igniter before, and looks really simple to get started with if I try to learn PHP, But I'm not sure how important it is for my career and what I would gain out of it. I'm asking because I can't decide what I should learn first. When I select it, I really want to take my time and learn the language. I don't want to spend time on learning multiple languages at the same time, so I need to pick wisely. I'm trying to turn the right direction so my career can hopefully be successful in the feature. (If money/gaining a job asked if its important, then its a yeah, it is a bit) I'm hoping I can get opinions and suggestions on this question, thanks for giving me your thoughts also.

    Read the article

  • Getting App.config to be configuration specific in VS2010

    - by MarkPearl
    I recently wanted to have a console application that had configuration specific settings. For instance, if I had two configurations “Debug” and “Release”, depending on the currently selected configuration I wanted it to use a specific configuration file (either debug or config). If you are wanting to do something similar, here is a potential solution that worked for me. Setting up a demo app to illustrate the point First, let’s set up an application that will demonstrate the most basic concept. using System; using System.Configuration; namespace ConsoleSpecificConfiguration { class Program { static void Main(string[] args) { Console.WriteLine("Config"); Console.WriteLine(ConfigurationManager.AppSettings["Example Config"]); Console.ReadLine(); } } }   This does a really simple thing. Display a config when run. To do this, you also need a config file set up. My default looks as follows… <?xml version="1.0" encoding="utf-8" ?> <configuration> <appSettings> <add key="Example Config" value="Default"/> </appSettings> </configuration>   Your entire solution will look as follows… Running the project you will get the following amazing output…   Let’s now say instead of having one config file we want depending on whether we are running in “Debug” or “Release” for the solution configuration we want different config settings to be propagated across you can do the following… Step 1 – Create alternate config Files First add additional config files to your solution. You should have some form of naming convention for these config files, I have decided to follow a similar convention to the one used for web.config, so in my instance I am going to add a App.Debug.config and a App.Release.config file BUT you can follow any naming convention you want provided you wire up the rest of the approach to use this convention. My files look as follows.. App.Debug.config <?xml version="1.0" encoding="utf-8" ?> <configuration> <appSettings> <add key="Example Config" value="Debug"/> </appSettings> </configuration>   App.Release.config <?xml version="1.0" encoding="utf-8" ?> <configuration> <appSettings> <add key="Example Config" value="Release"/> </appSettings> </configuration>   Your solution will now look as follows… Step 2 – Create a bat file that will overwrite files The next step is to create a bat file that will overwrite one file with another. If you right click on the solution in the solution explorer there will be a menu option to add new items to the solution. Create a text file called “copyifnewer.bat” which will be our copy script. It’s contents should look as follows… @echo off echo Comparing two files: %1 with %2 if not exist %1 goto File1NotFound if not exist %2 goto File2NotFound fc %1 %2 /A if %ERRORLEVEL%==0 GOTO NoCopy echo Files are not the same. Copying %1 over %2 copy %1 %2 /y & goto END :NoCopy echo Files are the same. Did nothing goto END :File1NotFound echo %1 not found. goto END :File2NotFound copy %1 %2 /y goto END :END echo Done. Your solution should now look as follows…   Step 3 – Customize the Post Build event command line We now need to wire up everything – which we will do using the post build event command line in VS2010. Right click on your project and go to it’s properties We are now going to wire up the script so that when we build our project it will overwrite the default App.config with whatever file we want. The syntax goes as follows… call "$(SolutionDir)copyifnewer.bat" "$(ProjectDir)App.$(ConfigurationName).config" "$(ProjectDir)$(OutDir)\$(TargetFileName).config" Testing it If I now change my project configuration to Release   And then run my application I get the following output… Toggling between Release and Debug mode will show that the config file is changing each time. And that is it!

    Read the article

  • Map external directory to web.xml

    - by prometheus
    Is there an easy way to map a directory in the web.xml or other deployment descriptor (jetty.xml, etc) files? For example, if I have a directory /opt/files/ is there a way that I can access its files and sub-directories by visiting http://localhost/some-mapping/? It strikes me that there should be some simple way of doing this, but I haven't been able to find out how (via google, stackoverflow, etc). All I've found are servlets that mimic fileservers, which is not what I would like. For reference I am using jetty on an AIX box.

    Read the article

  • April 14th Links: ASP.NET, ASP.NET MVC, ASP.NET Web API and Visual Studio

    - by ScottGu
    Here is the latest in my link-listing blog series: ASP.NET Easily overlooked features in VS 11 Express for Web: Good post by Scott Hanselman that highlights a bunch of easily overlooked improvements that are coming to VS 11 (and specifically the free express editions) for web development: unit testing, browser chooser/launcher, IIS Express, CSS Color Picker, Image Preview in Solution Explorer and more. Get Started with ASP.NET 4.5 Web Forms: Good 5-part tutorial that walks-through building an application using ASP.NET Web Forms and highlights some of the nice improvements coming with ASP.NET 4.5. What is New in Razor V2 and What Else is New in Razor V2: Great posts by Andrew Nurse, a dev on the ASP.NET team, about some of the new improvements coming with ASP.NET Razor v2. ASP.NET MVC 4 AllowAnonymous Attribute: Nice post from David Hayden that talks about the new [AllowAnonymous] filter introduced with ASP.NET MVC 4. Introduction to the ASP.NET Web API: Great tutorial by Stephen Walher that covers how to use the new ASP.NET Web API support built-into ASP.NET 4.5 and ASP.NET MVC 4. Comprehensive List of ASP.NET Web API Tutorials and Articles: Tugberk Ugurlu links to a huge collection of articles, tutorials, and samples about the new ASP.NET Web API capability. Async Mashups using ASP.NET Web API: Nice post by Henrik on how you can use the new async language support coming with .NET 4.5 to easily and efficiently make asynchronous network requests that do not block threads within ASP.NET. ASP.NET and Front-End Web Development Visual Studio 11 and Front End Web Development - JavaScript/HTML5/CSS3: Nice post by Scott Hanselman that highlights some of the great improvements coming with VS 11 (including the free express edition) for front-end web development. HTML5 Drag/Drop and Async Multi-file Upload with ASP.NET Web API: Great post by Filip W. that demonstrates how to implement an async file drag/drop uploader using HTML5 and ASP.NET Web API. Device Emulator Guide for Mobile Development with ASP.NET: Good post from Rachel Appel that covers how to use various device emulators with ASP.NET and VS to develop cross platform mobile sites. Fixing these jQuery: A Guide to Debugging: Great presentation by Adam Sontag on debugging with JavaScript and jQuery.  Some really good tips, tricks and gotchas that can save a lot of time. ASP.NET and Open Source Getting Started with ASP.NET Web Stack Source on CodePlex: Fantastic post by Henrik (an architect on the ASP.NET team) that provides step by step instructions on how to work with the ASP.NET source code we recently open sourced. Contributing to ASP.NET Web Stack Source on CodePlex: Follow-on to the post above (also by Henrik) that walks-through how you can submit a code contribution to the ASP.NET MVC, Web API and Razor projects. Overview of the WebApiContrib project: Nice post by Pedro Reys on the new open source WebApiContrib project that has been started to deliver cool extensions and libraries for use with ASP.NET Web API. Entity Framework Entity Framework 5 Performance Improvements and Performance Considerations for EF5:  Good articles that describes some of the big performance wins coming with EF5 (which will ship with both .NET 4.5 and ASP.NET MVC 4). Automatic compilation of LINQ queries will yield some significant performance wins (up to 600% faster). ASP.NET MVC 4 and EF Database Migrations: Good post by David Hayden that covers the new database migrations support within EF 4.3 which allows you to easily update your database schema during development - without losing any of the data within it. Visual Studio What's New in Visual Studio 11 Unit Testing: Nice post by Peter Provost (from the VS team) that talks about some of the great improvements coming to VS11 for unit testing - including built-in VS tooling support for a broad set of unit test frameworks (including NUnit, XUnit, Jasmine, QUnit and more) Hope this helps, Scott

    Read the article

  • Flash builder 4 - change output filename using external build-config.xml (not Ant)

    - by Casey
    I'm trying to change the output filename with a config file loaded via the compiler option -load-config. It looks like this in my compiler arguments: -load-config+=build-config.xml. I've tried the following: <flex-config> <o>absolute/path/to/filename</o> </flex-config> and <flex-config> <output>absolute/path/to/filename</output> </flex-config> and <flex-config> <compiler> <o>absolute/path/to/filename</o> </compiler> </flex-config> and <flex-config> <compiler> <output>absolute/path/to/filename</output> </compiler> </flex-config> but none have worked. I'm on a PC using Flash Builder 4. Has anyone else done this? Also, ideally, I want to use a relative path instead of absolute. I can't get this to work either, even if I do so in the "additional compiler arguments" field of the Project configuration. Thanks in advance!

    Read the article

  • Child web.config can't clear <pages><controls> from parent web.config

    - by Lance Rushing
    How can I "clear" the vendor defined <controls> in my child app's web.config? Parent Web Config. <system.web> <pages> <controls> <!-- START: Vendor Custom Control --> <add tagPrefix="asp" namespace="VENDOR.Web.UI.Base" assembly="System.Web.Extensions, Version=1.0.61025.0, Culture=neutral /> ... <!-- END: Vendor Custom Control --> ... </controls> <tagMapping> <add tagType="System.Web.UI.WebControls.WebParts.WebPartManager" mappedTagType="Microsoft.Web.Preview.UI.Controls.WebParts.WebPartManager" /> <add tagType="System.Web.UI.WebControls.WebParts.WebPartZone" mappedTagType="Microsoft.Web.Preview.UI.Controls.WebParts.WebPartZone" /> </tagMapping> </pages> </system.web> Child: <system.web> <pages> <controls> <add tagPrefix="asp" namespace="System.Web.UI" assembly="System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"/> <add tagPrefix="asp" namespace="System.Web.UI.WebControls" assembly="System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"/> </controls> <tagMapping> <clear/> </tagMapping> </pages> </system.web> I have it working for the <tagMapping> section, but <controls> does not support <clear/> (or ).

    Read the article

  • Mapping UrlEncoded POST Values in ASP.NET Web API

    - by Rick Strahl
    If there's one thing that's a bit unexpected in ASP.NET Web API, it's the limited support for mapping url encoded POST data values to simple parameters of ApiController methods. When I first looked at this I thought I was doing something wrong, because it seems mighty odd that you can bind query string values to parameters by name, but can't bind POST values to parameters in the same way. To demonstrate here's a simple example. If you have a Web API method like this:[HttpGet] public HttpResponseMessage Authenticate(string username, string password) { …} and then hit with a URL like this: http://localhost:88/samples/authenticate?Username=ricks&Password=sekrit it works just fine. The query string values are mapped to the username and password parameters of our API method. But if you now change the method to work with [HttpPost] instead like this:[HttpPost] public HttpResponseMessage Authenticate(string username, string password) { …} and hit it with a POST HTTP Request like this: POST http://localhost:88/samples/authenticate HTTP/1.1 Host: localhost:88 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Content-type: application/x-www-form-urlencoded Content-Length: 30 Username=ricks&Password=sekrit you'll find that while the request works, it doesn't actually receive the two string parameters. The username and password parameters are null and so the method is definitely going to fail. When I mentioned this over Twitter a few days ago I got a lot of responses back of why I'd want to do this in the first place - after all HTML Form submissions are the domain of MVC and not WebAPI which is a valid point. However, the more common use case is using POST Variables with AJAX calls. The following is quite common for passing simple values:$.post(url,{ Username: "Rick", Password: "sekrit" },function(result) {…}); but alas that doesn't work. How ASP.NET Web API handles Content Bodies Web API supports parsing content data in a variety of ways, but it does not deal with multiple posted content values. In effect you can only post a single content value to a Web API Action method. That one parameter can be very complex and you can bind it in a variety of ways, but ultimately you're tied to a single POST content value in your parameter definition. While it's possible to support multiple parameters on a POST/PUT operation, only one parameter can be mapped to the actual content - the rest have to be mapped to route values or the query string. Web API treats the whole request body as one big chunk of data that is sent to a Media Type Formatter that's responsible for de-serializing the content into whatever value the method requires. The restriction comes from async nature of Web API where the request data is read only once inside of the formatter that retrieves and deserializes it. Because it's read once, checking for content (like individual POST variables) first is not possible. However, Web API does provide a couple of ways to access the form POST data: Model Binding - object property mapping to bind POST values FormDataCollection - collection of POST keys/values ModelBinding POST Values - Binding POST data to Object Properties The recommended way to handle POST values in Web API is to use Model Binding, which maps individual urlencoded POST values to properties of a model object provided as the parameter. Model binding requires a single object as input to be bound to the POST data, with each POST key that matches a property name (including nested properties like Address.Street) being mapped and updated including automatic type conversion of simple types. This is a very nice feature - and a familiar one from MVC - that makes it very easy to have model objects mapped directly from inbound data. The obvious drawback with Model Binding is that you need a model for it to work: You have to provide a strongly typed object that can receive the data and this object has to map the inbound data. To rewrite the example above to use ModelBinding I have to create a class maps the properties that I need as parameters:public class LoginData { public string Username { get; set; } public string Password { get; set; } } and then accept the data like this in the API method:[HttpPost] public HttpResponseMessage Authenticate(LoginData login) { string username = login.Username; string password = login.Password; … } This works fine mapping the POST values to the properties of the login object. As a side benefit of this method definition, the method now also allows posting of JSON or XML to the same endpoint. If I change my request to send JSON like this: POST http://localhost:88/samples/authenticate HTTP/1.1 Host: localhost:88 Accept: application/jsonContent-type: application/json Content-Length: 40 {"Username":"ricks","Password":"sekrit"} it works as well and transparently, courtesy of the nice Content Negotiation features of Web API. There's nothing wrong with using Model binding and in fact it's a common practice to use (view) model object for inputs coming back from the client and mapping them into these models. But it can be  kind of a hassle if you have AJAX applications with a ton of backend hits, especially if many methods are very atomic and focused and don't effectively require a model or view. Not always do you have to pass structured data, but sometimes there are just a couple of simple response values that need to be sent back. If all you need is to pass a couple operational parameters, creating a view model object just for parameter purposes seems like overkill. Maybe you can use the query string instead (if that makes sense), but if you can't then you can often end up with a plethora of 'message objects' that serve no further  purpose than to make Model Binding work. Note that you can accept multiple parameters with ModelBinding so the following would still work:[HttpPost] public HttpResponseMessage Authenticate(LoginData login, string loginDomain) but only the object will be bound to POST data. As long as loginDomain comes from the querystring or route data this will work. Collecting POST values with FormDataCollection Another more dynamic approach to handle POST values is to collect POST data into a FormDataCollection. FormDataCollection is a very basic key/value collection (like FormCollection in MVC and Request.Form in ASP.NET in general) and then read the values out individually by querying each. [HttpPost] public HttpResponseMessage Authenticate(FormDataCollection form) { var username = form.Get("Username"); var password = form.Get("Password"); …} The downside to this approach is that it's not strongly typed, you have to handle type conversions on non-string parameters, and it gets a bit more complicated to test such as setup as you have to seed a FormDataCollection with data. On the other hand it's flexible and easy to use and especially with string parameters is easy to deal with. It's also dynamic, so if the client sends you a variety of combinations of values on which you make operating decisions, this is much easier to work with than a strongly typed object that would have to account for all possible values up front. The downside is that the code looks old school and isn't as self-documenting as a parameter list or object parameter would be. Nevertheless it's totally functionality and a viable choice for collecting POST values. What about [FromBody]? Web API also has a [FromBody] attribute that can be assigned to parameters. If you have multiple parameters on a Web API method signature you can use [FromBody] to specify which one will be parsed from the POST content. Unfortunately it's not terribly useful as it only returns content in raw format and requires a totally non-standard format ("=content") to specify your content. For more info in how FromBody works and several related issues to how POST data is mapped, you can check out Mike Stalls post: How WebAPI does Parameter Binding Not really sure where the Web API team thought [FromBody] would really be a good fit other than a down and dirty way to send a full string buffer. Extending Web API to make multiple POST Vars work? Don't think so Clearly there's no native support for multiple POST variables being mapped to parameters, which is a bit of a bummer. I know in my own work on one project my customer actually found this to be a real sticking point in their AJAX backend work, and we ended up not using Web API and using MVC JSON features instead. That's kind of sad because Web API is supposed to be the proper solution for AJAX backends. With all of ASP.NET Web API's extensibility you'd think there would be some way to build this functionality on our own, but after spending a bit of time digging and asking some of the experts from the team and Web API community I didn't hear anything that even suggests that this is possible. From what I could find I'd say it's not possible primarily because Web API's Routing engine does not account for the POST variable mapping. This means [HttpPost] methods with url encoded POST buffers are not mapped to the parameters of the endpoint, and so the routes would never even trigger a request that could be intercepted. Once the routing doesn't work there's not much that can be done. If somebody has an idea how this could be accomplished I would love to hear about it. Do we really need multi-value POST mapping? I think that that POST value mapping is a feature that one would expect of any API tool to have. If you look at common APIs out there like Flicker and Google Maps etc. they all work with POST data. POST data is very prominent much more so than JSON inputs and so supporting as many options that enable would seem to be crucial. All that aside, Web API does provide very nice features with Model Binding that allows you to capture many POST variables easily enough, and logistically this will let you build whatever you need with POST data of all shapes as long as you map objects. But having to have an object for every operation that receives a data input is going to take its toll in heavy AJAX applications, with a lot of types created that do nothing more than act as parameter containers. I also think that POST variable mapping is an expected behavior and Web APIs non-support will likely result in many, many questions like this one: How do I bind a simple POST value in ASP.NET WebAPI RC? with no clear answer to this question. I hope for V.next of WebAPI Microsoft will consider this a feature that's worth adding. Related Articles Passing multiple POST parameters to Web API Controller Methods Mike Stall's post: How Web API does Parameter Binding Where does ASP.NET Web API Fit?© Rick Strahl, West Wind Technologies, 2005-2012Posted in Web Api   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Row Number Transformation

    The Row Number Transformation calculates a row number for each row, and adds this as a new output column to the data flow. The column number is a sequential number, based on a seed value. Each row receives the next number in the sequence, based on the defined increment value. The final row number can be stored in a variable for later analysis, and can be used as part of a process to validate the integrity of the data movement. The Row Number transform has a variety of uses, such as generating surrogate keys, or as the basis for a data partitioning scheme when combined with the Conditional Split transformation. Properties Property Data Type Description Seed Int32 The first row number or seed value. Increment Int32 The value added to the previous row number to make the next row number. OutputVariable String The name of the variable into which the final row number is written post execution. (Optional). The three properties have been configured to support expressions, or they can set directly in the normal manner. Expressions on components are only visible on the hosting Data Flow task, not at the individual component level. Sometimes the data type of the property is incorrectly set when the properties are created, see the Troubleshooting section below for details on how to fix this. Installation The component is provided as an MSI file which you can download and run to install it. This simply places the files on disk in the correct locations and also installs the assemblies in the Global Assembly Cache as per Microsoft’s recommendations. You may need to restart the SQL Server Integration Services service, as this caches information about what components are installed, as well as restarting any open instances of Business Intelligence Development Studio (BIDS) / Visual Studio that you may be using to build your SSIS packages. For 2005/2008 Only - Finally you will have to add the transformation to the Visual Studio toolbox manually. Right-click the toolbox, and select Choose Items.... Select the SSIS Data Flow Items tab, and then check the Row Number transformation in the Choose Toolbox Items window. This process has been described in detail in the related FAQ entry for How do I install a task or transform component? We recommend you follow best practice and apply the current Microsoft SQL Server Service pack to your SQL Server servers and workstations, and this component requires a minimum of SQL Server 2005 Service Pack 1. Downloads The Row Number Transformation  is available for SQL Server 2005, SQL Server 2008 (includes R2) and SQL Server 2012. Please choose the version to match your SQL Server version, or you can install multiple versions and use them side by side if you have more than one version of SQL Server installed. Row Number Transformation for SQL Server 2005 Row Number Transformation for SQL Server 2008 Row Number Transformation for SQL Server 2012 Version History SQL Server 2012 Version 3.0.0.6 - SQL Server 2012 release. Includes upgrade support for both 2005 and 2008 packages to 2012. (5 Jun 2012) SQL Server 2008 Version 2.0.0.5 - SQL Server 2008 release. (15 Oct 2008) SQL Server 2005 Version 1.2.0.34 – Updated installer. (25 Jun 2008) Version 1.2.0.7 - SQL Server 2005 RTM Refresh. SP1 Compatibility Testing. Added the ability to reuse an existing column to hold the generated row number, as an alternative to the default of adding a new column to the output. (18 Jun 2006) Version 1.2.0.7 - SQL Server 2005 RTM Refresh. SP1 Compatibility Testing. Added the ability to reuse an existing column to hold the generated row number, as an alternative to the default of adding a new column to the output. (18 Jun 2006) Version 1.0.0.0 - Public Release for SQL Server 2005 IDW 15 June CTP (29 Aug 2005) Screenshot Code Sample The following code sample demonstrates using the Data Generator Source and Row Number Transformation programmatically in a very simple package. Package package = new Package(); package.Name = "Data Generator & Row Number"; // Add the Data Flow Task Executable taskExecutable = package.Executables.Add("STOCK:PipelineTask"); // Get the task host wrapper, and the Data Flow task TaskHost taskHost = taskExecutable as TaskHost; MainPipe dataFlowTask = (MainPipe)taskHost.InnerObject; // Add Data Generator Source IDTSComponentMetaData100 componentSource = dataFlowTask.ComponentMetaDataCollection.New(); componentSource.Name = "Data Generator"; componentSource.ComponentClassID = "Konesans.Dts.Pipeline.DataGenerator.DataGenerator, Konesans.Dts.Pipeline.DataGenerator, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b2ab4a111192992b"; CManagedComponentWrapper instanceSource = componentSource.Instantiate(); instanceSource.ProvideComponentProperties(); instanceSource.SetComponentProperty("RowCount", 10000); // Add Row Number Tx IDTSComponentMetaData100 componentRowNumber = dataFlowTask.ComponentMetaDataCollection.New(); componentRowNumber.Name = "FlatFileDestination"; componentRowNumber.ComponentClassID = "Konesans.Dts.Pipeline.RowNumberTransform.RowNumberTransform, Konesans.Dts.Pipeline.RowNumberTransform, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b2ab4a111192992b"; CManagedComponentWrapper instanceRowNumber = componentRowNumber.Instantiate(); instanceRowNumber.ProvideComponentProperties(); instanceRowNumber.SetComponentProperty("Increment", 10); // Connect the two components together IDTSPath100 path = dataFlowTask.PathCollection.New(); path.AttachPathAndPropagateNotifications(componentSource.OutputCollection[0], componentRowNumber.InputCollection[0]); #if DEBUG // Save package to disk, DEBUG only new Application().SaveToXml(String.Format(@"C:\Temp\{0}.dtsx", package.Name), package, null); #endif package.Execute(); foreach (DtsError error in package.Errors) { Console.WriteLine("ErrorCode : {0}", error.ErrorCode); Console.WriteLine(" SubComponent : {0}", error.SubComponent); Console.WriteLine(" Description : {0}", error.Description); } package.Dispose(); Troubleshooting Make sure you have downloaded the version that matches your version of SQL Server. We offer separate downloads for SQL Server 2005, SQL Server 2008 and SQL Server 2012. If you get an error when you try and use the component along the lines of The component could not be added to the Data Flow task. Please verify that this component is properly installed.  ... The data flow object "Konesans ..." is not installed correctly on this computer, this usually indicates that the internal cache of SSIS components needs to be updated. This is held by the SSIS service, so you need restart the the SQL Server Integration Services service. You can do this from the Services applet in Control Panel or Administrative Tools in Windows. You can also restart the computer if you prefer. You may also need to restart any current instances of Business Intelligence Development Studio (BIDS) / Visual Studio that you may be using to build your SSIS packages. Once installation is complete you need to manually add the task to the toolbox before you will see it and to be able add it to packages - How do I install a task or transform component? Please also make sure you have installed a minimum of SP1 for SQL 2005. The IDtsPipelineEnvironmentService was added in SQL Server 2005 Service Pack 1 (SP1) (See  http://support.microsoft.com/kb/916940). If you get an error Could not load type 'Microsoft.SqlServer.Dts.Design.IDtsPipelineEnvironmentService' from assembly 'Microsoft.SqlServer.Dts.Design, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91'. when trying to open the user interface, it implies that your development machine has not had SP1 applied. Very occasionally we get a problem to do with the properties not being created with the correct data type. Since there is no way to programmatically to define the data type of a pipeline component property, it can only infer it. Whilst we set an integer value as we create the property, sometimes SSIS decides to define it is a decimal. This is often highlighted when you use a property expression against the property and get an error similar to Cannot convert System.Int32 to System.Decimal. Unfortunately this is beyond our control and there appears to be no pattern as to when this happens. If you do have more information we would be happy to hear it. To fix this issue you can manually edit the package file. In Visual Studio right click the package file from the Solution Explorer and select View Code, which will open the package as raw XML. You can now search for the properties by name or the component name. You can then change the incorrect property data types highlighted below from Decimal to Int32. <component id="37" name="Row Number Transformation" componentClassID="{BF01D463-7089-41EE-8F05-0A6DC17CE633}" … >     <properties>         <property id="38" name="UserComponentTypeName" …>         <property id="41" name="Seed" dataType="System.Int32" ...>10</property>         <property id="42" name="Increment" dataType="System.Decimal" ...>10</property>         ... If you are still having issues then contact us, but please provide as much detail as possible about error, as well as which version of the the task you are using and details of the SSIS tools installed.

    Read the article

  • Is it typical for a provider of a web services to also provide client libraries?

    - by HDave
    My company is building a corporate Java web-app and we are leaning towards using GWT-RPC as the client-server protocol for performance reasons. However, in the future, we will need to provide an API for other enterprise systems to access our data as well. For this, we were thinking of a SOAP based web service. In my experience it is common for commercial providers of enterprise web applications to provide client libraries (Java, .NET, C#, etc.). Is this generally the case? I ask because if so, then why bother using SOAP or REST or any standard web services protocol at all? Why not just create a client libraries that communicate via GWT-RPC?

    Read the article

  • Split vector vs matrix notation for transformation

    - by seahorse
    Some rendering engines like Ogre prefer to use a individual vector based notation for transformations like the following Split vector notation: Net Transformation is represented by Scale vector = sx, sy, sz Transformation vector = tx, ty, tz Rotation Quaternion Vector = w,x,y,z Matrix notation: There are other engines which simply use a net combined transformation matrix. What are the advantages of the first notation over the second? Also for animation interpolation does it work in the first notation that we interpolate across the individual components and use the interpolated parts to get the net transformation? Is this another advantage?

    Read the article

  • Avoid web.config inheritance in child web application using inheritInChildApplications

    - by Blankman
    I am trying to add <location inheritInChildApplications="false"> to my parent web application's web.config but it doesn't seem to be working. My parent's web.config has: <configuration> <configSections> </configSections> // 10 or so custom config sections like log4net, hibernate, <connectionStrings> </connectionStrings> <appSettings> </appSettings> <system.diagnostics> </system.diagnostics> <system.web> <webParts> </webParts> <membership> </membership> <compilation> </compilation> </system.web> <location ..> <system.web> </system.web> </location> <system.webServer> </system.webServer> My child web application is setup as an application in IIS, and is inheriting from the parent's web.config which is causing problems. Where exactly should I place the <location inheritInChildApplications="false"> so it ignores all the various web.config settings?

    Read the article

  • App.Config Ignored After Pre-Build Event

    - by Jake Hulse
    I have a solution with several projects and several developers, each with their own environment (of course). To manage connection strings between each environment, I have created several app.config files for each environment: app.config.dev, app.config.qa, etc. The pre-build event simply copies the app.config.$(ConfigurationName) to app.config. This pre-build event is done for each project in the solution, and the config string is included in each (including the test project). When I use the pre-build events to manage the app.config files, the connection string cannot be found. I can get the tests to run fine by 2 methods: 1. Do not use the pre-build events to manage the app.config file selection, and do it myself or 2. If I check out app.config and make it writable, then the pre-build events work just fine. We are using Visual Studio 2008. I'm down to my last grey hair here, any ideas? Thanks in advance!

    Read the article

  • Selecting Users For A/B (Champion/Challenger) Testing

    - by Gordon Guthrie
    We have a framework that offers A/B split testing. You have a 'champion' version of a page and you develop a 'challenger' version of it. Then you run the website and allocate some of your users the champion and some the challenger and measure their different responses. If the challenger is better than the champion at achieving your metrics then you dethrone the champion and develop a new challenger... My question is what mechanisms should I consider to allocate the versions? A number of options spring to mind: odd or even IP address (or sub-segments) ****.****.****.123 gets champion but .124 gets challenger cookie push - check for a champion/challenger cookie, if it doesn't exist randomly allocate the user to one and push the cookie Best practice? suggestions? comments? experience?

    Read the article

  • Web App for smartphones in ASPX

    - by Ryan Knoll
    I have been looking around recently to try and find a good place to learn how to start making my website more smartphone friendly when it is visited by an iPhone, android, or blackberry. This is something like what Digg.com does when it is visited by a smartphone. I have found a few tutorials for PHP but none for ASPX, and all I have is windows servers. Could anyone point me in the right direction, and show me were to find a quick run through on how to do something like this? I am a bit lost. :(

    Read the article

  • iis7 large worker process request queue creating process blocking aspnet.config & machine.config amended (bottleneck)

    - by scott_lotus
    ASP.net 2.0 app .net 2.0 framework IIS7 I am seeing a large queue of "requests" appear under the "worker process" option. State recorded appear to be Authenticate Request and Execute Request Handles more than anything else. I have amended aspnet.config in C:\Windows\Microsoft.NET\Framework64\v2.0.50727 (32 bit path and 64 bit path) to include: maxConcurrentRequestsPerCPU="50000" maxConcurrentThreadsPerCPU="0" requestQueueLimit="50000" I have amended machine.config in C:\Windows\Microsoft.NET\Framework64\v2.0.50727\CONFIG (32 bit and 64 bit path) to include: autoConfig="true" maxIoThreads="100" maxWorkerThreads="100" minIoThreads="50" minWorkerThreads="50" minFreeThreads="176" minLocalRequestFreeThreads="152" Still i get the issue. The issue manifestes itself as a large number of processes in the Worker Process queue. Number of current connections to the website display 500 when this issue occurs. I dont think i have seen concurrent connections over 500 without this issue occurring. Web application slows as the request block. Refreshing the app pool resolves for a while (as expected) as the load is spread between the two pools. Application pool in question FIXED REQUEST have been set to refresh on 50000. Thank you for any help. Scott quick edit to say hmm, my develeopers are telling me the project was built with .net 3.5 framework. Looking at C:\Windows\Microsoft.NET\Framework64\v3.5 there does not appear to be a ASPNET.CONFIG or a MACHINE.CONFIG .... is there a 3.5 equivalent ? after a little searching apparenetly 3.5 uses the 2.0 framework files that 3.5 is missing. So back to the original question , where is my bottleneck ?

    Read the article

  • Web.config encryption/decryption

    - by Akshay
    In my applications we.config file I have a connection string stored. I encrypted it using '---open the web.config file Dim config As Configuration = _ ConfigurationManager.OpenWebConfiguration( _ Request.ApplicationPath) '---indicate the section to protect Dim section As ConfigurationSection = _ config.Sections("connectionStrings") '---specify the protection provider section.SectionInformation.ProtectSection(protectionProvider) '---Apple the protection and update config.Save() Now I can decrypt it using the code Dim config As Configuration = _ ConfigurationManager.OpenWebConfiguration( _ Request.ApplicationPath) Dim section As ConfigurationSection = _ config.Sections("connectionStrings") section.SectionInformation.UnProtectSection() config.Save() I want to know where is the key stored. Also If somehow my web.config file is stolen, will it be possible for him/her to decrypt it using thhe code above.

    Read the article

  • ASP.Net Web API in Visual Studio 2010

    - by sreejukg
    Recently for one of my project, it was necessary to create couple of services. In the past I was using WCF, since my Services are going to be utilized through HTTP, I was thinking of ASP.Net web API. So I decided to create a Web API project. Now the real issue is that ASP.Net Web API launched after Visual Studio 2010 and I had to use ASP.Net web API in VS 2010 itself. By default there is no template available for Web API in Visual Studio 2010. Microsoft has made available an update that installs ASP.Net MVC 4 with web API in Visual Studio 2010. You can find the update from the below url. http://www.microsoft.com/en-us/download/details.aspx?id=30683 Though the update denotes ASP.Net MVC 4, this also includes ASP.Net Web API. Download the installation media and start the installer. As usual for any update, you need to agree on terms and conditions. The installation starts straight away, once you clicked the Install button. If everything goes ok, you will see the success message. Now open Visual Studio 2010, you can see ASP.Net MVC 4 Project template is available for you. Now you can create ASP.Net Web API project using Visual Studio 2010. When you create a new ASP.Net MVC 4 project, you can choose the Web API template. Further reading http://www.asp.net/web-api/overview/getting-started-with-aspnet-web-api/tutorial-your-first-web-api http://www.asp.net/mvc/mvc4

    Read the article

  • upgrading from MVC4 to MVC5 pre-Release

    - by Jack M
    I have made that dreadful error of upgrading from MVC4 to MVC5 pre-release by updating the razor, and mvc webpage in my references I have System.Web.Mvc, System.Web.Webpages, System.Web.Webpages.Razor and System.Web.Razor as version v4.0.30319, when I run my application I get [A]System.Web.WebPages.Razor.Configuration.HostSection cannot be cast to [B]System.Web.WebPages.Razor.Configuration.HostSection. Type A originates from 'System.Web.WebPages.Razor, Version=2.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' in the context 'Default' at location 'C:\Windows\Microsoft.Net\assembly\GAC_MSIL\System.Web.WebPages.Razor\v4.0_2.0.0.0__31bf3856ad364e35\System.Web.WebPages.Razor.dll'. Type B originates from 'System.Web.WebPages.Razor, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' in the context 'Default' at location 'C:\Windows\Microsoft.NET\Framework64\v4.0.30319\Temporary ASP.NET Files\membership\c70f06fe\9163b1ca\assembly\dl3\291c956e\73c25daa_cf74ce01\System.Web.WebPages.Razor.dll'. is this the same as http://www.asp.net/whitepapers/mvc4-release-notes Thanks Adding a stacktrace: [InvalidCastException: [A]System.Web.WebPages.Razor.Configuration.HostSection cannot be cast to [B]System.Web.WebPages.Razor.Configuration.HostSection. Type A originates from 'System.Web.WebPages.Razor, Version=2.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' in the context 'Default' at location 'C:\Windows\Microsoft.Net\assembly\GAC_MSIL\System.Web.WebPages.Razor\v4.0_2.0.0.0__31bf3856ad364e35\System.Web.WebPages.Razor.dll'. Type B originates from 'System.Web.WebPages.Razor, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' in the context 'Default' at location 'C:\Windows\Microsoft.NET\Framework64\v4.0.30319\Temporary ASP.NET Files\c70f06fe\9163b1ca\assembly\dl3\291c956e\73c25daa_cf74ce01\System.Web.WebPages.Razor.dll'.] System.Web.WebPages.Razor.WebRazorHostFactory.CreateHostFromConfig(String virtualPath, String physicalPath) +193 System.Web.WebPages.Razor.RazorBuildProvider.GetHostFromConfig() +51 System.Web.WebPages.Razor.RazorBuildProvider.CreateHost() +24 System.Web.WebPages.Razor.RazorBuildProvider.get_Host() +34 System.Web.WebPages.Razor.RazorBuildProvider.EnsureGeneratedCode() +85 System.Web.WebPages.Razor.RazorBuildProvider.get_CodeCompilerType() +34 System.Web.Compilation.BuildProvider.GetCompilerTypeFromBuildProvider(BuildProvider buildProvider) +189 System.Web.Compilation.BuildProvidersCompiler.ProcessBuildProviders() +265 System.Web.Compilation.BuildProvidersCompiler.PerformBuild() +21 System.Web.Compilation.BuildManager.CompileWebFile(VirtualPath virtualPath) +580 System.Web.Compilation.BuildManager.GetVPathBuildResultInternal(VirtualPath virtualPath, Boolean noBuild, Boolean allowCrossApp, Boolean allowBuildInPrecompile, Boolean throwIfNotFound, Boolean ensureIsUpToDate) +571 System.Web.Compilation.BuildManager.GetVPathBuildResultWithNoAssert(HttpContext context, VirtualPath virtualPath, Boolean noBuild, Boolean allowCrossApp, Boolean allowBuildInPrecompile, Boolean throwIfNotFound, Boolean ensureIsUpToDate) +203 System.Web.Compilation.BuildManager.GetVirtualPathObjectFactory(VirtualPath virtualPath, HttpContext context, Boolean allowCrossApp, Boolean throwIfNotFound) +249 System.Web.Compilation.BuildManager.GetCompiledType(VirtualPath virtualPath) +17 System.Web.Mvc.BuildManagerCompiledView.Render(ViewContext viewContext, TextWriter writer) +90 System.Web.Mvc.ViewResultBase.ExecuteResult(ControllerContext context) +380 System.Web.Mvc.ControllerActionInvoker.InvokeActionResultFilterRecursive(IList`1 filters, Int32 filterIndex, ResultExecutingContext preContext, ControllerContext controllerContext, ActionResult actionResult) +109 System.Web.Mvc.ControllerActionInvoker.InvokeActionResultFilterRecursive(IList`1 filters, Int32 filterIndex, ResultExecutingContext preContext, ControllerContext controllerContext, ActionResult actionResult) +890 System.Web.Mvc.ControllerActionInvoker.InvokeActionResultWithFilters(ControllerContext controllerContext, IList`1 filters, ActionResult actionResult) +97 System.Web.Mvc.Async.<>c__DisplayClass1e.<BeginInvokeAction>b__1b(IAsyncResult asyncResult) +241 System.Web.Mvc.Controller.<BeginExecuteCore>b__1d(IAsyncResult asyncResult, ExecuteCoreState innerState) +29 System.Web.Mvc.Async.WrappedAsyncVoid`1.CallEndDelegate(IAsyncResult asyncResult) +111 System.Web.Mvc.Controller.EndExecuteCore(IAsyncResult asyncResult) +53 System.Web.Mvc.Async.WrappedAsyncVoid`1.CallEndDelegate(IAsyncResult asyncResult) +19 System.Web.Mvc.MvcHandler.<BeginProcessRequest>b__4(IAsyncResult asyncResult, ProcessRequestState innerState) +51 System.Web.Mvc.Async.WrappedAsyncVoid`1.CallEndDelegate(IAsyncResult asyncResult) +111 System.Web.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() +606 System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously) +288

    Read the article

  • Where to get glib-config for Kubuntu?

    - by Carl Smotricz
    I'm trying to compile Midnight Commander on a KUbuntu 9.10 (Karmic) box with no root access. I've set up a directory under $HOME, downloaded the mc source package and various stuff required for building, such as autotools. I've unpacked the CONTENTS of all those packages into this working directory such that I have the usual ./usr, ./lib, ./etc hierarchy. I manage to get configure through a lot of tests, but I can't seem to fool it into finding glib. checking for glib-2.0... checking for glib-config... no checking for glib12-config... no checking for glib-config... no checking for GLIB - version >= 1.2.6... no *** The glib-config script installed by GLIB could not be found *** If GLIB was installed in PREFIX, make sure PREFIX/bin is in *** your path, or set the GLIB_CONFIG environment variable to the *** full path to glib-config. configure: error: Test for glib failed. GNU Midnight Commander requires glib 1.2.6 or above. My system has glib installed: /lib/libglib-2.0.so.0 /lib/libglib-2.0.so.0.2200.3 ... and I've also downloaded and unpacked the glib package into my working directory: libglib2.0-0_2.22.2-0ubuntu1_i386.deb libglib2.0-dev_2.22.2-0ubuntu1_i386.deb ... but still the elusive glib-config is nowhere to be found. It's not in any debian package for Karmic, either. So I'd appreciate any help getting over this hurdle. Please note, again, that I don't have root, so I can't just merrily apt-get stuff.

    Read the article

  • Where to get glib-config for Kubuntu?

    - by Carl Smotricz
    I'm trying to compile Midnight Commander on a KUbuntu 9.10 (Karmic) box with no root access. I've set up a directory under $HOME, downloaded the mc source package and various stuff required for building, such as autotools. I've unpacked the CONTENTS of all those packages into this working directory such that I have the usual ./usr, ./lib, ./etc hierarchy. I manage to get configure through a lot of tests, but I can't seem to fool it into finding glib. checking for glib-2.0... checking for glib-config... no checking for glib12-config... no checking for glib-config... no checking for GLIB - version >= 1.2.6... no *** The glib-config script installed by GLIB could not be found *** If GLIB was installed in PREFIX, make sure PREFIX/bin is in *** your path, or set the GLIB_CONFIG environment variable to the *** full path to glib-config. configure: error: Test for glib failed. GNU Midnight Commander requires glib 1.2.6 or above. My system has glib installed: /lib/libglib-2.0.so.0 /lib/libglib-2.0.so.0.2200.3 ... and I've also downloaded and unpacked the glib package into my working directory: libglib2.0-0_2.22.2-0ubuntu1_i386.deb libglib2.0-dev_2.22.2-0ubuntu1_i386.deb ... but still the elusive glib-config is nowhere to be found. It's not in any debian package for Karmic, either. So I'd appreciate any help getting over this hurdle. Please note, again, that I don't have root, so I can't just merrily apt-get stuff.

    Read the article

  • Google I/O 2010 - Developing web apps for Chrome Web Store

    Google I/O 2010 - Developing web apps for Chrome Web Store Google I/O 2010 - Developing web apps for the Chrome Web Store Chrome 101 Erik Kay Google Chrome is a powerful platform for developing web apps. With Chrome web apps, we're making it easier for users to discover and use these apps. Learn how to build and sell apps for the Chrome Web Store. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 8 0 ratings Time: 01:00:29 More in Science & Technology

    Read the article

  • Getting beyond basic web programming languages. How to be awesome?

    - by user73962
    I'm a web developer that's done a bunch of projects using PHP, JQuery/JS, Mysql using PhPMyAdmin, CSS, HTML and a tiny bit of XML. Basically lots of work with CMS's and freehand coding. I'm looking to take things to the next level. I've done a lot of freelance and small contract work, but I'm dying to excel. I'm tired of acting as tech support for all these "non-tech" companies that barely know how to use their own computers..."really, you didn't think to backup your files before switching to a new server??". Think of potential employers as amazon, netflix, twitter, google, etc. I don't necessarily want to work for these guys specifically, but potentially organizations like this. I could be wrong, but I feel like a big company like this would laugh at me if I interviewed. For example, how helpful is knowing Ruby, SQL (commands without interface), C++, API's, Oracle, Java, debugging, qa, etc? (I realize this is a very random list). I use Notepad ++, but have heard that the bigger boys use IDE interfaces. I'm not really interested in building desktop apps, only web related stuff. I feel like I've reached my potential and want to really take it up a notch. I see a lot of projects on GitHub and I'm amazed at what people have created. Note - my degree is in economics but I've done web dev since high school. I definitely wish I took more comp sci/programming courses in college. I'm 27 and want to be awesome at web dev before it's too late. Not just decent. Any advice? Book suggestions? Thanks

    Read the article

  • Suggested Web Application Framework and Database for Enterprise, “Big-Data” App?

    - by willOEM
    I have a web application that I have been developing for a small group within my company over the past few years, using Pipeline Pilot (plus jQuery and Python scripting) for web development and back-end computation, and Oracle 10g for my RDBMS. Users upload experimental genomic data, which is parsed into a database, and made available for querying, transformation, and reporting. Experimental data sets are large and have many layers of metadata. A given experimental data record might have a foreign key relationship with a table that describes this data point's assay. Assays can cover multiple genes, which can have multiple transcript, which can have multiple mutations, which can affect multiple signaling pathways, etc. Users need to approach this data from any point in those layers in the metadata. Since all data sets for a given data type can run over a billion rows, this results in some large, dynamic queries that are hard to predict. New data sets are added on a weekly basis (~1GB per set). Experimental data is never updated, but the associated metadata can be updated weekly for a few records and yearly for most others. For every data set insert the system sees, there will be between 10 and 100 selects run against it and associated data. It is okay for updates and inserts to run slow, so long as queries run quick and are as up-to-date as possible. The application continues to grow in size and scope and is already starting to run slower than I like. I am worried that we have about outgrown Pipeline Pilot, and perhaps Oracle (as the sole database). Would a NoSQL database or an OLAP system be appropriate here? What web application frameworks work well with systems like this? I'd like the solution to be something scalable, portable and supportable X-years down the road. Here is the current state of the application: Web Server/Data Processing: Pipeline Pilot on Windows Server + IIS Database: Oracle 10g, ~1TB of data, ~180 tables with several billion-plus row tables Network Storage: Isilon, ~50TB of low-priority raw data

    Read the article

  • Transformation of Product Management in Telecommunications for Rapid Launch of Next Generation Products

    - by raul.goycoolea
    @font-face { font-family: "Arial"; }@font-face { font-family: "Courier New"; }@font-face { font-family: "Wingdings"; }@font-face { font-family: "Cambria"; }p.MsoNormal, li.MsoNormal, div.MsoNormal { margin: 0cm 0cm 0.0001pt; font-size: 12pt; font-family: "Times New Roman"; }a:link, span.MsoHyperlink { color: blue; text-decoration: underline; }a:visited, span.MsoHyperlinkFollowed { color: purple; text-decoration: underline; }p.MsoListParagraph, li.MsoListParagraph, div.MsoListParagraph { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpFirst, li.MsoListParagraphCxSpFirst, div.MsoListParagraphCxSpFirst { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpMiddle, li.MsoListParagraphCxSpMiddle, div.MsoListParagraphCxSpMiddle { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpLast, li.MsoListParagraphCxSpLast, div.MsoListParagraphCxSpLast { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }div.Section1 { page: Section1; }ol { margin-bottom: 0cm; }ul { margin-bottom: 0cm; } The Telecom industry continues to evolve through disruptive products, uncertain markets, shorter product lifecycles and convergence of technologies. Today’s market has moved from network centric to consumer centric and focuses primarily on the customer experience. It has resulted in several product management challenges such as an increased complexity and volume of offerings, creating product variants, accelerating time-to-market, ability to provide multiple product views for varied stakeholders, leveraging OSS intelligence to BSS layer, product co-creation and increasing audit and security concerns for service providers. The document discusses how enterprise product management enabled by PLM-based product catalogue solutions helps to launch next generation products rapidly in the context of the Telecommunication Industry.   1.0.       Introduction   Figure 1: Business Scenario   Modern business demands the launch of complex products in a very short timeframe and effecting changes in the price plan faster without IT intervention. One of the key transformation initiatives companies are focusing on is in the area of product management transformation and operational efficiency improvement. As part of these initiatives, companies are investing in best- in-class COTs-based Product Management solutions developed on industry-wide standards.   The new COTs packages are planned to integrate with existing or new B/OSS systems to provide a strategic end-to-end agile solution for reduced time-to-market and order journey time. In addition, system rationalization is being undertaken to phase out legacy systems and migrate to strategic systems.   2.0.       An Overview of Product Management in Telecom   Product data in telecom is multi- dimensional and difficult to manage. It increased significantly due to the complexity of the product, product offerings on the converged network, increased volume of offerings, bundled offering structures and ever increasing regulatory requirements.   In addition, the shrinking product lifecycle in telecom makes it difficult to manage the dynamic product data. Mergers and acquisitions coupled with organic growth pose major challenges in product portfolio management. It is a roadblock in the journey towards becoming an agile organization.       Figure 2: Complexity in Product Management   Network Technology’ is the new dimension in telecom product management where the same products are realized through different networks i.e., Soiled network to Converged network. Consequently, the product solution is different.     Figure 3: Current Scenario - Pain Points in Product Management   The major business implications arising out of the current scenario are slow time-to-market and an inefficient process that affects innovation.   3.0. Transformation of Next Generation Product Management   Companies must focus on their Product Management Transformation Journey in the areas of:   ·       Management of single truth of product information across the organization/geographies which is currently managed in heterogeneous systems   ·       Management of the Intellectual Property (IP) on the product concept and partnership in the design of discrete components to integrate into the system   ·       Leveraging structured and unstructured product data within the extended enterprise to extract consumer insights and drive innovation   ·       Management of effective operational separation to comply with regulatory bodies   ·       Reuse of existing designs and add relevant features such as value-added services to enable effective product bundling     Figure 4: Next generation needs   PLM-based Enterprise Product Catalogue solutions efficiently address the above requirements and act as an enabler towards product management transformation and rapid product launch.   4.0. PLM-based Enterprise Product Management     Figure 5: PLM-based Enterprise Product Mastering   Enterprise Product Management (EPM) enables the business to manage complex product attributes of data in complex environments. Product Mastering helps create a 'single view' of the product by creating a business-driven, IT-supported environment where a global 'single truth record' is created, managed and reused.   4.1 The Business Case for Telco PLM-based solutions for Enterprise Product Management   ·       Telco PLM-based Product Mastering solutions provide a centralized authoring environment for product definition and control of all product data and rules   ·       PLM packages are designed to support multiple perspectives of product data (ordering perspective, billing perspective, provisioning perspective)   ·       Maintains relationships/links between different elements of the entire product definition   ·       Telco PLM packages are specialized in next generation lifecycle management requirements of products such as revision and state management, test and release management, role management and impact analysis)   ·       Takes into consideration all aspects of OSS product requirements compared to CRM product catalogue solutions where the product data managed is mostly order oriented and transactional     ·       New breed of Telco PLM packages are designed with 'open' standards such as SID and eTOM. They are interoperable, support integration frameworks such as subscription and notification.   ·       Telco PLM packages have developed good collaboration frameworks to integrate suppliers and partners into the product development value chain   4.2 Various Architectures/Approaches for Product Mastering using Telco PLM systems   4. 2.a Single Central Product Management (Mastering) Approach   Figure 6: Single Central Product Management (Master) Approach       This approach is implemented across verticals such as aerospace and automotive. It focuses on a physically centralized product master to which other sources are dependent on. The product definition data (Product bundles, service bundles, price plans, offers and discounts, product configuration rules and market campaigns) is created and maintained physically in a centralized environment. In addition, the product definition/authoring environment is centralized. The existing legacy product definition data available in CRM product catalogue, billing catalogue and the legacy product catalogue is migrated to the centralized PLM-based Enterprise Product Management solution.   Architectural changes must be made in the existing business landscape of applications to create and revise data because the applications have to refer to the central repository for approvals and validation of product configurations. It is achieved by modifying how the applications write data or how the applications can be adapted to use the rules to be managed and published.   Complete product configuration validation will be done in enterprise / central product catalogue and final configuration will be sent to the B/OSS system through the SOA compliant product distribution architecture. The approach/architecture enables greater control in terms of product data management and product data governance.   4.2.b Federated Product Management (Mastering) Architecture     Figure 7: Federated Product Management (Mastering) Architecture   In the federated product mastering approach, the basic unique product definition data (product id, description product hierarchy, basic price plans and simple product design rules) will be centrally created and will be maintained. And, the advanced product definition (Product bundling, promotions, offers & discount plans) will be created in respective down stream OSS systems. The advanced product definition (Product bundling, promotions, offers and discount plans) will be created in respective downstream OSS systems.   For example, basic product definitions such as attributes, product hierarchy and basic price plans will be created and maintained in Enterprise/Central product reference catalogue and distributed to downstream OSS systems. Respective downstream OSS systems build product bundles, promotions, advanced price plans over the basic product definition and master the advanced product definition. Central reference database accesses the respective other source product master data and assembles a point-in-time consolidated view of the product. The approach is typically adapted in some merger and acquisition scenarios where there is a low probability of a central physical authority managing the data. In addition, the migration effort in this case is minimal and there are no big architectural changes to the organization application landscape. However, this approach will not result in better product data management and data governance.   5.0 Customer Scenario – Before EPC deployment   A leading global telecommunications service provider wanted to launch a quad play and triple play service offering in the shortest possible lead time. The service provider was offering Broadband and VoIP services to customers. The company wanted to reuse a majority of the Broadband services and price plans and bundle them with new wireless and IPTV services for quad play and triple play. The challenges in launching the new service offerings were:       Figure 8: Triple Play Plan   ·       Broadband product data was stored in multiple product catalogues (CRM catalogue, Billing catalogue, spread sheets)   ·       Product managers spent a lot of time performing tasks involving duplication or re-keying of data. Manual effort caused errors, cost and time over-runs.   ·       No effective product and price data governance mechanism. Price change issues arising from the lack of data consistency across systems resulted in leakage of customer value and revenue.   ·       Product data had re-usability issues and was not in a structured format. It resulted in uncontrolled product portfolio creation and product management issues.   ·       Lack of enterprise product model resulted into product distribution challenges and thus delays in product launch.   ·       Designers are constrained by existing legacy product management solutions to model product/service requirements and product configuration rules such as upgrading, downgrading and cross selling.    5.1 Customer Scenario - After EPC deployment     Figure 9: SOA-based end-to-end EPC Solution   The company deployed PLM-based Enterprise Product Catalogue solutions to launch quad play service after evaluating various product catalogues. The broadband product offering, service and price data were migrated to the new system, and the product and price plan hierarchy for new offerings were created using the entities defined in the Enterprise Product Model. Supplier product catalogue data such as routers and set up boxes were loaded onto the new solution through SOA-based web service. Price plans and configuration rules were built in the new system. The validated final product configurations were extracted from the product catalogue in a SID format and were distributed to the downstream B/OSS systems through exposed SOA-based web services. The transformations required for the B/OSS system were handled using the transformation layer as part of the solution.   6.0 How PLM enabled Product Management Transformation         Figure 10: Product Management Transformation     PLM-based Product Catalogue Solution helped the customer reduce the product launch cycle time by 30% and enable transformation of Product Management for next generation services.   7.0 Conclusion   On the one hand, the telecom industry is undergoing changes due to disruptions, uncertain product markets and increased complexity of products. On the other hand, the ARPU is decreasing year-on-year. Communications Service Providers are embarking on convergence, bundled service offerings, flexibility to cross-sell and up-sell, introduce new value-added services, leverage Web 2.0 concepts and network capabilities. Consequently, large scale IT transformation initiatives to improve their ARPU supporting network and business transformations are a business imperative. Product Management has become a focus area. Companies are investing in best-in- class COTS solutions to reduce time-to-market, ensure rapid service delivery and improve operational efficiency. An efficient PLM-based enterprise product mastering solution plays a key role in achieving zero touch automation and rapid product launch.   References:   1.     Preston G.Smith, Donald G.Reineristsem, Van Nostrand Reinhold “Developing Products in Half the time”.   2.     John G. Innes, "Achieving Successful Product Change", Pitman Publishing.   3.     D T Pham and R M Setchi (16th Jan, 2001) "Authoring environment for documentation development" University of Wales Cardiff, U.K., Proceedings on Institution of Mechanical Engineers, Vol. 215, Part B.   4.     Oracle Product Hub for Communications:   http://www.oracle.com/us/products/applications/master-data-management/product-hub-082059.html  

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >