Search Results

Search found 25660 results on 1027 pages for 'dotnetnuke development'.

Page 853/1027 | < Previous Page | 849 850 851 852 853 854 855 856 857 858 859 860  | Next Page >

  • Laissez les bon temps rouler! (Microsoft BI Conference 2010)

    - by smisner
    "Laissez les bons temps rouler" is a Cajun phrase that I heard frequently when I lived in New Orleans in the mid-1990s. It means "Let the good times roll!" and encapsulates a feeling of happy expectation. As I met with many of my peers and new acquaintances at the Microsoft BI Conference last week, this phrase kept running through my mind as people spoke about their plans in their respective businesses, the benefits and opportunities that the recent releases in the BI stack are providing, and their expectations about the future of the BI stack. Notwithstanding some jabs here and there to point out the platform is neither perfect now nor will be anytime soon (along with admissions that the competitors are also not perfect), and notwithstanding several missteps by the event organizers (which I don't care to enumerate), the overarching mood at the conference was positive. It was a refreshing change from the doom and gloom hovering over several conferences that I attended in 2009. Although many people expect economic hardships to continue over the coming year or so, everyone I know in the BI field is busier than ever and expects to stay busy for quite a while. Self-Service BI Self-service was definitely a theme of the BI conference. In the keynote, Ted Kummert opened with a look back to a fairy tale vision of self-service BI that he told in 2008. At that time, the fairy tale future was a time when "every end user was able to use BI technologies within their job in order to move forward more effectively" and transitioned to the present time in which SQL Server 2008 R2, Office 2010, and SharePoint 2010 are available to deliver managed self-service BI. This set of technologies is presumably poised to address the needs of the 80% of users that Kummert said do not use BI today. He proceeded to outline a series of activities that users ought to be able to do themselves--from simple changes to a report like formatting or an addtional data visualization to integration of an additional data source. The keynote then continued with a series of demonstrations of both current and future technology in support of self-service BI. Some highlights that interested me: PowerPivot, of course, is the flagship product for self-service BI in the Microsoft BI stack. In the TechEd keynote, which was open to the BI conference attendees, Amir Netz (twitter) impressed the audience by demonstrating interactivity with a workbook containing 100 million rows. He upped the ante at the BI keynote with his demonstration of a future-state PowerPivot workbook containing over 2 billion records. It's important to note that this volume of data is being processed by a server engine, and not in the PowerPivot client engine. (Yes, I think it's impressive, but none of my clients are typically wrangling with 2 billion records at a time. Maybe they're thinking too small. This ability to work quickly with large data sets has greater implications for BI solutions than for self-service BI, in my opinion.) Amir also demonstrated KPIs for the future PowerPivot, which appeared to be easier to implement than in any other Microsoft product that supports KPIs, apart from simple KPIs in SharePoint. (My initial reaction is that we have one more place to build KPIs. Great. It's confusing enough. I haven't seen how well those KPIs integrate with other BI tools, which will be important for adoption.) One more PowerPivot feature that Amir showed was a graphical display of the lineage for calculations. (This is hugely practical, especially if you build up calculations incrementally. You can more easily follow the logic from calculation to calculation. Furthermore, if you need to make a change to one calculation, you can assess the impact on other calculations.) Another product demonstration will be available within the next 30 days--Pivot for Reporting Services. If you haven't seen this technology yet, check it out at www.getpivot.com. (It definitely has a wow factor, but I'm skeptical about its practicality. However, I'm looking forward to trying it out with data that I understand.) Michael Tejedor (twitter) demonstrated a feature that I think is really interesting and not emphasized nearly enough--overshadowed by PowerPivot, no doubt. That feature is the Microsoft Business Intelligence Indexing Connector, which enables search of the content of Excel workbooks and Reporting Services reports. (This capability existed in MOSS 2007, but was more cumbersome to implement. The search results in SharePoint 2010 are not only cooler, but more useful by describing whether the content is found in a table or a chart, for example.) This may yet be the dawning of the age of self-service BI - a phrase I've heard repeated from time to time over the last decade - but I think BI professionals are likely to stay busy for a long while, and need not start looking for a new line of work. Kummert repeatedly referenced strategic BI solutions in contrast to self-service BI to emphasize that self-service BI is not a replacement for the services that BI professionals provide. After all, self-service BI does not appear magically on user desktops (or whatever device they want to use). A supporting infrastructure is necessary, and grows in complexity in proportion to the need to simplify BI for users. It's one thing to hear the party line touted by Microsoft employees at the BI keynote, but it's another to hear from the people who are responsible for implementing and supporting it within an organization. Rob Collie (blog | twitter), Kasper de Jonge (blog | twitter), Vidas Matelis (site | twitter), and I were invited to join Andrew Brust (blog | twitter) as he led a Birds of a Feather session at TechEd entitled "PowerPivot: Is It the BI Deal-Changer for Developers and IT Pros?" I would single out the prevailing concern in this session as the issue of control. On one side of this issue were those who were concerned that they would lose control once PowerPivot is implemented. On the other side were those who believed that data should be freely accessible to users in PowerPivot, and even acknowledgment that users would get the data they want even if it meant they would have to manually enter into a workbook to have it ready for analysis. For another viewpoint on how PowerPivot played out at the conference, see Rob Collie's observations. Collaborative BI I have been intrigued by the notion of collaborative BI for a very long time. Before I discovered BI, I was a Lotus Notes developer and later a manager of developers, working in a software company that enabled collaboration in the legal industry. Not only did I help create collaborative systems for our clients, I created a complete project management from the ground up to collaboratively manage our custom development work. In that case, collaboration involved my team, my client contacts, and me. I was also able to produce my own BI from that system as well, but didn't know that's what I was doing at the time. Only in recent years has SharePoint begun to catch up with the capabilities that I had with Lotus Notes more than a decade ago. Eventually, I had the opportunity at that job to formally investigate BI as another product offering for our software, and the rest - as they say - is history. I built my first data warehouse with Scott Cameron (who has also ventured into the authoring world by writing Analysis Services 2008 Step by Step and was at the BI Conference last week where I got to reminisce with him for a bit) and that began a career that I never imagined at the time. Fast forward to 2010, and I'm still lauding the virtues of collaborative BI, if only the tools will catch up to my vision! Thus, I was anxious to see what Donald Farmer (blog | twitter) and Rita Sallam of Gartner had to say on the subject in their session "Collaborative Decision Making." As I suspected, the tools aren't quite there yet, but the vendors are moving in the right direction. One thing I liked about this session was a non-Microsoft perspective of the state of the industry with regard to collaborative BI. In addition, this session included a better demonstration of SharePoint collaborative BI capabilities than appeared in the BI keynote. Check out the video in the link to the session to see the demonstration. One of the use cases that was demonstrated was linking from information to a person, because, as Donald put it, "People don't trust data, they trust people." The Microsoft BI Stack in General A question I hear all the time from students when I'm teaching is how to know what tools to use when there is overlap between products in the BI stack. I've never taken the time to codify my thoughts on the subject, but saw that my friend Dan Bulos provided good insight on this topic from a variety of perspectives in his session, "So Many BI Tools, So Little Time." I thought one of his best points was that ideally you should be able to design in your tool of choice, and then deploy to your tool of choice. Unfortunately, the ideal is yet to become real across the platform. The closest we come is with the RDL in Reporting Services which can be produced from two different tools (Report Builder or Business Intelligence Development Studio's Report Designer), manually, or by a third-party or custom application. I have touted the idea for years (and publicly said so about 5 years ago) that eventually more products would be RDL producers or consumers, but we aren't there yet. Maybe in another 5 years. Another interesting session that covered the BI stack against a backdrop of competitive products was delivered by Andrew Brust. Andrew did a marvelous job of consolidating a lot of information in a way that clearly communicated how various vendors' offerings compared to the Microsoft BI stack. He also made a particularly compelling argument about how the existence of an ecosystem around the Microsoft BI stack provided innovation and opportunities lacking for other vendors. Check out his presentation, "How Does the Microsoft BI Stack...Stack Up?" Expo Hall I had planned to spend more time in the Expo Hall to see who was doing new things with the BI stack, but didn't manage to get very far. Each time I set out on an exploratory mission, I got caught up in some fascinating conversations with one or more of my peers. I find interacting with people that I meet at conferences just as important as attending sessions to learn something new. There were a couple of items that really caught me eye, however, that I'll share here. Pragmatic Works. Whether you develop SSIS packages, build SSAS cubes, or author SSRS reports (or all of the above), you really must take a look at BI Documenter. Brian Knight (twitter) walked me through the key features, and I must say I was impressed. Once you've seen what this product can do, you won't want to document your BI projects any other way. You can download a free single-user database edition, or choose from more feature-rich standard or professional editions. Microsoft Press ebooks. I also stopped by the O'Reilly Media booth to meet some folks that one of my acquisitions editors at Microsoft Press recommended. In case you haven't heard, Microsoft Press has partnered with O'Reilly Media for distribution and publishing. Apart from my interest in learning more about O'Reilly Media as an author, an advertisement in their booth caught me eye which I think is a really great move. When you buy Microsoft Press ebooks through the O'Reilly web site, you can receive it in any (or all) of the following formats where possible: PDF, epub, .mobi for Kindle and .apk for Android. You also have lifetime DRM-free access to the ebooks. As someone who is an avid collector of books, I fnd myself running out of room for storage. In addition, I travel a lot, and it's hard to lug my reference library with me. Today's e-reader options make the move to digital books a more viable way to grow my library. Having a variety of formats means I am not limited to a single device, and lifetime access means I don't have to worry about keeping track of where I've stored my files. Because the e-books are DRM-free, I can copy and paste when I'm compiling notes, and I can print pages when necessary. That's a winning combination in my mind! Overall, I was pleased with the BI conference. There were many more sessions that I couldn't attend, either because the room was full when I got there or there were multiple sessions running concurrently that I wanted to see. Fortunately, many of the sessions are accessible for viewing online at http://www.msteched.com/2010/NorthAmerica along with the TechEd sessions. You can spot the BI sessions by the yellow skyline on the title slide of the presentation as shown below. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Strange exception when connecting to a WCF service via a proxy server

    - by Slauma
    The exception "This operation is not supported for a relative URI." occurs in the following situation: I have a WCF service: [ServiceContract(ProtectionLevel=ProtectionLevel.None)] public interface IMyService { [OperationContract] [FaultContract(typeof(MyFault))] List<MyDto> MyOperation(int param); // other operations } public class MyService : IMyService { public List<MyDto> MyOperation(int param) { // Do the business stuff and return a list of MyDto } // other implementations } MyFault and MyDto are two very simple classes marked with [DataContract] attribute and each only having three [DataMember] of type string, int and int?. This service is hosted in IIS 7.0 on a Win 2008 Server along with an ASP.NET application. I am using an SVC file MyService.svc which is located directly in the root of the web site. The service configuration in web.config is the following: <system.serviceModel> <services> <service name="MyServiceLib.MyService"> <endpoint address="" binding="wsHttpBinding" bindingConfiguration="wsHttpBindingConfig" contract="MyServiceLib.IMyService" /> </service> </services> <bindings> <wsHttpBinding> <binding name="wsHttpBindingConfig"> <security mode="None"> <transport clientCredentialType="None" /> </security> </binding> </wsHttpBinding> </bindings> <behaviors> <serviceBehaviors> <behavior> <serviceMetadata httpGetEnabled="false"/> <serviceDebug includeExceptionDetailInFaults="false" /> </behavior> </serviceBehaviors> </behaviors> </system.serviceModel> This seems to work so far as I can enter the address http://www.domain.com/MyService.svc in a browser and get the "This is a Windows Communication Foundation Service"-Welcome page. One of the clients consuming the service is a console application: MyServiceClient aChannel = new MyServiceClient("WSHttpBinding_IMyService"); List<MyDto> aMyDtoList = aChannel.MyOperation(1); It has the following configuration: <system.serviceModel> <bindings> <wsHttpBinding> <binding name="WSHttpBinding_IMyService" closeTimeout="00:01:00" openTimeout="00:01:00" receiveTimeout="00:10:00" sendTimeout="00:01:00" bypassProxyOnLocal="true" transactionFlow="false" hostNameComparisonMode="StrongWildcard" maxBufferPoolSize="524288" maxReceivedMessageSize="65536" messageEncoding="Text" textEncoding="utf-8" useDefaultWebProxy="false" proxyAddress="10.20.30.40:8080" allowCookies="false"> <readerQuotas maxDepth="32" maxStringContentLength="8192" maxArrayLength="16384" maxBytesPerRead="4096" maxNameTableCharCount="16384" /> <reliableSession ordered="true" inactivityTimeout="00:10:00" enabled="false" /> <security mode="None"> <transport clientCredentialType="Windows" proxyCredentialType="None" realm="" /> <message clientCredentialType="Windows" negotiateServiceCredential="true" /> </security> </binding> </wsHttpBinding> </bindings> <client> <endpoint address="http://www.domain.com/MyService.svc" binding="wsHttpBinding" bindingConfiguration="WSHttpBinding_IMyService" contract="MyService.IMyService" name="WSHttpBinding_IMyService" /> </client> </system.serviceModel> When I run this application at a production server at a customer site calling aChannel.MyOperation(1) throws the following exception: This operation is not supported for a relative URI. When I run this client application on my development PC with exactly the same config, with the exception that I remove proxyAddress="10.20.30.40:8080" from the bindings the operation works without problems. Now I really don't know what specifying a proxy server address might have to do with absolute or relative URIs. The use of the proxy server or not is the only difference I can see when running the client on the production or on the development machine. Does someone have an idea what this exception might mean in this context and how possibly to solve the problem? Thank you in advance for help!

    Read the article

  • .NET Usercontrols telerik devexpress infragistics ComponentOne: who's best?

    - by petebob796
    I am considering the purchase of some .NET user controls with interest in both WinForms and asp.net. I have trialed in the past devexpress when I needed a hierarchical data grid for a personal project which I was impressed with. Rather than just jump for them I am interested in peoples experience of different products such as: TelerikDev ExpressInfragistics ComponentOne Any Others? I would like peoples opinions on: - Features Set and Number of Controls - Installation and Upgrade - Ease of use - Documentation - Price - License - Development (Updates to controls their side) Also if anyone has any links to review (hopefully side by side) please post them

    Read the article

  • TFS 2010 RC does not run Visual Studio 2008 MSTest unit tests

    - by Bernard Vander Beken
    Steps: Run the build including unit tests. Expected result: the unit tests are executed and succeed. Actual result: the unit tests are built by the build, but this is the result: 1 test run(s) completed - 0% average pass rate (0% total pass rate) 0/4 test(s) passed, 0 failed, 4 inconclusive, View Test Results Other Errors and Warnings 1 error(s), 0 warning(s) TF270015: 'MSTest.exe' returned an unexpected exit code. Expected '0'; actual '1'. All the tests are enumerated (four), but the result for each test is "Not Executed". Context: Team Foundation Server 2010 release candidate A build definition that runs projects using the Visual Studio 2008 project format and .NET 3.5 SP1. The unit tests run on a development machine, within Visual Studio. The unit tests project references C:\Program Files (x86)\Microsoft Visual Studio 9.0\Common7\IDE\PublicAssemblies\Microsoft.VisualStudio.QualityTools.UnitTestFramework.dll Typical test class [TestClass] public class DemoTest { [TestMethod] public void DemoTestName() { } // etc }

    Read the article

  • Transformation of Product Management in Telecommunications for Rapid Launch of Next Generation Products

    - by raul.goycoolea
    @font-face { font-family: "Arial"; }@font-face { font-family: "Courier New"; }@font-face { font-family: "Wingdings"; }@font-face { font-family: "Cambria"; }p.MsoNormal, li.MsoNormal, div.MsoNormal { margin: 0cm 0cm 0.0001pt; font-size: 12pt; font-family: "Times New Roman"; }a:link, span.MsoHyperlink { color: blue; text-decoration: underline; }a:visited, span.MsoHyperlinkFollowed { color: purple; text-decoration: underline; }p.MsoListParagraph, li.MsoListParagraph, div.MsoListParagraph { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpFirst, li.MsoListParagraphCxSpFirst, div.MsoListParagraphCxSpFirst { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpMiddle, li.MsoListParagraphCxSpMiddle, div.MsoListParagraphCxSpMiddle { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpLast, li.MsoListParagraphCxSpLast, div.MsoListParagraphCxSpLast { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }div.Section1 { page: Section1; }ol { margin-bottom: 0cm; }ul { margin-bottom: 0cm; } The Telecom industry continues to evolve through disruptive products, uncertain markets, shorter product lifecycles and convergence of technologies. Today’s market has moved from network centric to consumer centric and focuses primarily on the customer experience. It has resulted in several product management challenges such as an increased complexity and volume of offerings, creating product variants, accelerating time-to-market, ability to provide multiple product views for varied stakeholders, leveraging OSS intelligence to BSS layer, product co-creation and increasing audit and security concerns for service providers. The document discusses how enterprise product management enabled by PLM-based product catalogue solutions helps to launch next generation products rapidly in the context of the Telecommunication Industry.   1.0.       Introduction   Figure 1: Business Scenario   Modern business demands the launch of complex products in a very short timeframe and effecting changes in the price plan faster without IT intervention. One of the key transformation initiatives companies are focusing on is in the area of product management transformation and operational efficiency improvement. As part of these initiatives, companies are investing in best- in-class COTs-based Product Management solutions developed on industry-wide standards.   The new COTs packages are planned to integrate with existing or new B/OSS systems to provide a strategic end-to-end agile solution for reduced time-to-market and order journey time. In addition, system rationalization is being undertaken to phase out legacy systems and migrate to strategic systems.   2.0.       An Overview of Product Management in Telecom   Product data in telecom is multi- dimensional and difficult to manage. It increased significantly due to the complexity of the product, product offerings on the converged network, increased volume of offerings, bundled offering structures and ever increasing regulatory requirements.   In addition, the shrinking product lifecycle in telecom makes it difficult to manage the dynamic product data. Mergers and acquisitions coupled with organic growth pose major challenges in product portfolio management. It is a roadblock in the journey towards becoming an agile organization.       Figure 2: Complexity in Product Management   Network Technology’ is the new dimension in telecom product management where the same products are realized through different networks i.e., Soiled network to Converged network. Consequently, the product solution is different.     Figure 3: Current Scenario - Pain Points in Product Management   The major business implications arising out of the current scenario are slow time-to-market and an inefficient process that affects innovation.   3.0. Transformation of Next Generation Product Management   Companies must focus on their Product Management Transformation Journey in the areas of:   ·       Management of single truth of product information across the organization/geographies which is currently managed in heterogeneous systems   ·       Management of the Intellectual Property (IP) on the product concept and partnership in the design of discrete components to integrate into the system   ·       Leveraging structured and unstructured product data within the extended enterprise to extract consumer insights and drive innovation   ·       Management of effective operational separation to comply with regulatory bodies   ·       Reuse of existing designs and add relevant features such as value-added services to enable effective product bundling     Figure 4: Next generation needs   PLM-based Enterprise Product Catalogue solutions efficiently address the above requirements and act as an enabler towards product management transformation and rapid product launch.   4.0. PLM-based Enterprise Product Management     Figure 5: PLM-based Enterprise Product Mastering   Enterprise Product Management (EPM) enables the business to manage complex product attributes of data in complex environments. Product Mastering helps create a 'single view' of the product by creating a business-driven, IT-supported environment where a global 'single truth record' is created, managed and reused.   4.1 The Business Case for Telco PLM-based solutions for Enterprise Product Management   ·       Telco PLM-based Product Mastering solutions provide a centralized authoring environment for product definition and control of all product data and rules   ·       PLM packages are designed to support multiple perspectives of product data (ordering perspective, billing perspective, provisioning perspective)   ·       Maintains relationships/links between different elements of the entire product definition   ·       Telco PLM packages are specialized in next generation lifecycle management requirements of products such as revision and state management, test and release management, role management and impact analysis)   ·       Takes into consideration all aspects of OSS product requirements compared to CRM product catalogue solutions where the product data managed is mostly order oriented and transactional     ·       New breed of Telco PLM packages are designed with 'open' standards such as SID and eTOM. They are interoperable, support integration frameworks such as subscription and notification.   ·       Telco PLM packages have developed good collaboration frameworks to integrate suppliers and partners into the product development value chain   4.2 Various Architectures/Approaches for Product Mastering using Telco PLM systems   4. 2.a Single Central Product Management (Mastering) Approach   Figure 6: Single Central Product Management (Master) Approach       This approach is implemented across verticals such as aerospace and automotive. It focuses on a physically centralized product master to which other sources are dependent on. The product definition data (Product bundles, service bundles, price plans, offers and discounts, product configuration rules and market campaigns) is created and maintained physically in a centralized environment. In addition, the product definition/authoring environment is centralized. The existing legacy product definition data available in CRM product catalogue, billing catalogue and the legacy product catalogue is migrated to the centralized PLM-based Enterprise Product Management solution.   Architectural changes must be made in the existing business landscape of applications to create and revise data because the applications have to refer to the central repository for approvals and validation of product configurations. It is achieved by modifying how the applications write data or how the applications can be adapted to use the rules to be managed and published.   Complete product configuration validation will be done in enterprise / central product catalogue and final configuration will be sent to the B/OSS system through the SOA compliant product distribution architecture. The approach/architecture enables greater control in terms of product data management and product data governance.   4.2.b Federated Product Management (Mastering) Architecture     Figure 7: Federated Product Management (Mastering) Architecture   In the federated product mastering approach, the basic unique product definition data (product id, description product hierarchy, basic price plans and simple product design rules) will be centrally created and will be maintained. And, the advanced product definition (Product bundling, promotions, offers & discount plans) will be created in respective down stream OSS systems. The advanced product definition (Product bundling, promotions, offers and discount plans) will be created in respective downstream OSS systems.   For example, basic product definitions such as attributes, product hierarchy and basic price plans will be created and maintained in Enterprise/Central product reference catalogue and distributed to downstream OSS systems. Respective downstream OSS systems build product bundles, promotions, advanced price plans over the basic product definition and master the advanced product definition. Central reference database accesses the respective other source product master data and assembles a point-in-time consolidated view of the product. The approach is typically adapted in some merger and acquisition scenarios where there is a low probability of a central physical authority managing the data. In addition, the migration effort in this case is minimal and there are no big architectural changes to the organization application landscape. However, this approach will not result in better product data management and data governance.   5.0 Customer Scenario – Before EPC deployment   A leading global telecommunications service provider wanted to launch a quad play and triple play service offering in the shortest possible lead time. The service provider was offering Broadband and VoIP services to customers. The company wanted to reuse a majority of the Broadband services and price plans and bundle them with new wireless and IPTV services for quad play and triple play. The challenges in launching the new service offerings were:       Figure 8: Triple Play Plan   ·       Broadband product data was stored in multiple product catalogues (CRM catalogue, Billing catalogue, spread sheets)   ·       Product managers spent a lot of time performing tasks involving duplication or re-keying of data. Manual effort caused errors, cost and time over-runs.   ·       No effective product and price data governance mechanism. Price change issues arising from the lack of data consistency across systems resulted in leakage of customer value and revenue.   ·       Product data had re-usability issues and was not in a structured format. It resulted in uncontrolled product portfolio creation and product management issues.   ·       Lack of enterprise product model resulted into product distribution challenges and thus delays in product launch.   ·       Designers are constrained by existing legacy product management solutions to model product/service requirements and product configuration rules such as upgrading, downgrading and cross selling.    5.1 Customer Scenario - After EPC deployment     Figure 9: SOA-based end-to-end EPC Solution   The company deployed PLM-based Enterprise Product Catalogue solutions to launch quad play service after evaluating various product catalogues. The broadband product offering, service and price data were migrated to the new system, and the product and price plan hierarchy for new offerings were created using the entities defined in the Enterprise Product Model. Supplier product catalogue data such as routers and set up boxes were loaded onto the new solution through SOA-based web service. Price plans and configuration rules were built in the new system. The validated final product configurations were extracted from the product catalogue in a SID format and were distributed to the downstream B/OSS systems through exposed SOA-based web services. The transformations required for the B/OSS system were handled using the transformation layer as part of the solution.   6.0 How PLM enabled Product Management Transformation         Figure 10: Product Management Transformation     PLM-based Product Catalogue Solution helped the customer reduce the product launch cycle time by 30% and enable transformation of Product Management for next generation services.   7.0 Conclusion   On the one hand, the telecom industry is undergoing changes due to disruptions, uncertain product markets and increased complexity of products. On the other hand, the ARPU is decreasing year-on-year. Communications Service Providers are embarking on convergence, bundled service offerings, flexibility to cross-sell and up-sell, introduce new value-added services, leverage Web 2.0 concepts and network capabilities. Consequently, large scale IT transformation initiatives to improve their ARPU supporting network and business transformations are a business imperative. Product Management has become a focus area. Companies are investing in best-in- class COTS solutions to reduce time-to-market, ensure rapid service delivery and improve operational efficiency. An efficient PLM-based enterprise product mastering solution plays a key role in achieving zero touch automation and rapid product launch.   References:   1.     Preston G.Smith, Donald G.Reineristsem, Van Nostrand Reinhold “Developing Products in Half the time”.   2.     John G. Innes, "Achieving Successful Product Change", Pitman Publishing.   3.     D T Pham and R M Setchi (16th Jan, 2001) "Authoring environment for documentation development" University of Wales Cardiff, U.K., Proceedings on Institution of Mechanical Engineers, Vol. 215, Part B.   4.     Oracle Product Hub for Communications:   http://www.oracle.com/us/products/applications/master-data-management/product-hub-082059.html  

    Read the article

  • Problem running Thinking Sphinx with Rails 2.3.5

    - by benoror
    Hi, I just installed Sphinx (distro: archlinux) downloading the source. Then I installed "Thinking Sphinx" plugin for Rails. I followed the official page setup and this Screencast from Ryan Bates, but when I try to index the models it gives me this error: $ rake thinking_sphinx:index (in /home/benoror/Dropbox/Proyectos/cotizahoy) Sphinx cannot be found on your system. You may need to configure the following settings in your config/sphinx.yml file: * bin_path * searchd_binary_name * indexer_binary_name For more information, read the documentation: http://freelancing-god.github.com/ts/en/advanced_config.html Generating Configuration to /home/benoror/Dropbox/Proyectos/cotizahoy/config/development.sphinx.conf sh: indexer: command not found I tried starting the daemon manually (/usr/bin/sphinx-searchd), changing the config/sphinx.yml file: devlopment: searchd_binary_name: sphinx-searchd indexer_binary_name: sphinx-indexer But it shows the same error, any ideas ?

    Read the article

  • Maven Plugin - Restart Jetty with new WAR?

    - by Walter White
    Hi all, What I would like to do is automatically test against several different maven build profiles. I want to write a maven plugin that iterates through each profile so I don't have to manually list them for the CI process. I just want to verify that the code works in all development, testing, staging, and production once deployed there. I want it to automatically test against those profiles so I could keep it a part of the same maven build? How would I best set that up to log those changes in Sonar or another tool? Walter

    Read the article

  • Should I be afraid of Linux server administration?

    - by markle976
    I've been trying to figure out what to focus on. I finally realized that the root of my quandary is that I am unsure about learning Linux server administration. I have been getting pretty good with PHP/MySQL and web development, but I am not very familiar with Linux. Is it hard to learn? What would I need to know in order to manage a LAMP stack? Also, which version is most used in enterprises? I think I have also hesitated to dive in because it seems like it is mostly used in small companies, but I guess that could be a good thing.

    Read the article

  • Strange error occurring when using wcf to run query against sql server

    - by vondip
    Hi all, I am building an asp.net application, using II6 on windows server 2003 (vps hosting). I am confronted with an error I didn't receive on my development machine (windows 7, iis 7.5, 64 bit). When my wcf service tries launching my query running against a local sql server this is the error I receive: Memory gates checking failed because the free memory (43732992 bytes) is less than 5% of total memory. As a result, the service will not be available for incoming requests. To resolve this, either reduce the load on the machine or adjust the value of minFreeMemoryPercentageToActivateService on the serviceHostingEnvironment config element. and ideas??

    Read the article

  • Printing PDFs Server-side using Acrobat Reader from ASP.NET

    - by Chris Roberts
    Hi, I have been presented with a problem which requires me to print PDF files from a server as part of an ASP.NET web service. The problem is further complicated by the fact that the PDF files I have to print can ONLY be printed using Adobe Reader (they were created using Adobe LiveCycle and have some strange protection in them). This piece of code seems to do the trick in the Visual Studio development web server, but doesn't do anything when the site's running in IIS. I'm assuming this is probably some sort of permissions issue!? I know this is a FAR from ideal thing to be trying to do, but I haven't really got much choice! Any ideas would be greatly appreciated! Dim starter As ProcessStartInfo Dim Prc As Process ' Pass File Path And Arguments starter = New ProcessStartInfo("c:\program files\...\AcroRd32.exe", "/t ""test.pdf"" ""Printer""") starter.CreateNoWindow = True starter.RedirectStandardOutput = True starter.UseShellExecute = False ' Start Adobe Process Prc = New Process() Prc.StartInfo = starter Prc.Start()

    Read the article

  • I can't get onChange to fire for dijit.form.Select

    - by user306462
    I seem unable to correctly attach the onchange event to a dijit.form.Select widget. However, I am new to web development, so I could be doing something completely idiotic (although, as best I can tell (and I've read all the docs I could find) I'm not). I made sure the body class matches the dojo theme, that I dojo.require() for all the widgets I use (and dojo.parser), and still, nada. The code I'm using is: dojo.addOnLoad (function () { var _query = dojo.query('.toggle'); for (var i=0; i < _query.length; i++) { dojo.connect(_query[i], 'onchange', function (ev) { console.log(ev + ' fired onchange'); }); } }); Any help at all would be appreciated.

    Read the article

  • application: didFinishLaunchingWithOptions doesn't execute, but RootViewController: viewDidLoad does

    - by BeachRunnerJoe
    I'm playing around with the iPad SplitView template and it was working fine before I started swapping out view objects in my RootViewController. When it was working fine, the application:didFinishLaunchingWithOptions method would be called and would setup my persistant store objects, then the RootViewController:viewDidLoad method would be called to populate my rootView with data from my store. I opened up IB and started swapping out view objects in my RootView and now the application:didFinishLaunchingWithOptions method never gets called, but the RootViewController:viewDidLoad method still does. Obviously, the app crashes because the viewDidLoad method depends on the successful execution of the didFinishLauchingWIthOptions method to setup the persistent store objects. Does anyone have any thoughts on what is causing this or how I can go about investigating what's causing this? I'm obviously new to iPhone OS development, so I apologize if this questions is absurd in any way. Thanks so much in advance for your help!

    Read the article

  • Is learning the Caché database hard coming from relational databases and object oriented programming

    - by Edelcom
    I am currently running the local version of Caché on my system in order to determine if I can (and will) take on a new possible project. The current project uses Delphi 7 as a front end calling a Caché dll where the business logic is stored in the database. I have a background of Sqlserver and Firebird (and before Access and Paradox) as databases. I use Delphi 7 for 95% of my Windows development, so I know about object programming. I would like to recieve opinions from persons having used Caché and either SqlServer, Firebird or Oracle and having developed in Delphi (or C++ or C# - an object oriented language). I have read the pro's and con's from other questions, but I am not asking for this, I need input from Caché developers. Thanks in advance.

    Read the article

  • iphone - Programmatically set (System-wide) proxy settings?

    - by Andrew
    I am new to iPhone development, so I'm sorry if this is a stupid question. I am developing an application whose purpose will be to route all iPhone activity through my company's proxy. Is there a way to programmatically set system-wide proxy settings in the iPhone (which will also take effect on the 3G connection)? I know there is a way to manually set proxy settings for each wifi connection. Detecting new networks and setting the proxy on them would be acceptable. However, I need to also be able to set the proxy on the 3G connection. Also, bonus: Is there a way to programmatically change the "Restrictions" settings? If anyone has any tips or can point me in the right direction, I would appreciate it. Thanks. EDIT: Please understand that this is for a legitimate purpose. Apple has to approve app store additions, so it's not like I'm trying to spread a virus. Please, constructive answers only.

    Read the article

  • Avoiding .NET versioning hell

    - by moogs
    So sometimes (oftentimes!) you want to target a specific .NET version (say 3.0), but then due to some .NET service packs you get into problems like: Dispatcher.BeginInvoke(Delegate, Object[]) <-- this was added in 3.0 SP2 (3.0.30618 ) System.Threading.WaitHandle.WaitOne(Int32) <-- this was added in 3.5 SP1, 3.0 SP2, 2.0 SP2 Now, these are detected by the JIT compiler, so building against .NET 3.0 in Visual Studio won't guarante it will run on .NET 3.0 only systems. Short of confirming each and every function you use, or limiting your development environment to .NET 3.0 (which sucks since you have to develop for other projects too) what's the best way to avoid against using extensions? Thanks!

    Read the article

  • Social Game Mechanics in Django

    - by oliland
    I want users to receive 'points' for completing various tasks in my application - ranging from tasks such as tagging objects to making friends. I havn't yet found a Django application that simplifies this. At the moment I'm thinking that the best way to accumulate points is that each user action creates the equivalent of a "stream item", and the points are calculated through counting the value of each action published to their stream. Obviously social game mechanics is a huge area with a lot of research going on at the moment. But from a development perspective what's the easiest way to get started? Am I on the wrong track or are there better / simpler ways?

    Read the article

  • (Fluent)NHibernate: Mapping an IDictionary<MappedClass, MyEnum>

    - by anthony
    I've found a number of posts about this but none seem to help me directly. Also there seems to be confusion about solutions working or not working during different stages of FluentNHibernate's development. I have the following classes: public class MappedClass { ... } public enum MyEnum { One, Two } public class Foo { ... public virtual IDictionary<MappedClass, MyEnum> Values { get; set; } } My questions are: Will I need a separate (third) table of MyEnum? How can I map the MyEnum type? Should I? What should Foo's mapping look like? I've tried mapping HasMany(x = x.Values).AsMap("MappedClass")... This results in: NHibernate.MappingException : Association references unmapped class: MyEnum

    Read the article

  • Marshall a list of objects from VB6 to C#

    - by Andrew
    I have a development which requires the passing of objects between a VB6 application and a C# class library. The objects are defined in the C# class library and are used as parameters for methods exposed by other classes in the same library. The objects all contain simple string/numeric properties and so marshaling has been relatively painless. We now have a requirement to pass an object which contains a list of other objects. If I was coding this in VB6 I might have a class containing a collection as a member variable. In C# I might have a class with a List member variable. Is it possible to construct a C# class in such a way that the VB6 application could populate this inner list and marshal it successfully? I don't have a lot of experience here but I would guess Id have to use an array of Object types.

    Read the article

  • Why do I get this error when I try to push my SQLite3 to Postgresql (via Taps) on Cedar Stack?

    - by rhodee
    I've done quite a bit of research on Heroku Dev Center and I am now looking to the community for help. Here is my problem. I can not push my db to Heroku Cedar Stack. I am trying to migrate a sqlite database to postgresql via Taps gem. When I am ready to deploy I run: bundle install --without production heroku run db:push I get the following result: Running db:seed attached to terminal... up, run.17 sh: db:seed: not found heroku run rake db:migrate And when I run the migration: heroku run rake db:migrate I get the following: Running rake db:migrate attached to terminal... up, run.18 rake aborted! No Rakefile found (looking for: rakefile, Rakefile, rakefile.rb, Rakefile.rb) /usr/local/lib/ruby/1.9.1/rake.rb:2367:in `raw_load_rakefile' /usr/local/lib/ruby/1.9.1/rake.rb:2007:in `block in load_rakefile' /usr/local/lib/ruby/1.9.1/rake.rb:2058:in `standard_exception_handling' /usr/local/lib/ruby/1.9.1/rake.rb:2006:in `load_rakefile' /usr/local/lib/ruby/1.9.1/rake.rb:1991:in `run' /usr/local/bin/rake:31:in `<main>' Everytime I push to Heroku (git push heroku master) it fails because my gem file is attempting to install sqlite3 gem-even though its inside of the development and test groups in my Gemfile. My database.yml production environment still points to sqlite adapter even after I have run the following command successfully: heroku config:add BUNDLE_WITHOUT="test development" --app app_name_on_heroku Out of ideas. Please help. If its useful I can post results of my gemfile, heroku ps and logs. Cheers UPDATE: After following @John's direction I now receive the following terminal message. Sending schema Schema: 100% |==========================================| Time: 00:00:07 Sending indexes schema_migrat: 100% |==========================================| Time: 00:00:00 Sending data 4 tables, 6 records schema_migrat: 0% | | ETA: --:--:-- Saving session to push_201111070749.dat.. !!! Caught Server Exception HTTP CODE: 500 Taps Server Error: LoadError: no such file to load -- sequel/adapters/ And the following warnings: ["/app/.bundle/gems/ruby/1.9.1/gems/sequel-3.20.0/lib/sequel/core.rb:249:in require'", "/app/.bundle/gems/ruby/1.9.1/gems/sequel-3.20.0/lib/sequel/core.rb:249:inblock in tsk_require'", "/app/.bundle/gems/ruby/1.9.1/gems/sequel-3.20.0/lib/sequel/core.rb:72:in block in check_requiring_thread'", "<internal:prelude>:10:insynchronize'", "/app/.bundle/gems/ruby/1.9.1/gems/sequel-3.20.0/lib/sequel/core.rb:69:in check_requiring_thread'", "/app/.bundle/gems/ruby/1.9.1/gems/sequel-3.20.0/lib/sequel/core.rb:249:intsk_require'", "/app/.bundle/gems/ruby/1.9.1/gems/sequel-3.20.0/lib/sequel/database/connecting.rb:25:in adapter_class'", "/app/.bundle/gems/ruby/1.9.1/gems/sequel-3.20.0/lib/sequel/database/connecting.rb:54:inconnect'", "/app/.bundle/gems/ruby/1.9.1/gems/sequel-3.20.0/lib/sequel/core.rb:119:in connect'", "/app/lib/taps/db_session.rb:14:inconn'", "/app/lib/taps/server.rb:91:in block in <class:Server>'", "/app/.bundle/gems/ruby/1.9.1/gems/sinatra-1.0/lib/sinatra/base.rb:865:incall'", "/app/.bundle/gems/ruby/1.9.1/gems/sinatra-1.0/lib/sinatra/base.rb:865:in block in route'", "/app/.bundle/gems/ruby/1.9.1/gems/sinatra-1.0/lib/sinatra/base.rb:521:ininstance_eval'", "/app/.bundle/gems/ruby/1.9.1/gems/sinatra-1.0/lib/sinatra/base.rb:521:in route_eval'", "/app/.bundle/gems/ruby/1.9.1/gems/sinatra-1.0/lib/sinatra/base.rb:500:inblock (2 levels) in route!'", "/app/.bundle/gems/ruby/1.9.1/gems/sinatra-1.0/lib/sinatra/base.rb:497:in catch'", "/app/.bundle/gems/ruby/1.9.1/gems/sinatra-1.0/lib/sinatra/base.rb:497:inblock in route!'", "/app/.bundle/gems/ruby/1.9.1/gems/sinatra-1.0/lib/sinatra/base.rb:476:in each'", "/app/.bundle/gems/ruby/1.9.1/gems/sinatra-1.0/lib/sinatra/base.rb:476:inroute!'", "/app/.bundle/gems/ruby/1.9.1/gems/sinatra-1.0/lib/sinatra/base.rb:601:in dispatch!'", "/app/.bundle/gems/ruby/1.9.1/gems/sinatra-1.0/lib/sinatra/base.rb:411:inblock in call!'", "/app/.bundle/gems/ruby/1.9.1/gems/sinatra-1.0/lib/sinatra/base.rb:566:in instance_eval'", "/app/.bundle/gems/ruby/1.9.1/gems/sinatra-1.0/lib/sinatra/base.rb:566:inblock in invoke'", "/app/.bundle/gems/ruby/1.9.1/gems/sinatra-1.0/lib/sinatra/base.rb:566:in catch'", "/app/.bundle/gems/ruby/1.9.1/gems/sinatra-1.0/lib/sinatra/base.rb:566:ininvoke'", "/app/.bundle/gems/ruby/1.9.1/gems/sinatra-1.0/lib/sinatra/base.rb:411:in call!'", "/app/.bundle/gems/ruby/1.9.1/gems/sinatra-1.0/lib/sinatra/base.rb:399:incall'", "/app/.bundle/gems/ruby/1.9.1/gems/rack-1.2.1/lib/rack/auth/basic.rb:25:in call'", "/app/.bundle/gems/ruby/1.9.1/gems/sinatra-1.0/lib/sinatra/base.rb:979:inblock in call'", "/app/.bundle/gems/ruby/1.9.1/gems/sinatra-1.0/lib/sinatra/base.rb:1005:in synchronize'", "/app/.bundle/gems/ruby/1.9.1/gems/sinatra-1.0/lib/sinatra/base.rb:979:incall'", "/home/heroku_rack/lib/static_assets.rb:9:in call'", "/home/heroku_rack/lib/last_access.rb:15:incall'", "/app/.bundle/gems/ruby/1.9.1/gems/rack-1.2.1/lib/rack/urlmap.rb:47:in block in call'", "/app/.bundle/gems/ruby/1.9.1/gems/rack-1.2.1/lib/rack/urlmap.rb:41:ineach'", "/app/.bundle/gems/ruby/1.9.1/gems/rack-1.2.1/lib/rack/urlmap.rb:41:in call'", "/home/heroku_rack/lib/date_header.rb:14:incall'", "/app/.bundle/gems/ruby/1.9.1/gems/rack-1.2.1/lib/rack/builder.rb:77:in call'", "/app/.bundle/gems/ruby/1.9.1/gems/thin-1.2.7/lib/thin/connection.rb:76:inblock in pre_process'", "/app/.bundle/gems/ruby/1.9.1/gems/thin-1.2.7/lib/thin/connection.rb:74:in catch'", "/app/.bundle/gems/ruby/1.9.1/gems/thin-1.2.7/lib/thin/connection.rb:74:inpre_process'", "/app/.bundle/gems/ruby/1.9.1/gems/thin-1.2.7/lib/thin/connection.rb:57:in process'", "/app/.bundle/gems/ruby/1.9.1/gems/thin-1.2.7/lib/thin/connection.rb:42:inreceive_data'", "/app/.bundle/gems/ruby/1.9.1/gems/eventmachine-0.12.10/lib/eventmachine.rb:256:in run_machine'", "/app/.bundle/gems/ruby/1.9.1/gems/eventmachine-0.12.10/lib/eventmachine.rb:256:inrun'", "/app/.bundle/gems/ruby/1.9.1/gems/thin-1.2.7/lib/thin/backends/base.rb:57:in start'", "/app/.bundle/gems/ruby/1.9.1/gems/thin-1.2.7/lib/thin/server.rb:156:instart'", "/app/.bundle/gems/ruby/1.9.1/gems/thin-1.2.7/lib/thin/controllers/controller.rb:80:in start'", "/app/.bundle/gems/ruby/1.9.1/gems/thin-1.2.7/lib/thin/runner.rb:177:inrun_command'", "/app/.bundle/gems/ruby/1.9.1/gems/thin-1.2.7/lib/thin/runner.rb:143:in run!'", "/app/.bundle/gems/ruby/1.9.1/gems/thin-1.2.7/bin/thin:6:in'", "/usr/ruby1.9.2/bin/thin:19:in load'", "/usr/ruby1.9.2/bin/thin:19:in'"]

    Read the article

  • Unit Testing iPhone - Linker Errors

    - by sliver
    I've followed the Unit Testing Applications guide from the iPhone Development documentation. I followed all the steps and it worked with the TestCase from the documentation. But as soon as I changed the TestCase to test real Code from my project I ended up with linker errors. All classes that are used in the TestCase are reported as missing. I've already searched the internet and found that the Bundle Loader property must be set to "$(BUILT_PRODUCTS_DIR)/MyApplication.app/Contents/MacOS/MyApplication". But this also fails because the file could not be found. Any ideas what I have to do to tell the linker where to search for the missing files?

    Read the article

  • Win32: Is there a replacement GDI32.dll that uses hardware acceleration?

    - by Ian Boyd
    Has anyone out there created a version of GDI32.dll that takes advantage of hardware acceleration available on the machine? gdiplus.dll? Starting with Windows Vista, GDI is no longer hardware accelerated. (GDI+ was never hardware accelerated). Without Microsoft fixing GDI (and GDI+) to be able to run well on the computer: native applications (C++ MFC, Delphi, etc), and managed WinForms applications, will continue to run poorly forever. While i could use Direct2D for business applications, i cannot control the fact that the development environment still creates controls, with decades of library support code, that assumes the presence of GDI.

    Read the article

  • IPhone Example : Login Page and then Navigate to another Screen (UI Table View)

    - by Smc
    Hi, I am an .Net Expert but a new to the IPhone App Development. Recently I have started workinf on Objective - c. I need a help I need a example which shows a Login Screen and then Navigates to the another Screen (UITableView) when username and password and found correct. As of now I am assuming that username and passwords are hardcoded in app. I have tried to Design the Screen for Login UI but unable to Load the another ViewController but it wont worked Can some one help me out with examples, Links?

    Read the article

  • C# [Mono]: MPAPI vs MPI.NET vs ?

    - by Olexandr
    Hi. I'm working on college project. I have to develop distributed computing system. And i decided to do some research to make this task fun :) I've found MPAPI and MPI.NET libraries. Yes, they are .NET libraries(Mono, in my case). Why .NET ? I'm choosing between Ada, C++ and C# so to i've choosed C# because of lower development time. I have two goals: Simplicity; Performance; Cluster computing. So, what to choose - MPAPI or MPI.NET or something else ?

    Read the article

  • why use branches in svn?

    - by ajsie
    i know that you could organize your files according to this structure in svn: trunk branches tags that you copy the trunk to a folder in branches if you want to have a seperate development line. later on you merge this branch back to trunk. but i wonder why me and my group should do this. why should one copy the trunk to a branch and work with this copy just to merge it back to the trunk, and mean while the code is frequently updated/commited to stay in sync with the trunk. why not just work with the trunk then? what is the benefits with creating a branch? would be great if someone could shed a light on this topic. thanks in advance

    Read the article

  • Poll: CSS3 color transparency

    - by lepe
    Many of you already know that you can express colors in your stylesheets like: color: #FFF; color: #FFFFFF; color: rgb(255,255,255); color: hsl(100%,100%,100%); and that you can use rgba() and hsla() to express color transparency. Do you think it would be a good idea to be able to express color transparency in #RRGGBBAA or #RGBA annotation? Why YES, and why NOT??? Also if you want, give a little of details of how you perform your development (if you use a web-builder software, a graphic software (for designs), use vim, etc..)

    Read the article

< Previous Page | 849 850 851 852 853 854 855 856 857 858 859 860  | Next Page >