Search Results

Search found 21053 results on 843 pages for 'process'.

Page 332/843 | < Previous Page | 328 329 330 331 332 333 334 335 336 337 338 339  | Next Page >

  • How do you dive into large code bases?

    - by miku
    What tools and techniques do you use for exploring and learning an unknown code base? I am thinking of tools like grep, ctags, unit-tests, functional test, class-diagram generators, call graphs, code metrics like sloccount and so on. I'd be interested in your experiences, the helpers you used or wrote yourself and the size of the codebase, with which you worked with. I realize, that this is also a process (happening over time) and that learning can mean "can give a ten minute intro" to "can refactor and shrink this to 30% of the size". Let's leave that open for now.

    Read the article

  • Google I/O 2012 - What's New in the Google Drive SDK

    Google I/O 2012 - What's New in the Google Drive SDK Josh Hudgins, John Day-Richter In this talk, we will introduce a number of major new features and platforms to the Google Drive SDK. We will discuss what we feel is a revolution in the way developers write collaborative applications. We will also announce a new API to make managing files in Google Drive even easier for developers, replacing some legacy APIs in the process. For all I/O 2012 sessions, go to developers.google.com From: GoogleDevelopers Views: 556 6 ratings Time: 55:14 More in Science & Technology

    Read the article

  • MOSS 2007 &ndash; WCM Blank WebPart Page Zone ID&rsquo;s

    - by Jeff Julian
    Here is the list of Zone ID’s for the Blank WebPart Page (BlankWebPartPage.aspx") that is part of the Publishing Portal with MOSS 2007: TitleBar Header TopLeftRow TopRightRow CenterLeftColumn CenterColumn CenterRightColumn Footer RightColumn I was in need of these and wasn’t able to find them with a simple search on Google so I wanted to share them with you. To get a list of WebPartZone objects for a page that a webpart lives on can be done with the following code:  foreach (WebPartZone zone in this.WebPartManager.Zones) {          this.Controls.Add(new LiteralControl(zone.ID + "<br />")); } Use this code in a webpart that inherits from Microsoft.SharePoint.WebPartPages.WebPart. This is a simple way to do the equivalent of a Response.Write while having the output in the webpart zone your part resides in.  It also saves you from attaching to the process and debugging with the watch or quick watch. Technorati Tags: MOSS,WebParts,BlankWebPartPage.aspx,Zones

    Read the article

  • Support / Maintenance documentation for development team

    - by benwebdev
    Hi, I'm working in the Development dept (around 40 developers) for a large E-Commerce company. We've grown quickly but have not evolved very well in the field of documenting our work. We work with an Agile / Scrum-like methodology with our development and testing but documentation seems to be neglected. We need to be able to make documentation that would aid a developer who hasnt worked on our project before or was new to the company. We also have to create more high level information for our support department to explain any extra config settings and fixes of known issues that may arise, if any. Currently we put this in a badly put together wiki, based on an old Sharepoint / TFS site. Can anyone suggest some ideal links or advice on improving the documentation standard? What works in other companies? Has anyone got avice on developing documentation as part of an agile process? Many thanks, ben

    Read the article

  • Reference Data Management and Master Data: Are Relation ?

    - by Mala Narasimharajan
    Submitted By:  Rahul Kamath  Oracle Data Relationship Management (DRM) has always been extremely powerful as an Enterprise Master Data Management (MDM) solution that can help manage changes to master data in a way that influences enterprise structure, whether it be mastering chart of accounts to enable financial transformation, or revamping organization structures to drive business transformation and operational efficiencies, or restructuring sales territories to enable equitable distribution of leads to sales teams following the acquisition of new products, or adding additional cost centers to enable fine grain control over expenses. Increasingly, DRM is also being utilized by Oracle customers for reference data management, an emerging solution space that deserves some explanation. What is reference data? How does it relate to Master Data? Reference data is a close cousin of master data. While master data is challenged with problems of unique identification, may be more rapidly changing, requires consensus building across stakeholders and lends structure to business transactions, reference data is simpler, more slowly changing, but has semantic content that is used to categorize or group other information assets – including master data – and gives them contextual value. In fact, the creation of a new master data element may require new reference data to be created. For example, when a European company acquires a US business, chances are that they will now need to adapt their product line taxonomy to include a new category to describe the newly acquired US product line. Further, the cross-border transaction will also result in a revised geo hierarchy. The addition of new products represents changes to master data while changes to product categories and geo hierarchy are examples of reference data changes.1 The following table contains an illustrative list of examples of reference data by type. Reference data types may include types and codes, business taxonomies, complex relationships & cross-domain mappings or standards. Types & Codes Taxonomies Relationships / Mappings Standards Transaction Codes Industry Classification Categories and Codes, e.g., North America Industry Classification System (NAICS) Product / Segment; Product / Geo Calendars (e.g., Gregorian, Fiscal, Manufacturing, Retail, ISO8601) Lookup Tables (e.g., Gender, Marital Status, etc.) Product Categories City à State à Postal Codes Currency Codes (e.g., ISO) Status Codes Sales Territories (e.g., Geo, Industry Verticals, Named Accounts, Federal/State/Local/Defense) Customer / Market Segment; Business Unit / Channel Country Codes (e.g., ISO 3166, UN) Role Codes Market Segments Country Codes / Currency Codes / Financial Accounts Date/Time, Time Zones (e.g., ISO 8601) Domain Values Universal Standard Products and Services Classification (UNSPSC), eCl@ss International Classification of Diseases (ICD) e.g., ICD9 à IC10 mappings Tax Rates Why manage reference data? Reference data carries contextual value and meaning and therefore its use can drive business logic that helps execute a business process, create a desired application behavior or provide meaningful segmentation to analyze transaction data. Further, mapping reference data often requires human judgment. Sample Use Cases of Reference Data Management Healthcare: Diagnostic Codes The reference data challenges in the healthcare industry offer a case in point. Part of being HIPAA compliant requires medical practitioners to transition diagnosis codes from ICD-9 to ICD-10, a medical coding scheme used to classify diseases, signs and symptoms, causes, etc. The transition to ICD-10 has a significant impact on business processes, procedures, contracts, and IT systems. Since both code sets ICD-9 and ICD-10 offer diagnosis codes of very different levels of granularity, human judgment is required to map ICD-9 codes to ICD-10. The process requires collaboration and consensus building among stakeholders much in the same way as does master data management. Moreover, to build reports to understand utilization, frequency and quality of diagnoses, medical practitioners may need to “cross-walk” mappings -- either forward to ICD-10 or backwards to ICD-9 depending upon the reporting time horizon. Spend Management: Product, Service & Supplier Codes Similarly, as an enterprise looks to rationalize suppliers and leverage their spend, conforming supplier codes, as well as product and service codes requires supporting multiple classification schemes that may include industry standards (e.g., UNSPSC, eCl@ss) or enterprise taxonomies. Aberdeen Group estimates that 90% of companies rely on spreadsheets and manual reviews to aggregate, classify and analyze spend data, and that data management activities account for 12-15% of the sourcing cycle and consume 30-50% of a commodity manager’s time. Creating a common map across the extended enterprise to rationalize codes across procurement, accounts payable, general ledger, credit card, procurement card (P-card) as well as ACH and bank systems can cut sourcing costs, improve compliance, lower inventory stock, and free up talent to focus on value added tasks. Change Management: Point of Sales Transaction Codes and Product Codes In the specialty finance industry, enterprises are confronted with usury laws – governed at the state and local level – that regulate financial product innovation as it relates to consumer loans, check cashing and pawn lending. To comply, it is important to demonstrate that transactions booked at the point of sale are posted against valid product codes that were on offer at the time of booking the sale. Since new products are being released at a steady stream, it is important to ensure timely and accurate mapping of point-of-sale transaction codes with the appropriate product and GL codes to comply with the changing regulations. Multi-National Companies: Industry Classification Schemes As companies grow and expand across geographies, a typical challenge they encounter with reference data represents reconciling various versions of industry classification schemes in use across nations. While the United States, Mexico and Canada conform to the North American Industry Classification System (NAICS) standard, European Union countries choose different variants of the NACE industry classification scheme. Multi-national companies must manage the individual national NACE schemes and reconcile the differences across countries. Enterprises must invest in a reference data change management application to address the challenge of distributing reference data changes to downstream applications and assess which applications were impacted by a given change. References 1 Master Data versus Reference Data, Malcolm Chisholm, April 1, 2006.

    Read the article

  • Install using a Logitech Keyboard and mouse with bluetooth dongle?

    - by Ryan
    I'm trying to install 12.10 on my system, but my mouse and keyboard are not working during installation. I use the Logitech MX5500 Bluetooth mouse+keyboard combo with a Bluetooth dongle. My keyboard and mouse work in my UEFI bios, and during the Windows 7/8 installation. My keyboard also works in the Ubuntu screen that allows me to set options, install, use the live cd, etc, before boot. I'm wondering if anyone knows a way to get this dongle working during the installation process so that I can actually install 12.10.

    Read the article

  • Unrecoverable error during WUBI installation

    - by john
    I tried to install 12.10 on my PC with Windows 7 installed, but I don't have a CD drive and my PC doesn't support USB booting, so I tried the Windows Installer (WUBI). I had the ISO image mounted with daemon tools, so the Windows installer took those files and used them (I guess, because it didn't download any files). Everything went right, but when it prompted me to reboot, I rebooted the PC and then it starts to install, but when the installation process starts, a message that says: The installation found a unrecoverable error. pops out and makes me reboot, then when I select Ubuntu in the operating system selection screen, it says that an error occurred.

    Read the article

  • Upgrading Agent Controllers in Oracle Enterprise Manager Ops Center 12c

    - by S Stelting
    Oracle Enterprise Manager Ops Center 12c recently released an upgrade for Solaris Agent Controllers. In this week's blog post, we'll show you how to upgrade agent controllers. Detailed instructions about upgrading Agent Controllers are available in the product documentation here. This blog post uses an Enterprise Controller which is configured for connected mode operation. If you'd like to apply the agent update in a disconnected installation, additional instructions are available here. Step 1: Download Agent Controller Updates With a connected mode Ops Center installation, you can check for product updates at any time by selecting the Enterprise Controller from the left-hand Administration navigation tab. Select the right-hand Action link “Ops Center Downloads” to open a pop-up dialog displaying any new product updates. In this example, the Enterprise Controller has already been upgraded to the latest version (Update 1, also shown as build version 2076) so only the Agent Controller updates will appear. There are three updates available: one for Solaris 10 X86, one for Solaris 8-10 SPARC, and one for all versions of Solaris 11. Note that the last update in the screen shot is the Solaris 11 update; for details on any of the downloads, place your mouse over the information icon under the details column for a pop-up text region. Select the software to download and click the Next button to display the Ops Center license agreement. Review and click the check box to accept the license agreement, then click the Next button to begin downloading the software. The status screen shows the current download status. If desired, you can perform the downloads as a background job. Simply click the check box, then click the next button to proceed to the summary screen. The summary screen shows the updates to be downloaded as well as the current status. Clicking the Finish button will close the dialog and return to the Browser UI. The download job will continue to run in Ops Center and progress can still be viewed from the jobs menu at the bottom of the browser window. Step 2: Check the Version of Existing Agent Controllers After the download job completes, you can check the availability of agent updates as well as the current versions of your Agent Controllers from the left-hand Assets navigation tab. Select “Operating Systems” from the pull-down tab lets to display only OS assets. Next, select “Solaris” in the left-hand tab to display the Solaris assets. Finally, select the Summary tab in the center display panel to show which versions of agent controllers are installed in your data center. Notice that a few of the OS assets are not displayed in the Agent Controllers tab. Ops Center will not display OS instances which do not have an Agent Controller installation. This includes Enterprise Controllers and Proxy Controllers (unless the agent has been activated on the OS instance) and and OS instances using agentless management. For Agent Controllers which support an update, the version of agent software (in this example, 2083) appears to the right of the currently installed version. Step 3: Upgrade Your Agent Controllers If desired, you can upgrade agent controllers from the previous screen by selecting the desired systems and clicking the upgrade button. Alternatively, you can click the link “Upgrade All Agent Controllers” in the right-hand Actions menu: In either case, a pop-up dialog lets you start the upgrade process. The first screen in the dialog lets you choose the upgrade method: Ops Center provides three ways to upgrade agent controllers: Automatic Upgrade: If Agent Controllers are running on all assets, Ops Center can automatically upgrade the software to the latest version without requiring any login credentials to the system SSH using a single set of credentials: If all assets use the same login credentials, you can apply a single set to all assets for the upgrade process. The log-in credentials are the same ones used for asset discovery and management, which are stored in the Plan Management navigation tab under Credentials. SSH using individual credentials: If assets use different login credentials, you can select a different set for each asset. After selecting the upgrade method, click the Next button to proceed to the summary screen. Click the Finish button to close the pop-up dialog and start the upgrade job for the agent controllers. The upgrade job runs a series of tasks in parallel, and will upgrade all agents which have been selected. Once the job completes, the OS instances in your data center will be upgraded and running the latest version of Agent Controller software.

    Read the article

  • SQL Table stored as a Heap - the dangers within

    - by MikeD
    Nearly all of the time I create a table, I include a primary key, and often that PK is implemented as a clustered index. Those two don't always have to go together, but in my world they almost always do. On a recent project, I was working on a data warehouse and a set of SSIS packages to import data from an OLTP database into my data warehouse. The data I was importing from the business database into the warehouse was mostly new rows, sometimes updates to existing rows, and sometimes deletes. I decided to use the MERGE statement to implement the insert, update or delete in the data warehouse, I found it quite performant to have a stored procedure that extracted all the new, updated, and deleted rows from the source database and dump it into a working table in my data warehouse, then run a stored proc in the warehouse that was the MERGE statement that took the rows from the working table and updated the real fact table. Use Warehouse CREATE TABLE Integration.MergePolicy (PolicyId int, PolicyTypeKey int, Premium money, Deductible money, EffectiveDate date, Operation varchar(5)) CREATE TABLE fact.Policy (PolicyKey int identity primary key, PolicyId int, PolicyTypeKey int, Premium money, Deductible money, EffectiveDate date) CREATE PROC Integration.MergePolicy as begin begin tran Merge fact.Policy as tgtUsing Integration.MergePolicy as SrcOn (tgt.PolicyId = Src.PolicyId) When not matched by Target then Insert (PolicyId, PolicyTypeKey, Premium, Deductible, EffectiveDate)values (src.PolicyId, src.PolicyTypeKey, src.Premium, src.Deductible, src.EffectiveDate) When matched and src.Operation = 'U' then Update set PolicyTypeKey = src.PolicyTypeKey,Premium = src.Premium,Deductible = src.Deductible,EffectiveDate = src.EffectiveDate When matched and src.Operation = 'D' then Delete ;delete from Integration.WorkPolicy commit end Notice that my worktable (Integration.MergePolicy) doesn't have any primary key or clustered index. I didn't think this would be a problem, since it was relatively small table and was empty after each time I ran the stored proc. For one of the work tables, during the initial loads of the warehouse, it was getting about 1.5 million rows inserted, processed, then deleted. Also, because of a bug in the extraction process, the same 1.5 million rows (plus a few hundred more each time) was getting inserted, processed, and deleted. This was being sone on a fairly hefty server that was otherwise unused, and no one was paying any attention to the time it was taking. This week I received a backup of this database and loaded it on my laptop to troubleshoot the problem, and of course it took a good ten minutes or more to run the process. However, what seemed strange to me was that after I fixed the problem and happened to run the merge sproc when the work table was completely empty, it still took almost ten minutes to complete. I immediately looked back at the MERGE statement to see if I had some sort of outer join that meant it would be scanning the target table (which had about 2 million rows in it), then turned on the execution plan output to see what was happening under the hood. Running the stored procedure again took a long time, and the plan output didn't show me much - 55% on the MERGE statement, and 45% on the DELETE statement, and table scans on the work table in both places. I was surprised at the relative cost of the DELETE statement, because there were really 0 rows to delete, but I was expecting to see the table scans. (I was beginning now to suspect that my problem was because the work table was being stored as a heap.) Then I turned on STATS_IO and ran the sproc again. The output was quite interesting.Table 'Worktable'. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.Table 'Policy'. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.Table 'MergePolicy'. Scan count 1, logical reads 433276, physical reads 60, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. I've reproduced the above from memory, the details aren't exact, but the essential bit was the very high number of logical reads on the table stored as a heap. Even just doing a SELECT Count(*) from Integration.MergePolicy incurred that sort of output, even though the result was always 0. I suppose I should research more on the allocation and deallocation of pages to tables stored as a heap, but I haven't, and my original assumption that a table stored as a heap with no rows would only need to read one page to answer any query was definitely proven wrong. It's likely that some sort of physical defragmentation of the table may have cleaned that up, but it seemed that the easiest answer was to put a clustered index on the table. After doing so, the execution plan showed a cluster index scan, and the IO stats showed only a single page read. (I aborted my first attempt at adding a clustered index on the table because it was taking too long - instead I ran TRUNCATE TABLE Integration.MergePolicy first and added the clustered index, both of which took very little time). I suspect I may not have noticed this if I had used TRUNCATE TABLE Integration.MergePolicy instead of DELETE FROM Integration.MergePolicy, since I'm guessing that the truncate operation does some rather quick releasing of pages allocated to the heap table. In the future, I will likely be much more careful to have a clustered index on every table I use, even the working tables. Mike  

    Read the article

  • How do you balance documentation requirements with Agile developments

    - by Jeremy
    In our development group there is currently discussions around agile and waterfal methodology. No-one has any practical experience with agile, but we are doing some reading. The agile manifesto lists 4 values: Individuals and interactions over processes and tools Working software over comprehensive documentation Customer collaboration over contract negotiation Responding to change over following a plan We are an internal development group developing applications for the consumption of other units in our enterprise. A team of 10 developers builds and releases multiple projects simultanously, typically with 1 - maybe 2 (rarely) developer on each project. It seems to be that from a supportability perspective the organization needs to put some real value on documentation - as without it, there are serious risks with resourcing changes. With agile favouring interactions, and software deliverables over processes and documentation, how do you balance that with the requirements of supportable systems and maintaining knowledge and understanding of how those systems work? With a waterfall approach which favours documentation (requirements before design, design specs before construction) it is easy to build a process that meets some of the organizational requirements - how do we do this with an agile approach?

    Read the article

  • How to design application for scaling the application?

    - by Muhammad
    I have one application which handles hardware events connected on the same computer's PCIe slots. The maximum number of PCIe slots on motherboard are two. I have utilized both slots. Now for scaling the application I need either more PCIe slots in same computer or I use another computer. So consider I am using another computer with same application and hardware connected on the PCIe Slots. Now my problem is that I want to design application over it which can access both computers hardware devices and does the process on it. The processed data should be send back to the respective PC's hardware. Please refer the attached diagram for expansion.

    Read the article

  • Does an onboard video affect the X windows configuration?

    - by Timothy
    Does the onboard video on the motherboard affect the X windows configuration? My system has onboard and pcie video. The onboard video is a NVIDIA GeForce 7025 GPU, On Board Graphic Max. Memory Share Up to 512MB(Under OS By Turbo Cache). I have a pcie dual head video card installed with two monitors. The video card is a GeForce 8400 GS, with 512mb memory. When installing Ubuntu 12.04, only one monitor worked. When pulling up system settings- Displays it shows a laptop. This is a desktop pc. I did get both monitors to work using nvidia using twinview -- A complicated process! When checking nvidia now it shows the monitors disabled. The Nvidia X server setting does show the GPU and all the information. I was thinking it's seeing the onboard video on the motherboard. Why else would it show laptop?

    Read the article

  • Can a programmer get too smart for their own good?

    - by P.Brian.Mackey
    The more I learn about programming, the more things I see that could be improved by a great deal. Often, a companies process management is total SWAG or they have Frames based websites written recently, .NET 1.1 based code, no separation of concerns, poor quality control...I could go on and on and on... Projects can succeed, but there tends to be so much waste I am amazed at how much time and money a company can throw away. I've seen it happen at several companies. So is it that ignorance truly is bliss? UPDATE Question "How is it that top developers (I don't mean like Jon Skeet level, I mean guys who are dedicated enough to hit a forum and try for self-improvement) even want to code anymore after they see the often insurmountable sociological and technical problems they are told to fix, but then scolded for doing so? "

    Read the article

  • Landed Cost Management Integration with OPM Financials

    - by Robert Story
    Upcoming WebcastTitle: Landed Cost Management Integration with OPM FinancialsDate: April 21, 2010 Time: 11:00 am EDT, 9:00 am PDT, 8:00 am MDT Product Family: EBS: Process Manufacturing Summary This one-hour session will present setup overview and detailed steps for a test case, and is recommended for functional users who are using OPM Financials module with an actual costing method. Topics will include: Overview on Landed Cost Management functionality Setup steps and a test case Some technical considerations Documentation and other reference materials available A short, live demonstration (only if applicable) and question and answer period will be included. Click here to register for this session....... ....... ....... ....... ....... ....... .......The above webcast is a service of the E-Business Suite Communities in My Oracle Support.For more information on other webcasts, please reference the Oracle Advisor Webcast Schedule.Click here to visit the E-Business Communities in My Oracle Support Note that all links require access to My Oracle Support.

    Read the article

  • Doing a P2V in OVM 3.0.3

    - by Steen Schmidt
    The other day I was talking to a customer about how you can do a P2V in OVM. I had already written about this topic earlier in my Blog and there was also some good documentation on the topic on how you do this. But what about seing the whole process from start to end, so I have include a link to a demo on the topic. Here is demo that has been divide into three steps: Step 1. Taget System,   Step 2. Import into OVM, and    Step 3. Use the new Template.

    Read the article

  • file acess slow after deletion of many files

    - by stefan
    I recently accidentally created millions of files in one folder (rougly 5 million) and due to limitations I couldn't process them correctly (maximum argument count exceeded for wc / ls and such stuff). So I deleted them, which took quite a while, but now they're gone. I deleted the files with a regular rm. It weren't any system files. So the files are definitively deleted, but the system is very slow on file stuff now. ls, cat and auto-complete by pressing tab freezes the terminal for several seconds. Is this some sort of fragmentation issue? Is it an issue with the files beeing still somehow present?

    Read the article

  • Error opening .wmv file

    - by Dcm1405
    I tried to open a .wmv file in Thunderbird. I was asked to download files in order to have it accessible. At the end of the installation An error occurred - Location not found. error message displayed with just the OK button available. Now when I press the Play button just the error message displays so I am not sure which packages need to be installed/uninstalled. I have installed Ubuntu 12.04 (32-bit) with open-jdk-7. Error was generated when installing: GStreamer ffmpeg video plugin It seemed that it took a while to have the process completing all steps of the download / install before the error message displayed.

    Read the article

  • Oracle OpenWorld 2012 Hands-on Lab: “Using Oracle AIA Foundation Pack for Oracle E-Business Suite Integration”

    - by Lionel Dubreuil
    Sharpen your Oracle skill sets and master Oracle technology in Oracle OpenWorld Hands-on Labs.In self-paced, practical learning sessions covering everything from business applications to middleware, database, storage, and enterprise management solutions, you'll discover new ways to derive maximum benefits from your Oracle hardware and software solutionsOracle experts will be available in person to answer questions and guide you through each lab.Hands-on Labs fill up early, and seats are limited, so don’t be late.This HOL10233 - Using Oracle AIA Foundation Pack for Oracle E-Business Suite Integration is scheduled for: Date: Monday, Oct 1 Time: 10:45 AM - 11:45 AM Location: Marriott Marquis - Nob Hill CD In this Hands-on Lab, learn how to integrate Oracle E-Business Suite with third-party applications in your ecosystem with Oracle Application Integration Architecture (AIA) Foundation Pack.This hands-on lab focuses on SOA-based integration with Oracle E-Business Suite, using Oracle E-Business Suite Integrated SOA Gateway or Oracle Applications Adapter to orchestrate a process across disparate applications in your ecosystem.Objectives for this session are to: Learn how to design and build an integration, from functional definition to development Learn how to automatically build services with robust error handling logic Learn how to discover and reuse services available in Oracle E-Business Suite

    Read the article

  • SQL Azure Data Sync

    - by kaleidoscope
    The Microsoft Sync Framework Power Pack for SQL Azure contains a series of components that improve the experience of synchronizing with SQL Azure. This includes runtime components that optimize performance and simplify the process of synchronizing with the cloud. SQL Azure Data Sync allows developers and DBA's to: · Link existing on-premises data stores to SQL Azure. · Create new applications in Windows Azure without abandoning existing on-premises applications. · Extend on-premises data to remote offices, retail stores and mobile workers via the cloud. · Take Windows Azure and SQL Azure based web application offline to provide an “Outlook like” cached-mode experience. The Microsoft Sync Framework Power Pack for SQL Azure is comprised of the following: · SqlAzureSyncProvider · Sql Azure Offline Visual Studio Plug-In · SQL Azure Data Sync Tool for SQL Server · New SQL Azure Events Automated Provisioning Geeta

    Read the article

  • Unable to set default gateway

    - by GrandMasterFlush
    I'm running Ubuntu server via Hyper-V and have successfully installed it but seem unable to ping the server or ping any other machines on the network from the server. After doing a bit of reading I've noticed that the default gateway isn't set but when I try and set it I keep getting error messages which I can't understand. From this article I've tried ip route add default via 10.0.10.200 Which reports: RTNETLINK answers: Operation not permitted If I try running it prefixed with sudo but it reports: `RNETLINK answers: No such process I've editted /etc/network/interfaces but when I start the machine and type netstat -nr there is nothing listed. Can anyone tell me where I'm going wrong please? EDIT : /etc/network/interfaces contains: auto lo iface lo inet loopback auto eth0 iface eth0 inet dhcp

    Read the article

  • WebCenter Workshops and Seminars

    - by rituchhibber
    The following workshops and eSeminars are already scheduled. You are allowed to forward eSeminars and events with registration links to any interested Oracle partner or consulting employee. If links are missing, please contact the organizer to be invited for this workshop (and get a registration link). Oracle WebCenter Content Foundation October 16-18, 2012: Colombes, France Oracle ADF Foundation October 10 -12, 2012: Colombes, Paris, France WebCenter Content Management Webcenter Content Manager 11g Workshop (3 days to nominated partners) Oracle Image Process Mgmt I/PM Foundation WS (3 days to nominated partners) WebCenter Sites November 20th - 22nd, 2012: Madrid, Spain ADF - Oracle Application Development Framework ADF 11g Foundation Workshop (3 days to nominated partners) ADF 11g Advanced Workshop (4 days to nominated partners)

    Read the article

  • Is Java free/open source or it isn't?

    - by user1598390
    On November 13, 2006, Sun released much of Java as free and open source software, (FOSS), under the terms of the GNU General Public License (GPL). On May 8, 2007, Sun finished the process, making all of Java's core code available under free software/open-source distribution terms, aside from a small portion of code to which Sun did not hold the copyright. OpenJDK (Open Java Development Kit) is a free and open source implementation of the Java programming language. It is the result of an effort Sun Microsystems began in 2006. The implementation is licensed under the GNU General Public License (GNU GPL) with a linking exception. Why there are still people that say Java is not open source or free as in free speech ? Am I missing something? Is Java still privative ?

    Read the article

  • Unknown file system : grub Rescue

    - by Rahul Rodrigues
    I have Lenovo Y560 Laptop on which installed Windows 7 and Ubuntu 11.10 as dual boot.Due to some reason i had to recover boot loader using bootrec.exe /fixmbr and bootrec.exe /fix boot,It created one partition of size 198MB named "tet" and my both os were working fine. Ysterday while making some changes in partition table i deleted that "tets" partition and after reboot im getting following error Error:unknow filesystem grub rescue i tried to boot from windows installer cd and but it stucks at "starting windows" so not able to run commands which i have mentioned earliar and tried to boot from Ubuntu 11.10,it stucks after showing following error memory full can't kill anymore process. Please help me out

    Read the article

  • Improving grepping over a huge file performance

    - by rogerio_marcio
    I have FILE_A which has over 300K lines and FILE_B which has over 30M lines. I created a bash script that greps each line in FILE_A over in FILE_B and writes the result of the grep to a new file. This whole process is taking over 5+ hours. I'm looking for suggestions on whether you see any way of improving the performance of my script. I'm using grep -F -m 1 as the grep command. FILE_A looks like this: 123456789 123455321 and FILE_B is like this: 123456789,123456789,730025400149993, 123455321,123455321,730025400126097, So with bash I have a while loop that picks the next line in FILE_A and greps it over in FILE_B. When the pattern is found in FILE_B i write it to result.txt. while read -r line; do grep -F -m1 $line 30MFile done < 300KFile Thanks a lot in advance for your help.

    Read the article

  • managing information/functionality on shared common project classes

    - by ilansch
    In my company, we have a common solution the contains common projects (2 projects so far, one for .net 3.5 and one for .net 4.5). My main problem is that during time, a lot of code is added, for example hosting a process as windows service is a class called ServiceManagement, But no one but the developer knows it, and if someone wants to use this shared class, he does not know it exist. So i am looking for a way to document and manage all the classes with tags, a 3rd party util/web util, that i can search for tags and maybe find common classes that i can use (if we keep all our code well-documented). Does anyone familiar with sort of tools ?

    Read the article

< Previous Page | 328 329 330 331 332 333 334 335 336 337 338 339  | Next Page >