Search Results

Search found 2128 results on 86 pages for 'ole automation'.

Page 10/86 | < Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >

  • How do I delete a signature from an Excel document using Office automation

    - by Guy Marom
    Hello, I have a signed Excel workbook and I want to delete the signature from it. The problem is that when I try to delete the signature there's a prompt for confirming the deletion and I want the process to be fully automatic. Here's a code sample, the prompt appears when the last line executes: Dim source As String = "c:\temp\signed.xlsx" Dim app As New Application() app.Visible = True Dim book As Workbook = app.Workbooks.Open(source, UpdateLinks:=0) app.ShowToolTips = False Dim sig As Microsoft.Office.Core.Signature = book.Signatures.Item(1) sig.Delete() Thanks

    Read the article

  • Excel automation using C#

    - by tecnodude
    Hi, I have a folder with close to 400 excel files. I need to copy the worksheets in all these excel files to a single excel file. using Interop and Reflection namespaces heres is what I have accomplished so far. I use folderBrowserDialog to browse to the folder and select it, this enable me to get the file names of the files within the folder and iterate through them this is as far as i got, any help would be appreciated. if (result == DialogResult.OK) { string path = fbd1.SelectedPath; //get the path int pathLength = path.Length + 1; string[] files = Directory.GetFiles(fbd1.SelectedPath);// getting the names of files in that folder foreach (string i in files) { MessageBox.Show("1 " + i); myExcel.Application excelApp = new myExcel.ApplicationClass(); excelApp.Visible = false; MessageBox.Show("2 " + i); myExcel.Workbook excelWorkbook = excelApp.Workbooks.Add(excelApp.Workbooks._Open(i, 0, false, 5, "", "", false, myExcel.XlPlatform.xlWindows, "", true, false, 0, true)); myExcel.Sheets excelSheets = excelWorkbook.Worksheets; MessageBox.Show("3 " + i); excelApp.Workbooks.Close(); excelApp.Quit(); } MessageBox.Show("Done!"); } How do i append the copied sheets to the destination file. Hope the question is clear? thanks.

    Read the article

  • ASP.NET Forms automation/serialization/binding

    - by creo
    I need to implement many forms in ASP.NET application (for IRS mostly). There are will be a lot of standard controls for each form (textboxes, dropdowns, checkboxes, radio). And business entity assigned to each. What's the best solution to automate this process? I need to: Have layout stored in DB (in XML). Layout must support several columns, tabbed interface Automatically bind business object values to the form Automatically read form values and write to business object Must support automatic validation Some basic workflows support would be good I used to work with TFS and saw how they implemented WorkItem templates (.wit files). In general this is all I need. But what framework did they build it on? How can I utilize this solution? I know about Dynamic Data only: http://www.asp.net/dynamicdata

    Read the article

  • Maven + SSDM Build and Runtime Environment Automation

    - by Randy
    Preface: My Company, like most, has several run-time environments and several release versions which themselves are composed of different versions of various jars. For example, let us consider release versions 1.1, 1.2, and 1.3 of Software X, which may be deployed to a developer computer, testing, or production. Software-x-1.1 is itself composed of jarA-0.9.1 and jarB-0.7.5, but software-x-1.3 is composed of jarA-1.7.31 and jarB-0.8.1. Currently we use Spring's PropertyPlaceholderConfigurer to configure run-time variables (such as database credentials), however, properties also change with release versions. We also use Maven 2 POM version 4 to specify which versions of our code need to be used. We place the version numbers of our jars as properties within profiles (dev,test,prod) inside of the parent pom and then reference those version numbers in all project poms. As of right now, we have no way to specify which project versions pertain to a given release other than the most current one. Moreover, we deploy our run-time configurations to the SSDM pickup which then configures and creates the services defined by the built versions of our software. -- Questions: Is there any procedure/tool we can use to build our product by merely providing the run-time environment and version number? IE "build 1.1 dev"? Is there anyway we can store the required jar versions for each release build? We are currently versioning all files, including the parent pom, but merely versioning the parent pom does not record which release version is pertinent to that parent pom. What else can we do to further automate the process of builds? For example, if we could manage run-time configurations within the parent pom that would be a step in the right direction, but that seems like a violation of scope. Any tool outside of our framework is inconceivable at this point, but not in the far future. Summary: How can we automate our build process to the fullest extent without being error prone?

    Read the article

  • Software automation testing

    - by dotnet-practitioner
    I work in a .net shop where we need to automate software testing. We write ASP.net web apps, web services, windows services, scheduled console application. Back end for all these applications is SQL Server. We would like to automate testing of any bug fixes, any where from web UI change to, middle tier .net code change to sql code change. This tool would be used by programmers to do unit test and played back in different test environments to ensure that bug fix is test correctly in all the environments including the produciton environment. This test would be executed by different teams such as QA, Build, and production site testers. What tool or approach do you recommend?

    Read the article

  • Visual Studio 2010 Data Compare Automation

    - by MicMit
    I noticed in premium edition Data menu with Data Compare option which does everything I need. Just wondering whether there is a way to automate what's done in GUI from my application. Ideally I'd like to get collections of different/left/right rows

    Read the article

  • COM Interop - Wait for Excel to Complete Operation

    - by roygbiv
    Hello, I am doing some COM Interop work with Excel and other Office Automation Software. A Co-Worker has mentioned to me that I need to wait for these Automation Servers to become ready after issuing a command to them. I, however, cannot see the purpose of this as don't all calls block until the automation server completes the given task? For example I would normally write: Dim o as AutomationObject = AutomationServer.CreateObject(x, y, z) o.Property ' could throw COM exception!!???? My co-worker says I need to sleep after that call because the the Automation Server could still be working on creating and initializing the object. Dim o as AutomationObject = AutomationServer.CreateObject(x, y, z) Threading.Sleep(5000) ' wait for AutomationServer to "become ready" o.Property ' could still throw COM exception!!???? The problem I have with this is that the AutomationServer calls should block until the AutomationServer finishes or completes what it was working on or at the very least it should be a loop checking if "o" is nothing, but that makes no sense because once the call returns its done! My question is, Is there any benefit to Sleeping after an AutomationServer call? Is there a method to "wait" until the AutomationServer finishes (if it does not in fact block)?

    Read the article

  • Why is my code slower using #import "progid:typelib" than using "MFC Class From TypeLib"?

    - by Pakman
    I am writing an automation client in Visual C++ with MFC. If I right-click on my solution » Add » Class, I have the option to select MFC Class From TypeLib. Selecting this option generates source/header files for all interfaces. This allows me to write code such as: #include "CApplication.h" #include "CDocument.h" // ... connect to automation server ... CApplication *myApp = new CApplication(pDisp); CDocument myDoc = myApp->get_ActiveDocument(); Using this method, my benchmarking function that makes about 12000 automation calls takes 1 second. Meanwhile, the following code: #import "progid:Library.Application" Library::IApplicationPtr myApp; // ... connect to automation server ... Library::IDocumentPtr myDoc = myApp->GetActiveDocument(); takes about 2.4 seconds for the same benchmark. I assume the smart-pointer implementation is slowing me down, but I don't know why. Even worse, I'm not sure how to use #import construct to achieve the speeds that the first method yields. Is this possible? How or why not? Thanks for your time!

    Read the article

  • [OT] : Windows Activation, en masse

    - by AaronBertrand
    This weekend I discovered a minor issue in one of my virtual environments. I had built out 100 VMs based on a Hyper-V template, but I forgot to activate the original source before creating the template, so all of the machines were suddenly out of compliance. While easy enough on a one- or two-machine basis to just log into the machine and activate manually, there was no way I was even going to dream of repeating that process on 100 machines. My First Reaction : PowerShell Whenever I do anything with...(read more)

    Read the article

  • BAD ARCHIVE MIRROR using PXE BOOT method

    - by omkar
    i m trying to automatically install UBUNTU on a client PC by using the method of PXE BOOT method....my Objectives are below:- i m following the steps given in this link installation using PXE BOOT 1:-the server will have a KICKSTART config file which contains the parameters for the OS installation and the files which are required for the OS installations. 2:-the client will have to detect this configuration along with the setup files and complete the installation without any input from the user. In my server i have installed DHCP3-server,Apache2 and TFTP for helping me with the installation. i have nearly achieved my first objective,i m able to boot my client using the files stored in the server,but during the installation stage it is asking me to "CHOOSE A MIRROR of UBUNTU ARCHIVE".i gave the server's IP address and the path in the server where the files are located but then too its giving me error "BAD ARCHIVE MIRROR". so is it possible that instead of downloading all the files from the internet and storing them on my disk , can i use the files which comes with the UBUNTU-CD, and how to store this files in what format (should i zip them ) on the disk. secondly i am also generating the ks.cfg which i wanted to give to the client for automatic installation of the OS ,so how should the configuration file be given to the installation process.

    Read the article

  • Cut Caseload Costs, Speed Service Delivery For Social Services

    - by michael.seback
    Lower Caseload Costs, Speedier Service Delivery with New Oracle Social Services Solution Oracle has just introduced a new solution for social services agencies that's designed to help case workers address the challenges of rising workloads and growing demands by citizens for additional services. In the past, IT departments developed custom software in an effort to meet program outcomes. "Because this capability is out of the box with the Oracle solution, there's less complexity for organizations and an overall lower total cost of ownership," says Kimberly Ellison-Taylor, Oracle's executive director of health and human services. "Self service brings costs down to just pennies per interaction and makes it possible for clients to receive government services more quickly," Ellison-Taylor says. read more

    Read the article

  • Automated unit testing, integration testing or acceptance testing

    - by bjarkef
    TDD and unit testing seems to be the big rave at the moment. But it is really that useful compared to other forms of automated testing? Intuitively I would guess that automated integration testing is way more useful than unit testing. In my experience the most bugs seems to be in the interaction between modules, and not so much the actual (usual limited) logic of each unit. Also regressions often happened because of changing interfaces between modules (and changed pre and post-conditions.) Am I misunderstanding something, or why are unit testing getting so much focus compared to integration testing? It is simply because it is assumed that integration testing is something you have, and unit testing is the next thing we need to learn to apply as developers? Or maybe unit testing simply yields the highest gain compared to the complexity of automating it? What are you experience with automated unit testing, automated integration testing, and automated acceptance testing, and in your experience what has yielded the highest ROI? and why? If you had to pick just one form of testing to be automated on your next project, which would it be? Thanks in advance.

    Read the article

  • The Next Wave of PeopleSoft Capabilities for the Staffing Industry Is Here

    - by Mark Rosenberg
    With the release of PeopleSoft Financials and Supply Chain Management 9.1 Feature Pack 2 in January this year, we introduced substantial new capabilities for our Staffing Industry customers. Through a co-development project with Infosys Limited, we have enriched Oracle's PeopleSoft Staffing Solution with new tools aimed at accelerating and improving the quality of job order fulfillment, increasing branch recruiter productivity, and driving profitable growth. Staffing industry firms succeed based on their ability to rapidly, cost-effectively, and continually fill their pipelines with new clients and job orders, recruit the best talent, and match orders with talent. Pressure to execute in each of these functional areas is even more acute on staffing firms as contingent labor becomes a more substantial and permanent part of the workforce mix. In an industry that creates value through speedy execution, there is little room for manual, inefficient processes and brittle, custom integrations, which throttle profitability and growth. The latest wave of investment in the PeopleSoft Staffing Solution focuses on generating efficiency and flexibility for our customers. Simplicity To operate profitably and continue growing, a Staffing enterprise needs its client management, recruiting, order fulfillment, and other processes to function in harmony. Most importantly, they need to be simple for recruiters, branch managers, and applicants to access and understand. The latest PeopleSoft Staffing Solution set of enhancements includes numerous automated defaulting mechanisms and information-rich dashboard pagelets that even a new employee can learn quickly. Pending Applicant, Agenda management, Search, and other pagelets are just a few of the newest, easy-to-use tools that not only aggregate and summarize information, but also provide instant access to applicants, tasks, and key reports for branch staff. Productivity The leading firms in the Staffing industry are those that can more efficiently orchestrate large numbers of candidates, clients, and orders than their competitors can. PeopleSoft Financials and Supply Chain Management 9.1 Feature Pack 2 delivers productivity boosters that Staffing firms can leverage to streamline tasks and processes for competitive advantage. For example, we enhanced the Recruiting Funnel, which manages the candidate on-boarding process, with a highly interactive user interface. It integrates disparate Staffing business processes and exploits new PeopleTools technologies to offer a superior on-boarding user experience. Automated creation of agenda items and assignment tasks for each candidate minimizes setup and organizes assignment steps for the on-boarding process. Mass updates of tasks and instant access to the candidate overview page (which we also expanded), candidate event status, event counts, and other key data enable recruiters to better serve clients and candidates. Lower TCO Constructing and maintaining an efficient yet flexible labor supply chain can be complicated, let alone expensive. Traditionally, Staffing firms have been challenged in controlling their technology cost of ownership because connecting candidate and client-facing tools involved building and integrating custom applications and technologies and managing staff turnover, placing heavy demands on IT and support staff. With PeopleSoft Financials and Supply Chain Management 9.1 Feature Pack 2, there are two major enhancements that aggressively tackle these challenges. First, we added another integration framework to enable cost-effective linking of the Staffing firm’s PeopleSoft applications and its job board distributors. (The first PeopleSoft 9.1 Feature Pack released in March 2011 delivered an integration framework to connect to resume parsing providers.) Second, we introduced the teaming concept to enable work to be partitioned to groups, as well as individuals. These two capabilities, combined with a host of others, position Staffing firms to configure and grow their businesses without growing their IT and overhead expenditures. For our Staffing Industry customers, PeopleSoft Financials and Supply Chain Management 9.1 Feature Pack 2 is loaded with high-value tools aimed at enabling and sustaining a flexible labor supply chain. For more information, contact [email protected] or [email protected].

    Read the article

  • Thinking of Adopting the PRINCE2™ Project Management Methodology? Consider Using PeopleSoft Projects to Help

    - by Megan Boundey
    Ever wondered what the PRINCE2™ project management methodology is? Ever wondered if you could use PeopleSoft Projects (ESA) to manage your projects using PRINCE2™?  Published by the Office of Government Commerce in the UK, PRINCE2™ is a scalable, business case and product description-driven Project Management methodology based upon managing by exception. Project activities are organized around fulfilling and meeting the product description. Quality assurance, configuration control and risk management are all based upon ensuring that the product delivered accurately meets the product description. PRINCE2™ is built upon seven principles and seven themes, each underpinning the PRINCE2™project management processes. Important for today’s business environment, the focus throughout PRINCE2™ is on the Business Case, which describes the rationale and business justification for a project. The Business Case drives all the project management processes from initial project setup to successful finish. PRINCE2™, as a method and a certification, is adopted in many countries worldwide, including the UK, Western Europe and Australia. We’ve just released a new white paper, which provides you with an overview of the principles, themes and project management processes associated with PRINCE2™. It also shows how these map to the functionality available within PeopleSoft Projects (ESA). In the time it takes to drink a coffee, you can learn about PRINCE2™ and determine whether it might help you deliver better project results. We encourage you to take a look.

    Read the article

  • PeopleSoft Upgrades, Fusion, & BI for Leading European PeopleSoft Applications Customers

    - by Mark Rosenberg
    With so many industry-leading services firms around the globe managing their businesses with PeopleSoft, it’s always an adventure setting up times and meetings for us to keep in touch with them, especially those outside of North America who often do not get to join us at Oracle OpenWorld. Fortunately, during the first two weeks of May, Nigel Woodland (Oracle’s Service Industries Director for the EMEA region) and I successfully blocked off our calendars to visit seven different customers spanning four countries in Western Europe. We met executives and leaders at four Staffing industry firms, two Professional Services firms that engage in consulting and auditing, and a Financial Services firm. As we shared the latest information regarding product capabilities and plans, we also gained valuable insight into the hot technology topics facing these businesses. What we heard was both informative and inspiring, and I suspect other Oracle PeopleSoft applications customers can benefit from one or more of the following observations from our trip. Great IT Plans Get Executed When You Respect the Users Each of our visits followed roughly the same pattern. After introductions, Nigel outlined Oracle’s product and technology strategy, including a discussion of how we at Oracle invest in each layer of the “technology stack” to provide customers with unprecedented business management capabilities and choice. Then, I provided the specifics of the PeopleSoft product line’s investment strategy, detailing the dramatic number of rich usability and functionality enhancements added to release 9.1 since its general availability in 2009 and the game-changing capabilities slated for 9.2. What was most exciting about each of these discussions was that shortly after my talking about what customers can do with release 9.1 right now to drive up user productivity and satisfaction, I saw the wheels turning in the minds of our audiences. Business analyst and end user-configurable tools and technologies, such as WorkCenters and the Related Action Framework, that provide the ability to tailor a “central command center” to the exact needs of each recruiter, biller, and every other role in the organization were exactly what each of our customers had been looking for. Every one of our audiences agreed that these tools which demonstrate a respect for the user would finally help IT pole vault over the wall of resistance that users had often raised in the past. With these new user-focused capabilities, IT is positioned to definitively partner with the business, instead of drag the business along, to unlock the value of their investment in PeopleSoft. This topic of respecting the user emerged during our very first visit, which was at Vital Services Group at their Head Office “The Mill” in Manchester, England. (If you are a student of architecture and are ever in Manchester, you should stop in to see this amazingly renovated old mill building.) I had just finished explaining our PeopleSoft 9.2 roadmap, and Mike Code, PeopleSoft Systems Manager for this innovative staffing company, said, “Mark, the new features you’ve shown us in 9.1/9.2 are very relevant to our business. As we forge ahead with the 9.1 upgrade, the ability to configure a targeted user interface with WorkCenters, Related Actions, Pivot Grids, and Alerts will enable us to satisfy the business that this upgrade is for them and will deliver tangible benefits. In fact, you’ve highlighted that we need to start talking to the business to keep up the momentum to start reviewing the 9.2 upgrade after we get to 9.1, because as much as 9.1 and PeopleTools 8.52 offers, what you’ve shown us for 9.2 is what we’ve envisioned was ultimately possible with our investment in PeopleSoft applications.” We also received valuable feedback about our investment for the Staffing industry when we visited with Hans Wanders, CIO of Randstad (the second largest Staffing company in the world) in the Netherlands. After our visit, Hans noted, “It was very interesting to see how the PeopleSoft applications have developed. I was truly impressed by many of the new developments.” Hans and Mike, sincere thanks for the validation that our team’s hard work and dedication to “respecting the users” is worth the effort! Co-existence of PeopleSoft and Fusion Applications Just Makes Sense As a “product person,” one of the most rewarding things about visiting customers is that they actually want to talk to me. Sometimes, they want to discuss a product area that we need to enhance; other times, they are interested in learning how to extract more value from their applications; and still others, they want to tell me how they are using the applications to drive real value for the business. During this trip, I was very pleased to hear that several of our customers not only thought the co-existence of Fusion applications alongside PeopleSoft applications made sense in theory, but also that they were aggressively looking at how to deploy one or more Fusion applications alongside their PeopleSoft HCM and FSCM applications. The most common deployment plan in the works by three of the organizations is to upgrade to PeopleSoft 9.1 or 9.2, and then adopt one of the new Fusion HCM applications, such as Fusion Performance Management or the full suite of  Fusion Talent Management. For example, during an applications upgrade planning discussion with the staffing company Hays plc., Mark Thomas, who is Hays’ UK IT Director, commented, “We are very excited about where we can go with the latest versions of the PeopleSoft applications in conjunction with Fusion Talent Management.” Needless to say, this news was very encouraging, because it reiterated that our applications investment strategy makes good business sense for our customers. Next Generation Business Intelligence Is the Key to the Future The third, and perhaps most exciting, lesson I learned during this journey is that our audiences already know that the latest generation of Business Intelligence technologies will be the “secret sauce” for organizations to transform business in radical ways. While a number of the organizations we visited on the trip have deployed or are deploying Oracle Business Intelligence Enterprise Edition and the associated analytics applications to provide dashboards of easy-to-understand, user-configurable metrics that help optimize business performance according to current operating procedures, what’s most exciting to them is being able to use Business Intelligence to change the way an organization does business, grows revenue, and makes a profit. In particular, several executives we met asked whether we can help them minimize the need to have perfectly structured data and at the same time generate analytics that improve order fulfillment decision-making. To them, the path to future growth lies in having the ability to analyze unstructured data rapidly and intuitively and leveraging technology’s ability to detect patterns that a human cannot reasonably be expected to see. For illustrative purposes, here is a good example of a business problem where analyzing a combination of structured and unstructured data can produce better results. If you have a resource manager trying to decide which person would be the best fit for an assignment in terms of ensuring (a) client satisfaction, (b) the individual’s satisfaction with the work, (c) least travel distance, and (d) highest margin, you traditionally compare resource qualifications to assignment needs, calculate margins on past work with the client, and measure distances. To perform these comparisons, you are likely to need the organization to have profiles setup, people ranked against profiles, margin targets setup, margins measured, distances setup, distances measured, and more. As you can imagine, this requires organizations to plan and implement data setup, capture, and quality management initiatives to ensure that dependable information is available to support resourcing analysis and decisions. In the fast-paced, tight-budget world in which most organizations operate today, the effort and discipline required to maintain high-quality, structured data like those described in the above example are certainly not desirable and in some cases are not feasible. You can imagine how intrigued our audiences were when I informed them that we are ready to help them analyze volumes of unstructured data, detect trends, and produce recommendations. Our discussions delved into examples of how the firms could leverage Oracle’s Secure Enterprise Search and Endeca technologies to keyword search against, compare, and learn from unstructured resource and assignment data. We also considered examples of how they could employ Oracle Real-Time Decisions to generate statistically significant recommendations based on similar resourcing scenarios that have produced the desired satisfaction and profit margin results. --- Although I had almost no time for sight-seeing during this trip to Europe, I have to say that it may have been one of the most energizing and engaging trips of my career. Showing these dedicated customers how they can give every user a uniquely tailored set of tools and address business problems in ways that have to date been impossible made the journey across the Atlantic more than worth it. If any of these three topics intrigue you, I’d recommend you contact your Oracle applications representative to arrange for more detailed discussions with the appropriate members of our organization.

    Read the article

  • Oracle Consulting North America is now live on PeopleSoft Services Procurement and PeopleSoft Resource Management

    - by Howard Shaw
    Last month, Oracle's own internal consulting group (OCS North America) went live on PeopleSoft Services Procurement and PeopleSoft Resource Management to manage all aspects of identifying, recruiting, and deploying billable subcontractors on North America Applications customer consulting projects. The primary goals were to enhance the subcontractor staffing process, improve operational and informational processes, and improve collaboration between the Oracle NA Consulting Subcontractor Program and subcontractor suppliers. Over 200 registered external suppliers access the tool, review open needs and competitively bid their resources to work on NA Applications projects. This implementation highlights the usage of Oracle’s own solutions to streamline and enhance business operations, as the PeopleSoft 9.1 applications (Services Procurement and Resource Management) were deployed using Sun hardware, Oracle Enterprise Linux, and Oracle Virtual Machines.For more information, please navigate to the following web pages: PeopleSoft Services Procurement PeopleSoft Resource Management

    Read the article

  • Oracle's PeopleSoft Customer Advisory Boards Convene to Discuss Roadmap at Pleasanton Campus

    - by john.webb(at)oracle.com
    Last week we hosted all of the PeopleSoft CABs (Customer Advisory Boards) at our Pleasanton Development Center to review our detailed designs for future Feature Packs, PeopleSoft 9.2, and beyond. Over 150 customers from 79 companies attended representing a variety of industries, geographies, and company sizes. The PeopleSoft team relies heavily on this group to provide key input on our roadmap for applications as well as technology direction. A good product strategy is one part well thought out idea with many handfuls of customer validation, and very often our best ideas originate from these customer discussions. While the individual CABs have frequent interactions with our teams, it's always great to have all of them in one place and in person. Our attendance was up from last year which I attribute to two things: (1) More interest as a result of PeopleSoft 9.1 upgrade; (2) An improving economy allowing for more travel. Maybe we should index the second item meeting-to-meeting and use it as a market indicator - we'll see! We kicked off the day one session with an overview of the PeopleSoft Roadmap and I outlined our strategy around Feature Packs and PeopleSoft 9.2. Given the high adoption rate of PeopleSoft 9.1 (over 4x that of 9.0 given the same time lapse since the release date), there was a lot of interest around the 9.1 Feature Packs as a vehicle for continuous value. We provided examples of our 3 central design themes: Simplicity, Productivity, and lower TCO, including those already delivered via Feature Packs in 2010. A great example of this is the Company Directory feature in PeopleSoft HCM. The configuration capabilities and the new actionable links our CAB advised us on last Spring were made available to all customers late last year. We reviewed many more future Navigation changes that will fundamentally change the way users interact with PeopleSoft. Our old friend, the menu tree, is being relegated from center stage to a bit part, with new concepts like Activity Guides, Train Stops, Related Actions, Work Centers, Collaborative Workspaces, and Secure Enterprise Search bringing users what they need in a contextual, role based manner with fewer clicks. Paco Aubrejuan, our PeopleSoft GM, and Steve Miranda, the SVP for Fusion Applications, then discussed our plans around Oracle's Application Investment Strategy.  This included our continued investment in developing both PeopleSoft and Fusion as well as the co-existence strategy with new Fusion Apps integrating to PeopleSoft Apps. Should you want to view this presentation, a recording is available. Jeff Robbins, our lead PeopleTools Strategist, provided the roadmap for PeopleTools and discussed our continuing plan to deliver annual releases to further evolve the user experience. Numerous examples were highlighted with the Navigation techniques I mentioned previously. Jeff also provided a lot of food for thought around Lifecycle Management topics and how to remain current on releases with a  lower cost of ownership. Dennis Mesler, from Boise, was the guest speaker in this slot, who spoke about the new PeopleSoft Test Framework (PTF). Regression Testing is a key cost component when product updates are applied. This new tool (which is free to all PeopleSoft customers as part of PeopleTools 8.51) provides a meta data driven approach to recording and executing test scripts. Coupled with what our Usage Monitor enables, PTF provides our customers a powerful tool to lower costs and manage product updates more efficiently and at the time of their choosing. Beyond the general session, we broke out into the individual CABs: HCM, Financials, ESA/ALM, SRM, SCM, CRM, and PeopleTools/ Technology. A day and half of very engaging discussions around our plans took place for each product pillar. More about that to follow in future posts.      We capped the first day with a reception sponsored by our partners: InfoSys, SmartERP (represented by Doris Wong), and Grey Sparling  Solutions (represented by Chris Heller and Larry Grey). Great to see these old friends actively engaged in the very busy PeopleSoft ecosystem!   Jeff Robbins previews the roadmap for PeopleTools with the PeopleSoft CAB  

    Read the article

  • How to create scripts that create another scripts

    - by sfrj
    I am writing an script that needs to generate another script that will be used to shutdown an appserver... This is how my code looks like: echo "STEP 8: CREATE STOP SCRIPT" stopScriptContent="echo \"STOPING GLASSFISH PLEASE WAIT...\"\n cd glassfish4/bin\n chmod +x asadmin\n ./asadmin stop-domain\n #In order to work it is required that the original folder of glassfish don't contain already any #project, otherwise, there will be a conflict\n" ${stopScriptContent} > stop.sh chmod +x stop.sh But it is not being created correctly, this is how the output stop.sh looks like: "STOPING GLASSFISH PLEASE WAIT..."\n cd glassfish4/bin\n chmod +x asadmin\n ./asadmin stop-domain\n #In order to work it is required that the original folder of glassfish don't contain already any #project, otherwise, there will be a conflict\n As you see, lots of things are wrong: there is no echo command is taking the \n literaly so there is no new line My doubts are: What is the correct way of making an .sh script create another .sh script. What do you thing I am doing wrong?

    Read the article

  • Ubuntu Preseed set Norwegian Keyboard?

    - by Vangelis Tasoulas
    It's been a couple of days now that I am trying to make a fully automated unattended installation. I managed to make it work with Ubuntu/Cobbler and a preseed file, but I cannot set the correct keyboard layout which is Norwegian in this case. I am doing the tests on a virtual machine and when I am going with a normal manual installation (no preseed) everything is working fine. When I am using the preseed file, I always end up with an "English (US)" keyboard no matter the many different options I have tried. I can change it manually with the "dpkg-reconfigure keyboard-configuration" command, but that's not the case. It should be handled automatically using the preseed file. I am using DEBCONF_DEBUG=5 when the grub is loading, and as I see in "/var/log/installer/syslog" file after the installation has finished, the preseeding commands are accepted. Can anyone help on this? The preseed file I am using is following: d-i debian-installer/country string NO d-i debian-installer/language string en_US:en d-i debian-installer/locale string en_US.UTF-8 d-i console-setup/ask_detect boolean false d-i keyboard-configuration/layout select Norwegian d-i keyboard-configuration/variant select Norwegian d-i keyboard-configuration/modelcode string pc105 d-i keyboard-configuration/layoutcode string no d-i keyboard-configuration/xkb-keymap select no d-i netcfg/choose_interface select auto d-i netcfg/get_hostname string myhostname d-i netcfg/get_domain string simula.no d-i hw-detect/load_firmware boolean true d-i mirror/country string manual d-i mirror/http/hostname string ftp.uninett.no d-i mirror/http/directory string /ubuntu d-i mirror/http/proxy string http://10.0.1.253:3142/ d-i mirror/codename string precise d-i mirror/suite string precise d-i clock-setup/utc boolean true d-i time/zone string Europe/Oslo d-i clock-setup/ntp boolean true d-i clock-setup/ntp-server string 10.0.1.254 d-i partman-auto/method string lvm partman-auto-lvm partman-auto-lvm/new_vg_name string vg0 d-i partman-auto/purge_lvm_from_device boolean true d-i partman-lvm/device_remove_lvm boolean true d-i partman-md/device_remove_md boolean true d-i partman-lvm/confirm boolean true d-i partman-lvm/confirm_nooverwrite boolean true d-i partman-auto-lvm/guided_size string max d-i partman-auto/choose_recipe select 30atomic d-i partman/default_filesystem string ext4 d-i partman-partitioning/confirm_write_new_label boolean true d-i partman/choose_partition select finish d-i partman/confirm boolean true d-i partman/confirm_nooverwrite boolean true d-i partman/mount_style select uuid d-i passwd/root-login boolean false d-i passwd/make-user boolean true d-i passwd/user-fullname string vangelis d-i passwd/username string vangelis d-i passwd/user-password-crypted password $6$asdafdsdfasdfasdf d-i passwd/user-uid string d-i user-setup/allow-password-weak boolean false d-i passwd/user-default-groups string adm cdrom dialout lpadmin plugdev sambashare d-i user-setup/encrypt-home boolean false d-i apt-setup/restricted boolean true d-i apt-setup/universe boolean true d-i apt-setup/backports boolean true d-i apt-setup/services-select multiselect security d-i apt-setup/security_host string security.ubuntu.com d-i apt-setup/security_path string /ubuntu tasksel tasksel/first multiselect Basic Ubuntu server, OpenSSH server d-i pkgsel/include string build-essential htop vim nmap ntp d-i pkgsel/upgrade select safe-upgrade d-i pkgsel/update-policy select none d-i pkgsel/updatedb boolean true d-i grub-installer/only_debian boolean true d-i grub-installer/with_other_os boolean true d-i finish-install/keep-consoles boolean false d-i finish-install/reboot_in_progress note d-i cdrom-detect/eject boolean true d-i debian-installer/exit/halt boolean false d-i debian-installer/exit/poweroff boolean false

    Read the article

  • Floppy Autoloader Automatically Archives Thousands of Floppies

    - by Jason Fitzpatrick
    The thought of hand loading 5,000 floppy disks is more than enough to drive an inventive geek to create a better alternative–like this automated floppy disk archiver. DwellerTunes has several crates of floppy disks that contain old Amiga software and related material, personal programming projects, personal documents, and more. Realistically there’s no way he could devout time to hand loading and archiving thousands upon thousands of floppy disks so he built a automatic loader that accepts stacks of several hundred floppy disks at time. The loader not only loads and archives the floppy disks, but it photographs the label of each disk so that each archive includes a picture of the original label. Watch the video above to see it in action and then hit up the link below for more information. Converting All My Amiga Disks [DwellerTunes via Make] How to Own Your Own Website (Even If You Can’t Build One) Pt 2 How to Own Your Own Website (Even If You Can’t Build One) Pt 1 What’s the Difference Between Sleep and Hibernate in Windows?

    Read the article

  • Automated build platform for .NET portfolio - best choice?

    - by jkohlhepp
    I am involved with maintaining a fairly large portfolio of .NET applications. Also in the portfolio are legacy applications built on top of other platforms - native C++, ECLIPS Forms, etc. I have a complex build framework on top of NAnt right now that manages the builds for all of these applications. The build framework uses NAnt to do a number of different things: Pull code out of Subversion, as well as create tags in Subversion Build the code, using MSBuild for .NET or other compilers for other platforms Peek inside AssemblyInfo files to increment version numbers Do deletes of certain files that shouldn't be included in builds / releases Releases code to deployment folders Zips code up for backup purposes Deploy Windows services; start and stop them Etc. Most of those things can be done with just NAnt by itself, but we did build a couple of extension tasks for NAnt to do some things that were specific to our environment. Also, most of those processes above are genericized and reused across a lot of our different application build scripts, so that we don't repeat logic. So it is not simple NAnt code, and not simple build scripts. There are dozens of NAnt files that come together to execute a build. Lately I've been dissatisfied with NAnt for a couple reasons: (1) it's syntax is just awful - programming languages on top of XML are really horrific to maintain, (2) the project seems to have died on the vine; there haven't been a ton of updates lately and it seems like no one is really at the helm. Trying to get it working with .NET 4 has cause some pain points due to this lack of activity. So, with all of that background out of the way, here's my question. Given some of the things that I want to accomplish based on that list above, and given that I am primarily in a .NET shop, but I also need to build non-.NET projects, is there an alternative to NAnt that I should consider switching to? Things on my radar include Powershell (with or without psake), MSBuild by itself, and rake. These all have pros and cons. For example, is MSBuild powerful enough? I remember using it years ago and it didn't seem to have as much power as NAnt. Do I really want to have my team learn Ruby just to do builds using rake? Is psake really mature enough of a project to pin my portfolio to? Is Powershell "too close to the metal" and I'll end up having to write my own build library akin to psake to use it on its own? Are there other tools that I should consider? If you were involved with maintaining a .NET portfolio of significant complexity, what build tool would you be looking at? What does your team currently use?

    Read the article

  • Policy Implementation is Damaging Organizations: Economist Intelligence Unit

    - by michael.seback
    Read new research revealing the hidden risks of inefficient policy implementation The frenetic pace of regulatory and legislative change means public and private sector organizations must continuously update internal policies - in particular, as associated with decision making and disbursements. Yet with policy management efforts alarmingly under-resourced and under-funded, the risk and cost of non-compliance - and their associated implications - are growing daily. To find out how inefficient policy management could be putting your business at risk, read your complimentary copy of the full EIU paper - Enabling Efficient Policy Implementation - today.

    Read the article

< Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >