Search Results

Search found 49 results on 2 pages for 'iseries'.

Page 2/2 | < Previous Page | 1 2 

  • using text and ntext SQL Datatypes in RPG

    - by David Stratton
    I'll preface this with saying that I'm a .NET developer, and am NOT an RPG developer. I'm working with one of our RPG developers to come up with a solution, so any suggestions you provide will get passed to him. We have a scenario where we want our iSeries to read from a SQL Server database. One of the columns is a TEXT column. IN RPG, there is no equivalent data type to use for this. We've gone back and forth on this, and our current plan is to change course, and have our SQL Server write out a text file, which the iSeries can pick up and parse. This is, however, a last resort option, as the data in the file is sensitive, and we'd like to avoid the additional security overhead. We've already got the SQL Server locked down as tight as possible (one user only has read access to this, and that user is an iSeries user.) We don't want to have to worry about transferring files back and forth. However, at this point, we see no other option. We have no in-house Java developers, and need to do this in RPG. So I'm wondering if there are any RPG developers out there who have faced this situation and have any advice.

    Read the article

  • Apache fop-0.95 error on FopFactory.newInstance() command

    - by FlexInfoSys
    I am using Apache fop-0.95 to build pdf files from a JSP web application on IBM iSeries V5R4 using Websphere 6.0. Everything works perfect in my development using Websphere Development Studio client. When I put the application on the server, I get an error at this line. FopFactory fopFactory = FopFactory.newInstance(); The error is: java.lang.UnsatisfiedLinkError: javax/imageio/ImageIO Does anyone know how I can fix this error? All of the fop class files are part of the EAR file. The files were installed to the projects \WEB-INF\lib directory. I have added the fop jar files to the classpath, using the admin console. I am running IBM WebSphere Application Server - Express, 6.0.2.9 Build Number: cf90614.22 on IBM iSeries V5R4

    Read the article

  • Centralized Credentials Service For Various Apps

    - by Vlad
    We are researching the possibility to build a centralized credentials storage for internal applications. These apps (vb6, vb.net, web apps in asp.net, etc) are using various instances of SQL servers and iSeries. We want to implement a central credentials facility that would act as a security broker. Basically it should work like this: Client app supplies AppID (I am Sales Application) and EnvironmentID (I am running in QA environment) and in return will get either a connection object (preferred) or encrypted connection string that will allow said application to connect to resources it needs. There will be cases when application needs to connect to two (or more) database resources (i.e. to SQL and iSeries). What are looking at DP API at the moment, but I am not convinced that DP API is the solution as it tied in with machine key. In our case using machine key isn't feasible, so I want to know if there are other approaches available.

    Read the article

  • How can I connect to AS/400 through TN5250?

    - by Swati Sarkar
    How will I find out my iseries server name? I checked through "nslookup" it gives one ip address & I tried to connect TN5250 session to cconnect but could not connect. from dos command line c:\nslookup default server : unknown ip address : 192.168.50.119 Then I tried ping with this IP address - it's giving reply from the above IP address Then I have given this id in TN5250 session, but says can not create a connection to the AS/400.

    Read the article

  • LDAP RBAC model

    - by typo
    Hi does anybody can tell me about best practice to model RBAC on LDAP ? I'm very confused, not sure if I should think about LDAP groups as role, or just user in some custom OU. Any real-life examples with tasks-operations\roles\user scheme (one user, multiple roles per user, multiple operations-tasks per role) ? BTW:Target systems are .net, java and iSeries

    Read the article

  • Pros and Cons of ASNA Visual RPG (AVR)

    - by mga911
    Have you had any experience with ASNA Visual RPG for Visual Studio 2005/2008? I'm looking for some feedback on this product. I'm especially curious as to how it compares to other methods of accessing files and programs on the IBM’s System i (formerly known as iSeries, AS/400) server. Thanks!

    Read the article

  • Delphi TBytesField - How to see the text properly - Source is HIT OLEDB AS400

    - by myitanalyst
    We are connecting to a multi-member AS400 iSeries table via HIT OLEDB and HIT ODBC. You connect to this table via an alias to access a specific multi-member. We create the alias on the AS400 this way: CREATE ALIAS aliasname FOR table(membername) We can then query each member of the table this way: SELECT * FROM aliasname We are testing this in Delphi6 first, but will move it to D2010 later We are using HIT OLEDB for the AS400. We are pulling down records from a table and the field is being seen as a tBytesField. I have also tried ODBC driver and it sees as tBytesField as well. Directly on the AS400 I can query the data and see readable text. I can use the iSeries Navigation tool and see readable text as well. However when I bring it down to the Delphi client via the HIT OLEDB or HIT ODBC and try to view via asString then I just see unreadable text.. something like this: ñðð@ðõñððððñ÷@õôððõñòøóóöøñðÂÁÕÒ@ÖÆ@ÁÔÅÙÉÃÁ@@@@@@@@ÂÈÙÉâãæÁðòñè@ÔK@k@ÉÕÃK@@@@@@@@@ç I jumbled up the text above, but that is the character types that show up. When I did a test in D2010 the text looks like japanse or chinese characters, but if I display as AnsiString then it looks like what it does in Delphi 6. I am thinking this may have something to do with code pages or character sets, but I have no experience in this are so it is new to me if it is related. When I look at the Coded Character Set on the AS400 it is set to 65535. What do I need to do to make this text readable? We do have a third party component (Delphi400) that makes things behave in a more native AS400 manner. When I use its AS400 connection and AS400 query components it shows the field as a tStringField and displays just fine. BUT we are phasing out this product (for a number of reasons) and would really like the OLEDB with the ADO components work. Just for clarification the HIT OLEDB with tADOQuery do have some fields showing as tStringFields for many of the other tables we use... not sure why it is showing as a tBytesField in this case. I am not an AS400 expert, but looking at the field definititions on the AS400 the ones showing up as tBytesField look the same as the ones showing up as tStringFields... but there must be a difference. Maybe due to being a multi-member? So... does anyone have any guidance on how to get the correct string data that is readable? If you need more info please ask. Greg

    Read the article

  • WCF Test Client Service Failed to Invoke

    - by renjucool
    I'm connection IBM.Iseries database in a c# application using ADO.NET iSeries drivers. My WCF service is throwing the error as "HTTP could not register URL http://+:80/Temporary_Listen_Addresses/771be107-8fa3-46f9-ac01-12c4d503d01e/ because TCP port 80 is being used by another application." <system.serviceModel> <client> <endpoint address="http://localhost:8080/Design_Time_Addresses/DataAccessLayer/POEngine/" binding="wsDualHttpBinding" bindingConfiguration="" contract="POServiceReference.IPOEngine" name="MyPoBindings"> <identity> <dns value="localhost" /> </identity> </endpoint> <endpoint address="http://localhost:8080/Design_Time_Addresses/DataAccessLayer/POEngine/" binding="wsDualHttpBinding" bindingConfiguration="PoBindings1" contract="POServiceReference.IPOEngine" name="PoBindings"> <identity> <dns value="localhost" /> </identity> </endpoint> </client> <bindings> <wsDualHttpBinding> <binding name="PoBindings1" closeTimeout="00:01:00" openTimeout="00:01:00" receiveTimeout="00:10:00" sendTimeout="00:01:00" bypassProxyOnLocal="false" transactionFlow="false" hostNameComparisonMode="StrongWildcard" maxBufferPoolSize="524288" maxReceivedMessageSize="65536" messageEncoding="Text" textEncoding="utf-8" useDefaultWebProxy="true"> <readerQuotas maxDepth="32" maxStringContentLength="8192" maxArrayLength="16384" maxBytesPerRead="4096" maxNameTableCharCount="16384" /> <reliableSession ordered="true" inactivityTimeout="00:10:00" /> <security mode="Message"> <message clientCredentialType="Windows" negotiateServiceCredential="true" algorithmSuite="Default" /> </security> </binding> </wsDualHttpBinding> <wsHttpBinding> <binding name="PoBindings" /> </wsHttpBinding> </bindings> <services> <service behaviorConfiguration="DataAccessLayer.POEngineBehavior" name="DataAccessLayer.POEngine"> <endpoint address="" binding="wsDualHttpBinding" bindingConfiguration="" name="PoBindings" contract="DataAccessLayer.IPOEngine"> <identity> <dns value="localhost" /> </identity> </endpoint> <endpoint address="mex" binding="mexHttpBinding" contract="IMetadataExchange" /> <host> <baseAddresses> <add baseAddress="http://localhost:8080/Design_Time_Addresses/DataAccessLayer/POEngine/" /> </baseAddresses> </host> </service> </services> <behaviors> <endpointBehaviors> <behavior name="NewBehavior" /> </endpointBehaviors> <serviceBehaviors> <behavior name="DataAccessLayer.Service1Behavior"> <serviceMetadata httpGetEnabled="true" /> <serviceDebug includeExceptionDetailInFaults="false" /> </behavior> <behavior name="DataAccessLayer.POEngineBehavior"> <serviceMetadata httpGetEnabled="true" /> <serviceDebug includeExceptionDetailInFaults="false" /> </behavior> </serviceBehaviors> </behaviors> </system.serviceModel> Having the problem also without using any IBM data access. A normal Hello World is also throwing the error.Where i need to change the port address

    Read the article

  • DB2 UDF Permissions

    - by WernerCD
    I have a custom function that I'm working on... the problem I'm having is simple: Permssions. example function: drop function circle_area go CREATE FUNCTION circle_area (radius FLOAT) RETURNS FLOAT LANGUAGE SQL BEGIN DECLARE pi FLOAT DEFAULT 3.14; DECLARE area FLOAT; SET area = pi * radius * radius; RETURN area; END GO if I then log out of my "admin" account... and log into test account I get a "Not authorized" error when I try to run something "Select circle_area(foo) from library.bar". I can log into iSeries Navigator, navigate to schema functions permissions and change the permission for public from Exclude to All. bam it works. How do I grant permission to all, either in the CREATE FUNCTION or after?

    Read the article

  • Enable System Beep in Ubuntu

    - by Melissa W
    I have tried and tried to get the system beep working, but with no success. I have selected System--Sound--System Beep--Enable Audible Beep (from the Gnome Desktop) I have tried from a Terminal window Edit--General Tab--Selecting terminal bell checkbox I have tried entering modprobe pcspkr at the command line. Trying echo -e '\a' or using the beep application - Nothing works! I know my hardware speaker works, because if on startup the battery is low it will beep. Update: It is a laptop computer. It is an IBM Thinkpad, iSeries. I did look at the modprobe blacklist, and pcspkr was not listed.

    Read the article

  • Aldon and .Net Development

    - by David Stratton
    I'm looking for feedback from .Net developers who have experience with Aldon as a lifecycle management platform. We're seriously considering using Aldon for lifecycle management including source control, automated builds, etc. I know there are a lot of other options out there, but ours is primary an AS/400 shop (with AS/400 programmers outnumbering .Net developers 6 to 1), and Aldon is used already by our iSeries team. The benefit we're looking for is having one lifecycle management suite. Basically, I'm looking for opinions from people who have used Aldon and another set of tools (perhaps TFS, or a combination of SVN, Cruise Control, etc). If you've worked with both, do you have a recommendation on whether this is a good idea, or a bad idea? It's obviously a big choice, so any feedback would be helpful.

    Read the article

  • Developing for Windows CE platform?

    - by grmbl
    I'm looking in creating some applications for workers to use on the workfloor. They'll be using Psion NEO devices running Windows CE 5.0. My skillset allows for C#, PHP, ASP.Net (+ webservices). Application requirements: should connect to our ERP system running on IBM iSeries (AS400). should be run in fullscreen (effectively hiding the OS). usability touch functionality. I have tried the following: Full winform application ran through RDP session: [+] easy deployment using .rdp file. [+] application can be run on desktop environment too. [+] rdp host can easily access DB2 using IBM drivers. [+] GUI works ok on small screen. [-] environment = terminal server. (which is already under heavy use) Full winform application running on device OS: [+] environment = local. [+] responsive. [-] must use a webservice to access DB2. [-] deployment... [-] fixed platform (no desktop) Console application running on device OS: [+] environment = local. [+] very responsive. [-] must use a webservice to access DB2. [-] no fullscreen or other window options? [-] deployment... [-] fixed platform (no desktop) I'm considering creating a web application but it seems the OS comes with IE 5? I don't want to alter the OS in any way! (install other browsers etc.) I would like to have an application that's responsive, easy to deploy, fullscreen and optionally multiplatform. I have seen handheld devices using terminal (emulation?) with a console like interface. This seems to be native to the device but I'm afraid this requires modest knowledge of C++? It seems that using RDP is the way to go but, I came here for advice and look for people that have been in the same situation willing to share their experience. There does not seem to be many "best practices" on the web that could help me decide the best way of working. Greetings

    Read the article

  • Maximum Availability with Oracle GoldenGate

    - by Irem Radzik
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 Oracle Database offers a variety of built-in and optional products for maximum availability, and it is well known for its robust high availability and disaster recovery solutions. With its heterogeneous, real-time, transactional data movement capabilities, Oracle GoldenGate is a key part of the Maximum Availability Architecture for Oracle Database. This week on Thursday Dec. 13th we will be presenting in a live webcast how Oracle GoldenGate fits into Oracle Database Maximum Availability Architecture (MAA). Joe Meeks from the Oracle Database High Availability team will discuss how Oracle GoldenGate complements other key products within MAA such as Active Data Guard. Nick Wagner from GoldenGate PM team will present how to upgrade to latest Oracle Database release without any downtime. Nick will also cover 2 new features of  Oracle GoldenGate 11gR2:  Integrated Capture for Oracle Database and Automated Conflict Detection and Resolution. Nick will provide in depth review of these new features with examples. Oracle GoldenGate also offers maximum availability for non-Oracle databases, such as HP NonStop, SQL Server, DB2 (LUW, iSeries, or zSeries) and more. The same robust, reliable real-time, bidirectional data movement capabilities apply to all supported databases.  I'd like to invite you to join us on Thursday Dec. 13th 10am PT/1pm ET to hear from the product experts on how to use GoldenGate for maximizing database availability and to ask your questions. You can find the registration link below. Webcast: Maximum Availability with Oracle GoldenGate Thursday Dec. 13th 10am PT/1pm ET Look forward to another great webcast with lots of interaction with the audience. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;}

    Read the article

  • php_ibm_db2.dll on IIS 7.5 using PHP 5.3 error message

    - by grmbl
    I'm trying to use ibm_db2 extension to access iSeries DB2 database. This is the testcode (taken from here) <?php $database = 'ALI452BFAL'; //library $user = 'STN452'; $password = '**********'; $hostname = 'myserverip'; $port = 50000; $conn_string = "DRIVER={IBM DB2 ODBC DRIVER};DATABASE=$database;" . "HOSTNAME=$hostname;PORT=$port;PROTOCOL=TCPIP;UID=$user;PWD=$password;"; $conn = db2_connect($conn_string, '', ''); if ($conn) { print "ok"; db2_close($conn); } else { echo db2_conn_error() . '<br>' . db2_conn_errormsg(); } ?> I have installed a very basic package containing the db2 driver and added this as an extension. (IBM Data Server Driver for ODBC, CLI, and .NET.msi) This is my result: 08001 [IBM][CLI Driver] SQL30081N A communication error has been detected. Communication protocol being used: "TCP/IP". Communication API being used: "SOCKETS". Location where the error was detected: "10.10.0.120". Communication function detecting the error: "connect". Protocol specific error code(s): "10061", "", "". SQLSTATE=08001 SQLCODE=-30081 Anybody tried this before??

    Read the article

  • How do I set up a Windows NFS share so that I can view it's contents on Linux?

    - by hewhocutsdown
    My NFS server is a Windows XP SP3 box with the Microsoft Windows Services for Unix installed. I have a share configured under C:\NFS with the share name NFS and ANSI encoding. Anonymous access is enabled, with the anon UID/GID set to 0/0. Additionally, I've set ALL MACHINES to Read-Write, and checked the checkbox to Allow root access. My first NFS client is a Ubuntu 10.04 box, with nfs-common installed. Running sudo mount -t nfs 1.1.1.1:/NFS /home/user/NFS succeeds, but when I attempt to view the folder (even as root), it tells me that I do not have the permissions necessary to view the contents of the folder. My second NFS client is an IBM iSeries box running OS/400 V5R3. I used the mount command below: MOUNT TYPE(*NFS) MFS('1.1.1.1:/NFS') MNTOVRDIR('/PARENT/NFS') OPTIONS('rw,nosuid,retry=5,rsize=8096,wsize=8096,timeo=20,retrans=2,acregmin=30,acregmax=60,acdirmin=30,acdirmax=60,soft') CODEPAGE(*BINARY *ASCII) which also mounts successfully. Attempting to WRKLNK '/PARENT/NFS' and use Option 5 to enter the directory yields a Not authorized to object error - even though I am a security officer with the *ALLOBJ special authority. My gut says that it's a problem with the Windows share, but I don't know what it could be. Do you have any suggestions?

    Read the article

  • How to I do install DB2 ODBC?

    - by Justin
    I have been trying, with no success, to install a IBM DB2 ODBC driver so that my PHP server can connect to a database. I've tried installing the db2_connect and get all sorts of problems, I tried install I Access for Linux and the RPM did not install right nor did using alien breed any useful results. I've also tried the DB2 Runtime v8.1, no success. If I attempt to run the rpm it claims I need dependencies that I can't find in apt-get. Yum is also not very helpful as it appears I don't have any repositories installed or lists... Running the simple RPM gives me this result in terminal: # rpm -ivh iSeriesAccess-7.1.0-1.0.x86_64.rpm rpm: RPM should not be used directly install RPM packages, use Alien instead! rpm: However assuming you know what you are doing... error: Failed dependencies: /bin/ln is needed by iSeriesAccess-7.1.0-1.0.x86_64 /sbin/ldconfig is needed by iSeriesAccess-7.1.0-1.0.x86_64 /bin/rm is needed by iSeriesAccess-7.1.0-1.0.x86_64 /bin/sh is needed by iSeriesAccess-7.1.0-1.0.x86_64 libc.so.6()(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libc.so.6(GLIBC_2.2.5)(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libc.so.6(GLIBC_2.3)(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libdl.so.2()(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libdl.so.2(GLIBC_2.2.5)(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libgcc_s.so.1()(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libm.so.6()(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libm.so.6(GLIBC_2.2.5)(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libodbcinst.so.1()(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libodbc.so.1()(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libpthread.so.0()(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libpthread.so.0(GLIBC_2.2.5)(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libpthread.so.0(GLIBC_2.3.2)(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 librt.so.1()(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 librt.so.1(GLIBC_2.2.5)(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libstdc++.so.6()(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libstdc++.so.6(CXXABI_1.3)(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libstdc++.so.6(GLIBCXX_3.4)(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 Using alien and running the dkpg gives me thes headaque: $ alien iSeriesAccess-7.1.0-1.0.x86_64.rpm --scripts # dpkg -i iseriesaccess_7.1.0-2_amd64.deb (Reading database ... 127664 files and directories currently installed.) Preparing to replace iseriesaccess 7.1.0-2 (using iseriesaccess_7.1.0-2_amd64.deb) ... Unpacking replacement iseriesaccess ... post uninstall processing for iSeriesAccess 1.0...upgrade /var/lib/dpkg/info/iseriesaccess.postrm: line 8: [: upgrade: integer expression expected Setting up iseriesaccess (7.1.0-2) ... post install processing for iSeriesAccess 1.0...configure iSeries Access ODBC Driver has been deleted (if it existed at all) because its usage count became zero odbcinst: Driver installed. Usage count increased to 1. Target directory is /etc odbcinst: Driver installed. Usage count increased to 3. Target directory is /etc Processing triggers for libc-bin ... ldconfig deferred processing now taking place So it seems the files installed right, well my odbc driver shows up but db2cli.ini is no where to be found. So several questions. Is there a better alternative to connect php to db2, say an ubuntu package I can just install? Can someone direct me to the steps that makes my ubuntu server works well with the RPM so I can build my db2 instance? Also remember I'm connection to an I Series remotely. I'm not using the DB2 Express C thing, even if I did try it to get the db2 php functions to work. And I don't have zend but I think I have every other package on the ubuntu repositories. Help, thank you!

    Read the article

  • It was worth the wait… Welcome Oracle GoldenGate 11g Release 2

    - by Irem Radzik
    v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif";} v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif";} It certainly was worth the wait to meet Oracle GoldenGate 11gR2, because it is full of new features on multiple fronts. In fact, this release has the longest and strongest list of new features in Oracle GoldenGate’s history. The new release brings GoldenGate closer to the Oracle Database while expanding the support for global implementations and heterogeneous systems. It is more secure, more flexible, and faster. We announced the availability of Oracle GoldenGate 11gR2 via a press release. If you haven’t seen it yet, please check it out. As covered in this announcement, there are a variety of improvements in the product: Integrated Capture for Oracle Database: brings Oracle GoldenGate’s Capture process closer to the Oracle Database engine and enables support for Advanced Compression among other benefits. Enhanced Conflict Detection & Resolution, speeds and simplifies the conflict detection and resolution process for Active-Active deployments. Globalization, meaning Oracle GoldenGate can be deployed for databases that use multi-byte/Unicode character sets. Security and Performance Improvements, includes support Federal Information Protection Standard (FIPS). Increased Extensibility by kicking off actions based on an event record in the transaction log or in the Trail file. Integration with Oracle Enterprise Manager 12c , in addition to the Oracle GoldenGate Monitor product. Expanded Heterogeneity, including capture from IBM DB2 for i on iSeries (AS/400) and delivery to Postgres We will explain these new features in more detail at our upcoming launch webcast: Harness the Power of the New Release of Oracle GoldenGate 11g- (Sept 12 8am/10am PT) In addition to learning more about these new features, the webcast will allow you to ask your questions to product management via live Q&A section. So, I hope you will not miss this opportunity to explore the new release of Oracle GoldenGate 11g and see how it can deliver enterprise-class real-time data integration solutions.. I look forward to a great webcast to unveil GoldenGate’s new capabilities.

    Read the article

  • Oracle GoldenGate 11g Release 2 Launch Webcast Replay Available

    - by Irem Radzik
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif"; mso-fareast-font-family:"Times New Roman";} For those of you who missed Oracle GoldenGate 11g Release 2 launch webcasts last week, the replay is now available from the following url. Harnessing the Power of the New Release of Oracle GoldenGate 11g I would highly recommend watching the webcast to meet many new features of the new release and hear the product management team respond to the questions from the audience in a nice long Q&A section. In my blog last week I listed the media coverage for this new release. There is a new article published by ITJungle talking about Oracle GoldenGate’s heterogeneity and support for DB2 for iSeries: Oracle Completes DB2/400 Support in Data Replication Tool As mentioned in last week’s blog, we received over 150 questions from the audience and in this blog I'd like to continue to post some of the frequently asked,  questions and their answers: Question: What are the fundamental differences between classic data capture and integrated data capture? Do both use the redo logs in the source database? Answer: Yes, they both use redo logs. Classic capture parses the redo log data directly, whereas the Integrated Capture lets the Oracle database parse the redo log record using an internal API. Question: Does GoldenGate version need to match Oracle Database version? Answer: No, they are not directly linked. Oracle GoldenGate 11g Release 2 supports Oracle Database version 10gR2 as well. For Oracle Database version 10gR1 and Oracle Database version 9i you will need GoldenGate11g Release 1 or lower. And for Oracle Database 8i you need Oracle GoldenGate 10 or earlier versions. Question: If I already use Data Guard, do I need GoldenGate? Answer: Data Guard is designed as the best disaster recovery solution for Oracle Database. If you would like to implement a bidirectional Active-Active replication solution or need to move data between heterogeneous systems, you will need GoldenGate. Question: On Compression and GoldenGate, if the source uses compression, is it required that the target also use compression? Answer: No, the source and target do not need to have the same compression settings. Question: Does GG support Advance Security Option on the Source database? Answer: Yes it does. Question: Can I use GoldenGate to upgrade the Oracle Database to 11g and do OS migration at the same time? Answer: Yes, this is a very common project where GoldenGate can eliminate downtime, give flexibility to test the target as needed, and minimize risks with fail-back option to the old environment. For more information on database upgrades please check out the following white papers: Best Practices for Migrating/Upgrading Oracle Database Using Oracle GoldenGate 11g Zero-Downtime Database Upgrades Using Oracle GoldenGate Question: Does GoldenGate create any trigger in the source database table level or row level to for real-time data integration? Answer: No, GoldenGate does not create triggers. Question: Can transformation be done after insert to destination table or need to be done before? Answer: It can happen in the Capture (Extract) process, in the  Delivery (Replicat) process, or in the target database. For more resources on Oracle GoldenGate 11gR2 please check out our Oracle GoldenGate 11gR2 resource kit as well.

    Read the article

  • Switching from Sourcesafe - What to look for in a product

    - by asp316
    We're looking to move off of sourcesafe and on to a more robust source control system for our .Net apps. We're also looking for scripted/automated deployments. I'm a .Net developer (web and winforms). However, most of our development staff is RPG for the IBM iSeries and the devs use Aldon's LMI for source control and deployment. Our manager would prefer to stick with Aldon so all of our products are in the same system. However, I don't have experience with Aldon's products on the .Net side. I've used TFS and Subversion with Tortoise a bit, but not enough to recommend one or the other, especially in comparison to Aldon's product. Does anybody have experience with Aldon's products? If so, thoughts please? Also, other than the obvious things source control systems do, are there things I should avoid or are there must haves? I'm open to any system. A bit of background, I'm the only .Net dev in our company but I let operations do the deployments. I do want the ability to support concurrent checkouts if we hire a new dev.

    Read the article

  • Producing a Form from an Overlay from Reporting Services rdlc Reports

    - by Mike Wills
    I am not sure what you call it in other technologies, on the IBM i (or iSeries) we call it overlays. The overlay is an image of a form that is stored on the server then a program generates the form with fields from the database so you can eliminate preprinted forms. I had a problem last year with the method I was trying at the time. It was a rush job at the time to be revisited at a later point. The work-around at the time was to export to PDF. So now it is "later" and once again is a rush (imagine that). This is all done through a web-based interface. So how do you generate forms from something that was once a preprinted form? What method do you recommend? This is a legal form and must be filled out a certain way and can have many in a batch (up to 50 or so). I would prefer to not have them print one page at a time. Any ideas would be appreciated!

    Read the article

  • Generate A Simple Read-Only DAL?

    - by David
    I've been looking around for a simple solution to this, trying my best to lean towards something like NHibernate, but so far everything I've found seems to be trying to solve a slightly different problem. Here's what I'm looking at in my current project: We have an IBM iSeries database as a primary repository for a third party software suite used for our core business (a financial institution). Part of what my team does is write applications that report on or key off of a lot of this data in some way. In the past, we've been manually creating ADO .NET connections (we're using .NET 3.5 and Visual Studio 2008, by the way) and manually writing queries, etc. Moving forward, I'd like to simplify the process of getting data from there for the development team. Rather than creating connections and queries and all that each time, I'd much rather a developer be able to simply do something like this: var something = (from t in TableName select t); And, ideally, they'd just get some IQueryable or IEnumerable of generated entities. This would be done inside a new domain core that I'm building where these entities would live and the applications would interface with it through a request/response service layer. A few things to note are: The entities that correspond to the database tables should be generated once and we'd prefer to manually keep them updated over time. That is, if columns/tables are added to the database then we shouldn't have to do anything. (If some are deleted, of course, it will break, but that's fine.) But if we need to use a new column, we should be able to just add it to the necessary class(es) without having to re-gen the whole thing. The whole thing should be SELECT-only. We're not doing a full DAL here because we don't want to be able to break anything in the database (even accidentally). We don't need any kind of mapping between our domain objects and the generated entity types. The domain barely covers a fraction of the data that's in there, most of it we'll never need, and we would rather just create re-usable maps manually over time. I already have a logical separation for the DAL where my "repository" classes return domain objects, I'm just looking for a better alternative to manual ADO to be used inside the repository classes. Any suggestions? It seems like what I'm doing is just enough outside the normal demand for DAL/ORM tools/tutorials online that I haven't been able to find anything. Or maybe I'm just overlooking something obvious?

    Read the article

  • GoldenGate 12c Trail Encryption and Credentials with Oracle Wallet

    - by hamsun
    I have been asked more than once whether the Oracle Wallet supports GoldenGate trail encryption. Although GoldenGate has supported encryption with the ENCKEYS file for years, Oracle GoldenGate 12c now also supports encryption using the Oracle Wallet. This helps improve security and makes it easier to administer. Two types of wallets can be configured in Oracle GoldenGate 12c: The wallet that holds the master keys, used with trail or TCP/IP encryption and decryption, stored in the new 12c dirwlt/cwallet.sso file.   The wallet that holds the Oracle Database user IDs and passwords stored in the ‘credential store’ stored in the new 12c dircrd/cwallet.sso file.   A wallet can be created using a ‘create wallet’  command.  Adding a master key to an existing wallet is easy using ‘open wallet’ and ‘add masterkey’ commands.   GGSCI (EDLVC3R27P0) 42> open wallet Opened wallet at location 'dirwlt'. GGSCI (EDLVC3R27P0) 43> add masterkey Master key 'OGG_DEFAULT_MASTERKEY' added to wallet at location 'dirwlt'.   Existing GUI Wallet utilities that come with other products such as the Oracle Database “Oracle Wallet Manager” do not work on this version of the wallet. The default Oracle Wallet can be changed.   GGSCI (EDLVC3R27P0) 44> sh ls -ltr ./dirwlt/* -rw-r----- 1 oracle oinstall 685 May 30 05:24 ./dirwlt/cwallet.sso GGSCI (EDLVC3R27P0) 45> info masterkey Masterkey Name:                 OGG_DEFAULT_MASTERKEY Creation Date:                  Fri May 30 05:24:04 2014 Version:        Creation Date:                  Status: 1               Fri May 30 05:24:04 2014        Current   The second wallet file is used for the credential used to connect to a database, without exposing the user id or password. Once it is configured, this file can be copied so that credentials are available to connect to the source or target database.   GGSCI (EDLVC3R27P0) 48> sh cp ./dircrd/cwallet.sso $GG_EURO_HOME/dircrd GGSCI (EDLVC3R27P0) 49> sh ls -ltr ./dircrd/* -rw-r----- 1 oracle oinstall 709 May 28 05:39 ./dircrd/cwallet.sso   The encryption wallet file can also be copied to the target machine so the replicat has access to the master key to decrypt records that are encrypted in the trail. Similar to the old ENCKEYS file, the master keys wallet created on the source host must either be stored in a centrally available disk or copied to all GoldenGate target hosts. The wallet is in a platform-independent format, although it is not certified for the iSeries, z/OS, and NonStop platforms.   GGSCI (EDLVC3R27P0) 50> sh cp ./dirwlt/cwallet.sso $GG_EURO_HOME/dirwlt   The new 12c UserIdAlias parameter is used to locate the credential in the wallet so the source user id and password does not need to be stored as a parameter as long as it is in the wallet.   GGSCI (EDLVC3R27P0) 52> view param extwest extract extwest exttrail ./dirdat/ew useridalias gguamer table west.*; The EncryptTrail parameter is used to encrypt the trail using the Advanced Encryption Standard and can be used with a primary extract or pump extract. GGSCI (EDLVC3R27P0) 54> view param pwest extract pwest encrypttrail AES256 rmthost easthost, mgrport 15001 rmttrail ./dirdat/pe passthru table west.*;   Once the extracts are running, records can be encrypted using the wallet.   GGSCI (EDLVC3R27P0) 60> info extract *west EXTRACT    EXTWEST   Last Started 2014-05-30 05:26   Status RUNNING Checkpoint Lag       00:00:17 (updated 00:00:01 ago) Process ID           24982 Log Read Checkpoint  Oracle Integrated Redo Logs                      2014-05-30 05:25:53                      SCN 0.0 (0) EXTRACT    PWEST     Last Started 2014-05-30 05:26   Status RUNNING Checkpoint Lag       24:02:32 (updated 00:00:05 ago) Process ID           24983 Log Read Checkpoint  File ./dirdat/ew000004                      2014-05-29 05:23:34.748949  RBA 1483   The ‘info masterkey’ command is used to confirm the wallet contains the key after copying it to the target machine. The key is needed to decrypt the data in the trail before the replicat applies the changes to the target database.   GGSCI (EDLVC3R27P0) 41> open wallet Opened wallet at location 'dirwlt'. GGSCI (EDLVC3R27P0) 42> info masterkey Masterkey Name:                 OGG_DEFAULT_MASTERKEY Creation Date:                  Fri May 30 05:24:04 2014 Version:        Creation Date:                  Status: 1               Fri May 30 05:24:04 2014        Current   Once the replicat is running, records can be decrypted using the wallet.   GGSCI (EDLVC3R27P0) 44> info reast REPLICAT   REAST     Last Started 2014-05-30 05:28   Status RUNNING INTEGRATED Checkpoint Lag       00:00:00 (updated 00:00:02 ago) Process ID           25057 Log Read Checkpoint  File ./dirdat/pe000004                      2014-05-30 05:28:16.000000  RBA 1546   There is no need for the DecryptTrail parameter when using the Oracle Wallet, unlike when using the ENCKEYS file.   GGSCI (EDLVC3R27P0) 45> view params reast replicat reast assumetargetdefs discardfile ./dirrpt/reast.dsc, purge useridalias ggueuro map west.*, target east.*;   Once a record is inserted into the source table and committed, the encryption can be verified using logdump and then querying the target table.   AMER_SQL>insert into west.branch values (50, 80071); 1 row created.   AMER_SQL>commit; Commit complete.   The following encrypted record can be found using logdump. Logdump 40 >n 2014/05/30 05:28:30.001.154 Insert               Len    28 RBA 1546 Name: WEST.BRANCH After  Image:                                             Partition 4   G  s    0a3e 1ba3 d924 5c02 eade db3f 61a9 164d 8b53 4331 | .>...$\....?a..M.SC1   554f e65a 5185 0257                               | UO.ZQ..W  Bad compressed block, found length of  7075 (x1ba3), RBA 1546   GGS tokens: TokenID x52 'R' ORAROWID         Info x00  Length   20  4141 4157 7649 4141 4741 4141 4144 7541 4170 0001 | AAAWvIAAGAAAADuAAp..  TokenID x4c 'L' LOGCSN           Info x00  Length    7  3231 3632 3934 33                                 | 2162943  TokenID x36 '6' TRANID           Info x00  Length   10  3130 2e31 372e 3135 3031                          | 10.17.1501  The replicat automatically decrypted this record from the trail and then inserted the row to the target table using the wallet. This select verifies the row was inserted into the target database and the data is not encrypted. EURO_SQL>select * from branch where branch_number=50; BRANCH_NUMBER                  BRANCH_ZIP -------------                                   ----------    50                                              80071   Book a seat in an upcoming Oracle GoldenGate 12c: Fundamentals for Oracle course now to learn more about GoldenGate 12c new features including how to use GoldenGate with the Oracle wallet, credentials, integrated extracts, integrated replicats, the Oracle Universal Installer, and other new features. Looking for another course? View all Oracle GoldenGate training.   Randy Richeson joined Oracle University as a Senior Principal Instructor in March 2005. He is an Oracle Certified Professional (10g-12c) and a GoldenGate Certified Implementation Specialist (10-11g). He has taught GoldenGate since 2010 and also has experience teaching other technical curriculums including GoldenGate Monitor, Veridata, JD Edwards, PeopleSoft, and the Oracle Application Server.

    Read the article

< Previous Page | 1 2