Search Results

Search found 23901 results on 957 pages for 'deployment process'.

Page 351/957 | < Previous Page | 347 348 349 350 351 352 353 354 355 356 357 358  | Next Page >

  • Can’t connect to SQL Server 2008 - looks like Shared Memory problem

    - by Proposition Joe
    I am unable to connect to my local instance of SQL Server 2008 Express using SQL Server Management Studio. I believe the problem is related to a change I made to the connection protocols. Before the error occurred, I had Shared Memory enabled and Named Pipes and TCP/IP disabled. I then enabled both Named Pipes and TCP/IP, and this is when I started experiencing the problem. When I try to connect to the server with SSMS (with either my SQL server sysadmin login or with windows authentication), I get the following error message: A connection was successfully established with the server, but then an error occurred during the login process. (provider: Named Pipes Provider, error: 0 - No process is on the other end of the pipe.) (Microsoft SQL Server, Error: 233) Why is it returning a Named Pipes error? Why would it not just use Shared Memory, as this has a higher priority order in the list of connection protocols? It seems like it is not listening on Shared Memory for some reason? When I set Named Pipes to enabled and try to connect, I get the same error message. My windows account is does not have administrator priviliges on my computer - perhaps this is making a difference in some way (as some of the discussions in this post about an "SuperSocketNetLib\Lpc" registry key seems to suggest). I have tried restarting the SQL Server service, by the way, and also tried to get someone to log onto the machine with an admin account to restart the SQL Server service. Still no luck.

    Read the article

  • Using Monit to monitor Resque

    - by Alex
    I'm trying to use resque as a job runner for Rails. I've tried this config, and many other ways of demonizing the rescue task (because running rake resque:work leaves the terminal tied to that command). Unfortunately, their example configuration doesn't work for me. Does the configuration look correct? Or is there another way to turn the process into a daemon? Thank you :) check process resque_worker_QUEUE with pidfile /data/APP_NAME/current/tmp/pids/resque_worker_QUEUE.pid start program = "/bin/sh -c 'cd /data/APP_NAME/current; RAILS_ENV=production QUEUE=queue_name VERBOSE=1 nohup rake environment resque:work& > log/resque_worker_QUEUE.log && echo $! > tmp/pids/resque_worker_QUEUE.pid'" as uid deploy and gid deploy stop program = "/bin/sh -c 'cd /data/APP_NAME/current && kill -s QUIT `cat tmp/pids/resque_worker_QUEUE.pid` && rm -f tmp/pids/resque_worker_QUEUE.pid; exit 0;'" if totalmem is greater than 300 MB for 10 cycles then restart # eating up memory?

    Read the article

  • upstart scripts: run a task after networking goes up

    - by The Journeyman geek
    I'm working on moving my current server setup to newer hardware, and migrating from ubuntu karmic koala to lucid lynx. Currently i'm using gw6c (compiled from the gogo6 website, as opposed to the version from the repositories) to get ipv6 access for my systems. On the karmic koala system, i used simple init.d script to get the ipv6 client started #! /bin/sh /usr/local/gw6c/bin/gw6c -f /usr/local/gw6c/bin/gw6c.conf I figured since this runs at any runlevel, it should translate to respawn console none start on startup stop on shutdown script exec /usr/local/gw6c/bin/gw6c -f /usr/local/gw6c/bin/gw6c.conf emit free6_ipv6_started end script this works fine started from initctrl, but it apparently fails to start when it boots. - its status being stop/waiting. It works fine (and respawns) when started otherwise.Any ideas on where i'm going wrong, and what would be the appropriate 'start on' arguement? EDIT: the exact error is 'init: gw6c main process (xxx) ended with status 8' followed by the process respawning , with xxx being a PID i suspect. I'm also half suspecting this is cause gw6c starts before networking does, and i need my eth0 up before gw6c is

    Read the article

  • Unable to delete a file or take ownership on Win7x64

    - by Basic
    I'm a developer and as part of the build process, a Microsoft dll is copied to a certain folder. That file copy is now failing as the target can't be overwritten. I decided to delete it by hand (using an admin account but a non-elevated explorer) so browsed to the folder and attempted a delete. This failed (Require permission from the Administrator). The same applies when using an elevated explorer. So I tried Properties-Security-Advanced-Ownership The current owner is showing as Unable to display current owner. I can't take ownership (a simple Access Denied message with no elaboration). Elevated Command Prompt/PowerShell don't help either (both give an Access Denied in their own way). Process explorer shows no open handles on the file. Eventually, I booted to linux and deleted the file but what I'd like to know is what caused it? Security Essentials had no issues with the file. It's digitally signed by MS and the signatures match.

    Read the article

  • Profile not loaded correctly (Cannot access registry)

    - by xaav
    Every so often, I log on and get the Following Message: User profile was not loaded correctly. You have been logged on with a temporary profile. Changes you make to this profile will be lost when you log off. Please see the event log for details or contact your administrator This almost always happens when somebody else has been on the computer for a while, and then I log on. This never used to happen, but now it happens pretty often. My profile is not permanently corrupted, all I have to do is restart my computer, but this annoys me, and I would like to fix it. I was curios about the reason of this cause, so I looked into the Event Log, and found the root of the problem was the ntuser.dat file in the profile that I was logging on to was locked at logon time. This resulted in the current users registry not being loaded, resulting in failure to load the profile. I just found a microsoft article that mentions this exact issue: http://support.microsoft.com/kb/960464/ The problem is that I do not want to delete this profile; and this issue does not come up every time that I log on, only when somebody else has been on a long time before me. What could be locking this file? Is there any way to get a process list without logging on so that I can identify which process has the file locked? Any other suggestions?

    Read the article

  • Should I completely turn off swap for linux webserver?

    - by Poma
    Recently my friend told me that it is a good idea to turn off swap on linux webservers with enough memory. My server has 12 GB and currently uses 4GB (not counting cache and buffers) under peak load. His argument was that in a normal situation server will never use all of its RAM so the only way it can encounter OutOfMemory situation is due to some bug/ddos/etc. So in case swap is turned off system will run out of memory that will eventually crash the program hogging memory (most likely the web server process) and probably some other processes. In case swap is turned on it will eat both RAM and swap and eventually will result in the same crash, but before that it will offload crucial processes like sshd to swap and start to do a lot of swap operations resulting in major slowdown. This way when under ddos system may go into a completely unusable condition due to huge lags and I probably will not be unable to log in and kill webserver process or deny all incoming traffic (all but ssh). Is this right? Am I missing something (like the fact that swap partition is very useful in some way even if I have enough RAM)? Should I turn it off?

    Read the article

  • Cisco 7206vxr cpu reducing

    - by naimson
    I have a 7206VXR (NPE-G2) . At the rate of 140 kpps i gain 80% of cpu . So i looking for ways how to reduce it? So i want to turn off netflow(but don't want to this,monitoring is highly important for me), but it will give me only 10-20% ? At this moment with 84kpps i have 58% sh processes cpu sorted give me this. PID Runtime(ms) Invoked uSecs 5Sec 1Min 5Min TTY Process 109 163534600 537236763 304 35.38% 32.83% 16.85% 0 IP Input 67 829396 52280 15864 0.15% 0.01% 0.00% 0 Per-minute Jobs 68 5542736 3053476 1815 0.15% 0.18% 0.16% 0 Per-Second Jobs 51 635852 1116315 569 0.07% 0.03% 0.02% 0 Net Background 329 120396 4607274 26 0.07% 0.00% 0.00% 0 EIGRP-IPv4 Hello 105 50508 95032488 0 0.07% 0.05% 0.05% 0 IPAM Manager 6 4068580 476916 8531 0.00% 0.07% 0.05% 0 Check heaps 7 7768 3634 2137 0.00% 0.00% 0.00% 0 Pool Manager 8 0 1 0 0.00% 0.00% 0.00% 0 DiscardQ Backgro 10 8 708 11 0.00% 0.00% 0.00% 0 WATCH_AFS 5 0 1 0 0.00% 0.00% 0.00% 0 RO Notify Timers 12 0 2 0 0.00% 0.00% 0.00% 0 ATM VC Auto Crea 9 0 2 0 0.00% 0.00% 0.00% 0 Timers 11 0 2 0 0.00% 0.00% 0.00% 0 ATM AutoVC Perio 13 296 610532 0 0.00% 0.00% 0.00% 0 IPC Event Notifi 16 0 1 0 0.00% 0.00% 0.00% 0 IPC Zone Manager 17 3584 2980311 1 0.00% 0.00% 0.00% 0 IPC Periodic Tim 4 0 1 0 0.00% 0.00% 0.00% 0 EDDRI_MAIN 19 0 1 0 0.00% 0.00% 0.00% 0 IPC Process leve 20 0 1 0 0.00% 0.00% 0.00% 0 IPC Seat Manager 21 96 174453 0 0.00% 0.00% 0.00% 0 IPC Check Queue 14 4 50890 0 0.00% 0.00% 0.00% 0 IPC Dynamic Cach 3 0 1 0 0.00% 0.00% 0.00% 0 cpf_process_tpQ 24 756 305371 2 0.00% 0.00% 0.00% 0 IPC Keep Alive M 25 2340 610561 3 0.00% 0.00% 0.00% 0 IPC Loadometer 22 0 1 0 0.00% 0.00% 0.00% 0 IPC Seat RX Cont 15 0 1 0 0.00% 0.00% 0.00% 0 IPC Session Serv 18 1620 2980310 0 0.00% 0.00% 0.00% 0 IPC Deferred Por 29 0 1 0 0.00% 0.00% 0.00% 0 Exception contro sh run(greped): http://pastie.org/5483194 Hardware: c7200p-adventerprisek9-mz.151-4.M1.bin Cisco 7206VXR (NPE-G2) processor (revision A) with 917504K/65536K bytes of memory. Processor board ID 2xxxxxxx MPC7448 CPU at 1666Mhz, Implementation 0, Rev 2.2 6 slot VXR midplane, Version 2.1

    Read the article

  • Linux SFTP and many local user accounts, limits with mount --bind?

    - by user123428
    I am in the process of building a solution to handle many developers (possibly hundreds) to work on their files via sftp, each one Jailed in their home directory. For our particular needs, we have a samba mount point that contains all of the users home directories. I have started developing the following solution and hit some walls: - I have configured a Ubuntu Lucid Server as sftp server. - In order to jail the user in their home directory (without allowing them the browse a directory up and seeing all the other users folders) I am using mount --bind and not a symbolic link (also some ftp clients don't really work with sym links). - The user accounts are local unix user accounts on the sftp server (not using a directory service or anything) that have an empty home folder created on the local machine, then I use mount --bind to bind the empty folder to the actual users home directory on the samba share. With this solution I am hitting a couple of problems, in the case of a server reboot, all the mount --binds are lost because they are not written in fstab. Then I have read somewhere that the maximum amount of entries in fstab are 400 (which does not really help us). I have thought of a solution of writing something that stores the mounts in a text file as a backup and on server reboot, run the script that re mounts. I am just really unsure about this whole process and was wondering if anyone has any insight on possibly a better solution for SFTP? (not FTP)

    Read the article

  • Server high CPU load issue! ( Cpanel + CentOS 5)

    - by kenby
    Our server cpu load is high todays sometimes reaches to 560! .. We have the lastest Cpanel/whm and the kernel is update!while the load average is : Load Averages: 39.05 75.01 45.33 the apache log is: Current Time: Sunday, 30-Jan-2011 01:50:13 EST Restart Time: Saturday, 29-Jan-2011 21:51:20 EST Parent Server Generation: 2 Server uptime: 3 hours 58 minutes 53 seconds Total accesses: 149493 - Total Traffic: 2.4 GB CPU Usage: u9.17 s10.66 cu42.82 cs0 - .437% CPU load 10.4 requests/sec - 174.6 kB/second - 16.7 kB/request 121 requests currently being processed, 42 idle workers W_WWW.W_..W.W_W_WCWW..W...W.WWW.WWWW.WW.C_W_.W.WW.WC..W.WW.WW .W.W.W...WWWW...WW.CC.C.._W.WC.WW_WW._W....W.WWW.W.WWW.W..W WW.....WW.W_WWWWW..WCRW..WWCW.WWW__.WWWWCW_W._._WW_W...W...W _W..W..WW.W...._W..._WW.W.WWW.._W.WWW.WWW....WW_.C...W._ Scoreboard Key: "_" Waiting for Connection, "S" Starting up, "R" Reading Request, "W" Sending Reply, "K" Keepalive (read), "D" DNS Lookup, "C" Closing connection, "L" Logging, "G" Gracefully finishing, "I" Idle cleanup of worker, "." Open slot with no current process What cause this high cpu load while the apache cpu load is fine? the mysql process is also fine.. the cpu load is still high even if I stop mail-http-mysql services!

    Read the article

  • Why would my network slow down?

    - by monkthemighty
    The network at my work has about 40 computers on it and a quite a few printers. When there are a lot of people working the network will be slow. I can test the ping between my computer and the router and it will keep rising, sometimes to the point that it times out. The router we are using is running Ubuntu on a atom processor and it has 4gb of ram. When the network slows the process Ksoftirq will be using most if not all of the processing power. I have found that Ksoftirq is a process that handles irq requests. Also when the network slows down I have captured packets from the router and using tshark and looked at it using wireshark on my laptop. With the capture show a lot of packets with TCP Dup ACK and TCP Retransmissions. The destinations of the TCP Dup and TCP retransmissions are to most of the computers on the network but there are some that are far more than others. What could this problem be caused by?

    Read the article

  • File Acces\Rename Issue

    - by Moon .
    Guys, i am having serious problem with media files (only video files). When i try to rename or delete a Files or the Folders that contained media files, i get an error. "Access Denied .... Some other program might be using this file..... bla bla bla" Previously for this knida problem i had a freeware "Unlocker" that used to work. What "Unlocker" used to do was simply kill the process that is using the file. But in my case, now that is, the process using the files is "Explorer.exe" W.T.F... A strange behavior, when i copy paste the same file. The copied file, I can rename it, i can delete it, i can shit on it if i want to.... but what is wrong with the original file. Check the following list. It will give you a better idea of my situation. I am using Win XP SP 2 I have got Div Player installed (latest) The media files are from both, my hard drive, created by many of my previous window installations. And new downloaded files. So Files ownership or Privacy is not an issue. I may not be able to rename\delete those files one moment, the next moment, in the next try, i might succeed. The problem is a major F. i can't figure it out. Please F1! its getting on my nerves.

    Read the article

  • Allow more websocket connections

    - by Switz
    I want to load balance my node.js (DerbyJS to be specific) application on a basic Linode (512MB ram). It can probably take more than one process running at once. The querys/database does not concern me as I'm not doing anything intensive. The problem at the moment is that it can only handle up to ~40 websocket connections at once. I would love if I could get that number in the few hundred+ range. I anticipate a lot of traffic on launch due to the fact that it's a highly niche community with an engaged audience, but after it should be fine with just ~20-40 connections at once, which it handles perfectly as of now. I don't mind spending a bit of money for a week or two worth of running, but I also don't want to switch production environments. How can I test the process to see how many instances I am able to run on the box? Will increasing the number of processes increase the amount of websockets I can handle, or is that a limitation of the server's network? I have an old Macbook Pro running Linux sitting next to me that has 2GB ram and a 2.8 Dual Core Processor. Could I use this to handle some of the extra load? I could probably load balance with nginx to its IP. I'm on a FiOS home network. If you have any suggestions, I'd really appreciate it. Thanks

    Read the article

  • Can’t connect to SQL Server 2008 - looks like Shared Memory problem

    - by user38556
    I am unable to connect to my local instance of SQL Server 2008 Express using SQL Server Management Studio. I believe the problem is related to a change I made to the connection protocols. Before the error occurred, I had Shared Memory enabled and Named Pipes and TCP/IP disabled. I then enabled both Named Pipes and TCP/IP, and this is when I started experiencing the problem. When I try to connect to the server with SSMS (with either my SQL server sysadmin login or with windows authentication), I get the following error message: A connection was successfully established with the server, but then an error occurred during the login process. (provider: Named Pipes Provider, error: 0 - No process is on the other end of the pipe.) (Microsoft SQL Server, Error: 233) Why is it returning a Named Pipes error? Why would it not just use Shared Memory, as this has a higher priority order in the list of connection protocols? It seems like it is not listening on Shared Memory for some reason? When I set Named Pipes to enabled and try to connect, I get the same error message. My windows account is does not have administrator priviliges on my computer - perhaps this is making a difference in some way (as some of the discussions in this post about an "SuperSocketNetLib\Lpc" registry key seems to suggest). I have tried restarting the SQL Server service, by the way, and also tried to get someone to log onto the machine with an admin account to restart the SQL Server service. Still no luck.

    Read the article

  • Linux 'top' utility widly inaccurate (more so for multi-CPU/core hardware)?

    - by amn
    Hi all. After using 'top' for long time, albeit basically, I have grown to distrust it's '% CPU' column reports. I have a 8-core (quad core Intel i7 920 with hyperthreading) hardware, and see some wild numbers when running a process that should not use more than 5% overall. top happily reports 50%, and I suspect it is not so. My question is, is it a known fact that it's inaccurate when several CPUs/cores are present? I used 'mpstat' from the 'sysstat' package, and it's showings are much more conservative, certainly within my expectations. I did press '1' for 'top' to switch it to show all the core and us/sy/io stats, but the numbers are substantially higher than with 'mpstat'... I know that my expectations can be unfound as well, but my gut feeling tells me 'top' is wrong! :-) The reason I need to know is because the process I am monitoring only guarantees quality of service with CPU usage "less than 80%" (however vague that sounds), and I need to know how much headroom I have left. It's a streaming server.

    Read the article

  • How to ensure local file is up-to-date or ahead (dropbox sync) before truecrypt auto-mount it?

    - by user620965
    There are a lot tutorials out there that states that dropbox build-in encryption is not secure enought. That tutorials recommands to sync a truecrypt container file to have all files in it securely encrypted. This setup is know to be limited. You can NOT have that truecrypt container file mounted on the same time on more than one location - if you have inserted changes to the contents of the container in more then one location at a time then this setup produces a conflict on the container file in the dropbox system - resulting in one container file for each location. In my case that issue is not relevant - i do not use my data on more than one location at a time. I want to use the auto-mount feature of truecrypt on startup of windows 7 to have a zero configuration environment - and start working right away. But i want to ensure that the local truecrypt container file is up-to-date before truecrypt mounts it automatically - imagine you updated the contents of the container on your primary location and your secondary location was off for a long time. In that case it can take "a long time" till dropbox sync is complete (e.g. depending on your internet connection and the size of the container file). There is a option in truecrypt that ensures that truecrypt do not update the timestamp of the container file - which speeds up the sync, because dropbox client is doing a differential sync then instead of a time consuming full-sync. That is an improvement to that setup, but this do not fix my issue. The question is how to make the auto-mount function wait for the container file to be up-to-date (updated by dropbox)? In contrast: if the file was changed local, but remote file (in the dropbox cloud system) is still old (not jet updated by the sync process / or process is progress), should not make truecrypt to wait for the sync. Suggestions?

    Read the article

  • VMWare server: virtual machine start-up reaches 95% and hangs

    - by Magsol
    This problem cropped up today, after updating my eVGA motherboard's chipset in order to try and fix an unrelated issue. After installing the chipset update (contained SATA and Ethernet drivers), every time I've tried to start my Ubuntu VM, it reaches 95% in the web interface and then just hangs. I'm using VMWare Server 2.0.2, running it within Windows 7 64-bit. I haven't had any issues up until now, and I suspect it has something to do with the chipset update. I've already tried reinstalling VMWare itself, removing the VM from the inventory and re-adding it, and neither has proved successful. I'm also not sure how to kill the VMware server process itself once the start-up hangs; I've only been able to try again by rebooting (as none of the VMWare Services listed kill the server process itself). Any insights? Edit: Uhhh...as an addendum: I have a cron job set up on my Ubuntu VM that runs every 20 minutes. The VM is still listed in the VMWare web interface as at 95% of startup, and the start/stop buttons are still disabled, but the cronjob just ran. I also tested SSH, and I was able to tunnel into the Ubuntu VM as well. Now I'm really confused. Edit #2: I just started a thread on the VMWare Server support forums on this same topic. Hopefully between the two communities, we can come up with an answer: http://communities.vmware.com/thread/251033 Edit #3: In lieu of a specific fix, I've switched over to VirtualBox, and all is working just fine.

    Read the article

  • 100% CPU use when new usb device plugged in - services.exe / Windows Server 2003

    - by Will3265
    On my server I am trying to install a new usb drive but all that happens is that the system starts using huge amounts of processor cycles with services.exe. On closer inspection with process explorer there is a thread called umpnpmgr.dll using most of the services.exe processor time. I left it for a half hour and still nothing happened. Rebooted and tried again, same result. Tried a different usb drive, then a flash drive but still same issue. Tried updating driver but it said the update function was already in action. I have used process explorer to kill the thread now so the server can still perform its intended functions. Any device that was previously installed before this began happening will still work but any device new to the system will not. My question(s) is/are: Is there a way to manually install the device into the registry so Windows thinks it is a previously installed device? Or can this problem be repaired through anything other than a reinstall? To do a reinstall would mean backing up large amounts of data which is hard with a usb drive and insufficient space on all other network machines. Any help would be greatly appreciated. William

    Read the article

  • Added splash screen code to my package

    - by Youssef
    Please i need support to added splash screen code to my package /* * T24_Transformer_FormView.java */ package t24_transformer_form; import org.jdesktop.application.Action; import org.jdesktop.application.ResourceMap; import org.jdesktop.application.SingleFrameApplication; import org.jdesktop.application.FrameView; import org.jdesktop.application.TaskMonitor; import java.awt.event.ActionEvent; import java.awt.event.ActionListener; import javax.swing.filechooser.FileNameExtensionFilter; import javax.swing.filechooser.FileFilter; // old T24 Transformer imports import java.io.File; import java.io.FileWriter; import java.io.StringWriter; import java.text.SimpleDateFormat; import java.util.ArrayList; import java.util.Date; import java.util.HashMap; import java.util.Iterator; //import java.util.Properties; import java.util.StringTokenizer; import javax.swing.; import javax.xml.parsers.DocumentBuilder; import javax.xml.parsers.DocumentBuilderFactory; import javax.xml.transform.Result; import javax.xml.transform.Source; import javax.xml.transform.Transformer; import javax.xml.transform.TransformerFactory; import javax.xml.transform.dom.DOMSource; import javax.xml.transform.stream.StreamResult; import org.apache.log4j.Logger; import org.apache.log4j.PropertyConfigurator; import org.w3c.dom.Document; import org.w3c.dom.DocumentFragment; import org.w3c.dom.Element; import org.w3c.dom.Node; import org.w3c.dom.NodeList; import com.ejada.alinma.edh.xsdtransform.util.ConfigKeys; import com.ejada.alinma.edh.xsdtransform.util.XSDElement; import com.sun.org.apache.xml.internal.serialize.OutputFormat; import com.sun.org.apache.xml.internal.serialize.XMLSerializer; /* * The application's main frame. */ public class T24_Transformer_FormView extends FrameView { /**} * static holders for application-level utilities * { */ //private static Properties appProps; private static Logger appLogger; /** * */ private StringBuffer columnsCSV = null; private ArrayList<String> singleValueTableColumns = null; private HashMap<String, String> multiValueTablesSQL = null; private HashMap<Object, HashMap<String, Object>> groupAttrs = null; private ArrayList<XSDElement> xsdElementsList = null; /** * initialization */ private void init() /*throws Exception*/ { // init the properties object //FileReader in = new FileReader(appConfigPropsPath); //appProps.load(in); // log4j.properties constant String PROP_LOG4J_CONFIG_FILE = "log4j.properties"; // init the logger if ((PROP_LOG4J_CONFIG_FILE != null) && (!PROP_LOG4J_CONFIG_FILE.equals(""))) { PropertyConfigurator.configure(PROP_LOG4J_CONFIG_FILE); if (appLogger == null) { appLogger = Logger.getLogger(T24_Transformer_FormView.class.getName()); } appLogger.info("Application initialization successful."); } columnsCSV = new StringBuffer(ConfigKeys.FIELD_TAG + "," + ConfigKeys.FIELD_NUMBER + "," + ConfigKeys.FIELD_DATA_TYPE + "," + ConfigKeys.FIELD_FMT + "," + ConfigKeys.FIELD_LEN + "," + ConfigKeys.FIELD_INPUT_LEN + "," + ConfigKeys.FIELD_GROUP_NUMBER + "," + ConfigKeys.FIELD_MV_GROUP_NUMBER + "," + ConfigKeys.FIELD_SHORT_NAME + "," + ConfigKeys.FIELD_NAME + "," + ConfigKeys.FIELD_COLUMN_NAME + "," + ConfigKeys.FIELD_GROUP_NAME + "," + ConfigKeys.FIELD_MV_GROUP_NAME + "," + ConfigKeys.FIELD_JUSTIFICATION + "," + ConfigKeys.FIELD_TYPE + "," + ConfigKeys.FIELD_SINGLE_OR_MULTI + System.getProperty("line.separator")); singleValueTableColumns = new ArrayList<String>(); singleValueTableColumns.add(ConfigKeys.COLUMN_XPK_ROW + ConfigKeys.DELIMITER_COLUMN_TYPE + ConfigKeys.DATA_TYPE_XSD_NUMERIC); multiValueTablesSQL = new HashMap<String, String>(); groupAttrs = new HashMap<Object, HashMap<String, Object>>(); xsdElementsList = new ArrayList<XSDElement>(); } /** * initialize the <code>DocumentBuilder</code> and read the XSD file * * @param docPath * @return the <code>Document</code> object representing the read XSD file */ private Document retrieveDoc(String docPath) { Document xsdDoc = null; File file = new File(docPath); try { DocumentBuilder builder = DocumentBuilderFactory.newInstance().newDocumentBuilder(); xsdDoc = builder.parse(file); } catch (Exception e) { appLogger.error(e.getMessage()); } return xsdDoc; } /** * perform the iteration/modification on the document * iterate to the level which contains all the elements (Single-Value, and Groups) and start processing each * * @param xsdDoc * @return */ private Document processDoc(Document xsdDoc) { ArrayList<Object> newElementsList = new ArrayList<Object>(); HashMap<String, Object> docAttrMap = new HashMap<String, Object>(); Element sequenceElement = null; Element schemaElement = null; // get document's root element NodeList nodes = xsdDoc.getChildNodes(); for (int i = 0; i < nodes.getLength(); i++) { if (ConfigKeys.TAG_SCHEMA.equals(nodes.item(i).getNodeName())) { schemaElement = (Element) nodes.item(i); break; } } // process the document (change single-value elements, collect list of new elements to be added) for (int i1 = 0; i1 < schemaElement.getChildNodes().getLength(); i1++) { Node childLevel1 = (Node) schemaElement.getChildNodes().item(i1); // <ComplexType> element if (childLevel1.getNodeName().equals(ConfigKeys.TAG_COMPLEX_TYPE)) { // first, get the main attributes and put it in the csv file for (int i6 = 0; i6 < childLevel1.getChildNodes().getLength(); i6++) { Node child6 = childLevel1.getChildNodes().item(i6); if (ConfigKeys.TAG_ATTRIBUTE.equals(child6.getNodeName())) { if (child6.getAttributes().getNamedItem(ConfigKeys.ATTR_NAME) != null) { String attrName = child6.getAttributes().getNamedItem(ConfigKeys.ATTR_NAME).getNodeValue(); if (((Element) child6).getElementsByTagName(ConfigKeys.TAG_SIMPLE_TYPE).getLength() != 0) { Node simpleTypeElement = ((Element) child6).getElementsByTagName(ConfigKeys.TAG_SIMPLE_TYPE) .item(0); if (((Element) simpleTypeElement).getElementsByTagName(ConfigKeys.TAG_RESTRICTION).getLength() != 0) { Node restrictionElement = ((Element) simpleTypeElement).getElementsByTagName( ConfigKeys.TAG_RESTRICTION).item(0); if (((Element) restrictionElement).getElementsByTagName(ConfigKeys.TAG_MAX_LENGTH).getLength() != 0) { Node maxLengthElement = ((Element) restrictionElement).getElementsByTagName( ConfigKeys.TAG_MAX_LENGTH).item(0); HashMap<String, String> elementProperties = new HashMap<String, String>(); elementProperties.put(ConfigKeys.FIELD_TAG, attrName); elementProperties.put(ConfigKeys.FIELD_NUMBER, "0"); elementProperties.put(ConfigKeys.FIELD_DATA_TYPE, ConfigKeys.DATA_TYPE_XSD_STRING); elementProperties.put(ConfigKeys.FIELD_FMT, ""); elementProperties.put(ConfigKeys.FIELD_NAME, attrName); elementProperties.put(ConfigKeys.FIELD_SHORT_NAME, attrName); elementProperties.put(ConfigKeys.FIELD_COLUMN_NAME, attrName); elementProperties.put(ConfigKeys.FIELD_SINGLE_OR_MULTI, "S"); elementProperties.put(ConfigKeys.FIELD_LEN, maxLengthElement.getAttributes().getNamedItem( ConfigKeys.ATTR_VALUE).getNodeValue()); elementProperties.put(ConfigKeys.FIELD_INPUT_LEN, maxLengthElement.getAttributes() .getNamedItem(ConfigKeys.ATTR_VALUE).getNodeValue()); constructElementRow(elementProperties); // add the attribute as a column in the single-value table singleValueTableColumns.add(attrName + ConfigKeys.DELIMITER_COLUMN_TYPE + ConfigKeys.DATA_TYPE_XSD_STRING + ConfigKeys.DELIMITER_COLUMN_TYPE + maxLengthElement.getAttributes().getNamedItem(ConfigKeys.ATTR_VALUE).getNodeValue()); // add the attribute as an element in the elements list addToElementsList(attrName, attrName); appLogger.debug("added attribute: " + attrName); } } } } } } // now, loop on the elements and process them for (int i2 = 0; i2 < childLevel1.getChildNodes().getLength(); i2++) { Node childLevel2 = (Node) childLevel1.getChildNodes().item(i2); // <Sequence> element if (childLevel2.getNodeName().equals(ConfigKeys.TAG_SEQUENCE)) { sequenceElement = (Element) childLevel2; for (int i3 = 0; i3 < childLevel2.getChildNodes().getLength(); i3++) { Node childLevel3 = (Node) childLevel2.getChildNodes().item(i3); // <Element> element if (childLevel3.getNodeName().equals(ConfigKeys.TAG_ELEMENT)) { // check if single element or group if (isGroup(childLevel3)) { processGroup(childLevel3, true, null, null, docAttrMap, xsdDoc, newElementsList); // insert a new comment node with the contents of the group tag sequenceElement.insertBefore(xsdDoc.createComment(serialize(childLevel3)), childLevel3); // remove the group tag sequenceElement.removeChild(childLevel3); } else { processElement(childLevel3); } } } } } } } // add new elements // this step should be after finishing processing the whole document. when you add new elements to the document // while you are working on it, those new elements will be included in the processing. We don't need that! for (int i = 0; i < newElementsList.size(); i++) { sequenceElement.appendChild((Element) newElementsList.get(i)); } // write the new required attributes to the schema element Iterator<String> attrIter = docAttrMap.keySet().iterator(); while(attrIter.hasNext()) { Element attr = (Element) docAttrMap.get(attrIter.next()); Element newAttrElement = xsdDoc.createElement(ConfigKeys.TAG_ATTRIBUTE); appLogger.debug("appending attr. [" + attr.getAttribute(ConfigKeys.ATTR_NAME) + "]..."); newAttrElement.setAttribute(ConfigKeys.ATTR_NAME, attr.getAttribute(ConfigKeys.ATTR_NAME)); newAttrElement.setAttribute(ConfigKeys.ATTR_TYPE, attr.getAttribute(ConfigKeys.ATTR_TYPE)); schemaElement.appendChild(newAttrElement); } return xsdDoc; } /** * add a new <code>XSDElement</code> with the given <code>name</code> and <code>businessName</code> to * the elements list * * @param name * @param businessName */ private void addToElementsList(String name, String businessName) { xsdElementsList.add(new XSDElement(name, businessName)); } /** * add the given <code>XSDElement</code> to the elements list * * @param element */ private void addToElementsList(XSDElement element) { xsdElementsList.add(element); } /** * check if the <code>element</code> sent is single-value element or group * element. the comparison depends on the children of the element. if found one of type * <code>ComplexType</code> then it's a group element, and if of type * <code>SimpleType</code> then it's a single-value element * * @param element * @return <code>true</code> if the element is a group element, * <code>false</code> otherwise */ private boolean isGroup(Node element) { for (int i = 0; i < element.getChildNodes().getLength(); i++) { Node child = (Node) element.getChildNodes().item(i); if (child.getNodeName().equals(ConfigKeys.TAG_COMPLEX_TYPE)) { // found a ComplexType child (Group element) return true; } else if (child.getNodeName().equals(ConfigKeys.TAG_SIMPLE_TYPE)) { // found a SimpleType child (Single-Value element) return false; } } return false; /* String attrName = null; if (element.getAttributes() != null) { Node attribute = element.getAttributes().getNamedItem(XSDTransformer.ATTR_NAME); if (attribute != null) { attrName = attribute.getNodeValue(); } } if (attrName.startsWith("g")) { // group element return true; } else { // single element return false; } */ } /** * process a group element. recursively, process groups till no more group elements are found * * @param element * @param isFirstLevelGroup * @param attrMap * @param docAttrMap * @param xsdDoc * @param newElementsList */ private void processGroup(Node element, boolean isFirstLevelGroup, Node parentGroup, XSDElement parentGroupElement, HashMap<String, Object> docAttrMap, Document xsdDoc, ArrayList<Object> newElementsList) { String elementName = null; HashMap<String, Object> groupAttrMap = new HashMap<String, Object>(); HashMap<String, Object> parentGroupAttrMap = new HashMap<String, Object>(); XSDElement groupElement = null; if (element.getAttributes().getNamedItem(ConfigKeys.ATTR_NAME) != null) { elementName = element.getAttributes().getNamedItem(ConfigKeys.ATTR_NAME).getNodeValue(); } appLogger.debug("processing group [" + elementName + "]..."); groupElement = new XSDElement(elementName, elementName); // get the attributes if a non-first-level-group // attributes are: groups's own attributes + parent group's attributes if (!isFirstLevelGroup) { // get the current element (group) attributes for (int i1 = 0; i1 < element.getChildNodes().getLength(); i1++) { if (ConfigKeys.TAG_COMPLEX_TYPE.equals(element.getChildNodes().item(i1).getNodeName())) { Node complexTypeNode = element.getChildNodes().item(i1); for (int i2 = 0; i2 < complexTypeNode.getChildNodes().getLength(); i2++) { if (ConfigKeys.TAG_ATTRIBUTE.equals(complexTypeNode.getChildNodes().item(i2).getNodeName())) { appLogger.debug("add group attr: " + ((Element) complexTypeNode.getChildNodes().item(i2)).getAttribute(ConfigKeys.ATTR_NAME)); groupAttrMap.put(((Element) complexTypeNode.getChildNodes().item(i2)).getAttribute(ConfigKeys.ATTR_NAME), complexTypeNode.getChildNodes().item(i2)); docAttrMap.put(((Element) complexTypeNode.getChildNodes().item(i2)).getAttribute(ConfigKeys.ATTR_NAME), complexTypeNode.getChildNodes().item(i2)); } } } } // now, get the parent's attributes parentGroupAttrMap = groupAttrs.get(parentGroup); if (parentGroupAttrMap != null) { Iterator<String> iter = parentGroupAttrMap.keySet().iterator(); while (iter.hasNext()) { String attrName = iter.next(); groupAttrMap.put(attrName, parentGroupAttrMap.get(attrName)); } } // add the attributes to the group element that will be added to the elements list Iterator<String> itr = groupAttrMap.keySet().iterator(); while(itr.hasNext()) { groupElement.addAttribute(itr.next()); } // put the attributes in the attributes map groupAttrs.put(element, groupAttrMap); } for (int i = 0; i < element.getChildNodes().getLength(); i++) { Node childLevel1 = (Node) element.getChildNodes().item(i); if (childLevel1.getNodeName().equals(ConfigKeys.TAG_COMPLEX_TYPE)) { for (int j = 0; j < childLevel1.getChildNodes().getLength(); j++) { Node childLevel2 = (Node) childLevel1.getChildNodes().item(j); if (childLevel2.getNodeName().equals(ConfigKeys.TAG_SEQUENCE)) { for (int k = 0; k < childLevel2.getChildNodes().getLength(); k++) { Node childLevel3 = (Node) childLevel2.getChildNodes().item(k); if (childLevel3.getNodeName().equals(ConfigKeys.TAG_ELEMENT)) { // check if single element or group if (isGroup(childLevel3)) { // another group element.. // unfortunately, a recursion is // needed here!!! :-( processGroup(childLevel3, false, element, groupElement, docAttrMap, xsdDoc, newElementsList); } else { // reached a single-value element.. copy it under the // main sequence and apply the name<>shorname replacement processGroupElement(childLevel3, element, groupElement, isFirstLevelGroup, xsdDoc, newElementsList); } } } } } } } if (isFirstLevelGroup) { addToElementsList(groupElement); } else { parentGroupElement.addChild(groupElement); } appLogger.debug("finished processing group [" + elementName + "]."); } /** * process the sent <code>element</code> to extract/modify required * information: * 1. replace the <code>name</code> attribute with the <code>shortname</code>. * * @param element */ private void processElement(Node element) { String fieldShortName = null; String fieldColumnName = null; String fieldDataType = null; String fieldFormat = null; String fieldInputLength = null; String elementName = null; HashMap<String, String> elementProperties = new HashMap<String, String>(); if (element.getAttributes().getNamedItem(ConfigKeys.ATTR_NAME) != null) { elementName = element.getAttributes().getNamedItem(ConfigKeys.ATTR_NAME).getNodeValue(); } appLogger.debug("processing element [" + elementName + "]..."); for (int i = 0; i < element.getChildNodes().getLength(); i++) { Node childLevel1 = (Node) element.getChildNodes().item(i); if (childLevel1.getNodeName().equals(ConfigKeys.TAG_ANNOTATION)) { for (int j = 0; j < childLevel1.getChildNodes().getLength(); j++) { Node childLevel2 = (Node) childLevel1.getChildNodes().item(j); if (childLevel2.getNodeName().equals(ConfigKeys.TAG_APP_INFO)) { for (int k = 0; k < childLevel2.getChildNodes().getLength(); k++) { Node childLevel3 = (Node) childLevel2.getChildNodes().item(k); if (childLevel3.getNodeName().equals(ConfigKeys.TAG_HAS_PROPERTY)) { if (childLevel3.getAttributes() != null) { String attrName = null; Node attribute = childLevel3.getAttributes().getNamedItem(ConfigKeys.ATTR_NAME); if (attribute != null) { attrName = attribute.getNodeValue(); elementProperties.put(attrName, childLevel3.getAttributes().getNamedItem(ConfigKeys.ATTR_VALUE) .getNodeValue()); if (attrName.equals(ConfigKeys.FIELD_SHORT_NAME)) { fieldShortName = childLevel3.getAttributes().getNamedItem(ConfigKeys.ATTR_VALUE) .getNodeValue(); } else if (attrName.equals(ConfigKeys.FIELD_COLUMN_NAME)) { fieldColumnName = childLevel3.getAttributes().getNamedItem(ConfigKeys.ATTR_VALUE) .getNodeValue(); } else if (attrName.equals(ConfigKeys.FIELD_DATA_TYPE)) { fieldDataType = childLevel3.getAttributes().getNamedItem(ConfigKeys.ATTR_VALUE) .getNodeValue(); } else if (attrName.equals(ConfigKeys.FIELD_FMT)) { fieldFormat = childLevel3.getAttributes().getNamedItem(ConfigKeys.ATTR_VALUE) .getNodeValue(); } else if (attrName.equals(ConfigKeys.FIELD_INPUT_LEN)) { fieldInputLength = childLevel3.getAttributes().getNamedItem(ConfigKeys.ATTR_VALUE) .getNodeValue(); } } } } } } } } } // replace the name attribute with the shortname if (element.getAttributes().getNamedItem(ConfigKeys.ATTR_NAME) != null) { element.getAttributes().getNamedItem(ConfigKeys.ATTR_NAME).setNodeValue(fieldShortName); } elementProperties.put(ConfigKeys.FIELD_SINGLE_OR_MULTI, "S"); constructElementRow(elementProperties); singleValueTableColumns.add(fieldShortName + ConfigKeys.DELIMITER_COLUMN_TYPE + fieldDataType + fieldFormat + ConfigKeys.DELIMITER_COLUMN_TYPE + fieldInputLength); // add the element to elements list addToElementsList(fieldShortName, fieldColumnName); appLogger.debug("finished processing element [" + elementName + "]."); } /** * process the sent <code>element</code> to extract/modify required * information: * 1. copy the element under the main sequence * 2. replace the <code>name</code> attribute with the <code>shortname</code>. * 3. add the attributes of the parent groups (if non-first-level-group) * * @param element */ private void processGroupElement(Node element, Node parentGroup, XSDElement parentGroupElement, boolean isFirstLevelGroup, Document xsdDoc, ArrayList<Object> newElementsList) { String fieldShortName = null; String fieldColumnName = null; String fieldDataType = null; String fieldFormat = null; String fieldInputLength = null; String elementName = null; Element newElement = null; HashMap<String, String> elementProperties = new HashMap<String, String>(); ArrayList<String> tableColumns = new ArrayList<String>(); HashMap<String, Object> groupAttrMap = null; if (element.getAttributes().getNamedItem(ConfigKeys.ATTR_NAME) != null) { elementName = element.getAttributes().getNamedItem(ConfigKeys.ATTR_NAME).getNodeValue(); } appLogger.debug("processing element [" + elementName + "]..."); // 1. copy the element newElement = (Element) element.cloneNode(true); newElement.setAttribute(ConfigKeys.ATTR_MAX_OCCURS, "unbounded"); // 2. if non-first-level-group, replace the element's SimpleType tag with a ComplexType tag if (!isFirstLevelGroup) { if (((Element) newElement).getElementsByTagName(ConfigKeys.TAG_SIMPLE_TYPE).getLength() != 0) { // there should be only one tag of SimpleType Node simpleTypeNode = ((Element) newElement).getElementsByTagName(ConfigKeys.TAG_SIMPLE_TYPE).item(0); // create the new ComplexType element Element complexTypeNode = xsdDoc.createElement(ConfigKeys.TAG_COMPLEX_TYPE); complexTypeNode.setAttribute(ConfigKeys.ATTR_MIXED, "true"); // get the list of attributes for the parent group groupAttrMap = groupAttrs.get(parentGroup); Iterator<String> attrIter = groupAttrMap.keySet().iterator(); while(attrIter.hasNext()) { Element attr = (Element) groupAttrMap.get(attrIter.next()); Element newAttrElement = xsdDoc.createElement(ConfigKeys.TAG_ATTRIBUTE); appLogger.debug("adding attr. [" + attr.getAttribute(ConfigKeys.ATTR_NAME) + "]..."); newAttrElement.setAttribute(ConfigKeys.ATTR_REF, attr.getAttribute(ConfigKeys.ATTR_NAME)); newAttrElement.setAttribute(ConfigKeys.ATTR_USE, "optional"); complexTypeNode.appendChild(newAttrElement); } // replace the old SimpleType node with the new ComplexType node newElement.replaceChild(complexTypeNode, simpleTypeNode); } } // 3. replace the name with the shortname in the new element for (int i = 0; i < newElement.getChildNodes().getLength(); i++) { Node childLevel1 = (Node) newElement.getChildNodes().item(i); if (childLevel1.getNodeName().equals(ConfigKeys.TAG_ANNOTATION)) { for (int j = 0; j < childLevel1.getChildNodes().getLength(); j++) { Node childLevel2 = (Node) childLevel1.getChildNodes().item(j); if (childLevel2.getNodeName().equals(ConfigKeys.TAG_APP_INFO)) { for (int k = 0; k < childLevel2.getChildNodes().getLength(); k++) { Node childLevel3 = (Node) childLevel2.getChildNodes().item(k); if (childLevel3.getNodeName().equals(ConfigKeys.TAG_HAS_PROPERTY)) { if (childLevel3.getAttributes() != null) { String attrName = null; Node attribute = childLevel3.getAttributes().getNamedItem(ConfigKeys.ATTR_NAME); if (attribute != null) { attrName = attribute.getNodeValue(); elementProperties.put(attrName, childLevel3.getAttributes().getNamedItem(ConfigKeys.ATTR_VALUE) .getNodeValue()); if (attrName.equals(ConfigKeys.FIELD_SHORT_NAME)) { fieldShortName = childLevel3.getAttributes().getNamedItem(ConfigKeys.ATTR_VALUE) .getNodeValue(); } else if (attrName.equals(ConfigKeys.FIELD_COLUMN_NAME)) { fieldColumnName = childLevel3.getAttributes().getNamedItem(ConfigKeys.ATTR_VALUE) .getNodeValue(); } else if (attrName.equals(ConfigKeys.FIELD_DATA_TYPE)) { fieldDataType = childLevel3.getAttributes().getNamedItem(ConfigKeys.ATTR_VALUE) .getNodeValue(); } else if (attrName.equals(ConfigKeys.FIELD_FMT)) { fieldFormat = childLevel3.getAttributes().getNamedItem(ConfigKeys.ATTR_VALUE) .getNodeValue(); } else if (attrName.equals(ConfigKeys.FIELD_INPUT_LEN)) { fieldInputLength = childLevel3.getAttributes().getNamedItem(ConfigKeys.ATTR_VALUE) .getNodeValue(); } } } } } } } } } if (newElement.getAttributes().getNamedItem(ConfigKeys.ATTR_NAME) != null) { newElement.getAttributes().getNamedItem(ConfigKeys.ATTR_NAME).setNodeValue(fieldShortName); } // 4. save the new element to be added to the sequence list newElementsList.add(newElement); elementProperties.put(ConfigKeys.FIELD_SINGLE_OR_MULTI, "M"); constructElementRow(elementProperties); // create the MULTI-VALUE table // 0. Primary Key tableColumns.add(ConfigKeys.COLUMN_XPK_ROW + ConfigKeys.DELIMITER_COLUMN_TYPE + ConfigKeys.DATA_TYPE_XSD_STRING + ConfigKeys.DELIMITER_COLUMN_TYPE + ConfigKeys.COLUMN_XPK_ROW_LENGTH); // 1. foreign key tableColumns.add(ConfigKeys.COLUMN_FK_ROW + ConfigKeys.DELIMITER_COLUMN_TYPE + ConfigKeys.DATA_TYPE_XSD_NUMERIC); // 2. field value tableColumns.add(fieldShortName + ConfigKeys.DELIMITER_COLUMN_TYPE + fieldDataType + fieldFormat + ConfigKeys.DELIMITER_COLUMN_TYPE + fieldInputLength); // 3. attributes if (groupAttrMap != null) { Iterator<String> attrIter = groupAttrMap.keySet().iterator(); while (attrIter.hasNext()) { Element attr = (Element) groupAttrMap.get(attrIter.next()); tableColumns.add(attr.getAttribute(ConfigKeys.ATTR_NAME) + ConfigKeys.DELIMITER_COLUMN_TYPE + ConfigKeys.DATA_TYPE_XSD_NUMERIC); } } multiValueTablesSQL.put(sub_table_prefix.getText() + fieldShortName, constructMultiValueTableSQL( sub_table_prefix.getText() + fieldShortName, tableColumns)); // add the element to it's parent group children parentGroupElement.addChild(new XSDElement(fieldShortName, fieldColumnName)); appLogger.debug("finished processing element [" + elementName + "]."); } /** * write resulted files * * @param xsdDoc * @param docPath */ private void writeResults(Document xsdDoc, String resultsDir, String newXSDFileName, String csvFileName) { String rsDir = resultsDir + File.separator + new SimpleDateFormat("yyyyMMdd-HHmm").format(new Date()); try { File resultsDirFile = new File(rsDir); if (!resultsDirFile.exists()) { resultsDirFile.mkdirs(); } // write the XSD doc appLogger.info("writing the transformed XSD..."); Source source = new DOMSource(xsdDoc); Result result = new StreamResult(rsDir + File.separator + newXSDFileName); Transformer xformer = TransformerFactory.newInstance().newTransformer(); // xformer.setOutputProperty("indent", "yes"); xformer.transform(source, result); appLogger.info("finished writing the transformed XSD."); // write the CSV columns file appLogger.info("writing the CSV file..."); FileWriter csvWriter = new FileWriter(rsDir + File.separator + csvFileName); csvWriter.write(columnsCSV.toString()); csvWriter.close(); appLogger.info("finished writing the CSV file."); // write the master single-value table appLogger.info("writing the creation script for master table (single-values)..."); FileWriter masterTableWriter = new FileWriter(rsDir + File.separator + main_edh_table_name.getText() + ".sql"); masterTableWriter.write(constructSingleValueTableSQL(main_edh_table_name.getText(), singleValueTableColumns)); masterTableWriter.close(); appLogger.info("finished writing the creation script for master table (single-values)."); // write the multi-value tables sql appLogger.info("writing the creation script for slave tables (multi-values)..."); Iterator<String> iter = multiValueTablesSQL.keySet().iterator(); while (iter.hasNext()) { String tableName = iter.next(); String sql = multiValueTablesSQL.get(tableName); FileWriter tableSQLWriter = new FileWriter(rsDir + File.separator + tableName + ".sql"); tableSQLWriter.write(sql); tableSQLWriter.close(); } appLogger.info("finished writing the creation script for slave tables (multi-values)."); // write the single-value view appLogger.info("writing the creation script for single-value selection view..."); FileWriter singleValueViewWriter = new FileWriter(rsDir + File.separator + view_name_single.getText() + ".sql"); singleValueViewWriter.write(constructViewSQL(ConfigKeys.SQL_VIEW_SINGLE)); singleValueViewWriter.close(); appLogger.info("finished writing the creation script for single-value selection view."); // debug for (int i = 0; i < xsdElementsList.size(); i++) { getMultiView(xsdElementsList.get(i)); /*// if (xsdElementsList.get(i).getAllDescendants() != null) { // for (int j = 0; j < xsdElementsList.get(i).getAllDescendants().size(); j++) { // appLogger.debug(main_edh_table_name.getText() + "." + ConfigKeys.COLUMN_XPK_ROW // + "=" + xsdElementsList.get(i).getAllDescendants().get(j).getName() + "." + ConfigKeys.COLUMN_FK_ROW); // } // } */ } } catch (Exception e) { appLogger.error(e.getMessage()); } } private String getMultiView(XSDElement element)

    Read the article

  • I am having a problem of class cast exception. Can anyone please help me out?

    - by Piyush
    This is my code: package com.example.userpage; import android.app.Activity; import android.content.Intent; import android.os.Bundle; import android.view.View; import android.widget.Button; import android.widget.EditText; import android.widget.TextView; public class UserPage extends Activity { String tv,tv1; EditText name,pass; TextView x,y; /** Called when the activity is first created. */ @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); Button button = (Button) findViewById(R.id.widget44); button.setOnClickListener(new View.OnClickListener() { public void onClick(View v) { name.setText(" "); pass.setText(" "); } }); x = (TextView) findViewById(R.id.widget46); y = (TextView) findViewById(R.id.widget47); name = (EditText)findViewById(R.id.widget41); pass = (EditText)findViewById(R.id.widget42); Button button1 = (Button) findViewById(R.id.widget45); button1.setOnClickListener(new View.OnClickListener() { public void onClick(View v) { tv= name.getText().toString(); tv1 = pass.getText().toString(); x.setText(tv); y.setText(tv1); } }); } } And this is my log cat: 02-16 12:24:30.488: DEBUG/AndroidRuntime(973): >>>>>>>>>>>>>> AndroidRuntime START <<<<<<<<<<<<<< 02-16 12:24:30.488: DEBUG/AndroidRuntime(973): CheckJNI is ON 02-16 12:24:31.208: DEBUG/AndroidRuntime(973): --- registering native functions --- 02-16 12:24:33.498: DEBUG/AndroidRuntime(973): Shutting down VM 02-16 12:24:33.537: DEBUG/dalvikvm(973): Debugger has detached; object registry had 1 entries 02-16 12:24:33.537: INFO/AndroidRuntime(973): NOTE: attach of thread 'Binder Thread #3' failed 02-16 12:24:34.917: DEBUG/AndroidRuntime(981): >>>>>>>>>>>>>> AndroidRuntime START <<<<<<<<<<<<<< 02-16 12:24:34.927: DEBUG/AndroidRuntime(981): CheckJNI is ON 02-16 12:24:35.617: DEBUG/AndroidRuntime(981): --- registering native functions --- 02-16 12:24:38.029: INFO/ActivityManager(67): Starting activity: Intent { act=android.intent.action.MAIN cat=[android.intent.category.LAUNCHER] flg=0x10000000 cmp=com.example.userpage/.UserPage } 02-16 12:24:38.129: DEBUG/AndroidRuntime(981): Shutting down VM 02-16 12:24:38.160: DEBUG/dalvikvm(981): Debugger has detached; object registry had 1 entries 02-16 12:24:38.168: INFO/AndroidRuntime(981): NOTE: attach of thread 'Binder Thread #3' failed 02-16 12:25:12.028: DEBUG/AndroidRuntime(990): >>>>>>>>>>>>>> AndroidRuntime START <<<<<<<<<<<<<< 02-16 12:25:12.038: DEBUG/AndroidRuntime(990): CheckJNI is ON 02-16 12:25:12.708: DEBUG/AndroidRuntime(990): --- registering native functions --- 02-16 12:25:15.178: DEBUG/dalvikvm(176): GC_EXPLICIT freed 114 objects / 5880 bytes in 115ms 02-16 12:25:15.318: DEBUG/PackageParser(67): Scanning package: /data/app/vmdl25170.tmp 02-16 12:25:15.588: INFO/PackageManager(67): Removing non-system package:com.example.userpage 02-16 12:25:15.597: INFO/ActivityManager(67): Force stopping package com.example.userpage uid=10036 02-16 12:25:15.648: INFO/Process(67): Sending signal. PID: 916 SIG: 9 02-16 12:25:15.877: INFO/UsageStats(67): Unexpected resume of com.android.launcher while already resumed in com.example.userpage 02-16 12:25:17.028: WARN/InputManagerService(67): Window already focused, ignoring focus gain of: com.android.internal.view.IInputMethodClient$Stub$Proxy@4400ecf8 02-16 12:25:17.928: DEBUG/PackageManager(67): Scanning package com.example.userpage 02-16 12:25:17.949: INFO/PackageManager(67): Package com.example.userpage codePath changed from /data/app/com.example.userpage-1.apk to /data/app/com.example.userpage-2.apk; Retaining data and using new 02-16 12:25:17.987: INFO/PackageManager(67): /data/app/com.example.userpage-2.apk changed; unpacking 02-16 12:25:18.037: DEBUG/installd(35): DexInv: --- BEGIN '/data/app/com.example.userpage-2.apk' --- 02-16 12:25:18.737: DEBUG/dalvikvm(997): DexOpt: load 81ms, verify 112ms, opt 6ms 02-16 12:25:18.768: DEBUG/installd(35): DexInv: --- END '/data/app/com.example.userpage-2.apk' (success) --- 02-16 12:25:18.799: INFO/ActivityManager(67): Force stopping package com.example.userpage uid=10036 02-16 12:25:18.808: WARN/PackageManager(67): Code path for pkg : com.example.userpage changing from /data/app/com.example.userpage-1.apk to /data/app/com.example.userpage-2.apk 02-16 12:25:18.839: WARN/PackageManager(67): Resource path for pkg : com.example.userpage changing from /data/app/com.example.userpage-1.apk to /data/app/com.example.userpage-2.apk 02-16 12:25:18.868: DEBUG/PackageManager(67): Activities: com.example.userpage.UserPage 02-16 12:25:19.297: INFO/installd(35): move /data/dalvik-cache/data@[email protected]@classes.dex -> /data/dalvik-cache/data@[email protected]@classes.dex 02-16 12:25:19.297: DEBUG/PackageManager(67): New package installed in /data/app/com.example.userpage-2.apk 02-16 12:25:19.598: DEBUG/dalvikvm(67): GC_FOR_MALLOC freed 7979 objects / 516856 bytes in 246ms 02-16 12:25:20.498: INFO/ActivityManager(67): Force stopping package com.example.userpage uid=10036 02-16 12:25:20.708: DEBUG/dalvikvm(129): GC_EXPLICIT freed 124 objects / 5672 bytes in 157ms 02-16 12:25:21.838: DEBUG/dalvikvm(67): GC_EXPLICIT freed 4208 objects / 236264 bytes in 419ms 02-16 12:25:21.918: WARN/RecognitionManagerService(67): no available voice recognition services found 02-16 12:25:22.127: INFO/installd(35): unlink /data/dalvik-cache/data@[email protected]@classes.dex 02-16 12:25:22.478: DEBUG/AndroidRuntime(990): Shutting down VM 02-16 12:25:22.488: DEBUG/dalvikvm(990): Debugger has detached; object registry had 1 entries 02-16 12:25:22.588: INFO/AndroidRuntime(990): NOTE: attach of thread 'Binder Thread #3' failed 02-16 12:25:24.137: DEBUG/AndroidRuntime(1003): >>>>>>>>>>>>>> AndroidRuntime START <<<<<<<<<<<<<< 02-16 12:25:24.147: DEBUG/AndroidRuntime(1003): CheckJNI is ON 02-16 12:25:24.817: DEBUG/AndroidRuntime(1003): --- registering native functions --- 02-16 12:25:27.450: INFO/ActivityManager(67): Starting activity: Intent { act=android.intent.action.MAIN cat=[android.intent.category.LAUNCHER] flg=0x10000000 cmp=com.example.userpage/.UserPage } 02-16 12:25:27.628: DEBUG/AndroidRuntime(1003): Shutting down VM 02-16 12:25:27.780: INFO/AndroidRuntime(1003): NOTE: attach of thread 'Binder Thread #3' failed 02-16 12:25:28.018: DEBUG/dalvikvm(1003): Debugger has detached; object registry had 1 entries 02-16 12:25:28.148: INFO/ActivityManager(67): Start proc com.example.userpage for activity com.example.userpage/.UserPage: pid=1010 uid=10036 gids={} 02-16 12:25:30.308: DEBUG/AndroidRuntime(1010): Shutting down VM 02-16 12:25:30.308: WARN/dalvikvm(1010): threadid=1: thread exiting with uncaught exception (group=0x4001d800) 02-16 12:25:30.388: ERROR/AndroidRuntime(1010): FATAL EXCEPTION: main 02-16 12:25:30.388: ERROR/AndroidRuntime(1010): java.lang.RuntimeException: Unable to start activity ComponentInfo{com.example.userpage/com.example.userpage.UserPage}: java.lang.ClassCastException: android.widget.TextView 02-16 12:25:30.388: ERROR/AndroidRuntime(1010): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2663) 02-16 12:25:30.388: ERROR/AndroidRuntime(1010): at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2679) 02-16 12:25:30.388: ERROR/AndroidRuntime(1010): at android.app.ActivityThread.access$2300(ActivityThread.java:125) 02-16 12:25:30.388: ERROR/AndroidRuntime(1010): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:2033) 02-16 12:25:30.388: ERROR/AndroidRuntime(1010): at android.os.Handler.dispatchMessage(Handler.java:99) 02-16 12:25:30.388: ERROR/AndroidRuntime(1010): at android.os.Looper.loop(Looper.java:123) 02-16 12:25:30.388: ERROR/AndroidRuntime(1010): at android.app.ActivityThread.main(ActivityThread.java:4627) 02-16 12:25:30.388: ERROR/AndroidRuntime(1010): at java.lang.reflect.Method.invokeNative(Native Method) 02-16 12:25:30.388: ERROR/AndroidRuntime(1010): at java.lang.reflect.Method.invoke(Method.java:521) 02-16 12:25:30.388: ERROR/AndroidRuntime(1010): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:868) 02-16 12:25:30.388: ERROR/AndroidRuntime(1010): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:626) 02-16 12:25:30.388: ERROR/AndroidRuntime(1010): at dalvik.system.NativeStart.main(Native Method) 02-16 12:25:30.388: ERROR/AndroidRuntime(1010): Caused by: java.lang.ClassCastException: android.widget.TextView 02-16 12:25:30.388: ERROR/AndroidRuntime(1010): at com.example.userpage.UserPage.onCreate(UserPage.java:35) 02-16 12:25:30.388: ERROR/AndroidRuntime(1010): at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1047) 02-16 12:25:30.388: ERROR/AndroidRuntime(1010): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2627) 02-16 12:25:30.388: ERROR/AndroidRuntime(1010): ... 11 more 02-16 12:25:30.438: WARN/ActivityManager(67): Force finishing activity com.example.userpage/.UserPage 02-16 12:25:31.088: WARN/ActivityManager(67): Activity pause timeout for HistoryRecord{43f164f8 com.example.userpage/.UserPage} 02-16 12:25:32.588: DEBUG/dalvikvm(292): GC_EXPLICIT freed 46 objects / 2240 bytes in 6282ms 02-16 12:25:35.267: INFO/Process(1010): Sending signal. PID: 1010 SIG: 9 02-16 12:25:35.468: WARN/InputManagerService(67): Window already focused, ignoring focus gain of: com.android.internal.view.IInputMethodClient$Stub$Proxy@43e60a90 02-16 12:25:35.900: INFO/ActivityManager(67): Process com.example.userpage (pid 1010) has died. 02-16 12:25:38.278: DEBUG/dalvikvm(176): GC_EXPLICIT freed 172 objects / 12280 bytes in 127ms 02-16 12:25:43.011: WARN/ActivityManager(67): Activity destroy timeout for HistoryRecord{43f164f8 com.example.userpage/.UserPage} 02-16 12:28:12.698: DEBUG/AndroidRuntime(1019): >>>>>>>>>>>>>> AndroidRuntime START <<<<<<<<<<<<<< 02-16 12:28:12.711: DEBUG/AndroidRuntime(1019): CheckJNI is ON 02-16 12:28:13.367: DEBUG/AndroidRuntime(1019): --- registering native functions --- 02-16 12:28:15.998: DEBUG/dalvikvm(176): GC_EXPLICIT freed 114 objects / 5888 bytes in 183ms 02-16 12:28:16.539: DEBUG/PackageParser(67): Scanning package: /data/app/vmdl25171.tmp 02-16 12:28:16.867: INFO/PackageManager(67): Removing non-system package:com.example.userpage 02-16 12:28:16.867: INFO/ActivityManager(67): Force stopping package com.example.userpage uid=10036 02-16 12:28:17.277: DEBUG/PackageManager(67): Scanning package com.example.userpage 02-16 12:28:17.308: INFO/PackageManager(67): Package com.example.userpage codePath changed from /data/app/com.example.userpage-2.apk to /data/app/com.example.userpage-1.apk; Retaining data and using new 02-16 12:28:17.328: INFO/PackageManager(67): /data/app/com.example.userpage-1.apk changed; unpacking 02-16 12:28:17.367: DEBUG/installd(35): DexInv: --- BEGIN '/data/app/com.example.userpage-1.apk' --- 02-16 12:28:18.357: DEBUG/dalvikvm(1026): DexOpt: load 85ms, verify 114ms, opt 6ms 02-16 12:28:18.398: DEBUG/installd(35): DexInv: --- END '/data/app/com.example.userpage-1.apk' (success) --- 02-16 12:28:18.428: INFO/ActivityManager(67): Force stopping package com.example.userpage uid=10036 02-16 12:28:18.438: WARN/PackageManager(67): Code path for pkg : com.example.userpage changing from /data/app/com.example.userpage-2.apk to /data/app/com.example.userpage-1.apk 02-16 12:28:18.477: WARN/PackageManager(67): Resource path for pkg : com.example.userpage changing from /data/app/com.example.userpage-2.apk to /data/app/com.example.userpage-1.apk 02-16 12:28:18.477: DEBUG/PackageManager(67): Activities: com.example.userpage.UserPage 02-16 12:28:18.977: INFO/installd(35): move /data/dalvik-cache/data@[email protected]@classes.dex -> /data/dalvik-cache/data@[email protected]@classes.dex 02-16 12:28:18.988: DEBUG/PackageManager(67): New package installed in /data/app/com.example.userpage-1.apk 02-16 12:28:19.528: DEBUG/dalvikvm(67): GC_FOR_MALLOC freed 6733 objects / 459728 bytes in 211ms 02-16 12:28:20.138: INFO/ActivityManager(67): Force stopping package com.example.userpage uid=10036 02-16 12:28:20.368: DEBUG/dalvikvm(129): GC_EXPLICIT freed 892 objects / 48744 bytes in 175ms 02-16 12:28:21.317: WARN/RecognitionManagerService(67): no available voice recognition services found 02-16 12:28:22.827: DEBUG/dalvikvm(67): GC_EXPLICIT freed 3877 objects / 241128 bytes in 452ms 02-16 12:28:22.979: INFO/installd(35): unlink /data/dalvik-cache/data@[email protected]@classes.dex 02-16 12:28:23.277: DEBUG/AndroidRuntime(1019): Shutting down VM 02-16 12:28:23.307: DEBUG/dalvikvm(1019): Debugger has detached; object registry had 1 entries 02-16 12:28:23.328: INFO/AndroidRuntime(1019): NOTE: attach of thread 'Binder Thread #3' failed 02-16 12:28:24.989: DEBUG/AndroidRuntime(1032): >>>>>>>>>>>>>> AndroidRuntime START <<<<<<<<<<<<<< 02-16 12:28:24.989: DEBUG/AndroidRuntime(1032): CheckJNI is ON 02-16 12:28:25.888: DEBUG/AndroidRuntime(1032): --- registering native functions --- 02-16 12:28:28.588: INFO/ActivityManager(67): Starting activity: Intent { act=android.intent.action.MAIN cat=[android.intent.category.LAUNCHER] flg=0x10000000 cmp=com.example.userpage/.UserPage } 02-16 12:28:28.888: DEBUG/AndroidRuntime(1032): Shutting down VM 02-16 12:28:28.997: DEBUG/dalvikvm(1032): Debugger has detached; object registry had 1 entries 02-16 12:28:29.038: INFO/AndroidRuntime(1032): NOTE: attach of thread 'Binder Thread #3' failed 02-16 12:28:30.417: INFO/ActivityManager(67): Start proc com.example.userpage for activity com.example.userpage/.UserPage: pid=1039 uid=10036 gids={} 02-16 12:28:32.588: DEBUG/AndroidRuntime(1039): Shutting down VM 02-16 12:28:32.598: WARN/dalvikvm(1039): threadid=1: thread exiting with uncaught exception (group=0x4001d800) 02-16 12:28:32.648: ERROR/AndroidRuntime(1039): FATAL EXCEPTION: main 02-16 12:28:32.648: ERROR/AndroidRuntime(1039): java.lang.RuntimeException: Unable to start activity ComponentInfo{com.example.userpage/com.example.userpage.UserPage}: java.lang.ClassCastException: android.widget.TextView 02-16 12:28:32.648: ERROR/AndroidRuntime(1039): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2663) 02-16 12:28:32.648: ERROR/AndroidRuntime(1039): at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2679) 02-16 12:28:32.648: ERROR/AndroidRuntime(1039): at android.app.ActivityThread.access$2300(ActivityThread.java:125) 02-16 12:28:32.648: ERROR/AndroidRuntime(1039): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:2033) 02-16 12:28:32.648: ERROR/AndroidRuntime(1039): at android.os.Handler.dispatchMessage(Handler.java:99) 02-16 12:28:32.648: ERROR/AndroidRuntime(1039): at android.os.Looper.loop(Looper.java:123) 02-16 12:28:32.648: ERROR/AndroidRuntime(1039): at android.app.ActivityThread.main(ActivityThread.java:4627) 02-16 12:28:32.648: ERROR/AndroidRuntime(1039): at java.lang.reflect.Method.invokeNative(Native Method) 02-16 12:28:32.648: ERROR/AndroidRuntime(1039): at java.lang.reflect.Method.invoke(Method.java:521) 02-16 12:28:32.648: ERROR/AndroidRuntime(1039): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:868) 02-16 12:28:32.648: ERROR/AndroidRuntime(1039): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:626) 02-16 12:28:32.648: ERROR/AndroidRuntime(1039): at dalvik.system.NativeStart.main(Native Method) 02-16 12:28:32.648: ERROR/AndroidRuntime(1039): Caused by: java.lang.ClassCastException: android.widget.TextView 02-16 12:28:32.648: ERROR/AndroidRuntime(1039): at com.example.userpage.UserPage.onCreate(UserPage.java:34) 02-16 12:28:32.648: ERROR/AndroidRuntime(1039): at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1047) 02-16 12:28:32.648: ERROR/AndroidRuntime(1039): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2627) 02-16 12:28:32.648: ERROR/AndroidRuntime(1039): ... 11 more 02-16 12:28:32.698: WARN/ActivityManager(67): Force finishing activity com.example.userpage/.UserPage 02-16 12:28:32.967: DEBUG/dalvikvm(292): GC_EXPLICIT freed 46 objects / 2240 bytes in 6840ms 02-16 12:28:33.247: WARN/ActivityManager(67): Activity pause timeout for HistoryRecord{43ee7b70 com.example.userpage/.UserPage} 02-16 12:28:36.947: INFO/Process(1039): Sending signal. PID: 1039 SIG: 9 02-16 12:28:37.017: INFO/ActivityManager(67): Process com.example.userpage (pid 1039) has died. 02-16 12:28:37.128: WARN/InputManagerService(67): Window already focused, ignoring focus gain of: com.android.internal.view.IInputMethodClient$Stub$Proxy@43e872f8 02-16 12:28:42.087: DEBUG/dalvikvm(176): GC_EXPLICIT freed 156 objects / 11488 bytes in 145ms 02-16 12:28:45.391: WARN/ActivityManager(67): Activity destroy timeout for HistoryRecord{43ee7b70 com.example.userpage/.UserPage} 02-16 12:28:47.177: DEBUG/SntpClient(67): request time failed: java.net.SocketException: Address family not supported by protocol

    Read the article

  • Java Cloud Service Integration to REST Service

    - by Jani Rautiainen
    Service (JCS) provides a platform to develop and deploy business applications in the cloud. In Fusion Applications Cloud deployments customers do not have the option to deploy custom applications developed with JDeveloper to ensure the integrity and supportability of the hosted application service. Instead the custom applications can be deployed to the JCS and integrated to the Fusion Application Cloud instance. This series of articles will go through the features of JCS, provide end-to-end examples on how to develop and deploy applications on JCS and how to integrate them with the Fusion Applications instance. In this article a custom application integrating with REST service will be implemented. We will use REST services provided by Taleo as an example; however the same approach will work with any REST service. In this example the data from the REST service is used to populate a dynamic table. Pre-requisites Access to Cloud instance In order to deploy the application access to a JCS instance is needed, a free trial JCS instance can be obtained from Oracle Cloud site. To register you will need a credit card even if the credit card will not be charged. To register simply click "Try it" and choose the "Java" option. The confirmation email will contain the connection details. See this video for example of the registration.Once the request is processed you will be assigned 2 service instances; Java and Database. Applications deployed to the JCS must use Oracle Database Cloud Service as their underlying database. So when JCS instance is created a database instance is associated with it using a JDBC data source.The cloud services can be monitored and managed through the web UI. For details refer to Getting Started with Oracle Cloud. JDeveloper JDeveloper contains Cloud specific features related to e.g. connection and deployment. To use these features download the JDeveloper from JDeveloper download site by clicking the "Download JDeveloper 11.1.1.7.1 for ADF deployment on Oracle Cloud" link, this version of JDeveloper will have the JCS integration features that will be used in this article. For versions that do not include the Cloud integration features the Oracle Java Cloud Service SDK or the JCS Java Console can be used for deployment. For details on installing and configuring the JDeveloper refer to the installation guideFor details on SDK refer to Using the Command-Line Interface to Monitor Oracle Java Cloud Service and Using the Command-Line Interface to Manage Oracle Java Cloud Service. Access to a local database The database associated with the JCS instance cannot be connected to with JDBC.  Since creating ADFbc business component requires a JDBC connection we will need access to a local database. 3rd party libraries This example will use some 3rd party libraries for implementing the REST service call and processing the input / output content. Other libraries may also be used, however these are tested to work. Jersey 1.x Jersey library will be used as a client to make the call to the REST service. JCS documentation for supported specifications states: Java API for RESTful Web Services (JAX-RS) 1.1 So Jersey 1.x will be used. Download the single-JAR Jersey bundle; in this example Jersey 1.18 JAR bundle is used. Json-simple Jjson-simple library will be used to process the json objects. Download the  JAR file; in this example json-simple-1.1.1.jar is used. Accessing data in Taleo Before implementing the application it is beneficial to familiarize oneself with the data in Taleo. Easiest way to do this is by using a RESTClient on your browser. Once added to the browser you can access the UI: The client can be used to call the REST services to test the URLs and data before adding them into the application. First derive the base URL for the service this can be done with: Method: GET URL: https://tbe.taleo.net/MANAGER/dispatcher/api/v1/serviceUrl/<company name> The response will contain the base URL to be used for the service calls for the company. Next obtain authentication token with: Method: POST URL: https://ch.tbe.taleo.net/CH07/ats/api/v1/login?orgCode=<company>&userName=<user name>&password=<password> The response includes an authentication token that can be used for few hours to authenticate with the service: {   "response": {     "authToken": "webapi26419680747505890557"   },   "status": {     "detail": {},     "success": true   } } To authenticate the service calls navigate to "Headers -> Custom Header": And add a new request header with: Name: Cookie Value: authToken=webapi26419680747505890557 Once authentication token is defined the tool can be used to invoke REST services; for example: Method: GET URL: https://ch.tbe.taleo.net/CH07/ats/api/v1/object/candidate/search.xml?status=16 This data will be used on the application to be created. For details on the Taleo REST services refer to the Taleo Business Edition REST API Guide. Create Application First Fusion Web Application is created and configured. Start JDeveloper and click "New Application": Application Name: JcsRestDemo Application Package Prefix: oracle.apps.jcs.test Application Template: Fusion Web Application (ADF) Configure Local Cloud Connection Follow the steps documented in the "Java Cloud Service ADF Web Application" article to configure a local database connection needed to create the ADFbc objects. Configure Libraries Add the 3rd party libraries into the class path. Create the following directory and copy the jar files into it: <JDEV_USER_HOME>/JcsRestDemo/lib  Select the "Model" project, navigate "Application -> Project Properties -> Libraries and Classpath -> Add JAR / Directory" and add the 2 3rd party libraries: Accessing Data from Taleo To access data from Taleo using the REST service the 3rd party libraries will be used. 2 Java classes are implemented, one representing the Candidate object and another for accessing the Taleo repository Candidate Candidate object is a POJO object used to represent the candidate data obtained from the Taleo repository. The data obtained will be used to populate the ADFbc object used to display the data on the UI. The candidate object contains simply the variables we obtain using the REST services and the getters / setters for them: Navigate "New -> General -> Java -> Java Class", enter "Candidate" as the name and create it in the package "oracle.apps.jcs.test.model".  Copy / paste the following as the content: import oracle.jbo.domain.Number; public class Candidate { private Number candId; private String firstName; private String lastName; public Candidate() { super(); } public Candidate(Number candId, String firstName, String lastName) { super(); this.candId = candId; this.firstName = firstName; this.lastName = lastName; } public void setCandId(Number candId) { this.candId = candId; } public Number getCandId() { return candId; } public void setFirstName(String firstName) { this.firstName = firstName; } public String getFirstName() { return firstName; } public void setLastName(String lastName) { this.lastName = lastName; } public String getLastName() { return lastName; } } Taleo Repository Taleo repository class will interact with the Taleo REST services. The logic will query data from Taleo and populate Candidate objects with the data. The Candidate object will then be used to populate the ADFbc object used to display data on the UI. Navigate "New -> General -> Java -> Java Class", enter "TaleoRepository" as the name and create it in the package "oracle.apps.jcs.test.model".  Copy / paste the following as the content (for details of the implementation refer to the documentation in the code): import com.sun.jersey.api.client.Client; import com.sun.jersey.api.client.ClientResponse; import com.sun.jersey.api.client.WebResource; import com.sun.jersey.core.util.MultivaluedMapImpl; import java.io.StringReader; import java.util.ArrayList; import java.util.Iterator; import java.util.List; import java.util.Map; import javax.ws.rs.core.MediaType; import javax.ws.rs.core.MultivaluedMap; import oracle.jbo.domain.Number; import org.json.simple.JSONArray; import org.json.simple.JSONObject; import org.json.simple.parser.JSONParser; /** * This class interacts with the Taleo REST services */ public class TaleoRepository { /** * Connection information needed to access the Taleo services */ String _company = null; String _userName = null; String _password = null; /** * Jersey client used to access the REST services */ Client _client = null; /** * Parser for processing the JSON objects used as * input / output for the services */ JSONParser _parser = null; /** * The base url for constructing the REST URLs. This is obtained * from Taleo with a service call */ String _baseUrl = null; /** * Authentication token obtained from Taleo using a service call. * The token can be used to authenticate on subsequent * service calls. The token will expire in 4 hours */ String _authToken = null; /** * Static url that can be used to obtain the url used to construct * service calls for a given company */ private static String _taleoUrl = "https://tbe.taleo.net/MANAGER/dispatcher/api/v1/serviceUrl/"; /** * Default constructor for the repository * Authentication details are passed as parameters and used to generate * authentication token. Note that each service call will * generate its own token. This is done to avoid dealing with the expiry * of the token. Also only 20 tokens are allowed per user simultaneously. * So instead for each call there is login / logout. * * @param company the company for which the service calls are made * @param userName the user name to authenticate with * @param password the password to authenticate with. */ public TaleoRepository(String company, String userName, String password) { super(); _company = company; _userName = userName; _password = password; _client = Client.create(); _parser = new JSONParser(); _baseUrl = getBaseUrl(); } /** * This obtains the base url for a company to be used * to construct the urls for service calls * @return base url for the service calls */ private String getBaseUrl() { String result = null; if (null != _baseUrl) { result = _baseUrl; } else { try { String company = _company; WebResource resource = _client.resource(_taleoUrl + company); ClientResponse response = resource.type(MediaType.APPLICATION_FORM_URLENCODED_TYPE).get(ClientResponse.class); String entity = response.getEntity(String.class); JSONObject jsonObject = (JSONObject)_parser.parse(new StringReader(entity)); JSONObject jsonResponse = (JSONObject)jsonObject.get("response"); result = (String)jsonResponse.get("URL"); } catch (Exception ex) { ex.printStackTrace(); } } return result; } /** * Generates authentication token, that can be used to authenticate on * subsequent service calls. Note that each service call will * generate its own token. This is done to avoid dealing with the expiry * of the token. Also only 20 tokens are allowed per user simultaneously. * So instead for each call there is login / logout. * @return authentication token that can be used to authenticate on * subsequent service calls */ private String login() { String result = null; try { MultivaluedMap<String, String> formData = new MultivaluedMapImpl(); formData.add("orgCode", _company); formData.add("userName", _userName); formData.add("password", _password); WebResource resource = _client.resource(_baseUrl + "login"); ClientResponse response = resource.type(MediaType.APPLICATION_FORM_URLENCODED_TYPE).post(ClientResponse.class, formData); String entity = response.getEntity(String.class); JSONObject jsonObject = (JSONObject)_parser.parse(new StringReader(entity)); JSONObject jsonResponse = (JSONObject)jsonObject.get("response"); result = (String)jsonResponse.get("authToken"); } catch (Exception ex) { throw new RuntimeException("Unable to login ", ex); } if (null == result) throw new RuntimeException("Unable to login "); return result; } /** * Releases a authentication token. Each call to login must be followed * by call to logout after the processing is done. This is required as * the tokens are limited to 20 per user and if not released the tokens * will only expire after 4 hours. * @param authToken */ private void logout(String authToken) { WebResource resource = _client.resource(_baseUrl + "logout"); resource.header("cookie", "authToken=" + authToken).post(ClientResponse.class); } /** * This method is used to obtain a list of candidates using a REST * service call. At this example the query is hard coded to query * based on status. The url constructed to access the service is: * <_baseUrl>/object/candidate/search.xml?status=16 * @return List of candidates obtained with the service call */ public List<Candidate> getCandidates() { List<Candidate> result = new ArrayList<Candidate>(); try { // First login, note that in finally block we must have logout _authToken = "authToken=" + login(); /** * Construct the URL, the resulting url will be: * <_baseUrl>/object/candidate/search.xml?status=16 */ MultivaluedMap<String, String> formData = new MultivaluedMapImpl(); formData.add("status", "16"); JSONArray searchResults = (JSONArray)getTaleoResource("object/candidate/search", "searchResults", formData); /** * Process the results, the resulting JSON object is something like * this (simplified for readability): * * { * "response": * { * "searchResults": * [ * { * "candidate": * { * "candId": 211, * "firstName": "Mary", * "lastName": "Stochi", * logic here will find the candidate object(s), obtain the desired * data from them, construct a Candidate object based on the data * and add it to the results. */ for (Object object : searchResults) { JSONObject temp = (JSONObject)object; JSONObject candidate = (JSONObject)findObject(temp, "candidate"); Long candIdTemp = (Long)candidate.get("candId"); Number candId = (null == candIdTemp ? null : new Number(candIdTemp)); String firstName = (String)candidate.get("firstName"); String lastName = (String)candidate.get("lastName"); result.add(new Candidate(candId, firstName, lastName)); } } catch (Exception ex) { ex.printStackTrace(); } finally { if (null != _authToken) logout(_authToken); } return result; } /** * Convenience method to construct url for the service call, invoke the * service and obtain a resource from the response * @param path the path for the service to be invoked. This is combined * with the base url to construct a url for the service * @param resource the key for the object in the response that will be * obtained * @param parameters any parameters used for the service call. The call * is slightly different depending whether parameters exist or not. * @return the resource from the response for the service call */ private Object getTaleoResource(String path, String resource, MultivaluedMap<String, String> parameters) { Object result = null; try { WebResource webResource = _client.resource(_baseUrl + path); ClientResponse response = null; if (null == parameters) response = webResource.header("cookie", _authToken).get(ClientResponse.class); else response = webResource.queryParams(parameters).header("cookie", _authToken).get(ClientResponse.class); String entity = response.getEntity(String.class); JSONObject jsonObject = (JSONObject)_parser.parse(new StringReader(entity)); result = findObject(jsonObject, resource); } catch (Exception ex) { ex.printStackTrace(); } return result; } /** * Convenience method to recursively find a object with an key * traversing down from a given root object. This will traverse a * JSONObject / JSONArray recursively to find a matching key, if found * the object with the key is returned. * @param root root object which contains the key searched for * @param key the key for the object to search for * @return the object matching the key */ private Object findObject(Object root, String key) { Object result = null; if (root instanceof JSONObject) { JSONObject rootJSON = (JSONObject)root; if (rootJSON.containsKey(key)) { result = rootJSON.get(key); } else { Iterator children = rootJSON.entrySet().iterator(); while (children.hasNext()) { Map.Entry entry = (Map.Entry)children.next(); Object child = entry.getValue(); if (child instanceof JSONObject || child instanceof JSONArray) { result = findObject(child, key); if (null != result) break; } } } } else if (root instanceof JSONArray) { JSONArray rootJSON = (JSONArray)root; for (Object child : rootJSON) { if (child instanceof JSONObject || child instanceof JSONArray) { result = findObject(child, key); if (null != result) break; } } } return result; } }   Creating Business Objects While JCS application can be created without a local database, the local database is required when using ADFbc objects even if database objects are not referred. For this example we will create a "Transient" view object that will be programmatically populated based the data obtained from Taleo REST services. Creating ADFbc objects Choose the "Model" project and navigate "New -> Business Tier : ADF Business Components : View Object". On the "Initialize Business Components Project" choose the local database connection created in previous step. On Step 1 enter "JcsRestDemoVO" on the "Name" and choose "Rows populated programmatically, not based on query": On step 2 create the following attributes: CandId Type: Number Updatable: Always Key Attribute: checked Name Type: String Updatable: Always On steps 3 and 4 accept defaults and click "Next".  On step 5 check the "Application Module" checkbox and enter "JcsRestDemoAM" as the name: Click "Finish" to generate the objects. Populating the VO To display the data on the UI the "transient VO" is populated programmatically based on the data obtained from the Taleo REST services. Open the "JcsRestDemoVOImpl.java". Copy / paste the following as the content (for details of the implementation refer to the documentation in the code): import java.sql.ResultSet; import java.util.List; import java.util.ListIterator; import oracle.jbo.server.ViewObjectImpl; import oracle.jbo.server.ViewRowImpl; import oracle.jbo.server.ViewRowSetImpl; // --------------------------------------------------------------------- // --- File generated by Oracle ADF Business Components Design Time. // --- Tue Feb 18 09:40:25 PST 2014 // --- Custom code may be added to this class. // --- Warning: Do not modify method signatures of generated methods. // --------------------------------------------------------------------- public class JcsRestDemoVOImpl extends ViewObjectImpl { /** * This is the default constructor (do not remove). */ public JcsRestDemoVOImpl() { } @Override public void executeQuery() { /** * For some reason we need to reset everything, otherwise * 2nd entry to the UI screen may fail with * "java.util.NoSuchElementException" in createRowFromResultSet * call to "candidates.next()". I am not sure why this is happening * as the Iterator is new and "hasNext" is true at the point * of the execution. My theory is that since the iterator object is * exactly the same the VO cache somehow reuses the iterator including * the pointer that has already exhausted the iterable elements on the * previous run. Working around the issue * here by cleaning out everything on the VO every time before query * is executed on the VO. */ getViewDef().setQuery(null); getViewDef().setSelectClause(null); setQuery(null); this.reset(); this.clearCache(); super.executeQuery(); } /** * executeQueryForCollection - overridden for custom java data source support. */ protected void executeQueryForCollection(Object qc, Object[] params, int noUserParams) { /** * Integrate with the Taleo REST services using TaleoRepository class. * A list of candidates matching a hard coded query is obtained. */ TaleoRepository repository = new TaleoRepository(<company>, <username>, <password>); List<Candidate> candidates = repository.getCandidates(); /** * Store iterator for the candidates as user data on the collection. * This will be used in createRowFromResultSet to create rows based on * the custom iterator. */ ListIterator<Candidate> candidatescIterator = candidates.listIterator(); setUserDataForCollection(qc, candidatescIterator); super.executeQueryForCollection(qc, params, noUserParams); } /** * hasNextForCollection - overridden for custom java data source support. */ protected boolean hasNextForCollection(Object qc) { boolean result = false; /** * Determines whether there are candidates for which to create a row */ ListIterator<Candidate> candidates = (ListIterator<Candidate>)getUserDataForCollection(qc); result = candidates.hasNext(); /** * If all candidates to be created indicate that processing is done */ if (!result) { setFetchCompleteForCollection(qc, true); } return result; } /** * createRowFromResultSet - overridden for custom java data source support. */ protected ViewRowImpl createRowFromResultSet(Object qc, ResultSet resultSet) { /** * Obtain the next candidate from the collection and create a row * for it. */ ListIterator<Candidate> candidates = (ListIterator<Candidate>)getUserDataForCollection(qc); ViewRowImpl row = createNewRowForCollection(qc); try { Candidate candidate = candidates.next(); row.setAttribute("CandId", candidate.getCandId()); row.setAttribute("Name", candidate.getFirstName() + " " + candidate.getLastName()); } catch (Exception e) { e.printStackTrace(); } return row; } /** * getQueryHitCount - overridden for custom java data source support. */ public long getQueryHitCount(ViewRowSetImpl viewRowSet) { /** * For this example this is not implemented rather we always return 0. */ return 0; } } Creating UI Choose the "ViewController" project and navigate "New -> Web Tier : JSF : JSF Page". On the "Create JSF Page" enter "JcsRestDemo" as name and ensure that the "Create as XML document (*.jspx)" is checked.  Open "JcsRestDemo.jspx" and navigate to "Data Controls -> JcsRestDemoAMDataControl -> JcsRestDemoVO1" and drag & drop the VO to the "<af:form> " as a "ADF Read-only Table": Accept the defaults in "Edit Table Columns". To execute the query navigate to to "Data Controls -> JcsRestDemoAMDataControl -> JcsRestDemoVO1 -> Operations -> Execute" and drag & drop the operation to the "<af:form> " as a "Button": Deploying to JCS Follow the same steps as documented in previous article"Java Cloud Service ADF Web Application". Once deployed the application can be accessed with URL: https://java-[identity domain].java.[data center].oraclecloudapps.com/JcsRestDemo-ViewController-context-root/faces/JcsRestDemo.jspx The UI displays a list of candidates obtained from the Taleo REST Services: Summary In this article we learned how to integrate with REST services using Jersey library in JCS. In future articles various other integration techniques will be covered.

    Read the article

  • Postgres cannot connect to server

    - by user1408935
    Super stumped by why Postgres isn't working on a new app I just started. I've got it working for one app already. I'm using postgres.app, and it's running. I started a new app with rails new depot -d postgresql and then I went into the database.yml file and changed username to my $USER (which is what it is for the other app, which is working). So now my database.yml file has this development section: development: adapter: postgresql encoding: unicode database: depot_development pool: 5 username: <username> password: But when I run "rake db:create" or "rake db:create:all" I still got this error (in full, cause I don't know what's relevant): Couldn't create database for {"adapter"=>"postgresql", "encoding"=>"unicode", "database"=>"depot_development", "pool"=>5, "username"=>"<username>", "password"=>nil} could not connect to server: Permission denied Is the server running locally and accepting connections on Unix domain socket "/var/pgsql_socket/.s.PGSQL.5432"? /Users/<username>/.rvm/gems/ruby-1.9.3-p194/gems/activerecord-3.2.8/lib/active_record/connection_adapters/postgresql_adapter.rb:1213:in `initialize' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/gems/activerecord-3.2.8/lib/active_record/connection_adapters/postgresql_adapter.rb:1213:in `new' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/gems/activerecord-3.2.8/lib/active_record/connection_adapters/postgresql_adapter.rb:1213:in `connect' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/gems/activerecord-3.2.8/lib/active_record/connection_adapters/postgresql_adapter.rb:329:in `initialize' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/gems/activerecord-3.2.8/lib/active_record/connection_adapters/postgresql_adapter.rb:28:in `new' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/gems/activerecord-3.2.8/lib/active_record/connection_adapters/postgresql_adapter.rb:28:in `postgresql_connection' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/gems/activerecord-3.2.8/lib/active_record/connection_adapters/abstract/connection_pool.rb:309:in `new_connection' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/gems/activerecord-3.2.8/lib/active_record/connection_adapters/abstract/connection_pool.rb:319:in `checkout_new_connection' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/gems/activerecord-3.2.8/lib/active_record/connection_adapters/abstract/connection_pool.rb:241:in `block (2 levels) in checkout' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/gems/activerecord-3.2.8/lib/active_record/connection_adapters/abstract/connection_pool.rb:236:in `loop' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/gems/activerecord-3.2.8/lib/active_record/connection_adapters/abstract/connection_pool.rb:236:in `block in checkout' /Users/<username>/.rvm/rubies/ruby-1.9.3-p194/lib/ruby/1.9.1/monitor.rb:211:in `mon_synchronize' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/gems/activerecord-3.2.8/lib/active_record/connection_adapters/abstract/connection_pool.rb:233:in `checkout' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/gems/activerecord-3.2.8/lib/active_record/connection_adapters/abstract/connection_pool.rb:96:in `block in connection' /Users/<username>/.rvm/rubies/ruby-1.9.3-p194/lib/ruby/1.9.1/monitor.rb:211:in `mon_synchronize' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/gems/activerecord-3.2.8/lib/active_record/connection_adapters/abstract/connection_pool.rb:95:in `connection' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/gems/activerecord-3.2.8/lib/active_record/connection_adapters/abstract/connection_pool.rb:404:in `retrieve_connection' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/gems/activerecord-3.2.8/lib/active_record/connection_adapters/abstract/connection_specification.rb:170:in `retrieve_connection' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/gems/activerecord-3.2.8/lib/active_record/connection_adapters/abstract/connection_specification.rb:144:in `connection' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/gems/activerecord-3.2.8/lib/active_record/railties/databases.rake:107:in `rescue in create_database' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/gems/activerecord-3.2.8/lib/active_record/railties/databases.rake:51:in `create_database' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/gems/activerecord-3.2.8/lib/active_record/railties/databases.rake:40:in `block (3 levels) in <top (required)>' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/gems/activerecord-3.2.8/lib/active_record/railties/databases.rake:40:in `each' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/gems/activerecord-3.2.8/lib/active_record/railties/databases.rake:40:in `block (2 levels) in <top (required)>' /Users/<username>/.rvm/gems/ruby-1.9.3-p194@global/gems/rake-0.9.2.2/lib/rake/task.rb:205:in `call' /Users/<username>/.rvm/gems/ruby-1.9.3-p194@global/gems/rake-0.9.2.2/lib/rake/task.rb:205:in `block in execute' /Users/<username>/.rvm/gems/ruby-1.9.3-p194@global/gems/rake-0.9.2.2/lib/rake/task.rb:200:in `each' /Users/<username>/.rvm/gems/ruby-1.9.3-p194@global/gems/rake-0.9.2.2/lib/rake/task.rb:200:in `execute' /Users/<username>/.rvm/gems/ruby-1.9.3-p194@global/gems/rake-0.9.2.2/lib/rake/task.rb:158:in `block in invoke_with_call_chain' /Users/<username>/.rvm/rubies/ruby-1.9.3-p194/lib/ruby/1.9.1/monitor.rb:211:in `mon_synchronize' /Users/<username>/.rvm/gems/ruby-1.9.3-p194@global/gems/rake-0.9.2.2/lib/rake/task.rb:151:in `invoke_with_call_chain' /Users/<username>/.rvm/gems/ruby-1.9.3-p194@global/gems/rake-0.9.2.2/lib/rake/task.rb:144:in `invoke' /Users/<username>/.rvm/gems/ruby-1.9.3-p194@global/gems/rake-0.9.2.2/lib/rake/application.rb:116:in `invoke_task' /Users/<username>/.rvm/gems/ruby-1.9.3-p194@global/gems/rake-0.9.2.2/lib/rake/application.rb:94:in `block (2 levels) in top_level' /Users/<username>/.rvm/gems/ruby-1.9.3-p194@global/gems/rake-0.9.2.2/lib/rake/application.rb:94:in `each' /Users/<username>/.rvm/gems/ruby-1.9.3-p194@global/gems/rake-0.9.2.2/lib/rake/application.rb:94:in `block in top_level' /Users/<username>/.rvm/gems/ruby-1.9.3-p194@global/gems/rake-0.9.2.2/lib/rake/application.rb:133:in `standard_exception_handling' /Users/<username>/.rvm/gems/ruby-1.9.3-p194@global/gems/rake-0.9.2.2/lib/rake/application.rb:88:in `top_level' /Users/<username>/.rvm/gems/ruby-1.9.3-p194@global/gems/rake-0.9.2.2/lib/rake/application.rb:66:in `block in run' /Users/<username>/.rvm/gems/ruby-1.9.3-p194@global/gems/rake-0.9.2.2/lib/rake/application.rb:133:in `standard_exception_handling' /Users/<username>/.rvm/gems/ruby-1.9.3-p194@global/gems/rake-0.9.2.2/lib/rake/application.rb:63:in `run' /Users/<username>/.rvm/gems/ruby-1.9.3-p194@global/gems/rake-0.9.2.2/bin/rake:33:in `<top (required)>' /Users/<username>/.rvm/gems/ruby-1.9.3-p194@global/bin/rake:19:in `load' /Users/<username>/.rvm/gems/ruby-1.9.3-p194@global/bin/rake:19:in `<main>' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/bin/ruby_noexec_wrapper:14:in `eval' /Users/<username>/.rvm/gems/ruby-1.9.3-p194/bin/ruby_noexec_wrapper:14:in `<main>' Couldn't create database for {"adapter"=>"postgresql", "encoding"=>"unicode", "database"=>"depot_test", "pool"=>5, "username"=>"<username>", "password"=>nil} I have tried createdb depot_development I have tried going into the psql environment and listing users (which included my username among them). In the same psql environment, I tried CREATE DATABASE depot; I've made sure that the pg gem is installed with bundle install, I've run "pg_ctl start", to which I got this response: pg_ctl: no database directory specified and environment variable PGDATA unset I ran "ps aux | grep postgres" to make sure postgres was running, to which I got this in return (which looks like it's doing OK, right?): <username> 10390 0.4 0.0 2425480 180 s000 R+ 6:15PM 0:00.00 grep postgres <username> 2907 0.0 0.0 2441604 464 ?? Ss 6:17PM 0:02.31 postgres: stats collector process <username> 2906 0.0 0.0 2445520 1664 ?? Ss 6:17PM 0:02.33 postgres: autovacuum launcher process <username> 2905 0.0 0.0 2445388 600 ?? Ss 6:17PM 0:09.25 postgres: wal writer process <username> 2904 0.0 0.0 2445388 1252 ?? Ss 6:17PM 0:12.08 postgres: writer process <username> 2902 0.0 0.0 2445388 3688 ?? S 6:17PM 0:00.54 /Applications/Postgres.app/Contents/MacOS/bin/postgres -D /Users/<username>/Library/Application Support/Postgres/var -p5432 The short of it, is I've been troubleshooting for a WHILE and have NO idea what's wrong. Any ideas? I'd really appreciate it, cause I'm pretty new to Rails, and this is a pretty disheartening roadblock. Thanks! EDIT -- Per request, posting the successful database.yml . It seems the difference is the inclusion of a password: development: adapter: postgresql encoding: unicode database: *******_development pool: 5 username: ******* password: ******* EDIT2 -- When I add a password to the .yml file, then run rake db:create again, I get this error. rake aborted! No Rakefile found (looking for: rakefile, Rakefile, rakefile.rb, Rakefile.rb)

    Read the article

  • javax.validation.ConstraintViolationException: validation failed for classes during update time for groups

    - by Tim
    Hello all! I have a Java / Spring MVC 3 application, using Hibernate and a MySQL database. In my controller, I have this source code: Set<ConstraintViolation<Person>> failures = validator.validate(p); if (failures.isEmpty()) { Project project = this.projectService.findProjectById(projectid); Person newPerson = this.personService.addPerson(p); Set<Person> persons = this.personService.getAllPersonsByProjectId(projectid); persons.add(newPerson); project.setPersons(persons); Set<ConstraintViolation<Project>> failures1 = validator.validate(project); if (!failures1.isEmpty()) { System.out.println("ERROR"); } else { System.out.println("NO ERROR"); } this.projectService.updateProject(project); return Collections.singletonMap("person", newPerson); } Project and Person are a many-to-many relation annotated with @manytomany and Project is the mapping owner. The new Person is added, but on the line with this.projectService.updateProject(project); I get an error. What it does it this in a Dao Hibernate implementation: public void updateProject(Project p) { SessionFactory sessionFactory = HibernateUtil.getSessionFactory(); Session sess = sessionFactory.getCurrentSession(); Transaction tx = sess.beginTransaction(); sess.update(p); tx.commit(); } It failed on the line tx.commit();. My check with if (!failures1.isEmpty()) { tell me that there are nor errors in my project. So what's wrong here? And why there is a validation of my project? I did not call a validation method... so why is there a org.hibernate.cfg.beanvalidation.BeanValidationEventListener.validate()? I hope, someone can help me how to fix this! Best Regards, Tim. Here the full error stack trace: 13.01.2011 00:06:36 org.apache.catalina.core.ApplicationDispatcher invoke SERVE: Servlet.service() for servlet project3 threw exception javax.validation.ConstraintViolationException: validation failed for classes [com.mydomain.myproject.domain.Person] during update time for groups [javax.validation.groups.Default, ] at org.hibernate.cfg.beanvalidation.BeanValidationEventListener.validate(BeanValidationEventListener.java:155) at org.hibernate.cfg.beanvalidation.BeanValidationEventListener.onPreUpdate(BeanValidationEventListener.java:102) at org.hibernate.action.EntityUpdateAction.preUpdate(EntityUpdateAction.java:235) at org.hibernate.action.EntityUpdateAction.execute(EntityUpdateAction.java:86) at org.hibernate.engine.ActionQueue.execute(ActionQueue.java:273) at org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:265) at org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:185) at org.hibernate.event.def.AbstractFlushingEventListener.performExecutions(AbstractFlushingEventListener.java:321) at org.hibernate.event.def.DefaultFlushEventListener.onFlush(DefaultFlushEventListener.java:51) at org.hibernate.impl.SessionImpl.flush(SessionImpl.java:1216) at org.hibernate.impl.SessionImpl.managedFlush(SessionImpl.java:383) at org.hibernate.transaction.JDBCTransaction.commit(JDBCTransaction.java:133) at com.mydomain.myproject.dao.impl.ProjectDaoImplHibernate.updateProject(ProjectDaoImplHibernate.java:44) at com.mydomain.myproject.service.impl.ProjectServiceImpl.updateProject(ProjectServiceImpl.java:39) at com.mydomain.myproject.controller.ProjectPersonController.addPerson(ProjectPersonController.java:189) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.springframework.web.bind.annotation.support.HandlerMethodInvoker.invokeHandlerMethod(HandlerMethodInvoker.java:176) at org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter.invokeHandlerMethod(AnnotationMethodHandlerAdapter.java:426) at org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter.handle(AnnotationMethodHandlerAdapter.java:414) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:790) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:719) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:644) at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:560) at javax.servlet.http.HttpServlet.service(HttpServlet.java:637) at javax.servlet.http.HttpServlet.service(HttpServlet.java:717) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.apache.catalina.core.ApplicationDispatcher.invoke(ApplicationDispatcher.java:646) at org.apache.catalina.core.ApplicationDispatcher.processRequest(ApplicationDispatcher.java:436) at org.apache.catalina.core.ApplicationDispatcher.doForward(ApplicationDispatcher.java:374) at org.apache.catalina.core.ApplicationDispatcher.forward(ApplicationDispatcher.java:302) at org.tuckey.web.filters.urlrewrite.NormalRewrittenUrl.doRewrite(NormalRewrittenUrl.java:195) at org.tuckey.web.filters.urlrewrite.RuleChain.handleRewrite(RuleChain.java:159) at org.tuckey.web.filters.urlrewrite.RuleChain.doRules(RuleChain.java:141) at org.tuckey.web.filters.urlrewrite.UrlRewriter.processRequest(UrlRewriter.java:90) at org.tuckey.web.filters.urlrewrite.UrlRewriteFilter.doFilter(UrlRewriteFilter.java:417) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:88) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:76) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298) at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:857) at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588) at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489) at java.lang.Thread.run(Thread.java:619) 13.01.2011 00:06:36 org.apache.catalina.core.StandardWrapperValve invoke SERVE: Servlet.service() for servlet default threw exception javax.validation.ConstraintViolationException: validation failed for classes [com.mydomain.myproject.domain.Person] during update time for groups [javax.validation.groups.Default, ] at org.hibernate.cfg.beanvalidation.BeanValidationEventListener.validate(BeanValidationEventListener.java:155) at org.hibernate.cfg.beanvalidation.BeanValidationEventListener.onPreUpdate(BeanValidationEventListener.java:102) at org.hibernate.action.EntityUpdateAction.preUpdate(EntityUpdateAction.java:235) at org.hibernate.action.EntityUpdateAction.execute(EntityUpdateAction.java:86) at org.hibernate.engine.ActionQueue.execute(ActionQueue.java:273) at org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:265) at org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:185) at org.hibernate.event.def.AbstractFlushingEventListener.performExecutions(AbstractFlushingEventListener.java:321) at org.hibernate.event.def.DefaultFlushEventListener.onFlush(DefaultFlushEventListener.java:51) at org.hibernate.impl.SessionImpl.flush(SessionImpl.java:1216) at org.hibernate.impl.SessionImpl.managedFlush(SessionImpl.java:383) at org.hibernate.transaction.JDBCTransaction.commit(JDBCTransaction.java:133) at com.mydomain.myproject.dao.impl.ProjectDaoImplHibernate.updateProject(ProjectDaoImplHibernate.java:44) at com.mydomain.myproject.service.impl.ProjectServiceImpl.updateProject(ProjectServiceImpl.java:39) at com.mydomain.myproject.controller.ProjectPersonController.addPerson(ProjectPersonController.java:189) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.springframework.web.bind.annotation.support.HandlerMethodInvoker.invokeHandlerMethod(HandlerMethodInvoker.java:176) at org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter.invokeHandlerMethod(AnnotationMethodHandlerAdapter.java:426) at org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter.handle(AnnotationMethodHandlerAdapter.java:414) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:790) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:719) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:644) at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:560) at javax.servlet.http.HttpServlet.service(HttpServlet.java:637) at javax.servlet.http.HttpServlet.service(HttpServlet.java:717) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.apache.catalina.core.ApplicationDispatcher.invoke(ApplicationDispatcher.java:646) at org.apache.catalina.core.ApplicationDispatcher.processRequest(ApplicationDispatcher.java:436) at org.apache.catalina.core.ApplicationDispatcher.doForward(ApplicationDispatcher.java:374) at org.apache.catalina.core.ApplicationDispatcher.forward(ApplicationDispatcher.java:302) at org.tuckey.web.filters.urlrewrite.NormalRewrittenUrl.doRewrite(NormalRewrittenUrl.java:195) at org.tuckey.web.filters.urlrewrite.RuleChain.handleRewrite(RuleChain.java:159) at org.tuckey.web.filters.urlrewrite.RuleChain.doRules(RuleChain.java:141) at org.tuckey.web.filters.urlrewrite.UrlRewriter.processRequest(UrlRewriter.java:90) at org.tuckey.web.filters.urlrewrite.UrlRewriteFilter.doFilter(UrlRewriteFilter.java:417) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:88) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:76) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298) at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:857) at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588) at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489) at java.lang.Thread.run(Thread.java:619) UPDATE Before updating the Project where the error occurs, I add a person which have this annotated: @NotNull @Size(min = 1, max = 255) @Pattern(regexp="(?:[a-z0-9!#$%&'*+/=?^_`{|}~-]+(?:\\.[a-z0-9!#$%&'*+/=?^_`{|}~-]+)*|\"(?:[\\x01-\\x08\\x0b\\x0c\\x0e-\\x1f\\x21\\x23-\\x5b\\x5d-\\x7f]|\\\\[\\x01-\\x09\\x0b\\x0c\\x0e-\\x7f])*\")@(?:(?:[a-z0-9](?:[a-z0-9-]*[a-z0-9])?\\.)+[a-z0-9](?:[a-z0-9-]*[a-z0-9])?|\\[(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\.){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?|[a-z0-9-]*[a-z0-9]:(?:[\\x01-\\x08\\x0b\\x0c\\x0e-\\x1f\\x21-\\x5a\\x53-\\x7f]|\\\\[\\x01-\\x09\\x0b\\x0c\\x0e-\\x7f])+)\\])", message="{my.email.error.message}") private String email; Without the @Pattern no error... So, what's wrong here? UPDATE-2: I use Hibernate 3.6.0.Final and I have these in my Maven pom.xml: <!-- JSR 303 with Hibernate Validator --> <dependency> <groupId>javax.validation</groupId> <artifactId>validation-api</artifactId> <version>1.0.0.GA</version> </dependency> <dependency> <groupId>org.hibernate</groupId> <artifactId>hibernate-validator</artifactId> <version>4.1.0.Final</version> </dependency>

    Read the article

  • My App crashes when launched on my Iphone

    - by Miky Mike
    hi guys, I have a problem here : my app crashed on my Iphone (JB) though Xcode doesn't complain about anything. The app works fine on the simulator though. However, there is this in the device logs : Thread 0 Crashed: 0 libSystem.B.dylib 0x00078ac8 kill + 8 1 libSystem.B.dylib 0x00078ab8 kill + 4 2 libSystem.B.dylib 0x00078aaa raise + 10 3 libSystem.B.dylib 0x0008d03a abort + 50 4 libstdc++.6.dylib 0x00044a20 __gnu_cxx::__verbose_terminate_handler() + 376 5 libobjc.A.dylib 0x00005958 _objc_terminate + 104 6 libstdc++.6.dylib 0x00042df2 _cxxabiv1::_terminate(void (*)()) + 46 7 libstdc++.6.dylib 0x00042e46 std::terminate() + 10 8 libstdc++.6.dylib 0x00042f16 __cxa_throw + 78 9 libobjc.A.dylib 0x00004838 objc_exception_throw + 64 10 CoreFoundation 0x0009fd0e +[NSException raise:format:arguments:] + 62 11 CoreFoundation 0x0009fd48 +[NSException raise:format:] + 28 12 Foundation 0x000125d8 -[NSURL(NSURL) initFileURLWithPath:] + 64 13 Foundation 0x000371e0 +[NSURL(NSURL) fileURLWithPath:] + 24 14 TheLearningMachine 0x00002d08 0x1000 + 7432 15 TheLearningMachine 0x00002e8c 0x1000 + 7820 16 TheLearningMachine 0x00002be4 0x1000 + 7140 17 TheLearningMachine 0x000029b6 0x1000 + 6582 18 UIKit 0x0000e47a -[UIApplication _callInitializationDelegatesForURL:payload:suspended:] + 766 19 UIKit 0x000049e0 -[UIApplication _runWithURL:payload:launchOrientation:statusBarStyle:statusBarHidden:] + 200 20 UIKit 0x0005dfd6 -[UIApplication handleEvent:withNewEvent:] + 1390 21 UIKit 0x0005d8fa -[UIApplication sendEvent:] + 38 22 UIKit 0x0005d330 _UIApplicationHandleEvent + 5104 23 GraphicsServices 0x00005044 PurpleEventCallback + 660 24 CoreFoundation 0x00034cdc __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE1_PERFORM_FUNCTION + 20 25 CoreFoundation 0x00034ca0 __CFRunLoopDoSource1 + 160 26 CoreFoundation 0x00027566 __CFRunLoopRun + 514 27 CoreFoundation 0x00027270 CFRunLoopRunSpecific + 224 28 CoreFoundation 0x00027178 CFRunLoopRunInMode + 52 29 UIKit 0x000040fc -[UIApplication _run] + 364 30 UIKit 0x00002128 UIApplicationMain + 664 31 TheLearningMachine 0x00002948 0x1000 + 6472 32 TheLearningMachine 0x000028fc 0x1000 + 6396 Thread 1: 0 libSystem.B.dylib 0x0002d330 kevent + 24 1 libSystem.B.dylib 0x000d6b6c _dispatch_mgr_invoke + 88 2 libSystem.B.dylib 0x000d65bc _dispatch_queue_invoke + 96 3 libSystem.B.dylib 0x000d675c _dispatch_worker_thread2 + 120 4 libSystem.B.dylib 0x0007a67a _pthread_wqthread + 258 5 libSystem.B.dylib 0x00073190 start_wqthread + 0 Thread 2: 0 libSystem.B.dylib 0x0007b19c __workq_kernreturn + 8 1 libSystem.B.dylib 0x0007a790 _pthread_wqthread + 536 2 libSystem.B.dylib 0x00073190 start_wqthread + 0 Thread 3: 0 libSystem.B.dylib 0x00000c98 mach_msg_trap + 20 1 libSystem.B.dylib 0x00002d64 mach_msg + 44 2 CoreFoundation 0x00027c38 __CFRunLoopServiceMachPort + 88 3 CoreFoundation 0x000274c2 __CFRunLoopRun + 350 4 CoreFoundation 0x00027270 CFRunLoopRunSpecific + 224 5 CoreFoundation 0x00027178 CFRunLoopRunInMode + 52 6 WebCore 0x000024e2 RunWebThread(void*) + 362 7 libSystem.B.dylib 0x0007a27e _pthread_start + 242 8 libSystem.B.dylib 0x0006f2a8 thread_start + 0 Thread 0 crashed with ARM Thread State: r0: 0x00000000 r1: 0x00000000 r2: 0x00000001 r3: 0x3e0862b4 r4: 0x00000006 r5: 0x0015a2ec r6: 0x2fffe090 r7: 0x2fffe0a0 r8: 0x3e1a378c r9: 0x00000065 r10: 0x33028e5a r11: 0x3e1ab89c ip: 0x00000025 sp: 0x2fffe0a0 lr: 0x30277abf pc: 0x30277ac8 cpsr: 0x000f0010 Any idea what the problem can be ? I've already spent my whole day on that, but... I'm stuck. Thanks in advance... Miky Mike Ok, Here is more then from the console, I get this : This GDB was configured as "--host=i386-apple-darwin --target=arm-apple-darwin".tty /dev/ttys002 Loading program into debugger… Program loaded. target remote-mobile /tmp/.XcodeGDBRemote-17280-65 Switching to remote-macosx protocol mem 0x1000 0x3fffffff cache mem 0x40000000 0xffffffff none mem 0x00000000 0x0fff none run Running… Error launching remote program: failed to get the task for process 456. Error launching remote program: failed to get the task for process 456. The program being debugged is not being run. The program being debugged is not being run. [Session started at 2010-12-23 20:33:33 +0100.] GNU gdb 6.3.50-20050815 (Apple version gdb-1472) (Thu Aug 5 05:54:10 UTC 2010) Copyright 2004 Free Software Foundation, Inc. GDB is free software, covered by the GNU General Public License, and you are welcome to change it and/or distribute copies of it under certain conditions. Type "show copying" to see the conditions. There is absolutely no warranty for GDB. Type "show warranty" for details. This GDB was configured as "--host=i386-apple-darwin --target=arm-apple-darwin".tty /dev/ttys004 Loading program into debugger… Program loaded. target remote-mobile /tmp/.XcodeGDBRemote-17280-72 Switching to remote-macosx protocol mem 0x1000 0x3fffffff cache mem 0x40000000 0xffffffff none mem 0x00000000 0x0fff none run Running… Error launching remote program: failed to get the task for process 508. Error launching remote program: failed to get the task for process 508. The program being debugged is not being run. The program being debugged is not being run. And here is the code page that calls the URL import "TheLearningMachineAppDelegate.h" import "RootViewController.h" @implementation TheLearningMachineAppDelegate @synthesize window; @synthesize navigationController; pragma mark - pragma mark Application lifecycle (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions { RootViewController *rootViewController = (RootViewController *)[navigationController topViewController]; rootViewController.managedObjectContext = self.managedObjectContext; [window addSubview:[navigationController view]]; [window makeKeyAndVisible]; return YES; } (void)applicationWillResignActive:(UIApplication )application { / Sent when the application is about to move from active to inactive state. This can occur for certain types of temporary interruptions (such as an incoming phone call or SMS message) or when the user quits the application and it begins the transition to the background state. Use this method to pause ongoing tasks, disable timers, and throttle down OpenGL ES frame rates. Games should use this method to pause the game. */ } (void)applicationDidEnterBackground:(UIApplication *)application { [self saveContext]; } (void)applicationWillEnterForeground:(UIApplication )application { / Called as part of the transition from the background to the inactive state: here you can undo many of the changes made on entering the background. */ } (void)applicationDidBecomeActive:(UIApplication )application { / Restart any tasks that were paused (or not yet started) while the application was inactive. If the application was previously in the background, optionally refresh the user interface. */ } // Method that saves the managed object context before the application terminates. (void)applicationWillTerminate:(UIApplication *)application { [self saveContext]; } (void)saveContext { NSError *error = nil; if (managedObjectContext != nil) { if ([managedObjectContext hasChanges] && ![managedObjectContext save:&error]) { NSLog(@"Unresolved error %@, %@", error, [error userInfo]); abort(); //Replace this implementation with code to handle the error appropriately. //abort() causes the application to generate a crash log and terminate. You should not use this function in a shipping application, although it may be useful during development. If it is not possible to recover from the error, display an alert panel that instructs the user to quit the application by pressing the Home button. } } } pragma mark - pragma mark Core Data stack // Returns the managed object context for the application. (NSManagedObjectContext *)managedObjectContext { if (managedObjectContext != nil) { return managedObjectContext; } NSPersistentStoreCoordinator *coordinator = [self persistentStoreCoordinator]; if (coordinator != nil) { managedObjectContext = [[NSManagedObjectContext alloc] init]; [managedObjectContext setPersistentStoreCoordinator:coordinator]; } return managedObjectContext; } // Returns the managed object model for the application. (NSManagedObjectModel *)managedObjectModel { if (managedObjectModel != nil) { return managedObjectModel; } NSString *modelPath = [[NSBundle mainBundle] pathForResource:@"TheLearningMachine" ofType:@"momd"]; NSURL *modelURL = [NSURL fileURLWithPath:modelPath]; managedObjectModel = [[NSManagedObjectModel alloc] initWithContentsOfURL:modelURL]; return managedObjectModel; } pragma mark - pragma mark Application's Documents directory // Returns the path to the application's Documents directory. - (NSString *)applicationDocumentsDirectory { return [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject]; } // Returns the persistent store coordinator for the application. - (NSPersistentStoreCoordinator *)persistentStoreCoordinator { if (persistentStoreCoordinator != nil) { return persistentStoreCoordinator; } NSURL *storeURL = [NSURL fileURLWithPath: [[self applicationDocumentsDirectory] stringByAppendingPathComponent: @"TheLearningMachine.sqlite"]]; NSError *error = nil; persistentStoreCoordinator = [[NSPersistentStoreCoordinator alloc] initWithManagedObjectModel:[self managedObjectModel]]; if (![persistentStoreCoordinator addPersistentStoreWithType:NSSQLiteStoreType configuration:nil URL:storeURL options:nil error:&error]) { NSLog(@"Unresolved error %@, %@", error, [error userInfo]); abort(); } return persistentStoreCoordinator; } pragma mark - pragma mark Memory management (void)applicationDidReceiveMemoryWarning:(UIApplication )application { / Free up as much memory as possible by purging cached data objects that can be recreated (or reloaded from disk) later. */ } (void)dealloc { [managedObjectContext release]; [managedObjectModel release]; [persistentStoreCoordinator release]; [window release]; [super dealloc]; } @end

    Read the article

  • Use a single freemarker template to display tables of arbitrary pojos

    - by Kevin Pauli
    Attention advanced Freemarker gurus: I want to use a single freemarker template to be able to output tables of arbitrary pojos, with the columns to display defined separately than the data. The problem is that I can't figure out how to get a handle to a function on a pojo at runtime, and then have freemarker invoke that function (lambda style). From skimming the docs it seems that Freemarker supports functional programming, but I can't seem to forumulate the proper incantation. I whipped up a simplistic concrete example. Let's say I have two lists: a list of people with a firstName and lastName, and a list of cars with a make and model. would like to output these two tables: <table> <tr> <th>firstName</th> <th>lastName</th> </tr> <tr> <td>Joe</td> <td>Blow</d> </tr> <tr> <td>Mary</td> <td>Jane</d> </tr> </table> and <table> <tr> <th>make</th> <th>model</th> </tr> <tr> <td>Toyota</td> <td>Tundra</d> </tr> <tr> <td>Honda</td> <td>Odyssey</d> </tr> </table> But I want to use the same template, since this is part of a framework that has to deal with dozens of different pojo types. Given the following code: public class FreemarkerTest { public static class Table { private final List<Column> columns = new ArrayList<Column>(); public Table(Column[] columns) { this.columns.addAll(Arrays.asList(columns)); } public List<Column> getColumns() { return columns; } } public static class Column { private final String name; public Column(String name) { this.name = name; } public String getName() { return name; } } public static class Person { private final String firstName; private final String lastName; public Person(String firstName, String lastName) { this.firstName = firstName; this.lastName = lastName; } public String getFirstName() { return firstName; } public String getLastName() { return lastName; } } public static class Car { String make; String model; public Car(String make, String model) { this.make = make; this.model = model; } public String getMake() { return make; } public String getModel() { return model; } } public static void main(String[] args) throws Exception { final Table personTableDefinition = new Table(new Column[] { new Column("firstName"), new Column("lastName") }); final List<Person> people = Arrays.asList(new Person[] { new Person("Joe", "Blow"), new Person("Mary", "Jane") }); final Table carTable = new Table(new Column[] { new Column("make"), new Column("model") }); final List<Car> cars = Arrays.asList(new Car[] { new Car("Toyota", "Tundra"), new Car("Honda", "Odyssey") }); final Configuration cfg = new Configuration(); cfg.setClassForTemplateLoading(FreemarkerTest.class, ""); cfg.setObjectWrapper(new DefaultObjectWrapper()); final Template template = cfg.getTemplate("test.ftl"); process(template, personTableDefinition, people); process(template, carTable, cars); } private static void process(Template template, Table tableDefinition, List<? extends Object> data) throws Exception { final Map<String, Object> dataMap = new HashMap<String, Object>(); dataMap.put("tableDefinition", tableDefinition); dataMap.put("data", data); final Writer out = new OutputStreamWriter(System.out); template.process(dataMap, out); out.flush(); } } All the above is a given for this problem. So here is the template I have been hacking on. Note the comment where I am having trouble. <table> <tr> <#list tableDefinition.columns as col> <th>${col.name}</th> </#list> </tr> <#list data as pojo> <tr> <#list tableDefinition.columns as col> <td><#-- what goes here? --></td> </#list> </tr> </#list> </table> So col.name has the name of the property I want to access from the pojo. I have tried a few things, such as pojo.col.name and <#assign property = col.name/> ${pojo.property} but of course these don't work, I just included them to help convey my intent. I am looking for a way to get a handle to a function and have freemarker invoke it, or perhaps some kind of "evaluate" feature that can take an arbitrary expression as a string and evaluate it at runtime.

    Read the article

  • delete function and upload

    - by Jesper Petersen
    it must be said that I download the database and all my function is in class. That's how I was incredible pleased function and think they are nice .. That's how I'm going to build a gallery where the id of the upload to the site if it fits with the id_session is log in page, you have the option to delete it. and so it must just go back to / latest pictures / when it delete it from the folder and database. but it comes up with an error as you can see here; Fatal error: Call to a member function bind_param () on a non-object in / home / jesperbo / public_html / mebe.dk / function / function.php on line 411 It is such that I am also in the process of building an upload system where the underlying database and make it smaller after what I have now set it and when it did the 2 things must send me back to / latest-images / but it do not reach the only available picture up on the server and do it with the picture but it will not go back in some way at all. So to / latest-images / Where wrong with it to delete, etc. I lie just here, $stm1->bind_param('i', $id_gallery); function img_slet_indhold(){ if($_SESSION["logged_in"] = true && $_SESSION["rank"] == '1' || $_SESSION["rank"] == 2) { if($stmt = $this->mysqli->prepare('SELECT `title` FROM `gallery` WHERE `id_gallery` = ?')) { $stm1->bind_param('i', $id_gallery); $id_gallery = $_GET["id_gallery"]; $stm1->execute(); $stm1->store_result(); $stm1->bind_result($title); $UploadDir = "/gallery/"; //ligger i toppen af documentet, evt som en define if($stm1->fetch()) { $tmpfile = $UploadDir . "" . $title; if(file_exists($tmpfile)) { unlink($tmpfile); } $tmpfile = $UploadDir . "lille/" . $title; if(file_exists($tmpfile)) { unlink($tmpfile); } $tmpfile = $UploadDir . "store/" . $title; if(file_exists($tmpfile)) { unlink($tmpfile); } } $stm1->close(); } else { /* Der er opstået en fejl */ echo 'Der opstod en fejl i erklæringen: ' . $mysqli->error; } } if($stmt = $this->mysqli->prepare('DELETE FROM `gallery` WHERE `id_gallery` = ?' )) { $stmt->bind_param('i', $id); $id = $_GET["id_gallery"]; $stmt->execute(); header('Location: /nyeste-billeder/'); $stmt->close(); } else { /* Der er opstået en fejl */ echo 'Der opstod en fejl i erklæringen: ' . $mysqli->error; } } So into the file as it should delete from, I have chosen to do so here; <?php session_start(); require_once ("function/function.php"); $mebe = new mebe; $db = $mebe->db_c(); error_reporting(E_ERROR); $img_slet_indhold = $mebe->img_slet_indhold(); ?> So when I upload image to folder and database, and just after can be returned when uploading function img_indhold(){ if($_SESSION["logged_in"] = true && $_SESSION["rank"] == '1' || $_SESSION["rank"] == 2) { include "function/class.upload.php"; $handle = new Upload($_FILES["filename"]); if($handle->uploaded) { //lidt mere store billeder $handle->image_resize = true; $handle->image_ratio_y = true; $handle->image_x = 220; $handle->Process("gallery/store"); //til profil billede lign.. $handle->image_resize = true; $handle->image_ratio_crop = true; $handle->image_y = 115; $handle->image_x = 100; $handle->Process("gallery"); //til profil billede lign.. $handle->image_resize = true; $handle->image_ratio_crop = true; $handle->image_y = 75; $handle->image_x = 75; $handle->Process("gallery/lille"); $pb = $handle->file_dst_name; } if($stmt = $this->mysqli->prepare('INSERT INTO `gallery` (`title`, `id_bruger`) VALUES (?, ?)')) { $stmt->bind_param('si', $title, $id_bruger); $title = $pb; $id_bruger = $_SESSION["id"]; $stmt->execute(); header('Location: /nyeste-billeder/'); $stmt->close(); } } } So when I call it on the page when it is required to do so do it like this; <?php session_start(); require_once ("function/function.php"); $mebe = new mebe; $db = $mebe->db_c(); error_reporting(E_ERROR); $img_slet_indhold = $mebe->img_slet_indhold(); ?> it is here as to when I will upload to the site and show gallery / pictures on the page function vise_img(){ if ($stmt = $this->mysqli->prepare('SELECT `id_gallery`, `title`, `id_bruger` FROM `gallery` ORDER BY `gallery`.`id_gallery` DESC')) { $stmt->execute(); $stmt->store_result(); $stmt->bind_result($id_gallery, $title, $id_bruger); while ($stmt->fetch()) { echo "<div id=\"gallery_box\">"; echo "<a href=\"/profil/$id_bruger/\"><img src=\"/gallery/$title\" alt=\"\" height=\"115\" width=\"100\" border=\"0\"></a>"; if($_SESSION["logged_in"]) { if($id_bruger == $_SESSION["id"]) { echo "<ul>"; echo "<li><a href=\"/nyeste-billeder-slet/$id_gallery/\">Slet</a></li>"; echo "</ul>"; } } echo "</div>"; } /* Luk statement */ $stmt->close(); } else { /* Der er opstået en fejl */ echo 'Der opstod en fejl i erklæringen: ' . $mysqli->error; } } function upload_img(){ if($_SESSION["logged_in"] = true && $_SESSION["rank"] == '1' || $_SESSION["rank"] == 2) { ?> <form name="opslag" method="post" action="/nyeste-ok/" enctype="multipart/form-data"> <input type="file" name="filename" id="filename" onchange="checkFileExt(this)"> <input name="upload" value="Upload" id="background_indhold" onclick="return check()" type="submit"> </form> <?php } elseif ($_SESSION["logged_in"] != true && $_SESSION["rank"] != '1' || $_SESSION["rank"] != 2) { echo "<p>Du har ingen mulighed for at upload billeder på siden</p>"; } } Really hope you are able to help me further!

    Read the article

< Previous Page | 347 348 349 350 351 352 353 354 355 356 357 358  | Next Page >