After I got home I recorded my entire session from this year’s DevCon. Here it is along with all the files that I used in the session. Hope you enjoy.The video player below should allow you to play the video in HD and in full screen. Click “HD” for HD and the little monitor icon to go Full Screen.
Apologies, unfortunately the demo files have gone missing.
Thank you Todd, this is just what i was looking for, so simple, yet so usefull.
Hi Todd
Excellent presentation. Thanks for sharing your Devcon presentation with those of us who couldn’t make it.
Lee
Hi Todd,
How could I download the files used in video as you mentioned at video description above?
Marek
Hi Marek,
Sorry about that. I have added the Download link to the post. Thanks for pointing this out.
Todd
I’ve looked at the video and also at you demo files. really enlightening, thank you. Does the commit record step work if the related table is in an different database hosted on the same server rather than in the same database as in your demo files?
Thanks
Hi Brian,
Yes it works the same way. So the separation model does work 🙂
Thanks
Todd
Hi!
Very nice presentation, one of the best ones I could attend to at devcon last year.
I have a question about how this works with field validation. Using your file, I marked the field “Date” of the expenses table as required. Doing so will prevent de record from being committed until I insert a date. Unfortunately, no error message is shown, no FM Dialog popping up. The most I could do was trap error number 509 and show an error dialog with a simple message like “Something is wrong” (since I don´t know if it´s possible to know which field was wrong and get its corresponding error message). How would you deal with field validation? Do I have to manually validate the fields inside a script?
Thanks!!
Thanks for the video and demo files; they were very clear and informative.
I noticed that in your “Force Commit Record” script, it exits with Result: Get(LastError), but the “Save…” script(s) that call the “Force Commit Record” script also test for Get(LastError), rather than Get(ScriptResult). I hadn’t realized that Get(LastError) would persist all the way up to the calling script, so it was cool to see that.
My question: is there a reason you also exit with the last error, since you don’t test for that value in your calling scripts (in this demo, at least)?
Hi Dan,
Congrats you found a bug :-). What you see in SimpleDemo is wrong! That script step should be Get(ScriptREsult) not Get(lastError). You are the first person to notice. 🙂
Look at the “Expensed” demo file, and look at the “Save New Expense Report” script you will see that I use Get(ScriptResult) to retrieve the error from Force Commit Record script. That is how I typically do it.
Sorry for the confusion.
Thanks for finding that.
Todd
Thank you! Glad you liked it.
So yeah Field Validation is an issue. If you have turned on error trapping, no message is shown, as you correctly point out. But you can modify the script to handle field validations if you use them.
Here is how you can fix it, In the “Force Commit Record” script trap for error 509 right after the commit. Instead of exiting, like the demo, do this:
Set Error Capture Off
Commit Record <- this will again trigger the validation, but now that Error Capture is off it will show the Field Validation Set Error Capture On You only need to do this if the error is 509 so wrap the above in the appropriate "IF" That will do it! Thanks Todd
Thanks for the fix. It worked altough I need one more line:
Set Error Capture Off
Set Variable[$$AllowTheCommit; Value: 1] <— in the first call to "Commit Rercord" the is set to 0, so I needed to set back to 1 before calling it back
Commit Record
Set Error Capture On
Thanks again for your help.
Regards,
Alfonso.
Right. Sorry about leaving that part out.
Glad you got it working
Todd
Why set error capture on at all? It doesn’t seem necessary to me.
Hi Dan,
I suppose you could, if you didn’t care about controlling the messaging that comes up when errors occur. I like to control all the error messages, so I almost always turn it off. But its up to you.
Todd
Hi Todd,
I am late to the party. Great presentation. I do have a question. I am using your Expense Report demo
When creating new Expense Report, the expense splits are entered directly into the portal. This works. But if I use a script to enter the expense split ( which I do most of the time) , it will not shown up in the portal unless I forced a commit. In this case, I will not be able to revert the record but to delete it. Is there a way to make the expense shown up without the force commit ?
Cheers
Joseph
Hello,
If your script is working on the same portal then you it should show up with out a commit. If however your script is going off an entering splits through some other relationship then, no, there is no way to see it with a commit.
Hope that helps
Thanks
Todd
Great presentation!!!!!
It seems that FM 12 with it’s Window controls would change the Commit requirements. Do you plan on making a video using FM 12?
Hi Ron,
Thanks!
The new windowing model in FM 12 doesn’t fundamentally change how commit records works. It still works the same way. Is there something specific about the presentation that you think might be different with FM 12?
Thanks
Todd
FM 12 has more Window ‘Styles’: Document, Floating and Dialog. It ‘seems’ like using these styles would at the very least, prevent the focus from going to the main window…. Anyway, for what it’s worth I would be nice to have these ‘styles’ and their affects brought into your excellent Commit video.
On the other hand, I can screw around with it and try and figure it out…. 8)
Hi Ron,
Although we didn’t have a specific Modal window type until 12, we were able to get the same modal behavior using a loop pause. The new Modal window style certainly makes it easier to achieve but it doesn’t really change what we could or could not do with regards to committing records. If I make a new video, perhaps I can just point out that the new window styles don’t really effect commit record behavior. 🙂
If you do come across anything that you think is different please let me know. I’d be happy to take a look.
Thanks again for commenting
Todd
Hey Todd,
Here is the crux of my problem. I have a script that opens a ‘new’ window which contains the fields in the main window portal. I then loop through each record in the ‘new’ layout running a script on each record. This works.
The weirdness is that after the script loops through about 20 records (I can watch it in debug), I get the dreaded “This record can not be modified because it is being modified in another window” message.
Question: “Why after 20 records”? Should this condition be evident from record 1?
Loop
Go to field (Dues::Date)
Perform Script “Reset Dues PaidYear”
Goto Record / Request/ Page (Exit after last)
End Loop
I even put a ‘commit’ before or aftere the Perform Script and it did not help.
Ideas?
Thanks
Hi Ron,
The error is telling you exactly what the problem is. You have that record open in the other window. Try Committing the record before you open the new window.
Todd
I just used this method to create 5000 records in a single commit (in a local file) and it took ~180 seconds. Surprised by the performance hit, I added a commit step to my script after every 50 records, then it only took ~40 seconds. After playing around with a few things, it seemed like one/all of: field validation, auto-enter calcs, field indexing was causing the script to take as long as it did to commit the records.
Have you seen similar performance issues using this method? Are there any tricks to improve performance?
Thanks,
Dan
Hi Dan,
Usually single commits are faster, at least up to the point of the filling up the local memory, which can happen on FM Go. The difference is usually amplified when committing over the LAN.
BUT there are times when it is slower. And often that is because of the nature of the auto-enter calcs, and field validations that are in use. For example if you have a field validation set to unique ( which I almost never do, for this reason ), it will get very slow when using single commits. This is because the field validation still has to examine all those records that are NOT committed and indexed yet to look for duplicates. So calcs that have to look at many records are going to cause slow downs.
Hope that helps
Todd
Yes that helps, thanks for the reply.
Hi Todd,
Have you any experience with using the transactions approach with multiple files (e.g. interface file/data separation or FileMaker Go file on the device syncing to hosted FileMaker Server file)? I’ve noticed that there are certain circumstances where I need to add a commit records step immediately after creating the new record, otherwise I will end up with 2 records once I set additional fields after first creating the record.
Just wondering if you had encountered this before? It seems to prevent the use of the transactions technique in this scenario as you have to commit the record to prevent duplicate records being created, so you can’t revert at a later stage.
thanks,
Steve
Hi Steve,
Yes we use this with file/data separation and with FileMaker Go syncing to a server. In general the technique works well. But there are lots of little silly things you have to watch out for. One that may account for what you are seeing, is that if you
1. create a new record through a relationship
2. break the relationship
3. try restore the relationship to the record you previously created
it won’t work. The new record is there in memory, and will be committed with everything else, but you have lost access to it and you can’t ever get it back.
So if you then proceed to edit more field on that record, you’ll likely create another record, because you can’t reach the original one.
So whenever you create new records make sure you do everything you need to do to the record before letting go of the relationship.
Hope that helps
Todd