Hmmmm... Let me try to reply to each question in order.
So wait... you say the keyboard works fine now[?]
I gave up on trying to get the iOS onscreen keyboard because I realized that my application was drawing over it during the screen's refresh. In this application, I need the entire screen to update the whole time the application is running. (I have "plans")
The iOS keyboard and the TouchHit and TouchDown are complete separate things. Have nothing to do with each other.
When the keyboard is enabled, you should not try to read touches imho.
Yes. Exactly. Due to my having to create my own onscreen keyboard, I needed some touch recognition.
Thus, I used TouchDown() at first*.
Do you use any scaling?
I do. But not on anything involving the keyboard that I created.
What framework do you use to render? mojo or mojo2?
I use mojo and ignition-x.
Is the problem only showing in the right and bottom edge or on all edges?
It's happening on all edges, on all iPhones. (this issue happens all through the application, no just when the keyboard is up, too)
Do you have a real splash-screen set for iOS or do you run your own?
I think you're referring to the black image that Xcode provides when first installing onto an iPhone, right? If so, I am using my own, set to 1242 x 2208. (this was my attempt to build the application at the same resolution of an iPhone 6s+ so it wouldn't be blurred by stretching the canvas to fit its screen.)
One of my girlfriends told me that I'm terrible at explaining things, so I figured that this "recap" might help.
* When I first started making my replacement onscreen keyboard, I originally built it using ignition-x's prefab buttons. That's when I discovered that any buttons placed on the edge of the screen weren't triggering.
After many iterations of trying to solve this issue, and frantically searching for help on this forum, I decided to rework ignition-x's buttons... not realizing until today that the buttons were triggered by TouchDown() and not TouchHit().
It seems that TouchHit() only registers intermittently around the edges on an iPhone. And since TouchDown() is triggered by some preset-number of registered TouchHit() detections, TouchDown() doesn't trigger, either. That might explain why I'm able to get TouchX() and TouchY() coordinates without any issues, no matter where I touch the screen even though I didn't get any TouchDown() triggers.
This morning, I rearranged some code and switched the TouchDown() call to a TouchHit() as a test... and it seems to work.
Maybe I am wrong? I'm just not certain yet. I've only gotten the idea to check for things I wouldn't have thought of on my own because you were on here. I'm grateful for your help. I'm not sure if providing a small app to demonstrate the touch issue will be useful since the problem only appears on an actual device - not the simulator - and not on the desktop either. I can, however provide a zipped video of what was happening with the edge of the screen BEFORE switching TouchDown() to TouchHit().