Forum Replies Created
-
In reply to: Importing from phpBB 3
Watch out for the lack of this: Strip phpBB bbcode, bbcode_uid and magic urls during import
I feel like a caveman in a spaceship with this SQL stuff.. ignore my worries about the logic, I have now sent netweb the replies row dumps. Realizing I can do “export, Dump some row(s), Number of rows: 1, Row to begin at: 470, View output as text” helped A LOT! Farewell to uncertainty.
Damn, I just realized phpmyadmin’s browsing interface was confusing me and I was giving netweb the wrong topic! Now I have sent him the correct one + the first post.
Yes, I’m trying to locate some blocking replies to send to netweb for testing.
JJJ: does this logic mean it skips the first post in topic: LEFT JOIN phpbb_topics AS topics USING (topic_id) WHERE posts.post_id != topics.topic_first_post_id LIMIT 470, 1
I think if this is true, I can’t just look at my phpbb_post row 470 in phpmyadmin, but would have to run a modified query to display the exact offending post.
Ok restarting my web server did the trick and now it continues with the replies.
Sent you an email. The next stop was with replies, LIMIT 470. Managed to skip it. Left it running over night. When I came to check it, it had stalled at LIMIT 6874 and had outputted a huge line of dashes. I pressed stop and went to bump the bbp_converter_start. “Starting conversion” it said and started to output more dashes without actually doing anything.
If you can give me confirmation that 470 and 6874 mean row numbers in phpbb_posts, I will send those rows to you. But then I will wait for an update to bbPress with better debugging, because I can’t go doing conversions 1 row at a time for weeks on end hoping it won’t crash.
Can you contact me privately?
To add to my debugging wish, would be nice to have the specific info even when running with more than 1 row at a time.660 does not imply the topic_id, but the row number. I finally managed to pinpoint the correct offending topic. Having to have to deal with row numbers, I was a bit confused about which topic I have to skip. There really should be some debugging feature that would display the exact topic_id or any first (=most important) field in a given table.
First I deleted a couple of topics (not the right ones) in phpBB. Then when it got stuck again, I realized which of them was the offender. Then I used the trick of bumping the value in bbp_converter_start by 1 and pressing Start to restart the conversion from where it left off.
The topic as I see it does not contain anything out of the ordinary: it’s by an admin, so his name is colored red. But otherwise the fields are fine.
Did those and it still gets stuck. Can you confirm that the topic is on the row 660?
It is a clean WP install on my local machine. Phpbb is 3.0.9.
I finally ran it 1 row at a time and this is the content of bbp_converter_query:
> SELECT convert(topics.topic_id USING “utf8”) AS
> topic_id,convert(topics.forum_id USING “utf8”) AS
> forum_id,convert(topics.topic_poster USING “utf8”) AS
> topic_poster,convert(posts.post_text USING “utf8”) AS
> post_text,convert(topics.topic_title USING “utf8”) AS
> topic_title,convert(topics.topic_time USING “utf8”) AS topic_time FROM
> phpbb_topics AS topics INNER JOIN phpbb_posts AS posts USING
> (topic_id) WHERE posts.post_id = topics.topic_first_post_id LIMIT 660,
> 1How can I determine the correct id in phpmyadmin? Is it row 660? Like if I have Show 30 rows starting from 660, will it show the correct topic id as the first one? The topic in question doesn’t seem to have anything out of the ordinary in it.
bbPress 2.1.1 was released and the notes said “Fixed Invision, phpBB, and vBulletin importers”.
Didn’t help with my problem!Is there any way I can debug this?
Tried with 10 rows, 1 sec and 5 sec delay both result in a crash:
Converting topics (650 – 659)I’m starting to suspect there is something fishy about this certain group of topics: Converting topics (500 – 599) – crashes every time. I had a delay of 5 seconds and could see in the monitor the disk use drop to zero for a good while.
Thanks, I’ll try decreasing the rows, increasing delay time and observing my disk in the Resource Monitor. Already had memory_limit = 1000M.