08-Jan-2022 08:45:20 padua_test(): MATLAB/Octave version 9.8.0.1380330 (R2020a) Update 2 Test padua(). PADUA_ORDER_TEST PADUA_ORDER converts the level L into the order N of any Padua rule. L N 0 1 1 3 2 6 3 10 4 15 5 21 6 28 7 36 8 45 9 55 10 66 PADUA_PLOT_TEST PADUA_PLOT plots the Padua points. Plot file stored as "padua_00.png" Plot file stored as "padua_01.png" Plot file stored as "padua_02.png" Plot file stored as "padua_03.png" Plot file stored as "padua_04.png" Plot file stored as "padua_05.png" Plot file stored as "padua_06.png" Plot file stored as "padua_07.png" Plot file stored as "padua_08.png" Plot file stored as "padua_09.png" Plot file stored as "padua_10.png" PADUA_POINTS_TEST PADUA_POINTS returns the points of a Padua rule. Level 0 Padua points: Row: 1 2 Col 1: 0 0 Level 1 Padua points: Row: 1 2 Col 1: 1 0 2: -1 1 3: -1 -1 Level 2 Padua points: Row: 1 2 Col 1: 1 0.5 2: 1 -1 3: 0 1 4: 0 -0.5 5: -1 0.5 6: -1 -1 Level 3 Padua points: Row: 1 2 Col 1: 1 0.707107 2: 1 -0.707107 3: 0.5 1 4: 0.5 0 5: 0.5 -1 6: -0.5 0.707107 7: -0.5 -0.707107 8: -1 1 9: -1 0 10: -1 -1 Level 4 Padua points: Row: 1 2 Col 1: 1 0.809017 2: 1 -0.309017 3: 1 -1 4: 0.707107 1 5: 0.707107 0.309017 6: 0.707107 -0.809017 7: 0 0.809017 8: 0 -0.309017 9: 0 -1 10: -0.707107 1 11: -0.707107 0.309017 12: -0.707107 -0.809017 13: -1 0.809017 14: -1 -0.309017 15: -1 -1 Level 5 Padua points: Row: 1 2 Col 1: 1 0.866025 2: 1 0 3: 1 -0.866025 4: 0.809017 1 5: 0.809017 0.5 6: 0.809017 -0.5 7: 0.809017 -1 8: 0.309017 0.866025 9: 0.309017 0 10: 0.309017 -0.866025 11: -0.309017 1 12: -0.309017 0.5 13: -0.309017 -0.5 14: -0.309017 -1 15: -0.809017 0.866025 16: -0.809017 0 17: -0.809017 -0.866025 18: -1 1 19: -1 0.5 20: -1 -0.5 21: -1 -1 Level 6 Padua points: Row: 1 2 Col 1: 1 0.900969 2: 1 0.222521 3: 1 -0.62349 4: 1 -1 5: 0.866025 1 6: 0.866025 0.62349 7: 0.866025 -0.222521 8: 0.866025 -0.900969 9: 0.5 0.900969 10: 0.5 0.222521 11: 0.5 -0.62349 12: 0.5 -1 13: 0 1 14: 0 0.62349 15: 0 -0.222521 16: 0 -0.900969 17: -0.5 0.900969 18: -0.5 0.222521 19: -0.5 -0.62349 20: -0.5 -1 21: -0.866025 1 22: -0.866025 0.62349 23: -0.866025 -0.222521 24: -0.866025 -0.900969 25: -1 0.900969 26: -1 0.222521 27: -1 -0.62349 28: -1 -1 Level 7 Padua points: Row: 1 2 Col 1: 1 0.92388 2: 1 0.382683 3: 1 -0.382683 4: 1 -0.92388 5: 0.900969 1 6: 0.900969 0.707107 7: 0.900969 0 8: 0.900969 -0.707107 9: 0.900969 -1 10: 0.62349 0.92388 11: 0.62349 0.382683 12: 0.62349 -0.382683 13: 0.62349 -0.92388 14: 0.222521 1 15: 0.222521 0.707107 16: 0.222521 0 17: 0.222521 -0.707107 18: 0.222521 -1 19: -0.222521 0.92388 20: -0.222521 0.382683 21: -0.222521 -0.382683 22: -0.222521 -0.92388 23: -0.62349 1 24: -0.62349 0.707107 25: -0.62349 0 26: -0.62349 -0.707107 27: -0.62349 -1 28: -0.900969 0.92388 29: -0.900969 0.382683 30: -0.900969 -0.382683 31: -0.900969 -0.92388 32: -1 1 33: -1 0.707107 34: -1 0 35: -1 -0.707107 36: -1 -1 Level 8 Padua points: Row: 1 2 Col 1: 1 0.939693 2: 1 0.5 3: 1 -0.173648 4: 1 -0.766044 5: 1 -1 6: 0.92388 1 7: 0.92388 0.766044 8: 0.92388 0.173648 9: 0.92388 -0.5 10: 0.92388 -0.939693 11: 0.707107 0.939693 12: 0.707107 0.5 13: 0.707107 -0.173648 14: 0.707107 -0.766044 15: 0.707107 -1 16: 0.382683 1 17: 0.382683 0.766044 18: 0.382683 0.173648 19: 0.382683 -0.5 20: 0.382683 -0.939693 21: 0 0.939693 22: 0 0.5 23: 0 -0.173648 24: 0 -0.766044 25: 0 -1 26: -0.382683 1 27: -0.382683 0.766044 28: -0.382683 0.173648 29: -0.382683 -0.5 30: -0.382683 -0.939693 31: -0.707107 0.939693 32: -0.707107 0.5 33: -0.707107 -0.173648 34: -0.707107 -0.766044 35: -0.707107 -1 36: -0.92388 1 37: -0.92388 0.766044 38: -0.92388 0.173648 39: -0.92388 -0.5 40: -0.92388 -0.939693 41: -1 0.939693 42: -1 0.5 43: -1 -0.173648 44: -1 -0.766044 45: -1 -1 Level 9 Padua points: Row: 1 2 Col 1: 1 0.951057 2: 1 0.587785 3: 1 0 4: 1 -0.587785 5: 1 -0.951057 6: 0.939693 1 7: 0.939693 0.809017 8: 0.939693 0.309017 9: 0.939693 -0.309017 10: 0.939693 -0.809017 11: 0.939693 -1 12: 0.766044 0.951057 13: 0.766044 0.587785 14: 0.766044 0 15: 0.766044 -0.587785 16: 0.766044 -0.951057 17: 0.5 1 18: 0.5 0.809017 19: 0.5 0.309017 20: 0.5 -0.309017 21: 0.5 -0.809017 22: 0.5 -1 23: 0.173648 0.951057 24: 0.173648 0.587785 25: 0.173648 0 26: 0.173648 -0.587785 27: 0.173648 -0.951057 28: -0.173648 1 29: -0.173648 0.809017 30: -0.173648 0.309017 31: -0.173648 -0.309017 32: -0.173648 -0.809017 33: -0.173648 -1 34: -0.5 0.951057 35: -0.5 0.587785 36: -0.5 0 37: -0.5 -0.587785 38: -0.5 -0.951057 39: -0.766044 1 40: -0.766044 0.809017 41: -0.766044 0.309017 42: -0.766044 -0.309017 43: -0.766044 -0.809017 44: -0.766044 -1 45: -0.939693 0.951057 46: -0.939693 0.587785 47: -0.939693 0 48: -0.939693 -0.587785 49: -0.939693 -0.951057 50: -1 1 51: -1 0.809017 52: -1 0.309017 53: -1 -0.309017 54: -1 -0.809017 55: -1 -1 Level 10 Padua points: Row: 1 2 Col 1: 1 0.959493 2: 1 0.654861 3: 1 0.142315 4: 1 -0.415415 5: 1 -0.841254 6: 1 -1 7: 0.951057 1 8: 0.951057 0.841254 9: 0.951057 0.415415 10: 0.951057 -0.142315 11: 0.951057 -0.654861 12: 0.951057 -0.959493 13: 0.809017 0.959493 14: 0.809017 0.654861 15: 0.809017 0.142315 16: 0.809017 -0.415415 17: 0.809017 -0.841254 18: 0.809017 -1 19: 0.587785 1 20: 0.587785 0.841254 21: 0.587785 0.415415 22: 0.587785 -0.142315 23: 0.587785 -0.654861 24: 0.587785 -0.959493 25: 0.309017 0.959493 26: 0.309017 0.654861 27: 0.309017 0.142315 28: 0.309017 -0.415415 29: 0.309017 -0.841254 30: 0.309017 -1 31: 0 1 32: 0 0.841254 33: 0 0.415415 34: 0 -0.142315 35: 0 -0.654861 36: 0 -0.959493 37: -0.309017 0.959493 38: -0.309017 0.654861 39: -0.309017 0.142315 40: -0.309017 -0.415415 41: -0.309017 -0.841254 42: -0.309017 -1 43: -0.587785 1 44: -0.587785 0.841254 45: -0.587785 0.415415 46: -0.587785 -0.142315 47: -0.587785 -0.654861 48: -0.587785 -0.959493 49: -0.809017 0.959493 50: -0.809017 0.654861 51: -0.809017 0.142315 52: -0.809017 -0.415415 53: -0.809017 -0.841254 54: -0.809017 -1 55: -0.951057 1 56: -0.951057 0.841254 57: -0.951057 0.415415 58: -0.951057 -0.142315 59: -0.951057 -0.654861 60: -0.951057 -0.959493 61: -1 0.959493 62: -1 0.654861 63: -1 0.142315 64: -1 -0.415415 65: -1 -0.841254 66: -1 -1 PADUA_POINTS_SET_TEST PADUA_POINTS_SET looks the Padua points up in a table. Level 3 Padua points 1 1 0.707107 1 0.707107 2 1 -0.707107 1 -0.707107 3 0.5 1 0.5 1 4 0.5 0 0.5 0 5 0.5 -1 0.5 -1 6 -0.5 0.707107 -0.5 0.707107 7 -0.5 -0.707107 -0.5 -0.707107 8 -1 1 -1 1 9 -1 0 -1 0 10 -1 -1 -1 -1 Level 4 Padua points 1 1 0.809017 1 0.809017 2 1 -0.309017 1 -0.309017 3 1 -1 1 -1 4 0.707107 1 0.707107 1 5 0.707107 0.309017 0.707107 0.309017 6 0.707107 -0.809017 0.707107 -0.809017 7 0 0.809017 0 0.809017 8 0 -0.309017 0 -0.309017 9 0 -1 0 -1 10 -0.707107 1 -0.707107 1 11 -0.707107 0.309017 -0.707107 0.309017 12 -0.707107 -0.809017 -0.707107 -0.809017 13 -1 0.809017 -1 0.809017 14 -1 -0.309017 -1 -0.309017 15 -1 -1 -1 -1 PADUA_WEIGHTS_TEST PADUA_WEIGHTS returns the weights of a Padua rule. Level 0 Padua weightss: 1: 4 Level 1 Padua weightss: 1: 2 2: 1 3: 1 Level 2 Padua weightss: 1: 0.666667 2: 3.70074e-17 3: 0.444444 4: 2.22222 5: 0.666667 6: 3.70074e-17 Level 3 Padua weightss: 1: 0.111111 2: 0.111111 3: 0.222222 4: 1.33333 5: 0.222222 6: 0.888889 7: 0.888889 8: -0.0555556 9: 0.333333 10: -0.0555556 Level 4 Padua weightss: 1: 0.061173 2: 0.0810492 3: -0.00888889 4: 0.0533333 5: 0.625924 6: 0.38741 7: 0.545807 8: 0.983082 9: 0.0711111 10: 0.0533333 11: 0.625924 12: 0.38741 13: 0.061173 14: 0.0810492 15: -0.00888889 Level 5 Padua weightss: 1: 0.0207407 2: 0.0385185 3: 0.0207407 4: 0.0318931 5: 0.32885 6: 0.32885 7: 0.0318931 8: 0.280452 9: 0.63761 10: 0.280452 11: 0.0451439 12: 0.554113 13: 0.554113 14: 0.0451439 15: 0.187696 16: 0.346093 17: 0.187696 18: -0.0103704 19: 0.0503704 20: 0.0503704 21: -0.0103704 Level 6 Padua weightss: 1: 0.0142543 2: 0.0263393 3: 0.0195727 4: -0.00302343 5: 0.0120937 6: 0.174314 7: 0.22089 8: 0.100638 9: 0.167723 10: 0.402706 11: 0.324506 12: 0.01935 13: 0.0247921 14: 0.372182 15: 0.448963 16: 0.195332 17: 0.167723 18: 0.402706 19: 0.324506 20: 0.01935 21: 0.0120937 22: 0.174314 23: 0.22089 24: 0.100638 25: 0.0142543 26: 0.0263393 27: 0.0195727 28: -0.00302343 Level 7 Padua weightss: 1: 0.00635588 2: 0.0140523 3: 0.0140523 4: 0.00635588 5: 0.00864992 6: 0.106201 7: 0.150581 8: 0.106201 9: 0.00864992 10: 0.0964699 11: 0.255773 12: 0.255773 13: 0.0964699 14: 0.0177137 15: 0.249093 16: 0.340804 17: 0.249093 18: 0.0177137 19: 0.122182 20: 0.315027 21: 0.315027 22: 0.122182 23: 0.0126387 24: 0.197994 25: 0.283218 26: 0.197994 27: 0.0126387 28: 0.0594932 29: 0.130648 30: 0.130648 31: 0.0594932 32: -0.00328798 33: 0.0133787 34: 0.0206349 35: 0.0133787 36: -0.00328798 Level 8 Padua weightss: 1: 0.00493068 2: 0.0102646 3: 0.0111146 4: 0.00670609 5: -0.00126984 6: 0.00406163 7: 0.0640019 8: 0.0996591 9: 0.0883832 10: 0.0363315 11: 0.063426 12: 0.169947 13: 0.19275 14: 0.125834 15: 0.00677249 16: 0.0100477 17: 0.16481 18: 0.247731 19: 0.217084 20: 0.0837633 21: 0.0893669 22: 0.236402 23: 0.272941 24: 0.178433 25: 0.0101587 26: 0.0100477 27: 0.16481 28: 0.247731 29: 0.217084 30: 0.0837633 31: 0.063426 32: 0.169947 33: 0.19275 34: 0.125834 35: 0.00677249 36: 0.00406163 37: 0.0640019 38: 0.0996591 39: 0.0883832 40: 0.0363315 41: 0.00493068 42: 0.0102646 43: 0.0111146 44: 0.00670609 45: -0.00126984 Level 9 Padua weightss: 1: 0.00255069 2: 0.00608423 3: 0.00742152 4: 0.00608423 5: 0.00255069 6: 0.00323908 7: 0.0431759 8: 0.0701525 9: 0.0701525 10: 0.0431759 11: 0.00323908 12: 0.040319 13: 0.114298 14: 0.141335 15: 0.114298 16: 0.040319 17: 0.00750617 18: 0.114216 19: 0.180218 20: 0.180218 21: 0.114216 22: 0.00750617 23: 0.0623497 24: 0.173042 25: 0.216942 26: 0.173042 27: 0.0623497 28: 0.00818618 29: 0.129536 30: 0.206141 31: 0.206141 32: 0.129536 33: 0.00818618 34: 0.0556455 35: 0.152439 36: 0.187711 37: 0.152439 38: 0.0556455 39: 0.0046594 40: 0.0835452 41: 0.13708 42: 0.13708 43: 0.0835452 44: 0.0046594 45: 0.0242529 46: 0.0572733 47: 0.0700826 48: 0.0572733 49: 0.0242529 50: -0.00136861 51: 0.00483798 52: 0.00887631 53: 0.00887631 54: 0.00483798 55: -0.00136861 Level 10 Padua weightss: 1: 0.00213535 2: 0.00470574 3: 0.00589175 4: 0.00525003 5: 0.00284323 6: -0.000624076 7: 0.00171962 8: 0.0284347 9: 0.0486334 10: 0.053265 11: 0.040996 12: 0.0161094 13: 0.0285577 14: 0.0806621 15: 0.105321 16: 0.0965987 17: 0.0572471 18: 0.00288377 19: 0.0046107 20: 0.0793285 21: 0.131369 22: 0.143017 23: 0.109039 24: 0.0398129 25: 0.0460306 26: 0.128358 27: 0.169606 28: 0.156232 29: 0.0931136 30: 0.0050865 31: 0.00570484 32: 0.0982275 33: 0.163504 34: 0.176419 35: 0.134728 36: 0.0489489 37: 0.0460306 38: 0.128358 39: 0.169606 40: 0.156232 41: 0.0931136 42: 0.0050865 43: 0.0046107 44: 0.0793285 45: 0.131369 46: 0.143017 47: 0.109039 48: 0.0398129 49: 0.0285577 50: 0.0806621 51: 0.105321 52: 0.0965987 53: 0.0572471 54: 0.00288377 55: 0.00171962 56: 0.0284347 57: 0.0486334 58: 0.053265 59: 0.040996 60: 0.0161094 61: 0.00213535 62: 0.00470574 63: 0.00589175 64: 0.00525003 65: 0.00284323 66: -0.000624076 PADUA_WEIGHTS_SET_TEST PADUA_WEIGHTS_SET looks up Padua weights in a table. Level 3 Padua weights 1 0.111111 0.111111 2 0.111111 0.111111 3 0.222222 0.222222 4 1.33333 1.33333 5 0.222222 0.222222 6 0.888889 0.888889 7 0.888889 0.888889 8 -0.0555556 -0.0555556 9 0.333333 0.333333 10 -0.0555556 -0.0555556 Maximum difference = 7.77156e-16 Level 4 Padua weights 1 0.061173 0.061173 2 0.0810492 0.0810492 3 -0.00888889 -0.00888889 4 0.0533333 0.0533333 5 0.625924 0.625924 6 0.38741 0.38741 7 0.545807 0.545807 8 0.983082 0.983082 9 0.0711111 0.0711111 10 0.0533333 0.0533333 11 0.625924 0.625924 12 0.38741 0.38741 13 0.061173 0.061173 14 0.0810492 0.0810492 15 -0.00888889 -0.00888889 Maximum difference = 1.22125e-15 padua_test(): Normal end of execution. 08-Jan-2022 08:45:27