Apr 19 12:27:59.350141 ip-10-0-142-55 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 19 12:27:59.350155 ip-10-0-142-55 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 19 12:27:59.350165 ip-10-0-142-55 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 19 12:27:59.350615 ip-10-0-142-55 systemd[1]: Failed to start Kubernetes Kubelet. Apr 19 12:28:09.586516 ip-10-0-142-55 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 19 12:28:09.586539 ip-10-0-142-55 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 4c05acaccaf041f899ffcf0a23982cde -- Apr 19 12:30:31.719452 ip-10-0-142-55 systemd[1]: Starting Kubernetes Kubelet... Apr 19 12:30:32.196919 ip-10-0-142-55 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 12:30:32.196919 ip-10-0-142-55 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 19 12:30:32.196919 ip-10-0-142-55 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 12:30:32.196919 ip-10-0-142-55 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 19 12:30:32.196919 ip-10-0-142-55 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 12:30:32.200791 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.200692 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 19 12:30:32.205762 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205732 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:30:32.205762 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205752 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:30:32.205762 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205755 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:30:32.205762 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205759 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:30:32.205762 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205763 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:30:32.205762 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205765 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:30:32.205762 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205771 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:30:32.206007 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205775 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:30:32.206007 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205779 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:30:32.206007 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205782 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:30:32.206007 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205784 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:30:32.206007 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205787 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:30:32.206007 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205790 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:30:32.206007 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205793 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:30:32.206007 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205795 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:30:32.206007 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205798 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:30:32.206007 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205801 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:30:32.206007 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205803 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:30:32.206007 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205806 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:30:32.206007 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205808 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:30:32.206007 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205811 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:30:32.206007 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205813 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:30:32.206007 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205816 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:30:32.206007 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205819 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:30:32.206007 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205822 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:30:32.206007 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205824 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:30:32.206007 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205827 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:30:32.206539 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205837 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:30:32.206539 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205840 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:30:32.206539 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205843 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:30:32.206539 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205845 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:30:32.206539 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205848 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:30:32.206539 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205850 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:30:32.206539 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205853 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:30:32.206539 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205856 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:30:32.206539 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205858 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:30:32.206539 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205861 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:30:32.206539 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205863 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:30:32.206539 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205866 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:30:32.206539 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205869 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:30:32.206539 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205873 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:30:32.206539 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205877 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:30:32.206539 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205880 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:30:32.206539 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205883 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:30:32.206539 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205887 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:30:32.206539 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205890 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:30:32.207008 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205893 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:30:32.207008 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205896 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:30:32.207008 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205899 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:30:32.207008 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205903 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:30:32.207008 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205905 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:30:32.207008 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205909 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:30:32.207008 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205911 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:30:32.207008 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205914 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:30:32.207008 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205916 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:30:32.207008 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205919 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:30:32.207008 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205922 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:30:32.207008 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205924 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:30:32.207008 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205928 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:30:32.207008 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205931 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:30:32.207008 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205933 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:30:32.207008 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205936 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:30:32.207008 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205938 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:30:32.207008 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205941 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:30:32.207008 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205944 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:30:32.207008 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205946 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:30:32.207508 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205949 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:30:32.207508 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205953 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:30:32.207508 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205956 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:30:32.207508 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205958 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:30:32.207508 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205960 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:30:32.207508 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205963 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:30:32.207508 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205967 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:30:32.207508 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205970 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:30:32.207508 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205972 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:30:32.207508 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205975 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:30:32.207508 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205978 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:30:32.207508 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205980 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:30:32.207508 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205985 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:30:32.207508 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205987 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:30:32.207508 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205990 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:30:32.207508 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205992 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:30:32.207508 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205995 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:30:32.207508 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.205998 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:30:32.207508 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.206001 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:30:32.207508 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.206003 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:30:32.207989 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207081 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:30:32.207989 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207089 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:30:32.207989 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207094 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:30:32.207989 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207098 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:30:32.207989 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207101 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:30:32.207989 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207104 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:30:32.207989 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207107 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:30:32.207989 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207110 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:30:32.207989 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207113 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:30:32.207989 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207116 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:30:32.207989 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207119 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:30:32.207989 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207122 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:30:32.207989 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207125 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:30:32.207989 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207128 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:30:32.207989 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207130 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:30:32.207989 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207133 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:30:32.207989 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207136 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:30:32.207989 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207139 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:30:32.207989 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207143 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:30:32.207989 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207145 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:30:32.208530 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207148 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:30:32.208530 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207150 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:30:32.208530 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207153 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:30:32.208530 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207156 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:30:32.208530 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207187 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:30:32.208530 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207192 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:30:32.208530 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207194 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:30:32.208530 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207197 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:30:32.208530 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207200 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:30:32.208530 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207202 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:30:32.208530 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207204 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:30:32.208530 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207207 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:30:32.208530 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207210 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:30:32.208530 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207213 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:30:32.208530 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207215 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:30:32.208530 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207219 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:30:32.208530 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207222 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:30:32.208530 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207225 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:30:32.208530 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207228 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:30:32.208530 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207231 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:30:32.209013 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207234 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:30:32.209013 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207237 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:30:32.209013 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207240 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:30:32.209013 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207242 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:30:32.209013 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207245 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:30:32.209013 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207248 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:30:32.209013 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207251 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:30:32.209013 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207253 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:30:32.209013 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207256 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:30:32.209013 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207259 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:30:32.209013 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207262 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:30:32.209013 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207265 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:30:32.209013 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207267 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:30:32.209013 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207270 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:30:32.209013 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207272 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:30:32.209013 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207275 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:30:32.209013 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207277 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:30:32.209013 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207280 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:30:32.209013 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207283 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:30:32.209013 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207286 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:30:32.209503 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207288 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:30:32.209503 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207291 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:30:32.209503 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207293 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:30:32.209503 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207295 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:30:32.209503 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207298 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:30:32.209503 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207300 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:30:32.209503 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207303 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:30:32.209503 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207305 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:30:32.209503 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207309 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:30:32.209503 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207311 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:30:32.209503 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207313 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:30:32.209503 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207316 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:30:32.209503 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207318 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:30:32.209503 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207321 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:30:32.209503 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207323 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:30:32.209503 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207327 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:30:32.209503 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207330 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:30:32.209503 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207332 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:30:32.209503 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207335 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:30:32.209972 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207338 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:30:32.209972 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207340 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:30:32.209972 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207343 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:30:32.209972 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207346 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:30:32.209972 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207348 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:30:32.209972 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207351 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:30:32.209972 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.207353 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:30:32.209972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207428 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 19 12:30:32.209972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207435 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 19 12:30:32.209972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207442 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 19 12:30:32.209972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207447 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 19 12:30:32.209972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207453 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 19 12:30:32.209972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207456 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 19 12:30:32.209972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207461 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 19 12:30:32.209972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207465 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 19 12:30:32.209972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207469 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 19 12:30:32.209972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207472 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 19 12:30:32.209972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207475 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 19 12:30:32.209972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207479 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 19 12:30:32.209972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207482 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 19 12:30:32.209972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207485 2567 flags.go:64] FLAG: --cgroup-root="" Apr 19 12:30:32.209972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207488 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207491 2567 flags.go:64] FLAG: --client-ca-file="" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207494 2567 flags.go:64] FLAG: --cloud-config="" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207497 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207500 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207504 2567 flags.go:64] FLAG: --cluster-domain="" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207507 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207511 2567 flags.go:64] FLAG: --config-dir="" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207513 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207517 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207521 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207525 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207528 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207532 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207535 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207538 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207540 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207544 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207547 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207551 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207554 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207557 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207560 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207564 2567 flags.go:64] FLAG: --enable-server="true" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207567 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 19 12:30:32.210523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207571 2567 flags.go:64] FLAG: --event-burst="100" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207574 2567 flags.go:64] FLAG: --event-qps="50" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207578 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207582 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207585 2567 flags.go:64] FLAG: --eviction-hard="" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207589 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207592 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207595 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207599 2567 flags.go:64] FLAG: --eviction-soft="" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207602 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207605 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207608 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207610 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207613 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207616 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207619 2567 flags.go:64] FLAG: --feature-gates="" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207623 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207626 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207629 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207632 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207636 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207638 2567 flags.go:64] FLAG: --help="false" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207641 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-142-55.ec2.internal" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207644 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 19 12:30:32.211149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207648 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207651 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207654 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207657 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207660 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207663 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207665 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207669 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207672 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207675 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207678 2567 flags.go:64] FLAG: --kube-reserved="" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207681 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207684 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207687 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207689 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207692 2567 flags.go:64] FLAG: --lock-file="" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207695 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207698 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207701 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207707 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207709 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207712 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207715 2567 flags.go:64] FLAG: --logging-format="text" Apr 19 12:30:32.211747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207718 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207724 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207727 2567 flags.go:64] FLAG: --manifest-url="" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207730 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207734 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207738 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207742 2567 flags.go:64] FLAG: --max-pods="110" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207745 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207748 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207751 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207753 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207756 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207759 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207762 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207770 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207773 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207776 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207779 2567 flags.go:64] FLAG: --pod-cidr="" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207782 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207788 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207791 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207794 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207797 2567 flags.go:64] FLAG: --port="10250" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207800 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 19 12:30:32.212335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207803 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02ff57a6b3ad4acd5" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207806 2567 flags.go:64] FLAG: --qos-reserved="" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207809 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207812 2567 flags.go:64] FLAG: --register-node="true" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207815 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207818 2567 flags.go:64] FLAG: --register-with-taints="" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207822 2567 flags.go:64] FLAG: --registry-burst="10" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207824 2567 flags.go:64] FLAG: --registry-qps="5" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207827 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207832 2567 flags.go:64] FLAG: --reserved-memory="" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207836 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207839 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207842 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207845 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207848 2567 flags.go:64] FLAG: --runonce="false" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207851 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207854 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207856 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207859 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207862 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207865 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207869 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207872 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207875 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207878 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207881 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 19 12:30:32.212902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207884 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207887 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207890 2567 flags.go:64] FLAG: --system-cgroups="" Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207893 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207898 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207901 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207903 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207908 2567 flags.go:64] FLAG: --tls-min-version="" Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207911 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207913 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207916 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207919 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207922 2567 flags.go:64] FLAG: --v="2" Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207926 2567 flags.go:64] FLAG: --version="false" Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207930 2567 flags.go:64] FLAG: --vmodule="" Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207936 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.207939 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208039 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208043 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208046 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208049 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208051 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208054 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:30:32.213566 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208056 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:30:32.214150 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208059 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:30:32.214150 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208062 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:30:32.214150 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208064 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:30:32.214150 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208067 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:30:32.214150 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208069 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:30:32.214150 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208072 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:30:32.214150 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208074 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:30:32.214150 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208077 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:30:32.214150 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208079 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:30:32.214150 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208083 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:30:32.214150 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208087 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:30:32.214150 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208090 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:30:32.214150 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208093 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:30:32.214150 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208095 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:30:32.214150 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208098 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:30:32.214150 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208100 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:30:32.214150 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208104 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:30:32.214150 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208106 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:30:32.214150 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208109 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:30:32.214661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208111 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:30:32.214661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208114 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:30:32.214661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208116 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:30:32.214661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208119 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:30:32.214661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208123 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:30:32.214661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208125 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:30:32.214661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208128 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:30:32.214661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208130 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:30:32.214661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208133 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:30:32.214661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208136 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:30:32.214661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208138 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:30:32.214661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208141 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:30:32.214661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208143 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:30:32.214661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208146 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:30:32.214661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208148 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:30:32.214661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208151 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:30:32.214661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208153 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:30:32.214661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208156 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:30:32.214661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208173 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:30:32.214661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208177 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:30:32.215152 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208179 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:30:32.215152 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208182 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:30:32.215152 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208184 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:30:32.215152 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208188 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:30:32.215152 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208190 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:30:32.215152 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208193 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:30:32.215152 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208195 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:30:32.215152 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208198 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:30:32.215152 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208201 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:30:32.215152 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208206 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:30:32.215152 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208209 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:30:32.215152 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208211 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:30:32.215152 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208214 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:30:32.215152 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208218 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:30:32.215152 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208221 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:30:32.215152 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208224 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:30:32.215152 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208228 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:30:32.215152 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208231 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:30:32.215152 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208234 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:30:32.215152 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208237 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:30:32.215661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208240 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:30:32.215661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208243 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:30:32.215661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208245 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:30:32.215661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208248 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:30:32.215661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208250 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:30:32.215661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208253 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:30:32.215661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208255 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:30:32.215661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208258 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:30:32.215661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208261 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:30:32.215661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208263 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:30:32.215661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208266 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:30:32.215661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208269 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:30:32.215661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208271 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:30:32.215661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208274 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:30:32.215661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208276 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:30:32.215661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208279 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:30:32.215661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208282 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:30:32.215661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208284 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:30:32.215661 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208287 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:30:32.216131 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.208289 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:30:32.216131 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.208958 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 12:30:32.216131 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.215264 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 19 12:30:32.216131 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.215280 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 19 12:30:32.216131 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215331 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:30:32.216131 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215335 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:30:32.216131 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215339 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:30:32.216131 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215341 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:30:32.216131 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215344 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:30:32.216131 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215347 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:30:32.216131 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215350 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:30:32.216131 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215353 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:30:32.216131 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215355 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:30:32.216131 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215358 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:30:32.216131 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215360 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:30:32.216634 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215364 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:30:32.216634 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215368 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:30:32.216634 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215371 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:30:32.216634 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215374 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:30:32.216634 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215376 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:30:32.216634 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215379 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:30:32.216634 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215381 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:30:32.216634 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215384 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:30:32.216634 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215387 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:30:32.216634 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215389 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:30:32.216634 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215392 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:30:32.216634 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215394 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:30:32.216634 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215397 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:30:32.216634 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215400 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:30:32.216634 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215403 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:30:32.216634 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215406 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:30:32.216634 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215408 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:30:32.216634 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215411 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:30:32.216634 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215413 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:30:32.217100 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215416 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:30:32.217100 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215420 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:30:32.217100 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215422 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:30:32.217100 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215425 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:30:32.217100 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215428 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:30:32.217100 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215430 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:30:32.217100 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215433 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:30:32.217100 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215436 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:30:32.217100 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215439 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:30:32.217100 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215441 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:30:32.217100 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215444 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:30:32.217100 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215446 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:30:32.217100 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215449 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:30:32.217100 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215451 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:30:32.217100 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215454 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:30:32.217100 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215456 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:30:32.217100 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215459 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:30:32.217100 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215461 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:30:32.217100 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215464 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:30:32.217100 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215467 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:30:32.217599 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215469 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:30:32.217599 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215472 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:30:32.217599 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215474 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:30:32.217599 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215477 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:30:32.217599 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215480 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:30:32.217599 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215482 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:30:32.217599 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215484 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:30:32.217599 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215487 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:30:32.217599 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215490 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:30:32.217599 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215493 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:30:32.217599 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215495 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:30:32.217599 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215498 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:30:32.217599 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215501 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:30:32.217599 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215504 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:30:32.217599 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215507 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:30:32.217599 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215509 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:30:32.217599 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215511 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:30:32.217599 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215514 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:30:32.217599 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215516 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:30:32.217599 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215519 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:30:32.218091 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215521 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:30:32.218091 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215524 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:30:32.218091 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215526 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:30:32.218091 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215529 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:30:32.218091 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215531 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:30:32.218091 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215533 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:30:32.218091 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215536 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:30:32.218091 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215538 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:30:32.218091 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215541 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:30:32.218091 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215544 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:30:32.218091 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215546 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:30:32.218091 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215550 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:30:32.218091 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215553 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:30:32.218091 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215556 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:30:32.218091 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215559 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:30:32.218091 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215561 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:30:32.218575 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.215566 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 12:30:32.218575 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215663 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:30:32.218575 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215668 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:30:32.218575 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215671 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:30:32.218575 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215674 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:30:32.218575 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215677 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:30:32.218575 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215680 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:30:32.218575 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215683 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:30:32.218575 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215686 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:30:32.218575 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215689 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:30:32.218575 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215692 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:30:32.218575 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215695 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:30:32.218575 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215698 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:30:32.218575 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215701 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:30:32.218575 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215704 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:30:32.218575 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215707 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:30:32.218980 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215709 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:30:32.218980 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215712 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:30:32.218980 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215715 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:30:32.218980 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215717 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:30:32.218980 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215720 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:30:32.218980 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215723 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:30:32.218980 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215726 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:30:32.218980 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215729 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:30:32.218980 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215732 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:30:32.218980 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215735 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:30:32.218980 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215738 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:30:32.218980 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215741 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:30:32.218980 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215743 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:30:32.218980 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215746 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:30:32.218980 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215748 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:30:32.218980 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215751 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:30:32.218980 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215753 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:30:32.218980 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215756 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:30:32.218980 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215758 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:30:32.219449 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215761 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:30:32.219449 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215763 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:30:32.219449 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215766 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:30:32.219449 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215769 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:30:32.219449 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215772 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:30:32.219449 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215774 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:30:32.219449 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215777 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:30:32.219449 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215780 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:30:32.219449 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215783 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:30:32.219449 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215785 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:30:32.219449 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215788 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:30:32.219449 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215790 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:30:32.219449 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215794 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:30:32.219449 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215798 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:30:32.219449 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215801 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:30:32.219449 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215805 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:30:32.219449 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215808 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:30:32.219449 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215810 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:30:32.219449 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215813 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:30:32.219926 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215815 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:30:32.219926 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215817 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:30:32.219926 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215820 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:30:32.219926 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215822 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:30:32.219926 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215825 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:30:32.219926 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215827 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:30:32.219926 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215830 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:30:32.219926 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215832 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:30:32.219926 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215834 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:30:32.219926 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215837 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:30:32.219926 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215839 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:30:32.219926 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215841 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:30:32.219926 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215844 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:30:32.219926 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215846 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:30:32.219926 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215849 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:30:32.219926 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215851 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:30:32.219926 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215854 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:30:32.219926 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215857 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:30:32.219926 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215859 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:30:32.219926 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215862 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:30:32.220484 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215864 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:30:32.220484 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215867 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:30:32.220484 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215869 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:30:32.220484 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215872 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:30:32.220484 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215874 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:30:32.220484 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215877 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:30:32.220484 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215880 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:30:32.220484 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215882 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:30:32.220484 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215884 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:30:32.220484 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215887 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:30:32.220484 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215889 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:30:32.220484 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215892 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:30:32.220484 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:32.215894 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:30:32.220484 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.215899 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 12:30:32.220484 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.216592 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 19 12:30:32.220860 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.219208 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 19 12:30:32.220860 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.220128 2567 server.go:1019] "Starting client certificate rotation" Apr 19 12:30:32.220860 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.220222 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 19 12:30:32.220860 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.220264 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 19 12:30:32.250076 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.250053 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 19 12:30:32.253528 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.253508 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 19 12:30:32.270401 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.270373 2567 log.go:25] "Validated CRI v1 runtime API" Apr 19 12:30:32.276005 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.275988 2567 log.go:25] "Validated CRI v1 image API" Apr 19 12:30:32.277278 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.277254 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 19 12:30:32.281374 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.281354 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 19 12:30:32.284674 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.284651 2567 fs.go:135] Filesystem UUIDs: map[2b38651f-651f-4b14-b258-dc54879bf9a2:/dev/nvme0n1p3 3d28ca81-06a3-40a9-a0eb-67a69a6f4d0f:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 19 12:30:32.284733 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.284674 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 19 12:30:32.290403 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.290275 2567 manager.go:217] Machine: {Timestamp:2026-04-19 12:30:32.288451491 +0000 UTC m=+0.442384963 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100070 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2bb04b5dbdf1543e98fe7dae9e3e9c SystemUUID:ec2bb04b-5dbd-f154-3e98-fe7dae9e3e9c BootID:4c05acac-caf0-41f8-99ff-cf0a23982cde Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:5b:1a:14:f1:fd Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:5b:1a:14:f1:fd Speed:0 Mtu:9001} {Name:ovs-system MacAddress:06:48:0d:9d:cf:50 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 19 12:30:32.290403 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.290396 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 19 12:30:32.290507 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.290484 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 19 12:30:32.293169 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.293128 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 19 12:30:32.293319 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.293171 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-55.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 19 12:30:32.293369 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.293329 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 19 12:30:32.293369 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.293338 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 19 12:30:32.293369 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.293355 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 19 12:30:32.294122 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.294112 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 19 12:30:32.294875 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.294865 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 19 12:30:32.294985 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.294977 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 19 12:30:32.297598 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.297587 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 19 12:30:32.297653 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.297606 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 19 12:30:32.297653 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.297619 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 19 12:30:32.297653 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.297628 2567 kubelet.go:397] "Adding apiserver pod source" Apr 19 12:30:32.297653 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.297637 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 19 12:30:32.299324 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.299311 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 19 12:30:32.299395 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.299331 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 19 12:30:32.302862 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.302843 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 19 12:30:32.304244 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.304231 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 19 12:30:32.306212 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.306199 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 19 12:30:32.306284 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.306217 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 19 12:30:32.306284 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.306223 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 19 12:30:32.306284 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.306230 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 19 12:30:32.306284 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.306236 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 19 12:30:32.306284 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.306243 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 19 12:30:32.306284 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.306249 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 19 12:30:32.306284 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.306254 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 19 12:30:32.306284 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.306261 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 19 12:30:32.306284 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.306267 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 19 12:30:32.306284 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.306281 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 19 12:30:32.306534 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.306290 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 19 12:30:32.307111 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.307102 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 19 12:30:32.307147 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.307111 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 19 12:30:32.310818 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.310804 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 19 12:30:32.310907 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.310841 2567 server.go:1295] "Started kubelet" Apr 19 12:30:32.310965 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.310887 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 19 12:30:32.312807 ip-10-0-142-55 systemd[1]: Started Kubernetes Kubelet. Apr 19 12:30:32.313475 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:32.313387 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-55.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 19 12:30:32.313538 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:32.313470 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 19 12:30:32.313715 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.313694 2567 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-55.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 19 12:30:32.313776 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.310988 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 19 12:30:32.313920 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.313784 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 19 12:30:32.315239 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.315219 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 19 12:30:32.315432 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.315413 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 19 12:30:32.320981 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.320962 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 19 12:30:32.320981 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.320972 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 19 12:30:32.321208 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:32.320130 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-55.ec2.internal.18a7c1f42231bde7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-55.ec2.internal,UID:ip-10-0-142-55.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-55.ec2.internal,},FirstTimestamp:2026-04-19 12:30:32.310816231 +0000 UTC m=+0.464749703,LastTimestamp:2026-04-19 12:30:32.310816231 +0000 UTC m=+0.464749703,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-55.ec2.internal,}" Apr 19 12:30:32.321561 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.321539 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mr5xn" Apr 19 12:30:32.321635 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.321599 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 19 12:30:32.321635 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.321605 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 19 12:30:32.321635 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.321624 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 19 12:30:32.321779 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.321763 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 19 12:30:32.321850 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.321766 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 19 12:30:32.321850 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.321791 2567 factory.go:55] Registering systemd factory Apr 19 12:30:32.321850 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.321801 2567 factory.go:223] Registration of the systemd container factory successfully Apr 19 12:30:32.321850 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.321794 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 19 12:30:32.322042 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:32.321940 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-55.ec2.internal\" not found" Apr 19 12:30:32.322042 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.322032 2567 factory.go:153] Registering CRI-O factory Apr 19 12:30:32.322139 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.322048 2567 factory.go:223] Registration of the crio container factory successfully Apr 19 12:30:32.322139 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.322073 2567 factory.go:103] Registering Raw factory Apr 19 12:30:32.322139 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.322090 2567 manager.go:1196] Started watching for new ooms in manager Apr 19 12:30:32.322750 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.322731 2567 manager.go:319] Starting recovery of all containers Apr 19 12:30:32.325843 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:32.325798 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 19 12:30:32.329154 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.329132 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mr5xn" Apr 19 12:30:32.331222 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:32.331190 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 19 12:30:32.331334 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:32.331209 2567 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-142-55.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 19 12:30:32.331840 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.331823 2567 manager.go:324] Recovery completed Apr 19 12:30:32.337405 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.337393 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:30:32.341742 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.341726 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-55.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:30:32.341801 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.341759 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:30:32.341801 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.341770 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-55.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:30:32.342334 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.342321 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 19 12:30:32.342388 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.342334 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 19 12:30:32.342388 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.342370 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 19 12:30:32.345556 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.345545 2567 policy_none.go:49] "None policy: Start" Apr 19 12:30:32.345596 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.345560 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 19 12:30:32.345596 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.345570 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 19 12:30:32.382727 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.382707 2567 manager.go:341] "Starting Device Plugin manager" Apr 19 12:30:32.397723 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:32.382745 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 19 12:30:32.397723 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.382755 2567 server.go:85] "Starting device plugin registration server" Apr 19 12:30:32.397723 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.383022 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 19 12:30:32.397723 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.383035 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 19 12:30:32.397723 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.383132 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 19 12:30:32.397723 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.383233 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 19 12:30:32.397723 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.383243 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 19 12:30:32.397723 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:32.383810 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 19 12:30:32.397723 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:32.383847 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-55.ec2.internal\" not found" Apr 19 12:30:32.452217 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.452110 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 19 12:30:32.453286 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.453272 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 19 12:30:32.453372 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.453301 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 19 12:30:32.453372 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.453335 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 19 12:30:32.453372 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.453345 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 19 12:30:32.453497 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:32.453385 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 19 12:30:32.460041 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.460019 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:30:32.484157 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.484135 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:30:32.485029 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.485007 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-55.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:30:32.485132 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.485040 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:30:32.485132 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.485052 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-55.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:30:32.485132 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.485076 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-55.ec2.internal" Apr 19 12:30:32.494546 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.494527 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-55.ec2.internal" Apr 19 12:30:32.494622 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:32.494552 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-55.ec2.internal\": node \"ip-10-0-142-55.ec2.internal\" not found" Apr 19 12:30:32.508686 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:32.508663 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-55.ec2.internal\" not found" Apr 19 12:30:32.554324 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.554295 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-142-55.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-55.ec2.internal"] Apr 19 12:30:32.554448 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.554369 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:30:32.555317 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.555300 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-55.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:30:32.555408 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.555329 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:30:32.555408 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.555339 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-55.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:30:32.557649 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.557637 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:30:32.557785 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.557770 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-55.ec2.internal" Apr 19 12:30:32.557819 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.557802 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:30:32.558336 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.558319 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-55.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:30:32.558447 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.558345 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:30:32.558447 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.558322 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-55.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:30:32.558447 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.558389 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:30:32.558447 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.558402 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-55.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:30:32.558447 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.558355 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-55.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:30:32.560532 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.560518 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-55.ec2.internal" Apr 19 12:30:32.560586 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.560544 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:30:32.561150 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.561136 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-55.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:30:32.561259 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.561176 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:30:32.561259 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.561187 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-55.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:30:32.583754 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:32.583738 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-55.ec2.internal\" not found" node="ip-10-0-142-55.ec2.internal" Apr 19 12:30:32.587017 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:32.587003 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-55.ec2.internal\" not found" node="ip-10-0-142-55.ec2.internal" Apr 19 12:30:32.609617 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:32.609576 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-55.ec2.internal\" not found" Apr 19 12:30:32.624582 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.624560 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/023b8927dfe46c9ec0872b191f59109d-config\") pod \"kube-apiserver-proxy-ip-10-0-142-55.ec2.internal\" (UID: \"023b8927dfe46c9ec0872b191f59109d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-55.ec2.internal" Apr 19 12:30:32.624634 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.624588 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/137e787a6b92ec93f0dd935c9a5cf7fd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-55.ec2.internal\" (UID: \"137e787a6b92ec93f0dd935c9a5cf7fd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-55.ec2.internal" Apr 19 12:30:32.624634 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.624607 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/137e787a6b92ec93f0dd935c9a5cf7fd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-55.ec2.internal\" (UID: \"137e787a6b92ec93f0dd935c9a5cf7fd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-55.ec2.internal" Apr 19 12:30:32.710738 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:32.710656 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-55.ec2.internal\" not found" Apr 19 12:30:32.725007 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.724987 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/023b8927dfe46c9ec0872b191f59109d-config\") pod \"kube-apiserver-proxy-ip-10-0-142-55.ec2.internal\" (UID: \"023b8927dfe46c9ec0872b191f59109d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-55.ec2.internal" Apr 19 12:30:32.725091 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.725012 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/137e787a6b92ec93f0dd935c9a5cf7fd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-55.ec2.internal\" (UID: \"137e787a6b92ec93f0dd935c9a5cf7fd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-55.ec2.internal" Apr 19 12:30:32.725091 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.725030 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/137e787a6b92ec93f0dd935c9a5cf7fd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-55.ec2.internal\" (UID: \"137e787a6b92ec93f0dd935c9a5cf7fd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-55.ec2.internal" Apr 19 12:30:32.725091 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.725072 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/137e787a6b92ec93f0dd935c9a5cf7fd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-55.ec2.internal\" (UID: \"137e787a6b92ec93f0dd935c9a5cf7fd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-55.ec2.internal" Apr 19 12:30:32.725200 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.725089 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/023b8927dfe46c9ec0872b191f59109d-config\") pod \"kube-apiserver-proxy-ip-10-0-142-55.ec2.internal\" (UID: \"023b8927dfe46c9ec0872b191f59109d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-55.ec2.internal" Apr 19 12:30:32.725200 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.725110 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/137e787a6b92ec93f0dd935c9a5cf7fd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-55.ec2.internal\" (UID: \"137e787a6b92ec93f0dd935c9a5cf7fd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-55.ec2.internal" Apr 19 12:30:32.811477 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:32.811436 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-55.ec2.internal\" not found" Apr 19 12:30:32.886023 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.885990 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-55.ec2.internal" Apr 19 12:30:32.889545 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:32.889526 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-55.ec2.internal" Apr 19 12:30:32.912186 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:32.912141 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-55.ec2.internal\" not found" Apr 19 12:30:33.012788 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:33.012704 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-55.ec2.internal\" not found" Apr 19 12:30:33.113225 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:33.113194 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-55.ec2.internal\" not found" Apr 19 12:30:33.213866 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:33.213837 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-55.ec2.internal\" not found" Apr 19 12:30:33.219994 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:33.219971 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 19 12:30:33.220141 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:33.220124 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 19 12:30:33.314010 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:33.313923 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-55.ec2.internal\" not found" Apr 19 12:30:33.321394 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:33.321369 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 19 12:30:33.331175 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:33.331133 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-18 12:25:32 +0000 UTC" deadline="2027-10-14 01:08:39.728866578 +0000 UTC" Apr 19 12:30:33.331175 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:33.331178 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13020h38m6.397691632s" Apr 19 12:30:33.331457 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:33.331444 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 19 12:30:33.353130 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:33.353100 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-c4x59" Apr 19 12:30:33.364314 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:33.364287 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-c4x59" Apr 19 12:30:33.378361 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:33.378336 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:30:33.414921 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:33.414894 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-55.ec2.internal\" not found" Apr 19 12:30:33.451494 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:33.451476 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:30:33.515155 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:33.515123 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-55.ec2.internal\" not found" Apr 19 12:30:33.580960 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:33.580923 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod137e787a6b92ec93f0dd935c9a5cf7fd.slice/crio-89eb64b405ea8a9e8a2dffea24b5432852abbaa07a503ca81677a2f7fdd73ce0 WatchSource:0}: Error finding container 89eb64b405ea8a9e8a2dffea24b5432852abbaa07a503ca81677a2f7fdd73ce0: Status 404 returned error can't find the container with id 89eb64b405ea8a9e8a2dffea24b5432852abbaa07a503ca81677a2f7fdd73ce0 Apr 19 12:30:33.581443 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:33.581426 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod023b8927dfe46c9ec0872b191f59109d.slice/crio-5ec047dc1f0029777bdc76408c637913b67da9ad29a360765c87fb15fcb38420 WatchSource:0}: Error finding container 5ec047dc1f0029777bdc76408c637913b67da9ad29a360765c87fb15fcb38420: Status 404 returned error can't find the container with id 5ec047dc1f0029777bdc76408c637913b67da9ad29a360765c87fb15fcb38420 Apr 19 12:30:33.585102 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:33.585085 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:30:33.615766 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:33.615729 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-55.ec2.internal\" not found" Apr 19 12:30:33.716225 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:33.716199 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-55.ec2.internal\" not found" Apr 19 12:30:33.728811 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:33.728783 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:30:33.821552 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:33.821492 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-55.ec2.internal" Apr 19 12:30:33.831371 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:33.831348 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 19 12:30:33.832350 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:33.832337 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-55.ec2.internal" Apr 19 12:30:33.837409 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:33.837395 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 19 12:30:34.284828 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.284797 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:30:34.299664 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.299417 2567 apiserver.go:52] "Watching apiserver" Apr 19 12:30:34.305865 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.305662 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 19 12:30:34.307755 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.307668 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wh942","kube-system/kube-apiserver-proxy-ip-10-0-142-55.ec2.internal","openshift-cluster-node-tuning-operator/tuned-nxpx9","openshift-dns/node-resolver-rd647","openshift-image-registry/node-ca-n7t66","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-55.ec2.internal","openshift-multus/multus-additional-cni-plugins-6xjjk","openshift-multus/multus-jcjln","openshift-network-diagnostics/network-check-target-txq8s","kube-system/konnectivity-agent-jtfsx","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x","openshift-multus/network-metrics-daemon-98bqr","openshift-network-operator/iptables-alerter-7hcbn"] Apr 19 12:30:34.310860 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.310835 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.313543 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.313044 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 19 12:30:34.313543 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.313128 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 19 12:30:34.313543 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.313312 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 19 12:30:34.313543 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.313406 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-97fhp\"" Apr 19 12:30:34.313543 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.313312 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 19 12:30:34.313543 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.313457 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 19 12:30:34.315132 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.314618 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.316589 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.316240 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:30:34.316589 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.316487 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 19 12:30:34.316589 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.316528 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-v8krd\"" Apr 19 12:30:34.319898 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.319877 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rd647" Apr 19 12:30:34.320003 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.319986 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n7t66" Apr 19 12:30:34.321498 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.321479 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-kxnwd\"" Apr 19 12:30:34.321852 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.321837 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 19 12:30:34.321928 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.321910 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-qpff5\"" Apr 19 12:30:34.321987 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.321932 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 19 12:30:34.321987 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.321934 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 19 12:30:34.322236 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.322219 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 19 12:30:34.322327 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.322312 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 19 12:30:34.325695 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.325675 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.327347 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.327329 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 19 12:30:34.327724 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.327664 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 19 12:30:34.327839 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.327790 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 19 12:30:34.328685 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.328246 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vbmv6\"" Apr 19 12:30:34.328685 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.328318 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 19 12:30:34.328685 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.328398 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 19 12:30:34.328685 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.328516 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 19 12:30:34.330018 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.329672 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.330018 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.329734 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:30:34.330018 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:34.329822 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-txq8s" podUID="54214d38-325e-4791-8e44-1bb6aac2fb3f" Apr 19 12:30:34.331362 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.331345 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 19 12:30:34.331586 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.331570 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-czpz7\"" Apr 19 12:30:34.332743 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.332723 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-host-run-ovn-kubernetes\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.332837 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.332763 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-ovnkube-config\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.332837 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.332790 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5ktv\" (UniqueName: \"kubernetes.io/projected/3c634550-95fa-4405-a478-4ce4ac61b034-kube-api-access-n5ktv\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.332946 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.332853 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d3c440e9-6d1f-46b9-b5c1-3ee825d6e2a4-hosts-file\") pod \"node-resolver-rd647\" (UID: \"d3c440e9-6d1f-46b9-b5c1-3ee825d6e2a4\") " pod="openshift-dns/node-resolver-rd647" Apr 19 12:30:34.332946 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.332875 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c634550-95fa-4405-a478-4ce4ac61b034-cnibin\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.332946 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.332899 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-run-systemd\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.332946 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.332936 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-run-ovn\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.333132 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.332960 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-node-log\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.333132 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.332976 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c634550-95fa-4405-a478-4ce4ac61b034-os-release\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.333132 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.332998 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c634550-95fa-4405-a478-4ce4ac61b034-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.333132 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333022 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncg8j\" (UniqueName: \"kubernetes.io/projected/d3c440e9-6d1f-46b9-b5c1-3ee825d6e2a4-kube-api-access-ncg8j\") pod \"node-resolver-rd647\" (UID: \"d3c440e9-6d1f-46b9-b5c1-3ee825d6e2a4\") " pod="openshift-dns/node-resolver-rd647" Apr 19 12:30:34.333132 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333054 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxlp9\" (UniqueName: \"kubernetes.io/projected/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-kube-api-access-cxlp9\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.333132 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333079 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dc6754e8-c6ea-48e0-9ad2-435a13e54b61-serviceca\") pod \"node-ca-n7t66\" (UID: \"dc6754e8-c6ea-48e0-9ad2-435a13e54b61\") " pod="openshift-image-registry/node-ca-n7t66" Apr 19 12:30:34.333132 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333108 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-systemd-units\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.333464 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333145 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-log-socket\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.333464 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333186 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c634550-95fa-4405-a478-4ce4ac61b034-cni-binary-copy\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.333464 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333196 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jtfsx" Apr 19 12:30:34.333464 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333211 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-etc-modprobe-d\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.333464 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333233 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-etc-sysctl-conf\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.333464 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333256 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-host-slash\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.333464 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333278 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/aa115001-5add-4a0d-b7bb-9137e988a754-etc-tuned\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.333464 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333305 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aa115001-5add-4a0d-b7bb-9137e988a754-tmp\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.333464 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333330 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dprfn\" (UniqueName: \"kubernetes.io/projected/aa115001-5add-4a0d-b7bb-9137e988a754-kube-api-access-dprfn\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.333464 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333351 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d3c440e9-6d1f-46b9-b5c1-3ee825d6e2a4-tmp-dir\") pod \"node-resolver-rd647\" (UID: \"d3c440e9-6d1f-46b9-b5c1-3ee825d6e2a4\") " pod="openshift-dns/node-resolver-rd647" Apr 19 12:30:34.333464 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333395 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq54m\" (UniqueName: \"kubernetes.io/projected/dc6754e8-c6ea-48e0-9ad2-435a13e54b61-kube-api-access-mq54m\") pod \"node-ca-n7t66\" (UID: \"dc6754e8-c6ea-48e0-9ad2-435a13e54b61\") " pod="openshift-image-registry/node-ca-n7t66" Apr 19 12:30:34.333464 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333424 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-host-kubelet\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.333464 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333445 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-var-lib-openvswitch\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.333464 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333467 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-env-overrides\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.334092 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333493 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3c634550-95fa-4405-a478-4ce4ac61b034-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.334092 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333519 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-etc-kubernetes\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.334092 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333542 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-run\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.334092 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333565 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c634550-95fa-4405-a478-4ce4ac61b034-system-cni-dir\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.334092 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333618 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-host-run-netns\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.334092 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333796 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-etc-openvswitch\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.334092 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333862 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-host-cni-bin\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.334092 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333880 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-host-cni-netd\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.334092 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333900 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.334092 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333936 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-var-lib-kubelet\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.334092 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333954 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc6754e8-c6ea-48e0-9ad2-435a13e54b61-host\") pod \"node-ca-n7t66\" (UID: \"dc6754e8-c6ea-48e0-9ad2-435a13e54b61\") " pod="openshift-image-registry/node-ca-n7t66" Apr 19 12:30:34.334092 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.333977 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-ovnkube-script-lib\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.334092 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.334028 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3c634550-95fa-4405-a478-4ce4ac61b034-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.334092 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.334065 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-etc-sysctl-d\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.334092 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.334089 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-etc-systemd\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.334801 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.334113 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-run-openvswitch\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.334801 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.334142 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-ovn-node-metrics-cert\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.334801 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.334192 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-etc-sysconfig\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.334801 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.334222 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-sys\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.334801 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.334244 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-lib-modules\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.334801 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.334270 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-host\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.334801 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.334741 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 19 12:30:34.335124 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.335033 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-hjddv\"" Apr 19 12:30:34.335253 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.335238 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 19 12:30:34.337081 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.336325 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.339047 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.338774 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:34.339047 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:34.338870 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98bqr" podUID="720d8932-1617-465d-a213-ebb1e99e6bc6" Apr 19 12:30:34.341322 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.341297 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 19 12:30:34.341418 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.341363 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 19 12:30:34.341769 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.341753 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 19 12:30:34.341887 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.341871 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8lhqs\"" Apr 19 12:30:34.345564 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.345532 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7hcbn" Apr 19 12:30:34.347462 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.347445 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 19 12:30:34.347545 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.347442 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:30:34.347806 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.347789 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ltct2\"" Apr 19 12:30:34.348067 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.348049 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 19 12:30:34.366308 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.366235 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-18 12:25:33 +0000 UTC" deadline="2027-12-20 06:28:16.0040503 +0000 UTC" Apr 19 12:30:34.366308 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.366263 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14633h57m41.637790083s" Apr 19 12:30:34.423311 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.423286 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 19 12:30:34.435104 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435062 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.435265 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435147 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7b7d9484-9728-48a5-a61f-08f5dbf3b0b7-iptables-alerter-script\") pod \"iptables-alerter-7hcbn\" (UID: \"7b7d9484-9728-48a5-a61f-08f5dbf3b0b7\") " pod="openshift-network-operator/iptables-alerter-7hcbn" Apr 19 12:30:34.435265 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435181 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.435265 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435192 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-hostroot\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.435265 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435241 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0ab8ee79-a102-4394-8421-c7c0b8a462c3-sys-fs\") pod \"aws-ebs-csi-driver-node-4qn8x\" (UID: \"0ab8ee79-a102-4394-8421-c7c0b8a462c3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.435476 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435273 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc6754e8-c6ea-48e0-9ad2-435a13e54b61-host\") pod \"node-ca-n7t66\" (UID: \"dc6754e8-c6ea-48e0-9ad2-435a13e54b61\") " pod="openshift-image-registry/node-ca-n7t66" Apr 19 12:30:34.435476 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435304 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3c634550-95fa-4405-a478-4ce4ac61b034-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.435476 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435317 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc6754e8-c6ea-48e0-9ad2-435a13e54b61-host\") pod \"node-ca-n7t66\" (UID: \"dc6754e8-c6ea-48e0-9ad2-435a13e54b61\") " pod="openshift-image-registry/node-ca-n7t66" Apr 19 12:30:34.435476 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435332 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-multus-socket-dir-parent\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.435476 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435358 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-host-run-netns\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.435476 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435381 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-host-run-multus-certs\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.435476 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435410 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c2abf95c-a107-4a6c-93cd-802a48e2976c-agent-certs\") pod \"konnectivity-agent-jtfsx\" (UID: \"c2abf95c-a107-4a6c-93cd-802a48e2976c\") " pod="kube-system/konnectivity-agent-jtfsx" Apr 19 12:30:34.435476 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435437 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c2abf95c-a107-4a6c-93cd-802a48e2976c-konnectivity-ca\") pod \"konnectivity-agent-jtfsx\" (UID: \"c2abf95c-a107-4a6c-93cd-802a48e2976c\") " pod="kube-system/konnectivity-agent-jtfsx" Apr 19 12:30:34.435476 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435464 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-run-openvswitch\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.435874 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435490 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-ovn-node-metrics-cert\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.435874 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435515 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-etc-sysconfig\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.435874 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435538 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-lib-modules\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.435874 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435563 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-host-var-lib-cni-bin\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.435874 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435618 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-host-var-lib-kubelet\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.435874 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435644 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0ab8ee79-a102-4394-8421-c7c0b8a462c3-registration-dir\") pod \"aws-ebs-csi-driver-node-4qn8x\" (UID: \"0ab8ee79-a102-4394-8421-c7c0b8a462c3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.435874 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435681 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0ab8ee79-a102-4394-8421-c7c0b8a462c3-device-dir\") pod \"aws-ebs-csi-driver-node-4qn8x\" (UID: \"0ab8ee79-a102-4394-8421-c7c0b8a462c3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.435874 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435719 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-host-run-ovn-kubernetes\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.435874 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435750 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-ovnkube-config\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.435874 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435773 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d3c440e9-6d1f-46b9-b5c1-3ee825d6e2a4-hosts-file\") pod \"node-resolver-rd647\" (UID: \"d3c440e9-6d1f-46b9-b5c1-3ee825d6e2a4\") " pod="openshift-dns/node-resolver-rd647" Apr 19 12:30:34.435874 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435799 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-node-log\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.435874 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435829 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c634550-95fa-4405-a478-4ce4ac61b034-os-release\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.435874 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435857 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-multus-cni-dir\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.436483 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435883 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0ab8ee79-a102-4394-8421-c7c0b8a462c3-etc-selinux\") pod \"aws-ebs-csi-driver-node-4qn8x\" (UID: \"0ab8ee79-a102-4394-8421-c7c0b8a462c3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.436483 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435911 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-systemd-units\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.436483 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435938 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-log-socket\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.436483 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435963 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0ab8ee79-a102-4394-8421-c7c0b8a462c3-socket-dir\") pod \"aws-ebs-csi-driver-node-4qn8x\" (UID: \"0ab8ee79-a102-4394-8421-c7c0b8a462c3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.436483 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.435981 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3c634550-95fa-4405-a478-4ce4ac61b034-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.436483 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436018 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-host-slash\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.436483 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436077 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-node-log\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.436483 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436120 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-system-cni-dir\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.436483 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436147 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ddw8\" (UniqueName: \"kubernetes.io/projected/0ab8ee79-a102-4394-8421-c7c0b8a462c3-kube-api-access-2ddw8\") pod \"aws-ebs-csi-driver-node-4qn8x\" (UID: \"0ab8ee79-a102-4394-8421-c7c0b8a462c3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.436483 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436151 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c634550-95fa-4405-a478-4ce4ac61b034-os-release\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.436483 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436203 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mq54m\" (UniqueName: \"kubernetes.io/projected/dc6754e8-c6ea-48e0-9ad2-435a13e54b61-kube-api-access-mq54m\") pod \"node-ca-n7t66\" (UID: \"dc6754e8-c6ea-48e0-9ad2-435a13e54b61\") " pod="openshift-image-registry/node-ca-n7t66" Apr 19 12:30:34.436483 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436225 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d3c440e9-6d1f-46b9-b5c1-3ee825d6e2a4-hosts-file\") pod \"node-resolver-rd647\" (UID: \"d3c440e9-6d1f-46b9-b5c1-3ee825d6e2a4\") " pod="openshift-dns/node-resolver-rd647" Apr 19 12:30:34.436483 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436236 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-host-kubelet\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.436483 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436257 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-host-run-ovn-kubernetes\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.436483 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436262 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-run\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.436483 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436292 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-host-slash\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.436483 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436293 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs\") pod \"network-metrics-daemon-98bqr\" (UID: \"720d8932-1617-465d-a213-ebb1e99e6bc6\") " pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:34.437264 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436343 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-host-kubelet\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.437264 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436345 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-cnibin\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.437264 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436381 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-run-openvswitch\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.437264 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436381 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h4q5\" (UniqueName: \"kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5\") pod \"network-check-target-txq8s\" (UID: \"54214d38-325e-4791-8e44-1bb6aac2fb3f\") " pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:30:34.437264 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436418 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-host-run-netns\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.437264 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436432 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 19 12:30:34.437264 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436459 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-systemd-units\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.437264 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436522 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-lib-modules\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.437264 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436526 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-var-lib-kubelet\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.437264 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436559 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0badcc58-d388-42cd-aff8-8b79d1693727-multus-daemon-config\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.437264 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436567 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-host-run-netns\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.437264 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436587 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-ovnkube-script-lib\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.437264 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436595 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-ovnkube-config\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.437264 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436609 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-run\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.437264 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436614 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-etc-sysctl-d\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.437264 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436665 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-etc-sysconfig\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.437264 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436704 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-etc-systemd\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.437264 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436722 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-etc-sysctl-d\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.438058 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436775 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b5cs\" (UniqueName: \"kubernetes.io/projected/7b7d9484-9728-48a5-a61f-08f5dbf3b0b7-kube-api-access-6b5cs\") pod \"iptables-alerter-7hcbn\" (UID: \"7b7d9484-9728-48a5-a61f-08f5dbf3b0b7\") " pod="openshift-network-operator/iptables-alerter-7hcbn" Apr 19 12:30:34.438058 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436780 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-var-lib-kubelet\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.438058 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436837 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ab8ee79-a102-4394-8421-c7c0b8a462c3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4qn8x\" (UID: \"0ab8ee79-a102-4394-8421-c7c0b8a462c3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.438058 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436864 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-sys\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.438058 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436862 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-etc-systemd\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.438058 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436878 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-log-socket\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.438058 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436913 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-sys\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.438058 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436940 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-host\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.438058 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.436981 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0badcc58-d388-42cd-aff8-8b79d1693727-cni-binary-copy\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.438058 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437014 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-host\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.438058 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437055 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-etc-kubernetes\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.438058 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437095 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5ktv\" (UniqueName: \"kubernetes.io/projected/3c634550-95fa-4405-a478-4ce4ac61b034-kube-api-access-n5ktv\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.438058 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437178 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c634550-95fa-4405-a478-4ce4ac61b034-cnibin\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.438058 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437230 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-run-systemd\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.438058 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437232 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c634550-95fa-4405-a478-4ce4ac61b034-cnibin\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.438058 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437276 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-run-ovn\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.438058 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437308 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c634550-95fa-4405-a478-4ce4ac61b034-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.438842 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437349 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ncg8j\" (UniqueName: \"kubernetes.io/projected/d3c440e9-6d1f-46b9-b5c1-3ee825d6e2a4-kube-api-access-ncg8j\") pod \"node-resolver-rd647\" (UID: \"d3c440e9-6d1f-46b9-b5c1-3ee825d6e2a4\") " pod="openshift-dns/node-resolver-rd647" Apr 19 12:30:34.438842 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437360 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-run-systemd\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.438842 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437376 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-host-run-k8s-cni-cncf-io\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.438842 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437431 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-host-var-lib-cni-multus\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.438842 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437461 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dprfn\" (UniqueName: \"kubernetes.io/projected/aa115001-5add-4a0d-b7bb-9137e988a754-kube-api-access-dprfn\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.438842 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437458 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-run-ovn\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.438842 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437484 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c634550-95fa-4405-a478-4ce4ac61b034-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.438842 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437495 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d3c440e9-6d1f-46b9-b5c1-3ee825d6e2a4-tmp-dir\") pod \"node-resolver-rd647\" (UID: \"d3c440e9-6d1f-46b9-b5c1-3ee825d6e2a4\") " pod="openshift-dns/node-resolver-rd647" Apr 19 12:30:34.438842 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437558 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b7d9484-9728-48a5-a61f-08f5dbf3b0b7-host-slash\") pod \"iptables-alerter-7hcbn\" (UID: \"7b7d9484-9728-48a5-a61f-08f5dbf3b0b7\") " pod="openshift-network-operator/iptables-alerter-7hcbn" Apr 19 12:30:34.438842 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437585 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxlp9\" (UniqueName: \"kubernetes.io/projected/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-kube-api-access-cxlp9\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.438842 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437615 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dc6754e8-c6ea-48e0-9ad2-435a13e54b61-serviceca\") pod \"node-ca-n7t66\" (UID: \"dc6754e8-c6ea-48e0-9ad2-435a13e54b61\") " pod="openshift-image-registry/node-ca-n7t66" Apr 19 12:30:34.438842 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437632 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c634550-95fa-4405-a478-4ce4ac61b034-cni-binary-copy\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.438842 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437668 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-etc-modprobe-d\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.438842 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437684 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-etc-sysctl-conf\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.438842 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437747 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/aa115001-5add-4a0d-b7bb-9137e988a754-etc-tuned\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.438842 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437788 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aa115001-5add-4a0d-b7bb-9137e988a754-tmp\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.438842 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437822 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsx8z\" (UniqueName: \"kubernetes.io/projected/720d8932-1617-465d-a213-ebb1e99e6bc6-kube-api-access-gsx8z\") pod \"network-metrics-daemon-98bqr\" (UID: \"720d8932-1617-465d-a213-ebb1e99e6bc6\") " pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:34.439620 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437857 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmh6q\" (UniqueName: \"kubernetes.io/projected/0badcc58-d388-42cd-aff8-8b79d1693727-kube-api-access-wmh6q\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.439620 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437891 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-var-lib-openvswitch\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.439620 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437926 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-env-overrides\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.439620 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437947 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-ovnkube-script-lib\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.439620 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.437959 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d3c440e9-6d1f-46b9-b5c1-3ee825d6e2a4-tmp-dir\") pod \"node-resolver-rd647\" (UID: \"d3c440e9-6d1f-46b9-b5c1-3ee825d6e2a4\") " pod="openshift-dns/node-resolver-rd647" Apr 19 12:30:34.439620 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.438002 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3c634550-95fa-4405-a478-4ce4ac61b034-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.439620 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.438034 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-var-lib-openvswitch\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.439620 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.438081 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-etc-sysctl-conf\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.439620 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.438110 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-etc-kubernetes\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.439620 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.438128 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-os-release\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.439620 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.438143 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-multus-conf-dir\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.439620 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.438183 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c634550-95fa-4405-a478-4ce4ac61b034-system-cni-dir\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.439620 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.438236 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-etc-openvswitch\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.439620 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.438293 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-host-cni-bin\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.439620 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.438330 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-host-cni-netd\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.439620 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.438447 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-host-cni-netd\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.439620 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.438453 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-env-overrides\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.440421 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.438482 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c634550-95fa-4405-a478-4ce4ac61b034-system-cni-dir\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.440421 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.438493 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-etc-kubernetes\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.440421 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.438504 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-etc-openvswitch\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.440421 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.438536 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-host-cni-bin\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.440421 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.438593 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/aa115001-5add-4a0d-b7bb-9137e988a754-etc-modprobe-d\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.440421 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.438933 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c634550-95fa-4405-a478-4ce4ac61b034-cni-binary-copy\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.440421 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.439065 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3c634550-95fa-4405-a478-4ce4ac61b034-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.440421 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.439220 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dc6754e8-c6ea-48e0-9ad2-435a13e54b61-serviceca\") pod \"node-ca-n7t66\" (UID: \"dc6754e8-c6ea-48e0-9ad2-435a13e54b61\") " pod="openshift-image-registry/node-ca-n7t66" Apr 19 12:30:34.440421 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.440219 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-ovn-node-metrics-cert\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.441430 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.441387 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/aa115001-5add-4a0d-b7bb-9137e988a754-etc-tuned\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.442819 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.442796 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aa115001-5add-4a0d-b7bb-9137e988a754-tmp\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.445586 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.445533 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dprfn\" (UniqueName: \"kubernetes.io/projected/aa115001-5add-4a0d-b7bb-9137e988a754-kube-api-access-dprfn\") pod \"tuned-nxpx9\" (UID: \"aa115001-5add-4a0d-b7bb-9137e988a754\") " pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.453231 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.449903 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5ktv\" (UniqueName: \"kubernetes.io/projected/3c634550-95fa-4405-a478-4ce4ac61b034-kube-api-access-n5ktv\") pod \"multus-additional-cni-plugins-6xjjk\" (UID: \"3c634550-95fa-4405-a478-4ce4ac61b034\") " pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.453231 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.450283 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxlp9\" (UniqueName: \"kubernetes.io/projected/f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9-kube-api-access-cxlp9\") pod \"ovnkube-node-wh942\" (UID: \"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.456685 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.456656 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq54m\" (UniqueName: \"kubernetes.io/projected/dc6754e8-c6ea-48e0-9ad2-435a13e54b61-kube-api-access-mq54m\") pod \"node-ca-n7t66\" (UID: \"dc6754e8-c6ea-48e0-9ad2-435a13e54b61\") " pod="openshift-image-registry/node-ca-n7t66" Apr 19 12:30:34.458939 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.458913 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncg8j\" (UniqueName: \"kubernetes.io/projected/d3c440e9-6d1f-46b9-b5c1-3ee825d6e2a4-kube-api-access-ncg8j\") pod \"node-resolver-rd647\" (UID: \"d3c440e9-6d1f-46b9-b5c1-3ee825d6e2a4\") " pod="openshift-dns/node-resolver-rd647" Apr 19 12:30:34.459629 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.459583 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-55.ec2.internal" event={"ID":"137e787a6b92ec93f0dd935c9a5cf7fd","Type":"ContainerStarted","Data":"89eb64b405ea8a9e8a2dffea24b5432852abbaa07a503ca81677a2f7fdd73ce0"} Apr 19 12:30:34.461041 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.461004 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-55.ec2.internal" event={"ID":"023b8927dfe46c9ec0872b191f59109d","Type":"ContainerStarted","Data":"5ec047dc1f0029777bdc76408c637913b67da9ad29a360765c87fb15fcb38420"} Apr 19 12:30:34.539294 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539199 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-multus-cni-dir\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.539294 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539245 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0ab8ee79-a102-4394-8421-c7c0b8a462c3-etc-selinux\") pod \"aws-ebs-csi-driver-node-4qn8x\" (UID: \"0ab8ee79-a102-4394-8421-c7c0b8a462c3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.539294 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539269 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0ab8ee79-a102-4394-8421-c7c0b8a462c3-socket-dir\") pod \"aws-ebs-csi-driver-node-4qn8x\" (UID: \"0ab8ee79-a102-4394-8421-c7c0b8a462c3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.539294 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539294 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-system-cni-dir\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.539592 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539318 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ddw8\" (UniqueName: \"kubernetes.io/projected/0ab8ee79-a102-4394-8421-c7c0b8a462c3-kube-api-access-2ddw8\") pod \"aws-ebs-csi-driver-node-4qn8x\" (UID: \"0ab8ee79-a102-4394-8421-c7c0b8a462c3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.539592 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539343 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs\") pod \"network-metrics-daemon-98bqr\" (UID: \"720d8932-1617-465d-a213-ebb1e99e6bc6\") " pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:34.539592 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539342 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-multus-cni-dir\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.539592 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539365 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-cnibin\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.539592 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539390 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h4q5\" (UniqueName: \"kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5\") pod \"network-check-target-txq8s\" (UID: \"54214d38-325e-4791-8e44-1bb6aac2fb3f\") " pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:30:34.539592 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539415 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0badcc58-d388-42cd-aff8-8b79d1693727-multus-daemon-config\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.539592 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539423 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-system-cni-dir\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.539592 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539439 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6b5cs\" (UniqueName: \"kubernetes.io/projected/7b7d9484-9728-48a5-a61f-08f5dbf3b0b7-kube-api-access-6b5cs\") pod \"iptables-alerter-7hcbn\" (UID: \"7b7d9484-9728-48a5-a61f-08f5dbf3b0b7\") " pod="openshift-network-operator/iptables-alerter-7hcbn" Apr 19 12:30:34.539592 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539462 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ab8ee79-a102-4394-8421-c7c0b8a462c3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4qn8x\" (UID: \"0ab8ee79-a102-4394-8421-c7c0b8a462c3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.539592 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539526 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0ab8ee79-a102-4394-8421-c7c0b8a462c3-socket-dir\") pod \"aws-ebs-csi-driver-node-4qn8x\" (UID: \"0ab8ee79-a102-4394-8421-c7c0b8a462c3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.539592 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539528 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0ab8ee79-a102-4394-8421-c7c0b8a462c3-etc-selinux\") pod \"aws-ebs-csi-driver-node-4qn8x\" (UID: \"0ab8ee79-a102-4394-8421-c7c0b8a462c3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.539592 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539528 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-cnibin\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.539592 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539555 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0badcc58-d388-42cd-aff8-8b79d1693727-cni-binary-copy\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.539592 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539528 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ab8ee79-a102-4394-8421-c7c0b8a462c3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4qn8x\" (UID: \"0ab8ee79-a102-4394-8421-c7c0b8a462c3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.539592 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539587 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-etc-kubernetes\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.540267 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539622 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-host-run-k8s-cni-cncf-io\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.540267 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539646 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-host-var-lib-cni-multus\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.540267 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539671 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b7d9484-9728-48a5-a61f-08f5dbf3b0b7-host-slash\") pod \"iptables-alerter-7hcbn\" (UID: \"7b7d9484-9728-48a5-a61f-08f5dbf3b0b7\") " pod="openshift-network-operator/iptables-alerter-7hcbn" Apr 19 12:30:34.540267 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539701 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gsx8z\" (UniqueName: \"kubernetes.io/projected/720d8932-1617-465d-a213-ebb1e99e6bc6-kube-api-access-gsx8z\") pod \"network-metrics-daemon-98bqr\" (UID: \"720d8932-1617-465d-a213-ebb1e99e6bc6\") " pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:34.540267 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539724 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmh6q\" (UniqueName: \"kubernetes.io/projected/0badcc58-d388-42cd-aff8-8b79d1693727-kube-api-access-wmh6q\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.540267 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539749 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-os-release\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.540267 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539771 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-multus-conf-dir\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.540267 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539774 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-host-run-k8s-cni-cncf-io\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.540267 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539800 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7b7d9484-9728-48a5-a61f-08f5dbf3b0b7-iptables-alerter-script\") pod \"iptables-alerter-7hcbn\" (UID: \"7b7d9484-9728-48a5-a61f-08f5dbf3b0b7\") " pod="openshift-network-operator/iptables-alerter-7hcbn" Apr 19 12:30:34.540267 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:34.539898 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:34.540267 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.539953 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0badcc58-d388-42cd-aff8-8b79d1693727-multus-daemon-config\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.540267 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:34.539964 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs podName:720d8932-1617-465d-a213-ebb1e99e6bc6 nodeName:}" failed. No retries permitted until 2026-04-19 12:30:35.039941114 +0000 UTC m=+3.193874575 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs") pod "network-metrics-daemon-98bqr" (UID: "720d8932-1617-465d-a213-ebb1e99e6bc6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:34.540267 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540055 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-host-var-lib-cni-multus\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.540267 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540106 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b7d9484-9728-48a5-a61f-08f5dbf3b0b7-host-slash\") pod \"iptables-alerter-7hcbn\" (UID: \"7b7d9484-9728-48a5-a61f-08f5dbf3b0b7\") " pod="openshift-network-operator/iptables-alerter-7hcbn" Apr 19 12:30:34.540267 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540142 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-hostroot\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.540267 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540186 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-hostroot\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.540267 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540245 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-os-release\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.541084 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540250 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0ab8ee79-a102-4394-8421-c7c0b8a462c3-sys-fs\") pod \"aws-ebs-csi-driver-node-4qn8x\" (UID: \"0ab8ee79-a102-4394-8421-c7c0b8a462c3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.541084 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540284 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-multus-socket-dir-parent\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.541084 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540309 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-host-run-netns\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.541084 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540334 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-host-run-multus-certs\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.541084 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540362 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c2abf95c-a107-4a6c-93cd-802a48e2976c-agent-certs\") pod \"konnectivity-agent-jtfsx\" (UID: \"c2abf95c-a107-4a6c-93cd-802a48e2976c\") " pod="kube-system/konnectivity-agent-jtfsx" Apr 19 12:30:34.541084 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540386 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c2abf95c-a107-4a6c-93cd-802a48e2976c-konnectivity-ca\") pod \"konnectivity-agent-jtfsx\" (UID: \"c2abf95c-a107-4a6c-93cd-802a48e2976c\") " pod="kube-system/konnectivity-agent-jtfsx" Apr 19 12:30:34.541084 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540414 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-host-var-lib-cni-bin\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.541084 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540420 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7b7d9484-9728-48a5-a61f-08f5dbf3b0b7-iptables-alerter-script\") pod \"iptables-alerter-7hcbn\" (UID: \"7b7d9484-9728-48a5-a61f-08f5dbf3b0b7\") " pod="openshift-network-operator/iptables-alerter-7hcbn" Apr 19 12:30:34.541084 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540441 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-host-var-lib-kubelet\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.541084 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540145 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-etc-kubernetes\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.541084 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540464 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-multus-conf-dir\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.541084 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540470 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0ab8ee79-a102-4394-8421-c7c0b8a462c3-registration-dir\") pod \"aws-ebs-csi-driver-node-4qn8x\" (UID: \"0ab8ee79-a102-4394-8421-c7c0b8a462c3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.541084 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540512 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0ab8ee79-a102-4394-8421-c7c0b8a462c3-sys-fs\") pod \"aws-ebs-csi-driver-node-4qn8x\" (UID: \"0ab8ee79-a102-4394-8421-c7c0b8a462c3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.541084 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540518 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-multus-socket-dir-parent\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.541084 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540526 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0ab8ee79-a102-4394-8421-c7c0b8a462c3-registration-dir\") pod \"aws-ebs-csi-driver-node-4qn8x\" (UID: \"0ab8ee79-a102-4394-8421-c7c0b8a462c3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.541084 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540531 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0badcc58-d388-42cd-aff8-8b79d1693727-cni-binary-copy\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.541084 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540547 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0ab8ee79-a102-4394-8421-c7c0b8a462c3-device-dir\") pod \"aws-ebs-csi-driver-node-4qn8x\" (UID: \"0ab8ee79-a102-4394-8421-c7c0b8a462c3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.541731 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540576 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-host-run-netns\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.541731 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540590 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-host-var-lib-kubelet\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.541731 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540613 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-host-run-multus-certs\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.541731 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540619 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0ab8ee79-a102-4394-8421-c7c0b8a462c3-device-dir\") pod \"aws-ebs-csi-driver-node-4qn8x\" (UID: \"0ab8ee79-a102-4394-8421-c7c0b8a462c3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.541731 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540632 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0badcc58-d388-42cd-aff8-8b79d1693727-host-var-lib-cni-bin\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.541731 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.540970 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c2abf95c-a107-4a6c-93cd-802a48e2976c-konnectivity-ca\") pod \"konnectivity-agent-jtfsx\" (UID: \"c2abf95c-a107-4a6c-93cd-802a48e2976c\") " pod="kube-system/konnectivity-agent-jtfsx" Apr 19 12:30:34.543854 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.543820 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c2abf95c-a107-4a6c-93cd-802a48e2976c-agent-certs\") pod \"konnectivity-agent-jtfsx\" (UID: \"c2abf95c-a107-4a6c-93cd-802a48e2976c\") " pod="kube-system/konnectivity-agent-jtfsx" Apr 19 12:30:34.545506 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:34.545483 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:30:34.545599 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:34.545510 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:30:34.545599 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:34.545524 2567 projected.go:194] Error preparing data for projected volume kube-api-access-5h4q5 for pod openshift-network-diagnostics/network-check-target-txq8s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:34.545714 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:34.545605 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5 podName:54214d38-325e-4791-8e44-1bb6aac2fb3f nodeName:}" failed. No retries permitted until 2026-04-19 12:30:35.045578288 +0000 UTC m=+3.199511754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5h4q5" (UniqueName: "kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5") pod "network-check-target-txq8s" (UID: "54214d38-325e-4791-8e44-1bb6aac2fb3f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:34.547942 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.547920 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b5cs\" (UniqueName: \"kubernetes.io/projected/7b7d9484-9728-48a5-a61f-08f5dbf3b0b7-kube-api-access-6b5cs\") pod \"iptables-alerter-7hcbn\" (UID: \"7b7d9484-9728-48a5-a61f-08f5dbf3b0b7\") " pod="openshift-network-operator/iptables-alerter-7hcbn" Apr 19 12:30:34.548589 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.548540 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ddw8\" (UniqueName: \"kubernetes.io/projected/0ab8ee79-a102-4394-8421-c7c0b8a462c3-kube-api-access-2ddw8\") pod \"aws-ebs-csi-driver-node-4qn8x\" (UID: \"0ab8ee79-a102-4394-8421-c7c0b8a462c3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.548922 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.548884 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmh6q\" (UniqueName: \"kubernetes.io/projected/0badcc58-d388-42cd-aff8-8b79d1693727-kube-api-access-wmh6q\") pod \"multus-jcjln\" (UID: \"0badcc58-d388-42cd-aff8-8b79d1693727\") " pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.550416 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.550380 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsx8z\" (UniqueName: \"kubernetes.io/projected/720d8932-1617-465d-a213-ebb1e99e6bc6-kube-api-access-gsx8z\") pod \"network-metrics-daemon-98bqr\" (UID: \"720d8932-1617-465d-a213-ebb1e99e6bc6\") " pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:34.626886 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.626843 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6xjjk" Apr 19 12:30:34.634147 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.634125 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" Apr 19 12:30:34.637265 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:34.637234 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c634550_95fa_4405_a478_4ce4ac61b034.slice/crio-013964638ccde01715fa751e7aac466e7105587d9b5381b239602b56c5fc7edc WatchSource:0}: Error finding container 013964638ccde01715fa751e7aac466e7105587d9b5381b239602b56c5fc7edc: Status 404 returned error can't find the container with id 013964638ccde01715fa751e7aac466e7105587d9b5381b239602b56c5fc7edc Apr 19 12:30:34.641964 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:34.641938 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa115001_5add_4a0d_b7bb_9137e988a754.slice/crio-e23054f8a559b301858fc43fa3d787a483d48947c6b34be900409e3e0eff5ec6 WatchSource:0}: Error finding container e23054f8a559b301858fc43fa3d787a483d48947c6b34be900409e3e0eff5ec6: Status 404 returned error can't find the container with id e23054f8a559b301858fc43fa3d787a483d48947c6b34be900409e3e0eff5ec6 Apr 19 12:30:34.650110 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.650032 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rd647" Apr 19 12:30:34.657216 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.657174 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n7t66" Apr 19 12:30:34.657508 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:34.657485 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3c440e9_6d1f_46b9_b5c1_3ee825d6e2a4.slice/crio-64af8473808b2ef0558d20fe5ae72beabd18ea9fd3cb517aec6df5f7cff3f48f WatchSource:0}: Error finding container 64af8473808b2ef0558d20fe5ae72beabd18ea9fd3cb517aec6df5f7cff3f48f: Status 404 returned error can't find the container with id 64af8473808b2ef0558d20fe5ae72beabd18ea9fd3cb517aec6df5f7cff3f48f Apr 19 12:30:34.665587 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.665563 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:34.667437 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:34.667408 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc6754e8_c6ea_48e0_9ad2_435a13e54b61.slice/crio-c80645435fb3a59ed90fb891ee1994ec7458bda140eb1f22001e8a8a8af6a4ac WatchSource:0}: Error finding container c80645435fb3a59ed90fb891ee1994ec7458bda140eb1f22001e8a8a8af6a4ac: Status 404 returned error can't find the container with id c80645435fb3a59ed90fb891ee1994ec7458bda140eb1f22001e8a8a8af6a4ac Apr 19 12:30:34.672509 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:34.672484 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7f5fa31_f41a_42f0_8ea0_450aafc2a4a9.slice/crio-f43cc261f6d79ed1b1837a20af114046be4e5cca3d6588467c9765c8ba6ba3d7 WatchSource:0}: Error finding container f43cc261f6d79ed1b1837a20af114046be4e5cca3d6588467c9765c8ba6ba3d7: Status 404 returned error can't find the container with id f43cc261f6d79ed1b1837a20af114046be4e5cca3d6588467c9765c8ba6ba3d7 Apr 19 12:30:34.673727 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.673707 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jcjln" Apr 19 12:30:34.681552 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:34.681532 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0badcc58_d388_42cd_aff8_8b79d1693727.slice/crio-2be186cd0d285d8b1b3adf242ee950cde7bb1dcad6ded2206276f00ca0c3a8e6 WatchSource:0}: Error finding container 2be186cd0d285d8b1b3adf242ee950cde7bb1dcad6ded2206276f00ca0c3a8e6: Status 404 returned error can't find the container with id 2be186cd0d285d8b1b3adf242ee950cde7bb1dcad6ded2206276f00ca0c3a8e6 Apr 19 12:30:34.683428 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.683150 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jtfsx" Apr 19 12:30:34.692489 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.692464 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" Apr 19 12:30:34.700994 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:34.700967 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7hcbn" Apr 19 12:30:35.043855 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:35.043824 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs\") pod \"network-metrics-daemon-98bqr\" (UID: \"720d8932-1617-465d-a213-ebb1e99e6bc6\") " pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:35.044006 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:35.043951 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:35.044065 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:35.044018 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs podName:720d8932-1617-465d-a213-ebb1e99e6bc6 nodeName:}" failed. No retries permitted until 2026-04-19 12:30:36.044001619 +0000 UTC m=+4.197935079 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs") pod "network-metrics-daemon-98bqr" (UID: "720d8932-1617-465d-a213-ebb1e99e6bc6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:35.144456 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:35.144426 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h4q5\" (UniqueName: \"kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5\") pod \"network-check-target-txq8s\" (UID: \"54214d38-325e-4791-8e44-1bb6aac2fb3f\") " pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:30:35.144620 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:35.144600 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:30:35.144673 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:35.144626 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:30:35.144673 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:35.144639 2567 projected.go:194] Error preparing data for projected volume kube-api-access-5h4q5 for pod openshift-network-diagnostics/network-check-target-txq8s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:35.144751 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:35.144700 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5 podName:54214d38-325e-4791-8e44-1bb6aac2fb3f nodeName:}" failed. No retries permitted until 2026-04-19 12:30:36.144681968 +0000 UTC m=+4.298615437 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5h4q5" (UniqueName: "kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5") pod "network-check-target-txq8s" (UID: "54214d38-325e-4791-8e44-1bb6aac2fb3f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:35.367434 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:35.367331 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-18 12:25:33 +0000 UTC" deadline="2027-11-09 20:59:33.952279926 +0000 UTC" Apr 19 12:30:35.367434 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:35.367373 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13664h28m58.58490999s" Apr 19 12:30:35.454111 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:35.454082 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:35.454295 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:35.454228 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98bqr" podUID="720d8932-1617-465d-a213-ebb1e99e6bc6" Apr 19 12:30:35.463191 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:35.463143 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jcjln" event={"ID":"0badcc58-d388-42cd-aff8-8b79d1693727","Type":"ContainerStarted","Data":"2be186cd0d285d8b1b3adf242ee950cde7bb1dcad6ded2206276f00ca0c3a8e6"} Apr 19 12:30:35.464283 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:35.464256 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wh942" event={"ID":"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9","Type":"ContainerStarted","Data":"f43cc261f6d79ed1b1837a20af114046be4e5cca3d6588467c9765c8ba6ba3d7"} Apr 19 12:30:35.465369 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:35.465334 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n7t66" event={"ID":"dc6754e8-c6ea-48e0-9ad2-435a13e54b61","Type":"ContainerStarted","Data":"c80645435fb3a59ed90fb891ee1994ec7458bda140eb1f22001e8a8a8af6a4ac"} Apr 19 12:30:35.466391 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:35.466370 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rd647" event={"ID":"d3c440e9-6d1f-46b9-b5c1-3ee825d6e2a4","Type":"ContainerStarted","Data":"64af8473808b2ef0558d20fe5ae72beabd18ea9fd3cb517aec6df5f7cff3f48f"} Apr 19 12:30:35.467410 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:35.467386 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" event={"ID":"aa115001-5add-4a0d-b7bb-9137e988a754","Type":"ContainerStarted","Data":"e23054f8a559b301858fc43fa3d787a483d48947c6b34be900409e3e0eff5ec6"} Apr 19 12:30:35.468430 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:35.468408 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6xjjk" event={"ID":"3c634550-95fa-4405-a478-4ce4ac61b034","Type":"ContainerStarted","Data":"013964638ccde01715fa751e7aac466e7105587d9b5381b239602b56c5fc7edc"} Apr 19 12:30:35.554658 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:35.554625 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ab8ee79_a102_4394_8421_c7c0b8a462c3.slice/crio-6d6da174ae1a23f6ebe30a46372ce1ff54301d4853b7c318c5f6b3a3fcca8c1c WatchSource:0}: Error finding container 6d6da174ae1a23f6ebe30a46372ce1ff54301d4853b7c318c5f6b3a3fcca8c1c: Status 404 returned error can't find the container with id 6d6da174ae1a23f6ebe30a46372ce1ff54301d4853b7c318c5f6b3a3fcca8c1c Apr 19 12:30:35.557180 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:35.556944 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b7d9484_9728_48a5_a61f_08f5dbf3b0b7.slice/crio-ec4d9a6483b5555857b1253bbee05e6230914d69b92b1076d35640d17388358e WatchSource:0}: Error finding container ec4d9a6483b5555857b1253bbee05e6230914d69b92b1076d35640d17388358e: Status 404 returned error can't find the container with id ec4d9a6483b5555857b1253bbee05e6230914d69b92b1076d35640d17388358e Apr 19 12:30:35.557689 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:30:35.557665 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2abf95c_a107_4a6c_93cd_802a48e2976c.slice/crio-ace1c8ed9d9c1df707e0634165b900d6b1b79d3b22eb5e594eeda612e421d391 WatchSource:0}: Error finding container ace1c8ed9d9c1df707e0634165b900d6b1b79d3b22eb5e594eeda612e421d391: Status 404 returned error can't find the container with id ace1c8ed9d9c1df707e0634165b900d6b1b79d3b22eb5e594eeda612e421d391 Apr 19 12:30:36.053108 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:36.052861 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs\") pod \"network-metrics-daemon-98bqr\" (UID: \"720d8932-1617-465d-a213-ebb1e99e6bc6\") " pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:36.053317 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:36.053280 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:36.053388 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:36.053345 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs podName:720d8932-1617-465d-a213-ebb1e99e6bc6 nodeName:}" failed. No retries permitted until 2026-04-19 12:30:38.053327119 +0000 UTC m=+6.207260590 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs") pod "network-metrics-daemon-98bqr" (UID: "720d8932-1617-465d-a213-ebb1e99e6bc6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:36.154679 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:36.154022 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h4q5\" (UniqueName: \"kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5\") pod \"network-check-target-txq8s\" (UID: \"54214d38-325e-4791-8e44-1bb6aac2fb3f\") " pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:30:36.154679 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:36.154227 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:30:36.154679 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:36.154250 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:30:36.154679 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:36.154269 2567 projected.go:194] Error preparing data for projected volume kube-api-access-5h4q5 for pod openshift-network-diagnostics/network-check-target-txq8s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:36.154679 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:36.154333 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5 podName:54214d38-325e-4791-8e44-1bb6aac2fb3f nodeName:}" failed. No retries permitted until 2026-04-19 12:30:38.154313354 +0000 UTC m=+6.308246819 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-5h4q5" (UniqueName: "kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5") pod "network-check-target-txq8s" (UID: "54214d38-325e-4791-8e44-1bb6aac2fb3f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:36.456519 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:36.456403 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:30:36.456967 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:36.456550 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-txq8s" podUID="54214d38-325e-4791-8e44-1bb6aac2fb3f" Apr 19 12:30:36.489608 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:36.488938 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-55.ec2.internal" event={"ID":"023b8927dfe46c9ec0872b191f59109d","Type":"ContainerStarted","Data":"dc8d4c1a84c3372ac59754c25aa7548dfeb9e758f7348a3be2d551a5c2916110"} Apr 19 12:30:36.502496 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:36.502427 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-55.ec2.internal" podStartSLOduration=3.5024080939999997 podStartE2EDuration="3.502408094s" podCreationTimestamp="2026-04-19 12:30:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:30:36.501801504 +0000 UTC m=+4.655734985" watchObservedRunningTime="2026-04-19 12:30:36.502408094 +0000 UTC m=+4.656341575" Apr 19 12:30:36.505719 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:36.504308 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7hcbn" event={"ID":"7b7d9484-9728-48a5-a61f-08f5dbf3b0b7","Type":"ContainerStarted","Data":"ec4d9a6483b5555857b1253bbee05e6230914d69b92b1076d35640d17388358e"} Apr 19 12:30:36.511447 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:36.510107 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" event={"ID":"0ab8ee79-a102-4394-8421-c7c0b8a462c3","Type":"ContainerStarted","Data":"6d6da174ae1a23f6ebe30a46372ce1ff54301d4853b7c318c5f6b3a3fcca8c1c"} Apr 19 12:30:36.522199 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:36.522119 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jtfsx" event={"ID":"c2abf95c-a107-4a6c-93cd-802a48e2976c","Type":"ContainerStarted","Data":"ace1c8ed9d9c1df707e0634165b900d6b1b79d3b22eb5e594eeda612e421d391"} Apr 19 12:30:37.279780 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:37.279748 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-qlgx9"] Apr 19 12:30:37.282767 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:37.282735 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:37.282887 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:37.282818 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qlgx9" podUID="ddd4693b-4c32-466e-be78-8808310a5f1f" Apr 19 12:30:37.366100 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:37.364977 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ddd4693b-4c32-466e-be78-8808310a5f1f-kubelet-config\") pod \"global-pull-secret-syncer-qlgx9\" (UID: \"ddd4693b-4c32-466e-be78-8808310a5f1f\") " pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:37.366100 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:37.365060 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ddd4693b-4c32-466e-be78-8808310a5f1f-dbus\") pod \"global-pull-secret-syncer-qlgx9\" (UID: \"ddd4693b-4c32-466e-be78-8808310a5f1f\") " pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:37.366100 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:37.365095 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ddd4693b-4c32-466e-be78-8808310a5f1f-original-pull-secret\") pod \"global-pull-secret-syncer-qlgx9\" (UID: \"ddd4693b-4c32-466e-be78-8808310a5f1f\") " pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:37.453984 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:37.453938 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:37.454196 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:37.454068 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98bqr" podUID="720d8932-1617-465d-a213-ebb1e99e6bc6" Apr 19 12:30:37.465488 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:37.465445 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ddd4693b-4c32-466e-be78-8808310a5f1f-dbus\") pod \"global-pull-secret-syncer-qlgx9\" (UID: \"ddd4693b-4c32-466e-be78-8808310a5f1f\") " pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:37.465942 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:37.465502 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ddd4693b-4c32-466e-be78-8808310a5f1f-original-pull-secret\") pod \"global-pull-secret-syncer-qlgx9\" (UID: \"ddd4693b-4c32-466e-be78-8808310a5f1f\") " pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:37.465942 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:37.465603 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ddd4693b-4c32-466e-be78-8808310a5f1f-kubelet-config\") pod \"global-pull-secret-syncer-qlgx9\" (UID: \"ddd4693b-4c32-466e-be78-8808310a5f1f\") " pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:37.465942 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:37.465676 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ddd4693b-4c32-466e-be78-8808310a5f1f-dbus\") pod \"global-pull-secret-syncer-qlgx9\" (UID: \"ddd4693b-4c32-466e-be78-8808310a5f1f\") " pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:37.465942 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:37.465692 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:30:37.465942 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:37.465724 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ddd4693b-4c32-466e-be78-8808310a5f1f-kubelet-config\") pod \"global-pull-secret-syncer-qlgx9\" (UID: \"ddd4693b-4c32-466e-be78-8808310a5f1f\") " pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:37.465942 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:37.465787 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd4693b-4c32-466e-be78-8808310a5f1f-original-pull-secret podName:ddd4693b-4c32-466e-be78-8808310a5f1f nodeName:}" failed. No retries permitted until 2026-04-19 12:30:37.965766472 +0000 UTC m=+6.119699931 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ddd4693b-4c32-466e-be78-8808310a5f1f-original-pull-secret") pod "global-pull-secret-syncer-qlgx9" (UID: "ddd4693b-4c32-466e-be78-8808310a5f1f") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:30:37.537173 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:37.537075 2567 generic.go:358] "Generic (PLEG): container finished" podID="137e787a6b92ec93f0dd935c9a5cf7fd" containerID="edb9ffc3ea5209a2dfc209e2281e689d89589a9aacc58e20e5b4c90f1d018949" exitCode=0 Apr 19 12:30:37.537459 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:37.537434 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-55.ec2.internal" event={"ID":"137e787a6b92ec93f0dd935c9a5cf7fd","Type":"ContainerDied","Data":"edb9ffc3ea5209a2dfc209e2281e689d89589a9aacc58e20e5b4c90f1d018949"} Apr 19 12:30:37.970527 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:37.970466 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ddd4693b-4c32-466e-be78-8808310a5f1f-original-pull-secret\") pod \"global-pull-secret-syncer-qlgx9\" (UID: \"ddd4693b-4c32-466e-be78-8808310a5f1f\") " pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:37.970698 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:37.970593 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:30:37.970698 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:37.970658 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd4693b-4c32-466e-be78-8808310a5f1f-original-pull-secret podName:ddd4693b-4c32-466e-be78-8808310a5f1f nodeName:}" failed. No retries permitted until 2026-04-19 12:30:38.970639116 +0000 UTC m=+7.124572574 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ddd4693b-4c32-466e-be78-8808310a5f1f-original-pull-secret") pod "global-pull-secret-syncer-qlgx9" (UID: "ddd4693b-4c32-466e-be78-8808310a5f1f") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:30:38.071887 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:38.071238 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs\") pod \"network-metrics-daemon-98bqr\" (UID: \"720d8932-1617-465d-a213-ebb1e99e6bc6\") " pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:38.071887 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:38.071409 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:38.071887 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:38.071474 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs podName:720d8932-1617-465d-a213-ebb1e99e6bc6 nodeName:}" failed. No retries permitted until 2026-04-19 12:30:42.07145597 +0000 UTC m=+10.225389445 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs") pod "network-metrics-daemon-98bqr" (UID: "720d8932-1617-465d-a213-ebb1e99e6bc6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:38.172187 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:38.172137 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h4q5\" (UniqueName: \"kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5\") pod \"network-check-target-txq8s\" (UID: \"54214d38-325e-4791-8e44-1bb6aac2fb3f\") " pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:30:38.172358 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:38.172310 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:30:38.172358 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:38.172327 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:30:38.172358 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:38.172341 2567 projected.go:194] Error preparing data for projected volume kube-api-access-5h4q5 for pod openshift-network-diagnostics/network-check-target-txq8s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:38.172557 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:38.172398 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5 podName:54214d38-325e-4791-8e44-1bb6aac2fb3f nodeName:}" failed. No retries permitted until 2026-04-19 12:30:42.1723803 +0000 UTC m=+10.326313776 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-5h4q5" (UniqueName: "kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5") pod "network-check-target-txq8s" (UID: "54214d38-325e-4791-8e44-1bb6aac2fb3f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:38.455200 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:38.454911 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:30:38.455200 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:38.455034 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-txq8s" podUID="54214d38-325e-4791-8e44-1bb6aac2fb3f" Apr 19 12:30:38.977509 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:38.977414 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ddd4693b-4c32-466e-be78-8808310a5f1f-original-pull-secret\") pod \"global-pull-secret-syncer-qlgx9\" (UID: \"ddd4693b-4c32-466e-be78-8808310a5f1f\") " pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:38.977991 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:38.977559 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:30:38.977991 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:38.977631 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd4693b-4c32-466e-be78-8808310a5f1f-original-pull-secret podName:ddd4693b-4c32-466e-be78-8808310a5f1f nodeName:}" failed. No retries permitted until 2026-04-19 12:30:40.977611569 +0000 UTC m=+9.131545050 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ddd4693b-4c32-466e-be78-8808310a5f1f-original-pull-secret") pod "global-pull-secret-syncer-qlgx9" (UID: "ddd4693b-4c32-466e-be78-8808310a5f1f") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:30:39.454717 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:39.454154 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:39.454717 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:39.454318 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98bqr" podUID="720d8932-1617-465d-a213-ebb1e99e6bc6" Apr 19 12:30:39.454717 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:39.454154 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:39.454717 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:39.454453 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qlgx9" podUID="ddd4693b-4c32-466e-be78-8808310a5f1f" Apr 19 12:30:40.454391 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:40.454357 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:30:40.454835 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:40.454495 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-txq8s" podUID="54214d38-325e-4791-8e44-1bb6aac2fb3f" Apr 19 12:30:40.995779 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:40.995738 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ddd4693b-4c32-466e-be78-8808310a5f1f-original-pull-secret\") pod \"global-pull-secret-syncer-qlgx9\" (UID: \"ddd4693b-4c32-466e-be78-8808310a5f1f\") " pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:40.996118 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:40.995904 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:30:40.996118 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:40.995986 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd4693b-4c32-466e-be78-8808310a5f1f-original-pull-secret podName:ddd4693b-4c32-466e-be78-8808310a5f1f nodeName:}" failed. No retries permitted until 2026-04-19 12:30:44.995965783 +0000 UTC m=+13.149899255 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ddd4693b-4c32-466e-be78-8808310a5f1f-original-pull-secret") pod "global-pull-secret-syncer-qlgx9" (UID: "ddd4693b-4c32-466e-be78-8808310a5f1f") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:30:41.454357 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:41.454321 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:41.454522 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:41.454437 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qlgx9" podUID="ddd4693b-4c32-466e-be78-8808310a5f1f" Apr 19 12:30:41.454884 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:41.454856 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:41.454985 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:41.454957 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98bqr" podUID="720d8932-1617-465d-a213-ebb1e99e6bc6" Apr 19 12:30:42.105229 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:42.105193 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs\") pod \"network-metrics-daemon-98bqr\" (UID: \"720d8932-1617-465d-a213-ebb1e99e6bc6\") " pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:42.105418 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:42.105359 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:42.105490 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:42.105445 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs podName:720d8932-1617-465d-a213-ebb1e99e6bc6 nodeName:}" failed. No retries permitted until 2026-04-19 12:30:50.105424214 +0000 UTC m=+18.259357689 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs") pod "network-metrics-daemon-98bqr" (UID: "720d8932-1617-465d-a213-ebb1e99e6bc6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:42.205821 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:42.205778 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h4q5\" (UniqueName: \"kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5\") pod \"network-check-target-txq8s\" (UID: \"54214d38-325e-4791-8e44-1bb6aac2fb3f\") " pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:30:42.205989 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:42.205966 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:30:42.206048 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:42.205998 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:30:42.206048 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:42.206012 2567 projected.go:194] Error preparing data for projected volume kube-api-access-5h4q5 for pod openshift-network-diagnostics/network-check-target-txq8s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:42.206137 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:42.206079 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5 podName:54214d38-325e-4791-8e44-1bb6aac2fb3f nodeName:}" failed. No retries permitted until 2026-04-19 12:30:50.206059653 +0000 UTC m=+18.359993117 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-5h4q5" (UniqueName: "kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5") pod "network-check-target-txq8s" (UID: "54214d38-325e-4791-8e44-1bb6aac2fb3f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:42.455238 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:42.454722 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:30:42.455238 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:42.454832 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-txq8s" podUID="54214d38-325e-4791-8e44-1bb6aac2fb3f" Apr 19 12:30:43.453721 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:43.453681 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:43.453944 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:43.453685 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:43.453944 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:43.453867 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qlgx9" podUID="ddd4693b-4c32-466e-be78-8808310a5f1f" Apr 19 12:30:43.453944 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:43.453931 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98bqr" podUID="720d8932-1617-465d-a213-ebb1e99e6bc6" Apr 19 12:30:44.454221 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:44.454184 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:30:44.454714 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:44.454314 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-txq8s" podUID="54214d38-325e-4791-8e44-1bb6aac2fb3f" Apr 19 12:30:44.552763 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:44.552726 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-55.ec2.internal" event={"ID":"137e787a6b92ec93f0dd935c9a5cf7fd","Type":"ContainerStarted","Data":"f0db211126968b845b4a7b5cafd6a0375aa86617da57e9ae3556ffb2b6e202b3"} Apr 19 12:30:44.565201 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:44.565129 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-55.ec2.internal" podStartSLOduration=11.565110572 podStartE2EDuration="11.565110572s" podCreationTimestamp="2026-04-19 12:30:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:30:44.564329479 +0000 UTC m=+12.718262960" watchObservedRunningTime="2026-04-19 12:30:44.565110572 +0000 UTC m=+12.719044053" Apr 19 12:30:45.028515 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:45.028472 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ddd4693b-4c32-466e-be78-8808310a5f1f-original-pull-secret\") pod \"global-pull-secret-syncer-qlgx9\" (UID: \"ddd4693b-4c32-466e-be78-8808310a5f1f\") " pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:45.028707 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:45.028587 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:30:45.028707 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:45.028671 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd4693b-4c32-466e-be78-8808310a5f1f-original-pull-secret podName:ddd4693b-4c32-466e-be78-8808310a5f1f nodeName:}" failed. No retries permitted until 2026-04-19 12:30:53.028651917 +0000 UTC m=+21.182585382 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ddd4693b-4c32-466e-be78-8808310a5f1f-original-pull-secret") pod "global-pull-secret-syncer-qlgx9" (UID: "ddd4693b-4c32-466e-be78-8808310a5f1f") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:30:45.454015 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:45.453981 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:45.454234 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:45.454101 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qlgx9" podUID="ddd4693b-4c32-466e-be78-8808310a5f1f" Apr 19 12:30:45.454234 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:45.454155 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:45.454583 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:45.454295 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98bqr" podUID="720d8932-1617-465d-a213-ebb1e99e6bc6" Apr 19 12:30:46.454440 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:46.454395 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:30:46.454887 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:46.454529 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-txq8s" podUID="54214d38-325e-4791-8e44-1bb6aac2fb3f" Apr 19 12:30:47.454072 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:47.454038 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:47.454297 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:47.454036 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:47.454297 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:47.454203 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98bqr" podUID="720d8932-1617-465d-a213-ebb1e99e6bc6" Apr 19 12:30:47.454297 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:47.454253 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qlgx9" podUID="ddd4693b-4c32-466e-be78-8808310a5f1f" Apr 19 12:30:48.453775 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:48.453738 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:30:48.454240 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:48.453853 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-txq8s" podUID="54214d38-325e-4791-8e44-1bb6aac2fb3f" Apr 19 12:30:49.453945 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:49.453907 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:49.454398 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:49.453910 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:49.454398 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:49.454036 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98bqr" podUID="720d8932-1617-465d-a213-ebb1e99e6bc6" Apr 19 12:30:49.454398 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:49.454107 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qlgx9" podUID="ddd4693b-4c32-466e-be78-8808310a5f1f" Apr 19 12:30:50.165136 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:50.165080 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs\") pod \"network-metrics-daemon-98bqr\" (UID: \"720d8932-1617-465d-a213-ebb1e99e6bc6\") " pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:50.165310 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:50.165272 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:50.165364 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:50.165354 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs podName:720d8932-1617-465d-a213-ebb1e99e6bc6 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:06.165332223 +0000 UTC m=+34.319265688 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs") pod "network-metrics-daemon-98bqr" (UID: "720d8932-1617-465d-a213-ebb1e99e6bc6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:50.266422 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:50.266383 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h4q5\" (UniqueName: \"kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5\") pod \"network-check-target-txq8s\" (UID: \"54214d38-325e-4791-8e44-1bb6aac2fb3f\") " pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:30:50.266575 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:50.266546 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:30:50.266575 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:50.266563 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:30:50.266575 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:50.266576 2567 projected.go:194] Error preparing data for projected volume kube-api-access-5h4q5 for pod openshift-network-diagnostics/network-check-target-txq8s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:50.266718 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:50.266632 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5 podName:54214d38-325e-4791-8e44-1bb6aac2fb3f nodeName:}" failed. No retries permitted until 2026-04-19 12:31:06.266617373 +0000 UTC m=+34.420550835 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-5h4q5" (UniqueName: "kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5") pod "network-check-target-txq8s" (UID: "54214d38-325e-4791-8e44-1bb6aac2fb3f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:50.454557 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:50.454474 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:30:50.454988 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:50.454620 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-txq8s" podUID="54214d38-325e-4791-8e44-1bb6aac2fb3f" Apr 19 12:30:51.454156 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:51.454118 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:51.454355 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:51.454118 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:51.454355 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:51.454291 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98bqr" podUID="720d8932-1617-465d-a213-ebb1e99e6bc6" Apr 19 12:30:51.454462 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:51.454362 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qlgx9" podUID="ddd4693b-4c32-466e-be78-8808310a5f1f" Apr 19 12:30:52.454998 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:52.454588 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:30:52.455733 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:52.455116 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-txq8s" podUID="54214d38-325e-4791-8e44-1bb6aac2fb3f" Apr 19 12:30:52.566769 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:52.566742 2567 generic.go:358] "Generic (PLEG): container finished" podID="3c634550-95fa-4405-a478-4ce4ac61b034" containerID="ec5ef34507e441ae525464663e10ecb1181b6ef9db367526b5987c80975fed9f" exitCode=0 Apr 19 12:30:52.566870 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:52.566826 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6xjjk" event={"ID":"3c634550-95fa-4405-a478-4ce4ac61b034","Type":"ContainerDied","Data":"ec5ef34507e441ae525464663e10ecb1181b6ef9db367526b5987c80975fed9f"} Apr 19 12:30:52.568201 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:52.568152 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jtfsx" event={"ID":"c2abf95c-a107-4a6c-93cd-802a48e2976c","Type":"ContainerStarted","Data":"135546c77471ff48b56da3b026b62a56ecb27c5dadd91d9751b027c8a55dd078"} Apr 19 12:30:52.569648 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:52.569594 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n7t66" event={"ID":"dc6754e8-c6ea-48e0-9ad2-435a13e54b61","Type":"ContainerStarted","Data":"9651c9e01d7bd8d759d403b46a26c93a26410175952cd497bcaadb6c260fca2b"} Apr 19 12:30:52.571137 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:52.571110 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rd647" event={"ID":"d3c440e9-6d1f-46b9-b5c1-3ee825d6e2a4","Type":"ContainerStarted","Data":"f6a9aeef141929633b128b1bbd7c1653db18d52f8a30c449b28d038f84c45dc4"} Apr 19 12:30:52.572476 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:52.572453 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" event={"ID":"0ab8ee79-a102-4394-8421-c7c0b8a462c3","Type":"ContainerStarted","Data":"18fc88b8ad9486e03bacaa7588e490c5a70d02121c2a88913c725089be768edb"} Apr 19 12:30:52.573757 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:52.573734 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jcjln" event={"ID":"0badcc58-d388-42cd-aff8-8b79d1693727","Type":"ContainerStarted","Data":"0454060129cc5e2a35768765ae0831ba83275deda0413f126847ab99c7e5fe6e"} Apr 19 12:30:52.575914 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:52.575894 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wh942" event={"ID":"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9","Type":"ContainerStarted","Data":"e5d5d05db84b295f1bec5d40cd8ff92db223c3d87fcc7001fc225dc721c35826"} Apr 19 12:30:52.576004 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:52.575921 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wh942" event={"ID":"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9","Type":"ContainerStarted","Data":"69c5d97e9f211103e87960c96274e0d80a594de99f013daae071d76425d154cd"} Apr 19 12:30:52.576004 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:52.575935 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wh942" event={"ID":"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9","Type":"ContainerStarted","Data":"8142c97ed0bb8b0d117a99081d3b0b07bcbc3c5e4caa608917c5dd935770d2fc"} Apr 19 12:30:52.577206 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:52.577185 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" event={"ID":"aa115001-5add-4a0d-b7bb-9137e988a754","Type":"ContainerStarted","Data":"362ca4798887d204e475a88473b00f0609705fc782b70feb718f7788198c1154"} Apr 19 12:30:52.599914 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:52.599207 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-nxpx9" podStartSLOduration=3.192523912 podStartE2EDuration="20.599183361s" podCreationTimestamp="2026-04-19 12:30:32 +0000 UTC" firstStartedPulling="2026-04-19 12:30:34.644220549 +0000 UTC m=+2.798154006" lastFinishedPulling="2026-04-19 12:30:52.050879996 +0000 UTC m=+20.204813455" observedRunningTime="2026-04-19 12:30:52.59822815 +0000 UTC m=+20.752161631" watchObservedRunningTime="2026-04-19 12:30:52.599183361 +0000 UTC m=+20.753116839" Apr 19 12:30:52.611848 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:52.611810 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jcjln" podStartSLOduration=3.228013093 podStartE2EDuration="20.611797608s" podCreationTimestamp="2026-04-19 12:30:32 +0000 UTC" firstStartedPulling="2026-04-19 12:30:34.683322429 +0000 UTC m=+2.837255888" lastFinishedPulling="2026-04-19 12:30:52.067106939 +0000 UTC m=+20.221040403" observedRunningTime="2026-04-19 12:30:52.611685462 +0000 UTC m=+20.765618942" watchObservedRunningTime="2026-04-19 12:30:52.611797608 +0000 UTC m=+20.765731087" Apr 19 12:30:52.629141 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:52.629100 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-jtfsx" podStartSLOduration=12.085814975 podStartE2EDuration="20.629087579s" podCreationTimestamp="2026-04-19 12:30:32 +0000 UTC" firstStartedPulling="2026-04-19 12:30:35.559963622 +0000 UTC m=+3.713897099" lastFinishedPulling="2026-04-19 12:30:44.103236229 +0000 UTC m=+12.257169703" observedRunningTime="2026-04-19 12:30:52.628574293 +0000 UTC m=+20.782507794" watchObservedRunningTime="2026-04-19 12:30:52.629087579 +0000 UTC m=+20.783021085" Apr 19 12:30:52.643421 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:52.643373 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rd647" podStartSLOduration=11.177869951 podStartE2EDuration="20.643358805s" podCreationTimestamp="2026-04-19 12:30:32 +0000 UTC" firstStartedPulling="2026-04-19 12:30:34.661536888 +0000 UTC m=+2.815470347" lastFinishedPulling="2026-04-19 12:30:44.127025736 +0000 UTC m=+12.280959201" observedRunningTime="2026-04-19 12:30:52.642714226 +0000 UTC m=+20.796647705" watchObservedRunningTime="2026-04-19 12:30:52.643358805 +0000 UTC m=+20.797292284" Apr 19 12:30:52.654403 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:52.654361 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-n7t66" podStartSLOduration=11.223873206 podStartE2EDuration="20.654346368s" podCreationTimestamp="2026-04-19 12:30:32 +0000 UTC" firstStartedPulling="2026-04-19 12:30:34.670458114 +0000 UTC m=+2.824391582" lastFinishedPulling="2026-04-19 12:30:44.100931279 +0000 UTC m=+12.254864744" observedRunningTime="2026-04-19 12:30:52.654140491 +0000 UTC m=+20.808073970" watchObservedRunningTime="2026-04-19 12:30:52.654346368 +0000 UTC m=+20.808279848" Apr 19 12:30:53.087207 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:53.087097 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ddd4693b-4c32-466e-be78-8808310a5f1f-original-pull-secret\") pod \"global-pull-secret-syncer-qlgx9\" (UID: \"ddd4693b-4c32-466e-be78-8808310a5f1f\") " pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:53.087443 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:53.087272 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:30:53.087443 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:53.087353 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd4693b-4c32-466e-be78-8808310a5f1f-original-pull-secret podName:ddd4693b-4c32-466e-be78-8808310a5f1f nodeName:}" failed. No retries permitted until 2026-04-19 12:31:09.087334292 +0000 UTC m=+37.241267754 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ddd4693b-4c32-466e-be78-8808310a5f1f-original-pull-secret") pod "global-pull-secret-syncer-qlgx9" (UID: "ddd4693b-4c32-466e-be78-8808310a5f1f") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:30:53.453757 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:53.453726 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:53.453918 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:53.453725 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:53.453918 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:53.453863 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98bqr" podUID="720d8932-1617-465d-a213-ebb1e99e6bc6" Apr 19 12:30:53.453918 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:53.453907 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qlgx9" podUID="ddd4693b-4c32-466e-be78-8808310a5f1f" Apr 19 12:30:53.580466 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:53.580426 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7hcbn" event={"ID":"7b7d9484-9728-48a5-a61f-08f5dbf3b0b7","Type":"ContainerStarted","Data":"c7f6f7dc4a7ec6abca4c5656c210e58f6e31d820e142439c8091aa0c74ca8776"} Apr 19 12:30:53.582901 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:53.582867 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wh942" event={"ID":"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9","Type":"ContainerStarted","Data":"5e84adcb964ce7294159a84100110d56fb391dbd13c291e015bcfd19d6e6a357"} Apr 19 12:30:53.582901 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:53.582903 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wh942" event={"ID":"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9","Type":"ContainerStarted","Data":"4d0e1c9be54c0840ee3ce82ef98894f342dad34a86dc3e99bb0d4e10a218b19d"} Apr 19 12:30:53.583090 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:53.582914 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wh942" event={"ID":"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9","Type":"ContainerStarted","Data":"97e18c90bd826934d93a52586ab2b526180769f41df4c9d5011b8e0f8cf27b8d"} Apr 19 12:30:53.886572 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:53.886543 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 19 12:30:54.394532 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:54.394427 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-19T12:30:53.88656382Z","UUID":"f3452f9a-22c9-438d-b6ad-dea11eb1041f","Handler":null,"Name":"","Endpoint":""} Apr 19 12:30:54.396499 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:54.396474 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 19 12:30:54.396499 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:54.396505 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 19 12:30:54.454308 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:54.454274 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:30:54.454479 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:54.454417 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-txq8s" podUID="54214d38-325e-4791-8e44-1bb6aac2fb3f" Apr 19 12:30:54.587730 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:54.587682 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" event={"ID":"0ab8ee79-a102-4394-8421-c7c0b8a462c3","Type":"ContainerStarted","Data":"e5a1cf63465a6d46d6e3cf19d6b44414be0bcbfffe0eaf287f1493188e7edffc"} Apr 19 12:30:55.454202 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:55.453938 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:55.454361 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:55.454001 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:55.454361 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:55.454321 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qlgx9" podUID="ddd4693b-4c32-466e-be78-8808310a5f1f" Apr 19 12:30:55.454463 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:55.454392 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98bqr" podUID="720d8932-1617-465d-a213-ebb1e99e6bc6" Apr 19 12:30:55.479780 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:55.479750 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-jtfsx" Apr 19 12:30:55.480517 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:55.480492 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-jtfsx" Apr 19 12:30:55.506639 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:55.506597 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-7hcbn" podStartSLOduration=14.934603139 podStartE2EDuration="23.506584096s" podCreationTimestamp="2026-04-19 12:30:32 +0000 UTC" firstStartedPulling="2026-04-19 12:30:35.558787131 +0000 UTC m=+3.712720603" lastFinishedPulling="2026-04-19 12:30:44.130768088 +0000 UTC m=+12.284701560" observedRunningTime="2026-04-19 12:30:53.591677743 +0000 UTC m=+21.745611222" watchObservedRunningTime="2026-04-19 12:30:55.506584096 +0000 UTC m=+23.660517575" Apr 19 12:30:55.592917 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:55.592879 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wh942" event={"ID":"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9","Type":"ContainerStarted","Data":"fa47c071eef432277783acd19ad42f1aec9dce8a4e185c7d91053d76cea507e8"} Apr 19 12:30:55.595061 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:55.595026 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" event={"ID":"0ab8ee79-a102-4394-8421-c7c0b8a462c3","Type":"ContainerStarted","Data":"63b3983025c76108ebbcf9faa485ece6dd9644e032dd9dc756c33dc5ee5bd2b1"} Apr 19 12:30:55.595418 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:55.595382 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-jtfsx" Apr 19 12:30:55.595885 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:55.595870 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-jtfsx" Apr 19 12:30:55.611040 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:55.610993 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4qn8x" podStartSLOduration=3.7771772710000002 podStartE2EDuration="23.610975508s" podCreationTimestamp="2026-04-19 12:30:32 +0000 UTC" firstStartedPulling="2026-04-19 12:30:35.557749195 +0000 UTC m=+3.711682670" lastFinishedPulling="2026-04-19 12:30:55.391547448 +0000 UTC m=+23.545480907" observedRunningTime="2026-04-19 12:30:55.610878494 +0000 UTC m=+23.764811974" watchObservedRunningTime="2026-04-19 12:30:55.610975508 +0000 UTC m=+23.764908989" Apr 19 12:30:56.456655 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:56.456625 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:30:56.456816 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:56.456736 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-txq8s" podUID="54214d38-325e-4791-8e44-1bb6aac2fb3f" Apr 19 12:30:57.454419 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:57.454149 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:57.455209 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:57.454150 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:57.455209 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:57.454512 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qlgx9" podUID="ddd4693b-4c32-466e-be78-8808310a5f1f" Apr 19 12:30:57.455209 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:57.454837 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98bqr" podUID="720d8932-1617-465d-a213-ebb1e99e6bc6" Apr 19 12:30:57.602902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:57.602870 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wh942" event={"ID":"f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9","Type":"ContainerStarted","Data":"458b087d641361b225b1db69a03f66acdac27fb356cbec5d19151730cb9780ec"} Apr 19 12:30:57.603339 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:57.603286 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:57.603339 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:57.603321 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:57.619224 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:57.619195 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:57.626155 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:57.626110 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wh942" podStartSLOduration=8.170206222 podStartE2EDuration="25.626093775s" podCreationTimestamp="2026-04-19 12:30:32 +0000 UTC" firstStartedPulling="2026-04-19 12:30:34.674587175 +0000 UTC m=+2.828520637" lastFinishedPulling="2026-04-19 12:30:52.13047472 +0000 UTC m=+20.284408190" observedRunningTime="2026-04-19 12:30:57.625652249 +0000 UTC m=+25.779585753" watchObservedRunningTime="2026-04-19 12:30:57.626093775 +0000 UTC m=+25.780027256" Apr 19 12:30:58.457257 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:58.457228 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:30:58.457675 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:58.457343 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-txq8s" podUID="54214d38-325e-4791-8e44-1bb6aac2fb3f" Apr 19 12:30:58.605770 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:58.605676 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:58.621974 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:58.621941 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:30:58.989731 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:58.989690 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-txq8s"] Apr 19 12:30:58.989897 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:58.989815 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:30:58.989964 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:58.989927 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-txq8s" podUID="54214d38-325e-4791-8e44-1bb6aac2fb3f" Apr 19 12:30:58.992877 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:58.992680 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-98bqr"] Apr 19 12:30:58.992877 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:58.992773 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:30:58.992877 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:58.992873 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98bqr" podUID="720d8932-1617-465d-a213-ebb1e99e6bc6" Apr 19 12:30:58.993896 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:58.993871 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qlgx9"] Apr 19 12:30:58.993998 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:58.993964 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:30:58.994067 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:30:58.994052 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qlgx9" podUID="ddd4693b-4c32-466e-be78-8808310a5f1f" Apr 19 12:30:59.608290 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:59.608251 2567 generic.go:358] "Generic (PLEG): container finished" podID="3c634550-95fa-4405-a478-4ce4ac61b034" containerID="9db7997f258eafa9d58d97cc932c50a9e092daee616c0d31d8aa7dcb85344551" exitCode=0 Apr 19 12:30:59.608713 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:30:59.608345 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6xjjk" event={"ID":"3c634550-95fa-4405-a478-4ce4ac61b034","Type":"ContainerDied","Data":"9db7997f258eafa9d58d97cc932c50a9e092daee616c0d31d8aa7dcb85344551"} Apr 19 12:31:00.456695 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:00.456667 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:31:00.456830 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:00.456668 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:31:00.456830 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:00.456770 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-txq8s" podUID="54214d38-325e-4791-8e44-1bb6aac2fb3f" Apr 19 12:31:00.456921 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:00.456677 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:31:00.456921 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:00.456852 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qlgx9" podUID="ddd4693b-4c32-466e-be78-8808310a5f1f" Apr 19 12:31:00.456989 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:00.456929 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98bqr" podUID="720d8932-1617-465d-a213-ebb1e99e6bc6" Apr 19 12:31:00.612298 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:00.612267 2567 generic.go:358] "Generic (PLEG): container finished" podID="3c634550-95fa-4405-a478-4ce4ac61b034" containerID="eba140291c9e87f11a271d7aa8dafb54b5f51b153af2f0245071f6c801e89b96" exitCode=0 Apr 19 12:31:00.612719 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:00.612344 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6xjjk" event={"ID":"3c634550-95fa-4405-a478-4ce4ac61b034","Type":"ContainerDied","Data":"eba140291c9e87f11a271d7aa8dafb54b5f51b153af2f0245071f6c801e89b96"} Apr 19 12:31:02.456454 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:02.456418 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:31:02.456969 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:02.456418 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:31:02.456969 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:02.456507 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-txq8s" podUID="54214d38-325e-4791-8e44-1bb6aac2fb3f" Apr 19 12:31:02.456969 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:02.456612 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qlgx9" podUID="ddd4693b-4c32-466e-be78-8808310a5f1f" Apr 19 12:31:02.456969 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:02.456430 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:31:02.456969 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:02.456705 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98bqr" podUID="720d8932-1617-465d-a213-ebb1e99e6bc6" Apr 19 12:31:02.618367 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:02.618337 2567 generic.go:358] "Generic (PLEG): container finished" podID="3c634550-95fa-4405-a478-4ce4ac61b034" containerID="5d5a5e4b39332198b6f2555f3414f2c3b1ab359262ca3889fe11de8de347582a" exitCode=0 Apr 19 12:31:02.618533 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:02.618397 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6xjjk" event={"ID":"3c634550-95fa-4405-a478-4ce4ac61b034","Type":"ContainerDied","Data":"5d5a5e4b39332198b6f2555f3414f2c3b1ab359262ca3889fe11de8de347582a"} Apr 19 12:31:04.453571 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.453533 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:31:04.454137 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:04.453636 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-txq8s" podUID="54214d38-325e-4791-8e44-1bb6aac2fb3f" Apr 19 12:31:04.454137 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.453647 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:31:04.454137 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.453661 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:31:04.454137 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:04.453759 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qlgx9" podUID="ddd4693b-4c32-466e-be78-8808310a5f1f" Apr 19 12:31:04.454137 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:04.453836 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98bqr" podUID="720d8932-1617-465d-a213-ebb1e99e6bc6" Apr 19 12:31:04.695574 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.695537 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-55.ec2.internal" event="NodeReady" Apr 19 12:31:04.695759 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.695709 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 19 12:31:04.727919 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.727831 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5cb67c5f8f-b7rwf"] Apr 19 12:31:04.730900 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.730874 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-bns5n"] Apr 19 12:31:04.731070 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.731046 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.733299 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.733148 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 19 12:31:04.733456 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.733432 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 19 12:31:04.733456 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.733450 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-v66mr\"" Apr 19 12:31:04.733575 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.733473 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 19 12:31:04.733851 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.733829 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bns5n" Apr 19 12:31:04.735553 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.735529 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 19 12:31:04.735720 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.735599 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-6mqh9\"" Apr 19 12:31:04.735946 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.735924 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 19 12:31:04.738887 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.738867 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 19 12:31:04.742237 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.742216 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-29mkc"] Apr 19 12:31:04.745190 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.745146 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-29mkc" Apr 19 12:31:04.745904 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.745881 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-bns5n"] Apr 19 12:31:04.746621 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.746599 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5cb67c5f8f-b7rwf"] Apr 19 12:31:04.747605 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.747588 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xbkh5\"" Apr 19 12:31:04.747756 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.747738 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 19 12:31:04.747831 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.747772 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 19 12:31:04.747889 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.747844 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 19 12:31:04.754381 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.754340 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-29mkc"] Apr 19 12:31:04.838371 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.838340 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nb68t"] Apr 19 12:31:04.841308 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.841291 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nb68t" Apr 19 12:31:04.843072 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.843054 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 19 12:31:04.843193 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.843074 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 19 12:31:04.843193 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.843074 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-62vx7\"" Apr 19 12:31:04.849694 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.849675 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nb68t"] Apr 19 12:31:04.879631 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.879606 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5svm\" (UniqueName: \"kubernetes.io/projected/f68c88a5-5b83-4fd0-92df-327974a7cc96-kube-api-access-j5svm\") pod \"ingress-canary-29mkc\" (UID: \"f68c88a5-5b83-4fd0-92df-327974a7cc96\") " pod="openshift-ingress-canary/ingress-canary-29mkc" Apr 19 12:31:04.879768 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.879635 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/343439fa-d125-4243-ac43-c00e012201b9-image-registry-private-configuration\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.879768 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.879659 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vwnp\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-kube-api-access-6vwnp\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.879768 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.879682 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert\") pod \"ingress-canary-29mkc\" (UID: \"f68c88a5-5b83-4fd0-92df-327974a7cc96\") " pod="openshift-ingress-canary/ingress-canary-29mkc" Apr 19 12:31:04.879768 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.879701 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/343439fa-d125-4243-ac43-c00e012201b9-registry-certificates\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.879768 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.879720 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/343439fa-d125-4243-ac43-c00e012201b9-trusted-ca\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.879768 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.879735 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-bound-sa-token\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.879768 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.879753 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.879995 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.879823 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e88a41fd-9d7d-457b-af51-169ee562d266-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-bns5n\" (UID: \"e88a41fd-9d7d-457b-af51-169ee562d266\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bns5n" Apr 19 12:31:04.879995 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.879842 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/343439fa-d125-4243-ac43-c00e012201b9-ca-trust-extracted\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.879995 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.879859 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/343439fa-d125-4243-ac43-c00e012201b9-installation-pull-secrets\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.879995 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.879877 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bns5n\" (UID: \"e88a41fd-9d7d-457b-af51-169ee562d266\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bns5n" Apr 19 12:31:04.981095 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.981010 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5svm\" (UniqueName: \"kubernetes.io/projected/f68c88a5-5b83-4fd0-92df-327974a7cc96-kube-api-access-j5svm\") pod \"ingress-canary-29mkc\" (UID: \"f68c88a5-5b83-4fd0-92df-327974a7cc96\") " pod="openshift-ingress-canary/ingress-canary-29mkc" Apr 19 12:31:04.981095 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.981058 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/343439fa-d125-4243-ac43-c00e012201b9-image-registry-private-configuration\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.981095 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.981091 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vwnp\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-kube-api-access-6vwnp\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.981487 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.981117 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls\") pod \"dns-default-nb68t\" (UID: \"c83beaf5-2d24-4163-855a-f4c6d55b0311\") " pod="openshift-dns/dns-default-nb68t" Apr 19 12:31:04.981487 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.981182 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert\") pod \"ingress-canary-29mkc\" (UID: \"f68c88a5-5b83-4fd0-92df-327974a7cc96\") " pod="openshift-ingress-canary/ingress-canary-29mkc" Apr 19 12:31:04.981487 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.981209 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/343439fa-d125-4243-ac43-c00e012201b9-registry-certificates\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.981487 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.981235 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/343439fa-d125-4243-ac43-c00e012201b9-trusted-ca\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.981487 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.981261 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-bound-sa-token\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.981487 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:04.981274 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:31:04.981487 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.981295 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7t6p\" (UniqueName: \"kubernetes.io/projected/c83beaf5-2d24-4163-855a-f4c6d55b0311-kube-api-access-d7t6p\") pod \"dns-default-nb68t\" (UID: \"c83beaf5-2d24-4163-855a-f4c6d55b0311\") " pod="openshift-dns/dns-default-nb68t" Apr 19 12:31:04.981487 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.981324 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.981487 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:04.981339 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert podName:f68c88a5-5b83-4fd0-92df-327974a7cc96 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:05.481318091 +0000 UTC m=+33.635251548 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert") pod "ingress-canary-29mkc" (UID: "f68c88a5-5b83-4fd0-92df-327974a7cc96") : secret "canary-serving-cert" not found Apr 19 12:31:04.981487 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:04.981391 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 12:31:04.981487 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:04.981405 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cb67c5f8f-b7rwf: secret "image-registry-tls" not found Apr 19 12:31:04.981487 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.981411 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e88a41fd-9d7d-457b-af51-169ee562d266-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-bns5n\" (UID: \"e88a41fd-9d7d-457b-af51-169ee562d266\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bns5n" Apr 19 12:31:04.981487 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.981437 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/343439fa-d125-4243-ac43-c00e012201b9-ca-trust-extracted\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.981487 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:04.981459 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls podName:343439fa-d125-4243-ac43-c00e012201b9 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:05.481442907 +0000 UTC m=+33.635376379 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls") pod "image-registry-5cb67c5f8f-b7rwf" (UID: "343439fa-d125-4243-ac43-c00e012201b9") : secret "image-registry-tls" not found Apr 19 12:31:04.981487 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.981492 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/343439fa-d125-4243-ac43-c00e012201b9-installation-pull-secrets\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.982311 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.981519 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c83beaf5-2d24-4163-855a-f4c6d55b0311-config-volume\") pod \"dns-default-nb68t\" (UID: \"c83beaf5-2d24-4163-855a-f4c6d55b0311\") " pod="openshift-dns/dns-default-nb68t" Apr 19 12:31:04.982311 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.981549 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bns5n\" (UID: \"e88a41fd-9d7d-457b-af51-169ee562d266\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bns5n" Apr 19 12:31:04.982311 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.981579 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c83beaf5-2d24-4163-855a-f4c6d55b0311-tmp-dir\") pod \"dns-default-nb68t\" (UID: \"c83beaf5-2d24-4163-855a-f4c6d55b0311\") " pod="openshift-dns/dns-default-nb68t" Apr 19 12:31:04.982311 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:04.981773 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 19 12:31:04.982311 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:04.981827 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert podName:e88a41fd-9d7d-457b-af51-169ee562d266 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:05.481810719 +0000 UTC m=+33.635744177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bns5n" (UID: "e88a41fd-9d7d-457b-af51-169ee562d266") : secret "networking-console-plugin-cert" not found Apr 19 12:31:04.982311 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.982000 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/343439fa-d125-4243-ac43-c00e012201b9-ca-trust-extracted\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.982311 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.982212 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/343439fa-d125-4243-ac43-c00e012201b9-registry-certificates\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.982647 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.982377 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e88a41fd-9d7d-457b-af51-169ee562d266-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-bns5n\" (UID: \"e88a41fd-9d7d-457b-af51-169ee562d266\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bns5n" Apr 19 12:31:04.982647 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.982451 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/343439fa-d125-4243-ac43-c00e012201b9-trusted-ca\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.986510 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.986487 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/343439fa-d125-4243-ac43-c00e012201b9-image-registry-private-configuration\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.986606 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.986491 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/343439fa-d125-4243-ac43-c00e012201b9-installation-pull-secrets\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.993005 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.992956 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5svm\" (UniqueName: \"kubernetes.io/projected/f68c88a5-5b83-4fd0-92df-327974a7cc96-kube-api-access-j5svm\") pod \"ingress-canary-29mkc\" (UID: \"f68c88a5-5b83-4fd0-92df-327974a7cc96\") " pod="openshift-ingress-canary/ingress-canary-29mkc" Apr 19 12:31:04.993178 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.993142 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vwnp\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-kube-api-access-6vwnp\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:04.993405 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:04.993386 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-bound-sa-token\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:05.082664 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:05.082623 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls\") pod \"dns-default-nb68t\" (UID: \"c83beaf5-2d24-4163-855a-f4c6d55b0311\") " pod="openshift-dns/dns-default-nb68t" Apr 19 12:31:05.082813 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:05.082694 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7t6p\" (UniqueName: \"kubernetes.io/projected/c83beaf5-2d24-4163-855a-f4c6d55b0311-kube-api-access-d7t6p\") pod \"dns-default-nb68t\" (UID: \"c83beaf5-2d24-4163-855a-f4c6d55b0311\") " pod="openshift-dns/dns-default-nb68t" Apr 19 12:31:05.082813 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:05.082784 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:31:05.082813 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:05.082805 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c83beaf5-2d24-4163-855a-f4c6d55b0311-config-volume\") pod \"dns-default-nb68t\" (UID: \"c83beaf5-2d24-4163-855a-f4c6d55b0311\") " pod="openshift-dns/dns-default-nb68t" Apr 19 12:31:05.082970 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:05.082851 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls podName:c83beaf5-2d24-4163-855a-f4c6d55b0311 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:05.582835863 +0000 UTC m=+33.736769321 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls") pod "dns-default-nb68t" (UID: "c83beaf5-2d24-4163-855a-f4c6d55b0311") : secret "dns-default-metrics-tls" not found Apr 19 12:31:05.082970 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:05.082875 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c83beaf5-2d24-4163-855a-f4c6d55b0311-tmp-dir\") pod \"dns-default-nb68t\" (UID: \"c83beaf5-2d24-4163-855a-f4c6d55b0311\") " pod="openshift-dns/dns-default-nb68t" Apr 19 12:31:05.083204 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:05.083189 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c83beaf5-2d24-4163-855a-f4c6d55b0311-tmp-dir\") pod \"dns-default-nb68t\" (UID: \"c83beaf5-2d24-4163-855a-f4c6d55b0311\") " pod="openshift-dns/dns-default-nb68t" Apr 19 12:31:05.083412 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:05.083396 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c83beaf5-2d24-4163-855a-f4c6d55b0311-config-volume\") pod \"dns-default-nb68t\" (UID: \"c83beaf5-2d24-4163-855a-f4c6d55b0311\") " pod="openshift-dns/dns-default-nb68t" Apr 19 12:31:05.092222 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:05.092195 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7t6p\" (UniqueName: \"kubernetes.io/projected/c83beaf5-2d24-4163-855a-f4c6d55b0311-kube-api-access-d7t6p\") pod \"dns-default-nb68t\" (UID: \"c83beaf5-2d24-4163-855a-f4c6d55b0311\") " pod="openshift-dns/dns-default-nb68t" Apr 19 12:31:05.485987 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:05.485950 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:05.486675 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:05.486047 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bns5n\" (UID: \"e88a41fd-9d7d-457b-af51-169ee562d266\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bns5n" Apr 19 12:31:05.486675 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:05.486105 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 12:31:05.486675 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:05.486122 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cb67c5f8f-b7rwf: secret "image-registry-tls" not found Apr 19 12:31:05.486675 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:05.486130 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert\") pod \"ingress-canary-29mkc\" (UID: \"f68c88a5-5b83-4fd0-92df-327974a7cc96\") " pod="openshift-ingress-canary/ingress-canary-29mkc" Apr 19 12:31:05.486675 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:05.486199 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls podName:343439fa-d125-4243-ac43-c00e012201b9 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:06.48618099 +0000 UTC m=+34.640114456 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls") pod "image-registry-5cb67c5f8f-b7rwf" (UID: "343439fa-d125-4243-ac43-c00e012201b9") : secret "image-registry-tls" not found Apr 19 12:31:05.486675 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:05.486220 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 19 12:31:05.486675 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:05.486234 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:31:05.486675 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:05.486269 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert podName:f68c88a5-5b83-4fd0-92df-327974a7cc96 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:06.486258859 +0000 UTC m=+34.640192323 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert") pod "ingress-canary-29mkc" (UID: "f68c88a5-5b83-4fd0-92df-327974a7cc96") : secret "canary-serving-cert" not found Apr 19 12:31:05.486675 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:05.486294 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert podName:e88a41fd-9d7d-457b-af51-169ee562d266 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:06.486276307 +0000 UTC m=+34.640209773 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bns5n" (UID: "e88a41fd-9d7d-457b-af51-169ee562d266") : secret "networking-console-plugin-cert" not found Apr 19 12:31:05.586835 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:05.586796 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls\") pod \"dns-default-nb68t\" (UID: \"c83beaf5-2d24-4163-855a-f4c6d55b0311\") " pod="openshift-dns/dns-default-nb68t" Apr 19 12:31:05.587020 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:05.586907 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:31:05.587020 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:05.586976 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls podName:c83beaf5-2d24-4163-855a-f4c6d55b0311 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:06.586956008 +0000 UTC m=+34.740889471 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls") pod "dns-default-nb68t" (UID: "c83beaf5-2d24-4163-855a-f4c6d55b0311") : secret "dns-default-metrics-tls" not found Apr 19 12:31:06.190803 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:06.190762 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs\") pod \"network-metrics-daemon-98bqr\" (UID: \"720d8932-1617-465d-a213-ebb1e99e6bc6\") " pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:31:06.191001 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:06.190909 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:31:06.191001 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:06.190971 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs podName:720d8932-1617-465d-a213-ebb1e99e6bc6 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:38.190956873 +0000 UTC m=+66.344890330 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs") pod "network-metrics-daemon-98bqr" (UID: "720d8932-1617-465d-a213-ebb1e99e6bc6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:31:06.291463 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:06.291421 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h4q5\" (UniqueName: \"kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5\") pod \"network-check-target-txq8s\" (UID: \"54214d38-325e-4791-8e44-1bb6aac2fb3f\") " pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:31:06.291679 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:06.291613 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:31:06.291679 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:06.291641 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:31:06.291679 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:06.291654 2567 projected.go:194] Error preparing data for projected volume kube-api-access-5h4q5 for pod openshift-network-diagnostics/network-check-target-txq8s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:31:06.291861 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:06.291721 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5 podName:54214d38-325e-4791-8e44-1bb6aac2fb3f nodeName:}" failed. No retries permitted until 2026-04-19 12:31:38.291702343 +0000 UTC m=+66.445635807 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-5h4q5" (UniqueName: "kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5") pod "network-check-target-txq8s" (UID: "54214d38-325e-4791-8e44-1bb6aac2fb3f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:31:06.454657 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:06.454572 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:31:06.454820 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:06.454581 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:31:06.454989 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:06.454582 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:31:06.457543 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:06.457286 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 19 12:31:06.457543 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:06.457303 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4sh8l\"" Apr 19 12:31:06.457543 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:06.457295 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 19 12:31:06.457543 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:06.457355 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nbhvx\"" Apr 19 12:31:06.457543 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:06.457355 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 19 12:31:06.457543 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:06.457401 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 19 12:31:06.492846 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:06.492817 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert\") pod \"ingress-canary-29mkc\" (UID: \"f68c88a5-5b83-4fd0-92df-327974a7cc96\") " pod="openshift-ingress-canary/ingress-canary-29mkc" Apr 19 12:31:06.493227 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:06.492869 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:06.493227 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:06.492942 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bns5n\" (UID: \"e88a41fd-9d7d-457b-af51-169ee562d266\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bns5n" Apr 19 12:31:06.493227 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:06.492963 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:31:06.493227 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:06.493025 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert podName:f68c88a5-5b83-4fd0-92df-327974a7cc96 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:08.493010013 +0000 UTC m=+36.646943495 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert") pod "ingress-canary-29mkc" (UID: "f68c88a5-5b83-4fd0-92df-327974a7cc96") : secret "canary-serving-cert" not found Apr 19 12:31:06.493227 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:06.493074 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 12:31:06.493227 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:06.493097 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cb67c5f8f-b7rwf: secret "image-registry-tls" not found Apr 19 12:31:06.493227 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:06.493146 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls podName:343439fa-d125-4243-ac43-c00e012201b9 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:08.49312916 +0000 UTC m=+36.647062624 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls") pod "image-registry-5cb67c5f8f-b7rwf" (UID: "343439fa-d125-4243-ac43-c00e012201b9") : secret "image-registry-tls" not found Apr 19 12:31:06.493227 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:06.493076 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 19 12:31:06.493227 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:06.493233 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert podName:e88a41fd-9d7d-457b-af51-169ee562d266 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:08.493220657 +0000 UTC m=+36.647154143 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bns5n" (UID: "e88a41fd-9d7d-457b-af51-169ee562d266") : secret "networking-console-plugin-cert" not found Apr 19 12:31:06.593491 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:06.593452 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls\") pod \"dns-default-nb68t\" (UID: \"c83beaf5-2d24-4163-855a-f4c6d55b0311\") " pod="openshift-dns/dns-default-nb68t" Apr 19 12:31:06.593652 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:06.593611 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:31:06.593725 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:06.593678 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls podName:c83beaf5-2d24-4163-855a-f4c6d55b0311 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:08.59365911 +0000 UTC m=+36.747592575 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls") pod "dns-default-nb68t" (UID: "c83beaf5-2d24-4163-855a-f4c6d55b0311") : secret "dns-default-metrics-tls" not found Apr 19 12:31:08.507834 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:08.507597 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert\") pod \"ingress-canary-29mkc\" (UID: \"f68c88a5-5b83-4fd0-92df-327974a7cc96\") " pod="openshift-ingress-canary/ingress-canary-29mkc" Apr 19 12:31:08.508290 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:08.507862 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:08.508290 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:08.507769 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:31:08.508290 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:08.507932 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bns5n\" (UID: \"e88a41fd-9d7d-457b-af51-169ee562d266\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bns5n" Apr 19 12:31:08.508290 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:08.507979 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert podName:f68c88a5-5b83-4fd0-92df-327974a7cc96 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:12.507956341 +0000 UTC m=+40.661889815 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert") pod "ingress-canary-29mkc" (UID: "f68c88a5-5b83-4fd0-92df-327974a7cc96") : secret "canary-serving-cert" not found Apr 19 12:31:08.508290 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:08.508032 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 19 12:31:08.508290 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:08.508063 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 12:31:08.508290 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:08.508081 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cb67c5f8f-b7rwf: secret "image-registry-tls" not found Apr 19 12:31:08.508290 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:08.508092 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert podName:e88a41fd-9d7d-457b-af51-169ee562d266 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:12.508077034 +0000 UTC m=+40.662010494 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bns5n" (UID: "e88a41fd-9d7d-457b-af51-169ee562d266") : secret "networking-console-plugin-cert" not found Apr 19 12:31:08.508290 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:08.508124 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls podName:343439fa-d125-4243-ac43-c00e012201b9 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:12.508110748 +0000 UTC m=+40.662044214 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls") pod "image-registry-5cb67c5f8f-b7rwf" (UID: "343439fa-d125-4243-ac43-c00e012201b9") : secret "image-registry-tls" not found Apr 19 12:31:08.608404 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:08.608366 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls\") pod \"dns-default-nb68t\" (UID: \"c83beaf5-2d24-4163-855a-f4c6d55b0311\") " pod="openshift-dns/dns-default-nb68t" Apr 19 12:31:08.608588 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:08.608506 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:31:08.608652 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:08.608590 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls podName:c83beaf5-2d24-4163-855a-f4c6d55b0311 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:12.60856848 +0000 UTC m=+40.762501955 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls") pod "dns-default-nb68t" (UID: "c83beaf5-2d24-4163-855a-f4c6d55b0311") : secret "dns-default-metrics-tls" not found Apr 19 12:31:09.112747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:09.112701 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ddd4693b-4c32-466e-be78-8808310a5f1f-original-pull-secret\") pod \"global-pull-secret-syncer-qlgx9\" (UID: \"ddd4693b-4c32-466e-be78-8808310a5f1f\") " pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:31:09.116040 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:09.116002 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ddd4693b-4c32-466e-be78-8808310a5f1f-original-pull-secret\") pod \"global-pull-secret-syncer-qlgx9\" (UID: \"ddd4693b-4c32-466e-be78-8808310a5f1f\") " pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:31:09.181648 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:09.181613 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qlgx9" Apr 19 12:31:10.031301 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:10.031266 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qlgx9"] Apr 19 12:31:10.045030 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:31:10.044988 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddd4693b_4c32_466e_be78_8808310a5f1f.slice/crio-7fe0be6f09d50a1efc6ea3d34126e03622494c422b4399830690bab403297116 WatchSource:0}: Error finding container 7fe0be6f09d50a1efc6ea3d34126e03622494c422b4399830690bab403297116: Status 404 returned error can't find the container with id 7fe0be6f09d50a1efc6ea3d34126e03622494c422b4399830690bab403297116 Apr 19 12:31:10.640950 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:10.640909 2567 generic.go:358] "Generic (PLEG): container finished" podID="3c634550-95fa-4405-a478-4ce4ac61b034" containerID="5605cc6049e976d93cc8cc6528b85e41f951f21ce6f2e510c6428b00e3ff9d4b" exitCode=0 Apr 19 12:31:10.641097 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:10.640981 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6xjjk" event={"ID":"3c634550-95fa-4405-a478-4ce4ac61b034","Type":"ContainerDied","Data":"5605cc6049e976d93cc8cc6528b85e41f951f21ce6f2e510c6428b00e3ff9d4b"} Apr 19 12:31:10.642010 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:10.641990 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qlgx9" event={"ID":"ddd4693b-4c32-466e-be78-8808310a5f1f","Type":"ContainerStarted","Data":"7fe0be6f09d50a1efc6ea3d34126e03622494c422b4399830690bab403297116"} Apr 19 12:31:11.646621 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:11.646436 2567 generic.go:358] "Generic (PLEG): container finished" podID="3c634550-95fa-4405-a478-4ce4ac61b034" containerID="8f3d8ea15d488aef7dd2c20c0c956e8381651f89b281dcda1cb755c2c9180e54" exitCode=0 Apr 19 12:31:11.646621 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:11.646514 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6xjjk" event={"ID":"3c634550-95fa-4405-a478-4ce4ac61b034","Type":"ContainerDied","Data":"8f3d8ea15d488aef7dd2c20c0c956e8381651f89b281dcda1cb755c2c9180e54"} Apr 19 12:31:12.542151 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:12.542115 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bns5n\" (UID: \"e88a41fd-9d7d-457b-af51-169ee562d266\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bns5n" Apr 19 12:31:12.542379 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:12.542228 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert\") pod \"ingress-canary-29mkc\" (UID: \"f68c88a5-5b83-4fd0-92df-327974a7cc96\") " pod="openshift-ingress-canary/ingress-canary-29mkc" Apr 19 12:31:12.542379 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:12.542263 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:12.542379 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:12.542269 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 19 12:31:12.542379 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:12.542363 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 12:31:12.542379 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:12.542381 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cb67c5f8f-b7rwf: secret "image-registry-tls" not found Apr 19 12:31:12.542621 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:12.542308 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:31:12.542621 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:12.542381 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert podName:e88a41fd-9d7d-457b-af51-169ee562d266 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:20.542363272 +0000 UTC m=+48.696296747 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bns5n" (UID: "e88a41fd-9d7d-457b-af51-169ee562d266") : secret "networking-console-plugin-cert" not found Apr 19 12:31:12.542621 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:12.542454 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls podName:343439fa-d125-4243-ac43-c00e012201b9 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:20.542440149 +0000 UTC m=+48.696373611 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls") pod "image-registry-5cb67c5f8f-b7rwf" (UID: "343439fa-d125-4243-ac43-c00e012201b9") : secret "image-registry-tls" not found Apr 19 12:31:12.542621 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:12.542464 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert podName:f68c88a5-5b83-4fd0-92df-327974a7cc96 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:20.542458392 +0000 UTC m=+48.696391850 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert") pod "ingress-canary-29mkc" (UID: "f68c88a5-5b83-4fd0-92df-327974a7cc96") : secret "canary-serving-cert" not found Apr 19 12:31:12.642739 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:12.642698 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls\") pod \"dns-default-nb68t\" (UID: \"c83beaf5-2d24-4163-855a-f4c6d55b0311\") " pod="openshift-dns/dns-default-nb68t" Apr 19 12:31:12.642908 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:12.642866 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:31:12.642950 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:12.642941 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls podName:c83beaf5-2d24-4163-855a-f4c6d55b0311 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:20.642921338 +0000 UTC m=+48.796854802 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls") pod "dns-default-nb68t" (UID: "c83beaf5-2d24-4163-855a-f4c6d55b0311") : secret "dns-default-metrics-tls" not found Apr 19 12:31:12.652026 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:12.651997 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6xjjk" event={"ID":"3c634550-95fa-4405-a478-4ce4ac61b034","Type":"ContainerStarted","Data":"23a85d8e6f513fc4bfda13f7c72a2ef3a68e14f4a61abd2b756515ee5a4d9fcc"} Apr 19 12:31:12.672200 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:12.672135 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6xjjk" podStartSLOduration=5.134838584 podStartE2EDuration="40.672117739s" podCreationTimestamp="2026-04-19 12:30:32 +0000 UTC" firstStartedPulling="2026-04-19 12:30:34.639559637 +0000 UTC m=+2.793493095" lastFinishedPulling="2026-04-19 12:31:10.176838788 +0000 UTC m=+38.330772250" observedRunningTime="2026-04-19 12:31:12.670713722 +0000 UTC m=+40.824647202" watchObservedRunningTime="2026-04-19 12:31:12.672117739 +0000 UTC m=+40.826051219" Apr 19 12:31:15.659810 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:15.659774 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qlgx9" event={"ID":"ddd4693b-4c32-466e-be78-8808310a5f1f","Type":"ContainerStarted","Data":"95d3434f35ae1e4fec7c1d7567f0fd2cf6c65819c61e57df45881cb25683448b"} Apr 19 12:31:15.673254 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:15.673150 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-qlgx9" podStartSLOduration=33.988925027 podStartE2EDuration="38.673132993s" podCreationTimestamp="2026-04-19 12:30:37 +0000 UTC" firstStartedPulling="2026-04-19 12:31:10.154940949 +0000 UTC m=+38.308874407" lastFinishedPulling="2026-04-19 12:31:14.839148911 +0000 UTC m=+42.993082373" observedRunningTime="2026-04-19 12:31:15.672355779 +0000 UTC m=+43.826289259" watchObservedRunningTime="2026-04-19 12:31:15.673132993 +0000 UTC m=+43.827066474" Apr 19 12:31:20.606003 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:20.605960 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert\") pod \"ingress-canary-29mkc\" (UID: \"f68c88a5-5b83-4fd0-92df-327974a7cc96\") " pod="openshift-ingress-canary/ingress-canary-29mkc" Apr 19 12:31:20.606541 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:20.606016 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:20.606541 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:20.606080 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bns5n\" (UID: \"e88a41fd-9d7d-457b-af51-169ee562d266\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bns5n" Apr 19 12:31:20.606541 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:20.606138 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 12:31:20.606541 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:20.606185 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cb67c5f8f-b7rwf: secret "image-registry-tls" not found Apr 19 12:31:20.606541 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:20.606232 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 19 12:31:20.606541 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:20.606137 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:31:20.606541 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:20.606251 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls podName:343439fa-d125-4243-ac43-c00e012201b9 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:36.606236186 +0000 UTC m=+64.760169648 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls") pod "image-registry-5cb67c5f8f-b7rwf" (UID: "343439fa-d125-4243-ac43-c00e012201b9") : secret "image-registry-tls" not found Apr 19 12:31:20.606541 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:20.606346 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert podName:e88a41fd-9d7d-457b-af51-169ee562d266 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:36.606328297 +0000 UTC m=+64.760261759 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bns5n" (UID: "e88a41fd-9d7d-457b-af51-169ee562d266") : secret "networking-console-plugin-cert" not found Apr 19 12:31:20.606541 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:20.606359 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert podName:f68c88a5-5b83-4fd0-92df-327974a7cc96 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:36.606352643 +0000 UTC m=+64.760286101 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert") pod "ingress-canary-29mkc" (UID: "f68c88a5-5b83-4fd0-92df-327974a7cc96") : secret "canary-serving-cert" not found Apr 19 12:31:20.707010 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:20.706976 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls\") pod \"dns-default-nb68t\" (UID: \"c83beaf5-2d24-4163-855a-f4c6d55b0311\") " pod="openshift-dns/dns-default-nb68t" Apr 19 12:31:20.707222 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:20.707124 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:31:20.707222 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:20.707213 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls podName:c83beaf5-2d24-4163-855a-f4c6d55b0311 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:36.707193692 +0000 UTC m=+64.861127150 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls") pod "dns-default-nb68t" (UID: "c83beaf5-2d24-4163-855a-f4c6d55b0311") : secret "dns-default-metrics-tls" not found Apr 19 12:31:30.622784 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:30.622754 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wh942" Apr 19 12:31:36.629747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:36.629697 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bns5n\" (UID: \"e88a41fd-9d7d-457b-af51-169ee562d266\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bns5n" Apr 19 12:31:36.630319 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:36.629777 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert\") pod \"ingress-canary-29mkc\" (UID: \"f68c88a5-5b83-4fd0-92df-327974a7cc96\") " pod="openshift-ingress-canary/ingress-canary-29mkc" Apr 19 12:31:36.630319 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:36.629802 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:31:36.630319 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:36.629864 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 19 12:31:36.630319 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:36.629885 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 12:31:36.630319 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:36.629896 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cb67c5f8f-b7rwf: secret "image-registry-tls" not found Apr 19 12:31:36.630319 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:36.629899 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:31:36.630319 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:36.629942 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls podName:343439fa-d125-4243-ac43-c00e012201b9 nodeName:}" failed. No retries permitted until 2026-04-19 12:32:08.62992702 +0000 UTC m=+96.783860478 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls") pod "image-registry-5cb67c5f8f-b7rwf" (UID: "343439fa-d125-4243-ac43-c00e012201b9") : secret "image-registry-tls" not found Apr 19 12:31:36.630319 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:36.629955 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert podName:f68c88a5-5b83-4fd0-92df-327974a7cc96 nodeName:}" failed. No retries permitted until 2026-04-19 12:32:08.629948657 +0000 UTC m=+96.783882114 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert") pod "ingress-canary-29mkc" (UID: "f68c88a5-5b83-4fd0-92df-327974a7cc96") : secret "canary-serving-cert" not found Apr 19 12:31:36.630319 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:36.629965 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert podName:e88a41fd-9d7d-457b-af51-169ee562d266 nodeName:}" failed. No retries permitted until 2026-04-19 12:32:08.629960122 +0000 UTC m=+96.783893580 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bns5n" (UID: "e88a41fd-9d7d-457b-af51-169ee562d266") : secret "networking-console-plugin-cert" not found Apr 19 12:31:36.730780 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:36.730734 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls\") pod \"dns-default-nb68t\" (UID: \"c83beaf5-2d24-4163-855a-f4c6d55b0311\") " pod="openshift-dns/dns-default-nb68t" Apr 19 12:31:36.730958 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:36.730900 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:31:36.730997 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:36.730971 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls podName:c83beaf5-2d24-4163-855a-f4c6d55b0311 nodeName:}" failed. No retries permitted until 2026-04-19 12:32:08.730954825 +0000 UTC m=+96.884888282 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls") pod "dns-default-nb68t" (UID: "c83beaf5-2d24-4163-855a-f4c6d55b0311") : secret "dns-default-metrics-tls" not found Apr 19 12:31:38.243651 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:38.243611 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs\") pod \"network-metrics-daemon-98bqr\" (UID: \"720d8932-1617-465d-a213-ebb1e99e6bc6\") " pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:31:38.245789 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:38.245771 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 19 12:31:38.254645 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:38.254626 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 19 12:31:38.254726 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:31:38.254694 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs podName:720d8932-1617-465d-a213-ebb1e99e6bc6 nodeName:}" failed. No retries permitted until 2026-04-19 12:32:42.254671194 +0000 UTC m=+130.408604655 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs") pod "network-metrics-daemon-98bqr" (UID: "720d8932-1617-465d-a213-ebb1e99e6bc6") : secret "metrics-daemon-secret" not found Apr 19 12:31:38.344634 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:38.344589 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h4q5\" (UniqueName: \"kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5\") pod \"network-check-target-txq8s\" (UID: \"54214d38-325e-4791-8e44-1bb6aac2fb3f\") " pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:31:38.346527 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:38.346509 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 19 12:31:38.356780 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:38.356755 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 19 12:31:38.369039 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:38.369004 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h4q5\" (UniqueName: \"kubernetes.io/projected/54214d38-325e-4791-8e44-1bb6aac2fb3f-kube-api-access-5h4q5\") pod \"network-check-target-txq8s\" (UID: \"54214d38-325e-4791-8e44-1bb6aac2fb3f\") " pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:31:38.569696 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:38.569619 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4sh8l\"" Apr 19 12:31:38.578247 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:38.578225 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:31:38.697396 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:38.697362 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-txq8s"] Apr 19 12:31:38.700835 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:31:38.700806 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54214d38_325e_4791_8e44_1bb6aac2fb3f.slice/crio-51a0bfa62120697d92c2214edd586d12774dc879ec05c407f513137978a40442 WatchSource:0}: Error finding container 51a0bfa62120697d92c2214edd586d12774dc879ec05c407f513137978a40442: Status 404 returned error can't find the container with id 51a0bfa62120697d92c2214edd586d12774dc879ec05c407f513137978a40442 Apr 19 12:31:38.704019 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:38.703990 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-txq8s" event={"ID":"54214d38-325e-4791-8e44-1bb6aac2fb3f","Type":"ContainerStarted","Data":"51a0bfa62120697d92c2214edd586d12774dc879ec05c407f513137978a40442"} Apr 19 12:31:41.710902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:41.710818 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-txq8s" event={"ID":"54214d38-325e-4791-8e44-1bb6aac2fb3f","Type":"ContainerStarted","Data":"b44e1e15143d96ccd15a98c27bd27c48458b962ced241a48ea6d91c6d89b0b7f"} Apr 19 12:31:41.711283 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:41.710928 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:31:41.726261 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:31:41.726222 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-txq8s" podStartSLOduration=67.027528111 podStartE2EDuration="1m9.726208511s" podCreationTimestamp="2026-04-19 12:30:32 +0000 UTC" firstStartedPulling="2026-04-19 12:31:38.702615719 +0000 UTC m=+66.856549177" lastFinishedPulling="2026-04-19 12:31:41.401296119 +0000 UTC m=+69.555229577" observedRunningTime="2026-04-19 12:31:41.726067194 +0000 UTC m=+69.880000674" watchObservedRunningTime="2026-04-19 12:31:41.726208511 +0000 UTC m=+69.880141990" Apr 19 12:32:08.677632 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:08.677561 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bns5n\" (UID: \"e88a41fd-9d7d-457b-af51-169ee562d266\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bns5n" Apr 19 12:32:08.678196 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:08.677766 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 19 12:32:08.678196 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:08.677853 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert podName:e88a41fd-9d7d-457b-af51-169ee562d266 nodeName:}" failed. No retries permitted until 2026-04-19 12:33:12.677832373 +0000 UTC m=+160.831765846 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bns5n" (UID: "e88a41fd-9d7d-457b-af51-169ee562d266") : secret "networking-console-plugin-cert" not found Apr 19 12:32:08.678196 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:08.677915 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert\") pod \"ingress-canary-29mkc\" (UID: \"f68c88a5-5b83-4fd0-92df-327974a7cc96\") " pod="openshift-ingress-canary/ingress-canary-29mkc" Apr 19 12:32:08.678196 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:08.677964 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:32:08.678196 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:08.678088 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 12:32:08.678196 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:08.678102 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cb67c5f8f-b7rwf: secret "image-registry-tls" not found Apr 19 12:32:08.678196 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:08.678148 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls podName:343439fa-d125-4243-ac43-c00e012201b9 nodeName:}" failed. No retries permitted until 2026-04-19 12:33:12.678136338 +0000 UTC m=+160.832069799 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls") pod "image-registry-5cb67c5f8f-b7rwf" (UID: "343439fa-d125-4243-ac43-c00e012201b9") : secret "image-registry-tls" not found Apr 19 12:32:08.678577 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:08.678191 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:32:08.678625 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:08.678604 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert podName:f68c88a5-5b83-4fd0-92df-327974a7cc96 nodeName:}" failed. No retries permitted until 2026-04-19 12:33:12.678464262 +0000 UTC m=+160.832397730 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert") pod "ingress-canary-29mkc" (UID: "f68c88a5-5b83-4fd0-92df-327974a7cc96") : secret "canary-serving-cert" not found Apr 19 12:32:08.778871 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:08.778788 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls\") pod \"dns-default-nb68t\" (UID: \"c83beaf5-2d24-4163-855a-f4c6d55b0311\") " pod="openshift-dns/dns-default-nb68t" Apr 19 12:32:08.779016 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:08.778897 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:32:08.779016 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:08.778963 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls podName:c83beaf5-2d24-4163-855a-f4c6d55b0311 nodeName:}" failed. No retries permitted until 2026-04-19 12:33:12.778948836 +0000 UTC m=+160.932882294 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls") pod "dns-default-nb68t" (UID: "c83beaf5-2d24-4163-855a-f4c6d55b0311") : secret "dns-default-metrics-tls" not found Apr 19 12:32:12.715313 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:12.715284 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-txq8s" Apr 19 12:32:42.331601 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:42.331562 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs\") pod \"network-metrics-daemon-98bqr\" (UID: \"720d8932-1617-465d-a213-ebb1e99e6bc6\") " pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:32:42.332072 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:42.331696 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 19 12:32:42.332072 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:42.331768 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs podName:720d8932-1617-465d-a213-ebb1e99e6bc6 nodeName:}" failed. No retries permitted until 2026-04-19 12:34:44.331744487 +0000 UTC m=+252.485677944 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs") pod "network-metrics-daemon-98bqr" (UID: "720d8932-1617-465d-a213-ebb1e99e6bc6") : secret "metrics-daemon-secret" not found Apr 19 12:32:52.368375 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.368341 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-lnbtf"] Apr 19 12:32:52.371198 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.371179 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9qxjw"] Apr 19 12:32:52.371320 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.371300 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" Apr 19 12:32:52.373421 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.373396 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 19 12:32:52.373552 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.373440 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-zgg5x\"" Apr 19 12:32:52.373552 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.373475 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 19 12:32:52.373846 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.373826 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9qxjw" Apr 19 12:32:52.373944 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.373923 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:32:52.374012 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.373982 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 19 12:32:52.376551 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.376530 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 19 12:32:52.376848 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.376816 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-rpnw7\"" Apr 19 12:32:52.376848 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.376838 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 19 12:32:52.376996 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.376899 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:32:52.381189 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.381144 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 19 12:32:52.381957 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.381936 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9qxjw"] Apr 19 12:32:52.383385 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.383364 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-lnbtf"] Apr 19 12:32:52.401665 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.401638 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h7fx\" (UniqueName: \"kubernetes.io/projected/333b8588-b9fe-47b5-aaa3-5657ab8f1e9a-kube-api-access-9h7fx\") pod \"cluster-samples-operator-6dc5bdb6b4-9qxjw\" (UID: \"333b8588-b9fe-47b5-aaa3-5657ab8f1e9a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9qxjw" Apr 19 12:32:52.401823 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.401739 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/333b8588-b9fe-47b5-aaa3-5657ab8f1e9a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-9qxjw\" (UID: \"333b8588-b9fe-47b5-aaa3-5657ab8f1e9a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9qxjw" Apr 19 12:32:52.401891 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.401863 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjm86\" (UniqueName: \"kubernetes.io/projected/2b04d28c-3eb3-44e1-b431-d6b75f3850fe-kube-api-access-tjm86\") pod \"console-operator-9d4b6777b-lnbtf\" (UID: \"2b04d28c-3eb3-44e1-b431-d6b75f3850fe\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" Apr 19 12:32:52.401947 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.401925 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2b04d28c-3eb3-44e1-b431-d6b75f3850fe-trusted-ca\") pod \"console-operator-9d4b6777b-lnbtf\" (UID: \"2b04d28c-3eb3-44e1-b431-d6b75f3850fe\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" Apr 19 12:32:52.401999 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.401975 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b04d28c-3eb3-44e1-b431-d6b75f3850fe-serving-cert\") pod \"console-operator-9d4b6777b-lnbtf\" (UID: \"2b04d28c-3eb3-44e1-b431-d6b75f3850fe\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" Apr 19 12:32:52.402109 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.402012 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b04d28c-3eb3-44e1-b431-d6b75f3850fe-config\") pod \"console-operator-9d4b6777b-lnbtf\" (UID: \"2b04d28c-3eb3-44e1-b431-d6b75f3850fe\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" Apr 19 12:32:52.468984 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.468954 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9vjds"] Apr 19 12:32:52.471925 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.471895 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9vjds" Apr 19 12:32:52.472629 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.472606 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf"] Apr 19 12:32:52.474028 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.474004 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 19 12:32:52.474137 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.474007 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 19 12:32:52.474202 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.474084 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 19 12:32:52.474296 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.474279 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:32:52.474683 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.474669 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-b6m6n\"" Apr 19 12:32:52.475439 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.475420 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf" Apr 19 12:32:52.477063 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.477039 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-r4mtm\"" Apr 19 12:32:52.477351 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.477338 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 19 12:32:52.478090 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.478065 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 19 12:32:52.478529 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.478512 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 19 12:32:52.478599 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.478578 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 19 12:32:52.483328 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.483307 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9vjds"] Apr 19 12:32:52.487208 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.487190 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf"] Apr 19 12:32:52.502333 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.502311 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b04d28c-3eb3-44e1-b431-d6b75f3850fe-config\") pod \"console-operator-9d4b6777b-lnbtf\" (UID: \"2b04d28c-3eb3-44e1-b431-d6b75f3850fe\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" Apr 19 12:32:52.502434 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.502348 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k74wv\" (UniqueName: \"kubernetes.io/projected/ac3748db-049f-4448-a55c-ed08dd605a59-kube-api-access-k74wv\") pod \"kube-storage-version-migrator-operator-6769c5d45-9vjds\" (UID: \"ac3748db-049f-4448-a55c-ed08dd605a59\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9vjds" Apr 19 12:32:52.502434 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.502392 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac3748db-049f-4448-a55c-ed08dd605a59-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-9vjds\" (UID: \"ac3748db-049f-4448-a55c-ed08dd605a59\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9vjds" Apr 19 12:32:52.502514 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.502499 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac3748db-049f-4448-a55c-ed08dd605a59-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-9vjds\" (UID: \"ac3748db-049f-4448-a55c-ed08dd605a59\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9vjds" Apr 19 12:32:52.502555 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.502531 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/418c073b-0032-49b0-84ad-cf233dc4778b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6mzzf\" (UID: \"418c073b-0032-49b0-84ad-cf233dc4778b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf" Apr 19 12:32:52.502632 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.502561 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2b04d28c-3eb3-44e1-b431-d6b75f3850fe-trusted-ca\") pod \"console-operator-9d4b6777b-lnbtf\" (UID: \"2b04d28c-3eb3-44e1-b431-d6b75f3850fe\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" Apr 19 12:32:52.502632 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.502607 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b04d28c-3eb3-44e1-b431-d6b75f3850fe-serving-cert\") pod \"console-operator-9d4b6777b-lnbtf\" (UID: \"2b04d28c-3eb3-44e1-b431-d6b75f3850fe\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" Apr 19 12:32:52.502783 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.502633 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/418c073b-0032-49b0-84ad-cf233dc4778b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-6mzzf\" (UID: \"418c073b-0032-49b0-84ad-cf233dc4778b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf" Apr 19 12:32:52.502783 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.502670 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9h7fx\" (UniqueName: \"kubernetes.io/projected/333b8588-b9fe-47b5-aaa3-5657ab8f1e9a-kube-api-access-9h7fx\") pod \"cluster-samples-operator-6dc5bdb6b4-9qxjw\" (UID: \"333b8588-b9fe-47b5-aaa3-5657ab8f1e9a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9qxjw" Apr 19 12:32:52.502783 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.502699 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjm86\" (UniqueName: \"kubernetes.io/projected/2b04d28c-3eb3-44e1-b431-d6b75f3850fe-kube-api-access-tjm86\") pod \"console-operator-9d4b6777b-lnbtf\" (UID: \"2b04d28c-3eb3-44e1-b431-d6b75f3850fe\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" Apr 19 12:32:52.502783 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.502771 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/333b8588-b9fe-47b5-aaa3-5657ab8f1e9a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-9qxjw\" (UID: \"333b8588-b9fe-47b5-aaa3-5657ab8f1e9a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9qxjw" Apr 19 12:32:52.502979 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.502812 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlrdl\" (UniqueName: \"kubernetes.io/projected/418c073b-0032-49b0-84ad-cf233dc4778b-kube-api-access-xlrdl\") pod \"cluster-monitoring-operator-75587bd455-6mzzf\" (UID: \"418c073b-0032-49b0-84ad-cf233dc4778b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf" Apr 19 12:32:52.503095 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.503071 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b04d28c-3eb3-44e1-b431-d6b75f3850fe-config\") pod \"console-operator-9d4b6777b-lnbtf\" (UID: \"2b04d28c-3eb3-44e1-b431-d6b75f3850fe\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" Apr 19 12:32:52.503209 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:52.503192 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 19 12:32:52.503281 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:52.503270 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333b8588-b9fe-47b5-aaa3-5657ab8f1e9a-samples-operator-tls podName:333b8588-b9fe-47b5-aaa3-5657ab8f1e9a nodeName:}" failed. No retries permitted until 2026-04-19 12:32:53.003249076 +0000 UTC m=+141.157182535 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/333b8588-b9fe-47b5-aaa3-5657ab8f1e9a-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-9qxjw" (UID: "333b8588-b9fe-47b5-aaa3-5657ab8f1e9a") : secret "samples-operator-tls" not found Apr 19 12:32:52.503846 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.503831 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2b04d28c-3eb3-44e1-b431-d6b75f3850fe-trusted-ca\") pod \"console-operator-9d4b6777b-lnbtf\" (UID: \"2b04d28c-3eb3-44e1-b431-d6b75f3850fe\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" Apr 19 12:32:52.505128 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.505108 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b04d28c-3eb3-44e1-b431-d6b75f3850fe-serving-cert\") pod \"console-operator-9d4b6777b-lnbtf\" (UID: \"2b04d28c-3eb3-44e1-b431-d6b75f3850fe\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" Apr 19 12:32:52.510854 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.510836 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h7fx\" (UniqueName: \"kubernetes.io/projected/333b8588-b9fe-47b5-aaa3-5657ab8f1e9a-kube-api-access-9h7fx\") pod \"cluster-samples-operator-6dc5bdb6b4-9qxjw\" (UID: \"333b8588-b9fe-47b5-aaa3-5657ab8f1e9a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9qxjw" Apr 19 12:32:52.520775 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.520752 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjm86\" (UniqueName: \"kubernetes.io/projected/2b04d28c-3eb3-44e1-b431-d6b75f3850fe-kube-api-access-tjm86\") pod \"console-operator-9d4b6777b-lnbtf\" (UID: \"2b04d28c-3eb3-44e1-b431-d6b75f3850fe\") " pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" Apr 19 12:32:52.604270 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.604231 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k74wv\" (UniqueName: \"kubernetes.io/projected/ac3748db-049f-4448-a55c-ed08dd605a59-kube-api-access-k74wv\") pod \"kube-storage-version-migrator-operator-6769c5d45-9vjds\" (UID: \"ac3748db-049f-4448-a55c-ed08dd605a59\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9vjds" Apr 19 12:32:52.604270 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.604282 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac3748db-049f-4448-a55c-ed08dd605a59-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-9vjds\" (UID: \"ac3748db-049f-4448-a55c-ed08dd605a59\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9vjds" Apr 19 12:32:52.604500 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.604323 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac3748db-049f-4448-a55c-ed08dd605a59-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-9vjds\" (UID: \"ac3748db-049f-4448-a55c-ed08dd605a59\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9vjds" Apr 19 12:32:52.604500 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.604339 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/418c073b-0032-49b0-84ad-cf233dc4778b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6mzzf\" (UID: \"418c073b-0032-49b0-84ad-cf233dc4778b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf" Apr 19 12:32:52.604500 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.604374 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/418c073b-0032-49b0-84ad-cf233dc4778b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-6mzzf\" (UID: \"418c073b-0032-49b0-84ad-cf233dc4778b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf" Apr 19 12:32:52.604500 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.604419 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlrdl\" (UniqueName: \"kubernetes.io/projected/418c073b-0032-49b0-84ad-cf233dc4778b-kube-api-access-xlrdl\") pod \"cluster-monitoring-operator-75587bd455-6mzzf\" (UID: \"418c073b-0032-49b0-84ad-cf233dc4778b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf" Apr 19 12:32:52.604500 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:52.604474 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 19 12:32:52.604737 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:52.604564 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/418c073b-0032-49b0-84ad-cf233dc4778b-cluster-monitoring-operator-tls podName:418c073b-0032-49b0-84ad-cf233dc4778b nodeName:}" failed. No retries permitted until 2026-04-19 12:32:53.104541783 +0000 UTC m=+141.258475255 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/418c073b-0032-49b0-84ad-cf233dc4778b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6mzzf" (UID: "418c073b-0032-49b0-84ad-cf233dc4778b") : secret "cluster-monitoring-operator-tls" not found Apr 19 12:32:52.605126 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.605107 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/418c073b-0032-49b0-84ad-cf233dc4778b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-6mzzf\" (UID: \"418c073b-0032-49b0-84ad-cf233dc4778b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf" Apr 19 12:32:52.605481 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.605466 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac3748db-049f-4448-a55c-ed08dd605a59-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-9vjds\" (UID: \"ac3748db-049f-4448-a55c-ed08dd605a59\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9vjds" Apr 19 12:32:52.606511 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.606495 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac3748db-049f-4448-a55c-ed08dd605a59-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-9vjds\" (UID: \"ac3748db-049f-4448-a55c-ed08dd605a59\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9vjds" Apr 19 12:32:52.612312 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.612286 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k74wv\" (UniqueName: \"kubernetes.io/projected/ac3748db-049f-4448-a55c-ed08dd605a59-kube-api-access-k74wv\") pod \"kube-storage-version-migrator-operator-6769c5d45-9vjds\" (UID: \"ac3748db-049f-4448-a55c-ed08dd605a59\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9vjds" Apr 19 12:32:52.612427 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.612352 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlrdl\" (UniqueName: \"kubernetes.io/projected/418c073b-0032-49b0-84ad-cf233dc4778b-kube-api-access-xlrdl\") pod \"cluster-monitoring-operator-75587bd455-6mzzf\" (UID: \"418c073b-0032-49b0-84ad-cf233dc4778b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf" Apr 19 12:32:52.685180 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.685139 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" Apr 19 12:32:52.782884 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.782851 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9vjds" Apr 19 12:32:52.797007 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.796979 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-lnbtf"] Apr 19 12:32:52.800341 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:32:52.800307 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b04d28c_3eb3_44e1_b431_d6b75f3850fe.slice/crio-e11c5506686f0f4ffc4b43ef077354b8b4f46df0c36b42093eac4312ba59a3d5 WatchSource:0}: Error finding container e11c5506686f0f4ffc4b43ef077354b8b4f46df0c36b42093eac4312ba59a3d5: Status 404 returned error can't find the container with id e11c5506686f0f4ffc4b43ef077354b8b4f46df0c36b42093eac4312ba59a3d5 Apr 19 12:32:52.847212 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.847178 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" event={"ID":"2b04d28c-3eb3-44e1-b431-d6b75f3850fe","Type":"ContainerStarted","Data":"e11c5506686f0f4ffc4b43ef077354b8b4f46df0c36b42093eac4312ba59a3d5"} Apr 19 12:32:52.897247 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:52.897212 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9vjds"] Apr 19 12:32:52.902067 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:32:52.902035 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac3748db_049f_4448_a55c_ed08dd605a59.slice/crio-58336e4a8214948a4f04c717367d29cc8086d1a947d1c54c0dc47af1f2155d16 WatchSource:0}: Error finding container 58336e4a8214948a4f04c717367d29cc8086d1a947d1c54c0dc47af1f2155d16: Status 404 returned error can't find the container with id 58336e4a8214948a4f04c717367d29cc8086d1a947d1c54c0dc47af1f2155d16 Apr 19 12:32:53.008029 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:53.007932 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/333b8588-b9fe-47b5-aaa3-5657ab8f1e9a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-9qxjw\" (UID: \"333b8588-b9fe-47b5-aaa3-5657ab8f1e9a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9qxjw" Apr 19 12:32:53.008207 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:53.008048 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 19 12:32:53.008207 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:53.008107 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333b8588-b9fe-47b5-aaa3-5657ab8f1e9a-samples-operator-tls podName:333b8588-b9fe-47b5-aaa3-5657ab8f1e9a nodeName:}" failed. No retries permitted until 2026-04-19 12:32:54.008090156 +0000 UTC m=+142.162023615 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/333b8588-b9fe-47b5-aaa3-5657ab8f1e9a-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-9qxjw" (UID: "333b8588-b9fe-47b5-aaa3-5657ab8f1e9a") : secret "samples-operator-tls" not found Apr 19 12:32:53.108656 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:53.108617 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/418c073b-0032-49b0-84ad-cf233dc4778b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6mzzf\" (UID: \"418c073b-0032-49b0-84ad-cf233dc4778b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf" Apr 19 12:32:53.108818 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:53.108721 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 19 12:32:53.108818 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:53.108775 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/418c073b-0032-49b0-84ad-cf233dc4778b-cluster-monitoring-operator-tls podName:418c073b-0032-49b0-84ad-cf233dc4778b nodeName:}" failed. No retries permitted until 2026-04-19 12:32:54.108761581 +0000 UTC m=+142.262695039 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/418c073b-0032-49b0-84ad-cf233dc4778b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6mzzf" (UID: "418c073b-0032-49b0-84ad-cf233dc4778b") : secret "cluster-monitoring-operator-tls" not found Apr 19 12:32:53.415313 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:53.415281 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6gh7r"] Apr 19 12:32:53.418459 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:53.418436 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6gh7r" Apr 19 12:32:53.420181 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:53.420143 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-q8gzc\"" Apr 19 12:32:53.426705 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:53.426664 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6gh7r"] Apr 19 12:32:53.511889 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:53.511847 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqt4x\" (UniqueName: \"kubernetes.io/projected/433c7a3b-e695-4024-b759-a0f506fc3aa5-kube-api-access-nqt4x\") pod \"network-check-source-8894fc9bd-6gh7r\" (UID: \"433c7a3b-e695-4024-b759-a0f506fc3aa5\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6gh7r" Apr 19 12:32:53.613490 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:53.613448 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqt4x\" (UniqueName: \"kubernetes.io/projected/433c7a3b-e695-4024-b759-a0f506fc3aa5-kube-api-access-nqt4x\") pod \"network-check-source-8894fc9bd-6gh7r\" (UID: \"433c7a3b-e695-4024-b759-a0f506fc3aa5\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6gh7r" Apr 19 12:32:53.621545 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:53.621498 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqt4x\" (UniqueName: \"kubernetes.io/projected/433c7a3b-e695-4024-b759-a0f506fc3aa5-kube-api-access-nqt4x\") pod \"network-check-source-8894fc9bd-6gh7r\" (UID: \"433c7a3b-e695-4024-b759-a0f506fc3aa5\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6gh7r" Apr 19 12:32:53.729299 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:53.729220 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6gh7r" Apr 19 12:32:53.851150 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:53.851108 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9vjds" event={"ID":"ac3748db-049f-4448-a55c-ed08dd605a59","Type":"ContainerStarted","Data":"58336e4a8214948a4f04c717367d29cc8086d1a947d1c54c0dc47af1f2155d16"} Apr 19 12:32:53.866047 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:53.866014 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6gh7r"] Apr 19 12:32:53.870688 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:32:53.870656 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod433c7a3b_e695_4024_b759_a0f506fc3aa5.slice/crio-80f52cc486c85c5cc233d9cc5ee3bfb9824db1c2679c56efd0c3298d7c8f403c WatchSource:0}: Error finding container 80f52cc486c85c5cc233d9cc5ee3bfb9824db1c2679c56efd0c3298d7c8f403c: Status 404 returned error can't find the container with id 80f52cc486c85c5cc233d9cc5ee3bfb9824db1c2679c56efd0c3298d7c8f403c Apr 19 12:32:54.016283 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:54.016140 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/333b8588-b9fe-47b5-aaa3-5657ab8f1e9a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-9qxjw\" (UID: \"333b8588-b9fe-47b5-aaa3-5657ab8f1e9a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9qxjw" Apr 19 12:32:54.016450 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:54.016296 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 19 12:32:54.016450 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:54.016379 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333b8588-b9fe-47b5-aaa3-5657ab8f1e9a-samples-operator-tls podName:333b8588-b9fe-47b5-aaa3-5657ab8f1e9a nodeName:}" failed. No retries permitted until 2026-04-19 12:32:56.016356912 +0000 UTC m=+144.170290373 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/333b8588-b9fe-47b5-aaa3-5657ab8f1e9a-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-9qxjw" (UID: "333b8588-b9fe-47b5-aaa3-5657ab8f1e9a") : secret "samples-operator-tls" not found Apr 19 12:32:54.116779 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:54.116745 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/418c073b-0032-49b0-84ad-cf233dc4778b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6mzzf\" (UID: \"418c073b-0032-49b0-84ad-cf233dc4778b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf" Apr 19 12:32:54.116987 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:54.116903 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 19 12:32:54.117057 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:54.116990 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/418c073b-0032-49b0-84ad-cf233dc4778b-cluster-monitoring-operator-tls podName:418c073b-0032-49b0-84ad-cf233dc4778b nodeName:}" failed. No retries permitted until 2026-04-19 12:32:56.116970006 +0000 UTC m=+144.270903463 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/418c073b-0032-49b0-84ad-cf233dc4778b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6mzzf" (UID: "418c073b-0032-49b0-84ad-cf233dc4778b") : secret "cluster-monitoring-operator-tls" not found Apr 19 12:32:54.854591 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:54.854559 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6gh7r" event={"ID":"433c7a3b-e695-4024-b759-a0f506fc3aa5","Type":"ContainerStarted","Data":"f05b5080570b5c787609dffb13a5650afdb96fc522f298f812b15f873d1aa455"} Apr 19 12:32:54.854940 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:54.854602 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6gh7r" event={"ID":"433c7a3b-e695-4024-b759-a0f506fc3aa5","Type":"ContainerStarted","Data":"80f52cc486c85c5cc233d9cc5ee3bfb9824db1c2679c56efd0c3298d7c8f403c"} Apr 19 12:32:54.868861 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:54.868797 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6gh7r" podStartSLOduration=1.868778305 podStartE2EDuration="1.868778305s" podCreationTimestamp="2026-04-19 12:32:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:32:54.867197619 +0000 UTC m=+143.021131099" watchObservedRunningTime="2026-04-19 12:32:54.868778305 +0000 UTC m=+143.022711786" Apr 19 12:32:55.857822 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:55.857730 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9vjds" event={"ID":"ac3748db-049f-4448-a55c-ed08dd605a59","Type":"ContainerStarted","Data":"e0e9d03ebee34cbdac31bc775ac12fa817df662852e0a8c8705d163377778003"} Apr 19 12:32:55.859101 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:55.859082 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnbtf_2b04d28c-3eb3-44e1-b431-d6b75f3850fe/console-operator/0.log" Apr 19 12:32:55.859267 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:55.859117 2567 generic.go:358] "Generic (PLEG): container finished" podID="2b04d28c-3eb3-44e1-b431-d6b75f3850fe" containerID="f5a21032811f05a241fa9032dcd3089f271ab407bed12a324ce877e6ce03894e" exitCode=255 Apr 19 12:32:55.859267 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:55.859196 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" event={"ID":"2b04d28c-3eb3-44e1-b431-d6b75f3850fe","Type":"ContainerDied","Data":"f5a21032811f05a241fa9032dcd3089f271ab407bed12a324ce877e6ce03894e"} Apr 19 12:32:55.863015 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:55.859784 2567 scope.go:117] "RemoveContainer" containerID="f5a21032811f05a241fa9032dcd3089f271ab407bed12a324ce877e6ce03894e" Apr 19 12:32:55.875726 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:55.875681 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9vjds" podStartSLOduration=1.239155608 podStartE2EDuration="3.875666127s" podCreationTimestamp="2026-04-19 12:32:52 +0000 UTC" firstStartedPulling="2026-04-19 12:32:52.903344403 +0000 UTC m=+141.057277861" lastFinishedPulling="2026-04-19 12:32:55.539854914 +0000 UTC m=+143.693788380" observedRunningTime="2026-04-19 12:32:55.874298337 +0000 UTC m=+144.028231816" watchObservedRunningTime="2026-04-19 12:32:55.875666127 +0000 UTC m=+144.029599609" Apr 19 12:32:56.034794 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:56.034755 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/333b8588-b9fe-47b5-aaa3-5657ab8f1e9a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-9qxjw\" (UID: \"333b8588-b9fe-47b5-aaa3-5657ab8f1e9a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9qxjw" Apr 19 12:32:56.034963 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:56.034919 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 19 12:32:56.035045 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:56.034999 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333b8588-b9fe-47b5-aaa3-5657ab8f1e9a-samples-operator-tls podName:333b8588-b9fe-47b5-aaa3-5657ab8f1e9a nodeName:}" failed. No retries permitted until 2026-04-19 12:33:00.034979334 +0000 UTC m=+148.188912816 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/333b8588-b9fe-47b5-aaa3-5657ab8f1e9a-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-9qxjw" (UID: "333b8588-b9fe-47b5-aaa3-5657ab8f1e9a") : secret "samples-operator-tls" not found Apr 19 12:32:56.135500 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:56.135413 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/418c073b-0032-49b0-84ad-cf233dc4778b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6mzzf\" (UID: \"418c073b-0032-49b0-84ad-cf233dc4778b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf" Apr 19 12:32:56.135640 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:56.135532 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 19 12:32:56.135640 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:56.135594 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/418c073b-0032-49b0-84ad-cf233dc4778b-cluster-monitoring-operator-tls podName:418c073b-0032-49b0-84ad-cf233dc4778b nodeName:}" failed. No retries permitted until 2026-04-19 12:33:00.135578751 +0000 UTC m=+148.289512208 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/418c073b-0032-49b0-84ad-cf233dc4778b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6mzzf" (UID: "418c073b-0032-49b0-84ad-cf233dc4778b") : secret "cluster-monitoring-operator-tls" not found Apr 19 12:32:56.863498 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:56.863472 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnbtf_2b04d28c-3eb3-44e1-b431-d6b75f3850fe/console-operator/1.log" Apr 19 12:32:56.863923 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:56.863849 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnbtf_2b04d28c-3eb3-44e1-b431-d6b75f3850fe/console-operator/0.log" Apr 19 12:32:56.863923 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:56.863882 2567 generic.go:358] "Generic (PLEG): container finished" podID="2b04d28c-3eb3-44e1-b431-d6b75f3850fe" containerID="a8dec56c21e3e1bad4030c2cb3cf16a14792114da2ba5ae1a200b8f9b1270148" exitCode=255 Apr 19 12:32:56.864040 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:56.863913 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" event={"ID":"2b04d28c-3eb3-44e1-b431-d6b75f3850fe","Type":"ContainerDied","Data":"a8dec56c21e3e1bad4030c2cb3cf16a14792114da2ba5ae1a200b8f9b1270148"} Apr 19 12:32:56.864040 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:56.863964 2567 scope.go:117] "RemoveContainer" containerID="f5a21032811f05a241fa9032dcd3089f271ab407bed12a324ce877e6ce03894e" Apr 19 12:32:56.864247 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:56.864233 2567 scope.go:117] "RemoveContainer" containerID="a8dec56c21e3e1bad4030c2cb3cf16a14792114da2ba5ae1a200b8f9b1270148" Apr 19 12:32:56.864444 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:56.864424 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-lnbtf_openshift-console-operator(2b04d28c-3eb3-44e1-b431-d6b75f3850fe)\"" pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" podUID="2b04d28c-3eb3-44e1-b431-d6b75f3850fe" Apr 19 12:32:57.866997 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:57.866971 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnbtf_2b04d28c-3eb3-44e1-b431-d6b75f3850fe/console-operator/1.log" Apr 19 12:32:57.867460 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:57.867346 2567 scope.go:117] "RemoveContainer" containerID="a8dec56c21e3e1bad4030c2cb3cf16a14792114da2ba5ae1a200b8f9b1270148" Apr 19 12:32:57.867549 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:32:57.867530 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-lnbtf_openshift-console-operator(2b04d28c-3eb3-44e1-b431-d6b75f3850fe)\"" pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" podUID="2b04d28c-3eb3-44e1-b431-d6b75f3850fe" Apr 19 12:32:58.804875 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:58.804844 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rd647_d3c440e9-6d1f-46b9-b5c1-3ee825d6e2a4/dns-node-resolver/0.log" Apr 19 12:32:59.407257 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:59.407223 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-j74c9"] Apr 19 12:32:59.411565 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:59.411547 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-j74c9" Apr 19 12:32:59.413426 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:59.413403 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 19 12:32:59.413645 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:59.413629 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 19 12:32:59.413720 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:59.413629 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 19 12:32:59.413953 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:59.413937 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 19 12:32:59.414044 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:59.413996 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-hwnwh\"" Apr 19 12:32:59.416810 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:59.416785 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-j74c9"] Apr 19 12:32:59.464232 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:59.464191 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ea7da589-a369-4c81-95f4-1ffd0af37713-signing-key\") pod \"service-ca-865cb79987-j74c9\" (UID: \"ea7da589-a369-4c81-95f4-1ffd0af37713\") " pod="openshift-service-ca/service-ca-865cb79987-j74c9" Apr 19 12:32:59.464413 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:59.464289 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ea7da589-a369-4c81-95f4-1ffd0af37713-signing-cabundle\") pod \"service-ca-865cb79987-j74c9\" (UID: \"ea7da589-a369-4c81-95f4-1ffd0af37713\") " pod="openshift-service-ca/service-ca-865cb79987-j74c9" Apr 19 12:32:59.464413 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:59.464389 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghlnv\" (UniqueName: \"kubernetes.io/projected/ea7da589-a369-4c81-95f4-1ffd0af37713-kube-api-access-ghlnv\") pod \"service-ca-865cb79987-j74c9\" (UID: \"ea7da589-a369-4c81-95f4-1ffd0af37713\") " pod="openshift-service-ca/service-ca-865cb79987-j74c9" Apr 19 12:32:59.565752 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:59.565711 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ea7da589-a369-4c81-95f4-1ffd0af37713-signing-key\") pod \"service-ca-865cb79987-j74c9\" (UID: \"ea7da589-a369-4c81-95f4-1ffd0af37713\") " pod="openshift-service-ca/service-ca-865cb79987-j74c9" Apr 19 12:32:59.565911 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:59.565796 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ea7da589-a369-4c81-95f4-1ffd0af37713-signing-cabundle\") pod \"service-ca-865cb79987-j74c9\" (UID: \"ea7da589-a369-4c81-95f4-1ffd0af37713\") " pod="openshift-service-ca/service-ca-865cb79987-j74c9" Apr 19 12:32:59.565911 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:59.565901 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghlnv\" (UniqueName: \"kubernetes.io/projected/ea7da589-a369-4c81-95f4-1ffd0af37713-kube-api-access-ghlnv\") pod \"service-ca-865cb79987-j74c9\" (UID: \"ea7da589-a369-4c81-95f4-1ffd0af37713\") " pod="openshift-service-ca/service-ca-865cb79987-j74c9" Apr 19 12:32:59.566444 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:59.566424 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ea7da589-a369-4c81-95f4-1ffd0af37713-signing-cabundle\") pod \"service-ca-865cb79987-j74c9\" (UID: \"ea7da589-a369-4c81-95f4-1ffd0af37713\") " pod="openshift-service-ca/service-ca-865cb79987-j74c9" Apr 19 12:32:59.568010 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:59.567989 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ea7da589-a369-4c81-95f4-1ffd0af37713-signing-key\") pod \"service-ca-865cb79987-j74c9\" (UID: \"ea7da589-a369-4c81-95f4-1ffd0af37713\") " pod="openshift-service-ca/service-ca-865cb79987-j74c9" Apr 19 12:32:59.573179 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:59.573147 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghlnv\" (UniqueName: \"kubernetes.io/projected/ea7da589-a369-4c81-95f4-1ffd0af37713-kube-api-access-ghlnv\") pod \"service-ca-865cb79987-j74c9\" (UID: \"ea7da589-a369-4c81-95f4-1ffd0af37713\") " pod="openshift-service-ca/service-ca-865cb79987-j74c9" Apr 19 12:32:59.721317 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:59.721217 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-j74c9" Apr 19 12:32:59.832515 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:59.832485 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-j74c9"] Apr 19 12:32:59.835371 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:32:59.835339 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea7da589_a369_4c81_95f4_1ffd0af37713.slice/crio-d8d13867ecc3c8f3fb3ff46739eab2e796bdf856b3a01242a2aa112756348e54 WatchSource:0}: Error finding container d8d13867ecc3c8f3fb3ff46739eab2e796bdf856b3a01242a2aa112756348e54: Status 404 returned error can't find the container with id d8d13867ecc3c8f3fb3ff46739eab2e796bdf856b3a01242a2aa112756348e54 Apr 19 12:32:59.871084 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:32:59.871050 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-j74c9" event={"ID":"ea7da589-a369-4c81-95f4-1ffd0af37713","Type":"ContainerStarted","Data":"d8d13867ecc3c8f3fb3ff46739eab2e796bdf856b3a01242a2aa112756348e54"} Apr 19 12:33:00.008575 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:00.008502 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-n7t66_dc6754e8-c6ea-48e0-9ad2-435a13e54b61/node-ca/0.log" Apr 19 12:33:00.070136 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:00.070070 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/333b8588-b9fe-47b5-aaa3-5657ab8f1e9a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-9qxjw\" (UID: \"333b8588-b9fe-47b5-aaa3-5657ab8f1e9a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9qxjw" Apr 19 12:33:00.070298 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:00.070239 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 19 12:33:00.070340 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:00.070310 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/333b8588-b9fe-47b5-aaa3-5657ab8f1e9a-samples-operator-tls podName:333b8588-b9fe-47b5-aaa3-5657ab8f1e9a nodeName:}" failed. No retries permitted until 2026-04-19 12:33:08.070294074 +0000 UTC m=+156.224227536 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/333b8588-b9fe-47b5-aaa3-5657ab8f1e9a-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-9qxjw" (UID: "333b8588-b9fe-47b5-aaa3-5657ab8f1e9a") : secret "samples-operator-tls" not found Apr 19 12:33:00.171313 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:00.171277 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/418c073b-0032-49b0-84ad-cf233dc4778b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6mzzf\" (UID: \"418c073b-0032-49b0-84ad-cf233dc4778b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf" Apr 19 12:33:00.171465 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:00.171387 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 19 12:33:00.171465 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:00.171439 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/418c073b-0032-49b0-84ad-cf233dc4778b-cluster-monitoring-operator-tls podName:418c073b-0032-49b0-84ad-cf233dc4778b nodeName:}" failed. No retries permitted until 2026-04-19 12:33:08.171426054 +0000 UTC m=+156.325359513 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/418c073b-0032-49b0-84ad-cf233dc4778b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6mzzf" (UID: "418c073b-0032-49b0-84ad-cf233dc4778b") : secret "cluster-monitoring-operator-tls" not found Apr 19 12:33:01.877194 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:01.877138 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-j74c9" event={"ID":"ea7da589-a369-4c81-95f4-1ffd0af37713","Type":"ContainerStarted","Data":"0b739df32976980989161ccab186a0cd988fc399907c82e1b3288a37ed42272b"} Apr 19 12:33:01.890353 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:01.890311 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-j74c9" podStartSLOduration=1.3818437129999999 podStartE2EDuration="2.890294031s" podCreationTimestamp="2026-04-19 12:32:59 +0000 UTC" firstStartedPulling="2026-04-19 12:32:59.83720385 +0000 UTC m=+147.991137308" lastFinishedPulling="2026-04-19 12:33:01.345654169 +0000 UTC m=+149.499587626" observedRunningTime="2026-04-19 12:33:01.889383547 +0000 UTC m=+150.043317026" watchObservedRunningTime="2026-04-19 12:33:01.890294031 +0000 UTC m=+150.044227512" Apr 19 12:33:02.685863 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:02.685825 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" Apr 19 12:33:02.685863 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:02.685866 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" Apr 19 12:33:02.686345 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:02.686331 2567 scope.go:117] "RemoveContainer" containerID="a8dec56c21e3e1bad4030c2cb3cf16a14792114da2ba5ae1a200b8f9b1270148" Apr 19 12:33:02.686572 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:02.686552 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-lnbtf_openshift-console-operator(2b04d28c-3eb3-44e1-b431-d6b75f3850fe)\"" pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" podUID="2b04d28c-3eb3-44e1-b431-d6b75f3850fe" Apr 19 12:33:07.744174 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:07.744118 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" podUID="343439fa-d125-4243-ac43-c00e012201b9" Apr 19 12:33:07.752015 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:07.751980 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bns5n" podUID="e88a41fd-9d7d-457b-af51-169ee562d266" Apr 19 12:33:07.759145 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:07.759110 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-29mkc" podUID="f68c88a5-5b83-4fd0-92df-327974a7cc96" Apr 19 12:33:07.850898 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:07.850837 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-nb68t" podUID="c83beaf5-2d24-4163-855a-f4c6d55b0311" Apr 19 12:33:07.892646 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:07.892618 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nb68t" Apr 19 12:33:07.892646 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:07.892636 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bns5n" Apr 19 12:33:07.892816 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:07.892751 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-29mkc" Apr 19 12:33:07.892926 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:07.892912 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:33:08.140280 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:08.140155 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/333b8588-b9fe-47b5-aaa3-5657ab8f1e9a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-9qxjw\" (UID: \"333b8588-b9fe-47b5-aaa3-5657ab8f1e9a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9qxjw" Apr 19 12:33:08.142598 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:08.142573 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/333b8588-b9fe-47b5-aaa3-5657ab8f1e9a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-9qxjw\" (UID: \"333b8588-b9fe-47b5-aaa3-5657ab8f1e9a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9qxjw" Apr 19 12:33:08.240937 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:08.240892 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/418c073b-0032-49b0-84ad-cf233dc4778b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6mzzf\" (UID: \"418c073b-0032-49b0-84ad-cf233dc4778b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf" Apr 19 12:33:08.241090 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:08.241048 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 19 12:33:08.241185 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:08.241116 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/418c073b-0032-49b0-84ad-cf233dc4778b-cluster-monitoring-operator-tls podName:418c073b-0032-49b0-84ad-cf233dc4778b nodeName:}" failed. No retries permitted until 2026-04-19 12:33:24.241099605 +0000 UTC m=+172.395033063 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/418c073b-0032-49b0-84ad-cf233dc4778b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6mzzf" (UID: "418c073b-0032-49b0-84ad-cf233dc4778b") : secret "cluster-monitoring-operator-tls" not found Apr 19 12:33:08.290367 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:08.290332 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9qxjw" Apr 19 12:33:08.402776 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:08.402745 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9qxjw"] Apr 19 12:33:08.898898 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:08.898865 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9qxjw" event={"ID":"333b8588-b9fe-47b5-aaa3-5657ab8f1e9a","Type":"ContainerStarted","Data":"0e3cf65bd9224eb2869b8a04b4dcd5af84f11a36ea71429c8a27408d00d02e5c"} Apr 19 12:33:09.475286 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:09.475241 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-98bqr" podUID="720d8932-1617-465d-a213-ebb1e99e6bc6" Apr 19 12:33:10.905712 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:10.905679 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9qxjw" event={"ID":"333b8588-b9fe-47b5-aaa3-5657ab8f1e9a","Type":"ContainerStarted","Data":"3fc4b65d3dcef2282ecc1143e097bc32f8495559135ab47dbcd2a31efcd39450"} Apr 19 12:33:10.905712 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:10.905712 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9qxjw" event={"ID":"333b8588-b9fe-47b5-aaa3-5657ab8f1e9a","Type":"ContainerStarted","Data":"f6b775f85111bc597b7a33c02169f15de2f03bd54d9e0d326b3a3d77b86967b2"} Apr 19 12:33:10.920095 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:10.920050 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9qxjw" podStartSLOduration=16.963341057 podStartE2EDuration="18.920036382s" podCreationTimestamp="2026-04-19 12:32:52 +0000 UTC" firstStartedPulling="2026-04-19 12:33:08.443153302 +0000 UTC m=+156.597086760" lastFinishedPulling="2026-04-19 12:33:10.399848622 +0000 UTC m=+158.553782085" observedRunningTime="2026-04-19 12:33:10.919065375 +0000 UTC m=+159.072998869" watchObservedRunningTime="2026-04-19 12:33:10.920036382 +0000 UTC m=+159.073969861" Apr 19 12:33:12.680340 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:12.680286 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:33:12.680779 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:12.680414 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert\") pod \"ingress-canary-29mkc\" (UID: \"f68c88a5-5b83-4fd0-92df-327974a7cc96\") " pod="openshift-ingress-canary/ingress-canary-29mkc" Apr 19 12:33:12.680779 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:12.680472 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bns5n\" (UID: \"e88a41fd-9d7d-457b-af51-169ee562d266\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bns5n" Apr 19 12:33:12.683008 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:12.682981 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e88a41fd-9d7d-457b-af51-169ee562d266-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bns5n\" (UID: \"e88a41fd-9d7d-457b-af51-169ee562d266\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bns5n" Apr 19 12:33:12.683145 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:12.682981 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f68c88a5-5b83-4fd0-92df-327974a7cc96-cert\") pod \"ingress-canary-29mkc\" (UID: \"f68c88a5-5b83-4fd0-92df-327974a7cc96\") " pod="openshift-ingress-canary/ingress-canary-29mkc" Apr 19 12:33:12.683145 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:12.683127 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls\") pod \"image-registry-5cb67c5f8f-b7rwf\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:33:12.695340 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:12.695313 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-6mqh9\"" Apr 19 12:33:12.695791 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:12.695756 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-v66mr\"" Apr 19 12:33:12.695985 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:12.695769 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xbkh5\"" Apr 19 12:33:12.703101 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:12.703077 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-29mkc" Apr 19 12:33:12.703301 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:12.703283 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:33:12.703365 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:12.703307 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bns5n" Apr 19 12:33:12.781989 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:12.781476 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls\") pod \"dns-default-nb68t\" (UID: \"c83beaf5-2d24-4163-855a-f4c6d55b0311\") " pod="openshift-dns/dns-default-nb68t" Apr 19 12:33:12.785777 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:12.785712 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c83beaf5-2d24-4163-855a-f4c6d55b0311-metrics-tls\") pod \"dns-default-nb68t\" (UID: \"c83beaf5-2d24-4163-855a-f4c6d55b0311\") " pod="openshift-dns/dns-default-nb68t" Apr 19 12:33:12.878988 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:12.878947 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-29mkc"] Apr 19 12:33:12.880359 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:12.880332 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5cb67c5f8f-b7rwf"] Apr 19 12:33:12.882558 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:33:12.882536 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf68c88a5_5b83_4fd0_92df_327974a7cc96.slice/crio-4b85c5c56a1c8c3ec7b0851a187e0c8b2d07ada36510422ead20cba557490413 WatchSource:0}: Error finding container 4b85c5c56a1c8c3ec7b0851a187e0c8b2d07ada36510422ead20cba557490413: Status 404 returned error can't find the container with id 4b85c5c56a1c8c3ec7b0851a187e0c8b2d07ada36510422ead20cba557490413 Apr 19 12:33:12.883136 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:33:12.883089 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod343439fa_d125_4243_ac43_c00e012201b9.slice/crio-bb860e06de79e8c999c9b3861c086226e981ed5db9f5e0acf59aec8ec9e4075e WatchSource:0}: Error finding container bb860e06de79e8c999c9b3861c086226e981ed5db9f5e0acf59aec8ec9e4075e: Status 404 returned error can't find the container with id bb860e06de79e8c999c9b3861c086226e981ed5db9f5e0acf59aec8ec9e4075e Apr 19 12:33:12.890567 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:12.890544 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-bns5n"] Apr 19 12:33:12.895003 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:33:12.894978 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode88a41fd_9d7d_457b_af51_169ee562d266.slice/crio-cf6322207fba3b4450bbf0dd31c7ed7e95fd1d292bc8c20cbf7dd642a6ba9ee2 WatchSource:0}: Error finding container cf6322207fba3b4450bbf0dd31c7ed7e95fd1d292bc8c20cbf7dd642a6ba9ee2: Status 404 returned error can't find the container with id cf6322207fba3b4450bbf0dd31c7ed7e95fd1d292bc8c20cbf7dd642a6ba9ee2 Apr 19 12:33:12.913748 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:12.913720 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-29mkc" event={"ID":"f68c88a5-5b83-4fd0-92df-327974a7cc96","Type":"ContainerStarted","Data":"4b85c5c56a1c8c3ec7b0851a187e0c8b2d07ada36510422ead20cba557490413"} Apr 19 12:33:12.914885 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:12.914862 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" event={"ID":"343439fa-d125-4243-ac43-c00e012201b9","Type":"ContainerStarted","Data":"bb860e06de79e8c999c9b3861c086226e981ed5db9f5e0acf59aec8ec9e4075e"} Apr 19 12:33:12.915855 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:12.915836 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bns5n" event={"ID":"e88a41fd-9d7d-457b-af51-169ee562d266","Type":"ContainerStarted","Data":"cf6322207fba3b4450bbf0dd31c7ed7e95fd1d292bc8c20cbf7dd642a6ba9ee2"} Apr 19 12:33:12.995688 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:12.995653 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-62vx7\"" Apr 19 12:33:13.004223 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:13.004187 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nb68t" Apr 19 12:33:13.134105 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:13.133990 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nb68t"] Apr 19 12:33:13.136501 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:33:13.136473 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc83beaf5_2d24_4163_855a_f4c6d55b0311.slice/crio-df62ba4b847387da9c9cf83ee5fa0dc79c8919cf67eb2aff0c0aa1939598e81e WatchSource:0}: Error finding container df62ba4b847387da9c9cf83ee5fa0dc79c8919cf67eb2aff0c0aa1939598e81e: Status 404 returned error can't find the container with id df62ba4b847387da9c9cf83ee5fa0dc79c8919cf67eb2aff0c0aa1939598e81e Apr 19 12:33:13.920708 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:13.920659 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nb68t" event={"ID":"c83beaf5-2d24-4163-855a-f4c6d55b0311","Type":"ContainerStarted","Data":"df62ba4b847387da9c9cf83ee5fa0dc79c8919cf67eb2aff0c0aa1939598e81e"} Apr 19 12:33:13.924156 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:13.923638 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" event={"ID":"343439fa-d125-4243-ac43-c00e012201b9","Type":"ContainerStarted","Data":"1576119ee5e6cc346494340100b69be4c6ca3f37198512f9b610fe72385c6e47"} Apr 19 12:33:13.924156 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:13.923853 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:33:13.941766 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:13.941381 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" podStartSLOduration=141.94136501 podStartE2EDuration="2m21.94136501s" podCreationTimestamp="2026-04-19 12:30:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:33:13.940583109 +0000 UTC m=+162.094516587" watchObservedRunningTime="2026-04-19 12:33:13.94136501 +0000 UTC m=+162.095298487" Apr 19 12:33:14.928050 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:14.927997 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bns5n" event={"ID":"e88a41fd-9d7d-457b-af51-169ee562d266","Type":"ContainerStarted","Data":"64318ef9f68ea7a71d0c70ce2ac00f10664b17d4edaa552619beaa4bf08bd854"} Apr 19 12:33:14.942335 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:14.942273 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bns5n" podStartSLOduration=155.637353433 podStartE2EDuration="2m36.942259183s" podCreationTimestamp="2026-04-19 12:30:38 +0000 UTC" firstStartedPulling="2026-04-19 12:33:12.896754295 +0000 UTC m=+161.050687767" lastFinishedPulling="2026-04-19 12:33:14.201660051 +0000 UTC m=+162.355593517" observedRunningTime="2026-04-19 12:33:14.940747576 +0000 UTC m=+163.094681056" watchObservedRunningTime="2026-04-19 12:33:14.942259183 +0000 UTC m=+163.096192662" Apr 19 12:33:15.932228 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:15.932189 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-29mkc" event={"ID":"f68c88a5-5b83-4fd0-92df-327974a7cc96","Type":"ContainerStarted","Data":"7387ff3c2f7a2a88921f773c8a20016dd4df7ea497c0852e34438f6f0a77417d"} Apr 19 12:33:15.934213 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:15.934188 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nb68t" event={"ID":"c83beaf5-2d24-4163-855a-f4c6d55b0311","Type":"ContainerStarted","Data":"2078279ab0ff20e5c10f8c8ca945f025e0ecd49eef5bbf017bc7e20baf00acfb"} Apr 19 12:33:15.934213 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:15.934216 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nb68t" event={"ID":"c83beaf5-2d24-4163-855a-f4c6d55b0311","Type":"ContainerStarted","Data":"37bf1002b10549519559871b0b959295c989756ebe4b7bdb4a4548fd58caf6d6"} Apr 19 12:33:15.934406 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:15.934335 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-nb68t" Apr 19 12:33:15.947135 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:15.947089 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-29mkc" podStartSLOduration=129.511761859 podStartE2EDuration="2m11.947075584s" podCreationTimestamp="2026-04-19 12:31:04 +0000 UTC" firstStartedPulling="2026-04-19 12:33:12.884491864 +0000 UTC m=+161.038425322" lastFinishedPulling="2026-04-19 12:33:15.319805588 +0000 UTC m=+163.473739047" observedRunningTime="2026-04-19 12:33:15.946148874 +0000 UTC m=+164.100082353" watchObservedRunningTime="2026-04-19 12:33:15.947075584 +0000 UTC m=+164.101009064" Apr 19 12:33:15.966982 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:15.966921 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nb68t" podStartSLOduration=129.78953959 podStartE2EDuration="2m11.966901308s" podCreationTimestamp="2026-04-19 12:31:04 +0000 UTC" firstStartedPulling="2026-04-19 12:33:13.138360437 +0000 UTC m=+161.292293910" lastFinishedPulling="2026-04-19 12:33:15.315722156 +0000 UTC m=+163.469655628" observedRunningTime="2026-04-19 12:33:15.965849309 +0000 UTC m=+164.119782789" watchObservedRunningTime="2026-04-19 12:33:15.966901308 +0000 UTC m=+164.120834790" Apr 19 12:33:17.454175 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:17.454142 2567 scope.go:117] "RemoveContainer" containerID="a8dec56c21e3e1bad4030c2cb3cf16a14792114da2ba5ae1a200b8f9b1270148" Apr 19 12:33:17.941365 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:17.941337 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnbtf_2b04d28c-3eb3-44e1-b431-d6b75f3850fe/console-operator/1.log" Apr 19 12:33:17.941519 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:17.941425 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" event={"ID":"2b04d28c-3eb3-44e1-b431-d6b75f3850fe","Type":"ContainerStarted","Data":"505d6c9c8d4d558e0fda57f00efa0a88e32df005b2fe0d1fbc33430afd7f5d1a"} Apr 19 12:33:17.941766 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:17.941747 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" Apr 19 12:33:17.957320 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:17.957274 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" podStartSLOduration=23.94775921 podStartE2EDuration="25.957262265s" podCreationTimestamp="2026-04-19 12:32:52 +0000 UTC" firstStartedPulling="2026-04-19 12:32:52.802393419 +0000 UTC m=+140.956326878" lastFinishedPulling="2026-04-19 12:32:54.811896472 +0000 UTC m=+142.965829933" observedRunningTime="2026-04-19 12:33:17.956477472 +0000 UTC m=+166.110410953" watchObservedRunningTime="2026-04-19 12:33:17.957262265 +0000 UTC m=+166.111195745" Apr 19 12:33:18.016805 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:18.016776 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-lnbtf" Apr 19 12:33:21.512538 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.512507 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-trxd7"] Apr 19 12:33:21.515659 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.515640 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-trxd7" Apr 19 12:33:21.518144 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.518126 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 19 12:33:21.518374 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.518125 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8lgzb\"" Apr 19 12:33:21.518456 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.518150 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 19 12:33:21.518456 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.518197 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 19 12:33:21.518562 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.518229 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 19 12:33:21.531565 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.531294 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-trxd7"] Apr 19 12:33:21.544928 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.544901 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-rk698"] Apr 19 12:33:21.551746 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.551720 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-rk698" Apr 19 12:33:21.553341 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.553317 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 19 12:33:21.553512 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.553497 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-d8sj6\"" Apr 19 12:33:21.553589 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.553514 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 19 12:33:21.560582 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.560561 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-rk698"] Apr 19 12:33:21.653030 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.653002 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b78d3726-337a-4dd9-9db1-1835392e8376-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-trxd7\" (UID: \"b78d3726-337a-4dd9-9db1-1835392e8376\") " pod="openshift-insights/insights-runtime-extractor-trxd7" Apr 19 12:33:21.653223 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.653063 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqbwv\" (UniqueName: \"kubernetes.io/projected/b78d3726-337a-4dd9-9db1-1835392e8376-kube-api-access-hqbwv\") pod \"insights-runtime-extractor-trxd7\" (UID: \"b78d3726-337a-4dd9-9db1-1835392e8376\") " pod="openshift-insights/insights-runtime-extractor-trxd7" Apr 19 12:33:21.653223 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.653086 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b78d3726-337a-4dd9-9db1-1835392e8376-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-trxd7\" (UID: \"b78d3726-337a-4dd9-9db1-1835392e8376\") " pod="openshift-insights/insights-runtime-extractor-trxd7" Apr 19 12:33:21.653223 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.653140 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkv6d\" (UniqueName: \"kubernetes.io/projected/3df8278a-cafc-490d-ad79-bf55ce74b38e-kube-api-access-qkv6d\") pod \"downloads-6bcc868b7-rk698\" (UID: \"3df8278a-cafc-490d-ad79-bf55ce74b38e\") " pod="openshift-console/downloads-6bcc868b7-rk698" Apr 19 12:33:21.653223 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.653205 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b78d3726-337a-4dd9-9db1-1835392e8376-crio-socket\") pod \"insights-runtime-extractor-trxd7\" (UID: \"b78d3726-337a-4dd9-9db1-1835392e8376\") " pod="openshift-insights/insights-runtime-extractor-trxd7" Apr 19 12:33:21.653352 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.653238 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b78d3726-337a-4dd9-9db1-1835392e8376-data-volume\") pod \"insights-runtime-extractor-trxd7\" (UID: \"b78d3726-337a-4dd9-9db1-1835392e8376\") " pod="openshift-insights/insights-runtime-extractor-trxd7" Apr 19 12:33:21.753808 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.753769 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b78d3726-337a-4dd9-9db1-1835392e8376-crio-socket\") pod \"insights-runtime-extractor-trxd7\" (UID: \"b78d3726-337a-4dd9-9db1-1835392e8376\") " pod="openshift-insights/insights-runtime-extractor-trxd7" Apr 19 12:33:21.753808 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.753808 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b78d3726-337a-4dd9-9db1-1835392e8376-data-volume\") pod \"insights-runtime-extractor-trxd7\" (UID: \"b78d3726-337a-4dd9-9db1-1835392e8376\") " pod="openshift-insights/insights-runtime-extractor-trxd7" Apr 19 12:33:21.754052 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.753832 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b78d3726-337a-4dd9-9db1-1835392e8376-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-trxd7\" (UID: \"b78d3726-337a-4dd9-9db1-1835392e8376\") " pod="openshift-insights/insights-runtime-extractor-trxd7" Apr 19 12:33:21.754052 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.753901 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b78d3726-337a-4dd9-9db1-1835392e8376-crio-socket\") pod \"insights-runtime-extractor-trxd7\" (UID: \"b78d3726-337a-4dd9-9db1-1835392e8376\") " pod="openshift-insights/insights-runtime-extractor-trxd7" Apr 19 12:33:21.754052 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.753957 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqbwv\" (UniqueName: \"kubernetes.io/projected/b78d3726-337a-4dd9-9db1-1835392e8376-kube-api-access-hqbwv\") pod \"insights-runtime-extractor-trxd7\" (UID: \"b78d3726-337a-4dd9-9db1-1835392e8376\") " pod="openshift-insights/insights-runtime-extractor-trxd7" Apr 19 12:33:21.754052 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.754002 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b78d3726-337a-4dd9-9db1-1835392e8376-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-trxd7\" (UID: \"b78d3726-337a-4dd9-9db1-1835392e8376\") " pod="openshift-insights/insights-runtime-extractor-trxd7" Apr 19 12:33:21.754294 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.754099 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkv6d\" (UniqueName: \"kubernetes.io/projected/3df8278a-cafc-490d-ad79-bf55ce74b38e-kube-api-access-qkv6d\") pod \"downloads-6bcc868b7-rk698\" (UID: \"3df8278a-cafc-490d-ad79-bf55ce74b38e\") " pod="openshift-console/downloads-6bcc868b7-rk698" Apr 19 12:33:21.754294 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.754134 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b78d3726-337a-4dd9-9db1-1835392e8376-data-volume\") pod \"insights-runtime-extractor-trxd7\" (UID: \"b78d3726-337a-4dd9-9db1-1835392e8376\") " pod="openshift-insights/insights-runtime-extractor-trxd7" Apr 19 12:33:21.754472 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.754450 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b78d3726-337a-4dd9-9db1-1835392e8376-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-trxd7\" (UID: \"b78d3726-337a-4dd9-9db1-1835392e8376\") " pod="openshift-insights/insights-runtime-extractor-trxd7" Apr 19 12:33:21.756887 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.756868 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b78d3726-337a-4dd9-9db1-1835392e8376-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-trxd7\" (UID: \"b78d3726-337a-4dd9-9db1-1835392e8376\") " pod="openshift-insights/insights-runtime-extractor-trxd7" Apr 19 12:33:21.761541 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.761517 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqbwv\" (UniqueName: \"kubernetes.io/projected/b78d3726-337a-4dd9-9db1-1835392e8376-kube-api-access-hqbwv\") pod \"insights-runtime-extractor-trxd7\" (UID: \"b78d3726-337a-4dd9-9db1-1835392e8376\") " pod="openshift-insights/insights-runtime-extractor-trxd7" Apr 19 12:33:21.761816 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.761802 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkv6d\" (UniqueName: \"kubernetes.io/projected/3df8278a-cafc-490d-ad79-bf55ce74b38e-kube-api-access-qkv6d\") pod \"downloads-6bcc868b7-rk698\" (UID: \"3df8278a-cafc-490d-ad79-bf55ce74b38e\") " pod="openshift-console/downloads-6bcc868b7-rk698" Apr 19 12:33:21.825908 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.825801 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-trxd7" Apr 19 12:33:21.861193 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.860828 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-rk698" Apr 19 12:33:21.956902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.956872 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-trxd7"] Apr 19 12:33:21.958620 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:33:21.958596 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb78d3726_337a_4dd9_9db1_1835392e8376.slice/crio-a0081f7683227cb70525228c69d0cd5fe7b9c102bf5be0118f3ded1b6290ccbc WatchSource:0}: Error finding container a0081f7683227cb70525228c69d0cd5fe7b9c102bf5be0118f3ded1b6290ccbc: Status 404 returned error can't find the container with id a0081f7683227cb70525228c69d0cd5fe7b9c102bf5be0118f3ded1b6290ccbc Apr 19 12:33:21.992833 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:21.992807 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-rk698"] Apr 19 12:33:21.995659 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:33:21.995628 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3df8278a_cafc_490d_ad79_bf55ce74b38e.slice/crio-43f6085c551a711e7fb3595aa44a071f445bb1578d693f45816e564f5ab0b88f WatchSource:0}: Error finding container 43f6085c551a711e7fb3595aa44a071f445bb1578d693f45816e564f5ab0b88f: Status 404 returned error can't find the container with id 43f6085c551a711e7fb3595aa44a071f445bb1578d693f45816e564f5ab0b88f Apr 19 12:33:22.958180 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:22.958129 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-trxd7" event={"ID":"b78d3726-337a-4dd9-9db1-1835392e8376","Type":"ContainerStarted","Data":"9025fb8f4a5b902051b1ad5f9056b2656919ff72d87bbfd73cc75834cc1e3751"} Apr 19 12:33:22.958637 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:22.958192 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-trxd7" event={"ID":"b78d3726-337a-4dd9-9db1-1835392e8376","Type":"ContainerStarted","Data":"40d6da650b838e2fa05cec930e806eb30805f16fac6c56b60e6029a133ef9d86"} Apr 19 12:33:22.958637 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:22.958209 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-trxd7" event={"ID":"b78d3726-337a-4dd9-9db1-1835392e8376","Type":"ContainerStarted","Data":"a0081f7683227cb70525228c69d0cd5fe7b9c102bf5be0118f3ded1b6290ccbc"} Apr 19 12:33:22.959265 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:22.959241 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-rk698" event={"ID":"3df8278a-cafc-490d-ad79-bf55ce74b38e","Type":"ContainerStarted","Data":"43f6085c551a711e7fb3595aa44a071f445bb1578d693f45816e564f5ab0b88f"} Apr 19 12:33:24.275395 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:24.275364 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/418c073b-0032-49b0-84ad-cf233dc4778b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6mzzf\" (UID: \"418c073b-0032-49b0-84ad-cf233dc4778b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf" Apr 19 12:33:24.278636 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:24.278600 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/418c073b-0032-49b0-84ad-cf233dc4778b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6mzzf\" (UID: \"418c073b-0032-49b0-84ad-cf233dc4778b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf" Apr 19 12:33:24.287348 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:24.287322 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf" Apr 19 12:33:24.454376 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:24.454345 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:33:24.523177 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:24.523131 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf"] Apr 19 12:33:24.528906 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:33:24.528868 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod418c073b_0032_49b0_84ad_cf233dc4778b.slice/crio-9c648773675dec725bd0ac787b163814c040d9b50e02564698fb804ff6490d89 WatchSource:0}: Error finding container 9c648773675dec725bd0ac787b163814c040d9b50e02564698fb804ff6490d89: Status 404 returned error can't find the container with id 9c648773675dec725bd0ac787b163814c040d9b50e02564698fb804ff6490d89 Apr 19 12:33:24.967520 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:24.967473 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-trxd7" event={"ID":"b78d3726-337a-4dd9-9db1-1835392e8376","Type":"ContainerStarted","Data":"9a495817e671df279d3a3494a97534777c829c80f1fe62f8f0f1eb4db8556f4f"} Apr 19 12:33:24.968692 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:24.968657 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf" event={"ID":"418c073b-0032-49b0-84ad-cf233dc4778b","Type":"ContainerStarted","Data":"9c648773675dec725bd0ac787b163814c040d9b50e02564698fb804ff6490d89"} Apr 19 12:33:24.988117 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:24.988056 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-trxd7" podStartSLOduration=1.566186147 podStartE2EDuration="3.988016474s" podCreationTimestamp="2026-04-19 12:33:21 +0000 UTC" firstStartedPulling="2026-04-19 12:33:22.021688647 +0000 UTC m=+170.175622105" lastFinishedPulling="2026-04-19 12:33:24.443518957 +0000 UTC m=+172.597452432" observedRunningTime="2026-04-19 12:33:24.986412976 +0000 UTC m=+173.140346479" watchObservedRunningTime="2026-04-19 12:33:24.988016474 +0000 UTC m=+173.141949957" Apr 19 12:33:25.939454 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:25.939081 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nb68t" Apr 19 12:33:26.977236 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:26.977146 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf" event={"ID":"418c073b-0032-49b0-84ad-cf233dc4778b","Type":"ContainerStarted","Data":"9732f8916d16485df9e75f5cc4be904bfd64bbddce06348bd95aa2ce5786db9e"} Apr 19 12:33:26.995708 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:26.995648 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mzzf" podStartSLOduration=32.92400307 podStartE2EDuration="34.99562997s" podCreationTimestamp="2026-04-19 12:32:52 +0000 UTC" firstStartedPulling="2026-04-19 12:33:24.531507485 +0000 UTC m=+172.685440943" lastFinishedPulling="2026-04-19 12:33:26.60313437 +0000 UTC m=+174.757067843" observedRunningTime="2026-04-19 12:33:26.994047299 +0000 UTC m=+175.147980803" watchObservedRunningTime="2026-04-19 12:33:26.99562997 +0000 UTC m=+175.149563454" Apr 19 12:33:27.169087 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:27.169045 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zwq6n"] Apr 19 12:33:27.172591 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:27.172569 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zwq6n" Apr 19 12:33:27.174442 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:27.174418 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 19 12:33:27.174736 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:27.174718 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-nbrsw\"" Apr 19 12:33:27.183401 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:27.183361 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zwq6n"] Apr 19 12:33:27.303461 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:27.303375 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/39ea9fea-2574-42e5-ad5a-9622c776c3f7-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zwq6n\" (UID: \"39ea9fea-2574-42e5-ad5a-9622c776c3f7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zwq6n" Apr 19 12:33:27.404749 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:27.404703 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/39ea9fea-2574-42e5-ad5a-9622c776c3f7-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zwq6n\" (UID: \"39ea9fea-2574-42e5-ad5a-9622c776c3f7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zwq6n" Apr 19 12:33:27.404944 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:27.404875 2567 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 19 12:33:27.405006 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:27.404961 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39ea9fea-2574-42e5-ad5a-9622c776c3f7-tls-certificates podName:39ea9fea-2574-42e5-ad5a-9622c776c3f7 nodeName:}" failed. No retries permitted until 2026-04-19 12:33:27.904938776 +0000 UTC m=+176.058872234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/39ea9fea-2574-42e5-ad5a-9622c776c3f7-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-zwq6n" (UID: "39ea9fea-2574-42e5-ad5a-9622c776c3f7") : secret "prometheus-operator-admission-webhook-tls" not found Apr 19 12:33:27.909224 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:27.909185 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/39ea9fea-2574-42e5-ad5a-9622c776c3f7-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zwq6n\" (UID: \"39ea9fea-2574-42e5-ad5a-9622c776c3f7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zwq6n" Apr 19 12:33:27.911865 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:27.911842 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/39ea9fea-2574-42e5-ad5a-9622c776c3f7-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zwq6n\" (UID: \"39ea9fea-2574-42e5-ad5a-9622c776c3f7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zwq6n" Apr 19 12:33:28.085024 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:28.084989 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zwq6n" Apr 19 12:33:28.214258 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:28.214225 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zwq6n"] Apr 19 12:33:28.217725 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:33:28.217691 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39ea9fea_2574_42e5_ad5a_9622c776c3f7.slice/crio-2ac14ee64d61341f56daa52889d73ea24830bea30dc9a395a8e1fb233072db85 WatchSource:0}: Error finding container 2ac14ee64d61341f56daa52889d73ea24830bea30dc9a395a8e1fb233072db85: Status 404 returned error can't find the container with id 2ac14ee64d61341f56daa52889d73ea24830bea30dc9a395a8e1fb233072db85 Apr 19 12:33:28.985092 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:28.985051 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zwq6n" event={"ID":"39ea9fea-2574-42e5-ad5a-9622c776c3f7","Type":"ContainerStarted","Data":"2ac14ee64d61341f56daa52889d73ea24830bea30dc9a395a8e1fb233072db85"} Apr 19 12:33:29.989014 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:29.988976 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zwq6n" event={"ID":"39ea9fea-2574-42e5-ad5a-9622c776c3f7","Type":"ContainerStarted","Data":"cc0ad048518e53a6031a26f58d140c194cdfe68e7029e255ee6df6e6ebda2a7c"} Apr 19 12:33:29.989495 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:29.989188 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zwq6n" Apr 19 12:33:29.994639 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:29.994611 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zwq6n" Apr 19 12:33:30.004317 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:30.004203 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zwq6n" podStartSLOduration=1.679830077 podStartE2EDuration="3.004183589s" podCreationTimestamp="2026-04-19 12:33:27 +0000 UTC" firstStartedPulling="2026-04-19 12:33:28.219836918 +0000 UTC m=+176.373770377" lastFinishedPulling="2026-04-19 12:33:29.544190426 +0000 UTC m=+177.698123889" observedRunningTime="2026-04-19 12:33:30.002492408 +0000 UTC m=+178.156425894" watchObservedRunningTime="2026-04-19 12:33:30.004183589 +0000 UTC m=+178.158117068" Apr 19 12:33:30.242334 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:30.242247 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-k5dpj"] Apr 19 12:33:30.245917 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:30.245894 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-k5dpj" Apr 19 12:33:30.248182 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:30.248137 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 19 12:33:30.248182 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:30.248145 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 19 12:33:30.248372 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:30.248176 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 19 12:33:30.248372 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:30.248175 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-fnvvz\"" Apr 19 12:33:30.252786 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:30.252743 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-k5dpj"] Apr 19 12:33:30.330114 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:30.330078 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bdd21a10-2e10-4f33-af8b-8504779ff325-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-k5dpj\" (UID: \"bdd21a10-2e10-4f33-af8b-8504779ff325\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k5dpj" Apr 19 12:33:30.330114 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:30.330119 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djntl\" (UniqueName: \"kubernetes.io/projected/bdd21a10-2e10-4f33-af8b-8504779ff325-kube-api-access-djntl\") pod \"prometheus-operator-5676c8c784-k5dpj\" (UID: \"bdd21a10-2e10-4f33-af8b-8504779ff325\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k5dpj" Apr 19 12:33:30.330407 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:30.330178 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bdd21a10-2e10-4f33-af8b-8504779ff325-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-k5dpj\" (UID: \"bdd21a10-2e10-4f33-af8b-8504779ff325\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k5dpj" Apr 19 12:33:30.330407 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:30.330286 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bdd21a10-2e10-4f33-af8b-8504779ff325-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-k5dpj\" (UID: \"bdd21a10-2e10-4f33-af8b-8504779ff325\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k5dpj" Apr 19 12:33:30.431071 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:30.431028 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bdd21a10-2e10-4f33-af8b-8504779ff325-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-k5dpj\" (UID: \"bdd21a10-2e10-4f33-af8b-8504779ff325\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k5dpj" Apr 19 12:33:30.431302 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:30.431083 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bdd21a10-2e10-4f33-af8b-8504779ff325-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-k5dpj\" (UID: \"bdd21a10-2e10-4f33-af8b-8504779ff325\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k5dpj" Apr 19 12:33:30.431302 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:30.431148 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bdd21a10-2e10-4f33-af8b-8504779ff325-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-k5dpj\" (UID: \"bdd21a10-2e10-4f33-af8b-8504779ff325\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k5dpj" Apr 19 12:33:30.431302 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:30.431190 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djntl\" (UniqueName: \"kubernetes.io/projected/bdd21a10-2e10-4f33-af8b-8504779ff325-kube-api-access-djntl\") pod \"prometheus-operator-5676c8c784-k5dpj\" (UID: \"bdd21a10-2e10-4f33-af8b-8504779ff325\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k5dpj" Apr 19 12:33:30.431302 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:30.431281 2567 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 19 12:33:30.431516 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:30.431362 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdd21a10-2e10-4f33-af8b-8504779ff325-prometheus-operator-tls podName:bdd21a10-2e10-4f33-af8b-8504779ff325 nodeName:}" failed. No retries permitted until 2026-04-19 12:33:30.931340195 +0000 UTC m=+179.085273653 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/bdd21a10-2e10-4f33-af8b-8504779ff325-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-k5dpj" (UID: "bdd21a10-2e10-4f33-af8b-8504779ff325") : secret "prometheus-operator-tls" not found Apr 19 12:33:30.431875 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:30.431852 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bdd21a10-2e10-4f33-af8b-8504779ff325-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-k5dpj\" (UID: \"bdd21a10-2e10-4f33-af8b-8504779ff325\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k5dpj" Apr 19 12:33:30.433580 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:30.433553 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bdd21a10-2e10-4f33-af8b-8504779ff325-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-k5dpj\" (UID: \"bdd21a10-2e10-4f33-af8b-8504779ff325\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k5dpj" Apr 19 12:33:30.439417 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:30.439394 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djntl\" (UniqueName: \"kubernetes.io/projected/bdd21a10-2e10-4f33-af8b-8504779ff325-kube-api-access-djntl\") pod \"prometheus-operator-5676c8c784-k5dpj\" (UID: \"bdd21a10-2e10-4f33-af8b-8504779ff325\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k5dpj" Apr 19 12:33:30.935938 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:30.935900 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bdd21a10-2e10-4f33-af8b-8504779ff325-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-k5dpj\" (UID: \"bdd21a10-2e10-4f33-af8b-8504779ff325\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k5dpj" Apr 19 12:33:30.938607 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:30.938578 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bdd21a10-2e10-4f33-af8b-8504779ff325-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-k5dpj\" (UID: \"bdd21a10-2e10-4f33-af8b-8504779ff325\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-k5dpj" Apr 19 12:33:31.159309 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:31.159273 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-k5dpj" Apr 19 12:33:31.285237 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:31.285205 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-k5dpj"] Apr 19 12:33:31.288429 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:33:31.288396 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdd21a10_2e10_4f33_af8b_8504779ff325.slice/crio-fae7b3b71d5f11e47c15e3e2f36bf44796624f06f9140fdb36066096264c15bf WatchSource:0}: Error finding container fae7b3b71d5f11e47c15e3e2f36bf44796624f06f9140fdb36066096264c15bf: Status 404 returned error can't find the container with id fae7b3b71d5f11e47c15e3e2f36bf44796624f06f9140fdb36066096264c15bf Apr 19 12:33:31.995540 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:31.995505 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-k5dpj" event={"ID":"bdd21a10-2e10-4f33-af8b-8504779ff325","Type":"ContainerStarted","Data":"fae7b3b71d5f11e47c15e3e2f36bf44796624f06f9140fdb36066096264c15bf"} Apr 19 12:33:32.116650 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.116612 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56cfb9f996-2cn9l"] Apr 19 12:33:32.120092 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.120071 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:33:32.122835 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.122269 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 19 12:33:32.122835 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.122339 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 19 12:33:32.122835 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.122406 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 19 12:33:32.122835 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.122346 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 19 12:33:32.123101 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.122892 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-8nsl6\"" Apr 19 12:33:32.123101 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.122910 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 19 12:33:32.127821 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.127784 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56cfb9f996-2cn9l"] Apr 19 12:33:32.249315 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.249227 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-console-config\") pod \"console-56cfb9f996-2cn9l\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:33:32.249315 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.249299 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-oauth-serving-cert\") pod \"console-56cfb9f996-2cn9l\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:33:32.249803 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.249341 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-console-oauth-config\") pod \"console-56cfb9f996-2cn9l\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:33:32.249803 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.249441 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-console-serving-cert\") pod \"console-56cfb9f996-2cn9l\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:33:32.249803 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.249544 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt7jp\" (UniqueName: \"kubernetes.io/projected/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-kube-api-access-pt7jp\") pod \"console-56cfb9f996-2cn9l\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:33:32.249803 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.249579 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-service-ca\") pod \"console-56cfb9f996-2cn9l\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:33:32.350086 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.350047 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pt7jp\" (UniqueName: \"kubernetes.io/projected/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-kube-api-access-pt7jp\") pod \"console-56cfb9f996-2cn9l\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:33:32.350302 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.350221 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-service-ca\") pod \"console-56cfb9f996-2cn9l\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:33:32.350302 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.350265 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-console-config\") pod \"console-56cfb9f996-2cn9l\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:33:32.350423 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.350320 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-oauth-serving-cert\") pod \"console-56cfb9f996-2cn9l\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:33:32.350423 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.350351 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-console-oauth-config\") pod \"console-56cfb9f996-2cn9l\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:33:32.350423 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.350377 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-console-serving-cert\") pod \"console-56cfb9f996-2cn9l\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:33:32.352320 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.352288 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 19 12:33:32.352450 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.352380 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 19 12:33:32.352450 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.352409 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 19 12:33:32.352579 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.352457 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 19 12:33:32.352579 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.352504 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 19 12:33:32.358388 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.358367 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt7jp\" (UniqueName: \"kubernetes.io/projected/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-kube-api-access-pt7jp\") pod \"console-56cfb9f996-2cn9l\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:33:32.361611 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.361589 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-service-ca\") pod \"console-56cfb9f996-2cn9l\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:33:32.361760 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.361664 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-oauth-serving-cert\") pod \"console-56cfb9f996-2cn9l\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:33:32.361824 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.361676 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-console-config\") pod \"console-56cfb9f996-2cn9l\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:33:32.363386 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.363348 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-console-oauth-config\") pod \"console-56cfb9f996-2cn9l\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:33:32.363992 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.363974 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-console-serving-cert\") pod \"console-56cfb9f996-2cn9l\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:33:32.433224 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.433194 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-8nsl6\"" Apr 19 12:33:32.441716 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.441681 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:33:32.708662 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.708623 2567 patch_prober.go:28] interesting pod/image-registry-5cb67c5f8f-b7rwf container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 19 12:33:32.708852 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:32.708686 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" podUID="343439fa-d125-4243-ac43-c00e012201b9" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 19 12:33:34.932999 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:34.932968 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:33:38.689399 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:38.689369 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56cfb9f996-2cn9l"] Apr 19 12:33:38.704247 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:33:38.704205 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e18b3f4_af11_4f42_9d62_f9c0e8e36606.slice/crio-10e094424fbf7bc7f5ebbda5be4abea5366008528a7afd9df4bda79b8ff7ea0b WatchSource:0}: Error finding container 10e094424fbf7bc7f5ebbda5be4abea5366008528a7afd9df4bda79b8ff7ea0b: Status 404 returned error can't find the container with id 10e094424fbf7bc7f5ebbda5be4abea5366008528a7afd9df4bda79b8ff7ea0b Apr 19 12:33:39.016483 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:39.016443 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-rk698" event={"ID":"3df8278a-cafc-490d-ad79-bf55ce74b38e","Type":"ContainerStarted","Data":"76b25cb63226025c40bc9ef93115b58a894878b456fb1b4404bc749ce5dc5424"} Apr 19 12:33:39.016674 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:39.016630 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-rk698" Apr 19 12:33:39.018042 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:39.017994 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56cfb9f996-2cn9l" event={"ID":"5e18b3f4-af11-4f42-9d62-f9c0e8e36606","Type":"ContainerStarted","Data":"10e094424fbf7bc7f5ebbda5be4abea5366008528a7afd9df4bda79b8ff7ea0b"} Apr 19 12:33:39.028622 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:39.028564 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-rk698" Apr 19 12:33:39.036117 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:39.036060 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-rk698" podStartSLOduration=1.37092324 podStartE2EDuration="18.036043562s" podCreationTimestamp="2026-04-19 12:33:21 +0000 UTC" firstStartedPulling="2026-04-19 12:33:21.997849883 +0000 UTC m=+170.151783341" lastFinishedPulling="2026-04-19 12:33:38.662970172 +0000 UTC m=+186.816903663" observedRunningTime="2026-04-19 12:33:39.03475871 +0000 UTC m=+187.188692191" watchObservedRunningTime="2026-04-19 12:33:39.036043562 +0000 UTC m=+187.189977037" Apr 19 12:33:40.031186 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:40.030743 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-k5dpj" event={"ID":"bdd21a10-2e10-4f33-af8b-8504779ff325","Type":"ContainerStarted","Data":"b010b97362a12c20e172da70e35f8aef23d7d9b84aaddf2fa07e8587ba05fa2a"} Apr 19 12:33:40.031186 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:40.030791 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-k5dpj" event={"ID":"bdd21a10-2e10-4f33-af8b-8504779ff325","Type":"ContainerStarted","Data":"265cdfdf24eff07b1d67a66e791ccdbed0ce347f357ea060d0da3f3a536694ae"} Apr 19 12:33:40.048292 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:40.048234 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-k5dpj" podStartSLOduration=1.586097906 podStartE2EDuration="10.048214125s" podCreationTimestamp="2026-04-19 12:33:30 +0000 UTC" firstStartedPulling="2026-04-19 12:33:31.290747289 +0000 UTC m=+179.444680753" lastFinishedPulling="2026-04-19 12:33:39.75286351 +0000 UTC m=+187.906796972" observedRunningTime="2026-04-19 12:33:40.046995832 +0000 UTC m=+188.200929312" watchObservedRunningTime="2026-04-19 12:33:40.048214125 +0000 UTC m=+188.202147606" Apr 19 12:33:41.582858 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.582177 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt"] Apr 19 12:33:41.620568 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.620515 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt"] Apr 19 12:33:41.620755 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.620703 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt" Apr 19 12:33:41.623001 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.622980 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-z5fb9\"" Apr 19 12:33:41.625732 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.625714 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-gbc2d"] Apr 19 12:33:41.627083 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.627060 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 19 12:33:41.627321 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.627306 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 19 12:33:41.637042 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.637019 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/07cd686e-db42-41d2-8704-20287a4d6ba5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-pkwpt\" (UID: \"07cd686e-db42-41d2-8704-20287a4d6ba5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt" Apr 19 12:33:41.637141 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.637058 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/07cd686e-db42-41d2-8704-20287a4d6ba5-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-pkwpt\" (UID: \"07cd686e-db42-41d2-8704-20287a4d6ba5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt" Apr 19 12:33:41.637141 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.637080 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ncqg\" (UniqueName: \"kubernetes.io/projected/07cd686e-db42-41d2-8704-20287a4d6ba5-kube-api-access-7ncqg\") pod \"openshift-state-metrics-9d44df66c-pkwpt\" (UID: \"07cd686e-db42-41d2-8704-20287a4d6ba5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt" Apr 19 12:33:41.637141 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.637098 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/07cd686e-db42-41d2-8704-20287a4d6ba5-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-pkwpt\" (UID: \"07cd686e-db42-41d2-8704-20287a4d6ba5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt" Apr 19 12:33:41.645535 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.645353 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.646220 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.646198 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-bjprv"] Apr 19 12:33:41.648384 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.648360 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 19 12:33:41.648779 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.648600 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 19 12:33:41.648779 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.648631 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-rhpt4\"" Apr 19 12:33:41.648920 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.648850 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 19 12:33:41.676759 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.676725 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-bjprv"] Apr 19 12:33:41.677065 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.677053 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" Apr 19 12:33:41.682811 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.681798 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-6pdq7\"" Apr 19 12:33:41.682811 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.682551 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 19 12:33:41.683412 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.683391 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 19 12:33:41.683487 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.683433 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 19 12:33:41.738972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.737790 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-node-exporter-textfile\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.738972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.737845 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-node-exporter-accelerators-collector-config\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.738972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.737875 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12d3e553-2bcd-4923-9074-65406b6c1644-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-bjprv\" (UID: \"12d3e553-2bcd-4923-9074-65406b6c1644\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" Apr 19 12:33:41.738972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.737908 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.738972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.737946 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-node-exporter-tls\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.738972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.737969 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-metrics-client-ca\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.738972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.737996 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/12d3e553-2bcd-4923-9074-65406b6c1644-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-bjprv\" (UID: \"12d3e553-2bcd-4923-9074-65406b6c1644\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" Apr 19 12:33:41.738972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.738120 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-sys\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.738972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.738181 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/12d3e553-2bcd-4923-9074-65406b6c1644-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-bjprv\" (UID: \"12d3e553-2bcd-4923-9074-65406b6c1644\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" Apr 19 12:33:41.738972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.738234 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-root\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.738972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.738328 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf6gg\" (UniqueName: \"kubernetes.io/projected/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-kube-api-access-mf6gg\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.738972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.738363 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-node-exporter-wtmp\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.738972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.738398 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/12d3e553-2bcd-4923-9074-65406b6c1644-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-bjprv\" (UID: \"12d3e553-2bcd-4923-9074-65406b6c1644\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" Apr 19 12:33:41.738972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.738438 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/07cd686e-db42-41d2-8704-20287a4d6ba5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-pkwpt\" (UID: \"07cd686e-db42-41d2-8704-20287a4d6ba5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt" Apr 19 12:33:41.738972 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:41.738534 2567 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 19 12:33:41.739862 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:41.738597 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07cd686e-db42-41d2-8704-20287a4d6ba5-openshift-state-metrics-tls podName:07cd686e-db42-41d2-8704-20287a4d6ba5 nodeName:}" failed. No retries permitted until 2026-04-19 12:33:42.238577537 +0000 UTC m=+190.392510996 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/07cd686e-db42-41d2-8704-20287a4d6ba5-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-pkwpt" (UID: "07cd686e-db42-41d2-8704-20287a4d6ba5") : secret "openshift-state-metrics-tls" not found Apr 19 12:33:41.739862 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.738893 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9h5b\" (UniqueName: \"kubernetes.io/projected/12d3e553-2bcd-4923-9074-65406b6c1644-kube-api-access-j9h5b\") pod \"kube-state-metrics-69db897b98-bjprv\" (UID: \"12d3e553-2bcd-4923-9074-65406b6c1644\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" Apr 19 12:33:41.739862 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.738949 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/07cd686e-db42-41d2-8704-20287a4d6ba5-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-pkwpt\" (UID: \"07cd686e-db42-41d2-8704-20287a4d6ba5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt" Apr 19 12:33:41.739862 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.738980 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/12d3e553-2bcd-4923-9074-65406b6c1644-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-bjprv\" (UID: \"12d3e553-2bcd-4923-9074-65406b6c1644\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" Apr 19 12:33:41.739862 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.739022 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ncqg\" (UniqueName: \"kubernetes.io/projected/07cd686e-db42-41d2-8704-20287a4d6ba5-kube-api-access-7ncqg\") pod \"openshift-state-metrics-9d44df66c-pkwpt\" (UID: \"07cd686e-db42-41d2-8704-20287a4d6ba5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt" Apr 19 12:33:41.739862 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.739054 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/07cd686e-db42-41d2-8704-20287a4d6ba5-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-pkwpt\" (UID: \"07cd686e-db42-41d2-8704-20287a4d6ba5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt" Apr 19 12:33:41.739862 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.739738 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/07cd686e-db42-41d2-8704-20287a4d6ba5-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-pkwpt\" (UID: \"07cd686e-db42-41d2-8704-20287a4d6ba5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt" Apr 19 12:33:41.743694 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.743665 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/07cd686e-db42-41d2-8704-20287a4d6ba5-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-pkwpt\" (UID: \"07cd686e-db42-41d2-8704-20287a4d6ba5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt" Apr 19 12:33:41.754787 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.754732 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ncqg\" (UniqueName: \"kubernetes.io/projected/07cd686e-db42-41d2-8704-20287a4d6ba5-kube-api-access-7ncqg\") pod \"openshift-state-metrics-9d44df66c-pkwpt\" (UID: \"07cd686e-db42-41d2-8704-20287a4d6ba5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt" Apr 19 12:33:41.843259 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.841607 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-sys\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.843259 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.841660 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/12d3e553-2bcd-4923-9074-65406b6c1644-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-bjprv\" (UID: \"12d3e553-2bcd-4923-9074-65406b6c1644\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" Apr 19 12:33:41.843259 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.841687 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-root\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.843259 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.841729 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mf6gg\" (UniqueName: \"kubernetes.io/projected/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-kube-api-access-mf6gg\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.843259 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.841756 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-node-exporter-wtmp\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.843259 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.841786 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/12d3e553-2bcd-4923-9074-65406b6c1644-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-bjprv\" (UID: \"12d3e553-2bcd-4923-9074-65406b6c1644\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" Apr 19 12:33:41.843259 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.841826 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9h5b\" (UniqueName: \"kubernetes.io/projected/12d3e553-2bcd-4923-9074-65406b6c1644-kube-api-access-j9h5b\") pod \"kube-state-metrics-69db897b98-bjprv\" (UID: \"12d3e553-2bcd-4923-9074-65406b6c1644\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" Apr 19 12:33:41.843259 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.841866 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/12d3e553-2bcd-4923-9074-65406b6c1644-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-bjprv\" (UID: \"12d3e553-2bcd-4923-9074-65406b6c1644\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" Apr 19 12:33:41.843259 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.841913 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-node-exporter-textfile\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.843259 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.841943 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-node-exporter-accelerators-collector-config\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.843259 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.841968 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12d3e553-2bcd-4923-9074-65406b6c1644-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-bjprv\" (UID: \"12d3e553-2bcd-4923-9074-65406b6c1644\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" Apr 19 12:33:41.843259 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.842003 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.843259 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.842041 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-node-exporter-tls\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.843259 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.842064 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-metrics-client-ca\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.843259 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.842091 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/12d3e553-2bcd-4923-9074-65406b6c1644-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-bjprv\" (UID: \"12d3e553-2bcd-4923-9074-65406b6c1644\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" Apr 19 12:33:41.843259 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:41.842266 2567 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 19 12:33:41.844308 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:41.842348 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12d3e553-2bcd-4923-9074-65406b6c1644-kube-state-metrics-tls podName:12d3e553-2bcd-4923-9074-65406b6c1644 nodeName:}" failed. No retries permitted until 2026-04-19 12:33:42.342327081 +0000 UTC m=+190.496260545 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/12d3e553-2bcd-4923-9074-65406b6c1644-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-bjprv" (UID: "12d3e553-2bcd-4923-9074-65406b6c1644") : secret "kube-state-metrics-tls" not found Apr 19 12:33:41.844308 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.842590 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-sys\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.844308 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.843592 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-root\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.844308 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:41.843756 2567 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 19 12:33:41.844308 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:41.843810 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-node-exporter-tls podName:35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6 nodeName:}" failed. No retries permitted until 2026-04-19 12:33:42.343792872 +0000 UTC m=+190.497726342 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-node-exporter-tls") pod "node-exporter-gbc2d" (UID: "35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6") : secret "node-exporter-tls" not found Apr 19 12:33:41.844771 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.844608 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-node-exporter-wtmp\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.844771 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.844734 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-node-exporter-textfile\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.845414 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.845386 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-node-exporter-accelerators-collector-config\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.845818 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.845795 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-metrics-client-ca\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.859020 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.847255 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/12d3e553-2bcd-4923-9074-65406b6c1644-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-bjprv\" (UID: \"12d3e553-2bcd-4923-9074-65406b6c1644\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" Apr 19 12:33:41.859020 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.847572 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/12d3e553-2bcd-4923-9074-65406b6c1644-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-bjprv\" (UID: \"12d3e553-2bcd-4923-9074-65406b6c1644\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" Apr 19 12:33:41.859020 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.847980 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.859020 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.855656 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12d3e553-2bcd-4923-9074-65406b6c1644-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-bjprv\" (UID: \"12d3e553-2bcd-4923-9074-65406b6c1644\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" Apr 19 12:33:41.859020 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.858458 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9h5b\" (UniqueName: \"kubernetes.io/projected/12d3e553-2bcd-4923-9074-65406b6c1644-kube-api-access-j9h5b\") pod \"kube-state-metrics-69db897b98-bjprv\" (UID: \"12d3e553-2bcd-4923-9074-65406b6c1644\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" Apr 19 12:33:41.859020 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.858875 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf6gg\" (UniqueName: \"kubernetes.io/projected/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-kube-api-access-mf6gg\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:41.867113 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:41.862699 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/12d3e553-2bcd-4923-9074-65406b6c1644-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-bjprv\" (UID: \"12d3e553-2bcd-4923-9074-65406b6c1644\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" Apr 19 12:33:42.246132 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.246029 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/07cd686e-db42-41d2-8704-20287a4d6ba5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-pkwpt\" (UID: \"07cd686e-db42-41d2-8704-20287a4d6ba5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt" Apr 19 12:33:42.250603 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.250569 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/07cd686e-db42-41d2-8704-20287a4d6ba5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-pkwpt\" (UID: \"07cd686e-db42-41d2-8704-20287a4d6ba5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt" Apr 19 12:33:42.347174 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.347123 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-node-exporter-tls\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:42.347348 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.347193 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/12d3e553-2bcd-4923-9074-65406b6c1644-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-bjprv\" (UID: \"12d3e553-2bcd-4923-9074-65406b6c1644\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" Apr 19 12:33:42.350811 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.350755 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/12d3e553-2bcd-4923-9074-65406b6c1644-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-bjprv\" (UID: \"12d3e553-2bcd-4923-9074-65406b6c1644\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" Apr 19 12:33:42.350986 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.350862 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6-node-exporter-tls\") pod \"node-exporter-gbc2d\" (UID: \"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6\") " pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:42.538507 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.538431 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt" Apr 19 12:33:42.561384 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.560241 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gbc2d" Apr 19 12:33:42.595560 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.595520 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" Apr 19 12:33:42.686371 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.686339 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 19 12:33:42.723250 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.721725 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.724231 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.724188 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 19 12:33:42.727821 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.726069 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 19 12:33:42.727821 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.726328 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 19 12:33:42.727821 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.726601 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-2lbj7\"" Apr 19 12:33:42.727821 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.726771 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 19 12:33:42.727821 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.726933 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 19 12:33:42.727821 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.726079 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 19 12:33:42.727821 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.727193 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 19 12:33:42.727821 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.727305 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 19 12:33:42.727821 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.727195 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 19 12:33:42.727821 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.727518 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 19 12:33:42.751815 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.751727 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.752236 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.751780 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-config-volume\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.752941 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.752376 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/effa5082-f6a7-4251-9183-c99891f7e9e3-config-out\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.752941 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.752425 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/effa5082-f6a7-4251-9183-c99891f7e9e3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.752941 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.752458 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.752941 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.752493 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.752941 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.752527 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-web-config\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.752941 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.752550 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbjmg\" (UniqueName: \"kubernetes.io/projected/effa5082-f6a7-4251-9183-c99891f7e9e3-kube-api-access-wbjmg\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.752941 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.752616 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.752941 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.752655 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.752941 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.752713 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/effa5082-f6a7-4251-9183-c99891f7e9e3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.752941 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.752738 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/effa5082-f6a7-4251-9183-c99891f7e9e3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.752941 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.752778 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/effa5082-f6a7-4251-9183-c99891f7e9e3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.854820 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.853808 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.854820 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.853890 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/effa5082-f6a7-4251-9183-c99891f7e9e3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.854820 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.853924 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/effa5082-f6a7-4251-9183-c99891f7e9e3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.854820 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.853965 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/effa5082-f6a7-4251-9183-c99891f7e9e3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.854820 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.853994 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.854820 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.854031 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-config-volume\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.854820 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.854055 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/effa5082-f6a7-4251-9183-c99891f7e9e3-config-out\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.854820 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.854092 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/effa5082-f6a7-4251-9183-c99891f7e9e3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.854820 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.854123 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.854820 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.854156 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.854820 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.854211 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-web-config\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.854820 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.854238 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbjmg\" (UniqueName: \"kubernetes.io/projected/effa5082-f6a7-4251-9183-c99891f7e9e3-kube-api-access-wbjmg\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.854820 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.854311 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.858050 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.857053 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/effa5082-f6a7-4251-9183-c99891f7e9e3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.858050 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:42.857185 2567 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 19 12:33:42.858050 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:33:42.857251 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-main-tls podName:effa5082-f6a7-4251-9183-c99891f7e9e3 nodeName:}" failed. No retries permitted until 2026-04-19 12:33:43.357229965 +0000 UTC m=+191.511163424 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "effa5082-f6a7-4251-9183-c99891f7e9e3") : secret "alertmanager-main-tls" not found Apr 19 12:33:42.858050 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.857721 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/effa5082-f6a7-4251-9183-c99891f7e9e3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.863147 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.858961 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.863147 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.861155 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-bjprv"] Apr 19 12:33:42.863738 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.863679 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/effa5082-f6a7-4251-9183-c99891f7e9e3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.866201 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.866143 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.869857 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.869810 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/effa5082-f6a7-4251-9183-c99891f7e9e3-config-out\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.883680 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.883634 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbjmg\" (UniqueName: \"kubernetes.io/projected/effa5082-f6a7-4251-9183-c99891f7e9e3-kube-api-access-wbjmg\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.885197 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.885115 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.889536 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.886207 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-config-volume\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.889536 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.886759 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.889536 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.889476 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/effa5082-f6a7-4251-9183-c99891f7e9e3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.890937 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.890700 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-web-config\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:42.909174 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:42.908593 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt"] Apr 19 12:33:43.045624 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:43.045535 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" event={"ID":"12d3e553-2bcd-4923-9074-65406b6c1644","Type":"ContainerStarted","Data":"61843878b6cd44f8734a2be1219d8cf567512b8007579787ff29d8654ccd216e"} Apr 19 12:33:43.046732 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:43.046701 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gbc2d" event={"ID":"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6","Type":"ContainerStarted","Data":"40c8be06e9cab7ba29c2aa80f8bbc3c08351b23cd4e2783250baeade8baf89fc"} Apr 19 12:33:43.068480 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:33:43.068445 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07cd686e_db42_41d2_8704_20287a4d6ba5.slice/crio-52e7110c3c6f43c8851abbb1cec1b7f945fb4b266f5187f1a865befc11c67845 WatchSource:0}: Error finding container 52e7110c3c6f43c8851abbb1cec1b7f945fb4b266f5187f1a865befc11c67845: Status 404 returned error can't find the container with id 52e7110c3c6f43c8851abbb1cec1b7f945fb4b266f5187f1a865befc11c67845 Apr 19 12:33:43.360977 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:43.360809 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:43.364151 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:43.363969 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:43.364277 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:43.364207 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:33:43.520881 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:43.520820 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 19 12:33:43.660744 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:43.660711 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5cb67c5f8f-b7rwf"] Apr 19 12:33:43.729044 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:33:43.728992 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeffa5082_f6a7_4251_9183_c99891f7e9e3.slice/crio-a5c09c9286019d901bd873243be78b9bf840463fe142ad70ae0b746721ceba07 WatchSource:0}: Error finding container a5c09c9286019d901bd873243be78b9bf840463fe142ad70ae0b746721ceba07: Status 404 returned error can't find the container with id a5c09c9286019d901bd873243be78b9bf840463fe142ad70ae0b746721ceba07 Apr 19 12:33:44.051459 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:44.051368 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"effa5082-f6a7-4251-9183-c99891f7e9e3","Type":"ContainerStarted","Data":"a5c09c9286019d901bd873243be78b9bf840463fe142ad70ae0b746721ceba07"} Apr 19 12:33:44.053106 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:44.053078 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56cfb9f996-2cn9l" event={"ID":"5e18b3f4-af11-4f42-9d62-f9c0e8e36606","Type":"ContainerStarted","Data":"e6e50a9a0a7922f532a80c752f1de4c99f456d00eaa7f848b5a91915af5e8753"} Apr 19 12:33:44.055328 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:44.055302 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt" event={"ID":"07cd686e-db42-41d2-8704-20287a4d6ba5","Type":"ContainerStarted","Data":"81c012836d9c7085c9597f75157e2af189da79a206be96cac19244a5390eb04e"} Apr 19 12:33:44.055447 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:44.055331 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt" event={"ID":"07cd686e-db42-41d2-8704-20287a4d6ba5","Type":"ContainerStarted","Data":"7b4f7c05b2343840a76da3ead6c81f39a1611f69cbfc780a0b48b63fc2c104ef"} Apr 19 12:33:44.055447 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:44.055346 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt" event={"ID":"07cd686e-db42-41d2-8704-20287a4d6ba5","Type":"ContainerStarted","Data":"52e7110c3c6f43c8851abbb1cec1b7f945fb4b266f5187f1a865befc11c67845"} Apr 19 12:33:44.069448 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:44.069343 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56cfb9f996-2cn9l" podStartSLOduration=7.679924318 podStartE2EDuration="12.069321981s" podCreationTimestamp="2026-04-19 12:33:32 +0000 UTC" firstStartedPulling="2026-04-19 12:33:38.706354728 +0000 UTC m=+186.860288190" lastFinishedPulling="2026-04-19 12:33:43.09575238 +0000 UTC m=+191.249685853" observedRunningTime="2026-04-19 12:33:44.067601736 +0000 UTC m=+192.221535211" watchObservedRunningTime="2026-04-19 12:33:44.069321981 +0000 UTC m=+192.223255462" Apr 19 12:33:46.063960 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:46.063893 2567 generic.go:358] "Generic (PLEG): container finished" podID="35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6" containerID="822ec8f0797283bb2866e71c59a45690d72fb838c3c74e29c2fe0d3176f21815" exitCode=0 Apr 19 12:33:46.064396 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:46.063982 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gbc2d" event={"ID":"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6","Type":"ContainerDied","Data":"822ec8f0797283bb2866e71c59a45690d72fb838c3c74e29c2fe0d3176f21815"} Apr 19 12:33:46.066281 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:46.066254 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt" event={"ID":"07cd686e-db42-41d2-8704-20287a4d6ba5","Type":"ContainerStarted","Data":"2c41dd057df2a5056701e718de20ae870181b2d290327eeda176792523760bf7"} Apr 19 12:33:46.068150 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:46.068122 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" event={"ID":"12d3e553-2bcd-4923-9074-65406b6c1644","Type":"ContainerStarted","Data":"47e9ef4b5d0df90b29270120660b0fe95d52fc8e3d9d19616eaea090aa68a5e3"} Apr 19 12:33:46.068285 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:46.068181 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" event={"ID":"12d3e553-2bcd-4923-9074-65406b6c1644","Type":"ContainerStarted","Data":"1c4ec8db20fd12d97492d4e42095179e8868019ad308b771a35176de1976be8d"} Apr 19 12:33:46.097836 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:46.097781 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pkwpt" podStartSLOduration=3.159330453 podStartE2EDuration="5.097761926s" podCreationTimestamp="2026-04-19 12:33:41 +0000 UTC" firstStartedPulling="2026-04-19 12:33:43.356687257 +0000 UTC m=+191.510620716" lastFinishedPulling="2026-04-19 12:33:45.295118718 +0000 UTC m=+193.449052189" observedRunningTime="2026-04-19 12:33:46.09664766 +0000 UTC m=+194.250581142" watchObservedRunningTime="2026-04-19 12:33:46.097761926 +0000 UTC m=+194.251695429" Apr 19 12:33:46.389062 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:46.389028 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56cfb9f996-2cn9l"] Apr 19 12:33:46.400848 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:46.400776 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-22cmq"] Apr 19 12:33:46.441237 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:46.440241 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-22cmq"] Apr 19 12:33:46.441237 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:46.440386 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-22cmq" Apr 19 12:33:46.442687 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:46.442515 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 19 12:33:46.442687 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:46.442555 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-lwszb\"" Apr 19 12:33:46.494354 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:46.494316 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7e516893-242c-4e32-94d8-70ccb92ef46e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-22cmq\" (UID: \"7e516893-242c-4e32-94d8-70ccb92ef46e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-22cmq" Apr 19 12:33:46.595505 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:46.595462 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7e516893-242c-4e32-94d8-70ccb92ef46e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-22cmq\" (UID: \"7e516893-242c-4e32-94d8-70ccb92ef46e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-22cmq" Apr 19 12:33:46.598355 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:46.598333 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7e516893-242c-4e32-94d8-70ccb92ef46e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-22cmq\" (UID: \"7e516893-242c-4e32-94d8-70ccb92ef46e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-22cmq" Apr 19 12:33:46.774537 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:46.774449 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-22cmq" Apr 19 12:33:46.916204 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:46.916153 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-22cmq"] Apr 19 12:33:46.919059 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:33:46.919032 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e516893_242c_4e32_94d8_70ccb92ef46e.slice/crio-08939802c5293f6f54f4a66c875922a3c66092a8fbe7a36efddaff8acb2c4fab WatchSource:0}: Error finding container 08939802c5293f6f54f4a66c875922a3c66092a8fbe7a36efddaff8acb2c4fab: Status 404 returned error can't find the container with id 08939802c5293f6f54f4a66c875922a3c66092a8fbe7a36efddaff8acb2c4fab Apr 19 12:33:47.073490 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:47.073408 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" event={"ID":"12d3e553-2bcd-4923-9074-65406b6c1644","Type":"ContainerStarted","Data":"56d2b0b851568486ef6ceb8c86422808fd507446e908c9b724b33a38fcec006e"} Apr 19 12:33:47.074998 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:47.074968 2567 generic.go:358] "Generic (PLEG): container finished" podID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerID="c0622ba8ab959bea7840e3177b7d9d7c774973447484c7fc556327c4c9121bcc" exitCode=0 Apr 19 12:33:47.075133 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:47.075058 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"effa5082-f6a7-4251-9183-c99891f7e9e3","Type":"ContainerDied","Data":"c0622ba8ab959bea7840e3177b7d9d7c774973447484c7fc556327c4c9121bcc"} Apr 19 12:33:47.077177 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:47.077135 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-22cmq" event={"ID":"7e516893-242c-4e32-94d8-70ccb92ef46e","Type":"ContainerStarted","Data":"08939802c5293f6f54f4a66c875922a3c66092a8fbe7a36efddaff8acb2c4fab"} Apr 19 12:33:47.079325 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:47.079300 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gbc2d" event={"ID":"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6","Type":"ContainerStarted","Data":"f50b31578f5531f754c7a06be1d5f64746ea5d9f4c5b7104b1665586eb44a3eb"} Apr 19 12:33:47.079418 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:47.079333 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gbc2d" event={"ID":"35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6","Type":"ContainerStarted","Data":"b7068fe60a2772d75280f6fd1447eb2356e8d6937f9a526004f26786b4511179"} Apr 19 12:33:47.090203 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:47.089710 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-bjprv" podStartSLOduration=3.689665283 podStartE2EDuration="6.089694583s" podCreationTimestamp="2026-04-19 12:33:41 +0000 UTC" firstStartedPulling="2026-04-19 12:33:42.892437309 +0000 UTC m=+191.046370780" lastFinishedPulling="2026-04-19 12:33:45.292466617 +0000 UTC m=+193.446400080" observedRunningTime="2026-04-19 12:33:47.089204168 +0000 UTC m=+195.243137652" watchObservedRunningTime="2026-04-19 12:33:47.089694583 +0000 UTC m=+195.243628054" Apr 19 12:33:47.134509 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:47.134440 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-gbc2d" podStartSLOduration=3.541819452 podStartE2EDuration="6.134419203s" podCreationTimestamp="2026-04-19 12:33:41 +0000 UTC" firstStartedPulling="2026-04-19 12:33:42.67311059 +0000 UTC m=+190.827044050" lastFinishedPulling="2026-04-19 12:33:45.265710324 +0000 UTC m=+193.419643801" observedRunningTime="2026-04-19 12:33:47.132823069 +0000 UTC m=+195.286756573" watchObservedRunningTime="2026-04-19 12:33:47.134419203 +0000 UTC m=+195.288352685" Apr 19 12:33:50.100520 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:50.100482 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"effa5082-f6a7-4251-9183-c99891f7e9e3","Type":"ContainerStarted","Data":"ed2a9bc3826f45907928f0b1d2a9d5cfb322b96549adf0337e509d17cb2ad47f"} Apr 19 12:33:50.100952 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:50.100525 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"effa5082-f6a7-4251-9183-c99891f7e9e3","Type":"ContainerStarted","Data":"586c59b7fd9917340deeb4a1389b04f7a52d537475c2bdbf6f24b3127c1ee21d"} Apr 19 12:33:50.101791 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:50.101763 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-22cmq" event={"ID":"7e516893-242c-4e32-94d8-70ccb92ef46e","Type":"ContainerStarted","Data":"9cc87e46968a61cf13fd94c4c2e145177e69985d647982a1e90f67543e99a37f"} Apr 19 12:33:50.102004 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:50.101984 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-22cmq" Apr 19 12:33:50.107247 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:50.107226 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-22cmq" Apr 19 12:33:50.114516 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:50.114472 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-22cmq" podStartSLOduration=1.3057154340000001 podStartE2EDuration="4.11445686s" podCreationTimestamp="2026-04-19 12:33:46 +0000 UTC" firstStartedPulling="2026-04-19 12:33:46.92128042 +0000 UTC m=+195.075213879" lastFinishedPulling="2026-04-19 12:33:49.730021846 +0000 UTC m=+197.883955305" observedRunningTime="2026-04-19 12:33:50.113665838 +0000 UTC m=+198.267599330" watchObservedRunningTime="2026-04-19 12:33:50.11445686 +0000 UTC m=+198.268390342" Apr 19 12:33:51.107993 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:51.107957 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"effa5082-f6a7-4251-9183-c99891f7e9e3","Type":"ContainerStarted","Data":"37eec079bfc56c2ea87dee7811ae890e110d171734b2bd92016cc6255608cac2"} Apr 19 12:33:51.107993 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:51.107997 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"effa5082-f6a7-4251-9183-c99891f7e9e3","Type":"ContainerStarted","Data":"38eb3876372b7c694f7b46aae5ba5ba0f717fe1dfb2cd3bd376291147488b8b3"} Apr 19 12:33:51.108499 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:51.108009 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"effa5082-f6a7-4251-9183-c99891f7e9e3","Type":"ContainerStarted","Data":"66616e9bc0f163ef5494e2eb7a44f83f4be480a11daab2a248c9103ca757d5d3"} Apr 19 12:33:52.114588 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:52.114543 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"effa5082-f6a7-4251-9183-c99891f7e9e3","Type":"ContainerStarted","Data":"402f639348ee896295a3db62a30a4f56114916c5fa031209e3f28386a6751d1e"} Apr 19 12:33:52.140227 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:52.140152 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.44845163 podStartE2EDuration="10.140129135s" podCreationTimestamp="2026-04-19 12:33:42 +0000 UTC" firstStartedPulling="2026-04-19 12:33:43.731658239 +0000 UTC m=+191.885591699" lastFinishedPulling="2026-04-19 12:33:51.423335746 +0000 UTC m=+199.577269204" observedRunningTime="2026-04-19 12:33:52.138921539 +0000 UTC m=+200.292855032" watchObservedRunningTime="2026-04-19 12:33:52.140129135 +0000 UTC m=+200.294062615" Apr 19 12:33:52.442727 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:33:52.442695 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:34:08.687540 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:08.687499 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" podUID="343439fa-d125-4243-ac43-c00e012201b9" containerName="registry" containerID="cri-o://1576119ee5e6cc346494340100b69be4c6ca3f37198512f9b610fe72385c6e47" gracePeriod=30 Apr 19 12:34:08.926765 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:08.926742 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:34:09.105045 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.104954 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/343439fa-d125-4243-ac43-c00e012201b9-trusted-ca\") pod \"343439fa-d125-4243-ac43-c00e012201b9\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " Apr 19 12:34:09.105045 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.105038 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/343439fa-d125-4243-ac43-c00e012201b9-image-registry-private-configuration\") pod \"343439fa-d125-4243-ac43-c00e012201b9\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " Apr 19 12:34:09.105310 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.105080 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/343439fa-d125-4243-ac43-c00e012201b9-registry-certificates\") pod \"343439fa-d125-4243-ac43-c00e012201b9\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " Apr 19 12:34:09.105310 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.105107 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-bound-sa-token\") pod \"343439fa-d125-4243-ac43-c00e012201b9\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " Apr 19 12:34:09.105310 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.105142 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/343439fa-d125-4243-ac43-c00e012201b9-installation-pull-secrets\") pod \"343439fa-d125-4243-ac43-c00e012201b9\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " Apr 19 12:34:09.105310 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.105217 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls\") pod \"343439fa-d125-4243-ac43-c00e012201b9\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " Apr 19 12:34:09.105310 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.105242 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/343439fa-d125-4243-ac43-c00e012201b9-ca-trust-extracted\") pod \"343439fa-d125-4243-ac43-c00e012201b9\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " Apr 19 12:34:09.105310 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.105272 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vwnp\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-kube-api-access-6vwnp\") pod \"343439fa-d125-4243-ac43-c00e012201b9\" (UID: \"343439fa-d125-4243-ac43-c00e012201b9\") " Apr 19 12:34:09.105750 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.105684 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343439fa-d125-4243-ac43-c00e012201b9-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "343439fa-d125-4243-ac43-c00e012201b9" (UID: "343439fa-d125-4243-ac43-c00e012201b9"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:34:09.105750 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.105687 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343439fa-d125-4243-ac43-c00e012201b9-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "343439fa-d125-4243-ac43-c00e012201b9" (UID: "343439fa-d125-4243-ac43-c00e012201b9"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:34:09.108181 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.108121 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-kube-api-access-6vwnp" (OuterVolumeSpecName: "kube-api-access-6vwnp") pod "343439fa-d125-4243-ac43-c00e012201b9" (UID: "343439fa-d125-4243-ac43-c00e012201b9"). InnerVolumeSpecName "kube-api-access-6vwnp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:34:09.108181 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.108129 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "343439fa-d125-4243-ac43-c00e012201b9" (UID: "343439fa-d125-4243-ac43-c00e012201b9"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:34:09.108181 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.108125 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/343439fa-d125-4243-ac43-c00e012201b9-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "343439fa-d125-4243-ac43-c00e012201b9" (UID: "343439fa-d125-4243-ac43-c00e012201b9"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:34:09.108407 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.108286 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/343439fa-d125-4243-ac43-c00e012201b9-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "343439fa-d125-4243-ac43-c00e012201b9" (UID: "343439fa-d125-4243-ac43-c00e012201b9"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:34:09.108464 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.108407 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "343439fa-d125-4243-ac43-c00e012201b9" (UID: "343439fa-d125-4243-ac43-c00e012201b9"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:34:09.113786 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.113752 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/343439fa-d125-4243-ac43-c00e012201b9-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "343439fa-d125-4243-ac43-c00e012201b9" (UID: "343439fa-d125-4243-ac43-c00e012201b9"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:34:09.167759 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.167724 2567 generic.go:358] "Generic (PLEG): container finished" podID="343439fa-d125-4243-ac43-c00e012201b9" containerID="1576119ee5e6cc346494340100b69be4c6ca3f37198512f9b610fe72385c6e47" exitCode=0 Apr 19 12:34:09.167920 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.167775 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" event={"ID":"343439fa-d125-4243-ac43-c00e012201b9","Type":"ContainerDied","Data":"1576119ee5e6cc346494340100b69be4c6ca3f37198512f9b610fe72385c6e47"} Apr 19 12:34:09.167920 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.167779 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" Apr 19 12:34:09.167920 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.167801 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cb67c5f8f-b7rwf" event={"ID":"343439fa-d125-4243-ac43-c00e012201b9","Type":"ContainerDied","Data":"bb860e06de79e8c999c9b3861c086226e981ed5db9f5e0acf59aec8ec9e4075e"} Apr 19 12:34:09.167920 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.167816 2567 scope.go:117] "RemoveContainer" containerID="1576119ee5e6cc346494340100b69be4c6ca3f37198512f9b610fe72385c6e47" Apr 19 12:34:09.175825 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.175808 2567 scope.go:117] "RemoveContainer" containerID="1576119ee5e6cc346494340100b69be4c6ca3f37198512f9b610fe72385c6e47" Apr 19 12:34:09.176063 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:34:09.176045 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1576119ee5e6cc346494340100b69be4c6ca3f37198512f9b610fe72385c6e47\": container with ID starting with 1576119ee5e6cc346494340100b69be4c6ca3f37198512f9b610fe72385c6e47 not found: ID does not exist" containerID="1576119ee5e6cc346494340100b69be4c6ca3f37198512f9b610fe72385c6e47" Apr 19 12:34:09.176115 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.176072 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1576119ee5e6cc346494340100b69be4c6ca3f37198512f9b610fe72385c6e47"} err="failed to get container status \"1576119ee5e6cc346494340100b69be4c6ca3f37198512f9b610fe72385c6e47\": rpc error: code = NotFound desc = could not find container \"1576119ee5e6cc346494340100b69be4c6ca3f37198512f9b610fe72385c6e47\": container with ID starting with 1576119ee5e6cc346494340100b69be4c6ca3f37198512f9b610fe72385c6e47 not found: ID does not exist" Apr 19 12:34:09.186610 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.186587 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5cb67c5f8f-b7rwf"] Apr 19 12:34:09.190181 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.190135 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5cb67c5f8f-b7rwf"] Apr 19 12:34:09.205924 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.205898 2567 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/343439fa-d125-4243-ac43-c00e012201b9-image-registry-private-configuration\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:34:09.205924 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.205923 2567 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/343439fa-d125-4243-ac43-c00e012201b9-registry-certificates\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:34:09.206071 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.205934 2567 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-bound-sa-token\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:34:09.206071 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.205944 2567 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/343439fa-d125-4243-ac43-c00e012201b9-installation-pull-secrets\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:34:09.206071 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.205954 2567 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-registry-tls\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:34:09.206071 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.205962 2567 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/343439fa-d125-4243-ac43-c00e012201b9-ca-trust-extracted\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:34:09.206071 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.205970 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6vwnp\" (UniqueName: \"kubernetes.io/projected/343439fa-d125-4243-ac43-c00e012201b9-kube-api-access-6vwnp\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:34:09.206071 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:09.205979 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/343439fa-d125-4243-ac43-c00e012201b9-trusted-ca\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:34:10.457932 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:10.457889 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="343439fa-d125-4243-ac43-c00e012201b9" path="/var/lib/kubelet/pods/343439fa-d125-4243-ac43-c00e012201b9/volumes" Apr 19 12:34:11.430517 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:11.430475 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-56cfb9f996-2cn9l" podUID="5e18b3f4-af11-4f42-9d62-f9c0e8e36606" containerName="console" containerID="cri-o://e6e50a9a0a7922f532a80c752f1de4c99f456d00eaa7f848b5a91915af5e8753" gracePeriod=15 Apr 19 12:34:11.748969 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:11.748946 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56cfb9f996-2cn9l_5e18b3f4-af11-4f42-9d62-f9c0e8e36606/console/0.log" Apr 19 12:34:11.749313 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:11.749011 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:34:11.930945 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:11.930910 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-console-config\") pod \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " Apr 19 12:34:11.931107 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:11.930972 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-console-serving-cert\") pod \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " Apr 19 12:34:11.931211 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:11.931153 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-console-oauth-config\") pod \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " Apr 19 12:34:11.931346 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:11.931235 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-service-ca\") pod \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " Apr 19 12:34:11.931346 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:11.931266 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-oauth-serving-cert\") pod \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " Apr 19 12:34:11.931346 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:11.931291 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt7jp\" (UniqueName: \"kubernetes.io/projected/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-kube-api-access-pt7jp\") pod \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\" (UID: \"5e18b3f4-af11-4f42-9d62-f9c0e8e36606\") " Apr 19 12:34:11.931468 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:11.931340 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-console-config" (OuterVolumeSpecName: "console-config") pod "5e18b3f4-af11-4f42-9d62-f9c0e8e36606" (UID: "5e18b3f4-af11-4f42-9d62-f9c0e8e36606"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:34:11.931566 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:11.931548 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-console-config\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:34:11.931656 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:11.931637 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5e18b3f4-af11-4f42-9d62-f9c0e8e36606" (UID: "5e18b3f4-af11-4f42-9d62-f9c0e8e36606"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:34:11.931737 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:11.931637 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-service-ca" (OuterVolumeSpecName: "service-ca") pod "5e18b3f4-af11-4f42-9d62-f9c0e8e36606" (UID: "5e18b3f4-af11-4f42-9d62-f9c0e8e36606"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:34:11.933284 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:11.933265 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5e18b3f4-af11-4f42-9d62-f9c0e8e36606" (UID: "5e18b3f4-af11-4f42-9d62-f9c0e8e36606"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:34:11.933363 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:11.933312 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-kube-api-access-pt7jp" (OuterVolumeSpecName: "kube-api-access-pt7jp") pod "5e18b3f4-af11-4f42-9d62-f9c0e8e36606" (UID: "5e18b3f4-af11-4f42-9d62-f9c0e8e36606"). InnerVolumeSpecName "kube-api-access-pt7jp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:34:11.933363 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:11.933320 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5e18b3f4-af11-4f42-9d62-f9c0e8e36606" (UID: "5e18b3f4-af11-4f42-9d62-f9c0e8e36606"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:34:12.032530 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:12.032433 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-console-serving-cert\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:34:12.032530 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:12.032466 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-console-oauth-config\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:34:12.032530 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:12.032482 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-service-ca\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:34:12.032530 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:12.032494 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-oauth-serving-cert\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:34:12.032530 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:12.032507 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pt7jp\" (UniqueName: \"kubernetes.io/projected/5e18b3f4-af11-4f42-9d62-f9c0e8e36606-kube-api-access-pt7jp\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:34:12.178482 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:12.178453 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56cfb9f996-2cn9l_5e18b3f4-af11-4f42-9d62-f9c0e8e36606/console/0.log" Apr 19 12:34:12.178624 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:12.178495 2567 generic.go:358] "Generic (PLEG): container finished" podID="5e18b3f4-af11-4f42-9d62-f9c0e8e36606" containerID="e6e50a9a0a7922f532a80c752f1de4c99f456d00eaa7f848b5a91915af5e8753" exitCode=2 Apr 19 12:34:12.178624 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:12.178562 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56cfb9f996-2cn9l" Apr 19 12:34:12.178723 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:12.178565 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56cfb9f996-2cn9l" event={"ID":"5e18b3f4-af11-4f42-9d62-f9c0e8e36606","Type":"ContainerDied","Data":"e6e50a9a0a7922f532a80c752f1de4c99f456d00eaa7f848b5a91915af5e8753"} Apr 19 12:34:12.178723 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:12.178661 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56cfb9f996-2cn9l" event={"ID":"5e18b3f4-af11-4f42-9d62-f9c0e8e36606","Type":"ContainerDied","Data":"10e094424fbf7bc7f5ebbda5be4abea5366008528a7afd9df4bda79b8ff7ea0b"} Apr 19 12:34:12.178723 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:12.178678 2567 scope.go:117] "RemoveContainer" containerID="e6e50a9a0a7922f532a80c752f1de4c99f456d00eaa7f848b5a91915af5e8753" Apr 19 12:34:12.187053 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:12.187032 2567 scope.go:117] "RemoveContainer" containerID="e6e50a9a0a7922f532a80c752f1de4c99f456d00eaa7f848b5a91915af5e8753" Apr 19 12:34:12.187351 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:34:12.187329 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6e50a9a0a7922f532a80c752f1de4c99f456d00eaa7f848b5a91915af5e8753\": container with ID starting with e6e50a9a0a7922f532a80c752f1de4c99f456d00eaa7f848b5a91915af5e8753 not found: ID does not exist" containerID="e6e50a9a0a7922f532a80c752f1de4c99f456d00eaa7f848b5a91915af5e8753" Apr 19 12:34:12.187404 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:12.187362 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6e50a9a0a7922f532a80c752f1de4c99f456d00eaa7f848b5a91915af5e8753"} err="failed to get container status \"e6e50a9a0a7922f532a80c752f1de4c99f456d00eaa7f848b5a91915af5e8753\": rpc error: code = NotFound desc = could not find container \"e6e50a9a0a7922f532a80c752f1de4c99f456d00eaa7f848b5a91915af5e8753\": container with ID starting with e6e50a9a0a7922f532a80c752f1de4c99f456d00eaa7f848b5a91915af5e8753 not found: ID does not exist" Apr 19 12:34:12.197070 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:12.197048 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56cfb9f996-2cn9l"] Apr 19 12:34:12.200730 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:12.200711 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-56cfb9f996-2cn9l"] Apr 19 12:34:12.458882 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:12.458848 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e18b3f4-af11-4f42-9d62-f9c0e8e36606" path="/var/lib/kubelet/pods/5e18b3f4-af11-4f42-9d62-f9c0e8e36606/volumes" Apr 19 12:34:27.223589 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:27.223553 2567 generic.go:358] "Generic (PLEG): container finished" podID="ac3748db-049f-4448-a55c-ed08dd605a59" containerID="e0e9d03ebee34cbdac31bc775ac12fa817df662852e0a8c8705d163377778003" exitCode=0 Apr 19 12:34:27.223995 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:27.223623 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9vjds" event={"ID":"ac3748db-049f-4448-a55c-ed08dd605a59","Type":"ContainerDied","Data":"e0e9d03ebee34cbdac31bc775ac12fa817df662852e0a8c8705d163377778003"} Apr 19 12:34:27.223995 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:27.223933 2567 scope.go:117] "RemoveContainer" containerID="e0e9d03ebee34cbdac31bc775ac12fa817df662852e0a8c8705d163377778003" Apr 19 12:34:28.227455 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:28.227416 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9vjds" event={"ID":"ac3748db-049f-4448-a55c-ed08dd605a59","Type":"ContainerStarted","Data":"c69b906d9318dd7df777f0513b2067c0aaf6e10704082caa79bede50543e691f"} Apr 19 12:34:44.417960 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:44.417917 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs\") pod \"network-metrics-daemon-98bqr\" (UID: \"720d8932-1617-465d-a213-ebb1e99e6bc6\") " pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:34:44.420259 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:44.420236 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/720d8932-1617-465d-a213-ebb1e99e6bc6-metrics-certs\") pod \"network-metrics-daemon-98bqr\" (UID: \"720d8932-1617-465d-a213-ebb1e99e6bc6\") " pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:34:44.557394 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:44.557364 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nbhvx\"" Apr 19 12:34:44.565707 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:44.565678 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98bqr" Apr 19 12:34:44.682854 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:44.682818 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-98bqr"] Apr 19 12:34:44.685614 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:34:44.685571 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod720d8932_1617_465d_a213_ebb1e99e6bc6.slice/crio-992943aa60233f6429513ffbb977e301712eee84df02e90b444cd19daab50df2 WatchSource:0}: Error finding container 992943aa60233f6429513ffbb977e301712eee84df02e90b444cd19daab50df2: Status 404 returned error can't find the container with id 992943aa60233f6429513ffbb977e301712eee84df02e90b444cd19daab50df2 Apr 19 12:34:45.282027 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:45.281989 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-98bqr" event={"ID":"720d8932-1617-465d-a213-ebb1e99e6bc6","Type":"ContainerStarted","Data":"992943aa60233f6429513ffbb977e301712eee84df02e90b444cd19daab50df2"} Apr 19 12:34:46.286593 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:46.286505 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-98bqr" event={"ID":"720d8932-1617-465d-a213-ebb1e99e6bc6","Type":"ContainerStarted","Data":"6fb0b51eb143a4104ea6726d465b14155c06b716a1008ff1689c7cd86b2c3450"} Apr 19 12:34:46.286593 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:46.286542 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-98bqr" event={"ID":"720d8932-1617-465d-a213-ebb1e99e6bc6","Type":"ContainerStarted","Data":"ca1f771d1d20c5ebae6aeec519c98445eb57fa630722089452b0e2216aee0f67"} Apr 19 12:34:46.300474 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:34:46.300415 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-98bqr" podStartSLOduration=253.044985242 podStartE2EDuration="4m14.300399227s" podCreationTimestamp="2026-04-19 12:30:32 +0000 UTC" firstStartedPulling="2026-04-19 12:34:44.687817029 +0000 UTC m=+252.841750487" lastFinishedPulling="2026-04-19 12:34:45.943231011 +0000 UTC m=+254.097164472" observedRunningTime="2026-04-19 12:34:46.299848007 +0000 UTC m=+254.453781489" watchObservedRunningTime="2026-04-19 12:34:46.300399227 +0000 UTC m=+254.454332711" Apr 19 12:35:01.939510 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:01.939421 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 19 12:35:01.939970 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:01.939888 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="alertmanager" containerID="cri-o://586c59b7fd9917340deeb4a1389b04f7a52d537475c2bdbf6f24b3127c1ee21d" gracePeriod=120 Apr 19 12:35:01.939970 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:01.939947 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="kube-rbac-proxy-metric" containerID="cri-o://37eec079bfc56c2ea87dee7811ae890e110d171734b2bd92016cc6255608cac2" gracePeriod=120 Apr 19 12:35:01.940074 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:01.939981 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="kube-rbac-proxy-web" containerID="cri-o://66616e9bc0f163ef5494e2eb7a44f83f4be480a11daab2a248c9103ca757d5d3" gracePeriod=120 Apr 19 12:35:01.940074 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:01.940024 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="prom-label-proxy" containerID="cri-o://402f639348ee896295a3db62a30a4f56114916c5fa031209e3f28386a6751d1e" gracePeriod=120 Apr 19 12:35:01.940074 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:01.939988 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="config-reloader" containerID="cri-o://ed2a9bc3826f45907928f0b1d2a9d5cfb322b96549adf0337e509d17cb2ad47f" gracePeriod=120 Apr 19 12:35:01.940261 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:01.940152 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="kube-rbac-proxy" containerID="cri-o://38eb3876372b7c694f7b46aae5ba5ba0f717fe1dfb2cd3bd376291147488b8b3" gracePeriod=120 Apr 19 12:35:02.337739 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:02.337658 2567 generic.go:358] "Generic (PLEG): container finished" podID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerID="402f639348ee896295a3db62a30a4f56114916c5fa031209e3f28386a6751d1e" exitCode=0 Apr 19 12:35:02.337739 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:02.337683 2567 generic.go:358] "Generic (PLEG): container finished" podID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerID="38eb3876372b7c694f7b46aae5ba5ba0f717fe1dfb2cd3bd376291147488b8b3" exitCode=0 Apr 19 12:35:02.337739 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:02.337690 2567 generic.go:358] "Generic (PLEG): container finished" podID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerID="ed2a9bc3826f45907928f0b1d2a9d5cfb322b96549adf0337e509d17cb2ad47f" exitCode=0 Apr 19 12:35:02.337739 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:02.337696 2567 generic.go:358] "Generic (PLEG): container finished" podID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerID="586c59b7fd9917340deeb4a1389b04f7a52d537475c2bdbf6f24b3127c1ee21d" exitCode=0 Apr 19 12:35:02.337739 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:02.337726 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"effa5082-f6a7-4251-9183-c99891f7e9e3","Type":"ContainerDied","Data":"402f639348ee896295a3db62a30a4f56114916c5fa031209e3f28386a6751d1e"} Apr 19 12:35:02.337998 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:02.337758 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"effa5082-f6a7-4251-9183-c99891f7e9e3","Type":"ContainerDied","Data":"38eb3876372b7c694f7b46aae5ba5ba0f717fe1dfb2cd3bd376291147488b8b3"} Apr 19 12:35:02.337998 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:02.337768 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"effa5082-f6a7-4251-9183-c99891f7e9e3","Type":"ContainerDied","Data":"ed2a9bc3826f45907928f0b1d2a9d5cfb322b96549adf0337e509d17cb2ad47f"} Apr 19 12:35:02.337998 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:02.337776 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"effa5082-f6a7-4251-9183-c99891f7e9e3","Type":"ContainerDied","Data":"586c59b7fd9917340deeb4a1389b04f7a52d537475c2bdbf6f24b3127c1ee21d"} Apr 19 12:35:03.168196 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.168149 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.271709 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.271621 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-main-tls\") pod \"effa5082-f6a7-4251-9183-c99891f7e9e3\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " Apr 19 12:35:03.271709 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.271668 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-web-config\") pod \"effa5082-f6a7-4251-9183-c99891f7e9e3\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " Apr 19 12:35:03.271709 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.271704 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/effa5082-f6a7-4251-9183-c99891f7e9e3-metrics-client-ca\") pod \"effa5082-f6a7-4251-9183-c99891f7e9e3\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " Apr 19 12:35:03.271960 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.271725 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"effa5082-f6a7-4251-9183-c99891f7e9e3\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " Apr 19 12:35:03.271960 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.271745 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/effa5082-f6a7-4251-9183-c99891f7e9e3-alertmanager-main-db\") pod \"effa5082-f6a7-4251-9183-c99891f7e9e3\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " Apr 19 12:35:03.271960 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.271783 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/effa5082-f6a7-4251-9183-c99891f7e9e3-alertmanager-trusted-ca-bundle\") pod \"effa5082-f6a7-4251-9183-c99891f7e9e3\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " Apr 19 12:35:03.271960 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.271821 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-config-volume\") pod \"effa5082-f6a7-4251-9183-c99891f7e9e3\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " Apr 19 12:35:03.271960 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.271847 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/effa5082-f6a7-4251-9183-c99891f7e9e3-config-out\") pod \"effa5082-f6a7-4251-9183-c99891f7e9e3\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " Apr 19 12:35:03.271960 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.271886 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-kube-rbac-proxy-web\") pod \"effa5082-f6a7-4251-9183-c99891f7e9e3\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " Apr 19 12:35:03.271960 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.271912 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/effa5082-f6a7-4251-9183-c99891f7e9e3-tls-assets\") pod \"effa5082-f6a7-4251-9183-c99891f7e9e3\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " Apr 19 12:35:03.271960 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.271937 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-kube-rbac-proxy\") pod \"effa5082-f6a7-4251-9183-c99891f7e9e3\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " Apr 19 12:35:03.272403 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.271973 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbjmg\" (UniqueName: \"kubernetes.io/projected/effa5082-f6a7-4251-9183-c99891f7e9e3-kube-api-access-wbjmg\") pod \"effa5082-f6a7-4251-9183-c99891f7e9e3\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " Apr 19 12:35:03.272403 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.272001 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-cluster-tls-config\") pod \"effa5082-f6a7-4251-9183-c99891f7e9e3\" (UID: \"effa5082-f6a7-4251-9183-c99891f7e9e3\") " Apr 19 12:35:03.272403 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.272118 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/effa5082-f6a7-4251-9183-c99891f7e9e3-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "effa5082-f6a7-4251-9183-c99891f7e9e3" (UID: "effa5082-f6a7-4251-9183-c99891f7e9e3"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:35:03.272403 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.272278 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/effa5082-f6a7-4251-9183-c99891f7e9e3-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "effa5082-f6a7-4251-9183-c99891f7e9e3" (UID: "effa5082-f6a7-4251-9183-c99891f7e9e3"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:35:03.272403 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.272292 2567 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/effa5082-f6a7-4251-9183-c99891f7e9e3-alertmanager-main-db\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:35:03.272652 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.272435 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/effa5082-f6a7-4251-9183-c99891f7e9e3-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "effa5082-f6a7-4251-9183-c99891f7e9e3" (UID: "effa5082-f6a7-4251-9183-c99891f7e9e3"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:35:03.274555 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.274526 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/effa5082-f6a7-4251-9183-c99891f7e9e3-kube-api-access-wbjmg" (OuterVolumeSpecName: "kube-api-access-wbjmg") pod "effa5082-f6a7-4251-9183-c99891f7e9e3" (UID: "effa5082-f6a7-4251-9183-c99891f7e9e3"). InnerVolumeSpecName "kube-api-access-wbjmg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:35:03.274830 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.274769 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "effa5082-f6a7-4251-9183-c99891f7e9e3" (UID: "effa5082-f6a7-4251-9183-c99891f7e9e3"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:35:03.274975 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.274859 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "effa5082-f6a7-4251-9183-c99891f7e9e3" (UID: "effa5082-f6a7-4251-9183-c99891f7e9e3"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:35:03.275591 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.275564 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/effa5082-f6a7-4251-9183-c99891f7e9e3-config-out" (OuterVolumeSpecName: "config-out") pod "effa5082-f6a7-4251-9183-c99891f7e9e3" (UID: "effa5082-f6a7-4251-9183-c99891f7e9e3"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:35:03.275695 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.275570 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/effa5082-f6a7-4251-9183-c99891f7e9e3-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "effa5082-f6a7-4251-9183-c99891f7e9e3" (UID: "effa5082-f6a7-4251-9183-c99891f7e9e3"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:35:03.275843 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.275820 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "effa5082-f6a7-4251-9183-c99891f7e9e3" (UID: "effa5082-f6a7-4251-9183-c99891f7e9e3"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:35:03.276283 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.276257 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-config-volume" (OuterVolumeSpecName: "config-volume") pod "effa5082-f6a7-4251-9183-c99891f7e9e3" (UID: "effa5082-f6a7-4251-9183-c99891f7e9e3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:35:03.276283 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.276272 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "effa5082-f6a7-4251-9183-c99891f7e9e3" (UID: "effa5082-f6a7-4251-9183-c99891f7e9e3"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:35:03.279827 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.279805 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "effa5082-f6a7-4251-9183-c99891f7e9e3" (UID: "effa5082-f6a7-4251-9183-c99891f7e9e3"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:35:03.285549 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.285528 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-web-config" (OuterVolumeSpecName: "web-config") pod "effa5082-f6a7-4251-9183-c99891f7e9e3" (UID: "effa5082-f6a7-4251-9183-c99891f7e9e3"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:35:03.344377 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.344338 2567 generic.go:358] "Generic (PLEG): container finished" podID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerID="37eec079bfc56c2ea87dee7811ae890e110d171734b2bd92016cc6255608cac2" exitCode=0 Apr 19 12:35:03.344377 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.344370 2567 generic.go:358] "Generic (PLEG): container finished" podID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerID="66616e9bc0f163ef5494e2eb7a44f83f4be480a11daab2a248c9103ca757d5d3" exitCode=0 Apr 19 12:35:03.344609 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.344425 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"effa5082-f6a7-4251-9183-c99891f7e9e3","Type":"ContainerDied","Data":"37eec079bfc56c2ea87dee7811ae890e110d171734b2bd92016cc6255608cac2"} Apr 19 12:35:03.344609 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.344459 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.344609 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.344474 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"effa5082-f6a7-4251-9183-c99891f7e9e3","Type":"ContainerDied","Data":"66616e9bc0f163ef5494e2eb7a44f83f4be480a11daab2a248c9103ca757d5d3"} Apr 19 12:35:03.344609 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.344490 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"effa5082-f6a7-4251-9183-c99891f7e9e3","Type":"ContainerDied","Data":"a5c09c9286019d901bd873243be78b9bf840463fe142ad70ae0b746721ceba07"} Apr 19 12:35:03.344609 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.344510 2567 scope.go:117] "RemoveContainer" containerID="402f639348ee896295a3db62a30a4f56114916c5fa031209e3f28386a6751d1e" Apr 19 12:35:03.351830 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.351818 2567 scope.go:117] "RemoveContainer" containerID="37eec079bfc56c2ea87dee7811ae890e110d171734b2bd92016cc6255608cac2" Apr 19 12:35:03.358941 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.358925 2567 scope.go:117] "RemoveContainer" containerID="38eb3876372b7c694f7b46aae5ba5ba0f717fe1dfb2cd3bd376291147488b8b3" Apr 19 12:35:03.365918 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.365465 2567 scope.go:117] "RemoveContainer" containerID="66616e9bc0f163ef5494e2eb7a44f83f4be480a11daab2a248c9103ca757d5d3" Apr 19 12:35:03.371367 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.369465 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 19 12:35:03.373040 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.372981 2567 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/effa5082-f6a7-4251-9183-c99891f7e9e3-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:35:03.373040 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.373010 2567 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-config-volume\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:35:03.373040 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.373027 2567 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/effa5082-f6a7-4251-9183-c99891f7e9e3-config-out\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:35:03.373283 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.373048 2567 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:35:03.373283 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.373063 2567 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/effa5082-f6a7-4251-9183-c99891f7e9e3-tls-assets\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:35:03.373283 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.373077 2567 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:35:03.373283 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.373132 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wbjmg\" (UniqueName: \"kubernetes.io/projected/effa5082-f6a7-4251-9183-c99891f7e9e3-kube-api-access-wbjmg\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:35:03.373283 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.373154 2567 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-cluster-tls-config\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:35:03.373283 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.373189 2567 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-main-tls\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:35:03.373283 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.373226 2567 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-web-config\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:35:03.373283 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.373254 2567 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/effa5082-f6a7-4251-9183-c99891f7e9e3-metrics-client-ca\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:35:03.373549 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.373316 2567 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/effa5082-f6a7-4251-9183-c99891f7e9e3-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:35:03.374966 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.374945 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 19 12:35:03.377460 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.377444 2567 scope.go:117] "RemoveContainer" containerID="ed2a9bc3826f45907928f0b1d2a9d5cfb322b96549adf0337e509d17cb2ad47f" Apr 19 12:35:03.384275 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.384257 2567 scope.go:117] "RemoveContainer" containerID="586c59b7fd9917340deeb4a1389b04f7a52d537475c2bdbf6f24b3127c1ee21d" Apr 19 12:35:03.392255 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.392232 2567 scope.go:117] "RemoveContainer" containerID="c0622ba8ab959bea7840e3177b7d9d7c774973447484c7fc556327c4c9121bcc" Apr 19 12:35:03.397581 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.397560 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 19 12:35:03.397923 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.397910 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="init-config-reloader" Apr 19 12:35:03.397970 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.397935 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="init-config-reloader" Apr 19 12:35:03.397970 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.397943 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="kube-rbac-proxy-metric" Apr 19 12:35:03.397970 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.397949 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="kube-rbac-proxy-metric" Apr 19 12:35:03.397970 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.397964 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e18b3f4-af11-4f42-9d62-f9c0e8e36606" containerName="console" Apr 19 12:35:03.397970 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.397970 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e18b3f4-af11-4f42-9d62-f9c0e8e36606" containerName="console" Apr 19 12:35:03.398120 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.397978 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="prom-label-proxy" Apr 19 12:35:03.398120 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.397985 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="prom-label-proxy" Apr 19 12:35:03.398120 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.397993 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="alertmanager" Apr 19 12:35:03.398120 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.397999 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="alertmanager" Apr 19 12:35:03.398120 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.398009 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="kube-rbac-proxy" Apr 19 12:35:03.398120 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.398014 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="kube-rbac-proxy" Apr 19 12:35:03.398120 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.398023 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="343439fa-d125-4243-ac43-c00e012201b9" containerName="registry" Apr 19 12:35:03.398120 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.398029 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="343439fa-d125-4243-ac43-c00e012201b9" containerName="registry" Apr 19 12:35:03.398120 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.398040 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="config-reloader" Apr 19 12:35:03.398120 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.398047 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="config-reloader" Apr 19 12:35:03.398120 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.398056 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="kube-rbac-proxy-web" Apr 19 12:35:03.398120 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.398061 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="kube-rbac-proxy-web" Apr 19 12:35:03.398120 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.398102 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="config-reloader" Apr 19 12:35:03.398120 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.398113 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="alertmanager" Apr 19 12:35:03.398120 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.398122 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e18b3f4-af11-4f42-9d62-f9c0e8e36606" containerName="console" Apr 19 12:35:03.398556 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.398129 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="kube-rbac-proxy" Apr 19 12:35:03.398556 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.398135 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="343439fa-d125-4243-ac43-c00e012201b9" containerName="registry" Apr 19 12:35:03.398556 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.398141 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="kube-rbac-proxy-metric" Apr 19 12:35:03.398556 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.398146 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="prom-label-proxy" Apr 19 12:35:03.398556 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.398152 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" containerName="kube-rbac-proxy-web" Apr 19 12:35:03.400726 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.400705 2567 scope.go:117] "RemoveContainer" containerID="402f639348ee896295a3db62a30a4f56114916c5fa031209e3f28386a6751d1e" Apr 19 12:35:03.400993 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:35:03.400972 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"402f639348ee896295a3db62a30a4f56114916c5fa031209e3f28386a6751d1e\": container with ID starting with 402f639348ee896295a3db62a30a4f56114916c5fa031209e3f28386a6751d1e not found: ID does not exist" containerID="402f639348ee896295a3db62a30a4f56114916c5fa031209e3f28386a6751d1e" Apr 19 12:35:03.401041 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.401001 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"402f639348ee896295a3db62a30a4f56114916c5fa031209e3f28386a6751d1e"} err="failed to get container status \"402f639348ee896295a3db62a30a4f56114916c5fa031209e3f28386a6751d1e\": rpc error: code = NotFound desc = could not find container \"402f639348ee896295a3db62a30a4f56114916c5fa031209e3f28386a6751d1e\": container with ID starting with 402f639348ee896295a3db62a30a4f56114916c5fa031209e3f28386a6751d1e not found: ID does not exist" Apr 19 12:35:03.401041 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.401018 2567 scope.go:117] "RemoveContainer" containerID="37eec079bfc56c2ea87dee7811ae890e110d171734b2bd92016cc6255608cac2" Apr 19 12:35:03.401285 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:35:03.401264 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37eec079bfc56c2ea87dee7811ae890e110d171734b2bd92016cc6255608cac2\": container with ID starting with 37eec079bfc56c2ea87dee7811ae890e110d171734b2bd92016cc6255608cac2 not found: ID does not exist" containerID="37eec079bfc56c2ea87dee7811ae890e110d171734b2bd92016cc6255608cac2" Apr 19 12:35:03.401345 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.401290 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37eec079bfc56c2ea87dee7811ae890e110d171734b2bd92016cc6255608cac2"} err="failed to get container status \"37eec079bfc56c2ea87dee7811ae890e110d171734b2bd92016cc6255608cac2\": rpc error: code = NotFound desc = could not find container \"37eec079bfc56c2ea87dee7811ae890e110d171734b2bd92016cc6255608cac2\": container with ID starting with 37eec079bfc56c2ea87dee7811ae890e110d171734b2bd92016cc6255608cac2 not found: ID does not exist" Apr 19 12:35:03.401345 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.401305 2567 scope.go:117] "RemoveContainer" containerID="38eb3876372b7c694f7b46aae5ba5ba0f717fe1dfb2cd3bd376291147488b8b3" Apr 19 12:35:03.401531 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:35:03.401495 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38eb3876372b7c694f7b46aae5ba5ba0f717fe1dfb2cd3bd376291147488b8b3\": container with ID starting with 38eb3876372b7c694f7b46aae5ba5ba0f717fe1dfb2cd3bd376291147488b8b3 not found: ID does not exist" containerID="38eb3876372b7c694f7b46aae5ba5ba0f717fe1dfb2cd3bd376291147488b8b3" Apr 19 12:35:03.401568 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.401540 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38eb3876372b7c694f7b46aae5ba5ba0f717fe1dfb2cd3bd376291147488b8b3"} err="failed to get container status \"38eb3876372b7c694f7b46aae5ba5ba0f717fe1dfb2cd3bd376291147488b8b3\": rpc error: code = NotFound desc = could not find container \"38eb3876372b7c694f7b46aae5ba5ba0f717fe1dfb2cd3bd376291147488b8b3\": container with ID starting with 38eb3876372b7c694f7b46aae5ba5ba0f717fe1dfb2cd3bd376291147488b8b3 not found: ID does not exist" Apr 19 12:35:03.401568 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.401556 2567 scope.go:117] "RemoveContainer" containerID="66616e9bc0f163ef5494e2eb7a44f83f4be480a11daab2a248c9103ca757d5d3" Apr 19 12:35:03.401743 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:35:03.401728 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66616e9bc0f163ef5494e2eb7a44f83f4be480a11daab2a248c9103ca757d5d3\": container with ID starting with 66616e9bc0f163ef5494e2eb7a44f83f4be480a11daab2a248c9103ca757d5d3 not found: ID does not exist" containerID="66616e9bc0f163ef5494e2eb7a44f83f4be480a11daab2a248c9103ca757d5d3" Apr 19 12:35:03.401786 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.401746 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66616e9bc0f163ef5494e2eb7a44f83f4be480a11daab2a248c9103ca757d5d3"} err="failed to get container status \"66616e9bc0f163ef5494e2eb7a44f83f4be480a11daab2a248c9103ca757d5d3\": rpc error: code = NotFound desc = could not find container \"66616e9bc0f163ef5494e2eb7a44f83f4be480a11daab2a248c9103ca757d5d3\": container with ID starting with 66616e9bc0f163ef5494e2eb7a44f83f4be480a11daab2a248c9103ca757d5d3 not found: ID does not exist" Apr 19 12:35:03.401786 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.401757 2567 scope.go:117] "RemoveContainer" containerID="ed2a9bc3826f45907928f0b1d2a9d5cfb322b96549adf0337e509d17cb2ad47f" Apr 19 12:35:03.401927 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:35:03.401910 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed2a9bc3826f45907928f0b1d2a9d5cfb322b96549adf0337e509d17cb2ad47f\": container with ID starting with ed2a9bc3826f45907928f0b1d2a9d5cfb322b96549adf0337e509d17cb2ad47f not found: ID does not exist" containerID="ed2a9bc3826f45907928f0b1d2a9d5cfb322b96549adf0337e509d17cb2ad47f" Apr 19 12:35:03.401980 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.401930 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed2a9bc3826f45907928f0b1d2a9d5cfb322b96549adf0337e509d17cb2ad47f"} err="failed to get container status \"ed2a9bc3826f45907928f0b1d2a9d5cfb322b96549adf0337e509d17cb2ad47f\": rpc error: code = NotFound desc = could not find container \"ed2a9bc3826f45907928f0b1d2a9d5cfb322b96549adf0337e509d17cb2ad47f\": container with ID starting with ed2a9bc3826f45907928f0b1d2a9d5cfb322b96549adf0337e509d17cb2ad47f not found: ID does not exist" Apr 19 12:35:03.401980 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.401941 2567 scope.go:117] "RemoveContainer" containerID="586c59b7fd9917340deeb4a1389b04f7a52d537475c2bdbf6f24b3127c1ee21d" Apr 19 12:35:03.402117 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:35:03.402101 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"586c59b7fd9917340deeb4a1389b04f7a52d537475c2bdbf6f24b3127c1ee21d\": container with ID starting with 586c59b7fd9917340deeb4a1389b04f7a52d537475c2bdbf6f24b3127c1ee21d not found: ID does not exist" containerID="586c59b7fd9917340deeb4a1389b04f7a52d537475c2bdbf6f24b3127c1ee21d" Apr 19 12:35:03.402151 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.402121 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"586c59b7fd9917340deeb4a1389b04f7a52d537475c2bdbf6f24b3127c1ee21d"} err="failed to get container status \"586c59b7fd9917340deeb4a1389b04f7a52d537475c2bdbf6f24b3127c1ee21d\": rpc error: code = NotFound desc = could not find container \"586c59b7fd9917340deeb4a1389b04f7a52d537475c2bdbf6f24b3127c1ee21d\": container with ID starting with 586c59b7fd9917340deeb4a1389b04f7a52d537475c2bdbf6f24b3127c1ee21d not found: ID does not exist" Apr 19 12:35:03.402151 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.402133 2567 scope.go:117] "RemoveContainer" containerID="c0622ba8ab959bea7840e3177b7d9d7c774973447484c7fc556327c4c9121bcc" Apr 19 12:35:03.402348 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:35:03.402333 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0622ba8ab959bea7840e3177b7d9d7c774973447484c7fc556327c4c9121bcc\": container with ID starting with c0622ba8ab959bea7840e3177b7d9d7c774973447484c7fc556327c4c9121bcc not found: ID does not exist" containerID="c0622ba8ab959bea7840e3177b7d9d7c774973447484c7fc556327c4c9121bcc" Apr 19 12:35:03.402387 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.402351 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0622ba8ab959bea7840e3177b7d9d7c774973447484c7fc556327c4c9121bcc"} err="failed to get container status \"c0622ba8ab959bea7840e3177b7d9d7c774973447484c7fc556327c4c9121bcc\": rpc error: code = NotFound desc = could not find container \"c0622ba8ab959bea7840e3177b7d9d7c774973447484c7fc556327c4c9121bcc\": container with ID starting with c0622ba8ab959bea7840e3177b7d9d7c774973447484c7fc556327c4c9121bcc not found: ID does not exist" Apr 19 12:35:03.402387 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.402362 2567 scope.go:117] "RemoveContainer" containerID="402f639348ee896295a3db62a30a4f56114916c5fa031209e3f28386a6751d1e" Apr 19 12:35:03.402537 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.402523 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"402f639348ee896295a3db62a30a4f56114916c5fa031209e3f28386a6751d1e"} err="failed to get container status \"402f639348ee896295a3db62a30a4f56114916c5fa031209e3f28386a6751d1e\": rpc error: code = NotFound desc = could not find container \"402f639348ee896295a3db62a30a4f56114916c5fa031209e3f28386a6751d1e\": container with ID starting with 402f639348ee896295a3db62a30a4f56114916c5fa031209e3f28386a6751d1e not found: ID does not exist" Apr 19 12:35:03.402580 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.402537 2567 scope.go:117] "RemoveContainer" containerID="37eec079bfc56c2ea87dee7811ae890e110d171734b2bd92016cc6255608cac2" Apr 19 12:35:03.402778 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.402762 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37eec079bfc56c2ea87dee7811ae890e110d171734b2bd92016cc6255608cac2"} err="failed to get container status \"37eec079bfc56c2ea87dee7811ae890e110d171734b2bd92016cc6255608cac2\": rpc error: code = NotFound desc = could not find container \"37eec079bfc56c2ea87dee7811ae890e110d171734b2bd92016cc6255608cac2\": container with ID starting with 37eec079bfc56c2ea87dee7811ae890e110d171734b2bd92016cc6255608cac2 not found: ID does not exist" Apr 19 12:35:03.402829 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.402779 2567 scope.go:117] "RemoveContainer" containerID="38eb3876372b7c694f7b46aae5ba5ba0f717fe1dfb2cd3bd376291147488b8b3" Apr 19 12:35:03.402956 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.402942 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38eb3876372b7c694f7b46aae5ba5ba0f717fe1dfb2cd3bd376291147488b8b3"} err="failed to get container status \"38eb3876372b7c694f7b46aae5ba5ba0f717fe1dfb2cd3bd376291147488b8b3\": rpc error: code = NotFound desc = could not find container \"38eb3876372b7c694f7b46aae5ba5ba0f717fe1dfb2cd3bd376291147488b8b3\": container with ID starting with 38eb3876372b7c694f7b46aae5ba5ba0f717fe1dfb2cd3bd376291147488b8b3 not found: ID does not exist" Apr 19 12:35:03.403011 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.402956 2567 scope.go:117] "RemoveContainer" containerID="66616e9bc0f163ef5494e2eb7a44f83f4be480a11daab2a248c9103ca757d5d3" Apr 19 12:35:03.403098 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.403083 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66616e9bc0f163ef5494e2eb7a44f83f4be480a11daab2a248c9103ca757d5d3"} err="failed to get container status \"66616e9bc0f163ef5494e2eb7a44f83f4be480a11daab2a248c9103ca757d5d3\": rpc error: code = NotFound desc = could not find container \"66616e9bc0f163ef5494e2eb7a44f83f4be480a11daab2a248c9103ca757d5d3\": container with ID starting with 66616e9bc0f163ef5494e2eb7a44f83f4be480a11daab2a248c9103ca757d5d3 not found: ID does not exist" Apr 19 12:35:03.403369 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.403098 2567 scope.go:117] "RemoveContainer" containerID="ed2a9bc3826f45907928f0b1d2a9d5cfb322b96549adf0337e509d17cb2ad47f" Apr 19 12:35:03.403369 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.403355 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed2a9bc3826f45907928f0b1d2a9d5cfb322b96549adf0337e509d17cb2ad47f"} err="failed to get container status \"ed2a9bc3826f45907928f0b1d2a9d5cfb322b96549adf0337e509d17cb2ad47f\": rpc error: code = NotFound desc = could not find container \"ed2a9bc3826f45907928f0b1d2a9d5cfb322b96549adf0337e509d17cb2ad47f\": container with ID starting with ed2a9bc3826f45907928f0b1d2a9d5cfb322b96549adf0337e509d17cb2ad47f not found: ID does not exist" Apr 19 12:35:03.403481 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.403371 2567 scope.go:117] "RemoveContainer" containerID="586c59b7fd9917340deeb4a1389b04f7a52d537475c2bdbf6f24b3127c1ee21d" Apr 19 12:35:03.403592 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.403567 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"586c59b7fd9917340deeb4a1389b04f7a52d537475c2bdbf6f24b3127c1ee21d"} err="failed to get container status \"586c59b7fd9917340deeb4a1389b04f7a52d537475c2bdbf6f24b3127c1ee21d\": rpc error: code = NotFound desc = could not find container \"586c59b7fd9917340deeb4a1389b04f7a52d537475c2bdbf6f24b3127c1ee21d\": container with ID starting with 586c59b7fd9917340deeb4a1389b04f7a52d537475c2bdbf6f24b3127c1ee21d not found: ID does not exist" Apr 19 12:35:03.403592 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.403590 2567 scope.go:117] "RemoveContainer" containerID="c0622ba8ab959bea7840e3177b7d9d7c774973447484c7fc556327c4c9121bcc" Apr 19 12:35:03.403777 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.403761 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.403826 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.403803 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0622ba8ab959bea7840e3177b7d9d7c774973447484c7fc556327c4c9121bcc"} err="failed to get container status \"c0622ba8ab959bea7840e3177b7d9d7c774973447484c7fc556327c4c9121bcc\": rpc error: code = NotFound desc = could not find container \"c0622ba8ab959bea7840e3177b7d9d7c774973447484c7fc556327c4c9121bcc\": container with ID starting with c0622ba8ab959bea7840e3177b7d9d7c774973447484c7fc556327c4c9121bcc not found: ID does not exist" Apr 19 12:35:03.406029 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.406009 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 19 12:35:03.406131 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.406105 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 19 12:35:03.406221 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.406131 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 19 12:35:03.406221 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.406139 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 19 12:35:03.406221 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.406145 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 19 12:35:03.406365 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.406226 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 19 12:35:03.406442 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.406422 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 19 12:35:03.406489 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.406474 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 19 12:35:03.406666 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.406650 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-2lbj7\"" Apr 19 12:35:03.411199 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.411181 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 19 12:35:03.414277 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.414250 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 19 12:35:03.575350 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.575249 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2bb33194-d022-4b4f-8510-d23f793f4a39-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.575350 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.575296 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2bb33194-d022-4b4f-8510-d23f793f4a39-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.575350 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.575319 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2bb33194-d022-4b4f-8510-d23f793f4a39-web-config\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.575350 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.575347 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2bb33194-d022-4b4f-8510-d23f793f4a39-config-volume\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.575668 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.575369 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2bb33194-d022-4b4f-8510-d23f793f4a39-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.575668 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.575392 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2bb33194-d022-4b4f-8510-d23f793f4a39-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.575668 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.575426 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2bb33194-d022-4b4f-8510-d23f793f4a39-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.575668 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.575446 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2bb33194-d022-4b4f-8510-d23f793f4a39-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.575668 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.575462 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2bb33194-d022-4b4f-8510-d23f793f4a39-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.575668 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.575518 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2bb33194-d022-4b4f-8510-d23f793f4a39-config-out\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.575668 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.575537 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlvb6\" (UniqueName: \"kubernetes.io/projected/2bb33194-d022-4b4f-8510-d23f793f4a39-kube-api-access-hlvb6\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.575668 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.575564 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2bb33194-d022-4b4f-8510-d23f793f4a39-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.575668 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.575586 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2bb33194-d022-4b4f-8510-d23f793f4a39-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.676226 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.676189 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2bb33194-d022-4b4f-8510-d23f793f4a39-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.676226 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.676229 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2bb33194-d022-4b4f-8510-d23f793f4a39-web-config\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.676424 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.676249 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2bb33194-d022-4b4f-8510-d23f793f4a39-config-volume\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.676424 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.676267 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2bb33194-d022-4b4f-8510-d23f793f4a39-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.676424 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.676299 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2bb33194-d022-4b4f-8510-d23f793f4a39-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.676424 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.676328 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2bb33194-d022-4b4f-8510-d23f793f4a39-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.676424 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.676359 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2bb33194-d022-4b4f-8510-d23f793f4a39-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.676424 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.676387 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2bb33194-d022-4b4f-8510-d23f793f4a39-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.676664 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.676426 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2bb33194-d022-4b4f-8510-d23f793f4a39-config-out\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.676664 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.676450 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlvb6\" (UniqueName: \"kubernetes.io/projected/2bb33194-d022-4b4f-8510-d23f793f4a39-kube-api-access-hlvb6\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.676664 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.676482 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2bb33194-d022-4b4f-8510-d23f793f4a39-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.676664 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.676506 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2bb33194-d022-4b4f-8510-d23f793f4a39-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.676664 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.676560 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2bb33194-d022-4b4f-8510-d23f793f4a39-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.677195 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.677153 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2bb33194-d022-4b4f-8510-d23f793f4a39-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.677303 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.677264 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2bb33194-d022-4b4f-8510-d23f793f4a39-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.679474 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.679448 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2bb33194-d022-4b4f-8510-d23f793f4a39-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.679601 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.679452 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2bb33194-d022-4b4f-8510-d23f793f4a39-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.679601 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.679531 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2bb33194-d022-4b4f-8510-d23f793f4a39-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.679601 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.679539 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2bb33194-d022-4b4f-8510-d23f793f4a39-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.679601 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.679591 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2bb33194-d022-4b4f-8510-d23f793f4a39-config-volume\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.679764 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.679687 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2bb33194-d022-4b4f-8510-d23f793f4a39-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.679764 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.679754 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2bb33194-d022-4b4f-8510-d23f793f4a39-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.679961 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.679945 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2bb33194-d022-4b4f-8510-d23f793f4a39-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.679998 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.679962 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2bb33194-d022-4b4f-8510-d23f793f4a39-config-out\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.681220 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.681193 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2bb33194-d022-4b4f-8510-d23f793f4a39-web-config\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.683533 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.683517 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlvb6\" (UniqueName: \"kubernetes.io/projected/2bb33194-d022-4b4f-8510-d23f793f4a39-kube-api-access-hlvb6\") pod \"alertmanager-main-0\" (UID: \"2bb33194-d022-4b4f-8510-d23f793f4a39\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.715609 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.715578 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:35:03.840482 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:03.840388 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 19 12:35:03.843859 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:35:03.843829 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bb33194_d022_4b4f_8510_d23f793f4a39.slice/crio-9e7dcea939b19e242e9da3f032f1b258f3a1d333d7f61db967b058055431598c WatchSource:0}: Error finding container 9e7dcea939b19e242e9da3f032f1b258f3a1d333d7f61db967b058055431598c: Status 404 returned error can't find the container with id 9e7dcea939b19e242e9da3f032f1b258f3a1d333d7f61db967b058055431598c Apr 19 12:35:04.349583 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:04.349550 2567 generic.go:358] "Generic (PLEG): container finished" podID="2bb33194-d022-4b4f-8510-d23f793f4a39" containerID="fcc55b4abe6af79135d5ecd74a29f26a3896b9e583a8482c4ee45de69277dd1a" exitCode=0 Apr 19 12:35:04.350121 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:04.349622 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2bb33194-d022-4b4f-8510-d23f793f4a39","Type":"ContainerDied","Data":"fcc55b4abe6af79135d5ecd74a29f26a3896b9e583a8482c4ee45de69277dd1a"} Apr 19 12:35:04.350121 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:04.349645 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2bb33194-d022-4b4f-8510-d23f793f4a39","Type":"ContainerStarted","Data":"9e7dcea939b19e242e9da3f032f1b258f3a1d333d7f61db967b058055431598c"} Apr 19 12:35:04.459254 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:04.459217 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="effa5082-f6a7-4251-9183-c99891f7e9e3" path="/var/lib/kubelet/pods/effa5082-f6a7-4251-9183-c99891f7e9e3/volumes" Apr 19 12:35:05.356364 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:05.356328 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2bb33194-d022-4b4f-8510-d23f793f4a39","Type":"ContainerStarted","Data":"aeb0b04807e5ad89448752bdfbf187e63957b3f56db6fcebdc6ba18f89638cbe"} Apr 19 12:35:05.356364 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:05.356365 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2bb33194-d022-4b4f-8510-d23f793f4a39","Type":"ContainerStarted","Data":"1dd7c76d1daf57a89cb45a09d4de80ebcd4eae59e4c38818c0d7dd3fc7eace9a"} Apr 19 12:35:05.356759 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:05.356375 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2bb33194-d022-4b4f-8510-d23f793f4a39","Type":"ContainerStarted","Data":"ab0874891aba992373ae540ec94b84447abfa16248188543e008886eb36d4b89"} Apr 19 12:35:05.356759 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:05.356383 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2bb33194-d022-4b4f-8510-d23f793f4a39","Type":"ContainerStarted","Data":"2f4e72601b339c150cd6eb6b8b8126af3a8ef047382cf3947560d84faf4b66ae"} Apr 19 12:35:05.356759 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:05.356391 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2bb33194-d022-4b4f-8510-d23f793f4a39","Type":"ContainerStarted","Data":"be504b42b2ef8668ccf2fc5462c61954658f8d9401cc73c424b875061d030c4c"} Apr 19 12:35:05.356759 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:05.356399 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2bb33194-d022-4b4f-8510-d23f793f4a39","Type":"ContainerStarted","Data":"f7145b38e39cb862d374b99a9adb2e207abf1f5baaaaf39102c84374bd893b38"} Apr 19 12:35:05.384055 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:05.380780 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.380758381 podStartE2EDuration="2.380758381s" podCreationTimestamp="2026-04-19 12:35:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:35:05.378334022 +0000 UTC m=+273.532267502" watchObservedRunningTime="2026-04-19 12:35:05.380758381 +0000 UTC m=+273.534691862" Apr 19 12:35:05.973248 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:05.973207 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5688fbc97b-n56sh"] Apr 19 12:35:05.977435 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:05.977411 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:05.979441 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:05.979410 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 19 12:35:05.979441 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:05.979432 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 19 12:35:05.979441 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:05.979421 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 19 12:35:05.979708 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:05.979415 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 19 12:35:05.979708 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:05.979580 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 19 12:35:05.979708 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:05.979593 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-fxg9b\"" Apr 19 12:35:05.985120 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:05.985093 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 19 12:35:05.989763 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:05.989737 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5688fbc97b-n56sh"] Apr 19 12:35:06.096903 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.096865 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1c35bac-a14c-4d36-8232-340f4f8b34be-serving-certs-ca-bundle\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.096903 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.096900 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/e1c35bac-a14c-4d36-8232-340f4f8b34be-secret-telemeter-client\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.097106 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.096936 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/e1c35bac-a14c-4d36-8232-340f4f8b34be-federate-client-tls\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.097106 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.097031 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e1c35bac-a14c-4d36-8232-340f4f8b34be-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.097106 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.097076 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1c35bac-a14c-4d36-8232-340f4f8b34be-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.097244 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.097109 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/e1c35bac-a14c-4d36-8232-340f4f8b34be-telemeter-client-tls\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.097244 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.097141 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rkdp\" (UniqueName: \"kubernetes.io/projected/e1c35bac-a14c-4d36-8232-340f4f8b34be-kube-api-access-4rkdp\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.097244 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.097192 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e1c35bac-a14c-4d36-8232-340f4f8b34be-metrics-client-ca\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.198372 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.198331 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e1c35bac-a14c-4d36-8232-340f4f8b34be-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.198541 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.198386 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1c35bac-a14c-4d36-8232-340f4f8b34be-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.198541 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.198417 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/e1c35bac-a14c-4d36-8232-340f4f8b34be-telemeter-client-tls\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.198541 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.198447 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rkdp\" (UniqueName: \"kubernetes.io/projected/e1c35bac-a14c-4d36-8232-340f4f8b34be-kube-api-access-4rkdp\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.198541 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.198488 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e1c35bac-a14c-4d36-8232-340f4f8b34be-metrics-client-ca\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.198541 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.198532 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1c35bac-a14c-4d36-8232-340f4f8b34be-serving-certs-ca-bundle\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.198805 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.198555 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/e1c35bac-a14c-4d36-8232-340f4f8b34be-secret-telemeter-client\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.198805 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.198617 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/e1c35bac-a14c-4d36-8232-340f4f8b34be-federate-client-tls\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.199375 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.199339 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1c35bac-a14c-4d36-8232-340f4f8b34be-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.199539 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.199516 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e1c35bac-a14c-4d36-8232-340f4f8b34be-metrics-client-ca\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.199910 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.199885 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1c35bac-a14c-4d36-8232-340f4f8b34be-serving-certs-ca-bundle\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.201261 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.201232 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/e1c35bac-a14c-4d36-8232-340f4f8b34be-federate-client-tls\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.201453 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.201420 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e1c35bac-a14c-4d36-8232-340f4f8b34be-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.201700 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.201679 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/e1c35bac-a14c-4d36-8232-340f4f8b34be-telemeter-client-tls\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.201892 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.201871 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/e1c35bac-a14c-4d36-8232-340f4f8b34be-secret-telemeter-client\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.206813 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.206789 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rkdp\" (UniqueName: \"kubernetes.io/projected/e1c35bac-a14c-4d36-8232-340f4f8b34be-kube-api-access-4rkdp\") pod \"telemeter-client-5688fbc97b-n56sh\" (UID: \"e1c35bac-a14c-4d36-8232-340f4f8b34be\") " pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.292401 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.292312 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" Apr 19 12:35:06.434790 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:06.434757 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5688fbc97b-n56sh"] Apr 19 12:35:06.437557 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:35:06.437530 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1c35bac_a14c_4d36_8232_340f4f8b34be.slice/crio-6df472ec8e4501b079494c8b0f9fffaadaf82e8fbe1dcc4b406243858647fb05 WatchSource:0}: Error finding container 6df472ec8e4501b079494c8b0f9fffaadaf82e8fbe1dcc4b406243858647fb05: Status 404 returned error can't find the container with id 6df472ec8e4501b079494c8b0f9fffaadaf82e8fbe1dcc4b406243858647fb05 Apr 19 12:35:07.366117 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:07.366082 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" event={"ID":"e1c35bac-a14c-4d36-8232-340f4f8b34be","Type":"ContainerStarted","Data":"6df472ec8e4501b079494c8b0f9fffaadaf82e8fbe1dcc4b406243858647fb05"} Apr 19 12:35:09.375048 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:09.375013 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" event={"ID":"e1c35bac-a14c-4d36-8232-340f4f8b34be","Type":"ContainerStarted","Data":"101376d89be700d07b21d3e5b153e67d2a8335a4689b697413079c003d57de89"} Apr 19 12:35:09.375048 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:09.375050 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" event={"ID":"e1c35bac-a14c-4d36-8232-340f4f8b34be","Type":"ContainerStarted","Data":"a799439b9ce9914a1954a3f5c3128f00e01d2b7fc2331c2981a71e0c8889a3c9"} Apr 19 12:35:09.375494 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:09.375063 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" event={"ID":"e1c35bac-a14c-4d36-8232-340f4f8b34be","Type":"ContainerStarted","Data":"e4a86e5d00eb269dc720db3bc86e44626042c8b54271c8a4e468fdebcf3e3a74"} Apr 19 12:35:09.394909 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:09.394852 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5688fbc97b-n56sh" podStartSLOduration=2.385067217 podStartE2EDuration="4.394838392s" podCreationTimestamp="2026-04-19 12:35:05 +0000 UTC" firstStartedPulling="2026-04-19 12:35:06.439913993 +0000 UTC m=+274.593847451" lastFinishedPulling="2026-04-19 12:35:08.449685163 +0000 UTC m=+276.603618626" observedRunningTime="2026-04-19 12:35:09.39368868 +0000 UTC m=+277.547622160" watchObservedRunningTime="2026-04-19 12:35:09.394838392 +0000 UTC m=+277.548771871" Apr 19 12:35:11.034566 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.034533 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-69f59dc65c-5cnwl"] Apr 19 12:35:11.038133 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.038106 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.043487 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.043448 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 19 12:35:11.043637 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.043557 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 19 12:35:11.043704 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.043640 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 19 12:35:11.043765 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.043710 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 19 12:35:11.045955 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.045924 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 19 12:35:11.046319 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.046287 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-8nsl6\"" Apr 19 12:35:11.055714 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.055686 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 19 12:35:11.062371 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.062344 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69f59dc65c-5cnwl"] Apr 19 12:35:11.139307 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.139274 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-console-serving-cert\") pod \"console-69f59dc65c-5cnwl\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.139481 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.139319 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-console-config\") pod \"console-69f59dc65c-5cnwl\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.139481 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.139409 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-trusted-ca-bundle\") pod \"console-69f59dc65c-5cnwl\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.139481 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.139449 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-console-oauth-config\") pod \"console-69f59dc65c-5cnwl\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.139481 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.139468 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-service-ca\") pod \"console-69f59dc65c-5cnwl\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.139613 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.139489 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-oauth-serving-cert\") pod \"console-69f59dc65c-5cnwl\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.139613 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.139520 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkpdg\" (UniqueName: \"kubernetes.io/projected/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-kube-api-access-pkpdg\") pod \"console-69f59dc65c-5cnwl\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.240281 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.240246 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-oauth-serving-cert\") pod \"console-69f59dc65c-5cnwl\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.240281 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.240292 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkpdg\" (UniqueName: \"kubernetes.io/projected/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-kube-api-access-pkpdg\") pod \"console-69f59dc65c-5cnwl\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.240536 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.240334 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-console-serving-cert\") pod \"console-69f59dc65c-5cnwl\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.240536 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.240364 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-console-config\") pod \"console-69f59dc65c-5cnwl\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.240536 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.240401 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-trusted-ca-bundle\") pod \"console-69f59dc65c-5cnwl\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.240536 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.240428 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-console-oauth-config\") pod \"console-69f59dc65c-5cnwl\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.240536 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.240449 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-service-ca\") pod \"console-69f59dc65c-5cnwl\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.241041 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.241013 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-oauth-serving-cert\") pod \"console-69f59dc65c-5cnwl\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.241205 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.241144 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-console-config\") pod \"console-69f59dc65c-5cnwl\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.241205 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.241187 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-service-ca\") pod \"console-69f59dc65c-5cnwl\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.241341 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.241326 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-trusted-ca-bundle\") pod \"console-69f59dc65c-5cnwl\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.242850 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.242833 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-console-oauth-config\") pod \"console-69f59dc65c-5cnwl\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.242957 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.242935 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-console-serving-cert\") pod \"console-69f59dc65c-5cnwl\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.247304 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.247286 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkpdg\" (UniqueName: \"kubernetes.io/projected/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-kube-api-access-pkpdg\") pod \"console-69f59dc65c-5cnwl\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.349394 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.349304 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:11.466415 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:11.466387 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69f59dc65c-5cnwl"] Apr 19 12:35:11.469351 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:35:11.469309 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6df0a60e_2a8e_4591_bed9_c3ffc7a67101.slice/crio-2fc65fb77e5a821917dc952d88710a95117f2de45fb12dd0d76bdce0805bdcc9 WatchSource:0}: Error finding container 2fc65fb77e5a821917dc952d88710a95117f2de45fb12dd0d76bdce0805bdcc9: Status 404 returned error can't find the container with id 2fc65fb77e5a821917dc952d88710a95117f2de45fb12dd0d76bdce0805bdcc9 Apr 19 12:35:12.386783 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:12.386749 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69f59dc65c-5cnwl" event={"ID":"6df0a60e-2a8e-4591-bed9-c3ffc7a67101","Type":"ContainerStarted","Data":"8a3f1fcda0b9696502453569a1e050687b87832196f89fb30a2b749a7bb8e28d"} Apr 19 12:35:12.386783 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:12.386781 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69f59dc65c-5cnwl" event={"ID":"6df0a60e-2a8e-4591-bed9-c3ffc7a67101","Type":"ContainerStarted","Data":"2fc65fb77e5a821917dc952d88710a95117f2de45fb12dd0d76bdce0805bdcc9"} Apr 19 12:35:12.404049 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:12.403996 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69f59dc65c-5cnwl" podStartSLOduration=2.4039805530000002 podStartE2EDuration="2.403980553s" podCreationTimestamp="2026-04-19 12:35:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:35:12.401838334 +0000 UTC m=+280.555771814" watchObservedRunningTime="2026-04-19 12:35:12.403980553 +0000 UTC m=+280.557914033" Apr 19 12:35:21.349668 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:21.349627 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:21.350119 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:21.349685 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:21.354471 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:21.354445 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:21.418351 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:21.418320 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:35:32.342232 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:32.342205 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnbtf_2b04d28c-3eb3-44e1-b431-d6b75f3850fe/console-operator/1.log" Apr 19 12:35:32.342664 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:32.342211 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnbtf_2b04d28c-3eb3-44e1-b431-d6b75f3850fe/console-operator/1.log" Apr 19 12:35:32.355084 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:35:32.355058 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 19 12:36:18.210804 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.210749 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-58fff8c4c5-tqlfs"] Apr 19 12:36:18.214565 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.214543 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.223935 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.223911 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58fff8c4c5-tqlfs"] Apr 19 12:36:18.297846 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.297804 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr9sq\" (UniqueName: \"kubernetes.io/projected/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-kube-api-access-sr9sq\") pod \"console-58fff8c4c5-tqlfs\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.298047 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.297867 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-console-oauth-config\") pod \"console-58fff8c4c5-tqlfs\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.298047 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.297933 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-service-ca\") pod \"console-58fff8c4c5-tqlfs\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.298047 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.297973 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-console-serving-cert\") pod \"console-58fff8c4c5-tqlfs\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.298047 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.298005 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-trusted-ca-bundle\") pod \"console-58fff8c4c5-tqlfs\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.298047 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.298028 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-console-config\") pod \"console-58fff8c4c5-tqlfs\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.298268 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.298106 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-oauth-serving-cert\") pod \"console-58fff8c4c5-tqlfs\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.398810 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.398767 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-trusted-ca-bundle\") pod \"console-58fff8c4c5-tqlfs\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.398810 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.398814 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-console-config\") pod \"console-58fff8c4c5-tqlfs\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.399097 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.398839 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-oauth-serving-cert\") pod \"console-58fff8c4c5-tqlfs\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.399097 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.398878 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sr9sq\" (UniqueName: \"kubernetes.io/projected/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-kube-api-access-sr9sq\") pod \"console-58fff8c4c5-tqlfs\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.399097 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.398907 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-console-oauth-config\") pod \"console-58fff8c4c5-tqlfs\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.399097 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.398948 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-service-ca\") pod \"console-58fff8c4c5-tqlfs\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.399097 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.398998 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-console-serving-cert\") pod \"console-58fff8c4c5-tqlfs\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.399674 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.399646 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-console-config\") pod \"console-58fff8c4c5-tqlfs\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.399778 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.399679 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-oauth-serving-cert\") pod \"console-58fff8c4c5-tqlfs\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.399834 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.399770 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-trusted-ca-bundle\") pod \"console-58fff8c4c5-tqlfs\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.399834 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.399792 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-service-ca\") pod \"console-58fff8c4c5-tqlfs\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.401430 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.401412 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-console-oauth-config\") pod \"console-58fff8c4c5-tqlfs\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.401533 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.401517 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-console-serving-cert\") pod \"console-58fff8c4c5-tqlfs\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.406446 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.406421 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr9sq\" (UniqueName: \"kubernetes.io/projected/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-kube-api-access-sr9sq\") pod \"console-58fff8c4c5-tqlfs\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.525288 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.525190 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:18.665956 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.665927 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58fff8c4c5-tqlfs"] Apr 19 12:36:18.668641 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:36:18.668610 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6bb8c7f_a442_43c7_88a2_aedc6cfd9347.slice/crio-046bc0dd9999499dbdbc5df5219aca321ddef981179e173fc85746141b0c511e WatchSource:0}: Error finding container 046bc0dd9999499dbdbc5df5219aca321ddef981179e173fc85746141b0c511e: Status 404 returned error can't find the container with id 046bc0dd9999499dbdbc5df5219aca321ddef981179e173fc85746141b0c511e Apr 19 12:36:18.670894 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:18.670873 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:36:19.598799 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:19.598762 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58fff8c4c5-tqlfs" event={"ID":"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347","Type":"ContainerStarted","Data":"f17fc991c8763f151a0dd0719e7782c205b391c647518bc413c70c642686557c"} Apr 19 12:36:19.598799 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:19.598800 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58fff8c4c5-tqlfs" event={"ID":"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347","Type":"ContainerStarted","Data":"046bc0dd9999499dbdbc5df5219aca321ddef981179e173fc85746141b0c511e"} Apr 19 12:36:28.525292 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:28.525250 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:28.525693 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:28.525557 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:28.530463 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:28.530438 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:28.546665 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:28.546619 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-58fff8c4c5-tqlfs" podStartSLOduration=10.546606614 podStartE2EDuration="10.546606614s" podCreationTimestamp="2026-04-19 12:36:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:36:19.614856519 +0000 UTC m=+347.768790011" watchObservedRunningTime="2026-04-19 12:36:28.546606614 +0000 UTC m=+356.700540094" Apr 19 12:36:28.629366 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:28.629341 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:36:28.668356 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:28.668325 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69f59dc65c-5cnwl"] Apr 19 12:36:53.688126 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:53.688063 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-69f59dc65c-5cnwl" podUID="6df0a60e-2a8e-4591-bed9-c3ffc7a67101" containerName="console" containerID="cri-o://8a3f1fcda0b9696502453569a1e050687b87832196f89fb30a2b749a7bb8e28d" gracePeriod=15 Apr 19 12:36:53.927007 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:53.926980 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69f59dc65c-5cnwl_6df0a60e-2a8e-4591-bed9-c3ffc7a67101/console/0.log" Apr 19 12:36:53.927144 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:53.927055 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:36:54.108056 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.107973 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-console-config\") pod \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " Apr 19 12:36:54.108056 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.108019 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-trusted-ca-bundle\") pod \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " Apr 19 12:36:54.108056 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.108046 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkpdg\" (UniqueName: \"kubernetes.io/projected/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-kube-api-access-pkpdg\") pod \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " Apr 19 12:36:54.108381 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.108074 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-oauth-serving-cert\") pod \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " Apr 19 12:36:54.108381 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.108100 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-console-serving-cert\") pod \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " Apr 19 12:36:54.108381 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.108118 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-service-ca\") pod \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " Apr 19 12:36:54.108381 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.108145 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-console-oauth-config\") pod \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\" (UID: \"6df0a60e-2a8e-4591-bed9-c3ffc7a67101\") " Apr 19 12:36:54.108585 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.108455 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-console-config" (OuterVolumeSpecName: "console-config") pod "6df0a60e-2a8e-4591-bed9-c3ffc7a67101" (UID: "6df0a60e-2a8e-4591-bed9-c3ffc7a67101"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:36:54.108585 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.108486 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6df0a60e-2a8e-4591-bed9-c3ffc7a67101" (UID: "6df0a60e-2a8e-4591-bed9-c3ffc7a67101"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:36:54.108585 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.108561 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6df0a60e-2a8e-4591-bed9-c3ffc7a67101" (UID: "6df0a60e-2a8e-4591-bed9-c3ffc7a67101"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:36:54.108691 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.108616 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-service-ca" (OuterVolumeSpecName: "service-ca") pod "6df0a60e-2a8e-4591-bed9-c3ffc7a67101" (UID: "6df0a60e-2a8e-4591-bed9-c3ffc7a67101"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:36:54.110412 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.110382 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6df0a60e-2a8e-4591-bed9-c3ffc7a67101" (UID: "6df0a60e-2a8e-4591-bed9-c3ffc7a67101"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:36:54.110548 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.110459 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6df0a60e-2a8e-4591-bed9-c3ffc7a67101" (UID: "6df0a60e-2a8e-4591-bed9-c3ffc7a67101"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:36:54.110548 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.110471 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-kube-api-access-pkpdg" (OuterVolumeSpecName: "kube-api-access-pkpdg") pod "6df0a60e-2a8e-4591-bed9-c3ffc7a67101" (UID: "6df0a60e-2a8e-4591-bed9-c3ffc7a67101"). InnerVolumeSpecName "kube-api-access-pkpdg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:36:54.209260 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.209225 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-console-config\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:36:54.209260 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.209254 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-trusted-ca-bundle\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:36:54.209260 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.209266 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pkpdg\" (UniqueName: \"kubernetes.io/projected/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-kube-api-access-pkpdg\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:36:54.209486 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.209275 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-oauth-serving-cert\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:36:54.209486 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.209287 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-console-serving-cert\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:36:54.209486 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.209297 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-service-ca\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:36:54.209486 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.209305 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6df0a60e-2a8e-4591-bed9-c3ffc7a67101-console-oauth-config\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:36:54.712863 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.712831 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69f59dc65c-5cnwl_6df0a60e-2a8e-4591-bed9-c3ffc7a67101/console/0.log" Apr 19 12:36:54.713311 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.712876 2567 generic.go:358] "Generic (PLEG): container finished" podID="6df0a60e-2a8e-4591-bed9-c3ffc7a67101" containerID="8a3f1fcda0b9696502453569a1e050687b87832196f89fb30a2b749a7bb8e28d" exitCode=2 Apr 19 12:36:54.713311 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.712931 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69f59dc65c-5cnwl" event={"ID":"6df0a60e-2a8e-4591-bed9-c3ffc7a67101","Type":"ContainerDied","Data":"8a3f1fcda0b9696502453569a1e050687b87832196f89fb30a2b749a7bb8e28d"} Apr 19 12:36:54.713311 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.712955 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69f59dc65c-5cnwl" event={"ID":"6df0a60e-2a8e-4591-bed9-c3ffc7a67101","Type":"ContainerDied","Data":"2fc65fb77e5a821917dc952d88710a95117f2de45fb12dd0d76bdce0805bdcc9"} Apr 19 12:36:54.713311 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.712970 2567 scope.go:117] "RemoveContainer" containerID="8a3f1fcda0b9696502453569a1e050687b87832196f89fb30a2b749a7bb8e28d" Apr 19 12:36:54.713311 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.712974 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69f59dc65c-5cnwl" Apr 19 12:36:54.720870 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.720849 2567 scope.go:117] "RemoveContainer" containerID="8a3f1fcda0b9696502453569a1e050687b87832196f89fb30a2b749a7bb8e28d" Apr 19 12:36:54.721121 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:36:54.721103 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a3f1fcda0b9696502453569a1e050687b87832196f89fb30a2b749a7bb8e28d\": container with ID starting with 8a3f1fcda0b9696502453569a1e050687b87832196f89fb30a2b749a7bb8e28d not found: ID does not exist" containerID="8a3f1fcda0b9696502453569a1e050687b87832196f89fb30a2b749a7bb8e28d" Apr 19 12:36:54.721199 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.721129 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a3f1fcda0b9696502453569a1e050687b87832196f89fb30a2b749a7bb8e28d"} err="failed to get container status \"8a3f1fcda0b9696502453569a1e050687b87832196f89fb30a2b749a7bb8e28d\": rpc error: code = NotFound desc = could not find container \"8a3f1fcda0b9696502453569a1e050687b87832196f89fb30a2b749a7bb8e28d\": container with ID starting with 8a3f1fcda0b9696502453569a1e050687b87832196f89fb30a2b749a7bb8e28d not found: ID does not exist" Apr 19 12:36:54.728339 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.728316 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69f59dc65c-5cnwl"] Apr 19 12:36:54.732087 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:54.732068 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-69f59dc65c-5cnwl"] Apr 19 12:36:56.458460 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:36:56.458428 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6df0a60e-2a8e-4591-bed9-c3ffc7a67101" path="/var/lib/kubelet/pods/6df0a60e-2a8e-4591-bed9-c3ffc7a67101/volumes" Apr 19 12:37:11.291703 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:11.291666 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w"] Apr 19 12:37:11.292098 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:11.292028 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6df0a60e-2a8e-4591-bed9-c3ffc7a67101" containerName="console" Apr 19 12:37:11.292098 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:11.292040 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df0a60e-2a8e-4591-bed9-c3ffc7a67101" containerName="console" Apr 19 12:37:11.292187 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:11.292114 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6df0a60e-2a8e-4591-bed9-c3ffc7a67101" containerName="console" Apr 19 12:37:11.296725 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:11.296704 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w" Apr 19 12:37:11.298735 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:11.298705 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 19 12:37:11.298869 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:11.298811 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 19 12:37:11.299191 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:11.299177 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-98xhq\"" Apr 19 12:37:11.302705 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:11.302678 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w"] Apr 19 12:37:11.348872 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:11.348837 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e513d05e-0d9a-49d1-84a3-9764176f9512-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w\" (UID: \"e513d05e-0d9a-49d1-84a3-9764176f9512\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w" Apr 19 12:37:11.348872 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:11.348868 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e513d05e-0d9a-49d1-84a3-9764176f9512-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w\" (UID: \"e513d05e-0d9a-49d1-84a3-9764176f9512\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w" Apr 19 12:37:11.349101 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:11.348991 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rgfm\" (UniqueName: \"kubernetes.io/projected/e513d05e-0d9a-49d1-84a3-9764176f9512-kube-api-access-8rgfm\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w\" (UID: \"e513d05e-0d9a-49d1-84a3-9764176f9512\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w" Apr 19 12:37:11.449794 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:11.449759 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rgfm\" (UniqueName: \"kubernetes.io/projected/e513d05e-0d9a-49d1-84a3-9764176f9512-kube-api-access-8rgfm\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w\" (UID: \"e513d05e-0d9a-49d1-84a3-9764176f9512\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w" Apr 19 12:37:11.449953 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:11.449813 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e513d05e-0d9a-49d1-84a3-9764176f9512-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w\" (UID: \"e513d05e-0d9a-49d1-84a3-9764176f9512\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w" Apr 19 12:37:11.449953 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:11.449832 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e513d05e-0d9a-49d1-84a3-9764176f9512-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w\" (UID: \"e513d05e-0d9a-49d1-84a3-9764176f9512\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w" Apr 19 12:37:11.450236 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:11.450220 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e513d05e-0d9a-49d1-84a3-9764176f9512-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w\" (UID: \"e513d05e-0d9a-49d1-84a3-9764176f9512\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w" Apr 19 12:37:11.450293 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:11.450247 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e513d05e-0d9a-49d1-84a3-9764176f9512-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w\" (UID: \"e513d05e-0d9a-49d1-84a3-9764176f9512\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w" Apr 19 12:37:11.457532 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:11.457508 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rgfm\" (UniqueName: \"kubernetes.io/projected/e513d05e-0d9a-49d1-84a3-9764176f9512-kube-api-access-8rgfm\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w\" (UID: \"e513d05e-0d9a-49d1-84a3-9764176f9512\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w" Apr 19 12:37:11.607870 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:11.607763 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w" Apr 19 12:37:11.728594 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:11.728565 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w"] Apr 19 12:37:11.731307 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:37:11.731278 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode513d05e_0d9a_49d1_84a3_9764176f9512.slice/crio-5e9850971c1bfec0b83bb97572bcb4539db1464e14bb1cd28ac2fb98ef095048 WatchSource:0}: Error finding container 5e9850971c1bfec0b83bb97572bcb4539db1464e14bb1cd28ac2fb98ef095048: Status 404 returned error can't find the container with id 5e9850971c1bfec0b83bb97572bcb4539db1464e14bb1cd28ac2fb98ef095048 Apr 19 12:37:11.773950 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:11.773911 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w" event={"ID":"e513d05e-0d9a-49d1-84a3-9764176f9512","Type":"ContainerStarted","Data":"5e9850971c1bfec0b83bb97572bcb4539db1464e14bb1cd28ac2fb98ef095048"} Apr 19 12:37:18.799337 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:18.799296 2567 generic.go:358] "Generic (PLEG): container finished" podID="e513d05e-0d9a-49d1-84a3-9764176f9512" containerID="4ef9b749667b224cb3e36a5faa87cdb8075a2993866ed085518c82250f2cb4a6" exitCode=0 Apr 19 12:37:18.799716 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:18.799350 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w" event={"ID":"e513d05e-0d9a-49d1-84a3-9764176f9512","Type":"ContainerDied","Data":"4ef9b749667b224cb3e36a5faa87cdb8075a2993866ed085518c82250f2cb4a6"} Apr 19 12:37:21.811490 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:21.811450 2567 generic.go:358] "Generic (PLEG): container finished" podID="e513d05e-0d9a-49d1-84a3-9764176f9512" containerID="46c01b7251c259bf03d1ce4770d6e332d85e2f9663a759fe284915266c094081" exitCode=0 Apr 19 12:37:21.811863 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:21.811500 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w" event={"ID":"e513d05e-0d9a-49d1-84a3-9764176f9512","Type":"ContainerDied","Data":"46c01b7251c259bf03d1ce4770d6e332d85e2f9663a759fe284915266c094081"} Apr 19 12:37:29.842756 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:29.842723 2567 generic.go:358] "Generic (PLEG): container finished" podID="e513d05e-0d9a-49d1-84a3-9764176f9512" containerID="6af14d967bf39a75fa90cfc4f4f67a79e22935625cec8313fc150cc8687903f0" exitCode=0 Apr 19 12:37:29.843154 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:29.842812 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w" event={"ID":"e513d05e-0d9a-49d1-84a3-9764176f9512","Type":"ContainerDied","Data":"6af14d967bf39a75fa90cfc4f4f67a79e22935625cec8313fc150cc8687903f0"} Apr 19 12:37:30.975863 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:30.975837 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w" Apr 19 12:37:31.021911 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:31.021870 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e513d05e-0d9a-49d1-84a3-9764176f9512-util\") pod \"e513d05e-0d9a-49d1-84a3-9764176f9512\" (UID: \"e513d05e-0d9a-49d1-84a3-9764176f9512\") " Apr 19 12:37:31.022100 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:31.021948 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e513d05e-0d9a-49d1-84a3-9764176f9512-bundle\") pod \"e513d05e-0d9a-49d1-84a3-9764176f9512\" (UID: \"e513d05e-0d9a-49d1-84a3-9764176f9512\") " Apr 19 12:37:31.022100 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:31.021974 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rgfm\" (UniqueName: \"kubernetes.io/projected/e513d05e-0d9a-49d1-84a3-9764176f9512-kube-api-access-8rgfm\") pod \"e513d05e-0d9a-49d1-84a3-9764176f9512\" (UID: \"e513d05e-0d9a-49d1-84a3-9764176f9512\") " Apr 19 12:37:31.022595 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:31.022561 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e513d05e-0d9a-49d1-84a3-9764176f9512-bundle" (OuterVolumeSpecName: "bundle") pod "e513d05e-0d9a-49d1-84a3-9764176f9512" (UID: "e513d05e-0d9a-49d1-84a3-9764176f9512"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:37:31.024187 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:31.024151 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e513d05e-0d9a-49d1-84a3-9764176f9512-kube-api-access-8rgfm" (OuterVolumeSpecName: "kube-api-access-8rgfm") pod "e513d05e-0d9a-49d1-84a3-9764176f9512" (UID: "e513d05e-0d9a-49d1-84a3-9764176f9512"). InnerVolumeSpecName "kube-api-access-8rgfm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:37:31.026770 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:31.026749 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e513d05e-0d9a-49d1-84a3-9764176f9512-util" (OuterVolumeSpecName: "util") pod "e513d05e-0d9a-49d1-84a3-9764176f9512" (UID: "e513d05e-0d9a-49d1-84a3-9764176f9512"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:37:31.122863 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:31.122771 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e513d05e-0d9a-49d1-84a3-9764176f9512-util\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:37:31.122863 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:31.122806 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e513d05e-0d9a-49d1-84a3-9764176f9512-bundle\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:37:31.122863 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:31.122816 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8rgfm\" (UniqueName: \"kubernetes.io/projected/e513d05e-0d9a-49d1-84a3-9764176f9512-kube-api-access-8rgfm\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:37:31.850998 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:31.850962 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w" event={"ID":"e513d05e-0d9a-49d1-84a3-9764176f9512","Type":"ContainerDied","Data":"5e9850971c1bfec0b83bb97572bcb4539db1464e14bb1cd28ac2fb98ef095048"} Apr 19 12:37:31.850998 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:31.850990 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dk89w" Apr 19 12:37:31.850998 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:31.850999 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e9850971c1bfec0b83bb97572bcb4539db1464e14bb1cd28ac2fb98ef095048" Apr 19 12:37:38.377980 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:38.377940 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-g7bt6"] Apr 19 12:37:38.378523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:38.378481 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e513d05e-0d9a-49d1-84a3-9764176f9512" containerName="extract" Apr 19 12:37:38.378523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:38.378500 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e513d05e-0d9a-49d1-84a3-9764176f9512" containerName="extract" Apr 19 12:37:38.378633 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:38.378529 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e513d05e-0d9a-49d1-84a3-9764176f9512" containerName="util" Apr 19 12:37:38.378633 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:38.378538 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e513d05e-0d9a-49d1-84a3-9764176f9512" containerName="util" Apr 19 12:37:38.378633 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:38.378550 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e513d05e-0d9a-49d1-84a3-9764176f9512" containerName="pull" Apr 19 12:37:38.378633 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:38.378559 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e513d05e-0d9a-49d1-84a3-9764176f9512" containerName="pull" Apr 19 12:37:38.378824 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:38.378646 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="e513d05e-0d9a-49d1-84a3-9764176f9512" containerName="extract" Apr 19 12:37:38.383367 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:38.383347 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-g7bt6" Apr 19 12:37:38.385178 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:38.385143 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:37:38.385314 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:38.385146 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-tlf87\"" Apr 19 12:37:38.385314 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:38.385144 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 19 12:37:38.392674 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:38.392630 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-g7bt6"] Apr 19 12:37:38.485107 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:38.485074 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff807976-15c5-4100-bf67-6415a777e536-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-g7bt6\" (UID: \"ff807976-15c5-4100-bf67-6415a777e536\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-g7bt6" Apr 19 12:37:38.485107 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:38.485106 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f6z5\" (UniqueName: \"kubernetes.io/projected/ff807976-15c5-4100-bf67-6415a777e536-kube-api-access-7f6z5\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-g7bt6\" (UID: \"ff807976-15c5-4100-bf67-6415a777e536\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-g7bt6" Apr 19 12:37:38.586181 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:38.586119 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff807976-15c5-4100-bf67-6415a777e536-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-g7bt6\" (UID: \"ff807976-15c5-4100-bf67-6415a777e536\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-g7bt6" Apr 19 12:37:38.586364 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:38.586196 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7f6z5\" (UniqueName: \"kubernetes.io/projected/ff807976-15c5-4100-bf67-6415a777e536-kube-api-access-7f6z5\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-g7bt6\" (UID: \"ff807976-15c5-4100-bf67-6415a777e536\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-g7bt6" Apr 19 12:37:38.586609 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:38.586585 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff807976-15c5-4100-bf67-6415a777e536-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-g7bt6\" (UID: \"ff807976-15c5-4100-bf67-6415a777e536\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-g7bt6" Apr 19 12:37:38.608756 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:38.608724 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f6z5\" (UniqueName: \"kubernetes.io/projected/ff807976-15c5-4100-bf67-6415a777e536-kube-api-access-7f6z5\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-g7bt6\" (UID: \"ff807976-15c5-4100-bf67-6415a777e536\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-g7bt6" Apr 19 12:37:38.693374 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:38.693337 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-g7bt6" Apr 19 12:37:38.821243 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:38.821215 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-g7bt6"] Apr 19 12:37:38.823992 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:37:38.823963 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff807976_15c5_4100_bf67_6415a777e536.slice/crio-6b8ae321a1406fe0d43def94452318e0c5b1c9caeead3ac164bdc768a255ec3f WatchSource:0}: Error finding container 6b8ae321a1406fe0d43def94452318e0c5b1c9caeead3ac164bdc768a255ec3f: Status 404 returned error can't find the container with id 6b8ae321a1406fe0d43def94452318e0c5b1c9caeead3ac164bdc768a255ec3f Apr 19 12:37:38.874319 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:38.874286 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-g7bt6" event={"ID":"ff807976-15c5-4100-bf67-6415a777e536","Type":"ContainerStarted","Data":"6b8ae321a1406fe0d43def94452318e0c5b1c9caeead3ac164bdc768a255ec3f"} Apr 19 12:37:40.883520 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:40.883434 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-g7bt6" event={"ID":"ff807976-15c5-4100-bf67-6415a777e536","Type":"ContainerStarted","Data":"28cc2b68cad434510208e3abafd39b08efd5c1ef59d0b5820981fd09740ce1e1"} Apr 19 12:37:40.904524 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:40.904475 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-g7bt6" podStartSLOduration=1.268257142 podStartE2EDuration="2.904459449s" podCreationTimestamp="2026-04-19 12:37:38 +0000 UTC" firstStartedPulling="2026-04-19 12:37:38.826546329 +0000 UTC m=+426.980479788" lastFinishedPulling="2026-04-19 12:37:40.462748632 +0000 UTC m=+428.616682095" observedRunningTime="2026-04-19 12:37:40.903493326 +0000 UTC m=+429.057426839" watchObservedRunningTime="2026-04-19 12:37:40.904459449 +0000 UTC m=+429.058392929" Apr 19 12:37:42.083348 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:42.083308 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt"] Apr 19 12:37:42.086946 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:42.086926 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt" Apr 19 12:37:42.088750 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:42.088733 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 19 12:37:42.089205 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:42.089180 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-98xhq\"" Apr 19 12:37:42.089205 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:42.089201 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 19 12:37:42.093609 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:42.093586 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt"] Apr 19 12:37:42.221684 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:42.221647 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7c2a15d-f450-4094-beff-d2d2ea145c33-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt\" (UID: \"d7c2a15d-f450-4094-beff-d2d2ea145c33\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt" Apr 19 12:37:42.221684 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:42.221698 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gp6s\" (UniqueName: \"kubernetes.io/projected/d7c2a15d-f450-4094-beff-d2d2ea145c33-kube-api-access-5gp6s\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt\" (UID: \"d7c2a15d-f450-4094-beff-d2d2ea145c33\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt" Apr 19 12:37:42.221913 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:42.221734 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7c2a15d-f450-4094-beff-d2d2ea145c33-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt\" (UID: \"d7c2a15d-f450-4094-beff-d2d2ea145c33\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt" Apr 19 12:37:42.322428 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:42.322388 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7c2a15d-f450-4094-beff-d2d2ea145c33-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt\" (UID: \"d7c2a15d-f450-4094-beff-d2d2ea145c33\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt" Apr 19 12:37:42.322596 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:42.322437 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gp6s\" (UniqueName: \"kubernetes.io/projected/d7c2a15d-f450-4094-beff-d2d2ea145c33-kube-api-access-5gp6s\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt\" (UID: \"d7c2a15d-f450-4094-beff-d2d2ea145c33\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt" Apr 19 12:37:42.322596 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:42.322476 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7c2a15d-f450-4094-beff-d2d2ea145c33-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt\" (UID: \"d7c2a15d-f450-4094-beff-d2d2ea145c33\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt" Apr 19 12:37:42.322778 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:42.322758 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7c2a15d-f450-4094-beff-d2d2ea145c33-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt\" (UID: \"d7c2a15d-f450-4094-beff-d2d2ea145c33\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt" Apr 19 12:37:42.322819 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:42.322802 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7c2a15d-f450-4094-beff-d2d2ea145c33-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt\" (UID: \"d7c2a15d-f450-4094-beff-d2d2ea145c33\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt" Apr 19 12:37:42.329686 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:42.329666 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gp6s\" (UniqueName: \"kubernetes.io/projected/d7c2a15d-f450-4094-beff-d2d2ea145c33-kube-api-access-5gp6s\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt\" (UID: \"d7c2a15d-f450-4094-beff-d2d2ea145c33\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt" Apr 19 12:37:42.397706 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:42.397590 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt" Apr 19 12:37:42.528413 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:42.528377 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt"] Apr 19 12:37:42.531131 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:37:42.531103 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7c2a15d_f450_4094_beff_d2d2ea145c33.slice/crio-67fa8f3b0a486e7b7214d637fa1813429d99620525354f9595a3dcb928700653 WatchSource:0}: Error finding container 67fa8f3b0a486e7b7214d637fa1813429d99620525354f9595a3dcb928700653: Status 404 returned error can't find the container with id 67fa8f3b0a486e7b7214d637fa1813429d99620525354f9595a3dcb928700653 Apr 19 12:37:42.891419 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:42.891385 2567 generic.go:358] "Generic (PLEG): container finished" podID="d7c2a15d-f450-4094-beff-d2d2ea145c33" containerID="4400bce851fa146b1dcb3bb8da2b1d7b9e0f7e0cfaae4bdd118686d2682a457b" exitCode=0 Apr 19 12:37:42.891612 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:42.891472 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt" event={"ID":"d7c2a15d-f450-4094-beff-d2d2ea145c33","Type":"ContainerDied","Data":"4400bce851fa146b1dcb3bb8da2b1d7b9e0f7e0cfaae4bdd118686d2682a457b"} Apr 19 12:37:42.891612 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:42.891512 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt" event={"ID":"d7c2a15d-f450-4094-beff-d2d2ea145c33","Type":"ContainerStarted","Data":"67fa8f3b0a486e7b7214d637fa1813429d99620525354f9595a3dcb928700653"} Apr 19 12:37:43.937152 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:43.937113 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-h78dc"] Apr 19 12:37:43.940540 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:43.940524 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-h78dc" Apr 19 12:37:43.942629 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:43.942603 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 19 12:37:43.942931 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:43.942912 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 19 12:37:43.943286 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:43.943250 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-6dcct\"" Apr 19 12:37:43.947996 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:43.947881 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-h78dc"] Apr 19 12:37:44.041490 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:44.041454 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/620845ac-07f2-4fe5-b292-9d099bc69df3-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-h78dc\" (UID: \"620845ac-07f2-4fe5-b292-9d099bc69df3\") " pod="cert-manager/cert-manager-webhook-597b96b99b-h78dc" Apr 19 12:37:44.041490 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:44.041500 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl9xm\" (UniqueName: \"kubernetes.io/projected/620845ac-07f2-4fe5-b292-9d099bc69df3-kube-api-access-cl9xm\") pod \"cert-manager-webhook-597b96b99b-h78dc\" (UID: \"620845ac-07f2-4fe5-b292-9d099bc69df3\") " pod="cert-manager/cert-manager-webhook-597b96b99b-h78dc" Apr 19 12:37:44.142664 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:44.142633 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/620845ac-07f2-4fe5-b292-9d099bc69df3-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-h78dc\" (UID: \"620845ac-07f2-4fe5-b292-9d099bc69df3\") " pod="cert-manager/cert-manager-webhook-597b96b99b-h78dc" Apr 19 12:37:44.142664 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:44.142671 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cl9xm\" (UniqueName: \"kubernetes.io/projected/620845ac-07f2-4fe5-b292-9d099bc69df3-kube-api-access-cl9xm\") pod \"cert-manager-webhook-597b96b99b-h78dc\" (UID: \"620845ac-07f2-4fe5-b292-9d099bc69df3\") " pod="cert-manager/cert-manager-webhook-597b96b99b-h78dc" Apr 19 12:37:44.151149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:44.151118 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/620845ac-07f2-4fe5-b292-9d099bc69df3-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-h78dc\" (UID: \"620845ac-07f2-4fe5-b292-9d099bc69df3\") " pod="cert-manager/cert-manager-webhook-597b96b99b-h78dc" Apr 19 12:37:44.151301 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:44.151280 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl9xm\" (UniqueName: \"kubernetes.io/projected/620845ac-07f2-4fe5-b292-9d099bc69df3-kube-api-access-cl9xm\") pod \"cert-manager-webhook-597b96b99b-h78dc\" (UID: \"620845ac-07f2-4fe5-b292-9d099bc69df3\") " pod="cert-manager/cert-manager-webhook-597b96b99b-h78dc" Apr 19 12:37:44.258198 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:44.258102 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-h78dc" Apr 19 12:37:44.908944 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:44.908919 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-h78dc"] Apr 19 12:37:44.911069 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:37:44.911036 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod620845ac_07f2_4fe5_b292_9d099bc69df3.slice/crio-8a191e0acc857dfc446e5adba11d643ec393739addc38674c4f5fe8ae1ec3815 WatchSource:0}: Error finding container 8a191e0acc857dfc446e5adba11d643ec393739addc38674c4f5fe8ae1ec3815: Status 404 returned error can't find the container with id 8a191e0acc857dfc446e5adba11d643ec393739addc38674c4f5fe8ae1ec3815 Apr 19 12:37:45.903553 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:45.903510 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-h78dc" event={"ID":"620845ac-07f2-4fe5-b292-9d099bc69df3","Type":"ContainerStarted","Data":"8a191e0acc857dfc446e5adba11d643ec393739addc38674c4f5fe8ae1ec3815"} Apr 19 12:37:45.905361 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:45.905330 2567 generic.go:358] "Generic (PLEG): container finished" podID="d7c2a15d-f450-4094-beff-d2d2ea145c33" containerID="b479cea6e2f5ede575dd7496fb7719ba70355455da926dc6e205e1caea10ea65" exitCode=0 Apr 19 12:37:45.905510 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:45.905387 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt" event={"ID":"d7c2a15d-f450-4094-beff-d2d2ea145c33","Type":"ContainerDied","Data":"b479cea6e2f5ede575dd7496fb7719ba70355455da926dc6e205e1caea10ea65"} Apr 19 12:37:46.662783 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:46.662752 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-pms8f"] Apr 19 12:37:46.666614 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:46.666546 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-pms8f" Apr 19 12:37:46.668242 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:46.668207 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-8pl2x\"" Apr 19 12:37:46.673490 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:46.673452 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-pms8f"] Apr 19 12:37:46.767665 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:46.767630 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9bc54865-9f95-4504-90f3-b930abb56106-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-pms8f\" (UID: \"9bc54865-9f95-4504-90f3-b930abb56106\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pms8f" Apr 19 12:37:46.767869 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:46.767694 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rql4f\" (UniqueName: \"kubernetes.io/projected/9bc54865-9f95-4504-90f3-b930abb56106-kube-api-access-rql4f\") pod \"cert-manager-cainjector-8966b78d4-pms8f\" (UID: \"9bc54865-9f95-4504-90f3-b930abb56106\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pms8f" Apr 19 12:37:46.868497 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:46.868460 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9bc54865-9f95-4504-90f3-b930abb56106-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-pms8f\" (UID: \"9bc54865-9f95-4504-90f3-b930abb56106\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pms8f" Apr 19 12:37:46.868669 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:46.868513 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rql4f\" (UniqueName: \"kubernetes.io/projected/9bc54865-9f95-4504-90f3-b930abb56106-kube-api-access-rql4f\") pod \"cert-manager-cainjector-8966b78d4-pms8f\" (UID: \"9bc54865-9f95-4504-90f3-b930abb56106\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pms8f" Apr 19 12:37:46.877353 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:46.877320 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9bc54865-9f95-4504-90f3-b930abb56106-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-pms8f\" (UID: \"9bc54865-9f95-4504-90f3-b930abb56106\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pms8f" Apr 19 12:37:46.877503 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:46.877460 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rql4f\" (UniqueName: \"kubernetes.io/projected/9bc54865-9f95-4504-90f3-b930abb56106-kube-api-access-rql4f\") pod \"cert-manager-cainjector-8966b78d4-pms8f\" (UID: \"9bc54865-9f95-4504-90f3-b930abb56106\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pms8f" Apr 19 12:37:46.912707 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:46.912662 2567 generic.go:358] "Generic (PLEG): container finished" podID="d7c2a15d-f450-4094-beff-d2d2ea145c33" containerID="cf83f8dbee7e9ecf5f599eb83247611127ab1aecd455f888a8763606cd954c2b" exitCode=0 Apr 19 12:37:46.913123 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:46.912740 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt" event={"ID":"d7c2a15d-f450-4094-beff-d2d2ea145c33","Type":"ContainerDied","Data":"cf83f8dbee7e9ecf5f599eb83247611127ab1aecd455f888a8763606cd954c2b"} Apr 19 12:37:46.978994 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:46.978958 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-pms8f" Apr 19 12:37:47.530624 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:47.530577 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-pms8f"] Apr 19 12:37:47.533344 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:37:47.533312 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bc54865_9f95_4504_90f3_b930abb56106.slice/crio-626a320f78cf7676c157394a9ec9a75f5a9b6877d39eaa7772e11d8f7e01a47c WatchSource:0}: Error finding container 626a320f78cf7676c157394a9ec9a75f5a9b6877d39eaa7772e11d8f7e01a47c: Status 404 returned error can't find the container with id 626a320f78cf7676c157394a9ec9a75f5a9b6877d39eaa7772e11d8f7e01a47c Apr 19 12:37:47.918380 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:47.918344 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-h78dc" event={"ID":"620845ac-07f2-4fe5-b292-9d099bc69df3","Type":"ContainerStarted","Data":"9bf7d628aa725a29bb6106dd3807f44a2da88a091d9a41c77fb392ba9b9f59af"} Apr 19 12:37:47.918826 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:47.918435 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-h78dc" Apr 19 12:37:47.919698 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:47.919677 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-pms8f" event={"ID":"9bc54865-9f95-4504-90f3-b930abb56106","Type":"ContainerStarted","Data":"570244761eff6ad3846d2edfdc881bb3235ea26cba1afdb7e425367e484c0c96"} Apr 19 12:37:47.919803 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:47.919703 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-pms8f" event={"ID":"9bc54865-9f95-4504-90f3-b930abb56106","Type":"ContainerStarted","Data":"626a320f78cf7676c157394a9ec9a75f5a9b6877d39eaa7772e11d8f7e01a47c"} Apr 19 12:37:47.933562 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:47.933497 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-h78dc" podStartSLOduration=2.3711632480000002 podStartE2EDuration="4.933478675s" podCreationTimestamp="2026-04-19 12:37:43 +0000 UTC" firstStartedPulling="2026-04-19 12:37:44.913042653 +0000 UTC m=+433.066976110" lastFinishedPulling="2026-04-19 12:37:47.475358076 +0000 UTC m=+435.629291537" observedRunningTime="2026-04-19 12:37:47.932333361 +0000 UTC m=+436.086266842" watchObservedRunningTime="2026-04-19 12:37:47.933478675 +0000 UTC m=+436.087412155" Apr 19 12:37:47.951484 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:47.951052 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-pms8f" podStartSLOduration=1.951031773 podStartE2EDuration="1.951031773s" podCreationTimestamp="2026-04-19 12:37:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:37:47.949701376 +0000 UTC m=+436.103634933" watchObservedRunningTime="2026-04-19 12:37:47.951031773 +0000 UTC m=+436.104965254" Apr 19 12:37:48.040408 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:48.040386 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt" Apr 19 12:37:48.183524 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:48.183433 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7c2a15d-f450-4094-beff-d2d2ea145c33-util\") pod \"d7c2a15d-f450-4094-beff-d2d2ea145c33\" (UID: \"d7c2a15d-f450-4094-beff-d2d2ea145c33\") " Apr 19 12:37:48.183524 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:48.183475 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7c2a15d-f450-4094-beff-d2d2ea145c33-bundle\") pod \"d7c2a15d-f450-4094-beff-d2d2ea145c33\" (UID: \"d7c2a15d-f450-4094-beff-d2d2ea145c33\") " Apr 19 12:37:48.183737 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:48.183534 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gp6s\" (UniqueName: \"kubernetes.io/projected/d7c2a15d-f450-4094-beff-d2d2ea145c33-kube-api-access-5gp6s\") pod \"d7c2a15d-f450-4094-beff-d2d2ea145c33\" (UID: \"d7c2a15d-f450-4094-beff-d2d2ea145c33\") " Apr 19 12:37:48.183910 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:48.183882 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7c2a15d-f450-4094-beff-d2d2ea145c33-bundle" (OuterVolumeSpecName: "bundle") pod "d7c2a15d-f450-4094-beff-d2d2ea145c33" (UID: "d7c2a15d-f450-4094-beff-d2d2ea145c33"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:37:48.185592 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:48.185568 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7c2a15d-f450-4094-beff-d2d2ea145c33-kube-api-access-5gp6s" (OuterVolumeSpecName: "kube-api-access-5gp6s") pod "d7c2a15d-f450-4094-beff-d2d2ea145c33" (UID: "d7c2a15d-f450-4094-beff-d2d2ea145c33"). InnerVolumeSpecName "kube-api-access-5gp6s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:37:48.284980 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:48.284946 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7c2a15d-f450-4094-beff-d2d2ea145c33-bundle\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:37:48.284980 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:48.284979 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5gp6s\" (UniqueName: \"kubernetes.io/projected/d7c2a15d-f450-4094-beff-d2d2ea145c33-kube-api-access-5gp6s\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:37:48.295584 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:48.295524 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7c2a15d-f450-4094-beff-d2d2ea145c33-util" (OuterVolumeSpecName: "util") pod "d7c2a15d-f450-4094-beff-d2d2ea145c33" (UID: "d7c2a15d-f450-4094-beff-d2d2ea145c33"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:37:48.386433 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:48.386403 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7c2a15d-f450-4094-beff-d2d2ea145c33-util\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:37:48.925790 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:48.925751 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt" event={"ID":"d7c2a15d-f450-4094-beff-d2d2ea145c33","Type":"ContainerDied","Data":"67fa8f3b0a486e7b7214d637fa1813429d99620525354f9595a3dcb928700653"} Apr 19 12:37:48.925790 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:48.925788 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67fa8f3b0a486e7b7214d637fa1813429d99620525354f9595a3dcb928700653" Apr 19 12:37:48.925790 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:48.925768 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7txmt" Apr 19 12:37:53.928760 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:37:53.928729 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-h78dc" Apr 19 12:38:03.083678 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.083642 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv"] Apr 19 12:38:03.084075 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.083990 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7c2a15d-f450-4094-beff-d2d2ea145c33" containerName="extract" Apr 19 12:38:03.084075 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.084001 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c2a15d-f450-4094-beff-d2d2ea145c33" containerName="extract" Apr 19 12:38:03.084075 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.084025 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7c2a15d-f450-4094-beff-d2d2ea145c33" containerName="util" Apr 19 12:38:03.084075 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.084031 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c2a15d-f450-4094-beff-d2d2ea145c33" containerName="util" Apr 19 12:38:03.084075 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.084040 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7c2a15d-f450-4094-beff-d2d2ea145c33" containerName="pull" Apr 19 12:38:03.084075 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.084045 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c2a15d-f450-4094-beff-d2d2ea145c33" containerName="pull" Apr 19 12:38:03.084280 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.084108 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7c2a15d-f450-4094-beff-d2d2ea145c33" containerName="extract" Apr 19 12:38:03.138887 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.138850 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv"] Apr 19 12:38:03.139047 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.138978 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv" Apr 19 12:38:03.141047 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.141021 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 19 12:38:03.141518 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.141500 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 19 12:38:03.141574 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.141501 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-98xhq\"" Apr 19 12:38:03.199065 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.199026 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt5kg\" (UniqueName: \"kubernetes.io/projected/58595de2-9d87-4611-8dd8-7e859185d579-kube-api-access-gt5kg\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv\" (UID: \"58595de2-9d87-4611-8dd8-7e859185d579\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv" Apr 19 12:38:03.199277 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.199139 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/58595de2-9d87-4611-8dd8-7e859185d579-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv\" (UID: \"58595de2-9d87-4611-8dd8-7e859185d579\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv" Apr 19 12:38:03.199277 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.199200 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/58595de2-9d87-4611-8dd8-7e859185d579-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv\" (UID: \"58595de2-9d87-4611-8dd8-7e859185d579\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv" Apr 19 12:38:03.300267 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.300230 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/58595de2-9d87-4611-8dd8-7e859185d579-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv\" (UID: \"58595de2-9d87-4611-8dd8-7e859185d579\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv" Apr 19 12:38:03.300267 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.300272 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/58595de2-9d87-4611-8dd8-7e859185d579-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv\" (UID: \"58595de2-9d87-4611-8dd8-7e859185d579\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv" Apr 19 12:38:03.300472 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.300312 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gt5kg\" (UniqueName: \"kubernetes.io/projected/58595de2-9d87-4611-8dd8-7e859185d579-kube-api-access-gt5kg\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv\" (UID: \"58595de2-9d87-4611-8dd8-7e859185d579\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv" Apr 19 12:38:03.300684 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.300661 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/58595de2-9d87-4611-8dd8-7e859185d579-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv\" (UID: \"58595de2-9d87-4611-8dd8-7e859185d579\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv" Apr 19 12:38:03.300725 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.300671 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/58595de2-9d87-4611-8dd8-7e859185d579-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv\" (UID: \"58595de2-9d87-4611-8dd8-7e859185d579\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv" Apr 19 12:38:03.307701 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.307673 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt5kg\" (UniqueName: \"kubernetes.io/projected/58595de2-9d87-4611-8dd8-7e859185d579-kube-api-access-gt5kg\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv\" (UID: \"58595de2-9d87-4611-8dd8-7e859185d579\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv" Apr 19 12:38:03.448472 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.448432 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv" Apr 19 12:38:03.575875 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.575849 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv"] Apr 19 12:38:03.578049 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:38:03.578019 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58595de2_9d87_4611_8dd8_7e859185d579.slice/crio-2a8bd676f44bb27a2cb245099b193cdcda74f0f703a2d5846b5ce63b62afda71 WatchSource:0}: Error finding container 2a8bd676f44bb27a2cb245099b193cdcda74f0f703a2d5846b5ce63b62afda71: Status 404 returned error can't find the container with id 2a8bd676f44bb27a2cb245099b193cdcda74f0f703a2d5846b5ce63b62afda71 Apr 19 12:38:03.978408 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.978313 2567 generic.go:358] "Generic (PLEG): container finished" podID="58595de2-9d87-4611-8dd8-7e859185d579" containerID="7588a53dd9ac4da582ce6b672484c728a8fd8536db974d244107be1db7cd6f66" exitCode=0 Apr 19 12:38:03.978408 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.978395 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv" event={"ID":"58595de2-9d87-4611-8dd8-7e859185d579","Type":"ContainerDied","Data":"7588a53dd9ac4da582ce6b672484c728a8fd8536db974d244107be1db7cd6f66"} Apr 19 12:38:03.978592 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:03.978423 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv" event={"ID":"58595de2-9d87-4611-8dd8-7e859185d579","Type":"ContainerStarted","Data":"2a8bd676f44bb27a2cb245099b193cdcda74f0f703a2d5846b5ce63b62afda71"} Apr 19 12:38:04.983285 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:04.983248 2567 generic.go:358] "Generic (PLEG): container finished" podID="58595de2-9d87-4611-8dd8-7e859185d579" containerID="46285553a750c05b926d98db25afaa0ac4007a87ede36d55521e2499c8cf1fdd" exitCode=0 Apr 19 12:38:04.983675 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:04.983313 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv" event={"ID":"58595de2-9d87-4611-8dd8-7e859185d579","Type":"ContainerDied","Data":"46285553a750c05b926d98db25afaa0ac4007a87ede36d55521e2499c8cf1fdd"} Apr 19 12:38:05.988728 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:05.988693 2567 generic.go:358] "Generic (PLEG): container finished" podID="58595de2-9d87-4611-8dd8-7e859185d579" containerID="6f74d0d753f7b8c46b58f99e2305e1e0cc843d0a14da91036054ffece3a04894" exitCode=0 Apr 19 12:38:05.989096 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:05.988743 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv" event={"ID":"58595de2-9d87-4611-8dd8-7e859185d579","Type":"ContainerDied","Data":"6f74d0d753f7b8c46b58f99e2305e1e0cc843d0a14da91036054ffece3a04894"} Apr 19 12:38:07.113045 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:07.113021 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv" Apr 19 12:38:07.126642 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:07.126612 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt5kg\" (UniqueName: \"kubernetes.io/projected/58595de2-9d87-4611-8dd8-7e859185d579-kube-api-access-gt5kg\") pod \"58595de2-9d87-4611-8dd8-7e859185d579\" (UID: \"58595de2-9d87-4611-8dd8-7e859185d579\") " Apr 19 12:38:07.126782 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:07.126663 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/58595de2-9d87-4611-8dd8-7e859185d579-bundle\") pod \"58595de2-9d87-4611-8dd8-7e859185d579\" (UID: \"58595de2-9d87-4611-8dd8-7e859185d579\") " Apr 19 12:38:07.126782 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:07.126723 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/58595de2-9d87-4611-8dd8-7e859185d579-util\") pod \"58595de2-9d87-4611-8dd8-7e859185d579\" (UID: \"58595de2-9d87-4611-8dd8-7e859185d579\") " Apr 19 12:38:07.128123 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:07.128032 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58595de2-9d87-4611-8dd8-7e859185d579-bundle" (OuterVolumeSpecName: "bundle") pod "58595de2-9d87-4611-8dd8-7e859185d579" (UID: "58595de2-9d87-4611-8dd8-7e859185d579"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:38:07.129086 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:07.129062 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58595de2-9d87-4611-8dd8-7e859185d579-kube-api-access-gt5kg" (OuterVolumeSpecName: "kube-api-access-gt5kg") pod "58595de2-9d87-4611-8dd8-7e859185d579" (UID: "58595de2-9d87-4611-8dd8-7e859185d579"). InnerVolumeSpecName "kube-api-access-gt5kg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:38:07.133408 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:07.133380 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58595de2-9d87-4611-8dd8-7e859185d579-util" (OuterVolumeSpecName: "util") pod "58595de2-9d87-4611-8dd8-7e859185d579" (UID: "58595de2-9d87-4611-8dd8-7e859185d579"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:38:07.227631 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:07.227582 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/58595de2-9d87-4611-8dd8-7e859185d579-util\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:38:07.227631 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:07.227625 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gt5kg\" (UniqueName: \"kubernetes.io/projected/58595de2-9d87-4611-8dd8-7e859185d579-kube-api-access-gt5kg\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:38:07.227631 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:07.227641 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/58595de2-9d87-4611-8dd8-7e859185d579-bundle\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:38:07.996927 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:07.996901 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv" Apr 19 12:38:07.996927 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:07.996912 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h9bgv" event={"ID":"58595de2-9d87-4611-8dd8-7e859185d579","Type":"ContainerDied","Data":"2a8bd676f44bb27a2cb245099b193cdcda74f0f703a2d5846b5ce63b62afda71"} Apr 19 12:38:07.997134 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:07.996949 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a8bd676f44bb27a2cb245099b193cdcda74f0f703a2d5846b5ce63b62afda71" Apr 19 12:38:09.689113 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.689082 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-844f57dbd6-fbxvz"] Apr 19 12:38:09.689544 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.689457 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58595de2-9d87-4611-8dd8-7e859185d579" containerName="pull" Apr 19 12:38:09.689544 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.689474 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="58595de2-9d87-4611-8dd8-7e859185d579" containerName="pull" Apr 19 12:38:09.689544 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.689493 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58595de2-9d87-4611-8dd8-7e859185d579" containerName="extract" Apr 19 12:38:09.689544 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.689502 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="58595de2-9d87-4611-8dd8-7e859185d579" containerName="extract" Apr 19 12:38:09.689544 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.689514 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58595de2-9d87-4611-8dd8-7e859185d579" containerName="util" Apr 19 12:38:09.689544 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.689522 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="58595de2-9d87-4611-8dd8-7e859185d579" containerName="util" Apr 19 12:38:09.689742 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.689659 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="58595de2-9d87-4611-8dd8-7e859185d579" containerName="extract" Apr 19 12:38:09.693858 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.693838 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-fbxvz" Apr 19 12:38:09.696246 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.696218 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 19 12:38:09.696246 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.696242 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:38:09.696397 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.696261 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 19 12:38:09.696397 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.696275 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-dtr48\"" Apr 19 12:38:09.696397 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.696218 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 19 12:38:09.696397 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.696218 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 19 12:38:09.703699 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.703679 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-844f57dbd6-fbxvz"] Apr 19 12:38:09.745906 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.745877 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/afb23281-487e-4df8-b8e5-d550ae2013f7-manager-config\") pod \"lws-controller-manager-844f57dbd6-fbxvz\" (UID: \"afb23281-487e-4df8-b8e5-d550ae2013f7\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-fbxvz" Apr 19 12:38:09.746098 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.745929 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbwdm\" (UniqueName: \"kubernetes.io/projected/afb23281-487e-4df8-b8e5-d550ae2013f7-kube-api-access-sbwdm\") pod \"lws-controller-manager-844f57dbd6-fbxvz\" (UID: \"afb23281-487e-4df8-b8e5-d550ae2013f7\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-fbxvz" Apr 19 12:38:09.746098 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.745949 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afb23281-487e-4df8-b8e5-d550ae2013f7-cert\") pod \"lws-controller-manager-844f57dbd6-fbxvz\" (UID: \"afb23281-487e-4df8-b8e5-d550ae2013f7\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-fbxvz" Apr 19 12:38:09.746098 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.746002 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/afb23281-487e-4df8-b8e5-d550ae2013f7-metrics-cert\") pod \"lws-controller-manager-844f57dbd6-fbxvz\" (UID: \"afb23281-487e-4df8-b8e5-d550ae2013f7\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-fbxvz" Apr 19 12:38:09.847045 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.847003 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbwdm\" (UniqueName: \"kubernetes.io/projected/afb23281-487e-4df8-b8e5-d550ae2013f7-kube-api-access-sbwdm\") pod \"lws-controller-manager-844f57dbd6-fbxvz\" (UID: \"afb23281-487e-4df8-b8e5-d550ae2013f7\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-fbxvz" Apr 19 12:38:09.847045 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.847048 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afb23281-487e-4df8-b8e5-d550ae2013f7-cert\") pod \"lws-controller-manager-844f57dbd6-fbxvz\" (UID: \"afb23281-487e-4df8-b8e5-d550ae2013f7\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-fbxvz" Apr 19 12:38:09.847340 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.847079 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/afb23281-487e-4df8-b8e5-d550ae2013f7-metrics-cert\") pod \"lws-controller-manager-844f57dbd6-fbxvz\" (UID: \"afb23281-487e-4df8-b8e5-d550ae2013f7\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-fbxvz" Apr 19 12:38:09.847340 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.847152 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/afb23281-487e-4df8-b8e5-d550ae2013f7-manager-config\") pod \"lws-controller-manager-844f57dbd6-fbxvz\" (UID: \"afb23281-487e-4df8-b8e5-d550ae2013f7\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-fbxvz" Apr 19 12:38:09.847808 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.847783 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/afb23281-487e-4df8-b8e5-d550ae2013f7-manager-config\") pod \"lws-controller-manager-844f57dbd6-fbxvz\" (UID: \"afb23281-487e-4df8-b8e5-d550ae2013f7\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-fbxvz" Apr 19 12:38:09.849682 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.849662 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afb23281-487e-4df8-b8e5-d550ae2013f7-cert\") pod \"lws-controller-manager-844f57dbd6-fbxvz\" (UID: \"afb23281-487e-4df8-b8e5-d550ae2013f7\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-fbxvz" Apr 19 12:38:09.849774 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.849663 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/afb23281-487e-4df8-b8e5-d550ae2013f7-metrics-cert\") pod \"lws-controller-manager-844f57dbd6-fbxvz\" (UID: \"afb23281-487e-4df8-b8e5-d550ae2013f7\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-fbxvz" Apr 19 12:38:09.855056 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:09.855026 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbwdm\" (UniqueName: \"kubernetes.io/projected/afb23281-487e-4df8-b8e5-d550ae2013f7-kube-api-access-sbwdm\") pod \"lws-controller-manager-844f57dbd6-fbxvz\" (UID: \"afb23281-487e-4df8-b8e5-d550ae2013f7\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-fbxvz" Apr 19 12:38:10.003339 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:10.003260 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-fbxvz" Apr 19 12:38:10.140355 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:10.140332 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-844f57dbd6-fbxvz"] Apr 19 12:38:10.142588 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:38:10.142564 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb23281_487e_4df8_b8e5_d550ae2013f7.slice/crio-8b6046f4ccbe7274dd042d614f181f6e92caeb0033fe38602ec17befe4d09896 WatchSource:0}: Error finding container 8b6046f4ccbe7274dd042d614f181f6e92caeb0033fe38602ec17befe4d09896: Status 404 returned error can't find the container with id 8b6046f4ccbe7274dd042d614f181f6e92caeb0033fe38602ec17befe4d09896 Apr 19 12:38:11.008642 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:11.008599 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-fbxvz" event={"ID":"afb23281-487e-4df8-b8e5-d550ae2013f7","Type":"ContainerStarted","Data":"8b6046f4ccbe7274dd042d614f181f6e92caeb0033fe38602ec17befe4d09896"} Apr 19 12:38:12.013224 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:12.013113 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-fbxvz" event={"ID":"afb23281-487e-4df8-b8e5-d550ae2013f7","Type":"ContainerStarted","Data":"71392ea9a7fffb337309f6c3619074c1e1f49edea5c74f28388bba7cbff58ba8"} Apr 19 12:38:12.013224 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:12.013171 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-fbxvz" Apr 19 12:38:12.040206 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:12.040137 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-fbxvz" podStartSLOduration=1.425801261 podStartE2EDuration="3.040122223s" podCreationTimestamp="2026-04-19 12:38:09 +0000 UTC" firstStartedPulling="2026-04-19 12:38:10.144458939 +0000 UTC m=+458.298392397" lastFinishedPulling="2026-04-19 12:38:11.758779887 +0000 UTC m=+459.912713359" observedRunningTime="2026-04-19 12:38:12.038583784 +0000 UTC m=+460.192517265" watchObservedRunningTime="2026-04-19 12:38:12.040122223 +0000 UTC m=+460.194055702" Apr 19 12:38:12.482519 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:12.482486 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw"] Apr 19 12:38:12.486122 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:12.486103 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw" Apr 19 12:38:12.487944 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:12.487923 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 19 12:38:12.488014 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:12.487944 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 19 12:38:12.488319 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:12.488305 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-98xhq\"" Apr 19 12:38:12.497260 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:12.497238 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw"] Apr 19 12:38:12.567370 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:12.567331 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/39dad83b-43e9-4bf6-bd09-fa4d34a2ad89-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw\" (UID: \"39dad83b-43e9-4bf6-bd09-fa4d34a2ad89\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw" Apr 19 12:38:12.567563 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:12.567399 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd6sf\" (UniqueName: \"kubernetes.io/projected/39dad83b-43e9-4bf6-bd09-fa4d34a2ad89-kube-api-access-bd6sf\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw\" (UID: \"39dad83b-43e9-4bf6-bd09-fa4d34a2ad89\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw" Apr 19 12:38:12.567563 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:12.567423 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/39dad83b-43e9-4bf6-bd09-fa4d34a2ad89-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw\" (UID: \"39dad83b-43e9-4bf6-bd09-fa4d34a2ad89\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw" Apr 19 12:38:12.668632 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:12.668589 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/39dad83b-43e9-4bf6-bd09-fa4d34a2ad89-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw\" (UID: \"39dad83b-43e9-4bf6-bd09-fa4d34a2ad89\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw" Apr 19 12:38:12.668813 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:12.668668 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bd6sf\" (UniqueName: \"kubernetes.io/projected/39dad83b-43e9-4bf6-bd09-fa4d34a2ad89-kube-api-access-bd6sf\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw\" (UID: \"39dad83b-43e9-4bf6-bd09-fa4d34a2ad89\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw" Apr 19 12:38:12.668813 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:12.668692 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/39dad83b-43e9-4bf6-bd09-fa4d34a2ad89-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw\" (UID: \"39dad83b-43e9-4bf6-bd09-fa4d34a2ad89\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw" Apr 19 12:38:12.669038 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:12.669023 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/39dad83b-43e9-4bf6-bd09-fa4d34a2ad89-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw\" (UID: \"39dad83b-43e9-4bf6-bd09-fa4d34a2ad89\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw" Apr 19 12:38:12.669076 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:12.669026 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/39dad83b-43e9-4bf6-bd09-fa4d34a2ad89-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw\" (UID: \"39dad83b-43e9-4bf6-bd09-fa4d34a2ad89\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw" Apr 19 12:38:12.676349 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:12.676315 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd6sf\" (UniqueName: \"kubernetes.io/projected/39dad83b-43e9-4bf6-bd09-fa4d34a2ad89-kube-api-access-bd6sf\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw\" (UID: \"39dad83b-43e9-4bf6-bd09-fa4d34a2ad89\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw" Apr 19 12:38:12.795302 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:12.795215 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw" Apr 19 12:38:12.916928 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:12.916900 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw"] Apr 19 12:38:12.919034 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:38:12.919004 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39dad83b_43e9_4bf6_bd09_fa4d34a2ad89.slice/crio-77af3baa4c4aca08610fd33504de488a08a738189ebee6250b3394b49f750caa WatchSource:0}: Error finding container 77af3baa4c4aca08610fd33504de488a08a738189ebee6250b3394b49f750caa: Status 404 returned error can't find the container with id 77af3baa4c4aca08610fd33504de488a08a738189ebee6250b3394b49f750caa Apr 19 12:38:13.017894 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:13.017861 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw" event={"ID":"39dad83b-43e9-4bf6-bd09-fa4d34a2ad89","Type":"ContainerStarted","Data":"3db1ed7b8a0c6afae50769e2b2812bb22df1bebf309c77900088467e76119d39"} Apr 19 12:38:13.017894 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:13.017898 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw" event={"ID":"39dad83b-43e9-4bf6-bd09-fa4d34a2ad89","Type":"ContainerStarted","Data":"77af3baa4c4aca08610fd33504de488a08a738189ebee6250b3394b49f750caa"} Apr 19 12:38:14.023048 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:14.023016 2567 generic.go:358] "Generic (PLEG): container finished" podID="39dad83b-43e9-4bf6-bd09-fa4d34a2ad89" containerID="3db1ed7b8a0c6afae50769e2b2812bb22df1bebf309c77900088467e76119d39" exitCode=0 Apr 19 12:38:14.023455 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:14.023053 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw" event={"ID":"39dad83b-43e9-4bf6-bd09-fa4d34a2ad89","Type":"ContainerDied","Data":"3db1ed7b8a0c6afae50769e2b2812bb22df1bebf309c77900088467e76119d39"} Apr 19 12:38:14.579649 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:14.579617 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9ff869b6b-zfnmt"] Apr 19 12:38:14.583303 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:14.583285 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-zfnmt" Apr 19 12:38:14.585641 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:14.585611 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 19 12:38:14.586859 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:14.586839 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 19 12:38:14.586999 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:14.586976 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 19 12:38:14.587722 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:14.587682 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 19 12:38:14.593723 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:14.593705 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-scmp2\"" Apr 19 12:38:14.616955 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:14.616926 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9ff869b6b-zfnmt"] Apr 19 12:38:14.683375 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:14.683329 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfwqx\" (UniqueName: \"kubernetes.io/projected/71198d00-829c-44c3-a38a-dfde254a8d7d-kube-api-access-zfwqx\") pod \"opendatahub-operator-controller-manager-9ff869b6b-zfnmt\" (UID: \"71198d00-829c-44c3-a38a-dfde254a8d7d\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-zfnmt" Apr 19 12:38:14.683536 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:14.683391 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/71198d00-829c-44c3-a38a-dfde254a8d7d-webhook-cert\") pod \"opendatahub-operator-controller-manager-9ff869b6b-zfnmt\" (UID: \"71198d00-829c-44c3-a38a-dfde254a8d7d\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-zfnmt" Apr 19 12:38:14.683616 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:14.683555 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/71198d00-829c-44c3-a38a-dfde254a8d7d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9ff869b6b-zfnmt\" (UID: \"71198d00-829c-44c3-a38a-dfde254a8d7d\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-zfnmt" Apr 19 12:38:14.784109 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:14.784073 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfwqx\" (UniqueName: \"kubernetes.io/projected/71198d00-829c-44c3-a38a-dfde254a8d7d-kube-api-access-zfwqx\") pod \"opendatahub-operator-controller-manager-9ff869b6b-zfnmt\" (UID: \"71198d00-829c-44c3-a38a-dfde254a8d7d\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-zfnmt" Apr 19 12:38:14.784314 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:14.784124 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/71198d00-829c-44c3-a38a-dfde254a8d7d-webhook-cert\") pod \"opendatahub-operator-controller-manager-9ff869b6b-zfnmt\" (UID: \"71198d00-829c-44c3-a38a-dfde254a8d7d\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-zfnmt" Apr 19 12:38:14.784314 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:14.784198 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/71198d00-829c-44c3-a38a-dfde254a8d7d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9ff869b6b-zfnmt\" (UID: \"71198d00-829c-44c3-a38a-dfde254a8d7d\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-zfnmt" Apr 19 12:38:14.786743 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:14.786718 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/71198d00-829c-44c3-a38a-dfde254a8d7d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9ff869b6b-zfnmt\" (UID: \"71198d00-829c-44c3-a38a-dfde254a8d7d\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-zfnmt" Apr 19 12:38:14.786868 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:14.786739 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/71198d00-829c-44c3-a38a-dfde254a8d7d-webhook-cert\") pod \"opendatahub-operator-controller-manager-9ff869b6b-zfnmt\" (UID: \"71198d00-829c-44c3-a38a-dfde254a8d7d\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-zfnmt" Apr 19 12:38:14.792595 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:14.792568 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfwqx\" (UniqueName: \"kubernetes.io/projected/71198d00-829c-44c3-a38a-dfde254a8d7d-kube-api-access-zfwqx\") pod \"opendatahub-operator-controller-manager-9ff869b6b-zfnmt\" (UID: \"71198d00-829c-44c3-a38a-dfde254a8d7d\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-zfnmt" Apr 19 12:38:14.894536 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:14.894455 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-zfnmt" Apr 19 12:38:15.022590 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:15.022562 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9ff869b6b-zfnmt"] Apr 19 12:38:15.025411 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:38:15.025382 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71198d00_829c_44c3_a38a_dfde254a8d7d.slice/crio-1e32f0415c2ebd74b00127ef53fa730e161d002249c8e99501c1af2e98ccabe0 WatchSource:0}: Error finding container 1e32f0415c2ebd74b00127ef53fa730e161d002249c8e99501c1af2e98ccabe0: Status 404 returned error can't find the container with id 1e32f0415c2ebd74b00127ef53fa730e161d002249c8e99501c1af2e98ccabe0 Apr 19 12:38:15.035090 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:15.035062 2567 generic.go:358] "Generic (PLEG): container finished" podID="39dad83b-43e9-4bf6-bd09-fa4d34a2ad89" containerID="c9d1fbafd6db1f3acbdaa22e2d423db25e76ca4e0fdf26e306c7ce2e3be603e9" exitCode=0 Apr 19 12:38:15.035250 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:15.035147 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw" event={"ID":"39dad83b-43e9-4bf6-bd09-fa4d34a2ad89","Type":"ContainerDied","Data":"c9d1fbafd6db1f3acbdaa22e2d423db25e76ca4e0fdf26e306c7ce2e3be603e9"} Apr 19 12:38:16.042138 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:16.042102 2567 generic.go:358] "Generic (PLEG): container finished" podID="39dad83b-43e9-4bf6-bd09-fa4d34a2ad89" containerID="dab0abea7ae8f1d0c69964e574252f585b97e97f954652806bc9e03d241783f7" exitCode=0 Apr 19 12:38:16.042624 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:16.042203 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw" event={"ID":"39dad83b-43e9-4bf6-bd09-fa4d34a2ad89","Type":"ContainerDied","Data":"dab0abea7ae8f1d0c69964e574252f585b97e97f954652806bc9e03d241783f7"} Apr 19 12:38:16.043853 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:16.043828 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-zfnmt" event={"ID":"71198d00-829c-44c3-a38a-dfde254a8d7d","Type":"ContainerStarted","Data":"1e32f0415c2ebd74b00127ef53fa730e161d002249c8e99501c1af2e98ccabe0"} Apr 19 12:38:17.493884 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:17.493861 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw" Apr 19 12:38:17.505885 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:17.505856 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/39dad83b-43e9-4bf6-bd09-fa4d34a2ad89-bundle\") pod \"39dad83b-43e9-4bf6-bd09-fa4d34a2ad89\" (UID: \"39dad83b-43e9-4bf6-bd09-fa4d34a2ad89\") " Apr 19 12:38:17.506017 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:17.505895 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd6sf\" (UniqueName: \"kubernetes.io/projected/39dad83b-43e9-4bf6-bd09-fa4d34a2ad89-kube-api-access-bd6sf\") pod \"39dad83b-43e9-4bf6-bd09-fa4d34a2ad89\" (UID: \"39dad83b-43e9-4bf6-bd09-fa4d34a2ad89\") " Apr 19 12:38:17.506017 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:17.505944 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/39dad83b-43e9-4bf6-bd09-fa4d34a2ad89-util\") pod \"39dad83b-43e9-4bf6-bd09-fa4d34a2ad89\" (UID: \"39dad83b-43e9-4bf6-bd09-fa4d34a2ad89\") " Apr 19 12:38:17.506538 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:17.506512 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39dad83b-43e9-4bf6-bd09-fa4d34a2ad89-bundle" (OuterVolumeSpecName: "bundle") pod "39dad83b-43e9-4bf6-bd09-fa4d34a2ad89" (UID: "39dad83b-43e9-4bf6-bd09-fa4d34a2ad89"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:38:17.508015 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:17.507989 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39dad83b-43e9-4bf6-bd09-fa4d34a2ad89-kube-api-access-bd6sf" (OuterVolumeSpecName: "kube-api-access-bd6sf") pod "39dad83b-43e9-4bf6-bd09-fa4d34a2ad89" (UID: "39dad83b-43e9-4bf6-bd09-fa4d34a2ad89"). InnerVolumeSpecName "kube-api-access-bd6sf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:38:17.511038 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:17.510967 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39dad83b-43e9-4bf6-bd09-fa4d34a2ad89-util" (OuterVolumeSpecName: "util") pod "39dad83b-43e9-4bf6-bd09-fa4d34a2ad89" (UID: "39dad83b-43e9-4bf6-bd09-fa4d34a2ad89"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:38:17.606780 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:17.606740 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/39dad83b-43e9-4bf6-bd09-fa4d34a2ad89-bundle\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:38:17.606780 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:17.606767 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bd6sf\" (UniqueName: \"kubernetes.io/projected/39dad83b-43e9-4bf6-bd09-fa4d34a2ad89-kube-api-access-bd6sf\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:38:17.606780 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:17.606776 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/39dad83b-43e9-4bf6-bd09-fa4d34a2ad89-util\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:38:18.054821 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:18.054783 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw" event={"ID":"39dad83b-43e9-4bf6-bd09-fa4d34a2ad89","Type":"ContainerDied","Data":"77af3baa4c4aca08610fd33504de488a08a738189ebee6250b3394b49f750caa"} Apr 19 12:38:18.055004 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:18.054833 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77af3baa4c4aca08610fd33504de488a08a738189ebee6250b3394b49f750caa" Apr 19 12:38:18.055004 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:18.054795 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q2rqw" Apr 19 12:38:18.056396 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:18.056368 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-zfnmt" event={"ID":"71198d00-829c-44c3-a38a-dfde254a8d7d","Type":"ContainerStarted","Data":"56fcea93e77e939e414ab63875a90fc0a47afe0cc6ef22f22181c8e398348953"} Apr 19 12:38:18.056597 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:18.056566 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-zfnmt" Apr 19 12:38:18.073300 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:18.073117 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-zfnmt" podStartSLOduration=1.576201752 podStartE2EDuration="4.073102057s" podCreationTimestamp="2026-04-19 12:38:14 +0000 UTC" firstStartedPulling="2026-04-19 12:38:15.027802838 +0000 UTC m=+463.181736311" lastFinishedPulling="2026-04-19 12:38:17.524703144 +0000 UTC m=+465.678636616" observedRunningTime="2026-04-19 12:38:18.072005251 +0000 UTC m=+466.225938730" watchObservedRunningTime="2026-04-19 12:38:18.073102057 +0000 UTC m=+466.227035537" Apr 19 12:38:23.019908 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:23.019878 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-fbxvz" Apr 19 12:38:29.062901 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:29.062866 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-zfnmt" Apr 19 12:38:42.964821 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:42.964785 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t"] Apr 19 12:38:42.965338 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:42.965322 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39dad83b-43e9-4bf6-bd09-fa4d34a2ad89" containerName="pull" Apr 19 12:38:42.965391 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:42.965341 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="39dad83b-43e9-4bf6-bd09-fa4d34a2ad89" containerName="pull" Apr 19 12:38:42.965391 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:42.965362 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39dad83b-43e9-4bf6-bd09-fa4d34a2ad89" containerName="util" Apr 19 12:38:42.965391 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:42.965370 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="39dad83b-43e9-4bf6-bd09-fa4d34a2ad89" containerName="util" Apr 19 12:38:42.965391 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:42.965383 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39dad83b-43e9-4bf6-bd09-fa4d34a2ad89" containerName="extract" Apr 19 12:38:42.965517 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:42.965393 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="39dad83b-43e9-4bf6-bd09-fa4d34a2ad89" containerName="extract" Apr 19 12:38:42.965517 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:42.965487 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="39dad83b-43e9-4bf6-bd09-fa4d34a2ad89" containerName="extract" Apr 19 12:38:42.971234 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:42.971211 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t" Apr 19 12:38:42.973382 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:42.973184 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-98xhq\"" Apr 19 12:38:42.973382 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:42.973231 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 19 12:38:42.973702 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:42.973684 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 19 12:38:42.974743 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:42.974722 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t"] Apr 19 12:38:43.027920 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.027884 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b04c5281-27a0-4793-927d-5188d81666e3-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t\" (UID: \"b04c5281-27a0-4793-927d-5188d81666e3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t" Apr 19 12:38:43.028098 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.027944 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lr2q\" (UniqueName: \"kubernetes.io/projected/b04c5281-27a0-4793-927d-5188d81666e3-kube-api-access-2lr2q\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t\" (UID: \"b04c5281-27a0-4793-927d-5188d81666e3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t" Apr 19 12:38:43.028098 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.027983 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b04c5281-27a0-4793-927d-5188d81666e3-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t\" (UID: \"b04c5281-27a0-4793-927d-5188d81666e3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t" Apr 19 12:38:43.043455 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.043428 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-874cdfcc7-8tz49"] Apr 19 12:38:43.048461 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.048437 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-874cdfcc7-8tz49" Apr 19 12:38:43.050705 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.050679 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-sxwn7\"" Apr 19 12:38:43.050843 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.050804 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 19 12:38:43.050843 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.050815 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 19 12:38:43.050954 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.050917 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 19 12:38:43.051045 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.051026 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 19 12:38:43.057105 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.057082 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-874cdfcc7-8tz49"] Apr 19 12:38:43.128796 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.128764 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bsgk\" (UniqueName: \"kubernetes.io/projected/c30d05da-556b-4dc1-9029-9cdc54c02ca8-kube-api-access-2bsgk\") pod \"kube-auth-proxy-874cdfcc7-8tz49\" (UID: \"c30d05da-556b-4dc1-9029-9cdc54c02ca8\") " pod="openshift-ingress/kube-auth-proxy-874cdfcc7-8tz49" Apr 19 12:38:43.128979 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.128816 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c30d05da-556b-4dc1-9029-9cdc54c02ca8-tls-certs\") pod \"kube-auth-proxy-874cdfcc7-8tz49\" (UID: \"c30d05da-556b-4dc1-9029-9cdc54c02ca8\") " pod="openshift-ingress/kube-auth-proxy-874cdfcc7-8tz49" Apr 19 12:38:43.128979 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.128878 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c30d05da-556b-4dc1-9029-9cdc54c02ca8-tmp\") pod \"kube-auth-proxy-874cdfcc7-8tz49\" (UID: \"c30d05da-556b-4dc1-9029-9cdc54c02ca8\") " pod="openshift-ingress/kube-auth-proxy-874cdfcc7-8tz49" Apr 19 12:38:43.128979 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.128909 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b04c5281-27a0-4793-927d-5188d81666e3-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t\" (UID: \"b04c5281-27a0-4793-927d-5188d81666e3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t" Apr 19 12:38:43.128979 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.128934 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lr2q\" (UniqueName: \"kubernetes.io/projected/b04c5281-27a0-4793-927d-5188d81666e3-kube-api-access-2lr2q\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t\" (UID: \"b04c5281-27a0-4793-927d-5188d81666e3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t" Apr 19 12:38:43.129228 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.128986 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b04c5281-27a0-4793-927d-5188d81666e3-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t\" (UID: \"b04c5281-27a0-4793-927d-5188d81666e3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t" Apr 19 12:38:43.129407 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.129386 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b04c5281-27a0-4793-927d-5188d81666e3-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t\" (UID: \"b04c5281-27a0-4793-927d-5188d81666e3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t" Apr 19 12:38:43.129407 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.129399 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b04c5281-27a0-4793-927d-5188d81666e3-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t\" (UID: \"b04c5281-27a0-4793-927d-5188d81666e3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t" Apr 19 12:38:43.136341 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.136324 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lr2q\" (UniqueName: \"kubernetes.io/projected/b04c5281-27a0-4793-927d-5188d81666e3-kube-api-access-2lr2q\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t\" (UID: \"b04c5281-27a0-4793-927d-5188d81666e3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t" Apr 19 12:38:43.229699 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.229612 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c30d05da-556b-4dc1-9029-9cdc54c02ca8-tls-certs\") pod \"kube-auth-proxy-874cdfcc7-8tz49\" (UID: \"c30d05da-556b-4dc1-9029-9cdc54c02ca8\") " pod="openshift-ingress/kube-auth-proxy-874cdfcc7-8tz49" Apr 19 12:38:43.229699 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.229663 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c30d05da-556b-4dc1-9029-9cdc54c02ca8-tmp\") pod \"kube-auth-proxy-874cdfcc7-8tz49\" (UID: \"c30d05da-556b-4dc1-9029-9cdc54c02ca8\") " pod="openshift-ingress/kube-auth-proxy-874cdfcc7-8tz49" Apr 19 12:38:43.229885 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.229726 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bsgk\" (UniqueName: \"kubernetes.io/projected/c30d05da-556b-4dc1-9029-9cdc54c02ca8-kube-api-access-2bsgk\") pod \"kube-auth-proxy-874cdfcc7-8tz49\" (UID: \"c30d05da-556b-4dc1-9029-9cdc54c02ca8\") " pod="openshift-ingress/kube-auth-proxy-874cdfcc7-8tz49" Apr 19 12:38:43.231964 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.231937 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c30d05da-556b-4dc1-9029-9cdc54c02ca8-tmp\") pod \"kube-auth-proxy-874cdfcc7-8tz49\" (UID: \"c30d05da-556b-4dc1-9029-9cdc54c02ca8\") " pod="openshift-ingress/kube-auth-proxy-874cdfcc7-8tz49" Apr 19 12:38:43.232135 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.232119 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c30d05da-556b-4dc1-9029-9cdc54c02ca8-tls-certs\") pod \"kube-auth-proxy-874cdfcc7-8tz49\" (UID: \"c30d05da-556b-4dc1-9029-9cdc54c02ca8\") " pod="openshift-ingress/kube-auth-proxy-874cdfcc7-8tz49" Apr 19 12:38:43.236519 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.236501 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bsgk\" (UniqueName: \"kubernetes.io/projected/c30d05da-556b-4dc1-9029-9cdc54c02ca8-kube-api-access-2bsgk\") pod \"kube-auth-proxy-874cdfcc7-8tz49\" (UID: \"c30d05da-556b-4dc1-9029-9cdc54c02ca8\") " pod="openshift-ingress/kube-auth-proxy-874cdfcc7-8tz49" Apr 19 12:38:43.282360 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.282330 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t" Apr 19 12:38:43.361563 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.361527 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-874cdfcc7-8tz49" Apr 19 12:38:43.415399 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.415354 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t"] Apr 19 12:38:43.418273 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:38:43.418235 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb04c5281_27a0_4793_927d_5188d81666e3.slice/crio-f0665b1a1c7b651000dfc173cd4e9c11b7f101b4c5a6393d7f384d7155c0832e WatchSource:0}: Error finding container f0665b1a1c7b651000dfc173cd4e9c11b7f101b4c5a6393d7f384d7155c0832e: Status 404 returned error can't find the container with id f0665b1a1c7b651000dfc173cd4e9c11b7f101b4c5a6393d7f384d7155c0832e Apr 19 12:38:43.497282 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:43.497258 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-874cdfcc7-8tz49"] Apr 19 12:38:43.500321 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:38:43.500292 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc30d05da_556b_4dc1_9029_9cdc54c02ca8.slice/crio-92c28078c50cf49ef3e8f761e8806da271a8177ef117824a231ffce28ce6ce90 WatchSource:0}: Error finding container 92c28078c50cf49ef3e8f761e8806da271a8177ef117824a231ffce28ce6ce90: Status 404 returned error can't find the container with id 92c28078c50cf49ef3e8f761e8806da271a8177ef117824a231ffce28ce6ce90 Apr 19 12:38:44.151523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:44.151472 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-874cdfcc7-8tz49" event={"ID":"c30d05da-556b-4dc1-9029-9cdc54c02ca8","Type":"ContainerStarted","Data":"92c28078c50cf49ef3e8f761e8806da271a8177ef117824a231ffce28ce6ce90"} Apr 19 12:38:44.153266 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:44.153234 2567 generic.go:358] "Generic (PLEG): container finished" podID="b04c5281-27a0-4793-927d-5188d81666e3" containerID="9a6ead3ac34dff0eee4dd4995742927bde1e837d538a842cd2fa4c9882e7fe25" exitCode=0 Apr 19 12:38:44.153465 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:44.153281 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t" event={"ID":"b04c5281-27a0-4793-927d-5188d81666e3","Type":"ContainerDied","Data":"9a6ead3ac34dff0eee4dd4995742927bde1e837d538a842cd2fa4c9882e7fe25"} Apr 19 12:38:44.153465 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:44.153308 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t" event={"ID":"b04c5281-27a0-4793-927d-5188d81666e3","Type":"ContainerStarted","Data":"f0665b1a1c7b651000dfc173cd4e9c11b7f101b4c5a6393d7f384d7155c0832e"} Apr 19 12:38:46.163790 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:46.163750 2567 generic.go:358] "Generic (PLEG): container finished" podID="b04c5281-27a0-4793-927d-5188d81666e3" containerID="889b34c149c40008d3422b1f868e1f48fa1684d3692b6236c32ea31fe6155d77" exitCode=0 Apr 19 12:38:46.164285 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:46.163797 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t" event={"ID":"b04c5281-27a0-4793-927d-5188d81666e3","Type":"ContainerDied","Data":"889b34c149c40008d3422b1f868e1f48fa1684d3692b6236c32ea31fe6155d77"} Apr 19 12:38:47.168512 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:47.168476 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-874cdfcc7-8tz49" event={"ID":"c30d05da-556b-4dc1-9029-9cdc54c02ca8","Type":"ContainerStarted","Data":"877273d52da6470e453ed79ede329e9080e80dd304deea25712c17084cd14028"} Apr 19 12:38:47.170238 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:47.170213 2567 generic.go:358] "Generic (PLEG): container finished" podID="b04c5281-27a0-4793-927d-5188d81666e3" containerID="5b4e4791ce2b5da1548ea0c7b62e5b6ad45ca48b5f208f23f16986b4c6d0b483" exitCode=0 Apr 19 12:38:47.170378 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:47.170297 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t" event={"ID":"b04c5281-27a0-4793-927d-5188d81666e3","Type":"ContainerDied","Data":"5b4e4791ce2b5da1548ea0c7b62e5b6ad45ca48b5f208f23f16986b4c6d0b483"} Apr 19 12:38:47.184452 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:47.184412 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-874cdfcc7-8tz49" podStartSLOduration=1.099955457 podStartE2EDuration="4.184400488s" podCreationTimestamp="2026-04-19 12:38:43 +0000 UTC" firstStartedPulling="2026-04-19 12:38:43.501985561 +0000 UTC m=+491.655919020" lastFinishedPulling="2026-04-19 12:38:46.586430589 +0000 UTC m=+494.740364051" observedRunningTime="2026-04-19 12:38:47.18183531 +0000 UTC m=+495.335768791" watchObservedRunningTime="2026-04-19 12:38:47.184400488 +0000 UTC m=+495.338333967" Apr 19 12:38:48.296442 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:48.296419 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t" Apr 19 12:38:48.380275 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:48.380240 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b04c5281-27a0-4793-927d-5188d81666e3-util\") pod \"b04c5281-27a0-4793-927d-5188d81666e3\" (UID: \"b04c5281-27a0-4793-927d-5188d81666e3\") " Apr 19 12:38:48.380275 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:48.380281 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b04c5281-27a0-4793-927d-5188d81666e3-bundle\") pod \"b04c5281-27a0-4793-927d-5188d81666e3\" (UID: \"b04c5281-27a0-4793-927d-5188d81666e3\") " Apr 19 12:38:48.380472 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:48.380316 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lr2q\" (UniqueName: \"kubernetes.io/projected/b04c5281-27a0-4793-927d-5188d81666e3-kube-api-access-2lr2q\") pod \"b04c5281-27a0-4793-927d-5188d81666e3\" (UID: \"b04c5281-27a0-4793-927d-5188d81666e3\") " Apr 19 12:38:48.381182 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:48.381135 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b04c5281-27a0-4793-927d-5188d81666e3-bundle" (OuterVolumeSpecName: "bundle") pod "b04c5281-27a0-4793-927d-5188d81666e3" (UID: "b04c5281-27a0-4793-927d-5188d81666e3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:38:48.382445 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:48.382422 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b04c5281-27a0-4793-927d-5188d81666e3-kube-api-access-2lr2q" (OuterVolumeSpecName: "kube-api-access-2lr2q") pod "b04c5281-27a0-4793-927d-5188d81666e3" (UID: "b04c5281-27a0-4793-927d-5188d81666e3"). InnerVolumeSpecName "kube-api-access-2lr2q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:38:48.385494 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:48.385461 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b04c5281-27a0-4793-927d-5188d81666e3-util" (OuterVolumeSpecName: "util") pod "b04c5281-27a0-4793-927d-5188d81666e3" (UID: "b04c5281-27a0-4793-927d-5188d81666e3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:38:48.481537 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:48.481511 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b04c5281-27a0-4793-927d-5188d81666e3-util\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:38:48.481537 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:48.481535 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b04c5281-27a0-4793-927d-5188d81666e3-bundle\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:38:48.481674 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:48.481545 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2lr2q\" (UniqueName: \"kubernetes.io/projected/b04c5281-27a0-4793-927d-5188d81666e3-kube-api-access-2lr2q\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:38:49.179366 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:49.179326 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t" event={"ID":"b04c5281-27a0-4793-927d-5188d81666e3","Type":"ContainerDied","Data":"f0665b1a1c7b651000dfc173cd4e9c11b7f101b4c5a6393d7f384d7155c0832e"} Apr 19 12:38:49.179366 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:49.179352 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835cs88t" Apr 19 12:38:49.179366 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:49.179367 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0665b1a1c7b651000dfc173cd4e9c11b7f101b4c5a6393d7f384d7155c0832e" Apr 19 12:38:56.988817 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:56.988786 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q"] Apr 19 12:38:56.989240 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:56.989137 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b04c5281-27a0-4793-927d-5188d81666e3" containerName="pull" Apr 19 12:38:56.989240 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:56.989147 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b04c5281-27a0-4793-927d-5188d81666e3" containerName="pull" Apr 19 12:38:56.989240 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:56.989176 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b04c5281-27a0-4793-927d-5188d81666e3" containerName="util" Apr 19 12:38:56.989240 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:56.989182 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b04c5281-27a0-4793-927d-5188d81666e3" containerName="util" Apr 19 12:38:56.989240 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:56.989194 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b04c5281-27a0-4793-927d-5188d81666e3" containerName="extract" Apr 19 12:38:56.989240 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:56.989201 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b04c5281-27a0-4793-927d-5188d81666e3" containerName="extract" Apr 19 12:38:56.989464 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:56.989270 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="b04c5281-27a0-4793-927d-5188d81666e3" containerName="extract" Apr 19 12:38:56.992443 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:56.992422 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q" Apr 19 12:38:56.995264 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:56.995238 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-98xhq\"" Apr 19 12:38:56.995769 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:56.995749 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 19 12:38:56.995865 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:56.995749 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 19 12:38:57.008530 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:57.008506 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q"] Apr 19 12:38:57.159086 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:57.159052 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb2q7\" (UniqueName: \"kubernetes.io/projected/3f149812-9639-44ed-b634-62b9c3f17126-kube-api-access-mb2q7\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q\" (UID: \"3f149812-9639-44ed-b634-62b9c3f17126\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q" Apr 19 12:38:57.159273 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:57.159137 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f149812-9639-44ed-b634-62b9c3f17126-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q\" (UID: \"3f149812-9639-44ed-b634-62b9c3f17126\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q" Apr 19 12:38:57.159273 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:57.159177 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f149812-9639-44ed-b634-62b9c3f17126-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q\" (UID: \"3f149812-9639-44ed-b634-62b9c3f17126\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q" Apr 19 12:38:57.260283 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:57.260191 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f149812-9639-44ed-b634-62b9c3f17126-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q\" (UID: \"3f149812-9639-44ed-b634-62b9c3f17126\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q" Apr 19 12:38:57.260283 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:57.260239 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f149812-9639-44ed-b634-62b9c3f17126-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q\" (UID: \"3f149812-9639-44ed-b634-62b9c3f17126\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q" Apr 19 12:38:57.260523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:57.260324 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mb2q7\" (UniqueName: \"kubernetes.io/projected/3f149812-9639-44ed-b634-62b9c3f17126-kube-api-access-mb2q7\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q\" (UID: \"3f149812-9639-44ed-b634-62b9c3f17126\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q" Apr 19 12:38:57.260724 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:57.260701 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f149812-9639-44ed-b634-62b9c3f17126-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q\" (UID: \"3f149812-9639-44ed-b634-62b9c3f17126\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q" Apr 19 12:38:57.260818 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:57.260758 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f149812-9639-44ed-b634-62b9c3f17126-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q\" (UID: \"3f149812-9639-44ed-b634-62b9c3f17126\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q" Apr 19 12:38:57.268037 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:57.268006 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb2q7\" (UniqueName: \"kubernetes.io/projected/3f149812-9639-44ed-b634-62b9c3f17126-kube-api-access-mb2q7\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q\" (UID: \"3f149812-9639-44ed-b634-62b9c3f17126\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q" Apr 19 12:38:57.302119 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:57.302079 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q" Apr 19 12:38:57.440460 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:57.440421 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q"] Apr 19 12:38:57.443872 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:38:57.443840 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f149812_9639_44ed_b634_62b9c3f17126.slice/crio-236d068a83d1241d46399dfa4d42fbf8c20d78e637e92ac8f241fa27e8d87988 WatchSource:0}: Error finding container 236d068a83d1241d46399dfa4d42fbf8c20d78e637e92ac8f241fa27e8d87988: Status 404 returned error can't find the container with id 236d068a83d1241d46399dfa4d42fbf8c20d78e637e92ac8f241fa27e8d87988 Apr 19 12:38:58.213267 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:58.213234 2567 generic.go:358] "Generic (PLEG): container finished" podID="3f149812-9639-44ed-b634-62b9c3f17126" containerID="fe19389cc40647f7c0e8908d8506ef29de9422536a62824058bf0ee07f0440ff" exitCode=0 Apr 19 12:38:58.213660 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:58.213292 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q" event={"ID":"3f149812-9639-44ed-b634-62b9c3f17126","Type":"ContainerDied","Data":"fe19389cc40647f7c0e8908d8506ef29de9422536a62824058bf0ee07f0440ff"} Apr 19 12:38:58.213660 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:58.213315 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q" event={"ID":"3f149812-9639-44ed-b634-62b9c3f17126","Type":"ContainerStarted","Data":"236d068a83d1241d46399dfa4d42fbf8c20d78e637e92ac8f241fa27e8d87988"} Apr 19 12:38:59.225120 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:59.225084 2567 generic.go:358] "Generic (PLEG): container finished" podID="3f149812-9639-44ed-b634-62b9c3f17126" containerID="ec1bbf4e669cc2864acf69f5bbea532237358f7e12a3d221ea82f42bb5352285" exitCode=0 Apr 19 12:38:59.225546 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:38:59.225188 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q" event={"ID":"3f149812-9639-44ed-b634-62b9c3f17126","Type":"ContainerDied","Data":"ec1bbf4e669cc2864acf69f5bbea532237358f7e12a3d221ea82f42bb5352285"} Apr 19 12:39:00.232428 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:00.232395 2567 generic.go:358] "Generic (PLEG): container finished" podID="3f149812-9639-44ed-b634-62b9c3f17126" containerID="7ef4d287659df32f7597dbd121420ea183d3dab379987fb50e350b8d0b032656" exitCode=0 Apr 19 12:39:00.232783 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:00.232481 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q" event={"ID":"3f149812-9639-44ed-b634-62b9c3f17126","Type":"ContainerDied","Data":"7ef4d287659df32f7597dbd121420ea183d3dab379987fb50e350b8d0b032656"} Apr 19 12:39:01.373198 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:01.373153 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q" Apr 19 12:39:01.503280 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:01.503202 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f149812-9639-44ed-b634-62b9c3f17126-bundle\") pod \"3f149812-9639-44ed-b634-62b9c3f17126\" (UID: \"3f149812-9639-44ed-b634-62b9c3f17126\") " Apr 19 12:39:01.503280 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:01.503268 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f149812-9639-44ed-b634-62b9c3f17126-util\") pod \"3f149812-9639-44ed-b634-62b9c3f17126\" (UID: \"3f149812-9639-44ed-b634-62b9c3f17126\") " Apr 19 12:39:01.503467 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:01.503298 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb2q7\" (UniqueName: \"kubernetes.io/projected/3f149812-9639-44ed-b634-62b9c3f17126-kube-api-access-mb2q7\") pod \"3f149812-9639-44ed-b634-62b9c3f17126\" (UID: \"3f149812-9639-44ed-b634-62b9c3f17126\") " Apr 19 12:39:01.504262 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:01.504225 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f149812-9639-44ed-b634-62b9c3f17126-bundle" (OuterVolumeSpecName: "bundle") pod "3f149812-9639-44ed-b634-62b9c3f17126" (UID: "3f149812-9639-44ed-b634-62b9c3f17126"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:39:01.505526 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:01.505496 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f149812-9639-44ed-b634-62b9c3f17126-kube-api-access-mb2q7" (OuterVolumeSpecName: "kube-api-access-mb2q7") pod "3f149812-9639-44ed-b634-62b9c3f17126" (UID: "3f149812-9639-44ed-b634-62b9c3f17126"). InnerVolumeSpecName "kube-api-access-mb2q7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:39:01.509246 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:01.509208 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f149812-9639-44ed-b634-62b9c3f17126-util" (OuterVolumeSpecName: "util") pod "3f149812-9639-44ed-b634-62b9c3f17126" (UID: "3f149812-9639-44ed-b634-62b9c3f17126"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:39:01.604490 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:01.604446 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mb2q7\" (UniqueName: \"kubernetes.io/projected/3f149812-9639-44ed-b634-62b9c3f17126-kube-api-access-mb2q7\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:39:01.604490 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:01.604482 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f149812-9639-44ed-b634-62b9c3f17126-bundle\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:39:01.604490 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:01.604492 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f149812-9639-44ed-b634-62b9c3f17126-util\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:39:02.241443 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:02.241415 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q" Apr 19 12:39:02.241443 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:02.241423 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29j87q" event={"ID":"3f149812-9639-44ed-b634-62b9c3f17126","Type":"ContainerDied","Data":"236d068a83d1241d46399dfa4d42fbf8c20d78e637e92ac8f241fa27e8d87988"} Apr 19 12:39:02.241443 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:02.241450 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="236d068a83d1241d46399dfa4d42fbf8c20d78e637e92ac8f241fa27e8d87988" Apr 19 12:39:47.339593 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.339553 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd"] Apr 19 12:39:47.340059 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.339941 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f149812-9639-44ed-b634-62b9c3f17126" containerName="extract" Apr 19 12:39:47.340059 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.339952 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f149812-9639-44ed-b634-62b9c3f17126" containerName="extract" Apr 19 12:39:47.340059 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.339962 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f149812-9639-44ed-b634-62b9c3f17126" containerName="util" Apr 19 12:39:47.340059 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.339967 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f149812-9639-44ed-b634-62b9c3f17126" containerName="util" Apr 19 12:39:47.340059 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.339979 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f149812-9639-44ed-b634-62b9c3f17126" containerName="pull" Apr 19 12:39:47.340059 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.339984 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f149812-9639-44ed-b634-62b9c3f17126" containerName="pull" Apr 19 12:39:47.340059 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.340051 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f149812-9639-44ed-b634-62b9c3f17126" containerName="extract" Apr 19 12:39:47.344718 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.344700 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd" Apr 19 12:39:47.346603 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.346581 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-4cwbr\"" Apr 19 12:39:47.346718 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.346589 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 19 12:39:47.346718 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.346589 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 19 12:39:47.352564 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.352511 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd"] Apr 19 12:39:47.506565 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.506535 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d21cef59-75f6-4713-81fd-7593fc3bb1c3-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd\" (UID: \"d21cef59-75f6-4713-81fd-7593fc3bb1c3\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd" Apr 19 12:39:47.506765 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.506579 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzmf7\" (UniqueName: \"kubernetes.io/projected/d21cef59-75f6-4713-81fd-7593fc3bb1c3-kube-api-access-xzmf7\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd\" (UID: \"d21cef59-75f6-4713-81fd-7593fc3bb1c3\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd" Apr 19 12:39:47.506765 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.506673 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d21cef59-75f6-4713-81fd-7593fc3bb1c3-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd\" (UID: \"d21cef59-75f6-4713-81fd-7593fc3bb1c3\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd" Apr 19 12:39:47.608182 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.608085 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d21cef59-75f6-4713-81fd-7593fc3bb1c3-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd\" (UID: \"d21cef59-75f6-4713-81fd-7593fc3bb1c3\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd" Apr 19 12:39:47.608182 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.608122 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzmf7\" (UniqueName: \"kubernetes.io/projected/d21cef59-75f6-4713-81fd-7593fc3bb1c3-kube-api-access-xzmf7\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd\" (UID: \"d21cef59-75f6-4713-81fd-7593fc3bb1c3\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd" Apr 19 12:39:47.608410 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.608197 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d21cef59-75f6-4713-81fd-7593fc3bb1c3-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd\" (UID: \"d21cef59-75f6-4713-81fd-7593fc3bb1c3\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd" Apr 19 12:39:47.608548 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.608527 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d21cef59-75f6-4713-81fd-7593fc3bb1c3-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd\" (UID: \"d21cef59-75f6-4713-81fd-7593fc3bb1c3\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd" Apr 19 12:39:47.608626 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.608605 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d21cef59-75f6-4713-81fd-7593fc3bb1c3-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd\" (UID: \"d21cef59-75f6-4713-81fd-7593fc3bb1c3\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd" Apr 19 12:39:47.615674 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.615643 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzmf7\" (UniqueName: \"kubernetes.io/projected/d21cef59-75f6-4713-81fd-7593fc3bb1c3-kube-api-access-xzmf7\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd\" (UID: \"d21cef59-75f6-4713-81fd-7593fc3bb1c3\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd" Apr 19 12:39:47.655468 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.655432 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd" Apr 19 12:39:47.782109 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.782078 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd"] Apr 19 12:39:47.784018 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:39:47.783983 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd21cef59_75f6_4713_81fd_7593fc3bb1c3.slice/crio-2bc6d7bc206f776a6f50b69ed8e448b1779b852a0a43ceaa84bf9ac338ed4fb2 WatchSource:0}: Error finding container 2bc6d7bc206f776a6f50b69ed8e448b1779b852a0a43ceaa84bf9ac338ed4fb2: Status 404 returned error can't find the container with id 2bc6d7bc206f776a6f50b69ed8e448b1779b852a0a43ceaa84bf9ac338ed4fb2 Apr 19 12:39:47.935963 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.935932 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47"] Apr 19 12:39:47.939609 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.939592 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47" Apr 19 12:39:47.946426 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:47.946401 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47"] Apr 19 12:39:48.011743 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.011707 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4rg2\" (UniqueName: \"kubernetes.io/projected/bf78201c-89b2-4894-8db8-f90bd134e341-kube-api-access-f4rg2\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47\" (UID: \"bf78201c-89b2-4894-8db8-f90bd134e341\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47" Apr 19 12:39:48.011923 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.011755 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf78201c-89b2-4894-8db8-f90bd134e341-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47\" (UID: \"bf78201c-89b2-4894-8db8-f90bd134e341\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47" Apr 19 12:39:48.011923 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.011836 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf78201c-89b2-4894-8db8-f90bd134e341-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47\" (UID: \"bf78201c-89b2-4894-8db8-f90bd134e341\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47" Apr 19 12:39:48.112819 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.112787 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4rg2\" (UniqueName: \"kubernetes.io/projected/bf78201c-89b2-4894-8db8-f90bd134e341-kube-api-access-f4rg2\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47\" (UID: \"bf78201c-89b2-4894-8db8-f90bd134e341\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47" Apr 19 12:39:48.112986 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.112826 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf78201c-89b2-4894-8db8-f90bd134e341-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47\" (UID: \"bf78201c-89b2-4894-8db8-f90bd134e341\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47" Apr 19 12:39:48.112986 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.112858 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf78201c-89b2-4894-8db8-f90bd134e341-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47\" (UID: \"bf78201c-89b2-4894-8db8-f90bd134e341\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47" Apr 19 12:39:48.113230 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.113211 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf78201c-89b2-4894-8db8-f90bd134e341-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47\" (UID: \"bf78201c-89b2-4894-8db8-f90bd134e341\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47" Apr 19 12:39:48.113270 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.113234 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf78201c-89b2-4894-8db8-f90bd134e341-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47\" (UID: \"bf78201c-89b2-4894-8db8-f90bd134e341\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47" Apr 19 12:39:48.121838 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.121814 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4rg2\" (UniqueName: \"kubernetes.io/projected/bf78201c-89b2-4894-8db8-f90bd134e341-kube-api-access-f4rg2\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47\" (UID: \"bf78201c-89b2-4894-8db8-f90bd134e341\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47" Apr 19 12:39:48.259338 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.259296 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47" Apr 19 12:39:48.405965 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.405934 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47"] Apr 19 12:39:48.409315 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:39:48.409290 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf78201c_89b2_4894_8db8_f90bd134e341.slice/crio-f304eed2ebeb60065b6df63b5a3640129ac124eed8722504a55811ae877731e6 WatchSource:0}: Error finding container f304eed2ebeb60065b6df63b5a3640129ac124eed8722504a55811ae877731e6: Status 404 returned error can't find the container with id f304eed2ebeb60065b6df63b5a3640129ac124eed8722504a55811ae877731e6 Apr 19 12:39:48.413635 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.413610 2567 generic.go:358] "Generic (PLEG): container finished" podID="d21cef59-75f6-4713-81fd-7593fc3bb1c3" containerID="93c9d189da962ceb779335648ce9db2ab3b81f55e84574838b431ca8e97ca1f9" exitCode=0 Apr 19 12:39:48.413735 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.413683 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd" event={"ID":"d21cef59-75f6-4713-81fd-7593fc3bb1c3","Type":"ContainerDied","Data":"93c9d189da962ceb779335648ce9db2ab3b81f55e84574838b431ca8e97ca1f9"} Apr 19 12:39:48.413735 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.413711 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd" event={"ID":"d21cef59-75f6-4713-81fd-7593fc3bb1c3","Type":"ContainerStarted","Data":"2bc6d7bc206f776a6f50b69ed8e448b1779b852a0a43ceaa84bf9ac338ed4fb2"} Apr 19 12:39:48.542149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.542039 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s"] Apr 19 12:39:48.545791 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.545771 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s" Apr 19 12:39:48.552677 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.552649 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s"] Apr 19 12:39:48.720750 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.720707 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx6tv\" (UniqueName: \"kubernetes.io/projected/09ae2876-b22e-4fb6-b7e0-d93d4d478df5-kube-api-access-rx6tv\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s\" (UID: \"09ae2876-b22e-4fb6-b7e0-d93d4d478df5\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s" Apr 19 12:39:48.720950 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.720770 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09ae2876-b22e-4fb6-b7e0-d93d4d478df5-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s\" (UID: \"09ae2876-b22e-4fb6-b7e0-d93d4d478df5\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s" Apr 19 12:39:48.720950 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.720826 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09ae2876-b22e-4fb6-b7e0-d93d4d478df5-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s\" (UID: \"09ae2876-b22e-4fb6-b7e0-d93d4d478df5\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s" Apr 19 12:39:48.822270 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.822148 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rx6tv\" (UniqueName: \"kubernetes.io/projected/09ae2876-b22e-4fb6-b7e0-d93d4d478df5-kube-api-access-rx6tv\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s\" (UID: \"09ae2876-b22e-4fb6-b7e0-d93d4d478df5\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s" Apr 19 12:39:48.822270 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.822245 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09ae2876-b22e-4fb6-b7e0-d93d4d478df5-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s\" (UID: \"09ae2876-b22e-4fb6-b7e0-d93d4d478df5\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s" Apr 19 12:39:48.822508 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.822316 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09ae2876-b22e-4fb6-b7e0-d93d4d478df5-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s\" (UID: \"09ae2876-b22e-4fb6-b7e0-d93d4d478df5\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s" Apr 19 12:39:48.822698 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.822674 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09ae2876-b22e-4fb6-b7e0-d93d4d478df5-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s\" (UID: \"09ae2876-b22e-4fb6-b7e0-d93d4d478df5\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s" Apr 19 12:39:48.822698 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.822691 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09ae2876-b22e-4fb6-b7e0-d93d4d478df5-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s\" (UID: \"09ae2876-b22e-4fb6-b7e0-d93d4d478df5\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s" Apr 19 12:39:48.829978 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.829955 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx6tv\" (UniqueName: \"kubernetes.io/projected/09ae2876-b22e-4fb6-b7e0-d93d4d478df5-kube-api-access-rx6tv\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s\" (UID: \"09ae2876-b22e-4fb6-b7e0-d93d4d478df5\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s" Apr 19 12:39:48.874783 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.874750 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s" Apr 19 12:39:48.995821 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:48.995797 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s"] Apr 19 12:39:48.997672 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:39:48.997640 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09ae2876_b22e_4fb6_b7e0_d93d4d478df5.slice/crio-8681e560d189ecaf553a9fb2ab34f718daa99bf53e6d2d6e73acef04600b0a84 WatchSource:0}: Error finding container 8681e560d189ecaf553a9fb2ab34f718daa99bf53e6d2d6e73acef04600b0a84: Status 404 returned error can't find the container with id 8681e560d189ecaf553a9fb2ab34f718daa99bf53e6d2d6e73acef04600b0a84 Apr 19 12:39:49.146966 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:49.143537 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv"] Apr 19 12:39:49.151095 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:49.151066 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv" Apr 19 12:39:49.153410 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:49.153380 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv"] Apr 19 12:39:49.327139 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:49.327109 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a498dcf-155d-4306-9c8f-18c1f4dde823-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv\" (UID: \"2a498dcf-155d-4306-9c8f-18c1f4dde823\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv" Apr 19 12:39:49.327283 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:49.327239 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49mfh\" (UniqueName: \"kubernetes.io/projected/2a498dcf-155d-4306-9c8f-18c1f4dde823-kube-api-access-49mfh\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv\" (UID: \"2a498dcf-155d-4306-9c8f-18c1f4dde823\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv" Apr 19 12:39:49.327326 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:49.327296 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a498dcf-155d-4306-9c8f-18c1f4dde823-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv\" (UID: \"2a498dcf-155d-4306-9c8f-18c1f4dde823\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv" Apr 19 12:39:49.422523 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:49.422489 2567 generic.go:358] "Generic (PLEG): container finished" podID="d21cef59-75f6-4713-81fd-7593fc3bb1c3" containerID="44f01499d654531e4bce619542afff4d867a57aa7b5b9570fb6d146bf83a7d0f" exitCode=0 Apr 19 12:39:49.422975 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:49.422584 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd" event={"ID":"d21cef59-75f6-4713-81fd-7593fc3bb1c3","Type":"ContainerDied","Data":"44f01499d654531e4bce619542afff4d867a57aa7b5b9570fb6d146bf83a7d0f"} Apr 19 12:39:49.424013 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:49.423989 2567 generic.go:358] "Generic (PLEG): container finished" podID="09ae2876-b22e-4fb6-b7e0-d93d4d478df5" containerID="96cac228fc5bfd0d1c4dab0b675eff8506bfdfc3b53b57fb41346599d67817e8" exitCode=0 Apr 19 12:39:49.424102 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:49.424076 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s" event={"ID":"09ae2876-b22e-4fb6-b7e0-d93d4d478df5","Type":"ContainerDied","Data":"96cac228fc5bfd0d1c4dab0b675eff8506bfdfc3b53b57fb41346599d67817e8"} Apr 19 12:39:49.424177 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:49.424109 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s" event={"ID":"09ae2876-b22e-4fb6-b7e0-d93d4d478df5","Type":"ContainerStarted","Data":"8681e560d189ecaf553a9fb2ab34f718daa99bf53e6d2d6e73acef04600b0a84"} Apr 19 12:39:49.425531 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:49.425508 2567 generic.go:358] "Generic (PLEG): container finished" podID="bf78201c-89b2-4894-8db8-f90bd134e341" containerID="fc6f82aa92cb77384ad22d826ad05f3e06a90adeea732218e5b958a4a7d2ae35" exitCode=0 Apr 19 12:39:49.425650 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:49.425578 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47" event={"ID":"bf78201c-89b2-4894-8db8-f90bd134e341","Type":"ContainerDied","Data":"fc6f82aa92cb77384ad22d826ad05f3e06a90adeea732218e5b958a4a7d2ae35"} Apr 19 12:39:49.425650 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:49.425607 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47" event={"ID":"bf78201c-89b2-4894-8db8-f90bd134e341","Type":"ContainerStarted","Data":"f304eed2ebeb60065b6df63b5a3640129ac124eed8722504a55811ae877731e6"} Apr 19 12:39:49.427753 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:49.427734 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49mfh\" (UniqueName: \"kubernetes.io/projected/2a498dcf-155d-4306-9c8f-18c1f4dde823-kube-api-access-49mfh\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv\" (UID: \"2a498dcf-155d-4306-9c8f-18c1f4dde823\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv" Apr 19 12:39:49.427834 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:49.427791 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a498dcf-155d-4306-9c8f-18c1f4dde823-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv\" (UID: \"2a498dcf-155d-4306-9c8f-18c1f4dde823\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv" Apr 19 12:39:49.427897 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:49.427846 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a498dcf-155d-4306-9c8f-18c1f4dde823-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv\" (UID: \"2a498dcf-155d-4306-9c8f-18c1f4dde823\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv" Apr 19 12:39:49.428156 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:49.428138 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a498dcf-155d-4306-9c8f-18c1f4dde823-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv\" (UID: \"2a498dcf-155d-4306-9c8f-18c1f4dde823\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv" Apr 19 12:39:49.428271 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:49.428192 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a498dcf-155d-4306-9c8f-18c1f4dde823-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv\" (UID: \"2a498dcf-155d-4306-9c8f-18c1f4dde823\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv" Apr 19 12:39:49.436385 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:49.436364 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49mfh\" (UniqueName: \"kubernetes.io/projected/2a498dcf-155d-4306-9c8f-18c1f4dde823-kube-api-access-49mfh\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv\" (UID: \"2a498dcf-155d-4306-9c8f-18c1f4dde823\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv" Apr 19 12:39:49.462425 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:49.462398 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv" Apr 19 12:39:49.591692 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:49.591662 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv"] Apr 19 12:39:49.609947 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:39:49.609914 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a498dcf_155d_4306_9c8f_18c1f4dde823.slice/crio-9bdf4bf56791bc103b75f337573dc321e6c897699052fcbbb2f09f3b9181e334 WatchSource:0}: Error finding container 9bdf4bf56791bc103b75f337573dc321e6c897699052fcbbb2f09f3b9181e334: Status 404 returned error can't find the container with id 9bdf4bf56791bc103b75f337573dc321e6c897699052fcbbb2f09f3b9181e334 Apr 19 12:39:50.431477 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:50.431386 2567 generic.go:358] "Generic (PLEG): container finished" podID="bf78201c-89b2-4894-8db8-f90bd134e341" containerID="2fb6e31bd28758130505790169d25b03c31bca161ad1b6e98d8e6c5659a5f888" exitCode=0 Apr 19 12:39:50.431885 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:50.431467 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47" event={"ID":"bf78201c-89b2-4894-8db8-f90bd134e341","Type":"ContainerDied","Data":"2fb6e31bd28758130505790169d25b03c31bca161ad1b6e98d8e6c5659a5f888"} Apr 19 12:39:50.432825 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:50.432801 2567 generic.go:358] "Generic (PLEG): container finished" podID="2a498dcf-155d-4306-9c8f-18c1f4dde823" containerID="73f88f55e3bb050962a64c265b11d1d6330017a4cde57f6432fb3f6fd7776624" exitCode=0 Apr 19 12:39:50.432952 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:50.432875 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv" event={"ID":"2a498dcf-155d-4306-9c8f-18c1f4dde823","Type":"ContainerDied","Data":"73f88f55e3bb050962a64c265b11d1d6330017a4cde57f6432fb3f6fd7776624"} Apr 19 12:39:50.432952 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:50.432906 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv" event={"ID":"2a498dcf-155d-4306-9c8f-18c1f4dde823","Type":"ContainerStarted","Data":"9bdf4bf56791bc103b75f337573dc321e6c897699052fcbbb2f09f3b9181e334"} Apr 19 12:39:50.434857 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:50.434830 2567 generic.go:358] "Generic (PLEG): container finished" podID="d21cef59-75f6-4713-81fd-7593fc3bb1c3" containerID="ebddfc739be38a64bec6a09585cd297d6815e2517dd0dc9d7b1aef3412cd626f" exitCode=0 Apr 19 12:39:50.434857 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:50.434853 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd" event={"ID":"d21cef59-75f6-4713-81fd-7593fc3bb1c3","Type":"ContainerDied","Data":"ebddfc739be38a64bec6a09585cd297d6815e2517dd0dc9d7b1aef3412cd626f"} Apr 19 12:39:50.436580 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:50.436558 2567 generic.go:358] "Generic (PLEG): container finished" podID="09ae2876-b22e-4fb6-b7e0-d93d4d478df5" containerID="02ea1204e271030f796dee37c55def90d10394657b0e57ff4fc46007241658ad" exitCode=0 Apr 19 12:39:50.436665 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:50.436581 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s" event={"ID":"09ae2876-b22e-4fb6-b7e0-d93d4d478df5","Type":"ContainerDied","Data":"02ea1204e271030f796dee37c55def90d10394657b0e57ff4fc46007241658ad"} Apr 19 12:39:51.442066 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:51.442034 2567 generic.go:358] "Generic (PLEG): container finished" podID="bf78201c-89b2-4894-8db8-f90bd134e341" containerID="c06f21e18989f950dd3d674d5c308052cc1a28921a37df091173479cd0b93b65" exitCode=0 Apr 19 12:39:51.442514 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:51.442107 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47" event={"ID":"bf78201c-89b2-4894-8db8-f90bd134e341","Type":"ContainerDied","Data":"c06f21e18989f950dd3d674d5c308052cc1a28921a37df091173479cd0b93b65"} Apr 19 12:39:51.443695 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:51.443668 2567 generic.go:358] "Generic (PLEG): container finished" podID="2a498dcf-155d-4306-9c8f-18c1f4dde823" containerID="d03b7846f9ecf5c8018fa578771f53acd9e6006f637a5b3555672ee5229b29d6" exitCode=0 Apr 19 12:39:51.443833 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:51.443751 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv" event={"ID":"2a498dcf-155d-4306-9c8f-18c1f4dde823","Type":"ContainerDied","Data":"d03b7846f9ecf5c8018fa578771f53acd9e6006f637a5b3555672ee5229b29d6"} Apr 19 12:39:51.445801 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:51.445778 2567 generic.go:358] "Generic (PLEG): container finished" podID="09ae2876-b22e-4fb6-b7e0-d93d4d478df5" containerID="5d4046f382a836771733e4ed20133c24950430fb89b85ed46083cf4cab5fb27d" exitCode=0 Apr 19 12:39:51.445885 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:51.445863 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s" event={"ID":"09ae2876-b22e-4fb6-b7e0-d93d4d478df5","Type":"ContainerDied","Data":"5d4046f382a836771733e4ed20133c24950430fb89b85ed46083cf4cab5fb27d"} Apr 19 12:39:51.586017 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:51.585991 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd" Apr 19 12:39:51.748485 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:51.748394 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzmf7\" (UniqueName: \"kubernetes.io/projected/d21cef59-75f6-4713-81fd-7593fc3bb1c3-kube-api-access-xzmf7\") pod \"d21cef59-75f6-4713-81fd-7593fc3bb1c3\" (UID: \"d21cef59-75f6-4713-81fd-7593fc3bb1c3\") " Apr 19 12:39:51.748485 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:51.748444 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d21cef59-75f6-4713-81fd-7593fc3bb1c3-bundle\") pod \"d21cef59-75f6-4713-81fd-7593fc3bb1c3\" (UID: \"d21cef59-75f6-4713-81fd-7593fc3bb1c3\") " Apr 19 12:39:51.748485 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:51.748465 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d21cef59-75f6-4713-81fd-7593fc3bb1c3-util\") pod \"d21cef59-75f6-4713-81fd-7593fc3bb1c3\" (UID: \"d21cef59-75f6-4713-81fd-7593fc3bb1c3\") " Apr 19 12:39:51.748913 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:51.748882 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d21cef59-75f6-4713-81fd-7593fc3bb1c3-bundle" (OuterVolumeSpecName: "bundle") pod "d21cef59-75f6-4713-81fd-7593fc3bb1c3" (UID: "d21cef59-75f6-4713-81fd-7593fc3bb1c3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:39:51.750839 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:51.750813 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d21cef59-75f6-4713-81fd-7593fc3bb1c3-kube-api-access-xzmf7" (OuterVolumeSpecName: "kube-api-access-xzmf7") pod "d21cef59-75f6-4713-81fd-7593fc3bb1c3" (UID: "d21cef59-75f6-4713-81fd-7593fc3bb1c3"). InnerVolumeSpecName "kube-api-access-xzmf7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:39:51.753972 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:51.753945 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d21cef59-75f6-4713-81fd-7593fc3bb1c3-util" (OuterVolumeSpecName: "util") pod "d21cef59-75f6-4713-81fd-7593fc3bb1c3" (UID: "d21cef59-75f6-4713-81fd-7593fc3bb1c3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:39:51.850157 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:51.850123 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d21cef59-75f6-4713-81fd-7593fc3bb1c3-bundle\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:39:51.850157 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:51.850152 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d21cef59-75f6-4713-81fd-7593fc3bb1c3-util\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:39:51.850431 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:51.850184 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xzmf7\" (UniqueName: \"kubernetes.io/projected/d21cef59-75f6-4713-81fd-7593fc3bb1c3-kube-api-access-xzmf7\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:39:52.456389 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.456355 2567 generic.go:358] "Generic (PLEG): container finished" podID="2a498dcf-155d-4306-9c8f-18c1f4dde823" containerID="3017831e822d4b8b777ca405d595a1c15f82a51fe38470a59f2a85625f54ed51" exitCode=0 Apr 19 12:39:52.459890 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.459857 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd" Apr 19 12:39:52.460286 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.460252 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd" event={"ID":"d21cef59-75f6-4713-81fd-7593fc3bb1c3","Type":"ContainerDied","Data":"2bc6d7bc206f776a6f50b69ed8e448b1779b852a0a43ceaa84bf9ac338ed4fb2"} Apr 19 12:39:52.460430 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.460290 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bc6d7bc206f776a6f50b69ed8e448b1779b852a0a43ceaa84bf9ac338ed4fb2" Apr 19 12:39:52.460430 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.460304 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv" event={"ID":"2a498dcf-155d-4306-9c8f-18c1f4dde823","Type":"ContainerDied","Data":"3017831e822d4b8b777ca405d595a1c15f82a51fe38470a59f2a85625f54ed51"} Apr 19 12:39:52.592561 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.592535 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47" Apr 19 12:39:52.617467 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.617446 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s" Apr 19 12:39:52.758514 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.758427 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09ae2876-b22e-4fb6-b7e0-d93d4d478df5-util\") pod \"09ae2876-b22e-4fb6-b7e0-d93d4d478df5\" (UID: \"09ae2876-b22e-4fb6-b7e0-d93d4d478df5\") " Apr 19 12:39:52.758514 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.758485 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf78201c-89b2-4894-8db8-f90bd134e341-bundle\") pod \"bf78201c-89b2-4894-8db8-f90bd134e341\" (UID: \"bf78201c-89b2-4894-8db8-f90bd134e341\") " Apr 19 12:39:52.758720 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.758520 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf78201c-89b2-4894-8db8-f90bd134e341-util\") pod \"bf78201c-89b2-4894-8db8-f90bd134e341\" (UID: \"bf78201c-89b2-4894-8db8-f90bd134e341\") " Apr 19 12:39:52.758720 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.758558 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4rg2\" (UniqueName: \"kubernetes.io/projected/bf78201c-89b2-4894-8db8-f90bd134e341-kube-api-access-f4rg2\") pod \"bf78201c-89b2-4894-8db8-f90bd134e341\" (UID: \"bf78201c-89b2-4894-8db8-f90bd134e341\") " Apr 19 12:39:52.758720 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.758574 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09ae2876-b22e-4fb6-b7e0-d93d4d478df5-bundle\") pod \"09ae2876-b22e-4fb6-b7e0-d93d4d478df5\" (UID: \"09ae2876-b22e-4fb6-b7e0-d93d4d478df5\") " Apr 19 12:39:52.758720 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.758596 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx6tv\" (UniqueName: \"kubernetes.io/projected/09ae2876-b22e-4fb6-b7e0-d93d4d478df5-kube-api-access-rx6tv\") pod \"09ae2876-b22e-4fb6-b7e0-d93d4d478df5\" (UID: \"09ae2876-b22e-4fb6-b7e0-d93d4d478df5\") " Apr 19 12:39:52.759101 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.759067 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf78201c-89b2-4894-8db8-f90bd134e341-bundle" (OuterVolumeSpecName: "bundle") pod "bf78201c-89b2-4894-8db8-f90bd134e341" (UID: "bf78201c-89b2-4894-8db8-f90bd134e341"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:39:52.759334 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.759311 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09ae2876-b22e-4fb6-b7e0-d93d4d478df5-bundle" (OuterVolumeSpecName: "bundle") pod "09ae2876-b22e-4fb6-b7e0-d93d4d478df5" (UID: "09ae2876-b22e-4fb6-b7e0-d93d4d478df5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:39:52.760797 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.760773 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae2876-b22e-4fb6-b7e0-d93d4d478df5-kube-api-access-rx6tv" (OuterVolumeSpecName: "kube-api-access-rx6tv") pod "09ae2876-b22e-4fb6-b7e0-d93d4d478df5" (UID: "09ae2876-b22e-4fb6-b7e0-d93d4d478df5"). InnerVolumeSpecName "kube-api-access-rx6tv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:39:52.760903 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.760885 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf78201c-89b2-4894-8db8-f90bd134e341-kube-api-access-f4rg2" (OuterVolumeSpecName: "kube-api-access-f4rg2") pod "bf78201c-89b2-4894-8db8-f90bd134e341" (UID: "bf78201c-89b2-4894-8db8-f90bd134e341"). InnerVolumeSpecName "kube-api-access-f4rg2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:39:52.767287 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.767261 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf78201c-89b2-4894-8db8-f90bd134e341-util" (OuterVolumeSpecName: "util") pod "bf78201c-89b2-4894-8db8-f90bd134e341" (UID: "bf78201c-89b2-4894-8db8-f90bd134e341"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:39:52.767507 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.767491 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09ae2876-b22e-4fb6-b7e0-d93d4d478df5-util" (OuterVolumeSpecName: "util") pod "09ae2876-b22e-4fb6-b7e0-d93d4d478df5" (UID: "09ae2876-b22e-4fb6-b7e0-d93d4d478df5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:39:52.859429 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.859400 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09ae2876-b22e-4fb6-b7e0-d93d4d478df5-util\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:39:52.859429 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.859426 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf78201c-89b2-4894-8db8-f90bd134e341-bundle\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:39:52.859429 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.859435 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf78201c-89b2-4894-8db8-f90bd134e341-util\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:39:52.859649 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.859444 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f4rg2\" (UniqueName: \"kubernetes.io/projected/bf78201c-89b2-4894-8db8-f90bd134e341-kube-api-access-f4rg2\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:39:52.859649 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.859454 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09ae2876-b22e-4fb6-b7e0-d93d4d478df5-bundle\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:39:52.859649 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:52.859466 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rx6tv\" (UniqueName: \"kubernetes.io/projected/09ae2876-b22e-4fb6-b7e0-d93d4d478df5-kube-api-access-rx6tv\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:39:53.461545 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:53.461519 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s" Apr 19 12:39:53.461949 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:53.461511 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s" event={"ID":"09ae2876-b22e-4fb6-b7e0-d93d4d478df5","Type":"ContainerDied","Data":"8681e560d189ecaf553a9fb2ab34f718daa99bf53e6d2d6e73acef04600b0a84"} Apr 19 12:39:53.461949 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:53.461635 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8681e560d189ecaf553a9fb2ab34f718daa99bf53e6d2d6e73acef04600b0a84" Apr 19 12:39:53.463328 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:53.463309 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47" Apr 19 12:39:53.463446 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:53.463339 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47" event={"ID":"bf78201c-89b2-4894-8db8-f90bd134e341","Type":"ContainerDied","Data":"f304eed2ebeb60065b6df63b5a3640129ac124eed8722504a55811ae877731e6"} Apr 19 12:39:53.463446 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:53.463370 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f304eed2ebeb60065b6df63b5a3640129ac124eed8722504a55811ae877731e6" Apr 19 12:39:53.588859 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:53.588832 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv" Apr 19 12:39:53.766776 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:53.766673 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a498dcf-155d-4306-9c8f-18c1f4dde823-bundle\") pod \"2a498dcf-155d-4306-9c8f-18c1f4dde823\" (UID: \"2a498dcf-155d-4306-9c8f-18c1f4dde823\") " Apr 19 12:39:53.766776 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:53.766716 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a498dcf-155d-4306-9c8f-18c1f4dde823-util\") pod \"2a498dcf-155d-4306-9c8f-18c1f4dde823\" (UID: \"2a498dcf-155d-4306-9c8f-18c1f4dde823\") " Apr 19 12:39:53.766999 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:53.766793 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49mfh\" (UniqueName: \"kubernetes.io/projected/2a498dcf-155d-4306-9c8f-18c1f4dde823-kube-api-access-49mfh\") pod \"2a498dcf-155d-4306-9c8f-18c1f4dde823\" (UID: \"2a498dcf-155d-4306-9c8f-18c1f4dde823\") " Apr 19 12:39:53.767264 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:53.767241 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a498dcf-155d-4306-9c8f-18c1f4dde823-bundle" (OuterVolumeSpecName: "bundle") pod "2a498dcf-155d-4306-9c8f-18c1f4dde823" (UID: "2a498dcf-155d-4306-9c8f-18c1f4dde823"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:39:53.768885 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:53.768856 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a498dcf-155d-4306-9c8f-18c1f4dde823-kube-api-access-49mfh" (OuterVolumeSpecName: "kube-api-access-49mfh") pod "2a498dcf-155d-4306-9c8f-18c1f4dde823" (UID: "2a498dcf-155d-4306-9c8f-18c1f4dde823"). InnerVolumeSpecName "kube-api-access-49mfh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:39:53.771959 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:53.771935 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a498dcf-155d-4306-9c8f-18c1f4dde823-util" (OuterVolumeSpecName: "util") pod "2a498dcf-155d-4306-9c8f-18c1f4dde823" (UID: "2a498dcf-155d-4306-9c8f-18c1f4dde823"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:39:53.867560 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:53.867526 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a498dcf-155d-4306-9c8f-18c1f4dde823-bundle\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:39:53.867560 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:53.867554 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a498dcf-155d-4306-9c8f-18c1f4dde823-util\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:39:53.867560 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:53.867563 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-49mfh\" (UniqueName: \"kubernetes.io/projected/2a498dcf-155d-4306-9c8f-18c1f4dde823-kube-api-access-49mfh\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:39:54.468622 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:54.468598 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv" Apr 19 12:39:54.468943 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:54.468611 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv" event={"ID":"2a498dcf-155d-4306-9c8f-18c1f4dde823","Type":"ContainerDied","Data":"9bdf4bf56791bc103b75f337573dc321e6c897699052fcbbb2f09f3b9181e334"} Apr 19 12:39:54.468943 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:39:54.468649 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bdf4bf56791bc103b75f337573dc321e6c897699052fcbbb2f09f3b9181e334" Apr 19 12:40:22.346668 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:22.346634 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58fff8c4c5-tqlfs"] Apr 19 12:40:23.585517 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.585469 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-zs9pb"] Apr 19 12:40:23.586064 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586047 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d21cef59-75f6-4713-81fd-7593fc3bb1c3" containerName="util" Apr 19 12:40:23.586117 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586068 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="d21cef59-75f6-4713-81fd-7593fc3bb1c3" containerName="util" Apr 19 12:40:23.586117 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586086 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf78201c-89b2-4894-8db8-f90bd134e341" containerName="util" Apr 19 12:40:23.586117 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586095 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf78201c-89b2-4894-8db8-f90bd134e341" containerName="util" Apr 19 12:40:23.586117 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586107 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d21cef59-75f6-4713-81fd-7593fc3bb1c3" containerName="pull" Apr 19 12:40:23.586117 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586114 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="d21cef59-75f6-4713-81fd-7593fc3bb1c3" containerName="pull" Apr 19 12:40:23.586314 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586124 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a498dcf-155d-4306-9c8f-18c1f4dde823" containerName="util" Apr 19 12:40:23.586314 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586132 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a498dcf-155d-4306-9c8f-18c1f4dde823" containerName="util" Apr 19 12:40:23.586314 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586143 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09ae2876-b22e-4fb6-b7e0-d93d4d478df5" containerName="extract" Apr 19 12:40:23.586314 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586151 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="09ae2876-b22e-4fb6-b7e0-d93d4d478df5" containerName="extract" Apr 19 12:40:23.586314 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586180 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09ae2876-b22e-4fb6-b7e0-d93d4d478df5" containerName="util" Apr 19 12:40:23.586314 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586188 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="09ae2876-b22e-4fb6-b7e0-d93d4d478df5" containerName="util" Apr 19 12:40:23.586314 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586200 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf78201c-89b2-4894-8db8-f90bd134e341" containerName="extract" Apr 19 12:40:23.586314 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586230 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf78201c-89b2-4894-8db8-f90bd134e341" containerName="extract" Apr 19 12:40:23.586314 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586241 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a498dcf-155d-4306-9c8f-18c1f4dde823" containerName="pull" Apr 19 12:40:23.586314 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586249 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a498dcf-155d-4306-9c8f-18c1f4dde823" containerName="pull" Apr 19 12:40:23.586314 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586273 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d21cef59-75f6-4713-81fd-7593fc3bb1c3" containerName="extract" Apr 19 12:40:23.586314 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586280 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="d21cef59-75f6-4713-81fd-7593fc3bb1c3" containerName="extract" Apr 19 12:40:23.586314 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586298 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf78201c-89b2-4894-8db8-f90bd134e341" containerName="pull" Apr 19 12:40:23.586314 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586306 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf78201c-89b2-4894-8db8-f90bd134e341" containerName="pull" Apr 19 12:40:23.586719 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586320 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09ae2876-b22e-4fb6-b7e0-d93d4d478df5" containerName="pull" Apr 19 12:40:23.586719 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586329 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="09ae2876-b22e-4fb6-b7e0-d93d4d478df5" containerName="pull" Apr 19 12:40:23.586719 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586341 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a498dcf-155d-4306-9c8f-18c1f4dde823" containerName="extract" Apr 19 12:40:23.586719 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586349 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a498dcf-155d-4306-9c8f-18c1f4dde823" containerName="extract" Apr 19 12:40:23.586719 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586429 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="d21cef59-75f6-4713-81fd-7593fc3bb1c3" containerName="extract" Apr 19 12:40:23.586719 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586441 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="09ae2876-b22e-4fb6-b7e0-d93d4d478df5" containerName="extract" Apr 19 12:40:23.586719 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586454 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf78201c-89b2-4894-8db8-f90bd134e341" containerName="extract" Apr 19 12:40:23.586719 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.586462 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="2a498dcf-155d-4306-9c8f-18c1f4dde823" containerName="extract" Apr 19 12:40:23.591481 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.591457 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-zs9pb" Apr 19 12:40:23.593715 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.593693 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 19 12:40:23.593842 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.593737 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 19 12:40:23.594092 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.594072 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 19 12:40:23.594190 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.594073 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 19 12:40:23.594529 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.594270 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-4cwbr\"" Apr 19 12:40:23.594974 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.594948 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-zs9pb"] Apr 19 12:40:23.717779 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.717744 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqmh4\" (UniqueName: \"kubernetes.io/projected/036649a7-fe91-49cc-ba36-5caaceef8d42-kube-api-access-wqmh4\") pod \"kuadrant-console-plugin-6cb54b5c86-zs9pb\" (UID: \"036649a7-fe91-49cc-ba36-5caaceef8d42\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-zs9pb" Apr 19 12:40:23.717947 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.717808 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/036649a7-fe91-49cc-ba36-5caaceef8d42-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-zs9pb\" (UID: \"036649a7-fe91-49cc-ba36-5caaceef8d42\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-zs9pb" Apr 19 12:40:23.717947 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.717902 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/036649a7-fe91-49cc-ba36-5caaceef8d42-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-zs9pb\" (UID: \"036649a7-fe91-49cc-ba36-5caaceef8d42\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-zs9pb" Apr 19 12:40:23.819342 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.819308 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/036649a7-fe91-49cc-ba36-5caaceef8d42-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-zs9pb\" (UID: \"036649a7-fe91-49cc-ba36-5caaceef8d42\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-zs9pb" Apr 19 12:40:23.819513 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.819487 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqmh4\" (UniqueName: \"kubernetes.io/projected/036649a7-fe91-49cc-ba36-5caaceef8d42-kube-api-access-wqmh4\") pod \"kuadrant-console-plugin-6cb54b5c86-zs9pb\" (UID: \"036649a7-fe91-49cc-ba36-5caaceef8d42\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-zs9pb" Apr 19 12:40:23.819665 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.819528 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/036649a7-fe91-49cc-ba36-5caaceef8d42-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-zs9pb\" (UID: \"036649a7-fe91-49cc-ba36-5caaceef8d42\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-zs9pb" Apr 19 12:40:23.820145 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.820125 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/036649a7-fe91-49cc-ba36-5caaceef8d42-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-zs9pb\" (UID: \"036649a7-fe91-49cc-ba36-5caaceef8d42\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-zs9pb" Apr 19 12:40:23.822283 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.822259 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/036649a7-fe91-49cc-ba36-5caaceef8d42-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-zs9pb\" (UID: \"036649a7-fe91-49cc-ba36-5caaceef8d42\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-zs9pb" Apr 19 12:40:23.826601 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.826577 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqmh4\" (UniqueName: \"kubernetes.io/projected/036649a7-fe91-49cc-ba36-5caaceef8d42-kube-api-access-wqmh4\") pod \"kuadrant-console-plugin-6cb54b5c86-zs9pb\" (UID: \"036649a7-fe91-49cc-ba36-5caaceef8d42\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-zs9pb" Apr 19 12:40:23.902813 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:23.902782 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-zs9pb" Apr 19 12:40:24.032782 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:24.032744 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-zs9pb"] Apr 19 12:40:24.034884 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:40:24.034858 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod036649a7_fe91_49cc_ba36_5caaceef8d42.slice/crio-3c73712eadbcb7d110de87c1b54094f89bcdab14bebea890150489f422be92c7 WatchSource:0}: Error finding container 3c73712eadbcb7d110de87c1b54094f89bcdab14bebea890150489f422be92c7: Status 404 returned error can't find the container with id 3c73712eadbcb7d110de87c1b54094f89bcdab14bebea890150489f422be92c7 Apr 19 12:40:24.588756 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:24.588713 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-zs9pb" event={"ID":"036649a7-fe91-49cc-ba36-5caaceef8d42","Type":"ContainerStarted","Data":"3c73712eadbcb7d110de87c1b54094f89bcdab14bebea890150489f422be92c7"} Apr 19 12:40:45.264226 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:45.264194 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnbtf_2b04d28c-3eb3-44e1-b431-d6b75f3850fe/console-operator/1.log" Apr 19 12:40:45.264747 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:45.264498 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnbtf_2b04d28c-3eb3-44e1-b431-d6b75f3850fe/console-operator/1.log" Apr 19 12:40:45.688621 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:45.688584 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-zs9pb" event={"ID":"036649a7-fe91-49cc-ba36-5caaceef8d42","Type":"ContainerStarted","Data":"ea91928054a652ddb1c63cb85259a6b41b2fb4644e75ed659ffbc133f070062b"} Apr 19 12:40:45.705716 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:45.705668 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-zs9pb" podStartSLOduration=1.416670621 podStartE2EDuration="22.705654412s" podCreationTimestamp="2026-04-19 12:40:23 +0000 UTC" firstStartedPulling="2026-04-19 12:40:24.036563817 +0000 UTC m=+592.190497274" lastFinishedPulling="2026-04-19 12:40:45.325547604 +0000 UTC m=+613.479481065" observedRunningTime="2026-04-19 12:40:45.703422421 +0000 UTC m=+613.857355901" watchObservedRunningTime="2026-04-19 12:40:45.705654412 +0000 UTC m=+613.859587891" Apr 19 12:40:47.367092 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.367049 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-58fff8c4c5-tqlfs" podUID="c6bb8c7f-a442-43c7-88a2-aedc6cfd9347" containerName="console" containerID="cri-o://f17fc991c8763f151a0dd0719e7782c205b391c647518bc413c70c642686557c" gracePeriod=15 Apr 19 12:40:47.698369 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.698338 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58fff8c4c5-tqlfs_c6bb8c7f-a442-43c7-88a2-aedc6cfd9347/console/0.log" Apr 19 12:40:47.698511 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.698385 2567 generic.go:358] "Generic (PLEG): container finished" podID="c6bb8c7f-a442-43c7-88a2-aedc6cfd9347" containerID="f17fc991c8763f151a0dd0719e7782c205b391c647518bc413c70c642686557c" exitCode=2 Apr 19 12:40:47.698511 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.698457 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58fff8c4c5-tqlfs" event={"ID":"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347","Type":"ContainerDied","Data":"f17fc991c8763f151a0dd0719e7782c205b391c647518bc413c70c642686557c"} Apr 19 12:40:47.734451 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.734429 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58fff8c4c5-tqlfs_c6bb8c7f-a442-43c7-88a2-aedc6cfd9347/console/0.log" Apr 19 12:40:47.734575 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.734538 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:40:47.862771 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.862737 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-console-serving-cert\") pod \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " Apr 19 12:40:47.862951 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.862796 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-console-oauth-config\") pod \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " Apr 19 12:40:47.862951 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.862926 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr9sq\" (UniqueName: \"kubernetes.io/projected/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-kube-api-access-sr9sq\") pod \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " Apr 19 12:40:47.863225 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.862967 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-service-ca\") pod \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " Apr 19 12:40:47.863225 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.863008 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-oauth-serving-cert\") pod \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " Apr 19 12:40:47.863225 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.863053 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-trusted-ca-bundle\") pod \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " Apr 19 12:40:47.863225 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.863204 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-console-config\") pod \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\" (UID: \"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347\") " Apr 19 12:40:47.863519 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.863440 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-service-ca" (OuterVolumeSpecName: "service-ca") pod "c6bb8c7f-a442-43c7-88a2-aedc6cfd9347" (UID: "c6bb8c7f-a442-43c7-88a2-aedc6cfd9347"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:40:47.863519 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.863454 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c6bb8c7f-a442-43c7-88a2-aedc6cfd9347" (UID: "c6bb8c7f-a442-43c7-88a2-aedc6cfd9347"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:40:47.863519 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.863468 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c6bb8c7f-a442-43c7-88a2-aedc6cfd9347" (UID: "c6bb8c7f-a442-43c7-88a2-aedc6cfd9347"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:40:47.863738 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.863714 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-service-ca\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:40:47.863805 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.863742 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-oauth-serving-cert\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:40:47.863805 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.863756 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-trusted-ca-bundle\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:40:47.863805 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.863782 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-console-config" (OuterVolumeSpecName: "console-config") pod "c6bb8c7f-a442-43c7-88a2-aedc6cfd9347" (UID: "c6bb8c7f-a442-43c7-88a2-aedc6cfd9347"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:40:47.865021 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.865000 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c6bb8c7f-a442-43c7-88a2-aedc6cfd9347" (UID: "c6bb8c7f-a442-43c7-88a2-aedc6cfd9347"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:40:47.865126 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.865020 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c6bb8c7f-a442-43c7-88a2-aedc6cfd9347" (UID: "c6bb8c7f-a442-43c7-88a2-aedc6cfd9347"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:40:47.865126 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.865037 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-kube-api-access-sr9sq" (OuterVolumeSpecName: "kube-api-access-sr9sq") pod "c6bb8c7f-a442-43c7-88a2-aedc6cfd9347" (UID: "c6bb8c7f-a442-43c7-88a2-aedc6cfd9347"). InnerVolumeSpecName "kube-api-access-sr9sq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:40:47.964902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.964827 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-console-config\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:40:47.964902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.964856 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-console-serving-cert\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:40:47.964902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.964866 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-console-oauth-config\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:40:47.964902 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:47.964875 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sr9sq\" (UniqueName: \"kubernetes.io/projected/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347-kube-api-access-sr9sq\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:40:48.703760 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:48.703729 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58fff8c4c5-tqlfs_c6bb8c7f-a442-43c7-88a2-aedc6cfd9347/console/0.log" Apr 19 12:40:48.704290 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:48.703869 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58fff8c4c5-tqlfs" Apr 19 12:40:48.704290 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:48.703867 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58fff8c4c5-tqlfs" event={"ID":"c6bb8c7f-a442-43c7-88a2-aedc6cfd9347","Type":"ContainerDied","Data":"046bc0dd9999499dbdbc5df5219aca321ddef981179e173fc85746141b0c511e"} Apr 19 12:40:48.704290 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:48.703913 2567 scope.go:117] "RemoveContainer" containerID="f17fc991c8763f151a0dd0719e7782c205b391c647518bc413c70c642686557c" Apr 19 12:40:48.720621 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:48.720564 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58fff8c4c5-tqlfs"] Apr 19 12:40:48.725886 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:48.725863 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-58fff8c4c5-tqlfs"] Apr 19 12:40:50.459112 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:40:50.459077 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6bb8c7f-a442-43c7-88a2-aedc6cfd9347" path="/var/lib/kubelet/pods/c6bb8c7f-a442-43c7-88a2-aedc6cfd9347/volumes" Apr 19 12:41:18.802595 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:18.802557 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:41:18.803125 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:18.802974 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6bb8c7f-a442-43c7-88a2-aedc6cfd9347" containerName="console" Apr 19 12:41:18.803125 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:18.802990 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6bb8c7f-a442-43c7-88a2-aedc6cfd9347" containerName="console" Apr 19 12:41:18.803125 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:18.803063 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6bb8c7f-a442-43c7-88a2-aedc6cfd9347" containerName="console" Apr 19 12:41:18.848576 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:18.848535 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:41:18.848576 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:18.848568 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:41:18.848839 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:18.848674 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-fvgtn" Apr 19 12:41:18.850651 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:18.850628 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 19 12:41:18.927283 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:18.927250 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws6cp\" (UniqueName: \"kubernetes.io/projected/f144adca-9201-4b23-b7de-f5e209e3797f-kube-api-access-ws6cp\") pod \"limitador-limitador-78c99df468-fvgtn\" (UID: \"f144adca-9201-4b23-b7de-f5e209e3797f\") " pod="kuadrant-system/limitador-limitador-78c99df468-fvgtn" Apr 19 12:41:18.927460 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:18.927349 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/f144adca-9201-4b23-b7de-f5e209e3797f-config-file\") pod \"limitador-limitador-78c99df468-fvgtn\" (UID: \"f144adca-9201-4b23-b7de-f5e209e3797f\") " pod="kuadrant-system/limitador-limitador-78c99df468-fvgtn" Apr 19 12:41:19.028193 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:19.028133 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/f144adca-9201-4b23-b7de-f5e209e3797f-config-file\") pod \"limitador-limitador-78c99df468-fvgtn\" (UID: \"f144adca-9201-4b23-b7de-f5e209e3797f\") " pod="kuadrant-system/limitador-limitador-78c99df468-fvgtn" Apr 19 12:41:19.028372 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:19.028242 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ws6cp\" (UniqueName: \"kubernetes.io/projected/f144adca-9201-4b23-b7de-f5e209e3797f-kube-api-access-ws6cp\") pod \"limitador-limitador-78c99df468-fvgtn\" (UID: \"f144adca-9201-4b23-b7de-f5e209e3797f\") " pod="kuadrant-system/limitador-limitador-78c99df468-fvgtn" Apr 19 12:41:19.028715 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:19.028696 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/f144adca-9201-4b23-b7de-f5e209e3797f-config-file\") pod \"limitador-limitador-78c99df468-fvgtn\" (UID: \"f144adca-9201-4b23-b7de-f5e209e3797f\") " pod="kuadrant-system/limitador-limitador-78c99df468-fvgtn" Apr 19 12:41:19.035463 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:19.035442 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws6cp\" (UniqueName: \"kubernetes.io/projected/f144adca-9201-4b23-b7de-f5e209e3797f-kube-api-access-ws6cp\") pod \"limitador-limitador-78c99df468-fvgtn\" (UID: \"f144adca-9201-4b23-b7de-f5e209e3797f\") " pod="kuadrant-system/limitador-limitador-78c99df468-fvgtn" Apr 19 12:41:19.127717 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:19.127639 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-mftht"] Apr 19 12:41:19.131511 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:19.131489 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-mftht" Apr 19 12:41:19.133279 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:19.133259 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-svsgq\"" Apr 19 12:41:19.137122 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:19.137099 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-mftht"] Apr 19 12:41:19.159177 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:19.159131 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-fvgtn" Apr 19 12:41:19.236007 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:19.233992 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96nhm\" (UniqueName: \"kubernetes.io/projected/7ff96edb-9596-4d27-b33e-9c26df365548-kube-api-access-96nhm\") pod \"authorino-f99f4b5cd-mftht\" (UID: \"7ff96edb-9596-4d27-b33e-9c26df365548\") " pod="kuadrant-system/authorino-f99f4b5cd-mftht" Apr 19 12:41:19.304304 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:19.304277 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:41:19.305730 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:41:19.305699 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf144adca_9201_4b23_b7de_f5e209e3797f.slice/crio-29ed85c42085257247505189370d45f9d75a50447ded223f68a2b10bf5b9ecb9 WatchSource:0}: Error finding container 29ed85c42085257247505189370d45f9d75a50447ded223f68a2b10bf5b9ecb9: Status 404 returned error can't find the container with id 29ed85c42085257247505189370d45f9d75a50447ded223f68a2b10bf5b9ecb9 Apr 19 12:41:19.307563 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:19.307543 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:41:19.335477 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:19.335446 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96nhm\" (UniqueName: \"kubernetes.io/projected/7ff96edb-9596-4d27-b33e-9c26df365548-kube-api-access-96nhm\") pod \"authorino-f99f4b5cd-mftht\" (UID: \"7ff96edb-9596-4d27-b33e-9c26df365548\") " pod="kuadrant-system/authorino-f99f4b5cd-mftht" Apr 19 12:41:19.342947 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:19.342922 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96nhm\" (UniqueName: \"kubernetes.io/projected/7ff96edb-9596-4d27-b33e-9c26df365548-kube-api-access-96nhm\") pod \"authorino-f99f4b5cd-mftht\" (UID: \"7ff96edb-9596-4d27-b33e-9c26df365548\") " pod="kuadrant-system/authorino-f99f4b5cd-mftht" Apr 19 12:41:19.443277 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:19.443244 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-mftht" Apr 19 12:41:19.561668 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:19.561643 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-mftht"] Apr 19 12:41:19.563941 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:41:19.563914 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ff96edb_9596_4d27_b33e_9c26df365548.slice/crio-c1f1abde5a0d26657deab7299966dbf1e017d14aa710b2e442faed987eae06bb WatchSource:0}: Error finding container c1f1abde5a0d26657deab7299966dbf1e017d14aa710b2e442faed987eae06bb: Status 404 returned error can't find the container with id c1f1abde5a0d26657deab7299966dbf1e017d14aa710b2e442faed987eae06bb Apr 19 12:41:19.823305 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:19.823206 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-fvgtn" event={"ID":"f144adca-9201-4b23-b7de-f5e209e3797f","Type":"ContainerStarted","Data":"29ed85c42085257247505189370d45f9d75a50447ded223f68a2b10bf5b9ecb9"} Apr 19 12:41:19.824759 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:19.824725 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-mftht" event={"ID":"7ff96edb-9596-4d27-b33e-9c26df365548","Type":"ContainerStarted","Data":"c1f1abde5a0d26657deab7299966dbf1e017d14aa710b2e442faed987eae06bb"} Apr 19 12:41:22.839792 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:22.839748 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-fvgtn" event={"ID":"f144adca-9201-4b23-b7de-f5e209e3797f","Type":"ContainerStarted","Data":"56e2fbebd2579669d656ac60220f412382290f21819235cf864dda06298d4fbd"} Apr 19 12:41:22.840237 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:22.839827 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-fvgtn" Apr 19 12:41:22.841193 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:22.841154 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-mftht" event={"ID":"7ff96edb-9596-4d27-b33e-9c26df365548","Type":"ContainerStarted","Data":"1f2f355c6f66aa18419ed7dbf034f26f7e124d269d337474a5fda315b80891ce"} Apr 19 12:41:22.854342 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:22.854297 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-fvgtn" podStartSLOduration=1.456613866 podStartE2EDuration="4.854285808s" podCreationTimestamp="2026-04-19 12:41:18 +0000 UTC" firstStartedPulling="2026-04-19 12:41:19.307672624 +0000 UTC m=+647.461606083" lastFinishedPulling="2026-04-19 12:41:22.70534456 +0000 UTC m=+650.859278025" observedRunningTime="2026-04-19 12:41:22.852831754 +0000 UTC m=+651.006765244" watchObservedRunningTime="2026-04-19 12:41:22.854285808 +0000 UTC m=+651.008219285" Apr 19 12:41:22.865605 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:22.865562 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-mftht" podStartSLOduration=0.780546472 podStartE2EDuration="3.865544011s" podCreationTimestamp="2026-04-19 12:41:19 +0000 UTC" firstStartedPulling="2026-04-19 12:41:19.565273571 +0000 UTC m=+647.719207029" lastFinishedPulling="2026-04-19 12:41:22.650271099 +0000 UTC m=+650.804204568" observedRunningTime="2026-04-19 12:41:22.865039671 +0000 UTC m=+651.018973151" watchObservedRunningTime="2026-04-19 12:41:22.865544011 +0000 UTC m=+651.019477491" Apr 19 12:41:23.568179 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:23.568136 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-mftht"] Apr 19 12:41:24.849124 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:24.849060 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-mftht" podUID="7ff96edb-9596-4d27-b33e-9c26df365548" containerName="authorino" containerID="cri-o://1f2f355c6f66aa18419ed7dbf034f26f7e124d269d337474a5fda315b80891ce" gracePeriod=30 Apr 19 12:41:25.100112 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:25.100051 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-mftht" Apr 19 12:41:25.197271 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:25.197232 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96nhm\" (UniqueName: \"kubernetes.io/projected/7ff96edb-9596-4d27-b33e-9c26df365548-kube-api-access-96nhm\") pod \"7ff96edb-9596-4d27-b33e-9c26df365548\" (UID: \"7ff96edb-9596-4d27-b33e-9c26df365548\") " Apr 19 12:41:25.199363 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:25.199330 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ff96edb-9596-4d27-b33e-9c26df365548-kube-api-access-96nhm" (OuterVolumeSpecName: "kube-api-access-96nhm") pod "7ff96edb-9596-4d27-b33e-9c26df365548" (UID: "7ff96edb-9596-4d27-b33e-9c26df365548"). InnerVolumeSpecName "kube-api-access-96nhm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:41:25.298217 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:25.298185 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-96nhm\" (UniqueName: \"kubernetes.io/projected/7ff96edb-9596-4d27-b33e-9c26df365548-kube-api-access-96nhm\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:41:25.853723 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:25.853686 2567 generic.go:358] "Generic (PLEG): container finished" podID="7ff96edb-9596-4d27-b33e-9c26df365548" containerID="1f2f355c6f66aa18419ed7dbf034f26f7e124d269d337474a5fda315b80891ce" exitCode=0 Apr 19 12:41:25.854128 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:25.853745 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-mftht" Apr 19 12:41:25.854128 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:25.853771 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-mftht" event={"ID":"7ff96edb-9596-4d27-b33e-9c26df365548","Type":"ContainerDied","Data":"1f2f355c6f66aa18419ed7dbf034f26f7e124d269d337474a5fda315b80891ce"} Apr 19 12:41:25.854128 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:25.853814 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-mftht" event={"ID":"7ff96edb-9596-4d27-b33e-9c26df365548","Type":"ContainerDied","Data":"c1f1abde5a0d26657deab7299966dbf1e017d14aa710b2e442faed987eae06bb"} Apr 19 12:41:25.854128 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:25.853831 2567 scope.go:117] "RemoveContainer" containerID="1f2f355c6f66aa18419ed7dbf034f26f7e124d269d337474a5fda315b80891ce" Apr 19 12:41:25.862809 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:25.862794 2567 scope.go:117] "RemoveContainer" containerID="1f2f355c6f66aa18419ed7dbf034f26f7e124d269d337474a5fda315b80891ce" Apr 19 12:41:25.863029 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:41:25.863012 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f2f355c6f66aa18419ed7dbf034f26f7e124d269d337474a5fda315b80891ce\": container with ID starting with 1f2f355c6f66aa18419ed7dbf034f26f7e124d269d337474a5fda315b80891ce not found: ID does not exist" containerID="1f2f355c6f66aa18419ed7dbf034f26f7e124d269d337474a5fda315b80891ce" Apr 19 12:41:25.863083 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:25.863037 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f2f355c6f66aa18419ed7dbf034f26f7e124d269d337474a5fda315b80891ce"} err="failed to get container status \"1f2f355c6f66aa18419ed7dbf034f26f7e124d269d337474a5fda315b80891ce\": rpc error: code = NotFound desc = could not find container \"1f2f355c6f66aa18419ed7dbf034f26f7e124d269d337474a5fda315b80891ce\": container with ID starting with 1f2f355c6f66aa18419ed7dbf034f26f7e124d269d337474a5fda315b80891ce not found: ID does not exist" Apr 19 12:41:25.874586 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:25.874562 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-mftht"] Apr 19 12:41:25.877664 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:25.877641 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-mftht"] Apr 19 12:41:26.459321 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:26.459289 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ff96edb-9596-4d27-b33e-9c26df365548" path="/var/lib/kubelet/pods/7ff96edb-9596-4d27-b33e-9c26df365548/volumes" Apr 19 12:41:33.846583 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:33.846550 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-fvgtn" Apr 19 12:41:48.556625 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.556587 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-kpzql"] Apr 19 12:41:48.557013 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.556967 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ff96edb-9596-4d27-b33e-9c26df365548" containerName="authorino" Apr 19 12:41:48.557013 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.556978 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff96edb-9596-4d27-b33e-9c26df365548" containerName="authorino" Apr 19 12:41:48.557094 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.557036 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ff96edb-9596-4d27-b33e-9c26df365548" containerName="authorino" Apr 19 12:41:48.561534 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.561515 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-kpzql" Apr 19 12:41:48.563427 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.563401 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-svsgq\"" Apr 19 12:41:48.566051 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.566025 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-kpzql"] Apr 19 12:41:48.595488 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.595454 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5g6m\" (UniqueName: \"kubernetes.io/projected/65b7d3e4-b87e-4f9f-9a71-f5d95151c120-kube-api-access-q5g6m\") pod \"authorino-8b475cf9f-kpzql\" (UID: \"65b7d3e4-b87e-4f9f-9a71-f5d95151c120\") " pod="kuadrant-system/authorino-8b475cf9f-kpzql" Apr 19 12:41:48.696190 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.696121 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5g6m\" (UniqueName: \"kubernetes.io/projected/65b7d3e4-b87e-4f9f-9a71-f5d95151c120-kube-api-access-q5g6m\") pod \"authorino-8b475cf9f-kpzql\" (UID: \"65b7d3e4-b87e-4f9f-9a71-f5d95151c120\") " pod="kuadrant-system/authorino-8b475cf9f-kpzql" Apr 19 12:41:48.703632 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.703597 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5g6m\" (UniqueName: \"kubernetes.io/projected/65b7d3e4-b87e-4f9f-9a71-f5d95151c120-kube-api-access-q5g6m\") pod \"authorino-8b475cf9f-kpzql\" (UID: \"65b7d3e4-b87e-4f9f-9a71-f5d95151c120\") " pod="kuadrant-system/authorino-8b475cf9f-kpzql" Apr 19 12:41:48.796072 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.796035 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-kpzql"] Apr 19 12:41:48.796300 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.796288 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-kpzql" Apr 19 12:41:48.819418 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.819340 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-57895fd864-v96sm"] Apr 19 12:41:48.823739 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.823714 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-57895fd864-v96sm" Apr 19 12:41:48.828537 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.828514 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-57895fd864-v96sm"] Apr 19 12:41:48.846275 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.846239 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-57895fd864-v96sm"] Apr 19 12:41:48.846532 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:41:48.846507 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-zcwtn], unattached volumes=[], failed to process volumes=[kube-api-access-zcwtn]: context canceled" pod="kuadrant-system/authorino-57895fd864-v96sm" podUID="3ea37fae-b494-4df6-89f3-05d7ffce710e" Apr 19 12:41:48.872285 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.872249 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-5fc5db7fdd-xz8x4"] Apr 19 12:41:48.875787 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.875762 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5fc5db7fdd-xz8x4" Apr 19 12:41:48.877716 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.877688 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 19 12:41:48.885073 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.884773 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5fc5db7fdd-xz8x4"] Apr 19 12:41:48.897579 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.897542 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5zvb\" (UniqueName: \"kubernetes.io/projected/1fc11394-db33-4fd2-b247-03e309576df3-kube-api-access-g5zvb\") pod \"authorino-5fc5db7fdd-xz8x4\" (UID: \"1fc11394-db33-4fd2-b247-03e309576df3\") " pod="kuadrant-system/authorino-5fc5db7fdd-xz8x4" Apr 19 12:41:48.897708 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.897626 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1fc11394-db33-4fd2-b247-03e309576df3-tls-cert\") pod \"authorino-5fc5db7fdd-xz8x4\" (UID: \"1fc11394-db33-4fd2-b247-03e309576df3\") " pod="kuadrant-system/authorino-5fc5db7fdd-xz8x4" Apr 19 12:41:48.897767 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.897752 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcwtn\" (UniqueName: \"kubernetes.io/projected/3ea37fae-b494-4df6-89f3-05d7ffce710e-kube-api-access-zcwtn\") pod \"authorino-57895fd864-v96sm\" (UID: \"3ea37fae-b494-4df6-89f3-05d7ffce710e\") " pod="kuadrant-system/authorino-57895fd864-v96sm" Apr 19 12:41:48.932860 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.932829 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-kpzql"] Apr 19 12:41:48.934339 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:41:48.934310 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65b7d3e4_b87e_4f9f_9a71_f5d95151c120.slice/crio-bd4699d7002660e2321452954e98c29af79676103c3f8bac266ad2fdde8ec275 WatchSource:0}: Error finding container bd4699d7002660e2321452954e98c29af79676103c3f8bac266ad2fdde8ec275: Status 404 returned error can't find the container with id bd4699d7002660e2321452954e98c29af79676103c3f8bac266ad2fdde8ec275 Apr 19 12:41:48.945851 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.945822 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-kpzql" event={"ID":"65b7d3e4-b87e-4f9f-9a71-f5d95151c120","Type":"ContainerStarted","Data":"bd4699d7002660e2321452954e98c29af79676103c3f8bac266ad2fdde8ec275"} Apr 19 12:41:48.945975 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.945883 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-57895fd864-v96sm" Apr 19 12:41:48.951099 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.951079 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-57895fd864-v96sm" Apr 19 12:41:49.000251 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.998828 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5zvb\" (UniqueName: \"kubernetes.io/projected/1fc11394-db33-4fd2-b247-03e309576df3-kube-api-access-g5zvb\") pod \"authorino-5fc5db7fdd-xz8x4\" (UID: \"1fc11394-db33-4fd2-b247-03e309576df3\") " pod="kuadrant-system/authorino-5fc5db7fdd-xz8x4" Apr 19 12:41:49.000251 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.998932 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1fc11394-db33-4fd2-b247-03e309576df3-tls-cert\") pod \"authorino-5fc5db7fdd-xz8x4\" (UID: \"1fc11394-db33-4fd2-b247-03e309576df3\") " pod="kuadrant-system/authorino-5fc5db7fdd-xz8x4" Apr 19 12:41:49.000251 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:48.999086 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcwtn\" (UniqueName: \"kubernetes.io/projected/3ea37fae-b494-4df6-89f3-05d7ffce710e-kube-api-access-zcwtn\") pod \"authorino-57895fd864-v96sm\" (UID: \"3ea37fae-b494-4df6-89f3-05d7ffce710e\") " pod="kuadrant-system/authorino-57895fd864-v96sm" Apr 19 12:41:49.006268 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:49.006229 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1fc11394-db33-4fd2-b247-03e309576df3-tls-cert\") pod \"authorino-5fc5db7fdd-xz8x4\" (UID: \"1fc11394-db33-4fd2-b247-03e309576df3\") " pod="kuadrant-system/authorino-5fc5db7fdd-xz8x4" Apr 19 12:41:49.008452 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:49.008428 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5zvb\" (UniqueName: \"kubernetes.io/projected/1fc11394-db33-4fd2-b247-03e309576df3-kube-api-access-g5zvb\") pod \"authorino-5fc5db7fdd-xz8x4\" (UID: \"1fc11394-db33-4fd2-b247-03e309576df3\") " pod="kuadrant-system/authorino-5fc5db7fdd-xz8x4" Apr 19 12:41:49.008580 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:49.008524 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcwtn\" (UniqueName: \"kubernetes.io/projected/3ea37fae-b494-4df6-89f3-05d7ffce710e-kube-api-access-zcwtn\") pod \"authorino-57895fd864-v96sm\" (UID: \"3ea37fae-b494-4df6-89f3-05d7ffce710e\") " pod="kuadrant-system/authorino-57895fd864-v96sm" Apr 19 12:41:49.188396 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:49.188364 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5fc5db7fdd-xz8x4" Apr 19 12:41:49.201266 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:49.201238 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcwtn\" (UniqueName: \"kubernetes.io/projected/3ea37fae-b494-4df6-89f3-05d7ffce710e-kube-api-access-zcwtn\") pod \"3ea37fae-b494-4df6-89f3-05d7ffce710e\" (UID: \"3ea37fae-b494-4df6-89f3-05d7ffce710e\") " Apr 19 12:41:49.203203 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:49.203174 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ea37fae-b494-4df6-89f3-05d7ffce710e-kube-api-access-zcwtn" (OuterVolumeSpecName: "kube-api-access-zcwtn") pod "3ea37fae-b494-4df6-89f3-05d7ffce710e" (UID: "3ea37fae-b494-4df6-89f3-05d7ffce710e"). InnerVolumeSpecName "kube-api-access-zcwtn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:41:49.302533 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:49.302499 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zcwtn\" (UniqueName: \"kubernetes.io/projected/3ea37fae-b494-4df6-89f3-05d7ffce710e-kube-api-access-zcwtn\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:41:49.311500 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:49.311469 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5fc5db7fdd-xz8x4"] Apr 19 12:41:49.313112 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:41:49.313084 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fc11394_db33_4fd2_b247_03e309576df3.slice/crio-d2267c758a1f3c646414cfdd5a20314b8c86bae7369477190eda6806fbe950fb WatchSource:0}: Error finding container d2267c758a1f3c646414cfdd5a20314b8c86bae7369477190eda6806fbe950fb: Status 404 returned error can't find the container with id d2267c758a1f3c646414cfdd5a20314b8c86bae7369477190eda6806fbe950fb Apr 19 12:41:49.951581 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:49.951472 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-kpzql" event={"ID":"65b7d3e4-b87e-4f9f-9a71-f5d95151c120","Type":"ContainerStarted","Data":"8d3ccc369aa2698df720d0e4143735ba18c31070be9d2656a6951b6557a9264a"} Apr 19 12:41:49.951581 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:49.951554 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-kpzql" podUID="65b7d3e4-b87e-4f9f-9a71-f5d95151c120" containerName="authorino" containerID="cri-o://8d3ccc369aa2698df720d0e4143735ba18c31070be9d2656a6951b6557a9264a" gracePeriod=30 Apr 19 12:41:49.952994 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:49.952961 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5fc5db7fdd-xz8x4" event={"ID":"1fc11394-db33-4fd2-b247-03e309576df3","Type":"ContainerStarted","Data":"28c29dd02fa3c687c1fdd89f204238cb507453b3aeab8cf87042fc1d4550f554"} Apr 19 12:41:49.953101 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:49.953002 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5fc5db7fdd-xz8x4" event={"ID":"1fc11394-db33-4fd2-b247-03e309576df3","Type":"ContainerStarted","Data":"d2267c758a1f3c646414cfdd5a20314b8c86bae7369477190eda6806fbe950fb"} Apr 19 12:41:49.953101 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:49.953038 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-57895fd864-v96sm" Apr 19 12:41:49.965395 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:49.965347 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-kpzql" podStartSLOduration=1.469074838 podStartE2EDuration="1.965330372s" podCreationTimestamp="2026-04-19 12:41:48 +0000 UTC" firstStartedPulling="2026-04-19 12:41:48.935606269 +0000 UTC m=+677.089539727" lastFinishedPulling="2026-04-19 12:41:49.4318618 +0000 UTC m=+677.585795261" observedRunningTime="2026-04-19 12:41:49.964646098 +0000 UTC m=+678.118579580" watchObservedRunningTime="2026-04-19 12:41:49.965330372 +0000 UTC m=+678.119263854" Apr 19 12:41:49.978115 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:49.978062 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-5fc5db7fdd-xz8x4" podStartSLOduration=1.64837821 podStartE2EDuration="1.978047979s" podCreationTimestamp="2026-04-19 12:41:48 +0000 UTC" firstStartedPulling="2026-04-19 12:41:49.314506391 +0000 UTC m=+677.468439850" lastFinishedPulling="2026-04-19 12:41:49.644176146 +0000 UTC m=+677.798109619" observedRunningTime="2026-04-19 12:41:49.976545384 +0000 UTC m=+678.130478858" watchObservedRunningTime="2026-04-19 12:41:49.978047979 +0000 UTC m=+678.131981756" Apr 19 12:41:50.032107 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:50.032062 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-57895fd864-v96sm"] Apr 19 12:41:50.034411 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:50.034382 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-57895fd864-v96sm"] Apr 19 12:41:50.205392 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:50.205333 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-kpzql" Apr 19 12:41:50.210384 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:50.210365 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5g6m\" (UniqueName: \"kubernetes.io/projected/65b7d3e4-b87e-4f9f-9a71-f5d95151c120-kube-api-access-q5g6m\") pod \"65b7d3e4-b87e-4f9f-9a71-f5d95151c120\" (UID: \"65b7d3e4-b87e-4f9f-9a71-f5d95151c120\") " Apr 19 12:41:50.212345 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:50.212324 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65b7d3e4-b87e-4f9f-9a71-f5d95151c120-kube-api-access-q5g6m" (OuterVolumeSpecName: "kube-api-access-q5g6m") pod "65b7d3e4-b87e-4f9f-9a71-f5d95151c120" (UID: "65b7d3e4-b87e-4f9f-9a71-f5d95151c120"). InnerVolumeSpecName "kube-api-access-q5g6m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:41:50.311714 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:50.311669 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q5g6m\" (UniqueName: \"kubernetes.io/projected/65b7d3e4-b87e-4f9f-9a71-f5d95151c120-kube-api-access-q5g6m\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:41:50.459260 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:50.459182 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ea37fae-b494-4df6-89f3-05d7ffce710e" path="/var/lib/kubelet/pods/3ea37fae-b494-4df6-89f3-05d7ffce710e/volumes" Apr 19 12:41:50.958067 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:50.958031 2567 generic.go:358] "Generic (PLEG): container finished" podID="65b7d3e4-b87e-4f9f-9a71-f5d95151c120" containerID="8d3ccc369aa2698df720d0e4143735ba18c31070be9d2656a6951b6557a9264a" exitCode=0 Apr 19 12:41:50.958572 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:50.958086 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-kpzql" Apr 19 12:41:50.958572 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:50.958121 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-kpzql" event={"ID":"65b7d3e4-b87e-4f9f-9a71-f5d95151c120","Type":"ContainerDied","Data":"8d3ccc369aa2698df720d0e4143735ba18c31070be9d2656a6951b6557a9264a"} Apr 19 12:41:50.958572 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:50.958154 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-kpzql" event={"ID":"65b7d3e4-b87e-4f9f-9a71-f5d95151c120","Type":"ContainerDied","Data":"bd4699d7002660e2321452954e98c29af79676103c3f8bac266ad2fdde8ec275"} Apr 19 12:41:50.958572 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:50.958192 2567 scope.go:117] "RemoveContainer" containerID="8d3ccc369aa2698df720d0e4143735ba18c31070be9d2656a6951b6557a9264a" Apr 19 12:41:50.970094 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:50.970061 2567 scope.go:117] "RemoveContainer" containerID="8d3ccc369aa2698df720d0e4143735ba18c31070be9d2656a6951b6557a9264a" Apr 19 12:41:50.970420 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:41:50.970396 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d3ccc369aa2698df720d0e4143735ba18c31070be9d2656a6951b6557a9264a\": container with ID starting with 8d3ccc369aa2698df720d0e4143735ba18c31070be9d2656a6951b6557a9264a not found: ID does not exist" containerID="8d3ccc369aa2698df720d0e4143735ba18c31070be9d2656a6951b6557a9264a" Apr 19 12:41:50.970532 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:50.970432 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d3ccc369aa2698df720d0e4143735ba18c31070be9d2656a6951b6557a9264a"} err="failed to get container status \"8d3ccc369aa2698df720d0e4143735ba18c31070be9d2656a6951b6557a9264a\": rpc error: code = NotFound desc = could not find container \"8d3ccc369aa2698df720d0e4143735ba18c31070be9d2656a6951b6557a9264a\": container with ID starting with 8d3ccc369aa2698df720d0e4143735ba18c31070be9d2656a6951b6557a9264a not found: ID does not exist" Apr 19 12:41:50.972505 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:50.972476 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-kpzql"] Apr 19 12:41:50.975804 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:50.975778 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-kpzql"] Apr 19 12:41:52.460176 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:52.460127 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65b7d3e4-b87e-4f9f-9a71-f5d95151c120" path="/var/lib/kubelet/pods/65b7d3e4-b87e-4f9f-9a71-f5d95151c120/volumes" Apr 19 12:41:57.555504 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:41:57.555470 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:42:39.228377 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:42:39.228341 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:42:49.737452 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:42:49.737410 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:42:58.824489 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:42:58.824448 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:43:02.429985 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:02.429954 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:43:06.108730 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.108695 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6"] Apr 19 12:43:06.109337 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.109291 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65b7d3e4-b87e-4f9f-9a71-f5d95151c120" containerName="authorino" Apr 19 12:43:06.109337 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.109311 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b7d3e4-b87e-4f9f-9a71-f5d95151c120" containerName="authorino" Apr 19 12:43:06.109482 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.109396 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="65b7d3e4-b87e-4f9f-9a71-f5d95151c120" containerName="authorino" Apr 19 12:43:06.112790 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.112769 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" Apr 19 12:43:06.115222 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.115193 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 19 12:43:06.115222 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.115216 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 19 12:43:06.115417 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.115231 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 19 12:43:06.115417 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.115313 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-v7ph6\"" Apr 19 12:43:06.121629 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.121608 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6"] Apr 19 12:43:06.174321 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.174287 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b45e71f2-9724-437b-8ce5-78daa954cca5-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6\" (UID: \"b45e71f2-9724-437b-8ce5-78daa954cca5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" Apr 19 12:43:06.174498 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.174333 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b45e71f2-9724-437b-8ce5-78daa954cca5-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6\" (UID: \"b45e71f2-9724-437b-8ce5-78daa954cca5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" Apr 19 12:43:06.174498 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.174428 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b45e71f2-9724-437b-8ce5-78daa954cca5-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6\" (UID: \"b45e71f2-9724-437b-8ce5-78daa954cca5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" Apr 19 12:43:06.174498 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.174477 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b45e71f2-9724-437b-8ce5-78daa954cca5-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6\" (UID: \"b45e71f2-9724-437b-8ce5-78daa954cca5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" Apr 19 12:43:06.174607 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.174506 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqpzm\" (UniqueName: \"kubernetes.io/projected/b45e71f2-9724-437b-8ce5-78daa954cca5-kube-api-access-zqpzm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6\" (UID: \"b45e71f2-9724-437b-8ce5-78daa954cca5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" Apr 19 12:43:06.174607 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.174595 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b45e71f2-9724-437b-8ce5-78daa954cca5-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6\" (UID: \"b45e71f2-9724-437b-8ce5-78daa954cca5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" Apr 19 12:43:06.276082 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.276038 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b45e71f2-9724-437b-8ce5-78daa954cca5-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6\" (UID: \"b45e71f2-9724-437b-8ce5-78daa954cca5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" Apr 19 12:43:06.276289 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.276102 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zqpzm\" (UniqueName: \"kubernetes.io/projected/b45e71f2-9724-437b-8ce5-78daa954cca5-kube-api-access-zqpzm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6\" (UID: \"b45e71f2-9724-437b-8ce5-78daa954cca5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" Apr 19 12:43:06.276289 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.276137 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b45e71f2-9724-437b-8ce5-78daa954cca5-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6\" (UID: \"b45e71f2-9724-437b-8ce5-78daa954cca5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" Apr 19 12:43:06.276289 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.276229 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b45e71f2-9724-437b-8ce5-78daa954cca5-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6\" (UID: \"b45e71f2-9724-437b-8ce5-78daa954cca5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" Apr 19 12:43:06.276289 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.276260 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b45e71f2-9724-437b-8ce5-78daa954cca5-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6\" (UID: \"b45e71f2-9724-437b-8ce5-78daa954cca5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" Apr 19 12:43:06.276556 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.276316 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b45e71f2-9724-437b-8ce5-78daa954cca5-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6\" (UID: \"b45e71f2-9724-437b-8ce5-78daa954cca5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" Apr 19 12:43:06.276638 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.276617 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b45e71f2-9724-437b-8ce5-78daa954cca5-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6\" (UID: \"b45e71f2-9724-437b-8ce5-78daa954cca5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" Apr 19 12:43:06.276688 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.276618 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b45e71f2-9724-437b-8ce5-78daa954cca5-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6\" (UID: \"b45e71f2-9724-437b-8ce5-78daa954cca5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" Apr 19 12:43:06.276688 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.276664 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b45e71f2-9724-437b-8ce5-78daa954cca5-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6\" (UID: \"b45e71f2-9724-437b-8ce5-78daa954cca5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" Apr 19 12:43:06.278311 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.278290 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b45e71f2-9724-437b-8ce5-78daa954cca5-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6\" (UID: \"b45e71f2-9724-437b-8ce5-78daa954cca5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" Apr 19 12:43:06.278608 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.278592 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b45e71f2-9724-437b-8ce5-78daa954cca5-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6\" (UID: \"b45e71f2-9724-437b-8ce5-78daa954cca5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" Apr 19 12:43:06.289492 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.289470 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqpzm\" (UniqueName: \"kubernetes.io/projected/b45e71f2-9724-437b-8ce5-78daa954cca5-kube-api-access-zqpzm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6\" (UID: \"b45e71f2-9724-437b-8ce5-78daa954cca5\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" Apr 19 12:43:06.423872 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.423838 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" Apr 19 12:43:06.548699 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:06.548670 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6"] Apr 19 12:43:06.550342 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:43:06.550316 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb45e71f2_9724_437b_8ce5_78daa954cca5.slice/crio-1b68af4a8677da5ce7f200c70ee1e96d7db0b5f103c77eaeb9a7780537294222 WatchSource:0}: Error finding container 1b68af4a8677da5ce7f200c70ee1e96d7db0b5f103c77eaeb9a7780537294222: Status 404 returned error can't find the container with id 1b68af4a8677da5ce7f200c70ee1e96d7db0b5f103c77eaeb9a7780537294222 Apr 19 12:43:07.147692 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:07.147654 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:43:07.259887 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:07.259841 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" event={"ID":"b45e71f2-9724-437b-8ce5-78daa954cca5","Type":"ContainerStarted","Data":"1b68af4a8677da5ce7f200c70ee1e96d7db0b5f103c77eaeb9a7780537294222"} Apr 19 12:43:12.283454 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:12.283417 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" event={"ID":"b45e71f2-9724-437b-8ce5-78daa954cca5","Type":"ContainerStarted","Data":"5539b2c90a1aa7eec49df61a567a1444519a59e56871f3d0c9a9f82305ae68ab"} Apr 19 12:43:15.334850 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:15.334814 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:43:20.317330 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:20.317290 2567 generic.go:358] "Generic (PLEG): container finished" podID="b45e71f2-9724-437b-8ce5-78daa954cca5" containerID="5539b2c90a1aa7eec49df61a567a1444519a59e56871f3d0c9a9f82305ae68ab" exitCode=0 Apr 19 12:43:20.317694 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:20.317345 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" event={"ID":"b45e71f2-9724-437b-8ce5-78daa954cca5","Type":"ContainerDied","Data":"5539b2c90a1aa7eec49df61a567a1444519a59e56871f3d0c9a9f82305ae68ab"} Apr 19 12:43:22.327122 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:22.327086 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" event={"ID":"b45e71f2-9724-437b-8ce5-78daa954cca5","Type":"ContainerStarted","Data":"499d31057b4b8a002e4040c5301b24b195c5c475354d6a7c9e229057e4d48183"} Apr 19 12:43:22.327546 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:22.327303 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" Apr 19 12:43:22.344401 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:22.344353 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" podStartSLOduration=1.594940904 podStartE2EDuration="16.344339102s" podCreationTimestamp="2026-04-19 12:43:06 +0000 UTC" firstStartedPulling="2026-04-19 12:43:06.552129797 +0000 UTC m=+754.706063258" lastFinishedPulling="2026-04-19 12:43:21.301527996 +0000 UTC m=+769.455461456" observedRunningTime="2026-04-19 12:43:22.342090668 +0000 UTC m=+770.496024172" watchObservedRunningTime="2026-04-19 12:43:22.344339102 +0000 UTC m=+770.498272573" Apr 19 12:43:33.343855 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:43:33.343824 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6" Apr 19 12:44:15.888320 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:15.888283 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7965cf5b7c-xrtcd"] Apr 19 12:44:15.891862 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:15.891845 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7965cf5b7c-xrtcd" Apr 19 12:44:15.898562 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:15.898538 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7965cf5b7c-xrtcd"] Apr 19 12:44:16.026640 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:16.026603 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clspc\" (UniqueName: \"kubernetes.io/projected/01efe377-9d07-46ed-bc1a-e9100937cb81-kube-api-access-clspc\") pod \"authorino-7965cf5b7c-xrtcd\" (UID: \"01efe377-9d07-46ed-bc1a-e9100937cb81\") " pod="kuadrant-system/authorino-7965cf5b7c-xrtcd" Apr 19 12:44:16.026823 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:16.026682 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/01efe377-9d07-46ed-bc1a-e9100937cb81-tls-cert\") pod \"authorino-7965cf5b7c-xrtcd\" (UID: \"01efe377-9d07-46ed-bc1a-e9100937cb81\") " pod="kuadrant-system/authorino-7965cf5b7c-xrtcd" Apr 19 12:44:16.127147 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:16.127108 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/01efe377-9d07-46ed-bc1a-e9100937cb81-tls-cert\") pod \"authorino-7965cf5b7c-xrtcd\" (UID: \"01efe377-9d07-46ed-bc1a-e9100937cb81\") " pod="kuadrant-system/authorino-7965cf5b7c-xrtcd" Apr 19 12:44:16.127379 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:16.127224 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clspc\" (UniqueName: \"kubernetes.io/projected/01efe377-9d07-46ed-bc1a-e9100937cb81-kube-api-access-clspc\") pod \"authorino-7965cf5b7c-xrtcd\" (UID: \"01efe377-9d07-46ed-bc1a-e9100937cb81\") " pod="kuadrant-system/authorino-7965cf5b7c-xrtcd" Apr 19 12:44:16.129638 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:16.129611 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/01efe377-9d07-46ed-bc1a-e9100937cb81-tls-cert\") pod \"authorino-7965cf5b7c-xrtcd\" (UID: \"01efe377-9d07-46ed-bc1a-e9100937cb81\") " pod="kuadrant-system/authorino-7965cf5b7c-xrtcd" Apr 19 12:44:16.134181 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:16.134138 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clspc\" (UniqueName: \"kubernetes.io/projected/01efe377-9d07-46ed-bc1a-e9100937cb81-kube-api-access-clspc\") pod \"authorino-7965cf5b7c-xrtcd\" (UID: \"01efe377-9d07-46ed-bc1a-e9100937cb81\") " pod="kuadrant-system/authorino-7965cf5b7c-xrtcd" Apr 19 12:44:16.201504 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:16.201471 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7965cf5b7c-xrtcd" Apr 19 12:44:16.325812 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:16.325784 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7965cf5b7c-xrtcd"] Apr 19 12:44:16.327743 ip-10-0-142-55 kubenswrapper[2567]: W0419 12:44:16.327716 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01efe377_9d07_46ed_bc1a_e9100937cb81.slice/crio-0406de105378a7ec70d85d0bbfd0cd037c495eb98abb0851b8a01bb744579721 WatchSource:0}: Error finding container 0406de105378a7ec70d85d0bbfd0cd037c495eb98abb0851b8a01bb744579721: Status 404 returned error can't find the container with id 0406de105378a7ec70d85d0bbfd0cd037c495eb98abb0851b8a01bb744579721 Apr 19 12:44:16.538366 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:16.538279 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7965cf5b7c-xrtcd" event={"ID":"01efe377-9d07-46ed-bc1a-e9100937cb81","Type":"ContainerStarted","Data":"0406de105378a7ec70d85d0bbfd0cd037c495eb98abb0851b8a01bb744579721"} Apr 19 12:44:17.544502 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:17.544464 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7965cf5b7c-xrtcd" event={"ID":"01efe377-9d07-46ed-bc1a-e9100937cb81","Type":"ContainerStarted","Data":"9884ba26a66699b98cb60077db58d3adf8f489ea78be0783d96cf1195e59fe88"} Apr 19 12:44:17.559255 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:17.559198 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7965cf5b7c-xrtcd" podStartSLOduration=2.197596646 podStartE2EDuration="2.559180993s" podCreationTimestamp="2026-04-19 12:44:15 +0000 UTC" firstStartedPulling="2026-04-19 12:44:16.329038408 +0000 UTC m=+824.482971866" lastFinishedPulling="2026-04-19 12:44:16.690622756 +0000 UTC m=+824.844556213" observedRunningTime="2026-04-19 12:44:17.556938253 +0000 UTC m=+825.710871733" watchObservedRunningTime="2026-04-19 12:44:17.559180993 +0000 UTC m=+825.713114471" Apr 19 12:44:17.582868 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:17.582829 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5fc5db7fdd-xz8x4"] Apr 19 12:44:17.583149 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:17.583099 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-5fc5db7fdd-xz8x4" podUID="1fc11394-db33-4fd2-b247-03e309576df3" containerName="authorino" containerID="cri-o://28c29dd02fa3c687c1fdd89f204238cb507453b3aeab8cf87042fc1d4550f554" gracePeriod=30 Apr 19 12:44:17.823536 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:17.823510 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5fc5db7fdd-xz8x4" Apr 19 12:44:17.843065 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:17.843034 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5zvb\" (UniqueName: \"kubernetes.io/projected/1fc11394-db33-4fd2-b247-03e309576df3-kube-api-access-g5zvb\") pod \"1fc11394-db33-4fd2-b247-03e309576df3\" (UID: \"1fc11394-db33-4fd2-b247-03e309576df3\") " Apr 19 12:44:17.843237 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:17.843077 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1fc11394-db33-4fd2-b247-03e309576df3-tls-cert\") pod \"1fc11394-db33-4fd2-b247-03e309576df3\" (UID: \"1fc11394-db33-4fd2-b247-03e309576df3\") " Apr 19 12:44:17.845455 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:17.845427 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc11394-db33-4fd2-b247-03e309576df3-kube-api-access-g5zvb" (OuterVolumeSpecName: "kube-api-access-g5zvb") pod "1fc11394-db33-4fd2-b247-03e309576df3" (UID: "1fc11394-db33-4fd2-b247-03e309576df3"). InnerVolumeSpecName "kube-api-access-g5zvb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:44:17.853783 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:17.853757 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc11394-db33-4fd2-b247-03e309576df3-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "1fc11394-db33-4fd2-b247-03e309576df3" (UID: "1fc11394-db33-4fd2-b247-03e309576df3"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:44:17.943743 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:17.943712 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g5zvb\" (UniqueName: \"kubernetes.io/projected/1fc11394-db33-4fd2-b247-03e309576df3-kube-api-access-g5zvb\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:44:17.943743 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:17.943739 2567 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1fc11394-db33-4fd2-b247-03e309576df3-tls-cert\") on node \"ip-10-0-142-55.ec2.internal\" DevicePath \"\"" Apr 19 12:44:18.549465 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:18.549432 2567 generic.go:358] "Generic (PLEG): container finished" podID="1fc11394-db33-4fd2-b247-03e309576df3" containerID="28c29dd02fa3c687c1fdd89f204238cb507453b3aeab8cf87042fc1d4550f554" exitCode=0 Apr 19 12:44:18.549942 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:18.549489 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5fc5db7fdd-xz8x4" Apr 19 12:44:18.549942 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:18.549504 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5fc5db7fdd-xz8x4" event={"ID":"1fc11394-db33-4fd2-b247-03e309576df3","Type":"ContainerDied","Data":"28c29dd02fa3c687c1fdd89f204238cb507453b3aeab8cf87042fc1d4550f554"} Apr 19 12:44:18.549942 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:18.549540 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5fc5db7fdd-xz8x4" event={"ID":"1fc11394-db33-4fd2-b247-03e309576df3","Type":"ContainerDied","Data":"d2267c758a1f3c646414cfdd5a20314b8c86bae7369477190eda6806fbe950fb"} Apr 19 12:44:18.549942 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:18.549557 2567 scope.go:117] "RemoveContainer" containerID="28c29dd02fa3c687c1fdd89f204238cb507453b3aeab8cf87042fc1d4550f554" Apr 19 12:44:18.559222 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:18.559199 2567 scope.go:117] "RemoveContainer" containerID="28c29dd02fa3c687c1fdd89f204238cb507453b3aeab8cf87042fc1d4550f554" Apr 19 12:44:18.559503 ip-10-0-142-55 kubenswrapper[2567]: E0419 12:44:18.559483 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28c29dd02fa3c687c1fdd89f204238cb507453b3aeab8cf87042fc1d4550f554\": container with ID starting with 28c29dd02fa3c687c1fdd89f204238cb507453b3aeab8cf87042fc1d4550f554 not found: ID does not exist" containerID="28c29dd02fa3c687c1fdd89f204238cb507453b3aeab8cf87042fc1d4550f554" Apr 19 12:44:18.559585 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:18.559511 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28c29dd02fa3c687c1fdd89f204238cb507453b3aeab8cf87042fc1d4550f554"} err="failed to get container status \"28c29dd02fa3c687c1fdd89f204238cb507453b3aeab8cf87042fc1d4550f554\": rpc error: code = NotFound desc = could not find container \"28c29dd02fa3c687c1fdd89f204238cb507453b3aeab8cf87042fc1d4550f554\": container with ID starting with 28c29dd02fa3c687c1fdd89f204238cb507453b3aeab8cf87042fc1d4550f554 not found: ID does not exist" Apr 19 12:44:18.564496 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:18.564471 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5fc5db7fdd-xz8x4"] Apr 19 12:44:18.572349 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:18.572327 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-5fc5db7fdd-xz8x4"] Apr 19 12:44:20.459708 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:20.459670 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fc11394-db33-4fd2-b247-03e309576df3" path="/var/lib/kubelet/pods/1fc11394-db33-4fd2-b247-03e309576df3/volumes" Apr 19 12:44:44.331921 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:44.331889 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:44:55.135290 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:44:55.135249 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:45:04.032911 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:45:04.032873 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:45:13.835260 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:45:13.835229 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:45:23.223097 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:45:23.223007 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:45:33.531261 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:45:33.531224 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:45:45.301910 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:45:45.301881 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnbtf_2b04d28c-3eb3-44e1-b431-d6b75f3850fe/console-operator/1.log" Apr 19 12:45:45.302903 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:45:45.302876 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnbtf_2b04d28c-3eb3-44e1-b431-d6b75f3850fe/console-operator/1.log" Apr 19 12:46:35.534428 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:46:35.534391 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:46:50.726003 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:46:50.725961 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:47:28.635797 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:47:28.635760 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:47:45.627891 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:47:45.627853 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:47:59.632140 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:47:59.632104 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:48:15.535082 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:48:15.535047 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:49:13.431057 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:49:13.431025 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:49:22.326177 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:49:22.326115 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:49:40.131793 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:49:40.131751 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:49:48.634248 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:49:48.634195 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:50:05.029217 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:50:05.029136 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:50:13.729725 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:50:13.729685 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:50:45.334187 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:50:45.334132 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnbtf_2b04d28c-3eb3-44e1-b431-d6b75f3850fe/console-operator/1.log" Apr 19 12:50:45.337864 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:50:45.337837 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnbtf_2b04d28c-3eb3-44e1-b431-d6b75f3850fe/console-operator/1.log" Apr 19 12:50:46.032190 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:50:46.032136 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:50:54.028632 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:50:54.028593 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:51:02.723785 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:51:02.723745 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:51:10.722629 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:51:10.722589 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:51:19.624115 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:51:19.624076 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:51:36.025961 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:51:36.025923 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:51:49.428342 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:51:49.428309 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:52:35.633424 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:52:35.633390 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:52:43.939552 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:52:43.939521 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:52:53.229040 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:52:53.228959 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:53:00.830356 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:53:00.830318 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:53:10.438318 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:53:10.438278 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:53:18.729423 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:53:18.729379 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:53:27.931402 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:53:27.931367 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:53:36.230257 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:53:36.230222 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:53:45.126056 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:53:45.126016 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:53:53.730206 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:53:53.730139 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:54:02.526355 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:54:02.526318 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:54:11.828578 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:54:11.828545 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:54:20.732381 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:54:20.732342 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:54:28.537078 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:54:28.537040 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:54:37.328739 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:54:37.328702 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:54:46.327717 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:54:46.327680 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:54:55.431711 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:54:55.431671 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:55:02.931931 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:55:02.931889 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:55:45.369142 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:55:45.369111 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnbtf_2b04d28c-3eb3-44e1-b431-d6b75f3850fe/console-operator/1.log" Apr 19 12:55:45.374629 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:55:45.374602 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnbtf_2b04d28c-3eb3-44e1-b431-d6b75f3850fe/console-operator/1.log" Apr 19 12:57:19.027707 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:57:19.027668 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:57:26.634758 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:57:26.634675 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:57:51.035296 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:57:51.035262 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:57:55.630379 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:57:55.630341 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:58:05.767212 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:58:05.767173 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:58:16.225958 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:58:16.225916 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:58:25.327473 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:58:25.327435 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:58:35.641339 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:58:35.641302 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:58:44.529403 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:58:44.529365 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:58:55.026866 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:58:55.026783 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:59:02.928427 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:59:02.928388 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:59:14.227822 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:59:14.227777 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:59:23.628286 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:59:23.628243 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 12:59:55.631668 ip-10-0-142-55 kubenswrapper[2567]: I0419 12:59:55.631626 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:00:38.329242 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:00:38.329182 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:00:45.402401 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:00:45.402373 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnbtf_2b04d28c-3eb3-44e1-b431-d6b75f3850fe/console-operator/1.log" Apr 19 13:00:45.409178 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:00:45.409146 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnbtf_2b04d28c-3eb3-44e1-b431-d6b75f3850fe/console-operator/1.log" Apr 19 13:00:47.046035 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:00:47.046000 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:00:55.928413 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:00:55.928379 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:01:03.829326 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:01:03.829289 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:01:13.631235 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:01:13.631191 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:01:23.230864 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:01:23.230826 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:01:32.630184 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:01:32.630125 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:01:40.636445 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:01:40.636404 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:01:48.432629 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:01:48.432593 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:01:56.219698 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:01:56.219617 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:02:05.527991 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:02:05.527945 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:02:15.528615 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:02:15.528572 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:02:33.526433 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:02:33.526396 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:02:42.029412 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:02:42.029375 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:02:50.524146 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:02:50.524113 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:02:58.229566 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:02:58.229529 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:03:15.626393 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:03:15.626356 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:03:24.038227 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:03:24.038121 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:03:32.528056 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:03:32.528018 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:03:41.527833 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:03:41.527793 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:03:50.732379 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:03:50.732334 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:03:58.823950 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:03:58.823911 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:04:07.934998 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:04:07.934958 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:04:21.125774 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:04:21.125737 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:04:29.324118 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:04:29.324082 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:04:41.231379 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:04:41.231344 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:04:50.225439 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:04:50.225400 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:04:57.919566 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:04:57.919485 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:05:05.726984 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:05:05.726946 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:05:12.918835 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:05:12.918798 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:05:30.527734 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:05:30.527694 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:05:40.026006 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:05:40.025968 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:05:45.435526 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:05:45.435499 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnbtf_2b04d28c-3eb3-44e1-b431-d6b75f3850fe/console-operator/1.log" Apr 19 13:05:45.443401 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:05:45.443377 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnbtf_2b04d28c-3eb3-44e1-b431-d6b75f3850fe/console-operator/1.log" Apr 19 13:05:47.933409 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:05:47.933375 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:05:55.827762 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:05:55.827731 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:06:20.022411 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:20.022373 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:06:31.106271 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:31.106232 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7965cf5b7c-xrtcd_01efe377-9d07-46ed-bc1a-e9100937cb81/authorino/0.log" Apr 19 13:06:32.028686 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:32.028649 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fvgtn"] Apr 19 13:06:35.371086 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:35.371049 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-9ff869b6b-zfnmt_71198d00-829c-44c3-a38a-dfde254a8d7d/manager/0.log" Apr 19 13:06:36.189493 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:36.189462 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47_bf78201c-89b2-4894-8db8-f90bd134e341/pull/0.log" Apr 19 13:06:36.195708 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:36.195684 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47_bf78201c-89b2-4894-8db8-f90bd134e341/extract/0.log" Apr 19 13:06:36.201350 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:36.201334 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47_bf78201c-89b2-4894-8db8-f90bd134e341/util/0.log" Apr 19 13:06:36.307442 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:36.307379 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd_d21cef59-75f6-4713-81fd-7593fc3bb1c3/util/0.log" Apr 19 13:06:36.313824 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:36.313799 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd_d21cef59-75f6-4713-81fd-7593fc3bb1c3/pull/0.log" Apr 19 13:06:36.319747 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:36.319705 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd_d21cef59-75f6-4713-81fd-7593fc3bb1c3/extract/0.log" Apr 19 13:06:36.425141 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:36.425106 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s_09ae2876-b22e-4fb6-b7e0-d93d4d478df5/util/0.log" Apr 19 13:06:36.431340 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:36.431299 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s_09ae2876-b22e-4fb6-b7e0-d93d4d478df5/pull/0.log" Apr 19 13:06:36.437427 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:36.437408 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s_09ae2876-b22e-4fb6-b7e0-d93d4d478df5/extract/0.log" Apr 19 13:06:36.543174 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:36.543077 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv_2a498dcf-155d-4306-9c8f-18c1f4dde823/util/0.log" Apr 19 13:06:36.549198 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:36.549176 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv_2a498dcf-155d-4306-9c8f-18c1f4dde823/pull/0.log" Apr 19 13:06:36.555222 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:36.555197 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv_2a498dcf-155d-4306-9c8f-18c1f4dde823/extract/0.log" Apr 19 13:06:36.668079 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:36.668049 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7965cf5b7c-xrtcd_01efe377-9d07-46ed-bc1a-e9100937cb81/authorino/0.log" Apr 19 13:06:37.013268 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:37.013242 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-zs9pb_036649a7-fe91-49cc-ba36-5caaceef8d42/kuadrant-console-plugin/0.log" Apr 19 13:06:37.386368 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:37.386283 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-fvgtn_f144adca-9201-4b23-b7de-f5e209e3797f/limitador/0.log" Apr 19 13:06:38.047577 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:38.047546 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-874cdfcc7-8tz49_c30d05da-556b-4dc1-9029-9cdc54c02ca8/kube-auth-proxy/0.log" Apr 19 13:06:39.236733 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:39.236695 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6_b45e71f2-9724-437b-8ce5-78daa954cca5/storage-initializer/0.log" Apr 19 13:06:39.243702 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:39.243675 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-9zfk6_b45e71f2-9724-437b-8ce5-78daa954cca5/main/0.log" Apr 19 13:06:42.890715 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:42.890680 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f5849/must-gather-zjq9d"] Apr 19 13:06:42.891339 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:42.891319 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1fc11394-db33-4fd2-b247-03e309576df3" containerName="authorino" Apr 19 13:06:42.891339 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:42.891341 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc11394-db33-4fd2-b247-03e309576df3" containerName="authorino" Apr 19 13:06:42.891498 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:42.891445 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="1fc11394-db33-4fd2-b247-03e309576df3" containerName="authorino" Apr 19 13:06:42.894828 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:42.894811 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5849/must-gather-zjq9d" Apr 19 13:06:42.896969 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:42.896947 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f5849\"/\"kube-root-ca.crt\"" Apr 19 13:06:42.897081 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:42.897034 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f5849\"/\"openshift-service-ca.crt\"" Apr 19 13:06:42.897781 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:42.897760 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-f5849\"/\"default-dockercfg-fxnnq\"" Apr 19 13:06:42.908452 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:42.908426 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f5849/must-gather-zjq9d"] Apr 19 13:06:43.049935 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:43.049899 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb4nk\" (UniqueName: \"kubernetes.io/projected/71877add-170d-40a2-a729-efe68e03ab13-kube-api-access-zb4nk\") pod \"must-gather-zjq9d\" (UID: \"71877add-170d-40a2-a729-efe68e03ab13\") " pod="openshift-must-gather-f5849/must-gather-zjq9d" Apr 19 13:06:43.050140 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:43.049988 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71877add-170d-40a2-a729-efe68e03ab13-must-gather-output\") pod \"must-gather-zjq9d\" (UID: \"71877add-170d-40a2-a729-efe68e03ab13\") " pod="openshift-must-gather-f5849/must-gather-zjq9d" Apr 19 13:06:43.151092 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:43.151061 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zb4nk\" (UniqueName: \"kubernetes.io/projected/71877add-170d-40a2-a729-efe68e03ab13-kube-api-access-zb4nk\") pod \"must-gather-zjq9d\" (UID: \"71877add-170d-40a2-a729-efe68e03ab13\") " pod="openshift-must-gather-f5849/must-gather-zjq9d" Apr 19 13:06:43.151276 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:43.151110 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71877add-170d-40a2-a729-efe68e03ab13-must-gather-output\") pod \"must-gather-zjq9d\" (UID: \"71877add-170d-40a2-a729-efe68e03ab13\") " pod="openshift-must-gather-f5849/must-gather-zjq9d" Apr 19 13:06:43.151436 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:43.151410 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71877add-170d-40a2-a729-efe68e03ab13-must-gather-output\") pod \"must-gather-zjq9d\" (UID: \"71877add-170d-40a2-a729-efe68e03ab13\") " pod="openshift-must-gather-f5849/must-gather-zjq9d" Apr 19 13:06:43.158465 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:43.158440 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb4nk\" (UniqueName: \"kubernetes.io/projected/71877add-170d-40a2-a729-efe68e03ab13-kube-api-access-zb4nk\") pod \"must-gather-zjq9d\" (UID: \"71877add-170d-40a2-a729-efe68e03ab13\") " pod="openshift-must-gather-f5849/must-gather-zjq9d" Apr 19 13:06:43.204310 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:43.204274 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5849/must-gather-zjq9d" Apr 19 13:06:43.331063 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:43.331021 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f5849/must-gather-zjq9d"] Apr 19 13:06:43.333903 ip-10-0-142-55 kubenswrapper[2567]: W0419 13:06:43.333871 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71877add_170d_40a2_a729_efe68e03ab13.slice/crio-f3d1204f8bce8f3da4a3609053292dac4e0c2b0bc68f3a333f726f241f37e733 WatchSource:0}: Error finding container f3d1204f8bce8f3da4a3609053292dac4e0c2b0bc68f3a333f726f241f37e733: Status 404 returned error can't find the container with id f3d1204f8bce8f3da4a3609053292dac4e0c2b0bc68f3a333f726f241f37e733 Apr 19 13:06:43.335733 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:43.335709 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 13:06:43.842650 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:43.842610 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5849/must-gather-zjq9d" event={"ID":"71877add-170d-40a2-a729-efe68e03ab13","Type":"ContainerStarted","Data":"f3d1204f8bce8f3da4a3609053292dac4e0c2b0bc68f3a333f726f241f37e733"} Apr 19 13:06:44.849086 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:44.849033 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5849/must-gather-zjq9d" event={"ID":"71877add-170d-40a2-a729-efe68e03ab13","Type":"ContainerStarted","Data":"9ecbe65b3e0d79091feb004819e3b2e9b682b49d3ca66d62f3327dd7c9f0ab3a"} Apr 19 13:06:44.849086 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:44.849070 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5849/must-gather-zjq9d" event={"ID":"71877add-170d-40a2-a729-efe68e03ab13","Type":"ContainerStarted","Data":"44ff15425a97dac3a7db519b3e54876ab3123c00d2107b983ed72b58ee6ec36c"} Apr 19 13:06:44.864465 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:44.864375 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f5849/must-gather-zjq9d" podStartSLOduration=1.850059226 podStartE2EDuration="2.864359755s" podCreationTimestamp="2026-04-19 13:06:42 +0000 UTC" firstStartedPulling="2026-04-19 13:06:43.335840356 +0000 UTC m=+2171.489773826" lastFinishedPulling="2026-04-19 13:06:44.350140878 +0000 UTC m=+2172.504074355" observedRunningTime="2026-04-19 13:06:44.861229808 +0000 UTC m=+2173.015163287" watchObservedRunningTime="2026-04-19 13:06:44.864359755 +0000 UTC m=+2173.018293288" Apr 19 13:06:45.975857 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:45.975824 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-qlgx9_ddd4693b-4c32-466e-be78-8808310a5f1f/global-pull-secret-syncer/0.log" Apr 19 13:06:46.019288 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:46.019257 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-jtfsx_c2abf95c-a107-4a6c-93cd-802a48e2976c/konnectivity-agent/0.log" Apr 19 13:06:46.136681 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:46.136650 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-55.ec2.internal_023b8927dfe46c9ec0872b191f59109d/haproxy/0.log" Apr 19 13:06:49.782564 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:49.782529 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47_bf78201c-89b2-4894-8db8-f90bd134e341/extract/0.log" Apr 19 13:06:49.806580 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:49.806551 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47_bf78201c-89b2-4894-8db8-f90bd134e341/util/0.log" Apr 19 13:06:49.827579 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:49.827513 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nbl47_bf78201c-89b2-4894-8db8-f90bd134e341/pull/0.log" Apr 19 13:06:49.853898 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:49.853851 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd_d21cef59-75f6-4713-81fd-7593fc3bb1c3/extract/0.log" Apr 19 13:06:49.879943 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:49.879909 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd_d21cef59-75f6-4713-81fd-7593fc3bb1c3/util/0.log" Apr 19 13:06:49.903353 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:49.903319 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nj7hd_d21cef59-75f6-4713-81fd-7593fc3bb1c3/pull/0.log" Apr 19 13:06:49.938066 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:49.938033 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s_09ae2876-b22e-4fb6-b7e0-d93d4d478df5/extract/0.log" Apr 19 13:06:49.964314 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:49.964287 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s_09ae2876-b22e-4fb6-b7e0-d93d4d478df5/util/0.log" Apr 19 13:06:49.985436 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:49.985411 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73nnx6s_09ae2876-b22e-4fb6-b7e0-d93d4d478df5/pull/0.log" Apr 19 13:06:50.016634 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:50.016602 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv_2a498dcf-155d-4306-9c8f-18c1f4dde823/extract/0.log" Apr 19 13:06:50.042272 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:50.042172 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv_2a498dcf-155d-4306-9c8f-18c1f4dde823/util/0.log" Apr 19 13:06:50.063605 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:50.063558 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w65bv_2a498dcf-155d-4306-9c8f-18c1f4dde823/pull/0.log" Apr 19 13:06:50.334254 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:50.334173 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7965cf5b7c-xrtcd_01efe377-9d07-46ed-bc1a-e9100937cb81/authorino/0.log" Apr 19 13:06:50.412768 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:50.412733 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-zs9pb_036649a7-fe91-49cc-ba36-5caaceef8d42/kuadrant-console-plugin/0.log" Apr 19 13:06:50.541447 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:50.541417 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-fvgtn_f144adca-9201-4b23-b7de-f5e209e3797f/limitador/0.log" Apr 19 13:06:51.968745 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:51.968707 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_2bb33194-d022-4b4f-8510-d23f793f4a39/alertmanager/0.log" Apr 19 13:06:51.996721 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:51.996688 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_2bb33194-d022-4b4f-8510-d23f793f4a39/config-reloader/0.log" Apr 19 13:06:52.021376 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:52.021277 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_2bb33194-d022-4b4f-8510-d23f793f4a39/kube-rbac-proxy-web/0.log" Apr 19 13:06:52.047375 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:52.047345 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_2bb33194-d022-4b4f-8510-d23f793f4a39/kube-rbac-proxy/0.log" Apr 19 13:06:52.068850 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:52.068808 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_2bb33194-d022-4b4f-8510-d23f793f4a39/kube-rbac-proxy-metric/0.log" Apr 19 13:06:52.094390 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:52.094364 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_2bb33194-d022-4b4f-8510-d23f793f4a39/prom-label-proxy/0.log" Apr 19 13:06:52.115804 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:52.115767 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_2bb33194-d022-4b4f-8510-d23f793f4a39/init-config-reloader/0.log" Apr 19 13:06:52.157887 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:52.157840 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-6mzzf_418c073b-0032-49b0-84ad-cf233dc4778b/cluster-monitoring-operator/0.log" Apr 19 13:06:52.182772 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:52.182739 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-bjprv_12d3e553-2bcd-4923-9074-65406b6c1644/kube-state-metrics/0.log" Apr 19 13:06:52.204699 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:52.204667 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-bjprv_12d3e553-2bcd-4923-9074-65406b6c1644/kube-rbac-proxy-main/0.log" Apr 19 13:06:52.228488 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:52.228448 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-bjprv_12d3e553-2bcd-4923-9074-65406b6c1644/kube-rbac-proxy-self/0.log" Apr 19 13:06:52.278825 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:52.278744 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-22cmq_7e516893-242c-4e32-94d8-70ccb92ef46e/monitoring-plugin/0.log" Apr 19 13:06:52.392679 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:52.392641 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gbc2d_35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6/node-exporter/0.log" Apr 19 13:06:52.414061 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:52.414002 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gbc2d_35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6/kube-rbac-proxy/0.log" Apr 19 13:06:52.435445 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:52.435414 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gbc2d_35ba0b8a-9d06-4de7-9d84-f266f1f8e9a6/init-textfile/0.log" Apr 19 13:06:52.538690 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:52.538610 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-pkwpt_07cd686e-db42-41d2-8704-20287a4d6ba5/kube-rbac-proxy-main/0.log" Apr 19 13:06:52.563658 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:52.563624 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-pkwpt_07cd686e-db42-41d2-8704-20287a4d6ba5/kube-rbac-proxy-self/0.log" Apr 19 13:06:52.584735 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:52.584705 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-pkwpt_07cd686e-db42-41d2-8704-20287a4d6ba5/openshift-state-metrics/0.log" Apr 19 13:06:52.772651 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:52.772617 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-k5dpj_bdd21a10-2e10-4f33-af8b-8504779ff325/prometheus-operator/0.log" Apr 19 13:06:52.792804 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:52.792728 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-k5dpj_bdd21a10-2e10-4f33-af8b-8504779ff325/kube-rbac-proxy/0.log" Apr 19 13:06:52.824571 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:52.824546 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-zwq6n_39ea9fea-2574-42e5-ad5a-9622c776c3f7/prometheus-operator-admission-webhook/0.log" Apr 19 13:06:52.854125 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:52.854097 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5688fbc97b-n56sh_e1c35bac-a14c-4d36-8232-340f4f8b34be/telemeter-client/0.log" Apr 19 13:06:52.883765 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:52.883674 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5688fbc97b-n56sh_e1c35bac-a14c-4d36-8232-340f4f8b34be/reload/0.log" Apr 19 13:06:52.907086 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:52.907060 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5688fbc97b-n56sh_e1c35bac-a14c-4d36-8232-340f4f8b34be/kube-rbac-proxy/0.log" Apr 19 13:06:54.119906 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:54.119881 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-bns5n_e88a41fd-9d7d-457b-af51-169ee562d266/networking-console-plugin/0.log" Apr 19 13:06:54.626710 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:54.626636 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnbtf_2b04d28c-3eb3-44e1-b431-d6b75f3850fe/console-operator/1.log" Apr 19 13:06:54.633072 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:54.633043 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lnbtf_2b04d28c-3eb3-44e1-b431-d6b75f3850fe/console-operator/2.log" Apr 19 13:06:54.745818 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:54.745775 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8"] Apr 19 13:06:54.753318 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:54.753290 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8" Apr 19 13:06:54.756275 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:54.756247 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8"] Apr 19 13:06:54.769986 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:54.769511 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f47d88a6-51b8-43c6-8db3-8b390b35f806-podres\") pod \"perf-node-gather-daemonset-tggm8\" (UID: \"f47d88a6-51b8-43c6-8db3-8b390b35f806\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8" Apr 19 13:06:54.769986 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:54.769562 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f47d88a6-51b8-43c6-8db3-8b390b35f806-sys\") pod \"perf-node-gather-daemonset-tggm8\" (UID: \"f47d88a6-51b8-43c6-8db3-8b390b35f806\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8" Apr 19 13:06:54.769986 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:54.769603 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f47d88a6-51b8-43c6-8db3-8b390b35f806-lib-modules\") pod \"perf-node-gather-daemonset-tggm8\" (UID: \"f47d88a6-51b8-43c6-8db3-8b390b35f806\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8" Apr 19 13:06:54.769986 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:54.769637 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdxmn\" (UniqueName: \"kubernetes.io/projected/f47d88a6-51b8-43c6-8db3-8b390b35f806-kube-api-access-wdxmn\") pod \"perf-node-gather-daemonset-tggm8\" (UID: \"f47d88a6-51b8-43c6-8db3-8b390b35f806\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8" Apr 19 13:06:54.769986 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:54.769667 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f47d88a6-51b8-43c6-8db3-8b390b35f806-proc\") pod \"perf-node-gather-daemonset-tggm8\" (UID: \"f47d88a6-51b8-43c6-8db3-8b390b35f806\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8" Apr 19 13:06:54.870995 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:54.870964 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f47d88a6-51b8-43c6-8db3-8b390b35f806-podres\") pod \"perf-node-gather-daemonset-tggm8\" (UID: \"f47d88a6-51b8-43c6-8db3-8b390b35f806\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8" Apr 19 13:06:54.871199 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:54.871005 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f47d88a6-51b8-43c6-8db3-8b390b35f806-sys\") pod \"perf-node-gather-daemonset-tggm8\" (UID: \"f47d88a6-51b8-43c6-8db3-8b390b35f806\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8" Apr 19 13:06:54.871199 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:54.871036 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f47d88a6-51b8-43c6-8db3-8b390b35f806-lib-modules\") pod \"perf-node-gather-daemonset-tggm8\" (UID: \"f47d88a6-51b8-43c6-8db3-8b390b35f806\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8" Apr 19 13:06:54.871199 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:54.871062 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdxmn\" (UniqueName: \"kubernetes.io/projected/f47d88a6-51b8-43c6-8db3-8b390b35f806-kube-api-access-wdxmn\") pod \"perf-node-gather-daemonset-tggm8\" (UID: \"f47d88a6-51b8-43c6-8db3-8b390b35f806\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8" Apr 19 13:06:54.871199 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:54.871083 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f47d88a6-51b8-43c6-8db3-8b390b35f806-proc\") pod \"perf-node-gather-daemonset-tggm8\" (UID: \"f47d88a6-51b8-43c6-8db3-8b390b35f806\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8" Apr 19 13:06:54.871199 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:54.871111 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f47d88a6-51b8-43c6-8db3-8b390b35f806-sys\") pod \"perf-node-gather-daemonset-tggm8\" (UID: \"f47d88a6-51b8-43c6-8db3-8b390b35f806\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8" Apr 19 13:06:54.871199 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:54.871132 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f47d88a6-51b8-43c6-8db3-8b390b35f806-podres\") pod \"perf-node-gather-daemonset-tggm8\" (UID: \"f47d88a6-51b8-43c6-8db3-8b390b35f806\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8" Apr 19 13:06:54.871199 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:54.871139 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f47d88a6-51b8-43c6-8db3-8b390b35f806-proc\") pod \"perf-node-gather-daemonset-tggm8\" (UID: \"f47d88a6-51b8-43c6-8db3-8b390b35f806\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8" Apr 19 13:06:54.871674 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:54.871312 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f47d88a6-51b8-43c6-8db3-8b390b35f806-lib-modules\") pod \"perf-node-gather-daemonset-tggm8\" (UID: \"f47d88a6-51b8-43c6-8db3-8b390b35f806\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8" Apr 19 13:06:54.879290 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:54.879220 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdxmn\" (UniqueName: \"kubernetes.io/projected/f47d88a6-51b8-43c6-8db3-8b390b35f806-kube-api-access-wdxmn\") pod \"perf-node-gather-daemonset-tggm8\" (UID: \"f47d88a6-51b8-43c6-8db3-8b390b35f806\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8" Apr 19 13:06:55.066287 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:55.066261 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8" Apr 19 13:06:55.106097 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:55.106066 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-rk698_3df8278a-cafc-490d-ad79-bf55ce74b38e/download-server/0.log" Apr 19 13:06:55.241900 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:55.240790 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8"] Apr 19 13:06:55.245622 ip-10-0-142-55 kubenswrapper[2567]: W0419 13:06:55.243029 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf47d88a6_51b8_43c6_8db3_8b390b35f806.slice/crio-56df5f76c975cbb3635c5ec27e608f3be2369239c533bc9549b2eccc33b76684 WatchSource:0}: Error finding container 56df5f76c975cbb3635c5ec27e608f3be2369239c533bc9549b2eccc33b76684: Status 404 returned error can't find the container with id 56df5f76c975cbb3635c5ec27e608f3be2369239c533bc9549b2eccc33b76684 Apr 19 13:06:55.923248 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:55.923211 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8" event={"ID":"f47d88a6-51b8-43c6-8db3-8b390b35f806","Type":"ContainerStarted","Data":"edf11301a5f2b56941db0a7370756a46befe9d27d23fd4a33b7684f66383f324"} Apr 19 13:06:55.923248 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:55.923252 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8" event={"ID":"f47d88a6-51b8-43c6-8db3-8b390b35f806","Type":"ContainerStarted","Data":"56df5f76c975cbb3635c5ec27e608f3be2369239c533bc9549b2eccc33b76684"} Apr 19 13:06:55.923466 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:55.923340 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8" Apr 19 13:06:55.951915 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:55.951850 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8" podStartSLOduration=1.951830673 podStartE2EDuration="1.951830673s" podCreationTimestamp="2026-04-19 13:06:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 13:06:55.946823779 +0000 UTC m=+2184.100757261" watchObservedRunningTime="2026-04-19 13:06:55.951830673 +0000 UTC m=+2184.105764155" Apr 19 13:06:56.431195 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:56.431131 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nb68t_c83beaf5-2d24-4163-855a-f4c6d55b0311/dns/0.log" Apr 19 13:06:56.451622 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:56.451595 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nb68t_c83beaf5-2d24-4163-855a-f4c6d55b0311/kube-rbac-proxy/0.log" Apr 19 13:06:56.563151 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:56.563128 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rd647_d3c440e9-6d1f-46b9-b5c1-3ee825d6e2a4/dns-node-resolver/0.log" Apr 19 13:06:57.072921 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:57.072880 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-n7t66_dc6754e8-c6ea-48e0-9ad2-435a13e54b61/node-ca/0.log" Apr 19 13:06:57.968136 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:57.968104 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-874cdfcc7-8tz49_c30d05da-556b-4dc1-9029-9cdc54c02ca8/kube-auth-proxy/0.log" Apr 19 13:06:58.554350 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:58.554316 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-29mkc_f68c88a5-5b83-4fd0-92df-327974a7cc96/serve-healthcheck-canary/0.log" Apr 19 13:06:59.100028 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:59.099984 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-trxd7_b78d3726-337a-4dd9-9db1-1835392e8376/kube-rbac-proxy/0.log" Apr 19 13:06:59.120213 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:59.120189 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-trxd7_b78d3726-337a-4dd9-9db1-1835392e8376/exporter/0.log" Apr 19 13:06:59.140868 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:06:59.140839 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-trxd7_b78d3726-337a-4dd9-9db1-1835392e8376/extractor/0.log" Apr 19 13:07:01.212924 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:01.212889 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-9ff869b6b-zfnmt_71198d00-829c-44c3-a38a-dfde254a8d7d/manager/0.log" Apr 19 13:07:01.937889 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:01.937862 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-f5849/perf-node-gather-daemonset-tggm8" Apr 19 13:07:02.350596 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:02.350519 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-844f57dbd6-fbxvz_afb23281-487e-4df8-b8e5-d550ae2013f7/manager/0.log" Apr 19 13:07:07.091278 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:07.091218 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-9vjds_ac3748db-049f-4448-a55c-ed08dd605a59/kube-storage-version-migrator-operator/1.log" Apr 19 13:07:07.092257 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:07.092235 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-9vjds_ac3748db-049f-4448-a55c-ed08dd605a59/kube-storage-version-migrator-operator/0.log" Apr 19 13:07:08.032360 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:08.032333 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6xjjk_3c634550-95fa-4405-a478-4ce4ac61b034/kube-multus-additional-cni-plugins/0.log" Apr 19 13:07:08.055652 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:08.055627 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6xjjk_3c634550-95fa-4405-a478-4ce4ac61b034/egress-router-binary-copy/0.log" Apr 19 13:07:08.077370 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:08.077346 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6xjjk_3c634550-95fa-4405-a478-4ce4ac61b034/cni-plugins/0.log" Apr 19 13:07:08.103504 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:08.103478 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6xjjk_3c634550-95fa-4405-a478-4ce4ac61b034/bond-cni-plugin/0.log" Apr 19 13:07:08.127881 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:08.127857 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6xjjk_3c634550-95fa-4405-a478-4ce4ac61b034/routeoverride-cni/0.log" Apr 19 13:07:08.154890 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:08.154864 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6xjjk_3c634550-95fa-4405-a478-4ce4ac61b034/whereabouts-cni-bincopy/0.log" Apr 19 13:07:08.198312 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:08.198275 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6xjjk_3c634550-95fa-4405-a478-4ce4ac61b034/whereabouts-cni/0.log" Apr 19 13:07:08.865576 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:08.865542 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jcjln_0badcc58-d388-42cd-aff8-8b79d1693727/kube-multus/0.log" Apr 19 13:07:08.983968 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:08.983939 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-98bqr_720d8932-1617-465d-a213-ebb1e99e6bc6/network-metrics-daemon/0.log" Apr 19 13:07:09.022563 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:09.022536 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-98bqr_720d8932-1617-465d-a213-ebb1e99e6bc6/kube-rbac-proxy/0.log" Apr 19 13:07:10.408790 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:10.408762 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wh942_f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9/ovn-controller/0.log" Apr 19 13:07:10.436320 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:10.436286 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wh942_f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9/ovn-acl-logging/0.log" Apr 19 13:07:10.454426 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:10.454399 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wh942_f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9/kube-rbac-proxy-node/0.log" Apr 19 13:07:10.474315 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:10.474280 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wh942_f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9/kube-rbac-proxy-ovn-metrics/0.log" Apr 19 13:07:10.491801 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:10.491777 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wh942_f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9/northd/0.log" Apr 19 13:07:10.512886 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:10.512857 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wh942_f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9/nbdb/0.log" Apr 19 13:07:10.537989 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:10.537961 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wh942_f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9/sbdb/0.log" Apr 19 13:07:10.654436 ip-10-0-142-55 kubenswrapper[2567]: I0419 13:07:10.654401 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wh942_f7f5fa31-f41a-42f0-8ea0-450aafc2a4a9/ovnkube-controller/0.log"