Apr 17 14:33:53.511679 ip-10-0-129-134 systemd[1]: Starting Kubernetes Kubelet... Apr 17 14:33:53.982769 ip-10-0-129-134 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:33:53.982769 ip-10-0-129-134 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 14:33:53.982769 ip-10-0-129-134 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:33:53.982769 ip-10-0-129-134 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 14:33:53.982769 ip-10-0-129-134 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:33:53.986961 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.986860 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 14:33:53.992867 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992840 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:33:53.992867 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992863 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:33:53.992867 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992867 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:33:53.992867 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992870 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:33:53.992867 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992875 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:33:53.993079 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992881 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:33:53.993079 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992885 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:33:53.993079 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992888 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:33:53.993079 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992892 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:33:53.993079 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992896 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:33:53.993079 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992899 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:33:53.993079 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992902 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:33:53.993079 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992905 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:33:53.993079 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992907 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:33:53.993079 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992910 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:33:53.993079 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992913 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:33:53.993079 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992915 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:33:53.993079 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992918 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:33:53.993079 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992920 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:33:53.993079 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992923 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:33:53.993079 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992926 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:33:53.993079 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992928 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:33:53.993079 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992931 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:33:53.993079 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992934 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:33:53.993079 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992936 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:33:53.993574 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992939 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:33:53.993574 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992942 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:33:53.993574 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992945 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:33:53.993574 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992947 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:33:53.993574 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992950 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:33:53.993574 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992953 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:33:53.993574 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992955 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:33:53.993574 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992958 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:33:53.993574 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992960 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:33:53.993574 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992963 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:33:53.993574 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992966 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:33:53.993574 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992968 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:33:53.993574 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992970 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:33:53.993574 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992974 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:33:53.993574 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992976 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:33:53.993574 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992979 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:33:53.993574 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992982 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:33:53.993574 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992984 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:33:53.993574 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992987 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:33:53.993574 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992990 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:33:53.994094 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992993 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:33:53.994094 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992995 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:33:53.994094 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.992998 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:33:53.994094 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993000 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:33:53.994094 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993003 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:33:53.994094 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993006 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:33:53.994094 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993009 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:33:53.994094 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993012 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:33:53.994094 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993015 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:33:53.994094 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993019 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:33:53.994094 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993024 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:33:53.994094 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993028 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:33:53.994094 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993032 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:33:53.994094 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993035 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:33:53.994094 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993038 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:33:53.994094 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993041 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:33:53.994094 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993043 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:33:53.994094 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993046 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:33:53.994094 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993049 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:33:53.994094 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993052 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:33:53.994613 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993055 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:33:53.994613 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993057 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:33:53.994613 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993060 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:33:53.994613 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993062 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:33:53.994613 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993065 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:33:53.994613 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993068 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:33:53.994613 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993071 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:33:53.994613 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993075 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:33:53.994613 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993077 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:33:53.994613 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993080 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:33:53.994613 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993083 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:33:53.994613 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993086 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:33:53.994613 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993088 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:33:53.994613 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993093 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:33:53.994613 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993096 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:33:53.994613 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993099 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:33:53.994613 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993102 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:33:53.994613 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993105 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:33:53.994613 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993107 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:33:53.994613 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993110 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:33:53.995092 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993112 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:33:53.995092 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993569 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:33:53.995092 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993575 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:33:53.995092 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993579 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:33:53.995092 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993582 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:33:53.995092 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993585 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:33:53.995092 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993588 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:33:53.995092 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993591 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:33:53.995092 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993594 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:33:53.995092 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993598 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:33:53.995092 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993600 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:33:53.995092 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993604 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:33:53.995092 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993606 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:33:53.995092 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993609 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:33:53.995092 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993612 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:33:53.995092 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993614 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:33:53.995092 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993617 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:33:53.995092 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993621 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:33:53.995092 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993623 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:33:53.995092 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993626 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:33:53.995600 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993629 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:33:53.995600 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993631 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:33:53.995600 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993634 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:33:53.995600 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993637 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:33:53.995600 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993639 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:33:53.995600 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993643 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:33:53.995600 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993645 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:33:53.995600 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993648 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:33:53.995600 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993650 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:33:53.995600 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993653 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:33:53.995600 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993655 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:33:53.995600 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993659 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:33:53.995600 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993663 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:33:53.995600 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993666 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:33:53.995600 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993669 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:33:53.995600 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993671 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:33:53.995600 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993674 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:33:53.995600 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993678 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:33:53.995600 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993681 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:33:53.996154 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993683 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:33:53.996154 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993686 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:33:53.996154 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993688 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:33:53.996154 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993692 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:33:53.996154 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993696 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:33:53.996154 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993699 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:33:53.996154 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993701 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:33:53.996154 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993704 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:33:53.996154 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993707 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:33:53.996154 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993710 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:33:53.996154 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993714 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:33:53.996154 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993717 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:33:53.996154 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993720 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:33:53.996154 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993723 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:33:53.996154 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993726 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:33:53.996154 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993729 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:33:53.996154 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993731 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:33:53.996154 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993734 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:33:53.996154 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993737 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:33:53.996154 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993740 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:33:53.996704 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993743 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:33:53.996704 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993745 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:33:53.996704 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993748 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:33:53.996704 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993751 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:33:53.996704 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993753 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:33:53.996704 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993756 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:33:53.996704 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993760 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:33:53.996704 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993762 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:33:53.996704 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993765 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:33:53.996704 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993767 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:33:53.996704 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993770 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:33:53.996704 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993773 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:33:53.996704 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993775 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:33:53.996704 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993778 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:33:53.996704 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993781 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:33:53.996704 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993784 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:33:53.996704 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993787 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:33:53.996704 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993789 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:33:53.996704 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993792 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:33:53.996704 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993794 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:33:53.997207 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993797 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:33:53.997207 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993800 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:33:53.997207 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993802 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:33:53.997207 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993805 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:33:53.997207 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993808 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:33:53.997207 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993810 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:33:53.997207 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993813 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:33:53.997207 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.993815 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:33:53.997207 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995123 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 14:33:53.997207 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995134 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 14:33:53.997207 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995141 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 14:33:53.997207 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995150 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 14:33:53.997207 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995158 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 14:33:53.997207 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995161 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 14:33:53.997207 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995166 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 14:33:53.997207 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995171 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 14:33:53.997207 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995174 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 14:33:53.997207 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995178 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 14:33:53.997207 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995181 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 14:33:53.997207 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995185 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 14:33:53.997207 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995188 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 14:33:53.997207 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995191 2570 flags.go:64] FLAG: --cgroup-root="" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995194 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995197 2570 flags.go:64] FLAG: --client-ca-file="" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995201 2570 flags.go:64] FLAG: --cloud-config="" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995203 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995207 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995212 2570 flags.go:64] FLAG: --cluster-domain="" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995215 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995218 2570 flags.go:64] FLAG: --config-dir="" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995221 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995225 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995229 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995232 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995251 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995256 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995259 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995262 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995265 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995268 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995271 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995276 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995279 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995282 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995285 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995289 2570 flags.go:64] FLAG: --enable-server="true" Apr 17 14:33:53.997759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995292 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995296 2570 flags.go:64] FLAG: --event-burst="100" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995299 2570 flags.go:64] FLAG: --event-qps="50" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995303 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995306 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995309 2570 flags.go:64] FLAG: --eviction-hard="" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995313 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995316 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995319 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995322 2570 flags.go:64] FLAG: --eviction-soft="" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995325 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995328 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995331 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995334 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995337 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995340 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995343 2570 flags.go:64] FLAG: --feature-gates="" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995347 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995349 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995352 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995356 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995359 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995362 2570 flags.go:64] FLAG: --help="false" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995365 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-129-134.ec2.internal" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995368 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 14:33:53.998360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995371 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995374 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995378 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995381 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995384 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995387 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995389 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995393 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995396 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995399 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995402 2570 flags.go:64] FLAG: --kube-reserved="" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995404 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995408 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995411 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995414 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995417 2570 flags.go:64] FLAG: --lock-file="" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995419 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995422 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995426 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995431 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995434 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995437 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995440 2570 flags.go:64] FLAG: --logging-format="text" Apr 17 14:33:53.998977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995443 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995447 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995450 2570 flags.go:64] FLAG: --manifest-url="" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995453 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995457 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995460 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995465 2570 flags.go:64] FLAG: --max-pods="110" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995468 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995471 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995474 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995477 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995480 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995483 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995486 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995493 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995496 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995500 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995503 2570 flags.go:64] FLAG: --pod-cidr="" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995506 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995512 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995515 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995518 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995521 2570 flags.go:64] FLAG: --port="10250" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995524 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 14:33:53.999558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995527 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a047c0c35c688070" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995531 2570 flags.go:64] FLAG: --qos-reserved="" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995533 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995536 2570 flags.go:64] FLAG: --register-node="true" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995539 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995542 2570 flags.go:64] FLAG: --register-with-taints="" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995546 2570 flags.go:64] FLAG: --registry-burst="10" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995548 2570 flags.go:64] FLAG: --registry-qps="5" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995554 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995557 2570 flags.go:64] FLAG: --reserved-memory="" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995561 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995564 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995567 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995570 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995573 2570 flags.go:64] FLAG: --runonce="false" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995576 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995579 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995582 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995585 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995587 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995591 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995594 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995597 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995600 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995603 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995606 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 14:33:54.000147 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995609 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995613 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995616 2570 flags.go:64] FLAG: --system-cgroups="" Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995618 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995624 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995627 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995630 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995635 2570 flags.go:64] FLAG: --tls-min-version="" Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995638 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995640 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995643 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995646 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995649 2570 flags.go:64] FLAG: --v="2" Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995654 2570 flags.go:64] FLAG: --version="false" Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995660 2570 flags.go:64] FLAG: --vmodule="" Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995664 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.995668 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995795 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995799 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995802 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995805 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995809 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995812 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:33:54.000800 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995815 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:33:54.001364 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995817 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:33:54.001364 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995820 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:33:54.001364 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995823 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:33:54.001364 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995825 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:33:54.001364 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995828 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:33:54.001364 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995830 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:33:54.001364 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995833 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:33:54.001364 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995835 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:33:54.001364 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995838 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:33:54.001364 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995841 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:33:54.001364 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995844 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:33:54.001364 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995847 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:33:54.001364 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995854 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:33:54.001364 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995857 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:33:54.001364 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995860 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:33:54.001364 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995862 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:33:54.001364 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995865 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:33:54.001364 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995868 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:33:54.001364 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995870 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:33:54.001364 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995873 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:33:54.001909 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995875 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:33:54.001909 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995878 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:33:54.001909 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995883 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:33:54.001909 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995887 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:33:54.001909 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995890 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:33:54.001909 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995893 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:33:54.001909 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995896 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:33:54.001909 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995898 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:33:54.001909 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995901 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:33:54.001909 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995904 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:33:54.001909 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995906 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:33:54.001909 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995909 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:33:54.001909 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995912 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:33:54.001909 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995915 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:33:54.001909 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995917 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:33:54.001909 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995920 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:33:54.001909 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995922 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:33:54.001909 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995925 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:33:54.001909 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995928 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:33:54.002412 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995930 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:33:54.002412 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995933 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:33:54.002412 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995935 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:33:54.002412 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995939 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:33:54.002412 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995941 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:33:54.002412 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995945 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:33:54.002412 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995948 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:33:54.002412 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995951 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:33:54.002412 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995953 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:33:54.002412 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995956 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:33:54.002412 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995959 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:33:54.002412 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995962 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:33:54.002412 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995964 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:33:54.002412 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995967 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:33:54.002412 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995969 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:33:54.002412 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995973 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:33:54.002412 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995976 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:33:54.002412 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995979 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:33:54.002412 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995981 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:33:54.002893 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995983 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:33:54.002893 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995987 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:33:54.002893 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995991 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:33:54.002893 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995994 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:33:54.002893 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995997 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:33:54.002893 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.995999 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:33:54.002893 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.996002 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:33:54.002893 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.996004 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:33:54.002893 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.996007 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:33:54.002893 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.996009 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:33:54.002893 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.996012 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:33:54.002893 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.996014 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:33:54.002893 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.996017 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:33:54.002893 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.996020 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:33:54.002893 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.996022 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:33:54.002893 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.996025 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:33:54.002893 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.996027 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:33:54.002893 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.996031 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:33:54.002893 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.996035 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:33:54.002893 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.996037 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:33:54.003413 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:53.996040 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:33:54.003413 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:53.996838 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:33:54.003950 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.003927 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 14:33:54.003981 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.003952 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 14:33:54.004011 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004003 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:33:54.004011 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004009 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:33:54.004011 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004012 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:33:54.004086 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004016 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:33:54.004086 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004020 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:33:54.004086 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004024 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:33:54.004086 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004029 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:33:54.004086 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004032 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:33:54.004086 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004036 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:33:54.004086 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004039 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:33:54.004086 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004042 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:33:54.004086 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004045 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:33:54.004086 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004047 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:33:54.004086 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004050 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:33:54.004086 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004053 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:33:54.004086 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004056 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:33:54.004086 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004060 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:33:54.004086 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004063 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:33:54.004086 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004066 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:33:54.004086 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004069 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:33:54.004086 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004072 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:33:54.004086 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004074 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:33:54.004585 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004077 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:33:54.004585 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004080 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:33:54.004585 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004082 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:33:54.004585 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004092 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:33:54.004585 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004095 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:33:54.004585 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004098 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:33:54.004585 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004100 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:33:54.004585 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004103 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:33:54.004585 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004106 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:33:54.004585 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004108 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:33:54.004585 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004111 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:33:54.004585 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004114 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:33:54.004585 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004116 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:33:54.004585 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004119 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:33:54.004585 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004122 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:33:54.004585 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004125 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:33:54.004585 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004128 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:33:54.004585 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004130 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:33:54.004585 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004133 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:33:54.004585 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004135 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:33:54.005072 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004138 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:33:54.005072 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004141 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:33:54.005072 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004146 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:33:54.005072 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004149 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:33:54.005072 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004151 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:33:54.005072 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004154 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:33:54.005072 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004156 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:33:54.005072 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004159 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:33:54.005072 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004161 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:33:54.005072 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004164 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:33:54.005072 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004166 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:33:54.005072 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004169 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:33:54.005072 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004171 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:33:54.005072 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004175 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:33:54.005072 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004178 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:33:54.005072 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004181 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:33:54.005072 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004184 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:33:54.005072 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004187 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:33:54.005072 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004190 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:33:54.005558 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004192 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:33:54.005558 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004195 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:33:54.005558 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004197 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:33:54.005558 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004200 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:33:54.005558 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004202 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:33:54.005558 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004205 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:33:54.005558 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004207 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:33:54.005558 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004210 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:33:54.005558 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004212 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:33:54.005558 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004215 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:33:54.005558 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004218 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:33:54.005558 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004220 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:33:54.005558 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004223 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:33:54.005558 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004225 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:33:54.005558 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004228 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:33:54.005558 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004231 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:33:54.005558 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004233 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:33:54.005558 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004257 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:33:54.005558 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004261 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:33:54.005558 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004264 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:33:54.006090 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004267 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:33:54.006090 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004270 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:33:54.006090 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004273 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:33:54.006090 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004275 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:33:54.006090 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004278 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:33:54.006090 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.004283 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:33:54.006090 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004400 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:33:54.006090 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004406 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:33:54.006090 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004410 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:33:54.006090 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004413 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:33:54.006090 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004416 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:33:54.006090 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004419 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:33:54.006090 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004422 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:33:54.006090 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004425 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:33:54.006090 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004428 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:33:54.006476 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004431 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:33:54.006476 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004433 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:33:54.006476 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004436 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:33:54.006476 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004439 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:33:54.006476 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004441 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:33:54.006476 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004444 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:33:54.006476 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004446 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:33:54.006476 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004449 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:33:54.006476 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004451 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:33:54.006476 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004454 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:33:54.006476 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004456 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:33:54.006476 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004459 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:33:54.006476 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004462 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:33:54.006476 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004465 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:33:54.006476 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004467 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:33:54.006476 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004471 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:33:54.006476 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004475 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:33:54.006476 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004478 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:33:54.006476 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004481 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:33:54.006938 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004484 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:33:54.006938 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004487 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:33:54.006938 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004489 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:33:54.006938 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004492 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:33:54.006938 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004496 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:33:54.006938 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004498 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:33:54.006938 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004501 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:33:54.006938 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004503 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:33:54.006938 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004506 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:33:54.006938 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004509 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:33:54.006938 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004512 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:33:54.006938 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004514 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:33:54.006938 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004517 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:33:54.006938 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004519 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:33:54.006938 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004522 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:33:54.006938 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004524 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:33:54.006938 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004527 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:33:54.006938 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004530 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:33:54.006938 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004532 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:33:54.007430 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004535 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:33:54.007430 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004537 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:33:54.007430 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004539 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:33:54.007430 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004542 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:33:54.007430 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004545 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:33:54.007430 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004548 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:33:54.007430 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004550 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:33:54.007430 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004553 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:33:54.007430 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004555 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:33:54.007430 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004558 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:33:54.007430 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004560 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:33:54.007430 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004563 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:33:54.007430 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004565 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:33:54.007430 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004568 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:33:54.007430 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004570 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:33:54.007430 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004573 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:33:54.007430 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004575 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:33:54.007430 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004578 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:33:54.007430 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004581 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:33:54.007430 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004584 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:33:54.007912 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004586 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:33:54.007912 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004589 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:33:54.007912 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004592 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:33:54.007912 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004595 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:33:54.007912 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004598 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:33:54.007912 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004601 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:33:54.007912 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004603 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:33:54.007912 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004606 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:33:54.007912 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004608 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:33:54.007912 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004611 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:33:54.007912 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004613 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:33:54.007912 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004616 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:33:54.007912 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004619 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:33:54.007912 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004621 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:33:54.007912 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004624 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:33:54.007912 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004626 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:33:54.007912 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004629 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:33:54.007912 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004632 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:33:54.007912 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:54.004634 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:33:54.008412 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.004639 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:33:54.008412 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.005375 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 14:33:54.008412 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.007547 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 14:33:54.008592 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.008579 2570 server.go:1019] "Starting client certificate rotation" Apr 17 14:33:54.008697 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.008678 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 14:33:54.008730 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.008722 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 14:33:54.034349 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.034323 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 14:33:54.040683 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.040650 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 14:33:54.058612 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.058588 2570 log.go:25] "Validated CRI v1 runtime API" Apr 17 14:33:54.064629 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.064605 2570 log.go:25] "Validated CRI v1 image API" Apr 17 14:33:54.065517 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.065497 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 14:33:54.066298 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.066278 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 14:33:54.069901 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.069871 2570 fs.go:135] Filesystem UUIDs: map[122e43e7-2140-409e-99dd-ca4c7497b527:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 9376864b-9c0f-4892-94c5-fefd7b154146:/dev/nvme0n1p4] Apr 17 14:33:54.070003 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.069898 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 14:33:54.075305 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.075148 2570 manager.go:217] Machine: {Timestamp:2026-04-17 14:33:54.073883963 +0000 UTC m=+0.430611590 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3161385 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2da954dc22fcdd189d4f7b23b1ef40 SystemUUID:ec2da954-dc22-fcdd-189d-4f7b23b1ef40 BootID:b292407e-928c-4cbd-b944-68793d6bf7d6 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:d1:a2:16:31:97 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:d1:a2:16:31:97 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:22:6d:47:bc:cd:31 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 14:33:54.075305 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.075288 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 14:33:54.075493 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.075415 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 14:33:54.077279 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.077252 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 14:33:54.077480 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.077279 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-134.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 14:33:54.077558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.077497 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 14:33:54.077558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.077510 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 14:33:54.077558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.077528 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 14:33:54.078391 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.078378 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 14:33:54.079726 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.079713 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 17 14:33:54.079861 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.079850 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 14:33:54.082353 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.082341 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 17 14:33:54.082419 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.082366 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 14:33:54.082419 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.082383 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 14:33:54.082419 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.082398 2570 kubelet.go:397] "Adding apiserver pod source" Apr 17 14:33:54.082419 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.082412 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 14:33:54.083526 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.083496 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 14:33:54.083605 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.083541 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 14:33:54.086676 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.086660 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 14:33:54.088568 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.088556 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 14:33:54.089949 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.089928 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 14:33:54.089949 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.089948 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 14:33:54.090054 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.089955 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 14:33:54.090054 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.089961 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 14:33:54.090054 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.089966 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 14:33:54.090054 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.089975 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 14:33:54.090054 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.089981 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 14:33:54.090054 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.089989 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 14:33:54.090054 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.090000 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 14:33:54.090054 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.090007 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 14:33:54.090054 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.090016 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 14:33:54.090054 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.090025 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 14:33:54.091075 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.091064 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 14:33:54.091075 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.091075 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 14:33:54.094793 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.094780 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 14:33:54.094859 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.094816 2570 server.go:1295] "Started kubelet" Apr 17 14:33:54.094940 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.094894 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 14:33:54.094996 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.094933 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 14:33:54.095041 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.095021 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 14:33:54.096556 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:54.096486 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-134.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 14:33:54.096962 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.096948 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 14:33:54.097071 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.097055 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-134.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 14:33:54.097189 ip-10-0-129-134 systemd[1]: Started Kubernetes Kubelet. Apr 17 14:33:54.098429 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.098280 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rbr6x" Apr 17 14:33:54.098504 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:54.098456 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 14:33:54.098612 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.098598 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 17 14:33:54.103844 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.103822 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 14:33:54.103844 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.103836 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 14:33:54.104475 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:54.103252 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-134.ec2.internal.18a72b865c481e85 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-134.ec2.internal,UID:ip-10-0-129-134.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-134.ec2.internal,},FirstTimestamp:2026-04-17 14:33:54.094792325 +0000 UTC m=+0.451519940,LastTimestamp:2026-04-17 14:33:54.094792325 +0000 UTC m=+0.451519940,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-134.ec2.internal,}" Apr 17 14:33:54.104593 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.104567 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 14:33:54.104593 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.104567 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 14:33:54.104703 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.104598 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 14:33:54.104703 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:54.104632 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-134.ec2.internal\" not found" Apr 17 14:33:54.104703 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.104672 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 17 14:33:54.104703 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.104684 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 17 14:33:54.106030 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.106008 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rbr6x" Apr 17 14:33:54.106229 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.106210 2570 factory.go:55] Registering systemd factory Apr 17 14:33:54.106316 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.106249 2570 factory.go:223] Registration of the systemd container factory successfully Apr 17 14:33:54.106703 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.106688 2570 factory.go:153] Registering CRI-O factory Apr 17 14:33:54.106812 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.106803 2570 factory.go:223] Registration of the crio container factory successfully Apr 17 14:33:54.106937 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.106926 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 14:33:54.107035 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.107026 2570 factory.go:103] Registering Raw factory Apr 17 14:33:54.107108 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.107101 2570 manager.go:1196] Started watching for new ooms in manager Apr 17 14:33:54.107816 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.107802 2570 manager.go:319] Starting recovery of all containers Apr 17 14:33:54.114624 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.114603 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:33:54.117023 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:54.116840 2570 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-134.ec2.internal\" not found" node="ip-10-0-129-134.ec2.internal" Apr 17 14:33:54.118777 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.118763 2570 manager.go:324] Recovery completed Apr 17 14:33:54.120049 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:54.120025 2570 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 17 14:33:54.123122 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.123105 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:33:54.125679 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.125664 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-134.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:33:54.125749 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.125695 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-134.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:33:54.125749 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.125709 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-134.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:33:54.126171 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.126156 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 14:33:54.126171 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.126171 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 14:33:54.126304 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.126188 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 17 14:33:54.128364 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.128352 2570 policy_none.go:49] "None policy: Start" Apr 17 14:33:54.128410 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.128367 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 14:33:54.128410 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.128377 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 17 14:33:54.173883 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.173862 2570 manager.go:341] "Starting Device Plugin manager" Apr 17 14:33:54.178447 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:54.173905 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 14:33:54.178447 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.173921 2570 server.go:85] "Starting device plugin registration server" Apr 17 14:33:54.178447 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.174178 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 14:33:54.178447 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.174189 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 14:33:54.178447 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.174287 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 14:33:54.178447 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.174376 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 14:33:54.178447 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.174387 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 14:33:54.178447 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:54.174909 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 14:33:54.178447 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:54.174950 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-134.ec2.internal\" not found" Apr 17 14:33:54.232860 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.232776 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 14:33:54.234052 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.234033 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 14:33:54.234111 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.234070 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 14:33:54.234111 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.234095 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 14:33:54.234111 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.234106 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 14:33:54.234250 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:54.234149 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 14:33:54.236291 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.236272 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:33:54.275108 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.275074 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:33:54.278183 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.278164 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-134.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:33:54.278308 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.278200 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-134.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:33:54.278308 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.278215 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-134.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:33:54.278308 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.278261 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-134.ec2.internal" Apr 17 14:33:54.285811 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.285791 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-134.ec2.internal" Apr 17 14:33:54.285925 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:54.285819 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-134.ec2.internal\": node \"ip-10-0-129-134.ec2.internal\" not found" Apr 17 14:33:54.299815 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:54.299790 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-134.ec2.internal\" not found" Apr 17 14:33:54.334854 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.334806 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-134.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-134.ec2.internal"] Apr 17 14:33:54.334926 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.334916 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:33:54.337288 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.337271 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-134.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:33:54.337367 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.337303 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-134.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:33:54.337367 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.337319 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-134.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:33:54.338654 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.338629 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:33:54.338833 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.338820 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-134.ec2.internal" Apr 17 14:33:54.338870 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.338850 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:33:54.339537 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.339518 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-134.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:33:54.339625 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.339544 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-134.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:33:54.339625 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.339554 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-134.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:33:54.339625 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.339586 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-134.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:33:54.339625 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.339604 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-134.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:33:54.339625 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.339614 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-134.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:33:54.340855 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.340840 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-134.ec2.internal" Apr 17 14:33:54.340911 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.340865 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:33:54.341620 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.341601 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-134.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:33:54.341705 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.341627 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-134.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:33:54.341705 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.341637 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-134.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:33:54.370268 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:54.370228 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-134.ec2.internal\" not found" node="ip-10-0-129-134.ec2.internal" Apr 17 14:33:54.374877 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:54.374856 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-134.ec2.internal\" not found" node="ip-10-0-129-134.ec2.internal" Apr 17 14:33:54.400550 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:54.400519 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-134.ec2.internal\" not found" Apr 17 14:33:54.405853 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.405835 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/53a7243a4603a87d58406d96759acbb6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-134.ec2.internal\" (UID: \"53a7243a4603a87d58406d96759acbb6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-134.ec2.internal" Apr 17 14:33:54.405928 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.405860 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53a7243a4603a87d58406d96759acbb6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-134.ec2.internal\" (UID: \"53a7243a4603a87d58406d96759acbb6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-134.ec2.internal" Apr 17 14:33:54.405928 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.405879 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6856fb5778dd8b62676b041b83b334bb-config\") pod \"kube-apiserver-proxy-ip-10-0-129-134.ec2.internal\" (UID: \"6856fb5778dd8b62676b041b83b334bb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-134.ec2.internal" Apr 17 14:33:54.500826 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:54.500742 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-134.ec2.internal\" not found" Apr 17 14:33:54.506092 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.506060 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/53a7243a4603a87d58406d96759acbb6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-134.ec2.internal\" (UID: \"53a7243a4603a87d58406d96759acbb6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-134.ec2.internal" Apr 17 14:33:54.506232 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.506013 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/53a7243a4603a87d58406d96759acbb6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-134.ec2.internal\" (UID: \"53a7243a4603a87d58406d96759acbb6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-134.ec2.internal" Apr 17 14:33:54.506232 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.506125 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53a7243a4603a87d58406d96759acbb6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-134.ec2.internal\" (UID: \"53a7243a4603a87d58406d96759acbb6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-134.ec2.internal" Apr 17 14:33:54.506232 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.506141 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6856fb5778dd8b62676b041b83b334bb-config\") pod \"kube-apiserver-proxy-ip-10-0-129-134.ec2.internal\" (UID: \"6856fb5778dd8b62676b041b83b334bb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-134.ec2.internal" Apr 17 14:33:54.506232 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.506173 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6856fb5778dd8b62676b041b83b334bb-config\") pod \"kube-apiserver-proxy-ip-10-0-129-134.ec2.internal\" (UID: \"6856fb5778dd8b62676b041b83b334bb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-134.ec2.internal" Apr 17 14:33:54.506232 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.506193 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53a7243a4603a87d58406d96759acbb6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-134.ec2.internal\" (UID: \"53a7243a4603a87d58406d96759acbb6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-134.ec2.internal" Apr 17 14:33:54.601486 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:54.601447 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-134.ec2.internal\" not found" Apr 17 14:33:54.671927 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.671901 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-134.ec2.internal" Apr 17 14:33:54.677558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:54.677541 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-134.ec2.internal" Apr 17 14:33:54.702156 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:54.702126 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-134.ec2.internal\" not found" Apr 17 14:33:54.802744 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:54.802660 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-134.ec2.internal\" not found" Apr 17 14:33:54.903188 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:54.903153 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-134.ec2.internal\" not found" Apr 17 14:33:55.003760 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:55.003727 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-134.ec2.internal\" not found" Apr 17 14:33:55.007971 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:55.007953 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 14:33:55.008096 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:55.008080 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 14:33:55.008157 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:55.008116 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 14:33:55.104358 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:55.104295 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-134.ec2.internal\" not found" Apr 17 14:33:55.104358 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:55.104299 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 14:33:55.110355 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:55.110330 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 14:28:54 +0000 UTC" deadline="2027-10-19 14:41:43.731435434 +0000 UTC" Apr 17 14:33:55.110404 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:55.110358 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13200h7m48.621081329s" Apr 17 14:33:55.118676 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:55.118660 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 14:33:55.136554 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:55.136530 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-q7dhx" Apr 17 14:33:55.145403 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:55.145383 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-q7dhx" Apr 17 14:33:55.149549 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:55.149521 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53a7243a4603a87d58406d96759acbb6.slice/crio-9f769fbb7dc2691f57e0b311aae0294875ca375021fb5f9ae32ba804f3a32212 WatchSource:0}: Error finding container 9f769fbb7dc2691f57e0b311aae0294875ca375021fb5f9ae32ba804f3a32212: Status 404 returned error can't find the container with id 9f769fbb7dc2691f57e0b311aae0294875ca375021fb5f9ae32ba804f3a32212 Apr 17 14:33:55.150065 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:55.150045 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6856fb5778dd8b62676b041b83b334bb.slice/crio-20f7d7b759dcd855e12af083cb90996a64e465dc51250980c23f35a4ecee9ff4 WatchSource:0}: Error finding container 20f7d7b759dcd855e12af083cb90996a64e465dc51250980c23f35a4ecee9ff4: Status 404 returned error can't find the container with id 20f7d7b759dcd855e12af083cb90996a64e465dc51250980c23f35a4ecee9ff4 Apr 17 14:33:55.154544 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:55.154529 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:33:55.205086 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:55.205048 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-134.ec2.internal\" not found" Apr 17 14:33:55.237278 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:55.237214 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-134.ec2.internal" event={"ID":"53a7243a4603a87d58406d96759acbb6","Type":"ContainerStarted","Data":"9f769fbb7dc2691f57e0b311aae0294875ca375021fb5f9ae32ba804f3a32212"} Apr 17 14:33:55.238145 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:55.238121 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-134.ec2.internal" event={"ID":"6856fb5778dd8b62676b041b83b334bb","Type":"ContainerStarted","Data":"20f7d7b759dcd855e12af083cb90996a64e465dc51250980c23f35a4ecee9ff4"} Apr 17 14:33:55.259962 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:55.259940 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:33:55.304978 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:55.304951 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-134.ec2.internal" Apr 17 14:33:55.315318 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:55.315294 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 14:33:55.317209 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:55.317192 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-134.ec2.internal" Apr 17 14:33:55.327211 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:55.327195 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 14:33:55.709229 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:55.709046 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:33:55.831798 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:55.831760 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:33:56.084034 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.083948 2570 apiserver.go:52] "Watching apiserver" Apr 17 14:33:56.092426 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.092396 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 14:33:56.092807 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.092782 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-2pbpg","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-134.ec2.internal","openshift-multus/multus-additional-cni-plugins-lr2qq","openshift-network-diagnostics/network-check-target-2jb96","openshift-network-operator/iptables-alerter-jglh7","kube-system/kube-apiserver-proxy-ip-10-0-129-134.ec2.internal","openshift-cluster-node-tuning-operator/tuned-tr472","openshift-dns/node-resolver-s9w25","openshift-image-registry/node-ca-pzwpd","openshift-multus/multus-s7m27","openshift-multus/network-metrics-daemon-j6cgs","openshift-ovn-kubernetes/ovnkube-node-lr966"] Apr 17 14:33:56.095683 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.095648 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pzwpd" Apr 17 14:33:56.096851 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.096835 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.098005 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.097978 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 14:33:56.098456 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.098160 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 14:33:56.098456 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.098169 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 14:33:56.098456 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.098178 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:33:56.098456 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:56.098258 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jb96" podUID="651b7208-91cf-42ee-b675-0a50ef1389f0" Apr 17 14:33:56.098456 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.098325 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-njt8v\"" Apr 17 14:33:56.099214 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.099192 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-vbldb\"" Apr 17 14:33:56.099315 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.099271 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 14:33:56.099370 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.099347 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 14:33:56.099422 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.099399 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 14:33:56.099515 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.099476 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.101192 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.101033 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jglh7" Apr 17 14:33:56.101830 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.101773 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 14:33:56.101830 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.101801 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 14:33:56.102012 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.101990 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 14:33:56.102157 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.102136 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-jrs7q\"" Apr 17 14:33:56.102268 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.102167 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 14:33:56.102268 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.102169 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.103070 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.103052 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 14:33:56.103339 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.103321 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:33:56.103534 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.103519 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s9w25" Apr 17 14:33:56.103620 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.103606 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 14:33:56.105099 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.105079 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jxqxc\"" Apr 17 14:33:56.105358 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.105344 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2pbpg" Apr 17 14:33:56.105500 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.105481 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 14:33:56.105827 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.105810 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.106794 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.106458 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 14:33:56.106794 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.106498 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-2cbqb\"" Apr 17 14:33:56.107427 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.107001 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:33:56.107427 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.107064 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 14:33:56.107427 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.107134 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 14:33:56.107427 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.107001 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-hthmd\"" Apr 17 14:33:56.108116 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.107769 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 14:33:56.108116 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.108062 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 14:33:56.108549 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.108528 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:33:56.108855 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:56.108665 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6cgs" podUID="4807a6e2-14df-4ba4-8aee-7422a65508f2" Apr 17 14:33:56.109598 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.109580 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-tgpzr\"" Apr 17 14:33:56.110225 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.110010 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8f7nj\"" Apr 17 14:33:56.110225 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.110010 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 14:33:56.110502 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.110480 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.112668 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.112652 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 14:33:56.113815 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.113797 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4790e8ee-77ab-402e-a1df-7e728d62db98-cni-binary-copy\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.113910 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.113829 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g94kh\" (UniqueName: \"kubernetes.io/projected/4790e8ee-77ab-402e-a1df-7e728d62db98-kube-api-access-g94kh\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.113910 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.113857 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/70bf124d-898a-4e10-aece-902d90ea13ac-agent-certs\") pod \"konnectivity-agent-2pbpg\" (UID: \"70bf124d-898a-4e10-aece-902d90ea13ac\") " pod="kube-system/konnectivity-agent-2pbpg" Apr 17 14:33:56.113910 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.113882 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bd98c614-168e-496f-adb3-06763ea0075c-etc-selinux\") pod \"aws-ebs-csi-driver-node-rr8s9\" (UID: \"bd98c614-168e-496f-adb3-06763ea0075c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.114052 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.113919 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-var-lib-kubelet\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.114052 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.113941 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-host-run-netns\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.114052 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.113963 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1610e8f2-d397-4d82-a851-2e17443a44e4-host-slash\") pod \"iptables-alerter-jglh7\" (UID: \"1610e8f2-d397-4d82-a851-2e17443a44e4\") " pod="openshift-network-operator/iptables-alerter-jglh7" Apr 17 14:33:56.114052 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114018 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-cnibin\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.114195 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114057 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-multus-socket-dir-parent\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.114195 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114085 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-host-var-lib-kubelet\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.114195 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114111 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml2t7\" (UniqueName: \"kubernetes.io/projected/e599dbf6-a663-42a7-82bb-12ed438c2ba8-kube-api-access-ml2t7\") pod \"node-resolver-s9w25\" (UID: \"e599dbf6-a663-42a7-82bb-12ed438c2ba8\") " pod="openshift-dns/node-resolver-s9w25" Apr 17 14:33:56.114195 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114134 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-host\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.114195 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114155 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e6899357-3e39-480b-ab64-ec01da0d8a4f-etc-tuned\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.114454 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114196 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns97l\" (UniqueName: \"kubernetes.io/projected/e6899357-3e39-480b-ab64-ec01da0d8a4f-kube-api-access-ns97l\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.114454 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114224 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-host-run-k8s-cni-cncf-io\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.114454 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114275 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-host-run-multus-certs\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.114454 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114304 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p9fk\" (UniqueName: \"kubernetes.io/projected/bd98c614-168e-496f-adb3-06763ea0075c-kube-api-access-8p9fk\") pod \"aws-ebs-csi-driver-node-rr8s9\" (UID: \"bd98c614-168e-496f-adb3-06763ea0075c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.114454 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114326 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-etc-modprobe-d\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.114454 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114372 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4790e8ee-77ab-402e-a1df-7e728d62db98-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.114454 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114414 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x52rp\" (UniqueName: \"kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp\") pod \"network-check-target-2jb96\" (UID: \"651b7208-91cf-42ee-b675-0a50ef1389f0\") " pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:33:56.114454 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114445 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-etc-sysctl-d\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.114818 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114467 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-sys\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.114818 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114485 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-g6m69\"" Apr 17 14:33:56.114818 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114499 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-cni-binary-copy\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.114818 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114524 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4790e8ee-77ab-402e-a1df-7e728d62db98-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.114818 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114546 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c66a6c5d-9d68-42d2-ac7d-1ef80b44ed2f-host\") pod \"node-ca-pzwpd\" (UID: \"c66a6c5d-9d68-42d2-ac7d-1ef80b44ed2f\") " pod="openshift-image-registry/node-ca-pzwpd" Apr 17 14:33:56.114818 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114568 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 14:33:56.114818 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114568 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 14:33:56.114818 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114568 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-etc-sysconfig\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.114818 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114703 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-multus-cni-dir\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.114818 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114718 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 14:33:56.114818 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114730 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-etc-kubernetes\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.114818 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114758 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bd98c614-168e-496f-adb3-06763ea0075c-socket-dir\") pod \"aws-ebs-csi-driver-node-rr8s9\" (UID: \"bd98c614-168e-496f-adb3-06763ea0075c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.114818 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114770 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 14:33:56.114818 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114782 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-run\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.114818 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114794 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 14:33:56.114818 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114807 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-system-cni-dir\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.115499 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114831 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-multus-conf-dir\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.115499 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114856 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-host-var-lib-cni-multus\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.115499 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114881 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-multus-daemon-config\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.115499 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114914 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld642\" (UniqueName: \"kubernetes.io/projected/1610e8f2-d397-4d82-a851-2e17443a44e4-kube-api-access-ld642\") pod \"iptables-alerter-jglh7\" (UID: \"1610e8f2-d397-4d82-a851-2e17443a44e4\") " pod="openshift-network-operator/iptables-alerter-jglh7" Apr 17 14:33:56.115499 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114938 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e6899357-3e39-480b-ab64-ec01da0d8a4f-tmp\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.115499 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114955 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4790e8ee-77ab-402e-a1df-7e728d62db98-os-release\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.115499 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114975 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e599dbf6-a663-42a7-82bb-12ed438c2ba8-hosts-file\") pod \"node-resolver-s9w25\" (UID: \"e599dbf6-a663-42a7-82bb-12ed438c2ba8\") " pod="openshift-dns/node-resolver-s9w25" Apr 17 14:33:56.115499 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.114999 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bd98c614-168e-496f-adb3-06763ea0075c-registration-dir\") pod \"aws-ebs-csi-driver-node-rr8s9\" (UID: \"bd98c614-168e-496f-adb3-06763ea0075c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.115499 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.115023 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1610e8f2-d397-4d82-a851-2e17443a44e4-iptables-alerter-script\") pod \"iptables-alerter-jglh7\" (UID: \"1610e8f2-d397-4d82-a851-2e17443a44e4\") " pod="openshift-network-operator/iptables-alerter-jglh7" Apr 17 14:33:56.115499 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.115038 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-lib-modules\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.115499 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.115053 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-host-var-lib-cni-bin\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.115499 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.115067 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/70bf124d-898a-4e10-aece-902d90ea13ac-konnectivity-ca\") pod \"konnectivity-agent-2pbpg\" (UID: \"70bf124d-898a-4e10-aece-902d90ea13ac\") " pod="kube-system/konnectivity-agent-2pbpg" Apr 17 14:33:56.115499 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.115087 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd98c614-168e-496f-adb3-06763ea0075c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rr8s9\" (UID: \"bd98c614-168e-496f-adb3-06763ea0075c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.115499 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.115131 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4790e8ee-77ab-402e-a1df-7e728d62db98-system-cni-dir\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.115499 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.115169 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4790e8ee-77ab-402e-a1df-7e728d62db98-cnibin\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.115499 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.115213 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bd98c614-168e-496f-adb3-06763ea0075c-device-dir\") pod \"aws-ebs-csi-driver-node-rr8s9\" (UID: \"bd98c614-168e-496f-adb3-06763ea0075c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.116111 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.115265 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bd98c614-168e-496f-adb3-06763ea0075c-sys-fs\") pod \"aws-ebs-csi-driver-node-rr8s9\" (UID: \"bd98c614-168e-496f-adb3-06763ea0075c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.116111 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.115290 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-etc-kubernetes\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.116111 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.115314 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-hostroot\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.116111 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.115336 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4790e8ee-77ab-402e-a1df-7e728d62db98-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.116111 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.115371 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c66a6c5d-9d68-42d2-ac7d-1ef80b44ed2f-serviceca\") pod \"node-ca-pzwpd\" (UID: \"c66a6c5d-9d68-42d2-ac7d-1ef80b44ed2f\") " pod="openshift-image-registry/node-ca-pzwpd" Apr 17 14:33:56.116111 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.115409 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtjxg\" (UniqueName: \"kubernetes.io/projected/c66a6c5d-9d68-42d2-ac7d-1ef80b44ed2f-kube-api-access-jtjxg\") pod \"node-ca-pzwpd\" (UID: \"c66a6c5d-9d68-42d2-ac7d-1ef80b44ed2f\") " pod="openshift-image-registry/node-ca-pzwpd" Apr 17 14:33:56.116111 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.115456 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-etc-systemd\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.116111 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.115487 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8964s\" (UniqueName: \"kubernetes.io/projected/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-kube-api-access-8964s\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.116111 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.115530 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e599dbf6-a663-42a7-82bb-12ed438c2ba8-tmp-dir\") pod \"node-resolver-s9w25\" (UID: \"e599dbf6-a663-42a7-82bb-12ed438c2ba8\") " pod="openshift-dns/node-resolver-s9w25" Apr 17 14:33:56.116111 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.115590 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-etc-sysctl-conf\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.116111 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.115614 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-os-release\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.146093 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.146063 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 14:28:55 +0000 UTC" deadline="2027-09-15 15:05:53.821355596 +0000 UTC" Apr 17 14:33:56.146207 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.146095 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12384h31m57.675264841s" Apr 17 14:33:56.149154 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.149132 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:33:56.205632 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.205598 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 14:33:56.216373 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.216348 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-multus-cni-dir\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.216534 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.216380 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-etc-kubernetes\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.216534 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.216413 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-host-cni-netd\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.216534 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.216439 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bd98c614-168e-496f-adb3-06763ea0075c-socket-dir\") pod \"aws-ebs-csi-driver-node-rr8s9\" (UID: \"bd98c614-168e-496f-adb3-06763ea0075c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.216534 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.216462 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-run\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.216534 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.216478 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-etc-kubernetes\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.216534 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.216486 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-system-cni-dir\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.216774 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.216538 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-multus-conf-dir\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.216774 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.216568 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-env-overrides\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.216774 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.216583 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-run\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.216774 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.216592 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-ovnkube-script-lib\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.216774 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.216593 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-multus-cni-dir\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.216774 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.216598 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bd98c614-168e-496f-adb3-06763ea0075c-socket-dir\") pod \"aws-ebs-csi-driver-node-rr8s9\" (UID: \"bd98c614-168e-496f-adb3-06763ea0075c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.216774 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.216539 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-system-cni-dir\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.216774 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.216643 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-host-var-lib-cni-multus\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.216774 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.216663 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-multus-daemon-config\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.216774 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.216636 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-multus-conf-dir\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.216774 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.216692 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-run-systemd\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.216774 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.216714 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-host-var-lib-cni-multus\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.216774 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.216723 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ld642\" (UniqueName: \"kubernetes.io/projected/1610e8f2-d397-4d82-a851-2e17443a44e4-kube-api-access-ld642\") pod \"iptables-alerter-jglh7\" (UID: \"1610e8f2-d397-4d82-a851-2e17443a44e4\") " pod="openshift-network-operator/iptables-alerter-jglh7" Apr 17 14:33:56.217366 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.216783 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e6899357-3e39-480b-ab64-ec01da0d8a4f-tmp\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.217366 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.216815 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4790e8ee-77ab-402e-a1df-7e728d62db98-os-release\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.217366 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217192 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-multus-daemon-config\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.217366 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217198 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 14:33:56.217366 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217267 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e599dbf6-a663-42a7-82bb-12ed438c2ba8-hosts-file\") pod \"node-resolver-s9w25\" (UID: \"e599dbf6-a663-42a7-82bb-12ed438c2ba8\") " pod="openshift-dns/node-resolver-s9w25" Apr 17 14:33:56.217366 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217297 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnh6k\" (UniqueName: \"kubernetes.io/projected/4807a6e2-14df-4ba4-8aee-7422a65508f2-kube-api-access-wnh6k\") pod \"network-metrics-daemon-j6cgs\" (UID: \"4807a6e2-14df-4ba4-8aee-7422a65508f2\") " pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:33:56.217366 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217357 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e599dbf6-a663-42a7-82bb-12ed438c2ba8-hosts-file\") pod \"node-resolver-s9w25\" (UID: \"e599dbf6-a663-42a7-82bb-12ed438c2ba8\") " pod="openshift-dns/node-resolver-s9w25" Apr 17 14:33:56.217710 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217387 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-var-lib-openvswitch\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.217710 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217415 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-log-socket\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.217710 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217417 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4790e8ee-77ab-402e-a1df-7e728d62db98-os-release\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.217710 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217453 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bd98c614-168e-496f-adb3-06763ea0075c-registration-dir\") pod \"aws-ebs-csi-driver-node-rr8s9\" (UID: \"bd98c614-168e-496f-adb3-06763ea0075c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.217710 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217485 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1610e8f2-d397-4d82-a851-2e17443a44e4-iptables-alerter-script\") pod \"iptables-alerter-jglh7\" (UID: \"1610e8f2-d397-4d82-a851-2e17443a44e4\") " pod="openshift-network-operator/iptables-alerter-jglh7" Apr 17 14:33:56.217710 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217509 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-lib-modules\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.217710 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217532 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-host-var-lib-cni-bin\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.217710 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217555 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/70bf124d-898a-4e10-aece-902d90ea13ac-konnectivity-ca\") pod \"konnectivity-agent-2pbpg\" (UID: \"70bf124d-898a-4e10-aece-902d90ea13ac\") " pod="kube-system/konnectivity-agent-2pbpg" Apr 17 14:33:56.217710 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217581 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd98c614-168e-496f-adb3-06763ea0075c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rr8s9\" (UID: \"bd98c614-168e-496f-adb3-06763ea0075c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.217710 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217555 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bd98c614-168e-496f-adb3-06763ea0075c-registration-dir\") pod \"aws-ebs-csi-driver-node-rr8s9\" (UID: \"bd98c614-168e-496f-adb3-06763ea0075c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.217710 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217607 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4790e8ee-77ab-402e-a1df-7e728d62db98-system-cni-dir\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.217710 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217635 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4790e8ee-77ab-402e-a1df-7e728d62db98-cnibin\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.217710 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217658 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bd98c614-168e-496f-adb3-06763ea0075c-device-dir\") pod \"aws-ebs-csi-driver-node-rr8s9\" (UID: \"bd98c614-168e-496f-adb3-06763ea0075c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.217710 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217678 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-lib-modules\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.217710 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217683 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bd98c614-168e-496f-adb3-06763ea0075c-sys-fs\") pod \"aws-ebs-csi-driver-node-rr8s9\" (UID: \"bd98c614-168e-496f-adb3-06763ea0075c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.217710 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217706 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd98c614-168e-496f-adb3-06763ea0075c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rr8s9\" (UID: \"bd98c614-168e-496f-adb3-06763ea0075c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.218506 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217705 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-host-var-lib-cni-bin\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.218506 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217728 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-etc-kubernetes\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.218506 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217744 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4790e8ee-77ab-402e-a1df-7e728d62db98-cnibin\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.218506 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217756 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-hostroot\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.218506 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217767 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4790e8ee-77ab-402e-a1df-7e728d62db98-system-cni-dir\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.218506 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217766 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bd98c614-168e-496f-adb3-06763ea0075c-device-dir\") pod \"aws-ebs-csi-driver-node-rr8s9\" (UID: \"bd98c614-168e-496f-adb3-06763ea0075c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.218506 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217783 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4790e8ee-77ab-402e-a1df-7e728d62db98-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.218506 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217796 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-etc-kubernetes\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.218506 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217806 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-hostroot\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.218506 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217813 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-host-kubelet\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.218506 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217840 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-host-run-netns\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.218506 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217729 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bd98c614-168e-496f-adb3-06763ea0075c-sys-fs\") pod \"aws-ebs-csi-driver-node-rr8s9\" (UID: \"bd98c614-168e-496f-adb3-06763ea0075c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.218506 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217865 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-ovnkube-config\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.218506 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217901 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4790e8ee-77ab-402e-a1df-7e728d62db98-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.218506 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217938 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c66a6c5d-9d68-42d2-ac7d-1ef80b44ed2f-serviceca\") pod \"node-ca-pzwpd\" (UID: \"c66a6c5d-9d68-42d2-ac7d-1ef80b44ed2f\") " pod="openshift-image-registry/node-ca-pzwpd" Apr 17 14:33:56.218506 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.217987 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtjxg\" (UniqueName: \"kubernetes.io/projected/c66a6c5d-9d68-42d2-ac7d-1ef80b44ed2f-kube-api-access-jtjxg\") pod \"node-ca-pzwpd\" (UID: \"c66a6c5d-9d68-42d2-ac7d-1ef80b44ed2f\") " pod="openshift-image-registry/node-ca-pzwpd" Apr 17 14:33:56.218506 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218017 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-etc-systemd\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.219313 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218041 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8964s\" (UniqueName: \"kubernetes.io/projected/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-kube-api-access-8964s\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.219313 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218066 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e599dbf6-a663-42a7-82bb-12ed438c2ba8-tmp-dir\") pod \"node-resolver-s9w25\" (UID: \"e599dbf6-a663-42a7-82bb-12ed438c2ba8\") " pod="openshift-dns/node-resolver-s9w25" Apr 17 14:33:56.219313 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218091 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-etc-sysctl-conf\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.219313 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218116 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-os-release\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.219313 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218125 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/70bf124d-898a-4e10-aece-902d90ea13ac-konnectivity-ca\") pod \"konnectivity-agent-2pbpg\" (UID: \"70bf124d-898a-4e10-aece-902d90ea13ac\") " pod="kube-system/konnectivity-agent-2pbpg" Apr 17 14:33:56.219313 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218141 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4790e8ee-77ab-402e-a1df-7e728d62db98-cni-binary-copy\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.219313 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218166 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g94kh\" (UniqueName: \"kubernetes.io/projected/4790e8ee-77ab-402e-a1df-7e728d62db98-kube-api-access-g94kh\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.219313 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218197 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-etc-systemd\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.219313 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218195 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/70bf124d-898a-4e10-aece-902d90ea13ac-agent-certs\") pod \"konnectivity-agent-2pbpg\" (UID: \"70bf124d-898a-4e10-aece-902d90ea13ac\") " pod="kube-system/konnectivity-agent-2pbpg" Apr 17 14:33:56.219313 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218233 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-host-slash\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.219313 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218282 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-os-release\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.219313 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218335 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c66a6c5d-9d68-42d2-ac7d-1ef80b44ed2f-serviceca\") pod \"node-ca-pzwpd\" (UID: \"c66a6c5d-9d68-42d2-ac7d-1ef80b44ed2f\") " pod="openshift-image-registry/node-ca-pzwpd" Apr 17 14:33:56.219313 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218285 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-run-ovn\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.219313 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218384 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bd98c614-168e-496f-adb3-06763ea0075c-etc-selinux\") pod \"aws-ebs-csi-driver-node-rr8s9\" (UID: \"bd98c614-168e-496f-adb3-06763ea0075c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.219313 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218413 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-var-lib-kubelet\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.219313 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218439 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-host-run-netns\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.219313 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218463 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-etc-sysctl-conf\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.220064 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218467 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-systemd-units\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.220064 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218510 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1610e8f2-d397-4d82-a851-2e17443a44e4-host-slash\") pod \"iptables-alerter-jglh7\" (UID: \"1610e8f2-d397-4d82-a851-2e17443a44e4\") " pod="openshift-network-operator/iptables-alerter-jglh7" Apr 17 14:33:56.220064 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218535 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-cnibin\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.220064 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218548 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-var-lib-kubelet\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.220064 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218558 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-multus-socket-dir-parent\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.220064 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218569 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bd98c614-168e-496f-adb3-06763ea0075c-etc-selinux\") pod \"aws-ebs-csi-driver-node-rr8s9\" (UID: \"bd98c614-168e-496f-adb3-06763ea0075c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.220064 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218600 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-host-run-netns\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.220064 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218618 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-host-var-lib-kubelet\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.220064 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218635 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e599dbf6-a663-42a7-82bb-12ed438c2ba8-tmp-dir\") pod \"node-resolver-s9w25\" (UID: \"e599dbf6-a663-42a7-82bb-12ed438c2ba8\") " pod="openshift-dns/node-resolver-s9w25" Apr 17 14:33:56.220064 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218646 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ml2t7\" (UniqueName: \"kubernetes.io/projected/e599dbf6-a663-42a7-82bb-12ed438c2ba8-kube-api-access-ml2t7\") pod \"node-resolver-s9w25\" (UID: \"e599dbf6-a663-42a7-82bb-12ed438c2ba8\") " pod="openshift-dns/node-resolver-s9w25" Apr 17 14:33:56.220064 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218648 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-cnibin\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.220064 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218675 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.220064 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218685 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1610e8f2-d397-4d82-a851-2e17443a44e4-host-slash\") pod \"iptables-alerter-jglh7\" (UID: \"1610e8f2-d397-4d82-a851-2e17443a44e4\") " pod="openshift-network-operator/iptables-alerter-jglh7" Apr 17 14:33:56.220064 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218702 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-host-var-lib-kubelet\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.220064 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218704 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5csw\" (UniqueName: \"kubernetes.io/projected/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-kube-api-access-t5csw\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.220064 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218734 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-multus-socket-dir-parent\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.220064 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218741 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-host\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.220064 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218768 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e6899357-3e39-480b-ab64-ec01da0d8a4f-etc-tuned\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.220795 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218795 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ns97l\" (UniqueName: \"kubernetes.io/projected/e6899357-3e39-480b-ab64-ec01da0d8a4f-kube-api-access-ns97l\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.220795 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218845 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-host-run-k8s-cni-cncf-io\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.220795 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218865 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-host\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.220795 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218870 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-host-run-multus-certs\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.220795 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218908 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4790e8ee-77ab-402e-a1df-7e728d62db98-cni-binary-copy\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.220795 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218930 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-etc-openvswitch\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.220795 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.218974 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-run-openvswitch\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.220795 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219003 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-host-run-ovn-kubernetes\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.220795 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219030 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8p9fk\" (UniqueName: \"kubernetes.io/projected/bd98c614-168e-496f-adb3-06763ea0075c-kube-api-access-8p9fk\") pod \"aws-ebs-csi-driver-node-rr8s9\" (UID: \"bd98c614-168e-496f-adb3-06763ea0075c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.220795 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219078 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-etc-modprobe-d\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.220795 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219108 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4790e8ee-77ab-402e-a1df-7e728d62db98-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.220795 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219126 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-host-run-multus-certs\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.220795 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219156 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-node-log\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.220795 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219045 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1610e8f2-d397-4d82-a851-2e17443a44e4-iptables-alerter-script\") pod \"iptables-alerter-jglh7\" (UID: \"1610e8f2-d397-4d82-a851-2e17443a44e4\") " pod="openshift-network-operator/iptables-alerter-jglh7" Apr 17 14:33:56.220795 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219192 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-host-cni-bin\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.220795 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219112 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-host-run-k8s-cni-cncf-io\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.220795 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219269 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-ovn-node-metrics-cert\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.221639 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219301 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x52rp\" (UniqueName: \"kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp\") pod \"network-check-target-2jb96\" (UID: \"651b7208-91cf-42ee-b675-0a50ef1389f0\") " pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:33:56.221639 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219312 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-etc-modprobe-d\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.221639 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219349 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-etc-sysctl-d\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.221639 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219375 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-sys\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.221639 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219426 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-cni-binary-copy\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.221639 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219449 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-sys\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.221639 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219453 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4790e8ee-77ab-402e-a1df-7e728d62db98-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.221639 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219501 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs\") pod \"network-metrics-daemon-j6cgs\" (UID: \"4807a6e2-14df-4ba4-8aee-7422a65508f2\") " pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:33:56.221639 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219525 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c66a6c5d-9d68-42d2-ac7d-1ef80b44ed2f-host\") pod \"node-ca-pzwpd\" (UID: \"c66a6c5d-9d68-42d2-ac7d-1ef80b44ed2f\") " pod="openshift-image-registry/node-ca-pzwpd" Apr 17 14:33:56.221639 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219585 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-etc-sysconfig\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.221639 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219593 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-etc-sysctl-d\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.221639 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219673 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4790e8ee-77ab-402e-a1df-7e728d62db98-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.221639 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219678 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e6899357-3e39-480b-ab64-ec01da0d8a4f-etc-sysconfig\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.221639 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.219705 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c66a6c5d-9d68-42d2-ac7d-1ef80b44ed2f-host\") pod \"node-ca-pzwpd\" (UID: \"c66a6c5d-9d68-42d2-ac7d-1ef80b44ed2f\") " pod="openshift-image-registry/node-ca-pzwpd" Apr 17 14:33:56.221639 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.220099 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-cni-binary-copy\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.221639 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.220128 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4790e8ee-77ab-402e-a1df-7e728d62db98-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.221639 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.220865 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e6899357-3e39-480b-ab64-ec01da0d8a4f-tmp\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.221639 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.221334 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/70bf124d-898a-4e10-aece-902d90ea13ac-agent-certs\") pod \"konnectivity-agent-2pbpg\" (UID: \"70bf124d-898a-4e10-aece-902d90ea13ac\") " pod="kube-system/konnectivity-agent-2pbpg" Apr 17 14:33:56.222501 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.221844 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e6899357-3e39-480b-ab64-ec01da0d8a4f-etc-tuned\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.224855 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:56.224831 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:33:56.224855 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:56.224857 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:33:56.224855 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:56.224870 2570 projected.go:194] Error preparing data for projected volume kube-api-access-x52rp for pod openshift-network-diagnostics/network-check-target-2jb96: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:33:56.224855 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:56.224953 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp podName:651b7208-91cf-42ee-b675-0a50ef1389f0 nodeName:}" failed. No retries permitted until 2026-04-17 14:33:56.724932355 +0000 UTC m=+3.081659978 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-x52rp" (UniqueName: "kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp") pod "network-check-target-2jb96" (UID: "651b7208-91cf-42ee-b675-0a50ef1389f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:33:56.226298 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.226275 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld642\" (UniqueName: \"kubernetes.io/projected/1610e8f2-d397-4d82-a851-2e17443a44e4-kube-api-access-ld642\") pod \"iptables-alerter-jglh7\" (UID: \"1610e8f2-d397-4d82-a851-2e17443a44e4\") " pod="openshift-network-operator/iptables-alerter-jglh7" Apr 17 14:33:56.227125 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.227101 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtjxg\" (UniqueName: \"kubernetes.io/projected/c66a6c5d-9d68-42d2-ac7d-1ef80b44ed2f-kube-api-access-jtjxg\") pod \"node-ca-pzwpd\" (UID: \"c66a6c5d-9d68-42d2-ac7d-1ef80b44ed2f\") " pod="openshift-image-registry/node-ca-pzwpd" Apr 17 14:33:56.227560 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.227535 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g94kh\" (UniqueName: \"kubernetes.io/projected/4790e8ee-77ab-402e-a1df-7e728d62db98-kube-api-access-g94kh\") pod \"multus-additional-cni-plugins-lr2qq\" (UID: \"4790e8ee-77ab-402e-a1df-7e728d62db98\") " pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.227704 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.227661 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8964s\" (UniqueName: \"kubernetes.io/projected/29dc8572-3cfc-4d9d-b915-1e8a137c2e00-kube-api-access-8964s\") pod \"multus-s7m27\" (UID: \"29dc8572-3cfc-4d9d-b915-1e8a137c2e00\") " pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.227984 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.227965 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p9fk\" (UniqueName: \"kubernetes.io/projected/bd98c614-168e-496f-adb3-06763ea0075c-kube-api-access-8p9fk\") pod \"aws-ebs-csi-driver-node-rr8s9\" (UID: \"bd98c614-168e-496f-adb3-06763ea0075c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.228524 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.228475 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml2t7\" (UniqueName: \"kubernetes.io/projected/e599dbf6-a663-42a7-82bb-12ed438c2ba8-kube-api-access-ml2t7\") pod \"node-resolver-s9w25\" (UID: \"e599dbf6-a663-42a7-82bb-12ed438c2ba8\") " pod="openshift-dns/node-resolver-s9w25" Apr 17 14:33:56.228821 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.228793 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns97l\" (UniqueName: \"kubernetes.io/projected/e6899357-3e39-480b-ab64-ec01da0d8a4f-kube-api-access-ns97l\") pod \"tuned-tr472\" (UID: \"e6899357-3e39-480b-ab64-ec01da0d8a4f\") " pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.320449 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320412 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-env-overrides\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.320614 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320456 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-ovnkube-script-lib\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.320614 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320483 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-run-systemd\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.320614 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320513 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wnh6k\" (UniqueName: \"kubernetes.io/projected/4807a6e2-14df-4ba4-8aee-7422a65508f2-kube-api-access-wnh6k\") pod \"network-metrics-daemon-j6cgs\" (UID: \"4807a6e2-14df-4ba4-8aee-7422a65508f2\") " pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:33:56.320614 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320541 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-var-lib-openvswitch\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.320614 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320564 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-log-socket\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.320614 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320593 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-run-systemd\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.320890 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320599 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-host-kubelet\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.320890 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320641 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-host-kubelet\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.320890 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320660 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-host-run-netns\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.320890 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320681 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-var-lib-openvswitch\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.320890 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320689 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-ovnkube-config\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.320890 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320715 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-host-run-netns\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.320890 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320724 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-host-slash\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.320890 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320741 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-log-socket\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.320890 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320759 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-run-ovn\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.320890 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320786 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-systemd-units\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.320890 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320797 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-host-slash\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.320890 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320820 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.320890 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320839 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-run-ovn\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.320890 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320847 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5csw\" (UniqueName: \"kubernetes.io/projected/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-kube-api-access-t5csw\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.320890 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320867 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-systemd-units\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.320890 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320875 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-etc-openvswitch\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.320890 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320876 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.321645 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320901 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-run-openvswitch\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.321645 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320914 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-etc-openvswitch\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.321645 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320939 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-run-openvswitch\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.321645 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320927 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-host-run-ovn-kubernetes\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.321645 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320976 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-env-overrides\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.321645 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.320977 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-node-log\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.321645 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.321018 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-node-log\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.321645 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.321019 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-host-cni-bin\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.321645 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.321053 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-host-cni-bin\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.321645 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.321057 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-ovn-node-metrics-cert\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.321645 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.321066 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-host-run-ovn-kubernetes\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.321645 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.321104 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs\") pod \"network-metrics-daemon-j6cgs\" (UID: \"4807a6e2-14df-4ba4-8aee-7422a65508f2\") " pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:33:56.321645 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.321134 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-host-cni-netd\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.321645 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:56.321170 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:33:56.321645 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.321193 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-host-cni-netd\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.321645 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.321160 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-ovnkube-script-lib\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.321645 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:56.321230 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs podName:4807a6e2-14df-4ba4-8aee-7422a65508f2 nodeName:}" failed. No retries permitted until 2026-04-17 14:33:56.821214892 +0000 UTC m=+3.177942496 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs") pod "network-metrics-daemon-j6cgs" (UID: "4807a6e2-14df-4ba4-8aee-7422a65508f2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:33:56.322353 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.321518 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-ovnkube-config\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.323460 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.323440 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-ovn-node-metrics-cert\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.329691 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.329667 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5csw\" (UniqueName: \"kubernetes.io/projected/0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c-kube-api-access-t5csw\") pod \"ovnkube-node-lr966\" (UID: \"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.330359 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.330333 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnh6k\" (UniqueName: \"kubernetes.io/projected/4807a6e2-14df-4ba4-8aee-7422a65508f2-kube-api-access-wnh6k\") pod \"network-metrics-daemon-j6cgs\" (UID: \"4807a6e2-14df-4ba4-8aee-7422a65508f2\") " pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:33:56.409609 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.407574 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pzwpd" Apr 17 14:33:56.415932 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.415905 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" Apr 17 14:33:56.424466 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.424447 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lr2qq" Apr 17 14:33:56.430051 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.430026 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jglh7" Apr 17 14:33:56.435635 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.435615 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tr472" Apr 17 14:33:56.443184 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.443165 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s9w25" Apr 17 14:33:56.449721 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.449702 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2pbpg" Apr 17 14:33:56.457248 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.457220 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s7m27" Apr 17 14:33:56.462787 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.462772 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:33:56.813872 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:56.813840 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode599dbf6_a663_42a7_82bb_12ed438c2ba8.slice/crio-cd35d24581d36eb0c4fdbc86f2d724120a84874f3c042feaadf8e3c6e22afad4 WatchSource:0}: Error finding container cd35d24581d36eb0c4fdbc86f2d724120a84874f3c042feaadf8e3c6e22afad4: Status 404 returned error can't find the container with id cd35d24581d36eb0c4fdbc86f2d724120a84874f3c042feaadf8e3c6e22afad4 Apr 17 14:33:56.815218 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:56.815155 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6899357_3e39_480b_ab64_ec01da0d8a4f.slice/crio-d6f4c8bbe0c7e3324ff79cb84e9befa27f680fbf9a996e497847709c4343ec4b WatchSource:0}: Error finding container d6f4c8bbe0c7e3324ff79cb84e9befa27f680fbf9a996e497847709c4343ec4b: Status 404 returned error can't find the container with id d6f4c8bbe0c7e3324ff79cb84e9befa27f680fbf9a996e497847709c4343ec4b Apr 17 14:33:56.816054 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:56.816010 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70bf124d_898a_4e10_aece_902d90ea13ac.slice/crio-368c5dcd23679b8ba36f5f5d931d569147017b6d2b04618dd065eaff501ffdd9 WatchSource:0}: Error finding container 368c5dcd23679b8ba36f5f5d931d569147017b6d2b04618dd065eaff501ffdd9: Status 404 returned error can't find the container with id 368c5dcd23679b8ba36f5f5d931d569147017b6d2b04618dd065eaff501ffdd9 Apr 17 14:33:56.816675 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:56.816612 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29dc8572_3cfc_4d9d_b915_1e8a137c2e00.slice/crio-e42180ccd0c89618402cfe773dd0f4005c60eb5df37288e91f961e422485fa0a WatchSource:0}: Error finding container e42180ccd0c89618402cfe773dd0f4005c60eb5df37288e91f961e422485fa0a: Status 404 returned error can't find the container with id e42180ccd0c89618402cfe773dd0f4005c60eb5df37288e91f961e422485fa0a Apr 17 14:33:56.818947 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:56.818926 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc66a6c5d_9d68_42d2_ac7d_1ef80b44ed2f.slice/crio-092683440c23d240753818b13733d250aca30bfc11d1069a7af3b02a1338b9fb WatchSource:0}: Error finding container 092683440c23d240753818b13733d250aca30bfc11d1069a7af3b02a1338b9fb: Status 404 returned error can't find the container with id 092683440c23d240753818b13733d250aca30bfc11d1069a7af3b02a1338b9fb Apr 17 14:33:56.820172 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:56.820152 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd98c614_168e_496f_adb3_06763ea0075c.slice/crio-0183f67d929784baf866ded45142d324d439a7f45d965bef61c3c064880c61d0 WatchSource:0}: Error finding container 0183f67d929784baf866ded45142d324d439a7f45d965bef61c3c064880c61d0: Status 404 returned error can't find the container with id 0183f67d929784baf866ded45142d324d439a7f45d965bef61c3c064880c61d0 Apr 17 14:33:56.821208 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:56.821168 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1610e8f2_d397_4d82_a851_2e17443a44e4.slice/crio-de0bcac0394787da38a47512f4f9e31f44e7ae9323cf188de99acf2eb7f4c281 WatchSource:0}: Error finding container de0bcac0394787da38a47512f4f9e31f44e7ae9323cf188de99acf2eb7f4c281: Status 404 returned error can't find the container with id de0bcac0394787da38a47512f4f9e31f44e7ae9323cf188de99acf2eb7f4c281 Apr 17 14:33:56.823516 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:56.823038 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4790e8ee_77ab_402e_a1df_7e728d62db98.slice/crio-5b1b4b097b4b79daa3e96e7de8c4c56c5d5baf248252978784e67289ade6f509 WatchSource:0}: Error finding container 5b1b4b097b4b79daa3e96e7de8c4c56c5d5baf248252978784e67289ade6f509: Status 404 returned error can't find the container with id 5b1b4b097b4b79daa3e96e7de8c4c56c5d5baf248252978784e67289ade6f509 Apr 17 14:33:56.823516 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:33:56.823346 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cf20fd9_1bae_45f5_af6f_5f39f00c8f3c.slice/crio-d152b57f2f67ff3664da7cb86d377b25f007cb449d52d653be1676968b621fcc WatchSource:0}: Error finding container d152b57f2f67ff3664da7cb86d377b25f007cb449d52d653be1676968b621fcc: Status 404 returned error can't find the container with id d152b57f2f67ff3664da7cb86d377b25f007cb449d52d653be1676968b621fcc Apr 17 14:33:56.823967 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.823941 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x52rp\" (UniqueName: \"kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp\") pod \"network-check-target-2jb96\" (UID: \"651b7208-91cf-42ee-b675-0a50ef1389f0\") " pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:33:56.824019 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:56.823983 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs\") pod \"network-metrics-daemon-j6cgs\" (UID: \"4807a6e2-14df-4ba4-8aee-7422a65508f2\") " pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:33:56.824352 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:56.824088 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:33:56.824352 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:56.824104 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:33:56.824352 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:56.824118 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:33:56.824352 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:56.824130 2570 projected.go:194] Error preparing data for projected volume kube-api-access-x52rp for pod openshift-network-diagnostics/network-check-target-2jb96: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:33:56.824352 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:56.824138 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs podName:4807a6e2-14df-4ba4-8aee-7422a65508f2 nodeName:}" failed. No retries permitted until 2026-04-17 14:33:57.824120971 +0000 UTC m=+4.180848594 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs") pod "network-metrics-daemon-j6cgs" (UID: "4807a6e2-14df-4ba4-8aee-7422a65508f2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:33:56.824352 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:56.824165 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp podName:651b7208-91cf-42ee-b675-0a50ef1389f0 nodeName:}" failed. No retries permitted until 2026-04-17 14:33:57.824153265 +0000 UTC m=+4.180880881 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-x52rp" (UniqueName: "kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp") pod "network-check-target-2jb96" (UID: "651b7208-91cf-42ee-b675-0a50ef1389f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:33:57.148119 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:57.147734 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 14:28:55 +0000 UTC" deadline="2027-09-10 15:06:29.867941317 +0000 UTC" Apr 17 14:33:57.148119 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:57.148010 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12264h32m32.719941513s" Apr 17 14:33:57.237341 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:57.236570 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:33:57.237341 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:57.236705 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6cgs" podUID="4807a6e2-14df-4ba4-8aee-7422a65508f2" Apr 17 14:33:57.258635 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:57.258545 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-134.ec2.internal" event={"ID":"6856fb5778dd8b62676b041b83b334bb","Type":"ContainerStarted","Data":"ac2ff7230e8fd8b1ca193d8bb116b540e109d78f52fc78b7b88ffcc7e15c24ec"} Apr 17 14:33:57.271251 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:57.270937 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-134.ec2.internal" podStartSLOduration=2.270918736 podStartE2EDuration="2.270918736s" podCreationTimestamp="2026-04-17 14:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:33:57.270808061 +0000 UTC m=+3.627535687" watchObservedRunningTime="2026-04-17 14:33:57.270918736 +0000 UTC m=+3.627646342" Apr 17 14:33:57.279028 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:57.278995 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" event={"ID":"bd98c614-168e-496f-adb3-06763ea0075c","Type":"ContainerStarted","Data":"0183f67d929784baf866ded45142d324d439a7f45d965bef61c3c064880c61d0"} Apr 17 14:33:57.280474 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:57.280441 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lr2qq" event={"ID":"4790e8ee-77ab-402e-a1df-7e728d62db98","Type":"ContainerStarted","Data":"5b1b4b097b4b79daa3e96e7de8c4c56c5d5baf248252978784e67289ade6f509"} Apr 17 14:33:57.282216 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:57.282171 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jglh7" event={"ID":"1610e8f2-d397-4d82-a851-2e17443a44e4","Type":"ContainerStarted","Data":"de0bcac0394787da38a47512f4f9e31f44e7ae9323cf188de99acf2eb7f4c281"} Apr 17 14:33:57.284930 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:57.284907 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s7m27" event={"ID":"29dc8572-3cfc-4d9d-b915-1e8a137c2e00","Type":"ContainerStarted","Data":"e42180ccd0c89618402cfe773dd0f4005c60eb5df37288e91f961e422485fa0a"} Apr 17 14:33:57.292877 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:57.292828 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s9w25" event={"ID":"e599dbf6-a663-42a7-82bb-12ed438c2ba8","Type":"ContainerStarted","Data":"cd35d24581d36eb0c4fdbc86f2d724120a84874f3c042feaadf8e3c6e22afad4"} Apr 17 14:33:57.295650 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:57.294992 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lr966" event={"ID":"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c","Type":"ContainerStarted","Data":"d152b57f2f67ff3664da7cb86d377b25f007cb449d52d653be1676968b621fcc"} Apr 17 14:33:57.299748 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:57.297778 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pzwpd" event={"ID":"c66a6c5d-9d68-42d2-ac7d-1ef80b44ed2f","Type":"ContainerStarted","Data":"092683440c23d240753818b13733d250aca30bfc11d1069a7af3b02a1338b9fb"} Apr 17 14:33:57.302272 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:57.302231 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2pbpg" event={"ID":"70bf124d-898a-4e10-aece-902d90ea13ac","Type":"ContainerStarted","Data":"368c5dcd23679b8ba36f5f5d931d569147017b6d2b04618dd065eaff501ffdd9"} Apr 17 14:33:57.312508 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:57.312418 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tr472" event={"ID":"e6899357-3e39-480b-ab64-ec01da0d8a4f","Type":"ContainerStarted","Data":"d6f4c8bbe0c7e3324ff79cb84e9befa27f680fbf9a996e497847709c4343ec4b"} Apr 17 14:33:57.844073 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:57.842318 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x52rp\" (UniqueName: \"kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp\") pod \"network-check-target-2jb96\" (UID: \"651b7208-91cf-42ee-b675-0a50ef1389f0\") " pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:33:57.844073 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:57.842368 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs\") pod \"network-metrics-daemon-j6cgs\" (UID: \"4807a6e2-14df-4ba4-8aee-7422a65508f2\") " pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:33:57.844073 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:57.842520 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:33:57.844073 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:57.842581 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs podName:4807a6e2-14df-4ba4-8aee-7422a65508f2 nodeName:}" failed. No retries permitted until 2026-04-17 14:33:59.842562834 +0000 UTC m=+6.199290443 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs") pod "network-metrics-daemon-j6cgs" (UID: "4807a6e2-14df-4ba4-8aee-7422a65508f2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:33:57.844073 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:57.842980 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:33:57.844073 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:57.843001 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:33:57.844073 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:57.843014 2570 projected.go:194] Error preparing data for projected volume kube-api-access-x52rp for pod openshift-network-diagnostics/network-check-target-2jb96: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:33:57.844073 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:57.843057 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp podName:651b7208-91cf-42ee-b675-0a50ef1389f0 nodeName:}" failed. No retries permitted until 2026-04-17 14:33:59.84304334 +0000 UTC m=+6.199770947 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-x52rp" (UniqueName: "kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp") pod "network-check-target-2jb96" (UID: "651b7208-91cf-42ee-b675-0a50ef1389f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:33:58.235554 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:58.235452 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:33:58.235984 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:58.235585 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jb96" podUID="651b7208-91cf-42ee-b675-0a50ef1389f0" Apr 17 14:33:58.322418 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:58.322379 2570 generic.go:358] "Generic (PLEG): container finished" podID="53a7243a4603a87d58406d96759acbb6" containerID="4c83bbce975bf28d68e818158bca5ac714560a08b60c7697159ddba74fe116d2" exitCode=0 Apr 17 14:33:58.323543 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:58.323294 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-134.ec2.internal" event={"ID":"53a7243a4603a87d58406d96759acbb6","Type":"ContainerDied","Data":"4c83bbce975bf28d68e818158bca5ac714560a08b60c7697159ddba74fe116d2"} Apr 17 14:33:59.235278 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:59.235201 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:33:59.235482 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:59.235385 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6cgs" podUID="4807a6e2-14df-4ba4-8aee-7422a65508f2" Apr 17 14:33:59.338444 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:59.337937 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-134.ec2.internal" event={"ID":"53a7243a4603a87d58406d96759acbb6","Type":"ContainerStarted","Data":"6ebd67361e252b15b0ba3b45add44dd3123e103e54d5b917b53903815691a738"} Apr 17 14:33:59.351292 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:59.350649 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-134.ec2.internal" podStartSLOduration=4.350632941 podStartE2EDuration="4.350632941s" podCreationTimestamp="2026-04-17 14:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:33:59.350191488 +0000 UTC m=+5.706919114" watchObservedRunningTime="2026-04-17 14:33:59.350632941 +0000 UTC m=+5.707360567" Apr 17 14:33:59.865131 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:59.865073 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x52rp\" (UniqueName: \"kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp\") pod \"network-check-target-2jb96\" (UID: \"651b7208-91cf-42ee-b675-0a50ef1389f0\") " pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:33:59.865131 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:33:59.865125 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs\") pod \"network-metrics-daemon-j6cgs\" (UID: \"4807a6e2-14df-4ba4-8aee-7422a65508f2\") " pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:33:59.865426 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:59.865260 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:33:59.865426 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:59.865347 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs podName:4807a6e2-14df-4ba4-8aee-7422a65508f2 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:03.865326508 +0000 UTC m=+10.222054116 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs") pod "network-metrics-daemon-j6cgs" (UID: "4807a6e2-14df-4ba4-8aee-7422a65508f2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:33:59.865426 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:59.865259 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:33:59.865426 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:59.865375 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:33:59.865426 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:59.865391 2570 projected.go:194] Error preparing data for projected volume kube-api-access-x52rp for pod openshift-network-diagnostics/network-check-target-2jb96: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:33:59.865769 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:33:59.865444 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp podName:651b7208-91cf-42ee-b675-0a50ef1389f0 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:03.865427629 +0000 UTC m=+10.222155246 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-x52rp" (UniqueName: "kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp") pod "network-check-target-2jb96" (UID: "651b7208-91cf-42ee-b675-0a50ef1389f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:00.234745 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:00.234658 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:34:00.234904 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:00.234777 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jb96" podUID="651b7208-91cf-42ee-b675-0a50ef1389f0" Apr 17 14:34:01.235111 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:01.235070 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:34:01.235601 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:01.235232 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6cgs" podUID="4807a6e2-14df-4ba4-8aee-7422a65508f2" Apr 17 14:34:02.235266 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:02.235076 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:34:02.235266 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:02.235195 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jb96" podUID="651b7208-91cf-42ee-b675-0a50ef1389f0" Apr 17 14:34:03.235321 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:03.235159 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:34:03.235819 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:03.235342 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6cgs" podUID="4807a6e2-14df-4ba4-8aee-7422a65508f2" Apr 17 14:34:03.899856 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:03.898917 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x52rp\" (UniqueName: \"kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp\") pod \"network-check-target-2jb96\" (UID: \"651b7208-91cf-42ee-b675-0a50ef1389f0\") " pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:34:03.899856 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:03.898975 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs\") pod \"network-metrics-daemon-j6cgs\" (UID: \"4807a6e2-14df-4ba4-8aee-7422a65508f2\") " pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:34:03.899856 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:03.899152 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:03.899856 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:03.899217 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs podName:4807a6e2-14df-4ba4-8aee-7422a65508f2 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:11.899197209 +0000 UTC m=+18.255924875 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs") pod "network-metrics-daemon-j6cgs" (UID: "4807a6e2-14df-4ba4-8aee-7422a65508f2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:03.899856 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:03.899696 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:34:03.899856 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:03.899714 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:34:03.899856 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:03.899727 2570 projected.go:194] Error preparing data for projected volume kube-api-access-x52rp for pod openshift-network-diagnostics/network-check-target-2jb96: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:03.899856 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:03.899813 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp podName:651b7208-91cf-42ee-b675-0a50ef1389f0 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:11.89979755 +0000 UTC m=+18.256525176 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-x52rp" (UniqueName: "kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp") pod "network-check-target-2jb96" (UID: "651b7208-91cf-42ee-b675-0a50ef1389f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:04.236696 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:04.236146 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:34:04.236696 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:04.236276 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jb96" podUID="651b7208-91cf-42ee-b675-0a50ef1389f0" Apr 17 14:34:05.235446 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:05.235342 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:34:05.235616 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:05.235508 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6cgs" podUID="4807a6e2-14df-4ba4-8aee-7422a65508f2" Apr 17 14:34:06.235364 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:06.235327 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:34:06.235820 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:06.235453 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jb96" podUID="651b7208-91cf-42ee-b675-0a50ef1389f0" Apr 17 14:34:07.234937 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:07.234900 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:34:07.235110 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:07.235030 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6cgs" podUID="4807a6e2-14df-4ba4-8aee-7422a65508f2" Apr 17 14:34:07.917075 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:07.917039 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-thst5"] Apr 17 14:34:07.919990 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:07.919963 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:07.920114 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:07.920048 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-thst5" podUID="2404238c-49db-4bc3-b328-96679c365761" Apr 17 14:34:08.027891 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:08.027859 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2404238c-49db-4bc3-b328-96679c365761-dbus\") pod \"global-pull-secret-syncer-thst5\" (UID: \"2404238c-49db-4bc3-b328-96679c365761\") " pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:08.028081 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:08.027908 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2404238c-49db-4bc3-b328-96679c365761-original-pull-secret\") pod \"global-pull-secret-syncer-thst5\" (UID: \"2404238c-49db-4bc3-b328-96679c365761\") " pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:08.028081 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:08.028023 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2404238c-49db-4bc3-b328-96679c365761-kubelet-config\") pod \"global-pull-secret-syncer-thst5\" (UID: \"2404238c-49db-4bc3-b328-96679c365761\") " pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:08.128929 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:08.128900 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2404238c-49db-4bc3-b328-96679c365761-original-pull-secret\") pod \"global-pull-secret-syncer-thst5\" (UID: \"2404238c-49db-4bc3-b328-96679c365761\") " pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:08.129118 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:08.128988 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2404238c-49db-4bc3-b328-96679c365761-kubelet-config\") pod \"global-pull-secret-syncer-thst5\" (UID: \"2404238c-49db-4bc3-b328-96679c365761\") " pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:08.129118 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:08.129014 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2404238c-49db-4bc3-b328-96679c365761-dbus\") pod \"global-pull-secret-syncer-thst5\" (UID: \"2404238c-49db-4bc3-b328-96679c365761\") " pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:08.129118 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:08.129043 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:34:08.129118 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:08.129101 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2404238c-49db-4bc3-b328-96679c365761-original-pull-secret podName:2404238c-49db-4bc3-b328-96679c365761 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:08.629085393 +0000 UTC m=+14.985813011 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2404238c-49db-4bc3-b328-96679c365761-original-pull-secret") pod "global-pull-secret-syncer-thst5" (UID: "2404238c-49db-4bc3-b328-96679c365761") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:34:08.129335 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:08.129134 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2404238c-49db-4bc3-b328-96679c365761-kubelet-config\") pod \"global-pull-secret-syncer-thst5\" (UID: \"2404238c-49db-4bc3-b328-96679c365761\") " pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:08.129335 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:08.129176 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2404238c-49db-4bc3-b328-96679c365761-dbus\") pod \"global-pull-secret-syncer-thst5\" (UID: \"2404238c-49db-4bc3-b328-96679c365761\") " pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:08.234408 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:08.234328 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:34:08.234574 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:08.234440 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jb96" podUID="651b7208-91cf-42ee-b675-0a50ef1389f0" Apr 17 14:34:08.631884 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:08.631799 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2404238c-49db-4bc3-b328-96679c365761-original-pull-secret\") pod \"global-pull-secret-syncer-thst5\" (UID: \"2404238c-49db-4bc3-b328-96679c365761\") " pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:08.632056 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:08.631947 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:34:08.632056 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:08.632014 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2404238c-49db-4bc3-b328-96679c365761-original-pull-secret podName:2404238c-49db-4bc3-b328-96679c365761 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:09.63199837 +0000 UTC m=+15.988725994 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2404238c-49db-4bc3-b328-96679c365761-original-pull-secret") pod "global-pull-secret-syncer-thst5" (UID: "2404238c-49db-4bc3-b328-96679c365761") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:34:09.235133 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:09.235101 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:09.235558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:09.235099 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:34:09.235558 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:09.235218 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-thst5" podUID="2404238c-49db-4bc3-b328-96679c365761" Apr 17 14:34:09.235558 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:09.235347 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6cgs" podUID="4807a6e2-14df-4ba4-8aee-7422a65508f2" Apr 17 14:34:09.642124 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:09.642087 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2404238c-49db-4bc3-b328-96679c365761-original-pull-secret\") pod \"global-pull-secret-syncer-thst5\" (UID: \"2404238c-49db-4bc3-b328-96679c365761\") " pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:09.642324 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:09.642274 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:34:09.642385 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:09.642356 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2404238c-49db-4bc3-b328-96679c365761-original-pull-secret podName:2404238c-49db-4bc3-b328-96679c365761 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:11.642335113 +0000 UTC m=+17.999062718 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2404238c-49db-4bc3-b328-96679c365761-original-pull-secret") pod "global-pull-secret-syncer-thst5" (UID: "2404238c-49db-4bc3-b328-96679c365761") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:34:10.235360 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:10.235322 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:34:10.235815 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:10.235478 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jb96" podUID="651b7208-91cf-42ee-b675-0a50ef1389f0" Apr 17 14:34:11.234908 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:11.234870 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:11.235072 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:11.235048 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-thst5" podUID="2404238c-49db-4bc3-b328-96679c365761" Apr 17 14:34:11.235072 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:11.235055 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:34:11.235177 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:11.235154 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6cgs" podUID="4807a6e2-14df-4ba4-8aee-7422a65508f2" Apr 17 14:34:11.659231 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:11.659143 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2404238c-49db-4bc3-b328-96679c365761-original-pull-secret\") pod \"global-pull-secret-syncer-thst5\" (UID: \"2404238c-49db-4bc3-b328-96679c365761\") " pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:11.659589 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:11.659291 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:34:11.659589 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:11.659363 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2404238c-49db-4bc3-b328-96679c365761-original-pull-secret podName:2404238c-49db-4bc3-b328-96679c365761 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:15.659346793 +0000 UTC m=+22.016074396 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2404238c-49db-4bc3-b328-96679c365761-original-pull-secret") pod "global-pull-secret-syncer-thst5" (UID: "2404238c-49db-4bc3-b328-96679c365761") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:34:11.960987 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:11.960898 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x52rp\" (UniqueName: \"kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp\") pod \"network-check-target-2jb96\" (UID: \"651b7208-91cf-42ee-b675-0a50ef1389f0\") " pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:34:11.960987 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:11.960942 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs\") pod \"network-metrics-daemon-j6cgs\" (UID: \"4807a6e2-14df-4ba4-8aee-7422a65508f2\") " pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:34:11.961207 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:11.961070 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:11.961207 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:11.961086 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:34:11.961207 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:11.961115 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:34:11.961207 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:11.961126 2570 projected.go:194] Error preparing data for projected volume kube-api-access-x52rp for pod openshift-network-diagnostics/network-check-target-2jb96: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:11.961207 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:11.961139 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs podName:4807a6e2-14df-4ba4-8aee-7422a65508f2 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:27.961119182 +0000 UTC m=+34.317846799 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs") pod "network-metrics-daemon-j6cgs" (UID: "4807a6e2-14df-4ba4-8aee-7422a65508f2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:11.961207 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:11.961171 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp podName:651b7208-91cf-42ee-b675-0a50ef1389f0 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:27.961155248 +0000 UTC m=+34.317882850 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-x52rp" (UniqueName: "kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp") pod "network-check-target-2jb96" (UID: "651b7208-91cf-42ee-b675-0a50ef1389f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:12.235292 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:12.235192 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:34:12.235443 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:12.235337 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jb96" podUID="651b7208-91cf-42ee-b675-0a50ef1389f0" Apr 17 14:34:13.234412 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:13.234378 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:34:13.234412 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:13.234413 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:13.234903 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:13.234495 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6cgs" podUID="4807a6e2-14df-4ba4-8aee-7422a65508f2" Apr 17 14:34:13.234903 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:13.234629 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-thst5" podUID="2404238c-49db-4bc3-b328-96679c365761" Apr 17 14:34:14.235341 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:14.235015 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:34:14.236122 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:14.235405 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jb96" podUID="651b7208-91cf-42ee-b675-0a50ef1389f0" Apr 17 14:34:14.365744 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:14.365714 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pzwpd" event={"ID":"c66a6c5d-9d68-42d2-ac7d-1ef80b44ed2f","Type":"ContainerStarted","Data":"86cf4c0a107fbeb81d578475b3e85b7a84e1d79614c4539a20aa5d80355dd00f"} Apr 17 14:34:14.367121 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:14.367095 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2pbpg" event={"ID":"70bf124d-898a-4e10-aece-902d90ea13ac","Type":"ContainerStarted","Data":"31e724f41e0b2d2d3cd5a07d18e05af14b577e0a168b039b2175cf42e9671dbb"} Apr 17 14:34:14.368504 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:14.368479 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tr472" event={"ID":"e6899357-3e39-480b-ab64-ec01da0d8a4f","Type":"ContainerStarted","Data":"0923ac5af23d0b8c0c4c73183d8aa81f3ac5e4911ba8985f01cfcb92d606c739"} Apr 17 14:34:14.369751 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:14.369723 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" event={"ID":"bd98c614-168e-496f-adb3-06763ea0075c","Type":"ContainerStarted","Data":"0a4cf6ce8cfaf6ba63c65e53b20de287363824aeead1e8be9c3b2876e509d012"} Apr 17 14:34:14.373296 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:14.373271 2570 generic.go:358] "Generic (PLEG): container finished" podID="4790e8ee-77ab-402e-a1df-7e728d62db98" containerID="46212fdce818aeca4e2b663cac13475146b6ab59febe9fb28e569de6db2316e6" exitCode=0 Apr 17 14:34:14.373391 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:14.373340 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lr2qq" event={"ID":"4790e8ee-77ab-402e-a1df-7e728d62db98","Type":"ContainerDied","Data":"46212fdce818aeca4e2b663cac13475146b6ab59febe9fb28e569de6db2316e6"} Apr 17 14:34:14.375219 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:14.375175 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s7m27" event={"ID":"29dc8572-3cfc-4d9d-b915-1e8a137c2e00","Type":"ContainerStarted","Data":"c5499bd364f8a431da99220fe8b4ca84aeadf4dd916f25d226f2f82e9e6f2cab"} Apr 17 14:34:14.376668 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:14.376645 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s9w25" event={"ID":"e599dbf6-a663-42a7-82bb-12ed438c2ba8","Type":"ContainerStarted","Data":"cc3abea9571ed3f68c3f02fbf02a4bf4081dbe5418fafd6ca921413b408f6b42"} Apr 17 14:34:14.379205 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:14.379186 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lr966_0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c/ovn-acl-logging/0.log" Apr 17 14:34:14.379561 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:14.379541 2570 generic.go:358] "Generic (PLEG): container finished" podID="0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c" containerID="a6d115f72b54e021147cccf01e4d774e2a3d95d92572dce5ab464e9c1b3dba45" exitCode=1 Apr 17 14:34:14.379648 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:14.379569 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lr966" event={"ID":"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c","Type":"ContainerStarted","Data":"e6f7a95bb24f51ec655e2af031a9dc79aa8e31273e06a50a98153a72055777c5"} Apr 17 14:34:14.379648 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:14.379584 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lr966" event={"ID":"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c","Type":"ContainerStarted","Data":"15e71c6d28edc014b4127472023ab3588850f6dc29a884e48d12eaf9b0ff3790"} Apr 17 14:34:14.379648 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:14.379593 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lr966" event={"ID":"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c","Type":"ContainerStarted","Data":"e53a99f5589765a6e9ef103bade254593b64a3f364ed6b739db57081468552bd"} Apr 17 14:34:14.379648 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:14.379607 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lr966" event={"ID":"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c","Type":"ContainerDied","Data":"a6d115f72b54e021147cccf01e4d774e2a3d95d92572dce5ab464e9c1b3dba45"} Apr 17 14:34:14.379648 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:14.379620 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lr966" event={"ID":"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c","Type":"ContainerStarted","Data":"c8505787652fef0776f1a4e9bfeaf9ad9344c405831bf41d0afe9ffcacc4098a"} Apr 17 14:34:14.391355 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:14.391299 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pzwpd" podStartSLOduration=3.692555667 podStartE2EDuration="20.391283828s" podCreationTimestamp="2026-04-17 14:33:54 +0000 UTC" firstStartedPulling="2026-04-17 14:33:56.820765666 +0000 UTC m=+3.177493281" lastFinishedPulling="2026-04-17 14:34:13.519493822 +0000 UTC m=+19.876221442" observedRunningTime="2026-04-17 14:34:14.39077606 +0000 UTC m=+20.747503686" watchObservedRunningTime="2026-04-17 14:34:14.391283828 +0000 UTC m=+20.748011467" Apr 17 14:34:14.406491 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:14.406437 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-tr472" podStartSLOduration=3.7026481589999998 podStartE2EDuration="20.406420357s" podCreationTimestamp="2026-04-17 14:33:54 +0000 UTC" firstStartedPulling="2026-04-17 14:33:56.817174546 +0000 UTC m=+3.173902153" lastFinishedPulling="2026-04-17 14:34:13.520946741 +0000 UTC m=+19.877674351" observedRunningTime="2026-04-17 14:34:14.406223161 +0000 UTC m=+20.762950810" watchObservedRunningTime="2026-04-17 14:34:14.406420357 +0000 UTC m=+20.763147979" Apr 17 14:34:14.422397 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:14.422278 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-s9w25" podStartSLOduration=3.717860754 podStartE2EDuration="20.422260925s" podCreationTimestamp="2026-04-17 14:33:54 +0000 UTC" firstStartedPulling="2026-04-17 14:33:56.815555801 +0000 UTC m=+3.172283412" lastFinishedPulling="2026-04-17 14:34:13.519955973 +0000 UTC m=+19.876683583" observedRunningTime="2026-04-17 14:34:14.421946478 +0000 UTC m=+20.778674104" watchObservedRunningTime="2026-04-17 14:34:14.422260925 +0000 UTC m=+20.778988550" Apr 17 14:34:14.438597 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:14.438537 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-s7m27" podStartSLOduration=3.698433584 podStartE2EDuration="20.43852113s" podCreationTimestamp="2026-04-17 14:33:54 +0000 UTC" firstStartedPulling="2026-04-17 14:33:56.819034705 +0000 UTC m=+3.175762309" lastFinishedPulling="2026-04-17 14:34:13.559122252 +0000 UTC m=+19.915849855" observedRunningTime="2026-04-17 14:34:14.43825572 +0000 UTC m=+20.794983340" watchObservedRunningTime="2026-04-17 14:34:14.43852113 +0000 UTC m=+20.795248756" Apr 17 14:34:14.471738 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:14.471677 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-2pbpg" podStartSLOduration=3.770243716 podStartE2EDuration="20.471657534s" podCreationTimestamp="2026-04-17 14:33:54 +0000 UTC" firstStartedPulling="2026-04-17 14:33:56.817999181 +0000 UTC m=+3.174726784" lastFinishedPulling="2026-04-17 14:34:13.519412994 +0000 UTC m=+19.876140602" observedRunningTime="2026-04-17 14:34:14.471297039 +0000 UTC m=+20.828024663" watchObservedRunningTime="2026-04-17 14:34:14.471657534 +0000 UTC m=+20.828385161" Apr 17 14:34:15.169771 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:15.169738 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-2pbpg" Apr 17 14:34:15.234679 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:15.234646 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:34:15.234679 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:15.234675 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:15.234844 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:15.234778 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6cgs" podUID="4807a6e2-14df-4ba4-8aee-7422a65508f2" Apr 17 14:34:15.234941 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:15.234916 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-thst5" podUID="2404238c-49db-4bc3-b328-96679c365761" Apr 17 14:34:15.242114 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:15.242088 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 14:34:15.383191 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:15.383151 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jglh7" event={"ID":"1610e8f2-d397-4d82-a851-2e17443a44e4","Type":"ContainerStarted","Data":"7317822dee9c5f740f9dadf653e8ef62fbc9f1551fff026d1a9f61707a5c4e8f"} Apr 17 14:34:15.386149 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:15.386124 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lr966_0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c/ovn-acl-logging/0.log" Apr 17 14:34:15.386646 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:15.386615 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lr966" event={"ID":"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c","Type":"ContainerStarted","Data":"79ec453186564c671603fa476008a0c5ed5fee81fdc4ee1b2bd651e38a47fd74"} Apr 17 14:34:15.388299 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:15.388274 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" event={"ID":"bd98c614-168e-496f-adb3-06763ea0075c","Type":"ContainerStarted","Data":"1092792bc56022b6469560eb17240ad18601e936dcbea0fc5b5659917347ff52"} Apr 17 14:34:15.396577 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:15.396543 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-jglh7" podStartSLOduration=4.700432172 podStartE2EDuration="21.396530233s" podCreationTimestamp="2026-04-17 14:33:54 +0000 UTC" firstStartedPulling="2026-04-17 14:33:56.823316243 +0000 UTC m=+3.180043845" lastFinishedPulling="2026-04-17 14:34:13.519414295 +0000 UTC m=+19.876141906" observedRunningTime="2026-04-17 14:34:15.396277083 +0000 UTC m=+21.753004710" watchObservedRunningTime="2026-04-17 14:34:15.396530233 +0000 UTC m=+21.753257857" Apr 17 14:34:15.680344 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:15.680270 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-2pbpg" Apr 17 14:34:15.680906 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:15.680885 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-2pbpg" Apr 17 14:34:15.694549 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:15.694517 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2404238c-49db-4bc3-b328-96679c365761-original-pull-secret\") pod \"global-pull-secret-syncer-thst5\" (UID: \"2404238c-49db-4bc3-b328-96679c365761\") " pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:15.694690 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:15.694672 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:34:15.694777 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:15.694745 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2404238c-49db-4bc3-b328-96679c365761-original-pull-secret podName:2404238c-49db-4bc3-b328-96679c365761 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:23.69472497 +0000 UTC m=+30.051452590 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2404238c-49db-4bc3-b328-96679c365761-original-pull-secret") pod "global-pull-secret-syncer-thst5" (UID: "2404238c-49db-4bc3-b328-96679c365761") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:34:16.184827 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:16.184696 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T14:34:15.242108121Z","UUID":"6b8b8fe4-c8d5-43cd-9719-3209cd119f63","Handler":null,"Name":"","Endpoint":""} Apr 17 14:34:16.186609 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:16.186585 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 14:34:16.186750 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:16.186618 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 14:34:16.234705 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:16.234674 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:34:16.234909 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:16.234792 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jb96" podUID="651b7208-91cf-42ee-b675-0a50ef1389f0" Apr 17 14:34:16.390888 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:16.390855 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-2pbpg" Apr 17 14:34:17.235142 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:17.234914 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:34:17.235323 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:17.234914 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:17.235323 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:17.235211 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6cgs" podUID="4807a6e2-14df-4ba4-8aee-7422a65508f2" Apr 17 14:34:17.235323 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:17.235311 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-thst5" podUID="2404238c-49db-4bc3-b328-96679c365761" Apr 17 14:34:17.395250 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:17.395220 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lr966_0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c/ovn-acl-logging/0.log" Apr 17 14:34:17.395703 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:17.395642 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lr966" event={"ID":"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c","Type":"ContainerStarted","Data":"6980ced5916ba0fb142b7f42b9deafc2c4138fa2e46e814c94d4204d80d6ef0d"} Apr 17 14:34:17.397741 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:17.397699 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" event={"ID":"bd98c614-168e-496f-adb3-06763ea0075c","Type":"ContainerStarted","Data":"71eb1b88e09c883a281f7008cdb6d0759320ecabf5763e1263fc3b8d1d4a19c9"} Apr 17 14:34:17.415390 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:17.415341 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rr8s9" podStartSLOduration=3.5493033819999997 podStartE2EDuration="23.415326697s" podCreationTimestamp="2026-04-17 14:33:54 +0000 UTC" firstStartedPulling="2026-04-17 14:33:56.822185048 +0000 UTC m=+3.178912661" lastFinishedPulling="2026-04-17 14:34:16.688208368 +0000 UTC m=+23.044935976" observedRunningTime="2026-04-17 14:34:17.414833177 +0000 UTC m=+23.771560802" watchObservedRunningTime="2026-04-17 14:34:17.415326697 +0000 UTC m=+23.772054323" Apr 17 14:34:18.234695 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:18.234609 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:34:18.234865 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:18.234726 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jb96" podUID="651b7208-91cf-42ee-b675-0a50ef1389f0" Apr 17 14:34:19.235165 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:19.234986 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:19.236049 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:19.234986 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:34:19.236049 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:19.235516 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-thst5" podUID="2404238c-49db-4bc3-b328-96679c365761" Apr 17 14:34:19.236049 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:19.235684 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6cgs" podUID="4807a6e2-14df-4ba4-8aee-7422a65508f2" Apr 17 14:34:19.404085 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:19.404057 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lr966_0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c/ovn-acl-logging/0.log" Apr 17 14:34:19.404473 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:19.404440 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lr966" event={"ID":"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c","Type":"ContainerStarted","Data":"b1a2106ceaa3d020d3f4846a243cf6776214a32f35d5b0f95cceeaf857d02ab2"} Apr 17 14:34:19.404840 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:19.404779 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:34:19.404840 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:19.404807 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:34:19.404840 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:19.404820 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:34:19.405018 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:19.404916 2570 scope.go:117] "RemoveContainer" containerID="a6d115f72b54e021147cccf01e4d774e2a3d95d92572dce5ab464e9c1b3dba45" Apr 17 14:34:19.406369 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:19.406342 2570 generic.go:358] "Generic (PLEG): container finished" podID="4790e8ee-77ab-402e-a1df-7e728d62db98" containerID="5c149a0580826e8b3890acda0811cccaea57ce1438c891b82bc716624535fade" exitCode=0 Apr 17 14:34:19.406469 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:19.406381 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lr2qq" event={"ID":"4790e8ee-77ab-402e-a1df-7e728d62db98","Type":"ContainerDied","Data":"5c149a0580826e8b3890acda0811cccaea57ce1438c891b82bc716624535fade"} Apr 17 14:34:19.420339 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:19.420318 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:34:19.420449 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:19.420434 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:34:20.237471 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:20.237435 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:34:20.237972 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:20.237557 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jb96" podUID="651b7208-91cf-42ee-b675-0a50ef1389f0" Apr 17 14:34:20.413530 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:20.413450 2570 generic.go:358] "Generic (PLEG): container finished" podID="4790e8ee-77ab-402e-a1df-7e728d62db98" containerID="fa2480ac4dfc46cadfb3d04eee91377e101370427dcd07d1b886ff7b9d7bf038" exitCode=0 Apr 17 14:34:20.413682 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:20.413536 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lr2qq" event={"ID":"4790e8ee-77ab-402e-a1df-7e728d62db98","Type":"ContainerDied","Data":"fa2480ac4dfc46cadfb3d04eee91377e101370427dcd07d1b886ff7b9d7bf038"} Apr 17 14:34:20.417792 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:20.417772 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lr966_0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c/ovn-acl-logging/0.log" Apr 17 14:34:20.420700 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:20.420426 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lr966" event={"ID":"0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c","Type":"ContainerStarted","Data":"77a7a9eef697e096516bcca014dc6e6751146ce80e6c7813946d9c344d5f3ee2"} Apr 17 14:34:20.460661 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:20.460611 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lr966" podStartSLOduration=9.49796878 podStartE2EDuration="26.460597094s" podCreationTimestamp="2026-04-17 14:33:54 +0000 UTC" firstStartedPulling="2026-04-17 14:33:56.825057531 +0000 UTC m=+3.181785134" lastFinishedPulling="2026-04-17 14:34:13.787685845 +0000 UTC m=+20.144413448" observedRunningTime="2026-04-17 14:34:20.45876441 +0000 UTC m=+26.815492034" watchObservedRunningTime="2026-04-17 14:34:20.460597094 +0000 UTC m=+26.817324718" Apr 17 14:34:20.882656 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:20.882449 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2jb96"] Apr 17 14:34:20.882656 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:20.882615 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:34:20.882918 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:20.882729 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jb96" podUID="651b7208-91cf-42ee-b675-0a50ef1389f0" Apr 17 14:34:20.885992 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:20.885963 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-thst5"] Apr 17 14:34:20.886144 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:20.886096 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:20.886413 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:20.886358 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-thst5" podUID="2404238c-49db-4bc3-b328-96679c365761" Apr 17 14:34:20.886886 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:20.886861 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j6cgs"] Apr 17 14:34:20.887395 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:20.886982 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:34:20.887395 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:20.887080 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6cgs" podUID="4807a6e2-14df-4ba4-8aee-7422a65508f2" Apr 17 14:34:21.424199 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:21.424166 2570 generic.go:358] "Generic (PLEG): container finished" podID="4790e8ee-77ab-402e-a1df-7e728d62db98" containerID="334c1ffe224b8def32d0d580168bd1acfacd67208f77f972d2900125def10eb4" exitCode=0 Apr 17 14:34:21.424541 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:21.424205 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lr2qq" event={"ID":"4790e8ee-77ab-402e-a1df-7e728d62db98","Type":"ContainerDied","Data":"334c1ffe224b8def32d0d580168bd1acfacd67208f77f972d2900125def10eb4"} Apr 17 14:34:22.238741 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:22.238563 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:34:22.238894 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:22.238563 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:22.238894 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:22.238844 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jb96" podUID="651b7208-91cf-42ee-b675-0a50ef1389f0" Apr 17 14:34:22.239017 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:22.238902 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-thst5" podUID="2404238c-49db-4bc3-b328-96679c365761" Apr 17 14:34:23.234521 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:23.234488 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:34:23.234889 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:23.234614 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6cgs" podUID="4807a6e2-14df-4ba4-8aee-7422a65508f2" Apr 17 14:34:23.758037 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:23.757937 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2404238c-49db-4bc3-b328-96679c365761-original-pull-secret\") pod \"global-pull-secret-syncer-thst5\" (UID: \"2404238c-49db-4bc3-b328-96679c365761\") " pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:23.758192 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:23.758094 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:34:23.758192 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:23.758175 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2404238c-49db-4bc3-b328-96679c365761-original-pull-secret podName:2404238c-49db-4bc3-b328-96679c365761 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:39.758156643 +0000 UTC m=+46.114884248 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2404238c-49db-4bc3-b328-96679c365761-original-pull-secret") pod "global-pull-secret-syncer-thst5" (UID: "2404238c-49db-4bc3-b328-96679c365761") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:34:24.235194 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:24.235160 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:34:24.235777 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:24.235282 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jb96" podUID="651b7208-91cf-42ee-b675-0a50ef1389f0" Apr 17 14:34:24.235777 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:24.235653 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:24.235777 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:24.235759 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-thst5" podUID="2404238c-49db-4bc3-b328-96679c365761" Apr 17 14:34:25.234397 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:25.234359 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:34:25.234576 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:25.234522 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6cgs" podUID="4807a6e2-14df-4ba4-8aee-7422a65508f2" Apr 17 14:34:26.238343 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:26.238305 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:26.238343 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:26.238346 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:34:26.238893 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:26.238434 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-thst5" podUID="2404238c-49db-4bc3-b328-96679c365761" Apr 17 14:34:26.238893 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:26.238565 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jb96" podUID="651b7208-91cf-42ee-b675-0a50ef1389f0" Apr 17 14:34:27.234855 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.234828 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:34:27.234992 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:27.234948 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6cgs" podUID="4807a6e2-14df-4ba4-8aee-7422a65508f2" Apr 17 14:34:27.433496 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.433467 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-134.ec2.internal" event="NodeReady" Apr 17 14:34:27.434048 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.433668 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 14:34:27.440404 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.440380 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lr2qq" event={"ID":"4790e8ee-77ab-402e-a1df-7e728d62db98","Type":"ContainerStarted","Data":"c4c8d4501087e506b6bebdb23f96f18c1e14d645fb7813f36b54fb3a8f9d1291"} Apr 17 14:34:27.465598 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.465564 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7c57cd6dc9-2bkl9"] Apr 17 14:34:27.475761 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.475735 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-jngmx"] Apr 17 14:34:27.475899 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.475875 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.478537 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.478511 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-qzqq8\"" Apr 17 14:34:27.478666 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.478539 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 14:34:27.478666 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.478556 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 14:34:27.478846 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.478711 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 14:34:27.488036 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.488007 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 14:34:27.488293 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.488261 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-jngmx"] Apr 17 14:34:27.488413 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.488299 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7c57cd6dc9-2bkl9"] Apr 17 14:34:27.488413 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.488315 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wbd2m"] Apr 17 14:34:27.488413 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.488371 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" Apr 17 14:34:27.491209 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.491046 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-6sm76\"" Apr 17 14:34:27.491209 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.491055 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 14:34:27.491209 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.491069 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 14:34:27.504269 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.504192 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wbd2m"] Apr 17 14:34:27.504380 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.504290 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wbd2m" Apr 17 14:34:27.506850 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.506828 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fr4dp\"" Apr 17 14:34:27.506850 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.506847 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 14:34:27.507000 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.506828 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 14:34:27.580868 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.580832 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-l9shq"] Apr 17 14:34:27.588492 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.588465 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74d427e0-c76f-43e0-8769-00df0351072e-trusted-ca\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.588492 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.588494 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jngmx\" (UID: \"42b16b82-9513-445d-b01e-228784a51e88\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" Apr 17 14:34:27.588666 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.588605 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jqbs\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-kube-api-access-6jqbs\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.588666 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.588641 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-bound-sa-token\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.588758 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.588739 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/74d427e0-c76f-43e0-8769-00df0351072e-ca-trust-extracted\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.588804 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.588769 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/74d427e0-c76f-43e0-8769-00df0351072e-installation-pull-secrets\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.588804 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.588798 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.588867 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.588828 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/74d427e0-c76f-43e0-8769-00df0351072e-image-registry-private-configuration\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.588867 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.588848 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/42b16b82-9513-445d-b01e-228784a51e88-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-jngmx\" (UID: \"42b16b82-9513-445d-b01e-228784a51e88\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" Apr 17 14:34:27.588929 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.588897 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/74d427e0-c76f-43e0-8769-00df0351072e-registry-certificates\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.595183 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.595157 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-l9shq"] Apr 17 14:34:27.595325 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.595251 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l9shq" Apr 17 14:34:27.597900 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.597876 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 14:34:27.597900 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.597891 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 14:34:27.598105 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.597956 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 14:34:27.598105 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.597990 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dtcwn\"" Apr 17 14:34:27.689979 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.689901 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jqbs\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-kube-api-access-6jqbs\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.689979 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.689937 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1beaeea-4ae8-452c-b3be-201c1ad4568e-config-volume\") pod \"dns-default-wbd2m\" (UID: \"d1beaeea-4ae8-452c-b3be-201c1ad4568e\") " pod="openshift-dns/dns-default-wbd2m" Apr 17 14:34:27.689979 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.689960 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-bound-sa-token\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.689979 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.689983 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/74d427e0-c76f-43e0-8769-00df0351072e-ca-trust-extracted\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.690355 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.690009 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/74d427e0-c76f-43e0-8769-00df0351072e-installation-pull-secrets\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.690355 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.690036 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjwzr\" (UniqueName: \"kubernetes.io/projected/d1beaeea-4ae8-452c-b3be-201c1ad4568e-kube-api-access-cjwzr\") pod \"dns-default-wbd2m\" (UID: \"d1beaeea-4ae8-452c-b3be-201c1ad4568e\") " pod="openshift-dns/dns-default-wbd2m" Apr 17 14:34:27.690355 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.690066 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.690355 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.690096 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/74d427e0-c76f-43e0-8769-00df0351072e-image-registry-private-configuration\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.690355 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:27.690165 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:34:27.690355 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:27.690185 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c57cd6dc9-2bkl9: secret "image-registry-tls" not found Apr 17 14:34:27.690355 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:27.690259 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls podName:74d427e0-c76f-43e0-8769-00df0351072e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:28.190219012 +0000 UTC m=+34.546946631 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls") pod "image-registry-7c57cd6dc9-2bkl9" (UID: "74d427e0-c76f-43e0-8769-00df0351072e") : secret "image-registry-tls" not found Apr 17 14:34:27.690355 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.690285 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/42b16b82-9513-445d-b01e-228784a51e88-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-jngmx\" (UID: \"42b16b82-9513-445d-b01e-228784a51e88\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" Apr 17 14:34:27.690355 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.690323 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d1beaeea-4ae8-452c-b3be-201c1ad4568e-tmp-dir\") pod \"dns-default-wbd2m\" (UID: \"d1beaeea-4ae8-452c-b3be-201c1ad4568e\") " pod="openshift-dns/dns-default-wbd2m" Apr 17 14:34:27.690720 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.690364 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/74d427e0-c76f-43e0-8769-00df0351072e-registry-certificates\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.690720 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.690486 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74d427e0-c76f-43e0-8769-00df0351072e-trusted-ca\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.690720 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.690524 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jngmx\" (UID: \"42b16b82-9513-445d-b01e-228784a51e88\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" Apr 17 14:34:27.690720 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.690548 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls\") pod \"dns-default-wbd2m\" (UID: \"d1beaeea-4ae8-452c-b3be-201c1ad4568e\") " pod="openshift-dns/dns-default-wbd2m" Apr 17 14:34:27.690720 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:27.690643 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:34:27.690720 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:27.690716 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert podName:42b16b82-9513-445d-b01e-228784a51e88 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:28.190700602 +0000 UTC m=+34.547428224 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jngmx" (UID: "42b16b82-9513-445d-b01e-228784a51e88") : secret "networking-console-plugin-cert" not found Apr 17 14:34:27.690969 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.690950 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/42b16b82-9513-445d-b01e-228784a51e88-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-jngmx\" (UID: \"42b16b82-9513-445d-b01e-228784a51e88\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" Apr 17 14:34:27.695371 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.695352 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/74d427e0-c76f-43e0-8769-00df0351072e-ca-trust-extracted\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.695627 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.695609 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/74d427e0-c76f-43e0-8769-00df0351072e-installation-pull-secrets\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.695698 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.695613 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/74d427e0-c76f-43e0-8769-00df0351072e-image-registry-private-configuration\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.701752 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.701727 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-bound-sa-token\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.702645 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.702625 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jqbs\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-kube-api-access-6jqbs\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.704624 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.704605 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/74d427e0-c76f-43e0-8769-00df0351072e-registry-certificates\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.705051 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.705020 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74d427e0-c76f-43e0-8769-00df0351072e-trusted-ca\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:27.791840 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.791806 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls\") pod \"dns-default-wbd2m\" (UID: \"d1beaeea-4ae8-452c-b3be-201c1ad4568e\") " pod="openshift-dns/dns-default-wbd2m" Apr 17 14:34:27.791977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.791880 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1beaeea-4ae8-452c-b3be-201c1ad4568e-config-volume\") pod \"dns-default-wbd2m\" (UID: \"d1beaeea-4ae8-452c-b3be-201c1ad4568e\") " pod="openshift-dns/dns-default-wbd2m" Apr 17 14:34:27.791977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.791909 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert\") pod \"ingress-canary-l9shq\" (UID: \"05c803a3-345a-4b4a-b40b-575211301efb\") " pod="openshift-ingress-canary/ingress-canary-l9shq" Apr 17 14:34:27.791977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.791944 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjwzr\" (UniqueName: \"kubernetes.io/projected/d1beaeea-4ae8-452c-b3be-201c1ad4568e-kube-api-access-cjwzr\") pod \"dns-default-wbd2m\" (UID: \"d1beaeea-4ae8-452c-b3be-201c1ad4568e\") " pod="openshift-dns/dns-default-wbd2m" Apr 17 14:34:27.791977 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:27.791959 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:34:27.791977 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.791971 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bm4l\" (UniqueName: \"kubernetes.io/projected/05c803a3-345a-4b4a-b40b-575211301efb-kube-api-access-8bm4l\") pod \"ingress-canary-l9shq\" (UID: \"05c803a3-345a-4b4a-b40b-575211301efb\") " pod="openshift-ingress-canary/ingress-canary-l9shq" Apr 17 14:34:27.792201 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:27.792027 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls podName:d1beaeea-4ae8-452c-b3be-201c1ad4568e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:28.292010507 +0000 UTC m=+34.648738113 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls") pod "dns-default-wbd2m" (UID: "d1beaeea-4ae8-452c-b3be-201c1ad4568e") : secret "dns-default-metrics-tls" not found Apr 17 14:34:27.792201 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.792057 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d1beaeea-4ae8-452c-b3be-201c1ad4568e-tmp-dir\") pod \"dns-default-wbd2m\" (UID: \"d1beaeea-4ae8-452c-b3be-201c1ad4568e\") " pod="openshift-dns/dns-default-wbd2m" Apr 17 14:34:27.796049 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.796031 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d1beaeea-4ae8-452c-b3be-201c1ad4568e-tmp-dir\") pod \"dns-default-wbd2m\" (UID: \"d1beaeea-4ae8-452c-b3be-201c1ad4568e\") " pod="openshift-dns/dns-default-wbd2m" Apr 17 14:34:27.796157 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.796140 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1beaeea-4ae8-452c-b3be-201c1ad4568e-config-volume\") pod \"dns-default-wbd2m\" (UID: \"d1beaeea-4ae8-452c-b3be-201c1ad4568e\") " pod="openshift-dns/dns-default-wbd2m" Apr 17 14:34:27.799588 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.799570 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjwzr\" (UniqueName: \"kubernetes.io/projected/d1beaeea-4ae8-452c-b3be-201c1ad4568e-kube-api-access-cjwzr\") pod \"dns-default-wbd2m\" (UID: \"d1beaeea-4ae8-452c-b3be-201c1ad4568e\") " pod="openshift-dns/dns-default-wbd2m" Apr 17 14:34:27.892865 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.892829 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert\") pod \"ingress-canary-l9shq\" (UID: \"05c803a3-345a-4b4a-b40b-575211301efb\") " pod="openshift-ingress-canary/ingress-canary-l9shq" Apr 17 14:34:27.893073 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.892874 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bm4l\" (UniqueName: \"kubernetes.io/projected/05c803a3-345a-4b4a-b40b-575211301efb-kube-api-access-8bm4l\") pod \"ingress-canary-l9shq\" (UID: \"05c803a3-345a-4b4a-b40b-575211301efb\") " pod="openshift-ingress-canary/ingress-canary-l9shq" Apr 17 14:34:27.893073 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:27.892985 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:34:27.893073 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:27.893057 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert podName:05c803a3-345a-4b4a-b40b-575211301efb nodeName:}" failed. No retries permitted until 2026-04-17 14:34:28.393038779 +0000 UTC m=+34.749766395 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert") pod "ingress-canary-l9shq" (UID: "05c803a3-345a-4b4a-b40b-575211301efb") : secret "canary-serving-cert" not found Apr 17 14:34:27.902759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.902733 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bm4l\" (UniqueName: \"kubernetes.io/projected/05c803a3-345a-4b4a-b40b-575211301efb-kube-api-access-8bm4l\") pod \"ingress-canary-l9shq\" (UID: \"05c803a3-345a-4b4a-b40b-575211301efb\") " pod="openshift-ingress-canary/ingress-canary-l9shq" Apr 17 14:34:27.993887 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.993808 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x52rp\" (UniqueName: \"kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp\") pod \"network-check-target-2jb96\" (UID: \"651b7208-91cf-42ee-b675-0a50ef1389f0\") " pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:34:27.993887 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:27.993843 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs\") pod \"network-metrics-daemon-j6cgs\" (UID: \"4807a6e2-14df-4ba4-8aee-7422a65508f2\") " pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:34:27.994126 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:27.993947 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:27.994126 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:27.993968 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:34:27.994126 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:27.993986 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:34:27.994126 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:27.993993 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs podName:4807a6e2-14df-4ba4-8aee-7422a65508f2 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:59.993980422 +0000 UTC m=+66.350708024 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs") pod "network-metrics-daemon-j6cgs" (UID: "4807a6e2-14df-4ba4-8aee-7422a65508f2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:34:27.994126 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:27.994001 2570 projected.go:194] Error preparing data for projected volume kube-api-access-x52rp for pod openshift-network-diagnostics/network-check-target-2jb96: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:27.994126 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:27.994050 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp podName:651b7208-91cf-42ee-b675-0a50ef1389f0 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:59.994035393 +0000 UTC m=+66.350762998 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-x52rp" (UniqueName: "kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp") pod "network-check-target-2jb96" (UID: "651b7208-91cf-42ee-b675-0a50ef1389f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:34:28.195524 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:28.195485 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jngmx\" (UID: \"42b16b82-9513-445d-b01e-228784a51e88\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" Apr 17 14:34:28.195671 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:28.195598 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:28.195671 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:28.195627 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:34:28.195740 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:28.195688 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert podName:42b16b82-9513-445d-b01e-228784a51e88 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:29.195673258 +0000 UTC m=+35.552400862 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jngmx" (UID: "42b16b82-9513-445d-b01e-228784a51e88") : secret "networking-console-plugin-cert" not found Apr 17 14:34:28.195740 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:28.195697 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:34:28.195740 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:28.195708 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c57cd6dc9-2bkl9: secret "image-registry-tls" not found Apr 17 14:34:28.195836 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:28.195760 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls podName:74d427e0-c76f-43e0-8769-00df0351072e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:29.195744535 +0000 UTC m=+35.552472139 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls") pod "image-registry-7c57cd6dc9-2bkl9" (UID: "74d427e0-c76f-43e0-8769-00df0351072e") : secret "image-registry-tls" not found Apr 17 14:34:28.238080 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:28.238046 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:28.238227 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:28.238050 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:34:28.240758 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:28.240734 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 14:34:28.240876 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:28.240793 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 14:34:28.240876 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:28.240810 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-mnbtt\"" Apr 17 14:34:28.241047 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:28.241032 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 14:34:28.296906 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:28.296865 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls\") pod \"dns-default-wbd2m\" (UID: \"d1beaeea-4ae8-452c-b3be-201c1ad4568e\") " pod="openshift-dns/dns-default-wbd2m" Apr 17 14:34:28.297076 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:28.297021 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:34:28.297126 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:28.297096 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls podName:d1beaeea-4ae8-452c-b3be-201c1ad4568e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:29.297067594 +0000 UTC m=+35.653795197 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls") pod "dns-default-wbd2m" (UID: "d1beaeea-4ae8-452c-b3be-201c1ad4568e") : secret "dns-default-metrics-tls" not found Apr 17 14:34:28.398189 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:28.398159 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert\") pod \"ingress-canary-l9shq\" (UID: \"05c803a3-345a-4b4a-b40b-575211301efb\") " pod="openshift-ingress-canary/ingress-canary-l9shq" Apr 17 14:34:28.398347 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:28.398325 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:34:28.398388 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:28.398381 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert podName:05c803a3-345a-4b4a-b40b-575211301efb nodeName:}" failed. No retries permitted until 2026-04-17 14:34:29.398364534 +0000 UTC m=+35.755092155 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert") pod "ingress-canary-l9shq" (UID: "05c803a3-345a-4b4a-b40b-575211301efb") : secret "canary-serving-cert" not found Apr 17 14:34:28.444592 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:28.444559 2570 generic.go:358] "Generic (PLEG): container finished" podID="4790e8ee-77ab-402e-a1df-7e728d62db98" containerID="c4c8d4501087e506b6bebdb23f96f18c1e14d645fb7813f36b54fb3a8f9d1291" exitCode=0 Apr 17 14:34:28.445069 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:28.444625 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lr2qq" event={"ID":"4790e8ee-77ab-402e-a1df-7e728d62db98","Type":"ContainerDied","Data":"c4c8d4501087e506b6bebdb23f96f18c1e14d645fb7813f36b54fb3a8f9d1291"} Apr 17 14:34:29.204404 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:29.204208 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jngmx\" (UID: \"42b16b82-9513-445d-b01e-228784a51e88\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" Apr 17 14:34:29.204561 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:29.204329 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:34:29.204561 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:29.204469 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:29.204561 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:29.204519 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert podName:42b16b82-9513-445d-b01e-228784a51e88 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:31.204502606 +0000 UTC m=+37.561230208 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jngmx" (UID: "42b16b82-9513-445d-b01e-228784a51e88") : secret "networking-console-plugin-cert" not found Apr 17 14:34:29.204561 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:29.204555 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:34:29.204697 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:29.204567 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c57cd6dc9-2bkl9: secret "image-registry-tls" not found Apr 17 14:34:29.204697 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:29.204612 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls podName:74d427e0-c76f-43e0-8769-00df0351072e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:31.204596905 +0000 UTC m=+37.561324527 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls") pod "image-registry-7c57cd6dc9-2bkl9" (UID: "74d427e0-c76f-43e0-8769-00df0351072e") : secret "image-registry-tls" not found Apr 17 14:34:29.235192 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:29.235166 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:34:29.237933 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:29.237912 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 14:34:29.238023 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:29.237912 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tg92k\"" Apr 17 14:34:29.305428 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:29.305394 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls\") pod \"dns-default-wbd2m\" (UID: \"d1beaeea-4ae8-452c-b3be-201c1ad4568e\") " pod="openshift-dns/dns-default-wbd2m" Apr 17 14:34:29.305594 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:29.305519 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:34:29.305594 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:29.305580 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls podName:d1beaeea-4ae8-452c-b3be-201c1ad4568e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:31.305565945 +0000 UTC m=+37.662293549 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls") pod "dns-default-wbd2m" (UID: "d1beaeea-4ae8-452c-b3be-201c1ad4568e") : secret "dns-default-metrics-tls" not found Apr 17 14:34:29.405822 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:29.405791 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert\") pod \"ingress-canary-l9shq\" (UID: \"05c803a3-345a-4b4a-b40b-575211301efb\") " pod="openshift-ingress-canary/ingress-canary-l9shq" Apr 17 14:34:29.405974 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:29.405935 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:34:29.406014 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:29.405992 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert podName:05c803a3-345a-4b4a-b40b-575211301efb nodeName:}" failed. No retries permitted until 2026-04-17 14:34:31.405977715 +0000 UTC m=+37.762705317 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert") pod "ingress-canary-l9shq" (UID: "05c803a3-345a-4b4a-b40b-575211301efb") : secret "canary-serving-cert" not found Apr 17 14:34:29.448567 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:29.448539 2570 generic.go:358] "Generic (PLEG): container finished" podID="4790e8ee-77ab-402e-a1df-7e728d62db98" containerID="d054aaed74c9ea0255ff24415ea33d03c6748c6f413c91cbb37241b75f445b8a" exitCode=0 Apr 17 14:34:29.448917 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:29.448601 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lr2qq" event={"ID":"4790e8ee-77ab-402e-a1df-7e728d62db98","Type":"ContainerDied","Data":"d054aaed74c9ea0255ff24415ea33d03c6748c6f413c91cbb37241b75f445b8a"} Apr 17 14:34:30.453426 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.453384 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lr2qq" event={"ID":"4790e8ee-77ab-402e-a1df-7e728d62db98","Type":"ContainerStarted","Data":"dda0331fb2a58d104df19bb4ded6c0a534caf8abbab4662660e514481b336164"} Apr 17 14:34:30.476820 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.476774 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-lr2qq" podStartSLOduration=6.142109804 podStartE2EDuration="36.476759577s" podCreationTimestamp="2026-04-17 14:33:54 +0000 UTC" firstStartedPulling="2026-04-17 14:33:56.824188049 +0000 UTC m=+3.180915657" lastFinishedPulling="2026-04-17 14:34:27.158837828 +0000 UTC m=+33.515565430" observedRunningTime="2026-04-17 14:34:30.474890192 +0000 UTC m=+36.831617818" watchObservedRunningTime="2026-04-17 14:34:30.476759577 +0000 UTC m=+36.833487202" Apr 17 14:34:30.674006 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.673974 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2"] Apr 17 14:34:30.693794 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.693729 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2"] Apr 17 14:34:30.693904 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.693798 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" Apr 17 14:34:30.696537 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.696511 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 14:34:30.696537 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.696511 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 14:34:30.696715 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.696550 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 14:34:30.696715 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.696577 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 14:34:30.696715 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.696511 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 14:34:30.696715 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.696657 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 14:34:30.696890 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.696867 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 14:34:30.714703 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.714678 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/08f10719-7fb4-455b-88d5-84e0bf2877d1-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7b955f578f-6kfg2\" (UID: \"08f10719-7fb4-455b-88d5-84e0bf2877d1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" Apr 17 14:34:30.714786 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.714707 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/08f10719-7fb4-455b-88d5-84e0bf2877d1-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7b955f578f-6kfg2\" (UID: \"08f10719-7fb4-455b-88d5-84e0bf2877d1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" Apr 17 14:34:30.714786 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.714725 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsz9m\" (UniqueName: \"kubernetes.io/projected/08f10719-7fb4-455b-88d5-84e0bf2877d1-kube-api-access-gsz9m\") pod \"cluster-proxy-proxy-agent-7b955f578f-6kfg2\" (UID: \"08f10719-7fb4-455b-88d5-84e0bf2877d1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" Apr 17 14:34:30.714786 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.714770 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/08f10719-7fb4-455b-88d5-84e0bf2877d1-ca\") pod \"cluster-proxy-proxy-agent-7b955f578f-6kfg2\" (UID: \"08f10719-7fb4-455b-88d5-84e0bf2877d1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" Apr 17 14:34:30.714887 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.714821 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/08f10719-7fb4-455b-88d5-84e0bf2877d1-hub\") pod \"cluster-proxy-proxy-agent-7b955f578f-6kfg2\" (UID: \"08f10719-7fb4-455b-88d5-84e0bf2877d1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" Apr 17 14:34:30.714887 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.714856 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/08f10719-7fb4-455b-88d5-84e0bf2877d1-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7b955f578f-6kfg2\" (UID: \"08f10719-7fb4-455b-88d5-84e0bf2877d1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" Apr 17 14:34:30.815295 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.815266 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/08f10719-7fb4-455b-88d5-84e0bf2877d1-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7b955f578f-6kfg2\" (UID: \"08f10719-7fb4-455b-88d5-84e0bf2877d1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" Apr 17 14:34:30.815295 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.815297 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/08f10719-7fb4-455b-88d5-84e0bf2877d1-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7b955f578f-6kfg2\" (UID: \"08f10719-7fb4-455b-88d5-84e0bf2877d1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" Apr 17 14:34:30.815434 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.815315 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gsz9m\" (UniqueName: \"kubernetes.io/projected/08f10719-7fb4-455b-88d5-84e0bf2877d1-kube-api-access-gsz9m\") pod \"cluster-proxy-proxy-agent-7b955f578f-6kfg2\" (UID: \"08f10719-7fb4-455b-88d5-84e0bf2877d1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" Apr 17 14:34:30.815434 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.815389 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/08f10719-7fb4-455b-88d5-84e0bf2877d1-ca\") pod \"cluster-proxy-proxy-agent-7b955f578f-6kfg2\" (UID: \"08f10719-7fb4-455b-88d5-84e0bf2877d1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" Apr 17 14:34:30.815506 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.815459 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/08f10719-7fb4-455b-88d5-84e0bf2877d1-hub\") pod \"cluster-proxy-proxy-agent-7b955f578f-6kfg2\" (UID: \"08f10719-7fb4-455b-88d5-84e0bf2877d1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" Apr 17 14:34:30.815506 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.815483 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/08f10719-7fb4-455b-88d5-84e0bf2877d1-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7b955f578f-6kfg2\" (UID: \"08f10719-7fb4-455b-88d5-84e0bf2877d1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" Apr 17 14:34:30.816175 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.816154 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/08f10719-7fb4-455b-88d5-84e0bf2877d1-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7b955f578f-6kfg2\" (UID: \"08f10719-7fb4-455b-88d5-84e0bf2877d1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" Apr 17 14:34:30.818685 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.818656 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/08f10719-7fb4-455b-88d5-84e0bf2877d1-hub\") pod \"cluster-proxy-proxy-agent-7b955f578f-6kfg2\" (UID: \"08f10719-7fb4-455b-88d5-84e0bf2877d1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" Apr 17 14:34:30.818769 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.818690 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/08f10719-7fb4-455b-88d5-84e0bf2877d1-ca\") pod \"cluster-proxy-proxy-agent-7b955f578f-6kfg2\" (UID: \"08f10719-7fb4-455b-88d5-84e0bf2877d1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" Apr 17 14:34:30.818769 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.818738 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/08f10719-7fb4-455b-88d5-84e0bf2877d1-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7b955f578f-6kfg2\" (UID: \"08f10719-7fb4-455b-88d5-84e0bf2877d1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" Apr 17 14:34:30.818838 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.818769 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/08f10719-7fb4-455b-88d5-84e0bf2877d1-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7b955f578f-6kfg2\" (UID: \"08f10719-7fb4-455b-88d5-84e0bf2877d1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" Apr 17 14:34:30.823046 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:30.823024 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsz9m\" (UniqueName: \"kubernetes.io/projected/08f10719-7fb4-455b-88d5-84e0bf2877d1-kube-api-access-gsz9m\") pod \"cluster-proxy-proxy-agent-7b955f578f-6kfg2\" (UID: \"08f10719-7fb4-455b-88d5-84e0bf2877d1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" Apr 17 14:34:31.019426 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:31.019343 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" Apr 17 14:34:31.192142 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:31.192106 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2"] Apr 17 14:34:31.196937 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:34:31.196771 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08f10719_7fb4_455b_88d5_84e0bf2877d1.slice/crio-d038ca031806f9f3f9ce67885b8d0c57eb69a69eb144f8feaa0928224924616d WatchSource:0}: Error finding container d038ca031806f9f3f9ce67885b8d0c57eb69a69eb144f8feaa0928224924616d: Status 404 returned error can't find the container with id d038ca031806f9f3f9ce67885b8d0c57eb69a69eb144f8feaa0928224924616d Apr 17 14:34:31.219159 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:31.219121 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jngmx\" (UID: \"42b16b82-9513-445d-b01e-228784a51e88\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" Apr 17 14:34:31.219348 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:31.219284 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:34:31.219348 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:31.219331 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:31.219467 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:31.219353 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert podName:42b16b82-9513-445d-b01e-228784a51e88 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:35.219335255 +0000 UTC m=+41.576062857 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jngmx" (UID: "42b16b82-9513-445d-b01e-228784a51e88") : secret "networking-console-plugin-cert" not found Apr 17 14:34:31.219467 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:31.219419 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:34:31.219467 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:31.219434 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c57cd6dc9-2bkl9: secret "image-registry-tls" not found Apr 17 14:34:31.219591 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:31.219501 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls podName:74d427e0-c76f-43e0-8769-00df0351072e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:35.219482463 +0000 UTC m=+41.576210081 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls") pod "image-registry-7c57cd6dc9-2bkl9" (UID: "74d427e0-c76f-43e0-8769-00df0351072e") : secret "image-registry-tls" not found Apr 17 14:34:31.320231 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:31.320199 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls\") pod \"dns-default-wbd2m\" (UID: \"d1beaeea-4ae8-452c-b3be-201c1ad4568e\") " pod="openshift-dns/dns-default-wbd2m" Apr 17 14:34:31.320405 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:31.320360 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:34:31.320463 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:31.320451 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls podName:d1beaeea-4ae8-452c-b3be-201c1ad4568e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:35.320428405 +0000 UTC m=+41.677156027 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls") pod "dns-default-wbd2m" (UID: "d1beaeea-4ae8-452c-b3be-201c1ad4568e") : secret "dns-default-metrics-tls" not found Apr 17 14:34:31.420844 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:31.420805 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert\") pod \"ingress-canary-l9shq\" (UID: \"05c803a3-345a-4b4a-b40b-575211301efb\") " pod="openshift-ingress-canary/ingress-canary-l9shq" Apr 17 14:34:31.421020 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:31.420951 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:34:31.421020 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:31.421015 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert podName:05c803a3-345a-4b4a-b40b-575211301efb nodeName:}" failed. No retries permitted until 2026-04-17 14:34:35.420998851 +0000 UTC m=+41.777726454 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert") pod "ingress-canary-l9shq" (UID: "05c803a3-345a-4b4a-b40b-575211301efb") : secret "canary-serving-cert" not found Apr 17 14:34:31.456151 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:31.456117 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" event={"ID":"08f10719-7fb4-455b-88d5-84e0bf2877d1","Type":"ContainerStarted","Data":"d038ca031806f9f3f9ce67885b8d0c57eb69a69eb144f8feaa0928224924616d"} Apr 17 14:34:34.463613 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:34.463531 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" event={"ID":"08f10719-7fb4-455b-88d5-84e0bf2877d1","Type":"ContainerStarted","Data":"a926fa7616ff7c176418b8dc846311fa82db95a542b94b2b49cecc7334f34873"} Apr 17 14:34:35.251293 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:35.251214 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:35.251499 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:35.251350 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jngmx\" (UID: \"42b16b82-9513-445d-b01e-228784a51e88\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" Apr 17 14:34:35.251499 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:35.251357 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:34:35.251499 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:35.251379 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c57cd6dc9-2bkl9: secret "image-registry-tls" not found Apr 17 14:34:35.251499 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:35.251444 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls podName:74d427e0-c76f-43e0-8769-00df0351072e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:43.251423376 +0000 UTC m=+49.608151003 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls") pod "image-registry-7c57cd6dc9-2bkl9" (UID: "74d427e0-c76f-43e0-8769-00df0351072e") : secret "image-registry-tls" not found Apr 17 14:34:35.251726 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:35.251511 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:34:35.251726 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:35.251581 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert podName:42b16b82-9513-445d-b01e-228784a51e88 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:43.251563177 +0000 UTC m=+49.608290780 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jngmx" (UID: "42b16b82-9513-445d-b01e-228784a51e88") : secret "networking-console-plugin-cert" not found Apr 17 14:34:35.352049 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:35.352017 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls\") pod \"dns-default-wbd2m\" (UID: \"d1beaeea-4ae8-452c-b3be-201c1ad4568e\") " pod="openshift-dns/dns-default-wbd2m" Apr 17 14:34:35.352220 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:35.352169 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:34:35.352306 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:35.352260 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls podName:d1beaeea-4ae8-452c-b3be-201c1ad4568e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:43.352224677 +0000 UTC m=+49.708952281 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls") pod "dns-default-wbd2m" (UID: "d1beaeea-4ae8-452c-b3be-201c1ad4568e") : secret "dns-default-metrics-tls" not found Apr 17 14:34:35.453539 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:35.453499 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert\") pod \"ingress-canary-l9shq\" (UID: \"05c803a3-345a-4b4a-b40b-575211301efb\") " pod="openshift-ingress-canary/ingress-canary-l9shq" Apr 17 14:34:35.453724 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:35.453639 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:34:35.453724 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:35.453715 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert podName:05c803a3-345a-4b4a-b40b-575211301efb nodeName:}" failed. No retries permitted until 2026-04-17 14:34:43.453696013 +0000 UTC m=+49.810423630 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert") pod "ingress-canary-l9shq" (UID: "05c803a3-345a-4b4a-b40b-575211301efb") : secret "canary-serving-cert" not found Apr 17 14:34:36.468887 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:36.468852 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" event={"ID":"08f10719-7fb4-455b-88d5-84e0bf2877d1","Type":"ContainerStarted","Data":"54d5b337152998c1490bbb95dad315e1570b0ed7107c437f72790e25891b6be3"} Apr 17 14:34:36.469233 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:36.468893 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" event={"ID":"08f10719-7fb4-455b-88d5-84e0bf2877d1","Type":"ContainerStarted","Data":"dc187f3387f75586160323886cb48a89c24010c89b05c00052ec405f7aed2fc7"} Apr 17 14:34:36.485373 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:36.485329 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" podStartSLOduration=1.366715742 podStartE2EDuration="6.485314751s" podCreationTimestamp="2026-04-17 14:34:30 +0000 UTC" firstStartedPulling="2026-04-17 14:34:31.199016308 +0000 UTC m=+37.555743916" lastFinishedPulling="2026-04-17 14:34:36.317615322 +0000 UTC m=+42.674342925" observedRunningTime="2026-04-17 14:34:36.48460939 +0000 UTC m=+42.841337014" watchObservedRunningTime="2026-04-17 14:34:36.485314751 +0000 UTC m=+42.842042376" Apr 17 14:34:39.789366 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:39.789334 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2404238c-49db-4bc3-b328-96679c365761-original-pull-secret\") pod \"global-pull-secret-syncer-thst5\" (UID: \"2404238c-49db-4bc3-b328-96679c365761\") " pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:39.792587 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:39.792566 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2404238c-49db-4bc3-b328-96679c365761-original-pull-secret\") pod \"global-pull-secret-syncer-thst5\" (UID: \"2404238c-49db-4bc3-b328-96679c365761\") " pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:39.947279 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:39.947226 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-thst5" Apr 17 14:34:40.079478 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:40.079400 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-thst5"] Apr 17 14:34:40.082187 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:34:40.082158 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2404238c_49db_4bc3_b328_96679c365761.slice/crio-cfd393e3473c9300bea9b9c3eba90ef1c607f73793eb18a4db05735c3249ef11 WatchSource:0}: Error finding container cfd393e3473c9300bea9b9c3eba90ef1c607f73793eb18a4db05735c3249ef11: Status 404 returned error can't find the container with id cfd393e3473c9300bea9b9c3eba90ef1c607f73793eb18a4db05735c3249ef11 Apr 17 14:34:40.477905 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:40.477824 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-thst5" event={"ID":"2404238c-49db-4bc3-b328-96679c365761","Type":"ContainerStarted","Data":"cfd393e3473c9300bea9b9c3eba90ef1c607f73793eb18a4db05735c3249ef11"} Apr 17 14:34:43.317370 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:43.317327 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:43.317874 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:43.317393 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jngmx\" (UID: \"42b16b82-9513-445d-b01e-228784a51e88\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" Apr 17 14:34:43.317874 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:43.317488 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:34:43.317874 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:43.317514 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c57cd6dc9-2bkl9: secret "image-registry-tls" not found Apr 17 14:34:43.317874 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:43.317545 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:34:43.317874 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:43.317597 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls podName:74d427e0-c76f-43e0-8769-00df0351072e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:59.317574365 +0000 UTC m=+65.674301990 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls") pod "image-registry-7c57cd6dc9-2bkl9" (UID: "74d427e0-c76f-43e0-8769-00df0351072e") : secret "image-registry-tls" not found Apr 17 14:34:43.317874 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:43.317611 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert podName:42b16b82-9513-445d-b01e-228784a51e88 nodeName:}" failed. No retries permitted until 2026-04-17 14:34:59.317605328 +0000 UTC m=+65.674332931 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jngmx" (UID: "42b16b82-9513-445d-b01e-228784a51e88") : secret "networking-console-plugin-cert" not found Apr 17 14:34:43.418102 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:43.418065 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls\") pod \"dns-default-wbd2m\" (UID: \"d1beaeea-4ae8-452c-b3be-201c1ad4568e\") " pod="openshift-dns/dns-default-wbd2m" Apr 17 14:34:43.418324 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:43.418267 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:34:43.418393 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:43.418341 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls podName:d1beaeea-4ae8-452c-b3be-201c1ad4568e nodeName:}" failed. No retries permitted until 2026-04-17 14:34:59.418323696 +0000 UTC m=+65.775051299 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls") pod "dns-default-wbd2m" (UID: "d1beaeea-4ae8-452c-b3be-201c1ad4568e") : secret "dns-default-metrics-tls" not found Apr 17 14:34:43.518954 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:43.518916 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert\") pod \"ingress-canary-l9shq\" (UID: \"05c803a3-345a-4b4a-b40b-575211301efb\") " pod="openshift-ingress-canary/ingress-canary-l9shq" Apr 17 14:34:43.519115 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:43.519095 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:34:43.519179 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:43.519164 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert podName:05c803a3-345a-4b4a-b40b-575211301efb nodeName:}" failed. No retries permitted until 2026-04-17 14:34:59.519144348 +0000 UTC m=+65.875871962 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert") pod "ingress-canary-l9shq" (UID: "05c803a3-345a-4b4a-b40b-575211301efb") : secret "canary-serving-cert" not found Apr 17 14:34:45.488945 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:45.488907 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-thst5" event={"ID":"2404238c-49db-4bc3-b328-96679c365761","Type":"ContainerStarted","Data":"3e56ff296fcad5a3ebd555f6898fa76fbc2afd382795a7c3546c24ea2c9ae075"} Apr 17 14:34:45.503071 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:45.503025 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-thst5" podStartSLOduration=34.009761914 podStartE2EDuration="38.503013287s" podCreationTimestamp="2026-04-17 14:34:07 +0000 UTC" firstStartedPulling="2026-04-17 14:34:40.084090361 +0000 UTC m=+46.440817977" lastFinishedPulling="2026-04-17 14:34:44.577341743 +0000 UTC m=+50.934069350" observedRunningTime="2026-04-17 14:34:45.502832611 +0000 UTC m=+51.859560236" watchObservedRunningTime="2026-04-17 14:34:45.503013287 +0000 UTC m=+51.859740912" Apr 17 14:34:51.434439 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:51.434410 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lr966" Apr 17 14:34:59.337636 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:59.337578 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:34:59.338108 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:59.337667 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jngmx\" (UID: \"42b16b82-9513-445d-b01e-228784a51e88\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" Apr 17 14:34:59.338108 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:59.337752 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:34:59.338108 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:59.337765 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:34:59.338108 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:59.337774 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c57cd6dc9-2bkl9: secret "image-registry-tls" not found Apr 17 14:34:59.338108 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:59.337844 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert podName:42b16b82-9513-445d-b01e-228784a51e88 nodeName:}" failed. No retries permitted until 2026-04-17 14:35:31.33782969 +0000 UTC m=+97.694557293 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jngmx" (UID: "42b16b82-9513-445d-b01e-228784a51e88") : secret "networking-console-plugin-cert" not found Apr 17 14:34:59.338108 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:59.337859 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls podName:74d427e0-c76f-43e0-8769-00df0351072e nodeName:}" failed. No retries permitted until 2026-04-17 14:35:31.337853357 +0000 UTC m=+97.694580959 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls") pod "image-registry-7c57cd6dc9-2bkl9" (UID: "74d427e0-c76f-43e0-8769-00df0351072e") : secret "image-registry-tls" not found Apr 17 14:34:59.438029 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:59.437996 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls\") pod \"dns-default-wbd2m\" (UID: \"d1beaeea-4ae8-452c-b3be-201c1ad4568e\") " pod="openshift-dns/dns-default-wbd2m" Apr 17 14:34:59.438188 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:59.438135 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:34:59.438253 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:59.438194 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls podName:d1beaeea-4ae8-452c-b3be-201c1ad4568e nodeName:}" failed. No retries permitted until 2026-04-17 14:35:31.438178742 +0000 UTC m=+97.794906346 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls") pod "dns-default-wbd2m" (UID: "d1beaeea-4ae8-452c-b3be-201c1ad4568e") : secret "dns-default-metrics-tls" not found Apr 17 14:34:59.538625 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:34:59.538557 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert\") pod \"ingress-canary-l9shq\" (UID: \"05c803a3-345a-4b4a-b40b-575211301efb\") " pod="openshift-ingress-canary/ingress-canary-l9shq" Apr 17 14:34:59.538813 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:59.538701 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:34:59.538813 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:34:59.538759 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert podName:05c803a3-345a-4b4a-b40b-575211301efb nodeName:}" failed. No retries permitted until 2026-04-17 14:35:31.538745382 +0000 UTC m=+97.895472990 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert") pod "ingress-canary-l9shq" (UID: "05c803a3-345a-4b4a-b40b-575211301efb") : secret "canary-serving-cert" not found Apr 17 14:35:00.042789 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:35:00.042739 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x52rp\" (UniqueName: \"kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp\") pod \"network-check-target-2jb96\" (UID: \"651b7208-91cf-42ee-b675-0a50ef1389f0\") " pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:35:00.042789 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:35:00.042793 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs\") pod \"network-metrics-daemon-j6cgs\" (UID: \"4807a6e2-14df-4ba4-8aee-7422a65508f2\") " pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:35:00.045607 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:35:00.045574 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 14:35:00.045746 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:35:00.045644 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 14:35:00.053795 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:35:00.053774 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 14:35:00.053866 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:35:00.053857 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs podName:4807a6e2-14df-4ba4-8aee-7422a65508f2 nodeName:}" failed. No retries permitted until 2026-04-17 14:36:04.053839964 +0000 UTC m=+130.410567566 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs") pod "network-metrics-daemon-j6cgs" (UID: "4807a6e2-14df-4ba4-8aee-7422a65508f2") : secret "metrics-daemon-secret" not found Apr 17 14:35:00.055966 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:35:00.055950 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 14:35:00.066790 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:35:00.066773 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x52rp\" (UniqueName: \"kubernetes.io/projected/651b7208-91cf-42ee-b675-0a50ef1389f0-kube-api-access-x52rp\") pod \"network-check-target-2jb96\" (UID: \"651b7208-91cf-42ee-b675-0a50ef1389f0\") " pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:35:00.354994 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:35:00.354910 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-mnbtt\"" Apr 17 14:35:00.363070 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:35:00.363054 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:35:00.471503 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:35:00.471472 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2jb96"] Apr 17 14:35:00.474573 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:35:00.474546 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod651b7208_91cf_42ee_b675_0a50ef1389f0.slice/crio-974c905f8db15a5b4623941d5a1c05fb073ee208f4837892166971e60c9025f4 WatchSource:0}: Error finding container 974c905f8db15a5b4623941d5a1c05fb073ee208f4837892166971e60c9025f4: Status 404 returned error can't find the container with id 974c905f8db15a5b4623941d5a1c05fb073ee208f4837892166971e60c9025f4 Apr 17 14:35:00.518825 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:35:00.518792 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2jb96" event={"ID":"651b7208-91cf-42ee-b675-0a50ef1389f0","Type":"ContainerStarted","Data":"974c905f8db15a5b4623941d5a1c05fb073ee208f4837892166971e60c9025f4"} Apr 17 14:35:03.528433 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:35:03.528403 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2jb96" event={"ID":"651b7208-91cf-42ee-b675-0a50ef1389f0","Type":"ContainerStarted","Data":"7acca085b9de3095775976cc2a4f537b69857e887062479b0698c2e7591072c1"} Apr 17 14:35:03.528862 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:35:03.528521 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:35:03.544221 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:35:03.544177 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-2jb96" podStartSLOduration=66.980238843 podStartE2EDuration="1m9.544161678s" podCreationTimestamp="2026-04-17 14:33:54 +0000 UTC" firstStartedPulling="2026-04-17 14:35:00.476835475 +0000 UTC m=+66.833563078" lastFinishedPulling="2026-04-17 14:35:03.040758309 +0000 UTC m=+69.397485913" observedRunningTime="2026-04-17 14:35:03.54378692 +0000 UTC m=+69.900514546" watchObservedRunningTime="2026-04-17 14:35:03.544161678 +0000 UTC m=+69.900889303" Apr 17 14:35:31.382839 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:35:31.382794 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:35:31.383350 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:35:31.382850 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jngmx\" (UID: \"42b16b82-9513-445d-b01e-228784a51e88\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" Apr 17 14:35:31.383350 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:35:31.382952 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:35:31.383350 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:35:31.382961 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:35:31.383350 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:35:31.382986 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c57cd6dc9-2bkl9: secret "image-registry-tls" not found Apr 17 14:35:31.383350 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:35:31.383013 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert podName:42b16b82-9513-445d-b01e-228784a51e88 nodeName:}" failed. No retries permitted until 2026-04-17 14:36:35.38299836 +0000 UTC m=+161.739725963 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jngmx" (UID: "42b16b82-9513-445d-b01e-228784a51e88") : secret "networking-console-plugin-cert" not found Apr 17 14:35:31.383350 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:35:31.383051 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls podName:74d427e0-c76f-43e0-8769-00df0351072e nodeName:}" failed. No retries permitted until 2026-04-17 14:36:35.38303374 +0000 UTC m=+161.739761345 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls") pod "image-registry-7c57cd6dc9-2bkl9" (UID: "74d427e0-c76f-43e0-8769-00df0351072e") : secret "image-registry-tls" not found Apr 17 14:35:31.484166 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:35:31.484130 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls\") pod \"dns-default-wbd2m\" (UID: \"d1beaeea-4ae8-452c-b3be-201c1ad4568e\") " pod="openshift-dns/dns-default-wbd2m" Apr 17 14:35:31.484340 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:35:31.484298 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:35:31.484388 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:35:31.484375 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls podName:d1beaeea-4ae8-452c-b3be-201c1ad4568e nodeName:}" failed. No retries permitted until 2026-04-17 14:36:35.484358371 +0000 UTC m=+161.841085979 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls") pod "dns-default-wbd2m" (UID: "d1beaeea-4ae8-452c-b3be-201c1ad4568e") : secret "dns-default-metrics-tls" not found Apr 17 14:35:31.585382 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:35:31.585347 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert\") pod \"ingress-canary-l9shq\" (UID: \"05c803a3-345a-4b4a-b40b-575211301efb\") " pod="openshift-ingress-canary/ingress-canary-l9shq" Apr 17 14:35:31.585544 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:35:31.585452 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:35:31.585544 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:35:31.585500 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert podName:05c803a3-345a-4b4a-b40b-575211301efb nodeName:}" failed. No retries permitted until 2026-04-17 14:36:35.585486375 +0000 UTC m=+161.942213978 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert") pod "ingress-canary-l9shq" (UID: "05c803a3-345a-4b4a-b40b-575211301efb") : secret "canary-serving-cert" not found Apr 17 14:35:34.533375 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:35:34.533343 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-2jb96" Apr 17 14:36:04.128798 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:04.128761 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs\") pod \"network-metrics-daemon-j6cgs\" (UID: \"4807a6e2-14df-4ba4-8aee-7422a65508f2\") " pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:36:04.129321 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:36:04.128886 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 14:36:04.129321 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:36:04.128940 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs podName:4807a6e2-14df-4ba4-8aee-7422a65508f2 nodeName:}" failed. No retries permitted until 2026-04-17 14:38:06.128925608 +0000 UTC m=+252.485653211 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs") pod "network-metrics-daemon-j6cgs" (UID: "4807a6e2-14df-4ba4-8aee-7422a65508f2") : secret "metrics-daemon-secret" not found Apr 17 14:36:23.217926 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:23.217896 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s9w25_e599dbf6-a663-42a7-82bb-12ed438c2ba8/dns-node-resolver/0.log" Apr 17 14:36:24.018350 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:24.018320 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pzwpd_c66a6c5d-9d68-42d2-ac7d-1ef80b44ed2f/node-ca/0.log" Apr 17 14:36:30.489869 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:36:30.489828 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" podUID="74d427e0-c76f-43e0-8769-00df0351072e" Apr 17 14:36:30.496978 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:36:30.496929 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" podUID="42b16b82-9513-445d-b01e-228784a51e88" Apr 17 14:36:30.512156 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:36:30.512122 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-wbd2m" podUID="d1beaeea-4ae8-452c-b3be-201c1ad4568e" Apr 17 14:36:30.604268 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:36:30.604217 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-l9shq" podUID="05c803a3-345a-4b4a-b40b-575211301efb" Apr 17 14:36:30.698271 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:30.698168 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:36:30.698271 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:30.698168 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wbd2m" Apr 17 14:36:30.698465 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:30.698168 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l9shq" Apr 17 14:36:30.698465 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:30.698179 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" Apr 17 14:36:32.243775 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:36:32.243736 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-j6cgs" podUID="4807a6e2-14df-4ba4-8aee-7422a65508f2" Apr 17 14:36:35.467110 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:35.467067 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:36:35.467522 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:35.467132 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jngmx\" (UID: \"42b16b82-9513-445d-b01e-228784a51e88\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" Apr 17 14:36:35.467522 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:36:35.467264 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:36:35.467522 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:36:35.467345 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert podName:42b16b82-9513-445d-b01e-228784a51e88 nodeName:}" failed. No retries permitted until 2026-04-17 14:38:37.467326891 +0000 UTC m=+283.824054496 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jngmx" (UID: "42b16b82-9513-445d-b01e-228784a51e88") : secret "networking-console-plugin-cert" not found Apr 17 14:36:35.469628 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:35.469600 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls\") pod \"image-registry-7c57cd6dc9-2bkl9\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:36:35.501067 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:35.501038 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-qzqq8\"" Apr 17 14:36:35.509075 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:35.509046 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:36:35.568317 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:35.568286 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls\") pod \"dns-default-wbd2m\" (UID: \"d1beaeea-4ae8-452c-b3be-201c1ad4568e\") " pod="openshift-dns/dns-default-wbd2m" Apr 17 14:36:35.571669 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:35.571603 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1beaeea-4ae8-452c-b3be-201c1ad4568e-metrics-tls\") pod \"dns-default-wbd2m\" (UID: \"d1beaeea-4ae8-452c-b3be-201c1ad4568e\") " pod="openshift-dns/dns-default-wbd2m" Apr 17 14:36:35.625106 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:35.625072 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7c57cd6dc9-2bkl9"] Apr 17 14:36:35.628230 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:36:35.628201 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74d427e0_c76f_43e0_8769_00df0351072e.slice/crio-3118461d3ce51bb232521f1868a73dee331ed69879a38959f671035d5792ce5c WatchSource:0}: Error finding container 3118461d3ce51bb232521f1868a73dee331ed69879a38959f671035d5792ce5c: Status 404 returned error can't find the container with id 3118461d3ce51bb232521f1868a73dee331ed69879a38959f671035d5792ce5c Apr 17 14:36:35.670088 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:35.670065 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert\") pod \"ingress-canary-l9shq\" (UID: \"05c803a3-345a-4b4a-b40b-575211301efb\") " pod="openshift-ingress-canary/ingress-canary-l9shq" Apr 17 14:36:35.672187 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:35.672168 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05c803a3-345a-4b4a-b40b-575211301efb-cert\") pod \"ingress-canary-l9shq\" (UID: \"05c803a3-345a-4b4a-b40b-575211301efb\") " pod="openshift-ingress-canary/ingress-canary-l9shq" Apr 17 14:36:35.710926 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:35.710893 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" event={"ID":"74d427e0-c76f-43e0-8769-00df0351072e","Type":"ContainerStarted","Data":"10f010d80a3213479d2718e071d42d792cb9855bfbb0cc5d72a23c46dd7eba97"} Apr 17 14:36:35.710926 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:35.710931 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" event={"ID":"74d427e0-c76f-43e0-8769-00df0351072e","Type":"ContainerStarted","Data":"3118461d3ce51bb232521f1868a73dee331ed69879a38959f671035d5792ce5c"} Apr 17 14:36:35.711127 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:35.711024 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:36:35.730320 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:35.730205 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" podStartSLOduration=163.730189042 podStartE2EDuration="2m43.730189042s" podCreationTimestamp="2026-04-17 14:33:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:36:35.728437968 +0000 UTC m=+162.085165589" watchObservedRunningTime="2026-04-17 14:36:35.730189042 +0000 UTC m=+162.086916666" Apr 17 14:36:35.801388 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:35.801353 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dtcwn\"" Apr 17 14:36:35.801388 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:35.801353 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fr4dp\"" Apr 17 14:36:35.809547 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:35.809516 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l9shq" Apr 17 14:36:35.809547 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:35.809540 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wbd2m" Apr 17 14:36:35.938060 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:35.938018 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-l9shq"] Apr 17 14:36:35.942594 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:36:35.942557 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05c803a3_345a_4b4a_b40b_575211301efb.slice/crio-9ecc8c5f9a2c83bc8e528b7beb8f7d0fda0370af263cd2bea08cba71e1a57d96 WatchSource:0}: Error finding container 9ecc8c5f9a2c83bc8e528b7beb8f7d0fda0370af263cd2bea08cba71e1a57d96: Status 404 returned error can't find the container with id 9ecc8c5f9a2c83bc8e528b7beb8f7d0fda0370af263cd2bea08cba71e1a57d96 Apr 17 14:36:35.957733 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:35.957702 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wbd2m"] Apr 17 14:36:35.961009 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:36:35.960982 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1beaeea_4ae8_452c_b3be_201c1ad4568e.slice/crio-7b3e984471d43fc06f566a8c1a866ad45fd16ed08981ecf8a808f7d3c4bdbf91 WatchSource:0}: Error finding container 7b3e984471d43fc06f566a8c1a866ad45fd16ed08981ecf8a808f7d3c4bdbf91: Status 404 returned error can't find the container with id 7b3e984471d43fc06f566a8c1a866ad45fd16ed08981ecf8a808f7d3c4bdbf91 Apr 17 14:36:36.716064 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:36.716029 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wbd2m" event={"ID":"d1beaeea-4ae8-452c-b3be-201c1ad4568e","Type":"ContainerStarted","Data":"7b3e984471d43fc06f566a8c1a866ad45fd16ed08981ecf8a808f7d3c4bdbf91"} Apr 17 14:36:36.717310 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:36.717267 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-l9shq" event={"ID":"05c803a3-345a-4b4a-b40b-575211301efb","Type":"ContainerStarted","Data":"9ecc8c5f9a2c83bc8e528b7beb8f7d0fda0370af263cd2bea08cba71e1a57d96"} Apr 17 14:36:38.723314 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:38.723275 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wbd2m" event={"ID":"d1beaeea-4ae8-452c-b3be-201c1ad4568e","Type":"ContainerStarted","Data":"da73f84e77bb31dec7e908ebce07450e73d845cf9ee15b7c710e9f130c9a7777"} Apr 17 14:36:38.723788 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:38.723313 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wbd2m" event={"ID":"d1beaeea-4ae8-452c-b3be-201c1ad4568e","Type":"ContainerStarted","Data":"99b04ce184872ba3ce2120182fd627cbbfcb4de3c77fdedb86c06c20080954be"} Apr 17 14:36:38.726718 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:38.724263 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-wbd2m" Apr 17 14:36:38.728073 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:38.728051 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-l9shq" event={"ID":"05c803a3-345a-4b4a-b40b-575211301efb","Type":"ContainerStarted","Data":"b54bf03a9ebd86370914a689cd415861feefcec0aa682dc10cbd2e4e50c70e9e"} Apr 17 14:36:38.740563 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:38.740525 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wbd2m" podStartSLOduration=129.460911971 podStartE2EDuration="2m11.740514118s" podCreationTimestamp="2026-04-17 14:34:27 +0000 UTC" firstStartedPulling="2026-04-17 14:36:35.962841769 +0000 UTC m=+162.319569372" lastFinishedPulling="2026-04-17 14:36:38.242443916 +0000 UTC m=+164.599171519" observedRunningTime="2026-04-17 14:36:38.738941459 +0000 UTC m=+165.095669084" watchObservedRunningTime="2026-04-17 14:36:38.740514118 +0000 UTC m=+165.097241743" Apr 17 14:36:38.752932 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:38.752898 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-l9shq" podStartSLOduration=129.451832014 podStartE2EDuration="2m11.75288726s" podCreationTimestamp="2026-04-17 14:34:27 +0000 UTC" firstStartedPulling="2026-04-17 14:36:35.94479207 +0000 UTC m=+162.301519673" lastFinishedPulling="2026-04-17 14:36:38.2458473 +0000 UTC m=+164.602574919" observedRunningTime="2026-04-17 14:36:38.751974499 +0000 UTC m=+165.108702173" watchObservedRunningTime="2026-04-17 14:36:38.75288726 +0000 UTC m=+165.109614885" Apr 17 14:36:41.856211 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:41.856130 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-v5hj8"] Apr 17 14:36:41.860743 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:41.860727 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-v5hj8" Apr 17 14:36:41.863476 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:41.863446 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 14:36:41.865988 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:41.865969 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 14:36:41.866149 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:41.866137 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 14:36:41.866333 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:41.866314 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qsgzg\"" Apr 17 14:36:41.866414 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:41.866318 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 14:36:41.870732 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:41.870713 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-v5hj8"] Apr 17 14:36:42.020365 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:42.020328 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb7ln\" (UniqueName: \"kubernetes.io/projected/254b8d3e-f0ea-4092-933e-966862da913d-kube-api-access-bb7ln\") pod \"insights-runtime-extractor-v5hj8\" (UID: \"254b8d3e-f0ea-4092-933e-966862da913d\") " pod="openshift-insights/insights-runtime-extractor-v5hj8" Apr 17 14:36:42.020537 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:42.020377 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/254b8d3e-f0ea-4092-933e-966862da913d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v5hj8\" (UID: \"254b8d3e-f0ea-4092-933e-966862da913d\") " pod="openshift-insights/insights-runtime-extractor-v5hj8" Apr 17 14:36:42.020537 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:42.020401 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/254b8d3e-f0ea-4092-933e-966862da913d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v5hj8\" (UID: \"254b8d3e-f0ea-4092-933e-966862da913d\") " pod="openshift-insights/insights-runtime-extractor-v5hj8" Apr 17 14:36:42.020537 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:42.020462 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/254b8d3e-f0ea-4092-933e-966862da913d-data-volume\") pod \"insights-runtime-extractor-v5hj8\" (UID: \"254b8d3e-f0ea-4092-933e-966862da913d\") " pod="openshift-insights/insights-runtime-extractor-v5hj8" Apr 17 14:36:42.020537 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:42.020510 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/254b8d3e-f0ea-4092-933e-966862da913d-crio-socket\") pod \"insights-runtime-extractor-v5hj8\" (UID: \"254b8d3e-f0ea-4092-933e-966862da913d\") " pod="openshift-insights/insights-runtime-extractor-v5hj8" Apr 17 14:36:42.121186 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:42.121104 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/254b8d3e-f0ea-4092-933e-966862da913d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v5hj8\" (UID: \"254b8d3e-f0ea-4092-933e-966862da913d\") " pod="openshift-insights/insights-runtime-extractor-v5hj8" Apr 17 14:36:42.121186 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:42.121144 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/254b8d3e-f0ea-4092-933e-966862da913d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v5hj8\" (UID: \"254b8d3e-f0ea-4092-933e-966862da913d\") " pod="openshift-insights/insights-runtime-extractor-v5hj8" Apr 17 14:36:42.121186 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:42.121173 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/254b8d3e-f0ea-4092-933e-966862da913d-data-volume\") pod \"insights-runtime-extractor-v5hj8\" (UID: \"254b8d3e-f0ea-4092-933e-966862da913d\") " pod="openshift-insights/insights-runtime-extractor-v5hj8" Apr 17 14:36:42.121455 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:42.121198 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/254b8d3e-f0ea-4092-933e-966862da913d-crio-socket\") pod \"insights-runtime-extractor-v5hj8\" (UID: \"254b8d3e-f0ea-4092-933e-966862da913d\") " pod="openshift-insights/insights-runtime-extractor-v5hj8" Apr 17 14:36:42.121455 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:42.121292 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bb7ln\" (UniqueName: \"kubernetes.io/projected/254b8d3e-f0ea-4092-933e-966862da913d-kube-api-access-bb7ln\") pod \"insights-runtime-extractor-v5hj8\" (UID: \"254b8d3e-f0ea-4092-933e-966862da913d\") " pod="openshift-insights/insights-runtime-extractor-v5hj8" Apr 17 14:36:42.121455 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:42.121403 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/254b8d3e-f0ea-4092-933e-966862da913d-crio-socket\") pod \"insights-runtime-extractor-v5hj8\" (UID: \"254b8d3e-f0ea-4092-933e-966862da913d\") " pod="openshift-insights/insights-runtime-extractor-v5hj8" Apr 17 14:36:42.121759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:42.121737 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/254b8d3e-f0ea-4092-933e-966862da913d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v5hj8\" (UID: \"254b8d3e-f0ea-4092-933e-966862da913d\") " pod="openshift-insights/insights-runtime-extractor-v5hj8" Apr 17 14:36:42.122139 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:42.122124 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/254b8d3e-f0ea-4092-933e-966862da913d-data-volume\") pod \"insights-runtime-extractor-v5hj8\" (UID: \"254b8d3e-f0ea-4092-933e-966862da913d\") " pod="openshift-insights/insights-runtime-extractor-v5hj8" Apr 17 14:36:42.123566 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:42.123548 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/254b8d3e-f0ea-4092-933e-966862da913d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v5hj8\" (UID: \"254b8d3e-f0ea-4092-933e-966862da913d\") " pod="openshift-insights/insights-runtime-extractor-v5hj8" Apr 17 14:36:42.129194 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:42.129174 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb7ln\" (UniqueName: \"kubernetes.io/projected/254b8d3e-f0ea-4092-933e-966862da913d-kube-api-access-bb7ln\") pod \"insights-runtime-extractor-v5hj8\" (UID: \"254b8d3e-f0ea-4092-933e-966862da913d\") " pod="openshift-insights/insights-runtime-extractor-v5hj8" Apr 17 14:36:42.169319 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:42.169290 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-v5hj8" Apr 17 14:36:42.278585 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:42.278545 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-v5hj8"] Apr 17 14:36:42.281539 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:36:42.281502 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod254b8d3e_f0ea_4092_933e_966862da913d.slice/crio-e454e4b886f677fd36063587c6cfd7041b44b7a0527cf7db9d06b45a20ec2a3d WatchSource:0}: Error finding container e454e4b886f677fd36063587c6cfd7041b44b7a0527cf7db9d06b45a20ec2a3d: Status 404 returned error can't find the container with id e454e4b886f677fd36063587c6cfd7041b44b7a0527cf7db9d06b45a20ec2a3d Apr 17 14:36:42.740234 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:42.740198 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v5hj8" event={"ID":"254b8d3e-f0ea-4092-933e-966862da913d","Type":"ContainerStarted","Data":"fc1db52866817caf3d8aafca26602db2dba5e9f1b121b20e58456d4148720c48"} Apr 17 14:36:42.740234 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:42.740251 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v5hj8" event={"ID":"254b8d3e-f0ea-4092-933e-966862da913d","Type":"ContainerStarted","Data":"e454e4b886f677fd36063587c6cfd7041b44b7a0527cf7db9d06b45a20ec2a3d"} Apr 17 14:36:43.747122 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:43.747088 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v5hj8" event={"ID":"254b8d3e-f0ea-4092-933e-966862da913d","Type":"ContainerStarted","Data":"8af263649a6ab24c9a435d5e3034cf3119fe7f20584ece93fd40aa36bf792efe"} Apr 17 14:36:45.752844 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:45.752808 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v5hj8" event={"ID":"254b8d3e-f0ea-4092-933e-966862da913d","Type":"ContainerStarted","Data":"72ce9d1a798ac1118648b749a4d50786071188d122610335cd80e465cadbe3c4"} Apr 17 14:36:45.768765 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:45.768722 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-v5hj8" podStartSLOduration=2.315963294 podStartE2EDuration="4.768709825s" podCreationTimestamp="2026-04-17 14:36:41 +0000 UTC" firstStartedPulling="2026-04-17 14:36:42.33635806 +0000 UTC m=+168.693085662" lastFinishedPulling="2026-04-17 14:36:44.789104577 +0000 UTC m=+171.145832193" observedRunningTime="2026-04-17 14:36:45.768486838 +0000 UTC m=+172.125214464" watchObservedRunningTime="2026-04-17 14:36:45.768709825 +0000 UTC m=+172.125437450" Apr 17 14:36:47.234718 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:47.234681 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:36:48.732958 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:48.732928 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wbd2m" Apr 17 14:36:55.512826 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:55.512787 2570 patch_prober.go:28] interesting pod/image-registry-7c57cd6dc9-2bkl9 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 14:36:55.513194 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:55.512844 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" podUID="74d427e0-c76f-43e0-8769-00df0351072e" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 14:36:56.722017 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:36:56.721987 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:37:00.382430 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.382391 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg"] Apr 17 14:37:00.387077 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.387055 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg" Apr 17 14:37:00.389892 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.389866 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 14:37:00.389892 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.389868 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 14:37:00.391080 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.391055 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 14:37:00.391198 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.391180 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 14:37:00.391274 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.391226 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-mbd5z\"" Apr 17 14:37:00.391329 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.391307 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 14:37:00.398779 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.398757 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg"] Apr 17 14:37:00.402386 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.402365 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-pvlsg"] Apr 17 14:37:00.405689 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.405667 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jsqmw"] Apr 17 14:37:00.405837 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.405821 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" Apr 17 14:37:00.408513 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.408490 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 14:37:00.408630 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.408607 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 14:37:00.408864 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.408846 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 14:37:00.408958 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.408909 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-gwxd4\"" Apr 17 14:37:00.408958 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.408853 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.411167 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.411146 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 14:37:00.411296 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.411176 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-zfm9f\"" Apr 17 14:37:00.411296 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.411269 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 14:37:00.411809 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.411783 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 14:37:00.419291 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.419266 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-pvlsg"] Apr 17 14:37:00.462258 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.462216 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ed45cab8-95d1-4284-9dc1-2d602984c4e1-node-exporter-textfile\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.462442 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.462299 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ed45cab8-95d1-4284-9dc1-2d602984c4e1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.462442 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.462342 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ed45cab8-95d1-4284-9dc1-2d602984c4e1-node-exporter-tls\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.462442 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.462359 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jsnp\" (UniqueName: \"kubernetes.io/projected/ed45cab8-95d1-4284-9dc1-2d602984c4e1-kube-api-access-8jsnp\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.462442 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.462378 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54chl\" (UniqueName: \"kubernetes.io/projected/12e54482-f4d9-41a0-ba0c-533f43bca23f-kube-api-access-54chl\") pod \"kube-state-metrics-69db897b98-pvlsg\" (UID: \"12e54482-f4d9-41a0-ba0c-533f43bca23f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" Apr 17 14:37:00.462442 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.462406 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6aedddde-b37e-4862-ac07-19ecebf2ca41-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-csjdg\" (UID: \"6aedddde-b37e-4862-ac07-19ecebf2ca41\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg" Apr 17 14:37:00.462717 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.462471 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed45cab8-95d1-4284-9dc1-2d602984c4e1-metrics-client-ca\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.462717 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.462517 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/12e54482-f4d9-41a0-ba0c-533f43bca23f-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-pvlsg\" (UID: \"12e54482-f4d9-41a0-ba0c-533f43bca23f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" Apr 17 14:37:00.462717 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.462543 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skqzx\" (UniqueName: \"kubernetes.io/projected/6aedddde-b37e-4862-ac07-19ecebf2ca41-kube-api-access-skqzx\") pod \"openshift-state-metrics-9d44df66c-csjdg\" (UID: \"6aedddde-b37e-4862-ac07-19ecebf2ca41\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg" Apr 17 14:37:00.462717 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.462566 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/12e54482-f4d9-41a0-ba0c-533f43bca23f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-pvlsg\" (UID: \"12e54482-f4d9-41a0-ba0c-533f43bca23f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" Apr 17 14:37:00.462717 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.462624 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/12e54482-f4d9-41a0-ba0c-533f43bca23f-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pvlsg\" (UID: \"12e54482-f4d9-41a0-ba0c-533f43bca23f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" Apr 17 14:37:00.462717 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.462647 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12e54482-f4d9-41a0-ba0c-533f43bca23f-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-pvlsg\" (UID: \"12e54482-f4d9-41a0-ba0c-533f43bca23f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" Apr 17 14:37:00.462717 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.462702 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6aedddde-b37e-4862-ac07-19ecebf2ca41-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-csjdg\" (UID: \"6aedddde-b37e-4862-ac07-19ecebf2ca41\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg" Apr 17 14:37:00.463055 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.462732 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/12e54482-f4d9-41a0-ba0c-533f43bca23f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-pvlsg\" (UID: \"12e54482-f4d9-41a0-ba0c-533f43bca23f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" Apr 17 14:37:00.463055 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.462766 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ed45cab8-95d1-4284-9dc1-2d602984c4e1-root\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.463055 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.462788 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ed45cab8-95d1-4284-9dc1-2d602984c4e1-node-exporter-wtmp\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.463055 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.462811 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ed45cab8-95d1-4284-9dc1-2d602984c4e1-sys\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.463055 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.462827 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ed45cab8-95d1-4284-9dc1-2d602984c4e1-node-exporter-accelerators-collector-config\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.463055 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.462846 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6aedddde-b37e-4862-ac07-19ecebf2ca41-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-csjdg\" (UID: \"6aedddde-b37e-4862-ac07-19ecebf2ca41\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg" Apr 17 14:37:00.563423 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.563374 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/12e54482-f4d9-41a0-ba0c-533f43bca23f-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-pvlsg\" (UID: \"12e54482-f4d9-41a0-ba0c-533f43bca23f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" Apr 17 14:37:00.563423 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.563425 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skqzx\" (UniqueName: \"kubernetes.io/projected/6aedddde-b37e-4862-ac07-19ecebf2ca41-kube-api-access-skqzx\") pod \"openshift-state-metrics-9d44df66c-csjdg\" (UID: \"6aedddde-b37e-4862-ac07-19ecebf2ca41\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg" Apr 17 14:37:00.563678 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.563452 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/12e54482-f4d9-41a0-ba0c-533f43bca23f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-pvlsg\" (UID: \"12e54482-f4d9-41a0-ba0c-533f43bca23f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" Apr 17 14:37:00.563678 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.563476 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/12e54482-f4d9-41a0-ba0c-533f43bca23f-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pvlsg\" (UID: \"12e54482-f4d9-41a0-ba0c-533f43bca23f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" Apr 17 14:37:00.563678 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.563497 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12e54482-f4d9-41a0-ba0c-533f43bca23f-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-pvlsg\" (UID: \"12e54482-f4d9-41a0-ba0c-533f43bca23f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" Apr 17 14:37:00.563678 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.563521 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6aedddde-b37e-4862-ac07-19ecebf2ca41-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-csjdg\" (UID: \"6aedddde-b37e-4862-ac07-19ecebf2ca41\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg" Apr 17 14:37:00.563678 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.563537 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/12e54482-f4d9-41a0-ba0c-533f43bca23f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-pvlsg\" (UID: \"12e54482-f4d9-41a0-ba0c-533f43bca23f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" Apr 17 14:37:00.563678 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.563567 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ed45cab8-95d1-4284-9dc1-2d602984c4e1-root\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.563678 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.563584 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ed45cab8-95d1-4284-9dc1-2d602984c4e1-node-exporter-wtmp\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.563678 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.563608 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ed45cab8-95d1-4284-9dc1-2d602984c4e1-sys\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.563678 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:37:00.563619 2570 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 17 14:37:00.564100 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:37:00.563699 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12e54482-f4d9-41a0-ba0c-533f43bca23f-kube-state-metrics-tls podName:12e54482-f4d9-41a0-ba0c-533f43bca23f nodeName:}" failed. No retries permitted until 2026-04-17 14:37:01.063677051 +0000 UTC m=+187.420404944 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/12e54482-f4d9-41a0-ba0c-533f43bca23f-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-pvlsg" (UID: "12e54482-f4d9-41a0-ba0c-533f43bca23f") : secret "kube-state-metrics-tls" not found Apr 17 14:37:00.564100 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.563631 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ed45cab8-95d1-4284-9dc1-2d602984c4e1-node-exporter-accelerators-collector-config\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.564100 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.563812 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6aedddde-b37e-4862-ac07-19ecebf2ca41-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-csjdg\" (UID: \"6aedddde-b37e-4862-ac07-19ecebf2ca41\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg" Apr 17 14:37:00.564100 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.563859 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ed45cab8-95d1-4284-9dc1-2d602984c4e1-node-exporter-textfile\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.564100 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.563896 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ed45cab8-95d1-4284-9dc1-2d602984c4e1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.564100 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.563946 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ed45cab8-95d1-4284-9dc1-2d602984c4e1-node-exporter-tls\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.564100 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.563958 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ed45cab8-95d1-4284-9dc1-2d602984c4e1-root\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.564100 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.563974 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jsnp\" (UniqueName: \"kubernetes.io/projected/ed45cab8-95d1-4284-9dc1-2d602984c4e1-kube-api-access-8jsnp\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.564100 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.563897 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/12e54482-f4d9-41a0-ba0c-533f43bca23f-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-pvlsg\" (UID: \"12e54482-f4d9-41a0-ba0c-533f43bca23f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" Apr 17 14:37:00.564100 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.564006 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54chl\" (UniqueName: \"kubernetes.io/projected/12e54482-f4d9-41a0-ba0c-533f43bca23f-kube-api-access-54chl\") pod \"kube-state-metrics-69db897b98-pvlsg\" (UID: \"12e54482-f4d9-41a0-ba0c-533f43bca23f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" Apr 17 14:37:00.564100 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.564045 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6aedddde-b37e-4862-ac07-19ecebf2ca41-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-csjdg\" (UID: \"6aedddde-b37e-4862-ac07-19ecebf2ca41\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg" Apr 17 14:37:00.564100 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.564074 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed45cab8-95d1-4284-9dc1-2d602984c4e1-metrics-client-ca\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.564714 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.564289 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12e54482-f4d9-41a0-ba0c-533f43bca23f-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-pvlsg\" (UID: \"12e54482-f4d9-41a0-ba0c-533f43bca23f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" Apr 17 14:37:00.564714 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.564357 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ed45cab8-95d1-4284-9dc1-2d602984c4e1-sys\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.564714 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.564428 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6aedddde-b37e-4862-ac07-19ecebf2ca41-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-csjdg\" (UID: \"6aedddde-b37e-4862-ac07-19ecebf2ca41\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg" Apr 17 14:37:00.564714 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:37:00.564525 2570 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 17 14:37:00.564714 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:37:00.564581 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6aedddde-b37e-4862-ac07-19ecebf2ca41-openshift-state-metrics-tls podName:6aedddde-b37e-4862-ac07-19ecebf2ca41 nodeName:}" failed. No retries permitted until 2026-04-17 14:37:01.064564019 +0000 UTC m=+187.421291640 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/6aedddde-b37e-4862-ac07-19ecebf2ca41-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-csjdg" (UID: "6aedddde-b37e-4862-ac07-19ecebf2ca41") : secret "openshift-state-metrics-tls" not found Apr 17 14:37:00.564714 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:37:00.564667 2570 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 14:37:00.564714 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.564423 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/12e54482-f4d9-41a0-ba0c-533f43bca23f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-pvlsg\" (UID: \"12e54482-f4d9-41a0-ba0c-533f43bca23f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" Apr 17 14:37:00.564714 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.564707 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ed45cab8-95d1-4284-9dc1-2d602984c4e1-node-exporter-wtmp\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.564714 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:37:00.564717 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed45cab8-95d1-4284-9dc1-2d602984c4e1-node-exporter-tls podName:ed45cab8-95d1-4284-9dc1-2d602984c4e1 nodeName:}" failed. No retries permitted until 2026-04-17 14:37:01.064701188 +0000 UTC m=+187.421428796 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/ed45cab8-95d1-4284-9dc1-2d602984c4e1-node-exporter-tls") pod "node-exporter-jsqmw" (UID: "ed45cab8-95d1-4284-9dc1-2d602984c4e1") : secret "node-exporter-tls" not found Apr 17 14:37:00.565005 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.564857 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed45cab8-95d1-4284-9dc1-2d602984c4e1-metrics-client-ca\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.565005 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.564956 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ed45cab8-95d1-4284-9dc1-2d602984c4e1-node-exporter-textfile\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.565339 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.565318 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ed45cab8-95d1-4284-9dc1-2d602984c4e1-node-exporter-accelerators-collector-config\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.566345 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.566320 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6aedddde-b37e-4862-ac07-19ecebf2ca41-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-csjdg\" (UID: \"6aedddde-b37e-4862-ac07-19ecebf2ca41\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg" Apr 17 14:37:00.566345 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.566335 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ed45cab8-95d1-4284-9dc1-2d602984c4e1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.566489 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.566380 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/12e54482-f4d9-41a0-ba0c-533f43bca23f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-pvlsg\" (UID: \"12e54482-f4d9-41a0-ba0c-533f43bca23f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" Apr 17 14:37:00.575028 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.575004 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skqzx\" (UniqueName: \"kubernetes.io/projected/6aedddde-b37e-4862-ac07-19ecebf2ca41-kube-api-access-skqzx\") pod \"openshift-state-metrics-9d44df66c-csjdg\" (UID: \"6aedddde-b37e-4862-ac07-19ecebf2ca41\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg" Apr 17 14:37:00.576111 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.576085 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jsnp\" (UniqueName: \"kubernetes.io/projected/ed45cab8-95d1-4284-9dc1-2d602984c4e1-kube-api-access-8jsnp\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:00.576604 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:00.576587 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54chl\" (UniqueName: \"kubernetes.io/projected/12e54482-f4d9-41a0-ba0c-533f43bca23f-kube-api-access-54chl\") pod \"kube-state-metrics-69db897b98-pvlsg\" (UID: \"12e54482-f4d9-41a0-ba0c-533f43bca23f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" Apr 17 14:37:01.069195 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.069159 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ed45cab8-95d1-4284-9dc1-2d602984c4e1-node-exporter-tls\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:01.069350 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.069202 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6aedddde-b37e-4862-ac07-19ecebf2ca41-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-csjdg\" (UID: \"6aedddde-b37e-4862-ac07-19ecebf2ca41\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg" Apr 17 14:37:01.069350 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.069263 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/12e54482-f4d9-41a0-ba0c-533f43bca23f-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pvlsg\" (UID: \"12e54482-f4d9-41a0-ba0c-533f43bca23f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" Apr 17 14:37:01.071598 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.071569 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ed45cab8-95d1-4284-9dc1-2d602984c4e1-node-exporter-tls\") pod \"node-exporter-jsqmw\" (UID: \"ed45cab8-95d1-4284-9dc1-2d602984c4e1\") " pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:01.071692 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.071629 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/12e54482-f4d9-41a0-ba0c-533f43bca23f-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pvlsg\" (UID: \"12e54482-f4d9-41a0-ba0c-533f43bca23f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" Apr 17 14:37:01.072311 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.072290 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6aedddde-b37e-4862-ac07-19ecebf2ca41-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-csjdg\" (UID: \"6aedddde-b37e-4862-ac07-19ecebf2ca41\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg" Apr 17 14:37:01.296801 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.296771 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg" Apr 17 14:37:01.322282 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.322212 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" Apr 17 14:37:01.327932 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.327906 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jsqmw" Apr 17 14:37:01.336929 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:37:01.336905 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded45cab8_95d1_4284_9dc1_2d602984c4e1.slice/crio-b1c2dae0acc72795ea4b1f98effb8a3304b6f468f7b402b709438a44ea5410f0 WatchSource:0}: Error finding container b1c2dae0acc72795ea4b1f98effb8a3304b6f468f7b402b709438a44ea5410f0: Status 404 returned error can't find the container with id b1c2dae0acc72795ea4b1f98effb8a3304b6f468f7b402b709438a44ea5410f0 Apr 17 14:37:01.446424 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.446389 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg"] Apr 17 14:37:01.449164 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:37:01.449137 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6aedddde_b37e_4862_ac07_19ecebf2ca41.slice/crio-3da2a387015a974e90a8a8ef26a6646ea32127b2a68824028b23b33b72032910 WatchSource:0}: Error finding container 3da2a387015a974e90a8a8ef26a6646ea32127b2a68824028b23b33b72032910: Status 404 returned error can't find the container with id 3da2a387015a974e90a8a8ef26a6646ea32127b2a68824028b23b33b72032910 Apr 17 14:37:01.461190 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.461143 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-pvlsg"] Apr 17 14:37:01.466655 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:37:01.466627 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12e54482_f4d9_41a0_ba0c_533f43bca23f.slice/crio-10be73dbe47b2e95e07fe80aefccbed54327ff3ba255bf388814f9186f36ffe9 WatchSource:0}: Error finding container 10be73dbe47b2e95e07fe80aefccbed54327ff3ba255bf388814f9186f36ffe9: Status 404 returned error can't find the container with id 10be73dbe47b2e95e07fe80aefccbed54327ff3ba255bf388814f9186f36ffe9 Apr 17 14:37:01.498660 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.498632 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 14:37:01.506680 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.506660 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.509403 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.509386 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 14:37:01.509503 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.509383 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 14:37:01.509503 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.509388 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 14:37:01.510004 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.509741 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 14:37:01.510004 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.509777 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 14:37:01.510004 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.509791 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 14:37:01.510004 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.509812 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 14:37:01.510004 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.509843 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-rh2rm\"" Apr 17 14:37:01.510361 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.510144 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 14:37:01.515043 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.515018 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 14:37:01.516188 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.516167 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 14:37:01.573352 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.573284 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6116e800-819e-48e7-8646-3890fe0067e1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.573352 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.573326 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6116e800-819e-48e7-8646-3890fe0067e1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.573520 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.573414 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-config-volume\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.573520 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.573442 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.573520 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.573469 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6116e800-819e-48e7-8646-3890fe0067e1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.573520 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.573491 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-web-config\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.573665 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.573576 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.573665 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.573613 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6116e800-819e-48e7-8646-3890fe0067e1-config-out\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.573665 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.573648 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbk7p\" (UniqueName: \"kubernetes.io/projected/6116e800-819e-48e7-8646-3890fe0067e1-kube-api-access-pbk7p\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.573769 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.573677 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.573769 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.573711 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.573769 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.573758 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6116e800-819e-48e7-8646-3890fe0067e1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.573865 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.573777 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.674384 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.674354 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6116e800-819e-48e7-8646-3890fe0067e1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.674552 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.674394 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6116e800-819e-48e7-8646-3890fe0067e1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.674704 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.674682 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-config-volume\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.674779 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.674719 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.674779 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.674760 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6116e800-819e-48e7-8646-3890fe0067e1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.674874 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.674792 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-web-config\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.674874 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.674847 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.675035 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.674873 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6116e800-819e-48e7-8646-3890fe0067e1-config-out\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.675035 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.674925 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbk7p\" (UniqueName: \"kubernetes.io/projected/6116e800-819e-48e7-8646-3890fe0067e1-kube-api-access-pbk7p\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.675035 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.674955 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.675035 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.674984 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.675035 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.675019 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6116e800-819e-48e7-8646-3890fe0067e1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.675284 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.675048 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.675284 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.675120 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6116e800-819e-48e7-8646-3890fe0067e1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.675284 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.675120 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6116e800-819e-48e7-8646-3890fe0067e1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.676742 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.676709 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6116e800-819e-48e7-8646-3890fe0067e1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.677713 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.677684 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.677894 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.677875 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6116e800-819e-48e7-8646-3890fe0067e1-config-out\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.678335 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.678314 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-config-volume\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.678666 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.678648 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6116e800-819e-48e7-8646-3890fe0067e1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.679401 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.679381 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.679401 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.679394 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.679563 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.679512 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.679824 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.679803 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-web-config\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.680628 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.680611 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.682906 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.682871 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbk7p\" (UniqueName: \"kubernetes.io/projected/6116e800-819e-48e7-8646-3890fe0067e1-kube-api-access-pbk7p\") pod \"alertmanager-main-0\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.792329 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.792292 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg" event={"ID":"6aedddde-b37e-4862-ac07-19ecebf2ca41","Type":"ContainerStarted","Data":"75dd5f7e785720f9ad185b82c33e6cfa43c8830dc95e4f2618d1f63427bd544d"} Apr 17 14:37:01.792329 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.792328 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg" event={"ID":"6aedddde-b37e-4862-ac07-19ecebf2ca41","Type":"ContainerStarted","Data":"f4d5cac1f15bf6cc685441e556c871c81c50455b325d11679266e2366e386c05"} Apr 17 14:37:01.792535 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.792338 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg" event={"ID":"6aedddde-b37e-4862-ac07-19ecebf2ca41","Type":"ContainerStarted","Data":"3da2a387015a974e90a8a8ef26a6646ea32127b2a68824028b23b33b72032910"} Apr 17 14:37:01.793233 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.793211 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jsqmw" event={"ID":"ed45cab8-95d1-4284-9dc1-2d602984c4e1","Type":"ContainerStarted","Data":"b1c2dae0acc72795ea4b1f98effb8a3304b6f468f7b402b709438a44ea5410f0"} Apr 17 14:37:01.794136 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.794114 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" event={"ID":"12e54482-f4d9-41a0-ba0c-533f43bca23f","Type":"ContainerStarted","Data":"10be73dbe47b2e95e07fe80aefccbed54327ff3ba255bf388814f9186f36ffe9"} Apr 17 14:37:01.816389 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.816365 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:37:01.957509 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:01.957487 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 14:37:01.959598 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:37:01.959571 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6116e800_819e_48e7_8646_3890fe0067e1.slice/crio-c452dad8dd4b4a9f19b4e98d00d8c8e2efeea060ba703f3d6d7d0cc4ef9f4309 WatchSource:0}: Error finding container c452dad8dd4b4a9f19b4e98d00d8c8e2efeea060ba703f3d6d7d0cc4ef9f4309: Status 404 returned error can't find the container with id c452dad8dd4b4a9f19b4e98d00d8c8e2efeea060ba703f3d6d7d0cc4ef9f4309 Apr 17 14:37:02.798450 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:02.798403 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6116e800-819e-48e7-8646-3890fe0067e1","Type":"ContainerStarted","Data":"c452dad8dd4b4a9f19b4e98d00d8c8e2efeea060ba703f3d6d7d0cc4ef9f4309"} Apr 17 14:37:02.800034 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:02.800005 2570 generic.go:358] "Generic (PLEG): container finished" podID="ed45cab8-95d1-4284-9dc1-2d602984c4e1" containerID="4aa8dbfa787dcc41c9bd977969eece136ef97f63ebd885855554ad83d187cdee" exitCode=0 Apr 17 14:37:02.800173 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:02.800064 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jsqmw" event={"ID":"ed45cab8-95d1-4284-9dc1-2d602984c4e1","Type":"ContainerDied","Data":"4aa8dbfa787dcc41c9bd977969eece136ef97f63ebd885855554ad83d187cdee"} Apr 17 14:37:03.804679 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:03.804583 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" event={"ID":"12e54482-f4d9-41a0-ba0c-533f43bca23f","Type":"ContainerStarted","Data":"10eee9e2518d5512491d8de4449736414834f3fe7caa0db952a48e6998335995"} Apr 17 14:37:03.804679 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:03.804621 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" event={"ID":"12e54482-f4d9-41a0-ba0c-533f43bca23f","Type":"ContainerStarted","Data":"df0a2c8c49c797443212d6bf8bed290ef10899717c7c28c61b5f96d9482cb635"} Apr 17 14:37:03.804679 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:03.804635 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" event={"ID":"12e54482-f4d9-41a0-ba0c-533f43bca23f","Type":"ContainerStarted","Data":"35f517632ec1b512996a5e5e4e86e768207dae88fb2dfa95001d5c0289741e91"} Apr 17 14:37:03.806316 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:03.806289 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg" event={"ID":"6aedddde-b37e-4862-ac07-19ecebf2ca41","Type":"ContainerStarted","Data":"24a803f863dd411df505b5371d6bf0d0a7e5eca64aef33201464d0a274c6c228"} Apr 17 14:37:03.807530 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:03.807500 2570 generic.go:358] "Generic (PLEG): container finished" podID="6116e800-819e-48e7-8646-3890fe0067e1" containerID="7041eb1c0446d5e6f34d7288b5acf4934a37f1a34d96a55bd7a97f0297de64bf" exitCode=0 Apr 17 14:37:03.807637 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:03.807576 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6116e800-819e-48e7-8646-3890fe0067e1","Type":"ContainerDied","Data":"7041eb1c0446d5e6f34d7288b5acf4934a37f1a34d96a55bd7a97f0297de64bf"} Apr 17 14:37:03.809426 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:03.809404 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jsqmw" event={"ID":"ed45cab8-95d1-4284-9dc1-2d602984c4e1","Type":"ContainerStarted","Data":"36032499f0370d624825249faae66f290585422ddcb6e8414f4129f4528362ab"} Apr 17 14:37:03.809494 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:03.809435 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jsqmw" event={"ID":"ed45cab8-95d1-4284-9dc1-2d602984c4e1","Type":"ContainerStarted","Data":"ec2809e46d05d36da5890e426a6ff7e0f5feccd3283d4d89f6dc0cf10ee66b09"} Apr 17 14:37:03.822786 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:03.822730 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-pvlsg" podStartSLOduration=1.994774066 podStartE2EDuration="3.822715425s" podCreationTimestamp="2026-04-17 14:37:00 +0000 UTC" firstStartedPulling="2026-04-17 14:37:01.468657972 +0000 UTC m=+187.825385575" lastFinishedPulling="2026-04-17 14:37:03.296599312 +0000 UTC m=+189.653326934" observedRunningTime="2026-04-17 14:37:03.821077323 +0000 UTC m=+190.177804950" watchObservedRunningTime="2026-04-17 14:37:03.822715425 +0000 UTC m=+190.179443050" Apr 17 14:37:03.859488 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:03.859436 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-csjdg" podStartSLOduration=2.170878612 podStartE2EDuration="3.859422003s" podCreationTimestamp="2026-04-17 14:37:00 +0000 UTC" firstStartedPulling="2026-04-17 14:37:01.6074591 +0000 UTC m=+187.964186703" lastFinishedPulling="2026-04-17 14:37:03.296002476 +0000 UTC m=+189.652730094" observedRunningTime="2026-04-17 14:37:03.858660966 +0000 UTC m=+190.215388595" watchObservedRunningTime="2026-04-17 14:37:03.859422003 +0000 UTC m=+190.216149628" Apr 17 14:37:03.875113 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:03.875065 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jsqmw" podStartSLOduration=2.8918331840000002 podStartE2EDuration="3.875050832s" podCreationTimestamp="2026-04-17 14:37:00 +0000 UTC" firstStartedPulling="2026-04-17 14:37:01.339081162 +0000 UTC m=+187.695808765" lastFinishedPulling="2026-04-17 14:37:02.32229879 +0000 UTC m=+188.679026413" observedRunningTime="2026-04-17 14:37:03.873808144 +0000 UTC m=+190.230535768" watchObservedRunningTime="2026-04-17 14:37:03.875050832 +0000 UTC m=+190.231778457" Apr 17 14:37:04.323962 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:04.323934 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7c57cd6dc9-2bkl9"] Apr 17 14:37:05.818841 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:05.818808 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6116e800-819e-48e7-8646-3890fe0067e1","Type":"ContainerStarted","Data":"d84c640984e239c20a6e30281869fb8e073592cb0ee3aa59098e5b18be881fb3"} Apr 17 14:37:05.818841 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:05.818847 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6116e800-819e-48e7-8646-3890fe0067e1","Type":"ContainerStarted","Data":"e03f69216fbb885d8a19bf3d7dfdff25169b105efedfef0944dbbd422a0ee569"} Apr 17 14:37:05.819361 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:05.818857 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6116e800-819e-48e7-8646-3890fe0067e1","Type":"ContainerStarted","Data":"63564e72a45d0888abfafb1c15972b0293eb9f8ce0c693cdd1f7628a600316c7"} Apr 17 14:37:05.819361 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:05.818867 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6116e800-819e-48e7-8646-3890fe0067e1","Type":"ContainerStarted","Data":"8a12eeb79b9ed8c8795620178cf84d798a55c76172f9f84e5d3b9a486be7a313"} Apr 17 14:37:05.819361 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:05.818875 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6116e800-819e-48e7-8646-3890fe0067e1","Type":"ContainerStarted","Data":"1da48737a631031c9b273eba7923076d9f814119014b2b83456e9e2d0d4b9621"} Apr 17 14:37:11.020989 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:11.020946 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" podUID="08f10719-7fb4-455b-88d5-84e0bf2877d1" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 14:37:15.733747 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:37:15.733696 2570 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5522f37104f3fac57567fa2e9ec65601f60b8cea3603b12dcda26db8c481f404: reading manifest sha256:64d3bd02eb22d4058f1dc9f4fb060f50418c0c39dc0d9e3ddd8662cda5f7e1b0 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5522f37104f3fac57567fa2e9ec65601f60b8cea3603b12dcda26db8c481f404" Apr 17 14:37:15.734116 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:37:15.733882 2570 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:prom-label-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5522f37104f3fac57567fa2e9ec65601f60b8cea3603b12dcda26db8c481f404,Command:[],Args:[--insecure-listen-address=127.0.0.1:9096 --upstream=http://127.0.0.1:9093 --label=namespace --error-on-replace],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{1 -3} {} 1m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pbk7p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[KILL MKNOD SETGID SETUID],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod alertmanager-main-0_openshift-monitoring(6116e800-819e-48e7-8646-3890fe0067e1): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5522f37104f3fac57567fa2e9ec65601f60b8cea3603b12dcda26db8c481f404: reading manifest sha256:64d3bd02eb22d4058f1dc9f4fb060f50418c0c39dc0d9e3ddd8662cda5f7e1b0 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 14:37:15.735057 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:37:15.735027 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prom-label-proxy\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5522f37104f3fac57567fa2e9ec65601f60b8cea3603b12dcda26db8c481f404: reading manifest sha256:64d3bd02eb22d4058f1dc9f4fb060f50418c0c39dc0d9e3ddd8662cda5f7e1b0 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-monitoring/alertmanager-main-0" podUID="6116e800-819e-48e7-8646-3890fe0067e1" Apr 17 14:37:15.849964 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:37:15.849928 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prom-label-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5522f37104f3fac57567fa2e9ec65601f60b8cea3603b12dcda26db8c481f404\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5522f37104f3fac57567fa2e9ec65601f60b8cea3603b12dcda26db8c481f404: reading manifest sha256:64d3bd02eb22d4058f1dc9f4fb060f50418c0c39dc0d9e3ddd8662cda5f7e1b0 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-monitoring/alertmanager-main-0" podUID="6116e800-819e-48e7-8646-3890fe0067e1" Apr 17 14:37:21.020932 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:21.020887 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" podUID="08f10719-7fb4-455b-88d5-84e0bf2877d1" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 14:37:29.346057 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.345986 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" podUID="74d427e0-c76f-43e0-8769-00df0351072e" containerName="registry" containerID="cri-o://10f010d80a3213479d2718e071d42d792cb9855bfbb0cc5d72a23c46dd7eba97" gracePeriod=30 Apr 17 14:37:29.573043 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.573017 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:37:29.715806 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.715714 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls\") pod \"74d427e0-c76f-43e0-8769-00df0351072e\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " Apr 17 14:37:29.715806 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.715760 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/74d427e0-c76f-43e0-8769-00df0351072e-installation-pull-secrets\") pod \"74d427e0-c76f-43e0-8769-00df0351072e\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " Apr 17 14:37:29.715806 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.715785 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74d427e0-c76f-43e0-8769-00df0351072e-trusted-ca\") pod \"74d427e0-c76f-43e0-8769-00df0351072e\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " Apr 17 14:37:29.716093 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.715826 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jqbs\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-kube-api-access-6jqbs\") pod \"74d427e0-c76f-43e0-8769-00df0351072e\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " Apr 17 14:37:29.716093 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.715858 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/74d427e0-c76f-43e0-8769-00df0351072e-image-registry-private-configuration\") pod \"74d427e0-c76f-43e0-8769-00df0351072e\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " Apr 17 14:37:29.716093 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.715887 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/74d427e0-c76f-43e0-8769-00df0351072e-registry-certificates\") pod \"74d427e0-c76f-43e0-8769-00df0351072e\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " Apr 17 14:37:29.716265 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.716111 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-bound-sa-token\") pod \"74d427e0-c76f-43e0-8769-00df0351072e\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " Apr 17 14:37:29.716265 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.716167 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/74d427e0-c76f-43e0-8769-00df0351072e-ca-trust-extracted\") pod \"74d427e0-c76f-43e0-8769-00df0351072e\" (UID: \"74d427e0-c76f-43e0-8769-00df0351072e\") " Apr 17 14:37:29.716391 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.716282 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74d427e0-c76f-43e0-8769-00df0351072e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "74d427e0-c76f-43e0-8769-00df0351072e" (UID: "74d427e0-c76f-43e0-8769-00df0351072e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:37:29.716391 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.716360 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74d427e0-c76f-43e0-8769-00df0351072e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "74d427e0-c76f-43e0-8769-00df0351072e" (UID: "74d427e0-c76f-43e0-8769-00df0351072e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:37:29.716575 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.716548 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74d427e0-c76f-43e0-8769-00df0351072e-trusted-ca\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:37:29.716575 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.716571 2570 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/74d427e0-c76f-43e0-8769-00df0351072e-registry-certificates\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:37:29.718446 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.718415 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-kube-api-access-6jqbs" (OuterVolumeSpecName: "kube-api-access-6jqbs") pod "74d427e0-c76f-43e0-8769-00df0351072e" (UID: "74d427e0-c76f-43e0-8769-00df0351072e"). InnerVolumeSpecName "kube-api-access-6jqbs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:37:29.718564 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.718509 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d427e0-c76f-43e0-8769-00df0351072e-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "74d427e0-c76f-43e0-8769-00df0351072e" (UID: "74d427e0-c76f-43e0-8769-00df0351072e"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:37:29.718636 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.718609 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "74d427e0-c76f-43e0-8769-00df0351072e" (UID: "74d427e0-c76f-43e0-8769-00df0351072e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:37:29.718636 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.718618 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "74d427e0-c76f-43e0-8769-00df0351072e" (UID: "74d427e0-c76f-43e0-8769-00df0351072e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:37:29.718713 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.718631 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d427e0-c76f-43e0-8769-00df0351072e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "74d427e0-c76f-43e0-8769-00df0351072e" (UID: "74d427e0-c76f-43e0-8769-00df0351072e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:37:29.724735 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.724712 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74d427e0-c76f-43e0-8769-00df0351072e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "74d427e0-c76f-43e0-8769-00df0351072e" (UID: "74d427e0-c76f-43e0-8769-00df0351072e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:37:29.817102 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.817071 2570 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-bound-sa-token\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:37:29.817102 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.817100 2570 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/74d427e0-c76f-43e0-8769-00df0351072e-ca-trust-extracted\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:37:29.817102 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.817110 2570 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-registry-tls\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:37:29.817352 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.817119 2570 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/74d427e0-c76f-43e0-8769-00df0351072e-installation-pull-secrets\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:37:29.817352 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.817129 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6jqbs\" (UniqueName: \"kubernetes.io/projected/74d427e0-c76f-43e0-8769-00df0351072e-kube-api-access-6jqbs\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:37:29.817352 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.817140 2570 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/74d427e0-c76f-43e0-8769-00df0351072e-image-registry-private-configuration\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:37:29.885948 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.885918 2570 generic.go:358] "Generic (PLEG): container finished" podID="74d427e0-c76f-43e0-8769-00df0351072e" containerID="10f010d80a3213479d2718e071d42d792cb9855bfbb0cc5d72a23c46dd7eba97" exitCode=0 Apr 17 14:37:29.886090 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.885976 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" event={"ID":"74d427e0-c76f-43e0-8769-00df0351072e","Type":"ContainerDied","Data":"10f010d80a3213479d2718e071d42d792cb9855bfbb0cc5d72a23c46dd7eba97"} Apr 17 14:37:29.886090 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.885979 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" Apr 17 14:37:29.886090 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.886002 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c57cd6dc9-2bkl9" event={"ID":"74d427e0-c76f-43e0-8769-00df0351072e","Type":"ContainerDied","Data":"3118461d3ce51bb232521f1868a73dee331ed69879a38959f671035d5792ce5c"} Apr 17 14:37:29.886090 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.886017 2570 scope.go:117] "RemoveContainer" containerID="10f010d80a3213479d2718e071d42d792cb9855bfbb0cc5d72a23c46dd7eba97" Apr 17 14:37:29.894302 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.894277 2570 scope.go:117] "RemoveContainer" containerID="10f010d80a3213479d2718e071d42d792cb9855bfbb0cc5d72a23c46dd7eba97" Apr 17 14:37:29.894544 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:37:29.894528 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10f010d80a3213479d2718e071d42d792cb9855bfbb0cc5d72a23c46dd7eba97\": container with ID starting with 10f010d80a3213479d2718e071d42d792cb9855bfbb0cc5d72a23c46dd7eba97 not found: ID does not exist" containerID="10f010d80a3213479d2718e071d42d792cb9855bfbb0cc5d72a23c46dd7eba97" Apr 17 14:37:29.894601 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.894553 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10f010d80a3213479d2718e071d42d792cb9855bfbb0cc5d72a23c46dd7eba97"} err="failed to get container status \"10f010d80a3213479d2718e071d42d792cb9855bfbb0cc5d72a23c46dd7eba97\": rpc error: code = NotFound desc = could not find container \"10f010d80a3213479d2718e071d42d792cb9855bfbb0cc5d72a23c46dd7eba97\": container with ID starting with 10f010d80a3213479d2718e071d42d792cb9855bfbb0cc5d72a23c46dd7eba97 not found: ID does not exist" Apr 17 14:37:29.906555 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.906527 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7c57cd6dc9-2bkl9"] Apr 17 14:37:29.910569 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:29.910541 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7c57cd6dc9-2bkl9"] Apr 17 14:37:30.238678 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:30.238646 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74d427e0-c76f-43e0-8769-00df0351072e" path="/var/lib/kubelet/pods/74d427e0-c76f-43e0-8769-00df0351072e/volumes" Apr 17 14:37:31.020480 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.020442 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" podUID="08f10719-7fb4-455b-88d5-84e0bf2877d1" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 14:37:31.020931 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.020516 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" Apr 17 14:37:31.021097 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.021072 2570 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"54d5b337152998c1490bbb95dad315e1570b0ed7107c437f72790e25891b6be3"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 14:37:31.021173 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.021125 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" podUID="08f10719-7fb4-455b-88d5-84e0bf2877d1" containerName="service-proxy" containerID="cri-o://54d5b337152998c1490bbb95dad315e1570b0ed7107c437f72790e25891b6be3" gracePeriod=30 Apr 17 14:37:31.342162 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.342133 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6cb955f7f9-mzd8p"] Apr 17 14:37:31.342429 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.342417 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="74d427e0-c76f-43e0-8769-00df0351072e" containerName="registry" Apr 17 14:37:31.342481 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.342431 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d427e0-c76f-43e0-8769-00df0351072e" containerName="registry" Apr 17 14:37:31.342481 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.342479 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="74d427e0-c76f-43e0-8769-00df0351072e" containerName="registry" Apr 17 14:37:31.345425 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.345404 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.347863 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.347833 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 14:37:31.347986 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.347906 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 14:37:31.347986 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.347947 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 14:37:31.348281 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.348258 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 14:37:31.348281 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.348272 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 14:37:31.348501 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.348294 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 14:37:31.348501 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.348427 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 14:37:31.348652 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.348635 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-lw4t2\"" Apr 17 14:37:31.352470 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.352448 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 14:37:31.355426 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.355404 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cb955f7f9-mzd8p"] Apr 17 14:37:31.429634 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.429599 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-console-config\") pod \"console-6cb955f7f9-mzd8p\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.429833 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.429646 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h29wg\" (UniqueName: \"kubernetes.io/projected/302f6ead-8fbd-4c95-8899-66acff56ed81-kube-api-access-h29wg\") pod \"console-6cb955f7f9-mzd8p\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.429833 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.429675 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-trusted-ca-bundle\") pod \"console-6cb955f7f9-mzd8p\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.429833 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.429702 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/302f6ead-8fbd-4c95-8899-66acff56ed81-console-serving-cert\") pod \"console-6cb955f7f9-mzd8p\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.429833 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.429745 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-service-ca\") pod \"console-6cb955f7f9-mzd8p\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.429833 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.429771 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/302f6ead-8fbd-4c95-8899-66acff56ed81-console-oauth-config\") pod \"console-6cb955f7f9-mzd8p\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.429833 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.429794 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-oauth-serving-cert\") pod \"console-6cb955f7f9-mzd8p\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.531208 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.531170 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-console-config\") pod \"console-6cb955f7f9-mzd8p\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.531208 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.531209 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h29wg\" (UniqueName: \"kubernetes.io/projected/302f6ead-8fbd-4c95-8899-66acff56ed81-kube-api-access-h29wg\") pod \"console-6cb955f7f9-mzd8p\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.531504 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.531229 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-trusted-ca-bundle\") pod \"console-6cb955f7f9-mzd8p\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.531504 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.531270 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/302f6ead-8fbd-4c95-8899-66acff56ed81-console-serving-cert\") pod \"console-6cb955f7f9-mzd8p\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.531504 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.531480 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-service-ca\") pod \"console-6cb955f7f9-mzd8p\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.531659 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.531523 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/302f6ead-8fbd-4c95-8899-66acff56ed81-console-oauth-config\") pod \"console-6cb955f7f9-mzd8p\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.531711 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.531660 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-oauth-serving-cert\") pod \"console-6cb955f7f9-mzd8p\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.531961 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.531936 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-console-config\") pod \"console-6cb955f7f9-mzd8p\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.532164 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.532142 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-trusted-ca-bundle\") pod \"console-6cb955f7f9-mzd8p\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.532233 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.532164 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-service-ca\") pod \"console-6cb955f7f9-mzd8p\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.532317 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.532300 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-oauth-serving-cert\") pod \"console-6cb955f7f9-mzd8p\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.533827 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.533800 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/302f6ead-8fbd-4c95-8899-66acff56ed81-console-oauth-config\") pod \"console-6cb955f7f9-mzd8p\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.533957 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.533913 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/302f6ead-8fbd-4c95-8899-66acff56ed81-console-serving-cert\") pod \"console-6cb955f7f9-mzd8p\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.538655 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.538630 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h29wg\" (UniqueName: \"kubernetes.io/projected/302f6ead-8fbd-4c95-8899-66acff56ed81-kube-api-access-h29wg\") pod \"console-6cb955f7f9-mzd8p\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.655449 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.655363 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:31.809200 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.809163 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cb955f7f9-mzd8p"] Apr 17 14:37:31.811958 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:37:31.811932 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod302f6ead_8fbd_4c95_8899_66acff56ed81.slice/crio-bfc71ecba35975233830b46134d9c07b5397591e57f9b8d14c19635973ae1541 WatchSource:0}: Error finding container bfc71ecba35975233830b46134d9c07b5397591e57f9b8d14c19635973ae1541: Status 404 returned error can't find the container with id bfc71ecba35975233830b46134d9c07b5397591e57f9b8d14c19635973ae1541 Apr 17 14:37:31.894830 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.894786 2570 generic.go:358] "Generic (PLEG): container finished" podID="08f10719-7fb4-455b-88d5-84e0bf2877d1" containerID="54d5b337152998c1490bbb95dad315e1570b0ed7107c437f72790e25891b6be3" exitCode=2 Apr 17 14:37:31.894992 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.894862 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" event={"ID":"08f10719-7fb4-455b-88d5-84e0bf2877d1","Type":"ContainerDied","Data":"54d5b337152998c1490bbb95dad315e1570b0ed7107c437f72790e25891b6be3"} Apr 17 14:37:31.894992 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.894902 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b955f578f-6kfg2" event={"ID":"08f10719-7fb4-455b-88d5-84e0bf2877d1","Type":"ContainerStarted","Data":"d696c26262af7a9f6761bdeb74a177774fa0f05cd52231c91c57bb11cb2ff459"} Apr 17 14:37:31.896081 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:31.896058 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cb955f7f9-mzd8p" event={"ID":"302f6ead-8fbd-4c95-8899-66acff56ed81","Type":"ContainerStarted","Data":"bfc71ecba35975233830b46134d9c07b5397591e57f9b8d14c19635973ae1541"} Apr 17 14:37:32.902776 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:32.902737 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6116e800-819e-48e7-8646-3890fe0067e1","Type":"ContainerStarted","Data":"09a13a719956ab7cc431a5d7f0e38745ddd29f0bb6673a5441d8514012ddcf5c"} Apr 17 14:37:32.929949 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:32.929420 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.877806821 podStartE2EDuration="31.929400988s" podCreationTimestamp="2026-04-17 14:37:01 +0000 UTC" firstStartedPulling="2026-04-17 14:37:01.961410549 +0000 UTC m=+188.318138153" lastFinishedPulling="2026-04-17 14:37:32.013004716 +0000 UTC m=+218.369732320" observedRunningTime="2026-04-17 14:37:32.927820814 +0000 UTC m=+219.284548440" watchObservedRunningTime="2026-04-17 14:37:32.929400988 +0000 UTC m=+219.286128614" Apr 17 14:37:34.828499 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:34.828417 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-l9shq_05c803a3-345a-4b4a-b40b-575211301efb/serve-healthcheck-canary/0.log" Apr 17 14:37:34.909783 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:34.909747 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cb955f7f9-mzd8p" event={"ID":"302f6ead-8fbd-4c95-8899-66acff56ed81","Type":"ContainerStarted","Data":"b39d40f53c4c229874b84d5e9fd60a6d6cbaa4811424279ba926e3bfecb1b65f"} Apr 17 14:37:34.926364 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:34.926317 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6cb955f7f9-mzd8p" podStartSLOduration=1.196423325 podStartE2EDuration="3.926303453s" podCreationTimestamp="2026-04-17 14:37:31 +0000 UTC" firstStartedPulling="2026-04-17 14:37:31.814159082 +0000 UTC m=+218.170886684" lastFinishedPulling="2026-04-17 14:37:34.544039205 +0000 UTC m=+220.900766812" observedRunningTime="2026-04-17 14:37:34.925476943 +0000 UTC m=+221.282204569" watchObservedRunningTime="2026-04-17 14:37:34.926303453 +0000 UTC m=+221.283031081" Apr 17 14:37:41.655981 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:41.655923 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:41.655981 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:41.655989 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:41.660762 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:41.660738 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:37:41.929952 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:37:41.929873 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:38:06.207759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:06.207725 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs\") pod \"network-metrics-daemon-j6cgs\" (UID: \"4807a6e2-14df-4ba4-8aee-7422a65508f2\") " pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:38:06.210055 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:06.210021 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4807a6e2-14df-4ba4-8aee-7422a65508f2-metrics-certs\") pod \"network-metrics-daemon-j6cgs\" (UID: \"4807a6e2-14df-4ba4-8aee-7422a65508f2\") " pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:38:06.437809 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:06.437775 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tg92k\"" Apr 17 14:38:06.445835 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:06.445816 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6cgs" Apr 17 14:38:06.559269 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:06.559229 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j6cgs"] Apr 17 14:38:06.561960 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:38:06.561933 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4807a6e2_14df_4ba4_8aee_7422a65508f2.slice/crio-5dd47189a35601e63aee0994d81edcc7ec042cf88cce3a8c348e50d4b8abb4fe WatchSource:0}: Error finding container 5dd47189a35601e63aee0994d81edcc7ec042cf88cce3a8c348e50d4b8abb4fe: Status 404 returned error can't find the container with id 5dd47189a35601e63aee0994d81edcc7ec042cf88cce3a8c348e50d4b8abb4fe Apr 17 14:38:06.996804 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:06.996763 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j6cgs" event={"ID":"4807a6e2-14df-4ba4-8aee-7422a65508f2","Type":"ContainerStarted","Data":"5dd47189a35601e63aee0994d81edcc7ec042cf88cce3a8c348e50d4b8abb4fe"} Apr 17 14:38:08.001151 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:08.001058 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j6cgs" event={"ID":"4807a6e2-14df-4ba4-8aee-7422a65508f2","Type":"ContainerStarted","Data":"45ca29db94753ad52044a0b223b855b1eaecaea7d3143b8c0a87ac2ffafde3fd"} Apr 17 14:38:08.001151 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:08.001093 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j6cgs" event={"ID":"4807a6e2-14df-4ba4-8aee-7422a65508f2","Type":"ContainerStarted","Data":"01ff913db17db47d96d13a4f57c424b84deeb6c733c98ddabdc9077c0b569a75"} Apr 17 14:38:08.016983 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:08.016942 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-j6cgs" podStartSLOduration=252.998538821 podStartE2EDuration="4m14.016928158s" podCreationTimestamp="2026-04-17 14:33:54 +0000 UTC" firstStartedPulling="2026-04-17 14:38:06.564167373 +0000 UTC m=+252.920894976" lastFinishedPulling="2026-04-17 14:38:07.582556709 +0000 UTC m=+253.939284313" observedRunningTime="2026-04-17 14:38:08.015546751 +0000 UTC m=+254.372274375" watchObservedRunningTime="2026-04-17 14:38:08.016928158 +0000 UTC m=+254.373655783" Apr 17 14:38:20.646694 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:20.646604 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 14:38:20.647187 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:20.647078 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="alertmanager" containerID="cri-o://1da48737a631031c9b273eba7923076d9f814119014b2b83456e9e2d0d4b9621" gracePeriod=120 Apr 17 14:38:20.647273 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:20.647169 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="kube-rbac-proxy-metric" containerID="cri-o://d84c640984e239c20a6e30281869fb8e073592cb0ee3aa59098e5b18be881fb3" gracePeriod=120 Apr 17 14:38:20.647273 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:20.647187 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="kube-rbac-proxy-web" containerID="cri-o://63564e72a45d0888abfafb1c15972b0293eb9f8ce0c693cdd1f7628a600316c7" gracePeriod=120 Apr 17 14:38:20.647273 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:20.647198 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="kube-rbac-proxy" containerID="cri-o://e03f69216fbb885d8a19bf3d7dfdff25169b105efedfef0944dbbd422a0ee569" gracePeriod=120 Apr 17 14:38:20.647273 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:20.647205 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="config-reloader" containerID="cri-o://8a12eeb79b9ed8c8795620178cf84d798a55c76172f9f84e5d3b9a486be7a313" gracePeriod=120 Apr 17 14:38:20.647489 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:20.647227 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="prom-label-proxy" containerID="cri-o://09a13a719956ab7cc431a5d7f0e38745ddd29f0bb6673a5441d8514012ddcf5c" gracePeriod=120 Apr 17 14:38:21.037945 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.037910 2570 generic.go:358] "Generic (PLEG): container finished" podID="6116e800-819e-48e7-8646-3890fe0067e1" containerID="09a13a719956ab7cc431a5d7f0e38745ddd29f0bb6673a5441d8514012ddcf5c" exitCode=0 Apr 17 14:38:21.037945 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.037934 2570 generic.go:358] "Generic (PLEG): container finished" podID="6116e800-819e-48e7-8646-3890fe0067e1" containerID="d84c640984e239c20a6e30281869fb8e073592cb0ee3aa59098e5b18be881fb3" exitCode=0 Apr 17 14:38:21.037945 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.037942 2570 generic.go:358] "Generic (PLEG): container finished" podID="6116e800-819e-48e7-8646-3890fe0067e1" containerID="e03f69216fbb885d8a19bf3d7dfdff25169b105efedfef0944dbbd422a0ee569" exitCode=0 Apr 17 14:38:21.037945 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.037948 2570 generic.go:358] "Generic (PLEG): container finished" podID="6116e800-819e-48e7-8646-3890fe0067e1" containerID="8a12eeb79b9ed8c8795620178cf84d798a55c76172f9f84e5d3b9a486be7a313" exitCode=0 Apr 17 14:38:21.037945 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.037953 2570 generic.go:358] "Generic (PLEG): container finished" podID="6116e800-819e-48e7-8646-3890fe0067e1" containerID="1da48737a631031c9b273eba7923076d9f814119014b2b83456e9e2d0d4b9621" exitCode=0 Apr 17 14:38:21.038222 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.037980 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6116e800-819e-48e7-8646-3890fe0067e1","Type":"ContainerDied","Data":"09a13a719956ab7cc431a5d7f0e38745ddd29f0bb6673a5441d8514012ddcf5c"} Apr 17 14:38:21.038222 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.038011 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6116e800-819e-48e7-8646-3890fe0067e1","Type":"ContainerDied","Data":"d84c640984e239c20a6e30281869fb8e073592cb0ee3aa59098e5b18be881fb3"} Apr 17 14:38:21.038222 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.038021 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6116e800-819e-48e7-8646-3890fe0067e1","Type":"ContainerDied","Data":"e03f69216fbb885d8a19bf3d7dfdff25169b105efedfef0944dbbd422a0ee569"} Apr 17 14:38:21.038222 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.038030 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6116e800-819e-48e7-8646-3890fe0067e1","Type":"ContainerDied","Data":"8a12eeb79b9ed8c8795620178cf84d798a55c76172f9f84e5d3b9a486be7a313"} Apr 17 14:38:21.038222 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.038040 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6116e800-819e-48e7-8646-3890fe0067e1","Type":"ContainerDied","Data":"1da48737a631031c9b273eba7923076d9f814119014b2b83456e9e2d0d4b9621"} Apr 17 14:38:21.885448 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.885421 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:21.934754 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.934727 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6116e800-819e-48e7-8646-3890fe0067e1-tls-assets\") pod \"6116e800-819e-48e7-8646-3890fe0067e1\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " Apr 17 14:38:21.934880 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.934759 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-main-tls\") pod \"6116e800-819e-48e7-8646-3890fe0067e1\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " Apr 17 14:38:21.934880 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.934789 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-config-volume\") pod \"6116e800-819e-48e7-8646-3890fe0067e1\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " Apr 17 14:38:21.934880 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.934822 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6116e800-819e-48e7-8646-3890fe0067e1-alertmanager-trusted-ca-bundle\") pod \"6116e800-819e-48e7-8646-3890fe0067e1\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " Apr 17 14:38:21.934880 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.934852 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-cluster-tls-config\") pod \"6116e800-819e-48e7-8646-3890fe0067e1\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " Apr 17 14:38:21.935106 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.934889 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-kube-rbac-proxy-web\") pod \"6116e800-819e-48e7-8646-3890fe0067e1\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " Apr 17 14:38:21.935106 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.934929 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-web-config\") pod \"6116e800-819e-48e7-8646-3890fe0067e1\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " Apr 17 14:38:21.935106 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.934965 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6116e800-819e-48e7-8646-3890fe0067e1-metrics-client-ca\") pod \"6116e800-819e-48e7-8646-3890fe0067e1\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " Apr 17 14:38:21.935106 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.934988 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-kube-rbac-proxy\") pod \"6116e800-819e-48e7-8646-3890fe0067e1\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " Apr 17 14:38:21.935106 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.935026 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6116e800-819e-48e7-8646-3890fe0067e1-config-out\") pod \"6116e800-819e-48e7-8646-3890fe0067e1\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " Apr 17 14:38:21.935106 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.935055 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbk7p\" (UniqueName: \"kubernetes.io/projected/6116e800-819e-48e7-8646-3890fe0067e1-kube-api-access-pbk7p\") pod \"6116e800-819e-48e7-8646-3890fe0067e1\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " Apr 17 14:38:21.935106 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.935083 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6116e800-819e-48e7-8646-3890fe0067e1-alertmanager-main-db\") pod \"6116e800-819e-48e7-8646-3890fe0067e1\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " Apr 17 14:38:21.935532 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.935119 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"6116e800-819e-48e7-8646-3890fe0067e1\" (UID: \"6116e800-819e-48e7-8646-3890fe0067e1\") " Apr 17 14:38:21.935904 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.935871 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6116e800-819e-48e7-8646-3890fe0067e1-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "6116e800-819e-48e7-8646-3890fe0067e1" (UID: "6116e800-819e-48e7-8646-3890fe0067e1"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:38:21.936007 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.935871 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6116e800-819e-48e7-8646-3890fe0067e1-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "6116e800-819e-48e7-8646-3890fe0067e1" (UID: "6116e800-819e-48e7-8646-3890fe0067e1"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:38:21.936294 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.936269 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6116e800-819e-48e7-8646-3890fe0067e1-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "6116e800-819e-48e7-8646-3890fe0067e1" (UID: "6116e800-819e-48e7-8646-3890fe0067e1"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:38:21.937715 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.937665 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6116e800-819e-48e7-8646-3890fe0067e1-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "6116e800-819e-48e7-8646-3890fe0067e1" (UID: "6116e800-819e-48e7-8646-3890fe0067e1"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:38:21.937715 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.937669 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-config-volume" (OuterVolumeSpecName: "config-volume") pod "6116e800-819e-48e7-8646-3890fe0067e1" (UID: "6116e800-819e-48e7-8646-3890fe0067e1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:38:21.937881 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.937769 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "6116e800-819e-48e7-8646-3890fe0067e1" (UID: "6116e800-819e-48e7-8646-3890fe0067e1"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:38:21.937991 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.937971 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6116e800-819e-48e7-8646-3890fe0067e1-config-out" (OuterVolumeSpecName: "config-out") pod "6116e800-819e-48e7-8646-3890fe0067e1" (UID: "6116e800-819e-48e7-8646-3890fe0067e1"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:38:21.939546 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.939520 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "6116e800-819e-48e7-8646-3890fe0067e1" (UID: "6116e800-819e-48e7-8646-3890fe0067e1"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:38:21.939905 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.939873 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "6116e800-819e-48e7-8646-3890fe0067e1" (UID: "6116e800-819e-48e7-8646-3890fe0067e1"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:38:21.940221 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.940193 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6116e800-819e-48e7-8646-3890fe0067e1-kube-api-access-pbk7p" (OuterVolumeSpecName: "kube-api-access-pbk7p") pod "6116e800-819e-48e7-8646-3890fe0067e1" (UID: "6116e800-819e-48e7-8646-3890fe0067e1"). InnerVolumeSpecName "kube-api-access-pbk7p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:38:21.941208 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.941181 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "6116e800-819e-48e7-8646-3890fe0067e1" (UID: "6116e800-819e-48e7-8646-3890fe0067e1"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:38:21.943931 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.943905 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "6116e800-819e-48e7-8646-3890fe0067e1" (UID: "6116e800-819e-48e7-8646-3890fe0067e1"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:38:21.949860 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:21.949836 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-web-config" (OuterVolumeSpecName: "web-config") pod "6116e800-819e-48e7-8646-3890fe0067e1" (UID: "6116e800-819e-48e7-8646-3890fe0067e1"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:38:22.036470 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.036443 2570 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-cluster-tls-config\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:38:22.036470 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.036469 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:38:22.036611 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.036479 2570 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-web-config\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:38:22.036611 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.036490 2570 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6116e800-819e-48e7-8646-3890fe0067e1-metrics-client-ca\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:38:22.036611 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.036499 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:38:22.036611 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.036507 2570 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6116e800-819e-48e7-8646-3890fe0067e1-config-out\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:38:22.036611 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.036516 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pbk7p\" (UniqueName: \"kubernetes.io/projected/6116e800-819e-48e7-8646-3890fe0067e1-kube-api-access-pbk7p\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:38:22.036611 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.036525 2570 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6116e800-819e-48e7-8646-3890fe0067e1-alertmanager-main-db\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:38:22.036611 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.036533 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:38:22.036611 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.036542 2570 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6116e800-819e-48e7-8646-3890fe0067e1-tls-assets\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:38:22.036611 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.036551 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-secret-alertmanager-main-tls\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:38:22.036611 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.036559 2570 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6116e800-819e-48e7-8646-3890fe0067e1-config-volume\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:38:22.036611 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.036568 2570 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6116e800-819e-48e7-8646-3890fe0067e1-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:38:22.042984 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.042961 2570 generic.go:358] "Generic (PLEG): container finished" podID="6116e800-819e-48e7-8646-3890fe0067e1" containerID="63564e72a45d0888abfafb1c15972b0293eb9f8ce0c693cdd1f7628a600316c7" exitCode=0 Apr 17 14:38:22.043072 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.043005 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6116e800-819e-48e7-8646-3890fe0067e1","Type":"ContainerDied","Data":"63564e72a45d0888abfafb1c15972b0293eb9f8ce0c693cdd1f7628a600316c7"} Apr 17 14:38:22.043072 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.043029 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6116e800-819e-48e7-8646-3890fe0067e1","Type":"ContainerDied","Data":"c452dad8dd4b4a9f19b4e98d00d8c8e2efeea060ba703f3d6d7d0cc4ef9f4309"} Apr 17 14:38:22.043072 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.043044 2570 scope.go:117] "RemoveContainer" containerID="09a13a719956ab7cc431a5d7f0e38745ddd29f0bb6673a5441d8514012ddcf5c" Apr 17 14:38:22.043182 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.043076 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.050930 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.050913 2570 scope.go:117] "RemoveContainer" containerID="d84c640984e239c20a6e30281869fb8e073592cb0ee3aa59098e5b18be881fb3" Apr 17 14:38:22.057916 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.057897 2570 scope.go:117] "RemoveContainer" containerID="e03f69216fbb885d8a19bf3d7dfdff25169b105efedfef0944dbbd422a0ee569" Apr 17 14:38:22.063764 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.063748 2570 scope.go:117] "RemoveContainer" containerID="63564e72a45d0888abfafb1c15972b0293eb9f8ce0c693cdd1f7628a600316c7" Apr 17 14:38:22.068264 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.068230 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 14:38:22.070707 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.070686 2570 scope.go:117] "RemoveContainer" containerID="8a12eeb79b9ed8c8795620178cf84d798a55c76172f9f84e5d3b9a486be7a313" Apr 17 14:38:22.072825 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.072805 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 14:38:22.076929 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.076913 2570 scope.go:117] "RemoveContainer" containerID="1da48737a631031c9b273eba7923076d9f814119014b2b83456e9e2d0d4b9621" Apr 17 14:38:22.082826 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.082811 2570 scope.go:117] "RemoveContainer" containerID="7041eb1c0446d5e6f34d7288b5acf4934a37f1a34d96a55bd7a97f0297de64bf" Apr 17 14:38:22.088826 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.088806 2570 scope.go:117] "RemoveContainer" containerID="09a13a719956ab7cc431a5d7f0e38745ddd29f0bb6673a5441d8514012ddcf5c" Apr 17 14:38:22.089054 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:38:22.089033 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09a13a719956ab7cc431a5d7f0e38745ddd29f0bb6673a5441d8514012ddcf5c\": container with ID starting with 09a13a719956ab7cc431a5d7f0e38745ddd29f0bb6673a5441d8514012ddcf5c not found: ID does not exist" containerID="09a13a719956ab7cc431a5d7f0e38745ddd29f0bb6673a5441d8514012ddcf5c" Apr 17 14:38:22.089125 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.089065 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09a13a719956ab7cc431a5d7f0e38745ddd29f0bb6673a5441d8514012ddcf5c"} err="failed to get container status \"09a13a719956ab7cc431a5d7f0e38745ddd29f0bb6673a5441d8514012ddcf5c\": rpc error: code = NotFound desc = could not find container \"09a13a719956ab7cc431a5d7f0e38745ddd29f0bb6673a5441d8514012ddcf5c\": container with ID starting with 09a13a719956ab7cc431a5d7f0e38745ddd29f0bb6673a5441d8514012ddcf5c not found: ID does not exist" Apr 17 14:38:22.089125 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.089092 2570 scope.go:117] "RemoveContainer" containerID="d84c640984e239c20a6e30281869fb8e073592cb0ee3aa59098e5b18be881fb3" Apr 17 14:38:22.089344 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:38:22.089327 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d84c640984e239c20a6e30281869fb8e073592cb0ee3aa59098e5b18be881fb3\": container with ID starting with d84c640984e239c20a6e30281869fb8e073592cb0ee3aa59098e5b18be881fb3 not found: ID does not exist" containerID="d84c640984e239c20a6e30281869fb8e073592cb0ee3aa59098e5b18be881fb3" Apr 17 14:38:22.089387 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.089351 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d84c640984e239c20a6e30281869fb8e073592cb0ee3aa59098e5b18be881fb3"} err="failed to get container status \"d84c640984e239c20a6e30281869fb8e073592cb0ee3aa59098e5b18be881fb3\": rpc error: code = NotFound desc = could not find container \"d84c640984e239c20a6e30281869fb8e073592cb0ee3aa59098e5b18be881fb3\": container with ID starting with d84c640984e239c20a6e30281869fb8e073592cb0ee3aa59098e5b18be881fb3 not found: ID does not exist" Apr 17 14:38:22.089387 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.089370 2570 scope.go:117] "RemoveContainer" containerID="e03f69216fbb885d8a19bf3d7dfdff25169b105efedfef0944dbbd422a0ee569" Apr 17 14:38:22.089602 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:38:22.089584 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e03f69216fbb885d8a19bf3d7dfdff25169b105efedfef0944dbbd422a0ee569\": container with ID starting with e03f69216fbb885d8a19bf3d7dfdff25169b105efedfef0944dbbd422a0ee569 not found: ID does not exist" containerID="e03f69216fbb885d8a19bf3d7dfdff25169b105efedfef0944dbbd422a0ee569" Apr 17 14:38:22.089642 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.089606 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e03f69216fbb885d8a19bf3d7dfdff25169b105efedfef0944dbbd422a0ee569"} err="failed to get container status \"e03f69216fbb885d8a19bf3d7dfdff25169b105efedfef0944dbbd422a0ee569\": rpc error: code = NotFound desc = could not find container \"e03f69216fbb885d8a19bf3d7dfdff25169b105efedfef0944dbbd422a0ee569\": container with ID starting with e03f69216fbb885d8a19bf3d7dfdff25169b105efedfef0944dbbd422a0ee569 not found: ID does not exist" Apr 17 14:38:22.089642 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.089621 2570 scope.go:117] "RemoveContainer" containerID="63564e72a45d0888abfafb1c15972b0293eb9f8ce0c693cdd1f7628a600316c7" Apr 17 14:38:22.089821 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:38:22.089806 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63564e72a45d0888abfafb1c15972b0293eb9f8ce0c693cdd1f7628a600316c7\": container with ID starting with 63564e72a45d0888abfafb1c15972b0293eb9f8ce0c693cdd1f7628a600316c7 not found: ID does not exist" containerID="63564e72a45d0888abfafb1c15972b0293eb9f8ce0c693cdd1f7628a600316c7" Apr 17 14:38:22.089885 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.089829 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63564e72a45d0888abfafb1c15972b0293eb9f8ce0c693cdd1f7628a600316c7"} err="failed to get container status \"63564e72a45d0888abfafb1c15972b0293eb9f8ce0c693cdd1f7628a600316c7\": rpc error: code = NotFound desc = could not find container \"63564e72a45d0888abfafb1c15972b0293eb9f8ce0c693cdd1f7628a600316c7\": container with ID starting with 63564e72a45d0888abfafb1c15972b0293eb9f8ce0c693cdd1f7628a600316c7 not found: ID does not exist" Apr 17 14:38:22.089885 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.089850 2570 scope.go:117] "RemoveContainer" containerID="8a12eeb79b9ed8c8795620178cf84d798a55c76172f9f84e5d3b9a486be7a313" Apr 17 14:38:22.090078 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:38:22.090063 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a12eeb79b9ed8c8795620178cf84d798a55c76172f9f84e5d3b9a486be7a313\": container with ID starting with 8a12eeb79b9ed8c8795620178cf84d798a55c76172f9f84e5d3b9a486be7a313 not found: ID does not exist" containerID="8a12eeb79b9ed8c8795620178cf84d798a55c76172f9f84e5d3b9a486be7a313" Apr 17 14:38:22.090111 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.090081 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a12eeb79b9ed8c8795620178cf84d798a55c76172f9f84e5d3b9a486be7a313"} err="failed to get container status \"8a12eeb79b9ed8c8795620178cf84d798a55c76172f9f84e5d3b9a486be7a313\": rpc error: code = NotFound desc = could not find container \"8a12eeb79b9ed8c8795620178cf84d798a55c76172f9f84e5d3b9a486be7a313\": container with ID starting with 8a12eeb79b9ed8c8795620178cf84d798a55c76172f9f84e5d3b9a486be7a313 not found: ID does not exist" Apr 17 14:38:22.090111 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.090094 2570 scope.go:117] "RemoveContainer" containerID="1da48737a631031c9b273eba7923076d9f814119014b2b83456e9e2d0d4b9621" Apr 17 14:38:22.090302 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:38:22.090286 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1da48737a631031c9b273eba7923076d9f814119014b2b83456e9e2d0d4b9621\": container with ID starting with 1da48737a631031c9b273eba7923076d9f814119014b2b83456e9e2d0d4b9621 not found: ID does not exist" containerID="1da48737a631031c9b273eba7923076d9f814119014b2b83456e9e2d0d4b9621" Apr 17 14:38:22.090350 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.090307 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1da48737a631031c9b273eba7923076d9f814119014b2b83456e9e2d0d4b9621"} err="failed to get container status \"1da48737a631031c9b273eba7923076d9f814119014b2b83456e9e2d0d4b9621\": rpc error: code = NotFound desc = could not find container \"1da48737a631031c9b273eba7923076d9f814119014b2b83456e9e2d0d4b9621\": container with ID starting with 1da48737a631031c9b273eba7923076d9f814119014b2b83456e9e2d0d4b9621 not found: ID does not exist" Apr 17 14:38:22.090350 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.090322 2570 scope.go:117] "RemoveContainer" containerID="7041eb1c0446d5e6f34d7288b5acf4934a37f1a34d96a55bd7a97f0297de64bf" Apr 17 14:38:22.090521 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:38:22.090506 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7041eb1c0446d5e6f34d7288b5acf4934a37f1a34d96a55bd7a97f0297de64bf\": container with ID starting with 7041eb1c0446d5e6f34d7288b5acf4934a37f1a34d96a55bd7a97f0297de64bf not found: ID does not exist" containerID="7041eb1c0446d5e6f34d7288b5acf4934a37f1a34d96a55bd7a97f0297de64bf" Apr 17 14:38:22.090580 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.090539 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7041eb1c0446d5e6f34d7288b5acf4934a37f1a34d96a55bd7a97f0297de64bf"} err="failed to get container status \"7041eb1c0446d5e6f34d7288b5acf4934a37f1a34d96a55bd7a97f0297de64bf\": rpc error: code = NotFound desc = could not find container \"7041eb1c0446d5e6f34d7288b5acf4934a37f1a34d96a55bd7a97f0297de64bf\": container with ID starting with 7041eb1c0446d5e6f34d7288b5acf4934a37f1a34d96a55bd7a97f0297de64bf not found: ID does not exist" Apr 17 14:38:22.097635 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.097616 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 14:38:22.097857 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.097846 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="kube-rbac-proxy" Apr 17 14:38:22.097903 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.097859 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="kube-rbac-proxy" Apr 17 14:38:22.097903 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.097867 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="prom-label-proxy" Apr 17 14:38:22.097903 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.097873 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="prom-label-proxy" Apr 17 14:38:22.097903 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.097884 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="init-config-reloader" Apr 17 14:38:22.097903 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.097892 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="init-config-reloader" Apr 17 14:38:22.097903 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.097901 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="kube-rbac-proxy-metric" Apr 17 14:38:22.098070 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.097906 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="kube-rbac-proxy-metric" Apr 17 14:38:22.098070 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.097917 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="kube-rbac-proxy-web" Apr 17 14:38:22.098070 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.097922 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="kube-rbac-proxy-web" Apr 17 14:38:22.098070 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.097931 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="config-reloader" Apr 17 14:38:22.098070 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.097936 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="config-reloader" Apr 17 14:38:22.098070 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.097944 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="alertmanager" Apr 17 14:38:22.098070 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.097949 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="alertmanager" Apr 17 14:38:22.098070 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.097993 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="kube-rbac-proxy" Apr 17 14:38:22.098070 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.098001 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="config-reloader" Apr 17 14:38:22.098070 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.098007 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="kube-rbac-proxy-web" Apr 17 14:38:22.098070 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.098013 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="prom-label-proxy" Apr 17 14:38:22.098070 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.098019 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="alertmanager" Apr 17 14:38:22.098070 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.098025 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="6116e800-819e-48e7-8646-3890fe0067e1" containerName="kube-rbac-proxy-metric" Apr 17 14:38:22.102930 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.102916 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.105518 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.105499 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 14:38:22.105619 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.105525 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 14:38:22.105619 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.105561 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 14:38:22.105956 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.105935 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 14:38:22.106034 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.105940 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-rh2rm\"" Apr 17 14:38:22.106034 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.105983 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 14:38:22.106034 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.105957 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 14:38:22.106184 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.105944 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 14:38:22.106266 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.106253 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 14:38:22.111172 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.111125 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 14:38:22.112878 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.112858 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 14:38:22.237500 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.237472 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.237500 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.237511 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.237706 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.237530 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-config-out\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.237706 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.237556 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-config-volume\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.237706 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.237578 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.237706 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.237654 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.237706 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.237703 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.237886 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.237780 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.237886 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.237808 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.237886 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.237843 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-web-config\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.237886 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.237860 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.238070 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.237926 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.238070 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.237971 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg8tp\" (UniqueName: \"kubernetes.io/projected/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-kube-api-access-gg8tp\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.238629 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.238611 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6116e800-819e-48e7-8646-3890fe0067e1" path="/var/lib/kubelet/pods/6116e800-819e-48e7-8646-3890fe0067e1/volumes" Apr 17 14:38:22.339212 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.339135 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.339212 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.339172 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.339418 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.339222 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.339418 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.339268 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.339418 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.339332 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-web-config\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.339418 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.339359 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.339418 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.339391 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.339677 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.339419 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gg8tp\" (UniqueName: \"kubernetes.io/projected/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-kube-api-access-gg8tp\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.339677 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.339464 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.339677 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.339502 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.339677 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.339527 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-config-out\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.339677 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.339554 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-config-volume\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.339677 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.339589 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.339980 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.339901 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.340068 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.340044 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.340149 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.340124 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.342450 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.342387 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-web-config\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.342450 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.342404 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.342450 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.342425 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-config-out\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.342700 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.342483 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.342700 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.342571 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.342700 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.342610 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-config-volume\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.342700 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.342654 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.342907 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.342891 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.344228 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.344208 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.347175 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.347158 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg8tp\" (UniqueName: \"kubernetes.io/projected/f759a44c-4c49-40f7-9ca4-77917fc3f9d4-kube-api-access-gg8tp\") pod \"alertmanager-main-0\" (UID: \"f759a44c-4c49-40f7-9ca4-77917fc3f9d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.412648 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.412603 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:38:22.532736 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:22.532712 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 14:38:22.534854 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:38:22.534822 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf759a44c_4c49_40f7_9ca4_77917fc3f9d4.slice/crio-c2c126d2a1635ba2b2891e1d3f849552b77bcfb6491d36ab4f11cf4ee88919ff WatchSource:0}: Error finding container c2c126d2a1635ba2b2891e1d3f849552b77bcfb6491d36ab4f11cf4ee88919ff: Status 404 returned error can't find the container with id c2c126d2a1635ba2b2891e1d3f849552b77bcfb6491d36ab4f11cf4ee88919ff Apr 17 14:38:23.047600 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:23.047566 2570 generic.go:358] "Generic (PLEG): container finished" podID="f759a44c-4c49-40f7-9ca4-77917fc3f9d4" containerID="05c60e18d6c68950884b814d13e7df1fd23a73f7dafdf8987d9ef41e46fa750a" exitCode=0 Apr 17 14:38:23.047940 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:23.047648 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f759a44c-4c49-40f7-9ca4-77917fc3f9d4","Type":"ContainerDied","Data":"05c60e18d6c68950884b814d13e7df1fd23a73f7dafdf8987d9ef41e46fa750a"} Apr 17 14:38:23.047940 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:23.047678 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f759a44c-4c49-40f7-9ca4-77917fc3f9d4","Type":"ContainerStarted","Data":"c2c126d2a1635ba2b2891e1d3f849552b77bcfb6491d36ab4f11cf4ee88919ff"} Apr 17 14:38:24.053951 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.053916 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f759a44c-4c49-40f7-9ca4-77917fc3f9d4","Type":"ContainerStarted","Data":"7a1e78c93d8bc8d910fbf24a8cb5535f419de04f6abca51d38eb9b2e6fa93ef1"} Apr 17 14:38:24.053951 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.053951 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f759a44c-4c49-40f7-9ca4-77917fc3f9d4","Type":"ContainerStarted","Data":"c85997c5d61d3dca56f1d1bcf1a89ac902b7da6d755768ba11170d8a0bff72e9"} Apr 17 14:38:24.053951 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.053961 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f759a44c-4c49-40f7-9ca4-77917fc3f9d4","Type":"ContainerStarted","Data":"5908e4e29c4b970986227119a66c86d092789e607e8409bea63ba851f63ac5f7"} Apr 17 14:38:24.054413 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.053969 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f759a44c-4c49-40f7-9ca4-77917fc3f9d4","Type":"ContainerStarted","Data":"12293ede710c2ccef8e362848a3fdab99255dcb1cd78079c63fc02c72a128191"} Apr 17 14:38:24.054413 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.053977 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f759a44c-4c49-40f7-9ca4-77917fc3f9d4","Type":"ContainerStarted","Data":"74de8aefd036f0ce5105f0d41a0492b6b26a92410f6f2d683d8a0e9db7c80e54"} Apr 17 14:38:24.054413 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.053987 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f759a44c-4c49-40f7-9ca4-77917fc3f9d4","Type":"ContainerStarted","Data":"27f0cc984a26a1c04bc29c5bf74054ed47faf32816a3dff2961e064ca961b849"} Apr 17 14:38:24.080097 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.080049 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.080034704 podStartE2EDuration="2.080034704s" podCreationTimestamp="2026-04-17 14:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:38:24.078738478 +0000 UTC m=+270.435466107" watchObservedRunningTime="2026-04-17 14:38:24.080034704 +0000 UTC m=+270.436762329" Apr 17 14:38:24.673329 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.673291 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-768cb8d9f7-s7czf"] Apr 17 14:38:24.676538 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.676523 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.678945 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.678922 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-rsqbp\"" Apr 17 14:38:24.679175 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.679156 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 14:38:24.679260 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.679166 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 14:38:24.679260 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.679228 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 14:38:24.679423 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.679273 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 14:38:24.680265 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.680229 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 14:38:24.685829 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.685810 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 14:38:24.686323 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.686298 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-768cb8d9f7-s7czf"] Apr 17 14:38:24.758258 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.758199 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-secret-telemeter-client\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.758258 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.758263 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-serving-certs-ca-bundle\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.758471 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.758286 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.758471 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.758312 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-telemeter-trusted-ca-bundle\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.758471 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.758351 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-telemeter-client-tls\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.758471 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.758373 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-federate-client-tls\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.758471 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.758392 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4m8n\" (UniqueName: \"kubernetes.io/projected/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-kube-api-access-h4m8n\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.758471 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.758416 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-metrics-client-ca\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.859052 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.859009 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-secret-telemeter-client\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.859052 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.859059 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-serving-certs-ca-bundle\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.859383 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.859100 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.859383 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.859133 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-telemeter-trusted-ca-bundle\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.859383 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.859336 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-telemeter-client-tls\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.859542 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.859384 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-federate-client-tls\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.859542 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.859416 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4m8n\" (UniqueName: \"kubernetes.io/projected/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-kube-api-access-h4m8n\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.859542 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.859456 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-metrics-client-ca\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.859963 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.859934 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-serving-certs-ca-bundle\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.860365 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.860340 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-metrics-client-ca\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.860643 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.860620 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-telemeter-trusted-ca-bundle\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.861860 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.861830 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.861972 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.861830 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-telemeter-client-tls\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.861972 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.861960 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-secret-telemeter-client\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.862063 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.861961 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-federate-client-tls\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.868209 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.868184 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4m8n\" (UniqueName: \"kubernetes.io/projected/536cadba-ee0d-4cfa-bd2e-2f09daa67ea7-kube-api-access-h4m8n\") pod \"telemeter-client-768cb8d9f7-s7czf\" (UID: \"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7\") " pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:24.988059 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:24.987979 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" Apr 17 14:38:25.110214 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:25.110193 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-768cb8d9f7-s7czf"] Apr 17 14:38:25.112414 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:38:25.112388 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod536cadba_ee0d_4cfa_bd2e_2f09daa67ea7.slice/crio-b8a36f20bbaa35acb8da1e89c1e7b001e15f7151117bf8a54a03f79bb9942aee WatchSource:0}: Error finding container b8a36f20bbaa35acb8da1e89c1e7b001e15f7151117bf8a54a03f79bb9942aee: Status 404 returned error can't find the container with id b8a36f20bbaa35acb8da1e89c1e7b001e15f7151117bf8a54a03f79bb9942aee Apr 17 14:38:26.062140 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:26.062095 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" event={"ID":"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7","Type":"ContainerStarted","Data":"b8a36f20bbaa35acb8da1e89c1e7b001e15f7151117bf8a54a03f79bb9942aee"} Apr 17 14:38:28.069424 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:28.069383 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" event={"ID":"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7","Type":"ContainerStarted","Data":"facb5df39316b15e71f415e94744656182e7cc79739242d0c81acedb8a73cc27"} Apr 17 14:38:28.069424 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:28.069425 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" event={"ID":"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7","Type":"ContainerStarted","Data":"7a766372f192a857bbfe09b66eb2dc3e5f0ba59ba003d7eccc425c20fd6421fc"} Apr 17 14:38:28.069940 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:28.069435 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" event={"ID":"536cadba-ee0d-4cfa-bd2e-2f09daa67ea7","Type":"ContainerStarted","Data":"7597761f54322a926a4c1e7a405e45a7fd07e3ba607099f1f0cec8ee41261372"} Apr 17 14:38:28.091023 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:28.090968 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-768cb8d9f7-s7czf" podStartSLOduration=1.698453689 podStartE2EDuration="4.090953622s" podCreationTimestamp="2026-04-17 14:38:24 +0000 UTC" firstStartedPulling="2026-04-17 14:38:25.114691502 +0000 UTC m=+271.471419105" lastFinishedPulling="2026-04-17 14:38:27.507191432 +0000 UTC m=+273.863919038" observedRunningTime="2026-04-17 14:38:28.089405486 +0000 UTC m=+274.446133123" watchObservedRunningTime="2026-04-17 14:38:28.090953622 +0000 UTC m=+274.447681267" Apr 17 14:38:33.699065 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:38:33.699022 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" podUID="42b16b82-9513-445d-b01e-228784a51e88" Apr 17 14:38:34.085586 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:34.085557 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" Apr 17 14:38:37.563894 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:37.563838 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jngmx\" (UID: \"42b16b82-9513-445d-b01e-228784a51e88\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" Apr 17 14:38:37.566214 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:37.566187 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/42b16b82-9513-445d-b01e-228784a51e88-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jngmx\" (UID: \"42b16b82-9513-445d-b01e-228784a51e88\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" Apr 17 14:38:37.689471 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:37.689440 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-6sm76\"" Apr 17 14:38:37.696536 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:37.696513 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" Apr 17 14:38:37.810585 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:37.810498 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-jngmx"] Apr 17 14:38:37.813178 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:38:37.813155 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42b16b82_9513_445d_b01e_228784a51e88.slice/crio-37d4a80d33d904c2cf7f32d5ec0da9244fd91e4c185792018fbfded660efa0a3 WatchSource:0}: Error finding container 37d4a80d33d904c2cf7f32d5ec0da9244fd91e4c185792018fbfded660efa0a3: Status 404 returned error can't find the container with id 37d4a80d33d904c2cf7f32d5ec0da9244fd91e4c185792018fbfded660efa0a3 Apr 17 14:38:38.096821 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:38.096739 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" event={"ID":"42b16b82-9513-445d-b01e-228784a51e88","Type":"ContainerStarted","Data":"37d4a80d33d904c2cf7f32d5ec0da9244fd91e4c185792018fbfded660efa0a3"} Apr 17 14:38:39.100852 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:39.100813 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" event={"ID":"42b16b82-9513-445d-b01e-228784a51e88","Type":"ContainerStarted","Data":"18a360c9954a401b02be0eeacb12469e600ba5ba911e5d83b16126b58cf586e6"} Apr 17 14:38:39.116670 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:39.116631 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jngmx" podStartSLOduration=274.216870828 podStartE2EDuration="4m35.116619297s" podCreationTimestamp="2026-04-17 14:34:04 +0000 UTC" firstStartedPulling="2026-04-17 14:38:37.815106782 +0000 UTC m=+284.171834385" lastFinishedPulling="2026-04-17 14:38:38.71485524 +0000 UTC m=+285.071582854" observedRunningTime="2026-04-17 14:38:39.115270104 +0000 UTC m=+285.471997720" watchObservedRunningTime="2026-04-17 14:38:39.116619297 +0000 UTC m=+285.473346921" Apr 17 14:38:39.695772 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:39.695740 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6cb955f7f9-mzd8p"] Apr 17 14:38:54.131588 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:54.131566 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lr966_0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c/ovn-acl-logging/0.log" Apr 17 14:38:54.132384 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:38:54.132361 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lr966_0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c/ovn-acl-logging/0.log" Apr 17 14:39:04.714393 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:04.714330 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6cb955f7f9-mzd8p" podUID="302f6ead-8fbd-4c95-8899-66acff56ed81" containerName="console" containerID="cri-o://b39d40f53c4c229874b84d5e9fd60a6d6cbaa4811424279ba926e3bfecb1b65f" gracePeriod=15 Apr 17 14:39:04.956921 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:04.956896 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cb955f7f9-mzd8p_302f6ead-8fbd-4c95-8899-66acff56ed81/console/0.log" Apr 17 14:39:04.957039 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:04.956962 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:39:04.977893 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:04.977822 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-oauth-serving-cert\") pod \"302f6ead-8fbd-4c95-8899-66acff56ed81\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " Apr 17 14:39:04.977893 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:04.977859 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-trusted-ca-bundle\") pod \"302f6ead-8fbd-4c95-8899-66acff56ed81\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " Apr 17 14:39:04.977893 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:04.977893 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/302f6ead-8fbd-4c95-8899-66acff56ed81-console-oauth-config\") pod \"302f6ead-8fbd-4c95-8899-66acff56ed81\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " Apr 17 14:39:04.978138 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:04.977932 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/302f6ead-8fbd-4c95-8899-66acff56ed81-console-serving-cert\") pod \"302f6ead-8fbd-4c95-8899-66acff56ed81\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " Apr 17 14:39:04.978138 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:04.977969 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h29wg\" (UniqueName: \"kubernetes.io/projected/302f6ead-8fbd-4c95-8899-66acff56ed81-kube-api-access-h29wg\") pod \"302f6ead-8fbd-4c95-8899-66acff56ed81\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " Apr 17 14:39:04.978138 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:04.978013 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-service-ca\") pod \"302f6ead-8fbd-4c95-8899-66acff56ed81\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " Apr 17 14:39:04.978138 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:04.978072 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-console-config\") pod \"302f6ead-8fbd-4c95-8899-66acff56ed81\" (UID: \"302f6ead-8fbd-4c95-8899-66acff56ed81\") " Apr 17 14:39:04.978402 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:04.978355 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "302f6ead-8fbd-4c95-8899-66acff56ed81" (UID: "302f6ead-8fbd-4c95-8899-66acff56ed81"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:39:04.978622 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:04.978395 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "302f6ead-8fbd-4c95-8899-66acff56ed81" (UID: "302f6ead-8fbd-4c95-8899-66acff56ed81"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:39:04.978989 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:04.978662 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-console-config" (OuterVolumeSpecName: "console-config") pod "302f6ead-8fbd-4c95-8899-66acff56ed81" (UID: "302f6ead-8fbd-4c95-8899-66acff56ed81"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:39:04.979274 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:04.979234 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-service-ca" (OuterVolumeSpecName: "service-ca") pod "302f6ead-8fbd-4c95-8899-66acff56ed81" (UID: "302f6ead-8fbd-4c95-8899-66acff56ed81"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:39:04.980574 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:04.980549 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/302f6ead-8fbd-4c95-8899-66acff56ed81-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "302f6ead-8fbd-4c95-8899-66acff56ed81" (UID: "302f6ead-8fbd-4c95-8899-66acff56ed81"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:39:04.980679 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:04.980653 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/302f6ead-8fbd-4c95-8899-66acff56ed81-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "302f6ead-8fbd-4c95-8899-66acff56ed81" (UID: "302f6ead-8fbd-4c95-8899-66acff56ed81"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:39:04.980815 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:04.980792 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/302f6ead-8fbd-4c95-8899-66acff56ed81-kube-api-access-h29wg" (OuterVolumeSpecName: "kube-api-access-h29wg") pod "302f6ead-8fbd-4c95-8899-66acff56ed81" (UID: "302f6ead-8fbd-4c95-8899-66acff56ed81"). InnerVolumeSpecName "kube-api-access-h29wg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:39:05.079546 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:05.079496 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-service-ca\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:39:05.079546 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:05.079541 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-console-config\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:39:05.079546 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:05.079553 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-oauth-serving-cert\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:39:05.079546 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:05.079563 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/302f6ead-8fbd-4c95-8899-66acff56ed81-trusted-ca-bundle\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:39:05.079814 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:05.079572 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/302f6ead-8fbd-4c95-8899-66acff56ed81-console-oauth-config\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:39:05.079814 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:05.079582 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/302f6ead-8fbd-4c95-8899-66acff56ed81-console-serving-cert\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:39:05.079814 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:05.079591 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h29wg\" (UniqueName: \"kubernetes.io/projected/302f6ead-8fbd-4c95-8899-66acff56ed81-kube-api-access-h29wg\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:39:05.173018 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:05.172987 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cb955f7f9-mzd8p_302f6ead-8fbd-4c95-8899-66acff56ed81/console/0.log" Apr 17 14:39:05.173170 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:05.173030 2570 generic.go:358] "Generic (PLEG): container finished" podID="302f6ead-8fbd-4c95-8899-66acff56ed81" containerID="b39d40f53c4c229874b84d5e9fd60a6d6cbaa4811424279ba926e3bfecb1b65f" exitCode=2 Apr 17 14:39:05.173170 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:05.173093 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cb955f7f9-mzd8p" Apr 17 14:39:05.173170 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:05.173126 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cb955f7f9-mzd8p" event={"ID":"302f6ead-8fbd-4c95-8899-66acff56ed81","Type":"ContainerDied","Data":"b39d40f53c4c229874b84d5e9fd60a6d6cbaa4811424279ba926e3bfecb1b65f"} Apr 17 14:39:05.173338 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:05.173171 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cb955f7f9-mzd8p" event={"ID":"302f6ead-8fbd-4c95-8899-66acff56ed81","Type":"ContainerDied","Data":"bfc71ecba35975233830b46134d9c07b5397591e57f9b8d14c19635973ae1541"} Apr 17 14:39:05.173338 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:05.173192 2570 scope.go:117] "RemoveContainer" containerID="b39d40f53c4c229874b84d5e9fd60a6d6cbaa4811424279ba926e3bfecb1b65f" Apr 17 14:39:05.181525 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:05.181504 2570 scope.go:117] "RemoveContainer" containerID="b39d40f53c4c229874b84d5e9fd60a6d6cbaa4811424279ba926e3bfecb1b65f" Apr 17 14:39:05.181786 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:39:05.181768 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b39d40f53c4c229874b84d5e9fd60a6d6cbaa4811424279ba926e3bfecb1b65f\": container with ID starting with b39d40f53c4c229874b84d5e9fd60a6d6cbaa4811424279ba926e3bfecb1b65f not found: ID does not exist" containerID="b39d40f53c4c229874b84d5e9fd60a6d6cbaa4811424279ba926e3bfecb1b65f" Apr 17 14:39:05.181866 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:05.181793 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b39d40f53c4c229874b84d5e9fd60a6d6cbaa4811424279ba926e3bfecb1b65f"} err="failed to get container status \"b39d40f53c4c229874b84d5e9fd60a6d6cbaa4811424279ba926e3bfecb1b65f\": rpc error: code = NotFound desc = could not find container \"b39d40f53c4c229874b84d5e9fd60a6d6cbaa4811424279ba926e3bfecb1b65f\": container with ID starting with b39d40f53c4c229874b84d5e9fd60a6d6cbaa4811424279ba926e3bfecb1b65f not found: ID does not exist" Apr 17 14:39:05.192601 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:05.192575 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6cb955f7f9-mzd8p"] Apr 17 14:39:05.196741 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:05.196719 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6cb955f7f9-mzd8p"] Apr 17 14:39:06.238405 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:06.238371 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="302f6ead-8fbd-4c95-8899-66acff56ed81" path="/var/lib/kubelet/pods/302f6ead-8fbd-4c95-8899-66acff56ed81/volumes" Apr 17 14:39:36.435202 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.435168 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f8756f569-wqvjz"] Apr 17 14:39:36.435635 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.435444 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="302f6ead-8fbd-4c95-8899-66acff56ed81" containerName="console" Apr 17 14:39:36.435635 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.435456 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="302f6ead-8fbd-4c95-8899-66acff56ed81" containerName="console" Apr 17 14:39:36.435635 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.435508 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="302f6ead-8fbd-4c95-8899-66acff56ed81" containerName="console" Apr 17 14:39:36.440445 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.440423 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.443150 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.443127 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-lw4t2\"" Apr 17 14:39:36.443264 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.443127 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 14:39:36.448269 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.448213 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 14:39:36.448614 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.448321 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 14:39:36.448666 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.448610 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 14:39:36.448666 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.448629 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 14:39:36.448754 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.448689 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 14:39:36.449425 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.449362 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 14:39:36.453201 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.452732 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 14:39:36.453614 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.453594 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f8756f569-wqvjz"] Apr 17 14:39:36.512408 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.512371 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-console-config\") pod \"console-5f8756f569-wqvjz\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.512581 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.512414 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b10bac76-d2a9-42fe-a0b9-20f63c877990-console-oauth-config\") pod \"console-5f8756f569-wqvjz\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.512581 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.512442 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-service-ca\") pod \"console-5f8756f569-wqvjz\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.512581 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.512505 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b10bac76-d2a9-42fe-a0b9-20f63c877990-console-serving-cert\") pod \"console-5f8756f569-wqvjz\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.512581 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.512557 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-trusted-ca-bundle\") pod \"console-5f8756f569-wqvjz\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.512581 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.512578 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-oauth-serving-cert\") pod \"console-5f8756f569-wqvjz\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.512743 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.512594 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg4kt\" (UniqueName: \"kubernetes.io/projected/b10bac76-d2a9-42fe-a0b9-20f63c877990-kube-api-access-dg4kt\") pod \"console-5f8756f569-wqvjz\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.613455 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.613423 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b10bac76-d2a9-42fe-a0b9-20f63c877990-console-serving-cert\") pod \"console-5f8756f569-wqvjz\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.613595 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.613469 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-trusted-ca-bundle\") pod \"console-5f8756f569-wqvjz\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.613595 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.613493 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-oauth-serving-cert\") pod \"console-5f8756f569-wqvjz\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.613595 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.613511 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg4kt\" (UniqueName: \"kubernetes.io/projected/b10bac76-d2a9-42fe-a0b9-20f63c877990-kube-api-access-dg4kt\") pod \"console-5f8756f569-wqvjz\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.613595 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.613562 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-console-config\") pod \"console-5f8756f569-wqvjz\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.613595 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.613582 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b10bac76-d2a9-42fe-a0b9-20f63c877990-console-oauth-config\") pod \"console-5f8756f569-wqvjz\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.613815 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.613597 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-service-ca\") pod \"console-5f8756f569-wqvjz\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.614249 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.614217 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-oauth-serving-cert\") pod \"console-5f8756f569-wqvjz\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.614399 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.614294 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-console-config\") pod \"console-5f8756f569-wqvjz\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.614399 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.614324 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-service-ca\") pod \"console-5f8756f569-wqvjz\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.614399 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.614387 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-trusted-ca-bundle\") pod \"console-5f8756f569-wqvjz\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.616066 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.616034 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b10bac76-d2a9-42fe-a0b9-20f63c877990-console-oauth-config\") pod \"console-5f8756f569-wqvjz\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.616155 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.616091 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b10bac76-d2a9-42fe-a0b9-20f63c877990-console-serving-cert\") pod \"console-5f8756f569-wqvjz\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.621579 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.621558 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg4kt\" (UniqueName: \"kubernetes.io/projected/b10bac76-d2a9-42fe-a0b9-20f63c877990-kube-api-access-dg4kt\") pod \"console-5f8756f569-wqvjz\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.755495 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.755399 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:36.874648 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.874617 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f8756f569-wqvjz"] Apr 17 14:39:36.877406 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:39:36.877378 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb10bac76_d2a9_42fe_a0b9_20f63c877990.slice/crio-8fc7552671b52292bb0c62fe826b959b7b5c512ec7e95b1f6674d3467643993f WatchSource:0}: Error finding container 8fc7552671b52292bb0c62fe826b959b7b5c512ec7e95b1f6674d3467643993f: Status 404 returned error can't find the container with id 8fc7552671b52292bb0c62fe826b959b7b5c512ec7e95b1f6674d3467643993f Apr 17 14:39:36.879161 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:36.879144 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:39:37.262425 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:37.262389 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f8756f569-wqvjz" event={"ID":"b10bac76-d2a9-42fe-a0b9-20f63c877990","Type":"ContainerStarted","Data":"47c558a145c58452c22daf162a3fab222e04b6543957713c1f26a68678a0982d"} Apr 17 14:39:37.262425 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:37.262427 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f8756f569-wqvjz" event={"ID":"b10bac76-d2a9-42fe-a0b9-20f63c877990","Type":"ContainerStarted","Data":"8fc7552671b52292bb0c62fe826b959b7b5c512ec7e95b1f6674d3467643993f"} Apr 17 14:39:37.278774 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:37.278727 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f8756f569-wqvjz" podStartSLOduration=1.278713014 podStartE2EDuration="1.278713014s" podCreationTimestamp="2026-04-17 14:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:39:37.277668483 +0000 UTC m=+343.634396113" watchObservedRunningTime="2026-04-17 14:39:37.278713014 +0000 UTC m=+343.635440642" Apr 17 14:39:46.755987 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:46.755891 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:46.755987 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:46.755959 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:46.760575 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:46.760550 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:39:47.294143 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:39:47.294114 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:41:18.572635 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:18.572538 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pvtjc"] Apr 17 14:41:18.576056 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:18.576031 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pvtjc" Apr 17 14:41:18.578754 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:18.578727 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:41:18.578882 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:18.578727 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-z87gh\"" Apr 17 14:41:18.579044 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:18.579031 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 14:41:18.584103 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:18.584080 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pvtjc"] Apr 17 14:41:18.626361 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:18.626320 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/82d0723e-4ff8-4938-bbc5-2eb5383ec6c7-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-pvtjc\" (UID: \"82d0723e-4ff8-4938-bbc5-2eb5383ec6c7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pvtjc" Apr 17 14:41:18.626531 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:18.626405 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qssl\" (UniqueName: \"kubernetes.io/projected/82d0723e-4ff8-4938-bbc5-2eb5383ec6c7-kube-api-access-8qssl\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-pvtjc\" (UID: \"82d0723e-4ff8-4938-bbc5-2eb5383ec6c7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pvtjc" Apr 17 14:41:18.727135 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:18.727102 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/82d0723e-4ff8-4938-bbc5-2eb5383ec6c7-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-pvtjc\" (UID: \"82d0723e-4ff8-4938-bbc5-2eb5383ec6c7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pvtjc" Apr 17 14:41:18.727333 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:18.727163 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qssl\" (UniqueName: \"kubernetes.io/projected/82d0723e-4ff8-4938-bbc5-2eb5383ec6c7-kube-api-access-8qssl\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-pvtjc\" (UID: \"82d0723e-4ff8-4938-bbc5-2eb5383ec6c7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pvtjc" Apr 17 14:41:18.727510 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:18.727492 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/82d0723e-4ff8-4938-bbc5-2eb5383ec6c7-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-pvtjc\" (UID: \"82d0723e-4ff8-4938-bbc5-2eb5383ec6c7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pvtjc" Apr 17 14:41:18.734941 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:18.734921 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qssl\" (UniqueName: \"kubernetes.io/projected/82d0723e-4ff8-4938-bbc5-2eb5383ec6c7-kube-api-access-8qssl\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-pvtjc\" (UID: \"82d0723e-4ff8-4938-bbc5-2eb5383ec6c7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pvtjc" Apr 17 14:41:18.885518 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:18.885418 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pvtjc" Apr 17 14:41:19.010578 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:19.010545 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pvtjc"] Apr 17 14:41:19.013548 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:41:19.013521 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82d0723e_4ff8_4938_bbc5_2eb5383ec6c7.slice/crio-8fb370606abda3c0ae99e8a909c220223f312a85de1ed5c3913ccd795ecdfcbc WatchSource:0}: Error finding container 8fb370606abda3c0ae99e8a909c220223f312a85de1ed5c3913ccd795ecdfcbc: Status 404 returned error can't find the container with id 8fb370606abda3c0ae99e8a909c220223f312a85de1ed5c3913ccd795ecdfcbc Apr 17 14:41:19.538392 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:19.538358 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pvtjc" event={"ID":"82d0723e-4ff8-4938-bbc5-2eb5383ec6c7","Type":"ContainerStarted","Data":"8fb370606abda3c0ae99e8a909c220223f312a85de1ed5c3913ccd795ecdfcbc"} Apr 17 14:41:22.548974 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:22.548937 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pvtjc" event={"ID":"82d0723e-4ff8-4938-bbc5-2eb5383ec6c7","Type":"ContainerStarted","Data":"31635edd5d7d5137831f76572fb3808a68946614ebec9831b4153b2832e1e5ab"} Apr 17 14:41:22.569531 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:22.569481 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-pvtjc" podStartSLOduration=2.044942081 podStartE2EDuration="4.569464923s" podCreationTimestamp="2026-04-17 14:41:18 +0000 UTC" firstStartedPulling="2026-04-17 14:41:19.016092673 +0000 UTC m=+445.372820289" lastFinishedPulling="2026-04-17 14:41:21.540615524 +0000 UTC m=+447.897343131" observedRunningTime="2026-04-17 14:41:22.566930509 +0000 UTC m=+448.923658154" watchObservedRunningTime="2026-04-17 14:41:22.569464923 +0000 UTC m=+448.926192548" Apr 17 14:41:56.280070 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:56.280028 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c585549bc-zqv2x"] Apr 17 14:41:56.284594 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:56.284571 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-zqv2x" Apr 17 14:41:56.287256 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:56.287209 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-bfw6w\"" Apr 17 14:41:56.287458 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:56.287439 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 14:41:56.287550 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:56.287433 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 14:41:56.287550 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:56.287495 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 14:41:56.287692 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:56.287579 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 14:41:56.296288 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:56.296262 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c585549bc-zqv2x"] Apr 17 14:41:56.413949 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:56.413907 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b392ed97-9ee8-4818-8103-31efb7f94c80-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c585549bc-zqv2x\" (UID: \"b392ed97-9ee8-4818-8103-31efb7f94c80\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-zqv2x" Apr 17 14:41:56.413949 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:56.413953 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b392ed97-9ee8-4818-8103-31efb7f94c80-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c585549bc-zqv2x\" (UID: \"b392ed97-9ee8-4818-8103-31efb7f94c80\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-zqv2x" Apr 17 14:41:56.414166 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:56.414072 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqwbd\" (UniqueName: \"kubernetes.io/projected/b392ed97-9ee8-4818-8103-31efb7f94c80-kube-api-access-jqwbd\") pod \"opendatahub-operator-controller-manager-6c585549bc-zqv2x\" (UID: \"b392ed97-9ee8-4818-8103-31efb7f94c80\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-zqv2x" Apr 17 14:41:56.515098 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:56.515060 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqwbd\" (UniqueName: \"kubernetes.io/projected/b392ed97-9ee8-4818-8103-31efb7f94c80-kube-api-access-jqwbd\") pod \"opendatahub-operator-controller-manager-6c585549bc-zqv2x\" (UID: \"b392ed97-9ee8-4818-8103-31efb7f94c80\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-zqv2x" Apr 17 14:41:56.515308 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:56.515123 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b392ed97-9ee8-4818-8103-31efb7f94c80-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c585549bc-zqv2x\" (UID: \"b392ed97-9ee8-4818-8103-31efb7f94c80\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-zqv2x" Apr 17 14:41:56.515308 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:56.515161 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b392ed97-9ee8-4818-8103-31efb7f94c80-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c585549bc-zqv2x\" (UID: \"b392ed97-9ee8-4818-8103-31efb7f94c80\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-zqv2x" Apr 17 14:41:56.517711 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:56.517686 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b392ed97-9ee8-4818-8103-31efb7f94c80-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c585549bc-zqv2x\" (UID: \"b392ed97-9ee8-4818-8103-31efb7f94c80\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-zqv2x" Apr 17 14:41:56.517826 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:56.517697 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b392ed97-9ee8-4818-8103-31efb7f94c80-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c585549bc-zqv2x\" (UID: \"b392ed97-9ee8-4818-8103-31efb7f94c80\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-zqv2x" Apr 17 14:41:56.522889 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:56.522866 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqwbd\" (UniqueName: \"kubernetes.io/projected/b392ed97-9ee8-4818-8103-31efb7f94c80-kube-api-access-jqwbd\") pod \"opendatahub-operator-controller-manager-6c585549bc-zqv2x\" (UID: \"b392ed97-9ee8-4818-8103-31efb7f94c80\") " pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-zqv2x" Apr 17 14:41:56.595065 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:56.594972 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-zqv2x" Apr 17 14:41:56.727991 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:56.727957 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c585549bc-zqv2x"] Apr 17 14:41:56.731775 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:41:56.731747 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb392ed97_9ee8_4818_8103_31efb7f94c80.slice/crio-19b4a7098f5f07acad9231c1e564a7da1e723cd14810e1690dbea4ad1f390a02 WatchSource:0}: Error finding container 19b4a7098f5f07acad9231c1e564a7da1e723cd14810e1690dbea4ad1f390a02: Status 404 returned error can't find the container with id 19b4a7098f5f07acad9231c1e564a7da1e723cd14810e1690dbea4ad1f390a02 Apr 17 14:41:57.660257 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:57.660202 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-zqv2x" event={"ID":"b392ed97-9ee8-4818-8103-31efb7f94c80","Type":"ContainerStarted","Data":"19b4a7098f5f07acad9231c1e564a7da1e723cd14810e1690dbea4ad1f390a02"} Apr 17 14:41:59.670043 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:59.670002 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-zqv2x" event={"ID":"b392ed97-9ee8-4818-8103-31efb7f94c80","Type":"ContainerStarted","Data":"a906abda71d79204ee5ebce971d309b8df506df076ffebac04db01ac11af9344"} Apr 17 14:41:59.670447 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:59.670119 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-zqv2x" Apr 17 14:41:59.691402 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:41:59.691352 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-zqv2x" podStartSLOduration=1.356389037 podStartE2EDuration="3.691338557s" podCreationTimestamp="2026-04-17 14:41:56 +0000 UTC" firstStartedPulling="2026-04-17 14:41:56.733584202 +0000 UTC m=+483.090311805" lastFinishedPulling="2026-04-17 14:41:59.068533722 +0000 UTC m=+485.425261325" observedRunningTime="2026-04-17 14:41:59.689500086 +0000 UTC m=+486.046227744" watchObservedRunningTime="2026-04-17 14:41:59.691338557 +0000 UTC m=+486.048066257" Apr 17 14:42:10.675721 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:10.675688 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6c585549bc-zqv2x" Apr 17 14:42:13.877213 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:13.877173 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-596bc867c-54qdn"] Apr 17 14:42:13.880863 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:13.880841 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-596bc867c-54qdn" Apr 17 14:42:13.884672 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:13.884649 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-hdxjw\"" Apr 17 14:42:13.884810 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:13.884689 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 14:42:13.884810 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:13.884689 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 14:42:13.884810 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:13.884758 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 14:42:13.885034 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:13.884971 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 14:42:13.890584 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:13.890562 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-596bc867c-54qdn"] Apr 17 14:42:13.970918 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:13.970884 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpsvs\" (UniqueName: \"kubernetes.io/projected/b247dd44-4d34-4a95-a9a7-8ff7b84ab721-kube-api-access-tpsvs\") pod \"kube-auth-proxy-596bc867c-54qdn\" (UID: \"b247dd44-4d34-4a95-a9a7-8ff7b84ab721\") " pod="openshift-ingress/kube-auth-proxy-596bc867c-54qdn" Apr 17 14:42:13.971077 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:13.970947 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b247dd44-4d34-4a95-a9a7-8ff7b84ab721-tls-certs\") pod \"kube-auth-proxy-596bc867c-54qdn\" (UID: \"b247dd44-4d34-4a95-a9a7-8ff7b84ab721\") " pod="openshift-ingress/kube-auth-proxy-596bc867c-54qdn" Apr 17 14:42:13.971077 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:13.971013 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b247dd44-4d34-4a95-a9a7-8ff7b84ab721-tmp\") pod \"kube-auth-proxy-596bc867c-54qdn\" (UID: \"b247dd44-4d34-4a95-a9a7-8ff7b84ab721\") " pod="openshift-ingress/kube-auth-proxy-596bc867c-54qdn" Apr 17 14:42:14.072143 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:14.072102 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpsvs\" (UniqueName: \"kubernetes.io/projected/b247dd44-4d34-4a95-a9a7-8ff7b84ab721-kube-api-access-tpsvs\") pod \"kube-auth-proxy-596bc867c-54qdn\" (UID: \"b247dd44-4d34-4a95-a9a7-8ff7b84ab721\") " pod="openshift-ingress/kube-auth-proxy-596bc867c-54qdn" Apr 17 14:42:14.072359 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:14.072165 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b247dd44-4d34-4a95-a9a7-8ff7b84ab721-tls-certs\") pod \"kube-auth-proxy-596bc867c-54qdn\" (UID: \"b247dd44-4d34-4a95-a9a7-8ff7b84ab721\") " pod="openshift-ingress/kube-auth-proxy-596bc867c-54qdn" Apr 17 14:42:14.072359 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:14.072191 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b247dd44-4d34-4a95-a9a7-8ff7b84ab721-tmp\") pod \"kube-auth-proxy-596bc867c-54qdn\" (UID: \"b247dd44-4d34-4a95-a9a7-8ff7b84ab721\") " pod="openshift-ingress/kube-auth-proxy-596bc867c-54qdn" Apr 17 14:42:14.074392 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:14.074370 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b247dd44-4d34-4a95-a9a7-8ff7b84ab721-tmp\") pod \"kube-auth-proxy-596bc867c-54qdn\" (UID: \"b247dd44-4d34-4a95-a9a7-8ff7b84ab721\") " pod="openshift-ingress/kube-auth-proxy-596bc867c-54qdn" Apr 17 14:42:14.074522 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:14.074496 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b247dd44-4d34-4a95-a9a7-8ff7b84ab721-tls-certs\") pod \"kube-auth-proxy-596bc867c-54qdn\" (UID: \"b247dd44-4d34-4a95-a9a7-8ff7b84ab721\") " pod="openshift-ingress/kube-auth-proxy-596bc867c-54qdn" Apr 17 14:42:14.079866 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:14.079843 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpsvs\" (UniqueName: \"kubernetes.io/projected/b247dd44-4d34-4a95-a9a7-8ff7b84ab721-kube-api-access-tpsvs\") pod \"kube-auth-proxy-596bc867c-54qdn\" (UID: \"b247dd44-4d34-4a95-a9a7-8ff7b84ab721\") " pod="openshift-ingress/kube-auth-proxy-596bc867c-54qdn" Apr 17 14:42:14.190883 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:14.190803 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-596bc867c-54qdn" Apr 17 14:42:14.314215 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:14.314181 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-596bc867c-54qdn"] Apr 17 14:42:14.316528 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:42:14.316502 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb247dd44_4d34_4a95_a9a7_8ff7b84ab721.slice/crio-7f0bb88521b4765159865a3ba37e74e62f85d439c780a06c34be1151421bee5e WatchSource:0}: Error finding container 7f0bb88521b4765159865a3ba37e74e62f85d439c780a06c34be1151421bee5e: Status 404 returned error can't find the container with id 7f0bb88521b4765159865a3ba37e74e62f85d439c780a06c34be1151421bee5e Apr 17 14:42:14.713543 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:14.713506 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-596bc867c-54qdn" event={"ID":"b247dd44-4d34-4a95-a9a7-8ff7b84ab721","Type":"ContainerStarted","Data":"7f0bb88521b4765159865a3ba37e74e62f85d439c780a06c34be1151421bee5e"} Apr 17 14:42:16.877118 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:16.877082 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-k6gvc"] Apr 17 14:42:16.880896 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:16.880870 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-k6gvc" Apr 17 14:42:16.883661 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:16.883628 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 17 14:42:16.883661 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:16.883652 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-mcw8d\"" Apr 17 14:42:16.888678 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:16.888638 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-k6gvc"] Apr 17 14:42:16.995602 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:16.995550 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf35ee29-5a30-4277-a1af-d999fbdb36de-cert\") pod \"odh-model-controller-858dbf95b8-k6gvc\" (UID: \"cf35ee29-5a30-4277-a1af-d999fbdb36de\") " pod="opendatahub/odh-model-controller-858dbf95b8-k6gvc" Apr 17 14:42:16.995790 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:16.995627 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7tks\" (UniqueName: \"kubernetes.io/projected/cf35ee29-5a30-4277-a1af-d999fbdb36de-kube-api-access-k7tks\") pod \"odh-model-controller-858dbf95b8-k6gvc\" (UID: \"cf35ee29-5a30-4277-a1af-d999fbdb36de\") " pod="opendatahub/odh-model-controller-858dbf95b8-k6gvc" Apr 17 14:42:17.096708 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:17.096676 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf35ee29-5a30-4277-a1af-d999fbdb36de-cert\") pod \"odh-model-controller-858dbf95b8-k6gvc\" (UID: \"cf35ee29-5a30-4277-a1af-d999fbdb36de\") " pod="opendatahub/odh-model-controller-858dbf95b8-k6gvc" Apr 17 14:42:17.096881 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:17.096734 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7tks\" (UniqueName: \"kubernetes.io/projected/cf35ee29-5a30-4277-a1af-d999fbdb36de-kube-api-access-k7tks\") pod \"odh-model-controller-858dbf95b8-k6gvc\" (UID: \"cf35ee29-5a30-4277-a1af-d999fbdb36de\") " pod="opendatahub/odh-model-controller-858dbf95b8-k6gvc" Apr 17 14:42:17.096881 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:42:17.096838 2570 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 14:42:17.096951 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:42:17.096904 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf35ee29-5a30-4277-a1af-d999fbdb36de-cert podName:cf35ee29-5a30-4277-a1af-d999fbdb36de nodeName:}" failed. No retries permitted until 2026-04-17 14:42:17.59688589 +0000 UTC m=+503.953613493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf35ee29-5a30-4277-a1af-d999fbdb36de-cert") pod "odh-model-controller-858dbf95b8-k6gvc" (UID: "cf35ee29-5a30-4277-a1af-d999fbdb36de") : secret "odh-model-controller-webhook-cert" not found Apr 17 14:42:17.107555 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:17.107533 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7tks\" (UniqueName: \"kubernetes.io/projected/cf35ee29-5a30-4277-a1af-d999fbdb36de-kube-api-access-k7tks\") pod \"odh-model-controller-858dbf95b8-k6gvc\" (UID: \"cf35ee29-5a30-4277-a1af-d999fbdb36de\") " pod="opendatahub/odh-model-controller-858dbf95b8-k6gvc" Apr 17 14:42:17.601644 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:17.601599 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf35ee29-5a30-4277-a1af-d999fbdb36de-cert\") pod \"odh-model-controller-858dbf95b8-k6gvc\" (UID: \"cf35ee29-5a30-4277-a1af-d999fbdb36de\") " pod="opendatahub/odh-model-controller-858dbf95b8-k6gvc" Apr 17 14:42:17.603939 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:17.603905 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf35ee29-5a30-4277-a1af-d999fbdb36de-cert\") pod \"odh-model-controller-858dbf95b8-k6gvc\" (UID: \"cf35ee29-5a30-4277-a1af-d999fbdb36de\") " pod="opendatahub/odh-model-controller-858dbf95b8-k6gvc" Apr 17 14:42:17.795503 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:17.795460 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-k6gvc" Apr 17 14:42:18.448825 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:18.448764 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-k6gvc"] Apr 17 14:42:18.453202 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:42:18.453102 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf35ee29_5a30_4277_a1af_d999fbdb36de.slice/crio-915f7b4769c5c748d87b9c20336007b481e3a8fff4fc7a1c594473f39b15712d WatchSource:0}: Error finding container 915f7b4769c5c748d87b9c20336007b481e3a8fff4fc7a1c594473f39b15712d: Status 404 returned error can't find the container with id 915f7b4769c5c748d87b9c20336007b481e3a8fff4fc7a1c594473f39b15712d Apr 17 14:42:18.728872 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:18.728755 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-596bc867c-54qdn" event={"ID":"b247dd44-4d34-4a95-a9a7-8ff7b84ab721","Type":"ContainerStarted","Data":"16812e05038aaf97bf51e98705f6f940d3743f3e1f20f131fc06ef987b5d92b9"} Apr 17 14:42:18.729807 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:18.729780 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-k6gvc" event={"ID":"cf35ee29-5a30-4277-a1af-d999fbdb36de","Type":"ContainerStarted","Data":"915f7b4769c5c748d87b9c20336007b481e3a8fff4fc7a1c594473f39b15712d"} Apr 17 14:42:18.745420 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:18.745372 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-596bc867c-54qdn" podStartSLOduration=1.695155133 podStartE2EDuration="5.745355371s" podCreationTimestamp="2026-04-17 14:42:13 +0000 UTC" firstStartedPulling="2026-04-17 14:42:14.318169263 +0000 UTC m=+500.674896866" lastFinishedPulling="2026-04-17 14:42:18.368369495 +0000 UTC m=+504.725097104" observedRunningTime="2026-04-17 14:42:18.744028178 +0000 UTC m=+505.100755814" watchObservedRunningTime="2026-04-17 14:42:18.745355371 +0000 UTC m=+505.102082998" Apr 17 14:42:19.127649 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.127614 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-66f567c4b6-skq9g"] Apr 17 14:42:19.131133 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.131108 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-skq9g" Apr 17 14:42:19.135673 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.135050 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:42:19.135673 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.135115 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 14:42:19.135673 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.135142 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 14:42:19.135673 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.135494 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 14:42:19.137760 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.135903 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 14:42:19.138124 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.137867 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-bbn7n\"" Apr 17 14:42:19.139371 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.139325 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-66f567c4b6-skq9g"] Apr 17 14:42:19.316234 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.316191 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2e17f585-e743-4564-8174-9b44434f5f66-manager-config\") pod \"lws-controller-manager-66f567c4b6-skq9g\" (UID: \"2e17f585-e743-4564-8174-9b44434f5f66\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-skq9g" Apr 17 14:42:19.316441 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.316268 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e17f585-e743-4564-8174-9b44434f5f66-cert\") pod \"lws-controller-manager-66f567c4b6-skq9g\" (UID: \"2e17f585-e743-4564-8174-9b44434f5f66\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-skq9g" Apr 17 14:42:19.316441 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.316296 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f96sr\" (UniqueName: \"kubernetes.io/projected/2e17f585-e743-4564-8174-9b44434f5f66-kube-api-access-f96sr\") pod \"lws-controller-manager-66f567c4b6-skq9g\" (UID: \"2e17f585-e743-4564-8174-9b44434f5f66\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-skq9g" Apr 17 14:42:19.316441 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.316375 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2e17f585-e743-4564-8174-9b44434f5f66-metrics-cert\") pod \"lws-controller-manager-66f567c4b6-skq9g\" (UID: \"2e17f585-e743-4564-8174-9b44434f5f66\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-skq9g" Apr 17 14:42:19.417448 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.417371 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2e17f585-e743-4564-8174-9b44434f5f66-manager-config\") pod \"lws-controller-manager-66f567c4b6-skq9g\" (UID: \"2e17f585-e743-4564-8174-9b44434f5f66\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-skq9g" Apr 17 14:42:19.417448 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.417427 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e17f585-e743-4564-8174-9b44434f5f66-cert\") pod \"lws-controller-manager-66f567c4b6-skq9g\" (UID: \"2e17f585-e743-4564-8174-9b44434f5f66\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-skq9g" Apr 17 14:42:19.417659 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.417454 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f96sr\" (UniqueName: \"kubernetes.io/projected/2e17f585-e743-4564-8174-9b44434f5f66-kube-api-access-f96sr\") pod \"lws-controller-manager-66f567c4b6-skq9g\" (UID: \"2e17f585-e743-4564-8174-9b44434f5f66\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-skq9g" Apr 17 14:42:19.417659 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.417507 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2e17f585-e743-4564-8174-9b44434f5f66-metrics-cert\") pod \"lws-controller-manager-66f567c4b6-skq9g\" (UID: \"2e17f585-e743-4564-8174-9b44434f5f66\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-skq9g" Apr 17 14:42:19.418587 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.418504 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2e17f585-e743-4564-8174-9b44434f5f66-manager-config\") pod \"lws-controller-manager-66f567c4b6-skq9g\" (UID: \"2e17f585-e743-4564-8174-9b44434f5f66\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-skq9g" Apr 17 14:42:19.420900 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.420829 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2e17f585-e743-4564-8174-9b44434f5f66-metrics-cert\") pod \"lws-controller-manager-66f567c4b6-skq9g\" (UID: \"2e17f585-e743-4564-8174-9b44434f5f66\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-skq9g" Apr 17 14:42:19.421017 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.420972 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e17f585-e743-4564-8174-9b44434f5f66-cert\") pod \"lws-controller-manager-66f567c4b6-skq9g\" (UID: \"2e17f585-e743-4564-8174-9b44434f5f66\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-skq9g" Apr 17 14:42:19.425912 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.425889 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f96sr\" (UniqueName: \"kubernetes.io/projected/2e17f585-e743-4564-8174-9b44434f5f66-kube-api-access-f96sr\") pod \"lws-controller-manager-66f567c4b6-skq9g\" (UID: \"2e17f585-e743-4564-8174-9b44434f5f66\") " pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-skq9g" Apr 17 14:42:19.453026 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.452987 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-skq9g" Apr 17 14:42:19.599787 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.599734 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-66f567c4b6-skq9g"] Apr 17 14:42:19.602728 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:42:19.602694 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e17f585_e743_4564_8174_9b44434f5f66.slice/crio-81c68eefb7c98dba656b42acee0065c7034fae6543eedb1293928016f7a5f88b WatchSource:0}: Error finding container 81c68eefb7c98dba656b42acee0065c7034fae6543eedb1293928016f7a5f88b: Status 404 returned error can't find the container with id 81c68eefb7c98dba656b42acee0065c7034fae6543eedb1293928016f7a5f88b Apr 17 14:42:19.735629 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:19.735526 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-skq9g" event={"ID":"2e17f585-e743-4564-8174-9b44434f5f66","Type":"ContainerStarted","Data":"81c68eefb7c98dba656b42acee0065c7034fae6543eedb1293928016f7a5f88b"} Apr 17 14:42:22.747722 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:22.747619 2570 generic.go:358] "Generic (PLEG): container finished" podID="cf35ee29-5a30-4277-a1af-d999fbdb36de" containerID="b3897b2e7993b60987eb979c0522a7d794a3f65ee83d44eade2157e5f389df12" exitCode=1 Apr 17 14:42:22.748176 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:22.747708 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-k6gvc" event={"ID":"cf35ee29-5a30-4277-a1af-d999fbdb36de","Type":"ContainerDied","Data":"b3897b2e7993b60987eb979c0522a7d794a3f65ee83d44eade2157e5f389df12"} Apr 17 14:42:22.748176 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:22.747950 2570 scope.go:117] "RemoveContainer" containerID="b3897b2e7993b60987eb979c0522a7d794a3f65ee83d44eade2157e5f389df12" Apr 17 14:42:22.749382 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:22.749357 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-skq9g" event={"ID":"2e17f585-e743-4564-8174-9b44434f5f66","Type":"ContainerStarted","Data":"173326602a0cccc5f08d5b4fe5b3d05f5b9ec2f706cf1471c13e8a42185a4035"} Apr 17 14:42:22.749628 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:22.749490 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-skq9g" Apr 17 14:42:22.809764 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:22.809712 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-skq9g" podStartSLOduration=0.989029507 podStartE2EDuration="3.809696444s" podCreationTimestamp="2026-04-17 14:42:19 +0000 UTC" firstStartedPulling="2026-04-17 14:42:19.605289836 +0000 UTC m=+505.962017441" lastFinishedPulling="2026-04-17 14:42:22.425956772 +0000 UTC m=+508.782684378" observedRunningTime="2026-04-17 14:42:22.808620443 +0000 UTC m=+509.165348068" watchObservedRunningTime="2026-04-17 14:42:22.809696444 +0000 UTC m=+509.166424070" Apr 17 14:42:23.753942 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:23.753832 2570 generic.go:358] "Generic (PLEG): container finished" podID="cf35ee29-5a30-4277-a1af-d999fbdb36de" containerID="f39326395160f427e8c7f477f272f5305ee0cff6eb0b4286d4af4cfb47033cba" exitCode=1 Apr 17 14:42:23.753942 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:23.753915 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-k6gvc" event={"ID":"cf35ee29-5a30-4277-a1af-d999fbdb36de","Type":"ContainerDied","Data":"f39326395160f427e8c7f477f272f5305ee0cff6eb0b4286d4af4cfb47033cba"} Apr 17 14:42:23.754496 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:23.753963 2570 scope.go:117] "RemoveContainer" containerID="b3897b2e7993b60987eb979c0522a7d794a3f65ee83d44eade2157e5f389df12" Apr 17 14:42:23.754496 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:23.754174 2570 scope.go:117] "RemoveContainer" containerID="f39326395160f427e8c7f477f272f5305ee0cff6eb0b4286d4af4cfb47033cba" Apr 17 14:42:23.754496 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:42:23.754470 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-k6gvc_opendatahub(cf35ee29-5a30-4277-a1af-d999fbdb36de)\"" pod="opendatahub/odh-model-controller-858dbf95b8-k6gvc" podUID="cf35ee29-5a30-4277-a1af-d999fbdb36de" Apr 17 14:42:24.759488 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:24.759457 2570 scope.go:117] "RemoveContainer" containerID="f39326395160f427e8c7f477f272f5305ee0cff6eb0b4286d4af4cfb47033cba" Apr 17 14:42:24.759846 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:42:24.759644 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-k6gvc_opendatahub(cf35ee29-5a30-4277-a1af-d999fbdb36de)\"" pod="opendatahub/odh-model-controller-858dbf95b8-k6gvc" podUID="cf35ee29-5a30-4277-a1af-d999fbdb36de" Apr 17 14:42:27.795591 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:27.795553 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-k6gvc" Apr 17 14:42:27.795983 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:27.795929 2570 scope.go:117] "RemoveContainer" containerID="f39326395160f427e8c7f477f272f5305ee0cff6eb0b4286d4af4cfb47033cba" Apr 17 14:42:27.796112 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:42:27.796093 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-k6gvc_opendatahub(cf35ee29-5a30-4277-a1af-d999fbdb36de)\"" pod="opendatahub/odh-model-controller-858dbf95b8-k6gvc" podUID="cf35ee29-5a30-4277-a1af-d999fbdb36de" Apr 17 14:42:33.756904 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:33.756870 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-66f567c4b6-skq9g" Apr 17 14:42:37.795960 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:37.795923 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-k6gvc" Apr 17 14:42:37.796471 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:37.796354 2570 scope.go:117] "RemoveContainer" containerID="f39326395160f427e8c7f477f272f5305ee0cff6eb0b4286d4af4cfb47033cba" Apr 17 14:42:38.804406 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:38.804371 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-k6gvc" event={"ID":"cf35ee29-5a30-4277-a1af-d999fbdb36de","Type":"ContainerStarted","Data":"c7bcb82882eae51af89739716a607d1b7541b51f2164927a0e9aced68b0ff90f"} Apr 17 14:42:38.804791 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:38.804572 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-k6gvc" Apr 17 14:42:49.810521 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:49.810442 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-k6gvc" Apr 17 14:42:49.827314 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:42:49.827264 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-k6gvc" podStartSLOduration=14.193260162 podStartE2EDuration="33.827233301s" podCreationTimestamp="2026-04-17 14:42:16 +0000 UTC" firstStartedPulling="2026-04-17 14:42:18.454441054 +0000 UTC m=+504.811168659" lastFinishedPulling="2026-04-17 14:42:38.088414179 +0000 UTC m=+524.445141798" observedRunningTime="2026-04-17 14:42:38.822201604 +0000 UTC m=+525.178929230" watchObservedRunningTime="2026-04-17 14:42:49.827233301 +0000 UTC m=+536.183960925" Apr 17 14:43:45.787461 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:43:45.787426 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f8756f569-wqvjz"] Apr 17 14:43:54.159038 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:43:54.158981 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lr966_0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c/ovn-acl-logging/0.log" Apr 17 14:43:54.161455 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:43:54.161435 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lr966_0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c/ovn-acl-logging/0.log" Apr 17 14:44:02.732706 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:02.732668 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xj54s"] Apr 17 14:44:02.735978 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:02.735958 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xj54s" Apr 17 14:44:02.738551 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:02.738519 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 17 14:44:02.738551 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:02.738526 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 17 14:44:02.738813 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:02.738574 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 14:44:02.738813 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:02.738637 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 14:44:02.739706 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:02.739692 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-f6hxm\"" Apr 17 14:44:02.745493 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:02.745468 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xj54s"] Apr 17 14:44:02.807011 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:02.806970 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/97f76fcc-9592-4845-9cef-a42c1d09ddca-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-xj54s\" (UID: \"97f76fcc-9592-4845-9cef-a42c1d09ddca\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xj54s" Apr 17 14:44:02.807198 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:02.807022 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/97f76fcc-9592-4845-9cef-a42c1d09ddca-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-xj54s\" (UID: \"97f76fcc-9592-4845-9cef-a42c1d09ddca\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xj54s" Apr 17 14:44:02.807198 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:02.807092 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn5sm\" (UniqueName: \"kubernetes.io/projected/97f76fcc-9592-4845-9cef-a42c1d09ddca-kube-api-access-zn5sm\") pod \"kuadrant-console-plugin-6cb54b5c86-xj54s\" (UID: \"97f76fcc-9592-4845-9cef-a42c1d09ddca\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xj54s" Apr 17 14:44:02.908211 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:02.908174 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/97f76fcc-9592-4845-9cef-a42c1d09ddca-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-xj54s\" (UID: \"97f76fcc-9592-4845-9cef-a42c1d09ddca\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xj54s" Apr 17 14:44:02.908211 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:02.908219 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/97f76fcc-9592-4845-9cef-a42c1d09ddca-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-xj54s\" (UID: \"97f76fcc-9592-4845-9cef-a42c1d09ddca\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xj54s" Apr 17 14:44:02.908447 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:02.908274 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zn5sm\" (UniqueName: \"kubernetes.io/projected/97f76fcc-9592-4845-9cef-a42c1d09ddca-kube-api-access-zn5sm\") pod \"kuadrant-console-plugin-6cb54b5c86-xj54s\" (UID: \"97f76fcc-9592-4845-9cef-a42c1d09ddca\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xj54s" Apr 17 14:44:02.908447 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:44:02.908337 2570 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 17 14:44:02.908447 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:44:02.908413 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97f76fcc-9592-4845-9cef-a42c1d09ddca-plugin-serving-cert podName:97f76fcc-9592-4845-9cef-a42c1d09ddca nodeName:}" failed. No retries permitted until 2026-04-17 14:44:03.4083953 +0000 UTC m=+609.765122903 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/97f76fcc-9592-4845-9cef-a42c1d09ddca-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-xj54s" (UID: "97f76fcc-9592-4845-9cef-a42c1d09ddca") : secret "plugin-serving-cert" not found Apr 17 14:44:02.908865 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:02.908845 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/97f76fcc-9592-4845-9cef-a42c1d09ddca-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-xj54s\" (UID: \"97f76fcc-9592-4845-9cef-a42c1d09ddca\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xj54s" Apr 17 14:44:02.919261 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:02.919211 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn5sm\" (UniqueName: \"kubernetes.io/projected/97f76fcc-9592-4845-9cef-a42c1d09ddca-kube-api-access-zn5sm\") pod \"kuadrant-console-plugin-6cb54b5c86-xj54s\" (UID: \"97f76fcc-9592-4845-9cef-a42c1d09ddca\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xj54s" Apr 17 14:44:03.411710 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:03.411671 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/97f76fcc-9592-4845-9cef-a42c1d09ddca-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-xj54s\" (UID: \"97f76fcc-9592-4845-9cef-a42c1d09ddca\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xj54s" Apr 17 14:44:03.413981 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:03.413959 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/97f76fcc-9592-4845-9cef-a42c1d09ddca-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-xj54s\" (UID: \"97f76fcc-9592-4845-9cef-a42c1d09ddca\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xj54s" Apr 17 14:44:03.646265 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:03.646195 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xj54s" Apr 17 14:44:03.764712 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:03.764686 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xj54s"] Apr 17 14:44:03.766984 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:44:03.766952 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97f76fcc_9592_4845_9cef_a42c1d09ddca.slice/crio-3297596802c833bf4c11a9a118fbb9565a9478dfb0a09136f4092646c1f83c56 WatchSource:0}: Error finding container 3297596802c833bf4c11a9a118fbb9565a9478dfb0a09136f4092646c1f83c56: Status 404 returned error can't find the container with id 3297596802c833bf4c11a9a118fbb9565a9478dfb0a09136f4092646c1f83c56 Apr 17 14:44:04.089699 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:04.089661 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xj54s" event={"ID":"97f76fcc-9592-4845-9cef-a42c1d09ddca","Type":"ContainerStarted","Data":"3297596802c833bf4c11a9a118fbb9565a9478dfb0a09136f4092646c1f83c56"} Apr 17 14:44:10.806423 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:10.806352 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5f8756f569-wqvjz" podUID="b10bac76-d2a9-42fe-a0b9-20f63c877990" containerName="console" containerID="cri-o://47c558a145c58452c22daf162a3fab222e04b6543957713c1f26a68678a0982d" gracePeriod=15 Apr 17 14:44:11.067015 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.066957 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f8756f569-wqvjz_b10bac76-d2a9-42fe-a0b9-20f63c877990/console/0.log" Apr 17 14:44:11.067126 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.067024 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:44:11.082933 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.082875 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b10bac76-d2a9-42fe-a0b9-20f63c877990-console-serving-cert\") pod \"b10bac76-d2a9-42fe-a0b9-20f63c877990\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " Apr 17 14:44:11.082933 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.082908 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg4kt\" (UniqueName: \"kubernetes.io/projected/b10bac76-d2a9-42fe-a0b9-20f63c877990-kube-api-access-dg4kt\") pod \"b10bac76-d2a9-42fe-a0b9-20f63c877990\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " Apr 17 14:44:11.082933 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.082929 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-trusted-ca-bundle\") pod \"b10bac76-d2a9-42fe-a0b9-20f63c877990\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " Apr 17 14:44:11.083166 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.082959 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-console-config\") pod \"b10bac76-d2a9-42fe-a0b9-20f63c877990\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " Apr 17 14:44:11.083307 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.083278 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b10bac76-d2a9-42fe-a0b9-20f63c877990-console-oauth-config\") pod \"b10bac76-d2a9-42fe-a0b9-20f63c877990\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " Apr 17 14:44:11.083431 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.083341 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-service-ca\") pod \"b10bac76-d2a9-42fe-a0b9-20f63c877990\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " Apr 17 14:44:11.083431 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.083372 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-oauth-serving-cert\") pod \"b10bac76-d2a9-42fe-a0b9-20f63c877990\" (UID: \"b10bac76-d2a9-42fe-a0b9-20f63c877990\") " Apr 17 14:44:11.083431 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.083319 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-console-config" (OuterVolumeSpecName: "console-config") pod "b10bac76-d2a9-42fe-a0b9-20f63c877990" (UID: "b10bac76-d2a9-42fe-a0b9-20f63c877990"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:44:11.083608 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.083496 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b10bac76-d2a9-42fe-a0b9-20f63c877990" (UID: "b10bac76-d2a9-42fe-a0b9-20f63c877990"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:44:11.083699 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.083683 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-trusted-ca-bundle\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:44:11.083768 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.083703 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-console-config\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:44:11.083768 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.083714 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-service-ca" (OuterVolumeSpecName: "service-ca") pod "b10bac76-d2a9-42fe-a0b9-20f63c877990" (UID: "b10bac76-d2a9-42fe-a0b9-20f63c877990"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:44:11.083884 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.083775 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b10bac76-d2a9-42fe-a0b9-20f63c877990" (UID: "b10bac76-d2a9-42fe-a0b9-20f63c877990"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:44:11.085762 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.085718 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b10bac76-d2a9-42fe-a0b9-20f63c877990-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b10bac76-d2a9-42fe-a0b9-20f63c877990" (UID: "b10bac76-d2a9-42fe-a0b9-20f63c877990"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:44:11.085762 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.085750 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b10bac76-d2a9-42fe-a0b9-20f63c877990-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b10bac76-d2a9-42fe-a0b9-20f63c877990" (UID: "b10bac76-d2a9-42fe-a0b9-20f63c877990"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:44:11.086364 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.086322 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b10bac76-d2a9-42fe-a0b9-20f63c877990-kube-api-access-dg4kt" (OuterVolumeSpecName: "kube-api-access-dg4kt") pod "b10bac76-d2a9-42fe-a0b9-20f63c877990" (UID: "b10bac76-d2a9-42fe-a0b9-20f63c877990"). InnerVolumeSpecName "kube-api-access-dg4kt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:44:11.124618 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.124595 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f8756f569-wqvjz_b10bac76-d2a9-42fe-a0b9-20f63c877990/console/0.log" Apr 17 14:44:11.124749 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.124635 2570 generic.go:358] "Generic (PLEG): container finished" podID="b10bac76-d2a9-42fe-a0b9-20f63c877990" containerID="47c558a145c58452c22daf162a3fab222e04b6543957713c1f26a68678a0982d" exitCode=2 Apr 17 14:44:11.124749 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.124730 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f8756f569-wqvjz" event={"ID":"b10bac76-d2a9-42fe-a0b9-20f63c877990","Type":"ContainerDied","Data":"47c558a145c58452c22daf162a3fab222e04b6543957713c1f26a68678a0982d"} Apr 17 14:44:11.124843 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.124745 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f8756f569-wqvjz" Apr 17 14:44:11.124843 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.124768 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f8756f569-wqvjz" event={"ID":"b10bac76-d2a9-42fe-a0b9-20f63c877990","Type":"ContainerDied","Data":"8fc7552671b52292bb0c62fe826b959b7b5c512ec7e95b1f6674d3467643993f"} Apr 17 14:44:11.124843 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.124789 2570 scope.go:117] "RemoveContainer" containerID="47c558a145c58452c22daf162a3fab222e04b6543957713c1f26a68678a0982d" Apr 17 14:44:11.134027 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.134012 2570 scope.go:117] "RemoveContainer" containerID="47c558a145c58452c22daf162a3fab222e04b6543957713c1f26a68678a0982d" Apr 17 14:44:11.134553 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:44:11.134530 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47c558a145c58452c22daf162a3fab222e04b6543957713c1f26a68678a0982d\": container with ID starting with 47c558a145c58452c22daf162a3fab222e04b6543957713c1f26a68678a0982d not found: ID does not exist" containerID="47c558a145c58452c22daf162a3fab222e04b6543957713c1f26a68678a0982d" Apr 17 14:44:11.134660 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.134558 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47c558a145c58452c22daf162a3fab222e04b6543957713c1f26a68678a0982d"} err="failed to get container status \"47c558a145c58452c22daf162a3fab222e04b6543957713c1f26a68678a0982d\": rpc error: code = NotFound desc = could not find container \"47c558a145c58452c22daf162a3fab222e04b6543957713c1f26a68678a0982d\": container with ID starting with 47c558a145c58452c22daf162a3fab222e04b6543957713c1f26a68678a0982d not found: ID does not exist" Apr 17 14:44:11.155210 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.155176 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f8756f569-wqvjz"] Apr 17 14:44:11.159330 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.159286 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5f8756f569-wqvjz"] Apr 17 14:44:11.184404 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.184379 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dg4kt\" (UniqueName: \"kubernetes.io/projected/b10bac76-d2a9-42fe-a0b9-20f63c877990-kube-api-access-dg4kt\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:44:11.184404 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.184404 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b10bac76-d2a9-42fe-a0b9-20f63c877990-console-oauth-config\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:44:11.184596 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.184415 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-service-ca\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:44:11.184596 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.184424 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b10bac76-d2a9-42fe-a0b9-20f63c877990-oauth-serving-cert\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:44:11.184596 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:11.184433 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b10bac76-d2a9-42fe-a0b9-20f63c877990-console-serving-cert\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:44:12.240482 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:12.240444 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b10bac76-d2a9-42fe-a0b9-20f63c877990" path="/var/lib/kubelet/pods/b10bac76-d2a9-42fe-a0b9-20f63c877990/volumes" Apr 17 14:44:28.188771 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:28.188739 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xj54s" event={"ID":"97f76fcc-9592-4845-9cef-a42c1d09ddca","Type":"ContainerStarted","Data":"39d33f27f6e9b81911d757b08c898d25c7bae9f2ac4d342fca13c7fdd0a3b07e"} Apr 17 14:44:28.206037 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:28.205994 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xj54s" podStartSLOduration=2.813695046 podStartE2EDuration="26.205982815s" podCreationTimestamp="2026-04-17 14:44:02 +0000 UTC" firstStartedPulling="2026-04-17 14:44:03.76817066 +0000 UTC m=+610.124898262" lastFinishedPulling="2026-04-17 14:44:27.160458415 +0000 UTC m=+633.517186031" observedRunningTime="2026-04-17 14:44:28.203939539 +0000 UTC m=+634.560667165" watchObservedRunningTime="2026-04-17 14:44:28.205982815 +0000 UTC m=+634.562710439" Apr 17 14:44:49.335321 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:49.335286 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-mm5sn"] Apr 17 14:44:49.335744 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:49.335581 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b10bac76-d2a9-42fe-a0b9-20f63c877990" containerName="console" Apr 17 14:44:49.335744 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:49.335592 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="b10bac76-d2a9-42fe-a0b9-20f63c877990" containerName="console" Apr 17 14:44:49.335744 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:49.335656 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="b10bac76-d2a9-42fe-a0b9-20f63c877990" containerName="console" Apr 17 14:44:49.461958 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:49.461923 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-mm5sn"] Apr 17 14:44:49.461958 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:49.461963 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-mm5sn"] Apr 17 14:44:49.462192 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:49.461981 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-mm5sn" Apr 17 14:44:49.464932 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:49.464912 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 14:44:49.516909 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:49.516877 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/46a95ade-4532-4f45-80c5-72c44ed44e09-config-file\") pod \"limitador-limitador-7d549b5b-mm5sn\" (UID: \"46a95ade-4532-4f45-80c5-72c44ed44e09\") " pod="kuadrant-system/limitador-limitador-7d549b5b-mm5sn" Apr 17 14:44:49.517045 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:49.516965 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75sd8\" (UniqueName: \"kubernetes.io/projected/46a95ade-4532-4f45-80c5-72c44ed44e09-kube-api-access-75sd8\") pod \"limitador-limitador-7d549b5b-mm5sn\" (UID: \"46a95ade-4532-4f45-80c5-72c44ed44e09\") " pod="kuadrant-system/limitador-limitador-7d549b5b-mm5sn" Apr 17 14:44:49.617605 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:49.617533 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75sd8\" (UniqueName: \"kubernetes.io/projected/46a95ade-4532-4f45-80c5-72c44ed44e09-kube-api-access-75sd8\") pod \"limitador-limitador-7d549b5b-mm5sn\" (UID: \"46a95ade-4532-4f45-80c5-72c44ed44e09\") " pod="kuadrant-system/limitador-limitador-7d549b5b-mm5sn" Apr 17 14:44:49.617605 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:49.617594 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/46a95ade-4532-4f45-80c5-72c44ed44e09-config-file\") pod \"limitador-limitador-7d549b5b-mm5sn\" (UID: \"46a95ade-4532-4f45-80c5-72c44ed44e09\") " pod="kuadrant-system/limitador-limitador-7d549b5b-mm5sn" Apr 17 14:44:49.618136 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:49.618119 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/46a95ade-4532-4f45-80c5-72c44ed44e09-config-file\") pod \"limitador-limitador-7d549b5b-mm5sn\" (UID: \"46a95ade-4532-4f45-80c5-72c44ed44e09\") " pod="kuadrant-system/limitador-limitador-7d549b5b-mm5sn" Apr 17 14:44:49.625459 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:49.625437 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75sd8\" (UniqueName: \"kubernetes.io/projected/46a95ade-4532-4f45-80c5-72c44ed44e09-kube-api-access-75sd8\") pod \"limitador-limitador-7d549b5b-mm5sn\" (UID: \"46a95ade-4532-4f45-80c5-72c44ed44e09\") " pod="kuadrant-system/limitador-limitador-7d549b5b-mm5sn" Apr 17 14:44:49.772808 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:49.772771 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-mm5sn" Apr 17 14:44:49.888408 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:49.888382 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-mm5sn"] Apr 17 14:44:49.890807 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:44:49.890779 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46a95ade_4532_4f45_80c5_72c44ed44e09.slice/crio-6c4782b336304a0fb633d969d48d7a3ff033c6937892e80b1722a96113f94d8e WatchSource:0}: Error finding container 6c4782b336304a0fb633d969d48d7a3ff033c6937892e80b1722a96113f94d8e: Status 404 returned error can't find the container with id 6c4782b336304a0fb633d969d48d7a3ff033c6937892e80b1722a96113f94d8e Apr 17 14:44:49.892411 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:49.892394 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:44:50.148074 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:50.147993 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-5f7nm"] Apr 17 14:44:50.196342 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:50.196306 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-5f7nm"] Apr 17 14:44:50.196495 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:50.196457 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-5f7nm" Apr 17 14:44:50.200256 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:50.200219 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-6slcb\"" Apr 17 14:44:50.223155 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:50.223129 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfs7v\" (UniqueName: \"kubernetes.io/projected/f63e8147-9064-422b-bab4-031af9edfa24-kube-api-access-bfs7v\") pod \"authorino-f99f4b5cd-5f7nm\" (UID: \"f63e8147-9064-422b-bab4-031af9edfa24\") " pod="kuadrant-system/authorino-f99f4b5cd-5f7nm" Apr 17 14:44:50.257512 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:50.257480 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-mm5sn" event={"ID":"46a95ade-4532-4f45-80c5-72c44ed44e09","Type":"ContainerStarted","Data":"6c4782b336304a0fb633d969d48d7a3ff033c6937892e80b1722a96113f94d8e"} Apr 17 14:44:50.324218 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:50.324187 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfs7v\" (UniqueName: \"kubernetes.io/projected/f63e8147-9064-422b-bab4-031af9edfa24-kube-api-access-bfs7v\") pod \"authorino-f99f4b5cd-5f7nm\" (UID: \"f63e8147-9064-422b-bab4-031af9edfa24\") " pod="kuadrant-system/authorino-f99f4b5cd-5f7nm" Apr 17 14:44:50.332843 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:50.332815 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfs7v\" (UniqueName: \"kubernetes.io/projected/f63e8147-9064-422b-bab4-031af9edfa24-kube-api-access-bfs7v\") pod \"authorino-f99f4b5cd-5f7nm\" (UID: \"f63e8147-9064-422b-bab4-031af9edfa24\") " pod="kuadrant-system/authorino-f99f4b5cd-5f7nm" Apr 17 14:44:50.506858 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:50.506769 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-5f7nm" Apr 17 14:44:50.662959 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:50.662899 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-5f7nm"] Apr 17 14:44:50.665403 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:44:50.665369 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf63e8147_9064_422b_bab4_031af9edfa24.slice/crio-8fd72d564d021d5e4bcf4a3b720fd9133bba1a08f89122f6b108bb7004de2adf WatchSource:0}: Error finding container 8fd72d564d021d5e4bcf4a3b720fd9133bba1a08f89122f6b108bb7004de2adf: Status 404 returned error can't find the container with id 8fd72d564d021d5e4bcf4a3b720fd9133bba1a08f89122f6b108bb7004de2adf Apr 17 14:44:51.262654 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:51.262611 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-5f7nm" event={"ID":"f63e8147-9064-422b-bab4-031af9edfa24","Type":"ContainerStarted","Data":"8fd72d564d021d5e4bcf4a3b720fd9133bba1a08f89122f6b108bb7004de2adf"} Apr 17 14:44:54.066704 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:54.066666 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-5f7nm"] Apr 17 14:44:55.278859 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:55.278819 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-mm5sn" event={"ID":"46a95ade-4532-4f45-80c5-72c44ed44e09","Type":"ContainerStarted","Data":"54f1bec14fa478c359e30647efaa0b29f84c7f0ecae4f73c7064e90392a2c50d"} Apr 17 14:44:55.279335 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:55.278877 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-mm5sn" Apr 17 14:44:55.280136 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:55.280113 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-5f7nm" event={"ID":"f63e8147-9064-422b-bab4-031af9edfa24","Type":"ContainerStarted","Data":"746e7b1567b75dd7a8c3b834f5dbff23fce9e7769ae21731b3bb35070d847948"} Apr 17 14:44:55.280208 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:55.280187 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-5f7nm" podUID="f63e8147-9064-422b-bab4-031af9edfa24" containerName="authorino" containerID="cri-o://746e7b1567b75dd7a8c3b834f5dbff23fce9e7769ae21731b3bb35070d847948" gracePeriod=30 Apr 17 14:44:55.293891 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:55.293838 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-mm5sn" podStartSLOduration=1.258679162 podStartE2EDuration="6.293824744s" podCreationTimestamp="2026-04-17 14:44:49 +0000 UTC" firstStartedPulling="2026-04-17 14:44:49.892519813 +0000 UTC m=+656.249247416" lastFinishedPulling="2026-04-17 14:44:54.927665385 +0000 UTC m=+661.284392998" observedRunningTime="2026-04-17 14:44:55.293672001 +0000 UTC m=+661.650399627" watchObservedRunningTime="2026-04-17 14:44:55.293824744 +0000 UTC m=+661.650552578" Apr 17 14:44:55.307780 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:55.307739 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-5f7nm" podStartSLOduration=1.184126263 podStartE2EDuration="5.307727717s" podCreationTimestamp="2026-04-17 14:44:50 +0000 UTC" firstStartedPulling="2026-04-17 14:44:50.667457229 +0000 UTC m=+657.024184836" lastFinishedPulling="2026-04-17 14:44:54.791058684 +0000 UTC m=+661.147786290" observedRunningTime="2026-04-17 14:44:55.306085655 +0000 UTC m=+661.662813281" watchObservedRunningTime="2026-04-17 14:44:55.307727717 +0000 UTC m=+661.664455342" Apr 17 14:44:55.525737 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:55.525711 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-5f7nm" Apr 17 14:44:55.569463 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:55.569382 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfs7v\" (UniqueName: \"kubernetes.io/projected/f63e8147-9064-422b-bab4-031af9edfa24-kube-api-access-bfs7v\") pod \"f63e8147-9064-422b-bab4-031af9edfa24\" (UID: \"f63e8147-9064-422b-bab4-031af9edfa24\") " Apr 17 14:44:55.571682 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:55.571655 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f63e8147-9064-422b-bab4-031af9edfa24-kube-api-access-bfs7v" (OuterVolumeSpecName: "kube-api-access-bfs7v") pod "f63e8147-9064-422b-bab4-031af9edfa24" (UID: "f63e8147-9064-422b-bab4-031af9edfa24"). InnerVolumeSpecName "kube-api-access-bfs7v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:44:55.670234 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:55.670191 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bfs7v\" (UniqueName: \"kubernetes.io/projected/f63e8147-9064-422b-bab4-031af9edfa24-kube-api-access-bfs7v\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:44:56.283959 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:56.283925 2570 generic.go:358] "Generic (PLEG): container finished" podID="f63e8147-9064-422b-bab4-031af9edfa24" containerID="746e7b1567b75dd7a8c3b834f5dbff23fce9e7769ae21731b3bb35070d847948" exitCode=0 Apr 17 14:44:56.284382 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:56.283975 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-5f7nm" Apr 17 14:44:56.284382 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:56.284009 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-5f7nm" event={"ID":"f63e8147-9064-422b-bab4-031af9edfa24","Type":"ContainerDied","Data":"746e7b1567b75dd7a8c3b834f5dbff23fce9e7769ae21731b3bb35070d847948"} Apr 17 14:44:56.284382 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:56.284043 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-5f7nm" event={"ID":"f63e8147-9064-422b-bab4-031af9edfa24","Type":"ContainerDied","Data":"8fd72d564d021d5e4bcf4a3b720fd9133bba1a08f89122f6b108bb7004de2adf"} Apr 17 14:44:56.284382 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:56.284058 2570 scope.go:117] "RemoveContainer" containerID="746e7b1567b75dd7a8c3b834f5dbff23fce9e7769ae21731b3bb35070d847948" Apr 17 14:44:56.291829 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:56.291803 2570 scope.go:117] "RemoveContainer" containerID="746e7b1567b75dd7a8c3b834f5dbff23fce9e7769ae21731b3bb35070d847948" Apr 17 14:44:56.292103 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:44:56.292081 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"746e7b1567b75dd7a8c3b834f5dbff23fce9e7769ae21731b3bb35070d847948\": container with ID starting with 746e7b1567b75dd7a8c3b834f5dbff23fce9e7769ae21731b3bb35070d847948 not found: ID does not exist" containerID="746e7b1567b75dd7a8c3b834f5dbff23fce9e7769ae21731b3bb35070d847948" Apr 17 14:44:56.292141 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:56.292114 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"746e7b1567b75dd7a8c3b834f5dbff23fce9e7769ae21731b3bb35070d847948"} err="failed to get container status \"746e7b1567b75dd7a8c3b834f5dbff23fce9e7769ae21731b3bb35070d847948\": rpc error: code = NotFound desc = could not find container \"746e7b1567b75dd7a8c3b834f5dbff23fce9e7769ae21731b3bb35070d847948\": container with ID starting with 746e7b1567b75dd7a8c3b834f5dbff23fce9e7769ae21731b3bb35070d847948 not found: ID does not exist" Apr 17 14:44:56.299523 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:56.299500 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-5f7nm"] Apr 17 14:44:56.303332 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:56.303312 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-5f7nm"] Apr 17 14:44:58.238920 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:44:58.238883 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f63e8147-9064-422b-bab4-031af9edfa24" path="/var/lib/kubelet/pods/f63e8147-9064-422b-bab4-031af9edfa24/volumes" Apr 17 14:45:03.784657 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:03.784621 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-mm5sn"] Apr 17 14:45:03.785102 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:03.784843 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-mm5sn" podUID="46a95ade-4532-4f45-80c5-72c44ed44e09" containerName="limitador" containerID="cri-o://54f1bec14fa478c359e30647efaa0b29f84c7f0ecae4f73c7064e90392a2c50d" gracePeriod=30 Apr 17 14:45:03.786565 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:03.786544 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-mm5sn" Apr 17 14:45:04.310798 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:04.310706 2570 generic.go:358] "Generic (PLEG): container finished" podID="46a95ade-4532-4f45-80c5-72c44ed44e09" containerID="54f1bec14fa478c359e30647efaa0b29f84c7f0ecae4f73c7064e90392a2c50d" exitCode=0 Apr 17 14:45:04.310929 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:04.310784 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-mm5sn" event={"ID":"46a95ade-4532-4f45-80c5-72c44ed44e09","Type":"ContainerDied","Data":"54f1bec14fa478c359e30647efaa0b29f84c7f0ecae4f73c7064e90392a2c50d"} Apr 17 14:45:04.310929 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:04.310829 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-mm5sn" event={"ID":"46a95ade-4532-4f45-80c5-72c44ed44e09","Type":"ContainerDied","Data":"6c4782b336304a0fb633d969d48d7a3ff033c6937892e80b1722a96113f94d8e"} Apr 17 14:45:04.310929 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:04.310844 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c4782b336304a0fb633d969d48d7a3ff033c6937892e80b1722a96113f94d8e" Apr 17 14:45:04.322946 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:04.322925 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-mm5sn" Apr 17 14:45:04.443377 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:04.443284 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/46a95ade-4532-4f45-80c5-72c44ed44e09-config-file\") pod \"46a95ade-4532-4f45-80c5-72c44ed44e09\" (UID: \"46a95ade-4532-4f45-80c5-72c44ed44e09\") " Apr 17 14:45:04.443377 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:04.443353 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75sd8\" (UniqueName: \"kubernetes.io/projected/46a95ade-4532-4f45-80c5-72c44ed44e09-kube-api-access-75sd8\") pod \"46a95ade-4532-4f45-80c5-72c44ed44e09\" (UID: \"46a95ade-4532-4f45-80c5-72c44ed44e09\") " Apr 17 14:45:04.443713 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:04.443689 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46a95ade-4532-4f45-80c5-72c44ed44e09-config-file" (OuterVolumeSpecName: "config-file") pod "46a95ade-4532-4f45-80c5-72c44ed44e09" (UID: "46a95ade-4532-4f45-80c5-72c44ed44e09"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:45:04.445376 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:04.445355 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46a95ade-4532-4f45-80c5-72c44ed44e09-kube-api-access-75sd8" (OuterVolumeSpecName: "kube-api-access-75sd8") pod "46a95ade-4532-4f45-80c5-72c44ed44e09" (UID: "46a95ade-4532-4f45-80c5-72c44ed44e09"). InnerVolumeSpecName "kube-api-access-75sd8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:45:04.544615 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:04.544585 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-75sd8\" (UniqueName: \"kubernetes.io/projected/46a95ade-4532-4f45-80c5-72c44ed44e09-kube-api-access-75sd8\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:45:04.544615 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:04.544613 2570 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/46a95ade-4532-4f45-80c5-72c44ed44e09-config-file\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:45:05.314332 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:05.314295 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-mm5sn" Apr 17 14:45:05.334000 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:05.333967 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-rr2w2"] Apr 17 14:45:05.334336 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:05.334320 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46a95ade-4532-4f45-80c5-72c44ed44e09" containerName="limitador" Apr 17 14:45:05.334336 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:05.334335 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="46a95ade-4532-4f45-80c5-72c44ed44e09" containerName="limitador" Apr 17 14:45:05.334494 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:05.334345 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f63e8147-9064-422b-bab4-031af9edfa24" containerName="authorino" Apr 17 14:45:05.334494 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:05.334351 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63e8147-9064-422b-bab4-031af9edfa24" containerName="authorino" Apr 17 14:45:05.334494 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:05.334410 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="46a95ade-4532-4f45-80c5-72c44ed44e09" containerName="limitador" Apr 17 14:45:05.334494 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:05.334421 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="f63e8147-9064-422b-bab4-031af9edfa24" containerName="authorino" Apr 17 14:45:05.336543 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:05.336521 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-rr2w2" Apr 17 14:45:05.339182 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:05.339163 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 17 14:45:05.339385 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:05.339164 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-9dpnj\"" Apr 17 14:45:05.344271 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:05.344234 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-mm5sn"] Apr 17 14:45:05.347750 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:05.347726 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-rr2w2"] Apr 17 14:45:05.349805 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:05.349781 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-mm5sn"] Apr 17 14:45:05.451470 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:05.451428 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdb5z\" (UniqueName: \"kubernetes.io/projected/4952c4ad-a6d9-4658-b9e4-3bcd9971f71a-kube-api-access-wdb5z\") pod \"postgres-868db5846d-rr2w2\" (UID: \"4952c4ad-a6d9-4658-b9e4-3bcd9971f71a\") " pod="opendatahub/postgres-868db5846d-rr2w2" Apr 17 14:45:05.451470 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:05.451477 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4952c4ad-a6d9-4658-b9e4-3bcd9971f71a-data\") pod \"postgres-868db5846d-rr2w2\" (UID: \"4952c4ad-a6d9-4658-b9e4-3bcd9971f71a\") " pod="opendatahub/postgres-868db5846d-rr2w2" Apr 17 14:45:05.552594 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:05.552548 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdb5z\" (UniqueName: \"kubernetes.io/projected/4952c4ad-a6d9-4658-b9e4-3bcd9971f71a-kube-api-access-wdb5z\") pod \"postgres-868db5846d-rr2w2\" (UID: \"4952c4ad-a6d9-4658-b9e4-3bcd9971f71a\") " pod="opendatahub/postgres-868db5846d-rr2w2" Apr 17 14:45:05.552775 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:05.552603 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4952c4ad-a6d9-4658-b9e4-3bcd9971f71a-data\") pod \"postgres-868db5846d-rr2w2\" (UID: \"4952c4ad-a6d9-4658-b9e4-3bcd9971f71a\") " pod="opendatahub/postgres-868db5846d-rr2w2" Apr 17 14:45:05.553058 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:05.553037 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4952c4ad-a6d9-4658-b9e4-3bcd9971f71a-data\") pod \"postgres-868db5846d-rr2w2\" (UID: \"4952c4ad-a6d9-4658-b9e4-3bcd9971f71a\") " pod="opendatahub/postgres-868db5846d-rr2w2" Apr 17 14:45:05.560428 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:05.560406 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdb5z\" (UniqueName: \"kubernetes.io/projected/4952c4ad-a6d9-4658-b9e4-3bcd9971f71a-kube-api-access-wdb5z\") pod \"postgres-868db5846d-rr2w2\" (UID: \"4952c4ad-a6d9-4658-b9e4-3bcd9971f71a\") " pod="opendatahub/postgres-868db5846d-rr2w2" Apr 17 14:45:05.649756 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:05.649653 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-rr2w2" Apr 17 14:45:05.766656 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:05.766623 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-rr2w2"] Apr 17 14:45:05.768960 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:45:05.768933 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4952c4ad_a6d9_4658_b9e4_3bcd9971f71a.slice/crio-fdb85e5e22360b026d798c51f3d6e9b9f52d31218ac46644659ae792c1e65752 WatchSource:0}: Error finding container fdb85e5e22360b026d798c51f3d6e9b9f52d31218ac46644659ae792c1e65752: Status 404 returned error can't find the container with id fdb85e5e22360b026d798c51f3d6e9b9f52d31218ac46644659ae792c1e65752 Apr 17 14:45:06.239379 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:06.239345 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46a95ade-4532-4f45-80c5-72c44ed44e09" path="/var/lib/kubelet/pods/46a95ade-4532-4f45-80c5-72c44ed44e09/volumes" Apr 17 14:45:06.318444 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:06.318407 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-rr2w2" event={"ID":"4952c4ad-a6d9-4658-b9e4-3bcd9971f71a","Type":"ContainerStarted","Data":"fdb85e5e22360b026d798c51f3d6e9b9f52d31218ac46644659ae792c1e65752"} Apr 17 14:45:11.337456 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:11.337422 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-rr2w2" event={"ID":"4952c4ad-a6d9-4658-b9e4-3bcd9971f71a","Type":"ContainerStarted","Data":"8864052eacb6ccedb2b4e2eace49da478f78273caa65a8f711d8ded6f1a682b5"} Apr 17 14:45:11.337880 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:11.337544 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-rr2w2" Apr 17 14:45:11.352301 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:11.352230 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-rr2w2" podStartSLOduration=0.959675081 podStartE2EDuration="6.352212597s" podCreationTimestamp="2026-04-17 14:45:05 +0000 UTC" firstStartedPulling="2026-04-17 14:45:05.770266182 +0000 UTC m=+672.126993785" lastFinishedPulling="2026-04-17 14:45:11.162803698 +0000 UTC m=+677.519531301" observedRunningTime="2026-04-17 14:45:11.351611513 +0000 UTC m=+677.708339139" watchObservedRunningTime="2026-04-17 14:45:11.352212597 +0000 UTC m=+677.708940223" Apr 17 14:45:17.370199 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:17.370160 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-rr2w2" Apr 17 14:45:18.154912 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.154879 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-7b85867767-4cxkh"] Apr 17 14:45:18.158378 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.158357 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7b85867767-4cxkh" Apr 17 14:45:18.161450 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.161422 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 17 14:45:18.161567 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.161548 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 17 14:45:18.161676 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.161660 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-6gnrl\"" Apr 17 14:45:18.169824 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.169796 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7b85867767-4cxkh"] Apr 17 14:45:18.181422 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.181399 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7cb75c8495-dnv2h"] Apr 17 14:45:18.185091 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.185070 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7cb75c8495-dnv2h" Apr 17 14:45:18.187821 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.187802 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-7pfsd\"" Apr 17 14:45:18.194400 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.194381 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7cb75c8495-dnv2h"] Apr 17 14:45:18.266983 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.266952 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7b32ea11-1e79-4868-8c2b-a5b441595373-maas-api-tls\") pod \"maas-api-7b85867767-4cxkh\" (UID: \"7b32ea11-1e79-4868-8c2b-a5b441595373\") " pod="opendatahub/maas-api-7b85867767-4cxkh" Apr 17 14:45:18.267146 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.267007 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz9n5\" (UniqueName: \"kubernetes.io/projected/7b32ea11-1e79-4868-8c2b-a5b441595373-kube-api-access-hz9n5\") pod \"maas-api-7b85867767-4cxkh\" (UID: \"7b32ea11-1e79-4868-8c2b-a5b441595373\") " pod="opendatahub/maas-api-7b85867767-4cxkh" Apr 17 14:45:18.267146 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.267088 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7wv8\" (UniqueName: \"kubernetes.io/projected/60f19f03-1ddf-4ad4-8118-b0a21cdc32aa-kube-api-access-n7wv8\") pod \"maas-controller-7cb75c8495-dnv2h\" (UID: \"60f19f03-1ddf-4ad4-8118-b0a21cdc32aa\") " pod="opendatahub/maas-controller-7cb75c8495-dnv2h" Apr 17 14:45:18.367776 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.367741 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hz9n5\" (UniqueName: \"kubernetes.io/projected/7b32ea11-1e79-4868-8c2b-a5b441595373-kube-api-access-hz9n5\") pod \"maas-api-7b85867767-4cxkh\" (UID: \"7b32ea11-1e79-4868-8c2b-a5b441595373\") " pod="opendatahub/maas-api-7b85867767-4cxkh" Apr 17 14:45:18.367982 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.367801 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7wv8\" (UniqueName: \"kubernetes.io/projected/60f19f03-1ddf-4ad4-8118-b0a21cdc32aa-kube-api-access-n7wv8\") pod \"maas-controller-7cb75c8495-dnv2h\" (UID: \"60f19f03-1ddf-4ad4-8118-b0a21cdc32aa\") " pod="opendatahub/maas-controller-7cb75c8495-dnv2h" Apr 17 14:45:18.367982 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.367972 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7b32ea11-1e79-4868-8c2b-a5b441595373-maas-api-tls\") pod \"maas-api-7b85867767-4cxkh\" (UID: \"7b32ea11-1e79-4868-8c2b-a5b441595373\") " pod="opendatahub/maas-api-7b85867767-4cxkh" Apr 17 14:45:18.370435 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.370406 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7b32ea11-1e79-4868-8c2b-a5b441595373-maas-api-tls\") pod \"maas-api-7b85867767-4cxkh\" (UID: \"7b32ea11-1e79-4868-8c2b-a5b441595373\") " pod="opendatahub/maas-api-7b85867767-4cxkh" Apr 17 14:45:18.375992 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.375966 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz9n5\" (UniqueName: \"kubernetes.io/projected/7b32ea11-1e79-4868-8c2b-a5b441595373-kube-api-access-hz9n5\") pod \"maas-api-7b85867767-4cxkh\" (UID: \"7b32ea11-1e79-4868-8c2b-a5b441595373\") " pod="opendatahub/maas-api-7b85867767-4cxkh" Apr 17 14:45:18.376106 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.375964 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7wv8\" (UniqueName: \"kubernetes.io/projected/60f19f03-1ddf-4ad4-8118-b0a21cdc32aa-kube-api-access-n7wv8\") pod \"maas-controller-7cb75c8495-dnv2h\" (UID: \"60f19f03-1ddf-4ad4-8118-b0a21cdc32aa\") " pod="opendatahub/maas-controller-7cb75c8495-dnv2h" Apr 17 14:45:18.469599 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.469506 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7b85867767-4cxkh" Apr 17 14:45:18.497466 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.497428 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7cb75c8495-dnv2h" Apr 17 14:45:18.632919 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.632843 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7b85867767-4cxkh"] Apr 17 14:45:18.642162 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:45:18.642112 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b32ea11_1e79_4868_8c2b_a5b441595373.slice/crio-2e6ff7b842adfa636e229c0e1e44dffdf2727c2d8f3bfb7f85d7200610285e6a WatchSource:0}: Error finding container 2e6ff7b842adfa636e229c0e1e44dffdf2727c2d8f3bfb7f85d7200610285e6a: Status 404 returned error can't find the container with id 2e6ff7b842adfa636e229c0e1e44dffdf2727c2d8f3bfb7f85d7200610285e6a Apr 17 14:45:18.661561 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.661539 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7cb75c8495-dnv2h"] Apr 17 14:45:18.665018 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:45:18.664981 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60f19f03_1ddf_4ad4_8118_b0a21cdc32aa.slice/crio-2c0b17dbdd238b96f13613817c2f88bb094a686816d07cd5ad50b2c84d932e65 WatchSource:0}: Error finding container 2c0b17dbdd238b96f13613817c2f88bb094a686816d07cd5ad50b2c84d932e65: Status 404 returned error can't find the container with id 2c0b17dbdd238b96f13613817c2f88bb094a686816d07cd5ad50b2c84d932e65 Apr 17 14:45:18.843629 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.843594 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-mz5jz"] Apr 17 14:45:18.848327 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.848303 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-mz5jz" Apr 17 14:45:18.850863 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.850840 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-6slcb\"" Apr 17 14:45:18.855508 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.855468 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-mz5jz"] Apr 17 14:45:18.973056 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.973017 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmhwr\" (UniqueName: \"kubernetes.io/projected/1892b6a8-4bbc-4b41-a42f-c5ceaa67b142-kube-api-access-fmhwr\") pod \"authorino-8b475cf9f-mz5jz\" (UID: \"1892b6a8-4bbc-4b41-a42f-c5ceaa67b142\") " pod="kuadrant-system/authorino-8b475cf9f-mz5jz" Apr 17 14:45:18.988306 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.983068 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-678768785c-74wq8"] Apr 17 14:45:18.989012 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:18.988978 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-678768785c-74wq8" Apr 17 14:45:19.003798 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.003767 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-678768785c-74wq8"] Apr 17 14:45:19.074013 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.073965 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmhwr\" (UniqueName: \"kubernetes.io/projected/1892b6a8-4bbc-4b41-a42f-c5ceaa67b142-kube-api-access-fmhwr\") pod \"authorino-8b475cf9f-mz5jz\" (UID: \"1892b6a8-4bbc-4b41-a42f-c5ceaa67b142\") " pod="kuadrant-system/authorino-8b475cf9f-mz5jz" Apr 17 14:45:19.074204 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.074074 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/70d01211-a2a6-4b53-ac2a-08255eb342d8-maas-api-tls\") pod \"maas-api-678768785c-74wq8\" (UID: \"70d01211-a2a6-4b53-ac2a-08255eb342d8\") " pod="opendatahub/maas-api-678768785c-74wq8" Apr 17 14:45:19.074204 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.074098 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khgd4\" (UniqueName: \"kubernetes.io/projected/70d01211-a2a6-4b53-ac2a-08255eb342d8-kube-api-access-khgd4\") pod \"maas-api-678768785c-74wq8\" (UID: \"70d01211-a2a6-4b53-ac2a-08255eb342d8\") " pod="opendatahub/maas-api-678768785c-74wq8" Apr 17 14:45:19.081891 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.081864 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmhwr\" (UniqueName: \"kubernetes.io/projected/1892b6a8-4bbc-4b41-a42f-c5ceaa67b142-kube-api-access-fmhwr\") pod \"authorino-8b475cf9f-mz5jz\" (UID: \"1892b6a8-4bbc-4b41-a42f-c5ceaa67b142\") " pod="kuadrant-system/authorino-8b475cf9f-mz5jz" Apr 17 14:45:19.085351 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.085323 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-mz5jz"] Apr 17 14:45:19.085665 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.085652 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-mz5jz" Apr 17 14:45:19.167900 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:45:19.167857 2570 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: fetching blob: received unexpected HTTP status: 503 Service Unavailable; artifact err: provided artifact is a container image" image="quay.io/opendatahub/maas-api@sha256:1e0ceed47cc0b43982aa9dd20e8ba1b464e9e776337381c854f7497ab1b738dd" Apr 17 14:45:19.168224 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:45:19.168154 2570 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:maas-api,Image:quay.io/opendatahub/maas-api@sha256:1e0ceed47cc0b43982aa9dd20e8ba1b464e9e776337381c854f7497ab1b738dd,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GATEWAY_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:maas-parameters,},Key:gateway-namespace,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:GATEWAY_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:maas-parameters,},Key:gateway-name,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:API_KEY_MAX_EXPIRATION_DAYS,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:maas-parameters,},Key:api-key-max-expiration-days,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:SECURE,Value:true,ValueFrom:nil,},EnvVar{Name:TLS_CERT,Value:/etc/maas-api/tls/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY,Value:/etc/maas-api/tls/tls.key,ValueFrom:nil,},EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:MAAS_SUBSCRIPTION_NAMESPACE,Value:models-as-a-service,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:maas-api-tls,ReadOnly:true,MountPath:/etc/maas-api/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hz9n5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{1 0 https},Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{1 0 https},Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod maas-api-7b85867767-4cxkh_opendatahub(7b32ea11-1e79-4868-8c2b-a5b441595373): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: fetching blob: received unexpected HTTP status: 503 Service Unavailable; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 14:45:19.169396 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:45:19.169363 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"maas-api\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: fetching blob: received unexpected HTTP status: 503 Service Unavailable; artifact err: provided artifact is a container image\"" pod="opendatahub/maas-api-7b85867767-4cxkh" podUID="7b32ea11-1e79-4868-8c2b-a5b441595373" Apr 17 14:45:19.175036 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.175007 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/70d01211-a2a6-4b53-ac2a-08255eb342d8-maas-api-tls\") pod \"maas-api-678768785c-74wq8\" (UID: \"70d01211-a2a6-4b53-ac2a-08255eb342d8\") " pod="opendatahub/maas-api-678768785c-74wq8" Apr 17 14:45:19.175142 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.175060 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khgd4\" (UniqueName: \"kubernetes.io/projected/70d01211-a2a6-4b53-ac2a-08255eb342d8-kube-api-access-khgd4\") pod \"maas-api-678768785c-74wq8\" (UID: \"70d01211-a2a6-4b53-ac2a-08255eb342d8\") " pod="opendatahub/maas-api-678768785c-74wq8" Apr 17 14:45:19.177615 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.177589 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/70d01211-a2a6-4b53-ac2a-08255eb342d8-maas-api-tls\") pod \"maas-api-678768785c-74wq8\" (UID: \"70d01211-a2a6-4b53-ac2a-08255eb342d8\") " pod="opendatahub/maas-api-678768785c-74wq8" Apr 17 14:45:19.182335 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.182312 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khgd4\" (UniqueName: \"kubernetes.io/projected/70d01211-a2a6-4b53-ac2a-08255eb342d8-kube-api-access-khgd4\") pod \"maas-api-678768785c-74wq8\" (UID: \"70d01211-a2a6-4b53-ac2a-08255eb342d8\") " pod="opendatahub/maas-api-678768785c-74wq8" Apr 17 14:45:19.205194 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.205169 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-mz5jz"] Apr 17 14:45:19.207403 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:45:19.207373 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1892b6a8_4bbc_4b41_a42f_c5ceaa67b142.slice/crio-7d3eb8958edbc3baba7933b5c0e32e1d62da645ca5d0aad9b044bd8f243877bb WatchSource:0}: Error finding container 7d3eb8958edbc3baba7933b5c0e32e1d62da645ca5d0aad9b044bd8f243877bb: Status 404 returned error can't find the container with id 7d3eb8958edbc3baba7933b5c0e32e1d62da645ca5d0aad9b044bd8f243877bb Apr 17 14:45:19.299796 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.299756 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-678768785c-74wq8" Apr 17 14:45:19.313854 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.313825 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-676889fd78-94jbq"] Apr 17 14:45:19.317589 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.317569 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-676889fd78-94jbq" Apr 17 14:45:19.320754 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.320561 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 17 14:45:19.323063 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.323039 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-676889fd78-94jbq"] Apr 17 14:45:19.371045 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.370999 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7b85867767-4cxkh" event={"ID":"7b32ea11-1e79-4868-8c2b-a5b441595373","Type":"ContainerStarted","Data":"2e6ff7b842adfa636e229c0e1e44dffdf2727c2d8f3bfb7f85d7200610285e6a"} Apr 17 14:45:19.372142 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:45:19.371979 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"maas-api\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/maas-api@sha256:1e0ceed47cc0b43982aa9dd20e8ba1b464e9e776337381c854f7497ab1b738dd\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: fetching blob: received unexpected HTTP status: 503 Service Unavailable; artifact err: provided artifact is a container image\"" pod="opendatahub/maas-api-7b85867767-4cxkh" podUID="7b32ea11-1e79-4868-8c2b-a5b441595373" Apr 17 14:45:19.377710 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.375260 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-mz5jz" event={"ID":"1892b6a8-4bbc-4b41-a42f-c5ceaa67b142","Type":"ContainerStarted","Data":"7d3eb8958edbc3baba7933b5c0e32e1d62da645ca5d0aad9b044bd8f243877bb"} Apr 17 14:45:19.377710 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.376526 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/6c45910e-b654-460b-ab94-a9f965045cb7-tls-cert\") pod \"authorino-676889fd78-94jbq\" (UID: \"6c45910e-b654-460b-ab94-a9f965045cb7\") " pod="kuadrant-system/authorino-676889fd78-94jbq" Apr 17 14:45:19.377710 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.376673 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8qwm\" (UniqueName: \"kubernetes.io/projected/6c45910e-b654-460b-ab94-a9f965045cb7-kube-api-access-l8qwm\") pod \"authorino-676889fd78-94jbq\" (UID: \"6c45910e-b654-460b-ab94-a9f965045cb7\") " pod="kuadrant-system/authorino-676889fd78-94jbq" Apr 17 14:45:19.378953 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.378907 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7cb75c8495-dnv2h" event={"ID":"60f19f03-1ddf-4ad4-8118-b0a21cdc32aa","Type":"ContainerStarted","Data":"2c0b17dbdd238b96f13613817c2f88bb094a686816d07cd5ad50b2c84d932e65"} Apr 17 14:45:19.451357 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.451331 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-678768785c-74wq8"] Apr 17 14:45:19.454719 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:45:19.454665 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70d01211_a2a6_4b53_ac2a_08255eb342d8.slice/crio-afda9da1d1fb97b7aead9ed2783f95a7938ce12decf6eec81637d33e2a66dedf WatchSource:0}: Error finding container afda9da1d1fb97b7aead9ed2783f95a7938ce12decf6eec81637d33e2a66dedf: Status 404 returned error can't find the container with id afda9da1d1fb97b7aead9ed2783f95a7938ce12decf6eec81637d33e2a66dedf Apr 17 14:45:19.477716 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.477671 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/6c45910e-b654-460b-ab94-a9f965045cb7-tls-cert\") pod \"authorino-676889fd78-94jbq\" (UID: \"6c45910e-b654-460b-ab94-a9f965045cb7\") " pod="kuadrant-system/authorino-676889fd78-94jbq" Apr 17 14:45:19.477930 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.477732 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8qwm\" (UniqueName: \"kubernetes.io/projected/6c45910e-b654-460b-ab94-a9f965045cb7-kube-api-access-l8qwm\") pod \"authorino-676889fd78-94jbq\" (UID: \"6c45910e-b654-460b-ab94-a9f965045cb7\") " pod="kuadrant-system/authorino-676889fd78-94jbq" Apr 17 14:45:19.481086 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.481048 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/6c45910e-b654-460b-ab94-a9f965045cb7-tls-cert\") pod \"authorino-676889fd78-94jbq\" (UID: \"6c45910e-b654-460b-ab94-a9f965045cb7\") " pod="kuadrant-system/authorino-676889fd78-94jbq" Apr 17 14:45:19.486223 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.486189 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8qwm\" (UniqueName: \"kubernetes.io/projected/6c45910e-b654-460b-ab94-a9f965045cb7-kube-api-access-l8qwm\") pod \"authorino-676889fd78-94jbq\" (UID: \"6c45910e-b654-460b-ab94-a9f965045cb7\") " pod="kuadrant-system/authorino-676889fd78-94jbq" Apr 17 14:45:19.630803 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.630771 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-676889fd78-94jbq" Apr 17 14:45:19.776428 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:19.776401 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-676889fd78-94jbq"] Apr 17 14:45:20.385497 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:20.385460 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-mz5jz" event={"ID":"1892b6a8-4bbc-4b41-a42f-c5ceaa67b142","Type":"ContainerStarted","Data":"9246378c0631089eeb198e7b1ba7fe3c34f243556b2d8ec8385a8d08051aad85"} Apr 17 14:45:20.386390 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:20.386012 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-mz5jz" podUID="1892b6a8-4bbc-4b41-a42f-c5ceaa67b142" containerName="authorino" containerID="cri-o://9246378c0631089eeb198e7b1ba7fe3c34f243556b2d8ec8385a8d08051aad85" gracePeriod=30 Apr 17 14:45:20.387032 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:20.386993 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-676889fd78-94jbq" event={"ID":"6c45910e-b654-460b-ab94-a9f965045cb7","Type":"ContainerStarted","Data":"d9b040fcade4a2479b4a0cf8ce10f6b1e7117aab5247394f54d0acaa6c652901"} Apr 17 14:45:20.388991 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:45:20.388959 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"maas-api\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/maas-api@sha256:1e0ceed47cc0b43982aa9dd20e8ba1b464e9e776337381c854f7497ab1b738dd\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: fetching blob: received unexpected HTTP status: 503 Service Unavailable; artifact err: provided artifact is a container image\"" pod="opendatahub/maas-api-7b85867767-4cxkh" podUID="7b32ea11-1e79-4868-8c2b-a5b441595373" Apr 17 14:45:20.389119 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:20.389014 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-678768785c-74wq8" event={"ID":"70d01211-a2a6-4b53-ac2a-08255eb342d8","Type":"ContainerStarted","Data":"afda9da1d1fb97b7aead9ed2783f95a7938ce12decf6eec81637d33e2a66dedf"} Apr 17 14:45:20.400791 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:20.400736 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-mz5jz" podStartSLOduration=2.01133396 podStartE2EDuration="2.400716684s" podCreationTimestamp="2026-04-17 14:45:18 +0000 UTC" firstStartedPulling="2026-04-17 14:45:19.208619175 +0000 UTC m=+685.565346777" lastFinishedPulling="2026-04-17 14:45:19.598001888 +0000 UTC m=+685.954729501" observedRunningTime="2026-04-17 14:45:20.398582584 +0000 UTC m=+686.755310200" watchObservedRunningTime="2026-04-17 14:45:20.400716684 +0000 UTC m=+686.757444311" Apr 17 14:45:21.229823 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:21.229795 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-mz5jz" Apr 17 14:45:21.297227 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:21.297191 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmhwr\" (UniqueName: \"kubernetes.io/projected/1892b6a8-4bbc-4b41-a42f-c5ceaa67b142-kube-api-access-fmhwr\") pod \"1892b6a8-4bbc-4b41-a42f-c5ceaa67b142\" (UID: \"1892b6a8-4bbc-4b41-a42f-c5ceaa67b142\") " Apr 17 14:45:21.299508 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:21.299479 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1892b6a8-4bbc-4b41-a42f-c5ceaa67b142-kube-api-access-fmhwr" (OuterVolumeSpecName: "kube-api-access-fmhwr") pod "1892b6a8-4bbc-4b41-a42f-c5ceaa67b142" (UID: "1892b6a8-4bbc-4b41-a42f-c5ceaa67b142"). InnerVolumeSpecName "kube-api-access-fmhwr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:45:21.394083 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:21.394037 2570 generic.go:358] "Generic (PLEG): container finished" podID="1892b6a8-4bbc-4b41-a42f-c5ceaa67b142" containerID="9246378c0631089eeb198e7b1ba7fe3c34f243556b2d8ec8385a8d08051aad85" exitCode=0 Apr 17 14:45:21.394581 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:21.394097 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-mz5jz" Apr 17 14:45:21.394581 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:21.394135 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-mz5jz" event={"ID":"1892b6a8-4bbc-4b41-a42f-c5ceaa67b142","Type":"ContainerDied","Data":"9246378c0631089eeb198e7b1ba7fe3c34f243556b2d8ec8385a8d08051aad85"} Apr 17 14:45:21.394581 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:21.394184 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-mz5jz" event={"ID":"1892b6a8-4bbc-4b41-a42f-c5ceaa67b142","Type":"ContainerDied","Data":"7d3eb8958edbc3baba7933b5c0e32e1d62da645ca5d0aad9b044bd8f243877bb"} Apr 17 14:45:21.394581 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:21.394206 2570 scope.go:117] "RemoveContainer" containerID="9246378c0631089eeb198e7b1ba7fe3c34f243556b2d8ec8385a8d08051aad85" Apr 17 14:45:21.398137 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:21.398111 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fmhwr\" (UniqueName: \"kubernetes.io/projected/1892b6a8-4bbc-4b41-a42f-c5ceaa67b142-kube-api-access-fmhwr\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:45:21.416865 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:21.416833 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-mz5jz"] Apr 17 14:45:21.422748 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:21.422722 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-mz5jz"] Apr 17 14:45:21.558110 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:21.558087 2570 scope.go:117] "RemoveContainer" containerID="9246378c0631089eeb198e7b1ba7fe3c34f243556b2d8ec8385a8d08051aad85" Apr 17 14:45:21.558454 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:45:21.558422 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9246378c0631089eeb198e7b1ba7fe3c34f243556b2d8ec8385a8d08051aad85\": container with ID starting with 9246378c0631089eeb198e7b1ba7fe3c34f243556b2d8ec8385a8d08051aad85 not found: ID does not exist" containerID="9246378c0631089eeb198e7b1ba7fe3c34f243556b2d8ec8385a8d08051aad85" Apr 17 14:45:21.558545 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:21.558460 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9246378c0631089eeb198e7b1ba7fe3c34f243556b2d8ec8385a8d08051aad85"} err="failed to get container status \"9246378c0631089eeb198e7b1ba7fe3c34f243556b2d8ec8385a8d08051aad85\": rpc error: code = NotFound desc = could not find container \"9246378c0631089eeb198e7b1ba7fe3c34f243556b2d8ec8385a8d08051aad85\": container with ID starting with 9246378c0631089eeb198e7b1ba7fe3c34f243556b2d8ec8385a8d08051aad85 not found: ID does not exist" Apr 17 14:45:22.239826 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:22.239788 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1892b6a8-4bbc-4b41-a42f-c5ceaa67b142" path="/var/lib/kubelet/pods/1892b6a8-4bbc-4b41-a42f-c5ceaa67b142/volumes" Apr 17 14:45:22.405466 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:22.405429 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7cb75c8495-dnv2h" event={"ID":"60f19f03-1ddf-4ad4-8118-b0a21cdc32aa","Type":"ContainerStarted","Data":"2f080d50bf42350bd50a1bed109b98bb846c04590b5f84a23f1a8b22439ae2cc"} Apr 17 14:45:22.405902 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:22.405576 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7cb75c8495-dnv2h" Apr 17 14:45:22.406893 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:22.406865 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-676889fd78-94jbq" event={"ID":"6c45910e-b654-460b-ab94-a9f965045cb7","Type":"ContainerStarted","Data":"29b35ef188b4f5e7bfe0880425ddb785393122fb44439171539b814a62b52e74"} Apr 17 14:45:22.408092 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:22.408068 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-678768785c-74wq8" event={"ID":"70d01211-a2a6-4b53-ac2a-08255eb342d8","Type":"ContainerStarted","Data":"37e7ab3dd711c8aeb41d860c25369f22a98c34806c7b619eb546858f57271798"} Apr 17 14:45:22.408248 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:22.408221 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-678768785c-74wq8" Apr 17 14:45:22.428305 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:22.428265 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7cb75c8495-dnv2h" podStartSLOduration=1.500518805 podStartE2EDuration="4.428254244s" podCreationTimestamp="2026-04-17 14:45:18 +0000 UTC" firstStartedPulling="2026-04-17 14:45:18.666892334 +0000 UTC m=+685.023619936" lastFinishedPulling="2026-04-17 14:45:21.594627766 +0000 UTC m=+687.951355375" observedRunningTime="2026-04-17 14:45:22.427103089 +0000 UTC m=+688.783830715" watchObservedRunningTime="2026-04-17 14:45:22.428254244 +0000 UTC m=+688.784981859" Apr 17 14:45:22.445739 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:22.445698 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-678768785c-74wq8" podStartSLOduration=2.2914744369999998 podStartE2EDuration="4.445688375s" podCreationTimestamp="2026-04-17 14:45:18 +0000 UTC" firstStartedPulling="2026-04-17 14:45:19.456496988 +0000 UTC m=+685.813224593" lastFinishedPulling="2026-04-17 14:45:21.610710914 +0000 UTC m=+687.967438531" observedRunningTime="2026-04-17 14:45:22.443996493 +0000 UTC m=+688.800724116" watchObservedRunningTime="2026-04-17 14:45:22.445688375 +0000 UTC m=+688.802415999" Apr 17 14:45:22.462454 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:22.462417 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-676889fd78-94jbq" podStartSLOduration=1.635278001 podStartE2EDuration="3.462407413s" podCreationTimestamp="2026-04-17 14:45:19 +0000 UTC" firstStartedPulling="2026-04-17 14:45:19.783752177 +0000 UTC m=+686.140479790" lastFinishedPulling="2026-04-17 14:45:21.610881595 +0000 UTC m=+687.967609202" observedRunningTime="2026-04-17 14:45:22.461201011 +0000 UTC m=+688.817928637" watchObservedRunningTime="2026-04-17 14:45:22.462407413 +0000 UTC m=+688.819135037" Apr 17 14:45:28.416766 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:28.416731 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-678768785c-74wq8" Apr 17 14:45:28.465939 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:28.465905 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-7b85867767-4cxkh"] Apr 17 14:45:28.588377 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:28.588352 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7b85867767-4cxkh" Apr 17 14:45:28.763041 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:28.762966 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7b32ea11-1e79-4868-8c2b-a5b441595373-maas-api-tls\") pod \"7b32ea11-1e79-4868-8c2b-a5b441595373\" (UID: \"7b32ea11-1e79-4868-8c2b-a5b441595373\") " Apr 17 14:45:28.763041 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:28.763009 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz9n5\" (UniqueName: \"kubernetes.io/projected/7b32ea11-1e79-4868-8c2b-a5b441595373-kube-api-access-hz9n5\") pod \"7b32ea11-1e79-4868-8c2b-a5b441595373\" (UID: \"7b32ea11-1e79-4868-8c2b-a5b441595373\") " Apr 17 14:45:28.765012 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:28.764981 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b32ea11-1e79-4868-8c2b-a5b441595373-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "7b32ea11-1e79-4868-8c2b-a5b441595373" (UID: "7b32ea11-1e79-4868-8c2b-a5b441595373"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:45:28.765012 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:28.764990 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b32ea11-1e79-4868-8c2b-a5b441595373-kube-api-access-hz9n5" (OuterVolumeSpecName: "kube-api-access-hz9n5") pod "7b32ea11-1e79-4868-8c2b-a5b441595373" (UID: "7b32ea11-1e79-4868-8c2b-a5b441595373"). InnerVolumeSpecName "kube-api-access-hz9n5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:45:28.864628 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:28.864582 2570 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7b32ea11-1e79-4868-8c2b-a5b441595373-maas-api-tls\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:45:28.864628 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:28.864630 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hz9n5\" (UniqueName: \"kubernetes.io/projected/7b32ea11-1e79-4868-8c2b-a5b441595373-kube-api-access-hz9n5\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:45:29.432923 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:29.432880 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7b85867767-4cxkh" event={"ID":"7b32ea11-1e79-4868-8c2b-a5b441595373","Type":"ContainerDied","Data":"2e6ff7b842adfa636e229c0e1e44dffdf2727c2d8f3bfb7f85d7200610285e6a"} Apr 17 14:45:29.432923 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:29.432901 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7b85867767-4cxkh" Apr 17 14:45:29.467066 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:29.467035 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-7b85867767-4cxkh"] Apr 17 14:45:29.471573 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:29.471543 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-7b85867767-4cxkh"] Apr 17 14:45:30.239556 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:30.239515 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b32ea11-1e79-4868-8c2b-a5b441595373" path="/var/lib/kubelet/pods/7b32ea11-1e79-4868-8c2b-a5b441595373/volumes" Apr 17 14:45:33.416256 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:33.416213 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7cb75c8495-dnv2h" Apr 17 14:45:33.699043 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:33.698960 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-647fcc54c6-4q6pb"] Apr 17 14:45:33.699355 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:33.699342 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1892b6a8-4bbc-4b41-a42f-c5ceaa67b142" containerName="authorino" Apr 17 14:45:33.699422 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:33.699358 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="1892b6a8-4bbc-4b41-a42f-c5ceaa67b142" containerName="authorino" Apr 17 14:45:33.699486 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:33.699425 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="1892b6a8-4bbc-4b41-a42f-c5ceaa67b142" containerName="authorino" Apr 17 14:45:33.706875 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:33.706852 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-647fcc54c6-4q6pb" Apr 17 14:45:33.711080 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:33.711049 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-647fcc54c6-4q6pb"] Apr 17 14:45:33.806419 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:33.806384 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74n4t\" (UniqueName: \"kubernetes.io/projected/f9629fe1-db36-4ffe-a3c4-a81df7316e6d-kube-api-access-74n4t\") pod \"maas-controller-647fcc54c6-4q6pb\" (UID: \"f9629fe1-db36-4ffe-a3c4-a81df7316e6d\") " pod="opendatahub/maas-controller-647fcc54c6-4q6pb" Apr 17 14:45:33.907720 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:33.907689 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74n4t\" (UniqueName: \"kubernetes.io/projected/f9629fe1-db36-4ffe-a3c4-a81df7316e6d-kube-api-access-74n4t\") pod \"maas-controller-647fcc54c6-4q6pb\" (UID: \"f9629fe1-db36-4ffe-a3c4-a81df7316e6d\") " pod="opendatahub/maas-controller-647fcc54c6-4q6pb" Apr 17 14:45:33.916139 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:33.916116 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74n4t\" (UniqueName: \"kubernetes.io/projected/f9629fe1-db36-4ffe-a3c4-a81df7316e6d-kube-api-access-74n4t\") pod \"maas-controller-647fcc54c6-4q6pb\" (UID: \"f9629fe1-db36-4ffe-a3c4-a81df7316e6d\") " pod="opendatahub/maas-controller-647fcc54c6-4q6pb" Apr 17 14:45:34.018020 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:34.017922 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-647fcc54c6-4q6pb" Apr 17 14:45:34.153897 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:34.153780 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-647fcc54c6-4q6pb"] Apr 17 14:45:34.156207 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:45:34.156178 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9629fe1_db36_4ffe_a3c4_a81df7316e6d.slice/crio-bb366a7befa9dabc3b1943f8a50927de866af3105a829b117cafc98d940c6b81 WatchSource:0}: Error finding container bb366a7befa9dabc3b1943f8a50927de866af3105a829b117cafc98d940c6b81: Status 404 returned error can't find the container with id bb366a7befa9dabc3b1943f8a50927de866af3105a829b117cafc98d940c6b81 Apr 17 14:45:34.450458 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:34.450416 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-647fcc54c6-4q6pb" event={"ID":"f9629fe1-db36-4ffe-a3c4-a81df7316e6d","Type":"ContainerStarted","Data":"bb366a7befa9dabc3b1943f8a50927de866af3105a829b117cafc98d940c6b81"} Apr 17 14:45:35.454975 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:35.454935 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-647fcc54c6-4q6pb" event={"ID":"f9629fe1-db36-4ffe-a3c4-a81df7316e6d","Type":"ContainerStarted","Data":"1e864425ae3faf423b63740d27ee2c28883c25178845ceb867f3d97023bd4cc5"} Apr 17 14:45:35.455383 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:35.455069 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-647fcc54c6-4q6pb" Apr 17 14:45:35.471252 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:35.471192 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-647fcc54c6-4q6pb" podStartSLOduration=2.123011279 podStartE2EDuration="2.471176132s" podCreationTimestamp="2026-04-17 14:45:33 +0000 UTC" firstStartedPulling="2026-04-17 14:45:34.157507453 +0000 UTC m=+700.514235056" lastFinishedPulling="2026-04-17 14:45:34.505672305 +0000 UTC m=+700.862399909" observedRunningTime="2026-04-17 14:45:35.469262593 +0000 UTC m=+701.825990221" watchObservedRunningTime="2026-04-17 14:45:35.471176132 +0000 UTC m=+701.827903757" Apr 17 14:45:46.463165 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:46.463075 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-647fcc54c6-4q6pb" Apr 17 14:45:46.501193 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:46.501161 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7cb75c8495-dnv2h"] Apr 17 14:45:46.501911 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:46.501873 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-7cb75c8495-dnv2h" podUID="60f19f03-1ddf-4ad4-8118-b0a21cdc32aa" containerName="manager" containerID="cri-o://2f080d50bf42350bd50a1bed109b98bb846c04590b5f84a23f1a8b22439ae2cc" gracePeriod=10 Apr 17 14:45:46.747752 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:46.747727 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7cb75c8495-dnv2h" Apr 17 14:45:46.811598 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:46.811554 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7wv8\" (UniqueName: \"kubernetes.io/projected/60f19f03-1ddf-4ad4-8118-b0a21cdc32aa-kube-api-access-n7wv8\") pod \"60f19f03-1ddf-4ad4-8118-b0a21cdc32aa\" (UID: \"60f19f03-1ddf-4ad4-8118-b0a21cdc32aa\") " Apr 17 14:45:46.813844 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:46.813811 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60f19f03-1ddf-4ad4-8118-b0a21cdc32aa-kube-api-access-n7wv8" (OuterVolumeSpecName: "kube-api-access-n7wv8") pod "60f19f03-1ddf-4ad4-8118-b0a21cdc32aa" (UID: "60f19f03-1ddf-4ad4-8118-b0a21cdc32aa"). InnerVolumeSpecName "kube-api-access-n7wv8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:45:46.912491 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:46.912438 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n7wv8\" (UniqueName: \"kubernetes.io/projected/60f19f03-1ddf-4ad4-8118-b0a21cdc32aa-kube-api-access-n7wv8\") on node \"ip-10-0-129-134.ec2.internal\" DevicePath \"\"" Apr 17 14:45:47.497148 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:47.497109 2570 generic.go:358] "Generic (PLEG): container finished" podID="60f19f03-1ddf-4ad4-8118-b0a21cdc32aa" containerID="2f080d50bf42350bd50a1bed109b98bb846c04590b5f84a23f1a8b22439ae2cc" exitCode=0 Apr 17 14:45:47.497592 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:47.497181 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7cb75c8495-dnv2h" Apr 17 14:45:47.497592 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:47.497203 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7cb75c8495-dnv2h" event={"ID":"60f19f03-1ddf-4ad4-8118-b0a21cdc32aa","Type":"ContainerDied","Data":"2f080d50bf42350bd50a1bed109b98bb846c04590b5f84a23f1a8b22439ae2cc"} Apr 17 14:45:47.497592 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:47.497255 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7cb75c8495-dnv2h" event={"ID":"60f19f03-1ddf-4ad4-8118-b0a21cdc32aa","Type":"ContainerDied","Data":"2c0b17dbdd238b96f13613817c2f88bb094a686816d07cd5ad50b2c84d932e65"} Apr 17 14:45:47.497592 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:47.497276 2570 scope.go:117] "RemoveContainer" containerID="2f080d50bf42350bd50a1bed109b98bb846c04590b5f84a23f1a8b22439ae2cc" Apr 17 14:45:47.506383 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:47.506041 2570 scope.go:117] "RemoveContainer" containerID="2f080d50bf42350bd50a1bed109b98bb846c04590b5f84a23f1a8b22439ae2cc" Apr 17 14:45:47.506459 ip-10-0-129-134 kubenswrapper[2570]: E0417 14:45:47.506378 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f080d50bf42350bd50a1bed109b98bb846c04590b5f84a23f1a8b22439ae2cc\": container with ID starting with 2f080d50bf42350bd50a1bed109b98bb846c04590b5f84a23f1a8b22439ae2cc not found: ID does not exist" containerID="2f080d50bf42350bd50a1bed109b98bb846c04590b5f84a23f1a8b22439ae2cc" Apr 17 14:45:47.506459 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:47.506409 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f080d50bf42350bd50a1bed109b98bb846c04590b5f84a23f1a8b22439ae2cc"} err="failed to get container status \"2f080d50bf42350bd50a1bed109b98bb846c04590b5f84a23f1a8b22439ae2cc\": rpc error: code = NotFound desc = could not find container \"2f080d50bf42350bd50a1bed109b98bb846c04590b5f84a23f1a8b22439ae2cc\": container with ID starting with 2f080d50bf42350bd50a1bed109b98bb846c04590b5f84a23f1a8b22439ae2cc not found: ID does not exist" Apr 17 14:45:47.520875 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:47.520846 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7cb75c8495-dnv2h"] Apr 17 14:45:47.525003 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:47.524982 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-7cb75c8495-dnv2h"] Apr 17 14:45:48.238937 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:45:48.238903 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60f19f03-1ddf-4ad4-8118-b0a21cdc32aa" path="/var/lib/kubelet/pods/60f19f03-1ddf-4ad4-8118-b0a21cdc32aa/volumes" Apr 17 14:48:54.181278 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:48:54.181185 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lr966_0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c/ovn-acl-logging/0.log" Apr 17 14:48:54.185975 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:48:54.185947 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lr966_0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c/ovn-acl-logging/0.log" Apr 17 14:50:50.429544 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:50:50.429511 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-676889fd78-94jbq_6c45910e-b654-460b-ab94-a9f965045cb7/authorino/0.log" Apr 17 14:50:54.254231 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:50:54.254202 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-678768785c-74wq8_70d01211-a2a6-4b53-ac2a-08255eb342d8/maas-api/0.log" Apr 17 14:50:54.369163 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:50:54.369126 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-647fcc54c6-4q6pb_f9629fe1-db36-4ffe-a3c4-a81df7316e6d/manager/0.log" Apr 17 14:50:54.483845 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:50:54.483809 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-k6gvc_cf35ee29-5a30-4277-a1af-d999fbdb36de/manager/2.log" Apr 17 14:50:54.826629 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:50:54.826589 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6c585549bc-zqv2x_b392ed97-9ee8-4818-8103-31efb7f94c80/manager/0.log" Apr 17 14:50:54.943333 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:50:54.943286 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-rr2w2_4952c4ad-a6d9-4658-b9e4-3bcd9971f71a/postgres/0.log" Apr 17 14:50:56.205220 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:50:56.205186 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-676889fd78-94jbq_6c45910e-b654-460b-ab94-a9f965045cb7/authorino/0.log" Apr 17 14:50:56.549393 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:50:56.549366 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-xj54s_97f76fcc-9592-4845-9cef-a42c1d09ddca/kuadrant-console-plugin/0.log" Apr 17 14:50:57.530441 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:50:57.530411 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-596bc867c-54qdn_b247dd44-4d34-4a95-a9a7-8ff7b84ab721/kube-auth-proxy/0.log" Apr 17 14:51:02.609674 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:02.609637 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fvwkz/must-gather-sm479"] Apr 17 14:51:02.610042 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:02.610011 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="60f19f03-1ddf-4ad4-8118-b0a21cdc32aa" containerName="manager" Apr 17 14:51:02.610042 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:02.610022 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="60f19f03-1ddf-4ad4-8118-b0a21cdc32aa" containerName="manager" Apr 17 14:51:02.610111 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:02.610093 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="60f19f03-1ddf-4ad4-8118-b0a21cdc32aa" containerName="manager" Apr 17 14:51:02.613227 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:02.613202 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fvwkz/must-gather-sm479" Apr 17 14:51:02.616037 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:02.616011 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-fvwkz\"/\"default-dockercfg-p84cm\"" Apr 17 14:51:02.616200 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:02.616043 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-fvwkz\"/\"openshift-service-ca.crt\"" Apr 17 14:51:02.617256 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:02.617207 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-fvwkz\"/\"kube-root-ca.crt\"" Apr 17 14:51:02.622074 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:02.622052 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fvwkz/must-gather-sm479"] Apr 17 14:51:02.679610 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:02.679577 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5e2ffa71-a7e9-47a8-9359-2671006a6b60-must-gather-output\") pod \"must-gather-sm479\" (UID: \"5e2ffa71-a7e9-47a8-9359-2671006a6b60\") " pod="openshift-must-gather-fvwkz/must-gather-sm479" Apr 17 14:51:02.679754 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:02.679633 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr2m5\" (UniqueName: \"kubernetes.io/projected/5e2ffa71-a7e9-47a8-9359-2671006a6b60-kube-api-access-gr2m5\") pod \"must-gather-sm479\" (UID: \"5e2ffa71-a7e9-47a8-9359-2671006a6b60\") " pod="openshift-must-gather-fvwkz/must-gather-sm479" Apr 17 14:51:02.780905 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:02.780872 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5e2ffa71-a7e9-47a8-9359-2671006a6b60-must-gather-output\") pod \"must-gather-sm479\" (UID: \"5e2ffa71-a7e9-47a8-9359-2671006a6b60\") " pod="openshift-must-gather-fvwkz/must-gather-sm479" Apr 17 14:51:02.781055 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:02.780923 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gr2m5\" (UniqueName: \"kubernetes.io/projected/5e2ffa71-a7e9-47a8-9359-2671006a6b60-kube-api-access-gr2m5\") pod \"must-gather-sm479\" (UID: \"5e2ffa71-a7e9-47a8-9359-2671006a6b60\") " pod="openshift-must-gather-fvwkz/must-gather-sm479" Apr 17 14:51:02.781220 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:02.781200 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5e2ffa71-a7e9-47a8-9359-2671006a6b60-must-gather-output\") pod \"must-gather-sm479\" (UID: \"5e2ffa71-a7e9-47a8-9359-2671006a6b60\") " pod="openshift-must-gather-fvwkz/must-gather-sm479" Apr 17 14:51:02.789049 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:02.789028 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr2m5\" (UniqueName: \"kubernetes.io/projected/5e2ffa71-a7e9-47a8-9359-2671006a6b60-kube-api-access-gr2m5\") pod \"must-gather-sm479\" (UID: \"5e2ffa71-a7e9-47a8-9359-2671006a6b60\") " pod="openshift-must-gather-fvwkz/must-gather-sm479" Apr 17 14:51:02.923649 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:02.923565 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fvwkz/must-gather-sm479" Apr 17 14:51:03.043916 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:03.043888 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fvwkz/must-gather-sm479"] Apr 17 14:51:03.046384 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:51:03.046355 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e2ffa71_a7e9_47a8_9359_2671006a6b60.slice/crio-7c0a4d9e83a3dace81c5c5436ca082a3cf38512261b71e75fd296c4ef7b27de1 WatchSource:0}: Error finding container 7c0a4d9e83a3dace81c5c5436ca082a3cf38512261b71e75fd296c4ef7b27de1: Status 404 returned error can't find the container with id 7c0a4d9e83a3dace81c5c5436ca082a3cf38512261b71e75fd296c4ef7b27de1 Apr 17 14:51:03.048067 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:03.048050 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:51:03.537304 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:03.537265 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fvwkz/must-gather-sm479" event={"ID":"5e2ffa71-a7e9-47a8-9359-2671006a6b60","Type":"ContainerStarted","Data":"7c0a4d9e83a3dace81c5c5436ca082a3cf38512261b71e75fd296c4ef7b27de1"} Apr 17 14:51:04.544187 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:04.544148 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fvwkz/must-gather-sm479" event={"ID":"5e2ffa71-a7e9-47a8-9359-2671006a6b60","Type":"ContainerStarted","Data":"aa6e31ef47a495d6af70243ff94f9ed896afea897d73d065d0a96b73304dddf7"} Apr 17 14:51:04.544675 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:04.544195 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fvwkz/must-gather-sm479" event={"ID":"5e2ffa71-a7e9-47a8-9359-2671006a6b60","Type":"ContainerStarted","Data":"7170f48d88c5dd4fc7c2657d630612c3168b0956c8774d57de0a190bad1db558"} Apr 17 14:51:04.561699 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:04.561635 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fvwkz/must-gather-sm479" podStartSLOduration=1.846358078 podStartE2EDuration="2.561613358s" podCreationTimestamp="2026-04-17 14:51:02 +0000 UTC" firstStartedPulling="2026-04-17 14:51:03.048180502 +0000 UTC m=+1029.404908105" lastFinishedPulling="2026-04-17 14:51:03.763435782 +0000 UTC m=+1030.120163385" observedRunningTime="2026-04-17 14:51:04.559332108 +0000 UTC m=+1030.916059734" watchObservedRunningTime="2026-04-17 14:51:04.561613358 +0000 UTC m=+1030.918340984" Apr 17 14:51:05.385379 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:05.385324 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-thst5_2404238c-49db-4bc3-b328-96679c365761/global-pull-secret-syncer/0.log" Apr 17 14:51:05.468554 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:05.468526 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-2pbpg_70bf124d-898a-4e10-aece-902d90ea13ac/konnectivity-agent/0.log" Apr 17 14:51:05.536825 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:05.536797 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-134.ec2.internal_6856fb5778dd8b62676b041b83b334bb/haproxy/0.log" Apr 17 14:51:09.490227 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:09.490192 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-676889fd78-94jbq_6c45910e-b654-460b-ab94-a9f965045cb7/authorino/0.log" Apr 17 14:51:09.569568 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:09.569526 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-xj54s_97f76fcc-9592-4845-9cef-a42c1d09ddca/kuadrant-console-plugin/0.log" Apr 17 14:51:10.932984 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:10.932936 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f759a44c-4c49-40f7-9ca4-77917fc3f9d4/alertmanager/0.log" Apr 17 14:51:10.956793 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:10.956767 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f759a44c-4c49-40f7-9ca4-77917fc3f9d4/config-reloader/0.log" Apr 17 14:51:10.978020 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:10.977902 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f759a44c-4c49-40f7-9ca4-77917fc3f9d4/kube-rbac-proxy-web/0.log" Apr 17 14:51:11.001705 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:11.001671 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f759a44c-4c49-40f7-9ca4-77917fc3f9d4/kube-rbac-proxy/0.log" Apr 17 14:51:11.023883 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:11.023842 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f759a44c-4c49-40f7-9ca4-77917fc3f9d4/kube-rbac-proxy-metric/0.log" Apr 17 14:51:11.053186 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:11.053162 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f759a44c-4c49-40f7-9ca4-77917fc3f9d4/prom-label-proxy/0.log" Apr 17 14:51:11.091502 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:11.091463 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f759a44c-4c49-40f7-9ca4-77917fc3f9d4/init-config-reloader/0.log" Apr 17 14:51:11.170963 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:11.170907 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-pvlsg_12e54482-f4d9-41a0-ba0c-533f43bca23f/kube-state-metrics/0.log" Apr 17 14:51:11.196039 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:11.195931 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-pvlsg_12e54482-f4d9-41a0-ba0c-533f43bca23f/kube-rbac-proxy-main/0.log" Apr 17 14:51:11.219602 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:11.219568 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-pvlsg_12e54482-f4d9-41a0-ba0c-533f43bca23f/kube-rbac-proxy-self/0.log" Apr 17 14:51:11.472611 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:11.472518 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jsqmw_ed45cab8-95d1-4284-9dc1-2d602984c4e1/node-exporter/0.log" Apr 17 14:51:11.493086 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:11.493027 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jsqmw_ed45cab8-95d1-4284-9dc1-2d602984c4e1/kube-rbac-proxy/0.log" Apr 17 14:51:11.514974 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:11.514948 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jsqmw_ed45cab8-95d1-4284-9dc1-2d602984c4e1/init-textfile/0.log" Apr 17 14:51:11.541573 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:11.541522 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-csjdg_6aedddde-b37e-4862-ac07-19ecebf2ca41/kube-rbac-proxy-main/0.log" Apr 17 14:51:11.563518 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:11.563490 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-csjdg_6aedddde-b37e-4862-ac07-19ecebf2ca41/kube-rbac-proxy-self/0.log" Apr 17 14:51:11.583615 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:11.583586 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-csjdg_6aedddde-b37e-4862-ac07-19ecebf2ca41/openshift-state-metrics/0.log" Apr 17 14:51:11.865850 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:11.865790 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-768cb8d9f7-s7czf_536cadba-ee0d-4cfa-bd2e-2f09daa67ea7/telemeter-client/0.log" Apr 17 14:51:11.887907 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:11.887874 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-768cb8d9f7-s7czf_536cadba-ee0d-4cfa-bd2e-2f09daa67ea7/reload/0.log" Apr 17 14:51:11.907371 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:11.907346 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-768cb8d9f7-s7czf_536cadba-ee0d-4cfa-bd2e-2f09daa67ea7/kube-rbac-proxy/0.log" Apr 17 14:51:13.218232 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:13.218201 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-jngmx_42b16b82-9513-445d-b01e-228784a51e88/networking-console-plugin/0.log" Apr 17 14:51:13.977888 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:13.977846 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx"] Apr 17 14:51:13.984111 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:13.984078 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx" Apr 17 14:51:13.990660 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:13.990608 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx"] Apr 17 14:51:14.100586 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:14.100538 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7b1ce413-5263-49e9-9367-fc21a3a7c3a6-podres\") pod \"perf-node-gather-daemonset-rdfbx\" (UID: \"7b1ce413-5263-49e9-9367-fc21a3a7c3a6\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx" Apr 17 14:51:14.100770 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:14.100598 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z8z8\" (UniqueName: \"kubernetes.io/projected/7b1ce413-5263-49e9-9367-fc21a3a7c3a6-kube-api-access-9z8z8\") pod \"perf-node-gather-daemonset-rdfbx\" (UID: \"7b1ce413-5263-49e9-9367-fc21a3a7c3a6\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx" Apr 17 14:51:14.100770 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:14.100640 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b1ce413-5263-49e9-9367-fc21a3a7c3a6-sys\") pod \"perf-node-gather-daemonset-rdfbx\" (UID: \"7b1ce413-5263-49e9-9367-fc21a3a7c3a6\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx" Apr 17 14:51:14.100770 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:14.100664 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7b1ce413-5263-49e9-9367-fc21a3a7c3a6-lib-modules\") pod \"perf-node-gather-daemonset-rdfbx\" (UID: \"7b1ce413-5263-49e9-9367-fc21a3a7c3a6\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx" Apr 17 14:51:14.100945 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:14.100759 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7b1ce413-5263-49e9-9367-fc21a3a7c3a6-proc\") pod \"perf-node-gather-daemonset-rdfbx\" (UID: \"7b1ce413-5263-49e9-9367-fc21a3a7c3a6\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx" Apr 17 14:51:14.201610 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:14.201567 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7b1ce413-5263-49e9-9367-fc21a3a7c3a6-podres\") pod \"perf-node-gather-daemonset-rdfbx\" (UID: \"7b1ce413-5263-49e9-9367-fc21a3a7c3a6\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx" Apr 17 14:51:14.201798 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:14.201630 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9z8z8\" (UniqueName: \"kubernetes.io/projected/7b1ce413-5263-49e9-9367-fc21a3a7c3a6-kube-api-access-9z8z8\") pod \"perf-node-gather-daemonset-rdfbx\" (UID: \"7b1ce413-5263-49e9-9367-fc21a3a7c3a6\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx" Apr 17 14:51:14.201798 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:14.201668 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b1ce413-5263-49e9-9367-fc21a3a7c3a6-sys\") pod \"perf-node-gather-daemonset-rdfbx\" (UID: \"7b1ce413-5263-49e9-9367-fc21a3a7c3a6\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx" Apr 17 14:51:14.201798 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:14.201692 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7b1ce413-5263-49e9-9367-fc21a3a7c3a6-lib-modules\") pod \"perf-node-gather-daemonset-rdfbx\" (UID: \"7b1ce413-5263-49e9-9367-fc21a3a7c3a6\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx" Apr 17 14:51:14.201798 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:14.201736 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7b1ce413-5263-49e9-9367-fc21a3a7c3a6-proc\") pod \"perf-node-gather-daemonset-rdfbx\" (UID: \"7b1ce413-5263-49e9-9367-fc21a3a7c3a6\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx" Apr 17 14:51:14.201986 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:14.201871 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7b1ce413-5263-49e9-9367-fc21a3a7c3a6-proc\") pod \"perf-node-gather-daemonset-rdfbx\" (UID: \"7b1ce413-5263-49e9-9367-fc21a3a7c3a6\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx" Apr 17 14:51:14.202032 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:14.201989 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7b1ce413-5263-49e9-9367-fc21a3a7c3a6-podres\") pod \"perf-node-gather-daemonset-rdfbx\" (UID: \"7b1ce413-5263-49e9-9367-fc21a3a7c3a6\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx" Apr 17 14:51:14.202210 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:14.202183 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b1ce413-5263-49e9-9367-fc21a3a7c3a6-sys\") pod \"perf-node-gather-daemonset-rdfbx\" (UID: \"7b1ce413-5263-49e9-9367-fc21a3a7c3a6\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx" Apr 17 14:51:14.202417 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:14.202391 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7b1ce413-5263-49e9-9367-fc21a3a7c3a6-lib-modules\") pod \"perf-node-gather-daemonset-rdfbx\" (UID: \"7b1ce413-5263-49e9-9367-fc21a3a7c3a6\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx" Apr 17 14:51:14.210572 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:14.210548 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z8z8\" (UniqueName: \"kubernetes.io/projected/7b1ce413-5263-49e9-9367-fc21a3a7c3a6-kube-api-access-9z8z8\") pod \"perf-node-gather-daemonset-rdfbx\" (UID: \"7b1ce413-5263-49e9-9367-fc21a3a7c3a6\") " pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx" Apr 17 14:51:14.301372 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:14.301340 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx" Apr 17 14:51:14.453185 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:14.453151 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx"] Apr 17 14:51:14.456059 ip-10-0-129-134 kubenswrapper[2570]: W0417 14:51:14.456015 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7b1ce413_5263_49e9_9367_fc21a3a7c3a6.slice/crio-3b514a16df5b32cfb4dd88c8902869ea419b7f424bf163892a283ac6e3b8dea0 WatchSource:0}: Error finding container 3b514a16df5b32cfb4dd88c8902869ea419b7f424bf163892a283ac6e3b8dea0: Status 404 returned error can't find the container with id 3b514a16df5b32cfb4dd88c8902869ea419b7f424bf163892a283ac6e3b8dea0 Apr 17 14:51:14.596128 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:14.596030 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx" event={"ID":"7b1ce413-5263-49e9-9367-fc21a3a7c3a6","Type":"ContainerStarted","Data":"f76ae86c85c304c0ea92eeff01840c5ae6814dcc2a2c6c925e85ae66f8b6675d"} Apr 17 14:51:14.596128 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:14.596082 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx" event={"ID":"7b1ce413-5263-49e9-9367-fc21a3a7c3a6","Type":"ContainerStarted","Data":"3b514a16df5b32cfb4dd88c8902869ea419b7f424bf163892a283ac6e3b8dea0"} Apr 17 14:51:14.596925 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:14.596897 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx" Apr 17 14:51:14.616151 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:14.616092 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx" podStartSLOduration=1.616077556 podStartE2EDuration="1.616077556s" podCreationTimestamp="2026-04-17 14:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:51:14.6146425 +0000 UTC m=+1040.971370141" watchObservedRunningTime="2026-04-17 14:51:14.616077556 +0000 UTC m=+1040.972805216" Apr 17 14:51:15.610447 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:15.610420 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wbd2m_d1beaeea-4ae8-452c-b3be-201c1ad4568e/dns/0.log" Apr 17 14:51:15.629735 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:15.629703 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wbd2m_d1beaeea-4ae8-452c-b3be-201c1ad4568e/kube-rbac-proxy/0.log" Apr 17 14:51:15.673803 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:15.673775 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s9w25_e599dbf6-a663-42a7-82bb-12ed438c2ba8/dns-node-resolver/0.log" Apr 17 14:51:16.152910 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:16.152875 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pzwpd_c66a6c5d-9d68-42d2-ac7d-1ef80b44ed2f/node-ca/0.log" Apr 17 14:51:17.160740 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:17.160716 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-596bc867c-54qdn_b247dd44-4d34-4a95-a9a7-8ff7b84ab721/kube-auth-proxy/0.log" Apr 17 14:51:17.732759 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:17.732727 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-l9shq_05c803a3-345a-4b4a-b40b-575211301efb/serve-healthcheck-canary/0.log" Apr 17 14:51:18.330475 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:18.330442 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v5hj8_254b8d3e-f0ea-4092-933e-966862da913d/kube-rbac-proxy/0.log" Apr 17 14:51:18.358698 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:18.358672 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v5hj8_254b8d3e-f0ea-4092-933e-966862da913d/exporter/0.log" Apr 17 14:51:18.382026 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:18.381996 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v5hj8_254b8d3e-f0ea-4092-933e-966862da913d/extractor/0.log" Apr 17 14:51:20.238161 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:20.238133 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-678768785c-74wq8_70d01211-a2a6-4b53-ac2a-08255eb342d8/maas-api/0.log" Apr 17 14:51:20.262095 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:20.262066 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-647fcc54c6-4q6pb_f9629fe1-db36-4ffe-a3c4-a81df7316e6d/manager/0.log" Apr 17 14:51:20.281201 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:20.281171 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-k6gvc_cf35ee29-5a30-4277-a1af-d999fbdb36de/manager/1.log" Apr 17 14:51:20.292438 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:20.292405 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-k6gvc_cf35ee29-5a30-4277-a1af-d999fbdb36de/manager/2.log" Apr 17 14:51:20.379731 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:20.379696 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6c585549bc-zqv2x_b392ed97-9ee8-4818-8103-31efb7f94c80/manager/0.log" Apr 17 14:51:20.398530 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:20.398501 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-rr2w2_4952c4ad-a6d9-4658-b9e4-3bcd9971f71a/postgres/0.log" Apr 17 14:51:21.477885 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:21.477851 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-66f567c4b6-skq9g_2e17f585-e743-4564-8174-9b44434f5f66/manager/0.log" Apr 17 14:51:21.613094 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:21.613068 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-fvwkz/perf-node-gather-daemonset-rdfbx" Apr 17 14:51:27.480724 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:27.480691 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lr2qq_4790e8ee-77ab-402e-a1df-7e728d62db98/kube-multus-additional-cni-plugins/0.log" Apr 17 14:51:27.503707 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:27.503672 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lr2qq_4790e8ee-77ab-402e-a1df-7e728d62db98/egress-router-binary-copy/0.log" Apr 17 14:51:27.523288 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:27.523253 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lr2qq_4790e8ee-77ab-402e-a1df-7e728d62db98/cni-plugins/0.log" Apr 17 14:51:27.544734 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:27.544706 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lr2qq_4790e8ee-77ab-402e-a1df-7e728d62db98/bond-cni-plugin/0.log" Apr 17 14:51:27.565846 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:27.565824 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lr2qq_4790e8ee-77ab-402e-a1df-7e728d62db98/routeoverride-cni/0.log" Apr 17 14:51:27.588514 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:27.588489 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lr2qq_4790e8ee-77ab-402e-a1df-7e728d62db98/whereabouts-cni-bincopy/0.log" Apr 17 14:51:27.609558 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:27.609531 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lr2qq_4790e8ee-77ab-402e-a1df-7e728d62db98/whereabouts-cni/0.log" Apr 17 14:51:27.645013 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:27.644985 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s7m27_29dc8572-3cfc-4d9d-b915-1e8a137c2e00/kube-multus/0.log" Apr 17 14:51:27.773698 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:27.773610 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-j6cgs_4807a6e2-14df-4ba4-8aee-7422a65508f2/network-metrics-daemon/0.log" Apr 17 14:51:27.790975 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:27.790945 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-j6cgs_4807a6e2-14df-4ba4-8aee-7422a65508f2/kube-rbac-proxy/0.log" Apr 17 14:51:28.820408 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:28.820376 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lr966_0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c/ovn-controller/0.log" Apr 17 14:51:28.838453 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:28.838426 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lr966_0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c/ovn-acl-logging/0.log" Apr 17 14:51:28.845306 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:28.845266 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lr966_0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c/ovn-acl-logging/1.log" Apr 17 14:51:28.863881 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:28.863844 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lr966_0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c/kube-rbac-proxy-node/0.log" Apr 17 14:51:28.883845 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:28.883810 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lr966_0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 14:51:28.901000 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:28.900955 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lr966_0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c/northd/0.log" Apr 17 14:51:28.922256 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:28.922214 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lr966_0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c/nbdb/0.log" Apr 17 14:51:28.942671 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:28.942643 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lr966_0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c/sbdb/0.log" Apr 17 14:51:29.064072 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:29.064033 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lr966_0cf20fd9-1bae-45f5-af6f-5f39f00c8f3c/ovnkube-controller/0.log" Apr 17 14:51:30.400073 ip-10-0-129-134 kubenswrapper[2570]: I0417 14:51:30.400049 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-2jb96_651b7208-91cf-42ee-b675-0a50ef1389f0/network-check-target-container/0.log"